Research...
...or what we do between surfing the internet, ferrying around kids, and drinking coffee.
Dictionary Learning
Iterative thresholding and K residual means - ITKrM
the fastest dictionary learning algorithm in the west...of Austria.
You need a basis or an overcomplete dictionary for your data?
If K-SVD is not an option or still running, give a quick try to
ITKrM - sequential, fast, with local convergence guarantees and pretty results.
For instance, on the left you see a 64x64 dictionary learned from/for patches of wonderful Fabio. More information about the algorithm or Fabio can be found here and if you want to start learning dictionaries right away, here is the toolbox!
Adaptive dictionary learning - ITKrM+
when you don't know which size your dictionary should have.
You need a dictionary but do not know how big it should be?
No problem, just start with a random basis and see where ITKrM+ will get you.
For instance for Peppers you will get the top dictionary and for Mandrill the bottom one. Despite being smaller the Peppers dictionary approximates Peppers patches better than the Mandrill dictionary Mandrill patches.
If you want to know more about adaptive dictionary learning and the theory behind it, have a look at the manifesto. If you are just happy about not having to decide dictionary size and sparsity level, here is the toolbox!
Dictionary learning from incomplete data - ITKrMM
when your data is not that perfect - yet.
You need a dictionary but unfortunately your training data is full of holes?
No problem, as long as you know where the holes are,we still have a solution!
ITKrMM can learn a dictionary also from corrupted/erased data.
Moreover it takes care of the potential presence of a low rank component in the data. The added bonus is that with the learned dictionary you can then go about filling the holes in your data - inpainting. More information about ITKrMM and inpainting can be found here and if you want to start restoring old family pictures right away, here is the toolbox!
Analysis Operator Learning
Fast analysis operator learning - FAOL, IAOL & SVAOL
the fastest analysis operator learning algorithms in the west...of Austria.
Dictionaries are not your cup of tea?
But you still need a low dimensional structure in your data for regularisation?
Try analysis operator learning with FAOL, IAOL or SVAOL!
For instance, on the left you see a 64x64 analysis operator/dictionary learned with IAOL from/for patches of wonderful Fabio. More information about analysis operators, FAOL, IAOL, SVAOL or image denoising using analysis operators can be found here and if you want to start learning analysis operators right away, here is the toolbox!
Sparse Approximation
Average performance of OMP
with no RIP in sight, but incoherence and decaying coefficients.
Need to sparsely approximate a lot of signals?
Happy with the performance of OMP, but everybody tells you BP is better?
If you suspect decaying coefficients, we have the argument, why OMP is a good idea!
For instance, on the left you can see the average performance of OMP with a dictionary of 512 atoms in dimension 128 and coefficients forming a geometric sequence. The theorem describing how the average performance of OMP (assuming random coefficient signs) improves with decay can be found here.
The big picture