简体   繁体   中英

Are there algorithms that succeed “Bilateral Filtering”? (in term of functionality)

"Bilateral Filter" algorithm is presented back in 1998. Right now, I am looking around for something newer. It's not that there is anything wrong with the algorithm, but is there be a newer algorithm for performing similar task (that is preserving the edge and at the same time, removing the noises) that is better in some ways today?

Non-local means filtering is a pretty standard algorithm for denoising. For each pixel (or patch), instead of taking the average of nearby pixels, you take the average of the "most similar pixels" of the image. The intuition is that along edges for example, there will be similar pixels, which are not necessarily close by.

Another successful technique is dictionary learning: you learn a dictionary of patches on your image, hoping that the noise (being random) will not be learnt. Then, you decompose each patch of your image on that dictionary, and take that decomposition as your denoised patch. The part II of this tutorial is a good introduction.

I always wanted to give Edge-Preserving Decompositions for Multi-Scale Tone and Detail Manipulation by Farbman, Fattal, Lischinski and Szeliski a try.

It would be nice if you could report back if you used it and for what and with what results. :-)

Since you did not explain what "better" means, it is hard to answer. However, if "better" means faster, you can check the paper on constant time bilateral filter here: http://www.cs.cityu.edu.hk/~qiyang/ .

I provided a java implementation (based on the initial C code) here: http://code.google.com/p/kanzi/source/browse/src/kanzi/filter/FastBilateralFilter.java

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM