简体   繁体   中英

High Quality Image Magnification on GPU

I'm looking for interesting algorithms for image magnification that can be implemented on a gpu for real-time scaling of video. Linear and bicubic interpolations algorithms are not good enough.

Suggestions?

Here are some papers I've found, unsure about their suitability for gpu implementation.

Adaptive Interpolation

Level Set

I've seen some demos on the cell processor used in TVs for scaling which had some impressive results, no link unfortunately.

lanczos3 is a very nice interpolation algorithm (you can test it in the GIMP or virtualDub). It generally performs better than cubic interpolation and can be parallelized.
A GPU based version is implemented in Chromium:
http://code.google.com/p/chromium/issues/detail?id=47447
Check out chromium source code.

It may be still too slow for realtime video processing but maybe worth trying if you don't use too high resolution.

You may also want to try out CUVI Lib which offers a good set of GPU acceleration Image Processing algorithms. Find about it on: http://www.cuvilib.com

Disclosure: I am part of the team that developed CUVI.

Prefiltered cubic b-spline interpolation delivers good results (you can have a look here for some theoretical background). CUDA source code can be downloaded here . WebGL examples can be found here .

edit: The cubic interpolation code is now available on github: CUDA version and WebGL version.

仍在进行中,但gpuCV取代了在GPU的openCL中实现的openCV图像处理功能

You may want to have a look at Super Resolution Algorithms. Starting Point on CiteseerX

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM