Using GPU for noise removal?

suggest a way to improve Neat Image
Post Reply
andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Using GPU for noise removal?

Post by andewid »

Would it not be possible to use the pixel shaders in modern GPUs to remove noise? The GPU's performance for image data manipulation is far better than the CPU, even if you have a Pentium 4 or Athlon 64.

The Mac OSX Tiger is using this technique and apple say they want Photoshop in Mac to use this too. They showed an illustration where all effects and plugins would be rendered realtime instead of the usual seconds (or longer!) lag for normal plugins/filters.

Modern DX9 graphics cards allow a programmer to inset assembly code (pixel shaders) that can affect the image buffer directly at great speeds.

If NeatImage could use the pixel shaders for noise removal, sharpening or colour conversions then it could be really fast. :) Instead of 20 seconds for an image we would have maybe instant speeds :D
NITeam
Posts: 3173
Joined: Sat Feb 01, 2003 4:43 pm
Contact:

Post by NITeam »

The idea is of course very good. I believe GPUs can be used to do filtration, at least to some extent. Whether that will provide a significant speedup to overweight the related slowdown is a question that deserves more investigation. We will look into this direction to see whether using GPUs can really provide enough benefits.

Thank you for the interesting suggestion.

Vlad
andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Post by andewid »

Yes, one slowdown would be the transfer back from the GPU to the system memory. The AGP bus is significatnly slower in reverse direction.

If you want to see some examples you should look on the Apple presentation of OSX Tiger. http://www.apple.com/quicktime/qtv/wwdc04/

About 1 hour and 5 minutes into the presentation is the GPU graphics filtering/effects for image and video processing. It would show that lots of image processing indeed is possible.

Also, if I remember correct, the 3DMark 04 benchmark is also using some pixel shader manipulation in it demo mode to give some film grain look.
NITeam
Posts: 3173
Joined: Sat Feb 01, 2003 4:43 pm
Contact:

Post by NITeam »

Thank you for the link! I will take a look at new OSX.

Vlad
TechMage89
Posts: 6
Joined: Sun Feb 06, 2005 7:20 pm

Post by TechMage89 »

The bus slowdown would be less of a problem for the newer graphics cards on the PCI express bus. Also, in batch processing, it would probably work to store jobs in video ram (which tends to be faster than system ram anyway), and retrieve them when done. If you did this, you may even be able to do parrelel processing (i don't know much about that though)
andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Post by andewid »

I have just observed that the Media Player Classic (http://sourceforge.net/projects/guliverkli - the CVS code) has support for using Pixel Shaders of DirectX 9 GPUs to render video effects in real time. for example sharpening and other video effects.

I know NI isn't a video processing tool, but NI could certainly take a peak at the source code (it is OpenSource GPL) to see how they are using the GPU. A similar process could then be integrated with NI, providing very good performance and new possibilites.

As for bus slowness. The AGP bus can do 266MB/s in reverse (It act as a PCI 66MHz bus). It should be enough to transfer an image back and forth =)
pinobot
Posts: 12
Joined: Wed Dec 03, 2003 12:17 pm

Post by pinobot »

A comparison between CPU and GPU.

http://gamma.cs.unc.edu/GPUSORT/results.html
andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Post by andewid »

pinobot wrote:A comparison between CPU and GPU.

http://gamma.cs.unc.edu/GPUSORT/results.html
Not that sorting is really about images :). Still, there are plenty of examples of pixel shaders being used for image manipulation. Apple has its Core-Image which allow photoshop-like applications to manipulate images through pixel shaders.

Another application that uses shaders (which you can write yourself!) is the open source Media Player Classic. Image
This example shows embossing done in realtime on a 30fps video stream without any extra CPU usage! You could probably do lots of things like noise-reduction, edge enhancement, sharpening, etc with shaders too. It is what the GPU was meant to do =).

Shader code: Emboss:

Code: Select all

sampler s0 : register(s0);
float4 p0 : register(c0);
float4 p1 : register(c1);

#define width (p0[0])
#define height (p0[1])
#define counter (p0[2])
#define clock (p0[3])
#define one_over_width (p1[0])
#define one_over_height (p1[1])

#define PI acos(-1)

float4 main(float2 tex : TEXCOORD0) : COLOR
{
	float dx = 1/width;
	float dy = 1/height;
	
	float4 c1 = tex2D(s0, tex + float2(-dx,-dy));
	float4 c2 = tex2D(s0, tex + float2(0,-dy));
	float4 c4 = tex2D(s0, tex + float2(-dx,0));
	float4 c6 = tex2D(s0, tex + float2(dx,0));
	float4 c8 = tex2D(s0, tex + float2(0,dy));
	float4 c9 = tex2D(s0, tex + float2(dx,dy));
	
	float4 c0 = (-c1-c2-c4+c6+c8+c9);
	c0 = (c0.r+c0.g+c0.b)/3 + 0.5;
	
	return c0;
}
Shader code: Contour:

Code: Select all

sampler s0 : register(s0);
float4 p0 : register(c0);
float4 p1 : register(c1);

#define width (p0[0])
#define height (p0[1])
#define counter (p0[2])
#define clock (p0[3])
#define one_over_width (p1[0])
#define one_over_height (p1[1])

#define PI acos(-1)

float4 main(float2 tex : TEXCOORD0) : COLOR
{
	float dx = 4/width;
	float dy = 4/height;
	
	float4 c2 = tex2D(s0, tex + float2(0,-dy));
	float4 c4 = tex2D(s0, tex + float2(-dx,0));
	float4 c5 = tex2D(s0, tex + float2(0,0));
	float4 c6 = tex2D(s0, tex + float2(dx,0));
	float4 c8 = tex2D(s0, tex + float2(0,dy));
	
	float4 c0 = (-c2-c4+c5*4-c6-c8);
	if(length(c0) < 1.0) c0 = float4(0,0,0,0);
	else c0 = float4(1,1,1,0);
	
	return c0;
}
pinobot
Posts: 12
Joined: Wed Dec 03, 2003 12:17 pm

Post by pinobot »

andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Post by andewid »

Quite effective way of showing that it is possible. Though the program itself wasn't very useful =). It could handle quite large images with no slowdown at all though.
RIUM+
Posts: 5
Joined: Wed Dec 29, 2004 2:14 pm

Post by RIUM+ »

Just so you know, if you have a DirectX9-compatible 3D card use Media Player Classic with the output set as VMR9 (renderless) and to render the video as textured surfaces under a 3D window, it is capable of doing bilinear and bicubic resizing of videos through Pixel Shaders. You can also set the sharpening amount of Bicubic resizing to different amounts (I personally use 0.75). It's not completely as thorough as if it was done through the the CPU, but for almost no increased CPU usage it's quite good.

Not that it does any noise removal, nor could it be used by NI, but it's just something I wanted to point out.
andewid
Posts: 62
Joined: Sat May 01, 2004 8:21 pm

Post by andewid »

RIUM+ wrote:Just so you know, if you have a DirectX9-compatible 3D card use Media Player Classic....
This was exactly what I posted in earlier message. With screenshots included too. Noise reduction and other image/pixel manipulations is certainly possible with pixel shaders.
RIUM+
Posts: 5
Joined: Wed Dec 29, 2004 2:14 pm

Post by RIUM+ »

andewid wrote:This was exactly what I posted in earlier message. With screenshots included too. Noise reduction and other image/pixel manipulations is certainly possible with pixel shaders.
Yeah, I was just pointing out the bicubic/sharpening aspects it has built-in. You didn't mention it, so I thought I should since image resizing/sharpening is more along the lines of what NI does than embossing the image, inverting the colours or making the image wave around. :)

...actually... altering the brightness, contrast and gamma just in the viewing window... now THERE is something that could be done in pixel shaders that NI currently does. No idea what speedups it would do or anything in that area, altering those settings isn't particularly slow anyway, but that's a possibility... pixel shaders for the noise reduction may not be possible, but it CERTAINLY would be possible to just use pixel shaders to provide different viewing styles that might show up the noise better - invert the colours, show just a single colour channel, maybe viewing a certain hue of the image so you can see noise in that hue... Output quality has to remain top-notch and there's often little optimizations/short-cuts done in GPU hardware, so if they're being applied to the viewed image only then it doesn't matter too much if the displayed image is showing a pixel or two 1/256th the wrong colour.

...Right, I'm ranting now. I need more sleep, sorry, I just had this idea and it's a great example of something NI could do with Pixel Shaders. Advertising a noise removal program that uses the GPU for some functions would be a serious selling point, even if they're only for viewing the image and the noise removal calculations are all done on the CPU. :)
Post Reply