Page 1 of 1

Noise Reduction/Frequency vs Scan Pixel Density

Posted: Tue Dec 13, 2022 1:35 am
by ddessert
I have 8mm film frames from the 1950’s that were professionally scanned at about 4K vertical lines per image/frame (actually 5.6K with an overscan area). I understand the usable resolution of the output may be more like 720 lines.

How would this 5X scan density affect the Frequency settings (High, Mid, Low, Very Low, Ultra Low) for Noise Reduction or Noise Level?

For example, if High Frequency is looking a variations in 2x2 or 4x4 arrays of pixels, with my extremely dense scan everything on that scale is likely noise and can be smoothed. On the other end, Noise Reduction Ultra Low Frequency is OFF by default but should I be using it for these densely scanned frames?

I guess to be able to answer my own questions (and formulate additional ones), how large are the arrays of pixels for each frequency? Assuming it is working that way.

Re: Noise Reduction/Frequency vs Scan Pixel Density

Posted: Tue Dec 13, 2022 10:02 am
by NVTeam
There is no need to do that math by hand, just let the Auto Profile measure the noise and try to use the default filter settings. If you see that some large-scale noise is not sufficiently reduced, then try to enable the Ultra Low filter. There is usually no need to disable any of the higher frequency filters at all.

Thank you,
Vlad