Massive Batch processing optimizations?
Posted: Sun Feb 06, 2005 7:39 pm
I use Neat image to process video saved as an image sequence, because Neat Image is so effective with VHS noise and the like.
However, processing 95,000 frames (at 720x480 pixels) is a nightmare, because NeatImage doesn't seem to have a way to handle it. Firstly, if I queue more than about 2,000 images (even at this low resolution), weird things start happening. Either NeatImage freezes processing the first image (actually not using CPU or hard drive at all), or gives me an out-of-memory error (although its only used about half of my paging file.
The other problem I noticed was with performance: when one image is finished and the next image begins processing, the CPU useage drops for a few seconds (a big issue when processing so many images). My guess is that it is programmed so the second image isn't requested from memory until the first is done.
These problems, I think, could be fixed by limiting the number of images loaded into memory at a time (probably at 1,000) and load an additional image each time an image is deleted. The second could be fixed by begining the request from memory for the next image about two seconds before the current one is estimated to be done.
Even better would be an actual video filter with the Neat Image Noise Removal System
However, processing 95,000 frames (at 720x480 pixels) is a nightmare, because NeatImage doesn't seem to have a way to handle it. Firstly, if I queue more than about 2,000 images (even at this low resolution), weird things start happening. Either NeatImage freezes processing the first image (actually not using CPU or hard drive at all), or gives me an out-of-memory error (although its only used about half of my paging file.
The other problem I noticed was with performance: when one image is finished and the next image begins processing, the CPU useage drops for a few seconds (a big issue when processing so many images). My guess is that it is programmed so the second image isn't requested from memory until the first is done.
These problems, I think, could be fixed by limiting the number of images loaded into memory at a time (probably at 1,000) and load an additional image each time an image is deleted. The second could be fixed by begining the request from memory for the next image about two seconds before the current one is estimated to be done.
Even better would be an actual video filter with the Neat Image Noise Removal System