1

Closed

Insufficient memory (case 4)

description

Hi,

I'm using Magick.net in a very high volume multi threaded photo processing application. I see random errors like below. This usually on photos that are larger, I have to deal with JPG files that are sometimes up to 40 megabytes in file size. But, i can see errors like this even on 5 megabyte files. The servers have plenty of memory. Is there anything i can do to avoid this other than forcing the application to only process one photo at a time?

I am using Magick.Net-Q16-AnyCPU version 7.0.3.902


ImageMagick.MagickCorruptImageErrorException\",\"Message\":\"Insufficient memory (case 4)
at ImageMagick.MagickImage.NativeMagickImage.WriteBlob(MagickSettings settings, UIntPtr& length)
at ImageMagick.MagickImage.Write(Stream stream)
Closed Jan 30 at 8:13 PM by dlemstra

comments

dlemstra wrote Jan 6 at 3:24 PM

It looks like you are reading from a stream. Would it be possible to read from a file? I don't know which part of the code is complaining about this but it looks like you are running into problems when the stream has to be send allocated inside the unmanaged code. I am planning to see if I can change the ImageMagick code to add support for better reading and writing of streams: https://github.com/dlemstra/Magick.NET/issues/35. For now I would advise you to switch to a file instead.

scohen2002 wrote Jan 6 at 4:14 PM

Thank you very much for your response. Yes through out the entire application i am dealing with the JPGs as Memory streams and never on disk. So every magick.net operation is stream based or byte array based. This is because the files live on AWS S3 and there are multiple servers that perform different operations on the files so each server never holds the physical files locally. The bring the file bytes down over HTTP perform jobs on the JPG and push the bytes back over HTTP. I could change it around to bring the bytes down and into a temporary jpg files i guess but that would slow things down. I'll experiment a bit.

scohen2002 wrote Jan 20 at 12:20 PM

I changed all the .Write() calls i had from Stream based to FileInfo baseed. So from .Write(Stream) to .Write(FileInfo). I then still got out of memory exception in the Write method. I then created a lock object and ensured that only one single Write call could ever execute at a time and i still get out of memory exceptions. There other simultaneous uses of Magic.Net objects going on in parallel at the same time but no other writes. The .Net process is using about 2 gigs of the available 16 gigs on the server and never goes above that.

dlemstra wrote Jan 20 at 3:07 PM

Are you running in 32-bit instead of 64-bit?

p.s. Planning to work on adding better stream support this weekend.

scohen2002 wrote Jan 20 at 7:38 PM

You got it! Thanks to my code changes to use temp files i happened to watch the temp folder while the process was running to make sure it wasn't orphaning any temporary jpg files. By the good graces of god guess what else i noticed sitting in the temp folder? I noticed that magic.net had unpacked the 32 bit DLL and not the 64 bit one. So... the windows service was running in 32-bit and capped at 2 gigs. I guess the .Write() calls are the most expensive and suffered from this limit. Was caused by a default build option being checked to "Prefer 32 bit" on the service's CSProj properties. I've now removed the temp files and the locking and am testing how many more files i can process at once without getting memory errors.

In a round about way your suggestion to use files instead of streams led to this getting figured out and it's appreciated!

dlemstra wrote Jan 30 at 8:13 PM

I just implemented issue 35 on github. This will be available in the next version of Magick.NET. Closing this issue now.