ImageEn for Delphi and C++ Builder ImageEn for Delphi and C++ Builder

 

ImageEn Forum
Profile    Join    Active Topics    Forum FAQ    Search this forumSearch
Forum membership is Free!  Click Join to sign-up
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 ImageEn Library for Delphi, C++ and .Net
 ImageEn and IEvolution Support Forum
 Lossless image resize setting
 New Topic  Reply to Topic
Author Previous Topic Topic Next Topic  

AndyColmes

USA
351 Posts

Posted - Oct 10 2018 :  16:35:55  Show Profile  Reply
What is the best setting to resize an image that gives the best lossless quality? I am also open to the idea of converting to another format as long as it is compatible with TIEBitmap and TBitmap.

As a footnote, can ImageMagick plugin with ImageEn help in any way?

Thanks very much in advance.

Andy

xequte

38180 Posts

Posted - Oct 10 2018 :  18:40:24  Show Profile  Reply
Hi Andy

PNG is generally the best option for lossless image saving. Good compression without losing any data.

But JPEG is very good with photos at high quality values there is very little visible loss, but images will be significantly smaller.

For example, take a loss source image. You save it to JPEG at 85% quality, and you also scale it to half size and save it to PNG. Even at half size the PNG will be probably have a larger file size and is now lower quality.


ImageMagick is probably only useful if you need to display unsupported image types.



Nigel
Xequte Software
www.imageen.com
Go to Top of Page

AndyColmes

USA
351 Posts

Posted - Oct 11 2018 :  16:50:49  Show Profile  Reply
Thank Nigel for the great info. If I am loading the image file into a TIEBitmap and I would ultimately want to resize it in case it is too large, what would be your recommendation? If I understand from your suggestion, I would first make sure that the file that I will be loading into the TIEBitmap is a PNG. Then once in TIEBitmap, what would be the resize/resample settings that I need to consider in order to achieve lossless quality for the bitmap that I intend to process with?

Thanks again.

Andy
Go to Top of Page

xequte

38180 Posts

Posted - Oct 11 2018 :  17:58:20  Show Profile  Reply
Hi Andy

The loading format does not matter. Though obviously if it is a lossy format like JPEG then you have already lost some data.

It is not possible to make specific recommendations without knowing the source of the images and the purpose of storing them. In general, photos lend themselves to lossy storage because done well the visual loss is imperceptible, but significant reductions in file size are produced. Conversely, images that are continiously modified or contain text or vector elements are best stored losslessly.

What is your purpose in reducing them (reduce file size, improve loading speed, etc), because obviously resizing is, in itself, a lossy operation.





Nigel
Xequte Software
www.imageen.com
Go to Top of Page

AndyColmes

USA
351 Posts

Posted - Oct 12 2018 :  12:40:57  Show Profile  Reply
Hi Nigel,

The issue that I have been having for this project is that I am almost always encountering "out of memory" situations due to large images. There are fixed routines that I am dealing with that uses the default TBitmap and that's where the bottleneck happens. I am looking for a solution where I can overcome this limitation, but as long as I have to deal with TBitmap at some point, I will get the "out of memory" nightmare. It would be great if somehow it can be written to disk and not have everything in memory, which is causing the issue.

So, I am open to any suggestion on how to overcome this hurdle. And one of the solution would be to downsize the image to an acceptable size that will not give the "out of memory" error.

I look forward to any suggestion or recommendation on how to handle this large image issue.

Thank you in advance.

Andy
Go to Top of Page

xequte

38180 Posts

Posted - Oct 12 2018 :  15:48:06  Show Profile  Reply
Hi Andy

Yes, unfortunately TBitmap is just not good with very large images, it cannot cache to file like TIEBitmap.

I can see why you are wanting to upgrade your routines to use TIEBitmap. Judging by the example you emailed me, this should not be technically difficult, but conversion and especially testing will be time consuming

Until then, you might have to work with downscaled images in a TBitmap.

The other option, though not ideal, is to use a full size image, but only downscale it for any analysis tasks. For some editing tasks, you could work with "strips", e.g. you break a 4000x3000 into 12 1000x1000 images and perform the editing tasks on each of them. Of course, there are only a limited set of editing operations where that is practical.


Nigel
Xequte Software
www.imageen.com
Go to Top of Page

AndyColmes

USA
351 Posts

Posted - Oct 16 2018 :  06:02:05  Show Profile  Reply
Hi Nigel,

Thank you for the suggestions. Do you think this would work, if I downscale the bitmap to process and after process, upscale it back using the same resample percentage, as long as the resampling is done losslessly?

Andy
Go to Top of Page

xequte

38180 Posts

Posted - Oct 16 2018 :  22:01:03  Show Profile  Reply
Hi Andy

No, sorry, that is not what I meant. There is almost never a situation where upscaling is a good idea.

I mean for analysis tasks, perform it on a downscaled copy of your image. The downscaled copy is only used for your analysis tasks, and then discarded. By analysis, I mean where you are only looking at the image to determine something (average color, position of blobs, etc). There are many analysis tasks where working with a downscaled copy provides equivalent result with an improved performance. Of course, for a single analysis task, the time taken for the downscaling itself probably negates your performance gains.

Nigel
Xequte Software
www.imageen.com
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Reply to Topic
Jump To: