From: Better Info on
On Tue, 10 Aug 2010 08:44:25 +0100, Martin Brown
<|||newspam|||@nezumi.demon.co.uk> wrote:

>On 09/08/2010 19:35, Paul Furman wrote:
>> Ofnuts wrote:
>>>
>>> - In all cases, most of the damage occurs on the 1st save. The
>>> subsequent saves show very little difference with the first step, even
>>> at very low quality settings. Save steps beyond the third do not add any
>>> loss... The JPEG algorithm is "stable", and the decoded values
>>> eventually get re-encoded the very same way.
>>
>> Interesting. Also worth noting; when an image remains open in (photoshop
>> at least), you can save as much as you like for backup and it won't
>> 'damage' the file till you close it and open again.
>
>A lot of programs do that by just renaming the buffer but without
>reloading the image that results from the JPEG encode and decode cycle.
>
>This can be misleading and I have seen people ruin images by overwriting
>an original with a lower quality copy because they did not realise what
>they saw on the screen did not reflect what was encoded in the file.
>Applications that allow you to see a zoomable preview of the encoded and
>decoded image and a filesize estimate are better.
>
>Regards,
>Martin Brown

How about applications that allow you to set the JPG compression level on
any layer or layout component individually, and edit it in real-time while
seeing it in preview, like Photoline. (Menu > Layout > Image > JPG
Compression) The "save-for-web" preview and 3 panel comparison of any two
file-types against the original, chosen compression methods, bit-depths,
dithering options, and filesizes of each is a separate built-in function.
It also supports 16-bit JPG compression conventions (HDPhoto/JPEG XR), for
years now. Do any browsers yet support that? If so, I might start using it.