From: Ofnuts on
I wrote a short utility to check the usual claim that JPEG image quality
degrades with the successive saves.

This utility saves an image multiple times, each time after making a
minor and very localized changed to it. To avoid suspecting that
"convert" does it cleverly to minimize losses, the image is saved to a
lossless format (PNG) and then converted from PNG to JPEG. The resulting
image is then compared with the original image (diff0-*), and with the
result of the first step (diff1-*) (red pixels are the changed pixels).

The utility and the results of some runs are available here:

http://dl.free.fr/rjtMETz9h

The subdirectories provided are the results of running the utility over
the same image with JPEG quality 25, 50, 75, and 90.

Now for the interesting part. This dispels some misunderstandings:

- In all cases, most of the damage occurs on the 1st save. The
subsequent saves show very little difference with the first step, even
at very low quality settings. Save steps beyond the third do not add any
loss... The JPEG algorithm is "stable", and the decoded values
eventually get re-encoded the very same way.

- The amount of "damage" is very low at reasonable quality settings (75
or above). To get an experimental "feel":

-- load the original image and the result of any step in a photo
editing software that support layers
-- obtain the "difference" between the two layers
-- the resulting image seems a very uniform black to the naked eye
-- use a "treshold" transform and lower the treshold value until
recognizable pattersn appear (besides the marker dots at top left)
-- At 90 quality, using the result of the 10th step, the first white
pixel shows up at 20 (artefact at lower border due to picture height not
a multiple of 8), the first pixel in the image a 11.
-- At 75 quality, the difference produces a recognizable ghost of the
linnet. The threshold method shows that most differences are below 20.

Disclaimers:

- Global image changes (white balance, contrast, colors) are a whole
different matter, not adressed here (though, IMHO, the problem with JPEG
in these operations is more the 8-bit-per-channel limit it puts on the
picture that in turn leads to a comb-like histogram)

- The original JPEG uses 1:1:1 sub-sampling and so does 'convert' by
default.

-- Unless reproduced by different means, these results only apply when
the same software is used throughout.

Note: you can run the utility yourself if you have Perl and the
ImageMagick toolkit installed. It was written, tested and run on WinXP,
so I make no promises for other OSses, but if you are running these you
should anyway be able to fix any problems.

--
Bertrand

From: Paul Furman on
Ofnuts wrote:
>
> - In all cases, most of the damage occurs on the 1st save. The
> subsequent saves show very little difference with the first step, even
> at very low quality settings. Save steps beyond the third do not add any
> loss... The JPEG algorithm is "stable", and the decoded values
> eventually get re-encoded the very same way.

Interesting. Also worth noting; when an image remains open in (photoshop
at least), you can save as much as you like for backup and it won't
'damage' the file till you close it and open again.
From: Ofnuts on
On 09/08/2010 20:35, Paul Furman wrote:
> Ofnuts wrote:
>>
>> - In all cases, most of the damage occurs on the 1st save. The
>> subsequent saves show very little difference with the first step, even
>> at very low quality settings. Save steps beyond the third do not add any
>> loss... The JPEG algorithm is "stable", and the decoded values
>> eventually get re-encoded the very same way.
>
> Interesting. Also worth noting; when an image remains open in (photoshop
> at least), you can save as much as you like for backup and it won't
> 'damage' the file till you close it and open again.

This is true of all software, AFAIK. But nitpickers will say your
intermediate version isn't really a backup. Consider:

Original (V0) -> local change 1 -> Save as V1 -> local change 2 -> save
as V2 (V2 is produced directly from V0, round off errors occur only once).

vs:

Original (V0) -> local change 1 -> Save as V1 -> reload V1 -> local
change 2 -> save as V2 (V2 is produced from V1, round off errors occur
twice).

And it is even worse with global changes (not speaking of losing
selections, masks, layers...).

--
Bertrand
From: Martin Brown on
On 07/08/2010 21:42, Ofnuts wrote:
> I wrote a short utility to check the usual claim that JPEG image quality
> degrades with the successive saves.
>
> This utility saves an image multiple times, each time after making a
> minor and very localized changed to it. To avoid suspecting that
> "convert" does it cleverly to minimize losses, the image is saved to a
> lossless format (PNG) and then converted from PNG to JPEG. The resulting
> image is then compared with the original image (diff0-*), and with the
> result of the first step (diff1-*) (red pixels are the changed pixels).
>
> The utility and the results of some runs are available here:
>
> http://dl.free.fr/rjtMETz9h
>
> The subdirectories provided are the results of running the utility over
> the same image with JPEG quality 25, 50, 75, and 90.
>
> Now for the interesting part. This dispels some misunderstandings:
>
> - In all cases, most of the damage occurs on the 1st save. The
> subsequent saves show very little difference with the first step, even
> at very low quality settings. Save steps beyond the third do not add any
> loss... The JPEG algorithm is "stable", and the decoded values
> eventually get re-encoded the very same way.

This is basically correct. The coefficients for each 8x8 or 16x16 block
usually converge onto an attractor in 5-10 cycles or may bounce between
a few closely related similar versions in a cyclic way. Not sure I would
be so bold as to say it is stable, but it is mostly chaotic around the
same stable attractor giving a series of very similar looking images
that may repeat with a short period 0,1,2,3 etc.

Serious damage tends to be mostly caused by the chroma subsampling
routine which averages YCrCb colour over 4x4 blocks and certain boundary
condition errors in the classical reconstruction methods.

> - The amount of "damage" is very low at reasonable quality settings (75
> or above). To get an experimental "feel":
>
> -- load the original image and the result of any step in a photo editing
> software that support layers
> -- obtain the "difference" between the two layers
> -- the resulting image seems a very uniform black to the naked eye
> -- use a "treshold" transform and lower the treshold value until
> recognizable pattersn appear (besides the marker dots at top left)
> -- At 90 quality, using the result of the 10th step, the first white
> pixel shows up at 20 (artefact at lower border due to picture height not
> a multiple of 8), the first pixel in the image a 11.
> -- At 75 quality, the difference produces a recognizable ghost of the
> linnet. The threshold method shows that most differences are below 20.

I did one based on an 8x8 test pattern that is designed to distress the
JPEG algorithm a long while ago. The results are at:

http://www.nezumi.demon.co.uk/photo/jpeg/2/jpeg2.htm

The difference between chroma subsampled JPEG saves (the default in most
applications) and the full chroma JPEG is very significant. A lot of
info is lost in the chroma subsampling and up sampling step.

The zoomed version doesn't look good on modern browsers with smoothed
upsampling. They are 8x8 pixel blocks that should have sharp edges.
>
> Disclaimers:
>
> - Global image changes (white balance, contrast, colors) are a whole
> different matter, not adressed here (though, IMHO, the problem with JPEG
> in these operations is more the 8-bit-per-channel limit it puts on the
> picture that in turn leads to a comb-like histogram)
>
> - The original JPEG uses 1:1:1 sub-sampling and so does 'convert' by
> default.

Full chroma sampling is very much better at preserving image integrity
than subsampled chroma (but the latter are considerably smaller).
PSPro 8 manages to do both incorrectly resulting in patterns in the sky
(and other artefacts that can be demonstrated on simple testcases).
>
> -- Unless reproduced by different means, these results only apply when
> the same software is used throughout.

And you use exactly the same quality settings for every save.

I agree though that JPEG is blamed for a lot of things that are not its
fault. You can encode graphics line art quite successfully with the
right choice of Q and full chroma sampling. The algorithm is optimised
for photographic images but it is not limited to them. PNG is usually
more compact for line art but not always.

Regards,
Martin Brown
From: Martin Brown on
On 09/08/2010 19:35, Paul Furman wrote:
> Ofnuts wrote:
>>
>> - In all cases, most of the damage occurs on the 1st save. The
>> subsequent saves show very little difference with the first step, even
>> at very low quality settings. Save steps beyond the third do not add any
>> loss... The JPEG algorithm is "stable", and the decoded values
>> eventually get re-encoded the very same way.
>
> Interesting. Also worth noting; when an image remains open in (photoshop
> at least), you can save as much as you like for backup and it won't
> 'damage' the file till you close it and open again.

A lot of programs do that by just renaming the buffer but without
reloading the image that results from the JPEG encode and decode cycle.

This can be misleading and I have seen people ruin images by overwriting
an original with a lower quality copy because they did not realise what
they saw on the screen did not reflect what was encoded in the file.
Applications that allow you to see a zoomable preview of the encoded and
decoded image and a filesize estimate are better.

Regards,
Martin Brown