From: Raphael Bustin on
On Sat, 22 Apr 2006 18:04:57 -0400, "Alan Meyer" <ameyer2(a)yahoo.com>
wrote:


>After all the controversy I went back and looked at the images
>I've been making and saving - both from my scanner and from
>my digital camera. What follows are my __subjective__
>impressions. I'm not claiming that everyone else will perceive
>things the way I do, but I bet a lot of people will see the same
>things.
>
>My normal technique is to save using JPEG at compression
>levels of between 10:1 and 20:1. I arrived at those numbers by
>trying numerous different compression levels from 3:1 up to
>100:1, on a number of different images from high detail to low
>detail subjects, and comparing the images. The actual
>compression I use depends on settings in my scanner, my
>camera, or in the GIMP, which produce results that I liked.
>The output of these settings is typically between 10 and 20
>to 1, depending on the detail found in the image.


Compression ratios of 10:1 and 20:1 are not characteristic
of best-quality JPG.

What Scott and I are talking about is what Photoshop used
to call "Quality 10" and what it now calls "Quality 12".
XnView calls it a 10 also.

At these settings, JPG files are typically about 0.4x the
size of the corresponding 24-bit TIF.

At 10:1 or 20:1 compression, artifacts will be apparent.


rafe b
www.terrapinphoto.com
From: Don on
On Sat, 22 Apr 2006 18:04:57 -0400, "Alan Meyer" <ameyer2(a)yahoo.com>
wrote:

>After all the controversy I went back and looked at the images
>I've been making and saving - both from my scanner and from
>my digital camera. What follows are my __subjective__
>impressions. I'm not claiming that everyone else will perceive
>things the way I do, but I bet a lot of people will see the same
>things.

Only the ones that keep an open mind!

>If I magnified my images 4 times, (double each dimension), I
>could often see JPEG digital artifacts - i.e., compression squares.

Using higher compression rates (i.e. lower quality setting) will of
course accentuate the artifacts.

Actually, I usually suggest that as a good way to train one's eye to
spot them. In other words, do a series of JPGs with different
compression settings and then cycle through *overlayed* (!) images.
Going from lowest quality to the original TIF will then identify
problem areas. Once you know what the artifacts look like, it will be
easier to spot them elsewhere without going through this process.

>So, I will go even further than some of the JPEG defenders have
>gone in this thread and say that some of us don't even require
>the highest quality JPEGs. For some of us (I suspect for many
>of us) something less than that will work just as well.

Absolutely! Like I said, I myself use JPGs (and not even top quality!)
when exchanging snapshots with friends, for example.

>However, I bet that at least some of the people that insist on
>these high quality images are kidding themselves.

No, that's wrong as I explain earlier. You're making assumptions about
how they use the images and their intentions.

What you see now is not the only consideration. For example, when
monitor size or bit-depth change (and they will!) or you get a new
printer or the print fades (and it will!) your archived JPGs will not
use all of the capabilities of these new devices.

In other words, if you archive your images as JPGs they will look
murky on an HDTV. Those who archived at full resolution and bit depth
will be able to use (i.e. see) the full dynamic range of an HDTV.
Therefore, the only way to preserve all information is to archive at
native resolution and bit depth.

Now, if they don't care and consider murky 8-bit JPGs on HDTV "just
fine" then, of course, there's nothing wrong with that.

But what is wrong is to use that subjective preference and project it
by claiming that others archiving at full resolution and bit-depth are
kidding themselves. They are not because they are saving real data.

>It's easy to get wrapped around the axle on a few aspects
>of image quality and lose sight of the big picture of what
>you're really trying to do.

Or, as I like to say, context. That's why I step back every now and
then and ask myself if I'm going too far. But that's a subjective call
and although I may share the conclusions to help others make a
decision this decision is up to each person to make for themselves.

But, having said that, objective facts are what they are. It's how we
deal with them. Being aware of objective facts and consciously
lowering one's requirements is one thing, but closing one's eyes in
ignorance is quite another.

Don.
From: Scott W on
Don wrote:

> Or, as I like to say, context. That's why I step back every now and
> then and ask myself if I'm going too far. But that's a subjective call
> and although I may share the conclusions to help others make a
> decision this decision is up to each person to make for themselves.
ignorance is quite another.
>
> Don.

Don, how can you stay in this thread without posting your image?
You keep going on like you know what you are talking about but as far
as I know
you have not pulled back from your statment that even at the highest
quaility a jpeg image view next to the original will stick out like a
sore thumb.

Why or why Don do you find it imposible to post this image?

Is this really do hard for you?

As long as you maintain that level 12 from Photoshop jpeg have visible
jpeg artifacts it is kind of hard to talk sensibly about lower quality
levels.


Scott

From: Alan Meyer on

"Raphael Bustin" <foo(a)bar.com> wrote in message
news:sqbl4251p9lj47l60viahpte77b852cd8t(a)4ax.com...
> ...
> At 10:1 or 20:1 compression, artifacts will be apparent.

Maybe. Maybe not.

If I magnify the images I can see the artifacts. But when
I look at them at 1:1 magnification, I can't. That is acceptable
to me because I always scan or shoot images at somewhat
higher resolution that I am planning to use, and then view
or print them at 1:1 or, typically, less than that.

Don suggests that I might not see them now on my equipment,
but might see them on a better monitor in the future. He
might be right.

He also suggests that, even though I don't see them now,
I might see them if I trained myself to look for them. He
might very well be right about that too.

But bear in mind that I'm not selling the images I make
or publishing them in books. I'm producing them only for
myself and my family. To me, they look virtually indistinguishable
from images made at lower compressions. I say "virtually"
because it may happen once in a while that I can spot a
detail that's different if I look _very_ closely. In general,
it seems to me that such detail flaws are generally less
significant in image quality than a dust spot on a scan, and
much less significant than an out of focus or motion
blurred region in a photograph.

Maybe I'm not demanding enough. I don't own a high-def
TV. My stereo system is less than the best. When I
painted the walls in my house, I didn't go ballistic if I got
a speck of paint on the floor or the ceiling. My desk is
a beat up thing with chips in it. My computer monitor is not
100% tack sharp in the corners.

Do I need my scans and photos to be not just virtually
indistinguishable from the best scans and photos, but
also actually indistinguishable on the best possible
monitor to the most highly trained eye?

I understand that many of the people who participate in
this newsgroup are professionals. They are selling
services to customers or scanning for employers, and
feel quite rightly obligated to produce the best quality
that they can. The work they turn out may be published
and may be held up for inspection.

But the original poster sounded more like a guy like me,
someone who was scanning for a family archive.

In the photography world, most consumers today are
buying point and shoot cameras at around 5 MP, and
using the default JPEG compression ratios, typically
in the 10:1 - 20:1 range that I'm using. Most of those
consumers are pretty happy with the results they're getting
and wouldn't really be happier with higher quality equipment.
Or at any rate, they're not voting with their dollars, euros
pounds and yen for higher quality equipment.

I suspect that most of those consumers are being pretty
well served by what they're getting - which is how I feel
about what I'm getting.

Alan


From: Raphael Bustin on
On Sun, 23 Apr 2006 23:06:44 -0400, "Alan Meyer" <ameyer2(a)yahoo.com>
wrote:

>
>"Raphael Bustin" <foo(a)bar.com> wrote in message
>news:sqbl4251p9lj47l60viahpte77b852cd8t(a)4ax.com...
>> ...
>> At 10:1 or 20:1 compression, artifacts will be apparent.
>
>Maybe. Maybe not.
>
>If I magnify the images I can see the artifacts. But when
>I look at them at 1:1 magnification, I can't. That is acceptable
>to me because I always scan or shoot images at somewhat
>higher resolution that I am planning to use, and then view
>or print them at 1:1 or, typically, less than that.
>
>Don suggests that I might not see them now on my equipment,
>but might see them on a better monitor in the future. He
>might be right.



Alan, my argument isn't with you at all.

I'm simply saying that my defense of JPG as
"virtually lossless" extends only to *minimal*
JPG compression.

It's not my business to tell you what the "proper"
quality or compressions are. That's your choice
entirely. No need to justify yourself -- you're using
JPG exactly as it was intended.

Personally, I wouldn't use JPG at all (for important
images) if there were *any* visible image degradation.

At *minimal* JPG compression, I expect virtually-perfect
results. Good enough for high-res scan samples,
in fact.


rafe b
www.terraphoto.com
First  |  Prev  |  Next  |  Last
Pages: 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Prev: canon F914900
Next: Canon FB 630 U - Driver