From: nospam on 19 Apr 2010 22:40
In article <4bcd106d$0$1674$742ec2ed(a)news.sonic.net>, Ray Fischer
> Look at DPReview's discussion of Sigma's cameras. I can't provide the
> link right now.
their tests show aliasing at about the same point that bayer stops
resolving. in other words, not much difference. aliasing is not
From: Ray Fischer on 19 Apr 2010 22:44
nospam <nospam(a)nospam.invalid> wrote:
>In article <4bcd106d$0$1674$742ec2ed(a)news.sonic.net>, Ray Fischer
>> Look at DPReview's discussion of Sigma's cameras. I can't provide the
>> link right now.
>their tests show aliasing at about the same point that bayer stops
You do know why those are two unrelated issues?
From: nospam on 19 Apr 2010 22:52
In article <4bcd1523$0$1674$742ec2ed(a)news.sonic.net>, Ray Fischer
> >> Look at DPReview's discussion of Sigma's cameras. I can't provide the
> >> link right now.
> >their tests show aliasing at about the same point that bayer stops
> You do know why those are two unrelated issues?
since nothing can resolve close to nyquist, you either get aliasing or
nothing, depending if there is an anti-alias filter or not.
From: nospam on 20 Apr 2010 15:56
In article <s0nzn.137533$gF5.19566(a)newsfe13.iad>, Martin Brown
> >>>> There are a handful of cases where the Foveon sensor might give a better
> >>>> image and one of those is when photographing fine black detail on
> >>>> saturated red or blue flowers. Rest of the time it is all marketting.
> >>> black detail on saturated colours should be ok with bayer because
> >>> there's a big luminance difference and bayer generally gets that right.
> >> It isn't the Bayer mask that fails in this particular case it is the
> >> chroma subsampling.
> > bayer samples colour at half the rate of luminance. humans can't see
> > the difference, but some think they can (like those who can 'hear'
> > differences in speaker cables).
> That is a caricature of the situation.
no it isn't. chrominance is sampled at half the rate as luminance. the
eye can resolve chroma about 1/10th as well as luma, so bayer has more
chroma than what people can see.
> It really does depend critically
> on what you are trying to image. I actually ran into the problem in a
> real world situation. Photographing the transit of mercury with an
> H-alpha filter so that prominences were also visible. The planet should
> have been a disc but it was obviously distorted to oval.
that sounds more like a lens aberration than a bayer issue, but in any
event, that's not really a common scenario.
> >> Try it on a test chart with a Wratten 25 filter and
> >> you will see what I mean. There will be a factor of 2 difference in the
> >> effective resolution between horizontal and vertical in the red channel.
> > if you filter only red, you will reduce the resolution of the sensor
> > but that's not a real world scenario.
> See above. Astronomers do have filters that are very precisely
> monochromatic. Most times we use unfiltered monochrome sensors too.
if they're using a monochromatic filter, why aren't they using a
monochromatic sensor? you *will* get worse results if you put a deep
red filter and use bayer. it wasn't designed for that use.
foveon might be a good choice if you discard the colour info (sum the 3
layers), but a true monochromatic sensor (no bayer filters at all)
would be even better.
> >> Foveon save their images as fully chroma sampled JPEGs so the issue of
> >> errors in the subsampled chroma decoding approximations do not arise.
> > that's a plus but humans can't tell the difference except in extreme
> > (i.e., not real-world and contrived) cases.
> Red flowers with fine black veins like poppies and tulips is one such
> real world case.
that should be fine on bayer because there's a dramatic luminance
difference between red and black.
> >> The problem arises later in the imaging chain. Bayer sensor struggles a
> >> bit with a pure red (or pure blue) monochrome images because it has
> >> fewer independent pixels.
> > true but nothing in this world is 'pure saturated red' (or blue or
> > green). even bright red objects have a little blue or green in it.
> It only has to be close enough that most of the luminance is in the red
> channel. Red flowers will cause the problem as do some jazz concerts
> with blue and red spotlights.
there's still plenty of blue/green under a red spotlight. not
everything is going to be pure red. it won't be as good as if it was
white light but it's not as bad as you make it out to be.
> >> Normally the luminance channel is able to hide
> >> these defects, but when the situation arises where the luminance channel
> >> is corrupted by the chroma channels then you lose detail.
> > and that only happens in edge cases, like red/blue test charts. that's
> > why the foveon fans love those tests, despite it not being relevant to
> > real world photography.
> Having the sharp edges mangled by subsampling faults stick out like a
> sore thumb in the handful of cases where it is relevant.
not when it's chroma being subsampled. try it in photoshop. convert the
image to lab and blur the ab channels. you only need to blur it with a
2 pixel radius to simulate bayer, but you can crank it to 5 pixels and
not notice a difference.
then blur the luminance and it will be noticeable even with the radius
set to a fraction of a pixel.
> Claims made for Foveon are completely OTT but there is a small element
> of truth in their claims to better colour fidelity. But my point here is
> that JPEG decoders are suboptimal in the standard implementation.
foveon does have higher chroma resolution, however, it's less accurate
and the eye can't see the difference anyway.
From: Ray Fischer on 20 Apr 2010 23:18
Alfred Molon <alfred_molon(a)yahoo.com> wrote:
>In article <190420101220067533%nospam(a)nospam.invalid>, nospam says...
>> > Luminance requires all *three* colour components. If you do not capture
>> > all three colour components at each pixel, you do not capture luminance
>> > at each pixel.
>> you may not capture full luminance but you do capture enough
>> information to calculate the correct value. the system works.
>Bayer does not capture the luminance at the pixel level, period.
And yet there are billions of photos captured with Bayer sensors that
have luminance information in them.