From: nospam on
In article <MPG.263a0831ff8f127898c2a9(a)news.supernews.com>, Alfred
Molon <alfred_molon(a)yahoo.com> wrote:

> > Mismatched spectral response is one of the reasons why the Foveon
> > concept isn't as good as its supporters claim. Whilst you do get full
> > colour pixels, and hence increased resolution over a similar pixel count
> > BFA camera, the response is a poor match to the eye. Foveon's highest
> > response is to blue, then green and then red.
>
> If so, wouldn't it be sufficient to multiply the blue channel by a
> factor < 1.0?

no

> Or are you claiming that cameras with a prism and three separate sensors
> (R, G and B), are a bad solution because they do not match the human
> eye's sensitivity?

a 3 chip camera is rgb.

foveon isn't rgb, despite the misleading ads. the three layers have to
be converted to rgb.
From: Martin Brown on
Bubba wrote:
> On Apr 21, 12:50 pm, Chris Malcolm <c...(a)holyrood.ed.ac.uk> wrote:
>> CMOS has an extra on-chip wiring overhead which matters a lot in very
>> tiny dense sensors, and not much on big DSLR sensors. The new
>> back-illuminated CMOS sensors remove that CMOS overhead (to the dark
>> side of the chip) and are bringing CMOS sensors into low-end P&S and
>> phone cameras.
>
> Does this mean (what someone reviewing a Canon SX1 said) that the CMOS
> sensors on $400--$500 P&S are "small" and (I suppose) mediocre? How
> can you tell which CMOS sensor a particular camera has, and whether
> it's any good?

You could read the manufacturers specification datasheet or practical
reviews of the cameras. The big thing about CMOS sensors is that they
can be *manufactured* on standard chip fab lines. CCD requires a
different more sophisticated process and until very recently gave
superior results in terms of uniformity. Now CMOS has caught up.

CMOS has had a lot more spent on it now as it allows mass produced
camera in a chip solutions for webcams and security cameras.

http://www.rps-isg.org/wordpress/?p=70

Is a reasonable basic level introduction (and refs therein).
>
>> You didn't know that Bayer sensors have twice as many green pixels?
>
> No. That's why I post questions here. Why in God's name would they
> have twice as many? I haven't bought a new digital camera in three

Because they know what they are doing! It is common knowledge!!!
You should do some proper background reading. All this stuff is well
documented. Are you too dim to use Wiki or something?

http://en.wikipedia.org/wiki/Bayer_mask

> years, because of something I read here back then about my particular
> problem--red channel flare--not ever improving in digital photography
> unless (three years ago) you could afford a splendiferously expensive
> camera ($+++Ks).

You really do need to post an *example* of this mythical "red flare" you
keep harping on about. It seems to me like you never managed to
understand how to use your original camera probably an RTFM fault given
your postings here.

Regards,
Martin Brown
From: Chris Malcolm on
nospam <nospam(a)nospam.invalid> wrote:
> In article <4bcd1523$0$1674$742ec2ed(a)news.sonic.net>, Ray Fischer
> <rfischer(a)sonic.net> wrote:

>> >> Look at DPReview's discussion of Sigma's cameras. I can't provide the
>> >> link right now.
>> >
>> >their tests show aliasing at about the same point that bayer stops
>> >resolving.
>>
>> You do know why those are two unrelated issues?

> since nothing can resolve close to nyquist, you either get aliasing or
> nothing, depending if there is an anti-alias filter or not.

It's not all or none.

AA filters aren't perfect, and camera makers choose different
strengths of of AA filters as different compromises between complete
absence of any aliasing ever, and permitting some aliasing. The point
of the compromise is that a weaker AA filter will capture some more
detail resolution at the cost of permitting some aliasing to appear. A
small amount of aliasing will only be noticeable in certain specific
kinds of images. So by weakening the AA filter you could get for
detail of tree leaves in shots of trees, and you wouldn't notice
aliasing. On the other hand, if you were shooting clothed portraits
with fine cloth texture that same filter might produce annoying
aliasing which was destructive of detail.

You can see such a compromise operating in TV images. Occasionally
you'll see someone wearing a certain tie or patterned weave of
clothing which throws up enough aliasing to to have weird flickering
iridescence-like effects as they move.

--
Chris Malcolm
From: Alfred Molon on
In article <6NphRCB1lT0LFwgX(a)kennedym.demon.co.uk>, Kennedy McEwen
says...
> No, there is no reason that such a design would have any particular
> spectral imbalance - you could have each photodiode of the appropriate
> response. The problem is there is no such control of the spectral
> response with the Foveon device.

You are making some assumptions here, but have no way to know the
details of the Foveon implementation.
--

Alfred Molon
------------------------------
Olympus E-series DSLRs and micro 4/3 forum at
http://tech.groups.yahoo.com/group/MyOlympus/
http://myolympus.org/ photo sharing site
From: Martin Brown on
nospam wrote:
> In article <s0nzn.137533$gF5.19566(a)newsfe13.iad>, Martin Brown
> <|||newspam|||@nezumi.demon.co.uk> wrote:
>
>>>>>> There are a handful of cases where the Foveon sensor might give a better
>>>>>> image and one of those is when photographing fine black detail on
>>>>>> saturated red or blue flowers. Rest of the time it is all marketting.
>>>>> black detail on saturated colours should be ok with bayer because
>>>>> there's a big luminance difference and bayer generally gets that right.
>>>> It isn't the Bayer mask that fails in this particular case it is the
>>>> chroma subsampling.
>>> bayer samples colour at half the rate of luminance. humans can't see
>>> the difference, but some think they can (like those who can 'hear'
>>> differences in speaker cables).
>> That is a caricature of the situation.
>
> no it isn't. chrominance is sampled at half the rate as luminance. the
> eye can resolve chroma about 1/10th as well as luma, so bayer has more
> chroma than what people can see.

Luminance is inferred from the Bayer data based on the GR,BG matrix or
in some cases CM,YG matrix. The luminance is dominated by the green
channel and the red and blue provide minor corrections to it through
various cunning heuristics.
>
>> It really does depend critically
>> on what you are trying to image. I actually ran into the problem in a
>> real world situation. Photographing the transit of mercury with an
>> H-alpha filter so that prominences were also visible. The planet should
>> have been a disc but it was obviously distorted to oval.
>
> that sounds more like a lens aberration than a bayer issue, but in any
> event, that's not really a common scenario.

It is exactly down to the 2x2 or 2x1 chroma subsampling and how JPEG
reconstructs the colours from it. I know this because I have worked on
methods to fix the problem for high fidelity JPEG reconstruction of
chroma subsampled data.
>
>>>> Try it on a test chart with a Wratten 25 filter and
>>>> you will see what I mean. There will be a factor of 2 difference in the
>>>> effective resolution between horizontal and vertical in the red channel.
>>> if you filter only red, you will reduce the resolution of the sensor
>>> but that's not a real world scenario.
>> See above. Astronomers do have filters that are very precisely
>> monochromatic. Most times we use unfiltered monochrome sensors too.
>
> if they're using a monochromatic filter, why aren't they using a
> monochromatic sensor? you *will* get worse results if you put a deep
> red filter and use bayer. it wasn't designed for that use.

Normally you would use a dedicated astronomical camera, but for a quick
one off in good light the standard DSLR seemed like a reasonable option.
>
> foveon might be a good choice if you discard the colour info (sum the 3
> layers), but a true monochromatic sensor (no bayer filters at all)
> would be even better.

They are also typically cooled for lower system noise, but sometimes it
is a lot easier just to use an ordinary DSLR than a computer tethered
camera with ancillary power supplies.
>
>>>> Foveon save their images as fully chroma sampled JPEGs so the issue of
>>>> errors in the subsampled chroma decoding approximations do not arise.
>>> that's a plus but humans can't tell the difference except in extreme
>>> (i.e., not real-world and contrived) cases.
>> Red flowers with fine black veins like poppies and tulips is one such
>> real world case.
>
> that should be fine on bayer because there's a dramatic luminance
> difference between red and black.

It is the dramatic luminance difference between the red and black that
causes the problem. If you reconstruct the luminance selectively using
DJPEG and then reconstruct the colour image and convert to monochrome
and difference them you will see exactly what I mean.
>
>>>> The problem arises later in the imaging chain. Bayer sensor struggles a
>>>> bit with a pure red (or pure blue) monochrome images because it has
>>>> fewer independent pixels.
>>> true but nothing in this world is 'pure saturated red' (or blue or
>>> green). even bright red objects have a little blue or green in it.
>> It only has to be close enough that most of the luminance is in the red
>> channel. Red flowers will cause the problem as do some jazz concerts
>> with blue and red spotlights.
>
> there's still plenty of blue/green under a red spotlight. not
> everything is going to be pure red. it won't be as good as if it was
> white light but it's not as bad as you make it out to be.

I am explaining the situations where the JPEG decoder fails to make the
best possible reconstruction. This may be relevant to the OPs problem or
not since he won't post an image which shows his "red flare".

>>>> Normally the luminance channel is able to hide
>>>> these defects, but when the situation arises where the luminance channel
>>>> is corrupted by the chroma channels then you lose detail.
>>> and that only happens in edge cases, like red/blue test charts. that's
>>> why the foveon fans love those tests, despite it not being relevant to
>>> real world photography.
>> Having the sharp edges mangled by subsampling faults stick out like a
>> sore thumb in the handful of cases where it is relevant.
>
> not when it's chroma being subsampled. try it in photoshop. convert the
> image to lab and blur the ab channels. you only need to blur it with a
> 2 pixel radius to simulate bayer, but you can crank it to 5 pixels and
> not notice a difference.

That depends. The red channel as reconstructed by standard JPEG decoders
can corrupt the luminance value by enough to be a nuisance in some
cases. It is only obvious when this situation arises which is typically
most obvious with fine black lines on near saturated red. This quirk is
part of the reason why JPEG images drift when recompressed many times.

In an ideal world you want to use the chrominance information to just
add colour to the luminance without altering its value. This is not what
happens in the standard JPEG decoders at present on the boundary cases.

4x4x4 chroma reconstruction is now being sold in the lastest TVs as
processing power has become more available.
>
> then blur the luminance and it will be noticeable even with the radius
> set to a fraction of a pixel.
>
>> Claims made for Foveon are completely OTT but there is a small element
>> of truth in their claims to better colour fidelity. But my point here is
>> that JPEG decoders are suboptimal in the standard implementation.
>
> foveon does have higher chroma resolution, however, it's less accurate
> and the eye can't see the difference anyway.

I am no great fan of the Foveon sensor, but it does have an edge for
certain photography and might be what the OP is looking for. However it
also has its own problems so only he can decide if it is for him.

Regards,
Martin Brown
First  |  Prev  |  Next  |  Last
Pages: 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
Prev: SI Facescape
Next: FF camera with mirrorless design