From: David J Taylor on
"nospam" <nospam(a)nospam.invalid> wrote in message
news:250420100514133247%nospam(a)nospam.invalid...
> In article <hr15k0$b8n$1(a)news.eternal-september.org>, David J Taylor
> <david-taylor(a)blueyonder.co.uk.invalid> wrote:
>
>> >> Furthermore, a 640x480 sensor with a Bayer CFA is four interleaved
>> >> arrays of 320x240 red, green, blue and green pixels. The output is
>> >> three
>> >> overlayed arrays of red green and blue pixels. Going from 320x240
>> >> to
>> >> 640x480 *IS* upsizing!
>> >
>> > except that's not how bayer works
>>
>> How does it work, then?
>
> as i said a few posts back, each pixel calculates the missing
> components and pixels get reused. there are a number of papers online
> about various algorithms. here's one:
> <http://www.ece.gatech.edu/research/labs/MCCL/pubs/dwnlds/bahadir05.pdf>

... where you find the basic spatial interpolation described, along with
enhancements.

David

From: Chris Malcolm on
nospam <nospam(a)nospam.invalid> wrote:
> In article <hqvhdi$doo$1(a)news.eternal-september.org>, David J Taylor
> <david-taylor(a)blueyonder.co.uk.invalid> wrote:

>> > They were explaining that because the light sensitive area would be
>> > reduced by the live view circuitry, noise levels would rise. You might
>> > want to google for these threads in the ng.
>> >
>> > In any case, unless you are somebody who is designing these sensors, you
>> > are not in a position to make qualified statements about what is
>> > possible and what is not.

>> If what you say is correct, Alfred, what they were saying is also correct,
>> and in accordance with the physics of the situation. Less sensitive area
>> captures fewer photons, which means more noise. Today, improved
>> micro-lenses and higher QE may well have offset the reduced sensing area.

> people did say that live view would impact the noise levels, but the
> missing piece is that sensor technology advanced which offsets any
> loss.

The new back illuminated sensors put the wiring on the backside of the
chip where it no longer encroaches on photosensor space.

--
Chris Malcolm
From: Alfred Molon on
In article <AuLf5SD67A1LFwc3(a)kennedym.demon.co.uk>, Kennedy McEwen
says...
> as someone who
> does make imaging sensors

Interesting. What sensors do you make?
--

Alfred Molon
------------------------------
Olympus E-series DSLRs and micro 4/3 forum at
http://tech.groups.yahoo.com/group/MyOlympus/
http://myolympus.org/ photo sharing site
From: Ray Fischer on
Alfred Molon <alfred_molon(a)yahoo.com> wrote:
> nospam says...
>
>> > 2/3 of the needed colour information is missing in a Bayer sensor, and
>> > that has an impact on the effective resolution.
>>
>> a small impact.
>
>2/3 of the data are missing,

2/3 of WHAT data?

Do you even know?

If you're referring to visible light then you clearly don't know much
because a 15MP Bayer sensor collects MORE data than does a 4.6MP
Foveon sensor.

--
Ray Fischer
rfischer(a)sonic.net

From: Ray Fischer on
David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>"Ray Fischer" <rfischer(a)sonic.net> wrote in message
>news:4bd3ec93$0$1610$742ec2ed(a)news.sonic.net...
>> David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>>>> however, the total number doesn't change. there are 12 million on the
>>>> sensor and 12 million in the image, or however many the sensor has.
>>>
>>>There are 12 million monochrome pixels on the sensor,
>>
>> No, there are 4.6 million pixels. Any other claim is a lie.
>>
>>> interpolated to 12
>>>million colour pixels. The sensor only has 3 million red pixels, but
>>
>> Learn what "pixel" means.
>
>Ray,
>
>This part of the thread had evolved to being about Bayer in a hypothetical
>12MP DSLR, and whether or not spatial interpolation was involved.

A piel is a picture element. You cannot split up the color
componenets of a pixel in some arbitrary way and then claim that a
single pixel is really three, four, or a thousand pixels.

--
Ray Fischer
rfischer(a)sonic.net