From: Bruce on
On 25 Apr 2010 19:08:12 GMT, rfischer(a)sonic.net (Ray Fischer) wrote:
>
>Wrong. Not "elements". Pixels.


According to Wikipedia:

"In digital imaging, a pixel (or picture element) is a single point in
a raster image."

The reference given is:
Rudolf F. Graf (1999). Modern Dictionary of Electronics. Oxford:
Newnes. p. 569. ISBN 0-7506-43315.

From: Ray Fischer on
Kennedy McEwen <rkm(a)kennedym.demon.co.uk> wrote:
> Ray Fischer
>>David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>>>"Ray Fischer" <rfischer(a)sonic.net> wrote in message
>>>news:4bd3ec93$0$1610$742ec2ed(a)news.sonic.net...
>>>> David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>>>>>> however, the total number doesn't change. there are 12 million on the
>>>>>> sensor and 12 million in the image, or however many the sensor has.
>>>>>
>>>>>There are 12 million monochrome pixels on the sensor,
>>>>
>>>> No, there are 4.6 million pixels. Any other claim is a lie.
>>>>
>>>>> interpolated to 12
>>>>>million colour pixels. The sensor only has 3 million red pixels, but
>>>>
>>>> Learn what "pixel" means.
>>>
>>>Ray,
>>>
>>>This part of the thread had evolved to being about Bayer in a hypothetical
>>>12MP DSLR, and whether or not spatial interpolation was involved.
>>
>>A pixel is a picture element. You cannot split up the color
>>componenets of a pixel in some arbitrary way and then claim that a
>>single pixel is really three, four, or a thousand pixels.
>>
>So a Bayer CFA sensel, being an incomplete picture element,

WRONG.

It's not incomplete. There's no such thing as an "incomplete" pixel.
A monochrome pixel is still a pixel.

--
Ray Fischer
rfischer(a)sonic.net

From: Ray Fischer on
Kennedy McEwen <rkm(a)kennedym.demon.co.uk> wrote:
>In article <250420101139237939%nospam(a)nospam.invalid>, nospam
><nospam(a)nospam.invalid> writes
>>In article <hr1mch$9de$1(a)news.eternal-september.org>, David J Taylor
>><david-taylor(a)blueyonder.co.uk.invalid> wrote:
>>
>>> .. where you find the basic spatial interpolation described, along with
>>> enhancements.
>>
>>interpolation yes, spatial no. there are n pixels going in, n pixels
>>coming out.
>
>There are only x picture elements in and 3x picture elements coming out.

Are you nuts?

--
Ray Fischer
rfischer(a)sonic.net

From: Martin Brown on
Ray Fischer wrote:
> Kennedy McEwen <rkm(a)kennedym.demon.co.uk> wrote:
>> Ray Fischer
>>> David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>>>> "Ray Fischer" <rfischer(a)sonic.net> wrote in message
>>>> news:4bd3ec93$0$1610$742ec2ed(a)news.sonic.net...
>>>>> David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>>>>>>> however, the total number doesn't change. there are 12 million on the
>>>>>>> sensor and 12 million in the image, or however many the sensor has.
>>>>>> There are 12 million monochrome pixels on the sensor,
>>>>> No, there are 4.6 million pixels. Any other claim is a lie.
>>>>>
>>>>>> interpolated to 12
>>>>>> million colour pixels. The sensor only has 3 million red pixels, but
>>>>> Learn what "pixel" means.

I suggest that you learn what a pixel means - that would help.

The Bayer grid contains filtered photosensor sites. It takes the data
from several of these to construct any pixel in the final image.

>>>> This part of the thread had evolved to being about Bayer in a hypothetical
>>>> 12MP DSLR, and whether or not spatial interpolation was involved.
>>> A pixel is a picture element. You cannot split up the color
>>> componenets of a pixel in some arbitrary way and then claim that a
>>> single pixel is really three, four, or a thousand pixels.
>>>
>> So a Bayer CFA sensel, being an incomplete picture element,
>
> WRONG.
>
> It's not incomplete. There's no such thing as an "incomplete" pixel.
> A monochrome pixel is still a pixel.

A monochrome pixel is a pixel, but only if the image can be interpretted
as an image in its own right. A Bayer mask image looks pretty strange if
displayed as colours without demosaicing. If you split a 12Mpixel Bayer
mask image into the four separate colours of 3Mpixel each then you can
call them pixels in the various monochrome representations of colour
images. When they are still all mixed up it is misleading to call them
pixels because that has an implication of all being channels measured.

By any reasonable definition a photosite on Bayer sensor array is only
measuring one colour out of R,G,B or possibly two in the case of low
light modified C,M,Y,G filters. That is necessarily losing information
about the true luminance of the image.

There are 12M sensor sites, but each one is measuring only a part of the
image data. In particular there is a difference in the spatial
resolution of the red and blue sensors compared to green. And of all the
colours when compared to an unfiltered sensor array of the same
dimensions. Demosaicing can fix most of this up which is why the Bayer
pattern is useed. It works because the human eye is ropey at colour
resolution.

In common computer imaging usage a pixel is generally taken to mean a
monochrome image of 8, 16 or 32 bits, or a colour pixel with either
palletted 8 bits, 16bits (R,G,B = 5,6,5), 24bits (8bits for each of
R,G,B), 48bits (16 bits each for R,G,B).

The Bayer mosaic is none of these as there are a lot of implicit zeroes
in the measurement grid and a distinct green bias due to the twofold
excess of green pixels in the grid.

Regards,
Martin Brown
From: David J Taylor on
"Ray Fischer" <rfischer(a)sonic.net> wrote in message
news:4bd49347$0$1611$742ec2ed(a)news.sonic.net...
> David J Taylor <david-taylor(a)blueyonder.co.uk> wrote:
>
>
>>>> .. where you find the basic spatial interpolation described, along
>>>> with
>>>> enhancements.
>>>
>>> interpolation yes, spatial no. there are n pixels going in, n pixels
>>> coming out.
>>
>>It is spatial. Typically there are n red sensing pixels, but 4n RGB
>>output pixels.
>
> Are you nuts?
>
> --
> Ray Fischer
> rfischer(a)sonic.net

No, I'm describing what happens in the typical 12MP DSLR, which has 3
million red-sensitive photo-sites, and delivers 12 million full-colour
output pixels in its JPEG file.

David