From: Crownfield on
george_preddy(a)yahoo.com wrote:
>
> No it doesn't. Digital has a very bad reputation among no-comprimise
> photographers, because the overwhelming majority of people who use
> digital don't understand that their Bayers take 3 very small RGB
> exposures, then combine them while upscaling by 400%.

you ignorant twit.
they do not upscale them. the images are large to start with.
think again before your next brain fart.
From: george_preddy on


JPS(a)no.komm wrote:
> >Interpolation never adds optical resolution.
> >
> >Try it for yourself, download a satellite picture of the world and
> >interpolatively upscale it using Photshop. Make it a bigger and bigger
> >image until you can actually see yourself smiling in the picture.
>
> Don't be ridiculous. Everyone arguing with you knows that you can't get
> more detail by upscaling a bitmap.

Each Bayer expoure is a bitmap, interpolated values are inserted
between optical data points in each channel to upscale each RGB
exposure to the sensor's overall monochrome dimension. This is
upscaling.

That is what interpolation means afterall, insterting a guess between
data points. There is no difference if you do it on a desktop or in a
camera, as my posted full size samples showed so nicely.

There is virtually no loss in full color resolution when you downsize a
Bayer to 25% of its recorded size, because it was already digitally
upscaled 400% as recorded.

> The luminance, the most significant part of the capture, is not upscaled
> at all. Red and blue resolution are upscaled to 200%, and green
> resolution is upscaled 70.7%.

Red and Blue are upscaled 400%, Green is upscaled only 200% but is
effectively reduced by half in order to topologically align the RGB
composite image. As a result, Green accuracy is somewhat imporoved,
but the resulting interpolated Green quantity is utterly identical to
Red and Blue. Obviously, since every final pixel has an R, G, and B
cahnnel.

From: Bart van der Wolf on

<george_preddy(a)yahoo.com> wrote in message
news:1118001119.337990.14710(a)g14g2000cwa.googlegroups.com...
>
>
> Bart van der Wolf wrote:
>> It was already clear that you don't understand the difference
>> between
>> monochrome (single color), and spectral band. Each sensel is
>> natively
>> sensitive to a spectrum of roughly 350 to 1000 nm, and filters
>> restrict that to 3 (sometimes 4) slightly overlapping spectral
>> bands.
>
> Oh, and I thought monchrome sensors always produced a big solid 1
> color
> rectangle.

Twisting and turning won't help you.
You stated that the sensor is monochrome.
I explained it isn't.

Now you state something completely different, yawn.

Bart




From: Bart van der Wolf on

<george_preddy(a)yahoo.com> wrote in message
news:1118002917.281821.236530(a)g44g2000cwa.googlegroups.com...
SNIP of all your inaccuracies and lies

Apparently leaves no text.

Bart

From: george_preddy on


Bart van der Wolf wrote:
> <george_preddy(a)yahoo.com> wrote in message
> news:1118002917.281821.236530(a)g44g2000cwa.googlegroups.com...
> SNIP of all your inaccuracies and lies
>
> Apparently leaves no text.

What's the matter? You've already run out of excuses for a Bayer
sensor's monochrome MP rating, which wrongly describes the sum total of
all 3 of its very small R, G, and B exposures--instead of listing the
resolution of the, similarly small, resulting composite color image?

Foveon got it right, a true 3.4MP camera absolutely requires a minimum
10.3MP monochrome sensor size.

But even that is only if there are no mandated inefficiencies, like
imbalanced RGB exposure sizes, or the corresponding need for a blur
filter, or the need for intrusive noise reduction due to the tiny
monochrome pixel size required to cram all 3 RGB exposures onto a
single 2D surface.