From: Terje Mathisen on
Andy "Krazy" Glew wrote:
> I love displays! I love looking at large numbers of pixels, relatively
> large pixels for my aging eyes.

I've been using a 1920x1200 + 1600x1200 combination both at home and at
work for a few years now, these are both connected to a little docking
slice.

For our trading department I tested out a PC-Card graphics adapter some
years ago, using this allowed another pair of relatively fast 2D screens.

Since PC-Card slots have gone away, it makes sense to move the
power-hungry parts to the other end of a sufficiently fast USB2 connection.

I suppose you need external power for these, or do they run on 500 mA?

> comp.arch relevance: what sort of computers are good for processing such
> large displays/ Not necessarily GPUs, since not necessarily real time
> graphics.

External screens needs external power anyway, so why not embed the
needed USB/display hardware in the screen itself?

I'm pretty sure I've seen such beasts, at least in the form of
projectors supporting both VGA and USB connections. Having the driver
available is the crucial problem for a meeting room projector however.

Terje
--
- <Terje.Mathisen at tmsw.no>
"almost all programming can be viewed as an exercise in caching"
From: "Andy "Krazy" Glew" on
Terje Mathisen wrote:
> Andy "Krazy" Glew wrote:
>> I love displays! I love looking at large numbers of pixels, relatively
>> large pixels for my aging eyes.

> Since PC-Card slots have gone away, it makes sense to move the
> power-hungry parts to the other end of a sufficiently fast USB2 connection.
>
> I suppose you need external power for these, or do they run on 500 mA?

The USB display adapters have no external power.

They must be USB powered, unless they can snarf power from DVI or VGA.

2 of my displays have built in USB hubs. I.e. the display is powering
it's own USB display adapter.



From: ChrisQ on
Andy "Krazy" Glew wrote:

>>
>> I suppose you need external power for these, or do they run on 500 mA?
>
> The USB display adapters have no external power.
>
> They must be USB powered, unless they can snarf power from DVI or VGA.
>
> 2 of my displays have built in USB hubs. I.e. the display is powering
> it's own USB display adapter.
>

I replaced my old tube monitor for a large tft earlier this year, 1600 x
1200 vs 1280x1024 on the tube and a fraction of the power. Order of
magnitude clearer as well and less eye strain when using it all day.

I wouldn't have thought that usb would have enough bandwidth for video
rates, though perhaps if you're not doing anything else with the usb
controller. NIf no psu, so how bright is it ?. Most of the power for
tft's is in the backlight, often 10-20 watts on a larger display...

Regards,

Chris
From: Peter Grandi on

> I love displays! I love looking at large numbers of pixels,
> relatively large pixels for my aging eyes.

Actually it should be relatively small pixels with scalable
fonts and icons -- they give much better contrast and are easier
for the eyes to focus on than coarse pixels masked by antialiasing.

[ ... ]

> Best of all, I can almost, *almost*, act as if my love of large
> display surfaces is work related. It sure does help to be able
> to look at really, really, wide spreadsheets (although really,
> really, wide spreadsheets are a bit of an abomination).

For programming it is much worse than abomination. I have
noticed that many people inexplicably (to me) like to have a
single maximized window on the screen, and that many programmers
tend to write lines as long as their window (and I guess many
know that kind of programmer who also likes really tiny fonts
with dark colors on black backgrounds). The result I see is that
lines in programs become longer and longer.

The extreme case I have seen so far is that coworker who used a
tiny font on a large monitor, with a maximized editor window 400
columns wide (and 200 high). I asked him what was the point, and
he said that with very long lines he would write many C functions
entirely on one line, and he could often write a whole module
that would fit in one screenful too.

Had to spend several days reformatting and indenting his code.

[ ... ]

> (After asking IT, who said that I could only have two monitors
> if they were smaller, 1400x1050. Which rather misses the point.)

I see that Mordac style characters are persecuting you. Where I
work currently some people have 3x 1920x1200 monitors on their
desks. That's pyshing it a bit too.

[ ... ]

> I decided to drive my two 1900x1200 monitors from Hillsboro to
> Bellevue, carefully wrapped in sleeping bags and clothes. So
> now, on my big Biomorph desk at work (another piece of personal
> equipment) I have 5 monitors:

Another case of edging closer to being a "free agent" supplying
your own tools/machinery for work.

> two 1900x1200 in landscape mode, and the two 1050x1680, in
> portrait mode. Plus the laptop LCD display.

The 1920x1200 would not be really necessary if native-portrait
monitors were available to. Ideally they would be grayscale too.

Unfortunately most monitors are targeted to consumers who can
only think of playing movies on them, and want them as wide and
colorful as possible; for office/programming work high DPI
portrait greyscale monitors would be far better (speaking from
experience).

[ ... ]

> Let's see, that's 8.88 megapixels, if I have done my math
> correctly. Most of it driven by USB. Probably no good for
> video or games, but good enough to throw a lot of data up where
> I can look at it. More! I want more! More slow pixels! If I
> could plug in e-paper displays all about my office, I would.

My observation is that the most precious computer entities are
displays and memory (that is, visual and program memory) as
proven by the degree of multiplexing/caching they are subjected
to. Right now I have a 24" LCD, with the following levels of
multiplexing:

* KVM to switch between 2 computers.
* Window manager with multiple virtual desktops.
* Multiple overlapping windows within a desktop.
* Multiple tabs within a window.
* Multiple buffers (Emacs) within a tab.

It can get pretty confusing and distracting. Never mind the crazy
performance implications of many layer of memory caching each
with their own inappropriate replacement policy.

> We're on the verge of LCDs and e-paper being cheap enough to
> replace the whiteboards that are ubiquitous in offices.

I am very very much against replacing any tech that works well,
requires no power or batteries, no cabling, little maintenance,
and has excellent viewing properties, with something else, just
for the sake of bringing a "Blade Runner" style world forward.

[ ... ]

> Eventually, we must get rid of refresh.

LCDs don't refresh...

> comp.arch relevance: what sort of computers are good for
> processing such large displays/ Not necessarily GPUs, since not
> necessarily real time graphics.

Well, 'comp.arch' is about computer system architecture in
general, not just processors, and the technology of the surface
between the human an computer perceptual worlds is part of that.

There is an angle on processors: current monitors are in effect
display computers, as the signals received from the main unit are
processes (e.g. zooming, sharpening, ...) before being rendered
to the LCD. In effect what one sees on the monitor is a processed
movie of the contents of the frame buffer (that's why some LCD
monitors offer a sharpening setting even for DVI input), and that
is also what those USB adapters you use do too, as do the various
KVM-over-Ethernet-or-IP products. Not too different from current
disk drives, which are in effect complicated block device servers.

The overall architectural relevance here is that modern systems
are ever more asymmetrical distributed system with specialized
embedded processors (and not just printers and disks, monitors
too, never mind disk host adapters, network cards, ...).

The result is flexibility, but also increased confusion as the
definition of "working" becomes rather fuzzy for even simple
systems.

I wonder mhow many of the readers of this newsgroup realize that
sometimes current TVs crash and have to be rebooted, and so do
current LCD displays (and trains -- once I was stuck on a train
for a couple of hours as the locomotive engineers had difficulty
rebooting to a stable state the engine's controllers).
From: Bernd Paysan on
Peter Grandi wrote:
> The 1920x1200 would not be really necessary if native-portrait
> monitors were available to. Ideally they would be grayscale too.
>
> Unfortunately most monitors are targeted to consumers who can
> only think of playing movies on them, and want them as wide and
> colorful as possible; for office/programming work high DPI
> portrait greyscale monitors would be far better (speaking from
> experience).

High resolution: yes, please. Grayscale: no. I use syntax
highlighting, and even when composing a Usenet posting like this, my
editor colors quoted text differently - that's very useful, I would miss
it on a grayscale monitor.

IMHO, a number of things go wrong which is why advance of resolution is
slow. My take at this is here:

http://www.jwdt.com/~paysan/hires.html

Executive summary: Use bayer pattern for the screen, double effective
resolution without decreasing feature size (i.e. with the same
technology and yield).

> I wonder mhow many of the readers of this newsgroup realize that
> sometimes current TVs crash and have to be rebooted, and so do
> current LCD displays

Fortunately, I'd neither to take my TV nor my LCD to an unexpected
forced reboot yet. The complexity still remains low enough to keep the
software robust. Suggestions like mine above would require some
slightly more complex software in the screen as long as the interface
stays RGB.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/