From: Rhodri James on
On Tue, 10 Nov 2009 18:55:25 -0000, Steven D'Aprano
<steve(a)remove-this-cybersource.com.au> wrote:

> On Tue, 10 Nov 2009 15:46:10 +0000, Grant Edwards wrote:
>
>> On 2009-11-10, Rhodri James <rhodri(a)wildebst.demon.co.uk> wrote:
>>> On Sun, 08 Nov 2009 19:45:31 -0000, Terry Reedy <tjreedy(a)udel.edu>
>>> wrote:
>>>
>>>> I believe the use of tagged pointers has been considered and so far
>>>> rejected by the CPython developers. And no one else that I know of has
>>>> developed a fork for that. It would seem more feasible with 64 bit
>>>> pointers where there seem to be spare bits. But CPython will have to
>>>> support 32 bit machines for several years.
>>>
>>> I've seen that mistake made twice (IBM 370 architecture (probably 360
>>> too, I'm too young to have used it) and ARM2/ARM3). I'd rather not see
>>> it a third time, thank you.
>>
>> MacOS applications made the same mistake on the 68K. They reserved the
>> high-end bits in a 32-bit pointer and used them to contain
>> meta-information.
>
>
> Obviously that was their mistake. They should have used the low-end bits
> for the metadata, instead of the more valuable high-end.

Oh, ARM used the low bits too. After all, instructions were 4-byte
aligned, so the PC had all those bits going spare...

--
Rhodri James *-* Wildebeest Herder to the Masses
From: Vincent Manis on
On 2009-11-10, at 07:46, Grant Edwards wrote:
> MacOS applications made the same mistake on the 68K. They
> reserved the high-end bits
At the time the 32-bit Macs were about to come on the market, I saw an internal confidential document that estimated that at least over 80% of the applications that the investigators had looked at (including many from that company named after a fruit, whose head office is on Infinite Loop) were not 32-bit clean. This in spite of the original edition of Inside Mac (the one that looked like a telephone book) that specifically said always to write 32-bit clean apps, as 32-bit machines were expected in the near future.

It's not quite as bad as the program I once looked at that was released in 1999 and was not Y2K compliant, but it's pretty close.

--v
From: Steven D'Aprano on
On Tue, 10 Nov 2009 16:05:01 -0800, Vincent Manis wrote:

> At the time the 32-bit Macs were about to come on the market, I saw an
> internal confidential document that estimated that at least over 80% of
> the applications that the investigators had looked at (including many
> from that company named after a fruit, whose head office is on Infinite
> Loop) were not 32-bit clean. This in spite of the original edition of
> Inside Mac (the one that looked like a telephone book) that specifically
> said always to write 32-bit clean apps, as 32-bit machines were expected
> in the near future.

That is incorrect. The original Inside Mac Volume 1 (published in 1985)
didn't look anything like a phone book. The original Macintosh's CPU (the
Motorola 68000) already used 32-bit addressing, but the high eight pins
were ignored since the CPU physically lacked the pins corresponding to
those bits.

In fact, in Inside Mac Vol II, Apple explicitly gives the format of
pointers: the low-order three bytes are the address, the high-order byte
is used for flags: bit 7 was the lock bit, bit 6 the purge bit and bit 5
the resource bit. The other five bits were unused.

By all means criticize Apple for failing to foresee 32-bit apps, but
criticizing them for hypocrisy (in this matter) is unfair. By the time
they recognized the need for 32-bit clean applications, they were stuck
with a lot of legacy code that were not clean. Including code burned into
ROMs.




--
Steven
From: Grant Edwards on
On 2009-11-11, Steven D'Aprano <steven(a)REMOVE.THIS.cybersource.com.au> wrote:

> By all means criticize Apple for failing to foresee 32-bit
> apps, but criticizing them for hypocrisy (in this matter) is
> unfair. By the time they recognized the need for 32-bit clean
> applications, they were stuck with a lot of legacy code that
> were not clean. Including code burned into ROMs.

They did manage to climb out of the hole they had dug and fix
things up -- something Microsoft has yet to do after 25 years.

Maybe it's finally going to be different this time around with
Windows 7...

--
Grant

From: Vincent Manis on
On 2009-11-10, at 19:07, Steven D'Aprano wrote:

> On Tue, 10 Nov 2009 16:05:01 -0800, Vincent Manis wrote:
> That is incorrect. The original Inside Mac Volume 1 (published in 1985)
> didn't look anything like a phone book. The original Macintosh's CPU (the
> Motorola 68000) already used 32-bit addressing, but the high eight pins
> were ignored since the CPU physically lacked the pins corresponding to
> those bits.
>
> In fact, in Inside Mac Vol II, Apple explicitly gives the format of
> pointers: the low-order three bytes are the address, the high-order byte
> is used for flags: bit 7 was the lock bit, bit 6 the purge bit and bit 5
> the resource bit. The other five bits were unused.
You are correct. On thinking about it further, my source was some kind of internal developer seminar I attended round about 1985 or so, where an Apple person said `don't use the high order bits, we might someday produce machines that use all 32 address bits', and then winked at us.

You are also correct (of course) about the original `Inside Mac', my copy was indeed 2 volumes in looseleaf binders; the phonebook came later.

> By all means criticize Apple for failing to foresee 32-bit apps, but
> criticizing them for hypocrisy (in this matter) is unfair. By the time
> they recognized the need for 32-bit clean applications, they were stuck
> with a lot of legacy code that were not clean. Including code burned into
> ROMs.
That's my point. I first heard about Moore's Law in 1974 from a talk given by Alan Kay. At about the same time, Gordon Bell had concluded, independently, that one needs extra address bit every 18 months (he was looking at core memory, so the cost factors were somewhat different). All of this should have suggested that relying on any `reserved' bits is always a bad idea.

I am most definitely not faulting Apple for hypocrisy, just saying that programmers sometimes assume that just because something works on one machine, it will work forevermore. And that's unwise.

-- v