From: Michael J. Mahon on
Bruce Tomlin wrote:
> In article <d4CdnR_y__CUmfXZnZ2dnUVZ_s-dnZ2d(a)comcast.com>,
> "Michael J. Mahon" <mjmahon(a)aol.com> wrote:
>
>
>>True, but the decoding in the 6502 is handled by a kind of PLA, so
>>it would likely not be very expensive (in real estate) to trap or
>>NOP the invalid combinations.
>
>
> Nope, it's handled by good old random logic. A PLA is designed to be a
> programmable generic replacement for random logic, and is way too
> inefficient for high-volume VLSI.

My mistake--from this chip photo, I incorrectly identified either
the registers or the ALU as a PLA:

http://micro.magnet.fsu.edu/chipshots/mos/6502large.html

Still, given the relatively ordered distribution of 6502 undefined
ops, I think their detection would have been relatively simple.

Of course, this is moot for at least two reasons: it wasn't done,
and most designers of the time wouldn't have done it anyway.

-michael

Music synthesis for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: Michael J. Mahon on
Rainer Buchty wrote:
> In article <d4CdnR_y__CUmfXZnZ2dnUVZ_s-dnZ2d(a)comcast.com>,
> "Michael J. Mahon" <mjmahon(a)aol.com> writes:
> |> True, but the decoding in the 6502 is handled by a kind of PLA, so
> |> it would likely not be very expensive (in real estate) to trap or
> |> NOP the invalid combinations.
>
> You are too today-centric.
>
> Back in the 1970s even that "likely not very expensive" decision raised
> the costs significantly and introduced complexity to the chip which was
> unnecessary from a technical and marketing point of view.
>
> |> If I had been designing the 6502,
> |> I likely would have made the same choice.
>
> Probably everyone would have. Leaving out the illegal opcode traps or
> mapping them to NOP did no harm, implementing them would just raise
> costs.
>
> |> (But as a coder, I would have regarded undefined ops as a curiosity,
> |> not as an opportunity to save a few cycles at the expense of future
> |> utility.)
>
> If a platform stays identical over a long-enough period of time
> then why not starting to squeeze the last out of it by using undocumented
> behavior. After all, the use of such undocumented behavior in the end
> led to enhanced capabilities, even the chip designers didn't envision
> in the first place.

I suspect that we don't disagree very much about this. But note that
the 6502 was not such a platform. Being relatively successful, it went
on through multiple generations of implementation, and unintended
behaviors are not, in general, maintained in subsequent implementations.

Although early experimenters could not have predicted the course of the
architecture's evolution, it was still common culture, since the 1950s,
not to "exploit" accidental, and therefore unsupported, "instructions"
that might exist in particular computer implementations. The problems
in maintenance and upgrading that this caused were well known.

Commercial computing recognized the significant advantages of creating
an object code-compatible line of machines, with scalable performance,
by the end of the 1950s, and major computer lines were designed with
this in mind by the early 1960s.

It has often been noted that the microprocessor community apparently
needed to rediscover all the lessons already learned by the mainframe
computer culture, but two decades later. A general presumption that
"things will always be as they are now" is one misconception that it
took people a while to correct.

> I'm especially thinking about all the fancy stuff coders did with the
> C64's video chip, which like the original 6502 (and the hereon based 6510)
> is a hardwired design.

As it turned out, the presumption that these were immutable chips was
correct.

As I stated earlier, if you have good reason to believe that you are
programming for an end-of-the-line system, then you are free to do
anything that works. But I consider this a pessimistic assumption
unless the platform has really fossilized--as the platforms we are
celebrating here have.

> And back then noone really thought about future utility. Maybe apart from
> the Apple II those machines were pretty much integrated & rather unexpandable
> boxes -- even more, in the early homecomputer and video game market there was
> no sense for a "family concept" where software from the old machine would just
> run on its next generation successor, cause that most likely was an entirely
> new box.
>
> Think of Atari 400/800 vs. 600/800XL, Commodore VIC20 vs. C64, Sinclair ZX81
> vs. Spectrum just to name a few.

No one is praiseworthy for not thinking about future utility!

It seems hard to imagine now that--almost 20 years after commercial
computers had all moved to scalable, compatible lines to leverage
code investments--the idea that this might be at least as valuable
in the microcomputer marketplace did not influence design decisions.

Until the Mac, all of Apple's computers were designed with application
compatibility with previous machines in mind. Of course, the same has
been true since in the Mac line, and in the entire PC line (after a few
early not-quite-clone dead ends).

I suppose I must fault the undisciplined early coders for participating
in the newest, most radical advance in computing without a real vision
of what success would mean. The lesson was already clear for anyone
who was paying attention. (Note that I have no problem whatever with
someone *using* undocumented features themselves--the problem is when
code that uses such features is released for wider use.)

-michael

Music synthesis for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: Bruce Tomlin on
In article <kKCdnXMGZogpR_XZRVn-qw(a)comcast.com>,
"Michael J. Mahon" <mjmahon(a)aol.com> wrote:

> My mistake--from this chip photo, I incorrectly identified either
> the registers or the ALU as a PLA:
>
> http://micro.magnet.fsu.edu/chipshots/mos/6502large.html

See that mess of spaghetti in the middle? I think that's the
instruction decoder.
From: Michael J. Mahon on
Bruce Tomlin wrote:
> In article <kKCdnXMGZogpR_XZRVn-qw(a)comcast.com>,
> "Michael J. Mahon" <mjmahon(a)aol.com> wrote:
>
>
>>My mistake--from this chip photo, I incorrectly identified either
>>the registers or the ALU as a PLA:
>>
>>http://micro.magnet.fsu.edu/chipshots/mos/6502large.html
>
>
> See that mess of spaghetti in the middle? I think that's the
> instruction decoder.

It would actually be fun to see the real logic diagram of the
6502. ;-)

All of the "internals" documentation I've found is very abstract
with no detail where it would be most interesting.

-michael

Music synthesis for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: John Selck on
Michael J. Mahon wrote:

> The fact that both companies used "custom" 6502 processors for their
> machines is no doubt part of the reason that they never moved forward.

Huh? They did move forward. 6510 -> 8500 -> 8502. And also, I don't
think it's any kind of problem for Commodore to use customized CPUs
since they owned MOS, the owners and producers of 6502 tech back then :)

> Non-portable code is written for two quite different reasons:
>
> 1) Because the coder doesn't even think about portability or
> doesn't understand it, non-portablility happens.
> 2) Because the coder understands perfectly, and chooses to
> write non-portable code on purpose. (One-time use, static
> platform, compelling need,...?)

3) Slow-as-hell 8 Bit platforms don't have a proper abstraction layer
for any of their hardware, no matter if sound, graphics, timers, ports
or CPU.
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Prev: what happened CBM=VGA
Next: 1581 Drive Kits on eBay