From: Michael J. Mahon on
heuser.marcus(a)freenet.de wrote:
>>It's kind of a self-fulfilling prophecy. If enough people write applications
>>that depend on undocumented behavior, then that behavior becomes
>>a "feature" that cannot be changed in future versions--usually
>>preventing useful improvements.
>
>
> IMHO Atari didn't even seriously consider replacing the 6502. The 8-bit
> platform was virtually unchanged for ten years. The machines were cash
> cows and the Tramiel-aera Atari concentrated in making them cheaper and
> cheaper (to compete with Commodore and reach the Eastern Europe market)
> - so there was small room to get innovative.

So it seems the message is, "If you have good reason to believe that
you are programming for an 'end of the line' platform, then do anything
that works--it won't matter anyway".

Of course, now *all* 8-bit platforms are unchanging, but some of them,
in particular the Apple II, went through several implementations of
the processor.

So, even though I don't feel any need to provide for *future* changes,
I still am motivated to cover all the *past* changes, by coding for the
widest range of systems that makes sense--particularly since there is a
negligible "cost" of doing so.

> The case with the 2600 is even clearer IMHO: It used the cheaper 6507
> and the video game market crash in 1983 did the rest.

In the case of a video game machine, I wouldn't expect any compatible
upgrade path, and the entire hardware arrangement was very ideosyncratic
anyway--so there would be no reason not to do "anything that worked".

-michael

Music synthesis for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: Michael J. Mahon on
Bruce Tomlin wrote:
> In article <b_2dnVaBwopOtvjZRVn-uQ(a)comcast.com>,
> "Michael J. Mahon" <mjmahon(a)aol.com> wrote:
>
>
>>That was DMA for video RAM acces, right? If so, they just didn't get
>>how trivial it is to share RAM with a 6502--it only accesses RAM during
>>half the clock signal!
>
>
> As I understand it, they actually halted the CPU for DMA so as to get
> more memory bandwidth than "half the clock signal", and they needed to
> know when the CPU had actually stopped so that they could start DMA.
> This was also done on the Atari 7800.

Wow--they needed more than a megabyte/second of video data? I guess
if you have about the same resolution as an Apple II, but twice the
color depth, then you do... The //c and following Apple II's got
around this by providing another memory bank in parallel with the
main memory bank.

> I've heard that in the original 400/800 they actually used a regluar
> 6502, but the extra signal allowed them to save all the external
> circuitry that figured this out.

Of course, all 6502's had the RDY line that allowed a trivial amount
of glue logic to halt the CPU between instructions. (Of course, you
couldn't get away with this during a Disk ][ I/O on the Apple II, since
it would mess up the instruction timing.)

-michael

Music synthesis for 8-bit Apple II's!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it is seriously underused."
From: heuser.marcus on
> So it seems the message is, "If you have good reason to believe that
> you are programming for an 'end of the line' platform, then do anything
> that works--it won't matter anyway".

In my opinion this approach is too fatalistic - after all Commodore and
Atari didn't really offer CPU-upgrades. The C128 is more or less
completely compatible (illegal opcodes supported) and Atari didn't
offer anything faster or based on a more modern chip.

Therefore their programmers (third party or hobbyists) never really had
to worry about doing something "illegal" when using these opcodes.

Apple on the other hand introduced the IIc quite early in comparison.
It's success and the subsequent enhancement kit for the IIe /
modernized IIe effectively killed the usage of these opcodes in the
Apple world. Side note: I would love to hear from Apple why they chose
the 65C02 - because of the much less power using CMOS design or the
additional features...

And then of course came the 68K-platforms and took over the market...

> In the case of a video game machine, I wouldn't expect any compatible
> upgrade path, and the entire hardware arrangement was very ideosyncratic
> anyway--so there would be no reason not to do "anything that worked".

Good points.

What would be interesting to know is when did the 6502 programmers
begin using illegal/undocumented opcodes. Right from the start? I
vaguely remember hearing about them first in the mid-eighties.

bye
Marcus

From: heuser.marcus on
> Wow--they needed more than a megabyte/second of video data?
> I guess if you have about the same resolution as an Apple II, but twice
> the color depth, then you do... The //c and following Apple II's got
> around this by providing another memory bank in parallel with the
> main memory bank.

The Atari and C64 use an official maximum of 8K RAM for the frame
buffer. The C64 has an additional kilobyte for color information which
is accessed in parallel, AFAIK. So they are comparable to an Apple
without double-hires mode.

The Atari has some possibilities to double the memory by page flipping
but this halves the screen refresh rate. Its also possible to change
the video mode in the middle of a scan line - but this results only in
a different "color interpretion" of the memory cells, not actually
"more" memory cells.

The C64 supports some advanced trickery to do some more colors than
initially thought and officially advertised but this is again more for
static displays.

I can't really speak technically for the C64 but in the case of the
Atari the situation is more complicated than in a "standard" Apple: The
video processor (ANTIC) really takes over the system for the time it
halts the 6502.

You see, ANTIC really is an, albeit very very simple, processor: It has
its own instructions, programs (called display lists) and memory. In
fact in can access the complete RAM, ROM and even the custom chip
register area of the system. This means that you can for example
display a "live view" of the zero page and stack with memory cells
changing all the time...

So I guess Atari chose to keep it simple in making this "multiprocessor
system" work by stopping the 6502 for the moment when ANTIC needs to
maintain the display.

The effect was quite drastic on performance so they clocked the 6502
much higher than in the Apple or C64 system (1,78 MHz). In the end all
three systems were comparable in speed when using the same video
resolutions or text mode.
When ANTIC is switched off the computer is indeed much faster - a
feature of the popular fractal generators of the time.

> Of course, all 6502's had the RDY line that allowed a trivial amount
> of glue logic to halt the CPU between instructions. (Of course, you
> couldn't get away with this during a Disk ][ I/O on the Apple II, since
> it would mess up the instruction timing.)

The Atari and C64 could - with careful programming - even use clean
display interrupts (to change colors or the scroll registers on the
fly) while accessing the disk. But that's mainly due to the more
intelligent disk drive designs - and the slow disk accesses...

bye
Marcus

From: MagerValp on
>>>>> "MJM" == Michael J Mahon <mjmahon(a)aol.com> writes:

MJM> Wow--they needed more than a megabyte/second of video data?

No, but they need more than one byte per clock cycle, in some
situations. The C64 does the same thing, during badlines and sprite
access. So yes, if you fill the screen with badlines and cover it with
sprites, you would theoretically need about 2 MB/s, but in reality you
only need it in short bursts.

--
___ . . . . . + . . o
_|___|_ + . + . + . Per Olofsson, arkadspelare
o-o . . . o + MagerValp(a)cling.gu.se
- + + . http://www.cling.gu.se/~cl3polof/
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Prev: what happened CBM=VGA
Next: 1581 Drive Kits on eBay