From: Herbert Kleebauer on
Randall Hyde wrote:
> "Herbert Kleebauer" <klee(a)unibwm.de> wrote in message

> > There are also many assemblers for the x86 architecture which
> > are all incompatible. Non of them is the "standard". Even if
> > Intel would write it's own assembler, they couldn't say, this
> > is the valid assembler syntax for x86 processors.
>
> Intel *did* write their own assembler. It was called ASM86 (and then ASM286,
> and then ASM386). IIRC, they stopped around the 386 version because DOS had
> become so ubiquitos as a development platform and MASM was pretty much ASM86
> syntax compatible.

Don't see the logic of this statement. Just because they once did have
an assembler doesn't mean that they still sell an assembler. And if
they currently don't have an assembler then they also have no
assembler syntax. Even if Intel would again write it's own assembler,
they couldn't say, this is the only valid assembler syntax for x86
processors.


> > All they
> > can define is the machine language but not the assembly
> > language.
>
> Hmmm...
> Last time I checked, Intel hired some computer scientist types to develop
> the *assembly language* for the original 8086.

They can define the language for an "Intel x86 assembler" but
surely not "the" assembly language for the x86. Everybody can
define a syntax for a x86 assembler (not only Intel) and they
are all equal. If we can't ignore (because of the market power)
an awful processor architecture (like Intel X86 or Atmel AVR)
then we at least should ignore the even more awful assembler
syntax used by this companies.
From: NoDot on
Betov wrote:
> [snip]

You want to waste the c.a.e people's time with this? The r.g.i-f people
have alread complained about this!

NoDot,
....

From: Betov on
"NoDot" <no_dot(a)msn.com> ýcrivait news:1122227437.170347.262770
@g43g2000cwa.googlegroups.com:

> Betov wrote:
>> [snip]
>
> You want to waste the c.a.e people's time with this? The r.g.i-f people
> have alread complained about this!
>
> NoDot,

Said by one from "the tiny couple of definitive idiots
keeping stuck with this horror"...

:))

"c.a.e", "r.g.i-f", "alread", you are sure it was not
"comdlainep", kid?

:))))))

Betov.

< http://rosasm.org >







From: Hans-Bernhard Broeker on
[F'up2 reduced to one group --- should have been done earlier.]

In comp.arch.embedded Herbert Kleebauer <klee(a)unibwm.de> wrote:
> Randall Hyde wrote:
> > "Herbert Kleebauer" <klee(a)unibwm.de> wrote in message

> > > There are also many assemblers for the x86 architecture which
> > > are all incompatible. Non of them is the "standard". Even if
> > > Intel would write it's own assembler, they couldn't say, this
> > > is the valid assembler syntax for x86 processors.

> > Intel *did* write their own assembler. It was called ASM86 (and
> > then ASM286, and then ASM386). IIRC, they stopped around the 386
> > version because DOS had become so ubiquitos as a development
> > platform and MASM was pretty much ASM86 syntax compatible.

> Don't see the logic of this statement. Just because they once did
> have an assembler doesn't mean that they still sell an assembler.

That's quite obviously not what Randall was trying to make it mean,
either.

The fact that Intel actually not only documented an assembly syntax
along with the processor pretty much from the get-go, but even
implemented it themselves as part of marketing the chip, establishes
the fact that there does exist something that doesn't leave us much
chance but to call it "the" standard assembly language for this
processor: that used and published by Intel. It's quite a broken
design of an assembler, granted, and nobody's expected to like it
particularly --- but it's still the standard syntax.

This standard assembly language for x86 CPUs is also the root of the
entire MASM-compatible family of x86 assembly dialects. The members
of that family may not all be compatible with each other (although
most can still be switched to actual MASM source compatibility) ---
but they're all roughly compatible to ASM86. I.e. you can still feed
assembly source from the original 8086 application notes and data
sheets (or from books from that era) to MASM or one of its derivatives
(running in compatibility mode, where necessary) essentially
unchanged, and expect it to work. Even "rogue" assemblers like a86
still accept that assembly language, even if they'll grumble a lot
about it.

The only truly independent assembly language family for x86 that I'm
personally aware of is AT&T (including its GNU descendants).
Off-hand, this fundamentally incompatible language looks like a very
bad idea, so it needs to be abandoned or justified.

I think AT&T x86 assembler language makes a whole lot more sense than
Intel's ever did. Not because of the oft-maligned opposite order of
source and destination operand, mind you, but because it puts the
definition of operation width where it belongs (as a letter in the
opcode) instead of where it has to be second-guessed (by inspecting
stuff like BYTE PTR [whatever]). The price to be paid for this
improvement of the syntax was a loss of direct access to the entire
body of pre-existing assembly source. AT&T deemed that acceptible,
probably because they didn't plan on using their 'as' on any
third-party original assembler sources anyway, and the compiler
doesn't care what language it emits.

The main problem with Intel syntax IMHO is that they were trying to
let the assembler do half the linker's and a good part of the human's
job on top of that of a genuine assembler. Frankly, if the human
programmer has to look at more than one line of source code (setting
aside macros) to figure out which actual opcode a given line of
assembly code will generate, something's seriously wrong with that
assembler. Once they had added BYTE PTR [] and ASSUME to the
language, the damage was irreversible.

--
Hans-Bernhard Broeker (broeker(a)physik.rwth-aachen.de)
Even if all the snow were burnt, ashes would remain.
From: Frank-Christian Kruegel on
On Sat, 23 Jul 2005 18:54:16 +0200, Herbert Kleebauer <klee(a)unibwm.de>
wrote:

>But how are AVR programs debugged at all? Are there AVR versions
>which have support for hardware breakpoints or at least single
>step interrupts? The only alternative I see at the moment is,
>to add an assembler directive which automatically inserts a call
>to a debug routine after each instruction. But this would
>double the size of the code to debug.

AVR Studio has a Simulator - which uses the official AVR syntax of course.
:-)

Bigger parts have a JTAG port for use with the Atmel JTAG ICE (both the
older and the new MKii version), where you can single-step, view and change
cpu and peripheral registers etc. Works with AVR Studio.

Newer smaller parts have debugWire, a debug port using a single pin (reset).
This requires the Atmel JTAG ICE MKii and AVR Studio.

Atmel also sells real In-Circuit Emulators - to be used with AVR Studio.

It's everything there, and everything works fine - and requires the AVR
Studio and proper symbol tables generated by the compiler, assembler and
linker.

Mit freundlichen Grýýen

Frank-Christian Krýgel
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Next: int 10h AX = 4F00h