From: Howard Brazee on
On Fri, 11 Sep 2009 13:02:11 -0300, Clark F Morris
<cfmpublic(a)ns.sympatico.ca> wrote:

>To add to your gloom, IBM mainframe COBOL doesn't support decimal
>floating point, IEEE floating point or the new data types in the 2002
>standard. It communicates to Java through a kludge. Somehow the bean
>counters, not the long run strategists are in charge. The Computer
>Science idiots and shortsighted bean counters are winning.

Not necessarily. It very well might be that the long run strategists
have decided that it isn't in IBMs best interest to be a mainframe
computer company. If their strategy along this front is one of
preparing for retreat, then I can't fault their implementation.

--
"In no part of the constitution is more wisdom to be found,
than in the clause which confides the question of war or peace
to the legislature, and not to the executive department."

- James Madison
From: Pete Dashwood on
Howard Brazee wrote:
> On Fri, 11 Sep 2009 17:56:51 +1200, "Pete Dashwood"
> <dashwood(a)removethis.enternet.co.nz> wrote:
>
>>> The future looks bleak folks- I sure hate this.
>>
>> But we knew about it 15 years ago... It wasn't like it was a big
>> surprise. People just wouldn't read the writing on the wall.
>
> Some people wouldn't read it - but others read it quite accurately.
>
> If a genie gives us the ability to predict the future, the story
> usually has us creating that future, despite everything we do to
> change it. But I don't see that happening here - we were just
> powerless to change it.
>
> That's not to say that changing it is necessarily smart. There are
> reasons things have changed that have nothing to do with the
> stubbornness of mainframers not switching to OO CoBOL.
>
>> I don't think it's bleak. Not even for COBOL. The chances of making
>> a living as a one trick COBOL pony are pretty bleak, but there is an
>> argument that says we shouldn't have done that anyway... If people
>> are allowed to move off COBOL in a measured way the future can be
>> bright.
>
> CoBOL hasn't been replaced by an other language.

Perhaps not. It depends on how you look at it.

Certainly the paradigm that COBOL represented has been replaced in
Client/Server processing. As client/server (networking, the Internet, etc.)
is where MOST of the computer use in the world is occurring, it is fair to
say that COBOL is being replaced. I originally predicted the process would
take until 2015, but it is mostly there now... In 1996, COBOL was still a
major force with many "COBOL shops" using COBOL exclusively. Today, you
would be hard pressed to find any, by 2015, such a place will be an
anachronistic curiosity...


>But people can no
> longer expect one technical skill to be sufficient for their IS
> career.

That has been true for 20 years. How long have I been saying that here in
this forum? Since BEFORE it started... :-)

Pete.
--
"I used to write COBOL...now I can do anything."


From: Pete Dashwood on
Howard Brazee wrote:
> On Fri, 11 Sep 2009 13:02:11 -0300, Clark F Morris
> <cfmpublic(a)ns.sympatico.ca> wrote:
>
>> To add to your gloom, IBM mainframe COBOL doesn't support decimal
>> floating point, IEEE floating point or the new data types in the 2002
>> standard. It communicates to Java through a kludge. Somehow the
>> bean counters, not the long run strategists are in charge. The
>> Computer Science idiots and shortsighted bean counters are winning.
>
> Not necessarily. It very well might be that the long run strategists
> have decided that it isn't in IBMs best interest to be a mainframe
> computer company. If their strategy along this front is one of
> preparing for retreat, then I can't fault their implementation.

I was surprised by Clark's post; it isn't like IBM to NOT implement
something that is axiomatically useful.

The kick is in what's "axiomatic"... (self evidently true).

Obviously their perception of what's true doesn't include floating point
running on their systems any more, and they are not extending credibility to
the COBOL 2002 standard, either.

I don't intend to pick the scab off a pretty well healed wound but I will
note in passing that I don't blame them one bit on the latter. No-one else
is implementing it; why would they?

Why is no-one else implementing it? Because it costs money to do so and
there is limited return left on that investment. These people are not
idiots.
(And, having moved some of their eggs to other baskets, they are
increasingly less dependent on COBOL than was the case, say, 20 years
ago...)

The COBOL Standard blew its remaining credibility when it took 17 years to
come up with COBOL 2002. Seven years on, nobody is even interested.

Floating point however, is another story. Certainly, IBM produced the
decimal instruction set (PACKED) when they realised that Amdahl's new system
360 (originally intended as a scientific computer) could be a gold mine if
it could just compute currency accurately, but it looks like the tail has
come to wag the dog. They're making so much money selling commercial
systems, they can afford not to worry about floating point.
(Presumably if you need IBM hardware for a large science lab, they'll
upgrade the floating point, for a price...)

"What is truth?" asked jesting Pilate, and would not wait for an answer.

Truth is very often what you perceive it to be. IBM have changed their
perception over decades. Presumably, that was in response to changing market
conditions.

Who can blame them?

Pete
--
"I used to write COBOL...now I can do anything."


From: Michael Wojcik on
Pete Dashwood wrote:
>
> Certainly the paradigm that COBOL represented has been replaced in
> Client/Server processing. As client/server (networking, the Internet, etc.)
> is where MOST of the computer use in the world is occurring, it is fair to
> say that COBOL is being replaced.

Client/server computing is not "most of the computer use in the
world". Most of the computers sold in recent years are embedded
systems (and a majority of those are 8-bitters). The type of computer
with the most users, worldwide, are mobile phones - a quintessential
peer-to-peer application. If compute cycles is our metric, most
computer use is in scientific number crunching.

If we restrict "computer use" to mean people using general-purpose
computers for a task they explicitly initiate, the dominant
application is email. That does involve clients and servers, but it's
hardly an interesting example of "client/server processing".

Personally, I doubt that even in classic business back-office
processing a majority of transactions are client/server in any
interesting sense. But that's much harder to measure, since many large
organizations don't even know what they run internally on a daily
basis. (That's why there's a market for application portfolio analysis
tools.)

Various forms of distributed processing, from web applications to
service-oriented architectures to massive server farms to cloud
computing, are certainly getting a lot of attention these days; and I
do think that's the right way to go for many kinds of applications,
including most of the things that were done as big online or batch
applications in the past. But they don't constitute "most computing"
unless you use a very narrow definition of "computing".

This is the same error we see from Web 2.0 pundits, New Media
enthusiasts, "long tail" proponents and the like - they ignore the
sectors of the industry that don't fit their models, and consequently
mistake the innovations of the vanguard for a revolution of the masses.

--
Michael Wojcik
Micro Focus
Rhetoric & Writing, Michigan State University
From: Pete Dashwood on
Michael Wojcik wrote:
> Pete Dashwood wrote:
>>
>> Certainly the paradigm that COBOL represented has been replaced in
>> Client/Server processing. As client/server (networking, the
>> Internet, etc.) is where MOST of the computer use in the world is
>> occurring, it is fair to say that COBOL is being replaced.
>
> Client/server computing is not "most of the computer use in the
> world".

Yes it is.


>Most of the computers sold in recent years are embedded
> systems (and a majority of those are 8-bitters). The type of computer
> with the most users, worldwide, are mobile phones - a quintessential
> peer-to-peer application. If compute cycles is our metric, most
> computer use is in scientific number crunching.

That's just silly. We are talking computers used by people interacting with
software. A smart washing machine or toaster is not part of this discussion
and "compute cycles" was NOT the metric for my statement.
>
> If we restrict "computer use" to mean people using general-purpose
> computers for a task they explicitly initiate, the dominant
> application is email. That does involve clients and servers, but it's
> hardly an interesting example of "client/server processing".

So you want to move my statement to include "interesting" now?

"interesting" is a completely subjective perspective. You think email is not
interesting as a technology; I find it fascinating, and dealing with it on
web based applications is challenging and satisfying.

>
> Personally, I doubt that even in classic business back-office
> processing a majority of transactions are client/server in any
> interesting sense. But that's much harder to measure, since many large
> organizations don't even know what they run internally on a daily
> basis. (That's why there's a market for application portfolio analysis
> tools.)

The people who work on the shop floor know what they run. They pass
spreadsheets to each other over the corporate intranet, share each others
data directly via the corporate LAN and would be completely lost without a
client server network. If you doubt this, just be there when a major LAN
server goes down and observe the resulting consternation...
>
> Various forms of distributed processing, from web applications to
> service-oriented architectures to massive server farms to cloud
> computing, are certainly getting a lot of attention these days; and I
> do think that's the right way to go for many kinds of applications,
> including most of the things that were done as big online or batch
> applications in the past. But they don't constitute "most computing"
> unless you use a very narrow definition of "computing".

I'm not going to argue how wide my definition is for the sake of this
argument. I stand by my original statement.
>
> This is the same error we see from Web 2.0 pundits, New Media
> enthusiasts, "long tail" proponents and the like - they ignore the
> sectors of the industry that don't fit their models, and consequently
> mistake the innovations of the vanguard for a revolution of the
> masses.

I'm not convinced.

Talk to anyone under thirty and ask them what a computer is.

It isn't a cellphone or a mainframe.

AND they all have one and have been using it all their lives. You completely
overlooked the fact that the internet is taking around 2 billion web page
hits a day, much of this off social _NETWORKS_ (my emphasis).

Client server networks are definitely the largest part of computing in
industry and the home. Sorry if it isn't "interesting" but reality sometimes
isn't.. :-)

Pedantically there may be more computing cycles consumed by car engine
management systems, but that has no bearing on COBOL, which is what I was
discussing.

Pete.
--
"I used to write COBOL...now I can do anything."