From: Ubiquitous on
By Dorothy Ramienski
Internet Editor
Federal News Radio

Many federal agencies are using machines that are running software developed
25 to 35 years ago, and they're still going strong.

These machines are using the COmmon Business-Oriented Language (COBOL) to get
things done.

Invented in 1959 by Naval officer Grace Hopper, COBOL is still a prevalent
force throughout the federal government.

There are still about 200 billion lines of the code in live operation, and 75
percent of business-critical applications and 90 percent of financial
transactions use it.

It's a very secure, easy-to-learn language, but it's running into some
problems now that the Obama administration has issued modernization
initiatives such as the Open Government Directive.

Joe Moyer is regional director of Micro Focus, a company that's helping
federal agencies move their platforms from COBOL to newer ones.

"It is a very secure and solid and very good language for transactional
business. It's really running the critical mission applications in our our
government today. So, when these applications have been around for so long,
how do you make changes? How does the government address this issue?" Moyer
explained.

COBOL does its job, but it is old, and the Obama administration is running
into problems. Since one of its goals is to modernize government while
lowering costs, COBOL is becoming a bit of a sticking point.

"These applications are very, very expensive to run on older mainframes,
whether that's an IBM or Unisys platform. There's really just a few ways
government will address this issue -- do you rewrite these applications into
Java, which could take years and years? Do you replace them and go to a COTS
package -- and that's a little difficult when an application could have 30
million lines of COBOL code going to an ERP? Or, do you do nothing and keep
paying the expensive cost to maintain these applications?"

Moyer says his company provides a solution that could enable the government to
reuse what it has with a software appliance that moves applications from COBOL
to one like Windows or Unix.

And then there's also the Open Government Directive.

A lot of agency information is being held hostage, so to speak, by COBOL.
Moyer explained that the IRS, SSA and even parts of DHS and DoD still use
COBOL in a closed environment.

"Can you really go to an SOA environment or to cloud computing from the
mainframe? . . . We can actually take a COBOL application and move it to [an]
open environment. . . . [COBOL] works. It's a fantastic language. It's written
in English. It's very easy to develop, and that's why many agencies have not
moved off it yet. So, with the Obama administration's modernization
initiatives, why not save millions and millions of dollars while keeping that
secure language behind the scenes."



From: Pete Dashwood on
Ubiquitous wrote:
> By Dorothy Ramienski
> Internet Editor
> Federal News Radio
>
> Many federal agencies are using machines that are running software
> developed 25 to 35 years ago, and they're still going strong.
>
> These machines are using the COmmon Business-Oriented Language
> (COBOL) to get things done.
>
> Invented in 1959 by Naval officer Grace Hopper, COBOL is still a
> prevalent force throughout the federal government.
>
> There are still about 200 billion lines of the code in live
> operation, and 75 percent of business-critical applications and 90
> percent of financial transactions use it.
>
> It's a very secure, easy-to-learn language, but it's running into some
> problems now that the Obama administration has issued modernization
> initiatives such as the Open Government Directive.
>
> Joe Moyer is regional director of Micro Focus, a company that's
> helping federal agencies move their platforms from COBOL to newer
> ones.
>
> "It is a very secure and solid and very good language for
> transactional business. It's really running the critical mission
> applications in our our government today. So, when these applications
> have been around for so long, how do you make changes? How does the
> government address this issue?" Moyer explained.
>
> COBOL does its job, but it is old, and the Obama administration is
> running into problems. Since one of its goals is to modernize
> government while lowering costs, COBOL is becoming a bit of a
> sticking point.
>
> "These applications are very, very expensive to run on older
> mainframes, whether that's an IBM or Unisys platform. There's really
> just a few ways government will address this issue -- do you rewrite
> these applications into Java, which could take years and years? Do
> you replace them and go to a COTS package -- and that's a little
> difficult when an application could have 30 million lines of COBOL
> code going to an ERP? Or, do you do nothing and keep paying the
> expensive cost to maintain these applications?"
>
> Moyer says his company provides a solution that could enable the
> government to reuse what it has with a software appliance that moves
> applications from COBOL to one like Windows or Unix.
>
> And then there's also the Open Government Directive.
>
> A lot of agency information is being held hostage, so to speak, by
> COBOL. Moyer explained that the IRS, SSA and even parts of DHS and
> DoD still use COBOL in a closed environment.
>
> "Can you really go to an SOA environment or to cloud computing from
> the mainframe? . . . We can actually take a COBOL application and
> move it to [an] open environment. . . . [COBOL] works. It's a
> fantastic language. It's written in English. It's very easy to
> develop, and that's why many agencies have not moved off it yet. So,
> with the Obama administration's modernization initiatives, why not
> save millions and millions of dollars while keeping that secure
> language behind the scenes."

Moyer is marketing Micro Focus. And it is not a bad solution but it neglects
the fact that COBOL cannot be a long term solution. (To be fair, you can
hardly expect a marketing spokesperson to understand the subtle differences
between technical paradigms; his job is to market a COBOL solution and
ensure an ongoing revenue stream for the company, not to raise awareness of
WHY COBOL needs to be replaced.)

Re-compiling existing code for .NET simply enables a platform transfer, it
DOESN'T open up the COBOL resources so they can be easily integrated with
other languages and packages. To do that, the COBOL needs to be wrapped as
objects, and if you are going to wrap it as objects, there is no need for a
..NET compiler. (The .NET platform is designed to accommodate objects that
are not managed code, using InterOP services.)

Continuing development in COBOL is fine as an interim measure while people
are being trained in newer languages and techniques but ultimately you
either salvage or rewrite the code you have. Either way, just recompiling it
to run on the .NET platform simply perpetuates the use of COBOL "as it is".
Many companies are finding that, in today's world, that is proving to be
simply not enough.

Modern tools and techniques deal with objects and layers and traditional
COBOL doesn't. (OO COBOL as implemented by MicroFocus and Alchemy, to name
just two major players, is a step in the right direction, but it is
expensive and the OO features were bolted onto it, so it can never be as
powerful as languages designed to manage objects from the day they were
conceived.)

PRIMA is committed to providing tools that move traditional COBOL into the
new world of Object Orientation. We already have tools that enable fully
automated conversion of COBOL file-based solutions to be moved to Relational
database. That is an important step, because it opens up the data resource
without requiring manual amendment or rewrite of existing code. (The tools
amend the code automatically to access the new RDB, so it continues to
function logically as it always has, but now other tools and other languages
can access the data, as well as COBOL.) Both the new database and the code
that accesses it can be generated automatically from existing COBOL
copybooks.

The process is a simple 3 step one and we have tools that do each step fully
automatically:

1. Create a new Relational database that is functionally equivalent to the
existing file base. (A set of tables on a Relational Data Base (RDB) is
generated in at least 2NF for each existing indexed file.)

2. Objects are generated (callable locally or remotely from the desktop or a
web page) to manage each of the generated table sets. These are COM server
components, and are referred to as Data Access Layer (DAL) objects).

3. Existing programs that access the indexed files are transformed to invoke
the DAL objects that manage the table sets. (As the DAL objects can be used
with any language that supports COM (including COBOL - both OO and standard)
they provide a useful separation between Business logic and Data layers and
can be used for future development as well as the existing applications. In
effect these objects are "future proof".)

Having achieved a stable base that is running existing applications against
the new RDB, the next step is to refactor existing COBOL code into objects
also.

This process is currently mainly manual, but we are developing tools to
automate aspects of it and hope eventually to make it "mostly automated"

Ancillary tools generate load modules that read the indexed files and write
the table sets, generate Host Variables for ESQL (DECLGEN), and a number of
other tasks that are tedious (and error prone...) when done manually.

Please visit http://primacomputing.co.nz and read my statement about how
this approach differs from a straight conversion using .NET COBOL.

For a graphic representation see
http://primacomputing.co.nz/COBOL21/mig.aspx

As for the topic of this thread, our base toolset comes at under half the
price of a .NET COBOL compiler. Furthermore we engage with and support
anyone using our tools, to ensure their migration is successful and as
"painless" as possible.

Pete.
--
"I used to write COBOL...now I can do anything."


From: Joel C. Ewing on
On 05/21/2010 07:39 AM, Pete Dashwood wrote:
> Ubiquitous wrote:
>> By Dorothy Ramienski
>> Internet Editor
>> Federal News Radio
>>
>> Many federal agencies are using machines that are running software
>> developed 25 to 35 years ago, and they're still going strong.
>>
>> These machines are using the COmmon Business-Oriented Language
>> (COBOL) to get things done.
>>
>> Invented in 1959 by Naval officer Grace Hopper, COBOL is still a
>> prevalent force throughout the federal government.
>>
>> There are still about 200 billion lines of the code in live
>> operation, and 75 percent of business-critical applications and 90
>> percent of financial transactions use it.
>>
>> It's a very secure, easy-to-learn language, but it's running into some
>> problems now that the Obama administration has issued modernization
>> initiatives such as the Open Government Directive.
>>
>> Joe Moyer is regional director of Micro Focus, a company that's
>> helping federal agencies move their platforms from COBOL to newer
>> ones.
>>
>> "It is a very secure and solid and very good language for
>> transactional business. It's really running the critical mission
>> applications in our our government today. So, when these applications
>> have been around for so long, how do you make changes? How does the
>> government address this issue?" Moyer explained.
>>
>> COBOL does its job, but it is old, and the Obama administration is
>> running into problems. Since one of its goals is to modernize
>> government while lowering costs, COBOL is becoming a bit of a
>> sticking point.
>>
>> "These applications are very, very expensive to run on older
>> mainframes, whether that's an IBM or Unisys platform. There's really
>> just a few ways government will address this issue -- do you rewrite
>> these applications into Java, which could take years and years? Do
>> you replace them and go to a COTS package -- and that's a little
>> difficult when an application could have 30 million lines of COBOL
>> code going to an ERP? Or, do you do nothing and keep paying the
>> expensive cost to maintain these applications?"
>>
>> Moyer says his company provides a solution that could enable the
>> government to reuse what it has with a software appliance that moves
>> applications from COBOL to one like Windows or Unix.
>>
>> And then there's also the Open Government Directive.
>>
>> A lot of agency information is being held hostage, so to speak, by
>> COBOL. Moyer explained that the IRS, SSA and even parts of DHS and
>> DoD still use COBOL in a closed environment.
>>
>> "Can you really go to an SOA environment or to cloud computing from
>> the mainframe? . . . We can actually take a COBOL application and
>> move it to [an] open environment. . . . [COBOL] works. It's a
>> fantastic language. It's written in English. It's very easy to
>> develop, and that's why many agencies have not moved off it yet. So,
>> with the Obama administration's modernization initiatives, why not
>> save millions and millions of dollars while keeping that secure
>> language behind the scenes."
>
> Moyer is marketing Micro Focus. And it is not a bad solution but it neglects
> the fact that COBOL cannot be a long term solution. (To be fair, you can
> hardly expect a marketing spokesperson to understand the subtle differences
> between technical paradigms; his job is to market a COBOL solution and
> ensure an ongoing revenue stream for the company, not to raise awareness of
> WHY COBOL needs to be replaced.)
>
> Re-compiling existing code for .NET simply enables a platform transfer, it
> DOESN'T open up the COBOL resources so they can be easily integrated with
> other languages and packages. To do that, the COBOL needs to be wrapped as
> objects, and if you are going to wrap it as objects, there is no need for a
> .NET compiler. (The .NET platform is designed to accommodate objects that
> are not managed code, using InterOP services.)
>
> Continuing development in COBOL is fine as an interim measure while people
> are being trained in newer languages and techniques but ultimately you
> either salvage or rewrite the code you have. Either way, just recompiling it
> to run on the .NET platform simply perpetuates the use of COBOL "as it is".
> Many companies are finding that, in today's world, that is proving to be
> simply not enough.
>
> Modern tools and techniques deal with objects and layers and traditional
> COBOL doesn't. (OO COBOL as implemented by MicroFocus and Alchemy, to name
> just two major players, is a step in the right direction, but it is
> expensive and the OO features were bolted onto it, so it can never be as
> powerful as languages designed to manage objects from the day they were
> conceived.)
>
> PRIMA is committed to providing tools that move traditional COBOL into the
> new world of Object Orientation. We already have tools that enable fully
> automated conversion of COBOL file-based solutions to be moved to Relational
> database. That is an important step, because it opens up the data resource
> without requiring manual amendment or rewrite of existing code. (The tools
> amend the code automatically to access the new RDB, so it continues to
> function logically as it always has, but now other tools and other languages
> can access the data, as well as COBOL.) Both the new database and the code
> that accesses it can be generated automatically from existing COBOL
> copybooks.
>
> The process is a simple 3 step one and we have tools that do each step fully
> automatically:
>
> 1. Create a new Relational database that is functionally equivalent to the
> existing file base. (A set of tables on a Relational Data Base (RDB) is
> generated in at least 2NF for each existing indexed file.)
>
> 2. Objects are generated (callable locally or remotely from the desktop or a
> web page) to manage each of the generated table sets. These are COM server
> components, and are referred to as Data Access Layer (DAL) objects).
>
> 3. Existing programs that access the indexed files are transformed to invoke
> the DAL objects that manage the table sets. (As the DAL objects can be used
> with any language that supports COM (including COBOL - both OO and standard)
> they provide a useful separation between Business logic and Data layers and
> can be used for future development as well as the existing applications. In
> effect these objects are "future proof".)
>
> Having achieved a stable base that is running existing applications against
> the new RDB, the next step is to refactor existing COBOL code into objects
> also.
>
> This process is currently mainly manual, but we are developing tools to
> automate aspects of it and hope eventually to make it "mostly automated"
>
> Ancillary tools generate load modules that read the indexed files and write
> the table sets, generate Host Variables for ESQL (DECLGEN), and a number of
> other tasks that are tedious (and error prone...) when done manually.
>
> Please visit http://primacomputing.co.nz and read my statement about how
> this approach differs from a straight conversion using .NET COBOL.
>
> For a graphic representation see
> http://primacomputing.co.nz/COBOL21/mig.aspx
>
> As for the topic of this thread, our base toolset comes at under half the
> price of a .NET COBOL compiler. Furthermore we engage with and support
> anyone using our tools, to ensure their migration is successful and as
> "painless" as possible.
>
> Pete.

I would take issue much earlier in the original comments starting with
the "These applications are very, very expensive to run on older
mainframes,...".

Are they really running these applications on "older mainframes" --
stuff from water-cooled days or earlier? If so, then that is a
ridiculously expensive choice and their cheapest solution would be to
immediately move to latest hardware that gives much more bang per buck
with lower environmental, energy, and maintenance costs. If these are
mission-critical applications, changing to other less-robust platforms
that are harder to manage for Disaster Recovery may be cheaper in the
short term than using modern mainframes, but could be a very unwise
long-term move.

There seems to be all sorts of confusion here about potentially distinct
choices for programming language, hardware choices, and Operating System
platforms. If there is one thing the last 30 years has taught, mass
conversion to the latest and greatest programming language or
programming paradigm of the day is not something to be undertaken
lightly and doesn't guarantee that one will end up with something that
is as efficient or costs any less to run and maintain than the original.
There are modern mainframe platforms that support COBOL quite well, and
modern COBOL has supported OO programming techniques and relational
databases for years.

I don't think Grace Hopper would have described herself as the inventor
of COBOL. She did invent the first programming language, Flow-Matic,
the concept of using English phrases in a programming language,
contributed significantly to the CODASYL committee that extended
Flow_Matic into what became COBOL, and was also the major player in the
adoption of COBOL by the military.
From: Pete Dashwood on
Joel C. Ewing wrote:

I thought this was a very good response, Joel. I agree with what you said,
but I wanted to expand some of it so have done so below :-)


> On 05/21/2010 07:39 AM, Pete Dashwood wrote:
>> Ubiquitous wrote:
>>> By Dorothy Ramienski
>>> Internet Editor
>>> Federal News Radio
>>>
>>> Many federal agencies are using machines that are running software
>>> developed 25 to 35 years ago, and they're still going strong.
>>>
>>> These machines are using the COmmon Business-Oriented Language
>>> (COBOL) to get things done.
>>>
>>> Invented in 1959 by Naval officer Grace Hopper, COBOL is still a
>>> prevalent force throughout the federal government.
>>>
>>> There are still about 200 billion lines of the code in live
>>> operation, and 75 percent of business-critical applications and 90
>>> percent of financial transactions use it.
>>>
>>> It's a very secure, easy-to-learn language, but it's running into
>>> some problems now that the Obama administration has issued
>>> modernization initiatives such as the Open Government Directive.
>>>
>>> Joe Moyer is regional director of Micro Focus, a company that's
>>> helping federal agencies move their platforms from COBOL to newer
>>> ones.
>>>
>>> "It is a very secure and solid and very good language for
>>> transactional business. It's really running the critical mission
>>> applications in our our government today. So, when these
>>> applications have been around for so long, how do you make changes?
>>> How does the government address this issue?" Moyer explained.
>>>
>>> COBOL does its job, but it is old, and the Obama administration is
>>> running into problems. Since one of its goals is to modernize
>>> government while lowering costs, COBOL is becoming a bit of a
>>> sticking point.
>>>
>>> "These applications are very, very expensive to run on older
>>> mainframes, whether that's an IBM or Unisys platform. There's really
>>> just a few ways government will address this issue -- do you rewrite
>>> these applications into Java, which could take years and years? Do
>>> you replace them and go to a COTS package -- and that's a little
>>> difficult when an application could have 30 million lines of COBOL
>>> code going to an ERP? Or, do you do nothing and keep paying the
>>> expensive cost to maintain these applications?"
>>>
>>> Moyer says his company provides a solution that could enable the
>>> government to reuse what it has with a software appliance that moves
>>> applications from COBOL to one like Windows or Unix.
>>>
>>> And then there's also the Open Government Directive.
>>>
>>> A lot of agency information is being held hostage, so to speak, by
>>> COBOL. Moyer explained that the IRS, SSA and even parts of DHS and
>>> DoD still use COBOL in a closed environment.
>>>
>>> "Can you really go to an SOA environment or to cloud computing from
>>> the mainframe? . . . We can actually take a COBOL application and
>>> move it to [an] open environment. . . . [COBOL] works. It's a
>>> fantastic language. It's written in English. It's very easy to
>>> develop, and that's why many agencies have not moved off it yet. So,
>>> with the Obama administration's modernization initiatives, why not
>>> save millions and millions of dollars while keeping that secure
>>> language behind the scenes."
>>
>> Moyer is marketing Micro Focus. And it is not a bad solution but it
>> neglects the fact that COBOL cannot be a long term solution. (To be
>> fair, you can hardly expect a marketing spokesperson to understand
>> the subtle differences between technical paradigms; his job is to
>> market a COBOL solution and ensure an ongoing revenue stream for the
>> company, not to raise awareness of WHY COBOL needs to be replaced.)
>>
>> Re-compiling existing code for .NET simply enables a platform
>> transfer, it DOESN'T open up the COBOL resources so they can be
>> easily integrated with other languages and packages. To do that, the
>> COBOL needs to be wrapped as objects, and if you are going to wrap
>> it as objects, there is no need for a .NET compiler. (The .NET
>> platform is designed to accommodate objects that are not managed
>> code, using InterOP services.)
>>
>> Continuing development in COBOL is fine as an interim measure while
>> people are being trained in newer languages and techniques but
>> ultimately you either salvage or rewrite the code you have. Either
>> way, just recompiling it to run on the .NET platform simply
>> perpetuates the use of COBOL "as it is". Many companies are finding
>> that, in today's world, that is proving to be simply not enough.
>>
>> Modern tools and techniques deal with objects and layers and
>> traditional COBOL doesn't. (OO COBOL as implemented by MicroFocus
>> and Alchemy, to name just two major players, is a step in the right
>> direction, but it is expensive and the OO features were bolted onto
>> it, so it can never be as powerful as languages designed to manage
>> objects from the day they were conceived.)
>>
>> PRIMA is committed to providing tools that move traditional COBOL
>> into the new world of Object Orientation. We already have tools that
>> enable fully automated conversion of COBOL file-based solutions to
>> be moved to Relational database. That is an important step, because
>> it opens up the data resource without requiring manual amendment or
>> rewrite of existing code. (The tools amend the code automatically to
>> access the new RDB, so it continues to function logically as it
>> always has, but now other tools and other languages can access the
>> data, as well as COBOL.) Both the new database and the code that
>> accesses it can be generated automatically from existing COBOL
>> copybooks.
>>
>> The process is a simple 3 step one and we have tools that do each
>> step fully automatically:
>>
>> 1. Create a new Relational database that is functionally equivalent
>> to the existing file base. (A set of tables on a Relational Data
>> Base (RDB) is generated in at least 2NF for each existing indexed
>> file.)
>>
>> 2. Objects are generated (callable locally or remotely from the
>> desktop or a web page) to manage each of the generated table sets.
>> These are COM server components, and are referred to as Data Access
>> Layer (DAL) objects).
>>
>> 3. Existing programs that access the indexed files are transformed
>> to invoke the DAL objects that manage the table sets. (As the DAL
>> objects can be used with any language that supports COM (including
>> COBOL - both OO and standard) they provide a useful separation
>> between Business logic and Data layers and can be used for future
>> development as well as the existing applications. In effect these
>> objects are "future proof".)
>>
>> Having achieved a stable base that is running existing applications
>> against the new RDB, the next step is to refactor existing COBOL
>> code into objects also.
>>
>> This process is currently mainly manual, but we are developing tools
>> to automate aspects of it and hope eventually to make it "mostly
>> automated"
>>
>> Ancillary tools generate load modules that read the indexed files
>> and write the table sets, generate Host Variables for ESQL
>> (DECLGEN), and a number of other tasks that are tedious (and error
>> prone...) when done manually.
>>
>> Please visit http://primacomputing.co.nz and read my statement about
>> how this approach differs from a straight conversion using .NET
>> COBOL.
>>
>> For a graphic representation see
>> http://primacomputing.co.nz/COBOL21/mig.aspx
>>
>> As for the topic of this thread, our base toolset comes at under
>> half the price of a .NET COBOL compiler. Furthermore we engage with
>> and support anyone using our tools, to ensure their migration is
>> successful and as "painless" as possible.
>>
>> Pete.
>
> I would take issue much earlier in the original comments starting with
> the "These applications are very, very expensive to run on older
> mainframes,...".

I think it is all relative. Basically, I agree with you on this. If you have
been running something for years and you can afford to do it, then it can't
be "very very expensive" (If it was you would have sought an alternative
years ago...)

In fact, if you are using older hardware and it is being supported and
maintained, it is probably cheaper to run now than it has ever been.
Certainly it's book value will be far less (if you own it) than it was when
you bought it.
>
> Are they really running these applications on "older mainframes" --
> stuff from water-cooled days or earlier? If so, then that is a
> ridiculously expensive choice and their cheapest solution would be to
> immediately move to latest hardware that gives much more bang per buck

Again, many people who are not in the mainframe world forget that mainframe
technology has also advanced and taken advantage of new hardware. I remember
a company I worked for paying half a million dollars for an IBM 360-30 with
32K of memory. Obviously, that same machine today could probably be
purchased in a garage sale for around $1000 if you could find one.The point
is that mainframe costs were exorbitant as long as there was little
alternative. Now there are alternatives and the mainframe vendors (mainly
IBM) have had to recognise this and respond.

> with lower environmental, energy, and maintenance costs. If these are
> mission-critical applications, changing to other less-robust platforms
> that are harder to manage for Disaster Recovery may be cheaper in the
> short term than using modern mainframes, but could be a very unwise
> long-term move.

I'm not sure that is a persuasive argument any more. For a long time
mainframe people were saying that the networks could not be as secure or
"robust" as a mainframe (with some justification), but I believe that is not
the case any longer. A corporae Intranet can be every bit as secure if
proper steps are taken to make it so, and redundant arrays at remote
locations mean that disaster recovery can be effected in moments, event if
the local nodes are completely wiped out. I think some of these are old
arguments that are now moot.
>
> There seems to be all sorts of confusion here about potentially
> distinct choices for programming language, hardware choices, and
> Operating System platforms. If there is one thing the last 30 years
> has taught, mass conversion to the latest and greatest programming
> language or programming paradigm of the day is not something to be
> undertaken lightly and doesn't guarantee that one will end up with
> something that is as efficient or costs any less to run and maintain
> than the original.

Certainly, platform migration based on fashion alone is a foolish move.
However, in the case of COBOL it isn't about fashion. Object Orientation has
been with us for a long time now and it isn't going away. It underpins
computing and IT in such a pervasive way that it simply cannot be ignored if
you need anything more than the simplest kinds of batch processing. For
existing COBOL functions to be able to be leveraged into the modern world
and interact with other packages and applications, COBOL has to put on an
Object mantle. Traditionally, there haven't been the skills available to do
this (the majority of "COBOL programmers" know COBOL and understand
procedural processing very well; it is not easy to make the transition to a
new paradigm like OO and they find it difficult. I know this from personal
experience having made the necessary transition many years ago, and helped a
number of others to make it. Once you have done it, it is easy to forget
that you "sweated it" for a while. However, the benefits definitely make it
worth the effort.), and there hasn't been perception of the need to do it.
It is only in the past decade or so, as the rest of the world has shown it
can produce applications cheaply and easily that leave COBOL gasping, that
corporate IT departments have had to sit up and take notice.

Now there is a situation where most corporations are voting with their feet
and looking for ways to get out of COBOL. This is what has largely led to
the "bad reputation" COBOL is perceived to have, and has led in turn to
Universities dropping it. COBOL is mostly still taught in a cursory fashion
as part of a "History of Computing" module. (They have to respond to the
needs of industry and industry needs people who understand objects and
layers.)

The problem with this is that there is a HUGE existing investment in COBOL
and nobody wants to simply write this off, even if that were possible. The
last decade has seen increasing efforts to address this with packages and
platform migrations and there have been some disasters. (There have also
been some success stories, but we don't really tend to hear about those
except in a marketing context, and most people are sceptical of anything
they hear in a marketing context... :-))

If someone comes along with a "solution" that says: "Don't worry, it's OK.
You can do what you've always done and keep on writing procedural code, but
now you can move it off your "very, very expensive" platform", you can
hardly blame people for embracing it. IT can report that they are now
running on a modern network, nobody needs to be retrained, and the status
quo remains intact, which makes everybody feel comfortable... until Reality
bites and they find that new packages with the functionality they require
can't work with their monolithic procedural code, their existing code is
using network resources at an alarming rate and the cost savings they were
promised don't eventuate because they are continually upgrading and
extending the network to accommodate their converted legacy, and they are
unable to enjoy newer features and facilities in their environment because
COBOL doesn't support it yet.

(After a while, the tech people get tired of forever playing "catch up" in
OO COBOL and simply move to Java, C# or VB.NET, biting the bullet on
retraining because they are now motivated to expand their skill sets. This,
of course, is what they should have done in the first place...)

The truth is that if you want to live in Rome, you have to act like a Roman.
(Otherwise you are consigned to the ghetto of expats and can never enjoy the
full benefits of Roman citizenship.). If COBOL is to succed in an OO world
it needs to be open (DB rather than flat files) and it needs to be Objects
and Layers.

That is why I have ensured that my company, at least, uses this approach. It
doesn't matter how you cut it, there is expense in moving off COBOL.(of
course, there may be even more expense in trying to retain it). You can
invest that expense in a planned long term direction to bring the essential
parts of your legacy into the modern world, or you can use it to simply buy
time and prolong the staus quo.

>There are modern mainframe platforms that support
> COBOL quite well, and modern COBOL has supported OO programming
> techniques and relational databases for years.

Yes, it has, Joel. Unfortunatey the COBOL community never really embraced
it, and, to be fair, it is only becoming gradually apparent to many, why
they need to.
>
> I don't think Grace Hopper would have described herself as the
> inventor of COBOL. She did invent the first programming language,
> Flow-Matic, the concept of using English phrases in a programming
> language, contributed significantly to the CODASYL committee that
> extended Flow_Matic into what became COBOL, and was also the major
> player in the adoption of COBOL by the military.

And that was all over 50 years ago. Kudos to her, but it isn't relevant to
solving IT problems in present time. The world has moved on.

Objects and Layers.

Pete.

--
"I used to write COBOL...now I can do anything."


From: SkippyPB on
On Fri, 21 May 2010 13:27:20 -0500, "Joel C. Ewing"
<jREMOVEcCAPSewing(a)acm.org> wrote:

>On 05/21/2010 07:39 AM, Pete Dashwood wrote:
>> Ubiquitous wrote:
>>> By Dorothy Ramienski
>>> Internet Editor
>>> Federal News Radio
>>>
>>> Many federal agencies are using machines that are running software
>>> developed 25 to 35 years ago, and they're still going strong.
>>>
>>> These machines are using the COmmon Business-Oriented Language
>>> (COBOL) to get things done.
>>>
>>> Invented in 1959 by Naval officer Grace Hopper, COBOL is still a
>>> prevalent force throughout the federal government.
>>>
>>> There are still about 200 billion lines of the code in live
>>> operation, and 75 percent of business-critical applications and 90
>>> percent of financial transactions use it.
>>>
>>> It's a very secure, easy-to-learn language, but it's running into some
>>> problems now that the Obama administration has issued modernization
>>> initiatives such as the Open Government Directive.
>>>
>>> Joe Moyer is regional director of Micro Focus, a company that's
>>> helping federal agencies move their platforms from COBOL to newer
>>> ones.
>>>
>>> "It is a very secure and solid and very good language for
>>> transactional business. It's really running the critical mission
>>> applications in our our government today. So, when these applications
>>> have been around for so long, how do you make changes? How does the
>>> government address this issue?" Moyer explained.
>>>
>>> COBOL does its job, but it is old, and the Obama administration is
>>> running into problems. Since one of its goals is to modernize
>>> government while lowering costs, COBOL is becoming a bit of a
>>> sticking point.
>>>
>>> "These applications are very, very expensive to run on older
>>> mainframes, whether that's an IBM or Unisys platform. There's really
>>> just a few ways government will address this issue -- do you rewrite
>>> these applications into Java, which could take years and years? Do
>>> you replace them and go to a COTS package -- and that's a little
>>> difficult when an application could have 30 million lines of COBOL
>>> code going to an ERP? Or, do you do nothing and keep paying the
>>> expensive cost to maintain these applications?"
>>>
>>> Moyer says his company provides a solution that could enable the
>>> government to reuse what it has with a software appliance that moves
>>> applications from COBOL to one like Windows or Unix.
>>>
>>> And then there's also the Open Government Directive.
>>>
>>> A lot of agency information is being held hostage, so to speak, by
>>> COBOL. Moyer explained that the IRS, SSA and even parts of DHS and
>>> DoD still use COBOL in a closed environment.
>>>
>>> "Can you really go to an SOA environment or to cloud computing from
>>> the mainframe? . . . We can actually take a COBOL application and
>>> move it to [an] open environment. . . . [COBOL] works. It's a
>>> fantastic language. It's written in English. It's very easy to
>>> develop, and that's why many agencies have not moved off it yet. So,
>>> with the Obama administration's modernization initiatives, why not
>>> save millions and millions of dollars while keeping that secure
>>> language behind the scenes."
>>
>> Moyer is marketing Micro Focus. And it is not a bad solution but it neglects
>> the fact that COBOL cannot be a long term solution. (To be fair, you can
>> hardly expect a marketing spokesperson to understand the subtle differences
>> between technical paradigms; his job is to market a COBOL solution and
>> ensure an ongoing revenue stream for the company, not to raise awareness of
>> WHY COBOL needs to be replaced.)
>>
>> Re-compiling existing code for .NET simply enables a platform transfer, it
>> DOESN'T open up the COBOL resources so they can be easily integrated with
>> other languages and packages. To do that, the COBOL needs to be wrapped as
>> objects, and if you are going to wrap it as objects, there is no need for a
>> .NET compiler. (The .NET platform is designed to accommodate objects that
>> are not managed code, using InterOP services.)
>>
>> Continuing development in COBOL is fine as an interim measure while people
>> are being trained in newer languages and techniques but ultimately you
>> either salvage or rewrite the code you have. Either way, just recompiling it
>> to run on the .NET platform simply perpetuates the use of COBOL "as it is".
>> Many companies are finding that, in today's world, that is proving to be
>> simply not enough.
>>
>> Modern tools and techniques deal with objects and layers and traditional
>> COBOL doesn't. (OO COBOL as implemented by MicroFocus and Alchemy, to name
>> just two major players, is a step in the right direction, but it is
>> expensive and the OO features were bolted onto it, so it can never be as
>> powerful as languages designed to manage objects from the day they were
>> conceived.)
>>
>> PRIMA is committed to providing tools that move traditional COBOL into the
>> new world of Object Orientation. We already have tools that enable fully
>> automated conversion of COBOL file-based solutions to be moved to Relational
>> database. That is an important step, because it opens up the data resource
>> without requiring manual amendment or rewrite of existing code. (The tools
>> amend the code automatically to access the new RDB, so it continues to
>> function logically as it always has, but now other tools and other languages
>> can access the data, as well as COBOL.) Both the new database and the code
>> that accesses it can be generated automatically from existing COBOL
>> copybooks.
>>
>> The process is a simple 3 step one and we have tools that do each step fully
>> automatically:
>>
>> 1. Create a new Relational database that is functionally equivalent to the
>> existing file base. (A set of tables on a Relational Data Base (RDB) is
>> generated in at least 2NF for each existing indexed file.)
>>
>> 2. Objects are generated (callable locally or remotely from the desktop or a
>> web page) to manage each of the generated table sets. These are COM server
>> components, and are referred to as Data Access Layer (DAL) objects).
>>
>> 3. Existing programs that access the indexed files are transformed to invoke
>> the DAL objects that manage the table sets. (As the DAL objects can be used
>> with any language that supports COM (including COBOL - both OO and standard)
>> they provide a useful separation between Business logic and Data layers and
>> can be used for future development as well as the existing applications. In
>> effect these objects are "future proof".)
>>
>> Having achieved a stable base that is running existing applications against
>> the new RDB, the next step is to refactor existing COBOL code into objects
>> also.
>>
>> This process is currently mainly manual, but we are developing tools to
>> automate aspects of it and hope eventually to make it "mostly automated"
>>
>> Ancillary tools generate load modules that read the indexed files and write
>> the table sets, generate Host Variables for ESQL (DECLGEN), and a number of
>> other tasks that are tedious (and error prone...) when done manually.
>>
>> Please visit http://primacomputing.co.nz and read my statement about how
>> this approach differs from a straight conversion using .NET COBOL.
>>
>> For a graphic representation see
>> http://primacomputing.co.nz/COBOL21/mig.aspx
>>
>> As for the topic of this thread, our base toolset comes at under half the
>> price of a .NET COBOL compiler. Furthermore we engage with and support
>> anyone using our tools, to ensure their migration is successful and as
>> "painless" as possible.
>>
>> Pete.
>
>I would take issue much earlier in the original comments starting with
>the "These applications are very, very expensive to run on older
>mainframes,...".
>
>Are they really running these applications on "older mainframes" --
>stuff from water-cooled days or earlier? If so, then that is a
>ridiculously expensive choice and their cheapest solution would be to
>immediately move to latest hardware that gives much more bang per buck
>with lower environmental, energy, and maintenance costs. If these are
>mission-critical applications, changing to other less-robust platforms
>that are harder to manage for Disaster Recovery may be cheaper in the
>short term than using modern mainframes, but could be a very unwise
>long-term move.
>
>There seems to be all sorts of confusion here about potentially distinct
>choices for programming language, hardware choices, and Operating System
>platforms. If there is one thing the last 30 years has taught, mass
>conversion to the latest and greatest programming language or
>programming paradigm of the day is not something to be undertaken
>lightly and doesn't guarantee that one will end up with something that
>is as efficient or costs any less to run and maintain than the original.
> There are modern mainframe platforms that support COBOL quite well, and
>modern COBOL has supported OO programming techniques and relational
>databases for years.
>
>I don't think Grace Hopper would have described herself as the inventor
>of COBOL. She did invent the first programming language, Flow-Matic,
>the concept of using English phrases in a programming language,
>contributed significantly to the CODASYL committee that extended
>Flow_Matic into what became COBOL, and was also the major player in the
>adoption of COBOL by the military.


I have to go with Joel here about costs. While the government usually
isn't on the forefront of technology, I believe they have switched off
of the Burroughs mainframes that used to run everything to IBM's zOS
hardware. As for using COBOL to run the bulk of their applications,
there is absolutely nothing wrong with that in my opinion. So what if
the language is over 50 years old. It has, in recent years, been
upgraded and enhanced not only by IBM but other companies as well. It
is well situated to be a cost effective language for the mainframe for
the next 50 years...far more than any of the current soup de jour
languages.

Regards,
--

////
(o o)
-oOO--(_)--OOo-


There are three kinds of men. The ones that learn by reading. The few
who learn by observation. The rest of them have to pee on the electric fence
for themselves.
--Will Rogers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Remove nospam to email me.

Steve