Forgot your password?
typodupeerror
Programming IT Technology

The Return of Ada 336

Posted by Zonk
from the lord-byron-will-be-in-the-sequel dept.
Pickens writes "Today when most people refer to Ada it's usually as a cautionary tale. The Defense Department commissioned the programming language in the late 1970s but few programmers used Ada, claiming it was difficult to use. Nonetheless many observers believe the basics of Ada are in place for wider use. Ada's stringency causes more work for programmers, but it will also make the code more secure, Ada enthusiasts say. Last fall, contractor Lockheed Martin delivered an update to ERAM, the Federal Aviation Administration's next-generation flight data air traffic control system — ahead of schedule and under budget, which is something you don't often hear about in government circles. Jeff O'Leary, an FAA software development and acquisition manager who oversaw ERAM, attributed at least part of it to the use of the Ada, used for about half the code in the system."
This discussion has been archived. No new comments can be posted.

The Return of Ada

Comments Filter:
  • by r00t (33219) on Tuesday April 15, 2008 @01:19PM (#23079612) Journal
    All that vulnerable client-side code (image libraries, HTML parser, etc.) would be immune to buffer overflows if it were in Ada.

    Even better, write it in proof-carrying Ada. (while an aritrary theorem prover is impossible, one can get a theorem prover to work in practice via minor tweaks to the input)
  • by ErichTheRed (39327) on Tuesday April 15, 2008 @01:27PM (#23079732)
    I don't think I'm the only one who has had to work with really lousy programming and IT coworkers. One of the good things about the past was that programmers had a much harder time hiding their mistakes. In the days of dual-core processors and tons of RAM, even a mediocre programmer can get Java or any of the .NET languages to produce code that works. Of course, readability, maintainability and speed aren't really a factor.

    Is going back to Ada and other similar languages a good idea? Maybe. But I think you could get the same result by just demanding better quality work out of existing languages. People have correctly pointed out that the languages aren't really to blame, because you can write garbage in just about any language.

    I sound like an old fogey, but I'd much rather see a smaller IT workforce with a very high skill set than a huge sea of mediocre IT folks. This would help combat outsourcing and the other problems affecting our jobs. Almost everyone I've heard complaining the loudest about outsourcing has been either downright lazy or just not very good at what they do.

    I'm primarily a systems engineer/administrator. There are many parallels in my branch of IT to the development branch. We've got the guys who can really pick a system apart and get into the guts of a problem to find the right answer. We also have the ones who search Google for an answer, find one that solves half the problem, and wonder why the system breaks a different way after they deploy it.

    Not sure how to solve it, but I think it's a problem that we should work on.
  • by Alistair Hutton (889794) on Tuesday April 15, 2008 @01:28PM (#23079742) Homepage
    I'm actually quite fond of Ada as a language. Yes, it's a very verbose language but unlike, say Java or C#, the verbosity gives you a lot of stuff. It gives you good threading. It gives you a very good encapsulation. It gives you a very nice parameter system for procedures/functions That's a point, it seperates between procedures and functions. It gives very, very, very good typing. Very good typing. It's very good. I like it. It's what I want when I'm doing strong, static typing rather than the wishy-washy getting in the way mess that many other main-stream languages. When I use a type I want it to mean something. It's a good language to teach students about programming in my opinion.
  • by museumpeace (735109) on Tuesday April 15, 2008 @01:31PM (#23079790) Journal
    I make a nice living rewriting Ada systems into C++. When DoD suspended the "only quote us system development costs based on Ada" requirement, most bidders dropped Ada like a burning bag of poop. Its best advances such as exception handling have been picked up by modern system programming languages and even Java. The doctrinaire variable type enforcements have yet to be equaled but OO it really aint. Bottom line, plenty of old defense software systems have few living authors who will admit to knowing the code and upkeep is expensive, talent hard to find. This is ironic since DoD spec'd Ada in the first place because it had a maintenance nightmare of hundreds of deployed languages. So of course the managers think a more popular language with "all the features" of Ada should be a porting target. Eventually even customers demanded modernization and compatibility ports.

    I know a few die hard Ada programmers who just love it...but very few. The brilliance of the language can be debated but its moot: no talent pool to speak of.

    And besides, Ada is really French. [why did GNU make an ada compiler??????????????]

    technology market: you can't separate technical merits from market forces
    open source: your market has a small leak and is slowly collapsing.
  • by hey! (33014) on Tuesday April 15, 2008 @01:32PM (#23079806) Homepage Journal
    I think there is something to be said for this idea. Not too much, mind you, but something.

    Let's imagine a language so obscure and difficult, that 90% of working programmers cannot gain sufficient mastery of it to understand what it is saying at first glance. This sounds terrible, until you realize that every programmer at some time in his life has written code in "friendly" languages that 0% of programmers (including his future self) can understand. And maybe selecting a language that only the top 1% of programmers is capable of using might be a good thing for some projects.

    Unfortunately, I don't think you can mandate thinking via language restrictions.

    If you really, really wanted to improve the quality of thought in code, wouldn't mandate languages, you'd mandate editors that don't support cut and paste. Then instead of taking a piece of code that works more or less for one purpose, then hammering into the approximate shape you'd need for something else, sooner or later you'd be forced to abstract what was useful about it rather than banging it out over and over again.
  • by dprovine (140134) on Tuesday April 15, 2008 @01:43PM (#23079944)

    I think that any strongly typed language with lots of compile time and link time checks would be about as good (e.g., Java).

    In Ada, you can declare a variable to be an integer in the range 1..100, and if it goes outside that range at any point during its lifetime, an exception is immediately thrown. In most languages, you'd have to check it every time you assign it.

    Also, you can declare subtypes which not only define ranges but wall themselves off from each other. If you declare "MonthType" and "DateType" as types, and then ThisMonth and ThisDate as variables, you can't say assign ThisMonth to ThisDate (or vice-versa) without an explicit cast, even if the value stored is within range.

    I programmed in Ada more-or-less exclusively for a year, with all the warnings possible turned on, and it did change a bit how I think about programming. I always know, instantly, what type any object is and what its limits are, because I got so used to thinking about those things when using Ada.

    Not that it's perfect, or the ultimate, or anything. I had a job where I wrote C only for about 2 years, and that definitely changed how I thought about programming too. When writing C++ I have sense of what the computer is going to have to do to actually run the code.

    There's a quote that any language which doesn't change how you think about programming isn't worth knowing. Ada built up my mental macros for making sure my types and values were in order, and for that alone it was worth learning and using for a year.

  • by drxenos (573895) on Tuesday April 15, 2008 @02:20PM (#23080458)
    When is the last time you used Ada? 1) See: Ada.Strings.Unbounded 2) Ada leaves that choice up to the programmer (like C++) (see: pragma controlled). The next version of Ada will have an STL-like library, which will at least reduce the need for GC. 3) See: pragma Export(C, Foo, "foo") and Convention(C, Foo). Some compiles even support CPP in place of C, with automatic translation between C++ classes and exceptions. 5) You should always know the type of data you are dealing with (unless you are writing generics, which still has some limits for safety). 6) Ada's dispatching is based on the actual call being made. No need to mark members has virtual (C++) or to just make them all virtual (Java).
  • by shoor (33382) on Tuesday April 15, 2008 @02:33PM (#23080596)
    Back in the 70s there was a big fuss being made about something called "Structured Programming". A lot of people took notice when a big project, an indexing system for the New York Times was finished with remarkably few errors. Yet, that success did not seem to become the norm. (It's mentioned briefly in the wikipedia article on 'structured programming').

    COBOL used to be touted as a great language because it was 'self documenting'. Yet a lot of retired COBOL programmers got a last hurrah when they were hired to update obscure code in this 'self-documenting' language to handle dates with the year 2000 in them back at the end of the 90s.

    Basically, I think what it boils down to is discipline and talent in the development process. That is far more important than the choice of language. To some extent, I would buy the idea that the fewer lines of code required to write out a program, the better, because there are fewer chances of errors. But even that can be taken to extremes in a language like APL, or if the lines refer back through obscure nests of classes. By few lines of code I mean a few readable lines of code that a programmer can look at and actually know what is supposed to be happening and how.

  • by HiThere (15173) <charleshixsn@earthlin k . n et> on Tuesday April 15, 2008 @02:54PM (#23080814)
    I didn't say there weren't solutions to those problems. Merely that the problems exist.

    1) Unbounded strings is a partial solution, but it doesn't blend nicely with string constants, so it's not a good solution.

    2) I don't like C or C++ and garbage collection either. I'm considering boost C++ for a project, but I'll probably opt for some language with decent garbage collection.

    3) Exporting a C interface isn't equivalent to good support of C++. Try interfacing Qt to Ada. (It's probably been done, but I mean try doing it yourself...this is about the general problem of C++ libraries.) And Ada doesn't even handle interfacing with C well (which isn't unusual, it's a hard problem...macros and other #define's).

    5) I can tell what kind of data I'm dealing with at run time, but dealing with this in Ada is unpleasant. Think of a container of objects that resides on a disk. If you read something in, it will tell you what kind of thing you've retrieved, but if you need to have that type hard-coded into your program, you've got to run through considerable contortions to deal with this. It's hard to handle this well and still be efficient at run time, but to me it's a continual irritation if I'm trying to use Ada.

    6) I think I understand what you're saying, but I guess I was too vague. Ada's model of objects *IS* a data structure. What you inherit is that data structure, which you can extend. Operations on the structure aren't a part of the "object". Operations, however, are typed to deal with particular groupings of objects. (I'm not clear on precisely when you need to say type'class, but I haven't used Ada much because of the other irritations.)

    Yes, I'm picky. I acknowledge that. Currently I'm using Python while I wait for D to develop into a suitable language. (Insufficient libraries is the main problem. Again there's problems connecting to libraries written in C or C++ . Nobody seems to handle that well. Macros and other preprocessor abominations appear to be an intractable problem. If D doesn't shape up in time I'll probably either switch to Boost C++ + Boost Python, or Python + Pyrex + C. Neither option thrills me, but both look possible. So does D + C + Python...though that one looks fragile. And maybe D + Pyrex + Python. But anything involving Python is going to presume that a compatible interpreter is installed on every system that hosts it.

       
  • C A R Hoare on Ada (Score:2, Interesting)

    by jchandra (15040) on Tuesday April 15, 2008 @03:19PM (#23081160) Homepage
    Hoare [wikipedia.org] had written an interesting paper on Ada titled The emperors old clothes [ucsb.edu] on Ada.

    He writes

    And so, the best of my advice to the originators and designers of ADA has been ignored. In this last
    resort, I appeal to you, representatives of the programming profession in the United States, and citizens
    concerned with the welfare and safety of your own country and of mankind: Do not allow this language
    in its present state to be used in applications where reliability is critical, i.e., nuclear power stations,
    cruise missiles, early warning systems, anti-ballistic missile defense systems. The next rocket to go
    astray as a result of a programming language error may not be an exploratory space rocket on a harmless
    trip to Venus: It may be a nuclear warhead exploding over one of our own cities.
  • by cmeans (81143) <cmeans&intfar,com> on Tuesday April 15, 2008 @04:04PM (#23081832) Homepage Journal
    Or it could simply mean that they way over budgeted for the work that was actually needed to be performed, and poorly spec'd the time it would take to do it. Normally, companies are more conservative with how long it'll take to do something (this helps bring the cost down and increases the likely hood that they'll get the work), thus, the reality tends to extend the deadline. In this case, they might have just been overly generous in their time allotment, and simply fortunate that the client was willing to pay for it.
  • Re:I used ada.... (Score:5, Interesting)

    by SL Baur (19540) <steve@xemacs.org> on Tuesday April 15, 2008 @04:06PM (#23081864) Homepage Journal
    Yeah. The typed I/O stuff was really the pits. It was even more difficult to send arbitrary data across the wire in networked applications (which of course, they all are in C3I - one of Ada's first application domains). Difficult, but not impossible.

    Perhaps the best job I ever had was when I was the 900 pound gorilla who vetted commercial Ada compilers. Every so often the boss would come in to my office, drop a package or tape of a commercial Ada compiler on my desk and say, "tell me what you think about this".

    I got so frustrated with Verdix Ada at one point because they had potentially the best system, but ignored our (valid) bug reports. After perhaps one beer too many and seeing a remark about VADS on comp.lang.ada, I flamed them. The next day, I got email and a telephone call from a guy at Verdix. After some discussion, I agreed to become a beta tester and if my concerns were addressed to issue a formal public apology on the newsgroup. I did, they did and I did. Unfortunately, the fix was in, the official Unix Ada compiler for the DoD was declared to be Alsys (Ichbiah (Green), Brosgol (Red), duh).

    I never met Ichbiah, but I did get to meet Benjamin Brosgol. He participated in Ada training (reeducation sessions) for Software Engineers at the company I worked for. A nice man, but I don't particularly care for the design decisions he makes in language design (and being me, I let him have both barrels - he's remarkably even tempered too).

    Alsys was barely usable - the code it produced worked, but even small systems (30k SLOC) took hours to recompile. At one point I was setting up a network test and noticed that one of my embedded message strings was wrong. Rather than doing a painful recompile of the world, I fixed it by editing the binary in Emacs. A couple weeks later, the test was still chugging along (remarkable for Ada stability at the time) and when it was time to give a demo to the highest ranking General in the US Army, the boss lady told me to just leave it running, so I did.

    So whatever anyone says about Ada in the 1980s, the view from the trenches was somewhat different. I also have no doubt that the technology probably got quite good in the 1990s. Early adopters always get the rough end of the stick.
  • Re:I used ada.... (Score:5, Interesting)

    by Not The Real Me (538784) on Tuesday April 15, 2008 @04:07PM (#23081868)
    "Likewise, I also used Ada in college."

    I too used Ada in college. Ada is a superset of Pascal. It's very similar to Borland's Delphi and Oracle's PL/SQL, which are basically their versions of Object Pascal.

    The FAA should've used Java. Then the project would've taken 3x longer and had cost overruns of 400% and/or would've gotten cancelled, like most government projects.
  • by Darinbob (1142669) on Tuesday April 15, 2008 @04:37PM (#23082236)
    The problem with software taking too long to write and ending up over budget has little to do with how easy it is to write a line of code. Obscurity should have zero impact on a quality team of programmers (who should all be able to switch languages at a drop of a hat). Writing slowly should have little impact as well, since most programming should rarely involve having to type as fast as you can all day.

    So why does a difference in language make an impact? If a language emphasizes good programming practices, the users of that language tend to follow along. A language that is sloppy with types or has malleable interfaces can encourage sloppier code. You can write great code in any language, and yet great code still remains relatively uncommon.

    Another factor is with project management itself. The project under consideration isn't about writing a new version of quicksort quickly and under budget, it's about a huge piece of work with maybe dozens of people working together. Programming such a project is not about writing a new function, but about how to fix or change existing functions without breaking something else, and how to get different people's work to fit together as planned.
  • by Anonymous Coward on Tuesday April 15, 2008 @04:57PM (#23082526)
    Oliver,

    I worked with your tools on this project. They worked great. I agree that Ada was the right tool for this job. I also have the opinion that Ada causes programmer burn-out and it is highly probable that not all the code in the system that was in Ada should have been in Ada.
  • Re:I used ada.... (Score:1, Interesting)

    by Anonymous Coward on Tuesday April 15, 2008 @06:34PM (#23083310)
    I was a victim of Ada '83 in the late 80s. It all sounded like a good idea at the beginning but as I had to wrestle with the crapfest I grew to hate it like no other language. At that point I was willing to pick up any language and try it: Forth, Modula, etc., so I jumped in.

    All the propaganda (even hard technical documentation was full of it) compared Ada favourably with Cobol and Fortran instead of real-world choices in the 80s which all left it for dead. What a grab-bag of incompatible concepts. God forbid you used a limited private type anywhere that found its way into a structure. You'd spend your days writing Set() functions for each of those structures.

    Then there's Section 13, which embedded systems spend a lot of time involved with. The interrupt handling was a joke. It varied wildly from compiler to compiler, even between compilers from the same vendor (you paid separately for a 286 and 386 compiler - A$15k in 1990!). So there went your portability.

    And the compilers! What a gigantic fraud. We paid A$4k for a compiler that allocated all automatic arrays on the heap at the top of a function, but decided deallocating them wasn't worth doing at the bottom.

    Ada '95 might have fixed some or all these issues, but I don't care. I wouldn't work in it again if you doubled my salary. I'm still a fan of strong typing, but living in C for fifteen years I've implemented my own disciplines to 99% replace that.

    As far as I'm concerned Jean Ichbiah is the greatest con-artist in software history.
  • by Samah (729132) on Tuesday April 15, 2008 @08:05PM (#23084186)
    I don't understand why people love dynamic typing so much. I'd much rather the compiler tell me off for misusing a variable than at runtime where it might not crop up until the code is in production. I'm not saying dynamic typing is BAD (I love Lua), just that I don't get why so many people can possibly hate static typing.
    Also I don't think you can really label Java as "verbose" when it shares mostly the same the syntax as C++ and C# (unless you assume those to be verbose too).
    Having said that, it's not exactly terse, either. :)
  • by airdrummer (547536) <{ten.nozirev} {ta} {remmurd_ria}> on Tuesday April 15, 2008 @09:35PM (#23084922)
    i worked in ada, late '80s, early '90s, alsn, wrote a troff clone;-) i had just learned c & chafed @ the strictness (i don't need no steenkin compile-time checks;-) but i grew to appreciate it...definitely leads to better code.

    http://www.adaic.org/atwork/trains.html [adaic.org] gives an interesting case history of the superiority of ada for teaching real-time programming:-)

    " The only difference between the years in which teams succeeded in implementing their projects and those in which no team succeeded was the implementation language."
  • Re:I used ada.... (Score:3, Interesting)

    by Mad Merlin (837387) on Tuesday April 15, 2008 @09:51PM (#23085052) Homepage

    Ever try reading in and storing an arbitrary length string? I'm fairly convinced it's not possible in Ada.
    It's not possible anywhere, unless you have access to an arbitrary size memory. Ada simply makes you aware of that fact before you put the code into production.

    I'm not saying it has to be in a single step, indeed to do it safely will often require multiple steps. However, Ada doesn't seem capable of either.

    The icing on the case was that if you do a get(foo), where foo is a fixed length string, it will read exactly the length of the string, hanging if there's not enough input and stopping part of the way through the input if there's too much. It might be possible to work around that by reading a single character at a time into a character and then repeatedly appending the character to a string which you could theoretically resize. However, IIRC, resizing the string, casting a character to a string, and appending to a string are all extremely painful or impossible.

  • by smellotron (1039250) on Wednesday April 16, 2008 @02:48AM (#23086868)
    > In Ada, you can declare a variable to be an integer in the range 1..100, and if it goes outside that range at any point during its lifetime, an exception is immediately thrown. In most languages, you'd have to check it every time you assign it.

    I know it's not the same, but can't you get the same effect in other languages?  In particular, C++ can be used to implement numbers with ranges and units.  Something like this...

    template <int min_value, int max_value, typename type_tag>
    class checked_integer;

    struct distance_tag { };
    struct time_tag { };
    struct speed_tag { };

    typedef checked_integer<1, 100, distance_tag> distance_t;
    typedef checked_integer<1, 10, time_tag> time_t; // whoops, colliding with C runtime, ignore that
    typedef checked_integer<1, 10, speed_tag> speed_t;

    // add some other function or template specification that distance / time = speed

    meters_t distance = 30;
    seconds_t time = 5;
    speed_t speed = distance / time; // compiles
    meters_t whoops = distance / time; // doesn't compile

    meters_t distance = 100000; // throws exception when instantiated with out-of-range value

I'd rather just believe that it's done by little elves running around.

Working...