Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Why Programming Still Stinks 585

Andrew Leonard writes "Scott Rosenberg has a column on Salon today about a conference held in honor of the twentieth anniversary of the publishing of 'Programmers at Work.' Among the panelists saying interesting things about the state of programing today are Andy Hertzfeld, Charles Simonyi, Jaron Lanier, and Jef Raskin."
This discussion has been archived. No new comments can be posted.

Why Programming Still Stinks

Comments Filter:
  • panel link (Score:5, Informative)

    by AnonymousCowheart ( 646429 ) on Saturday March 20, 2004 @11:18PM (#8624805)
    here is the link [sdexpo.com] to the ppl on the panel, and talks about their backround
  • by Anonymous Coward on Saturday March 20, 2004 @11:26PM (#8624855)
    In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.

    It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.

    In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.

    But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.

    These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.

    As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."

    The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."

    Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.

    Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The other is the sheer software engineering to make that artifact come into being. I believe these are two separate roles -- the subject matter expert and the software engineer."

    Giving the forme
  • Full Article (Score:0, Informative)

    by Anonymous Coward on Saturday March 20, 2004 @11:28PM (#8624870)
    Why software still stinks
    Programming must change -- but how? At a reunion of coding pioneers, answers abound.

    - - - - - - - - - - - -
    By Scott Rosenberg

    March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.

    It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.

    In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.

    But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.

    Today's Daypass sponsored by LowerMyBills.com

    These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.

    As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."

    The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."

    Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.

    Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, desi
  • by Anonymous Coward on Saturday March 20, 2004 @11:29PM (#8624874)
    Why software still stinks
    Programming must change -- but how? At a reunion of coding pioneers, answers abound.

    - - - - - - - - - - - -
    By Scott Rosenberg

    March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.

    It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.

    In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.

    But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.

    These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.

    As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."

    The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."

    Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.

    Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The othe
  • Re:Copyright (Score:4, Informative)

    by CeleronXL ( 726844 ) on Saturday March 20, 2004 @11:30PM (#8624876) Homepage
    In general I'd have to agree, but in this case seeing as Salon was simply trying to get money by having one of their own staffers(?) submit the article here, I think it would be well deserved. ;)
  • by torokun ( 148213 ) on Saturday March 20, 2004 @11:52PM (#8625002) Homepage
    This is some serious copyright infringement, man. Ripping an article verbatim and posting it on another site.

  • by YU Nicks NE Way ( 129084 ) on Sunday March 21, 2004 @12:40AM (#8625238)
    Uhh...no. Although I'm very angry with the Slashdot editors for posting this particular lead, this is blatant infringement of the most egregious form. Fair use only defends people who violate copyright in small (e.g. by printing a limited passage with full attribution) or who have some other significant difference from the original (e.g. a parody is almost be definition infringing, but protected by fair use.) Including a whole text verbatim for the purpose of avoiding the tariff a copyright holder has set is not fair use, whether or not the author benefits from it.
  • by JPriest ( 547211 ) on Sunday March 21, 2004 @12:41AM (#8625248) Homepage
    Here is the same link [google.co.th] without the annoying highlighted words.
    just remove all+the+search+terms

    The same link here [google.co.th] with extra words highlighted.

  • Hungarian Notation (Score:5, Informative)

    by Speare ( 84249 ) on Sunday March 21, 2004 @12:56AM (#8625314) Homepage Journal

    The linked page didn't mention that Charles Simonyi is the Hungarian for whom the term, Hungarian Notation is named.

    Hungarian Notation is that Microsoft Windows programming naming strategy where the first few characters of a variable name should hint to the reader as to its data type. So hToolbox tips you to understand that it was a HANDLE type, even without scrolling your editor up a couple pages to find out; papszEnvironment would likewise tell a Win32 devotee that it was a Pointer to an Array of Pointers to Zero-terminated Strings.

    It's not the first such instance of binding data type and name, and it won't be the last. For example, FORTRAN compilers have long had implicit variables; any variable not otherwise declared that started with I, J, K, L, M, or N would be assumed to be an integer, where most other variables would assume a real (floating-point) type. So FORCE(LAMBDA) directs the code to a real scalar from a real array, given an integer index. Many programmers start a routine with IMPLICIT NONE to disable this assumptive behavior, as mistakes are easy to make when you let the machine decide things for you.

    BASIC would use sigils at the end of the variable names (NAME$, COUNT#, and scripting languages like Perl use sigils that precede the name %phones, @log, $count).

  • by BobaFett ( 93158 ) on Sunday March 21, 2004 @01:04AM (#8625358) Homepage
    Unrealistic expectations? Yes. Of the management? That's the least of our problems. The "common sense" expectations for software are vastly unrealistic. Modern software tools are incredibly complex systems, both in the number of internal "degrees of freedom" and in their interaction with the environment. Yet they are expected by the public at large to function properly under circumstances which could be more different from what it was designed for than, say, driving a Ford across the lake (why is nobody complaining that Ford was lax in their testing because they only tested their cars on roads, which, after all, cover a tiny fraction of Earth surface?)

    At the same time, software is perhaps the only industry where everyone is a friggin expert. Most people (in the US anyway) happily pay someone $25 to change their oil. How many are willing to pay someone $25 to install a new sound card in their PC and load drivers? And, if you look at the complexity of interactions between parts of the system and open-endedness of the interface with the environment, oil change is downright primitive compared to sound card change. But no, for some reason it should be "easy". Look at everyone who enables "expert mode" on their software, while even "novice mode" presents more controls than a smal airplane.

    And the only people who actually understand the complexity of the software, the software engineers, somehow let themselves become convinced that software really should allow millions of users to do thousands of things they want, the way they want, all at once, and be "easy" at the same time.
  • If anybody else goes to Simonyi's company [intentsoft.com] and still can't figure out what they're talking about (mostly because it's vapourware at the moment, I believe), may I direct you to this Wiki [program-tr...mation.org]. It turns out that he thinks source transformation tools will change the world.

    I'm told that my university is one of the leading source transformation research centres in the world, but the only interesting things they're producing right now are for understanding legacy systems. So yes, there's probably a lot of money in source transformation, but it's also boring as hell.
  • by Admiral Burrito ( 11807 ) on Sunday March 21, 2004 @03:45AM (#8625978)
    You can learn as much from a data structures class taugh in Java as you can from one taught in $language_of_choice. The idea is to learn how things work fundamentally, and then apply those ideas practically. A linked list in Java works the same as a linked list in C.

    The thing is, Java is somewhat high-level. There are things that go on under the hood that you won't learn about, but once in a while pop up anyway. For example, being taught Java, you won't learn about the difference between memory allocated on the heap, and memory allocated on the stack. And yet...

    This does not work (it doesn't even compile):

    String x = "a";
    (new Runnable() { public void run() { x = "b"; } } ).run();
    System.out.println(x);

    There's nothing wrong with the code; the problem is that Java pretends to support closures but really doesn't. To use x in the anonymous inner class, you need to declare it final. But if you declare it final, you can't do the x = "b" assignment.

    I'm familiar with C, so I understand the difference between the heap and the stack. I can infer that x (the reference to the string, not the string itself) is allocated on the stack. It is not uncommon for instances of anonymous inner classes to outlive the stack frame they were created in, so the compiler doesn't know whether or not x (on the stack) will still exist when the object's run() method is called. So it makes a _copy_ of x, but in order to pretend that it is still x, the compiler wants you to declare it final so that the original and the copy can never get out of sync.

    Having experience with C, I know the heap is a safe place to put things that may need to outlive the current stack frame:

    final String[] x = new String[] { "a" };
    (new Runnable() { public void run() { x[0] = "b"; } } ).run();
    System.out.println(x[0]);

    It's ugly, but it works. The reference called x needs to be declared final (because it's on the stack) but the reference contained in the array does not need to be final (because it's on the heap).

    Because of my experience with lower-level stuff, I understand how Java is faking its support for closures, and how to work around the limitations. This is only one example; there are many other times when understanding things from a closer-to-the-metal perspective gives insights that would be lost if things were only understood from a high level. Joel Spolsky summed this up fairly well: Leaky Abstractions [joelonsoftware.com]

  • Full text (Score:1, Informative)

    by Anonymous Coward on Sunday March 21, 2004 @06:20AM (#8626408)
    Why software still stinks
    Programming must change -- but how? At a reunion of coding pioneers, answers abound.

    - - - - - - - - - - - -
    By Scott Rosenberg

    March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.

    It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.

    In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.

    But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.

    These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.

    As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."

    The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."

    Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.

    Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The othe
  • Note: This reply got really off-topic as I wrote it, but I still think it is an interesting train of throught. Got me thinking about languages, at least. So it may be a little rambling... I apologize in advance for any hard-to-understand prose.

    "Really cool work", can be done in any language, and the proliferation of languages shows that there's many solutions to the same problem.

    That's generally true except for some niche markets. There are still a few things, which should be programmed in assembly or a lower-level language like C. Problems with extreme memory or speed requirements are often only solvable with languages invented when most computers benchmarked against those extreme memory or speed specs.

    when the language design is flawed, only deep education of the masses (as in, don't do this, you'll regret it) can save the language.

    I don't think most languages - C++ in particular - are flawed. They rather represent different tradeoffs between ease of writing small programs, ease of writing large programs, limiting performance, capabilities, and extensibility.

    For instance, Java and Python provide large built-in libraries to make programs less complex. Those two languages are also generally implemented via virtual machines. VMs provide a medium between slow-but-safe (and highly portable) interpreted languages like BASIC, ECMAScript or Haskell and the traditional fast-but-dangerous compiled languages such as C/C++ or others. Also C++ was originally implemented in C macros - wasn't it? That's extreme extensibility. However, you don't want to use those features for individual projects: it'll only add complexity and thus a greater chance of bugs. Those features are generally useful for niche experiments, debugging, or sharing code among many projects. As a movie said (or something like it), with great extensibility comes great responsibility. :-}

    There are many more examples. Special-purpose languages like XSLT which do one thing really well (in this case, interpret data) and suck at almost everything else. (The game of Life can be implemented in XSLT, but why? :-} ) Some languages are easier to use in some professions than others - i.e. Lisp and Haskell in Comp Sci; Matlab and Mathematica in Engineering (although everyone really hates them); I once saw a psychology experiment written in Pascal (why? I don't know!). The choice of a programming language is all about design tradeoffs. The requirements of the problem dictate which class of languages are best suited - it doesn't make one right or one wrong.

  • by MythMoth ( 73648 ) on Sunday March 21, 2004 @07:36AM (#8626557) Homepage
    I think you're over stating your case.

    In pure line count a minimal Java Hello World would only have one additional line.

    It is crammed with keywords, and it contains the notion of Objects and Classes.

    But I see that as a good thing - you can concentrate on the mainline code and introduce the student to control flow and so forth, but when you come to the concepts of classes you've got a nice immediate example to point to.

    It's so much easier to teach a language when a common reaction to new information is "oh, I wondered what that was for" rather than "why would I ever need that ?"

    Finally I have to completely disagree with you about type safety. A perfectly written and comprehended program does not need type safety. A real world program will never be either, and type safety will prevent some of the nastiest bugs from occurring and keep your data intact.

    I have no problem with C in its place, but its place is not as a learning language or as a business language.

    D.
  • by dustmite ( 667870 ) on Sunday March 21, 2004 @11:25AM (#8627266)

    Hungarian Notation is not a "Microsoft thing". If you've ever bothered to read the original Charles Simonyi paper, it has nothing to do with Windows, and the original seemed more tailored for e.g. physics concepts than window and process handles. Microsoft later adopted an adapted version of this for their own coding. It's not a "Microsoft thing" or a "Windows thing", sheez, if I had a cent for every time someone repeated that bit of misinformation.

  • by john.r.strohm ( 586791 ) on Sunday March 21, 2004 @11:31AM (#8627289)
    Hungarian notation was originally developed as a band-aid (tm) for the near-complete lack of type checking in C. When all ints are created equal, and may be freely assigned to each other, and pointers must routinely be type-coerced to something else, and the compiler refuses to help the programmer keep things straight, something like Hungarian notation becomes necessary.

    Hungarian notation declined after Stroustrup added strong typing to C++. It is worth noting that Stroustrup never even considered NOT doing strong typing in C++. (Read his book on the design and evolution of C++.) Distaste on the part of hard-line C programmers for strong typing also declined, after C++ forced them to eat their broccoli, and they discovered it actually tasted pretty good (by catching errors that would otherwise not have been found nearly as easily).

    It is also worth noting that Hungarian notation never caught on in any language other than C. In particular, Ada programmers never bothered with it: the Ada type system was rich enough that it could do everything that Hungarian notation pretended to do, and enforce it, by requiring compilers to refuse to compile type-incorrect programs.

    (Somewhere, recently, I saw something about a commercial satellite project that was forced to use Ada, because there was no C/C++ compiler available for a recently-declassified microprocessor. Their programmers screamed bloody murder at the idea. The screams stopped when they noticed that the Ada compiler was catching errors for them.)
  • Re:ADA (Score:2, Informative)

    by Bush Pig ( 175019 ) on Sunday March 21, 2004 @10:04PM (#8630519)
    The reason we do this is largely because Ada compilers tend to be expensive, buggy or huge (sometimes all of these). That said, Ada is a really nice language to program in. It was the primary teaching language at Adelaide University when I did my degree, and certainly helped force good programming habits. Although it's possible to write horrible code in Ada, it requires a particularly dedicated kind of stupidity, and the nice thing is, that if you get a clean compile, you can be pretty sure the program will do something sensible (but not necessarily what you intended).

  • Re:ADA (Score:1, Informative)

    by Anonymous Coward on Tuesday March 23, 2004 @12:44PM (#8645879)
    The reason we do this is largely because Ada compilers tend to be expensive, buggy or huge

    Wrong [act-europe.fr], wrong [gnuada.org], and wrong [gnat.com].

    GNAT [gnat.com], the GNU Ada compiler, is Free (as in speech and beer), commercially supported, and has been integrated into mainstream GCC development. Get it. Use it. Love it.

    Users of debian can simply "apt-get install gnat" (and also think about getting gnat-gps gnat-doc ada-mode and ada-reference). Other distros probably have similar packages, Others can check GNUAda.org [gnuada.org], which has packages for Linux, NetBSD, DOS, and OS/2.

    I studied CS at NYU and took a programming languages class with Robert Dewar (main author of GNAT and president of AdaCore, the company behind GNAT, among other things (SPITBOL, anyone?)). One of the best classes I've taken at college.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...