Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Encryption Security Programming IT Technology

Colossus Cipher Challenge Winner On Ada 168

An anonymous reader writes "Colossus Cipher Challenge winner Joachim Schueth talks about why he settled on Ada as his language of choice to unravel a code transmitted from the Heinz Nixdorf Museum in Germany, from a Lorenz SZ42 Cipher machine (used by the German High Command to relay secret messages during the World War II). 'Ada allowed me to concisely express the algorithms I wanted to implement.'"
This discussion has been archived. No new comments can be posted.

Colossus Cipher Challenge Winner On Ada

Comments Filter:
  • He should have used a real programming language like Java or VB.Net.
    • by morgan_greywolf ( 835522 ) * on Thursday May 15, 2008 @09:59AM (#23417424) Homepage Journal

      He should have used a real programming language like Java or VB.Net.
      Pffft. Real men write programs like this:

      $ cat >/bin/myprogram
      • 01010111 01101001 01101101 01110000 00100000 01110101 01110011 01101001 01101110 01100111 00100000 01000001 01010011 01000011 01001001 01001001
        • Re: (Score:2, Informative)

          by 91degrees ( 207121 )
          But you're using an ASCII representation of a binary representation of an ASCII string!

          You're using ASCII twice so you're twice the wimp!
      • Re: (Score:3, Funny)

        Comment removed based on user account deletion
        • what about the real programmers who set the universal constants at the start of the universe? =P
          • Yeah, but how much choice do they really have in picking those?

            --
            I made myself sad just now
          • Re: (Score:2, Funny)

            by Hasmanean ( 814562 )
            Real men don't use constants. They express all quantities in their natural units, (such as the size of the known universe, or the diameter of a hydrogen atom). Needless to say, they do not require constants to convert the answers to meter-kilogram-second units.

            In fact better men than these do not even use equations. They let the laws of physics fall out of their simulations, as evanescent perennial relationships between variables.
        • You may mock, but I wrote my first program with the pointy part of a compass.
    • by jellomizer ( 103300 ) on Thursday May 15, 2008 @10:01AM (#23417434)
      For the most part the language doesn't matter that much. ADA, C, C++, PASCAL, BASIC, LISP... Almost every languge can get the job done. It is just a matter on how well it handles different details. I like Python for its List Processing and Top Down Design. Some people like Visual Basic for its ease in creating good interfaces. Some people like C and C++ because they can control the system at a lower level.
      ADA being a government/military based languge I am not to suprised that it won the competition decifering a goverment/military code. (it is more complex then that)
      • For the most part the language doesn't matter that much. ADA, C, C++, PASCAL, BASIC, LISP...


        The *language* doesn't matter so much as the *particular implementation* of that language and the platform(s) on which it runs and the libraries available.

        C is a fine langugage, but don't try writing an OS kernel using the Ch [softintegration.com] C interpreter, for instance.
      • I still think they should have gone with DOS-based batch files ... or Monad/Windows PowerShell :)
      • by oni ( 41625 )
        Some people like Visual Basic for its ease in creating good interfaces.

        Perhaps OT: but I think people like VB because they don't know any other language.

        I have never in my life heard anyone say, "I know C, Java, Ruby and VB and I really like VB!" More often it's, "I was working as an office assistant and wanted a promotion so I got a book titled, _Fast-Track Learn VB in 10 Hours for Dummies_ and this is the only language I've ever used - and I like it!"

        Other than that, you're right. The language matters l
      • Re: (Score:3, Insightful)

        I only found out about the contest a couple of days before it began, and I was away at the time, so I
        couldn't participate in "real time", but I used the
        copies of the sent ciphertexts on the Bletchley
        website.

        I worked away on a lorenz breaker for the SZ42 stuff, written in C. I was able to break
        ciphertext roughly an order of magnitude faster
        than Joachims code. Joachim worked away on his code
        for several weeks in advance of the contest. I had only a couple of days notice.

        I think two things matter in a competi
        • Re: (Score:3, Informative)

          Joachim also chose to work a much harder problem than you did.

          You worked with the symbolic cyphertexts. He worked with raw baseband audio from the radio receiver, complete with noise.

          You knew this, of course, because you RTFA.
    • C:\>copy con c:\decrypt.exe
    • It matters not which language is used, just as long as it is edited with vi.
      • No. ED is the standard editor! You don't ask for a viitor or an emacsitor, you ask for an editor! Real programmers use ed. Vi (or, god forbid, vim) is for quiche eaters!

    • Re: (Score:3, Funny)

      He should have used Brainfuck [wikipedia.org]!
  • by Anonymous Coward on Thursday May 15, 2008 @10:04AM (#23417488)
    I wonder how easy it would be to break the Allies corresponding machine, the SIGABA (http://www.quadibloc.com/crypto/ro0205.htm). It was stated that during WWII, the Lorenz machine was broken, but the SIGABA wasn't. Of course, given 60 years of computer improvements, it might be possible to break the SIGABA, now.
  • by Anonymous Coward
    It was at the University of Dayton in the late 90s. ADA was the language they taught all their intro computer science classes in. They then switched to C++. I didn't like ADA, but looking back that may have been my own prejudges more than anything wrong with the language. Every computer class I had in high school used a different programing language. I was getting sick of learning new languages when I wanted to be advancing in my computer skills.

    Brian
    • Re: (Score:3, Funny)

      by kellyb9 ( 954229 )
      Actually most colleges don't want to be typecasted a "C++ school" or an "ADA school". It's more important to learn data structures and theory. If you went to a good school, the language something is written is trivial, learning the syntax should not be that difficult.
  • Type Casting (Score:3, Interesting)

    by hroo772 ( 900089 ) on Thursday May 15, 2008 @10:06AM (#23417522)
    For those that know the differences in Ada, its a very strongly typed language which makes it harder for a beginning programmer to pickup. It doesn't allow for type conversion and pretty much enforces strict coding rules. It would make sense that he used this since he would have complete control over what his code did exactly. This wouldn't be the case with java or other languages which allow type conversions easily, which is nice for alot of people, but can definitely lead to issues when not accounted for.
    • Re: (Score:2, Insightful)

      by egilhh ( 689523 )
      What makes you think that Ada does not allow type conversion?
      "typename(variable)" is pretty easy in my opinion...
      (not much harder than the type cast in other languages: "(typename)variable")

      Ada even has a package called Ada.Unchecked_Conversion if you don't care
      about ensuring the result is within the bounds of the target type...
    • Re: (Score:3, Informative)

      by Zoxed ( 676559 )
      > It doesn't allow for type conversion

      It does (unchecked_conversion), but never (AFAIK) *implicitly*.
      • Re: (Score:3, Informative)

        by kst ( 168867 )
        >> It doesn't allow for type conversion

        > It does (unchecked_conversion), but never (AFAIK) *implicitly*.

        Unchecked_Conversion reinterprets the bits of the argument as a value of the specified type.

        Ada also allows ordinary value conversions (for example, converting 3.1 to type Integer yields 3) among sufficiently closely related types; for example, a value of any numeric type can be converted to any other numeric type. It requires such conversions to be explicit in more cases than many other languag
        • by drxenos ( 573895 )
          Unchecked_Conversion reinterprets the bits of the argument as a value of the specified type.

          Maybe, maybe not. There are several rules that define what it does, and many that are implementation-specific.
        • It should be called FrackYou_Conversion. On our compiler it "converted" an 8 bit value to a 16 bit value by just grabbing the next byte.

          So you could end up with 256 different conversions depending on where in memory you code is located.
    • Re: (Score:2, Insightful)

      languages which allow type conversions easily, which is nice for alot of people, but can definitely lead to issues when not accounted for.
      This is, IMHO, both Python's greatest strength and it's greatest weakness as a dynamically-typed language. Sometimes you can get bizarre results if you're not careful. OTOH, once you get the hang of it, you won't make those mistakes.
    • by MosesJones ( 55544 ) on Thursday May 15, 2008 @11:05AM (#23418236) Homepage
      As someone whose first programming language was Ada, and who knows of several universities around the same time who chose Ada as a teaching language, I can say with certainty that you are completely wrong.

      First off those strict rules help you because you spend miles less time debugging stuff you don't understand, once it compiles it will tend to run and the compiler gives helpful messages about what you are doing wrong (often including suggestions on how to fix it). With Java, and especially C and C++, let alone scripting languages the beginner spends much more time debugging non-operational code than writing the code in the first place. This tends to mean that these people focus on "getting an executable" rather than "getting it running".

      Ada is a brilliant language to teach newbies in (again I've personally done this) as you can explain the abstract concepts and then have the compiler make sure they are doing it right rather than have them say "it compiles but it keeps falling over, why?".

      Ada's issues are due to the mentality of lots of (IMO) unprofessional engineers who focus on the number of characters over the operational viability of a system.

      And for a final point. Take a look at the complex code the guy wrote, if that was in Java, C, C++, Scala, Ruby, Perl, LISP or what ever do you think that you'd have a chance of understanding it?
      • if that was in Java, C, C++, Scala, Ruby, Perl, LISP or what ever do you think that you'd have a chance of understanding it?

        Why not? Or does Ada magically make complex algorithms more comprehensible?

        Sorry, but a competent programmer should be able to build a clean, comprehensible solution in any of those languages. Time to a working solution may vary, but there's no excuse for one solution to be less understandable than another.
        • Why not? Or does Ada magically make complex algorithms more comprehensible?

          ds ths rd ok? it cn b dffclt 2 ndrstnd a trs lngg

          or

          Does this read ok? It can be difficult understand a terse language?

          That is the difference between Ada and the likes of C and Java. Sure a good coder could build it but will it be as comprehensible? No it won't.
          • Wow! You've presented a useless argument backed by nothing, and then claimed victory! Congratulations! My mind truly boggles at your boundless intellect.

            Honestly, I'd counter your point... except you didn't even make one.

            And to argue that, for example, Java is terse (probably the *last* thing people would accuse it of), let alone any of those other languages (which are only as terse as the developer who uses them) clearly demonstrates you probably have no idea what you're talking about.
            • Java is terse in comparison to languages like Pascal, Ada, Eiffel, etc it might not be terse in comparison with Perl or Brainfuck but that isn't exactly a bar that people should want to go under.

              C syntax languages are all terse the have ternary operators, ++, += and all those character saving devices but not comprehension saving devices.

              Now I'm sure you've got another witty response that shows how Java is a massive verbose language in comparison with Pascal, Ada or Eiffel... I'd love to see it to help me im
      • First off those strict rules help you because you spend miles less time debugging stuff you don't understand

        As I like to put it, "Ada makes you say what you mean and mean what you say."

        That is indeed a stumbling block to most newbies, but if you bite the bullet and go through with it, it soon becomes second nature to think that way. And you find yourself dealing with algorithms and abstractions, because once you've built a component you can forget the details and expect it to work right.

        Too bad so many programmers absolutely detest saying what they mean and meaning what they say.

    • by drxenos ( 573895 )
      What the hell are you talking about? Ada most certainly does allow for type conversions. It just don't potentially unsafe ones implicitly.
    • If you really want protection from typecasting then UTM(*) is the language you want to use!

      (*)Universal Turing Machine
  • by DrWho520 ( 655973 ) on Thursday May 15, 2008 @10:11AM (#23417568) Journal
    Use a masochistic language to break a German code...groovy.
  • ADA Resurgence? (Score:5, Interesting)

    by Arakageeta ( 671142 ) on Thursday May 15, 2008 @10:13AM (#23417604)
    Has anyone else started to notice an ADA resurgence? I feel like several years ago the general feeling was "ADA is a backwards language used only on old military projects." Now I read a positive story about ADA every few weeks! Was ADA 2005 that good of a language revision?
    • by Skeptical1 ( 823232 ) on Thursday May 15, 2008 @10:47AM (#23418022)
      Ada is not a backward language. Ada is a palindrome.
      • by kst ( 168867 )
        Ada isn't just a palindrome. It's a hexadecimal palindrome. How many other languages can make that claim?

        (Well, six that I can think of: B, C, D, E, F, and my own 99 [99-bottles-of-beer.net].)
    • I play with programming on both the PC and Mac (at work I am on a mini and there is no ADA there at all) so I am curious...

      Which are the good compilers for ADA for Mac and PC. As well as being good, what are the relative costs?

      Finally, which sites do ADA supporters consider best?
      • Re: (Score:3, Informative)

        by glop ( 181086 )
        Hi,

        The GNAT is based on GCC. It's free and it is damn good.
        I was also using AONIX and they have a free (as in beer) version. I have always preferred GNAT though.

        I am not sure about a website though...

      • by DdJ ( 10790 ) on Thursday May 15, 2008 @11:16AM (#23418358) Homepage Journal
        Can't give you advice on the PC, but on the Mac, the default compiler is the GNU compiler suite. That's where the C, C++, and Objective-C compilers come from.

        The GNU compiler suite also has an ADA compiler (GNAT, GNU Ada Translator). Should be possible to get it and plug it in without much trouble, and then it'd integrate with everything else. Heck, should be possible to include ADA modules into an Objective-C Cocoa application, even.

        There is also a GNU FORTRAN, worth checking out. Even today, you can't do mathematics as efficiently in C as you can in FORTRAN. (This is because of the language; in Fortran, taking the address of an existing variable isn't normal, so variables don't end up with the possibility of "aliases" that they don't know about, which means a lot more stuff can safely be done all in registers and stuff like that.)

        There is also a GNU Pascal, but unlike ADA and FORTRAN, I'm not personally aware of any reason to actually use it.
    • Re: (Score:3, Interesting)

      by Anonymous Coward
      Ada was considered too complex. By now C++ is orders of magnitudes more complex and still does not do half the things (Ada has had a sane integrated threading model that could be used for message passing constructs, namely actual OO programming techniques, pretty much from the start).

      C++ templates, for example, are just a ripoff of Ada's generics _including_ the Ada angle bracket constraint notation which does not fit at all into C.

      Basically it is like the Unix renaissance after Windows tried to offer ever
    • Re:ADA Resurgence? (Score:5, Informative)

      by Barromind ( 783894 ) on Thursday May 15, 2008 @11:52AM (#23418804)
      Ada 2005 is comparatively minor (although some changes, like interfaces, are not that minor). The real improvement was Ada 95. The 95 revision managed to standardize many things that C++/java are now settling.

      Ada is not trendy, but it has had built-in portable concurrency and many other killer features for more than a decade. Proper specifications are one of my favs.

      Of course there are other factors, like the lack of good and free compilers. Fortunately now the gcc toolchain has put this to rest. Also there are few libraries. Really few. Binding to C is easy, but still a deterrent for the hobbyist.

      It's emphasis in making maintenance easy over quick programming really pays in the end, not even in the middle/long term but shortly after getting familiar with the language. I find myself much more productive. When something compiles, I'm sure that the only bugs remaining are logical, not some funny pointer or unexpected type conversion or overflow. Nowadays I rarely fire the debugger more than once a month. My C/C++ has improved because Ada forbids the things that are considered bad practices in C/C++, but you still end doing because "you know better".

      I think that Ada is getting now more exposure because, albeit a niche language, Adacore is pushing hard behind it. Also, its SPARK derivative by Praxis has made some headlines with large and difficult projects getting flying marks. SPARK has made static analysis a reality for large projects.

      I'd say that anyone capable of discipline will enjoy the benefits of Ada. It's not the thing for quick hacking, but it is perfect for anything not trivial. Software engineers should love it. I have heard somewhere that it is a safe C++, and I concur: feature-wise is more or less on par, it catches bugs sooner and prevents many typical ones.

      Have I already said that concurrency is built-in and portable :P? And that inter-thread communication is really well done?
      • Heh, sounds a lot like Fortran, for instance with the built-in concurrency. The real goodies came with the F90 standard, while GCC has enabled them only fairly recently. And people associate the name with something old and clunky :-/
        • by samkass ( 174571 )
          I thought the same thing except that it sounded like Java to me. Java's definitely a language that's targeted more towards folks willing to think than just sit and hack, and is far more introspect-able, thus moving significantly more errors to code/analysis time rather than runtime. (And it has concurrency built in to the language and standard libraries.)

          I remember the days when some folks thought that learning BASIC was a stain on a developers psyche that took years to heal. Now I feel the same way abou
    • It is good to program in a language designed for the task at hand. Ada was designed to control "things that can't fail" like aircraft flight controls and nucear power plants, guidance systems of "smart bombs" and soon. Must people don't work in this area. Most people write stuff that runs on desktop machines and web servers. In that environment software error is just expected and tolerated. So Ada will be a minority language.

      Ada will always be used more by projects where the cost of software error is
      • Ada was designed to control "things that can't fail" like aircraft flight controls and nucear power plants, guidance systems of "smart bombs" and soon. Must people don't work in this area. Most people write stuff that runs on desktop machines and web servers. In that environment software error is just expected and tolerated.

        And that attitude is exactly why the world is filled with crapware.

        We should expect software to work correctly, and take our business elsewhere when it doesn't.

        No one dies if your kitchen faucet squirts the water out sideways, but we don't just shrug it off and live with it.

    • I think the PR spike may indicate Microsoft is behind an astro-turf campaign to raise awareness of Ada in support of their unannounced but soon to beta Visual Ada Plus Professional Platinum Edition 2008 for the Web.
    • Has anyone else started to notice an ADA resurgence?

      s/ADA/Ada/ -- It's a proper name rather than an acronym. (It refers to Ada Lady Lovelace, Charles Babbage's assistant.)

      It seems to be popular in France, presumably because the guy who invented it was French. (No, it wasn't designed by a committee. It was selected via a competition for a language to satisfy a rich set of requirements, with special attention to support for embedded programming.)

      For those who aren't familiar with it, it's very similar to Pascal, but has a lot more features. Things like mul

  • Compiler price.. (Score:5, Interesting)

    by renoX ( 11677 ) on Thursday May 15, 2008 @10:25AM (#23417744)
    I think that the main reason why Ada has 'lost' to C++ is that some time ago, C++ compiler were either cheap or free whereas Ada compiler were expensive.

    Too bad since Ada is 'by default' a language which is more secure than C++..
  • http://www.schlaupelz.de/SZ42/SZ42_software.html
  • hmm. (Score:4, Interesting)

    by apodyopsis ( 1048476 ) on Thursday May 15, 2008 @10:49AM (#23418042)
    I often wondered at the time if this was a fair test?

    I mean the german fellow was near teh transmitting station and got a very good signal and started right away.

    Bletchley Park on the other hand, because of the atmospheric conditions did not get a signal until late in the day and started late. On the other hand the german SW took only 46 seconds.

    I'm not saying that the german fellow should not of won, he did fair and square - but there seemed to be no mention in much of the news at the time of the receiver issues.

    On the plus side, it was excellent publicity for the park and colossus. If only Churchill had not ordered then scrapped then Britain could of led the technological era.

    • by Anonymous Coward
      He (and his successor, Attlee) kept it classified. Then, during decolonization, they gave lots of captured Enigma machines to former colonies to allow them to keep their communications secure -- and allow the former colonial power to keep an eye on things :)
  • Concise??!! (Score:3, Insightful)

    by Lodragandraoidh ( 639696 ) on Thursday May 15, 2008 @10:53AM (#23418084) Journal
    I can't imagine using the words concise and Ada in the same sentence.

    Constricted - maybe. Painful - most certainly.
    • by hey! ( 33014 ) on Thursday May 15, 2008 @11:16AM (#23418362) Homepage Journal

      I can't imagine using the words concise and Ada in the same sentence.


      Perhaps you should read what you just wrote.
    • The choice was obvious! What better way to solve a cypher contest but to code in a language that is pretty much a cypher to everyone else?
    • Re: Concise??!! (Score:3, Insightful)

      by Black Parrot ( 19622 )

      I can't imagine using the words concise and Ada in the same sentence.

      .

      Not concise in spelling out the details of some low-level module, but very concise for higher-level programming, because it made you be precise when implementing the components.

      Once you've worked in an area for several years you end up with a good collection of libraries with clean interfaces, and you find yourself throwing very complex programs together with simple code that "just works", rather than the spaghettied jazzturbation that you usually see when people use other languages.

      No reason you can't wr

  • by _|()|\| ( 159991 ) on Thursday May 15, 2008 @10:56AM (#23418122)

    Like the author of the article, I have a tendency to dabble with a variety of programming languages. I haven't used Ada seriously, but I am intrigued by it, especially in contrast to the looser languages that are currently popular. A lot of bytes have been spilled on the topic of static and dynamic typing, bondage & discipline vs. unit testing, etc. While these discussions often devolve to religious wars, I do think that language matters. Never mind Sapir-Whorf or Turing, some languages are simply more or less pleasurable or powerful for certain tasks.

    That said, often the language itself is not the dominant factor in choosing the language. As nice as (Ada | Erlang | Haskell | Lisp | Ruby) is, it's not going to be my first choice if another language has a readily available library that will make it easier to write the program. I can write web applications in Lisp, but I probably won't. There is probably a parser generator for Ada, but I'd rather use Flex and Bison, or maybe ANTLR. And when it comes to my first choice, independent of problem domain, I'll usually pick Python, in part because of its extensive library.

    • by hey! ( 33014 ) on Thursday May 15, 2008 @12:11PM (#23419108) Homepage Journal
      Well libraries. That's a huge part of language choices these days; you really choose frameworks or libraries and live with the language as a consequence. A lot of what we do these days is glue stuff together.

      This problem, however is a completely different kind of programming. It's old school stuff: building everything you need yourself to run on really slow hardware. And hardware is always slow relative to crypto problems. Ever try to implement RSA encryption from scratch? I have. There's a reason the public key stuff is only used for key exchange.

      I think the usefulness of Ada on this kind of problem is related to the issue of testing being costly. When I started in this business, compiling and linking a two hundred line program took about fifteen minutes. Something like unit testing would have been utterly impractical. So a strictly typed language was for nearly everyone a good idea.

      Over the last couple of years, I've been trying my hand at a number of difficult algorithmic problems. This is not the stuff that 99% of the programmers in the world do professionally, including me.

      Working on these problems was like programming was in the old days. Not only was it just you and the problem with no frameworks to come between, every output becomes a milestone when it takes a program days to generate. It also means that the style of programming is different. You don't worry so much about language restrictions introducing frictional losses into the code/test/recode cycle. You do worry more about mistakes that make it past the compiler.

      Ada's philosophy is that coding should be, if not exactly slower certainly more deliberate. If you are running something for which your hardware is monumentally slow, then this is a good style to work in.
    • "when it comes to my first choice, independent of problem domain,"

      Notice the choise of words here. nothing wrong but it implies a small one-person project. Maybe even less then that a one person part time project.

      But what if there were 250 software engineeers working full time over three the five years? This is what Ada was designed for, large scale software. Very few companies can do this kind of work. Mostly you are looking at the big aerospace companies, like Northrop, Lockeed and Boeing. The major
  • Horses for courses (Score:5, Interesting)

    by Viol8 ( 599362 ) on Thursday May 15, 2008 @11:42AM (#23418686) Homepage
    Its not really surprising that he found ADA nicer to use than C for this sort of project because its not the sort of thing C was created for. People seem to think that C was designed as an all purpose programming language - it wasn't. It was specifically designed as a systems programming language that could substitute for assembler 99% of the time. Its abilities lie in low level manupulation of memory and I/O , not in high level mathematics algorithms (though obviously it can do this too).

    Then of course C++ came along which wanted to have its cake and eat it and the end result was a nasty mishmash of low and high level constructs which is difficult to learn , unintuitive and generally messy to use.
    • I don't want to start a philosophical battle here, but I would appreciate it if you could give me a pointer to a reference explaining what are the features of C that make it suitable for "low-level manipulation of memory and I/O"?

      I've always found it to be sub-optimal due to its lack of a "bit" data type, the need to explicitly set pointers to address specific regions in memory (which may or may not be in the same address space as I/O - ie the x86 architecture), etc. To get around these issues, access func
      • by Z34107 ( 925136 )

        Maybe I misunderstand your question, but I dabble in C, so let me formulate a response.

        In your second paragraph, you mention that you "need to explicitly set pointers to address specific regions in memory (which may or may not be in the same address space as I/O - ie the x86 architecture." I assume you're talking about paging, swapping, segmentation, and the like.

        The OS determines address space. And, if you're trying to program this kind of memory management from within a multitasking operating system

  • HDL (Score:2, Interesting)

    by Anonymous Coward
    I've used both verilog (C based) and VHDL (ADA based) and the latter wins hands down for being maintainable and easy to debug. Nobody had to write a LINT checker for VHDL like they did for verilog. I totally believe this guy.
  • Chipping the code into a specific shape by hand... Give it a few years and software development will be more like civil engineering. Pouring concrete into shapes which have known specifications.

     
    • Chipping the code into a specific shape by hand... Give it a few years and software development will be more like civil engineering. Pouring concrete into shapes which have known specifications.

      Yep, Real Soon Now, just like we've been telling ourselves for the past quarter of a century.

      And by the time we get to that point, all the coding will be done by code generators that take the specs as input.

      Too bad none of us will ever meet a customer who actually knows what he wants until after you deliver the program...

  • truth never wins -- its opponents just go extinct

    Yes, and the people who promote Ada as a secure and productive programming language have almost died out.

    Ada is neither, and fortunately, the market has realized that.
  • I wonder ... (Score:3, Interesting)

    by kst ( 168867 ) on Thursday May 15, 2008 @12:31PM (#23419414)
    I can't help wondering how many of the people making snide comments about Ada (note: not ADA; it's not an acronym) have actually used it.
  • Obvious.. (Score:2, Informative)

    by jovius ( 974690 )
    He obviously settled on Ada, because Ada allowed him to implement.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...