Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News

Edsger Wybe Dijkstra: 1930-2002 409

Order writes "Edsger Wybe Dijkstra, one of the founding fathers of computer science and the author of the famous "Go To Considered Harmful", has died on Aug. 6, 2002 after a long struggle with cancer."
This discussion has been archived. No new comments can be posted.

Edsger Wybe Dijkstra: 1930-2002

Comments Filter:
  • by Sebastopol ( 189276 ) on Wednesday August 07, 2002 @07:57PM (#4029495) Homepage

    After reading this article, I think we all need to pause for a minute, and consider the insight of this simple observation.

    Add his definition of things human minds are geared to list: static relationships. It's perfectly in line with Dawkins statement that human minds are designed to comprehend things roughly human-sized moving at roughly human-speeds.

    I keep forgetting how long people have been programming. Think about how many people using GOTO there were back in 1968. Probably only a few thousand. Crazy.

  • Final GOTO (Score:2, Funny)

    by Anonymous Coward
    GOTO Heaven
    • by Turing Machine ( 144300 ) on Wednesday August 07, 2002 @08:03PM (#4029539)
      I'll bet he gets there by the shortest path.
    • by RPoet ( 20693 )
      Gosub without return.

      RIP.
    • while (!nirvana())
      {
      die();
      reincarnate();
      }
    • GOTO Heaven

      The problem with that is, as Dr. Dijkstra wrote:

      "The unbridled use of the go to statement has an immediate consequence that it becomes terribly hard to find a meaningful set of coordinates in which to describe the process progress."

      Looks like we won't be mixing humorous programming references and theology after all...

      .
  • I remember when Tulsa U. brought in Dr. Dijkstra back in the spring of 1984. He spoke at length about software design principles, and how design was the lynchpin of good systems. He was there only for a day, and had insisted on taking time out to talk to anyone interested in hearing him. I'm very glad that TU invited the Computer Science students from ORU over to hear him.

    The Computer Science profession has lost another giant.
    • As an aside, I remember this little snippet from the seminar:

      Tulsa University had shuffled around some classrooms to free up a large conference room/lecture hall. Well, for those students who were supposed to attend a class in there that day, the administration had put a notice on the board "Class ???? - Goto 426 " (or something like that). Dr. Dijkstra had come in from the back of the room, was introduced, and started speaking - he never looked at the chalkboard the whole time. Well, when it came time for questions, one student (not me) asked him about the notice on the back of the room. Well, Dr. Dijkstra turned about, cleaned off the board, and said something about structuring the overall conversation, and that comment violated good system design.
    • I took a course he taught at UT Austin. The final exam was oral, one-on-one, for about two hours or so. I walked into his office at the end of the hall, which was larger than the classroom was. Actually, I don't walk into the office; I walk into the waiting room leading to the office.

      So there's this huge room, with two walls covered in bookshelves, filled with books, periodicals, publications, a picture of Dijkstra in his graduation robes, awards, etc., all neatly arranged. I get the feeling the Doctor has written half of what's shelved there. (Knuth wrote half of the rest, I reckon.)

      Dr. Dijkstra sits me down, and after a quick chat, launches into the first problem. It's a proof, fairly simple. After presenting the problem, he sits down in the chair across from me, and waits, quietly and patiently. On me.

      I got so flustered I ended up with a B. One of my great regrets.
  • EWD Archives (Score:5, Informative)

    by charvolant ( 224858 ) on Wednesday August 07, 2002 @08:02PM (#4029531) Homepage
    For more of his writings, the Edsger W. Dijkstra Archives [utexas.edu] contains a lot of interesting/insightful/amusing writings.

    A pity he's gone.

  • Respects (Score:4, Interesting)

    by BigWorm ( 103915 ) on Wednesday August 07, 2002 @08:02PM (#4029535) Homepage
    Any service that uses pathfinding algorithms (such as MapQuest) should pay their respest.
  • More articles (Score:5, Informative)

    by devphil ( 51341 ) on Wednesday August 07, 2002 @08:03PM (#4029541) Homepage


    Some links from my article that slashdot rejected some hours ago: the University of Texas announcement [utexas.edu] has a list of his awards and discoveries. (He taught at UT.) A brief paper [utexas.edu] (in PDF, it's scanned from a handwritten paper for CACM if I recall) shows his brilliant, clear, and concise methods of thought and writing.

    If you ever used an application that made use of shortest-path searching -- say, any real-time strategy game -- then you owe this man a debt of gratitude.

    • by Anonymous Coward
      If he's responsible for the pathing in RTSes, his journey from life to the afterworld will inevitibly get stuck on the edge of a tree...
    • I believe the Internet core routing protocols use Dijkstra's shortest path algorithm, whereas RTS games probably use the A* algorithm to find approximate shortest paths. So everyone who accesses slashdot remotely uses his algorithm... :) IIRC, Dijkstra also developed semaphores and mutexes, according to our old friend Andy Tannenbaum, which are an absolute requirement for any multitasking, multithreaded OS. Gosh, the man was a legend...

  • by cperciva ( 102828 ) on Wednesday August 07, 2002 @08:04PM (#4029544) Homepage
    Dijkstra was very good at producing quotable remarks; in addition to his comment about computers, thought, submarines, and swimming (RTFA), he made the following remark about computer science:
    "Computer science is as much about computers as astronomy is about telescopes."
  • by Raindeer ( 104129 ) on Wednesday August 07, 2002 @08:10PM (#4029581) Homepage Journal
    I found the quotes here: http://www.cse.iitb.ac.in:8000/~rkj/dijkstraquotes .html I paste them here in full to counter the slashdot effect.

    Some Quotes of Edsger Dijkstra
    "Always design your programs as a member of a whole family of programs, including those that are likely to succeed it"

    "Separate Concerns"

    "A Programming Language is a tool that has profound influence on our thinking habits"

    "The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague" (from 1972 Turing Award Lecture)

    "Progress is possible only if we train ourselves to think about programs without thinking of them as pieces of executable code"

    "Program testing can best show the presence of errors but never their absence"

    "I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself, "Dijkstra would not have liked this", well that would be enough immortality for me"

    And then my quote :-) -->
    • "I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself, "Dijkstra would not have liked this", well that would be enough immortality for me"

      A very apt last quote for your post. It reminds me a little bit of one of Richard Feynman's friends talking about how he had seen Feynman in a dream, talking very animatedly about something or other, and he thought 'Should I tell him he's dead, or does he already know?'

      OK, so it seemed more relevant in my own mind, but he certainly has left a legacy for others to follow.
    • by swordboy ( 472941 ) on Wednesday August 07, 2002 @09:09PM (#4029856) Journal
      "I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself, "Dijkstra would not have liked this", well that would be enough immortality for me"

      Was he still talking about programming here?
    • by 2b|!2b ( 140353 ) on Wednesday August 07, 2002 @09:53PM (#4030067)
      I've always liked "Object-oriented programming is an exceptionally bad idea which could only have originated in California."

  • My 2 cents (Score:2, Interesting)

    by pjdepasq ( 214609 )
    I first learned that Dijkstra was ill back in February at a conference. Apparently he was sick with cancer and returned home to the Netherlands to live out his remaining days. Since that time, I periodically checked with my source, but they heard nothing new of his condition. I'm shocked to learn he lasted this long, considering what I heard back then.

    I was fortunate to be introduced to Dr. Dijkstra at SIGCSE 2000 in Austin by my advisor. Its unfortunate that our field is so young that its pioneers are just now starting to pass on (compared to other sciences such as Physics, Chemistry, etc.).
    • Its unfortunate that our field is so young that its pioneers are just now starting to pass on (compared to other sciences such as Physics, Chemistry, etc.).

      Yes. Computer science is indeed in its infancy. Dijkstra cleaned up algorithms by eliminating spaghetti code and introducing structured programming. In my opinion, we are still mired deep in the dark ages of computing. If only someone would clean up software engineering by eliminating the algorithm as the basis of software construction.

      Do a search on Google for 'synchronous reactive systems' and find out about the next big advance in software engineering.

      Project COSA [gte.net]
      • Sorry, but I consider you as a spammer. Diving into your posted messages, I see several COSA promotions, and never a good reply or any example of COSA's alleged superiority. Nice to see that you have some ideals, but claiming that this is the silver bullet is arrogant nonsense. Such an attitude won't bring you far. At least Dijkstra was a humble wise man, and a great algorithms scientist. I don't think you respected the man, by posting this message - if Dijkstra had been still alive, he'd probably told you to chill out until you've got something which is not vaporware.
        No thanks.
  • by DrStrangeLoop ( 567076 ) on Wednesday August 07, 2002 @08:23PM (#4029664) Homepage
    ...he is not going to need his forks [nist.gov] anymore and the other guys are finally getting to eat?

    seriously though, i think dijkstra will be remembered as long as there is the need to prevent race conditions... which in my eyes is quite an accomplishment.

    -strangeloop
  • Two articles posted in a row to depress me.

    -Pete
  • by Kook9 ( 69585 ) on Wednesday August 07, 2002 @08:25PM (#4029672)
    It's a shame that /. seems to think "Go To Considered Harmful" is Dijkstra's signature achievement. He was profoundly influential in developing the theory of operating systems. He was one of the first proponents of layered design [cmu.edu]. He also did pioneering work in mutual exclusion [cs.vu.nl] (IIRC, he invented semaphores) and deadlock [ic.ac.uk]. In short, he is responsible for a lot of the fundamental concepts that we use to build complex systems today.
    • by Lictor ( 535015 ) on Wednesday August 07, 2002 @09:43PM (#4030016)
      Indeed. He was a most amazing man in that he was clearly a brilliant theoretician, but he had a keen interest in solving theoretical problems that were of *practical* value (who would've thought there were so many neat mathematical problems in OS design?).

      The reason that the bulk of the comments here revolve around the whole GOTO thing is because, quite frankly, that is the only one of Diijkstra's contributions that the bulk of Slashdotters are capable of understanding and appreciating.

      Most of these posts are quite equivalent to, upon hearing of the passing of Ghandi, saying "Gee, I heard that guy could go a few days without food".

      But, to paraphrase the great man himself: in Computer Science most folks miss the science for the telescope. Some things never change.

      Rest in peace Professor Diijkstra.
      • this is the slashdot crowd. The same one that gets all crazy about MS or DRM or RMS. The rabble cannot "clean up" for this important post. Be kinder.

        There is an old zen saying:
        Show a swordsman your sword
        Show a poet your poem.

        Slashdot is just slashdot.
      • Yep. The idea (not Diijkstra's) is that by getting rid of GOTOs programs are magically more manageable. The thing is that a mess of GOTOs can easily lead to an incredibly complex program that does not warrant that degree of complexity, hence the diatribe. The natural way to implement a Finite State Automaton is with GOTOs (and long cryptic labels;).
        It's been a long time since I've even thought about them, but Diijkstra's guarded commands are insidiously powerful. I'm a bit surprised that essentially nothing has been done with them. He once commented that using them, he was using recursion much less. I'm guessing, but I think the power comes from specifying the partial order inherent in the problem rather than inventing a linear order that the program must follow. With a smart-alecky interpreter it might be possible to make a somewhat effective substitute for a nondeterministic finite state automaton.
  • Sad to see him go (Score:5, Insightful)

    by teetam ( 584150 ) on Wednesday August 07, 2002 @08:30PM (#4029698) Homepage
    He was one of the greatest computer scientists and programming language theorists ever. I sincerely mourn his passing.

    In today's computer world, dominated more by marketing folks more than the technicians, I wonder how many people have heard of this man. It is sad that in the last decade of so, CEOs like Bill Gates and Jeff Bezos have gained so much public recognition while people Dijkstra languish in relative anonymity.

    A few weeks ago, there was a post in /. about Knuth. I was surprised to see many ask who he was!

    • by guybarr ( 447727 ) on Thursday August 08, 2002 @04:07AM (#4031252)
      It is sad that in the last decade of so, CEOs like Bill Gates and Jeff Bezos have gained so much public recognition while people Dijkstra languish in relative anonymity.

      and in his time, whom do you think was more famous, Newton or his King/Queen ? Lagrange or whatever Louie ruled then ?

      True metal survives the acid test of time. The ornamentations, the hype-sellers, the gates'es and Bezos'es, will be forgotten by everyone (except historians) by the next century.

      Dijkstra's shortest-path algorithm and other works will be remembered in centuries to come.

  • by nbvb ( 32836 ) on Wednesday August 07, 2002 @08:32PM (#4029709) Journal
    For those of us who have chosen the fields of computer science & engineering as our professions, this is a time to reflect and realize just how lucky we are.

    We're getting in on the ground floor. The folks who were there in the VERY BEGINNING of our field are still around to teach us something. We need to remember just how privileged we are to have these fantastic people with us to "pass the torch" so to speak.

    Look at how far the medical field has come in its history. Or chemistry. Or physics. And these are just scientific professions.

    Think about other things, like teaching or agriculture.

    We're the next group to advance CS/E. We've got to adopt these folks as our mentors and learn all we can from them.

    Not just _how_ their stuff works, but _why_ they did it. Fundamental practices 30 years ago are as fundamental today as they were then.

    "Those who fail to learn from their past tend to repeat it."

    RIP, Mr. Dijkstra. And thanks for being such a great mentor.

    --NBVB
  • by extrasolar ( 28341 ) on Wednesday August 07, 2002 @08:36PM (#4029731) Homepage Journal
    I like spaghetti code. I grew up on AppleSoft Basic and GW-Basic (thank you microsoft).

    I read books I picked up from the library for free which showed Basic programs threaded back and forth in sequence, for no apparent reason, and like this sentence, confusing the heck out of me. I saw it as a challenge. I also loved condition gotos'. They were evil.

    Gosub? Bah. They ran out of memory too much. Because I hadn't the discipline to Return before I Goto'd out of the subroutine. So I used Goto's to simulate procedures. I also eventually used Goto's in a way that I would eventually learn is like structured programming. Set some variables, goto here, do stuff, goto back, set the same variables something else, goto here, do stuff, maybe goto back. Or it would be the end of the program.

    Then I got my first C book. I still haven't got the hang of this language. Before the book even mentions "goto" it gives me a lecture on how awful goto's are and that they can produce spaghetti code. But I *like* spaghetti code. And whats with these labels? Line numbers were so much cooler. But I took the man's advice, I used functions.

    But Basic spoiled me. I was never an effective programmer since. It wasn't long after I learned of structured programming that I got my first book on C++ and was introduced to object-oriented programming. Now, for someone using structured techniques for a couple years, the need for objects seemed to make sense. But I was lost in a sea of hierarchial classes and virtual methods.

    When I first went on the internet, I started learning all kinds of crazy languages, hoping some of them would be simpler. And there were many. Except for forth and common lisp. Except for ML and Smalltalk. So I am still toying with scheme as I speak, still trying to figure out what exactly the difference between a recursive and iterative process is.

    Eventually, I'll figure out how to write spaghetti code in this otherwise clean and elegant language too. Continuations sound promising, from what it sounds like.

    I wish the best of Dijkstra--hope he rests in peace. Honestly, I've never heard of him until this post to slashdot.

    But maybe it is slightly better for him not to know that some of us never learn.
    • Goto's are bad because they are a challenge. If you need to engineer mission-critical systems, the person paying you to do so does not want you to try to make things obfuscated and "neat". He wants sound design and a clear implementation. And when someone else comes along to maintain your code, he doesn't want to weave through your spaghetti code either. I doubt you're as smart as Dijkstra. Perhaps you should become familiar with his work if you want to become a better programmer.
    • by sharkey ( 16670 ) on Wednesday August 07, 2002 @10:05PM (#4030110)
      I like spaghetti code.

      Come on, Taco. Post under your own name.
    • Mr. Dijkstra has some bad news [virginia.edu] for you:
      It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
      • It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

        In a way I know what he means - but he's wrong. I did a lot of Basic, Assembler and Opcode before I got into OOP. Those are 2 completely different ways of thinking, I'll give him (and everbody else) that.
        The Problem isn't the students though. Until now (15 years of computing) I haven't met a single softwaredeveloper with significant OOP skills that could actually explain to normal people how OOP works. CS theorist are big at insulting hands-on programmers. But as soon as it comes down to getting the message across and the team going they're often utterly clueless.
        I've learned the hard way. It took me years to grasp OOP even though I did programm years before - and I'm still chewing on my design patterns and UML. But when I'm finished, I'll actually be able to teach 'spagetti coders' the way - and not just bullshit about them like the arrogant rest of the pack.
  • Quotes (Score:5, Funny)

    by dargaud ( 518470 ) <slashdot2@@@gdargaud...net> on Wednesday August 07, 2002 @08:41PM (#4029745) Homepage
    For memory:
    "The question of whether a computer can think is no more interesting than the question of whether a submarine can swim."
    - Edsgar W. Dijkstra.
    "The use of COBOL cripples the mind; its teaching should therefore be regarded as a criminal offense."
    - E.W. Dijkstra.
    "In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included."
    - E.W. Dijkstra.
  • re:guards (Score:3, Interesting)

    by epine ( 68316 ) on Wednesday August 07, 2002 @08:48PM (#4029775)
    Two chapters from one of Dijkstra's books improved my program correctness by an order of magnitude, and this was after I had fully digested Bertrand Meyer on programming by contract. His notion of guards is the number one item on my top ten list of everything I know about writing correct code.
  • by dankelley ( 573611 ) on Wednesday August 07, 2002 @08:49PM (#4029783)
    To enjoy the next hour (or week, or month, ...) set aside this /. thread and enter into a Dijsktra thread.

    Just looking at his U texas [utexas.edu]publication list is an awesome (pre-1990s meaning) experience. Let your eyes scan it, as they would the Grand Canyon. Then wander around the UTexas site, where many publications are online, and start reading. You'll be a better person for it. And you may experience a thrill of understanding, when you see that his hands hold up so much of today's code, as Shakespeare's hands hold up so much of the language and common experience of the English world.

    To get a feel for the span of his life's work, consider his thesis title, "Communications with an automatic computer." The word "automatic" was necessary then, to distinguish it from a person with a calculator. The machine he used in his thesis? It had a 32K memory unit. He divided this into what he called "living" and "dead" memory.

    Let's hope that his memory will be of the living variety.

    To a man I never shall meet, thank you.


  • I am frankly not convinced that he found that nested blocks are *objectively* better than goto's. His description is not really rock-solid reasoning in that paper that I can ascertain.

    Nested blocks are "better" because they are more consistent from programmer-to-programmer I have tentatively concluded.

    More about my GOTO ramblings at:

    http://geocities.com/tablizer/goals.htm#goto

    There is yet to be a "killer proof". I heard that when that paper came out there was a lot of contraversy. Goto fans rightly claimed that it was just an opinion. Regardless, most programmers now prefer nested blocks for the most part, whether they know why or not.

    I can't find any GOTO fans to interview, so their preference reasoning is unfortunately lost to history it seems.
  • God Bless Dijkstra (Score:5, Insightful)

    by RickHunter ( 103108 ) on Wednesday August 07, 2002 @09:14PM (#4029873)

    This man contributed many great ideas to our field. The sad thing is how many programmers are still in ignorance of them, even now. You did great things, Mr. Dijkstra, and will be sorely missed. I just hope we're still allowed to have generic computing devices in ten years' time, so we can continue to refine and develop the revolutionary ideas you left us with.

  • Remember that [umn.edu]? Only algorithm on my CS course I ever put into practical use. aka "No bracket required", (for Phil Collins fans).
  • I heard reference to his algorithm (only way I heard about him) and just pegged him as another Renaissance man with too much free time on his hands (like Fourier).
  • Learn what he taught. Avoid GOTO. Learn about structured programming and CSP. Strive for elegance and simplicity in your programs. I can think of no better testament to his work than to show that we really were listening.

  • We are informed when great men die. We are never informed when great men are born. Here's hoping whoever he comes back as can live up to his potential.
  • by sbaker ( 47485 ) on Wednesday August 07, 2002 @10:19PM (#4030168) Homepage
    When I was just fresh out of college back in 1978, a collegue of mine who had been on Dijstras circulation list gave me a large stack of photocopied papers from Dijstra...all written in his own handwriting because he liked to invent his own symbols and found typewriters too limiting. I was working for Philips Research at the time - and I suppose Dijstra was working at Philip's "Math Center" in Eindhoven, Holland.

    I've kept a whole boxful of his papers over the years - just because they are so fascinating to browse.

    He invented his own programming language for expressing algorithms - but doesn't seem ever to have written a compiler for it. He refers to algorithms his mother came up with...almost every document has something interesting like that.

    The notes are written in the most perfect handwriting you've ever seen.

    They could have been printed - they are that precise. Then, one of them out of the blue seems to have been written in someone else's handwriting - it's just as amazingly neat though and when you get to the end of it, it says something like: Apologies for the poor handwriting in this note, but my left hand could use some practice. :-)

    These cannot be stored as text files without losing most of their historical interest. Maybe I should spend an evening or two to scan them and put them online. There could be no more fitting tribute to the man.
  • by tarvid ( 48247 ) on Wednesday August 07, 2002 @10:31PM (#4030222) Homepage
    Although Edsger is remembered for the article on the
    goto, his development of the stack model was an
    evolutionary leap in the development of computers.

    Every computer made today embodies his model.
    Interrupt handling, recursion, reentrant
    programming, multi-programming, multi-processing,
    virtual memory all come out of Edsger's model.

    I had the great fortune to work on a Burroughs B5500
    and later the first B6500 that made it out of
    manufacturing. This entire series of computers
    was based on Edsger's model and his Algol 60 compiler.

    Tony Hoare may have put it best when he quipped
    "Algol is an improvement over all its successors".

    Certainly Edsger was an improvement over most of
    his successors.

    Jim Tarvid
  • A great loss... (Score:4, Insightful)

    by DCowern ( 182668 ) on Wednesday August 07, 2002 @11:10PM (#4030400) Homepage

    Moderators: This is one of those posts where I say screw karma. Mod me to redundant hell if you wish, it just doesn't matter.

    This is an extremely sad day for computer science. There is hardly a field in CS that Dijkstra's work didn't touch. His work can be seen everywhere we use computers.

    Personally, this is an extremely sad day for me as well. Although I never met the man or saw him speak (now one of my greatest regrets), being in college, he's my equivalent of a Joe DiMaggio or a Ted Williams. This man was a hero and an inspiration to me.

    Sometimes it really pisses me off that we show such public sorrow for sports figures who pass away like Ted Williams who for the most part didn't do a damn thing to really and truly improve our lives (granted Ted Williams was a marine and fighter pilot but that's not why most people were mourning him). This man greatly and directly contributed to a vast improvement of our quality of life as human beings. His obituary will be a foot note and page Z-42 of the NY Times and Washington Post but when celebrities die, they're front and center on page 1. It makes me sick.

    That's my 2 cents and I'm not giving any damn change. >:o

    • To paraphrase someone who posted this earlier; in his time Newton was probably less known than the king who ruled England at that time. I am sure Dijkstra, Knuth and the likes will be remembred forever as great pioners of computer science. I am not the celebreties you are talking about will be remembered some years from now.
  • I had Dijkstra for a graduate CS class in the Fall of 1996. It was an exploration of elegance in the process of quantitative reasoning. I must say that he taught me the virtue of careful thinking more so than any other instructor during my formal education. Check out this link [utexas.edu] starting around manuscript 1237 to see the course notes. As an example, he showed us an algorithm for calculating increasing cubes (x^3 for x=1 to N) of integers that reduces to 2 C statements and uses only integer addition and initial assignment as operators. E-mail me if you want the code. Hint: It would only be a 2 statement algorithm for any arbitrary polynomial function.
    k u r t AT s p a c e s h i p . c o m
  • Dijkstra's Law (of Programming Inertia):

    If you don't know what your program is supposed to do, you'd better not start writing it.

    I'm very sorry to hear he's died, even though I never met the good Doctor. In fact, each time I'm led off into the weeds by some dumbass project manager who misinterprets XP or RAD or ??? into contradicting this law, I quote Dijkstra's Law to anyone nearby. Along with quoting from Yourdon's "Death March", it's my favorite self-help therapy method.
  • by Michael Wolf ( 23460 ) on Thursday August 08, 2002 @03:01AM (#4031104)
    Let's not forget this bit of fun. We can banish goto forever now that someone finally invented [fortran.com]
    comefrom.
  • Linux Kernel (Score:5, Interesting)

    by AftanGustur ( 7715 ) on Thursday August 08, 2002 @03:19AM (#4031150) Homepage


    Those who actually read the linux kernel source codem probably already knew Dijkstra and his god-like powers in the computer-sciences.

    But for those who put their nose in there and juts read the comments, there are some references

    Fr example: drivers/scsi/NCR5380.c

    goto part2; /* RvC: sorry prof. Dijkstra, but it keeps the rest of the code nearly the same */
  • A few years back.... (Score:2, Interesting)

    by Hugh Kir ( 162782 )
    I had a friend working for the computer center at UT. She emails me that she just done such-and-such a thing for Professor Dijkstra. Upon reading this, I of course send some reply about all the stuff he has done, and how it must be cool to have met him, etc, etc, and she replies, "I didn't know he was famous. I just thought he was a nice old man!"

    So, here's to a great computer scientist and a "nice old man". May he rest in peace.
  • by C. E. Sum ( 1065 ) on Thursday August 08, 2002 @11:51AM (#4033209) Homepage Journal
    I received my computer science degree from the University of Texas, where Dr. Dijkstra taught before retiring. I never took the undergraduate class he offered (I was kind of intimidated at the time), but the professor who taught my Software Engineering class had him come in to lecture one day.

    This software engineering class was very pragmatic, emphasizing methodical design, implentation, and testing. As I recall, Dr. Dijkstra gave his lecture near the end of our semester, by which time we had been heavily involved in something resembling a team development evironment for a few months.. There was a very corporate feeling to our regimen of meetings and reports.

    So one day we all go to the faculty lounge to hear the esteemed professor speak. He comes in the door of the lounge appearing to me most unlike the kind of man who could write so forcefully about programming, dressed in shorts and a tee-shirt with a distinctly old-grandfather look on his face.

    In his very soft-spoken manner, he told us that he beleived that the main problem with programmers was a lack of rigor. People were so concerned with coding and testing that they never learned how to write something correctly the first time. He asked us to prove the correctness of the code for a binary search and spent the next half-hour proceeding glumly as we slowly worked through the process with him.

    I got the impression we were a vaguely dissapointing group of students who he could tell were not convinced of the validity of his approach. It wasn't even a bitter dissapointment, though. I felt as though he was someone who had totally convinced himself that he knew how to make the world a better place, but that noone was listening.

    He answered our questions about "gotos considered harmful" (it was his editor's idea to give it the cute title) with what I considered obvious patience. He talked about how he really only was able to keep up on the research that people referred to him these days. And then the lecture was over.

    Our professor and Dr. Dijkstra were good friends, and I hung around after class talking with them about computer science and Dijksta's past. I ended up in his office after a while and we chatted about the current state of the industry as he saw it, why he really liked Texas, and so on. He was so intelligent in his conversation--asked so many probling questions--that by the time I was done I felt both touched and exhausted. He put on his cowboy hat and walked out of the office with me and headed off to his next appointment.

    That was the last time I saw Esdgar Dijkstra--the only real time I ever talked to him. But I feel that the world has lost a quiet crusader, and I feel a tug in my heart thinking about this old dean of computer science with his cowboy hat.
  • by Lumpish Scholar ( 17107 ) on Thursday August 08, 2002 @12:03PM (#4033301) Homepage Journal
    It is practically impossible to teach good programming style to students that have had prior exposure to BASIC; as potential programmers they are mentally mutilated beyond hope of regeneration.

    APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums.

    The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

    When FORTRAN has been called an infantile disorder, PL/I, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease.

    COBOL is for morons.

    With respect to COBOL you can really do only one of two things: fight the disease or pretend that it does not exist.

    The question of whether computers can think is like the question of whether submarines can swim.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...