Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Remember the Computer Science Past Or Be Condemned To Repeat It? 479

theodp writes "In the movie Groundhog Day, a weatherman finds himself living the same day over and over again. It's a tale to which software-designers-of-a-certain-age can relate. Like Philip Greenspun, who wrote in 1999, 'One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.' Or Dave Winer, who recently observed, 'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.' And then there's Scott Locklin, who argues in a new essay that one of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.' Hey, maybe it's hard to learn from computer history when people don't acknowledge the existence of someone old enough to have lived it, as panelists reportedly did at an event held by Mark Zuckerberg's FWD.us last Friday!"
This discussion has been archived. No new comments can be posted.

Remember the Computer Science Past Or Be Condemned To Repeat It?

Comments Filter:
  • by zmughal ( 1343549 ) on Tuesday July 30, 2013 @10:10PM (#44430767) Homepage
    John Graham-Cumming gave a talk at OSCON 2013 titled "Turing's Curse [youtube.com]" that speaks to this same idea. Worth a watch.
  • by Anonymous Coward on Tuesday July 30, 2013 @10:51PM (#44431043)

    I would actually recommend you use C++ with only a subset of the features. C is still the most popular language globally but C++ is a close 2nd - unfortunately there are some benefits to C++. For the record, Universities have been teaching C++ since 1995 at least, you're nearly 20 years out of date. For the record, Slashdot is not a good measure of what languages people are using, as it does not represent the general coding community.

  • Re:Back to BASIC (Score:5, Informative)

    by Samantha Wright ( 1324923 ) on Tuesday July 30, 2013 @10:52PM (#44431065) Homepage Journal
    Here's some worthwhile reading on why Lisp has trouble staying put—possibly a little flamebait-y: Lisp is not an acceptable Lisp [blogspot.ca], The Lisp Curse [winestockwebdesign.com], and Revenge of the Nerds [paulgraham.com]. The core arguments seem to be (a) it's really easy to invent things in Lisp so no one can agree on how to do it, and (b) the lack of a coherent standard platform means there is no easy target for university courses or job descriptions.
  • by sjames ( 1099 ) on Tuesday July 30, 2013 @11:46PM (#44431327) Homepage Journal

    Actually, it was Babbage who faced such idiocy from Parliament:

    On two occasions I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

    And you thought the clogged tubes thing was bad.

  • Re:In Browser (Score:4, Informative)

    by cold fjord ( 826450 ) on Tuesday July 30, 2013 @11:56PM (#44431365)

    Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.

    Depends on how much leeway you are willing to grant. Around 1990 or so, the Mac could run Soft PC, a virtual machine x86 emulator running DOS or Windows. The Mac could certainly network and had file servers. So you should in fact have been able to download code from a fileserver and run it in the virtual machine, which from a Mac perspective would effectively be a sandbox. Although the PC DOS/Windows platform isn't "platform independent," it was nearly universal ( minus Mac only systems*) at the time.

    * Yes, yes - Amiga, Apple II, Atari, et. al.

  • Re:Paging Linus (Score:4, Informative)

    by phantomfive ( 622387 ) on Wednesday July 31, 2013 @12:31AM (#44431523) Journal
    For those who didn't catch it, he's an APL programmer. "Use a for loop" for him is the biggest insult in that entire paragraph.
  • by AK Marc ( 707885 ) on Wednesday July 31, 2013 @12:46AM (#44431613)
    What past were you from? When I had DOS 3.3 running on my XT (on a hard drive), it booted in a few seconds after POST. When I loaded Windows 3.1 (no network at home at the time, so didn't run W3.11) on an XT with 1M RAM, it would take forever. And DOS 3.3 from floppy was slow, and loud. But 3.3 from HD on an ancient XT was much faster than Windows is today. DOS programs loaded fast, granted I was running 300k programs, not 300 MB programs, but they were still fast on DOS 3.3 back in the day. What were you running on your ancient computer?
  • by Darinbob ( 1142669 ) on Wednesday July 31, 2013 @01:15AM (#44431755)

    However, compare Word from 1990 to Word from today. The 1990 one will start nearly instantly, by incredibly responsive, and have all the features most people use anyway.

  • Re:Back to BASIC (Score:4, Informative)

    by ebno-10db ( 1459097 ) on Wednesday July 31, 2013 @01:24AM (#44431799)

    What confuses people is the functional orientation

    Whether or not a language is functional is a matter of degree (just as whether or not a language is dynamic is a matter of degree).

    Out-of-the-box Lisp is not nearly as much a functional language as *ML or Haskell. There is no pattern matching for example. Of course you can turn Lisp into a functional language, which is what Qi [wikipedia.org] is. In true Lisp fashion it's incredibly clever - 10k lines of CL and you've got a functional language that is arguably even more of a functional language than Haskell (not sure about functional specific optimizations though). And also in true Lisp fashion, there's already a (not fully compatible) fork/successor called Shen [wikipedia.org] (by the same guy who created Qi!).

    In other words it's an ongoing experiment, rather than something you can rely on. It grew out of the L21 project, which was supposed to be about Lisp for the 21st century. The lesson is that Lisp for the 21st century is just like Lisp for the 20th century - incredibly clever and powerful but not stable or standardized enough to rely on. Of course the great exception to that is Common Lisp - a byzantine composite of many pre-1985 dialects, warts and all, that hasn't really been updated in 27 years.

  • Re:Back to BASIC (Score:5, Informative)

    by Darinbob ( 1142669 ) on Wednesday July 31, 2013 @02:44AM (#44432157)

    Macsyma? Emacs itself is more Lisp than C. Zork was a Lisp dialect. Mirai is lisp, and was used to do animation in Lord of the Rings. Lots of expert systems. Several CAD systems and other modelling programs. Data analysis stuff. Whoever uses Clojure is using Lisp and it seems to have some traction. And Orbitz is apparently using a lot of Lisp internally (just to throw out a web site since some people think it's not real if it's not a web site or PC app).

  • Re:Back to BASIC (Score:4, Informative)

    by ebno-10db ( 1459097 ) on Wednesday July 31, 2013 @04:02AM (#44432475)

    It can be done in C++, but why go to the extra effort?

    What extra effort is that? Calling your files *.cpp instead of *.c? Ok, that is an extra two letters per file name.

    The why is so that you can take advantage of C++ features. Templates for example are a great way to write very fast code, and if you know what you're doing you don't get the dreaded bloat. Object based programming is a nice way to encapsulate things and adds zero overhead. True OO can be used judiciously in the non-speed critical parts (often a clean way to have a single image handle several minor hardware variants). A combination of object based and operator overloading can be a clean way to handle the semantics of fixed point DSP, which don't map nicely to most languages.

    C++ is not designed for efficiency.

    Read Stroustrup. As I already said, one of the key design principles was not to add overhead unless you explicitly ask for it. It was designed to be as efficient as you need it to be.

    If you don't care quite so much about efficiency, use LISP

    1. GC and HRT? Don't go.

    2. Even where you can write Lisp to be as efficient as C/C++ for low level operations, all you're doing is writing C/C++ in Lisp. What's the point?

    3. How many Lisps, Schemes, whatevers, have you seen for cross-development to DSP's and other architectures that are usually only embedded, have good optimizers (ala SBCL for example), and can run without an OS?

  • by serviscope_minor ( 664417 ) on Wednesday July 31, 2013 @07:48AM (#44433349) Journal

    one that's much more favorable to the MPs

    Come to England and look at our MPs! You will the probably feel that it wasn't such an unfair interpretation on the part of Babbage.

    Seriously though there are many people out there (and they tend to be non technical) who simply do not understand comptuers. The lack of understanding means that they effectively interpret the actions of computers as magic in that there is no way for them to reason about what a computer might do. Even pretty smart people fall prey to this.

    The UK has never had a tradition of putting technically minded people into parliament.

Truth is free, but information costs.