Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming

Remember the Computer Science Past Or Be Condemned To Repeat It? 479

theodp writes "In the movie Groundhog Day, a weatherman finds himself living the same day over and over again. It's a tale to which software-designers-of-a-certain-age can relate. Like Philip Greenspun, who wrote in 1999, 'One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.' Or Dave Winer, who recently observed, 'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.' And then there's Scott Locklin, who argues in a new essay that one of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.' Hey, maybe it's hard to learn from computer history when people don't acknowledge the existence of someone old enough to have lived it, as panelists reportedly did at an event held by Mark Zuckerberg's FWD.us last Friday!"
This discussion has been archived. No new comments can be posted.

Remember the Computer Science Past Or Be Condemned To Repeat It?

Comments Filter:
  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Tuesday July 30, 2013 @10:19PM (#44430833) Homepage

    That's genius: comparing a "$100k/CPU" non-distributed database to a free distributed database. Also no mention that, yes, everyone hates Hive, and that's why there are a dozen replacements coming out this year promising 100x speedup, also all free.

    And on programming languages, Locklin is condescending speaking from his high and mighty functional programming languages mountain, and makes no mention of the detour the industry had to first take into object-oriented programming to handle and organize the exploding size in software programs before combined functional/object languages could make a resurgence. He also neglects to make any mention of Python, which has been popular and mainstream since the late 90's.

  • by DutchUncle ( 826473 ) on Tuesday July 30, 2013 @10:41PM (#44430975)

    I don't remember that code running cross platform on varying architectures.

    Yes. No code runs cross platform on varying architectures - INCLUDING the stuff that supposedly does, like Java and Javascript and all of the web distributed stuff. All of it DEPENDS on an interpretation level that, at some point, has to connect to the native environment.

    Which is what BASIC was all about. And FORTRAN. Expressing the algorithm in a slightly more abstract form that could be compiled to the native environment, and then in the case of BASIC turned into interpreted code (Oh, you thought Java invented the virtual machine?)

  • by Gim Tom ( 716904 ) on Tuesday July 30, 2013 @10:44PM (#44431005)
    As a 66 year old life long geek I actually saw many of the things I worked with decades ago reinvented numerous times under a variety of names, but there is one thing I used extensively on IBM OS/360 that I have never seen in the PC world that was a very useful item to have in my tool kit. The Generation Data Set and by extension the Generation Data Group were a mainstay of mainframe computing on that platform the entire time I worked on it. When I moved on to Unix and networks in the last few decades of my career I looked for something similar, and never found anything quite as simple and elegant (in the engineering sense of the word) as the Generation Data Set was. Oh, you can build the same functionality in any program, but this was built into the OS and used extensively. If anyone has seen a similar feature in Unix or Linux I would love to know about it.
  • by symbolset ( 646467 ) * on Tuesday July 30, 2013 @10:48PM (#44431029) Journal
    These guys were openly publishing their brilliance before hiding how your shit works was even a thing. Believe it or not once upon a time if you invented a brilliant thing in code you shared it for others to build upon so you could learn and grow and benefit. Hiding it for profit wasn't even thought of yet. It wasn't just undesirable: the thought did not even occur. That was the golden age of much progress, as each genius built upon the prior - standing upon the shoulders of giants reaching for fame. Now that we're in a hiding era we go around and around reinventing the same shit over and over, suing each other over who invented it first. It is madness. In the process we have moved backwards, losing decades of developed wisdom.
  • Au Contraire! (Score:4, Interesting)

    by VortexCortex ( 1117377 ) <VortexCortex AT ... trograde DOT com> on Tuesday July 30, 2013 @11:58PM (#44431379)

    For instance: As a cyberneticist I'm fond of programs that output themselves, it's the key component of a self hosting compiler... Such systems have a fundamental self describing mechanism much like DNA, and all other "life". While we programmers continue to add layers of indirection and obfuscation ( homeomorphic encryption ) and segmented computing (client / server), some of us are exploring the essential nature of creation that creates the similarities between such systems -- While you gloat over some clever system architecture some of us are discovering the universal truths of design itself.

    To those that may think Computer Science is a field that must be studied or be repeated, I would argue that there is no division in any field and that you haven't figured two key things:
    0. Such iteration is part of the cybernetic system of self improvement inherent in all living things -- to cease is death, extinction.
    1. Nothing in Computer Science will truly be "solved" until a self improving self hosting computing environment is created...

    So, while you look back and see the pains of Microsoft trying to implement POSIX poorly, I've studied the very nature of what POSIX tried and only partially succeeded to describe. While you chuckle at the misfortunes of programmers on the bleeding edge who are reinventing every wheel in each new language, I look deeper and understand why they must do so. While you look to the "great minds" of the past, I look to them as largely ignorant figures of self import who thought they were truly masters of something, but they ultimately did not grasp what they claimed to understand at a fundamental level -- The way a Quantum Physicist might acknowledge pioneers in early Atomic thinking... Important, but not even remotely aware of what they were truly doing.

    How foolishly arrogant you puny minded apes are...

  • by phantomfive ( 622387 ) on Wednesday July 31, 2013 @12:35AM (#44431547) Journal
    He's talking about APL, that's not a functional language. It's an array based language, of which Dijkstra said, ""APL is a mistake, carried through to perfection."
  • by MillerHighLife21 ( 876240 ) on Wednesday July 31, 2013 @01:11AM (#44431729) Homepage

    Package management and fit for purpose tool chains don't exist in other languages? Is that a joke? Have you seen the Ruby gem ecosystem? Have you seen the Java ecosystem? You can do everything that you described in Ruby or Python without blinking and you won't incur the technical debt that Node's global insanity creates. Node came a long and people went "OMG! Non-blocking I/O!" and everybody else with a pulse looked at it and said...yea, that's what background workers are for but background workers encapsulate the logic instead of letting it all float around in one process. Eventually, node code grows to insanity.

    Mongo is awesome...for write heavy applications. In most applications that means that one table could probably be better served with Mongo. For logging or cloud based data aggregators it's EXCELLENT. It's a fantastic session store too. Also a great query cache. That doesn't make it the optimal tool for your entire system where you might actually care about normalization, data compression, data integrity or the amount of hard drive space required to store all the data bloat that comes with it.

    I can built a fully functional ecommerce system with an API, payment gateways, account system and analytics in 2 weeks (and most of that is just setting up the payment gateway and merchant account) with Ruby, Python, Groovy or Scala. With 1 person. Having it do $100k / month in sales is a product of what it's offering, how effectively it's marketed, how the supply chain side of the business can scale with demand and has absolutely zero to do with Node and/or Mongo.

    Are you actually serious with such an example?

  • Re:Back to BASIC (Score:2, Interesting)

    by phantomfive ( 622387 ) on Wednesday July 31, 2013 @03:16AM (#44432287) Journal
    If you actually care about efficiency, you might as well do it in C. It can be done in C++, but why go to the extra effort? C++ is not designed for efficiency.

    If you don't care quite so much about efficiency, use LISP, a language that actually does make your life easier (and also happens to be a true multi-paradigm language, and does a much better job of it than C++).
  • by wanax ( 46819 ) on Wednesday July 31, 2013 @04:04AM (#44432485)

    I've always felt like that quotation had another interpretation, one that's much more favorable to the MPs:

    If you're an MP, you've probably had to deal with a lot of people asking for money to fund what is essentially snake oil. If you don't understand the underlying 'cutting edge' technology (both plausible and acceptable), one simple test is to ask a question that you KNOW if the answer anything other than "No" that the person is bullshitting, and you can safely ignore them... and as reported the question is phrased in such a way that it would sorely tempt any huckster to oversell their device. I think Babbage's lack of comprehension was due to his inability to understand the idea that the MP was questioning HIM, rather than the device.

  • by Camael ( 1048726 ) on Wednesday July 31, 2013 @04:29AM (#44432565)

    Actually, from the examples cited, it seems to me to be painfully obvious why in those cases information was not shared.

    One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.

    For quite a long period of time, IBM and MS were stiff competitors (remember OS/2 warp?). I doubt MS would inform IBM what they were working on, much less seek help. In fact, it seems to be the exception rather than the rule for software companies to share code with each other. Selling code, after all, is usually how they make money.

    'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.'

    Im fairly confident that Apple would sue any company that copies its software written for the Mac. Let us also not forget how much problems Oracle caused for Google when they sued over the Java API in Android [wikipedia.org]. Yes, it is efficient to reuse old tried and tested code, but it also opens you up to a lawsuit. So it is not so much reinventing the wheel as trying to find a different way of doing things so you wont get sued. For that, you have current IP laws to thank.

    One of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.

    The problem here is with equating writing software to producing works of art. People are willing to go out of their way to learn and improve themselves to paint better or make beautiful music because it enables them to express themselves. It's emotionally satisfying. OTOH most software is programmed to achieve a certain utility and the programmer is faced with constraints e.g. having to use certain libraries etc. He is rarely able to express himself, and his work is subject to the whims of his bosses. For most everyday programmers, I think there is no real motivation to 'learn from the great masters'.

    An exception might be the traditional hacking/cracking community where the members program for the sheer joy/devilry of it. I understand there is a fair amount of sharing of code/information/knowledge/learning from the great masters within their community.
     

  • Re:Back to BASIC (Score:5, Interesting)

    by TheRaven64 ( 641858 ) on Wednesday July 31, 2013 @05:19AM (#44432757) Journal
    The first C book I read spent most of the first chapter justifying why you'd use a high-level language like C instead of assembly. How times change...
  • by TheRaven64 ( 641858 ) on Wednesday July 31, 2013 @05:24AM (#44432775) Journal
    There was a point made at the 30-year retrospective talk at last year's European Smalltalk conference. If you have two languages, one of which allows developers to be more efficient, then you will end up needing fewer developers for the same amount of work. Unless your entire company uses this language and never experiences mergers, then this group of developers will be outnumbered. When you begin a new project or merge two projects, management will typically decide to pick the language that more developers have experience with. If you have a team of 100 C++ programmers and another team of one SilverBulletLang programmers, then it seems obvious that you should pick C++ for your next project because you have vastly more experience within the company in using C++. The fact that the one SilverBulletLang programmer was more productive doesn't matter. In the real world, languages tend not to be silver bullets and so the productivity difference is more in the factor of two to five range, but that's still enough that the less-productive language wins.
  • by TheRaven64 ( 641858 ) on Wednesday July 31, 2013 @05:38AM (#44432819) Journal
    The designers or Ruby wanted Smalltalk with Perl syntax. I find it amazing that anyone could look at Smalltalk and think 'the one thing this needs to make it better is Perl syntax'. And you can substitute pretty much any language for Smalltalk in the last sentence.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...