Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming

Remember the Computer Science Past Or Be Condemned To Repeat It? 479

theodp writes "In the movie Groundhog Day, a weatherman finds himself living the same day over and over again. It's a tale to which software-designers-of-a-certain-age can relate. Like Philip Greenspun, who wrote in 1999, 'One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.' Or Dave Winer, who recently observed, 'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.' And then there's Scott Locklin, who argues in a new essay that one of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.' Hey, maybe it's hard to learn from computer history when people don't acknowledge the existence of someone old enough to have lived it, as panelists reportedly did at an event held by Mark Zuckerberg's FWD.us last Friday!"
This discussion has been archived. No new comments can be posted.

Remember the Computer Science Past Or Be Condemned To Repeat It?

Comments Filter:
  • by Bob_Who ( 926234 ) on Tuesday July 30, 2013 @09:07PM (#44430749) Journal

    10 GOTO 20
    20 GOTO 10

  • by cultiv8 ( 1660093 ) on Tuesday July 30, 2013 @09:07PM (#44430753) Homepage
    It's managers and executives who make the decisions, and to them whether it's a browser or mobile app or SaaS or whatever the latest trend is, who cares if you're reinventing the wheel as long as profits are up.
    • by fldsofglry ( 2754803 ) on Tuesday July 30, 2013 @09:21PM (#44430841)
      Also in line with this: I can't imagine that way patents work actually help with the problem of inventing the wheel. You almost have to reinvent the wheel to create a working solution that won't get you sued.
      • by Camael ( 1048726 ) on Wednesday July 31, 2013 @03:29AM (#44432565)

        Actually, from the examples cited, it seems to me to be painfully obvious why in those cases information was not shared.

        One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.

        For quite a long period of time, IBM and MS were stiff competitors (remember OS/2 warp?). I doubt MS would inform IBM what they were working on, much less seek help. In fact, it seems to be the exception rather than the rule for software companies to share code with each other. Selling code, after all, is usually how they make money.

        'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.'

        Im fairly confident that Apple would sue any company that copies its software written for the Mac. Let us also not forget how much problems Oracle caused for Google when they sued over the Java API in Android [wikipedia.org]. Yes, it is efficient to reuse old tried and tested code, but it also opens you up to a lawsuit. So it is not so much reinventing the wheel as trying to find a different way of doing things so you wont get sued. For that, you have current IP laws to thank.

        One of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.

        The problem here is with equating writing software to producing works of art. People are willing to go out of their way to learn and improve themselves to paint better or make beautiful music because it enables them to express themselves. It's emotionally satisfying. OTOH most software is programmed to achieve a certain utility and the programmer is faced with constraints e.g. having to use certain libraries etc. He is rarely able to express himself, and his work is subject to the whims of his bosses. For most everyday programmers, I think there is no real motivation to 'learn from the great masters'.

        An exception might be the traditional hacking/cracking community where the members program for the sheer joy/devilry of it. I understand there is a fair amount of sharing of code/information/knowledge/learning from the great masters within their community.
         

    • Re: (Score:3, Insightful)

      by DaveAtFraud ( 460127 )

      It's managers and executives who make the decisions, and to them whether it's a browser or mobile app or SaaS or whatever the latest trend is, who cares if you're reinventing the wheel as long as profits are up.

      That hasn't changed either. Just the specific subject of the idiocy has changed. Idiotic managers are timeless. Lady Ada probably had the same thing to say about Charles Babbage.

      Cheers,
      Dave

      • by sjames ( 1099 ) on Tuesday July 30, 2013 @10:46PM (#44431327) Homepage Journal

        Actually, it was Babbage who faced such idiocy from Parliament:

        On two occasions I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

        And you thought the clogged tubes thing was bad.

        • by wanax ( 46819 ) on Wednesday July 31, 2013 @03:04AM (#44432485)

          I've always felt like that quotation had another interpretation, one that's much more favorable to the MPs:

          If you're an MP, you've probably had to deal with a lot of people asking for money to fund what is essentially snake oil. If you don't understand the underlying 'cutting edge' technology (both plausible and acceptable), one simple test is to ask a question that you KNOW if the answer anything other than "No" that the person is bullshitting, and you can safely ignore them... and as reported the question is phrased in such a way that it would sorely tempt any huckster to oversell their device. I think Babbage's lack of comprehension was due to his inability to understand the idea that the MP was questioning HIM, rather than the device.

          • by serviscope_minor ( 664417 ) on Wednesday July 31, 2013 @06:48AM (#44433349) Journal

            one that's much more favorable to the MPs

            Come to England and look at our MPs! You will the probably feel that it wasn't such an unfair interpretation on the part of Babbage.

            Seriously though there are many people out there (and they tend to be non technical) who simply do not understand comptuers. The lack of understanding means that they effectively interpret the actions of computers as magic in that there is no way for them to reason about what a computer might do. Even pretty smart people fall prey to this.

            The UK has never had a tradition of putting technically minded people into parliament.

    • by TheRaven64 ( 641858 ) on Wednesday July 31, 2013 @04:24AM (#44432775) Journal
      There was a point made at the 30-year retrospective talk at last year's European Smalltalk conference. If you have two languages, one of which allows developers to be more efficient, then you will end up needing fewer developers for the same amount of work. Unless your entire company uses this language and never experiences mergers, then this group of developers will be outnumbered. When you begin a new project or merge two projects, management will typically decide to pick the language that more developers have experience with. If you have a team of 100 C++ programmers and another team of one SilverBulletLang programmers, then it seems obvious that you should pick C++ for your next project because you have vastly more experience within the company in using C++. The fact that the one SilverBulletLang programmer was more productive doesn't matter. In the real world, languages tend not to be silver bullets and so the productivity difference is more in the factor of two to five range, but that's still enough that the less-productive language wins.
      • by captainClassLoader ( 240591 ) on Wednesday July 31, 2013 @07:24AM (#44433581) Journal
        What I've seen in the 3 decades I've been in the industry is that the number of programmers using OldVanillaLang versus SilverBulletLang is less of an issue - Managers are often willing to go with a more resource-efficient solution, given that IT/MIS departments are often considered overhead on the bean counter spreadsheets. The thing that keeps managers on the OldVanillaLang track is the answers to the questions: "Supposed my SilverBulletLang guys leave - Who takes over their code? How do I evaluate SilverBulletLang developers in interviews? And since they're rare, can I afford them?"
    • by Xest ( 935314 ) on Wednesday July 31, 2013 @06:55AM (#44433377)

      That's such a cop out and it's not true. Most the managers making these decisions are technical managers who come from development backgrounds themselves.

      There is a problem at a more fundamental level, even outside of determining what buzzwords to use for a product and it's prominent even in some of the higher echelons of web society. The most obvious I'm going to point out is that of HTML5 - it's a braindead spec full of utterly amateur mistakes that could've been avoided if only Ian Hickson had spent 5 seconds understanding why existing things were the way they were and why that mattered.

      An obvious example is that of HTML5's semantic tags, using a study to determine a static set of tags that would be used to define semantic capabilities in a spec that was out of date before the ink had even dried was just plain stupid. The complaint that we needed more readable markup rather than div soup to make writing HTML was naive, firstly because amateurs just don't write HTML anymore, they all publish via Facebook, Wordpress and so forth, and secondly because there's a good reason markup had descended into div soup - because genericness is necessary for future-proofing. Divs don't care if they're ID is menu, or their class is comment, they're content neutral, they don't give a fuck what they are, but they'll be whatever you want them to be which means they're always fit for the future. In contrast HTML5 tried to replace divs with tags such as aside, header, footer and so forth which would be great except when you have a finite number of elements you end up with people arguing about what to do when an element doesn't fit. Do you just go back to using divs for that bit anyway or do you fudge one of the new tags in because it's kinda-loosely related which means you bastardise the semantics in the first place because we now don't really know what each semantic tag is referring to because it's been fudged in where it doesn't make a lot of sense?

      The real solution was to provide a semantic definition language, the ability to apply semantics to classes and IDs externally. Does that concept sound familiar? It should because we had the exact same problem with applying designs to HTML in the past and created CSS. We allowed design to be separate from markup with external stylesheets because this had many benefits, a few obvious ones:

      1) Designers could focus on CSS without getting in the way of those dealing with markup making development easier

      2) We could provide external stylesheets for no longer maintained sites and have the browser apply them meaning there is a method to retrofit unmaintained sites with new features

      3) Our markup is just markup, it just defines document structure, it does one thing and one thing well without being a complete mess of other concepts.

      Consider that these could've been benefits for building a semantic web if HTML5 had been done properly too. The fact that Ian Hickson failed to recognise this with HTML5 highlights what the article is talking about exactly. He's completely and utterly failed to learn the lessons before him as to why inline styling was bad but on a more fundamental level demonstrates a failure to understand the importance of the concept of separation of concerns and the immense benefits that provides that was already learnt the hard way by those who came before him. His solution? Oh just make HTML5 a "living spec" - what? Specs are meant to be static for a reason, so that you can actually become compliant with them and remain compliant with them. Spec compliance once you've achieved it shouldn't ever be a moving target. That's when you know you need to release a new spec.

      It's a worrying trend because it's not just him, I see it amongst the Javascript community as they grow in their ambition to make ever bigger software but insist that Javascript is all they need to do it. The horrendously ugly fudges they implement to try and fudge faux-namespaces into the language in a desperate attempt to alleviate the fact the Javascript was just never designed for large codebases is but one example.

      I see it in frameworks, especially in the PHP world where there are many not-quite MVC frameworks because rather than figure out how to do things properly in the MVC pattern with the recognition that sticking as close to the pattern as possible lets people dive into your framework they throw in all these stupid additional concepts that are a mere band-aid for their own ignorance of good software design and sensible use of patterns. I mean, why re-use a common solution to a common problem that was solved long ago and hence that everyone else can jump in and understand when you can come up with your own god-awful fudge that makes no sense to anyone and turns your framework into an unmaintainable mess such that you end up rewriting your entire framework every few years?

      I'll admit I see the problem almost entirely amongst the web development community and in part I think this is because many web developers are home grown more so than other fields of software development and so didn't go through the academic rigour of a computer science degree and were brought up on a diet of badly designed languages like PHP and Javascript but the problem the article talks about is clearly there. In the web world even the most basic lessons of a decent computer science or software engineering degree are completely lost on such a upsettingly large proportion of those in the field.

      There's a reason why in this day and age there are still so many trivially exploitable websites through such long solved problems as SQL injection attacks - because these people who haven't learn the lessons from those who came before them are unfortunately not small in number and are out there writing supposedly production ready software, and that problem can't simply be pinned on management.

  • I saw the Lady Gaga quip and Scott's fondness for effective ancient map-reducey techniques on unusual hardware platforms. It reminded me about things like discovering America. Did the Vikings discover it years before any other Europeans? Certainly. Did the Chinese discover it as well? There's some scholarly thought that maybe they did. But you know whose discovery actually effected change in the world? Lame old Christopher Columbus.

    Perhaps there's a lesson to be learned here from people who want to actuall

    • Eh? starting 15,000 years ago various waves of people came here from asia and huge and important civilizations have risen and fallen in the Americas since then. Some of those people are still around and their influence on art, food, medicine continues into our culture. One group of those asians was absolutely crucial to the United States winning its independence and also had influence on our Constitution. Talk about effecting change in the world; and they're still around by the way.

    • by symbolset ( 646467 ) * on Tuesday July 30, 2013 @09:39PM (#44430963) Journal

      Lady Gaga is mentioned because she is both a classically trained artist and sui-generis of successful PopTart art through self-exploitation. Yes, the reference is recursive - as this sort of folk are prone to be. They can also be rude, if you bother to click through, as they give not one shit about propriety - they respect skill and art and nothing else.

      When I plussed this one on the Firehose I knew most of us weren't going to "get it" and that's OK. Once in a while we need an article that's for the outliers on the curve to maintain the site's "geek cred". This is one of those. Don't let it bother you. Most people aren't going to understand it. Actually, if you can begin to grasp why it's important to understand this you're at least three sigmas from the mean.

      Since you don't understand why it's important, I wouldn't click through to the article and attempt to participate in the discussion with these giants of technology. It would be bad for your self-esteem.

      For the audience though, these are the folk that made this stuff and if you appreciate the gifts of the IT art here is where you can duck in and say "thanks."

    • by __aaltlg1547 ( 2541114 ) on Tuesday July 30, 2013 @10:46PM (#44431329)
      Yeah, but too many of today's programmers think they discovered America themselves.
  • by zmughal ( 1343549 ) on Tuesday July 30, 2013 @09:10PM (#44430767) Homepage
    John Graham-Cumming gave a talk at OSCON 2013 titled "Turing's Curse [youtube.com]" that speaks to this same idea. Worth a watch.
    • I find it curious that he didn't mention this [chris-granger.com] or this [vimeo.com] at the end, given that they're both about a year old and both flirt with death and/or the halting problem in order to offer better debugging features.
  • by Anonymous Coward on Tuesday July 30, 2013 @09:12PM (#44430783)

    It's pretty damn obvious why this is: as an industry, we no longer shun those who should definitely be shunned.

    Just look at all of the damn fedora-wearing Ruby on Rails hipster freaks we deal with these days. Whoa, you're 19, you dropped out of college, but you can throw together some HTML and some shitty Ruby and now you consider yourself an "engineer". That's bullshit, son. That's utter bullshit. These kids don't have a clue what they're doing.

    In the 1970s and 1980s, when a lot of us got started in industry, a fool like that would've been filtered out long before he could even get a face-to-face interview with anyone at any software company. While there were indeed a lot of weird fuckers in industry back then, especially here in SV, they at least had some skill to offset their oddness. The Ruby youth of today have none of that. They're abnormal, yet they're also without any ability to do software development correctly.

    Yeah, these Ruby youngsters should get the hell off all of our lawns. There's not even good money in fixing up the crap they've produced. They fuck up so badly and produce so much utter shit that the companies that hired them go under rather than trying to even fix it!

    The moral of the story is to deal with skilled veteran software developers, or at least deal with college graduates who at least have some knowledge and potential to do things properly. And the Ruby on Rails idiots? Let's shun them as hard as we can. They have no place in our industry.

    • I think you were trolling, but there's a point under there. In the 70s you had to have a clue to get anything done. As more infrastructure and support system has been built, in the interest of not having to reinvent the wheel every project, you *can* have people produce things - or appear to produce things - while remaining clueless. Flash and sizzle have been replacing the steak.
      • I think you were trolling, but there's a point under there. In the 70s you had to have a clue to get anything done. As more infrastructure and support system has been built, in the interest of not having to reinvent the wheel every project, you *can* have people produce things - or appear to produce things - while remaining clueless. Flash and sizzle have been replacing the steak.

        Now it's html5 and sizzle.

    • by raymorris ( 2726007 ) on Tuesday July 30, 2013 @10:14PM (#44431163) Journal
      Indeed. Half of today's programmers have roughly zero engineering education, and want to be called software engineers. They have no idea, no idea at all, what their data structures look like in memory and why they are so damn slow. Heck "data structure" is an unfamiliar term to many.

      It's not entirely young vs old, either. I'm in my 30s. I work with people in their 50s who make GOOD money as programmers, but can't describe how the systems they are responsible for actually work.

      How do we fix it? If you want to be good, studying the old work of the masters like Knuth is helpful, of course. Most helpful, I think, is to become familiar with languages at different levels. Get a little bit familiar with C. Not C# or C++, but C. It will make you a better programmer in any language. Also get familiar with high level. You truly appreciate object oriented code when you do GUI programming in a good Microsoft language. Then, take a peek at Perl's objects to see how the high level objects are implemented with simple low level tricks. Perl is perfect for understanding what an object really is, under the covers. Maybe play with microcontrollers for a few hours. At that point, you'll have the breadth of knowledge that you could implement high level entities like objects in low level C. You'll have UNDERSTANDING, not just rote repetition.

      * none of this is intended to imply that I'm any kind of expert. Hundreds, possibly thousands of people are better programmers than I. On the other hand, tens of thousands could learn something from the approach I described.
    • To be honest, I sort of softened on ruby on rails after being forced to endure a project on it, and must to my teeth-grinding resentment, actually found it a decent and productive environment (Although I'd say Django more so because of its relative lack of magic, and hey who doesn't enjoy screwing around with python).

      Now don't get me started on javascript on the server and NoSQL systems. Somewhere between "lets call ourselves amazing because we got a god damn web browser script environment to implement a pa

  • Paging Linus (Score:3, Insightful)

    by TubeSteak ( 669689 ) on Tuesday July 30, 2013 @09:14PM (#44430803) Journal

    http://scottlocklin.wordpress.com/2013/07/28/ruins-of-forgotten-empires-apl-languages/#comment-6301 [wordpress.com]

    Computer science worked better historically in part because humorless totalitarian nincompoopery hadn't been invented yet. People were more concerned with solving actual problems than paying attention to idiots who feel a need to police productive people's language for feminist ideological correctness.

    You may now go fuck yourself with a carrot scraper in whatever gender-free orifice you have available. Use a for loop while you're at it.

  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Tuesday July 30, 2013 @09:19PM (#44430833) Homepage

    That's genius: comparing a "$100k/CPU" non-distributed database to a free distributed database. Also no mention that, yes, everyone hates Hive, and that's why there are a dozen replacements coming out this year promising 100x speedup, also all free.

    And on programming languages, Locklin is condescending speaking from his high and mighty functional programming languages mountain, and makes no mention of the detour the industry had to first take into object-oriented programming to handle and organize the exploding size in software programs before combined functional/object languages could make a resurgence. He also neglects to make any mention of Python, which has been popular and mainstream since the late 90's.

  • by fermion ( 181285 ) on Tuesday July 30, 2013 @09:19PM (#44430835) Homepage Journal
    One of the things that I still see is the idea that when a problem exists, throw more people at it. The mythical man month pretty much threw that to wind for software development, and I am sure there are a whole slew of books that predate it saying essenstially the same thing. Yes advancements do mean that more people can communicate more directly, but there still is a limit and I do not think it is as great as some believe. Define interfaces, define test that insure those interfaces exhibit high fidelity, and let small teams, even a single person, solve a small problem. What technological advance has done is make clock cycles very cheap, so there is less excuse to go digging around trying to change code that will make your code run a little faster. Speaking of interfaces, we know that when data and processes are not highly encapsulated, it is nearly impossible to create a bug free large project. One thing that object oriented programs has done is to create a structure where data and processes can be hidden so they can be changed as needed without damaging the overall software application. Now, many complain because the data is not really hidden, it is just a formality. But really coding is just a formality, and a professional is mostly one who knows how to respect that formality to generate the most manageable and defect free code possible. One thing that has been lost with the generation of rapid development systems quickly spouting out bad code is that code and the ability to tweak it is the basis of what we do.
    • With to much auto testing can just code to pass the test and even if some was looking at they would mark fail but it still passes what the auto system thinks is good.

    • For any given software project there is an optimal team size. If the project is small enough, you can keep the team size down to what works with an agile development methodology. If the project is bigger than that, things get ugly. I started my career in a company that considered projects of 50 to 100 man-years to be small to medium sized. Big projects involved over a thousand man-years of effort and the projects were still completed in a few years calendar time. You can do the math as to what that mea

  • by neorush ( 1103917 ) on Tuesday July 30, 2013 @09:24PM (#44430855) Homepage

    We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac

    I don't remember that code running cross platform on varying architectures. The web as an platform for distribution should not be compared to an actual OS...that doesn't even make sense.

    • by DutchUncle ( 826473 ) on Tuesday July 30, 2013 @09:41PM (#44430975)

      I don't remember that code running cross platform on varying architectures.

      Yes. No code runs cross platform on varying architectures - INCLUDING the stuff that supposedly does, like Java and Javascript and all of the web distributed stuff. All of it DEPENDS on an interpretation level that, at some point, has to connect to the native environment.

      Which is what BASIC was all about. And FORTRAN. Expressing the algorithm in a slightly more abstract form that could be compiled to the native environment, and then in the case of BASIC turned into interpreted code (Oh, you thought Java invented the virtual machine?)

  • There are a lot of things that if source code was available, other people could build on it and make higher quality products. In the absence of source code, people need to start from scratch often rebuilding the wheel.

    Competition for money might get people to strive to make better pieces of art. But on the flip side, this same competition will sue your pants off for any reason they can find so you don't compete with them either.

    An on an unrelated note, I had an idea for a zombie video game like Ground Hog day today. When you die, it starts out as the beginning of a zombie pandemic. As you die and play through it over and over, you get secrets to where weapons and supplies are. You find tricks you can use to survive and save people. Eventually you find out who caused the zombie pandemic. You can then kill him before he goes through with it. I'm not sure an ending where you serve in prison is a good ending though. I didn't think it the whole way through, but it sounded like a good premise for a zombie game.
    • There are a lot of things that if source code was available, other people could build on it and make higher quality products. In the absence of source code, people need to start from scratch often rebuilding the wheel.

      That doesn't seem true for the most part.

      All open source does with regard to code reuse is that it makes it painfully obvious how much redundancy there is. The spat between the different Linux display managers is one recent example, but I'm sure you can think of many others.

      As for why this is

  • In Browser (Score:5, Insightful)

    by MacDork ( 560499 ) on Tuesday July 30, 2013 @09:26PM (#44430871) Journal

    We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.

    Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.

    We should then pay homage to the Mac 25 years ago, when it basically did what Doug Englebart demonstrated 45 years ago. [youtube.com] Nice logic you have there.

    • 25 years ago you couldn't transmit the data in a matter of seconds. You *could* execute BASIC bytecode, though. Dynamic link libraries were invented for MULTICS in the 1960s. IBM assembler macros in the 70s could do more than a C++ template function. (OTOOH IBM deliberately crippled the small computer world by choosing an overlapped 24-bit address space instead of a 32-bit linear one (on the Motorola chips) because their mainframes were still linear 24-bit.)
    • Re:In Browser (Score:4, Informative)

      by cold fjord ( 826450 ) on Tuesday July 30, 2013 @10:56PM (#44431365)

      Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.

      Depends on how much leeway you are willing to grant. Around 1990 or so, the Mac could run Soft PC, a virtual machine x86 emulator running DOS or Windows. The Mac could certainly network and had file servers. So you should in fact have been able to download code from a fileserver and run it in the virtual machine, which from a Mac perspective would effectively be a sandbox. Although the PC DOS/Windows platform isn't "platform independent," it was nearly universal ( minus Mac only systems*) at the time.

      * Yes, yes - Amiga, Apple II, Atari, et. al.

    • by grcumb ( 781340 ) on Wednesday July 31, 2013 @12:32AM (#44431835) Homepage Journal

      We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.

      Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.

      We should then pay homage to the Mac 25 years ago, when it basically did what Doug Englebart demonstrated 45 years ago. [youtube.com] Nice logic you have there.

      Dude, just ignore this guy. Of all people who have the right to indulge in a good, old-fashioned 'get off my lawn' rant, Dave Winer ranks last. This is the man who, for our sins, gave us XMLRPC and SOAP, paving the way for the re-invention of... well, everything, in a web browser.

      Port 80 died for this man's sins....

  • The third link is mainly a praise of APL, the programming language. Talk about odd.

    It would be great if he'd actually given examples of why APL is a good language. I would be interested in that. Instead he says mmap is really interesting, which actually doesn't have anything to do with programming language.

    He says that old programmers have left a lot of examples of good source code. It would be great if he'd actually linked to their code.......
  • by Russ1642 ( 1087959 ) on Tuesday July 30, 2013 @09:41PM (#44430979)

    He says system performance is the same as it was way back then. He thinks that stuff just happened immediately on those systems because they were running very efficient code. So what. Here's a simple test. Go get one of those computers and set it next to yours. Turn them both on. Mine would be at a desktop before the old one even thinks about getting down to actually running the operating system. Or start a program. On a current system it loads now. As in, right now. Back then it was a waiting game. Everything was a waiting game. He must have simply forgotten or repressed those memories.

    • by siride ( 974284 ) on Tuesday July 30, 2013 @10:23PM (#44431215)

      Also those old programs did a lot less than many of our new programs. People often forget that when complaining about performance.

      That's not to say, of course, that modern programs couldn't be written more efficiently. Because of Moore's Law and other considerations, we have moved away from spending a lot of time on performance and efficiency.

    • Not sure what you're talking about. My current desktop boots in around 30 seconds. My Commodore 64 from 1980 was blinking a READY prompt in about 2 seconds.

      Of course modern computers have faster CPUs and everything else, but I'd really like to know where along the line a 30 second boot time became acceptable......
    • by AK Marc ( 707885 ) on Tuesday July 30, 2013 @11:46PM (#44431613)
      What past were you from? When I had DOS 3.3 running on my XT (on a hard drive), it booted in a few seconds after POST. When I loaded Windows 3.1 (no network at home at the time, so didn't run W3.11) on an XT with 1M RAM, it would take forever. And DOS 3.3 from floppy was slow, and loud. But 3.3 from HD on an ancient XT was much faster than Windows is today. DOS programs loaded fast, granted I was running 300k programs, not 300 MB programs, but they were still fast on DOS 3.3 back in the day. What were you running on your ancient computer?
    • Re: (Score:3, Informative)

      by Darinbob ( 1142669 )

      However, compare Word from 1990 to Word from today. The 1990 one will start nearly instantly, by incredibly responsive, and have all the features most people use anyway.

    • I had an Atari ST at college. It booted (to a graphical, no less) desktop pretty much instantly, say a few seconds if you had a slew of SCSI peripherals (especially a CDROM drive), but otherwise it was about half a second.

      It was ready to go, too. None of this crap of *showing* the desktop and then spinning the busy cursor for another 30 secs...

      Simon.

  • When people don't learn from people who have made mistakes or even had some real work place experience (not years of academic experience) it easy to end makeing mistakes that in theory seem like good ideas.

    Also similar to some of the certs type stuff there the book says this but in the real work place that does not work.

  • by AHuxley ( 892839 ) on Tuesday July 30, 2013 @09:43PM (#44430993) Journal
    The best and brightest at Apple, MS, BeOS, Linux did learn from "the great masters" - thank you to all of them.
    They faced the limits of the data on a floppy and cd.
    They had to think of updates over dial up, isdn, adsl.
    Their art had to look amazing and be responsive on first gen cpu/gpu's.
    They had to work around the quality and quantity of consumer RAM.
    They where stuck with early sound output.
    You got a generation of GUI's that worked, file systems that looked after your data, over time better graphics and sound.
    You got a generation of programming options that let you shape your 3d car on screen rather than write your own gui and then have to think about the basics of 3d art for every project.
    They also got the internet working at home.
    • by Darinbob ( 1142669 ) on Wednesday July 31, 2013 @12:21AM (#44431775)

      Some people seem to think this article is about going back to the past. They miss the entire point. We're not saying that older programs were better, or that older computers were better, or that we should roll back the clock. We're saying that they had to pay more attention to what they were doing, they had to learn more and be broad based, they had to learn on their own, and so forth. When they had good ideas they were shared, they were not continually being reinvented and presented as something new. They didn't rely on certification programs.

  • by Gim Tom ( 716904 ) on Tuesday July 30, 2013 @09:44PM (#44431005)
    As a 66 year old life long geek I actually saw many of the things I worked with decades ago reinvented numerous times under a variety of names, but there is one thing I used extensively on IBM OS/360 that I have never seen in the PC world that was a very useful item to have in my tool kit. The Generation Data Set and by extension the Generation Data Group were a mainstay of mainframe computing on that platform the entire time I worked on it. When I moved on to Unix and networks in the last few decades of my career I looked for something similar, and never found anything quite as simple and elegant (in the engineering sense of the word) as the Generation Data Set was. Oh, you can build the same functionality in any program, but this was built into the OS and used extensively. If anyone has seen a similar feature in Unix or Linux I would love to know about it.
  • ... or be doomed to repeat it. And they have been for 20 years, every year. Strategy game development in particular seriously needs a persistent collective consciousness.

  • by DutchUncle ( 826473 ) on Tuesday July 30, 2013 @09:52PM (#44431061)
    ... which really means the late '60s into the '70s. Isaac Newton said that he saw far because he stood on the shoulders of giants. Bill Gates and Steve Jobs were *proud* of knowing nothing about the industry they were trying to overturn. The same free, open, do-your-own-thing attitude (partly based on the new abundance helped along by technological advancement) that permitted startups to overtake established manufacturers, also encouraged tossing out anything "established" as "outdated" whether it was useful or not.
  • by Flwyd ( 607088 ) on Tuesday July 30, 2013 @10:08PM (#44431139) Homepage

    You can learn a lot from Mozart because you can read all the notes he published.
    You can listen to many interpretations of his works by different people.
    We don't have the chance to read through 25-year-old Mac symphonies^W programs.
    We aren't even writing for the same instruments.

  • Chariots were masterpieces of art. They were often made of precious metals and had elegant design work. They were environmentally friendly, using no fossil fuels whatsoever. They didn't cause noise pollution, or kill dozens of people when they crashed.

    Aircraft makers should learn from the past. They have totally functional designs, no semblance of artistry anywhere. Accommodations are cramped, passengers treated like cattle.

    We should go back to the good old days, things were so much better back then.

    No

  • by Coditor ( 2849497 ) on Tuesday July 30, 2013 @10:18PM (#44431185)
    I'm old enough at 55 to remember the past, and yes I did love APL briefly but lamenting that the present isn't like the past is like wishing it was 1850 again so you could have slaves do all your work. Neither the web nor the modern mobile application is anything like the past, and what we use to write that code today is nothing like what I started with. Trying to relive the past is why old programmers get a reputation for being out of touch. The past is important in that I learned a lot then that still rings today but I can say that about every year since I started. Today is a new day everyday.
  • Libraries should be archiving (and date-stamping) code. When copyright expires, that code can form public domain building blocks for a lot of cool stuff.

    The kids of the future won't have to reinvent the wheel, they'll be able to improve it.

    Software patents suck.

  • Au Contraire! (Score:4, Interesting)

    by VortexCortex ( 1117377 ) <VortexCortex AT ... trograde DOT com> on Tuesday July 30, 2013 @10:58PM (#44431379)

    For instance: As a cyberneticist I'm fond of programs that output themselves, it's the key component of a self hosting compiler... Such systems have a fundamental self describing mechanism much like DNA, and all other "life". While we programmers continue to add layers of indirection and obfuscation ( homeomorphic encryption ) and segmented computing (client / server), some of us are exploring the essential nature of creation that creates the similarities between such systems -- While you gloat over some clever system architecture some of us are discovering the universal truths of design itself.

    To those that may think Computer Science is a field that must be studied or be repeated, I would argue that there is no division in any field and that you haven't figured two key things:
    0. Such iteration is part of the cybernetic system of self improvement inherent in all living things -- to cease is death, extinction.
    1. Nothing in Computer Science will truly be "solved" until a self improving self hosting computing environment is created...

    So, while you look back and see the pains of Microsoft trying to implement POSIX poorly, I've studied the very nature of what POSIX tried and only partially succeeded to describe. While you chuckle at the misfortunes of programmers on the bleeding edge who are reinventing every wheel in each new language, I look deeper and understand why they must do so. While you look to the "great minds" of the past, I look to them as largely ignorant figures of self import who thought they were truly masters of something, but they ultimately did not grasp what they claimed to understand at a fundamental level -- The way a Quantum Physicist might acknowledge pioneers in early Atomic thinking... Important, but not even remotely aware of what they were truly doing.

    How foolishly arrogant you puny minded apes are...

  • I think "learning from the old masters" really isn't the problem. It's not that we don't have lots of smart people writing software. I think the core problem is that we haven't figured out how to do upgrades and backward compatibility properly, which the old masters haven't figured out either. You can go and develop a HTML replacement that is better and faster, sure, but now try to deploy it. Not only do you have to update billions of devices, you also have to update millions of servers. Good luck with that. It's basically impossible and that's why nobody is even trying it.

    In a way HTML/Javascript is actually the first real attempt at trying to solve that issue. As messed up as it might be in itself, deploying a HTML app to billion of people is actually completely doable, it's not even very hard, you just put it on your webserver and send people a link. Not only is it easy, it's also reasonably secure. Classic management of software on the desktop never managed to even get near that ease of deploying software.

    If software should improve in the long run we have to figure out a way how to make it not take 10 years to add a new function to the C++ standard. So far we simply haven't. The need for backward compatibility and the slowness of deploying new software slows everything to a crawl.

  • by brillow ( 917507 ) on Wednesday July 31, 2013 @12:11AM (#44431735)

    You don't like Gates but wish programmers looked towards more "Great Masters?" Bill Gates was a Great Master Programmer.

  • by Animats ( 122034 ) on Wednesday July 31, 2013 @12:33AM (#44431845) Homepage

    There are a few problems which keep being rediscovered. In many cases, the "new" solution is worse than the original one.

    • Flattened representations of trees Fairly often, you want to ship a flattened representation of a tree around. LISP had a text representation of S-expressions for that. XML managed to make a mess of the problem, by viewing it as "markup". JSON is essentially S-expressions again.
    • Concurrency primitives This goes back to Dijkstra, who got the basic primitives right. We had to suffer through decades of bad UNIX/POSIX/Linux locking primitives. The Go language touts as their big advantage the rediscovery of bounded buffers.
    • Virtualization IBM had that in 1967. IBM mainframes got it right - you can run VM on VM on VM... X86 virtualization can't quite create the illusion of a bare machine.

Ignorance is bliss. -- Thomas Gray Fortune updates the great quotes, #42: BLISS is ignorance.

Working...