Remember the Computer Science Past Or Be Condemned To Repeat It? 479
theodp writes "In the movie Groundhog Day, a weatherman finds himself living the same day over and over again. It's a tale to which software-designers-of-a-certain-age can relate. Like Philip Greenspun, who wrote in 1999, 'One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.' Or Dave Winer, who recently observed, 'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.' And then there's Scott Locklin, who argues in a new essay that one of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.' Hey, maybe it's hard to learn from computer history when people don't acknowledge the existence of someone old enough to have lived it, as panelists reportedly did at an event held by Mark Zuckerberg's FWD.us last Friday!"
Back to BASIC (Score:3, Funny)
10 GOTO 20
20 GOTO 10
Re:Back to BASIC (Score:5, Informative)
Re:Back to BASIC (Score:5, Funny)
Lisp never 'comes back'. It merely recurses.
Re:Back to BASIC (Score:4, Insightful)
Lisp is easy to get good programs from. You just have to stop thinking in non-Lisp ways. What confuses people is the functional orientation, but if you don't understand functional style of programming then a lot of modern stuff will pass you by. Procedural stuff is very easy in lisp too.
Re: (Score:3)
Lisp is easy to get good programs from.
Then name some "good programs" written in Lisp. I have worked with thousands of programs written in C. Plenty in C++, Java, Perl, Python, and even a few in Ruby. But other than Emacs scripts, I have never come across Lisp in a non-academic program.
Re:Back to BASIC (Score:5, Informative)
Macsyma? Emacs itself is more Lisp than C. Zork was a Lisp dialect. Mirai is lisp, and was used to do animation in Lord of the Rings. Lots of expert systems. Several CAD systems and other modelling programs. Data analysis stuff. Whoever uses Clojure is using Lisp and it seems to have some traction. And Orbitz is apparently using a lot of Lisp internally (just to throw out a web site since some people think it's not real if it's not a web site or PC app).
Re:Back to BASIC (Score:4, Insightful)
But there's a famously quoted statement by Guy Steele, who wrote some of the Lisp language specs and Java language specs. "we were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp." ( http://people.csail.mit.edu/gregs/ll1-discuss-archive-html/msg04045.html [mit.edu] )
Going from C to C++ is easy, I did that. Going from C++ to Java is easy. I did that too. Going from Java to Lisp is damn difficult, at least for me. But the fact that teaching mainstream C, C++, and Java developers Lisp is difficult merely makes it unlikely Lisp will be popular. It does not prove Lisp is a poor language.
Re:Back to BASIC (Score:4, Informative)
What confuses people is the functional orientation
Whether or not a language is functional is a matter of degree (just as whether or not a language is dynamic is a matter of degree).
Out-of-the-box Lisp is not nearly as much a functional language as *ML or Haskell. There is no pattern matching for example. Of course you can turn Lisp into a functional language, which is what Qi [wikipedia.org] is. In true Lisp fashion it's incredibly clever - 10k lines of CL and you've got a functional language that is arguably even more of a functional language than Haskell (not sure about functional specific optimizations though). And also in true Lisp fashion, there's already a (not fully compatible) fork/successor called Shen [wikipedia.org] (by the same guy who created Qi!).
In other words it's an ongoing experiment, rather than something you can rely on. It grew out of the L21 project, which was supposed to be about Lisp for the 21st century. The lesson is that Lisp for the 21st century is just like Lisp for the 20th century - incredibly clever and powerful but not stable or standardized enough to rely on. Of course the great exception to that is Common Lisp - a byzantine composite of many pre-1985 dialects, warts and all, that hasn't really been updated in 27 years.
Re: (Score:3, Insightful)
Re: (Score:3)
you think C++ is better than LISP? Seriously?
It's absurd to talk about which is better except with respect to a certain type of application. For example, deeply embedded processors running on an RTOS (not embedded Linux or something), or even bare metal, with memory limitations and hard real-time constraints measured in tens of microseconds is not the best environment for Lisp (or any GC language for that matter).
Re:Back to BASIC (Score:4, Insightful)
For example, deeply embedded processors running on an RTOS (not embedded Linux or something), or even bare metal, with memory limitations and hard real-time constraints measured in tens of microseconds is not the best environment for Lisp (or any GC language for that matter).
It's the best environment for Forth, and conversely, Forth is the best language for than environment.
Re: (Score:3)
It works fine if you're judicious about what C++ features to use, and when and where to use them. As much of a pig as C++ is, one thing Stroustrup got right was making it a true multi-paradigm language, and he stuck to the principle of not dragging in any baggage or overhead that you don't explicitly ask for.
Re:Back to BASIC (Score:4, Informative)
It can be done in C++, but why go to the extra effort?
What extra effort is that? Calling your files *.cpp instead of *.c? Ok, that is an extra two letters per file name.
The why is so that you can take advantage of C++ features. Templates for example are a great way to write very fast code, and if you know what you're doing you don't get the dreaded bloat. Object based programming is a nice way to encapsulate things and adds zero overhead. True OO can be used judiciously in the non-speed critical parts (often a clean way to have a single image handle several minor hardware variants). A combination of object based and operator overloading can be a clean way to handle the semantics of fixed point DSP, which don't map nicely to most languages.
C++ is not designed for efficiency.
Read Stroustrup. As I already said, one of the key design principles was not to add overhead unless you explicitly ask for it. It was designed to be as efficient as you need it to be.
If you don't care quite so much about efficiency, use LISP
1. GC and HRT? Don't go.
2. Even where you can write Lisp to be as efficient as C/C++ for low level operations, all you're doing is writing C/C++ in Lisp. What's the point?
3. How many Lisps, Schemes, whatevers, have you seen for cross-development to DSP's and other architectures that are usually only embedded, have good optimizers (ala SBCL for example), and can run without an OS?
Re:Back to BASIC (Score:5, Insightful)
it gives me somewhat of a sign of relief to see C, C++ and Java/C# be stable in the face of this recurring tide of fad languages
Sheesh, kids today. I remember when C++, Java and C# were the fad languages. I even remember when C outside of the Unix world was a "fad" replacing Fortran, Basic and Pascal. The "classic" languages you grew up with are not the end of programming language evolution.
OTOH I admit that the so-called Cambrian explosion of languages really needs to be followed by a mass extinction. Perl, Python, Ruby, Lua, etc., etc., etc. You could spend the rest of eternity debating their pros and cons, but do we really need all of them? It's great if you want to spend the rest of your life learning yet another genius's "best of" mix of existing language ideas, but it sucks if you just want to get work done. Then there's Clojure, because what the Lisp world really needs is yet another dialect, and F# because, uh, well OCaml has been around a while and we really want yet another variant, and ...
Re:Back to BASIC (Score:5, Interesting)
Re: (Score:3)
So pick a language you're good at and get work done. The existence of other languages doesn't prevent that. The one true language is never going to happen.
Re: (Score:3)
Re: (Score:2)
Your fork bomb fizzles due to running out of stack. Try this:
main() {
while(1) fork();
}
}
Re: (Score:3)
Your fork bomb fizzles due to running out of stack.
Any C compiler worth its salt will recognize the tail-call recursion and rewrite/optimize that code into a loop... so the bomb will go off after all as long as you compile with -O3. ;)
Re: (Score:3)
Things really started going downhill when BASIC came out.
BASIC came out in 1964.
Re: Back to BASIC (Score:5, Funny)
It's not the programmers making the decisions (Score:5, Insightful)
Re:It's not the programmers making the decisions (Score:5, Insightful)
Parent post speaks truth (Score:5, Interesting)
Actually, from the examples cited, it seems to me to be painfully obvious why in those cases information was not shared.
For quite a long period of time, IBM and MS were stiff competitors (remember OS/2 warp?). I doubt MS would inform IBM what they were working on, much less seek help. In fact, it seems to be the exception rather than the rule for software companies to share code with each other. Selling code, after all, is usually how they make money.
Im fairly confident that Apple would sue any company that copies its software written for the Mac. Let us also not forget how much problems Oracle caused for Google when they sued over the Java API in Android [wikipedia.org]. Yes, it is efficient to reuse old tried and tested code, but it also opens you up to a lawsuit. So it is not so much reinventing the wheel as trying to find a different way of doing things so you wont get sued. For that, you have current IP laws to thank.
The problem here is with equating writing software to producing works of art. People are willing to go out of their way to learn and improve themselves to paint better or make beautiful music because it enables them to express themselves. It's emotionally satisfying. OTOH most software is programmed to achieve a certain utility and the programmer is faced with constraints e.g. having to use certain libraries etc. He is rarely able to express himself, and his work is subject to the whims of his bosses. For most everyday programmers, I think there is no real motivation to 'learn from the great masters'.
An exception might be the traditional hacking/cracking community where the members program for the sheer joy/devilry of it. I understand there is a fair amount of sharing of code/information/knowledge/learning from the great masters within their community.
Re: (Score:3, Insightful)
It's managers and executives who make the decisions, and to them whether it's a browser or mobile app or SaaS or whatever the latest trend is, who cares if you're reinventing the wheel as long as profits are up.
That hasn't changed either. Just the specific subject of the idiocy has changed. Idiotic managers are timeless. Lady Ada probably had the same thing to say about Charles Babbage.
Cheers,
Dave
Re:It's not the programmers making the decisions (Score:5, Informative)
Actually, it was Babbage who faced such idiocy from Parliament:
On two occasions I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
And you thought the clogged tubes thing was bad.
Another interpretation (Score:5, Interesting)
I've always felt like that quotation had another interpretation, one that's much more favorable to the MPs:
If you're an MP, you've probably had to deal with a lot of people asking for money to fund what is essentially snake oil. If you don't understand the underlying 'cutting edge' technology (both plausible and acceptable), one simple test is to ask a question that you KNOW if the answer anything other than "No" that the person is bullshitting, and you can safely ignore them... and as reported the question is phrased in such a way that it would sorely tempt any huckster to oversell their device. I think Babbage's lack of comprehension was due to his inability to understand the idea that the MP was questioning HIM, rather than the device.
Re:Another interpretation (Score:4, Informative)
one that's much more favorable to the MPs
Come to England and look at our MPs! You will the probably feel that it wasn't such an unfair interpretation on the part of Babbage.
Seriously though there are many people out there (and they tend to be non technical) who simply do not understand comptuers. The lack of understanding means that they effectively interpret the actions of computers as magic in that there is no way for them to reason about what a computer might do. Even pretty smart people fall prey to this.
The UK has never had a tradition of putting technically minded people into parliament.
Re: (Score:3)
Ah, so that's why progress was possible back then and near absent today?
Re:It's not the programmers making the decisions (Score:5, Interesting)
Re:It's not the programmers making the decisions (Score:5, Insightful)
Re: (Score:3)
You're missing the point. Developers may pick the language, but if you've only hired a tenth as many programmers for language A than as for language B (because those who use language A are ten times more productive), then when you come to start a new project you'll have ten times as many advocates for language B as you do for language A. Which language will your development team pick?
Re:It's not the programmers making the decisions (Score:5, Insightful)
That's such a cop out and it's not true. Most the managers making these decisions are technical managers who come from development backgrounds themselves.
There is a problem at a more fundamental level, even outside of determining what buzzwords to use for a product and it's prominent even in some of the higher echelons of web society. The most obvious I'm going to point out is that of HTML5 - it's a braindead spec full of utterly amateur mistakes that could've been avoided if only Ian Hickson had spent 5 seconds understanding why existing things were the way they were and why that mattered.
An obvious example is that of HTML5's semantic tags, using a study to determine a static set of tags that would be used to define semantic capabilities in a spec that was out of date before the ink had even dried was just plain stupid. The complaint that we needed more readable markup rather than div soup to make writing HTML was naive, firstly because amateurs just don't write HTML anymore, they all publish via Facebook, Wordpress and so forth, and secondly because there's a good reason markup had descended into div soup - because genericness is necessary for future-proofing. Divs don't care if they're ID is menu, or their class is comment, they're content neutral, they don't give a fuck what they are, but they'll be whatever you want them to be which means they're always fit for the future. In contrast HTML5 tried to replace divs with tags such as aside, header, footer and so forth which would be great except when you have a finite number of elements you end up with people arguing about what to do when an element doesn't fit. Do you just go back to using divs for that bit anyway or do you fudge one of the new tags in because it's kinda-loosely related which means you bastardise the semantics in the first place because we now don't really know what each semantic tag is referring to because it's been fudged in where it doesn't make a lot of sense?
The real solution was to provide a semantic definition language, the ability to apply semantics to classes and IDs externally. Does that concept sound familiar? It should because we had the exact same problem with applying designs to HTML in the past and created CSS. We allowed design to be separate from markup with external stylesheets because this had many benefits, a few obvious ones:
1) Designers could focus on CSS without getting in the way of those dealing with markup making development easier
2) We could provide external stylesheets for no longer maintained sites and have the browser apply them meaning there is a method to retrofit unmaintained sites with new features
3) Our markup is just markup, it just defines document structure, it does one thing and one thing well without being a complete mess of other concepts.
Consider that these could've been benefits for building a semantic web if HTML5 had been done properly too. The fact that Ian Hickson failed to recognise this with HTML5 highlights what the article is talking about exactly. He's completely and utterly failed to learn the lessons before him as to why inline styling was bad but on a more fundamental level demonstrates a failure to understand the importance of the concept of separation of concerns and the immense benefits that provides that was already learnt the hard way by those who came before him. His solution? Oh just make HTML5 a "living spec" - what? Specs are meant to be static for a reason, so that you can actually become compliant with them and remain compliant with them. Spec compliance once you've achieved it shouldn't ever be a moving target. That's when you know you need to release a new spec.
It's a worrying trend because it's not just him, I see it amongst the Javascript community as they grow in their ambition to make ever bigger software but insist that Javascript is all they need to do it. The horrendously ugly fudges they implement to try and fudge faux-namespaces into the language in a desperate attempt to alleviate the fact the Javascript was just never designed for large codebases is but one example.
I see it in frameworks, especially in the PHP world where there are many not-quite MVC frameworks because rather than figure out how to do things properly in the MVC pattern with the recognition that sticking as close to the pattern as possible lets people dive into your framework they throw in all these stupid additional concepts that are a mere band-aid for their own ignorance of good software design and sensible use of patterns. I mean, why re-use a common solution to a common problem that was solved long ago and hence that everyone else can jump in and understand when you can come up with your own god-awful fudge that makes no sense to anyone and turns your framework into an unmaintainable mess such that you end up rewriting your entire framework every few years?
I'll admit I see the problem almost entirely amongst the web development community and in part I think this is because many web developers are home grown more so than other fields of software development and so didn't go through the academic rigour of a computer science degree and were brought up on a diet of badly designed languages like PHP and Javascript but the problem the article talks about is clearly there. In the web world even the most basic lessons of a decent computer science or software engineering degree are completely lost on such a upsettingly large proportion of those in the field.
There's a reason why in this day and age there are still so many trivially exploitable websites through such long solved problems as SQL injection attacks - because these people who haven't learn the lessons from those who came before them are unfortunately not small in number and are out there writing supposedly production ready software, and that problem can't simply be pinned on management.
The thing about repeating the past (Score:2)
I saw the Lady Gaga quip and Scott's fondness for effective ancient map-reducey techniques on unusual hardware platforms. It reminded me about things like discovering America. Did the Vikings discover it years before any other Europeans? Certainly. Did the Chinese discover it as well? There's some scholarly thought that maybe they did. But you know whose discovery actually effected change in the world? Lame old Christopher Columbus.
Perhaps there's a lesson to be learned here from people who want to actuall
Re: (Score:2)
Eh? starting 15,000 years ago various waves of people came here from asia and huge and important civilizations have risen and fallen in the Americas since then. Some of those people are still around and their influence on art, food, medicine continues into our culture. One group of those asians was absolutely crucial to the United States winning its independence and also had influence on our Constitution. Talk about effecting change in the world; and they're still around by the way.
Re:The thing about repeating the past (Score:4, Insightful)
Lady Gaga is mentioned because she is both a classically trained artist and sui-generis of successful PopTart art through self-exploitation. Yes, the reference is recursive - as this sort of folk are prone to be. They can also be rude, if you bother to click through, as they give not one shit about propriety - they respect skill and art and nothing else.
When I plussed this one on the Firehose I knew most of us weren't going to "get it" and that's OK. Once in a while we need an article that's for the outliers on the curve to maintain the site's "geek cred". This is one of those. Don't let it bother you. Most people aren't going to understand it. Actually, if you can begin to grasp why it's important to understand this you're at least three sigmas from the mean.
Since you don't understand why it's important, I wouldn't click through to the article and attempt to participate in the discussion with these giants of technology. It would be bad for your self-esteem.
For the audience though, these are the folk that made this stuff and if you appreciate the gifts of the IT art here is where you can duck in and say "thanks."
Re:The thing about repeating the past (Score:4, Funny)
Re:The thing about repeating the past (Score:4, Insightful)
A (very) recent OSCON talk (Score:5, Informative)
Re: (Score:2)
We don't shun those who should be shunned. (Score:5, Insightful)
It's pretty damn obvious why this is: as an industry, we no longer shun those who should definitely be shunned.
Just look at all of the damn fedora-wearing Ruby on Rails hipster freaks we deal with these days. Whoa, you're 19, you dropped out of college, but you can throw together some HTML and some shitty Ruby and now you consider yourself an "engineer". That's bullshit, son. That's utter bullshit. These kids don't have a clue what they're doing.
In the 1970s and 1980s, when a lot of us got started in industry, a fool like that would've been filtered out long before he could even get a face-to-face interview with anyone at any software company. While there were indeed a lot of weird fuckers in industry back then, especially here in SV, they at least had some skill to offset their oddness. The Ruby youth of today have none of that. They're abnormal, yet they're also without any ability to do software development correctly.
Yeah, these Ruby youngsters should get the hell off all of our lawns. There's not even good money in fixing up the crap they've produced. They fuck up so badly and produce so much utter shit that the companies that hired them go under rather than trying to even fix it!
The moral of the story is to deal with skilled veteran software developers, or at least deal with college graduates who at least have some knowledge and potential to do things properly. And the Ruby on Rails idiots? Let's shun them as hard as we can. They have no place in our industry.
Re: (Score:3)
Re: (Score:3)
I think you were trolling, but there's a point under there. In the 70s you had to have a clue to get anything done. As more infrastructure and support system has been built, in the interest of not having to reinvent the wheel every project, you *can* have people produce things - or appear to produce things - while remaining clueless. Flash and sizzle have been replacing the steak.
Now it's html5 and sizzle.
indeed, too many bad code monkeys, few engineers (Score:5, Insightful)
It's not entirely young vs old, either. I'm in my 30s. I work with people in their 50s who make GOOD money as programmers, but can't describe how the systems they are responsible for actually work.
How do we fix it? If you want to be good, studying the old work of the masters like Knuth is helpful, of course. Most helpful, I think, is to become familiar with languages at different levels. Get a little bit familiar with C. Not C# or C++, but C. It will make you a better programmer in any language. Also get familiar with high level. You truly appreciate object oriented code when you do GUI programming in a good Microsoft language. Then, take a peek at Perl's objects to see how the high level objects are implemented with simple low level tricks. Perl is perfect for understanding what an object really is, under the covers. Maybe play with microcontrollers for a few hours. At that point, you'll have the breadth of knowledge that you could implement high level entities like objects in low level C. You'll have UNDERSTANDING, not just rote repetition.
* none of this is intended to imply that I'm any kind of expert. Hundreds, possibly thousands of people are better programmers than I. On the other hand, tens of thousands could learn something from the approach I described.
I'm not seeing your point (Score:3)
Are you suggesting that it's not true, that C won't show you things that you don't learn from Ruby? Also the reverse - GUI programming in
Re: (Score:3)
Sorry, my point was not as clear as I had hoped, since I never actually made it directly...
Are you suggesting that it's not true, that C won't show you things that you don't learn from Ruby?
Not as such. I'm suggesting that some people are simply impervious to learning. Back then as now, people managed to struggle through careers in programming without seeming to gain knowledge, skill or apparently any understanding deep enough to write programs. For those people, they learned nothing really from C then and
Re: (Score:3)
To be honest, I sort of softened on ruby on rails after being forced to endure a project on it, and must to my teeth-grinding resentment, actually found it a decent and productive environment (Although I'd say Django more so because of its relative lack of magic, and hey who doesn't enjoy screwing around with python).
Now don't get me started on javascript on the server and NoSQL systems. Somewhere between "lets call ourselves amazing because we got a god damn web browser script environment to implement a pa
Re: (Score:2)
Re: (Score:3)
I Googled "A poor worker blames his tools". All I got was links to Craftsman and Harbor Freight.
I bet you're blaming Google for this.... :-)
Re:We don't shun those who should be shunned. (Score:4, Insightful)
Re: (Score:3)
Neither one of your meanings matches how I've always heard it. A poor worker will try to place the blame on someone else. "It couldn't have been my fault, they must have been bad tools." So the tools were the constant rather than the variable.
Re: (Score:2, Insightful)
I can tell that you're young simply because you used C++ in a debate where someone slightly older would have used C. Either that, or you're a Windows programmer.
C utterly dominated open source (and thus the Slashdot community) until about 5 years ago. That's when the overwhelming number of university switched to C++. Of course, before that it was Java, so you can see the trend.
Unless you're a Windows programmer, I'd stick with C, which is infinitely simpler, and provides you freedom to maintain competency i
Re: (Score:3)
The un-useful parts - overloading and inheritance for example, detract from that.
C has overloading. The + operator is overloaded so that it operates differently on pointers, ints, chars, floats, doubles and various combinations of the two. Could you imagine actually using a language without overloading?
I have used them and the result is not particularly pleasant.
Likewise for C++ which has *user defined* overloading. The idea of writing complex maths in C is horrible compared to C++.
And inheritance? How is t
Re: (Score:3)
"C has no un-useful parts"
setjmp / longjmp ?!?
Very very very useful when writing error handlers for functions way on down the stack. Much more crufty than C++ exceptions, but a helluva lot more helpful when debugging signal-based coded.
Re: (Score:3)
And Ruby is just a hobbled variant of Smalltalk anyway.
Re:We don't shun those who should be shunned. (Score:5, Interesting)
Re:We don't shun those who should be shunned. (Score:5, Interesting)
Re: (Score:2)
We just had an earthquake here, and one of the most heavily damaged buildings was
Re: We don't shun those who should be shunned. (Score:2, Insightful)
I'm offended that ruby keeps getting thrown in with this Node/NoSQL stuff. Node has a couple of real use cases, but outside of that its a waste of time. NoSQL has a couple of real use cases, but outside of them it's not something you build around.
Ruby on the other hand is a really interesting language that has the benefit of being so flexible that its made for creating DSLs. Puppet, Chef, Capistrano and Rails just off the top of my head. Do some libraries have memory leak issues? Yes. Does its thread handli
Re: We don't shun those who should be shunned. (Score:4, Interesting)
Package management and fit for purpose tool chains don't exist in other languages? Is that a joke? Have you seen the Ruby gem ecosystem? Have you seen the Java ecosystem? You can do everything that you described in Ruby or Python without blinking and you won't incur the technical debt that Node's global insanity creates. Node came a long and people went "OMG! Non-blocking I/O!" and everybody else with a pulse looked at it and said...yea, that's what background workers are for but background workers encapsulate the logic instead of letting it all float around in one process. Eventually, node code grows to insanity.
Mongo is awesome...for write heavy applications. In most applications that means that one table could probably be better served with Mongo. For logging or cloud based data aggregators it's EXCELLENT. It's a fantastic session store too. Also a great query cache. That doesn't make it the optimal tool for your entire system where you might actually care about normalization, data compression, data integrity or the amount of hard drive space required to store all the data bloat that comes with it.
I can built a fully functional ecommerce system with an API, payment gateways, account system and analytics in 2 weeks (and most of that is just setting up the payment gateway and merchant account) with Ruby, Python, Groovy or Scala. With 1 person. Having it do $100k / month in sales is a product of what it's offering, how effectively it's marketed, how the supply chain side of the business can scale with demand and has absolutely zero to do with Node and/or Mongo.
Are you actually serious with such an example?
Re: (Score:3)
That's why I don't work in areas where that can be done. In embedded systems you still need to know the basics and can't just rely on the web technology invented last week. Even in mobile devices they need bare metal C coders (I just got a recruiter letter for it too). Gotta know hardware, operating systems, some assembler, debug from core files, good algorithms, communication with other processors, etc. Try getting a $10/hour a guy doing that, or someone who thinks a certificate is proof of qualificati
Paging Linus (Score:3, Insightful)
http://scottlocklin.wordpress.com/2013/07/28/ruins-of-forgotten-empires-apl-languages/#comment-6301 [wordpress.com]
Computer science worked better historically in part because humorless totalitarian nincompoopery hadn't been invented yet. People were more concerned with solving actual problems than paying attention to idiots who feel a need to police productive people's language for feminist ideological correctness.
You may now go fuck yourself with a carrot scraper in whatever gender-free orifice you have available. Use a for loop while you're at it.
Re:Paging Linus (Score:4, Informative)
Hive: distributed and free (Score:3, Interesting)
That's genius: comparing a "$100k/CPU" non-distributed database to a free distributed database. Also no mention that, yes, everyone hates Hive, and that's why there are a dozen replacements coming out this year promising 100x speedup, also all free.
And on programming languages, Locklin is condescending speaking from his high and mighty functional programming languages mountain, and makes no mention of the detour the industry had to first take into object-oriented programming to handle and organize the exploding size in software programs before combined functional/object languages could make a resurgence. He also neglects to make any mention of Python, which has been popular and mainstream since the late 90's.
Re:Hive: distributed and free (Score:4, Interesting)
large teams (Score:3)
what the lack of QA and to much auto testing? (Score:2)
With to much auto testing can just code to pass the test and even if some was looking at they would mark fail but it still passes what the auto system thinks is good.
Optimal team size (Score:3)
For any given software project there is an optimal team size. If the project is small enough, you can keep the team size down to what works with an agile development methodology. If the project is bigger than that, things get ugly. I started my career in a company that considered projects of 50 to 100 man-years to be small to medium sized. Big projects involved over a thousand man-years of effort and the projects were still completed in a few years calendar time. You can do the math as to what that mea
'Web Based' Coding is not the same... (Score:5, Funny)
We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac
I don't remember that code running cross platform on varying architectures. The web as an platform for distribution should not be compared to an actual OS...that doesn't even make sense.
Re:'Web Based' Coding is not the same... (Score:5, Interesting)
I don't remember that code running cross platform on varying architectures.
Yes. No code runs cross platform on varying architectures - INCLUDING the stuff that supposedly does, like Java and Javascript and all of the web distributed stuff. All of it DEPENDS on an interpretation level that, at some point, has to connect to the native environment.
Which is what BASIC was all about. And FORTRAN. Expressing the algorithm in a slightly more abstract form that could be compiled to the native environment, and then in the case of BASIC turned into interpreted code (Oh, you thought Java invented the virtual machine?)
In a lot of ways it is closed source vs open (Score:3)
Competition for money might get people to strive to make better pieces of art. But on the flip side, this same competition will sue your pants off for any reason they can find so you don't compete with them either.
An on an unrelated note, I had an idea for a zombie video game like Ground Hog day today. When you die, it starts out as the beginning of a zombie pandemic. As you die and play through it over and over, you get secrets to where weapons and supplies are. You find tricks you can use to survive and save people. Eventually you find out who caused the zombie pandemic. You can then kill him before he goes through with it. I'm not sure an ending where you serve in prison is a good ending though. I didn't think it the whole way through, but it sounded like a good premise for a zombie game.
Re: (Score:3)
That doesn't seem true for the most part.
All open source does with regard to code reuse is that it makes it painfully obvious how much redundancy there is. The spat between the different Linux display managers is one recent example, but I'm sure you can think of many others.
As for why this is
In Browser (Score:5, Insightful)
We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.
Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.
We should then pay homage to the Mac 25 years ago, when it basically did what Doug Englebart demonstrated 45 years ago. [youtube.com] Nice logic you have there.
Re: (Score:2)
Re:In Browser (Score:4, Informative)
Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.
Depends on how much leeway you are willing to grant. Around 1990 or so, the Mac could run Soft PC, a virtual machine x86 emulator running DOS or Windows. The Mac could certainly network and had file servers. So you should in fact have been able to download code from a fileserver and run it in the virtual machine, which from a Mac perspective would effectively be a sandbox. Although the PC DOS/Windows platform isn't "platform independent," it was nearly universal ( minus Mac only systems*) at the time.
* Yes, yes - Amiga, Apple II, Atari, et. al.
Re:In Browser (Score:5, Funny)
We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.
Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.
We should then pay homage to the Mac 25 years ago, when it basically did what Doug Englebart demonstrated 45 years ago. [youtube.com] Nice logic you have there.
Dude, just ignore this guy. Of all people who have the right to indulge in a good, old-fashioned 'get off my lawn' rant, Dave Winer ranks last. This is the man who, for our sins, gave us XMLRPC and SOAP, paving the way for the re-invention of... well, everything, in a web browser.
Port 80 died for this man's sins....
The third link (Score:2)
It would be great if he'd actually given examples of why APL is a good language. I would be interested in that. Instead he says mmap is really interesting, which actually doesn't have anything to do with programming language.
He says that old programmers have left a lot of examples of good source code. It would be great if he'd actually linked to their code.......
What past was he from? (Score:3, Funny)
He says system performance is the same as it was way back then. He thinks that stuff just happened immediately on those systems because they were running very efficient code. So what. Here's a simple test. Go get one of those computers and set it next to yours. Turn them both on. Mine would be at a desktop before the old one even thinks about getting down to actually running the operating system. Or start a program. On a current system it loads now. As in, right now. Back then it was a waiting game. Everything was a waiting game. He must have simply forgotten or repressed those memories.
Re:What past was he from? (Score:4, Insightful)
Also those old programs did a lot less than many of our new programs. People often forget that when complaining about performance.
That's not to say, of course, that modern programs couldn't be written more efficiently. Because of Moore's Law and other considerations, we have moved away from spending a lot of time on performance and efficiency.
Re: (Score:3)
Of course modern computers have faster CPUs and everything else, but I'd really like to know where along the line a 30 second boot time became acceptable......
Re:What past was he from? (Score:5, Informative)
Re: (Score:3, Informative)
However, compare Word from 1990 to Word from today. The 1990 one will start nearly instantly, by incredibly responsive, and have all the features most people use anyway.
Re:What past was he from? Mine. (Score:3)
I had an Atari ST at college. It booted (to a graphical, no less) desktop pretty much instantly, say a few seconds if you had a slew of SCSI peripherals (especially a CDROM drive), but otherwise it was about half a second.
It was ready to go, too. None of this crap of *showing* the desktop and then spinning the busy cursor for another 30 secs...
Simon.
To much theory / lack of apprenticeships in CS? (Score:2)
When people don't learn from people who have made mistakes or even had some real work place experience (not years of academic experience) it easy to end makeing mistakes that in theory seem like good ideas.
Also similar to some of the certs type stuff there the book says this but in the real work place that does not work.
Net, CPU and GPU bound (Score:5, Insightful)
They faced the limits of the data on a floppy and cd.
They had to think of updates over dial up, isdn, adsl.
Their art had to look amazing and be responsive on first gen cpu/gpu's.
They had to work around the quality and quantity of consumer RAM.
They where stuck with early sound output.
You got a generation of GUI's that worked, file systems that looked after your data, over time better graphics and sound.
You got a generation of programming options that let you shape your 3d car on screen rather than write your own gui and then have to think about the basics of 3d art for every project.
They also got the internet working at home.
Re:Net, CPU and GPU bound (Score:5, Insightful)
Some people seem to think this article is about going back to the past. They miss the entire point. We're not saying that older programs were better, or that older computers were better, or that we should roll back the clock. We're saying that they had to pay more attention to what they were doing, they had to learn more and be broad based, they had to learn on their own, and so forth. When they had good ideas they were shared, they were not continually being reinvented and presented as something new. They didn't rely on certification programs.
Some things are missing (Score:5, Interesting)
Remember the strategy gaming past... (Score:2)
... or be doomed to repeat it. And they have been for 20 years, every year. Strategy game development in particular seriously needs a persistent collective consciousness.
Re: (Score:2)
Well, maybe I should have been more specific and said 4X strategy game development....
A symptom of popular culture in the '60s (Score:5, Insightful)
All Mozart's Works are Open Source (Score:5, Insightful)
You can learn a lot from Mozart because you can read all the notes he published.
You can listen to many interpretations of his works by different people.
We don't have the chance to read through 25-year-old Mac symphonies^W programs.
We aren't even writing for the same instruments.
Back to chariots and horses (Score:2, Insightful)
Chariots were masterpieces of art. They were often made of precious metals and had elegant design work. They were environmentally friendly, using no fossil fuels whatsoever. They didn't cause noise pollution, or kill dozens of people when they crashed.
Aircraft makers should learn from the past. They have totally functional designs, no semblance of artistry anywhere. Accommodations are cramped, passengers treated like cattle.
We should go back to the good old days, things were so much better back then.
No
The past was nice but today is not then (Score:5, Insightful)
Library Code Archives (Score:2)
Libraries should be archiving (and date-stamping) code. When copyright expires, that code can form public domain building blocks for a lot of cool stuff.
The kids of the future won't have to reinvent the wheel, they'll be able to improve it.
Software patents suck.
Au Contraire! (Score:4, Interesting)
For instance: As a cyberneticist I'm fond of programs that output themselves, it's the key component of a self hosting compiler... Such systems have a fundamental self describing mechanism much like DNA, and all other "life". While we programmers continue to add layers of indirection and obfuscation ( homeomorphic encryption ) and segmented computing (client / server), some of us are exploring the essential nature of creation that creates the similarities between such systems -- While you gloat over some clever system architecture some of us are discovering the universal truths of design itself.
To those that may think Computer Science is a field that must be studied or be repeated, I would argue that there is no division in any field and that you haven't figured two key things:
0. Such iteration is part of the cybernetic system of self improvement inherent in all living things -- to cease is death, extinction.
1. Nothing in Computer Science will truly be "solved" until a self improving self hosting computing environment is created...
So, while you look back and see the pains of Microsoft trying to implement POSIX poorly, I've studied the very nature of what POSIX tried and only partially succeeded to describe. While you chuckle at the misfortunes of programmers on the bleeding edge who are reinventing every wheel in each new language, I look deeper and understand why they must do so. While you look to the "great minds" of the past, I look to them as largely ignorant figures of self import who thought they were truly masters of something, but they ultimately did not grasp what they claimed to understand at a fundamental level -- The way a Quantum Physicist might acknowledge pioneers in early Atomic thinking... Important, but not even remotely aware of what they were truly doing.
How foolishly arrogant you puny minded apes are...
Upgrades and backward compatibility (Score:3)
I think "learning from the old masters" really isn't the problem. It's not that we don't have lots of smart people writing software. I think the core problem is that we haven't figured out how to do upgrades and backward compatibility properly, which the old masters haven't figured out either. You can go and develop a HTML replacement that is better and faster, sure, but now try to deploy it. Not only do you have to update billions of devices, you also have to update millions of servers. Good luck with that. It's basically impossible and that's why nobody is even trying it.
In a way HTML/Javascript is actually the first real attempt at trying to solve that issue. As messed up as it might be in itself, deploying a HTML app to billion of people is actually completely doable, it's not even very hard, you just put it on your webserver and send people a link. Not only is it easy, it's also reasonably secure. Classic management of software on the desktop never managed to even get near that ease of deploying software.
If software should improve in the long run we have to figure out a way how to make it not take 10 years to add a new function to the C++ standard. So far we simply haven't. The need for backward compatibility and the slowness of deploying new software slows everything to a crawl.
You hate Gates but... (Score:3)
You don't like Gates but wish programmers looked towards more "Great Masters?" Bill Gates was a Great Master Programmer.
Common rediscovered problems. (Score:4, Insightful)
There are a few problems which keep being rediscovered. In many cases, the "new" solution is worse than the original one.
Re: (Score:3)
If the people in the middle ages had only realized it was the rats, and the fleas they brought with them, they wouldn't have suffered from plague for so long. Hindsight is 20/20.
Re: (Score:3)
LocalTalk file sharing, AIFF files, 8-bit audio support.