Forgot your password?
typodupeerror
Education Google Java Programming

Google Engineer Decries Complexity of Java, C++ 878

Posted by Soulskill
from the keep-it-simple-stupid dept.
snydeq writes "Google distinguished engineer Rob Pike ripped the use of Java and C++ during his keynote at OSCON, saying that these 'industrial programming languages' are way too complex and not adequately suited for today's computing environments. 'I think these languages are too hard to use, too subtle, too intricate. They're far too verbose and their subtlety, intricacy and verbosity seem to be increasing over time. They're oversold, and used far too broadly,' Pike said. 'How do we have stuff like this [get to be] the standard way of computing that is taught in schools and is used in industry? [This sort of programming] is very bureaucratic. Every step must be justified to the compiler.' Pike also spoke out against the performance of interpreted languages and dynamic typing."
This discussion has been archived. No new comments can be posted.

Google Engineer Decries Complexity of Java, C++

Comments Filter:
  • by Black-Man (198831) on Friday July 23, 2010 @03:24PM (#33006454)

    This guy has a lot of nerve telling other folks what programming language to use.

  • by drewhk (1744562) on Friday July 23, 2010 @03:24PM (#33006470)

    You should bash Java, and C++ devs will agree. You should bash C++, and Java devs will agree.

    Now you bashed both languages that has probably the most devs. Except some dynamic languages, of course (PHP and JS comes to mind).
    Oh, you insulted them, too.

    OMG...

  • Summary: (Score:5, Interesting)

    by IICV (652597) on Friday July 23, 2010 @03:25PM (#33006488)

    Google distinguished engineer Rob Pike ripped the use of Java and C++ during his keynote at OSCON, saying that these 'industrial programming languages' are way too complex and not adequately suited for today's computing environments. ... Pike also spoke out against the performance of interpreted languages and dynamic typing. ... "Go is an attempt to combine the safety and performance of statically typed languages with the convenience and fun of dynamically typed interpretative languages," [Pike] said

    Shorter Rob Pike: all those other languages suck, but the one I invented rocks. It's elegant and simple just like Lisp was back in the sixties!

    I'm reminded of this blog [scienceblogs.com] post I read, where the author described it as "The Hurricane Lantern Effect". You look at someone else performing a task, and you think "geez, what an idiot! I can do it better in ten different ways!".

    Then they hand the task off to you, and you slowly realize that each of your ten improvements isn't actually any better.

    I bet you that if it's still around in ten years, someone else will decry Go 10.0 as being a "bureaucratic programming language".

  • Programming is Hard (Score:3, Interesting)

    by CSHARP123 (904951) on Friday July 23, 2010 @03:41PM (#33006734)
    I RTFA, really all he is saying is programming is Hard. Well Duh, I am sorry to hear that from Google Engineer. May be he will be better off using C# or VB.Net. Welcome to programming.
  • by Mad-Bassist (944409) on Friday July 23, 2010 @03:41PM (#33006744) Homepage

    I kinda miss those days--easy to learn and embedded 6502 machine code subroutines to make things move faster.

  • by AnonymousClown (1788472) on Friday July 23, 2010 @03:42PM (#33006760)

    No,

    What this world needs is competent programmers. C++ too hard for you? You shouldn't be programming. It's that simple.

    Look, C++ is my programming language of choice: I like programming down to metal, having the OS load my program and run it without any intermediary like an interpreter or some sort of runtime (no I don't do CLI), and I like the ability to go down and do old school 'C' when I need to. But these days, it's like Stroustrup is adding features for the sake of adding features - trying to be "modern"? I don't know. It's making the language bulkier, adds even more chances for obfuscation, and it's getting to the point where even C++ fanboys like myself are reevaluating our relationship with the language.

    Good god, when he first added templates (a great feature just see the STL), I had to deal with programmers who made template classes for everything and used it once for one data type. Plah-ease! Just because the feature is there doesn't mean you gotta use it. There's a time and place for everything. C++ is turning into the Word of programming languages: adding esoteric features that less than 1% of users will ever use.

    C++ is turning into a bloated slow fat pig and I'm thinking of getting a divorce.

  • by MightyMartian (840721) on Friday July 23, 2010 @03:44PM (#33006792) Journal

    Not to mention that, no matter what language you use, certain algorithms are going to be inherently complex, and in these areas the "simple" or "true" languages fall flat on their face. I've been coding in one form or another since the mid-1980s, and what I've seen from the "simple" languages is that they might be useful for teaching, but try to write a complex application or solve complex problems, and that simplicity simply makes the code even larger and more awkward.

  • Re:umm... (Score:4, Interesting)

    by owlstead (636356) on Friday July 23, 2010 @03:51PM (#33006894)

    I'm not saying that that would be it, but I would not mind a programming environment where the text files have gone the way of the dodo. With Eclipse we use a rather strong "Clean Up" where missing keywords (e.g. final) are added and statements are reformatted (etc.). Wouldn't it be easier to do without that kind of stuff? What about comparing the differences of two branches (or new code with the head) where the actual semantic changes are compared vs lines of text? What about an environment where you can easily hide complexity and meta-information? Or, possibly, add new literals? Where the base of the language is shifted to the Abstract Syntax Tree, not so much the syntax.

  • by jekewa (751500) on Friday July 23, 2010 @03:53PM (#33006908) Homepage Journal

    I'll start using Go as soon as Google makes a browser-based development environment for it, ala Google Documents meets Bespin, and it makes something I can then deploy to servers other than the one running the development environment...

  • Re:umm... (Score:3, Interesting)

    by FooAtWFU (699187) on Friday July 23, 2010 @03:58PM (#33006960) Homepage

    I'm no Robert Pike, but I imagine that computer programming as we move to thousands and millions of cores will consist less of telling the computer how to do something, and more of telling the computer what you want to do and having a really smart compiler figure out the details. The more low-level you go, the less chance the compiler has of figuring out what you're trying to do and making it work effectively (and do crazy optimizations like speculative out-of-order execution, and what-have-you).

    But this means that the programming languages of the future will be less imperative and more functional. How that will ever catch on is beyond me.

  • by owlstead (636356) on Friday July 23, 2010 @04:02PM (#33007000)

    Thank you, I do agree. I was about to write to the authors of Go, but I thought better of it: simply because I cannot see Go go anywhere.

    Basically, they do really weird things:
      - no exceptions
      - half assed immutability concepts
      - focus on compile time (compile time? really? yes really!)
      - no modularization system (it's like the micro-kernel vs mono-kernel fight all over)

    It's got some good ideas that make it interesting for small, fast, secure applications, but not so many that it becomes interesting. I could see technically make some headway for small monolithic kernels. But their market placement is lacking to the point that it is non-existent.

  • by Grishnakh (216268) on Friday July 23, 2010 @04:07PM (#33007080)

    Yep, it's like different natural human languages. Some are much simpler than others, and don't have so many cases and tenses and such. However, to express more complex thoughts (such as "he would have liked to go home"), you have to be extremely wordy, whereas with a more complex language, you just use a different verb tense.

  • by slasho81 (455509) on Friday July 23, 2010 @04:07PM (#33007082)
    Rich Hickey talked about incidental complexity in his keynote talk at the JVM Languages Summit 2009: http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hickey [infoq.com]
    It's worth watching.
    If Pike thinks the Go language solves anything, he should probably watch this talk too.
  • by lgw (121541) on Friday July 23, 2010 @04:24PM (#33007288) Journal

    C++ done right with scoped classes doesn't need garbage collection. No, really it doesn't, most people have just never worked with scoped classes, and it's mind-boggling that they don't show up in C++0x (other than the half-useless auto_ptr we've always had). I have a real problem with garbage collection in production code, because it's freaking hard to find and plug resource leaks. If you forget to close a file in a garbage collected language, it will probably get closed eventually when the garbage collector cleans up, but the program likely has some bug anyhow, and the GC has made it a horrible intermittant bug that changes behavior in a lab environment!

    C++ has two key abilities that any good language needs, but few have: scoped resoruces (but only in that it allows you to add them yourself!), and const references. Why are people still making high level languages where references aren't const by default?!? In C# I can't (usually) tell from the function signature whether changing an argument will change the caller's copy, nor can I be sure from an interface that a given function won't alter what I pass to it - what insanity is that? (And Java is only slightly better.)

  • by westlake (615356) on Friday July 23, 2010 @04:26PM (#33007310)

    It's why COBOL was invented, with syntax like.
      SUBTRACT DEBIT FROM BALANCE GIVING NEWBALANCE.
    I kid you not, Adm Hopper actually thought that would make programming easier, and she was no moron.

    COBOL was designed like this so it could be read and understood by corporate auditors and accountants - and for the recruitment and training of accountants as COBOL programmers.

    It makes perfect sense when you remember that modern bookkeeping rules are the product of hundreds of years of law and practical experience which the neolithic geek did not have.

       

  • Don't hate on VB (Score:3, Interesting)

    by labnet (457441) on Friday July 23, 2010 @04:26PM (#33007318)

    Although I mainly do hardware engineering, I also have done/do lots of 8/16/32 bit embedded prorgamming in C and C++. C is a terrific language for embedded. C++ is like a samurai sword, hard to truly master without killing yourself.
    I truly loved VB6 as a RAD platform. I wrote a scientific application involving realtime data collection/control, database method and paramter control, realtime graphing, simultaneous multiple system control including sampling robots etc, that is still being used in hundreds of industrial labs around the world. It was written in around 4 months, and there is no way I could of done it in C++.
    I think C++ sucks for end user app development for most of the reasons metioned by Rob Pike, but is has its place when you are close to the metal.

  • by lgw (121541) on Friday July 23, 2010 @04:31PM (#33007370) Journal

    No, I think you missed the point entirely here: those escape sequences would begin with a control character, but contain normal characters. So if you were grepping for "foo" you might get a false positive because "foo" appeared in an escape sequence. UTF-8 ensures that will never happen. Any existing code that uses reasonable algorithms to search or sort ASCII text still works reliably with UTF-8 text, but didn't with earlier escape sequence schemes.

  • by 0123456 (636235) on Friday July 23, 2010 @04:32PM (#33007380)

    In all my years of coding in Perl, Ruby, and Javascript, I have never encountered a single bug where somebody inserted a string into an array meant for integers, or one where someone tries to compare a float to an array.

    I've seen plenty of bugs in interpreted languages where someone meant to call procFoo() but actually wrote procFooo() and therefore the two hour script run failed at 1:58 requiring us to run it again from the start. I've seen plenty of bugs caught because the compiler realised that CustomerId is not the same type as WidgetCount, even though both are wrapped ints. I've seen plenty of bugs where someone meant to write 'f = procFoo()' but actually wrote 'g = procFoo()' and the 'smart' language created a new variable g while the program later used the value of f.

    A compiled language like C++ would catch all of those before you actually ran the code. Obviously a strongly typed interpretive language would catch the second, but a 'smart' language would probably convert it for you, leading to a bug that takes a long time to track down.

    I'm not saying that 'smart' languages are a bad idea, but there are an awful lot of things they can screw up for you if they try to be too 'smart'.

  • Re:Don't hate on VB (Score:3, Interesting)

    by BitZtream (692029) on Friday July 23, 2010 @04:37PM (#33007432)

    You like VB for the same reason you like Tylenol.

    It lets you play doctor without actually being one, or in this case, you get to play programmer without actually being one.

    That most certainly has its usefulness as you are well aware, but that doesn't make it a tool that programmers should be using.

    Doctors and programmers use a different set of tools than you use at home.

    I spend most of my time now converting crappy VB apps to real applications, I have a license to hate VB and shitty VB writers who think they are programmers.

  • by ciggieposeur (715798) on Friday July 23, 2010 @04:37PM (#33007434)

    Nearly everything I was unhappy about in C++ is better in D.

  • by goombah99 (560566) on Friday July 23, 2010 @04:40PM (#33007460)

    I suppose it's worth expanding my polymorphism comment slightly. People who think that objecitve C's Messaging concept is just semantic sugar are not understanding it clearly.

    In objective C you don't "call an object method" but rather you pass a message to the object. THe object, if it chooses to reply, does not "return" a value but instead sends back a reply message.

    What's the difference? Well in implementation very often nothing. The message that is sent is a message name and a list of calling args and their names. 99.9% of the time the object chooses to resolve this message by finding a method that corresponds to that prototype. So that looks just like C++. But the thing is, it does not have to do that. It could choose to re-interpret the message. And in particular it might use some other recently method added later than "linking" time to the object. Thus how the method calls are bound does not happen at link time. They often are prebound then for efficiency, but they do not have to be. The same is true of the return values.

    THis makes it more like java in a way.

    But the nice thing is that the overall syntax is just a thin layer on basic C.

    Another reason that Objective C is so nice now is that it had a chance to mature and modernize out of the spotlight. Having lived mainly in the apple ecosystem it has a lot of standard libs now with dictionaries and core data tied to persistent storage, MVCs, and so forth that are all (mostly) self consistent and not the tower of babble on finds in java. THings like get-set commands can be handled by decorators rather than explicit coding. That's cool because by letting the compiler pre-processor define what is in a get set you can inherit all sorts of things like listeners binding to your variables that you did not explicitly encode. Just recomoile your code and poof you inherit all the new features.

  • by benjto (1175995) on Friday July 23, 2010 @04:46PM (#33007544)

    You don't have to write in Java to run on Java. Languages like Clojure, Scala, and Groovy have come about an an answer to the complexity and verbosity of Java's syntax and structure - all running on the Java platform.

    Java

    // HelloWorld.java
    class HelloWorld
    {
    public static void main(String args[])
    {
    System.out.println("Hello World!");
    }
    }

    Groovy

    //hello.groovy
    println "hello, world"

    Clojure

    // hello.clj
    (println "Hello World")

  • by Anonymous Coward on Friday July 23, 2010 @05:00PM (#33007714)

    With the speed of today’s computers, though, you shouldn’t (usually) need that amount of optimization.

    the inefficiencies aggregate to the point where you have cd burning software that eats 200MB of ram and 300MB of hd space. binaries were getting bigger and hungrier all through the 90s, but there was a literal explosion after all the sandboxed runtimes became popular with colleges. People don't buy faster computers to run slower code written by lazier programmers. They assume they're going to get a faster experience. ...or at least more capability. what ends up happening more and more often though is that stuff just gets slower, buggier and more expensive.

  • by ciggieposeur (715798) on Friday July 23, 2010 @05:09PM (#33007806)

    Even [XYZ]-modem used a similar setup.

    Not quite. Xmodem and Ymodem use SOH and STX to denote start of sectors and ACK/NAK, but after that it's just a raw 8-bit file dump to the checksum/crc bytes with no concern for character set encoding. Zmodem uses DLE and escapes out most of the C0 bytes (XON/XOFF and CAN must be escaped regardless of session flags), but doesn't use the rest of the codes for anything.

    Most of C0/C1 codes mixed right in with the text for formatting/presentation, e.g. embedding backspaces followed by underscores to get underlined text. Some of the others did link-level too. It was a mess, so much so that parsers for ANSI X3.64 / ECMA-48 style escape sequences take a LOT of work to get right (passing 'vttest' is not trivial).

    UTF-8 isn't bad. It specifies that character decoding be done before any other processing including C0/C1 and ANSI escape sequences, which makes it very easy to integrate on the reading side. Harder is dealing with wide chars on the screen and user I/O. Compared to Avatar's repeat character and ANSI fallback features, it's much more bang for the buck. And let's not talk about "ANSI Music" and it's use of SO (Ctrl-N) because it's the "music symbol" in CP437!

    (Disclaimer: I've written a console-based terminal emulator [sf.net] that does a decent VT102/220, UTF-8, X/Y/Zmodem/Kermit, and lots of other things.)

  • by Anonymous Coward on Friday July 23, 2010 @05:11PM (#33007830)

    ...Google Go. Which has yet to impress me.

    There are some impressive parts of it that would make me want to use it over C, but it seems like it's still in its nascent stages and not ready for serious use yet. The biggest thing I can see that it needs is full two-way integration with existing C code...cgo is a terrible hack that only works to call C code from Go. They also need the ability to create shared libraries to allow C code to call into Go programs.

    I've been disappointed to see them ignoring that kind of stuff and instead focusing on all the Go packages. If you solve the integration with C, the standard library stuff becomes less important because everything they'd implement has a C equivalent that could be used until they've had a chance to implement a Go version.

  • by lgw (121541) on Friday July 23, 2010 @05:13PM (#33007854) Journal

    shared_ptr solves a different problem. An example of a scoped resource is a file handle that you allocate on the heap, that automatically closes the file when you leave scope, whether by returning or by an exception flying past. With scoped resources you can just stop thinking about all the "does my resource get cleaned up if this happens" cases entirely! You should never explicitly clean up a resource, so there's no way to forget to do so in some corner case. Just like a "using" block in C#, except it's the default, so you can't forget to do it (the most malevolent bugs I've fought were people forgetting to use a using block).

    Combine this idea with shared_ptr and it works for heap resources too, not just stack resources.

  • Re:I LOVE perl! (Score:5, Interesting)

    by Grishnakh (216268) on Friday July 23, 2010 @05:32PM (#33008050)

    Actually, English is a very easy language to learn, to a certain degree. It's a lot like learning to play guitar. Any moron can learn to play a few chords on a guitar and make a simple song. However, only really talented people can become true virtuosos of the instrument and play like Joe Satriani or Steve Vai. English is like that: it's easy to learn it to a minimal degree and become somewhat conversant. The words are short and simple, you don't have to worry about silly things like word gender, etc. However, becoming truly fluent in it (so that you can read and write advanced literature, for instance) is difficult and time-consuming because you have to memorize so many things, and learning some Greek and Latin is very useful for understanding many larger words.

  • by Animats (122034) on Friday July 23, 2010 @05:47PM (#33008260) Homepage

    The main problems of the major languages are known, but not widely recognized by many programmers.

    • C Started out with only built-in types, to which a type system was retrofitted. (You have to go back to pre-K&R C documents to see this, but originally, there was just "char", "int", "float", and pointers to them. "struct" was just a set of offsets, with no type checking. You couldn't even use the same field name in two different structs.) Bolting a type system onto this took a long time, and resulted in problems ranging from "array=pointer" to cascading recompilation because "include" files contain implementation details of included modules.

      The killer problems with C today mostly involve lying to the language. "int read(int fd, char* buf, size_t bufl);" is a lie; you're telling the compiler that the function accepts the address of a pointer, while in fact it accepts a reference to an array of char of length "bufl". This lie is the root cause of most buffer overflows. The other big problems with C involve the fact that you have to obsess on who owns what, both for allocation and concurrency locking purposes, yet the language provides no help whatsoever in dealing with those issues.
    • C++ Was supposed to fix the major problems with C. A few bad design decisions in the type system made that hopeless. The underlying problems with arrays remained. An attempt was made to paper that over with the "standard template library" collection classes. Collection classes were a big step forward, but they were really just papering over the moldy type system underneath, and the mold kept coming through the wallpaper. The C++ standards committee keeps adding bells and whistles to the template system, but after ten years they still don't have anything good enough to release.
    • Java Was supposed to fix the major problems with C. Java itself isn't a bad language, but somehow it got buried under a huge pile of libraries of mediocre quality. Then a template system was bolted on top, along with ever more elaborate "packaging" systems. Java ended up as the successor to COBOL, something that surprised its creators.
    • Python Python is an elegant language held back by painfully slow implementations. Some of the implementation speed problems come from the most common implementation, which is a naive (non-optimizing) interpreter, but some of them come from bad design decisions about when to bind. Late-binding languages are not inherently slow, but Python has lookup by name built into the language specification in ways which make it almost impossible to speed up the language as defined. (The Unladen Swallow team is discovering this the hard way; they're getting only marginal speed improvements with their JIT compiler.) Python also addresses concurrency badly; everything is potentially shared and one thread can even patch the code of another. The end result is that only one thread can run at a time in most implementations.
    • JavaScript A painful language which, due to massive efforts to speed it up, is starting to take over in non-browser applications. JavaScript is the object model of Self expressed in syntax somewhat like that of C. This is ugly but adequate.

    And that's where we are today.

  • by lgw (121541) on Friday July 23, 2010 @05:49PM (#33008282) Journal

    Sure, if you can get the "closing them when I'm done with them" right every time, never failing due to human error, you're golden. However, one wonders why you'd need a garbage collector, or for that matter a text editor - can't you just move the bits around on disk with the power of your mind? :)

    But, yeah, any time you see weak pointers a living hell awaits. Garbage collection is much better for those corner cases (and, dammit, for a while C++0x had "opt-in" garbage collection, but lost it. It's the best of both worlds!) I'm not the biggest fan of WCF, but it does have one thing goint for it: it's not COM.

  • Re:I LOVE perl! (Score:3, Interesting)

    by A Friendly Troll (1017492) on Friday July 23, 2010 @06:07PM (#33008526)

    You went on to describe how Perl is great but just so you know - every one of those reasons you listed is why every multi-lingual person on the planet hates English.

    I'm multi-lingual.

    English, with the exception of Esperanto, was the easiest language to learn. There are several orders of magnitude more exceptions in some other languages; they also have a lot more cases and conjugations, and actually use genders (to which English is oblivious). Even spelling words in English is incredibly easy.

  • Re:And video games (Score:2, Interesting)

    by Creepy (93888) on Friday July 23, 2010 @06:40PM (#33008944) Journal

    Actually, his rant is about the same thing I said about C++ 10 years ago. C++ is a powerful language, but it is also an extremely bloated language, and has a lot of legacy bolted on. After using the much more streamlined objective-C it was hard to go back. To be honest, I'd say the main reason to use C++ is all OS's include libraries for it.

    Sadly, C++ is even less portable than C and included numerous questionable decisions in its design. My personal peeve is wchar_t, which has an undefined length, so localization is a pain in the rear. I also wish there was an easy way to add accessor functions, since I always bolt them on except for low level code (to make it more true OOP). Also there are many convenience classes that have never made it in - like streamlined thread and thread pool classes (therefore, you need to implement them yourself on every platform).

  • It was actually Adam de Boor who discussed JavaScript.

    And while JavaScript has its warts, all in all, it's a pretty nice language. If gmail is 400k+ lines of code in JavaScript, it would have been well over a million (if not four million) in Java.

  • by Anonymous Coward on Friday July 23, 2010 @07:34PM (#33009494)

    Even if its a Stroustrup quote, its tough to swallow him saying "we don't want our tools--including our programming languages--to be more complex than necessary".

    Stroustrup invented the MOST COMPLICATED PROGRAMMING LANGUAGE IN EXISTENCE that actually gets used for anything: C++. Because his fucking language is so complicated (and based on an antiquated edit-compile-link model inherited from C), the tools to work with this langage (compilers, debuggers and IDEs) are also extremely complicated. So you can't do useful things like fast incremental compilation, useful reflection (unless you totally roll your own) or any sane kind of metaprogramming. And everyone knows a different subset of the language, and it takes about 4x as much programmer effort to get anything done than it does in a higher-level language like D.

    I have to use C++ at work and honestly I wish they would let us write the new one in C. Yes, it would be more inconvenient in some ways -- but it would also REALLY help by keeping us from getting into C++ rat-holes like overuse of templates or overloading or multiple inheritance or dynamic_cast with private non-virtual ctors in a protected virtual base class.

  • by shutdown -p now (807394) on Friday July 23, 2010 @09:19PM (#33010322) Journal

    n a good programing language, the "using" block would be implicit by default when you allocated the object, and you would need to declare it "shared" or somehting for those corner cases where you want it to survive leaving scope (and done properly, "shared" would only create a problem where there was an algorithmic error, not a simple oversight).

    You've just described C++. Objects with automatic storage duration ("stack-allocated" for those not versed in standarteze) get cleaned up as their scope is gone, and to mark them "shared" you declare them using std::tr1::shared_ptr and allocate with "new".

  • Re:Don't hate on VB (Score:3, Interesting)

    by shutdown -p now (807394) on Friday July 23, 2010 @09:27PM (#33010390) Journal

    I truly loved VB6 as a RAD platform. I wrote a scientific application involving realtime data collection/control, database method and paramter control, realtime graphing, simultaneous multiple system control including sampling robots etc, that is still being used in hundreds of industrial labs around the world. It was written in around 4 months, and there is no way I could of done it in C++.

    Yes, but you could have done it in Delphi just as fast and convenient, and you'd have had code that was much more readable.

    A language in which this line of code:

    x = y

    does not certainly mean "assign value of variable y to variable x" (in VB, it might mean that, or it might not - that depends on type of x; hint: default properties), truly deserves to be fully erased from the memory of mankind.

  • New wave legacy (Score:2, Interesting)

    by randomsearch (1207102) on Saturday July 24, 2010 @04:18AM (#33012032) Journal

    Having spent a few years working with IBM mainframes running 30-year-old COBOL and assembly code, the problems of maintainability are always sitting at the back of my mind when debates like this come up.

    I've worked in Java and C commercially, and recently I've done a few web projects in PHP and JavaScript. I'm currently an academic researcher, and in that job I've spent some time writing Python and C++. I'm also familiar with the way we're teaching languages and programming, having been involved in the labs and lectures.

    I think we are in danger of creating a "new wave" (apologies to French cinema) of legacy software. Whilst scripting, dynamically-typed, languages can be fun and faster as tools to build code in the first place, they do not constrain or discipline the programmer as much as something like Java. Object-orientation with static typing etc. took off because it is a great way to design (some, not all) software in a structured manner to improve communication between engineers and address concerns like maintainability.

    I understand why people are using things like Ruby and Python right now, but I suspect it might be a short-termist view of the world. If you're planning to throw your website away, perhaps that's ok. Invariable though, things last much, much, longer than you expect them to.

    We've already seen how poorly sites like Facebook scale - imagine what they will be like in another 10 years. We may well look back on these years of great web development as building a legacy that a lot of us spend the rest of our lives trying to reverse-engineer, fix and replace.

    RS

  • Re:Summary: (Score:3, Interesting)

    by yyxx (1812612) on Saturday July 24, 2010 @04:44AM (#33012106)

    So the only people that can be critical of anyone are people that have some big publicly recognized accomplishments? That's a pretty small list.

    You went beyond critical and went to ad hominems; that makes the question legitimate.

    But merely "This guy thinks he's better and smarter than everyone else...

    He's smarter than the vast majority of programmers; his publications and resume tell you at least that much.

    but his actual accomplishments in what HE'S developed to replace those technologies in no way measure up the his fanatical criticism of them"

    He was part of the development of the technologies he criticizes. Second, his criticism really is valid: C++ and Java are objectively bad designs, and I say that as someone whose main programming languages over the last 30 years have been C, C++, and Java. Third, technically, what he has developed to replace them does measure up; it is certainly quite a bit better.

    So, if he developed better technologies, why didn't they catch on? Because better technologies frequently don't catch on. Replacing a technology with something better doesn't just require developing something better, it requires convincing the users of the old technologies that switching is worthwhile, a process complicated by the fact that most users of those old technologies know little about software or programming languages.

    And, as you noticed, Pike isn't a particularly good advocate; he comes across as a blow-hard. Pike is never going to create a successful programming language, even though he is a lot more capable than B.S. or Gosling.

COBOL is for morons. -- E.W. Dijkstra

Working...