Forgot your password?
typodupeerror
Education Google Java Programming

Google Engineer Decries Complexity of Java, C++ 878

Posted by Soulskill
from the keep-it-simple-stupid dept.
snydeq writes "Google distinguished engineer Rob Pike ripped the use of Java and C++ during his keynote at OSCON, saying that these 'industrial programming languages' are way too complex and not adequately suited for today's computing environments. 'I think these languages are too hard to use, too subtle, too intricate. They're far too verbose and their subtlety, intricacy and verbosity seem to be increasing over time. They're oversold, and used far too broadly,' Pike said. 'How do we have stuff like this [get to be] the standard way of computing that is taught in schools and is used in industry? [This sort of programming] is very bureaucratic. Every step must be justified to the compiler.' Pike also spoke out against the performance of interpreted languages and dynamic typing."
This discussion has been archived. No new comments can be posted.

Google Engineer Decries Complexity of Java, C++

Comments Filter:
  • doWhatIWant()
    and
    doItFaster(doWhatIWant)

    • by somaTh (1154199) on Friday July 23, 2010 @03:27PM (#33006500) Journal
      See, I'm already thinking about extentions.

      doWhatIWantEvenThoughImTellingYouToDoSomethingElse()
      • by PixieDust (971386) on Friday July 23, 2010 @03:51PM (#33006874)
        I dunno, I kind of like these extensions:

        workItHarder makeItBetter doItFaster makesUsStronger moreThanEverHourAfterHour workIsNeverOver

      • One of C's great advantages is not only that it is simple and very fast, it is also very close to the hardware -- when you make local variables, structures, assignments, etc... you have a good idea what the compiler needs to do. Likewise control structures, statements and so on.

        The reason it is used is -- frankly -- because it kicks the ass of every other language out there (except machine and assembly) when it comes to both size and performance. This is because a C fragment turns into something very eff

        • by clone53421 (1310749) on Friday July 23, 2010 @04:17PM (#33007198) Journal

          One of C's great advantages is not only that it is simple and very fast, it is also very close to the hardware -- when you make local variables, structures, assignments, etc... you have a good idea what the compiler needs to do. Likewise control structures, statements and so on.

          That’s exactly the point... it’s too close to the hardware. Yes, it gives you really fine-grained control over what happens, and you can tweak it to make it as fast as possible. With the speed of today’s computers, though, you shouldn’t (usually) need that amount of optimization. Plus, the compiler should be robust enough to optimize the program nearly as well as you could anyway.

          You don’t want to tell the computer every nitty-gritty detail. The computer is fast enough and powerful enough to do what you want it to do without you needing to exercise that level of control over how it actually does it. You just want to call a function that does what you want without worrying about the underlying hardware or algorithm that does it.

          As programs get more and more complex, more and more abstraction is needed between the programmer and the hardware. This is not surprising. Someday, instead of saying “that’s like coding an entire application in assembly”, we might be saying “that’s like coding an entire application in C++”.

          • by bonch (38532) on Friday July 23, 2010 @04:21PM (#33007246)

            C's closeness to the hardware is probably why it has stayed relevant in the era of mobile computing and battery life. Some developers do need to tell the computer every nitty-gritty detail.

            • And video games (Score:5, Insightful)

              by Sycraft-fu (314770) on Friday July 23, 2010 @05:39PM (#33008144)

              Very hard to find a main stream game that isn't written in C++. What with Ms pushing XNA and some other stuff like that there may start being a few more written in managed languages, but C++ still reigns supreme. Why? Speed. You can write some real efficient (from the processor's point of view) code if needed, but it still has higher level functions like being OO and the boost libraries to make thing easier.

              Even on games made to be extensible, C++ is usually at the core. Civ 4 is mostly XML and Python. Pretty much all data is stored in XML, and the interactions of that stored in Python. However, the game engine is written in C++, as is the AI's DLL. The game core maybe you argue that is because they didn't want people messing with it but the AI they released the source code for. It is C++ because speed is essential.

              Some programmers love to whine about C and C++, but they endure for many reasons. I'd also point out they form the core of most OSes. Linux is written in C. The Windows kernel is written in C, the higher level API/ABI stuff in C++ and only some of the user stuff in .NET. OS-X is again C and C++ at the low level, and Objective-C higher up. All of this is not coincidence.

          • by conspirator57 (1123519) on Friday July 23, 2010 @05:10PM (#33007824)

            That's exactly the point... it's too close to the hardware. Yes, it gives you really fine-grained control over what happens, and you can tweak it to make it as fast as possible. With the speed of today's computers, though, you shouldn't (usually) need that amount of optimization. Plus, the compiler should be robust enough to optimize the program nearly as well as you could anyway.

            umm... did you miss the part where the guy also bitched that interpreted languages are "too slow"?

            so which is it? where on this stone are you going to squeeze the blood from? it's a tradeoff and the menu of available programming language choices is already comprehensive. this guy expresses it better and more comprehensively than i care to in a /. comment:

            http://eatthedots.blogspot.com/2008/07/why-is-c-faster-than-python.html [blogspot.com]

            and compiler research has only yielded 4% annual improvement in performance per Proebsting's law
            http://research.microsoft.com/en-us/um/people/toddpro/papers/law.htm [microsoft.com]
            http://www.cs.umd.edu/class/spring2006/cmsc430/lec18.4p.pdf [umd.edu]

            and compiler researchers concede that a competent human will outperform a compiler for the foreseeable future. so your statement about compilers is total hand-waving away of facts inconvenient to your argument.

          • by Like2Byte (542992) <.Like2Byte. .at. .yahoo.com.> on Friday July 23, 2010 @05:13PM (#33007856) Homepage

            This is exactly the reason I program in C and C++. Because it is hard. The level of knowledge required for entry into my field is higher and I am therefore surrounded by more competent engineers.

            Anyone complaining that C/C++ is too hard needs to stay in GUI application and web development. Have fun, I say.

        • by fbjon (692006) on Friday July 23, 2010 @04:19PM (#33007224) Homepage Journal
          You seem confused. He said C++ is complex, not C, and he is entirely right. Also, if you used to do Perl, you might like Ruby. It's no faster than Python, but I find it nicer.
        • by rolfwind (528248) on Friday July 23, 2010 @04:28PM (#33007334)

          "Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp."

        • by bbn (172659) <baldur.norddahl@gmail.com> on Friday July 23, 2010 @04:32PM (#33007372)

          This is because a C fragment turns into something very efficient

          No it _can_ turn into something very efficient. But usually it is too much bother or the programmer is not competent enough and the program ends up being no more efficient than something coded in a different language.

          Take a look at the programming language shootout. The C programs usually win the contest, but they do it by doing crazy things like looking up the cache size of the CPU and implementing their own version of malloc to fit a page size. The run of the mill C program is not like that.

          For many or even most tasks stability and security is more important. Other languages provide those properties better than C.

        • by Ex-MislTech (557759) on Friday July 23, 2010 @04:43PM (#33007504)

          You got modded flamebait because you dared question the less is more crowd that wants higher level languages that do
          most of the lower level work for them.

          The ppl that coded closer to the metal as you say mostly have grey hair now, and are looked at as legacy coders
          by the new breed that want all that lower level coding done for them by the language.

          As for it being complex, sure it is complex for most ppl, but for a college trained Comp Sci. major it should
          be bread and butter and tools of the trade.

          Ppl have declared we are giving a dumbed down education and I think this is a fair indicator of it.

        • by mangu (126918) on Friday July 23, 2010 @05:28PM (#33008004)

          I must say I rarely find a comment on /. that I agree as much as I do with yours.

          C and Python march hand in hand, one is for machine performance, the other is for programmer performance. If someone thinks C is too complex or too hard to learn then he shouldn't be working with programming computers, he's likely to cause great damage sooner or later.

          However, there's one point where C will need a new approach: multiprocessing is coming. Since it seems like Moore's law has hit the ceiling at 3 GHz CPU speeds, all progress in performance for the foreseeable future will come from increasing the number of CPUs and cores working together.

          I have done a lot of programming in multithreads using the pthread library lately and I feel that something better is needed, pthread is not close enough to the metal. I think some new fundamental elements may be needed in the language.

          C is so great for programming because it mirrors the hardware closely. For instance, pointers work so well because they represent memory addresses. Before I learned C I had worked with Fortran, I still have some programs I wrote over 25 years ago. Today I look at those old Fortran programs and I wonder why I did some things the way I did. I see some convoluted loops and wonder why I did that because, with a quarter century hindsight on using pointers, I create almost instinctively the most efficient set of pointers to handle a data structure.

          What programmers often don't realize is that the correct data structure may get orders of magnitude improvement in performance. To give one example, years ago, when I studied artificial neural networks, I read an article in the Doctor Dobb's magazine (January 1989, page 32, "Neural Networks and Noise Filtering" by Casey Klimasaukas). It was a good article, but the source code in C that came with it sucked. There was a struct _pe representing a processing element and each struct _pe had an array of struct _conn representing the connections to that element.

          The problem is that in an artificial neural network what each neuron is doing is, basically, a convolution of two arrays. To do that efficiently in hardware you need to have the array elements contiguous in memory. When you put the connection weight in a structure together with other data you will not have that value contiguous with the weights of the other connections.

          From an "object oriented" point of view that program was perfect. But if you want to use your multi-core CPU with that, the program sucks. That's the benefit you can get from programming in C that you won't get with other languages.

          And don't tell me that raw performance does not matter because you can always get faster hardware. CPU clock speed has stopped at 3 GHz, we must learn to use our multicores if we want to evolve from now on.

        • The vast bulk of your argument is dedicated to defending C, which he never mentioned and only an imbecile would call the core language rules complex.

          C++, which you barely mention, on the other hand, is extremely complex at its core, mostly due to its templating language--which, by the way, is one of the very best features of C++, and although it could be (and will be in C++0x (0x more like 2x amirite)) done better, I think the expressiveness is well worth the complexity. Again, only an imbecile would call C++ simple.

          The real question to ask when somebody chooses to bash C++, however, is "what are you selling?" In this case, he's fairly up-front about it--he's selling Go, Google's pet language. To that, I say, "I'll believe it when I see it." Right now Go is a bunch of "but just look at the groundwork we've laid down! This will be great when it's done!" As the article mentioned, that may be true, but its complexity will approach C++ as its capabilities do. It's just hard to do a lot of stuff well, and for all the complaining, C++ does do a lot of stuff, and it does most of it well, or at least ... eh, pretty good.

          Honestly, look over the C++ challengers: every one I've ever seen that's squarely aimed at taking out other languages is little more than an ego project. I have yet to see a language that actually introduces new capabilities without significant downsides compared to what they're trying to replace, so I'll keep doing what's worked for me the last 10 years I've been a C++ fanboy: wait for something that's actually better to come out. Smugly.

        • by sjames (1099) on Friday July 23, 2010 @06:31PM (#33008840) Homepage

          C is very much a relevant language with a strong niche for modern use. It's very good at what it does. It is a mid-level language and it shines at that sort of use.

          Note that he was complaining specifically about C++, not C. It's not even necessarily the language itself, but the patterns and APIs that are inspired by the language that matter.

          When you need a higher level language than C, C++ is NOT a great answer. It's a series of bolt-ons for C that try to make it something it is not.

          Meanwhile, java started out badly with silly marketing claims that it never quite lived up to, still manages to perform poorly in practice and has been dogpiled with a bunch of alphabet soup such that if there is a lean and mean language in there somewhere, we'll never find it./

          So it's not at all C that is too complex, it's those horrific messes we call programs written in C++ that are too complex.

        • by Demena (966987) on Friday July 23, 2010 @08:31PM (#33009988)
          Do you know who you are dissing? Did you read the article? No, and no. He never said anything against C, he said C++ was too complex not C. So you echo him and slag him off over your misreading at the same time? One of the founders of your profession? Standards dropped? Yep, sure. But whose?
      • Re: (Score:3, Funny)

        by bennomatic (691188)
        My implementation of that is doTheRightThing().
    • by cgenman (325138) on Friday July 23, 2010 @03:29PM (#33006534) Homepage

      slowDownAndCrashSoICanSellAnUpgrade();

    • by MikeyO (99577) on Friday July 23, 2010 @03:47PM (#33006832) Homepage

      You obviously aren't paid by the hour to write java code, or else you'd have come up with something like:

      ThreadFactory.getInstance().setExecutionTarget(new Runnable(){ public void run (doWhatIWant() }).addExecutionObserver(ExcecutionItemObserverFactory.getInstance()).start()

      Which is much more "enterprise ready" than yours.

      • We have a FoobarFactoryFactory class in the project I'm currently assigned to... yes, it's a factory that creates factories (which in turn create foobars). And the foobars are themselves generic-ish objects which can contain any number of different types of data.

        And they have the nerve to tell me one of the qualities of higher-level devs is that "they tend to make things simpler than entry-level devs".

        • Re: (Score:3, Funny)

          by danieltdp (1287734)
          I just hope, for the sake of your project, that the factory don't create foobars, but something else that you removed from the original
  • by Anonymous Coward on Friday July 23, 2010 @03:19PM (#33006372)
    Segmentation Fault
    • Re: (Score:3, Insightful)

      It's not hard to learn Java but yes, it is often a complicated mess of shit when you want to use it on the web. Anything where you *have* to generate great swathes of code to keep your sanity is probably an over engineered shit solution.
  • umm... (Score:4, Insightful)

    by deviator (92787) <bdp@amnesi[ ]rg ['a.o' in gap]> on Friday July 23, 2010 @03:20PM (#33006390) Homepage

    "Efficient" languages are too complex. "Simple" languages are too inefficient.

    Normally I'd write this off as "duh" but this is Rob Pike.

    Oh wait, he's pushing something new that somehow manages to be easy and efficient? OK...

    • Re:umm... (Score:4, Insightful)

      by Bryansix (761547) on Friday July 23, 2010 @03:29PM (#33006526) Homepage
      Right. Basically what we call "High level programming languages" are not all that high level any longer and the compiler is getting off without doing enough work. The language should be simple and the "hard work" should be done by the compiler figuring out the most effecient way to get done what it is being told to get done.
      • Re: (Score:3, Interesting)

        by FooAtWFU (699187)

        I'm no Robert Pike, but I imagine that computer programming as we move to thousands and millions of cores will consist less of telling the computer how to do something, and more of telling the computer what you want to do and having a really smart compiler figure out the details. The more low-level you go, the less chance the compiler has of figuring out what you're trying to do and making it work effectively (and do crazy optimizations like speculative out-of-order execution, and what-have-you).

        But this

        • Re:umm... (Score:5, Insightful)

          by ultranova (717540) on Friday July 23, 2010 @04:58PM (#33007704)

          But this means that the programming languages of the future will be less imperative and more functional.

          No, that means that the programming languages of the future will be subjective: the computer will interpret your commands in the light of whatever other data it has. This, of course, requires artificial intelligence, and slowly but surely phases away the whole job of programming as a separate skill from commanding people.

          In other words, the ultimate programming languages of future will be known as English, Chinese, etc.

  • Missing context... (Score:5, Insightful)

    by Akido37 (1473009) on Friday July 23, 2010 @03:21PM (#33006396)

    Pike detailed the shortcomings of such languages as a way of describing the goals that he and other Google engineers have for a new programming language they developed, called Go.

    Oh, so he's pushing a competing product and denigrating his competition? Nothing to see here, I think.

  • by swanzilla (1458281) on Friday July 23, 2010 @03:22PM (#33006416) Homepage
    I'd like to hear what he thinks about Perl.
    • by DG (989) on Friday July 23, 2010 @03:38PM (#33006686) Homepage Journal

      I actually think that perl is the best programming language every designed.

      (Waits for storms of laughter to subside)

      No, really, I'm completely serious. perl is the English of programming languages. It takes the most useful parts of everything and mixes them all together into a useful conglomerate.

      Much the same way you can use English to write a scientific dissertation, a sonnet (in full Billy S mode), or O RLY? perl can be as descriptive and formal or as loose and unbounded as the programmer chooses and it all JUST WORKS!

      I **lothe** "bondage and discipline" languages that force me to think and write a certain way just because some would-be language guru thinks HIS way is the One True Path to enlightenment. perl gives me an expressive, more-than-one-way-to-do-it language that lets me think and work the way that best fits the problem at hand.

      I have written enterprise-level perl code optimised for long-term maintainability and reliability (an LDAP server replication program that did schema translation). And I have written 5-second hacks that solved an immediate problem quickly and efficiently. perl lets me do this. No other language I've used matches perl's sheer versitility.

      I love perl!

      And I'm not at all ashamed to admit it.

      DG

      • Re:I LOVE perl! (Score:5, Insightful)

        by Monkeedude1212 (1560403) on Friday July 23, 2010 @04:03PM (#33007022) Journal

        I actually think that perl is the best programming language every designed[...] perl is the English of programming languages.

        You went on to describe how Perl is great but just so you know - every one of those reasons you listed is why every multi-lingual person on the planet hates English. It's a pain in the ass to learn because there are too many exceptions to the rules or the rules aren't well defined. Look no further than pluralization. Add an S, in most cases. Oh, but if it ends in an y, make it 'ies', like skies. And for some words, that end in sh or ch or x or something, its 'es', like wrenches. Oh and for Goose, its Geese. But the plural for Moose is not meese, in fact, its just moose, not even mooses.

        We won't bother getting into Contractions or prefixes/suffixes or any of the real gritty stuff. English itself is a pain, let alone how many variants of it are across the Earth, with their own Formal, informal, and Slang terminology.

        So yeah, while the flexibility that makes Perl accessible to more programming styles is good to you, its still a a pain to learn, and one of the reasons why people are put off by it. Without a standardized way of doing things its difficult to understand exactly whats going on. Some of the most obfuscated code I've ever seen has been written in Perl.

        • Re:I LOVE perl! (Score:4, Insightful)

          by Culture20 (968837) on Friday July 23, 2010 @05:20PM (#33007916)

          English. It's a pain in the ass to learn because there are too many exceptions to the rules or the rules aren't well defined. Look no further than pluralization. Add an S, in most cases. Oh, but if it ends in an y, make it 'ies', like skies.

          Unless the word is Monkey, then the plural is Monkeys.

          The nice part about the English language is that someone can speak pigeon-English and English speakers understand them very well. All those exceptions teach native-speakers to be accepting of lingual oddities. In some other languages, incorrect verb tense can make the difference between talking about "You" or "She", or incorrect tonality can make a foreigner sound like a stroke victim.

    • by mrogers (85392) on Friday July 23, 2010 @03:39PM (#33006710)
      I'd like to hear what he thinks about Perl.

      Since his talk had no discernible structure, said the same thing in a dozen different ways and won't make any sense this time next year, I'd assume he's a fan.

  • by Culture20 (968837) on Friday July 23, 2010 @03:23PM (#33006444)
    English estas too malmola! Paroli en Esperanto!
  • And...? (Score:4, Insightful)

    by arth1 (260657) on Friday July 23, 2010 @03:23PM (#33006446) Homepage Journal

    And where is the news here?

    Picking the right tool for the job doesn't just cut down half the work time, but can help offset what sloppy workers do to destroy quality.

    C++, Java, perl, C, forth, and sh are all different languages, and well suited to different jobs. But when all you have is a nailgun (i.e. all you are fluent in is a single language), every project starts looking like nailgun job, including your own foot.

  • by bigsexyjoe (581721) on Friday July 23, 2010 @03:23PM (#33006452)
    Google Engineer promotes Google language Go and claims it addresses weaknesses of existing languages, including Java and C++.
    • Re: (Score:3, Interesting)

      by jekewa (751500)

      I'll start using Go as soon as Google makes a browser-based development environment for it, ala Google Documents meets Bespin, and it makes something I can then deploy to servers other than the one running the development environment...

    • Re: (Score:3, Interesting)

      by slasho81 (455509)
      Rich Hickey talked about incidental complexity in his keynote talk at the JVM Languages Summit 2009: http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hickey [infoq.com]
      It's worth watching.
      If Pike thinks the Go language solves anything, he should probably watch this talk too.
  • by Black-Man (198831) on Friday July 23, 2010 @03:24PM (#33006454)

    This guy has a lot of nerve telling other folks what programming language to use.

  • by drewhk (1744562) on Friday July 23, 2010 @03:24PM (#33006470)

    You should bash Java, and C++ devs will agree. You should bash C++, and Java devs will agree.

    Now you bashed both languages that has probably the most devs. Except some dynamic languages, of course (PHP and JS comes to mind).
    Oh, you insulted them, too.

    OMG...

  • by John Whitley (6067) on Friday July 23, 2010 @03:25PM (#33006484) Homepage

    How about "Rob Pike Decries Complexity of Java, C++" instead?

    |Rob Pike| >> |Google Engineer|

  • Summary: (Score:5, Interesting)

    by IICV (652597) on Friday July 23, 2010 @03:25PM (#33006488)

    Google distinguished engineer Rob Pike ripped the use of Java and C++ during his keynote at OSCON, saying that these 'industrial programming languages' are way too complex and not adequately suited for today's computing environments. ... Pike also spoke out against the performance of interpreted languages and dynamic typing. ... "Go is an attempt to combine the safety and performance of statically typed languages with the convenience and fun of dynamically typed interpretative languages," [Pike] said

    Shorter Rob Pike: all those other languages suck, but the one I invented rocks. It's elegant and simple just like Lisp was back in the sixties!

    I'm reminded of this blog [scienceblogs.com] post I read, where the author described it as "The Hurricane Lantern Effect". You look at someone else performing a task, and you think "geez, what an idiot! I can do it better in ten different ways!".

    Then they hand the task off to you, and you slowly realize that each of your ten improvements isn't actually any better.

    I bet you that if it's still around in ten years, someone else will decry Go 10.0 as being a "bureaucratic programming language".

    • Re:Summary: (Score:4, Insightful)

      by BitZtream (692029) on Friday July 23, 2010 @03:38PM (#33006666)

      Whats better is if you take a look at his history of 'inventions', you find one or two things that eventually, with the help of others, turned into something that other people use.

      His personal list of inventions looks like a list of 'things no one gives a shit about'.

      His list of Wikipedia quotes are golden. I think there was one on the list that didn't make him look like a total douche.

      He's one of those guys that thinks everything sucks except what he's made ... unfortunately, the entire rest of the world feels pretty much the exact opposite.

      Looking at his history, I don't think he'll ever say anything bad about Go, he'll just continue thinking it was perfect and that it failed because everyone else wasn't up to the task of using it.

  • by Brett Buck (811747) on Friday July 23, 2010 @03:27PM (#33006498)

    These sorts of languages (and the underlying religious cults they bring with them) are probably appropriate for some uses. But what I see done in my life-critical real-time processor applications borders on criminal. Data hiding? How the f'ing hell do I check what is going on to the bit level is some twit determined to "hide the data". This is particularly apt right now, because we are adding a feature to our code that was almost trivial to add to our FORTRAN simulations, and because of the "cult of classes" C++ programming it's damn near impossible in the final product, and completely impossible to look at and tell what the heck it's doing. Trying to test it like a black box is never going to get to the level we need.

          We started having peer reviews of the code, and my colleagues and I are the designers of the system, so we would hypothetically need to sign off on it. We went for two hours to get 10 lines into it, no one could explain how it was working but that we should just "trust the compiler". That didn't fly with us, so the solution was to *not have us present at the peer reviews* since we were "disruptive"

        What we need is someone that can write straightforward procedural code, but no one seem to be willing or able to do it any more. It has all the features of a cult or a secret society, even when you get someone to understand and agree, they won't deviate from their dogma.

    • by ultranova (717540) on Friday July 23, 2010 @06:37PM (#33008912)

      But what I see done in my life-critical real-time processor applications borders on criminal. Data hiding? How the f'ing hell do I check what is going on to the bit level is some twit determined to "hide the data".

      You read the code of the class that the data belongs to. You can be sure that what you read is the only thing going on because no other code can do anything to the data since it's hidden from it.

      This is particularly apt right now, because we are adding a feature to our code that was almost trivial to add to our FORTRAN simulations, and because of the "cult of classes" C++ programming it's damn near impossible in the final product, and completely impossible to look at and tell what the heck it's doing.

      Well, high-level languages generally make it impossible to figure out what's really going on behind the scenes. That's intentional: abstracting away details is the whole idea of a programming language.

      C++ is particularly bad here because it mixes high- and lowe-level abstractions and allows you to redefine basic operations (such assignment, +, etc.). Combine that with manual memory management and lack of bounds checking and you have a rather explosive combination.

      What we need is someone that can write straightforward procedural code, but no one seem to be willing or able to do it any more.

      We went for two hours to get 10 lines into it, no one could explain how it was working but that we should just "trust the compiler".

      If you are using a compiler, you have the choice of either trusting it, or inspecting the machine code it generates by yourself, which is harder than simply writing the damn thing in assembly to begin with, and thus defeats the whole point of using a compiler.

      If you want straightforward procedural code, use C. Using C++ for procedural code is pointless, and simply adds unneeded complications.

      It has all the features of a cult or a secret society, even when you get someone to understand and agree, they won't deviate from their dogma.

      It could simply be that they disagree with you. Your earlier bit about data hiding making it more difficult to figure out what's going on makes it seem that you don't understand the idea of object-oriented programming, so of course it would seem like a "cult" to you.

      It could also be that you're trolling. In that case: bravo sir, you truly have the art down.

  • News Flash (Score:3, Funny)

    by eclectro (227083) on Friday July 23, 2010 @03:29PM (#33006532)

    Rob Pike likes to program in Forth [wikipedia.org] in his spare time.

  • Slashdot Interview (Score:5, Informative)

    by Jodka (520060) on Friday July 23, 2010 @03:31PM (#33006580)

    Slashdot previously interviewed [slashdot.org] Rob Pike.

  • by Cyberax (705495) on Friday July 23, 2010 @03:32PM (#33006590)

    Go has the same problems. They try to make it 'simpler' but along the way they actually make it more complex.

    For example, try-catch-finally idiom is an easy and standard way to deal with exceptions. But no, they had to invent their own half-assed implementation just to be 'minimal'.

    Also, they insist on using fucking _return_ _codes_ to indicate errors. WTF? It only makes code more complex because of tons of stupid 'if error' statements.

    Personally, I like Rust's ( http://wiki.github.com/graydon/rust/project-faq [github.com] ) design more. At least, it has some new features.

    • by owlstead (636356) on Friday July 23, 2010 @04:02PM (#33007000)

      Thank you, I do agree. I was about to write to the authors of Go, but I thought better of it: simply because I cannot see Go go anywhere.

      Basically, they do really weird things:
        - no exceptions
        - half assed immutability concepts
        - focus on compile time (compile time? really? yes really!)
        - no modularization system (it's like the micro-kernel vs mono-kernel fight all over)

      It's got some good ideas that make it interesting for small, fast, secure applications, but not so many that it becomes interesting. I could see technically make some headway for small monolithic kernels. But their market placement is lacking to the point that it is non-existent.

  • by ThoughtMonster (1602047) on Friday July 23, 2010 @03:37PM (#33006660) Homepage

    You could at least mention that Rob Pike had a large part in designing Plan 9, a programming language called Limbo, and oh, UTF-8, and that by "he and other Google engineers", TFA means Ken Thompson, who created B (a predecessor to C) and had a part in creating an operating system called Unix.

    These two people are the closest thing to a "computer scientist" there probably is, and I'd wager they know quite a lot about programming language design. Pike is known about his feelings towards programming languages like C++.

    Rob Pike made a talk about Go and programming language design and makes some interesting points. It's available on youtube [youtube.com].

    • by Bright Apollo (988736) on Friday July 23, 2010 @04:51PM (#33007624) Journal

      Pike and Thompson are not computer scientists, they are practitioners. The difference between Thompson's contributions and Knuth's contributions, for example, illustrate this exact point.

      --#

  • by Call Me Black Cloud (616282) on Friday July 23, 2010 @03:39PM (#33006690)
    How many words are in english? A lot. (According to the OED folks [oxforddictionaries.com], "The Second Edition of the 20-volume Oxford English Dictionary contains full entries for 171,476 words in current use, and 47,156 obsolete words. To this may be added around 9,500 derivative words included as subentries.") How many words does the average native english speaker know? According to this random website [englishenglish.com], 12,000-20,000 words. So English is complex, yet just 10% of the language meets a native speakers needs (less than that as we don't use all the words we know in normal conversation...except my wife when she's mad at me, then I hear every word she knows, many repeatedly)

    So Java is complex. C++ is complex. I program in Java for my daily bread and I certainly don't use the entire language. It's only as complex as I need it to be. The complexity of my code is driven by what I'm trying to do, not by the language itself. And for code maintainability, I try to keep things as simple as possible.
  • by oxide7 (1013325) on Friday July 23, 2010 @03:39PM (#33006694)
    It's been a long time since I've coded in a professional environment but I feel that having learned C++ you can learn any other language. It is complicated and verbose, but its extremely precise. Imagine having to learn how to manage memory with something like PHP. Actually -- because the new generation DOESNT LEARN C++ its why code is getting so sloppy now. There are easier languages sure, and using them can be fine, but if that's all you know, then you don't really know what your code is doing.
  • Programming is Hard (Score:3, Interesting)

    by CSHARP123 (904951) on Friday July 23, 2010 @03:41PM (#33006734)
    I RTFA, really all he is saying is programming is Hard. Well Duh, I am sorry to hear that from Google Engineer. May be he will be better off using C# or VB.Net. Welcome to programming.
  • by Mad-Bassist (944409) on Friday July 23, 2010 @03:41PM (#33006744) Homepage

    I kinda miss those days--easy to learn and embedded 6502 machine code subroutines to make things move faster.

  • He's just pimping Go (Score:5, Informative)

    by istartedi (132515) on Friday July 23, 2010 @03:41PM (#33006750) Journal

    The summary makes him sound like a winer with no solution. If you read TFA, you see he's pimping Google's new language, Go. That's perfectly understandable since they pay him; but TFA also points out that languages accumulate cruft over time, and Go is a baby.

  • by Animats (122034) on Friday July 23, 2010 @05:47PM (#33008260) Homepage

    The main problems of the major languages are known, but not widely recognized by many programmers.

    • C Started out with only built-in types, to which a type system was retrofitted. (You have to go back to pre-K&R C documents to see this, but originally, there was just "char", "int", "float", and pointers to them. "struct" was just a set of offsets, with no type checking. You couldn't even use the same field name in two different structs.) Bolting a type system onto this took a long time, and resulted in problems ranging from "array=pointer" to cascading recompilation because "include" files contain implementation details of included modules.

      The killer problems with C today mostly involve lying to the language. "int read(int fd, char* buf, size_t bufl);" is a lie; you're telling the compiler that the function accepts the address of a pointer, while in fact it accepts a reference to an array of char of length "bufl". This lie is the root cause of most buffer overflows. The other big problems with C involve the fact that you have to obsess on who owns what, both for allocation and concurrency locking purposes, yet the language provides no help whatsoever in dealing with those issues.
    • C++ Was supposed to fix the major problems with C. A few bad design decisions in the type system made that hopeless. The underlying problems with arrays remained. An attempt was made to paper that over with the "standard template library" collection classes. Collection classes were a big step forward, but they were really just papering over the moldy type system underneath, and the mold kept coming through the wallpaper. The C++ standards committee keeps adding bells and whistles to the template system, but after ten years they still don't have anything good enough to release.
    • Java Was supposed to fix the major problems with C. Java itself isn't a bad language, but somehow it got buried under a huge pile of libraries of mediocre quality. Then a template system was bolted on top, along with ever more elaborate "packaging" systems. Java ended up as the successor to COBOL, something that surprised its creators.
    • Python Python is an elegant language held back by painfully slow implementations. Some of the implementation speed problems come from the most common implementation, which is a naive (non-optimizing) interpreter, but some of them come from bad design decisions about when to bind. Late-binding languages are not inherently slow, but Python has lookup by name built into the language specification in ways which make it almost impossible to speed up the language as defined. (The Unladen Swallow team is discovering this the hard way; they're getting only marginal speed improvements with their JIT compiler.) Python also addresses concurrency badly; everything is potentially shared and one thread can even patch the code of another. The end result is that only one thread can run at a time in most implementations.
    • JavaScript A painful language which, due to massive efforts to speed it up, is starting to take over in non-browser applications. JavaScript is the object model of Self expressed in syntax somewhat like that of C. This is ugly but adequate.

    And that's where we are today.

  • by hendrikboom (1001110) on Friday July 23, 2010 @07:04PM (#33009228)
    In the 70's, I wrote a parser generator. It was about 1000 lines long, and it took 25 tries to get it past the static checking of the compiler. After that, it ran correctly the first time I got to run it.

    The language it was written in was Algol 68. What contributed to my success was an expressive static type system, and garbage collection. Unless you specifically turned run-time checking off, you could not break the run-time system.

    Oh, and did I mention that the compiler generated low-level efficient code as well?

    But there are few Algol 68 compilers around these days.

    Looking to what *is* available nowadays, have a look at Modula 3 [wikipedia.org]. It's not my favourite style of syntax, but programs written in it tend to run fast and be easy to debug. Again, most of the bugproofing lies in the static checking and garbage collection. And it's a systems language. It has been used for implementing OS kernels and the like, as well as application programs. It's my language of choice at the moment. Get the CM3 implementation. Follow the link in the Wikipedia article.

    Another attractive language is OCAML. I haven't much experience with it, but it seems to share the behaviour I've grown to love with Algol 68 and Modula 3. If anything, though, OCAML does too much automatic type inference. This leads lazy programmers to forget to mention types at many strategic code locations. making the code unnecessarily obscure.

No amount of careful planning will ever replace dumb luck.

Working...