Forgot your password?
typodupeerror
Education Google Java Programming

Google Engineer Decries Complexity of Java, C++ 878

Posted by Soulskill
from the keep-it-simple-stupid dept.
snydeq writes "Google distinguished engineer Rob Pike ripped the use of Java and C++ during his keynote at OSCON, saying that these 'industrial programming languages' are way too complex and not adequately suited for today's computing environments. 'I think these languages are too hard to use, too subtle, too intricate. They're far too verbose and their subtlety, intricacy and verbosity seem to be increasing over time. They're oversold, and used far too broadly,' Pike said. 'How do we have stuff like this [get to be] the standard way of computing that is taught in schools and is used in industry? [This sort of programming] is very bureaucratic. Every step must be justified to the compiler.' Pike also spoke out against the performance of interpreted languages and dynamic typing."
This discussion has been archived. No new comments can be posted.

Google Engineer Decries Complexity of Java, C++

Comments Filter:
  • by Anonymous Coward on Friday July 23, 2010 @03:26PM (#33006490)

    ROFL

  • Slashdot Interview (Score:5, Informative)

    by Jodka (520060) on Friday July 23, 2010 @03:31PM (#33006580)

    Slashdot previously interviewed [slashdot.org] Rob Pike.

  • by Cyberax (705495) on Friday July 23, 2010 @03:32PM (#33006590)

    Go has the same problems. They try to make it 'simpler' but along the way they actually make it more complex.

    For example, try-catch-finally idiom is an easy and standard way to deal with exceptions. But no, they had to invent their own half-assed implementation just to be 'minimal'.

    Also, they insist on using fucking _return_ _codes_ to indicate errors. WTF? It only makes code more complex because of tons of stupid 'if error' statements.

    Personally, I like Rust's ( http://wiki.github.com/graydon/rust/project-faq [github.com] ) design more. At least, it has some new features.

  • by ThoughtMonster (1602047) on Friday July 23, 2010 @03:37PM (#33006660) Homepage

    You could at least mention that Rob Pike had a large part in designing Plan 9, a programming language called Limbo, and oh, UTF-8, and that by "he and other Google engineers", TFA means Ken Thompson, who created B (a predecessor to C) and had a part in creating an operating system called Unix.

    These two people are the closest thing to a "computer scientist" there probably is, and I'd wager they know quite a lot about programming language design. Pike is known about his feelings towards programming languages like C++.

    Rob Pike made a talk about Go and programming language design and makes some interesting points. It's available on youtube [youtube.com].

  • He's just pimping Go (Score:5, Informative)

    by istartedi (132515) on Friday July 23, 2010 @03:41PM (#33006750) Journal

    The summary makes him sound like a winer with no solution. If you read TFA, you see he's pimping Google's new language, Go. That's perfectly understandable since they pay him; but TFA also points out that languages accumulate cruft over time, and Go is a baby.

  • by Zarel (900479) on Friday July 23, 2010 @03:53PM (#33006914)

    Should have RTFA I guess, I now realize Mr Pike just talks in circles and really didn't have anything of value to say other than 'programming is hard'.

    No, he doesn't. TFA-writer Joab Jackson talks in circles and doesn't have anything of value to say. Mr. Pike, on the other hand, appears to be saying that Google Go fixes a lot of unnecessary complexity in Java and C++.

    His keynote isn't linked from either the Slashdot summary or TFA, but can be seen here: http://www.youtube.com/watch?v=5kj5ApnhPAE [youtube.com]

  • by Muad'Dave (255648) on Friday July 23, 2010 @04:13PM (#33007144) Homepage

    ... "make sure the escape sequences cant be confused with notmal [sic] characters"...

    That was figured out _long_ before UTF-8. In fact, having been born in 1956, Rob Pike was probably exposed to the concepts of Control Codes [wikipedia.org] and bisync [wikipedia.org] as well as all the other framing methods that use escape characters to indicate in-band signalling [wikipedia.org].

    Have you ever wondered what all those control characters are doing wasting space down there under decimal 32? Link management, that's what. Start of Header, Start of Text, End of Text, End of Transmission, etc. They were all used to keep systems in sync across the (nasty) comm lines of the day. Even [XYZ]-modem used a similar setup.

  • by fbjon (692006) on Friday July 23, 2010 @04:19PM (#33007224) Homepage Journal
    You seem confused. He said C++ is complex, not C, and he is entirely right. Also, if you used to do Perl, you might like Ruby. It's no faster than Python, but I find it nicer.
  • by vbraga (228124) on Friday July 23, 2010 @04:24PM (#33007280) Journal

    Since I never met this quote before and found it quite insightful here [technologyreview.com] is the link for interview where Stroustrup said it for those like me who didn't know it - it's worth reading.

  • by lgw (121541) on Friday July 23, 2010 @04:26PM (#33007316) Journal

    Yes, you can do it the sane way. But my point was that someone actually thought the verbose way was easier to understand!

  • by vbraga (228124) on Friday July 23, 2010 @04:38PM (#33007442) Journal

    You probably used UTF-8 in a way or another.

  • by Anonymous Coward on Friday July 23, 2010 @04:40PM (#33007468)

    Good thing Google doesn't have an illegally close relationship with the government [nlpc.org] that benefits the company!

  • by Grishnakh (216268) on Friday July 23, 2010 @04:43PM (#33007502)

    Sorry, but while I definitely know Brian Kernighan, Dennis Ritchie, and Ken Thompson by name (and also Bjarne Stroustup), I'd never heard of Rob Pike before I read this story. Yes, he's done some noteworthy things according to his Wikipedia bio, but he's just a little below these other guys, so he doesn't quite have the same name recognition they do.

  • by BitZtream (692029) on Friday July 23, 2010 @04:44PM (#33007532)

    I'm aware of who he is, what he does now and what he's done in the past. I've now seen his keynote.

    My opinion hasn't changed. He has nothing to say and talks himself in circles, I'm guess you just don't see it due to lack of understanding, of course maybe I'm the one that doesn't understand.

    Who knows, but I'm going to stick with my original assessment that he's just a blow hard spewing about his latest creation and how everyone elses sucks because his creation some how mysteriously fixes the problem that no one else has.

    You go listen to what he has to say, I'll continue getting things done while you go play with a new language because its 'better' until you realize that its exactly the same as all the others.

    When you start telling me that the language is the problem I realize instantly that you aren't that great of a programmer. My one exception to this is Visual Basic (not BASIC, VB specifically). It is a shitty environment because of the shit support library MS made for it.

  • by nschubach (922175) on Friday July 23, 2010 @05:01PM (#33007716) Journal

    I think you'd be amazed at how much some of the world's companies rely on Excel macros.

  • Re:And...? (Score:3, Informative)

    by jd (1658) <imipak@@@yahoo...com> on Friday July 23, 2010 @05:01PM (#33007720) Homepage Journal

    Forth is extremely good for hardware control. It seems to have lost the edge it used to have, but once upon a time Forth was dominant in radio astronomy, self-contained robots, firmware, scientific lab equipment, etc.

  • Re:objective C (Score:5, Informative)

    by SteeldrivingJon (842919) on Friday July 23, 2010 @05:04PM (#33007752) Homepage Journal

    Objective-C is definitely simpler than C++. A little complexity is creeping in now, especially with Blocks. But overall it adds very little to C.

  • by conspirator57 (1123519) on Friday July 23, 2010 @05:10PM (#33007824)

    That's exactly the point... it's too close to the hardware. Yes, it gives you really fine-grained control over what happens, and you can tweak it to make it as fast as possible. With the speed of today's computers, though, you shouldn't (usually) need that amount of optimization. Plus, the compiler should be robust enough to optimize the program nearly as well as you could anyway.

    umm... did you miss the part where the guy also bitched that interpreted languages are "too slow"?

    so which is it? where on this stone are you going to squeeze the blood from? it's a tradeoff and the menu of available programming language choices is already comprehensive. this guy expresses it better and more comprehensively than i care to in a /. comment:

    http://eatthedots.blogspot.com/2008/07/why-is-c-faster-than-python.html [blogspot.com]

    and compiler research has only yielded 4% annual improvement in performance per Proebsting's law
    http://research.microsoft.com/en-us/um/people/toddpro/papers/law.htm [microsoft.com]
    http://www.cs.umd.edu/class/spring2006/cmsc430/lec18.4p.pdf [umd.edu]

    and compiler researchers concede that a competent human will outperform a compiler for the foreseeable future. so your statement about compilers is total hand-waving away of facts inconvenient to your argument.

  • by mangu (126918) on Friday July 23, 2010 @05:28PM (#33008004)

    I must say I rarely find a comment on /. that I agree as much as I do with yours.

    C and Python march hand in hand, one is for machine performance, the other is for programmer performance. If someone thinks C is too complex or too hard to learn then he shouldn't be working with programming computers, he's likely to cause great damage sooner or later.

    However, there's one point where C will need a new approach: multiprocessing is coming. Since it seems like Moore's law has hit the ceiling at 3 GHz CPU speeds, all progress in performance for the foreseeable future will come from increasing the number of CPUs and cores working together.

    I have done a lot of programming in multithreads using the pthread library lately and I feel that something better is needed, pthread is not close enough to the metal. I think some new fundamental elements may be needed in the language.

    C is so great for programming because it mirrors the hardware closely. For instance, pointers work so well because they represent memory addresses. Before I learned C I had worked with Fortran, I still have some programs I wrote over 25 years ago. Today I look at those old Fortran programs and I wonder why I did some things the way I did. I see some convoluted loops and wonder why I did that because, with a quarter century hindsight on using pointers, I create almost instinctively the most efficient set of pointers to handle a data structure.

    What programmers often don't realize is that the correct data structure may get orders of magnitude improvement in performance. To give one example, years ago, when I studied artificial neural networks, I read an article in the Doctor Dobb's magazine (January 1989, page 32, "Neural Networks and Noise Filtering" by Casey Klimasaukas). It was a good article, but the source code in C that came with it sucked. There was a struct _pe representing a processing element and each struct _pe had an array of struct _conn representing the connections to that element.

    The problem is that in an artificial neural network what each neuron is doing is, basically, a convolution of two arrays. To do that efficiently in hardware you need to have the array elements contiguous in memory. When you put the connection weight in a structure together with other data you will not have that value contiguous with the weights of the other connections.

    From an "object oriented" point of view that program was perfect. But if you want to use your multi-core CPU with that, the program sucks. That's the benefit you can get from programming in C that you won't get with other languages.

    And don't tell me that raw performance does not matter because you can always get faster hardware. CPU clock speed has stopped at 3 GHz, we must learn to use our multicores if we want to evolve from now on.

  • by ciggieposeur (715798) on Friday July 23, 2010 @05:38PM (#33008118)

    C++ is turning into a bloated slow fat pig and I'm thinking of getting a divorce.

    Check out D sometime, you might really like it.

  • by CosmeticLobotamy (155360) on Friday July 23, 2010 @05:50PM (#33008310)

    "WriteLine()" makes perfect sense in the context of the Console class when paired with its newline-less counterpart, Write(), and is spelled wr[down-arrow][enter]. Four keys. Intellisense (or whatever its equivalent in your preferred IDE is). Use it. Quit inflicting functions like "wrtln" on the rest of us.

  • by Bigjeff5 (1143585) on Friday July 23, 2010 @06:00PM (#33008440)

    He's speaking out against C++, which is hella complicated, not C, which is pretty simple to work with (even though what you have to do being so close to hardware can be complicated).

    There is a big difference between the two - mainly C++ is an ultra-extended, tricked to the nines version of C.

    This means you deal with all of the low level stuff that C deals with, but you've got all sorts of other shit to remember on top of it.

  • by Klinky (636952) on Friday July 23, 2010 @06:10PM (#33008572)

    You know, nothing pisses me off more than people responding to a post saying only "THIS". Please, keep it to yourself if that's all you can contribute. I feel bad for thewasted bandwidth and computing power that was exhausted on that little brain fart of a post.

  • by yyxx (1812612) on Friday July 23, 2010 @08:08PM (#33009792)

    And what do you tthink would the difference between "an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp" and an actual CommonLisp implementation be?

  • Re:Summary: (Score:3, Informative)

    by yyxx (1812612) on Friday July 23, 2010 @08:30PM (#33009972)

    I think UTF-8 and his two books were pretty influential. BLIT and Plan 9 were pretty significant research systems. And as a member of the original UNIX group, I think his opinion on UNIX and C carries some weight (and is probably shared by many of the original UNIX developers).

    As a language, Go actually looks pretty nice to me. The trouble with Go is the same trouble nice new languages have always had: lack of critical mass. If Google starts using Go for Appspot, Android, and internal development, it may have a chance.

    Just out of curiosity: what have you accomplished?

  • by shutdown -p now (807394) on Friday July 23, 2010 @08:55PM (#33010140) Journal

    C++ has RTTI, but it is frequently (often?) disabled.

    RTTI is actually not disabled all that often, since doing so typically also kills dynamic_cast, and that can be very useful at time.

    But then C++ "RTTI" is a misnomer. I mean, it only lets you inspect the name of the type, and even then the precise content of that string is implementation-defined, so you're only guaranteed that it's unique among all types! There's no member reflection whatsoever, and you can't create a new instance from a type_info.

    The thing is, once you have a system running around maintaining type data on all your objects in memory, you are halfway to having a virtual machine

    No, not really. You only have a bunch of static arrays filled with data. It is completely orthogonal to the idea of VM.

    By the way, have you noticed how many C++ frameworks roll out their own RTTI using macros or preprocessors?

    Reflection is such a perf hit either you want it or you don't, and if reflection is REALLY vital to your code, you mine as well run on a fully featured VM, the rest of the VM isn't going to weigh you down too much more.

    There are very few cases where reflection is used all the time, even in a VM-based platform such as Java or .NET. More often than not, it's used once to initialize some complex object graph in a generic way (e.g. read window layout from XML), and after that you're back to good old statically typed code.

    Another case is not really run-time - rather, it's the ability of some visual designer (or other tool) to list types in the assembly, their members etc, so that it can show those neat property grids, or generate code, based on that information. COM has that kind of thing - type libraries - which are completely useless at runtime (unless a specific object implements IDispatch for dynamic invokes), but are used a lot at design-time.

  • Re:I LOVE perl! (Score:4, Informative)

    by shutdown -p now (807394) on Friday July 23, 2010 @09:35PM (#33010456) Journal

    The words are short and simple, you don't have to worry about silly things like word gender, etc.

    Yeah, you only have to worry about other silly things, such as learning seemingly endless tables of non-standard plural and past tense forms by rote, or understanding just what the hell perfect tense is about.

    English is certainly not the hardest language out there, but it's also not the easiest one, by far.

  • by woodsbury (1581559) on Friday July 23, 2010 @09:47PM (#33010532)
    In case anybody doesn't know, the new standard for C that is currently being planned includes multithreading support in the form of a threads.h header.

    http://en.wikipedia.org/wiki/C1x [wikipedia.org]

    I believe the newest GCC includes some support for some of the features of that standard already (which of the features I can't remember).
  • Re:I LOVE perl! (Score:1, Informative)

    by Anonymous Coward on Saturday July 24, 2010 @01:38AM (#33011584)

    To be fair, pluralization is not that important to get perfectly correct. Most English speakers (myself included) make occasional mistakes with the vast array of special cases. Typically people will still understand you.

Forty two.

Working...