Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Stroustrup Reveals What's New In C++ 11 305

snydeq writes "Bjarne Stroustrup discusses the latest version of C++, which, although not a major overhaul, offers many small upgrades to appeal to different areas of development. From the interview: 'I like the way move semantics will simplify the way we return large data structures from functions and improve the performance of standard-library types, such as string and vector. People in high-performance areas will appreciate the massive increase in the power of constant expressions (constexpr). Users of the standard library (and some GUI libraries) will probably find lambda expressions the most prominent feature. Everybody will use smaller new features, such as auto (deduce a variable's type from its initializer) and the range-for loop, to simplify code.'"
This discussion has been archived. No new comments can be posted.

Stroustrup Reveals What's New In C++ 11

Comments Filter:
  • Don't forget initializer lists, variadic templates, non-static data member initializers, finally fixing that Template> (note the >>) thing, rvalues, nullptr, strongly-typed enums, constructor improvements (holy god we don't have to rewrite every fucking thing every fucking time or split off into an ::init()), user-defined literals which is crazy cool combined with templates and initializer lists, and lots of stuff I'm sure I'm forgetting about.

    Since starting on C#, I've kind of felt like I'm back in the dark ages in C++, even as it remains my favorite language. I've already started using a lot of these improvements, and while C++ still has it's rough edges, the improvement in "fun" while coding is massive. No more for (some_container_type<vector<map<int, string> > >::reverse_iterator aargh = instance.begin(); aargh != instance.end(); ++aargh) for me!

  • by swillden ( 191260 ) <> on Friday February 24, 2012 @05:27PM (#39152945) Homepage Journal

    auto means I no longer have to type std vector iterator in every for loop

    You didn't anyway. You type in "int" to loop over a vector.

    Only if you want to tie yourself to using a vector. Using a proper iterator costs you nothing in code space or execution time (because for a vector it optimizes down to just pointer arithmetic anyway), but means that at some future time you can replace that vector with a different data structure without having to modify the code that operates on it.

  • by Anonymous Coward on Friday February 24, 2012 @05:37PM (#39153085)

    for (auto i : v) {


    Even in terms of typing time it's a nice addition, ignoring the structure-independence benefits of this sort of thing.

  • by Anonymous Coward on Friday February 24, 2012 @05:41PM (#39153155)

    It pretty simple why you would use iterators.
    While the data you store might not change much over time, the amount of data stored for successful apps tends to grow alarmingly over time. Allowing the vector to change to some other more elegant solution when the need arises without having to rewrite large swaths of code.

  • by Anonymous Coward on Friday February 24, 2012 @05:46PM (#39153245)

    It is easy to refute your argument on memory safety and auto with a single line of code:

    auto obj = make_shared( arg1, arg2 );

    lambda expressions can only be assigned to an auto, because the actual type is compiler defined.

    auto some_callable_type = []( float f ){ return f * f; };

    Currency isn't supported? What more do you want apart from: threads, mutexes, atomics, thread local storage, concurrency safe memory model, futures, promises, async tasks and thread exception transfer?

  • by Tyler Durden ( 136036 ) on Friday February 24, 2012 @05:49PM (#39153289)

    The new features to the standard library are brilliant. Threading has never been easier: std::thread t(foo, x, y); will call foo(x, y) in a new thread. When I decide to finish the threads and then join them I call: t.join(); ... Simple.

    Sure. But it should be noted that this feature (along with many others brought to the new stand, I'm sure) were introduced in the Boost set of libraries first.

  • by genjix ( 959457 ) on Friday February 24, 2012 @06:02PM (#39153439)

    > What C++ compiler are you using?

    g++ 4.6 - standard in Ubuntu

    Two of the features I'm waiting on are class level non-static initialisers and templated typedefs. I've heard Microsof's C++ compiler has better C++11 support but I've never tried it.

    Beware that MingW has a bug so std::thread is disabled. I've heard mingw-w64 works better. You might want to also try boost::thread (same library essentially, except std::thread has move semantics).

  • Re:He's optimistic (Score:5, Informative)

    by grumbel ( 592662 ) <> on Friday February 24, 2012 @06:07PM (#39153489) Homepage

    > But C++11 describes a standard that absolutely nobody has ever got anywhere close to, so I don't imagine that there's going to be a lot of drive to adopt it.

    All popular C++ compilers already implement large parts of C++11, so the chance of seeing widespread C++11 adaption in the not so distant future is pretty high. Also this wasn't really any different with C++98, which essentially no compiler supported on release and which then took a few years to gain widespread adoption.

  • by Anonymous Coward on Friday February 24, 2012 @07:20PM (#39154133)

    tl;dr: "I refuse to keep my skills current, so I will dismiss every development in the field of computer programming that post-dates my pet language(s)".

  • Re:I want auto! (Score:5, Informative)

    by shutdown -p now ( 807394 ) on Friday February 24, 2012 @07:40PM (#39154337) Journal

    "auto" was always implemented, since the very first version of C, it just had a different meaning - it means that variable has an "automatic storage class" (as opposed to "static storage class" etc). Because automatic was the default, it was almost always redundant, but it did have a meaning.

    It actually goes way back to B [], which only had a single data type - machine word. Variable declarations looked somewhat like C, but, for the lack of type, they started with the storage class instead, i.e.:

    main() {
      extern x;
      static y = 1;
      auto z = 2;

    In C, we've got types, so you'd normally write "static int y" for a static local, and just "int z" for an automatic one - "auto" being implied. However, C inherited some of B's semantics as "default int" - i.e. if the declaration is clearly a variable, but it omits the type, assume that type to be "int" (i.e. machine word). So in C the above code snippet from B is actually valid, and declares x, y and z to all be ints.

    Then "auto" got inherited by C++, which dropped the "default int", making auto completely redundant - you couldn't write "auto x" in C++ anymore, and in all other cases where you could use "auto", like "auto int x = 123", it was always redundant. So when they appropriated it for type inference in C++11, it was technically a breaking change - it just wasn't ever used by anyone in production code in the old way, so nobody noticed.

  • by RCL ( 891376 ) <rcl,rs,vvg&gmail,com> on Friday February 24, 2012 @11:42PM (#39155821) Homepage

    I think he simply claimed that you have to deal with C++ written by C programmers all too often. That's my experience, too.

    I'm probably one of those C programmers. I use C++ features when I feel they are appropriate, but my definition of "the right thing" changed over years. I started to value custom-tailored solutions, often hardware- and problem-specific. No longer I'm trying to find (or create) code that would fit all (or even the most) cases, and micro-managed security is of lesser concern to me.

    I don't see what that has to do with the above. I suppose there are some rare cases where the standard library isn't appropriate, but are you arguing this is an excuse *never* to use it?

    There are always multiple tradeoffs involved, that's why I am very wary of saying "never". STL is no silver bullet, this is what I'm saying.

    E.g. STL is bad at managing memory, it's hard to make it NOT allocate it dynamically - yes, you can write a custom allocator for it, but STL allocators are inflexible and also - for some poorly thought reason - manage object construction, not just memory allocation, making this unnecessarily hard. STL is a template library, and templates produce separate implementation for every type used - whereas you can have a single void *-based container for all your POD types, with templates providing just minimal wrapper. Also, STL makes it hard for you to control how often memory (object instances) is copied - there's no way to influence its behavior if memory copying tops your profiler results.

    If you are developing primarily for desktop computers which are quite beefy (and also vague in terms of hardware), this may matter less (although it will never cease to matter - just think of all that slow code running on our desktops which eats the performance improvements we still (marginally) get, making desktop performance appear flat since 2004), but if your target is well-defined hardware-wise and if you know/set "upper bounds" for all the practical problem sizes your program is designed to handle, you can think of more optimal solutions for your case. This is one of reasons why you can still play the newest games on 2005 hardware (XBox 360) which hasn't even got an out-of-order CPU, while running the newest desktop software (not games!) on PC of the same era is problematic.

    To sum up, I'm not saying "never use STL", but I would not say "always use STL" either. People claim that using "standard" code helps you create more secure, robust, performant programs - but in my experience, you will end up running a lot of custom tools (all kinds of profilers and validators) anyway before shipping your binary, that makes the point of "performant/secure code by design" moot. When profiler tells you that some code in an STL container underperforms, reimplementing it - or even understanding why that happens - is harder than fixing/improving your own implementation, yet you'll probably have to do that anyway. And no one starts "from scratch" these days, each "low-level" programmer I know has their own set of STL-alike classes.

    By the way, your reference seems severely dated. Some of its complaints are still valid, but seem based on the state of STL support ten years ago.

    Well, it is perhaps improving with new "move" constructors and whatnot, but that's actually a problem: it's better to have cross-platform code with predictable behavior than code that is supposedly optimal everywhere, but which depends on highly varied implementations. What good is it for me to know that some compilers (e.g. gcc), on some platforms (e.g. x86), handle new, C++11-ready STL well - gcc isn't particularly good at optimizing code even for x86, and another compiler that is a good optimizer may not support the newest features and make my STL-heavy code suck.

    P.S. Also, STL (in implementations I saw) is written in quite unreadable, convoluted style - I always wondered why. This is of course less rele

  • by TheRaven64 ( 641858 ) on Saturday February 25, 2012 @10:17AM (#39157991) Journal
    My example is a trivial example. Go and look at some real code. In big applications, we use this pattern in Objective-C all the time. It's a trivial way of constructing localised strings. Now go and look at the mess of template metaprogramming that people use to do the same thing in C++...
  • Re:Uh. No. (Score:4, Informative)

    by Electricity Likes Me ( 1098643 ) on Saturday February 25, 2012 @10:31AM (#39158055)

    Operator overloading is to address the ambiguity of having by-value and by-reference passing of objects being possible via different semantics in the language.

    If I have a pointer and write:
    ObjType* pObjRef =
    then it's pretty obvious I have an object reference.

    But but what does
    ObjType Obj2 = Obj1;
    actually mean?

    C++ defines this type of transaction as always being a copy operation. But an object is a complex datatype - doing a straight copy of all it's memory doesn't necessarily give me sensible behavior. So you need operator overloading to let you enforce sensible behavior.

    That you can also use it to create syntactic sugar or completely illogical behavior doesn't make it bad though. And absent a garbage collector, I'm not sure it actually makes sense to do what C# does and try and treat all object variables as references (in that when would you deallocate things?)

I go on working for the same reason a hen goes on laying eggs. -- H.L. Mencken