Bjarne Stroustrup Releases 168-Page Paper on How C++ Thrived (acm.org) 101
Bjarne Stroustrup, the 69-year-old Danish creator of C++, just released a 168-page paper (published under a Creative Commons Attributions-NoDerivatives license) in the Proceedings of the ACM on Programming Languages, detailing the growth of C++ from its 21st birthday in 2006 up through the year 2020.
It begins by noting that by 2006, C++ "contained parts that had survived unchanged since introduced into C in the early 1970s as well as features that were novel in the early 2000s..." Originally, I designed C++ to answer to the question "How do you directly manipulate hardware and also support efficient high-level abstraction?" Over the years, C++ has grown from a relatively simple solution based on a combination of facilities from the C and Simula languages aimed at systems programming on 1980s computers to a far more complex and effective tool for an extraordinary range of applications... [T]his is also the story of the people involved in the evolution of C++, the way they perceived the challenges, interpreted the constraints on solutions, organized their work, and resolved their inevitable differences.
From the abstract: From 2006 to 2020, the C++ developer community grew from about 3 million to about 4.5 million. It was a period where new programming models emerged, hardware architectures evolved, new application domains gained massive importance, and quite a few well-financed and professionally marketed languages fought for dominance. How did C++ -- an older language without serious commercial backing -- manage to thrive in the face of all that?
This paper focuses on the major changes to the ISO C++ standard for the 2011, 2014, 2017, and 2020 revisions... Themes include efforts to preserve the essence of C++ through evolutionary changes, to simplify its use, to improve support for generic programming, to better support compile-time programming, to extend support for concurrency and parallel programming, and to maintain stable support for decades' old code... Specific language-technical topics include the memory model, concurrency and parallelism, compile-time computation, move-semantics, exceptions, lambda expressions, and modules.
"I hope other languages learn from C++'s successes," the paper concludes. "It would be sad if the lessons learned from C++'s evolution were limited to the C++ community."
It begins by noting that by 2006, C++ "contained parts that had survived unchanged since introduced into C in the early 1970s as well as features that were novel in the early 2000s..." Originally, I designed C++ to answer to the question "How do you directly manipulate hardware and also support efficient high-level abstraction?" Over the years, C++ has grown from a relatively simple solution based on a combination of facilities from the C and Simula languages aimed at systems programming on 1980s computers to a far more complex and effective tool for an extraordinary range of applications... [T]his is also the story of the people involved in the evolution of C++, the way they perceived the challenges, interpreted the constraints on solutions, organized their work, and resolved their inevitable differences.
From the abstract: From 2006 to 2020, the C++ developer community grew from about 3 million to about 4.5 million. It was a period where new programming models emerged, hardware architectures evolved, new application domains gained massive importance, and quite a few well-financed and professionally marketed languages fought for dominance. How did C++ -- an older language without serious commercial backing -- manage to thrive in the face of all that?
This paper focuses on the major changes to the ISO C++ standard for the 2011, 2014, 2017, and 2020 revisions... Themes include efforts to preserve the essence of C++ through evolutionary changes, to simplify its use, to improve support for generic programming, to better support compile-time programming, to extend support for concurrency and parallel programming, and to maintain stable support for decades' old code... Specific language-technical topics include the memory model, concurrency and parallelism, compile-time computation, move-semantics, exceptions, lambda expressions, and modules.
"I hope other languages learn from C++'s successes," the paper concludes. "It would be sad if the lessons learned from C++'s evolution were limited to the C++ community."
way too complicated (Score:5, Interesting)
drop about 80% of c++ features (keep classes and things like that that everyone uses); but man, c++ has grown so large and hard to understand (at times) that I can stare at some c++ code and not have any idea what the new constructs are doing.
been programming since the 80's; not new to coding at all, but I have to say that c++ has gone to the dark side along with java.
programming languages have had all they needed for decades.
bored language 'architects' should find something more productive to do. stop messing with language features that just make 'better' quiz questions.
final comment: the user is more responsible for code than the language. stop trying to capture every concept as a language feature. that just shows you guys have totally lost touch with what real software engineers need and use daily.
Re: way too complicated (Score:2)
Re: (Score:2)
This creates a perverse incentive for practitioners to make their field as arcane and obscure as possible. Flag as Inappropriate
Yeah. I remember that interview with Stroustrup from years back.
I could not agree more. (Score:4, Insightful)
Programming has become so abstract in the last couple of decades. And there is a reason for it: programmer productivity - i.e. getting the most code for the least amount of money. THAT is the main thrust of programming language progress.
As someone who learned C and assembly programming on DOS and later Windows and OS/2, all that boiler plate reinventing the wheel code for every stinking application got old (message loops anyone?). There was no creating - just typing the same old shit over and over again. And when Turbo C++ 1.0 came out, I was really enjoying it. Creating a class and all the mindless bullshit was taken care of.
Now, when I look at C++ code, I cannot understand it. It is a completely different language now. Has there been anything gained? Has it improved programmer productivity?
Because let us remember CS 101 - programming languages are nothing but a way to make programming easier for humans - human readable code as opposed to bits and bytes; which ALL languages eventually turn into. Programming languages do not give any extra powers beyond the CPU itself.
And I find that many modern languages and their obscenely complicated libraries and frameworks to actually reduce productivity. The last time I tried programming a Windows application in C# (or a Linux one in Java), the class structure was so goddamn confusing to do something that is pretty simple in win32 - tedious for all the C code, cut never the less simple.
I just cannot wait for a time when programming computers becomes as simple as talking to them - Make it so! - like on Star Trek.
Re: (Score:2)
I had a similar story, but my experience with C goes back to the earliest implementations on PDP-11s.
What I liked originally about C++ is that the early implementations were basically just like transpilers to C. You could always look at the generated C code and understand what it was doing. Then there was the early Microsoft IDE that made writing Windows apps reasonably straightforward.
But I drifted away from C++ to other things (mostly Java) and the few times I tried to came back it was like hittin
Re: (Score:2)
Re: (Score:2)
Re:I could not agree more. (Score:5, Interesting)
Now, when I look at C++ code, I cannot understand it. It is a completely different language now. Has there been anything gained? Has it improved programmer productivity?
I can assure you, I'm significantly more productive in C++ than I was two decades ago, and it's not solely because of my increased experience. A lot of the new features relieve much of the mental overhead in keeping track of mundane chores, like tracking and freeing resources when you're finished with them, putting the burden on the compiler instead of the programmer. As one simple example of a nice feature improvement, looping over a container no longer requires this sort of overly-complex syntax:
for (std::vector::iterator itr = container.begin(); itr != container.end(); ++itr)
Instead, you can just write:
for (auto& element : container)
Not only is that code much easier to read, write, and understand, but it's significantly safer as well, since there's less chance of accidentally copying and pasting the wrong container name - something I've done more than once. And I won't even get into other approaches of using algorithms instead of loops. C++ has a bunch of ways to do the same thing because it's a very old language that's been continuously upgrade for all that time. Yes, there's a lot of syntax to learn, but a lot of it falls under the "here's the best way to do x", versus "here's how it was often done in the past, so you should only see it in legacy code."
Sort of a pain, but what's the alternative... break compatibility? Would it be nice if we could somehow sweep away all that legacy cruft? Absolutely. But we live in reality, where the ecosystem and backwards compatibility matters. Might as well switch languages at that point, because you're essentially throwing away your entire codebase, which may have been developed and refined over decades. Very few organizations can afford to do that. Or, the language could just stagnate, never adopting new features, or applying lessons learned. Screw that. As someone who uses C++ almost every day, I can tell you I VASTLY prefer the modern language. It's not even close.
Re: I could not agree more. (Score:4, Insightful)
Auto is makes double edged sword as it makes it easier to write but harder to read as you may have no idea of the type which doesnt exactly aid comprehension. I use it very sparingly.
Re: (Score:2)
You usually don't need to know the type.
(And if you do, you can simply hover the mouse over the variable and the IDE will tell you)
Re: I could not agree more. (Score:2)
You dont need to know the type? Seriously? Unless you're doing some mickey mouse op with it you sure as hell do.
As for mouse hovering , good luck with that in vi or emacs.
Re: (Score:3, Insightful)
No, you don't need to know the type. All you need to know are the operations supported. If you're implementing a function which takes a BidirectionalIterator, you don't care whether the actual type of the variable is std::list::iterator, std::vector::iterator (which is actually a RandomAccessItrerator, a superset of BidirectionalIterator), std::map::iterator, etc. All you need to know are the operations supported (increment, decrement, dereference, comparing for equality/inequality).
Unfortunately, auto is a
Re: I could not agree more. (Score:2)
No, you're right, who needs to know the type. I'll just try and get .first or .second from a vector element, I'm sure it would work, right? Or maybe I could just try and deference the elements and hope they're pointers, Why wouldn't it compile? Or how about a fun runtime error of reinterpret cast the wrong base class and get a nice exception?
When you do some serious development instead of just regurgitating your student textbooks get back to me,
Re: (Score:2)
This argument right here is why C++ occupies a niche that no one really needs, except maybe in AAA game titles. The first argument demonstrates the complexity required to do something as simple as iterate over a container, the second argument rails against type annotations without understanding that compilers simply inferring the correct types (and tooling/IDEs providing auto-complete support based on that type) can be a real aid to developer productivity.
These days if I was looking for a solution for some
Re: (Score:1)
Oh no. We have someone like you on our team. All the other engineers pointed out that this doesn't work in PRs on github, not does it work when you're grepping around in the console for shit or using vim. You need to be able to READ code, not do tool-assisted lookups on ambiguous parts of code, hold the various results in short-term working memory, and only then understand the types and thus what the code is do
Re: (Score:3)
In Java-land you're probably right, although I would argue that if your program structures or types are _that_ complicated that you can't replace most explicit types with `var` and auto-inference, _you're doing it wrong_.
I've seen code with full explicit type annotations that are entirely incomprehensible; I've seen code with no explicit type annotations that are easily readable.
Re: (Score:2)
You usually don't need to know the type.
(And if you do, you can simply hover the mouse over the variable and the IDE will tell you)
Firstly: Yes, you *do* need to know the type if you're tracking down a bug.
Secondly, If the language is incomprehensible without tools that effectively compile it before being able to provide usable hints, maybe it's the language that is the problem.
Re: (Score:3, Informative)
That's often the issue with abstraction: when it works as intended, it saves a lot of time. When it doesn't, then one can waste a lot of time trying to figure out what's actually going on. The layers that hide the details then become, well, layers that hide details.
Good abstractions either have ways to discourage mistakes in th
Re: (Score:2)
Re: I could not agree more. (Score:2)
You've not written much complex code have you. Stick to VB.
Re: (Score:2)
The name of the variable should tell you what it is. Most of the time, you shouldn't need to directly know what type is used to implement a variable. customer_list is obviously not an int or a float and customer_list_index is pretty obviously an integer type.
If you're writing complex code, you are the problem. I've written a number of very large programs in C++ and I would not describe any of them as "complex code". My job as a programmer is to dissect what seems to be a complex problem into its simple comp
Re: (Score:2)
"Most of the time, you shouldn't need to directly know what type is used to implement a variable"
Oh ok genius. So if I have a container containing multiple instances of the following structure:
struct
{
uint32_t src;
uint32_t dest;
u_char pad;
u_char proto;
Re: (Score:2)
So you admit you're the problem. The names of that struct and its members are really shitty. They tell very little about what they actually are, how they're used, or why they're needed. Most of the time, you really don't give a fuck that the IP address is stored in a uint32_t, you care that it's an IP address.
You can't hear me because you don't want to. If you listened to someone else, that would be admitting you're not God's gift to programming and you don't know everything. You know why the text books say
Re: (Score:2)
LOL :) Too complex for you sonny? Obviously you've done zero low level network programming in your life just like you've never written anything complex - probably just GUI code or other noddy shit.
You're utterly ignorant and full of shit. Go away.
Re: (Score:2)
Defensive much?
I'd offer more, but since your comment consists primarily of what seems to me to be a variation of the no-true-scotsman fallacy and a factually inaccurate implication that I must work in VB, you haven't really offered me much to give a rebuttal to, and it honestly seems like I struck a nerve.
If you want to elaborate a bit on why you think knowing the underlying type of a variable is somehow correlated with complex code writing, or particularly why you would be inclined to make an assessm
Re: (Score:2)
If you need to know the type of a variable to understand what a code fragment that uses the variable is doing, then I would suggest the code was not written clearly in the first place.
Okay, agreed, but how does that help? Most programmers are fixing code written by other people, after all.
Re: (Score:2)
Re: (Score:2)
It doesn't help if your job is to maintain perpetually bad code,
Unless you are working solo, you don't have the luxury of not working with bad code.
it helps when you bear it mind to avoid writing bad code in the first place so that others are not burdened with that effort, and to the extent that it is permitted, refactoring code you have been assigned to maintain so that it is easier to read and understand both for yourself and for others who might have to maintain it after you..
If it isn't helping but makes code harder to read ... well, you can see the argument against it. There's places for auto, but those places are rare.
Unless you're perpetually working solo, things that make writing code easier but reading it harder should not be in the language.
C++ is declining in use, and it's because of all these little things that make it hard to read. Without being able to attract new blood, it will eventu
Re: (Score:2)
My point is that not having the data type explicitly mentioned *DOESN'T* make code harder to read. If the code has been written in an idiomatic style, the type can almost always be inferred from context, but in terms of understanding the algorithms, the actual underlying data type is usually irrelevant. I would argue that a person who actually depends on knowing types in order to maintain code they can't understand otherwise does not actually understand what programming is.
Re: (Score:2)
+5 Informative
Re:way too complicated (Score:4, Interesting)
From time to time there's a poll what is actually hard in C++. The vast majority of answers fall into two camps:
1. All that baggage it inherited from C, like raw pointers, array to pointer decay, etc. This is not new; it's the oldest part of C++ that people are complaining about. Generally, new additions have been received very positively by the community, precisely because they allow us to step away from C-style coding.
2. The almost complete lack of useful OS abstractions, such as sockets and windows, never mind higher level functionality like encryption, support for various protocols, etc. Almost every example of why C++ is hard seems to bring up a comparison whereby languages x, y, and z let you run a webserver in two lines of code, while in C++ you're basically f*cked. But this is a question for a much larger C++, rather than a much smaller one.
So let me ask you: what 80% would you remove? You claim 80% can be removed, surely you can give us some indication what that should be?
Re: (Score:2)
I'd like to see a function declared to return nothing and accept no arguments do just that, instead we still assume an int return type. I'd like the new use of the auto keyword to be assumed, such that if a variable is assigned when it's declared, it takes on both the type and value assigned to it by default, no type, not even auto, required as preamble. I wish 'class' had never been invented, as it adds almost nothing of value to the language, but it's too late now. On a similar note, array to pointer dec
Re: (Score:1)
You can't have your first request, it would break backwards compatibility, also stop being so fucking lazy. If you want it void, SAY SO. Be explicit in your code.
Your second request is horrible. Now every typo on an assignment makes a new variable. Those will be some fun bugs to track down. This makes scope masking bugs look simple.
Re: (Score:2)
Huh? C++ doesn't allow implicit int return type. You have to say it's int or void. As for auto, do you mean you want to say "i = 123" instead of "auto i = 123"? Is that it? I'm really struggling to see how either of this is a problem at all, or at least one worth complaining about.
Do you hate something about "class" syntax or just the concept of object-oriented programming in general? That was the whole point of C++, otherwise C still exists if you don't want OO.
Re: (Score:2)
As for auto, do you mean you want to say "i = 123" instead of "auto i = 123"? Is that it? I'm really struggling to see how either of this is a problem at all
I can see a problem: Should i be a char, a short, an int, a long....?
Should we just assume all numbers are double precision floating point like JavaScript does?
Re: (Score:2)
I meant the problem with using the "auto" keyword that going to scipring language approach of "i = 123" would solve. I don't think the OP was advocating going back to explicit declarations.
Here it'd be an int of course, but certainly there's some ambiguity with auto. In practice though I find that this is rarely an issue because I wouldn't use auto with a constant, but even if you do that you can see what type is deduced. And it's a huge help with those monster template types.
Re: (Score:1)
I'd like to see a function declared to return nothing and accept no arguments do just that, instead we still assume an int return type.
Um, no.
I'd like the new use of the auto keyword to be assumed, such that if a variable is assigned when it's declared, it takes on both the type and value assigned to it by default, no type, not even auto, required as preamble.
So what type would n be in this code?
n = 1;
I wish 'class' had never been invented, as it adds almost nothing of value to the language
I stopped reading there. Go back to JavaScript.
Re: way too complicated (Score:3, Insightful)
Anything related to the OS should be handled in a library, NOT built into the language because OSs vary. Eg. Windowing is built into the windows kernel but in unix only very low level display handling is there as frame buffers , windowing as a concept exists in the X server, not the OS kernel. Similarly C++ 2011 added threading which is all well and good but posix threads support 3 level locking whereas C++ threads have to be lowest common denominator and hence only support 2 levels which makes them fucking
Re: (Score:2)
Bring-your-own build system. Maybe its Make or CMake or SCons or Ninja or Meson. Maybe there's configure/autotools thrown on top. Time spent choosing or learning a new system is wasted, and it splinters your community support.
On top of that, these build systems are complex with their own DSL's that can require debugging before you're able to get a complete build.
Next, C++ seems to have a lot of foot-guns la
Re:way too complicated (Score:5, Interesting)
I use C++ all the time, but also split my time with much higher level languages and so the contrast is always at the forefront.
To me C++ is death by a thousand cuts. I hate constantly repeating myself between the header file and the cpp file (especially when the change is something entirely private and yet triggers a large, slow rebuild). I groan a little every time there's a template error - even if it's an easy fix, there's something about the gobs of template error spew. Data structures like tuples, vectors, lists, maps, etc. are great but will forever feel bolted on (at least compared to other languages). All the ways stuff can get initialized is easy to mess up (http://mikelui.io/2019/01/03/seriously-bonkers.html). Etc., etc.
None of these are individually the end of the world, but they add up to a real cost that I'm not sure is worth it. On more than one occasion (including just this week), I've temporarily rewritten some code in a higher level language where it's much cheaper to do algorithmic experimentation and troubleshooting. Once everything is all sorted out I'll port it back to C++. To some extent this isn't all that unusual, but it seems like C++ has enough stuff that doing this shouldn't be so helpful as often as it is. :)
Re: (Score:2)
From time to time there's a poll what is actually hard in C++. The vast majority of answers fall into two camps:
1. All that baggage it inherited from C, like raw pointers, array to pointer decay, etc. This is not new; it's the oldest part of C++ that people are complaining about. Generally, new additions have been received very positively by the community, precisely because they allow us to step away from C-style coding.
2. The almost complete lack of useful OS abstractions, such as sockets and windows, never mind higher level functionality like encryption, support for various protocols, etc. Almost every example of why C++ is hard seems to bring up a comparison whereby languages x, y, and z let you run a webserver in two lines of code, while in C++ you're basically f*cked. But this is a question for a much larger C++, rather than a much smaller one.
So let me ask you: what 80% would you remove? You claim 80% can be removed, surely you can give us some indication what that should be?
Classes are horribly over-complicated; almost all of that complication can be removed if all classes got a default copy-constructor that did correct deep-copying[1].
Default values in parameter lists + overloaded function names means that, at the point of calling a function the reader is never quite sure which function will be called. I'd remove default values in parameter lists; it's just a convenience when writing, but a real hindrance when reading.
Remove reference parameters, the job they do is better d
Comment removed (Score:4, Insightful)
Re: way too complicated (Score:2)
That's fine, but C++ syntax is now so complicated that you get the problem where people try to be clever and end up creating some unmaintainable and hard to extend nightmare that makes life difficult for everyone who has to mess with their code. Yes this can happen in any language but C++ had taken it to a fine art.
Also the problem exists where devs like myself are expected to keep up to date with all the useless extra kitchen sink crap , gotchas and API/STL interactions being thrown in which add little to
Re: (Score:2)
Re: way too complicated (Score:2)
The whole point of C++ is that it's an extension of C and you can still use all Cs features otherwise use java. And if you think printf is deprecated because of iostream then I would suggest you have no idea how to use it properly.
The only imbecile in this thread is you sonny.
Re: (Score:2)
Re: (Score:2)
"No, if you had actually read any of Stroustrups works"
I tend to avoid anything he writes, he could put an amphetamine addict into a coma.
"you would realize that C compatability was only add in order to allow C programmers an easier path into C++ programming"
Thats unlikely given it started out as C with classes - ie a way to have OO in C.
"printf and its derivatives are deprecated"
They're not deprecated in any sense.
"They work on unprotected pointers"
Do pointers scare you? You might want to avoid writing dev
Re: (Score:2)
Re: (Score:2)
The art of a good programmer is knowing when to use the appropriate tool. Treating everything as a nail because you only know how to use a hammer rarely works long term.
Re: (Score:2)
Re: (Score:2)
At least when I see a pile of printf statements in a C++ program, I know that the author was an imbecile
Sounds like you aren't very smart yourself. It's easier and safer to use printf than cout: everything is easy to read and in 2020 all major compilers warn you if you make a mistake.
Re: (Score:2)
Re: (Score:2)
Sounds like you aren't very smart yourself. It's easier and safer to use printf than cout:
Spoken like a true C programmer. printf is in no way safer than stream processing. printf and scanf have no fundamental protections for most kinds of evil that can happen on input and output streams. Why do you think scanf is essentially deprecated, even in C. printf is only marginally better, as you still have to manage your own buffers (whether you realize that's what you are doing or not). Streams handle buffers for you and thus make it much more difficult to code in a buffer related bug. In short, your lack of understanding of C++ make you believe (incorrectly) that C++ language structures are less safe than C style code. In reality what is unsafe is your lack of understanding of how C++ works.
Previously it looked liked you had a superficial understanding. Now it looks like you have none at all. Sure, scanf is deprecated, but I never said anything about scanf so I don't know why you think it is relevant. Strawman much?
As for printf/cout - passing data to them both has to be managed whether you like it or not, except that the burden is much greater using cout than using printf.
I feel sorry for those who work with you. It's really difficult to deal with someone who "feels" that the higher-burden,
Re:way too complicated (Score:4, Interesting)
How do you directly manipulate hardware and also support efficient high-level abstraction?
When C++ became more of a way to abstract a layer of abstractions about abstractions, with no hardware really needed. You ended up with people really not programming in C++, but in a set of STL (and other) libraries.
I still program occasionally, and for me, I still do C++ as "C with classes"; the concept of wrapping data and the functions that can operate on that data together into a single object is quite powerful. There's no need for 17 layers of abstraction upon it.
Re: (Score:2)
I work with Indian devs IN India for a Bay Area company, and as a group they're pretty terrible. I mean terrible as in: put
Re: (Score:2)
Re: (Score:3)
drop about 80% of c++ features (keep classes and things like that that everyone uses)
If you don't need any of the "fringe" features of C++, then Java or C# is probably a better language. Lots of people who use C++ regularly use 2% of its oddball features and think the other 98% could be removed. Thing is, it's a different 2% for everyone.
programming languages have had all they needed for decades.
All the oddball stuff people don't see the point in in C++ is the stuff that managed code doesn't allow you to do. If you only need those features that you can do in every language, C++ is probably a bad choice.
C++ is in a unique place in that it allows
Re:way too complicated (Score:5, Insightful)
I don't think you've got the narrative right. It's not that C++ has grown too large; it's that it started out too large.
C++ set out to implement all those features that people back in the 80s thought a truly object oriented language would *have* to have. But this was at a time when people had very little practical experience with the paradigm. Not surprisingly, some of the features people thought they needed turned out to be not so useful (operator overloading). Others turned out to be ill-conceived approaches to a real problem (multiple inheritance). It also got some things right (polymorphism).
People who hate C++ bristle when I say this, but C++ is a work of genius. It just tried to solve a problem people had very little practical understanding of at the time. If you look at the constraints Stroustrup *thought* he needed to satisfy, C++ was a brilliant hack in the best sense of the word. Looking at it in the light of over thirty years of real world experience will naturally diminish it.
You just have to remember how wrong-headed people were about object oriented programming in the 80s. A lot of the initial enthusiasm for the paradigm was naivete; people thought it would radically simplify the complicated business of programming, whereas it really opened up entirely new vistas of complexity. It took decades of trial and error to learn how to make productive use of that complexity, and C++ played a vital role in gaining that experience.
I can't speak to C++'s utility today; given its history I would expect the ecosystem to appear somewhat messy from the perspective of an outsider. But it was absolutely pivotal historically.
Re: (Score:2)
Not surprisingly, some of the features people thought they needed turned out to be not so useful (operator overloading).
Operator overloading is essential for anybody who does any sort of numerical computing.
Re: (Score:2, Interesting)
Sure, it has *some* uses, it's just not a very good idea to saddle other users with.
Re:way too complicated (Score:5, Insightful)
Operator overloading doesn't "saddle" anyone, as people do not have to use it if they do not wish to. The supposedly "limited" number of cases that operator overloading has practical use may certainly be smaller than the number of cases where it really is not needed, but that does not mean it is somehow actually finite.
You can argue that there is no real difference between a+b and a.plus(b), but if a person called a function "plus" that had nothing to do with the addition operation, it's just as misleading as when a person overloads the + operator to do something unintuitive.
It's my observation that the greatest downside to operator overloading actually reduces to the same argument that a programmer can still choose a bad name for some variable or function in their program which makes the code less readable and understandable.
And obviously, just as the argument that a programmer should have the discipline to name their variables and functions appropriate so that other programmers can understand their code instead of just using the first name that pops into their head and making it faster to write, a programmer using a language with operator overloading should have the discipline to realize when and where operator overloading makes their program easier for others to understand what the code is doing, and not just using it because they believe it makes their own coding job easier.
Re: (Score:2)
You're assuming that you only have to work on programs you write yourself.
Re: (Score:2)
Re: (Score:3)
And even more important for implementing functors, copy and now move semantics. People believing it to be useless has never understood it
Re: (Score:2)
And even more important for implementing functors, copy and now move semantics. People believing it to be useless has never understood it
This. Operator overloading is absolutely essential to the direction modern C++ has gone. For those who haven't looked at it, Modern C++ is attempting to synthesize a small, clean, safe language from a bloated, messy, terribly-unsafe language. Honestly, I think Rust is a better choice, but if you're a C++ programmer who hasn't learned and tried the modern style, you should.
Re: (Score:2)
Re: (Score:2)
Operator overloading is essential for anybody who does any sort of numerical computing.
This is a drastic self-goal. You even bolded the word that doesn't mean what you think it means.
Re: (Score:2)
Not surprisingly, some of the features people thought they needed turned out to be not so useful (operator overloading).
Operator overloading is essential for anybody who does any sort of numerical computing.
And for the other 99.99% of devs? If you want to do numerical computing, maybe use a language made for it. A general purpose language should err on the side of being easy for people who don't want to do numerical computing and hard for people who do, rather than the other way around.
Re: (Score:2)
drop about 80% of c++ features (keep classes and things like that that everyone uses); but man, c++ has grown so large and hard to understand (at times) that I can stare at some c++ code and not have any idea what the new constructs are doing.
been programming since the 80's; not new to coding at all, but I have to say that c++ has gone to the dark side along with java.
programming languages have had all they needed for decades.
I used lambdas the other day for the first time. I was working on some code where lambdas were the perfect fit, I'm glad C++ has them if I need them.
There's lots of other stuff that I haven't used yet but I know somebody out there really needs them or they wouldn't have been added.
(despite what C programmers tell each other, things don't get added to C++ without justification. The history of C++ is a history of rejected ideas).
Do you know what makes this worse? (Score:2)
Re: (Score:1)
Re: (Score:1)
There doesn't seem to be a system in place to scrutinize new language feature requests in an organized way. A small group seem to add what they want without sufficient analysis, often going with fads. Or at least they don't document the decision process and alternatives very well. If they documented it better, then we could learn what kind of things go wrong when they add something they shouldn't have. Those who don't learn from history are bound to
Re: (Score:2)
Re: (Score:1)
Between C front and GCC Bjarne Stroustrup changed the definition of overriding equal.
At the time we made the compiler switch we were near the end of working on the Xanadu project. The change caused obscure bugs in the tree rebalancing code that were hard to fathom. I think we worked on it for over a month before I started single-stepping through the code at the assembly level and found out what was happening.
It was not hard to fix, but finding what was failing was just awful.
C++ biggest strength... (Score:4, Insightful)
All the many programming languages that become popular for a while and then vanish into irrelevance usually die from either the greed of some copyright holder or inflated egos of a few "entitled" super-maintainers.
C++ may not be perfect, but it successfully circumnavigated those standard obstacles.
If his paper is anything like his language... (Score:4, Funny)
If his paper is anything like his language, a couple of pages of it will be amazing and everybody will enjoy reading them, while you have to wade through the other 166 and you will find interesting bits, although the bits you liked won't necessarily be the bits others liked...
Yeah, should have been a 9 page paper (Score:2)
âoePreserve the essenceâ (Score:3)
In general setting a goal of preserving the essence of something, while it feels good, is mostly for psychological purposes and detracts from the greater goal of having something be the best, most efficient.
Bjarne (Score:3)
C++ has indeed succeeded, despite its inherent ugliness, unwieldiness, bugginess, and needless ambiguities. Bjarne (bless!) is the only one who ever thought that his baby was pretty.
I'm sure I'll pay for this in karma, but C++ is awful. Herb Sutter (etc.) have done an amazing job untangling its worst parts, but imagine how much further along this industry and computer science in general would have been, had it not been for C++.
Can someone please summarize the paper (Score:1)
Re: (Score:2)
Neither am I. Slashdot seems to shit all over it for some reason. It doesn't seem to be going away any time soon either. I could care less what language my employer used as long as they pay me.
I prefer this C++ interview (Score:5, Funny)
Interviewer: Well, it's been a few years since you changed the world of software design, how does it feel, looking back?
Stroustrup: Actually, I was thinking about those days, just before you arrived. Do you remember? Everyone was writing 'C' and, the trouble was, they were pretty damn good at it. Universities got pretty good at teaching it, too. They were turning out competent - I stress the word 'competent' - graduates at a phenomenal rate. That's what caused the problem.
Interviewer: Problem?
Stroustrup: Yes, problem. Remember when everyone wrote Cobol?
Interviewer: Of course, I did too
Stroustrup: Well, in the beginning, these guys were like demi-gods. Their salaries were high, and they were treated like royalty.
Interviewer: Those were the days, eh?
Stroustrup: Right. So what happened? IBM got sick of it, and invested millions in training programmers, till they were a dime a dozen.
Interviewer: That's why I got out. Salaries dropped within a year, to the point where being a journalist actually paid better.
Stroustrup: Exactly. Well, the same happened with 'C' programmers.
Interviewer: I see, but what's the point?
Stroustrup: Well, one day, when I was sitting in my office, I thought of this little scheme, which would redress the balance a little. I thought 'I wonder what would happen, if there were a language so complicated, so difficult to learn, that nobody would ever be able to swamp the market with programmers? Actually, I got some of the ideas from X10, you know, X windows. That was such a bitch of a graphics system, that it only just ran on those Sun 3/60 things. They had all the ingredients for what I wanted. A really ridiculously complex syntax, obscure functions, and pseudo-OO structure. Even now, nobody writes raw X-windows code. Motif is the only way to go if you want to retain your sanity.
Interviewer: You're kidding...?
Stroustrup: Not a bit of it. In fact, there was another problem. Unix was written in 'C', which meant that any 'C' programmer could very easily become a systems programmer. Remember what a mainframe systems programmer used to earn?
Interviewer: You bet I do, that's what I used to do.
Stroustrup: OK, so this new language had to divorce itself from Unix, by hiding all the system calls that bound the two together so nicely. This would enable guys who only knew about DOS to earn a decent living too.
Interviewer: I don't believe you said that...
Stroustrup: Well, it's been long enough, now, and I believe most people have figured out for themselves that C++ is a waste of time but, I must say, it's taken them a lot longer than I thought it would.
Interviewer: So how exactly did you do it?
Stroustrup: It was only supposed to be a joke, I never thought people would take the book seriously. Anyone with half a brain can see that object-oriented programming is counter-intuitive, illogical and inefficient.
Interviewer: What?
Stroustrup: And as for 're-useable code' - when did you ever hear of a company re-using its code?
Interviewer: Well, never, actually, but...
Stroustrup: There you are then. Mind you, a few tried, in the early days. There was this Oregon company - Mentor Graphics, I think they were called - really caught a cold trying to rewrite everything in C++ in about '90 or '91. I felt sorry for them really, but I thought people would learn from their mistakes.
Interviewer: Obviously, they didn't?
Stroustrup: Not in the slightest. Trouble is, most companies hush-up all their major blunders, and explaining a $30 million loss to the shareholders would have been difficult. Give them their due, though, they made it work in the end.
Interviewer: They did? Well, there you are then, it proves O-O works.
Stroustrup: Well, almost. The executable was so huge, it took five minutes to load, on an HP workstation, with 128MB of RAM. Then it ran like treacle. Actually, I thought this would be a major stumbling-block, and I'd get found out within a week, but nobody cared. Sun and HP were only too glad to sell enormously powerful boxes, with huge resources just to run trivial programs. You know, when we had our first C++ compiler, at AT&T, I compiled 'Hello World', and couldn't believe the size of the executable. 2.1MB
Interviewer: What? Well, compilers have come a long way, since then.
Stroustrup: They have? Try it on the latest version of g++ - you won't get much change out of half a megabyte. Also, there are several quite recent examples for you, from all over the world. British Telecom had a major disaster on their hands but, luckily, managed to scrap the whole thing and start again. They were luckier than Australian Telecom. Now I hear that Siemens is building a dinosaur, and getting more and more worried as the size of the hardware gets bigger, to accommodate the executables. Isn't multiple inheritance a joy?
Interviewer: Yes, but C++ is basically a sound language.
Stroustrup: You really believe that, don't you? Have you ever sat down and worked on a C++ project? Here's what happens: First, I've put in enough pitfalls to make sure that only the most trivial projects will work first time. Take operator overloading. At the end of the project, almost every module has it, usually, because guys feel they really should do it, as it was in their training course. The same operator then means something totally different in every module. Try pulling that lot together, when you have a hundred or so modules. And as for data hiding. God, I sometimes can't help laughing when I hear about the problems companies have making their modules talk to each other. I think the word 'synergistic' was specially invented to twist the knife in a project manager's ribs.
Interviewer: I have to say, I'm beginning to be quite appalled at all this. You say you did it to raise programmers' salaries? That's obscene.
Stroustrup: Not really. Everyone has a choice. I didn't expect the thing to get so much out of hand. Anyway, I basically succeeded. C++ is dying off now, but programmers still get high salaries - especially those poor devils who have to maintain all this crap. You do realise, it's impossible to maintain a large C++ software module if you didn't actually write it?
Interviewer: How come?
Stroustrup: You are out of touch, aren't you? Remember the typedef?
Interviewer: Yes, of course.
Stroustrup: Remember how long it took to grope through the header files only to find that 'RoofRaised' was a double precision number? Well, imagine how long it takes to find all the implicit typedefs in all the Classes in a major project.
Interviewer: So how do you reckon you've succeeded?
Stroustrup: Remember the length of the average-sized 'C' project? About 6 months. Not nearly long enough for a guy with a wife and kids to earn enough to have a decent standard of living. Take the same project, design it in C++ and what do you get? I'll tell you. One to two years. Isn't that great? All that job security, just through one mistake of judgement. And another thing. The universities haven't been teaching 'C' for such a long time, there's now a shortage of decent 'C' programmers. Especially those who know anything about Unix systems programming. How many guys would know what to do with 'malloc', when they've used 'new' all these years - and never bothered to check the return code. In fact, most C++ programmers throw away their return codes. Whatever happened to good ol' '-1'? At least you knew you had an error, without bogging the thing down in all that 'throw' 'catch' 'try' stuff.
Interviewer: But, surely, inheritance does save a lot of time?
Stroustrup: Does it? Have you ever noticed the difference between a 'C' project plan, and a C++ project plan? The planning stage for a C++ project is three times as long. Precisely to make sure that everything which should be inherited is, and what shouldn't isn't. Then, they still get it wrong. Whoever heard of memory leaks in a 'C' program? Now finding them is a major industry. Most companies give up, and send the product out, knowing it leaks like a sieve, simply to avoid the expense of tracking them all down.
Interviewer: There are tools...
Stroustrup: Most of which were written in C++.
Interviewer: If we publish this, you'll probably get lynched, you do realise that?
Stroustrup: I doubt it. As I said, C++ is way past its peak now, and no company in its right mind would start a C++ project without a pilot trial. That should convince them that it's the road to disaster. If not, they deserve all they get. You know, I tried to convince Dennis Ritchie to rewrite Unix in C++.
Interviewer: Oh my God. What did he say?
Stroustrup: Well, luckily, he has a good sense of humor. I think both he and Brian figured out what I was doing, in the early days, but never let on. He said he'd help me write a C++ version of DOS, if I was interested.
Interviewer: Were you?
Stroustrup: Actually, I did write DOS in C++, I'll give you a demo when we're through. I have it running on a Sparc 20 in the computer room. Goes like a rocket on 4 CPU's, and only takes up 70 megs of disk.
Interviewer: What's it like on a PC?
Stroustrup: Now you're kidding. Haven't you ever seen Windows '95? I think of that as my biggest success. Nearly blew the game before I was ready, though.
Interviewer: You know, that idea of a Unix++ has really got me thinking. Somewhere out there, there's a guy going to try it.
Stroustrup: Not after they read this interview.
Interviewer: I'm sorry, but I don't see us being able to publish any of this.
Stroustrup: But it's the story of the century. I only want to be remembered by my fellow programmers, for what I've done for them. You know how much a C++ guy can get these days?
Interviewer: Last I heard, a really top guy is worth $70 - $80 an hour.
Stroustrup: See? And I bet he earns it. Keeping track of all the gotchas I put into C++ is no easy job. And, as I said before, every C++ programmer feels bound by some mystic promise to use every damn element of the language on every project. Actually, that really annoys me sometimes, even though it serves my original purpose. I almost like the language after all this time.
Interviewer: You mean you didn't before?
Stroustrup: Hated it. It even looks clumsy, don't you agree? But when the book royalties started to come in... well, you get the picture.
Interviewer: Just a minute. What about references? You must admit, you improved on 'C' pointers.
Stroustrup: Hmm. I've always wondered about that. Originally, I thought I had. Then, one day I was discussing this with a guy who'd written C++ from the beginning. He said he could never remember whether his variables were referenced or dereferenced, so he always used pointers. He said the little asterisk always reminded him.
Interviewer: Well, at this point, I usually say 'thank you very much' but it hardly seems adequate.
Stroustrup: Promise me you'll publish this. My conscience is getting the better of me these days.
Interviewer: I'll let you know, but I think I know what my editor will say.
Stroustrup: Who'd believe it anyway? Although, can you send me a copy of that tape?
Interviewer: I can do that.
Re: (Score:2)
A true classic.
I always retch a little when I get to the part about implicit typedefs. If you know a few languages, you can usually grope your way through reading a new language - even better if the new language is related to something you know. Not so much C++, and it is worse, somehow, if you know C. You can easily spend hours trying to figure out all of the implied stuff (aka "what is really going on") in a single statement.
Re: (Score:2)
I'm trying to decide whether Stroustrup would have actually used the term "X windows", or instead use the proper term, "X Window".
Re: (Score:2)
People don't actually say "X Window System" in conversation, it was always pronounced as "X windows" or "X."
Re: (Score:2)
Why does it feel like he's interviewing himself?
Re: (Score:2)
Because the writer wasn't a good character writer.
Re: (Score:2)
Mentor Graphics had a lot of success, they sold to Siemens for $4.5 billion in 2016. The stuff they were doing in the 90s worked out well for them.
That's a lot of pages (Score:3)
to say "it was faster than Smalltalk and compatible with C".
Still shit, though.
Now I'm gonna have to release a paper too... (Score:1)
... on how the black plague thrived. ... ;)
(Just to show that thriving does not equal good. :)
Lack of alternatives (Score:3)
I think any high level language that allowed people to code in C and gain the power of classes without any downsides except a very high cost of memory and call setup would have been successful.
If C were not such a steaming piece of shit... and by the standards of today... it really is... most any other language could have won. But C makes use of header files which have been possibly the worst idea ever in computing.
Consider that header files were a blessing in 1969 because computers simply did not have the resources to read and compile all the files in a large project at once. Header files allowed data structures for an entire project to be defined centrally, though with the severe short coming that they must be declared in the order referenced... thus permitting many small modules to reference and employ those structures with no knowledge of the contents of other modules.
Therefore, one could compile something as complex as Unix even though there was no possible way to store abstract syntax trees of massive size on a low memory system.
Linking files also became very important. Whether through static linking where objects within a library could be referenced through a very lightweight interface, or through dynamically loaded shared shared libraries which only required a table of pointers to be established when loading a program at startup, C was extremely versatile.
That said, the raw efficiency of C during compile time made it a truly terrible language in 2020.
We have memory... and one of it. We have CPU. We have disk. There is absolutely no excuse for a program language that treats individual source files of a program independently. It is devastating to compilation. Since most modern code is written so a single function is rarely more than a few lines, inlining should be used far more than it is. C style forward declarations make code shitty and unmaintainable. Preprocessor are absolutely horrifying. Cross platform binaries are pretty much impossible.
Then there is the ABI for C and C++. We actually still do not have the ability to compile a shared library that is recognized by multiple operating systems natively. C and C++ should have years ago defined a portable object file. A fat version would allow including not just multiple operating systems but also multiple CPU architectures.
Even more so, a disgusting weakness of modern C and C++ is that there is no standard intermediate format for binaries. Compilers should never produce raw binary files. Instead, they should parse code, generate an AST, optimize the AST including inlining and publish that in a dense (possibly compressed) form. Then all operating systems should read those files, generate machine code and cache it. This is not complex. The Web standardized in the format of DOM years ago and all browser vendors implement it. This should be much much easier in C and C++ but due to the horrifying insistence of C and C++ programmers to bypass everything manually, it is impossible.
Then there is the disgusting habit of lack of metadata and corresponding APIs. Any sufficiently advanced C or C++ programmer has written code at some point to correlate internal memory structures (walked the stack for example) to map files or debugging symbols. More advanced developers have been memory leak hunting by instrumenting code by overriding malloc/free or new/delete.
I suppose I can go even further and tear apart these languages for their utter disrespect for strings, buffers and possibly collections.
C++ always tries to fix everything by using templates. If there is something missing in the language, rather than break backwards compatibility, they add templates which are basically useless.
RTTI in a modern language is not a nice to have... it is a core feature.
C and C++ have served us very well for very long but in 2020 they are examples of what not to do.. EVER. There are actually no redeeming features of C and C++ today. We can write operating systems and boot loafers and even firmware in Rust and other far better languages.
As for language bindings, all modern languages can be quite easily bound from most other languages thanks to sufficient compiler tools. We can even thunk easily between almost any language in any runtime environment over any protocol because headers are dead and language ABIs facilitate it... except when interacting with C and usually C++ (which MS has done truly weird shit with in C++/RT).
Thank you Bjarne for what we needed at a time when we needed it. I still slum around in C++ once in a while. But writing code in C and C++ in 2020 is like making children toys with mercury and lead.
C++ is magical (Score:1)
let a = 10;
let b = 20;
let c = {a, b, "hello"};
auto squareFilter = []() { return def(x) { return x*x; }; };
c | squareFilter() | print;
let s = "";
auto concatFilter = [](let& t) { return def(x) { t+=x; }; };
c | concatFilter(s)
s | print;
Templates tell you everything wrong with C++ (Score:2)
Caveat: templates are very useful.
That aside, they're an obnoxious way to implement generic data structures. Worse, they're an over-complicated, Turing-complete bolt-on. Template meta-programming is a *thing*. Templates are useful because they're effectively another programming language that tags along for the ride. And all the interesting stuff that templates do that break your brain are *discovered functionality*.
See, that's the problem with C++: for all the planning, most of the stuff that's either good