Objective-C Overtakes C++, But C Is Number One 594
mikejuk writes "Although the TIOBE Index has its shortcomings, the finding that Objective-C has overtaken C++ is reiterated in the open source Transparent Language Popularity Index. The reason is, of course, that Objective-C is the language you have to use to create iOS applications — and as iPads and iPhones have risen in popularity, so has Objective-C. If you look at the raw charts then you can see that C++ has been in decline since about 2005 and Objective-C has shot up to overtake it with amazing growth. But the two charts are on different scales: if you plot both on the same chart, you can see that rather than rocketing up, Objective-C has just crawled its way past, and it is as much to do with the decline of C++. It simply hasn't reached the popularity of C++ in its heyday before 2005. However the real story is that C, a raw machine independent assembler-like language, with no pretense to be object oriented or sophisticated, has beaten all three of the object oriented heavy weights — Java, C++ and Objective C. Yes C is number one (and a close second in the transparent index)."
C Programming Language (Score:4, Insightful)
However the real story is that C, a raw machine independent assembler-like language, with no pretense to be object oriented or sophisticated, has beaten all three of the object oriented heavy weights
This sounds like it was written by someone who doesn't understand C. You can write object orientated code in C. You don't always need the language to hold your hand. And C is NOT assembler-like language. Not even close.
And as far as sophisticated code, I guess the author doesn't consider operating systems or most system programming to be sophisticated.
Re:C Programming Language (Score:5, Insightful)
Is this a touchy subject for you, AC?
The author didn't say anything about sophisticated code, they said that C isn't a particularly sophisticated language. And it's not. C doesn't have very many bells and whistles -- it's just a very good, general-purpose language. The fact that the language itself is unsophisticated is what makes it good for writing the kind of code people write in C.
Secondly, C is not an object oriented language. I can write object oriented code in assembly language if I want, but that doesn't make assembly language object oriented.
Bravo that C is still relevant. (Score:5, Insightful)
Re:fp (Score:4, Insightful)
Here's a crazy brief explanation:
The big draw to OO is that it (ostensibly) makes it easier and/or faster to write applications. This doesn't mean that you can make programs with an OO language that you couldn't with an imperative or structured language, only that certain tasks may be easier to implement.
That said, OO isn't always the best option. OO languages are typically a lot more complex and produce slower executables than plain C, so there is a trade-off that can be important in certain situations. As with anything, pick the best tool for the job.
For myself, when I first learned programming (via some books), I learned C before moving to C++. I absolutely hated C++ and didn't see the point of OO programming, due in large part because of the way the book presented it. At the start, the author had you write a C program, and throughout the course of the book, you would change it into a C++ program full of OO goodness. The final C++ program wound up having 50% more lines of code for the exact same functionality, and that was the point where I gave up on it. It was a pretty bad first impression.
So maybe you're reading from the wrong book?
Re:fp (Score:5, Insightful)
It's like the static vs dynamic linking debate that you sometimes hear. There's no real valid answer to that one either, it's a best guess on what'll lead to the best performance. With dynamic linking you don't need to load all the libraries at the start, on the other hand with static linking you don't need to call up the linker each time a library is loaded, and so on... My main advice: stay out of it. There's no real valid answer to these sort of things.
Re:sorry (Score:5, Insightful)
That'd be like saying letters are no longer required because we'll all be using words and sentences from now on.
Re:Objective-C not required to create iOS Apps (Score:4, Insightful)
C in combination with some form of assembly still holds the absolute first position in terms of how much its actually deployed. Every mainstream OS its core, bootloader,
C++ holds its second spot without problem simply due to the fact that it's compatible with C and it does offer native object extensions.
The top 5 will probably be completed by Visual Basic, C# and Java for enterprise applications. They're perfectly fine languages for such goals and they do their job well.
After that it becomes tricky, most likely a couple of web languages like PHP and Perl in combination with a few of the old gems like Ada and FORTRAN. Ada is used in the aircraft industry on a regular basis and FORTRAN is the corner stone of weather prediction. Two rather interesting languages (not really programming languages though) would most likely also show up on there: VHDL and Verilog.
Anyway, I would just wish people would stop linking to the TIOBE index cause it actually has 0 value compared to real research into the subject. I'd rather see them do a study trying to correlate suicide statistics in the programming community with the programming language that was being used at the time, that might actually give more information about how good a language is than a couple of search engine hits.
Re:Agreed. (Score:5, Insightful)
So strange... I find Ayn Rand completely guilty of the very same romantic notions that got the founders of Communism (she so despised) into so much hot water. Perhaps its true what they say about choosing your enemies well. Both presumed that the underlying greatness and magnificence of the human spirit either as a society or as a specific productive individual would prove the guiding light for humanity. In fact humanity has shown precious few guiding lights and for the most part, we are little descended from our primate ancestors. This isn't to say that we aren't capable of transcendence, simply that you can't depend on that to build a social or philosophical framework.
Design the system that demands human transcendence, inspires greatness, and puts strict limits to personal power and responsibly accounts for the grosser of human foibles and frailties, and you'll have a winner. We had that system in the form of checks and balances, until the "Randian" among us began to systematically dismantle those very defenses against our poorer natures, beginning in the 80s. Up until then, we had the time and means to look at the future we wanted as a society, not just a few social (read financial) elites, and strive towards that future wisely and with due consideration. Now we're in a kettle of fish. Those elite have proven to be every bit as ignorant, self obsessed/serving and foolish as everyone else and they've squandered the future on extra McMansions, expensive cars and yachts, and the virtual hijacking of our society.
C is a great language. You can't any closer to bare metal without slugging assembly around, and as we move to more and more intelligent particles infiltrating everything from household appliances to ubiquitous sensors in the roads we drive on, you better believe that C will bring consciousness to the dross matter that surrounds us. I can only hope, that we can put aside our prejudices (not only racial, but societal), and begin to replace belief systems with educated inquiry, and treat the future with our intelligence rather than our primate predilections. It is the only hope I can see for a future worth living in.
Re:C Programming Language (Score:3, Insightful)
You can write object orientated code in C.
The OP never said you couldn't, they said: "no pretense to be object oriented" (Emphasis mine.)
And as far as sophisticated code, I guess the author doesn't consider operating systems or most system programming to be sophisticated.
Again that's not what the OP said, they said C has: "no pretense to be ... sophisticated" They're saying C itself is not sophisticated, that has nothing to do with the code written in C. Sand is not a sophisticated medium, but I've seen sand sculptures that are definitely sophisticated.
However, you are 100% correct in challenging the "assembler-like" comment.
Re:C Programming Language (Score:5, Insightful)
Code is a way of expressing human thought (language) in a way that binary machines can interpret and perform. There has been a forever search for a language that best captures the grace and power of abstract human thinking elegantly.
One of those searches lead to Object Oriented Programming. An OO language breaks the organization of THINGS in a very natural way for western thinkers. The thought here is that by creating logical constructs representing an OBJECT which has both its own unique qualities and abilities, while at the same time inheriting qualities and capabilities from the family of OBJECT from whence it was derived, that you can perform wonderful things with a minimum of code and that if you were careful in designing your application that it should be easily adaptable and extensible to the vagaries of life. Of course this power doesn't come free, and there is operational code to support its behavior, so tiny problems or very small code may well demand C, while a large application is best implemented in a framework that gives you the logical freedom of an OO environment.
I see you nodding, is that you understanding or falling asleep... sorry if the monologue uses big words, they're part of the concepts. Anyway, languages have intrinsic power depending on their features and capabilities. Arguably, LISP is the most powerful language one can program in today. It is also one of the more syntactically challenging, and demands a fairly healthy understanding of what a machine is fundamentally capable of doing to use to its full potential. There is a spectacular free course available at MIT online, go here [oreillynet.com] to read more about it, and decide if its something you might be interested in. While you're at it, you might want to read up in functional languages (for the more action oriented among us) or just spend a while over at Wikipedia learning about computer languages and how we got here. Definitely read a book on algorithms. Understanding how we take every day problems and reduce them to logical constructs, and how very smart people have optimized the process of managing those problems is a very cool exercise... and it'll grow your brain a notch or two (help you look at problems newly.) Master abstraction and reduction, and you've got a bright future wherever you go.
Re:fp (Score:5, Insightful)
the idea of object oriented vs. non object oriented languages has always thrown me off.
Everyone else will attempt to explain OO using OO terms to a non-OO programmer. Thats like trying to teach my dog to sail a boat by speaking Japanese. I'll try a different tack. You know what a computed goto is, right (other than pure unadulterated evil, right?) What if your compiler enforced the hell out of good commenting and error bound checking to let you do computed goto's safely (er, more or less)? Well that is barely scratching the surface of OO. Syntactic sugar mounded on top of syntactic sugar. You know that quote about turtles all they way down, well fundamentally no matter the paradigm its Turing machines all they way down... more or less.
but really slashdot, what is the big draw to OO
When your professor was a little baby skript kiddie wannabe on his TRS-80 Coco-2 running OS/9 and BASIC09 and liking it, object orientation was the silver bullet among the crowd who could not bother to read "the mythical man month" by Brooks. So now you suffer thru OO because it was "cool" back when parachute pants were also cool, and leggings. Much as we're now raising a crop of wannabe skript kiddies who look up to the functional programming and agile methods people who have also never read "the mythical man month" by Brooks, so your kids / my grandkids are going to have to learn functional programming as The_One_True_Paradigm_And_all_disbelievers_should_be_burned_at_the_stake. And I'll still be writing device driver code on PIC microcontrollers in raw assembly, and it'll work great and I'll be liking it.
There's a really nice wiki article you probably need to read. The world is a lot bigger than "OO" "non-OO".
http://en.wikipedia.org/wiki/Comparison_of_programming_paradigms [wikipedia.org]
Re:fp (Score:5, Insightful)
What the fuck? Why the fuck would I subclass a button just to make it blue? That's just data, and damned trivial data at that. If your button object doesn't already have some mechanism for dealing with that data, it sucks and I'm using a different object, not yours.
Instead of having to code buttons from scratch, you sub class them...
No no no no NO. Goddamnit NO. Fucking Java. Motherfucking Javascript. They've ruined a generation of programmers.
Subclassing is the LAST thing you should be doing. The very last. First you should be using the customization features built in to the object, and using them directly on an instance of that object. Set the blue color on the Button class and be done with it. If that's not sufficient, use object composition. Most of the time, your object is NOT a Button. It's a something that needs to have a button. Only as a last possible resort do you subclass Button, and you'd damn well better be writing an object that still is-a Button. If you're not, you've done it WRONG.
Re:Agreed. (Score:5, Insightful)
By all means, there's nothing wrong with primates... fine animals. They just tend to form hierarchies along lines off dominance, commit acts of violence on one another including infants, they're greedy, scheming, back-stabbing, self serving Machiavellian bastards (to paraphrase one of the world's leading authorities [wordpress.com] on primate research.
So we aren't as bad as baboons and we aren't as good as bonobos. We fall neatly on the primate continuum of behavior (good and bad.) The problem is that we have nukes. A pissing contest among humans could end in a 20 mile wide blue glass ashtray. All I'm saying is that as good as being a primate has gotten us so far, its perhaps time to begin rising above the worst of our inclinations while rising above them still makes a difference.
Re:fp (Score:5, Insightful)
Except that if you read his code, he's not actually subclassing Button, he's instantiating it. He's certainly saying it wrong, though.
Re:C Programming Language (Score:5, Insightful)
And--I know I'm going to be stoned for this--Linus =/= God.
Re:I guess you don't understand languages either (Score:5, Insightful)
So, you think that is object-orientation? Oh boy.
From wikipedia [wikipedia.org]: "Object-oriented programming (OOP) is a programming paradigm using "objects" - data structures consisting of data fields and methods together with their interactions - to design applications and computer programs."
The GP's method certainly qualifies. Just because it doesn't include all the sugary syntax or features that are included in your favorite so-called "OOP language" doesn't mean that you can't do object-oriented programming in C.
Re:I guess you don't understand languages either (Score:4, Insightful)
Oh, but it is. C is actually very, very close to assembly language, with only the most unimportant CPU-specific details abstracted away. The primitive types in C are almost always natively supported by the CPU in assembly language, with few exceptions. Instead of having to manage your own stack, it mostly manages it for you, but it still leaves plenty of room for shenanigans, particularly because it doesn't enforce the number of arguments any more than asm does. And if you use varargs, you pretty much are doing direct accesses to the stack using indexed addressing. Simple asm.
Accesses to a struct are just a tiny bit of syntactic sugar on top of an indexed load/store. Goto is a jmp, setjmp and longjmp just set a register and then perform a jump to that address The if/then commands have near exact ASM equivalents (albeit with a couple of extra jump instructions thrown in), and even while loops are just a couple of instructions (not counting whatever calculations must be performed to determine which path to take).
C abstracts away some stack management details, register quantity limits, etc., but it really is little more than portable assembly language, by design. It was intended for systems-level programming, and does that job well, in part because it is such a thin layer compared with most other languages.
Re:sorry (Score:5, Insightful)
That'd be like saying letters are no longer required because we'll all be using words and sentences from now on.
That's what the Chinese did!
Re:fp (Score:0, Insightful)
Objects simply allow for an efficient programming structure for large software. That's the main reason.
Sorry, those imaginary benefits are as mythical as unicorns. OOP is objectively inefficient.
Really, this shit has been studied. No benefits in productivity were found for OOP. I'm actually shocked to find that OOP didn't harm productivity!
OOP leads to an incomprehensible mess of dependencies (inheritance is bad, composition is almost worse) -- and design patterns lead directly to over-engineered monstrosities that are not only bloated and slow, but destroy any hope of maintainability.
People get confused because objects can be handy abstractions -- and used extremely sparingly, and only when other modular approaches don't fit the problem as well, can make code easier to understand and maintain. Objects, however, do not OOP make.
Even Alan Kay (the guy who coined the term) regrets it. It distracted from his much better idea: agents that communicate via message passing (incidently, a concept NOT well served by modern OOP languages)
One apology I hear all the time is that no one really understand OOP and that it takes years of study to truly grok the concepts and see the benefits. Well, if that's the case (very doubtful) then half of the promises of OOP are false. It's not easier to use, it clearly doesn't simplify design or development, and to top it all off, it bloats your code with no tangible benefits.
OOP is a lie, a hoax, over-hyped, and over-sold. It's a blight on software development. It needs to die. It's anti-modular (accidentally) and anti-parallel (intrinsically) It's the single worst paradigm (if that even qualifies, as there is so little agreement about what constitutes OOP) for this modern era.
If you're an advocate of OOP, I can automatically assume that you're either an idiot or so insecure that you don't dare question the "mainstream" view, like the average creationist.
Re:C Programming Language (Score:5, Insightful)
An OO language breaks the organization of THINGS in a very natural way for western thinkers.
Yes. And that right there is a subtle trap.
The first problem is that the "tree of subclasses" organisation, while on the surface seeming natural, is not in fact an accurate description of real taxonomies found either in nature or in large software projects. Especially so if the "software" includes business data. It turns out there are an awful lot of platypuses in the real world, things which simply don't fit neatly into the tree.
For example, a classic "toy" example often used in the OO analysis world is a database of employees. Lets see, we have managers, and we have workers. Great, we can subclass those! We'll have an abstract Person class, then personWorker and personManager who are subclasses of Person. Instantiate Jack Smith as an instance of class personWorker. Problem sol - um. Wait. Jack just got promoted from a worker to a manager. Crap. Can our OO system of choice handle dynamically changing an object's class during its lifetime? No, it enforces strict classing, so it can't. Oops. No problem, we'll delete Jack and recreate... oh. His entire work history was attached to that object, linked by opaque reference and not by name or staff ID, and now it's all gone forever. Double crap. Oh well. He's left the company anyway, and now he's come back as a private contractor. We'll just make him a new personContractor. Easy. Yeah, wait, now we're dealing with him also over in the billing system as a personSupplier. But wait, there's more, he just bought some stuff from us, so he's also a personCustomer! Now he's three classes at once! The universe has gone crazy!
Most real OO systems "solve" this problem by either not doing inheritance here at all - therefore completely invalidating the "OO is about inheritance" line - or duplicating the data in multiple objects - thereby invalidating the "OO is about modelling the business domain directly" line. But at this point we're really starting to lose most of the advantages of OO entirely.
But there's a second, even more subtle problem: although OO usually uses "class" as a synonym for "type", it turns out that subclasses are NOT at all the same thing mathematically as a subtype. (Because you can override the behaviour of a class, meaning its behaviour is now not a strict subset of its superclass, but can also be a superset.) In fact there's no really sensible definition of "subtype" at all - Liskov substitutability requires that you define a context within which you want to limit your idea of "equality", and over the scope and lifetime of a sufficiently large software system, that context is going to change radically. So there goes all your type safety. Add in runtime reflection (which was a fundamental principle of Smalltalk, the first OO system, but seems to be an optional add-on recently tossed haphazardly back into the modern variety) and things get even more confused.
And finally, even the idea of typing can become a third subtle trap. Even if you could (which you can't in the real world) restrict your software system to a neat tree of subclasses corresponding exactly to strict subtypes in a glorious Platonic universe - if you look at your code carefully, you find out that your class/type structure, no matter how strict and clever you make it, doesn't actually tell you anything about the behaviour of your objects. It only tells you the calling signature. That you've defined an addOne method in all your IncrementableByOne class structures doesn't mean that any of those subclasses actually have to implement int addOne(int X) as returning X plus one - just that they receive and emit an integer. So after all your compile-time declarations, you've gained a whole lot of not much at all, and you have to implement a whole testing harness apparatus to do by hand what your compiler initially promised it will do.
tl;dr: Just like (insert a political philosophy you dislike), OO is a big idea, a seductive idea, but not actually a correct idea. And attempting to apply thoughtlessly will cause pain.
Re:C Programming Language (Score:3, Insightful)
- inefficient abstracted programming models where two years down the road you notice that some abstraction wasn't very efficient, but now all your code depends on all the nice object models around it, and you cannot fix it without rewriting your app.
This is no different in C...
The same problem is much worse in C. Because you will have grafted a crappy half-OO hack onto it that has accreted stupid numbers of macros, casts and other abuses. And it will be hell to change. Speaking from experience.
Re:I guess you don't understand languages either (Score:5, Insightful)
Re:C Programming Language (Score:5, Insightful)
Linus is a pretty smart kernel programmer & architect who programs almost exclusively in C. It's no surprise that he doesn't even want to grok C++.
Re:TIOBE is 'real research'. Just misunderstood. (Score:4, Insightful)
If you look at the description of how they compute the index, it's essentially useless for any practical purpose. So why even bother debating it?
Re:C Programming Language (Score:4, Insightful)
And yet he's replying to someone that seems to want to use C++ for the sake of using C++...