Objective-C Overtakes C++, But C Is Number One 594
mikejuk writes "Although the TIOBE Index has its shortcomings, the finding that Objective-C has overtaken C++ is reiterated in the open source Transparent Language Popularity Index. The reason is, of course, that Objective-C is the language you have to use to create iOS applications — and as iPads and iPhones have risen in popularity, so has Objective-C. If you look at the raw charts then you can see that C++ has been in decline since about 2005 and Objective-C has shot up to overtake it with amazing growth. But the two charts are on different scales: if you plot both on the same chart, you can see that rather than rocketing up, Objective-C has just crawled its way past, and it is as much to do with the decline of C++. It simply hasn't reached the popularity of C++ in its heyday before 2005. However the real story is that C, a raw machine independent assembler-like language, with no pretense to be object oriented or sophisticated, has beaten all three of the object oriented heavy weights — Java, C++ and Objective C. Yes C is number one (and a close second in the transparent index)."
I prefer Subjective-C (Score:5, Funny)
But that's just my opinion.
Agreed. (Score:5, Funny)
C's philosophy doesn't integrate well with Ayn Rand's.
Re:Agreed. (Score:4, Funny)
C's philosophy doesn't integrate well with Ayn Rand's.
Like hell it doesn't.
With C and Ayn Rand - you're on your own.
No pussy footing around with pee-pee holding concepts like "garbage collection", "array bounds checking", "welfare", "free health care".
Those are all for fucking wimps who need something to protect their incompetent asses.
Re:Agreed. (Score:5, Insightful)
So strange... I find Ayn Rand completely guilty of the very same romantic notions that got the founders of Communism (she so despised) into so much hot water. Perhaps its true what they say about choosing your enemies well. Both presumed that the underlying greatness and magnificence of the human spirit either as a society or as a specific productive individual would prove the guiding light for humanity. In fact humanity has shown precious few guiding lights and for the most part, we are little descended from our primate ancestors. This isn't to say that we aren't capable of transcendence, simply that you can't depend on that to build a social or philosophical framework.
Design the system that demands human transcendence, inspires greatness, and puts strict limits to personal power and responsibly accounts for the grosser of human foibles and frailties, and you'll have a winner. We had that system in the form of checks and balances, until the "Randian" among us began to systematically dismantle those very defenses against our poorer natures, beginning in the 80s. Up until then, we had the time and means to look at the future we wanted as a society, not just a few social (read financial) elites, and strive towards that future wisely and with due consideration. Now we're in a kettle of fish. Those elite have proven to be every bit as ignorant, self obsessed/serving and foolish as everyone else and they've squandered the future on extra McMansions, expensive cars and yachts, and the virtual hijacking of our society.
C is a great language. You can't any closer to bare metal without slugging assembly around, and as we move to more and more intelligent particles infiltrating everything from household appliances to ubiquitous sensors in the roads we drive on, you better believe that C will bring consciousness to the dross matter that surrounds us. I can only hope, that we can put aside our prejudices (not only racial, but societal), and begin to replace belief systems with educated inquiry, and treat the future with our intelligence rather than our primate predilections. It is the only hope I can see for a future worth living in.
Re:Agreed. (Score:5, Funny)
Re:Agreed. (Score:5, Insightful)
By all means, there's nothing wrong with primates... fine animals. They just tend to form hierarchies along lines off dominance, commit acts of violence on one another including infants, they're greedy, scheming, back-stabbing, self serving Machiavellian bastards (to paraphrase one of the world's leading authorities [wordpress.com] on primate research.
So we aren't as bad as baboons and we aren't as good as bonobos. We fall neatly on the primate continuum of behavior (good and bad.) The problem is that we have nukes. A pissing contest among humans could end in a 20 mile wide blue glass ashtray. All I'm saying is that as good as being a primate has gotten us so far, its perhaps time to begin rising above the worst of our inclinations while rising above them still makes a difference.
Re:Agreed. (Score:5, Funny)
'C' - The language of technocrats (Score:3)
What we need now is another tenuosly linked meme...In my copy of 1984 there is a reference to a fictional document that describes the different languages spoken by various groups. One of those languages is 'C' - the language of technocrats. So it follo
Re:Agreed. (Score:5, Funny)
But it's bottoms-up
Yes, Forth programming does tend to go a lot smoother when you drain a glass each time you have to look up the stack effect of a word.
Re: (Score:3)
Forth is a bit like darts. I paradoxically get better at darts the more I drink. Thus I'm very fond of darts, like I'm very fond of forth.
Somehow I feel that if I did forth for a day job, my liver would be destroyed.
Re: (Score:3)
Forth is only closer to the metal than C when it is running on a native Forth proccessor.
If not it abstracts everything away that the underlying machine gives/has. So how can it be close to something that it is abstracting away?
Re: (Score:3)
Forth is only closer to the metal than C when it is running on a native Forth proccessor.
If not it abstracts everything away that the underlying machine gives/has. So how can it be close to something that it is abstracting away?
Because the same instruction leads to the same piece of code, every time. When you do "dup +", you can know what code it creates. It won't be different when done a different place. You can follow the program, step by step, and know what the CPU does.
Re:Agreed. (Score:5, Funny)
"all illegal immigrants should be sent back to whence they came. america for americans."
Wasn't that Sitting Bull motto?
Re:Agreed. (Score:5, Funny)
Re:OH I GOT MODDED DOWN??? (Score:5, Funny)
It's about mentioning this "god" fantasy thing
It's no fantasy.
God started out as a C coder, got bored and tried to rebuild the project in a self-built language similar to Brainfuck http://en.wikipedia.org/wiki/Brainfuck [wikipedia.org], now called DNA. The signs are everywhere - in fact GCC is still being used in places, notably to produce Alanine.
Of course, it's an old project, abandoned long ago. There's cruft, commented out code and dependencies everywhere, The APIs are wildly inconsistent, the whole thing is a virus and worm magnet. Even fork bombs are rarely trapped.
The documentation is archaic and unreadable, rewritten from the original by ancient geeks.Modern coders can only guess at what it means, and according to Nietzsche, the guy who wrote it left the company long ago.
About the only thing going for it is a very effective, if slightly weird, bootstrapping process.
sorry (Score:4, Funny)
sorry but html and javascript is the future.. it must be true because all the kids just out of college say so.
Re:sorry (Score:5, Insightful)
That'd be like saying letters are no longer required because we'll all be using words and sentences from now on.
Re:sorry (Score:5, Insightful)
That'd be like saying letters are no longer required because we'll all be using words and sentences from now on.
That's what the Chinese did!
Re:sorry (Score:5, Informative)
That'd be like saying letters are no longer required because we'll all be using words and sentences from now on.
That's what the Chinese did!
Kind of but not really. There are far, far more words in Chinese than there are Chinese characters and characters often don't stand on their own as words. Rather individual characters represent morphemes with a single (or small number of) sounds, which often have no real meaning on their own. These morphemes are then combined to form words. In that sense Chinese characters are like an alphabet, albeit with characters which represent complete syllables rather than individual sounds and which generally (but not always) have some sort of semantic meaning. Secondly if you look at the way characters are formed in Chinese, there is a set of basic characters which are used as phonetic units in constructing most of the other characters. Most of the other characters end up consisting of a basic character indicating the phonological sound and a radical to (very broadly) indicate the semantic meaning of the character. So in terms of both 1. how the characters are used and 2. how the characters are constructed Chinese characters still deal with sound and meaning at a sub-word level.
Re: (Score:3)
Most individual Chinese characters do have their own individual meaning(s).
It used to be (thousands of years ago) that characters were essentially words, but these days multiple-character-words are more in fashion. The classical Chinese is still legible to those with a bit of training, and still sees some use in modern contexts.
You are indeed correct that the characters are often constructed with other "basic characters" that contributes a meaning and sound, but that's at a "sub-character" level, not really
Re: (Score:3)
Korean uses a phonetic system now.
Re: (Score:3)
C Programming Language (Score:4, Insightful)
However the real story is that C, a raw machine independent assembler-like language, with no pretense to be object oriented or sophisticated, has beaten all three of the object oriented heavy weights
This sounds like it was written by someone who doesn't understand C. You can write object orientated code in C. You don't always need the language to hold your hand. And C is NOT assembler-like language. Not even close.
And as far as sophisticated code, I guess the author doesn't consider operating systems or most system programming to be sophisticated.
Re:C Programming Language (Score:5, Insightful)
Is this a touchy subject for you, AC?
The author didn't say anything about sophisticated code, they said that C isn't a particularly sophisticated language. And it's not. C doesn't have very many bells and whistles -- it's just a very good, general-purpose language. The fact that the language itself is unsophisticated is what makes it good for writing the kind of code people write in C.
Secondly, C is not an object oriented language. I can write object oriented code in assembly language if I want, but that doesn't make assembly language object oriented.
Re: (Score:2)
> You can write object orientated code in C.
You technically could...but why would you? If you want to write object oriented C, there's C++. What's the benefit?
Re:C Programming Language (Score:4, Interesting)
Subject: Re: [RFC] Convert builin-mailinfo.c to use The Better String Library.
Newsgroups: gmane.comp.version-control.git
Date: 2007-09-06 17:50:28 GMT (2 years, 14 weeks, 16 hours and 36 minutes ago)
On Wed, 5 Sep 2007, Dmitry Kakurin wrote:
>
> When I first looked at Git source code two things struck me as odd:
> 1. Pure C as opposed to C++. No idea why. Please don't talk about portability,
> it's BS.
*YOU* are full of bullshit.
C++ is a horrible language. It's made more horrible by the fact that a lot of substandard programmers use it, to the point where it's much much easier to generate total and utter crap with it. Quite frankly, even if the choice of C were to do *nothing* but keep the C++ programmers out, that in itself would be a huge reason to use C.
In other words: the choice of C is the only sane choice. I know Miles Bader jokingly said "to piss you off", but it's actually true. I've come to the conclusion that any programmer that would prefer the project to be in C++ over C is likely a programmer that I really *would* prefer to piss off, so that he doesn't come and screw up any project I'm involved with.
C++ leads to really really bad design choices. You invariably start using the "nice" library features of the language like STL and Boost and other total and utter crap, that may "help" you program, but causes:
- infinite amounts of pain when they don't work (and anybody who tells me that STL and especially Boost are stable and portable is just so full of BS that it's not even funny)
- inefficient abstracted programming models where two years down the road you notice that some abstraction wasn't very efficient, but now all your code depends on all the nice object models around it, and you cannot fix it without rewriting your app.
In other words, the only way to do good, efficient, and system-level and portable C++ ends up to limit yourself to all the things that are basically available in C. And limiting your project to C means that people don't screw that up, and also means that you get a lot of programmers that do actually understand low-level issues and don't screw things up with any
idiotic "object model" crap.
So I'm sorry, but for something like git, where efficiency was a primary objective, the "advantages" of C++ is just a huge mistake. The fact that we also piss off people who cannot see that is just a big additional advantage.
If you want a VCS that is written in C++, go play with Monotone. Really.
They use a "real database". They use "nice object-oriented libraries". They use "nice C++ abstractions". And quite frankly, as a result of all these design decisions that sound so appealing to some CS people, the end result is a horrible and unmaintainable mess.
But I'm sure you'd like it more than git.
Linus
- - -
From: Linus Torvalds
Subject: Re: Compiling C++ kernel module + Makefile
Date: Mon, 19 Jan 2004 22:46:23 -0800 (PST)
On Tue, 20 Jan 2004, Robin Rosenberg wrote:
>
> This is the "We've always used COBOL^H^H^H^H" argument.
In fact, in Linux we did try C++ once already, back in 1992.
It sucks. Trust me - writing kernel code in C++ is a BLOODY STUPID IDEA.
The fact is, C++ compilers are not trustworthy. They were even worse in 1992, but some fundamental facts haven't changed:
- the whole C++ exception handling thing is fundamentally broken. It's _especially_ broken for kernels.
- any compiler or language that likes to hide things like memory allocations behind your back just isn't a good choice for a kernel.
- you can write object-oriented code (useful for filesystems etc) in C, _without_ the crap that is C++.
In general, I'd say that anybody who designs his kernel modules for C++ is either
(a) looking for problems
(b) a C++ bigot that can't see what he is writing is really just C anyway
(c) was given an assignment in CS class to do so.
Feel free to make up (d).
Linus
Re:C Programming Language (Score:5, Interesting)
From: Linus Torvalds Subject: Re: [RFC] Convert builin-mailinfo.c to use The Better String Library. Newsgroups: gmane.comp.version-control.git Date: 2007-09-06 17:50:28 GMT (2 years, 14 weeks, 16 hours and 36 minutes ago) ...
In other words, the only way to do good, efficient, and system-level and portable C++ ends up to limit yourself to all the things that are basically available in C. And limiting your project to C means that people don't screw that up, and also means that you get a lot of programmers that do actually understand low-level issues and don't screw things up with any idiotic "object model" crap.
And, for a view somewhat less harsh about C++, but still not a case of "C++ roolz, C droolz!", see The Old Man and the C [opensolaris.org], the abstract of which says
Re:C Programming Language (Score:5, Insightful)
And--I know I'm going to be stoned for this--Linus =/= God.
Re: (Score:3, Insightful)
- inefficient abstracted programming models where two years down the road you notice that some abstraction wasn't very efficient, but now all your code depends on all the nice object models around it, and you cannot fix it without rewriting your app.
This is no different in C...
The same problem is much worse in C. Because you will have grafted a crappy half-OO hack onto it that has accreted stupid numbers of macros, casts and other abuses. And it will be hell to change. Speaking from experience.
Re:C Programming Language (Score:5, Interesting)
For all his brilliance, Linus is stupid about some things. One of them is his near complete lack of understanding of C++. He obviously has zero skill in it, but he has plenty of skill at getting up on his pulpit and flaming people who do.
Re:C Programming Language (Score:4, Insightful)
And yet he's replying to someone that seems to want to use C++ for the sake of using C++...
Re:C Programming Language (Score:5, Insightful)
Linus is a pretty smart kernel programmer & architect who programs almost exclusively in C. It's no surprise that he doesn't even want to grok C++.
Re: (Score:3)
Re: (Score:3)
The first "real" OO language was Simula 67, and in fact it's a direct ancestor of C++ - if you look at Simula, you suddenly realize that the first version of C++ was more or less C with Simula OOP constructs grafted on top (this is where "virtual" came from, by the way).
Re: (Score:3)
You technically could...but why would you? If you want to write object oriented C, there's C++. What's the benefit?
C++ is a more expressive language than C. That's not necessarily better, though--I could define an extension to C where the + operator works between a string and an integer as follows:
mystr + myint is defined such that it returns a string of the same length as mystr, but where each element of the return value is myint less than the equivalent char in mystr. So "dddd" + 2 returns "bbbb", "zzzz
Re:C Programming Language (Score:4, Interesting)
Creating unusual object structures:
I once played around with a state machine framework that was object oriented c. It had a virtual table at the top of one base class and at the bottom of a different one. Using the same layout in the virtual table allowed multiple inheritance without any special effort and without any wasted space. It's the only object structure I've ever played with that I couldn't implement with C++ classes and inheritance (I guess a *really* good C++ compiler might be able to optimize to that structure)
Virtual Static members and member functions:
There are occasionally times when I need polymorphic behavior, but the behavior itself isn't instance specific. As a contrived example, imagine needing to query an instance of an object for the total number of peer objects that are in existence (I'm probably managing a count during construction/destruction). I need to call some member that is class specific, but that member will only need to use static members to execute. As it is this ends up being declared virtual and the this pointer is (needlessly) passed in.
Those are the only two instances I've run across where I actually wrote code up to the point of noticing that I couldn't do that in C++ (I'm actually still shocked ten years later that there's no such thing as a pure virtual static function).
Re:C Programming Language (Score:5, Interesting)
You will often see comments to the effect that C is like assembler or that you can do anything in C it just lacks some syntactic sugar. But that is very wrong. Yes, you can to some degree emulate object oriented programming in C. But how would you go about changing your memory allocation (malloc) to use a copying garbage collector? Or do lazy evaluation Haskell style? How do you implement zero-cost exception handling? (longjmp is NOT zero-cost because it requires setjmp).
These concepts are easy for a compiler that compiles directly to assembly language. Often less mature compilers will compile to C as an intermediate language, but in that case the compiler will not be able to generate the most efficient code. For example, a compiler that uses C as intermediate step can implement exceptions using setjmp/longjmp but this adds extra code at every function call. A compiler that goes directly to assembler can implement exception unrolling using static knowledge about the stack instead for a so called zero-cost exception handling solution.
Similarly, a compiler using C as intermediate will be forced to use a conservative garbage collector such as the Boehm GC. Using more efficient solutions such as a copying garbage collector is simply not possible without knowledge of the stack layout.
Re: (Score:3, Insightful)
You can write object orientated code in C.
The OP never said you couldn't, they said: "no pretense to be object oriented" (Emphasis mine.)
And as far as sophisticated code, I guess the author doesn't consider operating systems or most system programming to be sophisticated.
Again that's not what the OP said, they said C has: "no pretense to be ... sophisticated" They're saying C itself is not sophisticated, that has nothing to do with the code written in C. Sand is not a sophisticated medium, but I've seen sand sculptures that are definitely sophisticated.
However, you are 100% correct in challenging the "assembler-like" comment.
Re:C Programming Language (Score:5, Insightful)
Code is a way of expressing human thought (language) in a way that binary machines can interpret and perform. There has been a forever search for a language that best captures the grace and power of abstract human thinking elegantly.
One of those searches lead to Object Oriented Programming. An OO language breaks the organization of THINGS in a very natural way for western thinkers. The thought here is that by creating logical constructs representing an OBJECT which has both its own unique qualities and abilities, while at the same time inheriting qualities and capabilities from the family of OBJECT from whence it was derived, that you can perform wonderful things with a minimum of code and that if you were careful in designing your application that it should be easily adaptable and extensible to the vagaries of life. Of course this power doesn't come free, and there is operational code to support its behavior, so tiny problems or very small code may well demand C, while a large application is best implemented in a framework that gives you the logical freedom of an OO environment.
I see you nodding, is that you understanding or falling asleep... sorry if the monologue uses big words, they're part of the concepts. Anyway, languages have intrinsic power depending on their features and capabilities. Arguably, LISP is the most powerful language one can program in today. It is also one of the more syntactically challenging, and demands a fairly healthy understanding of what a machine is fundamentally capable of doing to use to its full potential. There is a spectacular free course available at MIT online, go here [oreillynet.com] to read more about it, and decide if its something you might be interested in. While you're at it, you might want to read up in functional languages (for the more action oriented among us) or just spend a while over at Wikipedia learning about computer languages and how we got here. Definitely read a book on algorithms. Understanding how we take every day problems and reduce them to logical constructs, and how very smart people have optimized the process of managing those problems is a very cool exercise... and it'll grow your brain a notch or two (help you look at problems newly.) Master abstraction and reduction, and you've got a bright future wherever you go.
Re: (Score:3)
Re:C Programming Language (Score:5, Insightful)
An OO language breaks the organization of THINGS in a very natural way for western thinkers.
Yes. And that right there is a subtle trap.
The first problem is that the "tree of subclasses" organisation, while on the surface seeming natural, is not in fact an accurate description of real taxonomies found either in nature or in large software projects. Especially so if the "software" includes business data. It turns out there are an awful lot of platypuses in the real world, things which simply don't fit neatly into the tree.
For example, a classic "toy" example often used in the OO analysis world is a database of employees. Lets see, we have managers, and we have workers. Great, we can subclass those! We'll have an abstract Person class, then personWorker and personManager who are subclasses of Person. Instantiate Jack Smith as an instance of class personWorker. Problem sol - um. Wait. Jack just got promoted from a worker to a manager. Crap. Can our OO system of choice handle dynamically changing an object's class during its lifetime? No, it enforces strict classing, so it can't. Oops. No problem, we'll delete Jack and recreate... oh. His entire work history was attached to that object, linked by opaque reference and not by name or staff ID, and now it's all gone forever. Double crap. Oh well. He's left the company anyway, and now he's come back as a private contractor. We'll just make him a new personContractor. Easy. Yeah, wait, now we're dealing with him also over in the billing system as a personSupplier. But wait, there's more, he just bought some stuff from us, so he's also a personCustomer! Now he's three classes at once! The universe has gone crazy!
Most real OO systems "solve" this problem by either not doing inheritance here at all - therefore completely invalidating the "OO is about inheritance" line - or duplicating the data in multiple objects - thereby invalidating the "OO is about modelling the business domain directly" line. But at this point we're really starting to lose most of the advantages of OO entirely.
But there's a second, even more subtle problem: although OO usually uses "class" as a synonym for "type", it turns out that subclasses are NOT at all the same thing mathematically as a subtype. (Because you can override the behaviour of a class, meaning its behaviour is now not a strict subset of its superclass, but can also be a superset.) In fact there's no really sensible definition of "subtype" at all - Liskov substitutability requires that you define a context within which you want to limit your idea of "equality", and over the scope and lifetime of a sufficiently large software system, that context is going to change radically. So there goes all your type safety. Add in runtime reflection (which was a fundamental principle of Smalltalk, the first OO system, but seems to be an optional add-on recently tossed haphazardly back into the modern variety) and things get even more confused.
And finally, even the idea of typing can become a third subtle trap. Even if you could (which you can't in the real world) restrict your software system to a neat tree of subclasses corresponding exactly to strict subtypes in a glorious Platonic universe - if you look at your code carefully, you find out that your class/type structure, no matter how strict and clever you make it, doesn't actually tell you anything about the behaviour of your objects. It only tells you the calling signature. That you've defined an addOne method in all your IncrementableByOne class structures doesn't mean that any of those subclasses actually have to implement int addOne(int X) as returning X plus one - just that they receive and emit an integer. So after all your compile-time declarations, you've gained a whole lot of not much at all, and you have to implement a whole testing harness apparatus to do by hand what your compiler initially promised it will do.
tl;dr: Just like (insert a political philosophy you dislike), OO is a big idea, a seductive idea, but not actually a correct idea. And attempting to apply thoughtlessly will cause pain.
Re:I guess you don't understand languages either (Score:5, Informative)
typedef struct { // And data goes here.
int (*open)(void *self, char *fspec);
int (*close)(void *self);
int (*read)(void *self, void *buff, size_t max_sz, size_t *p_act_sz);
int (*write)(void *self, void *buff, size_t max_sz, size_t *p_act_sz);
} tCommClass;
http://stackoverflow.com/questions/351733/can-you-write-object-oriented-code-in-c
Re: (Score:3)
You don't need to get to functions pointers if you don't need polymorphism. It's like making (in C++) all methods virtual just for fun. You can have static OOP in C:
typedef struct FileClass;
int FileClass_open(FileClass* this, char* fileName);
int FileClass_close(FileClass* this);
int FileClass_read(FileClass* this, void* in, size_t size, size_t* actualSize);
int FileClass_write(FileClass* this, void* out, size_t size, size_t* actualSize);
typedef struct {
} FileCla
Re:I guess you don't understand languages either (Score:5, Insightful)
So, you think that is object-orientation? Oh boy.
From wikipedia [wikipedia.org]: "Object-oriented programming (OOP) is a programming paradigm using "objects" - data structures consisting of data fields and methods together with their interactions - to design applications and computer programs."
The GP's method certainly qualifies. Just because it doesn't include all the sugary syntax or features that are included in your favorite so-called "OOP language" doesn't mean that you can't do object-oriented programming in C.
Re: (Score:3)
Doesn't sound like you know what OOD is. That method qualifies, and you also get inheritance and polymorphism. Perhaps you shouldn't be so smug next time?
It's not being smug when it's about pointing the folly of reinventing the wheel, poorly, out of a square block and claim that it is a wheel. You guys are the MadTV Stuarts of software development (ma, look what I can do.... with structs and pointers to callback functions.)
One could say that programming in assembly using nothing but unconditional jumps and cond jumps is just the same as structured programming because, after all, structured programming will get translated to its assembly/machine language ju
Re: (Score:3)
I think that falls under "you don't always need the language to hold your hand."
Re:I guess you don't understand languages either (Score:5, Insightful)
Re: (Score:3, Informative)
you are aware that the first C++ compilers simply generated C code from the C++ then compiled to that.
Oh, and I've seen several OO languages written in C as well, that some senior engineer who "didn't trust C++" came up with. The only thing you don't get with these is enforcement of visibility with private/public, which isn't strictly required for OO. But polymorphism and the lot, yup, all that was there.
Re: (Score:3)
Enforecement of public private can be done: simply have your base object struct have a void pointer field called private and declare and allocate a private data struct as static in the .c implementation of the class. Voila, private fields and methods as needed.
Re:I guess you don't understand languages either (Score:5, Informative)
doesn't need to be void, even. (I'm sure purists will complain about _t being reserved)
[header file] ...
typedef struct something_s something_t;
something_t *private stuff;
[C file] ...
struct something_s {
};
I use this sort of construct quite a lot.
Re:I guess you don't understand languages either (Score:4, Informative)
An object-oriented language has lots of syntactic help for the purpose, but all languages compile to some type of runtime code structure. If you understand what code gives the object-oriented behavior you want, then you can write it in C.
And yes, the poster who said C was assembler-like likely has never seen an assembler language, I would guess. I do remember writing a C routine once which had an initialized array containing hex representations of machine code to do a particular highly specialized task, and then using some coding wizardry to get the locus of control into that array when needed. Ah, those were the days.
Re:I guess you don't understand languages either (Score:4, Informative)
And yes, the poster who said C was assembler-like likely has never seen an assembler language,
C doesn't look like 6502 or 1802 or 10f220 assembly, but if you squint you can see some PDP-11 addressing modes in there. Because a primary dev box was a ....
Also I see aspects of BAL from MVS370 but maybe thats just dead brain cells flickering.
Re:I guess you don't understand languages either (Score:4, Insightful)
Oh, but it is. C is actually very, very close to assembly language, with only the most unimportant CPU-specific details abstracted away. The primitive types in C are almost always natively supported by the CPU in assembly language, with few exceptions. Instead of having to manage your own stack, it mostly manages it for you, but it still leaves plenty of room for shenanigans, particularly because it doesn't enforce the number of arguments any more than asm does. And if you use varargs, you pretty much are doing direct accesses to the stack using indexed addressing. Simple asm.
Accesses to a struct are just a tiny bit of syntactic sugar on top of an indexed load/store. Goto is a jmp, setjmp and longjmp just set a register and then perform a jump to that address The if/then commands have near exact ASM equivalents (albeit with a couple of extra jump instructions thrown in), and even while loops are just a couple of instructions (not counting whatever calculations must be performed to determine which path to take).
C abstracts away some stack management details, register quantity limits, etc., but it really is little more than portable assembly language, by design. It was intended for systems-level programming, and does that job well, in part because it is such a thin layer compared with most other languages.
Re:I guess you don't understand languages either (Score:5, Informative)
Considering that C++ was originally implemented as a preprocessor for C, there's an existence proof that says you are wrong.
Re:C Programming Language (Score:4, Funny)
In American English, one says and writes "oriented". In British English, the word "orientated" is used for the same purpose. Something I had to learn when I left the USA.
So the British book was "Murder on the Orientat Express"?
Java and C duking it out (Score:5, Interesting)
Java's apparent decline seems to be because of the financial slump. Where the number of new enterprise projects using Java has reduced. Most of this work was deferred and is starting to pick up again (at least as far as I can see). Some of the apparent 'decline' in languages is due to the introduction of new languages. The absolute number of projects using any language may be increasing but with new languages being introduced the proportion for any one language becomes diluted.
That said, C deserves to be right up there because it is still completely relevant as a 'lingua franca' (common language) for talking to hardware or operating systems. It also has the same benefits of Java in that the language is small and the convention is to place complexity in the libraries rather than as arbitrarily added keywords. This is not very exciting for many Slashdotters but for regular joes it allows them to get things done while working on huge, long-term projects (where the set of staff that start the project aren't necessarily those that finish it) where being able to follow other people's code is critical. This doesn't make for good press or excitement in the blogosphere or conference circuit but these two stalwarts pretty much let you solve any problem in any computing environment (portability matters!).
Re:Java and C duking it out (Score:4, Funny)
Java's apparent decline seems to be because of the financial slump. Where the number of new enterprise projects using Java has reduced. Most of this work was deferred and is starting to pick up again (at least as far as I can see)
I'm sure Oracle's mongolian horde of lawyers factors in there somewhere, too.
Re: (Score:3)
Maybe they didn't change away from it, since that would involve trashing already invested time and money, but there's certainly a non-zero number of developers who went another direction at the start of a new project because of it.
Re: (Score:3)
Objective-C not required to create iOS Apps (Score:3, Informative)
"Objective-C is the language you have to use to create iOS applications"
There are plenty of games and other iOS applications that are written in C and C++.
Yes, there is a little bit of "glue" code required for interaction with Apple APIs, but the implication here is that you can't use another language write the majority of an iOS Application, which is wrong.
Re:Objective-C not required to create iOS Apps (Score:4, Insightful)
C in combination with some form of assembly still holds the absolute first position in terms of how much its actually deployed. Every mainstream OS its core, bootloader,
C++ holds its second spot without problem simply due to the fact that it's compatible with C and it does offer native object extensions.
The top 5 will probably be completed by Visual Basic, C# and Java for enterprise applications. They're perfectly fine languages for such goals and they do their job well.
After that it becomes tricky, most likely a couple of web languages like PHP and Perl in combination with a few of the old gems like Ada and FORTRAN. Ada is used in the aircraft industry on a regular basis and FORTRAN is the corner stone of weather prediction. Two rather interesting languages (not really programming languages though) would most likely also show up on there: VHDL and Verilog.
Anyway, I would just wish people would stop linking to the TIOBE index cause it actually has 0 value compared to real research into the subject. I'd rather see them do a study trying to correlate suicide statistics in the programming community with the programming language that was being used at the time, that might actually give more information about how good a language is than a couple of search engine hits.
Re:TIOBE is 'real research'. Just misunderstood. (Score:4, Insightful)
If you look at the description of how they compute the index, it's essentially useless for any practical purpose. So why even bother debating it?
Bravo that C is still relevant. (Score:5, Insightful)
Re: (Score:3)
yes, those 2 needs for performance are: cloud and mobile.
Both have energy issues, so efficient code means more battery life or less electricity bill. Nowadays, no-one cares about the old desktop area, so native code is coming back.
"machine independent assembler-like language"?!? (Score:3)
What? What idiot posted that garbage? Oh, timothy...
Understood.
There are several problems with that article (Score:2)
1) The one stated in the summary - the C++ vs. Objective C graph is on a very small Y axis that exaggerates the differences.
2) They've included Javascript, apparently in it's seldom-used server-side form, to intimate its popularity is going down (AFAICT they don't bother to mention this differentiation).
3) In their 2011 vs 2012 table, they indicate a language's change in table rank using arrows - one point in change equals one arrow. Visually that makes it look like some languages (e.g. Visual Basic .NET) a
Re: (Score:3)
Shifting market or shifting paradigm (Score:3)
Ever tried looking for jobs using C? (Score:5, Interesting)
I don't get it. If you try searching for jobs programming in C, you'll find that almost everything that matches the search is Objective C, C++, or C# (or, on some poorly run job sites,a C++ or C# job where the punctuation got lost and it's displayed as C). Sometimes a job will say C/C++. C is rare as hen's teeth except for embedded development and there aren't *that* many jobs in embedded development.
I just went to monster.com and searched for C. What I found starting at the top was:
-- C++ job that lost the punctuation
-- Objective-C
-- C# job that lost the punctuation
-- C/C++
-- Objective-C
-- C/C++, C#
-- C/C++
-- Objective-C
etc. The first C job was item 14 (and is embedded). The next C job, ignoring the false hits on such things as A B C, was item 24 (also embedded), and C wasn't the main skill required. So how in the world can C be number one?
Re:Ever tried looking for jobs using C? (Score:5, Informative)
Mod parent up. "The popular search engines Google, Bing, Yahoo!, Wikipedia, Amazon, YouTube and Baidu are used to calculate the ratings." That's rather lame. Exactly how do they search for "C", anyway? Do Sesame Street episodes brought to you by the letter C count?
The decline in C++ is probably real. It's on the way out as an application-level programming language. Big, complex applications with serious performance requirements and elaborate internal data structures, like 3D CAD, benefit from being written in C++. But there's no reason to write a routine desktop business app in it any more. Just moving windows and menus around and talking to the database can be done far more easily by other means.
Re:Ever tried looking for jobs using C? (Score:4, Interesting)
Mod parent up. "The popular search engines Google, Bing, Yahoo!, Wikipedia, Amazon, YouTube and Baidu are used to calculate the ratings." That's rather lame. Exactly how do they search for "C", anyway? Do Sesame Street episodes brought to you by the letter C count?
TIOBE is good for generating bullshit headlines and boastful articles about misleading statistics.
The definition [tiobe.com] is pretty simple. They search for: +"<language> programming", then they try to look for false positives to get a "confidence" factor, and then use that to scale the resulting number of hits. They also include some search term qualifiers for certain languages, but I didn't see any listed for C.
This is really, really poor for a language with many false positives like C, because there are so many false positive results returned, but they are only looking at the first 100 results. The first results will have the fewest number of false positives, while the later results will almost all be false positives. What they are doing is assuming a linear relationship where instead it is most likely an exponential dropoff.
The fact that C is now on top is almost for sure due to the rise of false postives due to Objective-C gaining popularity.
C lives! (Score:2)
Wherefore art thou Dennis Ritchie?
We need a new language (Score:2)
And nothing of value was said (Score:3)
And that's not something to be settled by a popularity contest.
Survey is skewed by iOS developers (Score:3)
This survey is skewed by iOS developers trowelling out tons of appstore apps of questionable utility.
May Not Be a Fair Comparison (Score:3)
Therefore, the survey might include usage such as mine, which could tag every app I ever wrote as a 'C' app. FWIW
Re:fp (Score:4, Insightful)
Here's a crazy brief explanation:
The big draw to OO is that it (ostensibly) makes it easier and/or faster to write applications. This doesn't mean that you can make programs with an OO language that you couldn't with an imperative or structured language, only that certain tasks may be easier to implement.
That said, OO isn't always the best option. OO languages are typically a lot more complex and produce slower executables than plain C, so there is a trade-off that can be important in certain situations. As with anything, pick the best tool for the job.
For myself, when I first learned programming (via some books), I learned C before moving to C++. I absolutely hated C++ and didn't see the point of OO programming, due in large part because of the way the book presented it. At the start, the author had you write a C program, and throughout the course of the book, you would change it into a C++ program full of OO goodness. The final C++ program wound up having 50% more lines of code for the exact same functionality, and that was the point where I gave up on it. It was a pretty bad first impression.
So maybe you're reading from the wrong book?
Re:fp (Score:5, Interesting)
I'm guessing this was because the authors were exhibiting uselessly "object-oriented" toy programs to illustrate language features. You'd probably have had a different first impression if you'd started with Cocoa and Objective-C. While it hadn't been updated in years and consequently seems to have disappeared down the memory hole, one of Apple's old Cocoa tutorials was something to the effect of "Build a Text Editor in 15 Minutes", where they showed how you could build a TextEdit-like rich text editor with Cocoa in a couple pages of code.
In fact, it's pretty easy figure out how to do this starting from the Xcode "document-based application" template, as there's not much more to it than replacing the label control in the document window with a Text View and implementing a couple methods in the document class to get and set its contents.
Re:fp (Score:5, Informative)
The advantage of the object oriented paradigm is not primarily that it makes programming easier or faster. It is the better support of separation between different components, which makes it possible to contain the complexity of large projects with multiple software engineers.
Of course, there are other ways of handling large projects (for example, there are examples of large projects written in C that control complexity by conventions about the separation of data and modules). But the object oriented paradigm is a common choice for large software engineering projects.
You might miss this when learning from a text book, since you are often only given small code examples and toy object hierarchies. But that extra 'overhead' around the defining of object abstraction pays off as the complexity increases. For many problems, thinking in terms of objects rather than instruction sequences can make the problem easier to solve.
Starting off with C and moving to C++ is not necessarily a good process, as you will not begin to learn to think in terms of objects; it is a completely different way of problem solving. Even for experienced programmers, the transition from C to C++ can be a six month process, not because of the extra language features, but because it requires a change in approach. Many don't stick at it long enough to realize the benefits.
The trade-off over speed is not an issue at all; for example, C++ is not significantly slower than C. Speed is affected far more by other choices; data structures and algorithms, memory localization, parallelism, and so on.
And you would also be aware that there are other paradigms as well, such as functional programming. These paradigms are not just "different tools for the job". They can have a radical impact on problem solving methods.
Re: (Score:3)
OO is *always* the best choice, depending on your definition of OO.
Believing this is a tremendously bad mistake to make. There are plenty of other high-level programming paradigms that for some tasks can be vastly superior to OO--e.g. straight functional programming in ML or Dylan, or logic programming in Prolog, or explicitly stack-based programming in Forth or Postscript. There is no "one size fits all" when it comes to programming.
Re:fp (Score:5, Insightful)
It's like the static vs dynamic linking debate that you sometimes hear. There's no real valid answer to that one either, it's a best guess on what'll lead to the best performance. With dynamic linking you don't need to load all the libraries at the start, on the other hand with static linking you don't need to call up the linker each time a library is loaded, and so on... My main advice: stay out of it. There's no real valid answer to these sort of things.
Re: (Score:3)
Re: (Score:3)
Say you're coding a graphical interface and you want two buttons for okay and cancel. They both need to be blue. The toolkit yours using will have an object called Button that has the basic characteristics of what a button its, e.g., a clickable icon that does something. You sub class this Button and give it the specifics.
Button okay = new Button;
Button cancel = new Button;
You now have two objects of type Button. Next you get specific.
okay.onClick(proceed());
cancel.onClick(abort());
okay.color("00f");
Re:fp (Score:4, Funny)
Say you're coding a graphical interface and you want two buttons for okay and cancel. They both need to be blue. The toolkit yours using will have an object called Button that has the basic characteristics of what a button its, e.g., a clickable icon that does something. You sub class this Button and give it the specifics.
Button okay = new Button;
Button cancel = new Button;
You now have two objects of type Button. Next you get specific.
okay.onClick(proceed());
cancel.onClick(abort());
okay.color("00f");
cancel.color("00f");
This is terrible pseudocode butyou get the idea. instead of having to code buttons from scratch, you sub class them and only add what you need. typing on a tablet so I hope I haven't been unclear.
OK:
typedef struct button {
long long color[3];
void (*onClick)(int);
} Button;
Button okay;
Button cancel;
okay.onClick =
cancel.onClick =
okay.color[2] = 0xffffffff;
cancel.color[2] = 0xffffffff;
The C version is probably smaller and faster than your version.
Re: (Score:3)
Re:fp (Score:5, Informative)
Oh, and by the way, you left color[0] and [1] undefined on both of your Buttons.
The C standard requires the compiler initialize all stack-allocated memory to zero. color[0] and color[1] are exactly as the OP specified. To be safe, they should indeed be initialized to zero. In professional practice, I always memset everything I allocate to 0 for the entire block of memory I have allocated, and then initialize individual members of structures to whatever their default value should be.
The C standard requires static variables to be initialized to zero by default. Stack variables that aren't explicitly initialized can be random garbage.
To verify that I'm correct, I just tested it:
Re: (Score:3)
The C standard requires the compiler initialize all stack-allocated memory to zero.
Not only the standard does not require it, but virtually all real-world implementations don't do that, either.
In professional practice, I always memset everything I allocate to 0 for the entire block of memory I have allocated
That's a bad habit. There's no guarantee in the standard that setting all chars to 0 will correspond to a default value of a type, or even some valid, representation of pretty much any other type. For locals, you'd do better by writing an explicit initializer ={0} - this works with any type, and will implicitly initialize all remaining fields to their default values.
Re:fp (Score:5, Insightful)
What the fuck? Why the fuck would I subclass a button just to make it blue? That's just data, and damned trivial data at that. If your button object doesn't already have some mechanism for dealing with that data, it sucks and I'm using a different object, not yours.
Instead of having to code buttons from scratch, you sub class them...
No no no no NO. Goddamnit NO. Fucking Java. Motherfucking Javascript. They've ruined a generation of programmers.
Subclassing is the LAST thing you should be doing. The very last. First you should be using the customization features built in to the object, and using them directly on an instance of that object. Set the blue color on the Button class and be done with it. If that's not sufficient, use object composition. Most of the time, your object is NOT a Button. It's a something that needs to have a button. Only as a last possible resort do you subclass Button, and you'd damn well better be writing an object that still is-a Button. If you're not, you've done it WRONG.
Re:fp (Score:5, Insightful)
Except that if you read his code, he's not actually subclassing Button, he's instantiating it. He's certainly saying it wrong, though.
Re: (Score:3)
YES! Now, explain to me how you got past "C, a raw machine independent assembler-like language"?
Assembler-like? C? C used to be the hot new language that held your hand and did everything for you, unlike assembler. Now C is assembler-like?
Re:fp (Score:5, Insightful)
the idea of object oriented vs. non object oriented languages has always thrown me off.
Everyone else will attempt to explain OO using OO terms to a non-OO programmer. Thats like trying to teach my dog to sail a boat by speaking Japanese. I'll try a different tack. You know what a computed goto is, right (other than pure unadulterated evil, right?) What if your compiler enforced the hell out of good commenting and error bound checking to let you do computed goto's safely (er, more or less)? Well that is barely scratching the surface of OO. Syntactic sugar mounded on top of syntactic sugar. You know that quote about turtles all they way down, well fundamentally no matter the paradigm its Turing machines all they way down... more or less.
but really slashdot, what is the big draw to OO
When your professor was a little baby skript kiddie wannabe on his TRS-80 Coco-2 running OS/9 and BASIC09 and liking it, object orientation was the silver bullet among the crowd who could not bother to read "the mythical man month" by Brooks. So now you suffer thru OO because it was "cool" back when parachute pants were also cool, and leggings. Much as we're now raising a crop of wannabe skript kiddies who look up to the functional programming and agile methods people who have also never read "the mythical man month" by Brooks, so your kids / my grandkids are going to have to learn functional programming as The_One_True_Paradigm_And_all_disbelievers_should_be_burned_at_the_stake. And I'll still be writing device driver code on PIC microcontrollers in raw assembly, and it'll work great and I'll be liking it.
There's a really nice wiki article you probably need to read. The world is a lot bigger than "OO" "non-OO".
http://en.wikipedia.org/wiki/Comparison_of_programming_paradigms [wikipedia.org]
Re: (Score:3)
OK, with you so far...
But you're still trying to teach your dog to sail using sailing terms!
Now if on the other hand your dog was Japanese...
Re:fp (Score:4, Funny)
OK, with you so far...
But you're still trying to teach your dog to sail using sailing terms!
Now if on the other hand your dog was Japanese...
Ah, but is he sailing on a starboard tack or a port tack? And should he tack or jibe the boat? And should he attach the sheets to the tack or the clew of the sail?
Re: (Score:3)
Object-oriented code is a way of collecting functions and the data types they operate on into collections. Instead of having hundreds of functions that all operate on a particular data type, you group them together into a class—a collection of functions—so that you, the programmer, can easily see the relationship between them.
That's the best-case, well-behaved upside of OO, yes. But it has an evil twin called "object-oriented analysis".
Object-oriented analysis is a way of taking your company's essential business data - the data you need to trade and survive, which has been around since before punched cards and which will be around when mind-mapped DNA moon crystals are obsolete - and then wrapping that data together with hard-coded functions hacked up in some quirky, platform-specific language invented five minutes ago and for w
Re: (Score:3)
We got the Web? That's your argument?
Object oriented programming is a nice organizational technique for larger programs and code reuse. It works well, if used properly, not least because it imposes a namespace system. Most OO systems also give you some neat automatic features like inheritance.
OO is not a silver bullet, not everything needs to be OO, and nobody, ever, should start learning programming with OO. If you make absolutely everything an object you're either an idiot or a Java programmer (no com
Re: (Score:2)
Java is only slow for programmers who don't understand what they're doing.
Or those who have to do anything complex and CPU-intensive.
Re: (Score:3)
BS!
I have written fast programs in both Java and C# that are maybe 10% slower than pedal to the metal C or C++.
Good for you: for i = 1 to 100 is pretty fast in any language.
Now try doing complex signal processing in Java.
Re: (Score:3)
I didn't know they retired the C preprocessor.
Re: (Score:3)
Re: (Score:3)