Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming

Are C and C++ Losing Ground? 961

Pickens writes "Dr. Dobbs has an interesting interview with Paul Jansen, the managing director of TIOBE Software, about the Programming Community Index, which measures the popularity of programming languages by monitoring their web presence. Since the TIOBE index has been published now for more than 6 years, it gives an interesting picture about trends in the usage of programming languages. Jansen says not much has affected the top ten programming languages in the last five years, with only Python entering the top 10 (replacing COBOL), but C and C++ are definitely losing ground. 'Languages without automated garbage collection are getting out of fashion,' says Jansen. 'The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.'"
This discussion has been archived. No new comments can be posted.

Are C and C++ Losing Ground?

Comments Filter:
  • Always be there (Score:5, Insightful)

    by ohxten ( 1248800 ) on Thursday April 24, 2008 @04:35PM (#23188972) Homepage
    C/C++ will always be there. Period. Just look at all of the C/C++ projects on SourceForge. New languages will come and go, but C/C++ are just too stable to go so quickly.
  • Re:Always be there (Score:4, Insightful)

    by KlomDark ( 6370 ) on Thursday April 24, 2008 @04:38PM (#23189016) Homepage Journal
    Yep, it'll be right out there with all the Cobol projects on Sourceforge...
  • Dying...not hardly (Score:5, Insightful)

    by PalmKiller ( 174161 ) on Thursday April 24, 2008 @04:39PM (#23189026) Homepage
    I know I am gonna get flamed for this, but they said web programming, like its the only game out there. Sure its not web 2.0 friendly, and sure most web script kiddies don't use it...mainly because it don't hold their hand, but its far from dead when your are needing to squeeze every last ounce of power out of your hardware, or even that other 25-30% of it.
  • by Noodles ( 39504 ) on Thursday April 24, 2008 @04:41PM (#23189048)
    I develop desktop application software. Right now I wouldn't think about using anything else but C++.
  • by SilentTristero ( 99253 ) on Thursday April 24, 2008 @04:46PM (#23189136)
    For image processing (film/video), real-time audio or any serious signal processing, the overhead of anything but C/C++ is killer. It'll be news when Adobe After Effects or Autodesk Flame is rewritten in python.

    Besides, measuring the popularity of a language by the size of its web presence is the worst kind of fallacious reasoning.
  • by krog ( 25663 ) on Thursday April 24, 2008 @04:47PM (#23189158) Homepage
    C and C++ are entrenched, but it was never their stability which caused it. Computer languages are theoretical; one valid language is just as 'stable' as another. The real issue of stability lies in the implementation, and that is language-independent.

    Anyway, C is going to stick around because it is the most superb assembly language developed by man. C++ will of course stay around as well, but by modern standards it fails as a "high-level" language. The ceiling got a lot higher in the intervening 20 years; other languages reach much higher in a very useful way. I'd be happy to see less C++.
  • Re:Always be there (Score:5, Insightful)

    by fyngyrz ( 762201 ) * on Thursday April 24, 2008 @04:47PM (#23189160) Homepage Journal

    C is perfectly capable of extremely high-quality memory management with significant ease-of-use. However, you get to create that facility, or of course you can utilize someone else's design if you can locate one that fits your API needs, budget and time frame.

    For instance, years ago I faced this issue and wrote a module that ensures there are no leaks in any part of an application I write; I also get over-run and under-run detection, named segments, dual-free attempt capture, memory usage reporting, and more. I have debug and end-user levels for the code so that during development, I get enormous detail, while the end user doesn't see that unless I specifically turn it on for them.

    I have both pool and chunk level management; I have both pool and individual "free" levels; all of this in very few K indeed.

    C is the perfect language to implement memory management in, in fact, because it has perfect fine-grained control over memory.

    That goes for other things as well; C is highly capable if you need to build in certain types of OO; objects with built-in methods and variables can be crafted in seconds, with no waste at all; uniform list handling can be crafted (and is an interesting and useful programming exercise.)

    C *could* go away as a result of a generation of programmers who really don't know how to deal with such things, but I think it would be a real loss if it happened. The up side is that it'll take a while. There's a whole generation of us who know C quite well, and we're nowhere near dead yet. ;-)

  • by jythie ( 914043 ) on Thursday April 24, 2008 @04:48PM (#23189170)
    I could actually see C++ slowly going away over the next decade as it is replaced by other languages that fill the same needs but better. C on the other hand is probably going to be around for a long, long time.
  • Statistics (Score:5, Insightful)

    by Anonymous Coward on Thursday April 24, 2008 @04:48PM (#23189172)
    Measuring by internet web pages mentioning it? Can you say, "worthless statistic," kids? I write code that controls hardware. You bet it's C++. I write code that's IN the hardware. An interpreted language? Are you out of your damn mind? Do I blog about it? Don't be absurd. Am I generating "web presence" for it? Only on slashdot. Go away useless statistic.
  • by Sta7ic ( 819090 ) on Thursday April 24, 2008 @04:49PM (#23189200)
    "C++ is dying". Next!

    C/C++ won't be going anywhere fast.
      * There's too much code written in both languages.
      * There's NO CHANCE (imo) that anyone is going to write a kernel with code the plays with the memory behind the scenes. That's what the kernel's there for.
      * "If it works, don't fix it." See old FORTRAN, COBOL, Pascal, and other "dead/dying" languages that are still being used in industries because it'd cost more to replace everything than it's worth to update it ~ and in downtime & support costs more than direct dev costs.
      * "Fashionable" languages may affect the language choice, but do not affect task requirements. Construction workers don't wear hard hats because it's fashionable to do so.
  • by thermian ( 1267986 ) on Thursday April 24, 2008 @04:50PM (#23189228)
    I've been C coding for years, and I have to say, even though I like it, the number of things that I can do more easily with, say, Python, is getting larger.

    I suspect that soon all I will use C for is writing shared libraries that I can call from some other language.

    I wish people would stop banging on about C's memory problems. C has *no* memory management problem. It has no memory management at all, um, I mean, you just have to be careful when writing your code.

    C is fast, seriously fast even. For that reason alone it will always have a place. I shouldn't think there will be many coders who only use C left soon though, because the job market for pure C programmers is pretty small these days.
  • Re:so what? (Score:5, Insightful)

    by dreamchaser ( 49529 ) on Thursday April 24, 2008 @04:51PM (#23189234) Homepage Journal
    I consider a proper coder to be anyone who can write a proper flowchart and the pseudo-code/logic for their target application. It has nothing to do with the language they finally use to implement.

    That being said, I agree with you otherwise. The first thing I thought of when I read the summary was 'lazy coders' when garbage collection was cited as a driving factor. That's the sad fact; many of the kids being cranked out of schools today can't code their way out of a paper bag without a compiler/interpreter that does most of the dirty work for them.

    Yeah I know. Get off my lawn.
  • Re:not so.. (Score:5, Insightful)

    by ArcherB ( 796902 ) on Thursday April 24, 2008 @04:51PM (#23189246) Journal

    when we have internet that is as fast as cpu response times c and c++ will go the way of the dinosaur and the internet will be your main application platform and gaming platform, meaning game over for c and c++.
    As long as computers need an OS, C/C++ will be in wide use. All major OS's are written in C/C++ and will be for the foreseeable future.
  • by pclminion ( 145572 ) on Thursday April 24, 2008 @04:53PM (#23189274)

    I'm not sure why you feel you need to "track memory" in C++. I did an analysis of all the code I've written a year or so ago, and I found that there is approximately one usage of a pointer in every 5700 lines of code (the way I write it, at least).

    We have this great stuff called containers and RAII. And for when you absolutely must, must use a pointer, you have boost::scoped_ptr and boost::shared_ptr. I have not coded a memory leak or buffer overrun in C++ in over six years.

    The best way to not leak memory is to never allocate it in the first place. The best way to avoid overflowing raw buffers is to not use raw buffers. Use the containers. When you think you can't, think harder.

  • Re:Always be there (Score:5, Insightful)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday April 24, 2008 @04:54PM (#23189278) Journal
    Assembly will always be there. Period.

    That doesn't mean it will be particularly popular, or very likely that you can get a job in doing nothing but assembly programming.

    Really, with C especially, just about every advantage it has over more modern languages are advantages that C itself has over assembly. Assembly is still needed, but no one in their right mind would, say, write an entire OS in assembly.

    The day is coming when no one in their right mind will write an entire OS in C or C++, or even an entire OS kernel -- depending on your definition of "kernel".
  • by Froze ( 398171 ) on Thursday April 24, 2008 @04:55PM (#23189298)
    those who can code in binary and those who cant code.

    OK, kidding aside.

    There are those who write code so that a person can do something on a computer. In which case the users are comparatively slow and the high level languages give you a distinct advantage in development.

    Then there are those who write code to make the computer do something, in which case the low level languages give you the ability to more effectively optimize how the computer interacts with itself, this is where languages like C, C++ really come into their own.

    In the early days of computing it was all about the later, now its much more about the former, but the later will never go away. So the decrease is reasonable and IMHO does not represent a failing of the language, just a shift in the way computers are being used. I will be very surprised if the high level languages ever get widespread acceptance in the areas that require computational efficiency, ala computational physics, protein folding, etc.
  • by pclminion ( 145572 ) on Thursday April 24, 2008 @04:57PM (#23189324)

    GC is available for C++, but IMHO inappropriate. One of the great advantages of C++ is that the construction/destruction mechanism, along with automatic variables, gives you absolute control of the lifetime of every single resource. Whereas a garbage collected language like Java gives you absolutely no control over when (if ever) an object is destructed. I think it is a little wacky to give up this total control of object lifetimes in return for such a puny benefit, a benefit which could easily be achieved through C++ resource management techniques like RAII.

    And anyway, garbage collection is irrelevant if you never "new" anything in the first place.

  • by CustomDesigned ( 250089 ) <stuart@gathman.org> on Thursday April 24, 2008 @05:03PM (#23189410) Homepage Journal
    I do most of my work in Python and Java now. However, I often need to write in C/C++ to create JNI modules for Java or extension modules for Python. Wrapping low level (use 3rd party library) and performance intensive stuff for control via a higher level language is very productive. (C++ is handy for JNI, C is better for Python.) Furthermore, I even occasionally write small functions in assembler for C - usually to utilize a specialized instruction.
  • by Anonymous Coward on Thursday April 24, 2008 @05:04PM (#23189424)
    Programmers that have trouble tracking memory typically show themselves to also have trouble correctly managing other resources properly, especially in the presence of non-obvious control structures such as exceptions. If exceptions are present, then your code must run correctly forward and backward. Thinking that "everything is OK as long as I throw a call to close() in after I retrieve the results," is hopeful thinking. So long as the flow is not impacted by an exception, you are fine. Once an exception is thrown and your frame is unwound, then you may have leaked the database connection until the GC decides to come along and pick it up for you.

    Most managed languages use exceptions in some form, so the task of resource management is certainly not 'fixed' like proponents of GC would like to think. Languages need good constructs to help the programmer specify the lifetime of an object accurately. One example of this is C#'s using construct, similar to Python's with statement.

    While I do not miss manual memory management, I do appreciate being able to have direct control over memory if need be. Unfortunately many languages do not allow manual memory allocation for the rare cases it is needed.
  • Re:Always be there (Score:2, Insightful)

    by Hatta ( 162192 ) on Thursday April 24, 2008 @05:05PM (#23189444) Journal
    Already you'll find that no one writes an entire OS in C/C++. Look at the BSD init system, written in shell script for instance.
  • by SatanicPuppy ( 611928 ) * <Satanicpuppy.gmail@com> on Thursday April 24, 2008 @05:07PM (#23189460) Journal
    I'm not sure C is up to the multithreading/ multiprocessor support that is going to be required as processors keep shifting from single core to multicore architectures...I know it can be done, but C is hard to program for a single core...Multicore support may take it over the edge.

    Mind you, I don't think anything else is really set up for it either (Erlang?) but that's going to be the next big challenge.
  • Re:Always be there (Score:5, Insightful)

    by afabbro ( 33948 ) on Thursday April 24, 2008 @05:08PM (#23189470) Homepage

    However, you get to create that facility

    s/get to/must/

    Seriously, most people want to sit down and write the logic for their application, not invent (or even copy-paste) memory management schemes.

  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Thursday April 24, 2008 @05:11PM (#23189508) Homepage

    Garbage collection is surely a factor in them losing ground, but I think the main reason is simple: library support.

    Java and .NET have huge well-designed frameworks behind them. You can get things done really fast. What does C have? A bunch of separate libraries all with different conventions. C++ is a little better with a more useful standard library and Boost, but it still doesn't have anywhere near the infrastructure Java and .NET have.

  • Re:so what? (Score:3, Insightful)

    by moderatorrater ( 1095745 ) on Thursday April 24, 2008 @05:16PM (#23189582)

    The first thing I thought of when I read the summary was 'lazy coders' when garbage collection was cited as a driving factor
    While I somewhat agree with you, there are two things that I think you're overlooking. First, there are going to be bad programmers no matter what you do. Someone can sound good in an interview and turn out to be awful. Until everyone realizes that and comes to the decision that the programmer in question should be fired, they're introducing code to the system. Or, even worse, they're not bad enough to fire, but bad enough that it could be a problem. These people will always be there, so you have to try and work around them.

    Second, everyone makes mistakes. I don't care who you are, if you write 1 million lines of code, there's going to be a bug in there somewhere. Given enough bugs, there's going to be one you don't catch. Garbage collection takes away a class of bug and makes it so that even the very good programmers can write more stable code.

    There's a lot to be said for programmers getting taught better and applying those principles better, but in the end, taking away a class of bugs is going to be useful in the long run. Even with garbage collection it's possible to run into memory management problems, but it's a lot harder.
  • bandwagonism (Score:5, Insightful)

    by epine ( 68316 ) on Thursday April 24, 2008 @05:16PM (#23189584)
    I wouldn't say C or C++ is losing ground. They both continue to serve well in the niches they established.

    Meanwhile, other segments of the pie are expanding, and few of these new applications are coded in C or C++. Does that mean C and C++ are losing ground?

    There is no language out there that serves as a better C than C, or a better C++ than C++. The people who carp about C++ reject the C++ agenda, which is not to achieve supreme enlightenment, but to cope with any ugly reality you throw at it, across any scale of implementation.

    For those who wish to gaze toward enlightenment, there is always Java. Enlightenment is on the other side of a thick, protective window, but my isn't the view pretty? I've yet to encounter an "enlightened" language that offers a doorway rather than a window seat. I would be first in line if the hatch ever opened.

    The problem with C/C++ has long been that the number of programming projects far exceeds the number of people who have the particular aptitudes that C/C++ demand: those of us who don't need (or wish) to be protected from ourselves (or the guy programming next to us).

    It's not economically practical to force programmers who don't have that temperament to begin with to fight a losing battle with self-imposed coding invariants. I'm glad these people have other language choices where they can thrive within the scope of their particular gifts. I don't feel my role is diminished by their successes.

    For those of us who have gone to the trouble to cultivate hardcore program correctness skills, none of the supposed problems in the design of C or C++ are progress limiting factors, not within the zone of applications that demand a hardcore attitude toward program correctness.

    It's the natural order of things that hardcore niches are soon vacated by those unsuited to thrive there, leaving behind a critical core of people who specialize in deep-down nastiness.

    For example, it's not just anyone who maintains a root DNS server. I can say with some assurance that the person who does so did not earn his (or her) grey hairs by worrying about whether the implementation language supported automatic GC.

    Let's take a metaphor from the security sector. Ten years ago, a perimeter firewall was considered a good security measure. This measure alone eliminated 99% of TCP/IP based remote exploits.

    These days, most exploits are tunneled through http, or maybe I'm behind the times, and the true threat is now regarded to be some protocol tunneled within http.

    Then some genius comes along and says "in the security sector, TCP/IP defenses are losing ground". Quoi? Actually, no one is out there dismantling their TCP/IP based perimeter firewall. It's continuing to do the same essential job as ever.

    It's only the bandwagon that has picked up and moved camp. Yes, garbage collection and deep packet inspection are now all the rage. So it goes.

    Why not go around saying that sexual reproduction is all the rage these days? Would that imply we could eliminate all the organisms that reproduce asexually, and the earth's ecology would continue to function? Hardly.

    These new languages are soaking up much of the new code burden because these language are freed from having to cope with the nastiness at the extremes (small and large) that C/C++ have already taken care of.

    I would almost say that defines a success criteria for a programming language: if it removes enough nastiness from the application space, that the next language that comes along is free to function on a higher plane of convenience. C/C++ have both earned their stripes. Which of these new languages will achieve as much?
  • Re:Statistics (Score:2, Insightful)

    by Anonymous Coward on Thursday April 24, 2008 @05:17PM (#23189590)

    I absolutely agree. We have a disease in our industry, and it's that fast and cheap with under-experienced under-trained people is the way to go. There's a reason why OS's are not coded in concepts like Java - any programming language that needs to pause to "clean up after itself" needs some serious damn help - please code, audit your code properly instead of writing yet another piece of code because you refuse to properly #def, #undef and manage how you use memory.

    I'm sick and tired of the "Javoids" hanging up my web browsing, and fouling up real-time delivery on my trading floors. We are burning up serious quantities of electricity, and creating serious amounts of heat and feeding a bloated marketplace with this kind of inefficiency. We don't need faster processors, we need better code. You can make money with excellence - you just have to try instead of taking the easy road every time. Cray made a fortune being brilliant and keeping true to his engineering principles. Microsoft made billions by placing a pillow over the head of any and every innovation anyone with two brain cells in Redmond could think of would threaten their bloatware.

    Somewhere along the line the craftspeople left the building because IT managers were replaced with "Executrons" - these mindless folks who actually get compensated bonus for how much they "save" by cutting costs in IT - that's letting the fox in the hen-house. Thanks to this idiocracy, we now teach people not programming, but how to use Microsoft & Sun products in college for CS101, and to find any worthwhile thinking in a candidate I need to look for people from a foreign country.

    Thanks, Sun. Thanks, Microsoft. How about some real damn innovation, please, instead of these statistical analysis that make the status quo seem palatable.
  • by jameson ( 54982 ) on Thursday April 24, 2008 @05:18PM (#23189610) Homepage
    Hi,

    Yes, some things need to be done in assembly or C in order to `stay competetive' or even just to remain within the realm of the possible. How much that is depends on your application and your platform.

    So, systems programmers, you need not worry, your skills are always going to be needed for something.

    But let's be honest here, 80% of the applications you can code entirely in Haskell or Prolog or Python or whatever fancy high-level language you may personally have come to love. And of the remaining 20%, you can usually still code 80% of the application in your favourite language and optimise the core 20% in C. (After profiling. Let me repeat that, AFTER profiling.)

    Will it run faster and in less memory if you do it all in C? If you do it properly, sure. But that's not the question to ask. If you work commercially, ask for `what will be most profitable in the long run, while remaining ethical'. If you work free software projects, ask for `what will benefit people the most'.

    Don't confuse the above questions with `what will satisfy my C(++) hacker ego the most'. And remember that it's not just about getting the code working and making it fast, it's about making the code robust; and in many cases it's also about making the code readable for whoever will maintain it after you.

    Apologies for this rant; feel free to mod it down if you so desire, but you, dear fellow programmers, have had it coming for quite a while, as did I.
  • by Sloppy ( 14984 ) on Thursday April 24, 2008 @05:20PM (#23189660) Homepage Journal

    Mind you, I don't think anything else is really set up for it either (Erlang?) but that's going to be the next big challenge.

    Whatever it is, its compiler and low-level libraries will be written in C.

  • So what? (Score:5, Insightful)

    by menace3society ( 768451 ) on Thursday April 24, 2008 @05:21PM (#23189680)
    FORTRAN, Lisp, and Cobol have all lost ground. BASIC and Pascal used to be the big dogs instead of also-rans, and if Ada ever had any ground in the first place, it lost that.

    Even Perl isn't as popular as it used to be, now that other languages have started to fill its niche.

    Times change, and it should be unsurprising that the dominant programming languages change along with it. Some day, Java, PHP, Visual Basic, Python, and Ruby will all be obsolescent as well. Thirty years ago, computers were vastly different than they are now. In another thirty years, there will have been another quantum leap (intended) in computing. Why should the languages we program them with remain the same?
  • Re:not so.. (Score:4, Insightful)

    by gangien ( 151940 ) on Thursday April 24, 2008 @05:25PM (#23189730) Homepage
    I wouldn't be so [microsoft.com] sure [jnode.org].
  • Yes it is (Score:4, Insightful)

    by Weaselmancer ( 533834 ) on Thursday April 24, 2008 @05:26PM (#23189754)

    It's just slightly higher level. A C compiler outputs assembly code - that's the whole point of a C compiler. Think of C as the worlds greatest macro processor for assembly.

    That's why most compilers have some sort of ASM pragma - so you can inject your assembly into the code if you feel the compiler is doing a poor job of it.

    That's also why you'll never find a faster language. And that's why it'll never go away.

  • by Anonymous Coward on Thursday April 24, 2008 @05:30PM (#23189806)
    >> ways to code (like pimpl)

    pimpl is not a way to code. Is a way to abuse a language feature in order to avoid a language issue.
  • by Weaselmancer ( 533834 ) on Thursday April 24, 2008 @05:35PM (#23189898)

    I haven't written a line of code in C or C++ since I started with C#

    That says nothing about those languages. All that says anything about is your job.

    I write drivers, so I could make the opposite statement. Doesn't say anything about the relative merits of one language versus another though. All it says is that I'm in an environment where C makes more sense.

    In summary: A hammer is best when your problem is a nail, and a screwdriver is best when your problem is a screw.

  • by tepples ( 727027 ) <tepples.gmail@com> on Thursday April 24, 2008 @05:40PM (#23190000) Homepage Journal

    when we have internet that is as fast as cpu response times
    Unlikely. Even hard drives are faster than routing a ping from London to Tokyo and back.
  • by lgw ( 121541 ) on Thursday April 24, 2008 @05:41PM (#23190022) Journal
    As someone who programmed in assembly for 5 yeas professionally, let me say: C is a crappy assembly language. It has a crappy macro language, and I'm often left guessing what the compiler will do with my C code, especially on an unfamiliar platform.

    Is an int 32 or 64 bits? I had better compile a test program and fire up a debugger to find out. OK, since there's no C standard type for "32 bit int", what works on this compiler? Maybe INT32 is defined somewhere?

    And don't get me started on implicit conversion.
  • Re:Always be there (Score:5, Insightful)

    by fyngyrz ( 762201 ) * on Thursday April 24, 2008 @05:44PM (#23190084) Homepage Journal

    Seriously, most people want to sit down and write the logic for their application, not invent (or even copy-paste) memory management schemes.

    Yes, I understand that perfectly. I'm a huge fan of Python for that very reason.

    However, in C, writing memory management only needs to be done once; while writing the "logic for the[ir] application" is done many times. Consequently, the apparent load of writing memory management is much lighter than one might initially recognize. Or to put it another way, once it's done, it's done and represents no load at all.

    Further, there are huge advantages to having 100% control over the memory management of your application; speed advantages, fewer wasted/tied-up resources, and all the downhill consequences of those things -- if you don't waste resources, they're available for the user, or for other aspects of your programs. Likewise, if you get things done faster, more CPU is available elsewhere.

    Another thing: Depending on an external agency to manage your resources is a two-edged sword. If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target.

    If there are bugs in an external agency, you typically get to report them... and wait, bugs happily chewing on the users of your applications, until said external agency gets around to fixing whatever it was. If indeed they ever do.

    Same thing goes for list management, etc. Write it once, learn all about it (which is interesting AND increases your Leet Skillz) and now you have a generally useful tool that is as fast as you can make it, totally amenable to fixes and updates, and invulnerable to the ass-draggery of outside influences. I have used my list management module in AI apps, ray tracers, image processing, file management, and even in dialogs to control layer types in various (what I think are) clever ways. I have huge confidence in it, but, should it turn out to be broken... I could fix it in minutes. At which point every app I've written gains ground, all my customers win, etc.

    There's something else that has always remained in the back of my mind. As languages get more sophisticated, there is a trend for them to generate much larger and much slower resulting applications. It isn't uniform, and it depends on what you're doing, compilers as compared to interpreters, etc., but the trend itself is pretty clear. For instance, a Python app seems small, until you realize that the Python interpreter is loaded for your one-liner. C++ apps tend to be huge compared to C apps. And so on.

    This trend - basically - tracks the increasing availability of memory and CPU power. Seems reasonable enough. But the funny thing is, if you take an app that was designed to run at adequate speed on hardware from, say, 1992, keep the technology behind the app the same if you update it - that is, keep writing efficient C and so on - then the increase in memory and CPU resources serve to turn the app into some kind of blistering miracle implementation instead of the run of the mill performance you get from depending on the latest and greatest HLL with garbage collection, the implicit inclusion of module after module of object-oriented processing and modeling, data hiding, etc., etc.

    Directly related to this is the fact that if you attempt a modern task - such as an image manipulation system - in a modern language, you, as the programmer, can be significantly enabled by the language; that is, you can be done sooner, and you can have a lot of things done, too, many coming along for the ride, for "free." Garbage collection / memory management being one such thing. But if you approach the task using C, which is basically capable of creating as fast an application as you are capable of writing, it is so close to assembly, while we can certainly agree up front it'll take you longer, the end result coul

  • by Yokaze ( 70883 ) on Thursday April 24, 2008 @05:48PM (#23190156)
    That is hardly a conceptual problem of the language C++, but more one of the toolchain and/or ABI, and can be improved on by rewriting the old GNU linker in C++ :) [sourceware.org]. And maybe someday the GNU binutils will gain incremental linking.

    More critical is that the grammar of C++ is undecidable. [yosefk.com]
  • by kitgerrits ( 1034262 ) * on Thursday April 24, 2008 @05:49PM (#23190166)
    Shows what I know about PowerShell...
    Being a UN*X admin and a reasonably-competent scripter, I tried looking into it and my brain had trouble grasping how this is supposed to be a shell.

    From what I can see, Bourne and other UN*X shells are stream-oriented and PS seems object-oriented.
    I see LDAP as a flat text-based database with Organizational Units, not as a magical forest with trees, domains and groups.

    This is, most likely, because UN*X admins are used to modifying and/or generating configuration files,
        whilst Windows admins have spent the last 13 years ticking tickboxes and, lately, dragging objects.

    (does knowing win.ini and system.ini make me l33t or simply a dinosaur? I fled the Windows scene when 2003 came out...)
  • Re:C/C++ is dying! (Score:3, Insightful)

    by intangible ( 252848 ) on Thursday April 24, 2008 @05:50PM (#23190210) Homepage
    A musket isn't as useful or respectable when everyone else has M1A2s though.
  • Re:Always be there (Score:2, Insightful)

    by osu-neko ( 2604 ) on Thursday April 24, 2008 @05:53PM (#23190254)

    There are libraries out there available to just install and link to. But it certainly would be nice if some of this stuff got into the Standard C libs, so that all you needed was something like:

    #include <stdgcmem.h>

    ... and off you go on your merry way.

    The argument against would be that not everyone has the same needs from such a library, but it's a spurious one. Not everyone has the same needs from an I/O library, which is why there are a million alternatives to <stdio.h>, that doesn't mean you can't provide at least one standard library, and let those with other needs link to something else instead.

  • Re:C/C++ is dying! (Score:5, Insightful)

    by Anonymous Brave Guy ( 457657 ) on Thursday April 24, 2008 @05:55PM (#23190276)

    Popular as in people using it, or popular as in lots of people writing about it?

    Personally, I'm not convinced AJAX is that popular on the people-using-it count. It's a very useful technique for a particular niche, but it is only a niche.

  • Re:C/C++ is dying! (Score:5, Insightful)

    by neokushan ( 932374 ) on Thursday April 24, 2008 @05:56PM (#23190292)
    Surely it's not really a fair indication just because it's web presence is dropping? I could easily argue that Java is only so "popular" because more people are posting with problems they're having using the language and that C\C++ are only loosing ground because better information on using the language is already out there.
  • Re:Always be there (Score:4, Insightful)

    by Chris Burke ( 6130 ) on Thursday April 24, 2008 @05:56PM (#23190314) Homepage
    That makes C++ a lot better for application writing, but not necessarily for OS writing. The kinds of resources being managed in a kernel usually aren't the kind that are easily managed through "scope".

    One criticism of C++ is that by automatically handling the destruction of objects when they go out of scope, it can lead to a feeling of false security to programmers who assume that because their objects are destroyed, that all resources are properly freed. The possibility for leaks is quite significantly there, in the design of constructors and destructors and anything that uses a pointer. Though while not always easy, having to make no mistakes in any of your destructors for any class is a heck of a lot easier than never making a mistake on any individual object's deallocation as in C.

    By the same token, it's quite possible to have "leaks" in Java or C#, simply by having extraneous references to no-longer needed objects laying around in objects that are themselves still referenced.

    I'd still take C++ any day over C for a big application.
  • Re:Always be there (Score:3, Insightful)

    by fyngyrz ( 762201 ) * on Thursday April 24, 2008 @06:00PM (#23190382) Homepage Journal

    some one did that for us, so we can all use the same type of thing.

    No. They didn't. They implemented something else entirely, something that works outside the paradigm of what you're doing, attempting to track memory use by indirect means (such as counting references.) And this is precisely the problem with such memory management; by taking the programmer out of the management loop and abstracting it into irrelevance, the programmer no longer has either the means or the incentive to keep tight control of resources. The mindset leads to thinking nothing of bringing the entire Python interpreter into memory in order to evaluate a line or two of trivial logic, because its trivially easy to do. Consequently, huge system loads are incurred for relatively small tasks. Write that same logic in C, and you have a dedicated executable that (a) is tiny by comparison, (b) runs orders of magnitude faster, (c) you actually understand on every level. Presuming you don't drag in some huge library you didn't really need or do a really bad job. :-)

    Don't get me wrong - I am *not* a Python hater, in fact, it is one of my favorite languages. But I don't use it for everything just because it is easy. I always think about resources used, whose resources they are, whether I have the implicit right to consume them if I don't have to, and if I do have such a right, do I *want* to consume them? After all, they may be mine, and if I'm trying to do something else, some interpreter clumping around in the background consuming large chunks of resources may not be in my best interests. If it is true for me, it's probably true for my customers, so in the end, I have to do the same evaluation for them as well.

  • by karmaflux ( 148909 ) on Thursday April 24, 2008 @06:01PM (#23190404)
    1. Java.....20.5% - runtime written in C
    2. C........14.7% - duh
    3. VB.......11.6% - who cares
    4. PHP......10.3% - written in C++
    5. C++.......9.9% - duh
    6. Perl......5.9% - written in C
    7. Python....4.5% - written in C
    8. C#........3.8% - who cares
    9. Ruby......2.9% - written in C
    10. Delphi...2.7% - no idea

    ...somehow I doubt C or C++ are going anywhere. The good news is if they DO die at least they'll take PHP with them.
  • by porneL ( 674499 ) on Thursday April 24, 2008 @06:02PM (#23190430) Homepage
    D [digitalmars.com] also offers syntax and ease of writing comparable to C#/Java, but is faster [debian.org], doesn't require VM and compiles to native code linkable with C.
  • Re:Always be there (Score:5, Insightful)

    by jsebrech ( 525647 ) on Thursday April 24, 2008 @06:04PM (#23190460)
    But if you approach the task using C, which is basically capable of creating as fast an application as you are capable of writing, it is so close to assembly, while we can certainly agree up front it'll take you longer, the end result could be a lot faster and a lot more capable of efficiently managing the user's resources than that which you might create using a modern HLL.

    Agreed 100 percent. If you write it in C, you can make it run faster with lower resources, but you will spend a lot more time creating, debugging and maintaining it.

    Most software simply doesn't need to be that fast. The performance sensitive pieces of code are in database queries (C code), or disk operations (C code), or math operations (C code). Modern garbage collectors also are proven, they're fast, they're reliable. It doesn't make sense for the majority of classes of software, from a cost vs. gain perspective, to use C for the job.

  • by Tomy ( 34647 ) on Thursday April 24, 2008 @06:06PM (#23190480)

    Until a language comes along that can outperform C or C++, there will always be a use for them.

    It's still right-tool-for-the-job. I don't use Ruby to write audio DSP plugins, and I don't use C/C++ to code a web application.

    I'll keep both in my tool box, along with lisp, ... and this thermos.
  • by goombah99 ( 560566 ) on Thursday April 24, 2008 @06:07PM (#23190496)
    In the work I do--scientific calculations with a lot of fast numerics, , python + fortran seems like nirvana, as each overcomes the shortcomings of the other. One could just as well use C except the efficient numeric libs and memory layout give fortran an edge.

    This of course is not the match made in heaven for everyone but numerical scientists should look hard.

    What's so good?
    Utility:
    Well there's a strong base of numeric libraries in Python that are fortran array freindly so there's a good base to grow off of.

    Second F2PY, which creates python modules out of fortran subs works so well it's almost transparent and painless. Even cooler is that because fortran compiles are ludicrously fast compared to C++, you can generate fortran code in the python compiler at run time and compile it one the fly for creating modules optimized to your problem.

    Given you are wrapping in python, the availability of groovy C++ libs is not really very enticing at all given the pain you will pay for having to write the whole program in C++.

    Practical:
    Fortran as a stand alone language kinda blows for versatility and modern program architectures. But if all you are doing is writing a function then it's a sweet language because it's language syntax is so tight that it's harder to make a syntax error that compiles, and hard to logic errors seem to be less evasive than in C. (e.g. using i++ instead of ++i or doing I=4 instead of i==4 are not possible in fortran's limited syntax).

    Thus you write functions and let python deal with all the memory management, human interface, file management, command line arg parsing and all the messy bits that end up being a lot bigger than the function where the program spends all it's time.

    Fortran is also very optimization friendly since things like matrix multiplies and out-of-order loops are part of the core language.

    This is debatable but I find that fortran seems to have a more logical memory order in 2-d arrays. Namely if you take a sub-array you get elements that are consecutive in memory and thus for most microprocessors will all get pulled into the cache on the same page. Slices of C-arrays have consecutive elements spaced by the row width apart in memory. One can of course find cases where one is preferred over another.

    I do however which python had some way of optionally typing variables that was less cumbersome and slow than decorators or explicit run-time type checking. I virtually never write python that takes advantage of introspection or dynamic typing so the ability not to have some protection--optionally and just to debug--by type checking is annoying.

    But If I were starting from scratch and did not have a compelling need for all those wonderful fortran numeric libs, I think the optimal choice in the future is going to be

    Java+ Groovy.

    basically you learn one syntax and get the best of both interpreted and compiled languages. Develop it in groovy then migrate the slow bits to JAVA. import all the great JAVA libs.

    And since it's nearly the same syntax it's easy to read.

  • by Anonymous Coward on Thursday April 24, 2008 @06:08PM (#23190522)
    can I borrow it to clean my cat's litter box?

    To spoil the joke, but to explain it to you:

    1) The speed of the internet has fuckall to do with the programming language used to code the processing parts of it.
    2) A hard drive connected to a terabit internet won't do jack shit; -something- at your end needs to receive and interpret the stuff coming down the tubes. Or do you think magic fairies will do that?
    3) Even a hard drive has software inside it, which will still need to be programmed, which means some language will be in use.

    You're getting flamed because your statement makes about as much sense as: "Because TV will be digital next year, everybody will wear lederhosen with rabid gerbils stuffed down the front."
  • by Kevin Stevens ( 227724 ) <kevstev&gmail,com> on Thursday April 24, 2008 @06:13PM (#23190604)
    If you use the facilities provided by the STL and BOOST (most notably shared_ptr), C++ is not a whole lot different than Java these days. Java went a little too far in my opinion on being nice to the programmers while giving up performance. Modern C++ hits the sweet spot in my opinion.

    If only the standards committee could get off its arse and progress as quickly as BOOST does....
  • Re:Statistics (Score:3, Insightful)

    by jim3e8 ( 458859 ) on Thursday April 24, 2008 @06:14PM (#23190618) Homepage

    I write code that controls hardware. You bet it's C++. I write code that's IN the hardware. An interpreted language? Are you out of your damn mind?


    Forth is interpreted.
  • by Nursie ( 632944 ) on Thursday April 24, 2008 @06:39PM (#23191012)
    Jesus christ there's another one....

    C has been doing multi process for decades, and multi thread for a decade or more.

    It's used in commercial apps all over the world.

    How many times - threads and parallelism have been with us for years. Just because games haven't been threaded doesn't mean the rest of the world hasn't been doing it, and doing it well for a long time

    Look up pthreads sometime.

    Seriously, threaded processing in C is damn simple.
  • by goombah99 ( 560566 ) on Thursday April 24, 2008 @06:58PM (#23191268)
    Replying to myself because I forgot to add why I think Java+groovy has a big future.

    The big achilles heel of python is that it currently truly sucks for multi-core programming and it would appear that attempts to solve this are not coming quickly. It's global interpreter lock means that multi-threading gains almost no speed over a single processor. It's darn clumsy to fork in part because it takes so long for python to unwind it's stack when a job exits. And it's never written from the ground up to be thread safe.

    Fortran95 and 2003 have huge potentials for multi-cores since vector ops and out of order loops are part of the core language, the memory order of arrays can be favorable to vector ops, and the developers have been thinking about High performance Computing as a driver.

    however neither fortran95 or pyhton has notions of Syncronizing and locking so all the parallelism is implicit not explicit. You'd rather have implict paralellism to be sure, but sometimes you need explicit control.

    JAVA was written with threading in mind from the beginging. So it can potentially embrace the multi-core revolution that is coming more quickly than other languages.
  • Re:Always be there (Score:3, Insightful)

    by Namlak ( 850746 ) on Thursday April 24, 2008 @07:01PM (#23191310)
    It seems logical that the best approach is to write the speed critical portions in C and the (G)UI in a HLL - each is best suited for the task.

    As a VB.NET programmer building business automation apps for a living, I can't imagine building a (G)UI in a LLL. Not that I wouldn't appreciate the exercise, but the demands of the business environment won't allow it. Not just for the initial build but for the inevitable stream of change requests that will follow. Drag/Drop/Done is the name of the game.

    But as a hobbyist microcontroller programmer, well, there's no such thing as bloat in that space - you can't do it!

    If I was writing some image manipulation software, all the actual processing would most certainly be in C if not straight assembly for the very most critical parts. But the Load/Save/View/whatever parts, I'll do in VB!
  • by Rorschach1 ( 174480 ) on Thursday April 24, 2008 @07:08PM (#23191442) Homepage
    I'm curious what the list looks like for embedded programming - particularly at the low end. For my money, C is hard to beat on small systems - it's a good compromise between efficiency and manageability. If you know how the compiler works, it's not hard to write code that's nearly as efficient as hand-optimized assembly.

    I'm sure Java ranks high there, too, but I don't consider it to be in the same class. Native Java hardware is relatively expensive, and running a VM takes a significant amount of memory and processing power.

    My latest project is pure C (aside from about 100 lines of assembly for a firmware upgrade bootloader), around 30 pages of source code at present, and it compiles to about 9k of object code. It's targeted for a $2.50 processor, and I'm able to do things like simultaneous Bell 202 and DTMF decoding in software because I know exactly how C arithmetic is implemented on the processor and can take advantage of that without having to actually do the implementation in assembler. Doing the same thing in Java would cost a lot more. And when $5 saved on the bill of materials means an extra $5-10k in my bank account at the end of the year, that's a big deal.

    So what other languages can compete in that space?
  • Re:C/C++ is dying! (Score:2, Insightful)

    by Anonymous Coward on Thursday April 24, 2008 @07:16PM (#23191542)
    I know your troll is a mockery of the groupthink, and I don't want to spoil your momentum, but I think this bares saying. Visual Basic, before .NET style, was as turing complete as any other language I have had to use. Meaning, I CAN solve any solvable computational problem. Classes might NOT be like C++ or Java or whatever... but it isn't an inferior way of doing it. It is the COM way, which just about no one is comfortable with. Sure, error handling sucks, but seriously, do you REALLY think that try/catch is any prettier? I can call any DLL function with only half of my brain.
  • Re:C/C++ is dying! (Score:4, Insightful)

    by utopianfiat ( 774016 ) on Thursday April 24, 2008 @07:19PM (#23191584) Journal
    I was convinced that in the scientific programming world, people were still into Fortran as it grants a slight increase in speed over C for certain algorithms. Of course, this wouldn't have a broad _web presence_ due to the fact that realtime and mission critical applications aren't posted on the web. I think the limit of the fortran that still exists publicly would be the open source ATLASes and MATLAB clones (ie:matplotlib), as well as, of course, Linux itself.
  • by DrYak ( 748999 ) on Thursday April 24, 2008 @07:28PM (#23191720) Homepage

    I wouldn't say it's a niche.
    It's a niche in the sense that AJAX is exclusively used for web interfaces.

    So it could also be considered a niche on the scale of all the situation at which you can throw the other languages. C could be found on anything electronic that can run code between small embed micro-controllers up to package running on huge mainframes or cluster.

    It's not bad, its very useful indeed. Just hasn't seen as many different usages as the other languages yet.
  • Re:Always be there (Score:1, Insightful)

    by Anonymous Coward on Thursday April 24, 2008 @07:41PM (#23191872)
    Blah blah blah, C is great for writing your own programming language runtime in, blah blah blah.

    You seem to be suffering from Not-Invented-Here syndrome. Sure, all of the theoretical advantages you cite are real, but look at all the time you end up spending developing this framework. You say it's a small, one-time cost... but what if your application requirements change? What if you need to keep up with changing technology (what if you suddenly want to adapt your code to be cache-aware? distributed network shared memory? multiprocessor safe? something that hasn't even been thought of yet?) What if there are subtle memory model bugs you haven't anticipated because you're not an expert in machine architecture X?

    The big advantage of shoving all this work off to a third party, even if it's a suboptimal solution, is that they get to worry about all that stuff. Furthermore, if it's fundamental to the language, you can be assured all the libraries you might want to use will also take advantage of it. That's unlikely in the case of a framework you roll yourself.

    It's the open source model of sharing, instead of re-inventing the wheel. While I'm sure it's a very beautiful thing to build all this infrastructure, 99.9% of the time it's also completely unnecessary for you, personally, to do it. This is where languages like C/C++ fall down.

    (To be honest: I code lots of infrastructure-type stuff in C, too. Systems programming, we used to call it. But I do it as a hobby, not to get stuff done.)
  • Re:Always be there (Score:5, Insightful)

    by lena_10326 ( 1100441 ) on Thursday April 24, 2008 @07:56PM (#23192050) Homepage

    However, in C, writing memory management only needs to be done once; while writing the "logic for the[ir] application" is done many times. Consequently, the apparent load of writing memory management is much lighter than one might initially recognize. Or to put it another way, once it's done, it's done and represents no load at all.
    I don't believe that is true at all. One huge reason for building a memory management scheme is to tailor it to a specific algorithm, which is bound to a particular application. Optimization for allocating small chunks (bytes to kilobytes) can be very different compared to allocating extremely large chunks (megabytes to gigabytes), or variable sized versus fixed size, or read/writes with sequential access versus random access, or low access frequency versus high access frequency, or multi-threaded versus serial. These are all intricately bound to the overall application algorithm and can yield extremely different solutions given a particular problem. It's simply not possible to write a general allocation scheme that is fully optimized for every type of problem. I've experienced this in real world projects.

    Another thing: Depending on an external agency to manage your resources is a two-edged sword. If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target.
    It's rare that the original developer stays on the project for its lifetime of usage. In fact, I've never seen that happen. People quit, get fired, get promoted, or move onto new projects. When the sole hot-shot in the organization writes a complex codebase, it places a future burden on the lesser experienced team that may inherit it. Maintenance is always more expensive than original development.

    If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target... [CUT]... I have huge confidence in it, but, should it turn out to be broken... I could fix it in minutes
    I don't believe that for a second. I've seen sneaky bugs in C code plague development teams for days and on a few occasions a week. You're either vastly underestimating or are totally unaware of very well hidden bugs in your code.

    But the funny thing is, if you take an app that was designed to run at adequate speed on hardware from, say, 1992, keep the technology behind the app the same if you update it - that is, keep writing efficient C and so on - then the increase in memory and CPU resources serve to turn the app into some kind of blistering miracle implementation instead of the run of the mill performance you get from depending on the latest and greatest HLL with garbage collection
    99+% of the time with general problems, I/O is the bottleneck. For those cases, a C application might run 1% faster on newer hardware, given equivalent I/O hardware (same model/make of drive or network). In the vast majority of cases, the effort is simply not worth it. It's far more expensive to pay your salary to build and maintain that codebase than it is to simply purchase a beefier machine. The former is a repeating expense, the latter is a one time expense. Business managers love the latter, not the former.

    I do agree that if your domain consists of highly CPU bound computational algorithms that don't require frequent HD or network access, then your approach will scale well with the faster hardware.; however, I don't think advocating it as a baseline approach for all or most projects makes any sense. It is far more work and causes more maintenance headaches than you're describing.
  • Re:C/C++ is dying! (Score:2, Insightful)

    by ibbie ( 647332 ) on Thursday April 24, 2008 @08:50PM (#23192548) Journal

    I was surprised D was in the top 20. I did have to check if this was not a joke.
    D is pretty spiffy though, especially with the tango package. I'm not surprised that it's gaining in popularity.
  • Re:C/C++ is dying! (Score:5, Insightful)

    by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday April 24, 2008 @09:11PM (#23192718) Homepage
    I agree with you on Coldfusion, simply because I'm forced to work with it on a daily basis. As a longtime "real" programmer, CF is an insult to my skill and experience, but sadly I need to eat.

    Delphi though, slow down! Everyone keeps repeating how Pascal is a teaching language, yet it was my official language for many years. Back in the 90's I was developing commercial games with little more than Borland Pascal 7 and Turbo Assembler. I did the speedy bits in assembler, and the logic in Pascal. My development time was extremely short and my code was very reliable and reusable.

    When Delphi happened, well honestly the first few versions stank, but I remember writing all sorts of apps in Delphi 4 (yes, even DirectX games). Delphi today has turned into a schizophrenic marketing clusterfuck thanks to Borland/Inprise/Codegear/TrendyNameOfTheMomentInc, but I think Delphi as language is just right for a large number of situations.

    It's right in the sweet spot between useless VB and painful C, plus it compiles crazy fast and performs very respectably, given how easy it is to develop. In fact, its qualities closely resemble those of C#, only Delphi did it over 12 years ago. It's no coincidence, Microsoft hired the creator of Turbo Pascal, Anders Hejlsberg, to create C#, J++ and many key architectural features of .NET. If Borland hadn't gone mental in the mid-90s, .NET would not exist today, instead we'd have Borland's equivalent and people would be praising Delphi, just as they praise C# in today's reality. It would probably run a helluva lot faster too!
  • Re:C/C++ is dying! (Score:1, Insightful)

    by Anonymous Coward on Thursday April 24, 2008 @09:16PM (#23192760)

    9) "Hello, world" only takes one line, and that can't be right, because I learned in Java class that it should take pages and pages of setup code and stuff.
    System.out.println("Hello world");

  • Re:C/C++ is dying! (Score:3, Insightful)

    by Anonymous Brave Guy ( 457657 ) on Thursday April 24, 2008 @09:42PM (#23192960)

    Everyone benefits from not having to reload all the sidebars, etc. on a page when they click a link.

    Not if they're writing the firmware for a washing machine, the operating system for a telephone switching system, the back-end of a corporate database application, the latest FPS blockbuster, the drivers for a new Linux file system...

  • by Angst Badger ( 8636 ) on Thursday April 24, 2008 @11:18PM (#23193684)
    which measures the popularity of programming languages by monitoring their web presence

    Web presence doesn't equal much; it certainly doesn't equate to popularity. Nor do these numbers bear much resemblance to the mix of programming openings I see on job boards. C is number two? Really? Or are they just counting the number of times C shows up in the meaningless expression C/C++ ? Outside of the DSP and embedded devices niche, the appearance of "C/C++" in a job listing means they're looking for a C++ programmer, and it's generally followed by a list of C++ APIs that the successful candidate will be familiar with. And please, C fans, keep your flames low. C is my favorite language, but if it was really the second most popular programming language, I wouldn't spend eight to ten hours a day programming in C++ and PHP.

    Anyway, the bit about the lack of garbage collection in C++ is a crock. There are a number of easy to learn and use GC libraries available for C++, and a number of them can be used in most cases with little to no code changes simply by linking them in. If the popularity of C++ is declining over GC, it's because people have gotten too lazy to type "c++ garbage collector" into Google. There are plenty of reasons to dislike C++, but that's just not one of them.
  • by Shados ( 741919 ) on Friday April 25, 2008 @12:08AM (#23193960)
    I have news for you however. Statistically and historically, CS-related work was mostly a big trail of failures. Virtually no project, aside a select few with insane ressources backing em, and a couple of lucky shots, actually ended up giving out more value than they required.

    In this day and age, maybe its inefficient, maybe its not quite CS, maybe its not "cool", but it gets the job done, it saves money, it makes things work, its enabling. Its still not good enough in its current state, but its going in the right direction. We're leaving the "how" behind, and changing it with the "what". People have ideas, and they make those ideas a reality, and it "works".

    It may not be as badass to implement an enterprise management system, as opposed to the first ethernet network, but you don't spend 3 years on a project to be left empty ended 90% of the time either.
  • C++ is easy (Score:3, Insightful)

    by jessecurry ( 820286 ) <jesse@jessecurry.net> on Friday April 25, 2008 @12:25AM (#23194040) Homepage Journal
    Saying that C++ is losing ground because it lacks garbage collection makes me fear for the future of the "general programmer". If you use C++ for any length of time beyond a week it becomes second nature to manage your memory. Hell, I programmed in C++ for 2 days and caught on to situations in which a memory leak would even be possible. It's scary how easy it is to do... it's even scarier to think that people would drop a language because they may have to be mindful of memory when they code.
  • Re:C/C++ is dying! (Score:2, Insightful)

    by Count Fenring ( 669457 ) on Friday April 25, 2008 @01:55AM (#23194500) Homepage Journal

    How is a scripting language not a programmer's tool?

    And how, in that case, does PHP not have the same caveat attached, since (in capability) it resembles a sped-up but limited subset of Perl?

  • Re:C/C++ is dying! (Score:5, Insightful)

    by joggle ( 594025 ) on Friday April 25, 2008 @02:04AM (#23194538) Homepage Journal
    It really comes down to different tools for different jobs. Having a vast number of tools at your disposal for free is not a bad thing, just get a cursory knowledge of each tool and what it's good for so that when your next project comes up you can make an informed decision on which one(s) to use.

    Also, you make it seem like only knowing the C/C++ languages is sufficient to accomplish anything. That's really not true--at a minimum you need to know the STL for C++ related stuff, some GUI library for doing graphics, an XML library for doing XML manipulation, a database library for interacting with the database of your choice, a cross-platform library to write portable code, etc. Even if you're using something that does all of that (such as Qt) you still need to learn about XML, XMLSchema and DTD if you are using those technologies (just as you would for web programming).
  • Re:C/C++ is dying! (Score:1, Insightful)

    by Anonymous Coward on Friday April 25, 2008 @02:49AM (#23194742)

    Visual Basic, before .NET style, was as turing complete as any other language I have had to use. Meaning, I CAN solve any solvable computational problem. Classes might NOT be like C++ or Java or whatever... but it isn't an inferior way of doing it.

    The Whitespace [wikipedia.org] language is also Turing complete, as is LOLCODE [wikipedia.org]. You CAN solve any solvable problem with either of them.

    The comparison tdoesn't prove anything one way or the other about VB-- but neither does it Necessarily follow from the fact that it is "Turing complete" that it "isn't an inferior way of doing it" (I know I wouldn't want to get stuck with the job of maintaining, editing, or debugging someone elses Whitespace program...

  • Re:C/C++ is dying! (Score:3, Insightful)

    by asc99c ( 938635 ) on Friday April 25, 2008 @03:22AM (#23194898)

    For most apps of any size, having a 'separate' GUI is no bad thing. It encourages you to simplify the back end processing and keep if efficient and easy to understand, with a limited number of hooks for a GUI to hook into.

    The stuff I write at work has multiple user interfaces possible. Little has to change to swap our Windows only C++ / Visual Studio GUI to a multi-platform web based GUI.

  • Re:C/C++ is dying! (Score:3, Insightful)

    by TapeCutter ( 624760 ) on Friday April 25, 2008 @03:24AM (#23194908) Journal
    "Our team takes it the other direction altogether. Our engine is in C++. Our GUI is in Swing. As long as they're going to be different, may as well really make them different."

    Conceptually that's exactly how it should be, decouple the engine from the display as much as possible. Many enterprise projects learnt this lesson the hard way when they spread the engine across a multitude of MFC dialogs and widgets under Win31, many of the same enterprises are still learning the same lesson with web apps.
  • Re:Always be there (Score:3, Insightful)

    by MichaelSmith ( 789609 ) on Friday April 25, 2008 @04:10AM (#23195114) Homepage Journal
    Believe me I have done my share of machine code and assembly programming in my time and I understand exactly what the people on this project have achieved.

    Good on them, but I believe the right tool should be used for the job. Many very high level tools such as UML get overused these days. Thats a shame and it is good that people are working to keep their assembler skills alive.
  • by thesteef ( 1279258 ) on Friday April 25, 2008 @09:00AM (#23196600)
    It's unbelievable this company get's any attention with this index. Just some quick and easy points which have been glaring mistakes for years now: Vb is a combination of Basic, VB.NET, Visual Basic.NET, Visual Basic .NET, Visual Basic 2005, VB 2005, Visual Basic 2003, VB 2003, Visual Basic 2002, VB 2002. Some of these, especially the .net variants are totally different languages with only superficial similarities with the rest. Including this in the index is nonsense. Secondly one has to wonder why .net in general is so underrepresented versus java. This can be explained an their data gathering method. They google for terms like +c# programming. This ommits an entire group of articles only targetting .NET This is because of the language independant nature of the platform. Some numbers to put this 'index' in some much needed perspective: +java programming 8.320.000 +.net programming 13.500.000 +c# programming 1.260.000 +vb programming 618.000 +vb +.net programming 664.000 This is the only meaningful way to look at the relative popularity of java & .net. Java is not only a language but also a platform, so is .net. Furthermore the majority of programming (judging from these numbers i'd say about double) in .net is done in c#. You'd have to conclude that c# should at least be up there with java. That is if you think counting the number of search engine results is a meaningful way to get an index, which I don't.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...