Forgot your password?
typodupeerror
Programming

Are C and C++ Losing Ground? 961

Posted by Soulskill
from the lots-of-ground-to-lose dept.
Pickens writes "Dr. Dobbs has an interesting interview with Paul Jansen, the managing director of TIOBE Software, about the Programming Community Index, which measures the popularity of programming languages by monitoring their web presence. Since the TIOBE index has been published now for more than 6 years, it gives an interesting picture about trends in the usage of programming languages. Jansen says not much has affected the top ten programming languages in the last five years, with only Python entering the top 10 (replacing COBOL), but C and C++ are definitely losing ground. 'Languages without automated garbage collection are getting out of fashion,' says Jansen. 'The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.'"
This discussion has been archived. No new comments can be posted.

Are C and C++ Losing Ground?

Comments Filter:
  • by KlomDark (6370) on Thursday April 24, 2008 @04:36PM (#23188984) Homepage Journal
    I haven't written a line of code in C or C++ since I started with C# - C/C++ syntax with no tracking of memory (I detest tracking memory!!) except in the more obscure situations. Both .NET and Mono allow for C#, so you're not stuck on one platform.

  • so what? (Score:2, Interesting)

    by rastoboy29 (807168) on Thursday April 24, 2008 @04:38PM (#23189010) Homepage
    Of course C++ is losing ground.  But does it matter?  No.

    If you still need optimum performance at any cost--like for OS's, many games, simulations, etc.--you must use the C's.  If you don't, then your Java or favorite scripting language will give you faster development time and easier deployment.

    So it just doesn't matter.  I still don't consider someone a proper coder unless they know C++, though.
  • by QuantumG (50515) * <qg@biodome.org> on Thursday April 24, 2008 @04:41PM (#23189052) Homepage Journal
    Lately I've found the biggest advantage of using C# over C++ is compile time. If I change a header file in C++, that's it, I'm off to make coffee, but with C# you can change just about anything and the code is recompiled in seconds.

    Now if only the native code generation for C# wasn't so pitiful and unsupported.

  • not so.. (Score:1, Interesting)

    by Anonymous Coward on Thursday April 24, 2008 @04:45PM (#23189124)
    when we have internet that is as fast as cpu response times c and c++ will go the way of the dinosaur and the internet will be your main application platform and gaming platform, meaning game over for c and c++.
  • Incompetence... (Score:2, Interesting)

    by HetMes (1074585) on Thursday April 24, 2008 @04:47PM (#23189152)
    ...I say. If reference counting or basic allocation deallocation coupling is something you cannot do, you're in the wrong business. However, educating students in the art (c.i.t. I know) of programming with Java calls for these kinds of problems.
  • by ThePhilips (752041) on Thursday April 24, 2008 @04:55PM (#23189294) Homepage Journal

    What I love about such studies is that they can confirm any theory you want.

    Truth remains that every particular market has requirements which dictate selection of languages.

    I doubt that telecom industry (as it is right now) would ever get over C or C++. Just like kernel or system libraries in anything else but C.

    If you look at rise of Web - and pleiades of supporting it languages - then both C/C++ are out of question of course. Though again I can hardly imaging Apache or MySQL or PHP being written in anything else but C or C++.

    Market for system and telecom programming is definitely shrinking - and consequently their languages. Other markets are now blooming - and their languages are becoming more popular.

    My point is that the languages are complementing - they are not competing. After all you have to write hardware, firmware and OS first. Only then your beloved automated garbage collection has possibility to kicks in.

  • by Sarusa (104047) on Thursday April 24, 2008 @05:14PM (#23189556)
    We have certainly replaced C/C++ with Python wherever we can. This is about 90% of our software. Except where C is absolutely needed (which is mostly just in our kernel/device driver stuff), the 10x faster Python development and far easier code maintenance just outweighs everything else. That the Python is much less prone to crashing for programs beyond tiny one-offs is another big positive (yes, yes, if you write perfect C/C++ and don't use glib you'll never crash either, but in practice this never happens).

    In practice the speed difference doesn't matter for almost every application we've run into - we have a high speed network load tester in Python, which sounds ridiculous, but it works and it makes it insanely easier to add new tests or behaviors. If we ever hit a bottleneck, we just write a small C extension module and call that from the Python.

    I'm saying Python here, but insert your higher level language of choice.
  • Re:not so.. (Score:1, Interesting)

    by Anonymous Coward on Thursday April 24, 2008 @05:18PM (#23189620)

    As long as computers need an OS, C/C++ will be in wide use. All major OS's are written in C/C++ and will be for the foreseeable future.
    I'm sure the same was said quite a while ago with assembly swapped for C/C++. And, for the sake of application reliability and programmer productivity, one can only hope that the unforeseeable, but inevitable, future comes sooner than later.

  • Re:C/C++ is dying! (Score:3, Interesting)

    by dfiguero (324827) on Thursday April 24, 2008 @05:22PM (#23189688)
    Funny that, according to the site, Javascript is losing ground. I was thinking the opposite should be happening now that Ajax is so popular.
  • Who's going to bother listening to my "back in my day, we programmed uphill in the snow both ways" stories when I don't even bother to use a monospaced font!

    And before I started up my 80x25 terminal window, I tied an onion to my belt, which was the style at the time.

    Yeah. Much better.
  • Re:not so.. (Score:3, Interesting)

    by peragrin (659227) on Thursday April 24, 2008 @05:26PM (#23189740)
    I hate to say so, but MSFT's Singularity and now others(including Open Source versions) are doing a core OS in C# and .NET. It is something innovated from MSFT.

    It will take time, but it's well within the foreseeable future.
  • Re:C/C++ is dying! (Score:5, Interesting)

    by lgw (121541) on Thursday April 24, 2008 @05:30PM (#23189800) Journal
    Do they somewhere discriminate between VB and VB.Net? Claiming that VB is not even a programming language is ... probably reasonable. VB.Net is just C# without curly braces, however.
  • by SpinyNorman (33776) on Thursday April 24, 2008 @05:32PM (#23189830)
    There's nothing to stop you from exclusively using reference-counted smart pointers and garbage collection in C++, for some or all of a project, if that's really your thing.

    For me, C++ destructors (each object responsibe for it's own storage) remove most of the hassle of freeing storage, and I've never hankered after garbage collection.
  • Anecdotal experience (Score:3, Interesting)

    by raw-sewage (679226) on Thursday April 24, 2008 @05:38PM (#23189966)

    I've been in the "real world" for about six years now, after graduating with a computer science degree. I'm currently in Chicago, Illinois, USA. I've spent the past several months looking for a good software engineering job, both in the Chicago and Milwaukee (Wisconsin) areas. Just from this experience, my take is that Java and C#/.NET technologies are hottest right now.

    My first job was using C and C++. This was partly due to historical reasons (the application was about 12 years old), but also because the API for the platform was only in C. Shortly before I came in, and during my tenure there, we were trying to move more towards C++ and build a more object-oriented framework. My current position is at a high-frequency trading firm. All our software is custom and mostly C++ (some C here and there, and a handful of Perl to glue things together).

    So based on this experience, when I was looking for a job, I was focusing on C/C++ positions. What I found is that there aren't a lot of people looking for C/C++ developers. In Milwaukee, virtually all of the demand for C/C++ programmers was for embedded systems. In Chicago, there was little demand for experience in those languages outside of embedded systems and the finance industry (which I was/am trying to get out of!).

    This is just my casual observation of a relatively small portion of the software engineer landscape as a whole.

    On top of a diminishing demand for C/C++ programmers, I found that quite a few companies who were looking for Java/C# programmers wouldn't even consider C/C++ people. The languages aren't all that different, and the concepts should definitely be portable. I think knowing concepts, understanding programming ideas/patterns, problem solving, etc, are more important that knowing the specifics of a particular language. Shrug.

  • C++ losing favor (Score:2, Interesting)

    by apharmdq (219181) on Thursday April 24, 2008 @05:53PM (#23190256)
    I'm not really sure why C++ is so ill-favored lately. It may not be fully OO, but there are many times when a fully OO solution is counterintuitive. Instead, C++ allows the developer to choose whether they want an OO solution or not. I also see a lot of complaints about C++ performance in comparison to C performance, but really, when properly implemented there is little difference. The same goes for garbage collection. Granted you have to write it yourself, but program-specific garbage collection will ALWAYS be more efficient than automatic garbage collection.

    I think it's generally agreed that C and C++ aren't going anywhere anytime soon, since a lower-level programming language will always be needed. However, sidelining C++ in favor of C is definitely not a good idea, as C++ does offer many advantages that C lacks. (Yes, even for kernel development.)

    I'd go into more depth, but this article really does a good job of explaining it:
    http://unthought.net/c++/c_vs_c++.html [unthought.net]
  • by ajs (35943) <ajs@@@ajs...com> on Thursday April 24, 2008 @06:01PM (#23190406) Homepage Journal
    1. The Web is 90% noise
    2. 99% of statistics are lies
    3. This is a case of statistics based on the Web

    Thus we have a 99.9% certainty that this article is a noisy lie.

  • by radarsat1 (786772) on Thursday April 24, 2008 @06:18PM (#23190686) Homepage
    I'd _like_ to stop using C++, frankly, but I don't seem to have a choice. A lot of my work depends on real-time capability, the kind of speed that is still only really possible on natively compiled languages that don't do dynamic typing.

    I don't even mean hardcore real-time mechanical nano-second control of knife-wielding deathbots, just simple, This Must Run As Fast or Faster Than The Rate At Which It Will Be Converted To Analog. Python and Java still don't replace C in this area. (Mainly audio, video, and high-speed mechanical control.) And when it gets complex and you need to get into object oriented models to simplify the programming, there is unfortunately no real alternative other than C++. Combine this with that fact that there are a bunch of great libraries out there written in C++ that would be very difficult to replace, and you're stuck with it.

    (I sort of oscillate between liking C++ and hating it, but I'm preferring straight C more and more these days. But like I said, I don't always have the luxury of choice, depending on what libraries I need to use.)

    All these other languages mentioned (Java, Python, Ruby, PHP, Perl, etc) do not compile to native code, and all do dynamic memory management. Hell, that's exactly what makes them *good*. But unfortunately they're not so good for real-time tasks.

    For real-time, you need deterministic memory management, and native speed. I've been looking at some other languages that compile to native code these days, like D [digitalmars.com], or Vala [gnome.org], but I haven't really decided yet whether I can start using them on serious projects.

    I'd really like to learn more about functional programming in this area, too, but there seem to be very few functional languages that are designed for real-time. FAUST [sourceforge.net] is one, but it's only for audio.

    Anyone know any other good natively-compiled languages that actually have well-implemented modern features?

    I wish it were possible to have a compiled version of Python, for example, but there are many dynamic features it depends on. (Some stuff could be done in Pyrex, which is a pretty cool little project, but so far I've only used it to make bindings to C libraries.)
  • Re:C/C++ is dying! (Score:5, Interesting)

    by jd (1658) <imipak AT yahoo DOT com> on Thursday April 24, 2008 @06:22PM (#23190768) Homepage Journal
    ColdFusion should be shot with a silver bullet, stabbed through the heart with a stake, be stuffed with garlic, and be buried at a crossroads at midnight in a holy water-filled lead coffin with elder signs on all sides, inside and out. Other than that, I have no idea why it ranks in the top 20.

    Delphi and Pascal are other puzzlers. Pascal is great as a teaching language, but there are later iterations of that family of languages - Modula-2 and Modula-3 - that arguably provide better rigor if rigor is what you are after. And I see no obvious reason to use Pascal or related languages if you're not after truly rigorous code.

    C seems to be holding ground, the slight loss seems to be within the fluctuations other languages that are holding steady are seeing. It's too powerful, too close to bare-metal programming and too close to the actual machine architecture to fade for some time yet. C++ might genuinely be losing ground - C# and D provide a lot of the power and object-orientedness of C++ but make an effort to learn from the complexity of C++. Personally, I suspect D might stand a better chance as C# is still very much tied to a single vendor in people's minds. I don't see C++ vanishing, rather I see them reaching some common point and staying there.

    VB is quick-n-dirty, and it's popular because it's so easy to write something in it. If it ever became unlawful to have a website that was dangerously insecure or a hazard to Internet traffic (in much the same way cars have to be inspected every so often in some places to ensure it meets certain minimum safety standards) I imagine Visual Basic would lose appeal. Well, that or the EU eventually raising the fines to the point of driving Microsoft out of international competition.

    Given that so much new scientific code is still produced in Fortran, whereas not much is really written in COBOL although a lot of legacy code is maintained in it, I'm surprised COBOL is there and Fortran is not. (Fortran is popular enough that there are TWO competing front-ends for GCC for it. There are open-source COBOL compilers, but as far as I know, all work has stopped on all of them. To me, that says something about the level of interest and serious usage.)

  • by CoughDropAddict (40792) * on Thursday April 24, 2008 @06:31PM (#23190892) Homepage
    I am a die-hard C and C++ advocate. I consider it a high priority to make sure that the JVM and .NET aren't the de facto future of all computing, which seems like more and more of a risk when you see things like Singularity OS [wikipedia.org], which is an OS where all application code must be managed code. These managed code people go nuts and think that everything should be managed.

    The current generation of managed code VMs clearly have some benefits. But but they fall far short on some of the key properties that make C and C++ so powerful. Even if I grant you that the JVM and .NET have caught up to C and C++ in speed (which I still don't believe has been demonstrated), it's undeniable that
    • VMs have comically bloated memory footprints: between 2x and 30x comparable C programs according to benchmarks: JVM [debian.org], Mono [debian.org]. Even if you consider memory cheap, smaller is always better because it means fewer bits flying over the bus and better cache utilization.
    • VMs stop the world to do garbage collection. Point me to all the articles you want that explain how "it's getting better" and "they've figured out how to make it real-time," but that doesn't change the fact that you're stopping all threads whenever you garbage collect, which is making your latency suffer.

    C and C++ are the only game in town for getting the best performance and a small memory footprint and the ability to have the lowest possible latency.

    That said, I think that C and C++ are becoming harder to justify when you consider the havoc that memory errors can wreak. It's highly embarrassing to vendors and damaging to their customers when a buffer overflow exploit is discovered. malloc and free, even when used correctly, can still have some forgotten downsides like the memory fragmentation that was discovered in Firefox 2 [pavlov.net], and took some very smart people a lot of work to address.

    What I would like to see is a language that gives the benefits of C and C++ (extremely fast, extremely small memory footprint, and no GC pauses) but that is also immune to C and C++'s weaknesses (memory corruption, memory leaks, memory fragmentation). Yep, I pretty much want to have my cake and eat it too. Why do I think this is possible? I think that the future is to have a fully concurrent, compacting GC. Everyone's telling us we're going to have more cores than we know what to do with soon, right? Well why not use all those extra cores to do GC in the background? Even if it's more expensive on the whole, we barely know what to do with all those extra cores as it is. With this strategy, you could get the performance guarantees and low overhead of C and C++ (on the real, non-GC thread, that is) without having to give up GC or suffer from memory fragmentation.

    I'm also not willing to give up the option of dropping to C or C++ (or even assembly language) when it's justified. Mention JNI in a room of Java people and observe them reel in horror -- it's culturally shunned to deviate from "100% pure Java." Maybe this is a good value when you're on a big team of people writing a web app, but for systems and multimedia programming this is silly -- inner loops are inner loops, and some of them can benefit from machine-specific optimization.

    Theoretically you could experiment with the fully concurrent GC using an existing language/runtime like Java, but I've sort of given up on the JVM and .NET communities, because they have empirically demonstrated that they culturally have no regard for small memory footprint, low overhead, short startup time, etc. They just don't consider huge memory footprint or ridiculous startup times a problem. This is not to ment

  • Re:Always be there (Score:4, Interesting)

    by fyngyrz (762201) * on Thursday April 24, 2008 @06:34PM (#23190936) Homepage Journal

    ...but you will spend a lot more time creating, debugging and maintaining it.

    Hmm. Creating, probably so. You're writing smaller steps on a per-keystroke basis, so it's pretty much a given.

    Debugging and maintaining, however, are issues more predicated upon design skills than the language used. From things entirely outside the code's executing domain (like comments and other documentation) to things inside (structures and algorithms), correctness (from which depends debugging), reliability (from which depends maintainance) and completeness / applicability (from which also depend maintainance), all these things are independent of the language, except in very minor and essentially irrelevant ways.

    I would argue that coding in an HLL does not improve these latter things. However, coding in C brings you extremely close to both the problem(s), and the solution(s) you decide to implement without taking you that last troublesome step down into assembler, where you lose platform independence. I think that is a uniformly positive set of consequences to enjoy as a result of spending that extra time.

    It doesn't make sense for the majority of classes of software, from a cost vs. gain perspective, to use C for the job.

    Well, we'll have to agree to disagree here. Wasting resources can have unpredictably large effects, such as pushing a system over the edge between running in memory and beginning to swap. The more you waste, the more likely you are to cause such problems.

    The fact is, running the user out of resources for no reason other than saving small amounts of my time up front is outside the bounds I am willing to go. The gains at the user's end, especially when multiplied by many users across many invocations, are likely to be substantial. Consequently, the investment on my end is almost certain to be small by comparison, even if it is actually many of my hours.

    As a user, I run into this all the time. If I start a certain application, it typically takes quite some time to start. It's the "industry standard", but frankly, it runs like a pig in hip deep dung on every startup. And it eats memory like crazy, even the executable is 4x larger than other apps that do the same thing, but which -- notably -- aren't the "industry standard." So I make the choice, as a user, to use the other apps for all tasks that are achievable either way (and as it turns out, I *very* rarely have to start the industry standard program.) I want my memory to be used for data, not for a bloated application; and I want my time used in working on that data, not waiting to count and register every plugin or aux feature in the system every time the application starts.

    The problem is that from the programmer's perspective, "time and effort" are not even slightly the same as they are from the user's perspective. For my part, I consider it an ethical "must-do" to consider the user's perspective as the primary one driving the design. Both from the viewpoint that their resources are not "mine to waste" just because they have extended me the courtesy of allowing my software to run in their machine, but also from the viewpoint that any supposedly "extra" time I spend, I spend once; any time I cost the users unnecessarily, I extract that cost from every user, and every time the software is run.

  • by IamTheRealMike (537420) <mike@plan99.net> on Thursday April 24, 2008 @06:37PM (#23190992) Homepage

    You didn't say what kind of software you're writing. Personally, I think using Python for large codebases is shooting yourself in the foot (I've seen it tried several times and the results were never pleasant).

    Problems Python has that C++ doesn't (imho) - Python is oddly easier to write than read in my experience, because it's so dynamic. The result is that it's a lot of fun to write and really no fun at all to try and figure out when you're new to a codebase, because you can'

    Python doesn't even try and be efficient. Fans of Python tend to say that it doesn't matter because either performance doesn't matter for their application, or because they can write the hot-spots in C. Well, for a lot of apps there aren't really any well defined hotspots after some optimisation. Instead the app just chugs. Look at the fate of Chandler or Sugar for instance. You can't fix that kind of thing by jucidiously rewriting the bottlenecks in C because there isn't one bottleneck - it's death by a thousand cuts. This is especially true of memory-constrained environments like desktop software. I've seen way too many apps where the developers clearly thought they'd "make it fast later" and then discovered that they didn't understand performance like they thought ...

    It's rather hard to distribute Python apps without distributing a giant runtime with them as well. For many apps that doesn't matter, but if you want people to download your app, it's going to hurt. Any consumer desktop app for instance ...

  • Re:C/C++ is dying! (Score:4, Interesting)

    by ATMD (986401) on Thursday April 24, 2008 @06:47PM (#23191122) Journal
    I wouldn't say it's a niche. I recently got around to learning AJAX and proper DOM scripting and now I want to use it for everything. It makes the UI so much nicer not having to reload the entire page every time you have anything dynamic.
  • myth (Score:3, Interesting)

    by nguy (1207026) on Thursday April 24, 2008 @06:59PM (#23191288)
    The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection

    It's a myth that there is a performance penalty for garbage collection; garbage collection usually has less overhead than doing the same kind of memory management with malloc/free.

    The reason that garbage collection has a reputation for being slower is that once people have garbage collection, they get sloppy and wasteful. And languages using garbage collection often just aren't designed for efficiency (e.g., Java).

    It would be nice to have a safe, garbage collected systems language, and such languages are possible. Unfortunately, all we get is Java and C#, two bloated languages that make writing efficient programs harder than eating tomato soup with chopsticks.
  • Garbage Collection (Score:3, Interesting)

    by RAMMS+EIN (578166) on Thursday April 24, 2008 @07:10PM (#23191464) Homepage Journal
    ``The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.''

    Moreover, automatic memory management makes for more elegant and workable APIs, and some things (I think closures are among those) need automatic memory management.

    Also, automatic memory management is not necessarily slower than manual memory management. In fact, garbage collection can be faster than malloc/free, and has been demonstrated to be so in some cases. (One obvious case where a collector outperforms manual memory management is when the collector never has to run.)
  • Re:Always be there (Score:3, Interesting)

    by smellotron (1039250) on Thursday April 24, 2008 @08:30PM (#23192366)

    Write a quick five-liner in python, it might take 100ms to run. Spend a few hours writing equivalent code in C, it might take 10ms to run. Scale this up to million-line (in C code) applications.

    To give you some perspective, I did perform an experiment along these lines. I wrote a Python script to measure the entropy, cross entropy, and KL-divergence of words in a text file with some basic smoothing models applied. It was about 250 lines of code. I rewrote it in C++ mostly out of curiosity. My naive implementation was about 10% more code, and ran at about the same speed (mostly it's disk I/O and comparing std::map<std::string,int> to Dict()). I spent a bit more effort and ended up with a 450-line solution using better data structures. The end result (as I look back and run timings) is that it's about 8 times faster when examining a 12k file.

    If I didn't already have the design figured out in Python, it would have taken much longer to write the C++ solution... but having a prototype right there really helped. For performance-critical code (like performing anything more complex than O(n) on a large dataset), it was definitely worth it to spend the extra time on a better implementation. Honestly, the biggest benefit was using both languages in order to end up with a good design and a good implementation in a reasonable amount of time.

    If you're interested, I can provide the source for comparison (I just don't want to put up a public link to my private svn repo).

  • Algol68 (Score:3, Interesting)

    by hedley (8715) <hedley@pacbell.net> on Thursday April 24, 2008 @08:49PM (#23192534) Journal
    Strong typing. Garbage collection. Native code generation.

    Lack of OO and a current up to date machine code generating compiler would be a detriment though
    to a rollout anytime soon of this old language.

    http://en.wikipedia.org/wiki/ALGOL_68/ [wikipedia.org]

    H.
  • by arevos (659374) on Thursday April 24, 2008 @09:01PM (#23192638) Homepage

    The big achilles heel of python is that it currently truly sucks for multi-core programming and it would appear that attempts to solve this are not coming quickly.
    Jython? Iron Python? These Python implementations don't have the same global interpreter lock issues that CPython has.

    JAVA was written with threading in mind from the beginging. So it can potentially embrace the multi-core revolution that is coming more quickly than other languages.
    Java was written with a lot of things in mind, but in my opinion, it didn't fully achieve many of them. Locks and explicit threads are increasingly regarded as not a very good model for handling concurrency.

    Fortunately, whilst Java may have problems, other JVM based languages like Scala and Clojure handle concurrency extremely well in comparison.
  • Re:C/C++ is dying! (Score:4, Interesting)

    by MBGMorden (803437) on Thursday April 24, 2008 @09:38PM (#23192932)
    I agree. My educational background was in C/C++ programming (command line on Unix systems) and during and immediately following school most of my hobby programming was in Borland C++ Builder. About 2 years ago I discovered PHP and went wild with that. It was fun, but that old saying started becoming true: "When all you have is a hammer, everything starts looking like a nail.".

    Web apps are nice and quick to develop, but I'm definitely starting to come back around to the idea that there is definately a place for local apps, and most certainly for fast compiled code over interpreted.
  • by goombah99 (560566) on Thursday April 24, 2008 @09:50PM (#23193022)

    I'm not convinced Java's "synchronized" facilities are a significant improvement over Python's global interpreter lock.
    Java gets a (somewhat) linear speed-up when you add cores. Python gets virtually zero and in some cases it loses over unthreaded. Big difference!
  • by goombah99 (560566) on Thursday April 24, 2008 @09:53PM (#23193048)
    I agree Lua is fast and slim. Perl seems to fork really well. Java threads really well and presumably so does groovy. Python is terrible at both.

    In the end you want both powerful and language in wide use (for libraries and people willing to code). That sort of brings it back to Java.
  • Re:C/C++ is dying! (Score:5, Interesting)

    by CastrTroy (595695) on Thursday April 24, 2008 @10:22PM (#23193276) Homepage
    Well, as a VB developer, you have to remember that when people talk about VB now, they are talking about VB.Net. Which is exactly the same as C#, with a different syntax. Comparing VB.Net to VBScript or even VB6 is like comparing Java with Javascript. VB gets a bad name because it used to be pretty bad, and there's a lot of non-programmers using it to do a lot of stuff they aren't qualified to do, and messing it up royally. But that doesn't mean VB.Net is a terrible language. I wouldn't fault PHP for all the insecure newbie websites created with PHP.
  • Re:C/C++ is dying! (Score:5, Interesting)

    by PocketPick (798123) on Thursday April 24, 2008 @10:49PM (#23193488)
    I completely agree - You know why I like C or C++? Because I only need to know one thing to do 90% of everything - C or C++. In the world of web development, I must no only be proficient in an equivilant amount of libraries found on a desktop platform, but also any number of scripting languages (PHP/JavaScript/Ruby/etc), HTML/XHTML/XML/SGL/DTD/RelaxNG/XMLSchema, perhaps ColdFusion, or maybe Adobe Flex/SilverLight - And I'm probably only scrapping the complexity of this odd little world.

    Why the pain? Why not keep it simple? In spite of our advancement, it's amazing how much more practicality and common sense some software academics had 20 years ago compared to today.

    When are web standards comittees or intellectuals going to quit trying to one up each other and start consolidating some of their standardizations?
  • Re:C/C++ is dying! (Score:3, Interesting)

    by Planesdragon (210349) <slashdot&castlesteelstone,us> on Thursday April 24, 2008 @11:10PM (#23193634) Homepage Journal

    A musket isn't as useful or respectable when everyone else has M1A2s though.
    It is when all you've got is gunpowder and lead. The average soldier can't make bullets for his M1A2 in the field. The musketeer... can.

    Which is exactly why VB has the staying power it does.
  • by Animats (122034) on Thursday April 24, 2008 @11:21PM (#23193694) Homepage

    There's nothing good to program in. This is a serious problem.

    We have C. C isn't a bad language, but the "pointer=array" concept, while it provided some performance gains in the PDP-11 days, continues to cause millions of crashes and intrusions every day. The fundamental problem is that you can't even express the size of an array in the language. Given that, the odds of consistently getting subscripts right is low. This could be fixed [animats.com], but it will never happen.

    There's C++. C++ has lost its way, as I've pointed out before. The C++ committee is off in template la-la land, putting in features that few will use and fewer still will use correctly. (Coming soon: "concepts"). The real problem with C++ is that it's no safer than C, but hides more.

    There were once better languages. Delphi is better, but it's Borland. Modula 3 was a good systems programming language, but it died with DEC. Various attempts at improvement, from Ada to Eiffel to Sather, have almost died off. Amazingly, "D", which is Walter Bright's successor to C++, has a measurable market share.

    Some progress is being made on numeric issues, like compiling Matlab to efficient code for parallel hardwware. But that doesn't help systems programming much. Hard-compiled Python would have potential, if Guido wasn't against it. (Python has a speed penalty of about 10x to 60x over C/C++. Maybe at some point Google management will decide that a hard-compiled Python system would be cheaper than building additional data centers at former aluminum-smelter sites.)

    As for garbage collection, it's a headache. "Finalizer" and "destructor" semantics get weird. (See "Managed C++") Reference counting leads to saner semantics and repeatable timing, but is inefficient unless the compiler knows how to hoist reference count updates out of loops. (Incidentally, about 90% of subscript checks can be hoisted out of loops, and you can almost always hoist them out of inner FOR loops. So subscript checking is almost free if done right.) Note that Perl is reference-counted, and Perl programmers don't spend much attention on memory management. If you have strong and weak pointers, reference counting, and treat cycles as errors, you don't really need garbage collection.

  • Not forever (Score:1, Interesting)

    by Anonymous Coward on Friday April 25, 2008 @12:18AM (#23194012)
    Your argument used to go thusly:

    For performance critical processing, the overhead of anything but assembly language is killer. It'll be news when After Effects of Flame is rewritten in C.

    As time goes by, C# will be to C++ what C++ currently is to assembly language.
  • Re:Always be there (Score:3, Interesting)

    by pclminion (145572) on Friday April 25, 2008 @01:54AM (#23194494)

    Wow. I thought I was reading one of my own posts. I also use Python as a prototyping language for projects which will ultimately be done in C++, and I do it with exactly the same kinds of problems you are solving: statistical NLP.

    I've found that most NLP algorithms are easily expressed in Python, but it is too slow to apply to realistic corpora. On the other hand, Python can easily translate to C++ (once you've built up some experience and the kit to go along with it) and the time it takes to write Python + C++ probably ends up shorter than trying to begin right away in C++.

  • by corecaptain (135407) on Friday April 25, 2008 @03:44AM (#23195002)
    As for NBL - I think the NBL will provide support/semantics for development methodologies and components - Abstractions built in for defining components, instantiating them from repositories, support for testing .. I think all the current focus on things like dynamic vs static typing, closures, functional languages, dsl, are all things that are at too low a level - it is analogous to assembly language programmers arguing over instruction set design when out of left field comes compiled high level languages (ie. fortran ) So I think the guy saying he doesn't see any new paradigms showing up is correct if your "domain" is limited to what we know so far.. I mean take C vs Ruby - how much do their paradigms differ ? Compare the difference in paradigms between assembly and fortran - that is big difference. I think the NBL will be the NBL exactly because it will force us to think about programming in an entirely new way - in my opinion it will need to address the fundamental problem that alot of software engineering (despite 20 years of OO) involves re-inventing the wheel.
    Take the hottest web dev framework - Ruby on Rails...what is pathetic is how much functionality each web dev team is duplicating all over the world...user management, logging, session management, on and on, not to mention higher level domain stuff like shopping carts.. so there is lots of re-inventing the wheel going on. Software needs components - this is not a new idea .. I think Parnas in the 70s or something delivered a speech on this...so the NBL will enable components ...Software development is hard (re Brooks - mythical man month - essential vs accidental complexity) that is true..but it doesn't follow that we are doomed to repeating the essential (hard parts) ad infinitum....okay sorry for rambling... good night.
  • by Anonymous Coward on Friday April 25, 2008 @06:28AM (#23195718)
    you're forgetting that knowing C isn't the same as knowing all the library calls. Knowing C (or PERL/Java/Python/PHP/...) means knowing how to program in it. Knowing the way to approach it properly (you can write C code in Perl, practically. It's not really Perl then, though, and does things in a wasteful way), knowing when you have to do something and what that something is is knowing the language.

    You have different ways of doing loops, different ways of defining things (C is a strongly typed language, as is Java [though it doesn't have unsigned...] but perl isn't). And each language does it differently.
  • Re:Always be there (Score:4, Interesting)

    by Carewolf (581105) on Friday April 25, 2008 @08:44AM (#23196486) Homepage
    C/C++ might give you 1% CPU speed-up, but by fine-tuning the memory allocations, the block allocations on the disk and the way you communicate with the I/O devices it can give you a speed-up on I/O operations that is not available in any of the modern toy languages.
  • Re:C/C++ is dying! (Score:4, Interesting)

    by GeckoX (259575) on Friday April 25, 2008 @09:35AM (#23196946)
    Agreed. VB.Net is not VB. Still a tad behind C# for language features, but barely. Worked in C# for the last 4 years at my last job, and dreaded having to use VB.Net at my new place of work. But now that I have been for a year, other than syntax, there's really zero difference between the two. Catch is to turn off the 'features' that let you write more vb6ish bastardized code. Make sure Option Strict and Option Explicit are on, and throwout the Microsoft.VisualBasic namespace, and you're good to go. One great benefit over it is that the 'perceived' challenge for a VB6 developer in switching to .Net is greatly removed when they can be introduced to VB.Net rather than C#. I've mentored people that would never have attempted anything in C# in moving to VB.Net with great success.

    I still prefer the syntax of C#, but that's mostly just personal preference.
  • Re:C/C++ is dying! (Score:3, Interesting)

    by somersault (912633) on Friday April 25, 2008 @11:42AM (#23198502) Homepage Journal
    I think Delphi is great for a 'do anything' language - I use it basically for any non web based app requiring a GUI. I'm not sure how cross platform compatible it is as I haven't ever tried out Kylix or anything like that. I've used C/C++ for doing command line code, DLLs and OpenGL apps, but I've never actually used it for GUIs as it just looks like a royal pain in the ass compared to Delphi. For web based stuff I used to use PHP as that is what I learned at Uni, but I've moved to perl now, it's a lot more pleasant to work with, and encourages more secure practices IMO (I submitted an Ask Slashdot question about PHP security ages ago and the answers convinced me to girl Perl a go :) ). Knowing Perl will also be useful for doing any random scripting I may want to do in future too, I highly recommend it to anyone who hasn't tried it yet..
  • Re:Always be there (Score:3, Interesting)

    by cmburns69 (169686) on Friday April 25, 2008 @11:53AM (#23198670) Homepage Journal
    This is an interesting discussion for me because I've experienced the whole thing in real life. My cousin has run a large, free, web-statistics tracking site. When he started it, it was running on PHP+apache-- He was just to the point when he was able to start making money off it, and I gave him some bad advice. I told him that if he was having performance issues, he should re-engineer the entire thing himself.

    After 2 years building a web-server from scratch, he's in beta now. But during those 2 years that he wasn't improving (or maintaining) the existing code base, he lost all his thousands of free customers. He has no regrets, and his code behind it is awesome! It's fast, efficient, and modular. As far as code is concerned, it's like Michelangelo's David. From a purely coding perspective, it was the right decision. From a business perspective, it was a very poor decision. He could have a thriving business right now, but unfortunately for him, all he has now are a few prospective customers in beta.

    I (on the other hand) have built a technology to enable certain kinds of web-apps for my personal site. It's in java so it's slower, and eats memory a bit. I could spend a year or 2 re-writing it in C, and know for certain there were no memory or performance issues. But that's not worth it to me. I can start getting paying customers now. From a business perspective it's more than enough. As my business grows, I simply need to upgrade my hardware.

    My cousin spent a total of 3 man-years getting to where he is now (1 year for the original code-base, and 2 years for the current code-base). I've spent 1 man year for my current code-base. Assuming our time is worth only $30/hr, and that these part time, off hours man years are only 500 hours each, his cost was $60k more than mine was. I can buy a lot of hardware with that difference, and I'm collecting money at the same time.

    From a coding perspective, it makes sense to spend an infinite amount of time to make your code-base perfect. In the real world, the right language is often what is most practical.

    Note: This post was heavily influenced by Joel Spolsky's article Things you should never do [joelonsoftware.com]

"A great many people think they are thinking when they are merely rearranging their prejudices." -- William James

Working...