Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Education Programming IT Technology

Professors Slam Java As "Damaging" To Students 1267

jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
This discussion has been archived. No new comments can be posted.

Professors Slam Java As "Damaging" To Students

Comments Filter:
  • tasty (Score:5, Funny)

    by User 956 ( 568564 ) on Tuesday January 08, 2008 @04:19AM (#21951006) Homepage
    Professors Slam Java As "Damaging" To Students

    I dunno about you, but java was nothing but helpful to me as a student. the drinkable kind, at least.
    • Re:tasty (Score:5, Interesting)

      by Pvt_Ryan ( 1102363 ) on Tuesday January 08, 2008 @06:50AM (#21951788)
      I have to agree with the article. At Uni I was taught java much to my detriment and annoyance (I had taught myself VB6 prior to going to uni so wasnt a newbie to programming).

      I find I am now having to teach myself C++, and am struggling in a lot of areas that had I been taught in Uni I would be a lot more confident in.
      • Re:tasty (Score:5, Insightful)

        by SQLGuru ( 980662 ) on Tuesday January 08, 2008 @10:29AM (#21953506) Homepage Journal
        My biggest gripe with my college experience (graduated 1994 - BS in CS) was that even though they were teaching the "solid" languages, they still didn't really teach me what I needed to know in order to do the job I do today (DB App development). Sure, things like Algorithms and Data Structures had some low level fundamental use, but they didn't teach me how to develop a SYSTEM.....most of my "projects" were simple "take some input - produce some output" programs.

        Most of my professors had no real world experience, either. So, teaching things like team dynamics and working within a project schedule were really beyond their expertise. Granted, I've been quite successful, but I attribute most of that to my abilities, not what I learned in college. College just got me a piece of paper that opened the door.

        I don't think the problem is with the languages being taught, but in the lack of true engineering being taught. This is true of any of the programming related fields (CS, MIS, SE). All of them need these skills.

        Layne
        • Re:tasty (Score:5, Insightful)

          by pyite ( 140350 ) on Tuesday January 08, 2008 @10:38AM (#21953636)
          So, teaching things like team dynamics and working within a project schedule were really beyond their expertise.

          And also beyond the scope of computer science. If that's what you wanted, you should have specialized in software engineering. People keep forgetting that computer science classes should feel more like math classes than engineering or management classes. One look at TAoCP would hint at that. For the record, I'm an engineer and I find the pseudo-engineering that most CS programs push out to be highly disturbing. Either do it right and call it software engineering, or remove the non-CS stuff and call it computer science. If you're not gonna do either aggressively, give it a fake major name like "Information Technology" or "Management of Information Systems" and teach a bunch of stuff really poorly.

    • Java == Jobs (Score:3, Interesting)

      I am not especially fond of Java myself. In fact, my focus has been precisely with lower level languages like the ones mentioned in the summary. I would love to work with these languages, preferably with tasks that where algorithmically and mathematically challenging etc.

      But I have not been able to find any such jobs. Job databases show 90% .NET or Java jobs. The summary makes it sound like there is a great demand for my skills, but where are the jobs? /David
      • Re:Java == Jobs (Score:5, Insightful)

        by asc99c ( 938635 ) on Tuesday January 08, 2008 @08:30AM (#21952394)
        I think a lot of employers advertise Java / .NET as a lot of employees believe that is the new thing and the way forward. i.e. C programming is on the decline, and (young to middle aged) employees don't want to get too far behind the times. Older employees might instead make a selling point of their skills.

        I'm mainly a C programmer these days, but I took the job basically understanding that I would be working significantly with Java. That was the only language I had experience with on leaving Uni, and I was promptly put to work on a Pascal / OpenVMS system! Friends from Uni have had similar experiences.

        I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this. I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.

        In dealings with many (perhaps even most) other companies whose software I write interfaces with, it's pretty clear that they are also using C or C++, and often even older systems (in one interface we have to convert our messages from ASCII to EBCDIC). You can frequently tell what language the other system is from the sort of errors that crop up, and sometimes from the design of the interface. I'm forced to believe that my area of the industry is still primarily C based.
        • Re:Java == Jobs (Score:5, Insightful)

          by Phleg ( 523632 ) <stephenNO@SPAMtouset.org> on Tuesday January 08, 2008 @10:37AM (#21953628)

          I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this.
          What's stopping you from learning a language on your own, for fun? If you think employers won't care about non-professional experience, you're either simply wrong, or working for employers who hire crappy programmers. Where I work, we'll likely throw your resume in the trash if it doesn't have Haskell, Lisp, Ruby, Python, OCaml, Scheme, Scala, or some other obscure, clearly self-taught language on it.

          I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.
          You're far off here. This seems to be a prevailing thought, but it just doesn't bear out in practice.

          As code gets more complex, the best way to keep it understandable to others is to follow common language idioms, indentation / code formatting practices, and use built-ins in the standard libraries. These alone often take months to become familiar with, but that's only half of it. The other half I can only describe as trying to approach problems from the unique perspective of the language. Any asshole can jump from Java to Ruby, or from C++ to Lisp, or from VB.NET to Scala. But learning how to solve problems using those languages' strengths, rather than writing code as you would in the language you're coming from, is crucial.

          From my own experience, Java programmers coming fresh into Ruby don't use blocks. When you finally convince them to use blocks for enumerators, they miss the point entirely and simply use each_with_index for everything, rather than more powerful methods from functional programming like map. They also don't like to reopen classes. In Ruby, classes can be added to at will, so if you want a method to calculate the average value of an Array, you can simply define it as a new method on the class. But Java programmers will create a Util module, throw a method in there that takes an Array, and think nothing more. It's not wrong, per se, but it's ignoring Ruby's strengths, and simply writing Java code inside the Ruby interpreter. And the people who do this are bloody useless.

          My rant is getting long, but the main point is this: learning syntax for a new language is easy. Learning to use that language properly (much as a screw is used differently than a nail) is crucial to being able to work with other people, and getting anything meaningful done.

      • Re:Java == Jobs (Score:5, Insightful)

        by aldousd666 ( 640240 ) on Tuesday January 08, 2008 @10:02AM (#21953184) Journal
        There are many jobs in .net and java yes. I hated Ada in school, and particularly difficult was FP. But once we got to assembler it all made sense. It was the guts of the system, and I finally saw how it all fit together. Once I saw data structures, and then had a look at how stack based code was generated from all of the other languages, I felt like I could learn any of the languages and not feel like I was using a black box. In my opinion, it's ok to learn java and C# in school after one has had a look at the internals, perhaps a primer in virtual machines. That would cover the bases of actually knowing how computing works, in addition to allowing for the preparation for job markets. One thing that's absolutely crucial to a computer science grad in the real world is being able to adapt to any language when needed, so all of this argument over which language to learn is a little off the mark. You should learn programming in general in school, and optionally focus on any language of the day for the market after you've become versed in the art in general. I realize that 'becoming versed' while in school is a little bit unrealistic as well, but if you've at least been exposed to the concepts at a lower level, it doesn't leave you scratching your head as much in practice when you can't figure out, for example, why your C# code makes a distinction between stack and heap allocated structures, and what impacts it has on performance and all that. It also means that when security holes are pointed out, or patched, you at least know what the hell is going on, and why it was a big deal to begin with.
        • Re:Java == Jobs (Score:4, Insightful)

          by SatanicPuppy ( 611928 ) * <Satanicpuppy&gmail,com> on Tuesday January 08, 2008 @11:01AM (#21953972) Journal
          They taught mostly Java where I went to school, though "taught" isn't really right. They didn't "teach" by languages at all, it's just that the programming projects mostly tended to be in Java, unless the class concepts were better suited to something else.

          Mostly the classes were theory, concepts; stuff that applied equally to all languages. It was weird in some ways; I had a networking class that assigned the eternal "create a server/client chat program" project, where part of the project was a Java GUI. At this point, I'd never programmed a GUI, and neither had anyone else I talked to. The response of the TA (who was the only one who'd ever give programming advice, because the professor only dealt in theory), was that GUI design was beyond the scope of the class and we'd just have to figure it out.

          My method for figuring it out involved downloading a Java editor, and using the GUI design tools. It was the first time I'd used a graphical editor for Java; it was encouraged to do the work in VI or Emacs, and generally, that's all we did.

          Now I hated that crap at the time, but nothing has prepared me better for my day to day life than having projects dumped on me where I had to goddamn well use my initiative and figure it out. Over and over again, I was forced to go out and read and work out for myself how to translate the theory into code. These days, I program in Java about 20% of the time. I'd hardly say it stunted my abilities, and it certainly didn't make me into a cookie cutter corporate programmer.

          I'd have to say that specifically teaching any language is a problem. They all come in and out of fashion. I work with a guy whose mind is stuck in Visual Basic...And I don't mean .Net. The idea that you should focus on intellectually sexy teaching languages (like goddamn SCHEME) because it builds character or some crap...I just don't buy it. Language is a tool, and should be treated as such.
      • Re:Java == Jobs (Score:4, Interesting)

        by Pengo ( 28814 ) on Tuesday January 08, 2008 @10:42AM (#21953698) Journal
        I agree, Java does equal Jobs.

        But C and a C++ background will help you keep that job.

        It's very hard to be taken serious as a computer programmer, even a Java one, if you aren't able to understand memory allocation, points, etc. Often times it's critical to understand that so you can understand what the garbage collector is doing.

        I'm the director of technology for a small/medium company (about 5 programmers), and when I'm interviewing a new job candidate for a Java position, during the interview I ask them to explain to me how they would achieve writing a linked-list in Java without using the collection class.

        Interesting some of the responses I get. :)
  • by gangien ( 151940 ) on Tuesday January 08, 2008 @04:24AM (#21951026) Homepage
    "A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)"

    that's true, but again soft engineering/programming is a subset of computer science (maybe, i suppose you could argue they aren't)

    "Computer science is no more about computers than astronomy is about telescopes."
    - Edsger Dijkstra
    • by hedleyroos ( 817147 ) on Tuesday January 08, 2008 @04:36AM (#21951094)
      Dijkstra did say that, and if the software world consisted of only theory then we could all get 90% for our efforts and be happy with that.

      In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.

      I'm just worried that too few students these days know assembly and C, which leaves us in a predicament when the current generation of kernel devs retire.
      • Re: (Score:3, Insightful)

        by Jackmn ( 895532 )

        and if the software world consisted of only theory then we could all get 90% for our efforts and be happy with that.
        University is not occupational training.

        In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.
        Computer science is not computer programming.
      • by epine ( 68316 ) on Tuesday January 08, 2008 @05:24AM (#21951384)
        Anyone with a true gift to become a kernel dev has probably engaged in flame wars with his/her professors already, regardless of what she/he teaches.

        Pointers aren't rocket science. If you never perform an operation where you haven't first met the operation's preconditions, you never get a pointer error.

        If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway. Pointers are the least of your problems. Race conditions can become exceptionally hard to reason about. The prudent kernel dev architects the system such that this doesn't transpire. That requires a whole different galaxy of aptitude beyond not leaking pointers.

        When I first learned C in the K&R era, I thought those greybeards were pretty clever. Then I came across strcpy() and I wondered what they were smoking that I wasn't sharing. I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.

        More likely, too many of them had learned to program on paper teletypes, and just couldn't bring themselves to face having to type unsafe_strcpy() when they had reason to know it would work safely and efficiently.

        The C language deserves a great deal of shame in this matter of giving many beginning programmers the false impression that any function call should dispense with formal preconditions.

        Interestingly, if you sit down to implement an STL template algorithm manipulating iterators, it proves pretty much impossible to avoid careful consideration of range and validity.

        OTOH, C++ makes it charmingly easy for an object copy routine, such as operator=(self& dst, const self& src) to make a complete hash of managed resources if you fail to affirm dst != src.

        There are plenty of amateur mathematicians who can manipulate complex formulas in amazing ways. The difference with a professional mathematician is that the necessary conditions for each transformation is clearly spelled out.

        A = B ==> A/C = B/C iff C != 0
        A > B ==> C*A > C*B iff C > 0

        Infinite series must converge, etc.

        I'm not even getting into defining A,B,C as fields, groups, rings, monoids, etc. for maximum generality.

        Yet the average programmer feels sullied to exercise the same intellectual caution manipulating pointers. I've never understood that sentiment. My attitude is this: if that's how you feel, get your lazy coding ass out of my interrupt handler; go code a dialog box in some Visual Basic application that won't work right no matter what you do.

        Why did the software industry play out this way? Other professions have much harsher standards. Primarily because software was in an exponential expansion phase, any work was regarded as better than no work (perhaps falsely), and industry couldn't afford to reduce the talent pool by demanding actual talent.

        Now we've allowed many people to enter the profession without comprehending the rigors of preconditions. It's as if we had taught a generation of lawyers how to practice law, but omitted liability. Oops. What to do about it? Invent Java, and tell all these programmers it wasn't their fault in the first place.

        So yes, Java doesn't teach very darn much about the harsh realities of actually thinking. And since thinking is hard, it's an impediment to productivity anyway, so it hasn't much been missed. The only thing we lost in the shuffle is our professional self respect.

        • by erc ( 38443 ) <erc@p o b o x.com> on Tuesday January 08, 2008 @06:09AM (#21951612) Homepage
          I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.
          Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later. C was never designed to protect you from yourself, as explicitly pointed out by Dennis Ritchie many times. If you want a language that will protect you from yourself, program in VB.

          So yes, Java doesn't teach very darn much about the harsh realities of actually thinking.
          But C obviously does - like checking boundary conditions. I don't understand how you can slam C in one breath, then praise it in the next.
          • by epine ( 68316 ) on Tuesday January 08, 2008 @08:05AM (#21952228)
            I don't advocate protecting the programmer from him/herself.

            I do advocate designing primitives as essential to the language as the C string functions to powerfully remind the programmer using those functions of the programmer's logical obligations and support the programmer to reason correctly about those obligations, without having to digest 15 lines of preceding context to see that calloc() provided the implied terminating NUL.

            strlcpy and strlcat - consistent, safe, string copy and concatenation [gratisoft.us] by Todd C. Miller and Theo de Raadt, OpenBSD project

            There are several problems encountered when strncpy() and strncat() are used as safe versions of strcpy() and strcat(). Both functions deal with NUL-termination and the length parameter in different and non-intuitive ways that confuse even experienced programmers. They also provide no easy way to detect when truncation occurs. Finally, strncpy() zero-fills the remainder of the destination string, incurring a performance penalty. Of all these issues, the confusion caused by the length parameters and the related issue of NUL-termination are most important. When we audited the OpenBSD source tree for potential security holes we found rampant misuse of strncpy() and strncat(). While not all of these resulted in exploitable security holes, they made it clear that the rules for using strncpy() and strncat() in safe string operations are widely misunderstood.
            An Interview with OpenBSD's Marc Espie [onlamp.com]

            We have had a lot of success explaining the issues and getting a lot of people to switch from strcpy/strcat to strlcpy/strlcat.

            Weirdly enough, the Linux people are about the only major group of people that has constantly stayed deaf to these arguments. The chief opponent to strlcpy in glibc is most certainly Ulrich Drepper, who argues that good programmers don't need strlcpy, since they don't make mistakes while copying strings. This is a very mystifying point of view, since bugtraq daily proves that a lot of Linux and free software programmers are not that bright, and need all the help they can get.
            The original C strcpy() could just as easily have had the semantics of strlcpy(), with insane_strcpy() provided to copy strings/trash core without a cycle wasted.

            One must recognize that in a solid code base, thinking occurs more often while reading code than writing code. Correctness is not a write-only proposition in any living code base.

            We came to the conclusion that a foolproof alternative to strncpy() and strncat() was needed, primarily to simplify the job of the programmer, but also to make code auditing easier.

            The original C string functions were (and remain) a pedagodic disaster. Most beginning programmers failed to realized how much thinking had been folded into the surrounding context. If they were reading K&R, that thinking existed. If they were reading any code they had at hand, it likely hadn't, by any survey of average C code quality ten years later. With the original string functions, whether this careful thinking existed is not obvious without doing a lot of mental work, and that work has to be repeated *every time* the code is seriously reviewed.

            Worst of all, the strcpy() function seemed to imply "buffer overflow is no great concern, we're not even going to give you a single argument on this very dangerous function to help you avert it". It was a false parsimony to save that extra argument in the default case.

            This isn't at the level of whether the handgun has a safety or not. It's at the level of whether it is possible to chamber a round too large for the barrel. I can point the gun successfully, but I'd greatly prefer it not to detonate in any other direction.

            A more thoughtful C string API would have averted mistakes on the magnitude of chambering bad ammunition, without encumbering the pointy end in the slightest, or failing to endanger the programmer's foot.
          • by autophile ( 640621 ) on Tuesday January 08, 2008 @11:46AM (#21954630)

            Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later.

            I know, it's insane nowadays! Why, just the other day I was just remarking to my barefoot wife, who was cleaning our clothes by beating them against the rocks in the river, and then rolling them through a wringer, that all these fancyboys with their pushbutton machines and laundry powder just don't value godly, manual labor anymore! Then I went to my factory job where the machines have no safety features, so they really require you to think, which is sadly out of vogue these days with OSHA and safety goggles and whatnots.

            The shame!

            --Rob

        • by James Youngman ( 3732 ) <jay&gnu,org> on Tuesday January 08, 2008 @06:24AM (#21951680) Homepage

          Anyone with a true gift to become a kernel dev has probably engaged in flame wars with his/her professors already, regardless of what she/he teaches.
          Piffle. You are equating software engineering talent with a propensity to participate in shouting (or its equivalent) matches. Those things are, to say the least, incommensurate.

          If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway.
          I disagree. Once a precondition has been checked once (on entry to whatever subsystem we're talking about) there is no need to re-check it all the time. Especially if it's an invariant of the algorithm. Sometimes such precondition re-checking gives rise to bugs anyway, since the negative arm of the conditional may contain code with a bug in it (though obviously using an assert macro will prevent that) - error cases get poor test coverage so such bugs may persist for a long time, too.
      • by rucs_hack ( 784150 ) on Tuesday January 08, 2008 @06:23AM (#21951674)
        Last year I was in the decidedly odd position of having to teach third year CS students (who had primarily used Java), what pointers were, how memory allocation worked, and how to use C.

        That they didn't know C wasn't too surprising. That they didn't have more than a basic grasp of memory management was shocking. They were also completely baffled when it came to not using an IDE to develop software. Makefiles had to be explained several times.

        I've grumbled many times about this concentration on Java, and the resultant lack of detailed understanding about programing, but each time I did so at my university I was disregarded, and someone always trotted out that age old nonsense "not re-inventing the wheel".

        I mean, sure, I see the point, but surely you should have a basic idea of how wheels are made?
    • by bigstrat2003 ( 1058574 ) on Tuesday January 08, 2008 @05:01AM (#21951230)
      True. Besides, the idea that Java is damaging to students is pure bullshit anyway. If the students are learning the Java way to do things, and nothing else, then they have horrible professors. I learned CS from good profs (well... one good and one bad), and surprise, even though I got my start in Java, I am perfectly capable of doing things in other ways.

      When I took data structures, and we used C++, I didn't have mental convulsions because Java had wrecked up my thinking so much (although I did have mental convulsions cause C++ is incredibly messy to read at a glance), I learned different ways of doing things. So, maybe these professors should look at whoever's teaching these kids so sloppily, not the language.

      • by timmarhy ( 659436 ) on Tuesday January 08, 2008 @05:05AM (#21951260)
        unfortunately he doesn't go far enough into the core of the problem, which is today's universities are mass producing what employers want, rather then the thinkers of tomorrow.

        employers want nothing more then easily replacable drones who come with an easily definable skill set which they can replace when a new buzzword comes along. this is NOT what universities should be pandering to.

        • by cheater512 ( 783349 ) <nick@nickstallman.net> on Tuesday January 08, 2008 @07:07AM (#21951918) Homepage
          In Australia, universities are spewing out people who can pass a test but cant think in programming terms to save their life.

          I was rolling on the ground laughing when I saw the problems people were having making a simple Sudoku program in C#.
          Once they were done drag/dropping all the UI elements, they all got stuck.

          Mind you your right about they are teaching what they think employers want.
          We didnt get to see a *nix system let alone use one.
          Although that may be because of that rather large donation Microsoft gave.......
        • by JavaRob ( 28971 ) on Tuesday January 08, 2008 @09:33AM (#21952892) Homepage Journal

          unfortunately he doesn't go far enough into the core of the problem, which is today's universities are mass producing what employers want, rather then the thinkers of tomorrow.
          Is that even true, though? The "drones" are the reason why so many projects fail (because they have no clue about larger risks, and no way to solve difficult problems), which costs those same employers vast amounts of money.

          I code mostly in Java in my professional life, but when I was in school we were forced to diversify, and it was a definite plus.

          The intro course used mostly JavaScript for some reasons (!), but other (even relatively low-level courses) required projects written in C, Schema, and Java. I took an operating systems course where we had to write a project in some kind of mini-assembly language... it's all a bit fuzzy now (I graduated 10 years ago), but I remember it being tough to wrap my head around for a while. And that's a good thing, right?

          I also did a couple of summer-long solo projects that probably taught me more than anything -- just fighting through the problems on my own, learning the hard way about the value of clean code, OO, version control, debugging skills, etc. etc..

          Perhaps obviously, I'm much better a *Java* developer than I would have been without the other stuff. So I agree wholeheartedly that students must learn more than one language in their schooling -- either for professional reasons OR for academic reasons; you simply have to flex your thinking in more than one way if you're going to learn.

          It also strikes me as a tough way to learn... how do you learn what X language is, and the reasons behind its design, totally in isolation? How do you learn what OO is if "functional" is a meaningless concept to you?
      • by daem0n1x ( 748565 ) on Tuesday January 08, 2008 @06:18AM (#21951650)
        It's just the usual senseless Java bashing. It has ZERO to do with Java. If the students are taught VB or C# and nothing else, like it happens in my country today, the problem is the same.
  • by KillerCow ( 213458 ) on Tuesday January 08, 2008 @04:24AM (#21951028)

    Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging.


    Yeah, I just read a press release from the FAA blasting driver training courses. Apparently, flight students who just got their drivers licenses were not able to navigate in the air, execute banks, take-off, or land properly.

    Students have to start somewhere. It's easier to start with simple stuff than to try to cram their heads full of everything all at once.
    • by jandersen ( 462034 ) on Tuesday January 08, 2008 @04:45AM (#21951144)
      Don't be silly. Flying an aircraft requires a whole new set of skills, that are outside the normal experience of most people. Driving is not just flying with a number of 'security enhancements', whereas programming in Java is like programming in C, but without the need to learn about pointers or good programming discipline. So if C is like a manual car, Java is an automatic.

      It is reasonable to expect that a CS student has both the ability and the interest it takes to learn all the details of programming well in C.
    • I would think the best language to start with is quite possibly C. Pascal used to be better, but its no longer used. I personally would prefer a scripting language like Python or Perl. I think scripting languages are better because there is no complexity of the compiler involved.

      I would think it better to have functional language next. Students are much more receptive in the earlier years, and Functional programming does take some getting used to. I don't know much to recommend these languages ;-).

      After tha
  • by toolslive ( 953869 ) on Tuesday January 08, 2008 @04:28AM (#21951048)
    If they teach you only one programming language, yes, they damage you.

    In the course of my CS education (early 90s), they started with Pascal when they explained algorithmical basics.
    Later courses were in C for OS and networking, while other courses used about everything from PROLOG to ADA.

    You learn that some paradigms map to certain types of problems better (or worse) than others. So don't open sockets in
    Prolog (I have seen'em do it man) , and don't do AI in C.

    a quote: "if the only tool you have is a hammer, every problem looks like a nail".
    • Re: (Score:3, Insightful)

      by synx ( 29979 )
      Yes, early 90s - but have you checked out universities lately? We're talking cradle to grave Java. Intro to dev in Java, mid level courses in Java, Sr courses in Java. We're graduating people who don't know what pointers are!

      COME ON!

      And don't tell me Java doesn't have pointers - what do you think references are? Glorified pointers with auto-null checks.

      One problem I've seen is Java Developer Syndrome (JDS) - think devs who don't know the difference between Java API names and datastructures that are used
      • And don't tell me Java doesn't have pointers - what do you think references are? Glorified pointers with auto-null checks.

        And what's a SIGSEGV if it's not an auto-null check? ;-)

  • About the Authors (Score:5, Informative)

    by etymxris ( 121288 ) on Tuesday January 08, 2008 @04:32AM (#21951068)
    Gee, wonder why they're praising Ada so much.

    Robert B.K. Dewar, Ph.D., is president of AdaCore and a professor emeritus of computer science at New York University. He has been involved in the design and implementation of Ada since 1980 as a distinguished reviewer, a member of the Ada Rapporteur group, and the chief architect of Gnu Ada Translator.

    Edmond Schonberg, Ph.D., is vice-president of AdaCore and a professor emeritus of computer science at New York University. He has been involved in the implementation of Ada since 1981. With Robert Dewar and other collaborators, he created the first validated implementation of Ada83, the first prototype compiler for Ada9X, and the first full implementation of Ada2005.
    Maybe Ada is helpful for learning concurrent programming and safe typing, but I'll wait for the opinion of a slightly less partial party.
    • Re: (Score:3, Insightful)

      by superash ( 1045796 )
      TFA says:

      A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)

      Why C matters...
      Why C++ matters...
      Why lisp matters...
      Why Java matters...
      Why Ada matters...

      So, I don't think the article is biased.
    • Re:About the Authors (Score:5, Informative)

      by Col Bat Guano ( 633857 ) on Tuesday January 08, 2008 @05:08AM (#21951294)
      I'm a slightly less partial party. I've taught Ada to first year students, and our school replaced Ada with Java. Some observations are...

            Students found Ada a relatively simple language to start with (if you choose an appropriate subset)
            Java can have more overhead for a beginning student
            Lecturers are often tempted to push a lot of "stuff" in intro subjects
            Java GUI motivates some students to get more involved
            Many of my students regretted that Ada would no longer be taught in first year (having quite enjoyed it)

      No matter what you start with, teaching students to be better programmers takes more than just a language. Each language allows you to teach a specific set of skills, and Ada is not bad for teaching some important SE skills (IMHO).

      I think pointers are overrated as a first year concept, and can wait for later years.
    • Maybe Ada is helpful for learning concurrent programming and safe typing, but I'll wait for the opinion of a slightly less partial party.

      You might be right, but just because they're involved in Ada doesn't necessarily make them biased towards it -- it does mean that they probably know a lot about it. What actually matters are their teaching qualifications and their understanding of what's important.

      They might just as easily have come to be involved in Ada because it met all their requirements as a good

  • by delt0r ( 999393 ) on Tuesday January 08, 2008 @04:34AM (#21951084)
    There going to plenty of flames on this topic.

    As someone who programs mainly in java, I have to say they have a point. Surely a degree in CS should get someone familiar with all forms of higher order programing (both OO and functional). They should also have a reasonably solid understanding of basic hardware architecture and how that affects programs.

    Unfortunately this does not seem to be the case at least in NZ. Some don't even know about basic complexity ideas and often have little to zero mathematics under there belt.

    I did not do CS but physics. I was required to do Assembly,basic,C,matlab,R,Lisp,Java,C++,Haskell and a bunch of others I don't care to mention (Like PLC's and FPGA stuff).
  • Biased? (Score:5, Informative)

    by silverhalide ( 584408 ) on Tuesday January 08, 2008 @04:35AM (#21951092)
    This might be obvious, but take a close look at the authors of the article:

    Dr. Robert B.K. Dewar, AdaCore Inc. (President)
    Dr. Edmond Schonberg, AdaCore Inc. (Vice President).

    The article by some weird coincidence slams Java and praises Ada.

    Salt, please...

    PS, Ada is mainly alive in the Military/Aerospace industries where projects can last 20+ years.
  • Different goals (Score:4, Insightful)

    by Secret Rabbit ( 914973 ) on Tuesday January 08, 2008 @04:41AM (#21951124) Journal
    In a College for a 2-yr Programming Diploma, this would be fine because the goal of such a program is just writing some business application or some such. Nothing that requires any real competence.

    On the other hand, Universities have a much different end goal. They want to teach such that completing there program means that the student can go onto a Masters program, etc. Obviously, Java won't get students there without a massive amount of pain if they go on to further study.

    Well, at least that how it was, and how it should be. Currently, Universities are edging toward the College level. What this has produced is a massive gap in knowledge/skill of where the student is expected to be and where they actually are upon entering a Graduate program.

    Unfortunately, this isn't just in CS. More and more I see Mathematics and Physics programs degrading as well. From what I've seen, this is due to Administration applying... pressure for high grades, etc. No grades, no funding. The cycle continues.

    Though, I must point out that there are some Departments that are making an attempt at fighting back. Small in number they may be, there is still hope for a return to actual academics. Though, we'll see how that plays out. You never know, I wouldn't put it beyond a spiteful Administration to turn a Department into offering just service courses.
  • by freedom_india ( 780002 ) on Tuesday January 08, 2008 @04:46AM (#21951150) Homepage Journal
    For the record, the same professors said the same thing when C++ was in vogue. They also said the same thing when C was popular.
    To take apart their argument by logic:
    1. Java is an abstraction a VM over the hardware. People who study comp sci, today are of three kinds:
    a) Those who want to be a programmer and then a Project Mgr.
    b) Those who want to maintain hardware and/or design new ones. iPod maybe.
    c) Those want to manage systems (20% hardware: 80% software).

    For the first kind, Java is a better choice because: It does not force you to dig down to machine-code which is unnecessary today. Much like car driving in 1920s and 1990s. It teaches you the best of programming by forcing you to think in terms of objects and how to act upon them in real-world. If you mess up, you don't overwrite the root disk thus causing innumerous references to the time-worn joke about shooting in one's foot.
    It also teaches you GUI writing is tougher by way of its MVC programming. At this time, programmers can be split into real men or MSFT weenies: real men would go to Java in Server-world. Weenies would love GUI and goto VB.

    It also teaches you the worst in programming: Forcing you to think only in OO way.

    The second kinds is better off learning C or even Assembly.

    The third kind is tricky: There are lots of management tools nowadays. Some of them written in Java. if they want to write their plugins easily, then Java is the way to go.

    2) Java is one more step in evolution which normally the professors hate because it moves them away from the machine. But mankind has more important things to do (watching LOST and Sopranos) than twiddling with RPG.

    3) Blaming Java alone for problems is like blaming the Sea for causing Katrinas.

    Lastly if anyone should be blamed for warping the minds of youngsters permanently, it should MSFT with its Visual Basic system.

    • by ardor ( 673957 ) on Tuesday January 08, 2008 @07:49AM (#21952128)
      Java is one more step in evolution which normally the professors hate because it moves them away from the machine. But mankind has more important things to do (watching LOST and Sopranos) than twiddling with RPG.

      Wrong. Java is a step *back*. Not because of the abstraction, mind you, but simply because Java is such a severely limited language. 30-40 years ago amazing new things were developed in CS, none of these are included in Java. Lambda? Map-Reduce? Purely functional code? Caml-style pattern matching? Conditions? People say Java made C++ obsolete, which is completely wrong, since C++ is far more powerful than Java (thanks to generic programming and metaprogramming). Just have a look at Boost, or C++0x, and try to replicate the features in Java - WITHOUT runtime overhead (this is one key feature of generic/meta programming - a lot of computation can be done at *compile*-time).

      Actually, the next step in evolution should be Lisp, since it is one of the the most powerful and flexible languages in existence. There is nothing in the language that intrinsically prohibits Lisp performing as well as C++, its mostly the tools that lack development (for comparison, g++ has had a hell of a lot of improvements; the 2.x series was awful). If you avoid certain things like eval, Lisp code can be optimized well enough. There is a Scheme compiler called Stalin which does just that, and it can even outperform comparable C code.

      So, if you want to abstract away from the machine, why use Java instead of Lisp?
      IMO there are other reasons for Java:
      1) its a braindead easy language.
      2) you only need mediocre programmers, since Java is easy enough for them, and there are much more mediocre programmers than good ones.
      3) as a consequence, programmers are expendable, jobs can be outsourced etc. One C++/Lisp/Haskell expert could replace an entire Java team. Not good for the company, since this guy becomes indispensable, and can demand more salary etc.
      4) Universities can claim to have more success in teaching, since the number of guys with a CS degree is higher.
      5) Companies need less teaching courses, because of (1).
      6) Java has been overhyped for a while now, and many CTOs are so clueless, they just buy into it.

      Well, you can disagree at my opinion about Java, but it is a fact that CS students *should* learn about the functional paradigm, what lambdas are, closures, and so on as well as what pointers are, how the memory works, what the garbage collector actually does etc. However, this is not the case - and THIS is a serious problem.
  • Variety (Score:5, Interesting)

    by andy753421 ( 850820 ) on Tuesday January 08, 2008 @04:48AM (#21951162) Homepage
    Where I go to school [rose-hulman.edu], just this year we switched from teaching the introductory classes in Java to a combination of Python, then C, then Java. I think that this is much better than using any particular one of those languages the whole time. It gives the student experience with more different concepts and from that I think they can begin to see how everything works together. Also, starting with something simpler than Java/Eclipse seems to make it a lot easier the first few weeks of the course.

    One thing I have noticed though, is a complete lack of security related training. Something about calling eval() on every input just to parse integers makes me cringe. I guess the idea is that worrying to much standard practices keeps people from thinking creatively or something. Unfortunately, it also seems like a good way to get into a lot of bad habits.
  • Why not D? (Score:4, Interesting)

    by Twinbee ( 767046 ) on Tuesday January 08, 2008 @04:52AM (#21951186)
    I would suggest being taught a programming language such as D [digitalmars.com], at least in addition. Although the transition from C/C++ to D can be painful, D contains many similarities with C/C++ such as speed, except it's much tidier and has many of the advantages of Java syntax whilst maintaining the power of C/C++ if necessary (optional pointers, optional manual garbage collection etc.).

    Sooner or later, languages are going to evolve, and surely it's only a matter of time before something D-like is going to be used anyway. Might as well make the switch sooner rather than later.
  • by wrook ( 134116 ) on Tuesday January 08, 2008 @05:00AM (#21951228) Homepage
    OK, this hit one of my hot buttons. Before I continue, though, let me preface my statement by saying that I don't disagree with the article (which is right on the button). But I disagree with the way the summary characterizes the situation.

    I totally agree that universities shouldn't be teaching Java exclusively. They need to teach the basics of modular, functional, declarative and oo languages. Why? Certainly *not* to fill "software engineering" positions!!! A university's role is to do research, not to act as some technical college. OK, I can see having a programming course aimed at creating programmers for industry if it's going to pay the bills at the uni. But *don't* make that your "Computer Science" course!!

    Computer Science should be science (well, math anyway). Universities should be getting the 5 or 10 graduates they need that will move on to academia (or industry research) later in their careers. Because right now, *nobody* is getting taught Computer Science! Lately I've been reading papers posted on http://lambda-the-ultimate.org/ [lambda-the-ultimate.org] Regularly I have to go back to the basics and learn extremely fundamental theory because nobody *ever* taught them to me in the first place. Half the time I think, "OMG, I never even knew this existed -- and it was done in 1969!!????"

    More and more lately, I've been wanting to phone my University up and ask for my tuition back.

    If you want to learn how to program in a professional setting, there's nothing better to do than just start writing code. Get your chops up. Then find some big free software projects and start fixing bugs. Learn how to use the tools (configuration management, etc). Learn how to interact with the other programmers. That's all you really need (well, that and a quick automata and grammar course so that I don't have to look at yet another context free grammar being "parsed" by regular expressions).

    But right now, where do you go if you want to actually learn theory? I guess the library... And getting back to the point, this is essentially what the paper is suggesting. Students need to learn all these things because they are relevant to the field. A university supports industry by doing basic research. If you don't understand the concepts that they point out, you just can't do that. Paraphrasing from the article, having a university course that's meant to pad out a student's resume is shoddy indeed.

  • Java for Dummies (Score:5, Interesting)

    by DCFC ( 933633 ) on Tuesday January 08, 2008 @05:03AM (#21951248)
    I am a headhunter for high end roles at investment banks, and we are close to classifying CompSci as a "non degree", along with media studies, languages, etc.

    Java is fine for teaching design patterns, and classical algorithms like Quicksort, or binary search.
    But you can't do operating systems, and the success of Java in isolating you from any notion of the hardware is actually the problem.
    We have already blacklisted courses like the one at Kings College, because they teach operating systems in Java.
    Yes, really.
    Their reason apparently is that it is "easier".
    I have zero interest in kids who have studied "easy" subjects.

    The world is a bigger, more competitive place, how many jobs do you think there are for people who have an easy time at college ?

    Java is part of the dumbing down of CS.
    A computer "expert" is not someone who knows template metaprogramming in C++, or compiler archaeology in BCPL, or the vagaries of the Windows scheduler.
    It is someone who understands computers at multiple levels, allowing them to choose which one illuminates the problem at hand.
    To be wise in computers you choose whether to think of something as a block of bytes, quadwords, a bitmap, a picture, or a buffer overflow pretending to be porn. If also have the option of understanding flash vs static RAM, virtual memory, or networked storage, all the better. I doubt if even 1% of CS grads could write code to turn this BMP into a JPG, or even explain the ideas behind this. In my experience, 50% could not work out how to develop a data structure for a bitmap that used palettes.
    I have interviewed CS grads with apparently good grades who could not explain any data structure beyond arrays.

    Any CS grad who sends us their CV with bullshit like "computers and society" or "web design" has their CV consigned to trash with no further reading.
    A CS should be able to write a web server, not be an arts graduate who didn't get laid.

    C++ makes you think at multiple levels, unlike Java, you simply cannot avoid thinking about your system from patterns to bytes. This may be good or bad for productivity, and I'm sure we risk a flame war here.
    But I am entirely convinced you need to hack your way through a "real" system.

    How can someone understand the Linux kernel without C & C++ ?
    Is someone really fit to be called a computer scientist if like >50% of the Computer "Scientists" we interview for very highly paid jobs, show actual fear of working at that level.
    They have the same "way above my head" attitude that a mediocre biologist might have to applying quantum theory to skin disease.

    Partly, as in the Kings College debacle it is lazy mediocre lecturers, but also CompSci grads frankly are not that smart, so they need their hands held.
    Although the seats get filled, they quality is in monotonic decline.
    • by Endymion ( 12816 ) <slashdot.org@thoughtnoise. n e t> on Tuesday January 08, 2008 @05:34AM (#21951438) Homepage Journal
      Java is part of the dumbing down of CS.

      Java is the new COBOL. And we will regret it in 20 years for much the same reasons.

      It actually gives me hope that you have recognized this in hiring practices. That a CV with a list of Sun's Java buzzwords is not an indication of a useful programmer.

      I was disturbed in college (1997-2001) that things were changing towards Java and other idiocy. Too many people didn't get pointers and other basic concepts, and Java was hiding them even more. I believe it was the one class we had in assembly programming that really pointed it out - when confronted with having to deal with real hardware, most of the students didn't know what to do. Concepts like "two's complement" vs "one's complement" caused a strange brain-lock for them, as they were so sheltered from the actual binary math and hardware of the computer.

      It was only a handful of us that had been programming for years already (yay for the Atari 800XL) that had any idea of what was going on. The college (UC Davis) skipped entirely over very basic concepts like Von Newmann Architecture. I ended up having to spend most of my time trying to help my fellow students, there was so many fundamentals missing.

      I think the most frightening part was having to yell at one of the professors one day, because the basic data structures he was teaching were being done incorrectly. He was teaching people to leak memory. ("Let's allocate a huge linked list, and then just set the head pointer to NULL and consider it freed!")

      Sigh. It was frightening then, and apparently all my fears were justified, as now the entire discipline is getting a bad reputation. Unfortunately, I can't exactly disagree with that reputation from some of the CVs I've seen recently. My degree is destined fscked, apparently.

      You hiring? ^_^
    • Re: (Score:3, Insightful)

      by philipgar ( 595691 )

      I doubt if even 1% of CS grads could write code to turn this BMP into a JPG, or even explain the ideas behind this.

      Just to harp on one statement in your comment. This is either an absurdly basic demand to ask of programmers, or an absurdly complex one. On one side, this is as simple as making a function call bmp2jpg(...) or whatever it is with the libraries they are using. The process of doing this is simplistic, and taught to the java mentality of "find the right tool in the shed to do your entire pro

    • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday January 08, 2008 @02:50PM (#21957590)
      I hold a Master's degree in Computer Science with a focus in information security. I have also worked in the private sector in a variety of IT jobs, so don't think I'm some propellerheaded academic. I have also taught programming courses at the university level. I call shenanigans on your entire argument.

      I am a headhunter for high end roles at investment banks, and we are close to classifying CompSci as a "non degree", along with media studies, languages, etc

      So, what, you're going to hire math geeks only? People with degrees in mathematics or operations research, or perhaps some of the hard sciences? In my own experience, while there are some non-CS degrees that are excellent preparation for a CS career, only a CS degree is a CS degree. It is lamentable that some schools have embraced the trade-school mentality, but many more have not. When I was teaching courses as a graduate student (just a couple of years ago), the curriculum began with Java and quickly shifted to Haskell. A neighboring institution still uses Ada as an undergraduate language. There's also a legion of Knights of the Lambda Calculus who are trying to get Scheme reintroduced to the undergraduate curricula in several institutions in the area. Intellectual diversity about languages is alive and well in the academy, based on the institutions I've seen up close and personal.

      Also, who is this "we"? You and someone else who shares your prejudices? Or is this you and the senior engineering staff? If you're about to decree CS as a non degree, maybe you should get the input of the people who will be most brutally affected by your shortsightedness.

      Java is fine for teaching design patterns, and classical algorithms like Quicksort, or binary search.
      But you can't do operating systems

      So glad to know that you think design patterns and classic algorithms are worth studying.

      Look, pick up a copy of Cormen, Leiserson, Rivest and Stein's Algorithms textbook sometime. That's the definitive work on algorithms--if you need an algorithm, it's probably sitting in CLRS somewhere, along with some beautiful mathematical exposition about it. Every algorithm listed in the book can be trivially converted into Java. So why the hate for teaching CS with Java? It's a perfectly sensible language for many very important parts of CS.

      Further, I've taught operating system design in Python. Yes, Python. When talking about how a translation lookaside buffer works, I don't write C code on the board. I write pseudocode in Python and say "so, this is how it looks from twenty thousand feet." On those rare occasions when we have to get down and dirty with the bare metal, then it's time to break out C--and we leave C behind as soon as possible. I want students to be focused on the ideas of translation lookaside buffers, not the arcane minutiae of implementations.

      After all. Implementing it is their homework, and it involves hacking up the Minix code. In C.

      Their reason apparently is that it is "easier".
      I have zero interest in kids who have studied "easy" subjects.

      If it was an easy subject, would changes need to be made to make it easier?

      If it was a spectacularly hard subject with a 50% washout rate, would changes need to be made to make it easier?

      I've been in courses where 50% of the class washed. They were horrible, horrible classes. The pedagogy needed to change. The learning curve needed to be smoothed out and made gentler. This is in no way equivalent to saying it was made easy. The fact you think otherwise brands you as an intellectual elitist who can't be bothered to think logically about his own prejudices.

      A computer "expert" is not someone who knows template metaprogramming in C++, or compiler archaeology in BCPL, or the vagaries of the Windows scheduler.
      It is someone who understands computers at multiple leve

  • "Sure I know C!" (Score:5, Insightful)

    by Durandal64 ( 658649 ) on Tuesday January 08, 2008 @05:07AM (#21951278)
    I'm kind of a proponent of having a student's evolution mirror the industry's, to an extent. Start them with C and then gradually introduce problems that were more and more difficult to solve in C. That way, when you show them C++ or Java, they can appreciate why these languages were needed and what classes of problems they're appropriate for and more importantly, what they're not appropriate for. But to really appreciate these things, you have to have students implement their own little OO runtime in C or whatever other procedural language. You can bet that after that, by the time you show them a true OO language, they'll know plenty about OOP, and things will just come more naturally.

    These students are being trained as engineers. They shouldn't be afraid of a little grease.
  • 2 professors, 1 cup. (Score:3, Interesting)

    by forgotten_my_nick ( 802929 ) on Tuesday January 08, 2008 @05:08AM (#21951288)
    I disagree with them. The flaw isn't with Java but with how it is most likely being taught to the students. They also undersell Java as a language. dot.com? seriously. And about using Java only for threading and reflection? Are they serious?

    I've taught grinds to first year students in Ireland in Java (I'm SCJP 14/5) and their professors do not even allow the use of an IDE when coding. They also grade them over Java patterns and OO rather then knowledge of the language.

    C/C++ have their place, but any good CS student normally learns a number of languages.

    I can code in a number of languages, certified in quite a few as well and I've never used Ada. Considering both professors work for a company that sells ADA stuff it seems a little biased and uninformed on Java.
  • That's true (Score:4, Insightful)

    by cgomezr ( 1074699 ) on Tuesday January 08, 2008 @05:08AM (#21951290)

    I love Java, and I find it much more pleasant to use than C/C++, but I generally agree with TFA. I have seen many people doing things like this

    //build a string with 10K a's
    String as = "";
    for ( int i = 0 ; i < 10000 ; i++ ) as += "a";

    which creates 10K temporary objects to construct a single string*. This is because they started learning programming with a high abstraction level so they have no idea of what is going on behind the scenes. It's something similar to starting programming with one of these new "intelligent" IDE's such as Eclipse, which do lots of things for you so you don't have to figure them out for yourself. I think all these abstractions are great for people who already know how to program, not for beginners. You wouldn't give a calculator to 6 year old kids learning math, would you?

    I personally would being with C and then jump to Java. C is not so complicated as a first language if you don't introduce all its features from day one. It was my first language and I think it was a good choice, it shows you enough low-level concepts to be able to make efficient optimised code in higher-level languages. Besides, when you later jump to a more high-level OO language you appreciate the difference and learn it with interest.

    * I know modern compilers are able to optimise that automatically using a StringBuffer or StringBuilder. I just chose that (somewhat unrealistic) example for the sake of simplicity, but the same happens in other cases that aren't so easily handled by the compiler.

  • by Bryan Ischo ( 893 ) on Tuesday January 08, 2008 @05:28AM (#21951412) Homepage
    The better CS undergrad programs don't really teach languages per se. The main focus of the curriculum should be the theoretical underpinnings of computer science, combined with the practical aspects of software development. Since languages themselves are part of the practical aspect of software development, in addition to also being the focus of some computer science theory, it is unavoidable that languages should themselves be studied to some degree, and also used to a large degree to practice the theory that is being taught. Most theoretical CS only really needs 'pseudocode' to illustrate the concepts being discussed. But since students are often asked to write programs to demonstrate their understanding of the subject matter, a real language is unavoidable. But the language itself is secondary to the real meat of the subject, which should all be mathematical and theoretical in nature.

    At CMU the very first CS class (that losers like me who didn't AP out of the first CS course, mostly because my high school didn't even have computer classes let alone AP computer classes!) really did focus on teaching a language - Pascal - and a significant part of the class was the learning of the language. It was the least useful CS class I took in the long run (not surprising, as an introductory course in any subject is likely to be the same). Subsequent courses would spend 1 - 2 weeks going over the fundamentals of the language to be used in coursework for the remainder of the class (which in some classes was C, in some was C++, some used ML, others Scheme, etc), to get everyone started, and after that, you had to figure it out on your own in conjunction with actually learning the theory that was being taught. It really isn't that hard to pick up a new language once you know a couple, although I did have a hard time with ML, mostly because I was completely unmotivated to learn it, feeling that it was absolutely useless to know (I was right).

    No really good CS program has any classes with names like "Java 101" or "Advanced C++". To use a carpentry analogy, I would expect a really good carpentry school to teach the fundamental rules and "theory" of carpentry, so that the student upon graduation really understood what carpentry was all about and could apply their knowledge to aspects of the subject that they hadn't even encountered in school. I wouldn't expect a good carpentry school to have classes like "Advanced Hammering" and "Bandsaw 101". The courses would instead be "Introduction to House Frames" and "How to Construct Joints". You'd be expected to learn the use of the tools in the natural course of studying these subjects.

    It's the same for CS. Good programs don't teach the tools, they teach the *subject*; learning the tools is intrinsic in the study of the theory.

  • by IBitOBear ( 410965 ) on Tuesday January 08, 2008 @05:42AM (#21951486) Homepage Journal
    I you make everyone special, nobody is really special.

    Every time someone tells me that there are no pointers in Java I laugh a little. EVERYTHING in java that isn't a scalar is actually referenced through pointers. That is, you declare the pointer variable and then "new" the object into place.

    They are just incredibly _boring_ pointers. You cannot math on them. There is no sense of location to those pointers. But the absence of interesting pointer operations, and the absence of the _semantic_ _copy_ operation is what all this alleged pointerlessness is all about.

    I have only two _Real_ problems with java... (okay three if you count the complete requirement that you constantly have to deal with exceptions even when you know they cannot really happen, and if they did, you would want the thing to abort all over the place... but I digress)

    (1) Java has no useful destructors because no object has predictable scope. If you think finalize methods are the same as destructors then don't bother responding, you don't know what destructors are...

    (2) Since everything is a pointer in Java, you have to bend over backwards to pass-by-value. The fact that the language doesn't even begin to provide copy-construction semantics. What a miserable PITA.

    Now the _dumbest_ thing about java is that they were so set against multiple inheritance that they never bothered to ask themselves why _every_ OO language starts out life without multiple inheritance only to have to add it later. By making everything a proper linear subclass of Object, they left themselves with having to graft on "interfaces" which is just multiple inheritance with the "bonus" of completely preventing default implementations. (Which lead to delegation etc.)

    The way the language keeps sprouting things it claims to never have and never need, well it's very like watching a clown car endlessly explode with ridiculous archetypes. After a while it just isn't funny any more.

    So yea, teaching people Java as an introductory language is something of a disservice if you ever want to make them truly think about programming and what makes some things machine smart, while others are machine stupid.

    --- BUT ---

    I worked in education for years. The fundamental problem with computer science education is that it is being taught by computer scientists instead of educators. We are stuck learning from the people who learned from the people who made it up. None of these people ever learned to EFFECTIVELY IMPART INFORMATION.

    Consequently, the students are largely unemployable on the day of graduation.

    The classic computer curricula seems to consist of throwing three or four languages at a kid in the hopes that they will "just kind of figure out this programming stuff."

    The field of computer science has not yet come up with a "basic theory"... a starting place... The list of things a student simply must know before you start filling their head with syntax.

    And so we are a bunch of prelates training our acolytes in our special, individualized deeper mysteries.

    And that's what everybody is doing worldwide, so our graduates are just as lame as everyone else's...

    Cue "Enter the Gladiators"...
  • Beginner language? (Score:5, Interesting)

    by Jartan ( 219704 ) on Tuesday January 08, 2008 @05:44AM (#21951500)
    People learn a lot through failure and pain. C is clearly the perfect choice when you look at it this way.
  • by MCTFB ( 863774 ) on Tuesday January 08, 2008 @07:10AM (#21951946)
    99% of what you learn as a programmer you don't learn at college anyways (at least the people who don't totally suck at programming). Furthermore, unless you have one-on-one mentoring from a senior programmer or professor who has at least 10 years of solid professional coding experience under their belt, not much else is going to help you other than you and yourself in maturing as a programmer.

    Most CompSci college graduates are totally unproductive on their first job. They can be put to work on trivial things, but no matter what school they came from, they are just going to need a lot of hand-holding to make it through the first year. That is just how it is. Doing coursework at school is no substitute for coding on a meaningful project, whether it be work related, something open-source related, or just something for fun. That is the honest to god's truth as a software developer for over 12 years now and I don't even consider myself even that wisened in the field (maybe after 20 years I will feel differently).

    Now, with respect to Java as an introductory programming language, it is not bad but not great either, however the purpose of any introductory course to anything should be to capture the interest of the people who are curious enough to take the course in the first place. Back in college, we started with C (most of my peers had already been programming since they were teething but this was CMU) and if not for my persistent no quit attitude in life, I probably would of given up programming right then and there because spending your entire night trying to debug a trivial program not because you didn't understand the material but because of one stupid uninitialized pointer turns a lot of people off right then and there who may have had the potential to be great programmers, but because their first impression of programming was so bad, they gave it up before they got to learn more about how great programming really is.

    Oh yeah, and the not relevant at all math courses didn't really help much either. Whenever in your career you need to use some advanced calculus or discrete math, you will have likely forgotten about 99% of it and need to look it all up in a book anyways. Besides, 99% of programming projects in the real world basically involve high school level algebra and not much else. What separates the productive programmers from the unproductive ones is not who got a better grade on their math course back in college, but those who innately understand systems and are willing to make the extra effort to learn all about the gazillion design patterns available to programmers so that when they are faced with a difficult project, they will not waste inordinate amounts of time reinventing the wheel.

    As for understanding computing at a rather low-level, as is the case with a class in operating systems, then yah Java might not be such a great choice, but then again learning C is easy because C is made up of very simple constructs (C++ is another story). However, using C productively just requires a crapload of practice/experience to be good with, not necessarily a whole lot of computing expertise. In addition, the mastery of whatever API's you happen to be basing your career on is paramount as well. In the real world, employers don't want to hear "but I can learn anything quickly" because mastering some API's can take 6 months or more so if you come out of university with no specific skill sets, it is going to be really hard to get that first job because unless you can be productive soon (or even on day one), you are useless as far as employers are concerned. Also, though I don't program in Win32 professionally myself, from my understanding it takes at least 3 years of non-stop work with those API's just to be semi-proficient in them. Professionally, most of my work over the years has been in Java, and Java is probably scary to a lot of neophyte programmers these days because since 1.5, it has unfortunately turned into the bastard child of complexity like its twisted sister C++.

    Last but not
  • Lisp (Score:4, Insightful)

    by wikinerd ( 809585 ) on Tuesday January 08, 2008 @07:34AM (#21952054) Journal
    Professors who care about students's education teach Lisp and Scheme. I had a professor who taught us Scheme in his free unpaid time after the main lecture. The university did not include it into the curriculum, but he explained to us why Scheme is important, and those of us who understood its importance choose to stay and listen to his unpaid unofficial lectures on Scheme. The reason these languages are important is in the mathematical thought that lies behind their structure. Every language has a way of thinking behind it. Some languages are procedural, others are functional. It is these paradigms that are important in a curriculum, because most mediocre programmers who get to program using one paradigm usually stay with it for a lifetime and never get to learn another. So a university should ideally offer courses on all available paradigms to make sure future programmers can choose the one which is the most productive for every specific project.
  • Why We Teach Java (Score:5, Insightful)

    by fartrader ( 323244 ) on Tuesday January 08, 2008 @08:09AM (#21952250)
    As a CS Professor, here are some of my thoughts on this article:

    (1) Java is what the market wants. Yes, we can teach any other language under the sun. But the reality is, that the software industry values individuals who are Java-literate. By this I mean an individual who has a basic understanding of the OO principles that the language is founded upon, can write Java code using common tools, and has at least some insight into some of the more common Java APIs. Any learning institution that doesn't take this into account when designing their curriculum is doing a serious disservice to their student body. While some do go to University for the sheer joy of learning a subject - most are there to ultimately get a job.

    (2) In my opinion there is something seriously wrong with a Java course that emphasizes Swing or Web development, rather than the fundamentals. Yes, its important to get things in and out of a program, but, at least initially these should be incidental to the main event. Learning the language, and applying it effectively. Thinking in an object-oriented way, which many of you know is not necessarily an intuitive way to look at the world - especially if you already have a procedural background. GUI and web application development should be separate, advanced courses.

    (3) I sometimes lament the lack of insight into pointers, but any professor worth their salary will spend some time discussing the Java object reference architecture, and relate that to pointer-based languages. Regardless of how abstract your language is "opening up the hood" and demonstrating how things work, and why things have been designed the way they are, is often worth knowing.

    (4) I laughed when I read the article about Praxis, especially the part about formal methods. Are they serious? Yes I was taught formal methods in school, and could understand *why* I'd want to use them... If I had all the time in the world... a huge budget to burn and customers not screaming for something that the business needed yesterday. Praxis offers software development based on formal methods and as a consequence occupies an important (and probably expensive) specialized niche of the software ecosystem. To suggest that this approach should be the norm and lament its absence really betrays that the authors have spent too much time in academia and not enough in the real world.

    (5) Ada is a great language - in fact I learned Ada 83 as a first language along with C. It just isn't relevant to most software development companies or IT departments - if indeed it ever was. I worked on a research project that was part of the Ada 9X Real-Time initiative - the main users were aerospace and military vendors - particularly embedded systems. There you do need to know about concurrency and distribution - along with hard performance deadlines and often a slew of safety and mission-critical issues you need to consider to do a good job. However, I fail to see the general relevance of Ada to a commercial market that is primarily interested in "simple" information systems, getting information out of a database and/or putting it in - with some processing on route. Why should I use Ada when the market in general doesn't use it?

    (6) We teach concurrency - its useful stuff to know. I think that using formalisms to describe concurrent programs is going a wee bit too far (see (4)) above.

  • by TemporalBeing ( 803363 ) <bm_witness@nOSPAm.yahoo.com> on Tuesday January 08, 2008 @11:54AM (#21954738) Homepage Journal
    First, some of the best comments in this thread: Comment 1 [slashdot.org] Comment 2 [slashdot.org] Comment 3 [slashdot.org] Comment 4 [slashdot.org] Comment 5 [slashdot.org] Comment 6 [slashdot.org]

    I list them because they hold a lot of wisdom, and wanted to draw special attention to them for such as well.

    When I was in college I got really ticked at the level of theory - there was too much of it. It wasn't balanced well enough with implementation; and as I looked around, I noticed that was pretty common place among academic institutions (colleges AND universities - and I'm not talking about trade schools either). That was before they moved their curriculum to using Java for the first couple classes; and after they did, I had already heard some stories about the upper classes getting some of these "new" students and not being able to focus on the class materials because they had to teach these students C/C++ first and the students had a harder time getting it. (Not so the other way around.)

    That said, I've started thinking about how I would put together a curriculum for teaching computer programming/science/engineering. (I'm not talking about computer _hardware_ engineering, btw.) I even did some tutoring after college. So what would I do?

    I'd start students with a language that can be used to teach the real basic skills and concepts (variables, functions, etc.) - even vbscript could be used at this level; but I'd also quickly move them on to more advanced concepts (in the case of vbscript, it would only be used for a couple weeks at most), moving from language to language to bring not only a depth of concepts and understanding, but also a breadth of computer languages and kinds of tasks. I'd also ensure that somewhere in the curriculum students would be exposed to Assembly, and have found that even a small exposure makes a big difference in programming styles and philosophies for programmers.

    Furthermore, I'd break the curriculum into two parts. One part would start from the ground up; and the other would start from the top down. Both would be required of students. The idea being one part would be more focused on the theoretical, while the other would be more focused on the substantial - implementation. Both would work together to produce a well-rounded student. Additionally, it would be designed such that students that wanted to work on operating systems would simply follow the one from start to end; while other students would be able to leave for more focused courses at the layer of their choice. (Students wanting OS would still have other courses for focus work too, btw.) The primary idea being that even a web-app developer needs to know the underlying systems, and even the OS developer needs to understand the abstractions of the web-app developer.

    I'd also have the overall curriculum be far more software engineering focused. Yes, if people want to really be computer "scientists", then they could do that; but industry really needs software engineers, not computer scientists. Real programs require engineers, and sadly, this is strongly lacking from most all academic computer programs. (Some have changed it, but not many.)

    I'd also think that this approach would be very favorable to the authors of TFA and the comments I've linked. The ideas probably need a bit more refinement, but the general approach would be sound - and it's not what academia is doing today by any stretch of the imagination.

    FWIW - While I am relatively young (college grad of 2003), my main strength is C prog
  • by jilles ( 20976 ) on Tuesday January 08, 2008 @12:05PM (#21954922) Homepage
    1. Mathematics requirements in CS programs are shrinking.

    The reason is because Computer Science has developed into a discipline that is no longer pure mathematics. There's only so many courses you can squeeze into four years.

          2. The development of programming skills in several languages is giving way to cookbook approaches using large libraries and special-purpose packages.

    Guess what, that's what building real software is like today. We don't need people that can write quicksort in obscure unused languages but people that can grasp systems of millions of lines of code. Ada doesn't prepare you for that because it is a toy language that never really was adopted outside of the academic world. It has no good, widely used frameworks & libraries like you find in the real world. People don't use it for a whole range of software systems that you find in the real world and to prepare you for this real world there are simply much better languages around these days.

          3. The resulting set of skills is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals.

    I agree that skills are important. A good prof can teach those using pretty much any turing complete language if it needs to be done. Java isn't half bad for teaching a whole lot of important CS concepts and theory. And unlike Ada, people actually use it. As for C and C++ they are useful languages to learn of course. Many colleges still do.

    But of course two ex profs working for adacore are hardly objective. Ada is as dead as latin. It has some nice features but nothing you won't find somewhere else. Keeping professional skills up to date is as important for professors as it is for students. Having done a Ph. D. in software engineering & architecture and having practiced my skills in several companies, my view is that one of the largest problems in computer science education is teachers who have never worked on real, industrial sized software systems and continue to send students into industry with a lot of misguided & naive ideas about how to build software. Most SE teachers out there simply have no clue what they are talking about. Software engineering is a skill learned in practice because the teachers in university mostly lack the skills required to properly prepare students. That's the sad reality.

The end of labor is to gain leisure.

Working...