Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education Programming IT Technology

Professors Slam Java As "Damaging" To Students 1267

jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
This discussion has been archived. No new comments can be posted.

Professors Slam Java As "Damaging" To Students

Comments Filter:
  • by gangien ( 151940 ) on Tuesday January 08, 2008 @04:24AM (#21951026) Homepage
    "A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)"

    that's true, but again soft engineering/programming is a subset of computer science (maybe, i suppose you could argue they aren't)

    "Computer science is no more about computers than astronomy is about telescopes."
    - Edsger Dijkstra
  • by Facegarden ( 967477 ) on Tuesday January 08, 2008 @04:35AM (#21951086)
    I've noticed that... I'm not a CS or a CE, i'm an ME, and i build robots. I'm great at the mechanical side but i always talk to Computer Engineers or CS majors, trying to see what they can do for me, and all i've surmised is that they just aren't taught anything useful! Sure, there's probably all kinds of great theory and whatnot, and that's all very important, but at the end they should have a class that teaches them the useful stuff! I say that because the guys i've spoken to were never taught how to make ANY kind of GUI, and have no idea how to send data out a serial port (something often neglected but very useful in robotics). In fact, even the head of the Computer Engineering dept at my school (Santa Clara University, supposedly 14th in the country for engineering) didn't know how to talk to a serial port. And not teaching how to make a GUI? I mean, i know you can figure it out, but then, what is the point of school? I know not all software goes to consumers, but if it does it had better have a GUI, so why not teach at least the basics!? And hardware output - either from the serial port or programming USB peripherals - is incredibly useful, yet seems to be completely left out of today's undergrad programs, and that seems insane! What has been your experience with this stuff? -Taylor
  • by hedleyroos ( 817147 ) on Tuesday January 08, 2008 @04:36AM (#21951094)
    Dijkstra did say that, and if the software world consisted of only theory then we could all get 90% for our efforts and be happy with that.

    In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.

    I'm just worried that too few students these days know assembly and C, which leaves us in a predicament when the current generation of kernel devs retire.
  • In close to a decade now of workplace programming, I have yet to actually have a need for any systems level programming of any kind.

    I wonder whether the engineers supporting your production hardware would agree with you. ;-)

    Not that being conversant in systems is important in every practical application, but, speaking as a systems engineer, I do prefer to work with developers that understand it.
  • Different goals (Score:4, Insightful)

    by Secret Rabbit ( 914973 ) on Tuesday January 08, 2008 @04:41AM (#21951124) Journal
    In a College for a 2-yr Programming Diploma, this would be fine because the goal of such a program is just writing some business application or some such. Nothing that requires any real competence.

    On the other hand, Universities have a much different end goal. They want to teach such that completing there program means that the student can go onto a Masters program, etc. Obviously, Java won't get students there without a massive amount of pain if they go on to further study.

    Well, at least that how it was, and how it should be. Currently, Universities are edging toward the College level. What this has produced is a massive gap in knowledge/skill of where the student is expected to be and where they actually are upon entering a Graduate program.

    Unfortunately, this isn't just in CS. More and more I see Mathematics and Physics programs degrading as well. From what I've seen, this is due to Administration applying... pressure for high grades, etc. No grades, no funding. The cycle continues.

    Though, I must point out that there are some Departments that are making an attempt at fighting back. Small in number they may be, there is still hope for a return to actual academics. Though, we'll see how that plays out. You never know, I wouldn't put it beyond a spiteful Administration to turn a Department into offering just service courses.
  • by jandersen ( 462034 ) on Tuesday January 08, 2008 @04:45AM (#21951144)
    Don't be silly. Flying an aircraft requires a whole new set of skills, that are outside the normal experience of most people. Driving is not just flying with a number of 'security enhancements', whereas programming in Java is like programming in C, but without the need to learn about pointers or good programming discipline. So if C is like a manual car, Java is an automatic.

    It is reasonable to expect that a CS student has both the ability and the interest it takes to learn all the details of programming well in C.
  • by doktor-hladnjak ( 650513 ) on Tuesday January 08, 2008 @04:48AM (#21951160)
    It really depends what you're trying to teach when. A few years ago I taught a data structures course [decidedlyodd.com] (second out of a three course lower division sequence) in Java and thought it worked fairly well. About a year before that, I was a teaching assistant for a data structures class taught in C++ [uni-sb.de].

    Language choice affected the content of both courses quite a bit. In the Java course, students spent more time understanding how specific data structures worked and working on more interesting programming assignments. Students got to a working knowledge of Java fairly quickly after having only a semester of Scheme under their belt. In the C++ course (which followed a semester taught in ML), the students as a whole spent a lot more time learning about memory management along with the ins and outs of C++. Instruction on data structures was much more limited. Both sets of skills are valuable for practicing engineers, but I think it's fair to say that both the staff (myself and the teaching assistants) and the students enjoyed the class taught in Java more than the class taught in C++, probably because the interesting parts of such a class have to do with efficient data structures more than fighting with pointers and copy constructors.

    Following the Java course, those students normally took an introductory hardware class. That course was taught in both assembly language, C and Verilog where learning pointers fit in better with the other material. Following the C++ course, I don't think those students saw much more C or C++ programming for a while (possibly not until upper division topic-specific course work) and instead went off to complete 4 more semesters of more theoretical Computer Science.
  • by synx ( 29979 ) on Tuesday January 08, 2008 @04:49AM (#21951172)
    Yes, early 90s - but have you checked out universities lately? We're talking cradle to grave Java. Intro to dev in Java, mid level courses in Java, Sr courses in Java. We're graduating people who don't know what pointers are!

    COME ON!

    And don't tell me Java doesn't have pointers - what do you think references are? Glorified pointers with auto-null checks.

    One problem I've seen is Java Developer Syndrome (JDS) - think devs who don't know the difference between Java API names and datastructures that are used to implement them.

    Think of someone who when you ask them what a hashtable is, they say 'oh, that's the synchronized version of HashMap'. Tell me that is a quality developer you want to work with. Go ahead, TELL ME.
  • by Jackmn ( 895532 ) on Tuesday January 08, 2008 @04:51AM (#21951178)

    and if the software world consisted of only theory then we could all get 90% for our efforts and be happy with that.
    University is not occupational training.

    In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.
    Computer science is not computer programming.
  • by doktor-hladnjak ( 650513 ) on Tuesday January 08, 2008 @04:53AM (#21951188)
    ... and I've been writing almost entirely in C/C++ without any SQL since finishing undergrad over 5 years ago (grad school followed by a desktop application development job). The bottom line is that students should be learning a broad range of skills because it's hard to say where any of them are going to end up right out of school or especially within a few years of finishing school.
  • Re:Right on! (Score:5, Insightful)

    by putaro ( 235078 ) on Tuesday January 08, 2008 @04:55AM (#21951198) Journal

    .fuck you collage idiots
    Well, the problem is that you want to a collage. That's a bunch of stuff pasted together by art majors. If you had gone to a college, or perhaps a university, you would have learned stuff beyond programming such as data structures, compiler theory, etc. Programming, especially in any particular language, is a skill, like plumbing or electrical wiring. Knowing the theory behind it is education. I was a decent programmer when I started college. All the theory and stuff that I learned in college didn't seem that useful at the time but as I've gone along in my career it's definitely the difference between being a code monkey and being someone who can design systems.
  • by bigstrat2003 ( 1058574 ) on Tuesday January 08, 2008 @05:01AM (#21951230)
    True. Besides, the idea that Java is damaging to students is pure bullshit anyway. If the students are learning the Java way to do things, and nothing else, then they have horrible professors. I learned CS from good profs (well... one good and one bad), and surprise, even though I got my start in Java, I am perfectly capable of doing things in other ways.

    When I took data structures, and we used C++, I didn't have mental convulsions because Java had wrecked up my thinking so much (although I did have mental convulsions cause C++ is incredibly messy to read at a glance), I learned different ways of doing things. So, maybe these professors should look at whoever's teaching these kids so sloppily, not the language.

  • by remitaylor ( 884490 ) on Tuesday January 08, 2008 @05:02AM (#21951238)

    software engineering != computer science

    [...] soft engineering/programming is a subset of computer science

    true! i would take that even a step further ... it depends on the target consumer of your applications, as well.

    mr. awesome computer science man, who can program in everything, might not be what a business wants for someone who's good at whipping up quick, user-friendly (potentially resource-hungry and not secure enough to face the public internet or for commercial distribution) applications to help streamline business processes.

    and, obviously, mr. business programmer, who's good at getting the employees what they need, probably isn't the guy you want to program some super computer ... math ... stuff. or even a decent performing application that could be sold and distributed, commercially.

    i'm sure Java's *perfect* for some people to learn. C or even Assembly are *perfect* for other people. C# / Python / Ruby might be perfect for someone else. [some other languages here].

    it depends on the person and their career / interests / environment.
  • by timmarhy ( 659436 ) on Tuesday January 08, 2008 @05:05AM (#21951260)
    unfortunately he doesn't go far enough into the core of the problem, which is today's universities are mass producing what employers want, rather then the thinkers of tomorrow.

    employers want nothing more then easily replacable drones who come with an easily definable skill set which they can replace when a new buzzword comes along. this is NOT what universities should be pandering to.

  • by superash ( 1045796 ) on Tuesday January 08, 2008 @05:06AM (#21951274)
    TFA says:

    A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)

    Why C matters...
    Why C++ matters...
    Why lisp matters...
    Why Java matters...
    Why Ada matters...

    So, I don't think the article is biased.
  • "Sure I know C!" (Score:5, Insightful)

    by Durandal64 ( 658649 ) on Tuesday January 08, 2008 @05:07AM (#21951278)
    I'm kind of a proponent of having a student's evolution mirror the industry's, to an extent. Start them with C and then gradually introduce problems that were more and more difficult to solve in C. That way, when you show them C++ or Java, they can appreciate why these languages were needed and what classes of problems they're appropriate for and more importantly, what they're not appropriate for. But to really appreciate these things, you have to have students implement their own little OO runtime in C or whatever other procedural language. You can bet that after that, by the time you show them a true OO language, they'll know plenty about OOP, and things will just come more naturally.

    These students are being trained as engineers. They shouldn't be afraid of a little grease.
  • That's true (Score:4, Insightful)

    by cgomezr ( 1074699 ) on Tuesday January 08, 2008 @05:08AM (#21951290)

    I love Java, and I find it much more pleasant to use than C/C++, but I generally agree with TFA. I have seen many people doing things like this

    //build a string with 10K a's
    String as = "";
    for ( int i = 0 ; i < 10000 ; i++ ) as += "a";

    which creates 10K temporary objects to construct a single string*. This is because they started learning programming with a high abstraction level so they have no idea of what is going on behind the scenes. It's something similar to starting programming with one of these new "intelligent" IDE's such as Eclipse, which do lots of things for you so you don't have to figure them out for yourself. I think all these abstractions are great for people who already know how to program, not for beginners. You wouldn't give a calculator to 6 year old kids learning math, would you?

    I personally would being with C and then jump to Java. C is not so complicated as a first language if you don't introduce all its features from day one. It was my first language and I think it was a good choice, it shows you enough low-level concepts to be able to make efficient optimised code in higher-level languages. Besides, when you later jump to a more high-level OO language you appreciate the difference and learn it with interest.

    * I know modern compilers are able to optimise that automatically using a StringBuffer or StringBuilder. I just chose that (somewhat unrealistic) example for the sake of simplicity, but the same happens in other cases that aren't so easily handled by the compiler.

  • by Anonymous Coward on Tuesday January 08, 2008 @05:10AM (#21951300)
    I'll answer as a computer scientist.

    I view school as bootstrapping a person to learn how to learn, and for teaching them the things that are timeless. The only reason that a popular programming language like Java is used in the first place is because something has to be used, so it may as well be that. However, many schools offer Scheme, ML, or Common Lisp as the programming language of choice when the job market is comparatively low. This is because it's seen to help the learning process. The goal isn't a marketable skill, but a vehicle to teach the timeless things like algorithms, data structures, and all those courses that have he word "theory" tacked on to the end of the titles.

    If you want someone to be a lackey and build you a GUI, you'd be better off looking for someone who has an ITT certificate. If you're looking for something more on the math side of computing (again, algorithms, analysis), then you talk to a computer scientist.
  • by phugoid ( 1176331 ) on Tuesday January 08, 2008 @05:13AM (#21951312)
    Are you suggesting that the CS curriculum should be designed around solving your little practical problems?

    I'm a Mechanical Engineer as well. Are you suggesting that _we_ should have spent our degrees studying look-up charts for HVAC ducts, or how to make nice Excel graphs? (calculus, mechanics, thermodynamics, heat transfer, ring any bells?)
  • by wbren ( 682133 ) on Tuesday January 08, 2008 @05:23AM (#21951376) Homepage

    And here's a radical concept: what about learning both types of languages? You know, the purpose of education being to provide a wide-ranging vision and not just with what your teacher happens to like.
    That's exactly what the article is saying, and clearly you are missing the point. They aren't really complaining about students that learning enough languages, but rather that students aren't being shown the bigger picture. A student can be taught how to use a linked list class in Java (or C++ for that matter), but that's not all you should know. You should also know the advantages of a linked list over, say, a dynamically-expanding array, as well as in which situation they should be used (or avoided). You should implement a linked list yourself at least once so you understand how it's done. That's just a really small example, but everything from speed optimizations (optimal buffer sizes with respect to processor caches, etc.) to enterprise-level application scalability relies on a deep knowledge of system architecture as well as language syntax.

    Many universities are simply training highly replaceable professionals, which is a big reason why outsourcing is such a problem. When two people--one in the USA, one in India, for example--have the same skills, the cheaper will be chosen (and rightly so, sorry). The point of the article is that many universities are simply training programming rather than teaching computer scientists. It's an important distinction, which some people just don't understand.
  • by epine ( 68316 ) on Tuesday January 08, 2008 @05:24AM (#21951384)
    Anyone with a true gift to become a kernel dev has probably engaged in flame wars with his/her professors already, regardless of what she/he teaches.

    Pointers aren't rocket science. If you never perform an operation where you haven't first met the operation's preconditions, you never get a pointer error.

    If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway. Pointers are the least of your problems. Race conditions can become exceptionally hard to reason about. The prudent kernel dev architects the system such that this doesn't transpire. That requires a whole different galaxy of aptitude beyond not leaking pointers.

    When I first learned C in the K&R era, I thought those greybeards were pretty clever. Then I came across strcpy() and I wondered what they were smoking that I wasn't sharing. I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.

    More likely, too many of them had learned to program on paper teletypes, and just couldn't bring themselves to face having to type unsafe_strcpy() when they had reason to know it would work safely and efficiently.

    The C language deserves a great deal of shame in this matter of giving many beginning programmers the false impression that any function call should dispense with formal preconditions.

    Interestingly, if you sit down to implement an STL template algorithm manipulating iterators, it proves pretty much impossible to avoid careful consideration of range and validity.

    OTOH, C++ makes it charmingly easy for an object copy routine, such as operator=(self& dst, const self& src) to make a complete hash of managed resources if you fail to affirm dst != src.

    There are plenty of amateur mathematicians who can manipulate complex formulas in amazing ways. The difference with a professional mathematician is that the necessary conditions for each transformation is clearly spelled out.

    A = B ==> A/C = B/C iff C != 0
    A > B ==> C*A > C*B iff C > 0

    Infinite series must converge, etc.

    I'm not even getting into defining A,B,C as fields, groups, rings, monoids, etc. for maximum generality.

    Yet the average programmer feels sullied to exercise the same intellectual caution manipulating pointers. I've never understood that sentiment. My attitude is this: if that's how you feel, get your lazy coding ass out of my interrupt handler; go code a dialog box in some Visual Basic application that won't work right no matter what you do.

    Why did the software industry play out this way? Other professions have much harsher standards. Primarily because software was in an exponential expansion phase, any work was regarded as better than no work (perhaps falsely), and industry couldn't afford to reduce the talent pool by demanding actual talent.

    Now we've allowed many people to enter the profession without comprehending the rigors of preconditions. It's as if we had taught a generation of lawyers how to practice law, but omitted liability. Oops. What to do about it? Invent Java, and tell all these programmers it wasn't their fault in the first place.

    So yes, Java doesn't teach very darn much about the harsh realities of actually thinking. And since thinking is hard, it's an impediment to productivity anyway, so it hasn't much been missed. The only thing we lost in the shuffle is our professional self respect.

  • by papaver1 ( 846310 ) on Tuesday January 08, 2008 @05:27AM (#21951402)

    I've seen this in couple of students I went to school with. During my CS degree at UT of Austin the CS department started migrating to Java as their primary language. Java is definitely a easier language to learn on, especially with its GUI environments and libraries. That is the main problem however. A couple of friends I kept up with after college had a really hard time picking up lower level languages.

    It's much easier to pick up higher level languages when you know the building blocks from the bottom. I wrote all my code in C/C++ in vim. Not using a GUI gave me a good understanding of how code works. Like managing files and linking object files and libraries; using certain flags to enable compiler options. I think once one is familiar with these concepts a GUI is great to become more efficient and not have to deal with such rudimentary task sometimes. But if you never learn these concepts you are losing out.

    C/C++ helped me hang myself in college. It was grueling but worth it. I've picked up most languages I've tried pretty easily. I've coded/scripted in C/C++, C#, Pascal, Haskell, Perl, PHP, ASP, HTML, SQL, VB, Bash, MEL, LUA, UnrealScript. It's better to stick the pain out in college than try to figure out pointers and such when you have a job and there are deadlines to meet and the possibility of getting fired if you keep slipping.

  • by Bryan Ischo ( 893 ) on Tuesday January 08, 2008 @05:28AM (#21951412) Homepage
    The better CS undergrad programs don't really teach languages per se. The main focus of the curriculum should be the theoretical underpinnings of computer science, combined with the practical aspects of software development. Since languages themselves are part of the practical aspect of software development, in addition to also being the focus of some computer science theory, it is unavoidable that languages should themselves be studied to some degree, and also used to a large degree to practice the theory that is being taught. Most theoretical CS only really needs 'pseudocode' to illustrate the concepts being discussed. But since students are often asked to write programs to demonstrate their understanding of the subject matter, a real language is unavoidable. But the language itself is secondary to the real meat of the subject, which should all be mathematical and theoretical in nature.

    At CMU the very first CS class (that losers like me who didn't AP out of the first CS course, mostly because my high school didn't even have computer classes let alone AP computer classes!) really did focus on teaching a language - Pascal - and a significant part of the class was the learning of the language. It was the least useful CS class I took in the long run (not surprising, as an introductory course in any subject is likely to be the same). Subsequent courses would spend 1 - 2 weeks going over the fundamentals of the language to be used in coursework for the remainder of the class (which in some classes was C, in some was C++, some used ML, others Scheme, etc), to get everyone started, and after that, you had to figure it out on your own in conjunction with actually learning the theory that was being taught. It really isn't that hard to pick up a new language once you know a couple, although I did have a hard time with ML, mostly because I was completely unmotivated to learn it, feeling that it was absolutely useless to know (I was right).

    No really good CS program has any classes with names like "Java 101" or "Advanced C++". To use a carpentry analogy, I would expect a really good carpentry school to teach the fundamental rules and "theory" of carpentry, so that the student upon graduation really understood what carpentry was all about and could apply their knowledge to aspects of the subject that they hadn't even encountered in school. I wouldn't expect a good carpentry school to have classes like "Advanced Hammering" and "Bandsaw 101". The courses would instead be "Introduction to House Frames" and "How to Construct Joints". You'd be expected to learn the use of the tools in the natural course of studying these subjects.

    It's the same for CS. Good programs don't teach the tools, they teach the *subject*; learning the tools is intrinsic in the study of the theory.

  • by Endymion ( 12816 ) <slashdot...org@@@thoughtnoise...net> on Tuesday January 08, 2008 @05:34AM (#21951438) Homepage Journal
    Java is part of the dumbing down of CS.

    Java is the new COBOL. And we will regret it in 20 years for much the same reasons.

    It actually gives me hope that you have recognized this in hiring practices. That a CV with a list of Sun's Java buzzwords is not an indication of a useful programmer.

    I was disturbed in college (1997-2001) that things were changing towards Java and other idiocy. Too many people didn't get pointers and other basic concepts, and Java was hiding them even more. I believe it was the one class we had in assembly programming that really pointed it out - when confronted with having to deal with real hardware, most of the students didn't know what to do. Concepts like "two's complement" vs "one's complement" caused a strange brain-lock for them, as they were so sheltered from the actual binary math and hardware of the computer.

    It was only a handful of us that had been programming for years already (yay for the Atari 800XL) that had any idea of what was going on. The college (UC Davis) skipped entirely over very basic concepts like Von Newmann Architecture. I ended up having to spend most of my time trying to help my fellow students, there was so many fundamentals missing.

    I think the most frightening part was having to yell at one of the professors one day, because the basic data structures he was teaching were being done incorrectly. He was teaching people to leak memory. ("Let's allocate a huge linked list, and then just set the head pointer to NULL and consider it freed!")

    Sigh. It was frightening then, and apparently all my fears were justified, as now the entire discipline is getting a bad reputation. Unfortunately, I can't exactly disagree with that reputation from some of the CVs I've seen recently. My degree is destined fscked, apparently.

    You hiring? ^_^
  • by anandsr ( 148302 ) on Tuesday January 08, 2008 @05:38AM (#21951466) Homepage
    I would think the best language to start with is quite possibly C. Pascal used to be better, but its no longer used. I personally would prefer a scripting language like Python or Perl. I think scripting languages are better because there is no complexity of the compiler involved.

    I would think it better to have functional language next. Students are much more receptive in the earlier years, and Functional programming does take some getting used to. I don't know much to recommend these languages ;-).

    After that I would take an Object Oriented Language, Preferably Java. It is a nice Object Oriented Language.

    The important thing is that these languages must not be taught as languages, but as a tool to understand some Computer Science subject. For example we had something like this in my science courses

    1) Data-Structures : Pascal
    2) Linear Programming: FORTRAN
    3) Programming Languages : Lisp
    4) Graphics: C (Now it would be better to use something like C++)
    5) Systems Programming: Assembly
    6) Filing Systems / Database: Cobol (I would think java would be good here)
    7) Artificial Intelligence: Prologue

    I think now the layout now could be
    1) Data-Structures: C
    2) Mathematics (Set-theory, Combinatronics, Boolean Algebra, Linear Programming): Scheme, (C for Linear Programming)
    3) Graphics: C++
    4) Systems Programming and Operating Systems: C, Assembly, and shell
    5) Databases: Java
    6) Windowing Systems: Java
    7) Compilers: Perl and Yacc
    8) Artificial Intelligence: Scheme

    I am not sure Ada is required as such, because it is not used as much as other languages. I think having different types of languages will put enough base in people to learn Ada later on if required. I would have liked to fit Python somewhere, but I don't know where ;-). It does everything well but nothing specifically better.

    -anandsr
  • by cicatrix1 ( 123440 ) <cicatrix1@@@gmail...com> on Tuesday January 08, 2008 @05:57AM (#21951568) Homepage
    Isn't that kind of like saying science is not math? Of course it isn't technically, but computer programming sure is a great way to describe a lot of the theory and put it into practice, and in many ways make it easier to discuss.
  • by tieTYT ( 989034 ) on Tuesday January 08, 2008 @06:08AM (#21951610)

    I am a headhunter[...] I have zero interest in kids who have studied "easy" subjects.
    You have zero interest in hiring kids that are in the most demand? [tiobe.com] Does your boss know you're turning them away? I think he'd be upset.

    To be wise in computers...
    Not to be an elitist, but you're a fucking headhunter. If you actually knew anything about what makes someone wise in computers, you'd be the first I've met. Every headhunter I've met thought SQL was a database and didn't know there was a difference between "C-pound" and C++.

    You can't know unless you've been developing 8 hours every day after work. Wanna know why? Because every day I work I am 8 hours more knowledgeable in my field than the day before. I doubt you're keeping up with me.

  • by erc ( 38443 ) <erc AT pobox DOT com> on Tuesday January 08, 2008 @06:09AM (#21951612) Homepage
    I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.
    Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later. C was never designed to protect you from yourself, as explicitly pointed out by Dennis Ritchie many times. If you want a language that will protect you from yourself, program in VB.

    So yes, Java doesn't teach very darn much about the harsh realities of actually thinking.
    But C obviously does - like checking boundary conditions. I don't understand how you can slam C in one breath, then praise it in the next.
  • by daem0n1x ( 748565 ) on Tuesday January 08, 2008 @06:18AM (#21951650)
    It's just the usual senseless Java bashing. It has ZERO to do with Java. If the students are taught VB or C# and nothing else, like it happens in my country today, the problem is the same.
  • by aenikata ( 1193273 ) on Tuesday January 08, 2008 @06:21AM (#21951668)
    Most teaching adopts the approach that you learn the easy stuff first, and then move onto harder and harder bits. In teaching Java first, and only teaching ones like C++ later, this is taking much the same approach. It seems that the argument is essentially about taking the view that you should learn languages like C++ first to get an understanding of all the lower-level concepts, and to look at libraries later. At uni I studied C++ and Prolog, and Java was reserved for the 3rd year. So I learned a bit about memory allocation and pointers, and now know that I prefer to stay well clear of that - and in fact have been influenced by other academics who suggest that such low-level access to memory is better avoided unless it is necessary. If you need all-out performance, then you may need to do this, but the cost can be obscure crashes, memory leaks, code insecurity, and so on. I also learned about recursion, learning to work with just a single method for iterating through a list. I still use it when it's appropriate (e.g. when parsing trees of data). However, it also taught me a greater appreciation of the value of having more options - for loops, for each especially (again avoiding invalid references). Yes, you can learn to use Java to design a GUI using an IDE, leveraging libraries to achieve most of the 'hard' stuff. And you know what, for most business requirements, this is probably the most productive way to code. Minimal reinventing of the wheel. Maximum leverage of tools to enhance productivity. No, that doesn't give a full appreciation of memory streams, stack vs heap memory issues, hardware I/O, etc. But you're talking about catering for the mainstream. I think the authors are WRONG in saying that a university should be focussed on the academic. The reason people go to university (apart from the social aspects and cheap drink, or dossing around for another few years) is to become more employable, and to earn more. They don't achieve this by learning skills which are in minimal demand in a real world workplace. As a result there should be a balance between the academic aspects and the more employment-friendly ones. Learning to use an IDE and use toolkits also allows the developer to focus on things like the User Interface, something which is often severely overlooked, and relatively incidental to the final design. Rather than criticising teaching a GUI-focussed course, this should be encouraged as part of a complete education, covering essential skills that are often missing from an academic course. I've known a brilliant mathematician and developer who has worked on a number of commercial games, but who took some time to get anywhere because they haven't got the skills in user interface design. Equally, I've know various developers with a shaky grasp of logic and maths, who created a nice GUI and couldn't get the code behind it working right. So yes, there is a place for mathematics and learning the details that require more thinking, because you have to sort the chaff from the wheat. It's a typically academic article - in that it presents the authors particular biases as fact, and one which doesn't necessarily mesh academic utopia with commercial realities sufficiently.
  • by James Youngman ( 3732 ) <jay&gnu,org> on Tuesday January 08, 2008 @06:24AM (#21951680) Homepage

    Anyone with a true gift to become a kernel dev has probably engaged in flame wars with his/her professors already, regardless of what she/he teaches.
    Piffle. You are equating software engineering talent with a propensity to participate in shouting (or its equivalent) matches. Those things are, to say the least, incommensurate.

    If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway.
    I disagree. Once a precondition has been checked once (on entry to whatever subsystem we're talking about) there is no need to re-check it all the time. Especially if it's an invariant of the algorithm. Sometimes such precondition re-checking gives rise to bugs anyway, since the negative arm of the conditional may contain code with a bug in it (though obviously using an assert macro will prevent that) - error cases get poor test coverage so such bugs may persist for a long time, too.
  • by DerWulf ( 782458 ) on Tuesday January 08, 2008 @06:26AM (#21951690)
    They aren't as valid once you've realized that abstraction is the key to tackeling complexity, that how hard somethings is does not equal it worthiness, that a language where even K&R wrote unsafe code is probably not for everyone and that not every architect needs to be a Michelangelo.

  • by jdickey ( 1035778 ) <jdickey AT seven-sigma DOT com> on Tuesday January 08, 2008 @06:35AM (#21951714) Homepage
    If what is desired is the training of hordes of marginally useful, low level hacks who can be easily replaced by the Bots from Bangalore, we're on the right track. Someone obviously noticed that their "bachelor's" degrees are 2-3 years and "master's" is a year on top of that. It's not politically or economically acceptable (to the schools) to do the same thing, yet they're under pressure to produce drones who can be easily replaced. Hence the insectlike specialization inherent in a lot of "IT" "education" for the past decade or so.

    Back in the day, "computer science" apparently had a strong liberal-arts component, with topics such as logic that are traditionally taught by philosophy departments. By gaining a solid grounding in theory, compiler design, and different languages, a CS graduate could reasonably be expected to fit into a wide variety of roles. By the late '70s and early '80s, that had been folded into the engineering schools in many universities; CS was seen as a subset of EE, and CS underclassmen generally had the same requirements as any "other" engineering underclassmen, getting precious little specialization until their junior year. We all know how well that turned out; it's what led to the "industry-focused" "curricula" of the late '80s and '90s. Education became a byword for vocational training, with an ever-shrinking set of currently topical skills being taught; shrinking largely because the American (and, to a lesser degree, Canadian and British) lower public educational systems were being systematically raped and dismantled by the political trends of the day. Johnny can't program? Well, that's to be expected; he can't read or write past what for half a century was deemed a third-grade level by the time he hit university.

    Rather than solve that problem by re-broadening (all levels of) education, industry "solved" it post-1990 by offshoring everything that didn't involve a well-paid management position, and bringing indentured labor (via H1B visas in the US, for instance) for positions that were deemed "too difficult" to offshore. Everybody dranks the purple Kool-Aid for a decade and more, paying little attention to the fact that failed projects where becoming more and more common, and more and more costly. Instead, approaches like XP were introduced and sold, not on their very real merits, but on the idea that "this will help failing projects fail faster, earlier and cheaper".

    What ever happened to the idea that the project shouldn't fail at all? Or, more heretical still, that software shouldn't fail at all? We put our lives at the mercy of software whenever we get onto a modern elevator, a recent-model airliner, or an automobile with electronic fuel injection. We put our wealth and comfort in the hands of software much more regularly. I recall one day back in early 2002 when I walked into the local branch of my bank, to be greeted by the sight of every "terminal", including at the teller windows, displaying the Windows "blue screen of death". I walked out, came back the next day, closed my account and took my money to a different bank. I was, according to the local newspaper, far from the only one to do so.

    Our society is and will remain completely dependent on the correct functioning of computer software for its continued health and growth, if not survival. We, as a society, are being extremely shortsighted and apathetic by tolerating the status quo without examination or serious discussion. Which of these "sensible" "reforms" of the last 20 years will be the equivalent of the Romans' engineering decision to use lead as the lining of their water pipes?
  • by jacquesm ( 154384 ) <j@NoSpam.ww.com> on Tuesday January 08, 2008 @06:36AM (#21951718) Homepage
    Different languages have different purposes, C has gravitated to a 'niche', system level stuff, situations where performance is more important than security (not everything is connected to the internet, impressions to the contrary). And - surprise - 30 years ago we were living in a different world security wise. The biggest problem with strcpy is not that it is 'inherently unsafe', it is that if you do not do proper input sanitation you can not rely on it.

    This goes for most of the so called 'insecure' functions in C, they only become insecure if you have already messed up in an earlier stage of your code. If you are aware of the limitations of the standard library routines (even the unsafe ones) and you are operating in a 'hostile' environment (and todays internet certainly qualifies as such) then you'll need to take great care to accept only input that matches your assumptions in the code further down, if not you are in trouble. But good programmers will work like that anyway.

    It's perfectly possible to write crappy code in *any* language, not just in C (though, in the words of one old timer programmer 'C is like a racecar, you can cut corners but if you do that too often you'll end on your side).

    To come back to a fairly well thought out piece with an answer like what was written several levels above here is not in any way helping the discussion, it is simply insulting.

  • by MythMoth ( 73648 ) on Tuesday January 08, 2008 @06:41AM (#21951736) Homepage

    Fail gracefully? I mean, at least attempt to close the open files, sockets and whatnot
    They weren't opened in the shown try block, so closing them in its catch block wouldn't be appropriate; there should be a higher level try/catch to handle this. At the point of failure shown there's nothing much that can be done, and I agree that the AC was adding complexity to bolster a completely bogus argument.
  • by philipgar ( 595691 ) <pcg2 AT lehigh DOT edu> on Tuesday January 08, 2008 @06:47AM (#21951774) Homepage

    I doubt if even 1% of CS grads could write code to turn this BMP into a JPG, or even explain the ideas behind this.

    Just to harp on one statement in your comment. This is either an absurdly basic demand to ask of programmers, or an absurdly complex one. On one side, this is as simple as making a function call bmp2jpg(...) or whatever it is with the libraries they are using. The process of doing this is simplistic, and taught to the java mentality of "find the right tool in the shed to do your entire program".

    On the other hand the question you are asking could be one so absurdly difficult that it is unlikely you'll find more than a handful of people who just got a BS in CS will know (having a BS in Computer Engineering it is possible they might understand it, and having advanced degrees makes it even more likely). Expecting a CS student to be an expert in signal processing is rather silly. Even if someone was a computer engineer and was actually changing the JPEG code (a process I have done recently), it isn't expected that they understand the entire algorithm. They need to know the basic steps, and how the data flows through the process. They may even need to know the basics of what a DCT does (knowing the math behind it could be useful, but experts have optimized it so using a naive algorithm is idiotic). They should also know how Huffman encoding works, and any student who's taken an algorithms course should be familiar with such ideas. However at a low enough level, they really don't need to understand it. People spend entire careers optimizing something like a DCT, so to most programmers it should be seen as another tool.

    The point is that different levels of abstractions exist in different problems, and a good programmer (not necessarily a good CS student) should know what level of abstraction is necessary for the task at hand. Having someone who reinvents the wheel for everything is about as useful as a programmer who only uses canned solutions. It's finding the right mix that is important.

    However, one key points of this article is about the fact that CS curriculums spend TOO much time on programming, and too little on theory. A university is not a trade school, and students should learn the important theories and not the tools. However teaching a student when using the wrong tool is highly detrimental. And one thing that was observed is that while Java may make the introductory course easier, it causes problems later on when a student must know some other tool (such as pointers). Not having a good enough foundation can mean that the professor must waste a significant portion of a class teaching student the new tool.

    Phil

  • by Shohat ( 959481 ) on Tuesday January 08, 2008 @07:14AM (#21951970) Homepage
    That's bullshit
    C is a perfect beginner's language. When something is wrong, you know that YOU did it, not the compiler, not some class or wrapper or interface you have never seen, not the VM, or some bug or leak or god knows what.
    I write for real-time control systems (thermal/air/chemical, etc...) in C , for PIC and NEC microcontrollers, and I've written Java servlets and PHP (5, OO) for myself.
    When I write something like
    shortPointer=&(BYTE *)&Short_Transmission+2
    I know exactly where I point, what kind of data is there, how I handle it and what can change it. If something is messed up, I KNOW I did it, and I shouldn't google around for workaround, a bug fix or some undocumented feature.
    Learning programming should start from the point where you have full control over your code and data, and build up to the level of Java, where very often you handle little more than instances of objects that you just happen to understand enough to use.
  • by wrook ( 134116 ) on Tuesday January 08, 2008 @07:19AM (#21951992) Homepage
    I think you're missing my point. If you want to learn theoretical computer science, where would *you* go to learn it? Because they don't teach it at universities in general (with some notable exceptions). They churn out professional programmers, who would actually be *much* better off (in terms of being good programmers) to just spend those 4 years writing code.

    The comment about my degree was flippant, I admit. And it's been 25 years since I took that degree. So it's only in retrospect that I realize it's worth (or lack thereof). It got me a job. Without that piece of paper, it would have been difficult for me to break into the field. But it did *not* teach me anything about CS. Nor did it get me to a point where I could realistically do meaningful graduate work in the way, say, a physics graduate would have. But I had lots of trivial information about systems that were in use at the time (if you know what IEFBR14 does, then you understand what kind of systems those were. If you don't, be *very* thankful!)

    I take your point about researching the degree, and if I had known what I know now, then I would have known to go to those few schools that actually teach CS. But here's the thing: when you don't know something, you don't know what you don't know. Now that I've spent the last 2 and a half decades writing and reading, I'm beginning to understand what you need to know in theoretical CS. I at least have an idea of what I don't know. It is not reasonable to expect someone who has not done that work to know what is out there. But most professors who do research in CS know this. So I don't excuse them.

    And the point of my rant remains. The purpose of the university is basic research. Most universities are not training graduates (even grad students) to a level where they can do basic research in computer science. They may very well be providing a function for industry by churning out people by the tens of thousands with an introduction into the hot programming languages of the day. But that's the function of a technical college.

    What we risk by not investing in basic research is a stagnation in computer theory. Marvin Minsky proved that perceptrons can't compute everything. Then in a off hand comment said that he didn't know if multi-level neural nets had the same problem. It took something like 10 years before anyone even checked. In other areas, programs are getting more and more complex all the time. If we can't find ways of representing that knowledge in more expressive forms, then we will just hit a glass ceiling. By choosing to train programmers and neglect theorists, our whole profession loses. *That's* what I'm complaining about.
  • by Simon Brooke ( 45012 ) <stillyet@googlemail.com> on Tuesday January 08, 2008 @07:24AM (#21952014) Homepage Journal

    C->C++->assembly (any ISA)->some functional language->some other language (Java, python, ruby, etc).

    Assembly is necessary, to understand how a computer really works. Functional languages are good, just to know a completely different style. Some other language for breadth. Then the student can realise that everythin after asm was a waste of time, and return to C.

    This is kind-of bollocks.

    When I was a young programmer - which is about twenty-five years ago - the team I was on got a new ink-jet printer. It printed its own character set, we needed it to print bitmaps. The processor it used was one none of us had ever worked with before. One of the older members of the team - a guy called Chris Burton - took the spec sheet for the processor and the spec sheet for the printer home with him on the train, and came back the next day with the code for the new printer driver written in long hand, not in assembler mnemonics but in actual op-codes, in pencil on a pad of paper. It was burned on an EEPROM that day and drove the printers until that model became obsolete five years later - there were no bugs, it never needed fixing.

    It should be said in passing that Chris had worked in his youth on the Manchester Mark One [wikipedia.org], and after he retired was part of the team that rebuilt Baby and got it running again.

    I've always thought that was epic programming, a standard I'll never reach. But it's one particular layer on the stack. My job on that team was writing inference engines, and Chris was always really impressed by that. It's nearly thirty years since I touched any assembler and fifteen since I wrote anything serious in C. A modern computer system is way too complex for any single person to really understand, in depth, all the layers. I take what the silicon designers do as given, and likewise the microcode programmers. Right back in the early days of Linux I did fix issues in kernel code a couple of times but I wouldn't even try these days - the guys who do that are much more expert at it than I am. Likewise, I don't expect them to understand the compiler compilers that I write. It's a different layer on the stack.

    I agree that you need to have a rough idea about how the whole stack works. But we no longer expect all computer science students to be able to wire up NAND gates from discrete valves or transistors. And although a computer scientist needs to know that there are primitive logic operations carried out on the metal, and that on top of that there are a stack of different software layers with real machine code on the bottom and a whole slew of intermediate code representations above that, I don't believe that it is any longer necessary for all students to be able to write a serious program in assembler.

  • by teh kurisu ( 701097 ) on Tuesday January 08, 2008 @07:24AM (#21952018) Homepage

    They were also completely baffled when it came to not using an IDE to develop software. Makefiles had to be explained several times.

    That's a teaching problem, not a Java problem (you can teach badly in any language). If they can't function without an IDE then they've obviously not been taught how to use Ant, or even java and javac come to think of it.

  • Lisp (Score:4, Insightful)

    by wikinerd ( 809585 ) on Tuesday January 08, 2008 @07:34AM (#21952054) Journal
    Professors who care about students's education teach Lisp and Scheme. I had a professor who taught us Scheme in his free unpaid time after the main lecture. The university did not include it into the curriculum, but he explained to us why Scheme is important, and those of us who understood its importance choose to stay and listen to his unpaid unofficial lectures on Scheme. The reason these languages are important is in the mathematical thought that lies behind their structure. Every language has a way of thinking behind it. Some languages are procedural, others are functional. It is these paradigms that are important in a curriculum, because most mediocre programmers who get to program using one paradigm usually stay with it for a lifetime and never get to learn another. So a university should ideally offer courses on all available paradigms to make sure future programmers can choose the one which is the most productive for every specific project.
  • by Moraelin ( 679338 ) on Tuesday January 08, 2008 @07:37AM (#21952070) Journal
    Yes, abstraction is key to tackling complexity. But equally, having no clue what happens behind that pretty Java code is the key to writing bad code and spending time debugging what you shouldn't even worry about.

    Guess what? Even in Java, pointers still come to bite you in the arse when you least expect them. I see people every day who have trouble understanding the difference between "==" and "equals()" in Java, because they never learned the pointers behind them. They're essentially one abstraction level too far from understanding what their own code is doing.

    Or even in Java learning why you can't modify an "int" parameter, but you can modify the contents of an "int[]" parameter, guess what? Requires pointers. People end up doing all sorts of unnatural metal contortions to remember when passing by value isn't really passing by value, when "it's a pointer" would sum it up perfectly.

    And it shows. I've had people come to me half a dozen times with basically the same idiotic "auugh! Java's Hashtable is broken! I added a new value, and when I look into its array with a debugger it replaced my old one!" When in fact, it was only added a node to the front of the linked list. But they don't know what a linked list is, nor what a hash table really is, nor how a Node can contain another Node, without a concept of pointers. Worse yet, not only I see them spending a week debugging Hashtable, I see piss-poor workarounds done to prevent it from doing its job.

    Or I see burger-flippers-turned-programmers occasionally get the real programmers fired for doing the right thing. Like using a "==" where it's correct to use it. But the burger flipper doesn't understand that. He learned some "for String use equals()" mantra, and he'll apply it and preach it, cargo-cult style, without even understanding what he's _doing_.

    Or I see people think that optimization means replacing two lines with a one line call, because they have no fucking clue what the machine does with that code. They think that speed is measured in lines of code, because noone explained to them otherwise. So they wonder why their replacing two ifs with a catch is actually slower. (And I'm not getting into the many ways such a catch can make the code less secure, for example, by assuming that a real exception is just their loop reaching the end of the array.) Exactly what throwing an exception does, is a mystery to them.

    Etc.

    No, noone said you must keep programming in "a language where even K&R wrote unsafe code, nor that difficulty equals worthiness. But it helps to be at least exposed to those concepts once, even if thereafter you go on to program in Java or VB for the rest of your days. The fact that you worked with pointers once in C and managed to get them right, _will_ show in your Java code too.

    Probably the best thing that helped my coding was doing assembly on my parents' old home computer, back in high school. In fact, in hex, because that ZX-81 with 1k RAM didn't even have enough RAM for an assembler. Wrap your mind around _that_, if you think C is too hard.

    Would I advise anyone to write a production program in assembly nowadays? Nope, God forbid. I wouldn't have advised writing a whole program in assembly even back then. But understanding the machine behind that high level stuff will show even in your Java code.

    And, yes, not every architect needs to be a Michelangelo. But it helps if they're not a clueless moron who can't even build a doghouse right. You can see plenty of architects nowadays who can't even get a basic house right. They know how to draw an artsy sketch of a house, but they have no clue how to calculate it to actually stand upright or what materials to use so it doesn't get damaged by rain within a year or two. And/or need a civil engineer to fix their elementary mistakes. Maybe it wouldn't hurt that much if they knew a bit more, ya know?
  • by mwvdlee ( 775178 ) on Tuesday January 08, 2008 @08:04AM (#21952220) Homepage
    I agree with much of what you write but think assembler should still be covered atleast to the point of students being able to create tutorial programs.

    My day job is as a mainframe (z/OS) programmer in a number of languages and whenever a program has a non-obvious bug, it still comes in damn handy when you can read what is actually happening underneath.

    I'm not talking about training them upto the level of building systems in assembler, but they should be able to take hex values, processor specs and an assembler guide and translate what's going on.

    Assembler teaching you what a processor does, just like students should know how it's possible for a processor to be costructed out of NAND gates. They needn't be able to create their own processors, just understand why it actually works.
  • by epine ( 68316 ) on Tuesday January 08, 2008 @08:05AM (#21952228)
    I don't advocate protecting the programmer from him/herself.

    I do advocate designing primitives as essential to the language as the C string functions to powerfully remind the programmer using those functions of the programmer's logical obligations and support the programmer to reason correctly about those obligations, without having to digest 15 lines of preceding context to see that calloc() provided the implied terminating NUL.

    strlcpy and strlcat - consistent, safe, string copy and concatenation [gratisoft.us] by Todd C. Miller and Theo de Raadt, OpenBSD project

    There are several problems encountered when strncpy() and strncat() are used as safe versions of strcpy() and strcat(). Both functions deal with NUL-termination and the length parameter in different and non-intuitive ways that confuse even experienced programmers. They also provide no easy way to detect when truncation occurs. Finally, strncpy() zero-fills the remainder of the destination string, incurring a performance penalty. Of all these issues, the confusion caused by the length parameters and the related issue of NUL-termination are most important. When we audited the OpenBSD source tree for potential security holes we found rampant misuse of strncpy() and strncat(). While not all of these resulted in exploitable security holes, they made it clear that the rules for using strncpy() and strncat() in safe string operations are widely misunderstood.
    An Interview with OpenBSD's Marc Espie [onlamp.com]

    We have had a lot of success explaining the issues and getting a lot of people to switch from strcpy/strcat to strlcpy/strlcat.

    Weirdly enough, the Linux people are about the only major group of people that has constantly stayed deaf to these arguments. The chief opponent to strlcpy in glibc is most certainly Ulrich Drepper, who argues that good programmers don't need strlcpy, since they don't make mistakes while copying strings. This is a very mystifying point of view, since bugtraq daily proves that a lot of Linux and free software programmers are not that bright, and need all the help they can get.
    The original C strcpy() could just as easily have had the semantics of strlcpy(), with insane_strcpy() provided to copy strings/trash core without a cycle wasted.

    One must recognize that in a solid code base, thinking occurs more often while reading code than writing code. Correctness is not a write-only proposition in any living code base.

    We came to the conclusion that a foolproof alternative to strncpy() and strncat() was needed, primarily to simplify the job of the programmer, but also to make code auditing easier.

    The original C string functions were (and remain) a pedagodic disaster. Most beginning programmers failed to realized how much thinking had been folded into the surrounding context. If they were reading K&R, that thinking existed. If they were reading any code they had at hand, it likely hadn't, by any survey of average C code quality ten years later. With the original string functions, whether this careful thinking existed is not obvious without doing a lot of mental work, and that work has to be repeated *every time* the code is seriously reviewed.

    Worst of all, the strcpy() function seemed to imply "buffer overflow is no great concern, we're not even going to give you a single argument on this very dangerous function to help you avert it". It was a false parsimony to save that extra argument in the default case.

    This isn't at the level of whether the handgun has a safety or not. It's at the level of whether it is possible to chamber a round too large for the barrel. I can point the gun successfully, but I'd greatly prefer it not to detonate in any other direction.

    A more thoughtful C string API would have averted mistakes on the magnitude of chambering bad ammunition, without encumbering the pointy end in the slightest, or failing to endanger the programmer's foot.
  • Why We Teach Java (Score:5, Insightful)

    by fartrader ( 323244 ) on Tuesday January 08, 2008 @08:09AM (#21952250)
    As a CS Professor, here are some of my thoughts on this article:

    (1) Java is what the market wants. Yes, we can teach any other language under the sun. But the reality is, that the software industry values individuals who are Java-literate. By this I mean an individual who has a basic understanding of the OO principles that the language is founded upon, can write Java code using common tools, and has at least some insight into some of the more common Java APIs. Any learning institution that doesn't take this into account when designing their curriculum is doing a serious disservice to their student body. While some do go to University for the sheer joy of learning a subject - most are there to ultimately get a job.

    (2) In my opinion there is something seriously wrong with a Java course that emphasizes Swing or Web development, rather than the fundamentals. Yes, its important to get things in and out of a program, but, at least initially these should be incidental to the main event. Learning the language, and applying it effectively. Thinking in an object-oriented way, which many of you know is not necessarily an intuitive way to look at the world - especially if you already have a procedural background. GUI and web application development should be separate, advanced courses.

    (3) I sometimes lament the lack of insight into pointers, but any professor worth their salary will spend some time discussing the Java object reference architecture, and relate that to pointer-based languages. Regardless of how abstract your language is "opening up the hood" and demonstrating how things work, and why things have been designed the way they are, is often worth knowing.

    (4) I laughed when I read the article about Praxis, especially the part about formal methods. Are they serious? Yes I was taught formal methods in school, and could understand *why* I'd want to use them... If I had all the time in the world... a huge budget to burn and customers not screaming for something that the business needed yesterday. Praxis offers software development based on formal methods and as a consequence occupies an important (and probably expensive) specialized niche of the software ecosystem. To suggest that this approach should be the norm and lament its absence really betrays that the authors have spent too much time in academia and not enough in the real world.

    (5) Ada is a great language - in fact I learned Ada 83 as a first language along with C. It just isn't relevant to most software development companies or IT departments - if indeed it ever was. I worked on a research project that was part of the Ada 9X Real-Time initiative - the main users were aerospace and military vendors - particularly embedded systems. There you do need to know about concurrency and distribution - along with hard performance deadlines and often a slew of safety and mission-critical issues you need to consider to do a good job. However, I fail to see the general relevance of Ada to a commercial market that is primarily interested in "simple" information systems, getting information out of a database and/or putting it in - with some processing on route. Why should I use Ada when the market in general doesn't use it?

    (6) We teach concurrency - its useful stuff to know. I think that using formalisms to describe concurrent programs is going a wee bit too far (see (4)) above.

  • Re:tasty (Score:0, Insightful)

    by Anonymous Coward on Tuesday January 08, 2008 @08:14AM (#21952282)
    Engineers Slam Wheels As "Damaging" To Engineering Students

    Where are the car designers of tomorrow? We need to be teaching solid principles such as wheel design. They just take pi for granted these days!
  • Re:Java == Jobs (Score:5, Insightful)

    by asc99c ( 938635 ) on Tuesday January 08, 2008 @08:30AM (#21952394)
    I think a lot of employers advertise Java / .NET as a lot of employees believe that is the new thing and the way forward. i.e. C programming is on the decline, and (young to middle aged) employees don't want to get too far behind the times. Older employees might instead make a selling point of their skills.

    I'm mainly a C programmer these days, but I took the job basically understanding that I would be working significantly with Java. That was the only language I had experience with on leaving Uni, and I was promptly put to work on a Pascal / OpenVMS system! Friends from Uni have had similar experiences.

    I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this. I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.

    In dealings with many (perhaps even most) other companies whose software I write interfaces with, it's pretty clear that they are also using C or C++, and often even older systems (in one interface we have to convert our messages from ASCII to EBCDIC). You can frequently tell what language the other system is from the sort of errors that crop up, and sometimes from the design of the interface. I'm forced to believe that my area of the industry is still primarily C based.
  • by bcharr2 ( 1046322 ) on Tuesday January 08, 2008 @08:51AM (#21952528)
    The colleges would teach computer science if businesses were willing to pay for it. Unfortunately the industry wants to pay for the minimum, so that is what is being produced. Add in inept businesses managers, who see no difference between CS and CIS majors (except the CIS majors are willing to work for less money), and you have the industry today.

    It's why so many "project managers" think taking a little extra time to properly engineer a system is a waste of time, but have no problem rewriting entire systems every few years since they are unmaintainable.
  • by david_thornley ( 598059 ) on Tuesday January 08, 2008 @08:53AM (#21952542)

    Java is also a lousy "beginners" language, because its reliance on standard libraries leads beginners to look for pre-packaged solutions rather than writing their own.

    Which is exactly what we want people to do when they're off doing real work. One problem with teaching languages without such libraries is that people get used to writing their own rather than looking for pre-packaged.

    There's no problem with basic data structures and algorithms in Java. You declare that certain libraries are off-limits for an assignment, and give a zero score to anybody who violates that. The ones that actually belong in the field will catch on real quick.

  • by Shirotae ( 44882 ) on Tuesday January 08, 2008 @08:59AM (#21952588)

    There is no shortage of highly skilled computer scientists who know the theory and can program in any language, including ones they invent for themselves because the existing ones are not good enough. If there were a shortage then there would be head hunters out looking for people like that and offering big salaries and golden handshakes. The same is true for various other science and engineering disciplines.

    When businessmen or politicians talk about that sort of shortage what they really mean is that these days there is a shortage of naive people who will spend many years racking up debts pursuing an advanced education and then work for a pittance afterwards.

  • by DerWulf ( 782458 ) on Tuesday January 08, 2008 @09:01AM (#21952602)
    Yes, a programmer should be able to grasp the concept of pointers, no argument there. But as you said yourself this concept is also necessary in Java. So, what was your point? That badly trained programmers are bad?
       
  • Ada 2005 (Score:2, Insightful)

    by krischik ( 781389 ) <krischik&users,sourceforge,net> on Tuesday January 08, 2008 @09:05AM (#21952632) Homepage Journal
    If someone has actually stopped caring about the evolution of Computer Science then it's you - otherwise you would have known that the newest Ada is Ada 2005 and not Ada 83. Have a look for yourself:

    http://www.adaic.com/standards/05rm/html/RM-TTL.html [adaic.com]

    There is nothing Java (as a language) has what Ada 2005 hasn't got as well - but a lot what Java hasn't got but Ada does.

    Martin
  • by syousef ( 465911 ) on Tuesday January 08, 2008 @09:17AM (#21952740) Journal
    I think you're missing my point. If you want to learn theoretical computer science, where would *you* go to learn it?

    I'd research my courses very carefully and go somewhere where theoretical subjects were taught. That may mean a masters or even a PhD.

    When I did my B.Sc. in Computing Science, we had a mix of theoretical subjects (automata and discrete math were actually core, compilers was an elective I took, I took a couple of artificial intelligence subjects as well)

    Because they don't teach it at universities in general (with some notable exceptions).

    Ahhh so there are "some notable exceptions". Could it be that you should seek one of these out if that's what you want to do?

    They churn out professional programmers, who would actually be *much* better off (in terms of being good programmers) to just spend those 4 years writing code.

    Complete garbage. Programmers who don't understand how the machine works under the hood, don't understand that you can express problems that can't be solved, or can't be solved practically etc. are rubbish. They don't understand the very tools they use. I've worked with people who understand the classics like the travelling salesman problem, or the halting problem. I've also worked with people who don't. Let me tell you I'd much prefer to work with those that aren't going to propose something impossible and set a deadline for my team to meet in creating it. (I once had a manager propose we write a full blown compiler in a month, and it was clear she had zero understanding. My current boss on the other hand - and I work elsewhere thankfully - use to write software and she's brilliant). It doesn't matter if you never become a computer scientist proper. These things are VERY important.

    And the point of my rant remains. The purpose of the university is basic research.

    The point of a university is higher learning. One avenue is basic research.

    What we risk by not investing in basic research is a stagnation in computer theory. Marvin Minsky proved that perceptrons can't compute everything. Then in a off hand comment said that he didn't know if multi-level neural nets had the same problem. It took something like 10 years before anyone even checked.

    If it were an easy problem Minksy would have set one of his grad students to work on it.

    Look the rest of this refutation is pointless. If you want to move into a career of research now, you clearly have the intelligence and articulation required. Your education has put you in better stead than you give it credit for. What I don't know is whether you have the drive and circumstances to persue another degree or a change of career. If you want it desperately enough, you're willing to make the sacrifices and you're lucky enough to do it, I wish you the best of luck.

    I'm not saying this flippantly either. I dropped out of a science degree because I found it didnt suit me. I worked then went back and didmy comp sci and found my career in IT. Science was a dream of mine right through highschool. When I found I couldn't persue it I didn't give up on it. I went and did a Masters of Astronomy part time and on the Internet. I certainly didn't have my hand held to learn what I did, and though I never intended it as a change of career (ie. I l did the degree "for fun") and though it cost me big time health wise and socially (not to mention financially) I don't regret doing it. It is part of who I am, and I loved the challenge and cherish the knowledge I gained. I finally know how we know what we know about the universe. I've computed the distances to stars and understood their life cycles. I understand what the universe looks like on the grand scale. In short I have a better understanding than the average person of the universe I live in. While I'd love to go and do research I know I won't sacrifice what I have to for that. It's still okay. My degrees have been anything but a waste of time.
  • Re:tasty (Score:5, Insightful)

    by Anonymous Coward on Tuesday January 08, 2008 @09:32AM (#21952880)
    I've been alternating between C++ and Python

    Do more. Try doing your homework in haskell or lisp or hell, write in forth or postscript. It's a billion times easier to learn a language when you have someone else telling you what to do in it, and a billionth of the stress when your paycheck doesn't depend on it working.

    I've wanted to learn ruby and rails for a while now, but I've got nothing to do with it at home, and like hell I'm going to show up at work and replace a production app with ruby for the hell of it, even though we've got a number of internal web apps that are basically exactly the kind of CRUD RoR was designed for.
  • by JavaRob ( 28971 ) on Tuesday January 08, 2008 @09:33AM (#21952892) Homepage Journal

    unfortunately he doesn't go far enough into the core of the problem, which is today's universities are mass producing what employers want, rather then the thinkers of tomorrow.
    Is that even true, though? The "drones" are the reason why so many projects fail (because they have no clue about larger risks, and no way to solve difficult problems), which costs those same employers vast amounts of money.

    I code mostly in Java in my professional life, but when I was in school we were forced to diversify, and it was a definite plus.

    The intro course used mostly JavaScript for some reasons (!), but other (even relatively low-level courses) required projects written in C, Schema, and Java. I took an operating systems course where we had to write a project in some kind of mini-assembly language... it's all a bit fuzzy now (I graduated 10 years ago), but I remember it being tough to wrap my head around for a while. And that's a good thing, right?

    I also did a couple of summer-long solo projects that probably taught me more than anything -- just fighting through the problems on my own, learning the hard way about the value of clean code, OO, version control, debugging skills, etc. etc..

    Perhaps obviously, I'm much better a *Java* developer than I would have been without the other stuff. So I agree wholeheartedly that students must learn more than one language in their schooling -- either for professional reasons OR for academic reasons; you simply have to flex your thinking in more than one way if you're going to learn.

    It also strikes me as a tough way to learn... how do you learn what X language is, and the reasons behind its design, totally in isolation? How do you learn what OO is if "functional" is a meaningless concept to you?
  • Re:Java == Jobs (Score:5, Insightful)

    by aldousd666 ( 640240 ) on Tuesday January 08, 2008 @10:02AM (#21953184) Journal
    There are many jobs in .net and java yes. I hated Ada in school, and particularly difficult was FP. But once we got to assembler it all made sense. It was the guts of the system, and I finally saw how it all fit together. Once I saw data structures, and then had a look at how stack based code was generated from all of the other languages, I felt like I could learn any of the languages and not feel like I was using a black box. In my opinion, it's ok to learn java and C# in school after one has had a look at the internals, perhaps a primer in virtual machines. That would cover the bases of actually knowing how computing works, in addition to allowing for the preparation for job markets. One thing that's absolutely crucial to a computer science grad in the real world is being able to adapt to any language when needed, so all of this argument over which language to learn is a little off the mark. You should learn programming in general in school, and optionally focus on any language of the day for the market after you've become versed in the art in general. I realize that 'becoming versed' while in school is a little bit unrealistic as well, but if you've at least been exposed to the concepts at a lower level, it doesn't leave you scratching your head as much in practice when you can't figure out, for example, why your C# code makes a distinction between stack and heap allocated structures, and what impacts it has on performance and all that. It also means that when security holes are pointed out, or patched, you at least know what the hell is going on, and why it was a big deal to begin with.
  • Re:Java == Jobs (Score:2, Insightful)

    by Lodragandraoidh ( 639696 ) on Tuesday January 08, 2008 @10:19AM (#21953390) Journal
    This is exactly what I've been railing about for years.

    The first PC generation (those kids who first wrote programs in the late 70's and 80's) were blessed with having to learn all of this low level stuff to make their systems sing on the limited resources at the time.

    Since then, students have been getting farther and farther from the key understandings that would make them excellent regardless of the language they would use in the future. The best programmers understand the underlying implimentation so they can leverage it.

    Now industry is calling for more developers who can do concurrent applications - but how can you expect them to understand and build a system that avoids race conditions, when they don't even understand pointers?

    The quality of software will continue to suffer until Universities get their act together and standardize system programming (including assembly, as well as C/C++, and scripting - shell, Perl, awk, sed, Python).
  • by forgotten_my_nick ( 802929 ) on Tuesday January 08, 2008 @10:22AM (#21953408)
    Nearly every head hunter I have met has no clue what they are talking about when it comes to technology. They have a few fancy buzzwords, believe what they read in the local computer mag. They also try to inflate your CV to the customer they are selling you to.

    A bit like your post. I mean comments like "How can someone understand the Linux kernel without C & C++ ?". Why would anyone need to understand the linux kernel or C++ unless they were linux kernel programmers. Even so we have things called an "API" which allows developers to write to a kernel without having to know how the Kernel works or if it is even in C++.

    Also your comments on Java are laughable. Java is in use in day to day objects in real life. Phones, Set top boxes, Cars. It is not just web servers. There are even operating systems and emulators in Java which run fine. Go read up on JIT and stop living in the 90's.

    "In my experience, 50% could not work out how to develop a data structure for a bitmap that used palettes."

    And why would they? Unless they were looking for a job that does that. Seriously, if I was in an interview and they came out with the stuff you posted I would laugh at them.

  • by AndersOSU ( 873247 ) on Tuesday January 08, 2008 @10:26AM (#21953470)
    Ok, I'm not a programmer, but a mechanical engineer, and I know precious little about programing, but I find articles like this very interesting. I was reading your post, and I stopped dead in my tracks here:

    I think the authors are WRONG in saying that a university should be focussed on the academic
    People don't go to the university to get real world skills, people go to the university to understand the foundations of their field, so that they can adapt their fundamental understanding to solving the problem at hand. People get certificates to learn routine use of employment friendly tools.

    As I said, I'm not a programmer. I could (if I had to) model the frequency response of a simple mechanical system to a range of perturbations by hand. The chance that I'd have to do that in the course of my professional employment is so slim as to be laughable. Yet, the fact that I could do this (if I really had to) tells me that I don't want to put an eccentric load on a rotating shaft with out a lot of careful consideration. Now if I sort of knew this was a bad thing, but didn't really understand why I might to something silly like put only a single U-joint in a shaft. After all, it provides flexibility, and as long as the shaft is straight there is no eccentricity. The problem occurs when there is a deflection, then your single U-joint translates a nasty sinusoid down-stream. If you do that things tend to break.

    Now, I agree that the university should have some courses focused things that practicing professionals in the field use. I could draw a part by hand (if I really had to), but if I've never seen CAD before, I'd be at a serious disadvantage if I ever wanted to be a machine designer. However, fundamentally, a university is an academic institution. The suggestion that it should be an employment mill would severely compromise our education system.
  • by joel.neely ( 165789 ) on Tuesday January 08, 2008 @10:28AM (#21953492)
    ... and so self-defeating.

    There are many dynamics in (American) programming; some are real problems, and some are changes that the "old guard" finds uncomfortable. Nobody here (including me) wants to read a 30-minute rant, so let me just mention a few high spots.

    1. There are forces in almost every industry that seek to dumb down the role of the worker so that jobs can be commoditized or outsourced.
    2. Any institution that caters to the forces of #1 is acting like a trade school offering training, rather than a university offering education.
    3. There are many fine companies and schools that do not cave in to the forces of #1.
    4. No curriculum that teaches just one language will produce well-rounded programmers, regardless of what that language is. Any well-rounded programmer knows multiple languages, and is capable of learning more.
    5. The decline in mathematical competence of American college students has been widely discussed. Nobody with his head on straight believes that Java (or any other programming language) caused it. (The same can be said of grammatical competence.)
    6. Overemphasis on GUI toys is a fault of the curriculum and/or the teacher, not of the language. On the planet where I live, Java is more likely to be found on the server than in the browser.
    7. It is possible to have a long, productive career as a programmer without ever writing an operating system, or picking up a soldering iron.
    8. It is possible to learn a great deal about the foundations of programming by writing an operating system, or picking up a soldering iron.
    9. Conclusion from #7 and #8: there's more than one way to do IT.
    10. Anybody who thinks that application servers, messaging middleware, and compilers are not "systems programming" is still living in the 1960's. Nobody who knows the current industrial software world can be ignorant of the fact that Java is one of the languages in which these are being written. It is also not the only one (see #4 and #9).


    There are also the traditional inter-generational conflicts:
    • Every generation laments the fact that the following generations don't have to do/learn things "the hard way, like we did". Cars no longer require hand-cranking, manual choke, manual spark advance, or manual transmissions. Today's automotive technology makes it possible to be a good driver without knowing how to use any one of those bits of historical infrastructure. (It also means that we are more dependent on professional mechanics when our cars break down! ;-) Modern type-safe languages allow very sophisticated programs to be written without ever directly manipulating a memory address masquerading as a "pointer".
    • Every generation runs the risk of losing important lessons learned the hard way by preceding generations. Merely playing video games doesn't prepare one for a career in a high-tech world (as if anybody thought it would! ;-). How many Viet Nam wars or Great Depressions do we need before we learn to do better? ("How many roads must a man walk down before you can call him a man?") OO is about 45 years old (going back to Simula in the early 1960s). Functional programming is about 30 years old (going back to the mid 1970s). Our industry hasn't yet fully learned how to do either one correctly.


    Progress is made by human beings, not by tools.

    Or, as Shakespeare might have put it: "The fault, dear Brutus, is not in our programming languages, but in ourselves..." Julius Caesar (I, ii, 140-141), 2008 edition
  • Re:tasty (Score:5, Insightful)

    by SQLGuru ( 980662 ) on Tuesday January 08, 2008 @10:29AM (#21953506) Homepage Journal
    My biggest gripe with my college experience (graduated 1994 - BS in CS) was that even though they were teaching the "solid" languages, they still didn't really teach me what I needed to know in order to do the job I do today (DB App development). Sure, things like Algorithms and Data Structures had some low level fundamental use, but they didn't teach me how to develop a SYSTEM.....most of my "projects" were simple "take some input - produce some output" programs.

    Most of my professors had no real world experience, either. So, teaching things like team dynamics and working within a project schedule were really beyond their expertise. Granted, I've been quite successful, but I attribute most of that to my abilities, not what I learned in college. College just got me a piece of paper that opened the door.

    I don't think the problem is with the languages being taught, but in the lack of true engineering being taught. This is true of any of the programming related fields (CS, MIS, SE). All of them need these skills.

    Layne
  • Re:Java == Jobs (Score:5, Insightful)

    by Phleg ( 523632 ) <stephen AT touset DOT org> on Tuesday January 08, 2008 @10:37AM (#21953628)

    I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this.
    What's stopping you from learning a language on your own, for fun? If you think employers won't care about non-professional experience, you're either simply wrong, or working for employers who hire crappy programmers. Where I work, we'll likely throw your resume in the trash if it doesn't have Haskell, Lisp, Ruby, Python, OCaml, Scheme, Scala, or some other obscure, clearly self-taught language on it.

    I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.
    You're far off here. This seems to be a prevailing thought, but it just doesn't bear out in practice.

    As code gets more complex, the best way to keep it understandable to others is to follow common language idioms, indentation / code formatting practices, and use built-ins in the standard libraries. These alone often take months to become familiar with, but that's only half of it. The other half I can only describe as trying to approach problems from the unique perspective of the language. Any asshole can jump from Java to Ruby, or from C++ to Lisp, or from VB.NET to Scala. But learning how to solve problems using those languages' strengths, rather than writing code as you would in the language you're coming from, is crucial.

    From my own experience, Java programmers coming fresh into Ruby don't use blocks. When you finally convince them to use blocks for enumerators, they miss the point entirely and simply use each_with_index for everything, rather than more powerful methods from functional programming like map. They also don't like to reopen classes. In Ruby, classes can be added to at will, so if you want a method to calculate the average value of an Array, you can simply define it as a new method on the class. But Java programmers will create a Util module, throw a method in there that takes an Array, and think nothing more. It's not wrong, per se, but it's ignoring Ruby's strengths, and simply writing Java code inside the Ruby interpreter. And the people who do this are bloody useless.

    My rant is getting long, but the main point is this: learning syntax for a new language is easy. Learning to use that language properly (much as a screw is used differently than a nail) is crucial to being able to work with other people, and getting anything meaningful done.

  • Re:tasty (Score:5, Insightful)

    by pyite ( 140350 ) on Tuesday January 08, 2008 @10:38AM (#21953636)
    So, teaching things like team dynamics and working within a project schedule were really beyond their expertise.

    And also beyond the scope of computer science. If that's what you wanted, you should have specialized in software engineering. People keep forgetting that computer science classes should feel more like math classes than engineering or management classes. One look at TAoCP would hint at that. For the record, I'm an engineer and I find the pseudo-engineering that most CS programs push out to be highly disturbing. Either do it right and call it software engineering, or remove the non-CS stuff and call it computer science. If you're not gonna do either aggressively, give it a fake major name like "Information Technology" or "Management of Information Systems" and teach a bunch of stuff really poorly.

  • Re:Java == Jobs (Score:4, Insightful)

    by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Tuesday January 08, 2008 @11:01AM (#21953972) Journal
    They taught mostly Java where I went to school, though "taught" isn't really right. They didn't "teach" by languages at all, it's just that the programming projects mostly tended to be in Java, unless the class concepts were better suited to something else.

    Mostly the classes were theory, concepts; stuff that applied equally to all languages. It was weird in some ways; I had a networking class that assigned the eternal "create a server/client chat program" project, where part of the project was a Java GUI. At this point, I'd never programmed a GUI, and neither had anyone else I talked to. The response of the TA (who was the only one who'd ever give programming advice, because the professor only dealt in theory), was that GUI design was beyond the scope of the class and we'd just have to figure it out.

    My method for figuring it out involved downloading a Java editor, and using the GUI design tools. It was the first time I'd used a graphical editor for Java; it was encouraged to do the work in VI or Emacs, and generally, that's all we did.

    Now I hated that crap at the time, but nothing has prepared me better for my day to day life than having projects dumped on me where I had to goddamn well use my initiative and figure it out. Over and over again, I was forced to go out and read and work out for myself how to translate the theory into code. These days, I program in Java about 20% of the time. I'd hardly say it stunted my abilities, and it certainly didn't make me into a cookie cutter corporate programmer.

    I'd have to say that specifically teaching any language is a problem. They all come in and out of fashion. I work with a guy whose mind is stuck in Visual Basic...And I don't mean .Net. The idea that you should focus on intellectually sexy teaching languages (like goddamn SCHEME) because it builds character or some crap...I just don't buy it. Language is a tool, and should be treated as such.
  • by WED Fan ( 911325 ) <akahige@tras[ ]il.net ['hma' in gap]> on Tuesday January 08, 2008 @11:20AM (#21954228) Homepage Journal

    I think there needs to be a distinction between learning concepts and the tools you use to learn them.

    University is not where you go to learn a specific set of skills. If you want that, you go to a technical trade school.

    University is where you go to get an in-depth set of concepts, critical thinking skills, research skills, and theory foundations. This is true for any major you wish to approach. In the CS department, there is a reason you take different languages, some are for system development, some are for app development, some are for theory exploration with little to no value outside of the educational environment. Java falls into one of those categories. Assembly, C, C++ fall into others. Ada falls into yet another.

    Think of it in the terms of the English major, you know, those dime-a-dozen students who will end up working at Burger King and Mr. Chows Empire Chinese Buffet, or they go to Hollywood to work as waitresses while they wait for their big break. The English major takes a load of literature, English, American, Russian, Manga, and poetry from Bacon/Shakespeare to Ginsberg to Hughes to Tupac, and writing from haiku to freestyle with a goofy footed pentameter (trademark and patent pending). None of this is particularly helpful to someone who wants to come out of school with business writing skills.

    Remember, in University, some of the most mistaken ideas come from the professors.

  • Re:Java == Jobs (Score:2, Insightful)

    by raz0 ( 899158 ) on Tuesday January 08, 2008 @11:25AM (#21954296) Homepage
    Reopening a class certainly has it's drawbacks too, which is why I mainly try to avoid it. There's a good reason why almost? all the core Java classes are final.
  • Re:tasty (Score:5, Insightful)

    by Kadin2048 ( 468275 ) * <slashdot.kadin@xox y . net> on Tuesday January 08, 2008 @11:40AM (#21954534) Homepage Journal
    I'd offer only one correction:
    anyone with a CS degree and half a brain should be able to pick up new languages within a very short amount of time.

    There are unfortunately a great many universities turning out a great many low-quality "computer science" grads who don't know the first thing about programming, much less the intricacies of stacks and pointers in C. I've met some alleged CS grads who didn't know a compiler from a hole in the ground.

    I think the problem may be improving from how it was a few years back (the dot-com bust knocked CS off the lists of many students just looking for an easy $80k paycheck on graduation), but there are still a lot of dolts around, devaluing the degree.
  • Re:tasty (Score:1, Insightful)

    by Anonymous Coward on Tuesday January 08, 2008 @11:41AM (#21954558)
    There are no "fake majors" only fake programs. The strength of a graduate is largely determined by two factors: quality of student and quality of program. A strong student can excel with a mediocre program, it will take more preparation outside of the classroom. A good student can do well with a great program, he'll just lower the average some and won't be as good as other graduates of the program. An inferior program at one school does not mean the degree is worthless, except from that one school. To assume that "software engineering" as a major will always prepare a student for real-world business programming better than CS or Information Systems would is a fallacy. The strength of a program and its curriculum must be examined, as well as the specific requirements of the "real-world" position. For example, if low-level programming is required, perhaps a CS degree is better, especially for something like an operating system. Most organizations don't require OS programmers, they want someone who understands business processes and controls. This doesn't happen in most CS programs.
  • by FatSean ( 18753 ) on Tuesday January 08, 2008 @11:44AM (#21954590) Homepage Journal
    If you can't figure out concurrency from theory and pseudo code, you're a hack. You must first understand the theory, then worry about how to implement it. Besides, Java has so many tools and support systems for concurrency that you'd have to be a lunatic to try to develop that in C++. Hardware is cheap, talent is not.

  • by autophile ( 640621 ) on Tuesday January 08, 2008 @11:46AM (#21954630)

    Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later.

    I know, it's insane nowadays! Why, just the other day I was just remarking to my barefoot wife, who was cleaning our clothes by beating them against the rocks in the river, and then rolling them through a wringer, that all these fancyboys with their pushbutton machines and laundry powder just don't value godly, manual labor anymore! Then I went to my factory job where the machines have no safety features, so they really require you to think, which is sadly out of vogue these days with OSHA and safety goggles and whatnots.

    The shame!

    --Rob

  • by stryder100 ( 1215372 ) on Tuesday January 08, 2008 @12:09PM (#21954984)
    Good point - C was invented primarily not to make assembly language safe but to make it portable. That being said, to use it as an application development tool is not a good idea, even if the developers are "real" programmers. Things like checking for buffer overwrites, null pointers, etc. should definitely be automated, like it is in C++. Otherwise you're reinventing the wheel at every turn.
  • by jandersen ( 462034 ) on Tuesday January 08, 2008 @12:18PM (#21955114)

    Give me a break
    Sure - where do you want it? But of course, when I boldly claim that you don't need to learn discipline in Java, I am oversimplifying in the hope that the reader is able to fill in the obvious gap. After all, discipline is required for anything you do, to some extent; but in C you need so much more of.

    I am not saying that Java is bad or that Java programmers are bad, but there is still some serious value in understanding how a computer works right down there at the metal, and C is a lot closer to the bare metal; and that is why it is important for CS students to learn to use C well, even if they will forever more use Java, C# or whatever. When you study computer science, you study to become a Computer Scientist, which requires a lot more than the ability to program - hence the word 'Scientist'. Computer Scientists are, as far as I know, required to be able to write things like operating systems from scratch and that kind of things, and you won't choose Java for that if you want it to go fast (not quite true, I know - Nixdorff made a series of servers once, whose OS was written almost entirely in interpreted BASIC).

    Now, as for the 80% errors caused by pointers - it is perfectly possible to learn a coding discipline that avoids them. This should happen automatically with experience; after a while you learn to always initialize buffers - whether they are pointers, strings, int or other - when you define them.
  • by Abcd1234 ( 188840 ) on Tuesday January 08, 2008 @12:20PM (#21955142) Homepage
    Gah, you're completely and utterly missing the point. This isn't about friggin' languages, and it never was. It's about programming paradigms. My training in computing science took us through the usual host of languages: C/C++/Java, Assembler, Prolog, and of course Lisp. Plus SQL and, if you were so interested, Smalltalk. Was the point, here, to teach languages? Of course not! The purpose was to teach traditional imperative, functional and declarative programming, in procedural, functional, and object-oriented styles, all the while exposing us to both high-level software concepts while giving us a grounding in the underlying details so we can grasp how things work down at the hardware level. And *that* experience is vital for producing well-rounded software developers who understand a variety of techniques and can apply the appropriate tools for the job.

    So, sure, if you could find a language that supports all those programming paradigms, and allows one to expose students to both low- and high-level programming concepts, then my all means, teach it from start to finish. Because you're absolutely right, it isn't really about the language. But, that said, the language is there to best express one's ideas. And I'd be *very* surprised if you could find a language that encapsulated the full breadth and depth of programming approaches while not being a pathetic, lowest-common-denominator expression of those concepts.

    Incidentally, the real irony in all this is that languages like Java and C# are taking on more and more functional and declarative features (though, again, only a poor man's version of them). Just look at C#: suddenly, they have lambdas, and soon it'll have a built in query language. Bang, suddenly we have declarative, functional, and procedural programming models all jammed into the same syntax. And the brutal, bitter truth is that most programmers using C# probably have *no idea* how to best use these tools to solve problems.
  • Re:tasty (Score:4, Insightful)

    by cayenne8 ( 626475 ) on Tuesday January 08, 2008 @12:21PM (#21955152) Homepage Journal
    "I think it would be a very bad move for universities to cater to the corporate world. If you want to just learn programming, get some certs or buy a book. If you want an education, go to the university."

    But, the whole reason to GO to a University, is to get the skills/education to make more money when finished, than you would have if you had not gone.

    College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.

    If people could make good $$ without college, I doubt you'd see so many people trying to go....

    A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.

  • Re:tasty (Score:5, Insightful)

    by CaptainPinko ( 753849 ) on Tuesday January 08, 2008 @12:42PM (#21955444)
    Sorry, but universities are meant for education not job training. The fact that jobs are the reason main people go is sadly just another sign of the times. If you look at the long history of universities you'd realise that they have focused on research and the arts. Even the sciences were so abstract that they've didn't have a use for it at the time and for many years to come. Please, lets stop perverting universities for the industry. If universities don't produce well-rounded educated people where will they come frmo?
  • Re:tasty (Score:5, Insightful)

    by Stamen ( 745223 ) on Tuesday January 08, 2008 @12:51PM (#21955606)
    Sad commentary; unfortunately you are correct, this is the view of many people.

    I think anyone who is spending 4 to 6 years getting an a degree in computer science only to get a high paying job when then get out, are a tad silly. They are really, really wasting their time. They can get an intern job right now, at a software consulting company, study their ass off (as we all have to do in this field). Within a year they will be making decent money, within 3 years making really good money. 4 years later when the person has their shiny degree, after studying Java (which probably wont' even be used then), they get the joy of getting a junior developers job.

    There is an old adage: "How do you become a writer?" "Write... a lot". This is the same with programming. You can't fake your skills, and a PHD in CS won't matter if you can't bill your clients because your application doesn't fulfill requirements or even work.

    Truthfully if all you care about is money, work in finance, or become a salesperson. The best developer in the world won't compete with a high end salesperson dollar for dollar; hell CEOs can't compete with top salespeople. Zero education required.

    I very much value a university education, but it has nothing to do with making more money. Learn, create, become a very educated person; the money will follow; the money part really isn't that hard.

  • Re:tasty (Score:5, Insightful)

    by Thomas M Hughes ( 463951 ) on Tuesday January 08, 2008 @01:14PM (#21955912)

    But, the whole reason to GO to a University, is to get the skills/education to make more money when finished, than you would have if you had not gone.

    College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.

    If people could make good $$ without college, I doubt you'd see so many people trying to go....

    A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.
    I want to preface my comments by saying that a lot of people have a similar mindset as you do. It's highly prevalent in the United States at this point. So, it certainly isn't your own personal shortcoming for thinking like this, it's a larger societal problem.

    For a moment, put the reason why YOU go to a University to the side and consider what the purpose of the University is. It's an institution that's literally thousands of years old, dating back to the old Greek institutions of education. When Plato and Aristotle founded their schools, they didn't put up a big sign that said "When you're done, you get more money." That wasn't the promise. The promise was that by teaching you about the world, you would become a better person. That is to say, the founding concept of the University was that education lead to human excellence. And, for the Greeks especially, human excellence was not directly related to the possession of wealth.

    This understanding of education was dominant up until very recently. Everyone was required to learn Greek and Latin, so they could read Homer and Plato. Reading the Homer isn't going to get you a job, it's not going to get you a promotion, it's not going to get you an interview, and it's not going to get you laid this Friday. No one at the University used to make the claim that it would. They'd claim that reading Homer made you a better person, even if it doesn't get you a job.

    Now, as to why YOU should go to a University? If you're going for the purpose of getting a job, you're not going to understand the vast majority of your classes at the University. You're going to be wondering "Why do I have to take this anthropology class?" or "I have no interest in Operating Systems, why do I need this Operating Systems class?" and "Why do I need a foreign language, I'm going to be working with code all day." All these questions miss the larger point of what the University is trying to do to you. And if you're missing the point of the entire institution, it's exceptionally difficult to do well there.

    The whole thing is really just the result of multiple generations of corruption, I think. Employers realized that well-rounded, educated (dare I say, excellent) human beings are better for the health of a company. So they pay more for people who are excellent, and a University degree used to be a short-hand of some form of excellence. The masses of uneducated began to realize this, and started saying to their kids "If you want a good job, you need a degree." So their kids started going to the University, thinking the point was to make money. Professors, having tenure, just did what they were going to do anyway, but now we've gone two or three generations like this. We're reaching the point where current professors went to school thinking it was for money. We have boards of Universities with pressure from the state to focus less on the goal of education for excellence and more on the goal of education for job skills.
  • Re:tasty (Score:3, Insightful)

    by engwar ( 521117 ) on Tuesday January 08, 2008 @02:13PM (#21956994)

    Universities certainly do both educate and prepare people for jobs but I'd guess that the main reason people go to universities is to get a good job and make more money than they would otherwise.

    Let's imagine for a moment that going to a university would have ABSOLUTELY NO IMPACT on the kind of job you get or money you can make and that the ONLY thing anyone could get out of it was to be a more rounded and educated person.

    Do you really think that people would pay the kind of money that is required to go to a university if this were the case?

    The vast majority of people go to college to end up with a better job.

  • Re:Java == Jobs (Score:5, Insightful)

    by raddan ( 519638 ) on Tuesday January 08, 2008 @02:21PM (#21957130)
    Sorry, that's just not the case. OO is just a formalization of what was already happening with good procedural programmers. OO is not fundamentally different that procedural programming-- it is a superset. OO languages force the programmer to do certain things: code modularity, polymorphism, typedefs/classes, etc, and does so in a way that encourage a programmer NOT to come up with their own system to do the same thing.

    If you look at developers who spend a lot of time doing things in C (e.g., the OpenBSD developers-- have a look at their repository [openbsd.org]), you'll see that they are keenly aware of "object-oriented" design principles. They also tend to know exactly when things like byte alignment is an issue, and when you really should just use a void pointer, because they are forced to think about their machines. Most OO programmers I know have no idea why they would need OO language features-- they just use them because that's what they've been taught-- and they know next to nothing about the machines themselves. I would argue that a good programmer is a good programmer; and if they have standard procedural programming experience, that will nicely complement their future OO work.

    GP is right-- OO is simply a design philosophy. The actual mechanics of building an application are no different.
  • Re:tasty (Score:5, Insightful)

    by Radres ( 776901 ) on Tuesday January 08, 2008 @02:30PM (#21957288)
    I don't think the OP's point was that learning those languages would directly put the student in a better position to get a job after college. I think the point was that learning those languages would help the student to understand more about computer science.
  • by curri ( 107175 ) on Tuesday January 08, 2008 @02:48PM (#21957546) Homepage
    You may not find some languages (Scheme, Haskell) practical, or useful for getting a job; however, they help you understand certain concepts a lot better. If you really learn Scheme, you'll understand recursion, and will also get an appreciation for syntax (since scheme has none :); Haskell will teach you typing (templates etc) and lazy evaluation. Of course, you *could* learn those things in C++ or Java, but the concepts will be every unnatural, so chances are you won't really grok them. After you've learned the concepts, then it is relatively easy to apply them in a different language.

  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday January 08, 2008 @02:50PM (#21957590)
    I hold a Master's degree in Computer Science with a focus in information security. I have also worked in the private sector in a variety of IT jobs, so don't think I'm some propellerheaded academic. I have also taught programming courses at the university level. I call shenanigans on your entire argument.

    I am a headhunter for high end roles at investment banks, and we are close to classifying CompSci as a "non degree", along with media studies, languages, etc

    So, what, you're going to hire math geeks only? People with degrees in mathematics or operations research, or perhaps some of the hard sciences? In my own experience, while there are some non-CS degrees that are excellent preparation for a CS career, only a CS degree is a CS degree. It is lamentable that some schools have embraced the trade-school mentality, but many more have not. When I was teaching courses as a graduate student (just a couple of years ago), the curriculum began with Java and quickly shifted to Haskell. A neighboring institution still uses Ada as an undergraduate language. There's also a legion of Knights of the Lambda Calculus who are trying to get Scheme reintroduced to the undergraduate curricula in several institutions in the area. Intellectual diversity about languages is alive and well in the academy, based on the institutions I've seen up close and personal.

    Also, who is this "we"? You and someone else who shares your prejudices? Or is this you and the senior engineering staff? If you're about to decree CS as a non degree, maybe you should get the input of the people who will be most brutally affected by your shortsightedness.

    Java is fine for teaching design patterns, and classical algorithms like Quicksort, or binary search.
    But you can't do operating systems

    So glad to know that you think design patterns and classic algorithms are worth studying.

    Look, pick up a copy of Cormen, Leiserson, Rivest and Stein's Algorithms textbook sometime. That's the definitive work on algorithms--if you need an algorithm, it's probably sitting in CLRS somewhere, along with some beautiful mathematical exposition about it. Every algorithm listed in the book can be trivially converted into Java. So why the hate for teaching CS with Java? It's a perfectly sensible language for many very important parts of CS.

    Further, I've taught operating system design in Python. Yes, Python. When talking about how a translation lookaside buffer works, I don't write C code on the board. I write pseudocode in Python and say "so, this is how it looks from twenty thousand feet." On those rare occasions when we have to get down and dirty with the bare metal, then it's time to break out C--and we leave C behind as soon as possible. I want students to be focused on the ideas of translation lookaside buffers, not the arcane minutiae of implementations.

    After all. Implementing it is their homework, and it involves hacking up the Minix code. In C.

    Their reason apparently is that it is "easier".
    I have zero interest in kids who have studied "easy" subjects.

    If it was an easy subject, would changes need to be made to make it easier?

    If it was a spectacularly hard subject with a 50% washout rate, would changes need to be made to make it easier?

    I've been in courses where 50% of the class washed. They were horrible, horrible classes. The pedagogy needed to change. The learning curve needed to be smoothed out and made gentler. This is in no way equivalent to saying it was made easy. The fact you think otherwise brands you as an intellectual elitist who can't be bothered to think logically about his own prejudices.

    A computer "expert" is not someone who knows template metaprogramming in C++, or compiler archaeology in BCPL, or the vagaries of the Windows scheduler.
    It is someone who understands computers at multiple leve

  • Re:Java == Jobs (Score:1, Insightful)

    by magical_mystery_meat ( 1042950 ) on Tuesday January 08, 2008 @03:25PM (#21958270)
    The lack of first-class functions/methods in Java makes functional-style programming unnecessarily difficult.

    Then use a different language that supports your pet features. First-class functions are a procedural hack and don't really belong in a truly OO design in the first place.

    However, one could also argue that a design's "OO-ness" can never reach 100% no matter what language you use for implementation.

    The map is shorter, less cluttered, has less potential for bugs, and ultimately easier to understand once you understand what map and lambda do. Which shouldn't take a long time for a competent programmer.

    I find the second block to read much more clearly, and six months from now if the guy who wrote it got hit by a bus I wouldn't need to make sure the next guy knew what map and lambda did. It makes work really dull to write code like that, but you're not paying people to show off their knowledge of a language's esoteric syntactic sugar. You pay them to get the job done in the most effective way possible, which in the real world, means writing the clearest, most maintainable code possible given the constraints of the language.

  • Re:tasty (Score:1, Insightful)

    by CaptainNerdCave ( 982411 ) on Tuesday January 08, 2008 @03:57PM (#21958832)
    >But, the whole reason to GO to a University, is to get the skills/education to make

    >more money when finished, than you would have if you had not gone.

    funny, that's what i used to think. now that i have a couple of degrees, i finally realize that there is more to life than money. THAT is why one gets a bachelor's degree - to learn that there is more to life than just making money.

  • by Anonymous Coward on Tuesday January 08, 2008 @04:36PM (#21959588)
    "Hard to see knocking up forms for data as a real good job, regardless of the employer's industry."

    Unfortunately, it seems most other people agree. Based on my own experience, the user interface for data input is actually quite important - you need to think about how to make it as intuitive and foolproof as possible. Catching categories of errors and warning the user about them with some visual cue is a biggie, for example - or planning for how to scale a form for larger data input sets and still allow some sort of coherence in how it is displayed.

    Of course I'm sure the hope is input is a relatively rare activity or a one-time-per-problem step, but none the less many errors can be caught there in a variety of real world scenarios (corporate data process collection, for example, where controls are known.)
  • by Wolvey ( 918106 ) on Tuesday January 08, 2008 @07:26PM (#21962346)
    I spent those 4 years programming and not in a university. I've worked alongside people who did spend those 4 years in a university, and some came out with hardly a drop of practical knowledge. While they were reading from a book I was solving real-world problems. The halting problem is a fun exercise in logic, but to say that one must have knowledge of it to realize that writing a compiler in 1 month is unrealistic is... unrealistic.

    Considering the years and (tens of) thousands of dollars you've invested in formal education, I can see why you would want to justify that decision. Personally, the years I spent earning money and gaining experience in the field lead me to believe that a degree would have been a less-than-optimal use of my time and money.
  • Re:tasty (Score:3, Insightful)

    by emilng ( 641557 ) on Tuesday January 08, 2008 @09:47PM (#21963872)
    You're lucky that you don't have to take out a student loan and that your parents are paying for your whole education to NYU no less. It's good that it seems like you have a passion in programming. I wonder if it's really necessary for you to be attending such an overpriced school as NYU though. For an undergrad education I think there are probably state schools or other private schools that can offer you a comparable schooling in learning how to program for a much lower price. I think the only advantage you gain going to NYU for undergrad is having the NYC experience, but working here and having grown up here, I'm getting pretty sick of it. Given the cost of your education you have to figure in the amount of time you will have to work to recoup the cost. You should also calculate the cost with compound interest and inflation. I think a better route would be to do some research and transfer to a cheaper school that offers a comparable education and then dish out the big bucks when you feel like you need to go back to grad school.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Wednesday January 09, 2008 @02:12AM (#21965532)
    Comment removed based on user account deletion

I've noticed several design suggestions in your code.

Working...