Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Education Programming IT Technology

Professors Slam Java As "Damaging" To Students 1267

jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
This discussion has been archived. No new comments can be posted.

Professors Slam Java As "Damaging" To Students

Comments Filter:
  • About the Authors (Score:5, Informative)

    by etymxris ( 121288 ) on Tuesday January 08, 2008 @04:32AM (#21951068)
    Gee, wonder why they're praising Ada so much.

    Robert B.K. Dewar, Ph.D., is president of AdaCore and a professor emeritus of computer science at New York University. He has been involved in the design and implementation of Ada since 1980 as a distinguished reviewer, a member of the Ada Rapporteur group, and the chief architect of Gnu Ada Translator.

    Edmond Schonberg, Ph.D., is vice-president of AdaCore and a professor emeritus of computer science at New York University. He has been involved in the implementation of Ada since 1981. With Robert Dewar and other collaborators, he created the first validated implementation of Ada83, the first prototype compiler for Ada9X, and the first full implementation of Ada2005.
    Maybe Ada is helpful for learning concurrent programming and safe typing, but I'll wait for the opinion of a slightly less partial party.
  • by koinu ( 472851 ) on Tuesday January 08, 2008 @04:35AM (#21951090)
    The title is bad. Read the text and you will notice that they claim that "Teaching Java, as the only language, is damaging." and that might be true. I have the opinion for years now and I teach Java to students during this semester. There is far more than only Java and you can only get better the more languages you know.

    By the way, C (yes, the one without "++") is still my favorite language.

  • Biased? (Score:5, Informative)

    by silverhalide ( 584408 ) on Tuesday January 08, 2008 @04:35AM (#21951092)
    This might be obvious, but take a close look at the authors of the article:

    Dr. Robert B.K. Dewar, AdaCore Inc. (President)
    Dr. Edmond Schonberg, AdaCore Inc. (Vice President).

    The article by some weird coincidence slams Java and praises Ada.

    Salt, please...

    PS, Ada is mainly alive in the Military/Aerospace industries where projects can last 20+ years.
  • Re:About the Authors (Score:5, Informative)

    by Col Bat Guano ( 633857 ) on Tuesday January 08, 2008 @05:08AM (#21951294)
    I'm a slightly less partial party. I've taught Ada to first year students, and our school replaced Ada with Java. Some observations are...

          Students found Ada a relatively simple language to start with (if you choose an appropriate subset)
          Java can have more overhead for a beginning student
          Lecturers are often tempted to push a lot of "stuff" in intro subjects
          Java GUI motivates some students to get more involved
          Many of my students regretted that Ada would no longer be taught in first year (having quite enjoyed it)

    No matter what you start with, teaching students to be better programmers takes more than just a language. Each language allows you to teach a specific set of skills, and Ada is not bad for teaching some important SE skills (IMHO).

    I think pointers are overrated as a first year concept, and can wait for later years.
  • by goose-incarnated ( 1145029 ) on Tuesday January 08, 2008 @05:14AM (#21951320) Journal

    C is a lousy beginners language.

    That depends on what you are trying to teach.

    I started programming in Pascal, and then moved to C/C++. Structured programming, language syntax, variable typing, functions, parameters, recursion, etc I could ALL learn in Pascal.

    Once again, depends on what you are trying to teach/learn. Pascal (which I like, btw) teaches different things than one would learn in C. In pascal you are learning a programming language; very often in C you are using it for a specific platform/architecture, and hence get to use and understand things like signals, volatile variables, memory-mapped IO, etc ...

    When I came through Java was still pretty new, but I did take a java course, and found it reminded me a more of Pascal than C/C++; I'd say its a good starter language.

    For what, exactly? Java (and C#, etc) bring nothing new to the party; everything they offer has already been offered (sometimes decades ago) by different programming languages. The only reason to use them is peer-pressure, in which case the only place to teach them is in vocational colleges, not universities.

    Also you can easily write command line apps in java, so i don't know why they blamed gui dependancy on java. And as for 'systems programming' well DUH. Your first language is where you learn the basics of programming, before you start taking systems programming you should also have a lower level course ideally in something like assembly language (even if its just on emulated hardware) or C.
    Like I said above, it all depends on what you are trying to teach; for example, it is possible to teach OO using plain old C, but it is rather painful! In much the same way, the only time you would want to use Java is when your problem maps nicely to an OO architecture (and most problems do not!).

  • by Mr2001 ( 90979 ) on Tuesday January 08, 2008 @05:21AM (#21951360) Homepage Journal

    And because it's like that, you have heap allocations for every non-atomic data type, which is really the opposite of performance. Need a simple int[2] for the pipe(2) syscall? (let's just assume it, even if Java does not have)?

    try {
                    int fds[2] = new int[2];
                    pipe(fd); /* still check return value */
    } catch (memoryAllocationErrorOrSo) { ...

    Why does this need to be so complicated [...]
    It doesn't. You've made it more complicated than it needs to be, by putting in an exception handler. What are you going to do in the unlikely event that there is an exception, anyway - fix it somehow? Free up another 8 bytes of memory to make room? Just remove that try statement, and let the exception be caught by your top level handler.

    And then there is this garbage collector that professors swarm about. Does it handle circles?
    Yes, it does. It's not a reference counter, it's a garbage collector. It collects garbage, i.e. any heap object that can't be reached by following a chain of references from a root reference (like a local variable, a static field, or an instance field of any non-garbage object). A modern GC won't be fooled by two garbage objects holding references to each other.
  • by garphik ( 996984 ) on Tuesday January 08, 2008 @05:22AM (#21951374)
    Software engineering is not just about writing a class / implementing a functionality. Its about a process and is lesser to do with what programming language you have been taught. Mostly it is to do with the Object oriented paradigm (C++, Java for example). OO has a lot of proven advantages.

    C, Lisp, and ADA are all different types of languages.
    I think the best way to learn programming is not just by mastering one particular language, but going by the programming principles; Just like if you learn to play a guitar, you can learn other the strumming instruments easily.

    Languages like assembly and machine code don't require learning one just needs a manual while programming.
  • by trifish ( 826353 ) on Tuesday January 08, 2008 @08:39AM (#21952436)
    Out of curiosity: how can you check that the size of a memory block pointed to by a pointer in C ?

    The modern secure variants of strcpy() solve it by requiring additional parameters where you pass e.g. sizeof(dest) and sizeof(src).
  • Re:tasty (Score:5, Informative)

    by StarvingSE ( 875139 ) on Tuesday January 08, 2008 @09:13AM (#21952712)
    I think there needs to be a distinction between learning concepts and the tools you use to learn them. I graduated from uni within the past 5 years, and they taught first year CS students in Java. They used these languages to teach data structure (ie stacks, dictionaries, etc) and some simple algorithms like sorting. At the same time, they taught us proper object oriented skills since that is what the industry is demanding.

    During the second year we took a class that taught C/C++ which basically taught pointers and memory management. In my upper level courses like operating systems and graphics, it was all C and C++ from then on. I think this gave me a pretty well rounded education.

    When I was done, I had used a number of tools (languages) to learn a variety of CS topics, and felt that I was well prepared for the industry.
  • by makomk ( 752139 ) on Tuesday January 08, 2008 @09:14AM (#21952720) Journal
    Except that's not true anymore with modern optimising compilers, and assuming it is is a good way to shoot yourself in the foot. (Apparently, gcc does some really interesting things sometimes - for example, see this thread [] and the related lkml thread.)
  • by egomaniac ( 105476 ) on Tuesday January 08, 2008 @09:44AM (#21953006) Homepage
    Well, the whole point is that this exception should not be necessary at all. Just have 2 ints on the stack...

    Of course, because while it's possible to run out of of heap space (implying a possible OutOfMemoryError), computers have had infinite stacks since the 1960s. Simply by moving the memory over to the stack, you can't possibly run out anymore! It's brilliant!

    Hint: it's all the same memory. Your desperate desire to have the bytes come out of the "stack" bucket instead of the "heap" bucket is simply irrational.
  • Re:Java == Jobs (Score:3, Informative)

    by timster ( 32400 ) on Tuesday January 08, 2008 @09:54AM (#21953096)
    I had to explain to him how the computer handles twos complement works in binary and how a binary 1 converted to integer would be -1 and 0 would be 0.

    Maybe I'm not reading this right, but a binary 1 (as in 00000001) interpreted as a two's-complement integer is positive 1, not negative 1. -1 would be 11111111. Maybe this doesn't have much to do with your point, though.
  • Re:Java == Jobs (Score:3, Informative)

    by smallfries ( 601545 ) on Tuesday January 08, 2008 @11:43AM (#21954584) Homepage
    Ouch three replies and nobody got what he was saying.

    A single binary 1, would be -1 in 2s complement. You are talking about bytes, as in 8 bits. He is talking about a single bit, for which the two values are -1 and 0. The key word in the description is single.
  • by TemporalBeing ( 803363 ) <> on Tuesday January 08, 2008 @11:54AM (#21954738) Homepage Journal
    First, some of the best comments in this thread: Comment 1 [] Comment 2 [] Comment 3 [] Comment 4 [] Comment 5 [] Comment 6 []

    I list them because they hold a lot of wisdom, and wanted to draw special attention to them for such as well.

    When I was in college I got really ticked at the level of theory - there was too much of it. It wasn't balanced well enough with implementation; and as I looked around, I noticed that was pretty common place among academic institutions (colleges AND universities - and I'm not talking about trade schools either). That was before they moved their curriculum to using Java for the first couple classes; and after they did, I had already heard some stories about the upper classes getting some of these "new" students and not being able to focus on the class materials because they had to teach these students C/C++ first and the students had a harder time getting it. (Not so the other way around.)

    That said, I've started thinking about how I would put together a curriculum for teaching computer programming/science/engineering. (I'm not talking about computer _hardware_ engineering, btw.) I even did some tutoring after college. So what would I do?

    I'd start students with a language that can be used to teach the real basic skills and concepts (variables, functions, etc.) - even vbscript could be used at this level; but I'd also quickly move them on to more advanced concepts (in the case of vbscript, it would only be used for a couple weeks at most), moving from language to language to bring not only a depth of concepts and understanding, but also a breadth of computer languages and kinds of tasks. I'd also ensure that somewhere in the curriculum students would be exposed to Assembly, and have found that even a small exposure makes a big difference in programming styles and philosophies for programmers.

    Furthermore, I'd break the curriculum into two parts. One part would start from the ground up; and the other would start from the top down. Both would be required of students. The idea being one part would be more focused on the theoretical, while the other would be more focused on the substantial - implementation. Both would work together to produce a well-rounded student. Additionally, it would be designed such that students that wanted to work on operating systems would simply follow the one from start to end; while other students would be able to leave for more focused courses at the layer of their choice. (Students wanting OS would still have other courses for focus work too, btw.) The primary idea being that even a web-app developer needs to know the underlying systems, and even the OS developer needs to understand the abstractions of the web-app developer.

    I'd also have the overall curriculum be far more software engineering focused. Yes, if people want to really be computer "scientists", then they could do that; but industry really needs software engineers, not computer scientists. Real programs require engineers, and sadly, this is strongly lacking from most all academic computer programs. (Some have changed it, but not many.)

    I'd also think that this approach would be very favorable to the authors of TFA and the comments I've linked. The ideas probably need a bit more refinement, but the general approach would be sound - and it's not what academia is doing today by any stretch of the imagination.

    FWIW - While I am relatively young (college grad of 2003), my main strength is C prog
  • by Mr2001 ( 90979 ) on Tuesday January 08, 2008 @11:58AM (#21954796) Homepage Journal

    Well, the whole point is that this exception should not be necessary at all. Just have 2 ints on the stack...
    As another response pointed out, you can run out of stack space just as easily as you'd run out of heap space, and running out of stack is even less graceful.

    With a modern GC, allocating on the heap is basically the same operation as allocating on the stack anyway: just advance the top-of-heap pointer. Not really a performance issue, unless you're running it inside a tight loop where you want to avoid triggering a collection.
  • Re:tasty (Score:3, Informative)

    by jjn1056 ( 85209 ) <> on Tuesday January 08, 2008 @01:00PM (#21955732) Homepage Journal
    Yeah, I think the basic OO support that was addd to Perl in version 5 leaves a lot to be desired, since it's extremely minimal and requires a lot of typing boilerplate to do even basic stuff. The thinking was to make the most minimal solution and see what people needed.

    If you are interested in doing real OO, I'd recommend using the Perl OO Framework called 'Moose', which you can install with 'cpan Moose' is you have a reasonably modern Perl (you might need to sudo that command, btw.)

    Moose is a full meta object system with introspection, Roles (sort of like Interfaces on steroids), and a neat syntax. check out the tutorial at: [] (which is the lastest version as of this posting)

    There's also a very active IRC channel at

    If you like Perl but feel envious of languages with more strongly defined OO systems, I really recommend looking at Moose. It's a big step forward for the community and give you one of the nicest and most productive OO systems around.

  • by julesh ( 229690 ) on Tuesday January 08, 2008 @01:42PM (#21956388)
    Java is all about pointers actually. Everything that is not an atomic type (int/long) is actually a pointer. They even call that a reference! Hah, people go use C++ for a while.

    And because it's like that, you have heap allocations for every non-atomic data type, which is really the opposite of performance.

    Not really, no. The just-in-time compiler performs pointer escape analysis for the allocated objects and only uses the heap for the ones where heap allocation is actually necessary; the rest use the stack regardless of how the programmer wrote the declaration.

    Admittedly, it's taken a while for this optimisation to be included, but it is there in the latest versions of Java.
  • third incarnation. (Score:2, Informative)

    by krischik ( 781389 ) <krischik@[ ]rs.s ... t ['use' in gap]> on Tuesday January 08, 2008 @01:46PM (#21956498) Homepage Journal
    Earlier this year Ada even went into it's third incarnation now called Ada 2005 (ok, it took a bit longer to ion out all the ruff edges from the new standart): []


The wages of sin are high but you get your money's worth.