Professors Slam Java As "Damaging" To Students 1267
jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
You have to start somewhere... (Score:5, Interesting)
Yeah, I just read a press release from the FAA blasting driver training courses. Apparently, flight students who just got their drivers licenses were not able to navigate in the air, execute banks, take-off, or land properly.
Students have to start somewhere. It's easier to start with simple stuff than to try to cram their heads full of everything all at once.
1 language is damaging. (Score:5, Interesting)
In the course of my CS education (early 90s), they started with Pascal when they explained algorithmical basics.
Later courses were in C for OS and networking, while other courses used about everything from PROLOG to ADA.
You learn that some paradigms map to certain types of problems better (or worse) than others. So don't open sockets in
Prolog (I have seen'em do it man) , and don't do AI in C.
a quote: "if the only tool you have is a hammer, every problem looks like a nail".
Programming languages are tools, not religions. (Score:5, Interesting)
As someone who programs mainly in java, I have to say they have a point. Surely a degree in CS should get someone familiar with all forms of higher order programing (both OO and functional). They should also have a reasonably solid understanding of basic hardware architecture and how that affects programs.
Unfortunately this does not seem to be the case at least in NZ. Some don't even know about basic complexity ideas and often have little to zero mathematics under there belt.
I did not do CS but physics. I was required to do Assembly,basic,C,matlab,R,Lisp,Java,C++,Haskell and a bunch of others I don't care to mention (Like PLC's and FPGA stuff).
Oh noes! Java is not C! (Score:1, Interesting)
Well, of course learning Java will not help you with languages intended to be "high level assembler" like C, just like learning C will not help you with languages intended to be compiled to bytecode and executed under a VM like Java or C#. How that makes C the "solid language" and Java the usurper, I don't know. And here's a radical concept: what about learning both types of languages? You know, the purpose of education being to provide a wide-ranging vision and not just with what your teacher happens to like.
I haven't read TFA (this being Slashdot and all), but if those professors actually mention Ada as a better language for teaching than Java, I wouldn't trust anything else they say, because they obviously stopped caring about the evolution of Computer Science at some point in the 80s.
Re:You have to start somewhere... (Score:5, Interesting)
I started programming in Pascal, and then moved to C/C++. Structured programming, language syntax, variable typing, functions, parameters, recursion, etc I could ALL learn in Pascal.
When I came through Java was still pretty new, but I did take a java course, and found it reminded me a more of Pascal than C/C++; I'd say its a good starter language.
Also you can easily write command line apps in java, so i don't know why they blamed gui dependancy on java.
And as for 'systems programming' well DUH. Your first language is where you learn the basics of programming, before you start taking systems programming you should also have a lower level course ideally in something like assembly language (even if its just on emulated hardware) or C.
Java brings out the best and Worst in you (Score:3, Interesting)
To take apart their argument by logic:
1. Java is an abstraction a VM over the hardware. People who study comp sci, today are of three kinds:
a) Those who want to be a programmer and then a Project Mgr.
b) Those who want to maintain hardware and/or design new ones. iPod maybe.
c) Those want to manage systems (20% hardware: 80% software).
For the first kind, Java is a better choice because: It does not force you to dig down to machine-code which is unnecessary today. Much like car driving in 1920s and 1990s. It teaches you the best of programming by forcing you to think in terms of objects and how to act upon them in real-world. If you mess up, you don't overwrite the root disk thus causing innumerous references to the time-worn joke about shooting in one's foot.
It also teaches you GUI writing is tougher by way of its MVC programming. At this time, programmers can be split into real men or MSFT weenies: real men would go to Java in Server-world. Weenies would love GUI and goto VB.
It also teaches you the worst in programming: Forcing you to think only in OO way.
The second kinds is better off learning C or even Assembly.
The third kind is tricky: There are lots of management tools nowadays. Some of them written in Java. if they want to write their plugins easily, then Java is the way to go.
2) Java is one more step in evolution which normally the professors hate because it moves them away from the machine. But mankind has more important things to do (watching LOST and Sopranos) than twiddling with RPG.
3) Blaming Java alone for problems is like blaming the Sea for causing Katrinas.
Lastly if anyone should be blamed for warping the minds of youngsters permanently, it should MSFT with its Visual Basic system.
Variety (Score:5, Interesting)
One thing I have noticed though, is a complete lack of security related training. Something about calling eval() on every input just to parse integers makes me cringe. I guess the idea is that worrying to much standard practices keeps people from thinking creatively or something. Unfortunately, it also seems like a good way to get into a lot of bad habits.
Why not D? (Score:4, Interesting)
Sooner or later, languages are going to evolve, and surely it's only a matter of time before something D-like is going to be used anyway. Might as well make the switch sooner rather than later.
Why is "Computer Science" Staffing S/W companies? (Score:5, Interesting)
I totally agree that universities shouldn't be teaching Java exclusively. They need to teach the basics of modular, functional, declarative and oo languages. Why? Certainly *not* to fill "software engineering" positions!!! A university's role is to do research, not to act as some technical college. OK, I can see having a programming course aimed at creating programmers for industry if it's going to pay the bills at the uni. But *don't* make that your "Computer Science" course!!
Computer Science should be science (well, math anyway). Universities should be getting the 5 or 10 graduates they need that will move on to academia (or industry research) later in their careers. Because right now, *nobody* is getting taught Computer Science! Lately I've been reading papers posted on http://lambda-the-ultimate.org/ [lambda-the-ultimate.org] Regularly I have to go back to the basics and learn extremely fundamental theory because nobody *ever* taught them to me in the first place. Half the time I think, "OMG, I never even knew this existed -- and it was done in 1969!!????"
More and more lately, I've been wanting to phone my University up and ask for my tuition back.
If you want to learn how to program in a professional setting, there's nothing better to do than just start writing code. Get your chops up. Then find some big free software projects and start fixing bugs. Learn how to use the tools (configuration management, etc). Learn how to interact with the other programmers. That's all you really need (well, that and a quick automata and grammar course so that I don't have to look at yet another context free grammar being "parsed" by regular expressions).
But right now, where do you go if you want to actually learn theory? I guess the library... And getting back to the point, this is essentially what the paper is suggesting. Students need to learn all these things because they are relevant to the field. A university supports industry by doing basic research. If you don't understand the concepts that they point out, you just can't do that. Paraphrasing from the article, having a university course that's meant to pad out a student's resume is shoddy indeed.
Re:You have to start somewhere... (Score:4, Interesting)
Java is also a lousy "beginners" language, because its reliance on standard libraries leads beginners to look for pre-packaged solutions rather than writing their own. That was one of the main arguments against Java in the paper, and it was a problem even a decade ago when my school was transitioning beginners classes to Java (I was ahead of the change by a semester or two in each class, so I got to start from Scheme, learn data structures in C++, learn AI in Lisp, etc). Yes, in the "real world" you don't want everybody reimplimenting their own linked list or hashtable. However a beginner must learn the concepts behind those data structures in order to advance, and Java just makes it too easy to use the standard set of classes.
That's not to say that Java is all bad. With a good teacher and a good curriculum, it's absolutely possible to teach core concepts in Java (or any language, really). You have to be merciless about banning standard library usage such as collections, and teach your students the theory behind those data structures. People understand theory best when they can actually see it in practice, so you have to have your students implement their own linked lists, doubly linked lists, trees, etc. With Java it's an uphill battle getting people to ignore the standard libraries for "academic" purposes, but it's possible to do.
Personally, I'm thankful that my first real programming language (not counting BASIC in its various forms) was Scheme, and that I was exposed to a number of languages through my college career (the afore-mentioned Scheme, C/C++, and Lisp, as well as ML, Java, and MIPS assembly) even though my current day job consists of C# and SQL. Because of my background, I can easily pick up pretty much any language (and have done so several times), which gives me an advantage over those "programmers" churned out of today's Java-mill universities.
Java for Dummies (Score:5, Interesting)
Java is fine for teaching design patterns, and classical algorithms like Quicksort, or binary search.
But you can't do operating systems, and the success of Java in isolating you from any notion of the hardware is actually the problem.
We have already blacklisted courses like the one at Kings College, because they teach operating systems in Java.
Yes, really.
Their reason apparently is that it is "easier".
I have zero interest in kids who have studied "easy" subjects.
The world is a bigger, more competitive place, how many jobs do you think there are for people who have an easy time at college ?
Java is part of the dumbing down of CS.
A computer "expert" is not someone who knows template metaprogramming in C++, or compiler archaeology in BCPL, or the vagaries of the Windows scheduler.
It is someone who understands computers at multiple levels, allowing them to choose which one illuminates the problem at hand.
To be wise in computers you choose whether to think of something as a block of bytes, quadwords, a bitmap, a picture, or a buffer overflow pretending to be porn. If also have the option of understanding flash vs static RAM, virtual memory, or networked storage, all the better. I doubt if even 1% of CS grads could write code to turn this BMP into a JPG, or even explain the ideas behind this. In my experience, 50% could not work out how to develop a data structure for a bitmap that used palettes.
I have interviewed CS grads with apparently good grades who could not explain any data structure beyond arrays.
Any CS grad who sends us their CV with bullshit like "computers and society" or "web design" has their CV consigned to trash with no further reading.
A CS should be able to write a web server, not be an arts graduate who didn't get laid.
C++ makes you think at multiple levels, unlike Java, you simply cannot avoid thinking about your system from patterns to bytes. This may be good or bad for productivity, and I'm sure we risk a flame war here.
But I am entirely convinced you need to hack your way through a "real" system.
How can someone understand the Linux kernel without C & C++ ?
Is someone really fit to be called a computer scientist if like >50% of the Computer "Scientists" we interview for very highly paid jobs, show actual fear of working at that level.
They have the same "way above my head" attitude that a mediocre biologist might have to applying quantum theory to skin disease.
Partly, as in the Kings College debacle it is lazy mediocre lecturers, but also CompSci grads frankly are not that smart, so they need their hands held.
Although the seats get filled, they quality is in monotonic decline.
2 professors, 1 cup. (Score:3, Interesting)
I've taught grinds to first year students in Ireland in Java (I'm SCJP 14/5) and their professors do not even allow the use of an IDE when coding. They also grade them over Java patterns and OO rather then knowledge of the language.
C/C++ have their place, but any good CS student normally learns a number of languages.
I can code in a number of languages, certified in quite a few as well and I've never used Ada. Considering both professors work for a company that sells ADA stuff it seems a little biased and uninformed on Java.
Re:You have to start somewhere... (Score:5, Interesting)
Assembly is necessary, to understand how a computer really works. Functional languages are good, just to know a completely different style. Some other language for breadth. Then the student can realise that everythin after asm was a waste of time, and return to C.
Start with (Score:3, Interesting)
Though you may not use assembly language much, it helps to better understand what is going on under the hood.
Re:You have to start somewhere... (Score:2, Interesting)
1. Computer architecture, OS designs, Hardware, etc
then use C (or a subset of C++).
2. Computer Science, theory of automata, TM's, state-machines, etc
then use Scheme.
3. Programming, data-structures, algorithms, etc
then use Pascal.
4. Java, C++, $LANGUAGE_OF_CHOICE
then use Java, C++, $LANGUAGE_OF_CHOICE
There is no place for Java in tuition unless it is taught as a vocational skill. Universities never were about teaching vocational skills.
Low level languages (Score:2, Interesting)
Re:I've noticed that... (Score:3, Interesting)
As for the theory you callously disregaurd- that theory is what allows us to dream up ideas like GUIs, OSes, etc and implement them. Without the theory, all a CS major would be qualified for is to make a new webpage that does 90% the same thing as another webpage. Thats why theory is stressed, and they can learn individual tasks like IO reading on the job as needed.
I think they have some credibility (Score:3, Interesting)
You might be right, but just because they're involved in Ada doesn't necessarily make them biased towards it -- it does mean that they probably know a lot about it. What actually matters are their teaching qualifications and their understanding of what's important.
They might just as easily have come to be involved in Ada because it met all their requirements as a good language with flying colours. If they have experience teaching and working with other teachers and teaching in other languages, then I think it's perfectly reasonable for them to comment on this and be given some credibility. (As it is, I can't personally tell for sure just how much experience they've had.)
If you ignore everything someone says simply because it looks like they have a reason for saying it, there would be almost nothing to listen to. After all, who else is going to stand up and say that a language like Ada is better for something besides the people who actually design and use it? They are the people who would understand it best, after all... and there would have to be a lot of them before anyone even tries to run a visibly independent review.
Java is like "The Incredibles", or a circus (Score:5, Interesting)
Every time someone tells me that there are no pointers in Java I laugh a little. EVERYTHING in java that isn't a scalar is actually referenced through pointers. That is, you declare the pointer variable and then "new" the object into place.
They are just incredibly _boring_ pointers. You cannot math on them. There is no sense of location to those pointers. But the absence of interesting pointer operations, and the absence of the _semantic_ _copy_ operation is what all this alleged pointerlessness is all about.
I have only two _Real_ problems with java... (okay three if you count the complete requirement that you constantly have to deal with exceptions even when you know they cannot really happen, and if they did, you would want the thing to abort all over the place... but I digress)
(1) Java has no useful destructors because no object has predictable scope. If you think finalize methods are the same as destructors then don't bother responding, you don't know what destructors are...
(2) Since everything is a pointer in Java, you have to bend over backwards to pass-by-value. The fact that the language doesn't even begin to provide copy-construction semantics. What a miserable PITA.
Now the _dumbest_ thing about java is that they were so set against multiple inheritance that they never bothered to ask themselves why _every_ OO language starts out life without multiple inheritance only to have to add it later. By making everything a proper linear subclass of Object, they left themselves with having to graft on "interfaces" which is just multiple inheritance with the "bonus" of completely preventing default implementations. (Which lead to delegation etc.)
The way the language keeps sprouting things it claims to never have and never need, well it's very like watching a clown car endlessly explode with ridiculous archetypes. After a while it just isn't funny any more.
So yea, teaching people Java as an introductory language is something of a disservice if you ever want to make them truly think about programming and what makes some things machine smart, while others are machine stupid.
--- BUT ---
I worked in education for years. The fundamental problem with computer science education is that it is being taught by computer scientists instead of educators. We are stuck learning from the people who learned from the people who made it up. None of these people ever learned to EFFECTIVELY IMPART INFORMATION.
Consequently, the students are largely unemployable on the day of graduation.
The classic computer curricula seems to consist of throwing three or four languages at a kid in the hopes that they will "just kind of figure out this programming stuff."
The field of computer science has not yet come up with a "basic theory"... a starting place... The list of things a student simply must know before you start filling their head with syntax.
And so we are a bunch of prelates training our acolytes in our special, individualized deeper mysteries.
And that's what everybody is doing worldwide, so our graduates are just as lame as everyone else's...
Cue "Enter the Gladiators"...
Beginner language? (Score:5, Interesting)
Re:You have to start somewhere... (Score:2, Interesting)
Re:software engineering != computer science (Score:5, Interesting)
That they didn't know C wasn't too surprising. That they didn't have more than a basic grasp of memory management was shocking. They were also completely baffled when it came to not using an IDE to develop software. Makefiles had to be explained several times.
I've grumbled many times about this concentration on Java, and the resultant lack of detailed understanding about programing, but each time I did so at my university I was disregarded, and someone always trotted out that age old nonsense "not re-inventing the wheel".
I mean, sure, I see the point, but surely you should have a basic idea of how wheels are made?
Re:tasty (Score:5, Interesting)
I find I am now having to teach myself C++, and am struggling in a lot of areas that had I been taught in Uni I would be a lot more confident in.
Re:You have to start somewhere... (Score:2, Interesting)
Re:software engineering != computer science (Score:5, Interesting)
I was rolling on the ground laughing when I saw the problems people were having making a simple Sudoku program in C#.
Once they were done drag/dropping all the UI elements, they all got stuck.
Mind you your right about they are teaching what they think employers want.
We didnt get to see a *nix system let alone use one.
Although that may be because of that rather large donation Microsoft gave.......
This guy is about as unbiased as Stroustrup! (Score:5, Interesting)
Most CompSci college graduates are totally unproductive on their first job. They can be put to work on trivial things, but no matter what school they came from, they are just going to need a lot of hand-holding to make it through the first year. That is just how it is. Doing coursework at school is no substitute for coding on a meaningful project, whether it be work related, something open-source related, or just something for fun. That is the honest to god's truth as a software developer for over 12 years now and I don't even consider myself even that wisened in the field (maybe after 20 years I will feel differently).
Now, with respect to Java as an introductory programming language, it is not bad but not great either, however the purpose of any introductory course to anything should be to capture the interest of the people who are curious enough to take the course in the first place. Back in college, we started with C (most of my peers had already been programming since they were teething but this was CMU) and if not for my persistent no quit attitude in life, I probably would of given up programming right then and there because spending your entire night trying to debug a trivial program not because you didn't understand the material but because of one stupid uninitialized pointer turns a lot of people off right then and there who may have had the potential to be great programmers, but because their first impression of programming was so bad, they gave it up before they got to learn more about how great programming really is.
Oh yeah, and the not relevant at all math courses didn't really help much either. Whenever in your career you need to use some advanced calculus or discrete math, you will have likely forgotten about 99% of it and need to look it all up in a book anyways. Besides, 99% of programming projects in the real world basically involve high school level algebra and not much else. What separates the productive programmers from the unproductive ones is not who got a better grade on their math course back in college, but those who innately understand systems and are willing to make the extra effort to learn all about the gazillion design patterns available to programmers so that when they are faced with a difficult project, they will not waste inordinate amounts of time reinventing the wheel.
As for understanding computing at a rather low-level, as is the case with a class in operating systems, then yah Java might not be such a great choice, but then again learning C is easy because C is made up of very simple constructs (C++ is another story). However, using C productively just requires a crapload of practice/experience to be good with, not necessarily a whole lot of computing expertise. In addition, the mastery of whatever API's you happen to be basing your career on is paramount as well. In the real world, employers don't want to hear "but I can learn anything quickly" because mastering some API's can take 6 months or more so if you come out of university with no specific skill sets, it is going to be really hard to get that first job because unless you can be productive soon (or even on day one), you are useless as far as employers are concerned. Also, though I don't program in Win32 professionally myself, from my understanding it takes at least 3 years of non-stop work with those API's just to be semi-proficient in them. Professionally, most of my work over the years has been in Java, and Java is probably scary to a lot of neophyte programmers these days because since 1.5, it has unfortunately turned into the bastard child of complexity like its twisted sister C++.
Last but not
Java == Jobs (Score:3, Interesting)
But I have not been able to find any such jobs. Job databases show 90%
Re:software engineering != computer science (Score:5, Interesting)
With Java and most other 'friendly' languages you have literally no way of knowing what is going on under the hood unless you are prepared to invest a lot more time and effort than is available to the average comp-sci student.
With C that's as close as a single flag on your compile line and you can study the generated code until you're tired of it.
It is a coursework issue, not a language issue (Score:3, Interesting)
Why is the professor whining about Java (or C# for that matter) instead of teaching what he think is "useful" ?
I suppose he will say that there is no time.... well then either make the course longer or more focused !
Java/c# have solved a number of bugging issues in programming, do we want to program using tools that helps our job or tools that hinder us ? NOTE: I do not think that java is good for Device drivers or OS, but if you do GUI or any other higher level stuff.... why not, just to have fun with pointers ?
If you really want you can reach the "guts" of the OS even from Java, but unless you really need to why bother ? for fun ?
Another thing, only Java is mentioned, why not c# ?
Re:tasty (Score:2, Interesting)
I don't know a lot about programming as a profession, but teh article feels on target to me.
Re:Java brings out the best and Worst in you (Score:4, Interesting)
Wrong. Java is a step *back*. Not because of the abstraction, mind you, but simply because Java is such a severely limited language. 30-40 years ago amazing new things were developed in CS, none of these are included in Java. Lambda? Map-Reduce? Purely functional code? Caml-style pattern matching? Conditions? People say Java made C++ obsolete, which is completely wrong, since C++ is far more powerful than Java (thanks to generic programming and metaprogramming). Just have a look at Boost, or C++0x, and try to replicate the features in Java - WITHOUT runtime overhead (this is one key feature of generic/meta programming - a lot of computation can be done at *compile*-time).
Actually, the next step in evolution should be Lisp, since it is one of the the most powerful and flexible languages in existence. There is nothing in the language that intrinsically prohibits Lisp performing as well as C++, its mostly the tools that lack development (for comparison, g++ has had a hell of a lot of improvements; the 2.x series was awful). If you avoid certain things like eval, Lisp code can be optimized well enough. There is a Scheme compiler called Stalin which does just that, and it can even outperform comparable C code.
So, if you want to abstract away from the machine, why use Java instead of Lisp?
IMO there are other reasons for Java:
1) its a braindead easy language.
2) you only need mediocre programmers, since Java is easy enough for them, and there are much more mediocre programmers than good ones.
3) as a consequence, programmers are expendable, jobs can be outsourced etc. One C++/Lisp/Haskell expert could replace an entire Java team. Not good for the company, since this guy becomes indispensable, and can demand more salary etc.
4) Universities can claim to have more success in teaching, since the number of guys with a CS degree is higher.
5) Companies need less teaching courses, because of (1).
6) Java has been overhyped for a while now, and many CTOs are so clueless, they just buy into it.
Well, you can disagree at my opinion about Java, but it is a fact that CS students *should* learn about the functional paradigm, what lambdas are, closures, and so on as well as what pointers are, how the memory works, what the garbage collector actually does etc. However, this is not the case - and THIS is a serious problem.
Re:tasty (Score:2, Interesting)
In the upper division courses, we generally have our choice of languages. I've been alternating between C++ and Python (except for the rare professor that actually requires Java).
Re:Programming languages are tools, not religions. (Score:3, Interesting)
I never understood uni CS programs (Score:3, Interesting)
Now I could understand this approach if the world still operated the way it did when my dad learned the ropes, the way he kept insisting it operated even though things have changed in the decades since. In the old days, you went through college for whatever program you were on and when you got out, all the shiny diploma meant is "the boy can be trained." You got hired on at the firm you'd work at for the next 40 years and they would then teach you the business from the ground up.
That's the way it used to be. These days, nobody wants to mentor. Everyone expects you to have experience coming directly out of school, nobody wants to hire entry-level. And the way the jobs go, you may be technically a full-time employee but your work history will end up looking like a contractor. Work here for a few years, laid off, work another place for a few more years, laid off, maybe another place only lasts six months and you're laid off or fired because your boss doesn't like your tie, wash rinse repeat.
I'm a huge fan of education but I've been very disappointed with the educational institutions I've done time at. Public education was a joke, a waste of time. Because I lacked the big bucks and the desire to take on a crushing student loan, I went to JC after high school. It was a laughable experience. I did my undergrad at a local uni and again, it was just a bunch of hoop-jumping and wasted effort. True, there's the point that you get out of something what you put into it, I've seen people turn a bad situation into a positive experience and I've seen others make the same situation worse. But overall, I just think the education system is highly wasteful and inefficient. It'd be expensive to change. There's a reason why tailored suits cost more than the ones off the rack and a tailored education to best suit the student would cost a hell of a lot more than our current one-size-fits-all model.
Re:software engineering != computer science (Score:2, Interesting)
Re:tasty (Score:3, Interesting)
I taught myself Perl first (talk about a bad first language), then learned a bit of VB and C/C++ (not enough) at a community college, was taught mostly Java at university, then taught myself PHP at work, am now working on C#, and have a Python project already set as my next language.
Re:Java == Jobs (Score:4, Interesting)
But C and a C++ background will help you keep that job.
It's very hard to be taken serious as a computer programmer, even a Java one, if you aren't able to understand memory allocation, points, etc. Often times it's critical to understand that so you can understand what the garbage collector is doing.
I'm the director of technology for a small/medium company (about 5 programmers), and when I'm interviewing a new job candidate for a Java position, during the interview I ask them to explain to me how they would achieve writing a linked-list in Java without using the collection class.
Interesting some of the responses I get.
Re:tasty (Score:3, Interesting)
Spot on (Score:1, Interesting)
The university has now dropped all this "hard" stuff to try and attract more people to their degree. The teaching staff has also halved in size. I have spoken to many recent graduates from around Australia and it seems this is happening everywhere. It is a real shame.
Re:tasty (Score:5, Interesting)
Computer science is theory and math. Computers and langauges are the tools used to explore these concepts. The specific languages you learn in the university doesn't really matter; anyone with a CS degree and half a brain can pick up new languages within a very short amount of time.
I think it would be a very bad move for universities to cater to the corporate world. If you want to just learn programming, get some certs or buy a book. If you want an education, go to the university.
Re:software engineering != computer science (Score:3, Interesting)
I would agree with you, if the students were encouraged to learn on their own and given direction in which to point their curiosity. But they aren't. They're encouraged (from early in childhood, really) to do the bare minimum work to get through the course, to stick to the examples and lectures given by the instructor explicitly, and not deviate from what everyone else is doing, or you'll stick your neck out too far.
Case in point, in my data structures class we were given an assignment in which a requirement was that we used a hash table which had an underlying structure of a pointer-based linked-list. The assignment could be completed in any language, but since Java was the curriculum language, the course was taught using only Java examples. I preferred C, but knowing STL list was available I wrote my implementation with C++. I got half credit for the reason "Not use pointer based linked list". Now, aside from the obvious flaw that Java does not have pointers per se, and the Java kiddies all got full credit, STL list is in fact implemented with pointers. I argued this with the instructor, who relegated me to the TA that graded the assignment. When I explained to her that STL list is pointer based, she told me I was wrong. So I directed her to the SGI documentation that says "STL list is implemented by a doubly-linked pointer-based linked list..." Reluctantly, she gave me full credit. However, she wouldn't correct the scores of the other handful of people that also used C++ and STL list unless they came in and showed her the same web page. It wasn't enough that she had been proven undeniably wrong. Lesson learned: don't deviate from the strict curriculum and curriculum tools, or you get screwed. That is, at least, how the others took it. I was too angry to let the man get get the better of me, and I was determined to fight it. So many other people just rolled over and took it, though, and they abandoned all other languages for Java for the rest of their undergraduate classes.
Pointers... (Score:2, Interesting)
What's new? (Score:2, Interesting)
Basics (Score:2, Interesting)
two profs working for adacore love ada (Score:5, Interesting)
The reason is because Computer Science has developed into a discipline that is no longer pure mathematics. There's only so many courses you can squeeze into four years.
2. The development of programming skills in several languages is giving way to cookbook approaches using large libraries and special-purpose packages.
Guess what, that's what building real software is like today. We don't need people that can write quicksort in obscure unused languages but people that can grasp systems of millions of lines of code. Ada doesn't prepare you for that because it is a toy language that never really was adopted outside of the academic world. It has no good, widely used frameworks & libraries like you find in the real world. People don't use it for a whole range of software systems that you find in the real world and to prepare you for this real world there are simply much better languages around these days.
3. The resulting set of skills is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals.
I agree that skills are important. A good prof can teach those using pretty much any turing complete language if it needs to be done. Java isn't half bad for teaching a whole lot of important CS concepts and theory. And unlike Ada, people actually use it. As for C and C++ they are useful languages to learn of course. Many colleges still do.
But of course two ex profs working for adacore are hardly objective. Ada is as dead as latin. It has some nice features but nothing you won't find somewhere else. Keeping professional skills up to date is as important for professors as it is for students. Having done a Ph. D. in software engineering & architecture and having practiced my skills in several companies, my view is that one of the largest problems in computer science education is teachers who have never worked on real, industrial sized software systems and continue to send students into industry with a lot of misguided & naive ideas about how to build software. Most SE teachers out there simply have no clue what they are talking about. Software engineering is a skill learned in practice because the teachers in university mostly lack the skills required to properly prepare students. That's the sad reality.
Matters entirely on the industry. (Score:3, Interesting)
The vast majority of development is being done using pre-compiled libraries. Because let's face it, there is no need for your employer to pay you to recreate the string class, or a hash table, or any other primitive functionality that already exists in any number of languages. Sure, this knowledge is good to have as it can come in handy. But in reality, other then a cursory understanding to ground your knowledge to, the in depth knowledge of them will not effect the vast majority of developers.
Sure, if you are working in OS development, or in embedded software, or in other arena's where you may not be able to use a managed code solution, yeah, knowing C/C++ and all of the underlying mechanics is critical. But when was the last time you listened to a web developer talk about span width, compression algorithms, and rendering engines?
I think Java is an excellent tool for teaching OO design. Especially for people who had VB6 experience (because going from VB6 to VB.Net with out learning OO design was both possible and painful). C/C++ are also great tool for teaching the stuff that has already been written (as you mentioned, stacks, pointers, memory management). So both should be taught for the purpose of educating students. As for ADA, having taken a crap ton of ADA courses while in the military, I can only say that I saw nothing in it that really impressed me over Pascal. I could see bringing Lisp back into the educational realm, but it's real world usage is again, very limited.
I picked up my assoc CS degree from a tech college. We had 2 courses of C++, 3 courses of Java, 3 courses of VB.Net, 3 courses of Web related stuff (ASP.Net, HTML, Javascript, IIS/Apache, etc...), and the like. We never touched Assembler, no one coded an OS, we never touched a lot of stuff that my friends up at the University were working with.
But after 2 years, 9 of the original 60 students graduated the program. And of them, 5 were spot on to become entry level consultants with the flexibility to pick up a variety of languages and technologies. The other 4 were dedicated students that had worked very hard, but just didn't have the mindset to really make it in the development arena, but would still make solid tech support, technical writers, and technical managers.
-Rick
Not why I went (Score:4, Interesting)
That's not why I went to college. That's why you go to a trade school.
I went to college as a CS major because I loved programming. I went to college because I enjoyed learning, and wanted to round out my education in a lot of ways.
That I happened to be able to get a job after was because I was able to take all of the very abstract concepts I had learned and apply them to practical matters. But I had always been doing that on my own all through school anyway - why would I need school for that? Anyone can do that on their own, schools are there to teach you things that are hard to grasp or learn on your own.
Re:tasty (Score:3, Interesting)
That the unis have become almost vassals to industry mirrors that high schools have become vassals to the unis.
Re:tasty (Score:4, Interesting)
Lisp and Scheme I'll lump in "parenthesis hell." I've never seen the allure of list processing languages - they drive me nuts. In the real world, you'll probably never see them even for what it's known best for, AI (python or lua are much more useful these days).
Haskell I've never used and seems to be stuck in the university wasteland. Ruby and Rails seem more practical, but no more than Python. To be honest, I dislike python because it uses indent significance, the one thing I despised most about Make. I've been meaning to look at Ruby, as well, because I like the Smalltalk object model, but I'm not sure how much I like linebreak significant line endings.
As abusive as it sounds, languages like COBOL (banking/finance) or Ada (government/military) or even FORTRAN (mech-E) will get you a job faster than Haskell. Or learn RPG and dedicate your life to IBM and Unisys mainframes (*shudder*).
Re:tasty (Score:3, Interesting)
No. The whole reason to go to a university is to learn . Not just learning the subject at hand, but also learning how to learn. That's what seperates a decent school from a training seminar.
If all you want is more money, go get an IT certification
Re:tasty (Score:5, Interesting)
I think you are missing the mark here, the profs who wrote the original architects are both principals at a company that sells Ada tools. What they are complaining of really is the lack of demand for their stuff. Treating the argument seriously is a mistake.
I don't think that there was ever a time when Ada was a popular teaching language for any purpose other than coding in Ada. Same goes for COBOL, Fortran and such at this point.
Nobody would claim that there was a desperate need to teach CPL or BCPL, Pascal, or the like these days. They had their moment, they were found wanting. There are much better teaching languages these days and much better production languages.
These days I would probably teach either Java or C# as the intro language, depending on which is ahead at the time. I might teach C# to comp sci students simply to force the students to acknowledge the fact that they need to be able to adapt to new languages. Most other cases Java is most likely to be the most useful language.
The big change that came with Java is that when Java appeared it was the first time that a mainstream language was an acceptable teaching language. Pascal was popular in universities but the architecture was bjorked (arrays of ten elements are not a different type to arrays with eleven). The functional languages had dreadful performance and pawky support libs.
Re:tasty (Score:4, Interesting)
Art majors
English majors
Performing arts majors
I could go on... I don't see how you have a valid point. I am not saying that the above fields are not worthy of pursuing, but people do not get into them for the money.
Re:Java is like "The Incredibles", or a circus (Score:3, Interesting)
myThingy = null;
Granted the actual memory won't be deallocated until the next garbage collection cycle but the object is lost as far as the programmer is concerned and the memory is usable if needed.
As to passing by value. Why would you be doing that for anything other than scalars (which Java does pass-by-value) anyway?
I am glad my introductory algorithm classes were taught in Java. The same algorithms were used in a C++ class later on. The Java class was to actually teach you the algorithms. The C++ class was to teach you memory management. The algorithms just so happen to be a very good exercise in memory management.
A humble suggestion (Score:3, Interesting)
With some luck, your work on the project might even bring some money in, and you won't need to impress employeers with a long list of languages. Even if the project doesn't bring any money in, you're resume will show that you're someone who can work on an actual project.
Re:Matters entirely on the industry. (Score:3, Interesting)
I have a buddy who works in the laser microscopy industry. He has a 4-year EE BS from a tech school, almost all of his coworkers have 4-year EE BSs from Universities. They are all extremely smart guys. The University guys have an edge on him in some of the theoretical areas, which is to be expected. But he has an edge over them in engineering a product that will optimize a lot of the theory and limitations of their production abilities. For instance, he stepped in on a project that another engineer was working on. The project was functional, but just barely, all of the theory was spot on, but it wasn't anything that they could sell. So my friend took over, refining the layout and functionality of the electronics, removing almost 500 feet of wiring, completely restructuring the layout, designing new enclosures, mounting systems, etc... Could he have figured out the theory behind the original engineering all by himself? Probably, given enough time. But the University grad has more knowledge on it, and so he could come up with that design faster. Could the University grad have turned the proof of concept into a sell-able box? Probably, given enough time. But the tech school grad has more knowledge on real world environments, functional design, fabrication, and economic and production constraints, so he could come up with that design faster.
Me personally, I have a tech school AS in comp sci with a BS in Computer Information Systems (CS focus) and a BS in Computer Information Management. I've been doing software development since '97 in the military and as a civilian as a contractor and full time employee. I've worked with uneducated prodigies, grad school masters, and tech school graduates. And I have to say, if I need an entry level application developer for a business environment application, I head straight to the local tech schools and look for a student who is ace-ing the courses and getting along with his/her classmates.
It's not because I think tech degrees are better than university degrees. It's because I think tech degree course catalogs have more pertinent classes to the industry I work in than university programs do. If I were working at MS or Google, trying to find more people to work on a new search algorithm, or the next version of Windows, I would move my recruitment efforts to the universities.
Once again, I'm not saying either degree is better than the other, just that they are different, and the graduates of both programs will likely have different things they can bring to the table.
-Rick
Re:tasty (Score:3, Interesting)
But... PostScript is practical! (Score:3, Interesting)
Some examples:
* Make lines on someone's chart (available as eps) thicker/different color/etc. Translate it to another language.
* Add basic visualization capabilities to any program (good luck linking some old dusty Fortran code with graph plotting library of the day, but it is entirely possible to FORMAT some E16 E16 lineto there)
* Generate some recursive graph(ics).
Like this... [google.com]
* Yeah, good luck writing anything that produces print-outs without PostScript knowledge!
etc.
Paul B.
Re:But knowing those langs helps a lot ! (Score:2, Interesting)
Maybe one of us missed something... how do you traverse a list in Scheme? That's recursion. That's recursion.
In fact, in RnRS there is no *iteration* mechanism in the language other than recursion, unlike in CommonLisp for example. All apparent iteration is just syntactic sugar around recursion.
As a result, RnRS Scheme implementations are required to be safe for space when doing tail recursion. It's a fundamental point of the language, and directly ties in to how it handles large data structures, including lists.
If your course touched on Scheme without making this obvious, you got ripped off. Or you had an instructore who very ingeniously avoided using recursion to operate on lists! Who was it?