Professors Slam Java As "Damaging" To Students 1267
jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
software engineering != computer science (Score:5, Insightful)
that's true, but again soft engineering/programming is a subset of computer science (maybe, i suppose you could argue they aren't)
"Computer science is no more about computers than astronomy is about telescopes."
- Edsger Dijkstra
I've noticed that... (Score:2, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.
I'm just worried that too few students these days know assembly and C, which leaves us in a predicament when the current generation of kernel devs retire.
Re:Damaging to students maybe, but not workers (Score:2, Insightful)
I wonder whether the engineers supporting your production hardware would agree with you.
Not that being conversant in systems is important in every practical application, but, speaking as a systems engineer, I do prefer to work with developers that understand it.
Different goals (Score:4, Insightful)
On the other hand, Universities have a much different end goal. They want to teach such that completing there program means that the student can go onto a Masters program, etc. Obviously, Java won't get students there without a massive amount of pain if they go on to further study.
Well, at least that how it was, and how it should be. Currently, Universities are edging toward the College level. What this has produced is a massive gap in knowledge/skill of where the student is expected to be and where they actually are upon entering a Graduate program.
Unfortunately, this isn't just in CS. More and more I see Mathematics and Physics programs degrading as well. From what I've seen, this is due to Administration applying... pressure for high grades, etc. No grades, no funding. The cycle continues.
Though, I must point out that there are some Departments that are making an attempt at fighting back. Small in number they may be, there is still hope for a return to actual academics. Though, we'll see how that plays out. You never know, I wouldn't put it beyond a spiteful Administration to turn a Department into offering just service courses.
Re:You have to start somewhere... (Score:5, Insightful)
It is reasonable to expect that a CS student has both the ability and the interest it takes to learn all the details of programming well in C.
Re:You have to start somewhere... (Score:3, Insightful)
Language choice affected the content of both courses quite a bit. In the Java course, students spent more time understanding how specific data structures worked and working on more interesting programming assignments. Students got to a working knowledge of Java fairly quickly after having only a semester of Scheme under their belt. In the C++ course (which followed a semester taught in ML), the students as a whole spent a lot more time learning about memory management along with the ins and outs of C++. Instruction on data structures was much more limited. Both sets of skills are valuable for practicing engineers, but I think it's fair to say that both the staff (myself and the teaching assistants) and the students enjoyed the class taught in Java more than the class taught in C++, probably because the interesting parts of such a class have to do with efficient data structures more than fighting with pointers and copy constructors.
Following the Java course, those students normally took an introductory hardware class. That course was taught in both assembly language, C and Verilog where learning pointers fit in better with the other material. Following the C++ course, I don't think those students saw much more C or C++ programming for a while (possibly not until upper division topic-specific course work) and instead went off to complete 4 more semesters of more theoretical Computer Science.
Re:1 language is damaging. (Score:3, Insightful)
COME ON!
And don't tell me Java doesn't have pointers - what do you think references are? Glorified pointers with auto-null checks.
One problem I've seen is Java Developer Syndrome (JDS) - think devs who don't know the difference between Java API names and datastructures that are used to implement them.
Think of someone who when you ask them what a hashtable is, they say 'oh, that's the synchronized version of HashMap'. Tell me that is a quality developer you want to work with. Go ahead, TELL ME.
Re:software engineering != computer science (Score:3, Insightful)
Re:Damaging to students maybe, but not workers (Score:3, Insightful)
Re:Right on! (Score:5, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
When I took data structures, and we used C++, I didn't have mental convulsions because Java had wrecked up my thinking so much (although I did have mental convulsions cause C++ is incredibly messy to read at a glance), I learned different ways of doing things. So, maybe these professors should look at whoever's teaching these kids so sloppily, not the language.
commercial app dev != internal business app dev (Score:2, Insightful)
true! i would take that even a step further
mr. awesome computer science man, who can program in everything, might not be what a business wants for someone who's good at whipping up quick, user-friendly (potentially resource-hungry and not secure enough to face the public internet or for commercial distribution) applications to help streamline business processes.
and, obviously, mr. business programmer, who's good at getting the employees what they need, probably isn't the guy you want to program some super computer
i'm sure Java's *perfect* for some people to learn. C or even Assembly are *perfect* for other people. C# / Python / Ruby might be perfect for someone else. [some other languages here].
it depends on the person and their career / interests / environment.
Re:software engineering != computer science (Score:5, Insightful)
employers want nothing more then easily replacable drones who come with an easily definable skill set which they can replace when a new buzzword comes along. this is NOT what universities should be pandering to.
Re:About the Authors (Score:3, Insightful)
A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)
Why C matters...
Why C++ matters...
Why lisp matters...
Why Java matters...
Why Ada matters...
So, I don't think the article is biased.
"Sure I know C!" (Score:5, Insightful)
These students are being trained as engineers. They shouldn't be afraid of a little grease.
That's true (Score:4, Insightful)
I love Java, and I find it much more pleasant to use than C/C++, but I generally agree with TFA. I have seen many people doing things like this
which creates 10K temporary objects to construct a single string*. This is because they started learning programming with a high abstraction level so they have no idea of what is going on behind the scenes. It's something similar to starting programming with one of these new "intelligent" IDE's such as Eclipse, which do lots of things for you so you don't have to figure them out for yourself. I think all these abstractions are great for people who already know how to program, not for beginners. You wouldn't give a calculator to 6 year old kids learning math, would you?
I personally would being with C and then jump to Java. C is not so complicated as a first language if you don't introduce all its features from day one. It was my first language and I think it was a good choice, it shows you enough low-level concepts to be able to make efficient optimised code in higher-level languages. Besides, when you later jump to a more high-level OO language you appreciate the difference and learn it with interest.
* I know modern compilers are able to optimise that automatically using a StringBuffer or StringBuilder. I just chose that (somewhat unrealistic) example for the sake of simplicity, but the same happens in other cases that aren't so easily handled by the compiler.
Re:I've noticed that... (Score:5, Insightful)
I view school as bootstrapping a person to learn how to learn, and for teaching them the things that are timeless. The only reason that a popular programming language like Java is used in the first place is because something has to be used, so it may as well be that. However, many schools offer Scheme, ML, or Common Lisp as the programming language of choice when the job market is comparatively low. This is because it's seen to help the learning process. The goal isn't a marketable skill, but a vehicle to teach the timeless things like algorithms, data structures, and all those courses that have he word "theory" tacked on to the end of the titles.
If you want someone to be a lackey and build you a GUI, you'd be better off looking for someone who has an ITT certificate. If you're looking for something more on the math side of computing (again, algorithms, analysis), then you talk to a computer scientist.
Re:I've noticed that... (Score:4, Insightful)
I'm a Mechanical Engineer as well. Are you suggesting that _we_ should have spent our degrees studying look-up charts for HVAC ducts, or how to make nice Excel graphs? (calculus, mechanics, thermodynamics, heat transfer, ring any bells?)
Re:Oh noes! Java is not C! (Score:4, Insightful)
Many universities are simply training highly replaceable professionals, which is a big reason why outsourcing is such a problem. When two people--one in the USA, one in India, for example--have the same skills, the cheaper will be chosen (and rightly so, sorry). The point of the article is that many universities are simply training programming rather than teaching computer scientists. It's an important distinction, which some people just don't understand.
Re:software engineering != computer science (Score:5, Insightful)
Pointers aren't rocket science. If you never perform an operation where you haven't first met the operation's preconditions, you never get a pointer error.
If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway. Pointers are the least of your problems. Race conditions can become exceptionally hard to reason about. The prudent kernel dev architects the system such that this doesn't transpire. That requires a whole different galaxy of aptitude beyond not leaking pointers.
When I first learned C in the K&R era, I thought those greybeards were pretty clever. Then I came across strcpy() and I wondered what they were smoking that I wasn't sharing. I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.
More likely, too many of them had learned to program on paper teletypes, and just couldn't bring themselves to face having to type unsafe_strcpy() when they had reason to know it would work safely and efficiently.
The C language deserves a great deal of shame in this matter of giving many beginning programmers the false impression that any function call should dispense with formal preconditions.
Interestingly, if you sit down to implement an STL template algorithm manipulating iterators, it proves pretty much impossible to avoid careful consideration of range and validity.
OTOH, C++ makes it charmingly easy for an object copy routine, such as operator=(self& dst, const self& src) to make a complete hash of managed resources if you fail to affirm dst != src.
There are plenty of amateur mathematicians who can manipulate complex formulas in amazing ways. The difference with a professional mathematician is that the necessary conditions for each transformation is clearly spelled out.
A = B ==> A/C = B/C iff C != 0
A > B ==> C*A > C*B iff C > 0
Infinite series must converge, etc.
I'm not even getting into defining A,B,C as fields, groups, rings, monoids, etc. for maximum generality.
Yet the average programmer feels sullied to exercise the same intellectual caution manipulating pointers. I've never understood that sentiment. My attitude is this: if that's how you feel, get your lazy coding ass out of my interrupt handler; go code a dialog box in some Visual Basic application that won't work right no matter what you do.
Why did the software industry play out this way? Other professions have much harsher standards. Primarily because software was in an exponential expansion phase, any work was regarded as better than no work (perhaps falsely), and industry couldn't afford to reduce the talent pool by demanding actual talent.
Now we've allowed many people to enter the profession without comprehending the rigors of preconditions. It's as if we had taught a generation of lawyers how to practice law, but omitted liability. Oops. What to do about it? Invent Java, and tell all these programmers it wasn't their fault in the first place.
So yes, Java doesn't teach very darn much about the harsh realities of actually thinking. And since thinking is hard, it's an impediment to productivity anyway, so it hasn't much been missed. The only thing we lost in the shuffle is our professional self respect.
Java easy to lean is the problem (Score:2, Insightful)
I've seen this in couple of students I went to school with. During my CS degree at UT of Austin the CS department started migrating to Java as their primary language. Java is definitely a easier language to learn on, especially with its GUI environments and libraries. That is the main problem however. A couple of friends I kept up with after college had a really hard time picking up lower level languages.
It's much easier to pick up higher level languages when you know the building blocks from the bottom. I wrote all my code in C/C++ in vim. Not using a GUI gave me a good understanding of how code works. Like managing files and linking object files and libraries; using certain flags to enable compiler options. I think once one is familiar with these concepts a GUI is great to become more efficient and not have to deal with such rudimentary task sometimes. But if you never learn these concepts you are losing out.
C/C++ helped me hang myself in college. It was grueling but worth it. I've picked up most languages I've tried pretty easily. I've coded/scripted in C/C++, C#, Pascal, Haskell, Perl, PHP, ASP, HTML, SQL, VB, Bash, MEL, LUA, UnrealScript. It's better to stick the pain out in college than try to figure out pointers and such when you have a job and there are deadlines to meet and the possibility of getting fired if you keep slipping.
Better CS programs don't teach languages anyway (Score:5, Insightful)
At CMU the very first CS class (that losers like me who didn't AP out of the first CS course, mostly because my high school didn't even have computer classes let alone AP computer classes!) really did focus on teaching a language - Pascal - and a significant part of the class was the learning of the language. It was the least useful CS class I took in the long run (not surprising, as an introductory course in any subject is likely to be the same). Subsequent courses would spend 1 - 2 weeks going over the fundamentals of the language to be used in coursework for the remainder of the class (which in some classes was C, in some was C++, some used ML, others Scheme, etc), to get everyone started, and after that, you had to figure it out on your own in conjunction with actually learning the theory that was being taught. It really isn't that hard to pick up a new language once you know a couple, although I did have a hard time with ML, mostly because I was completely unmotivated to learn it, feeling that it was absolutely useless to know (I was right).
No really good CS program has any classes with names like "Java 101" or "Advanced C++". To use a carpentry analogy, I would expect a really good carpentry school to teach the fundamental rules and "theory" of carpentry, so that the student upon graduation really understood what carpentry was all about and could apply their knowledge to aspects of the subject that they hadn't even encountered in school. I wouldn't expect a good carpentry school to have classes like "Advanced Hammering" and "Bandsaw 101". The courses would instead be "Introduction to House Frames" and "How to Construct Joints". You'd be expected to learn the use of the tools in the natural course of studying these subjects.
It's the same for CS. Good programs don't teach the tools, they teach the *subject*; learning the tools is intrinsic in the study of the theory.
Re:Java for Dummies (Score:5, Insightful)
Java is the new COBOL. And we will regret it in 20 years for much the same reasons.
It actually gives me hope that you have recognized this in hiring practices. That a CV with a list of Sun's Java buzzwords is not an indication of a useful programmer.
I was disturbed in college (1997-2001) that things were changing towards Java and other idiocy. Too many people didn't get pointers and other basic concepts, and Java was hiding them even more. I believe it was the one class we had in assembly programming that really pointed it out - when confronted with having to deal with real hardware, most of the students didn't know what to do. Concepts like "two's complement" vs "one's complement" caused a strange brain-lock for them, as they were so sheltered from the actual binary math and hardware of the computer.
It was only a handful of us that had been programming for years already (yay for the Atari 800XL) that had any idea of what was going on. The college (UC Davis) skipped entirely over very basic concepts like Von Newmann Architecture. I ended up having to spend most of my time trying to help my fellow students, there was so many fundamentals missing.
I think the most frightening part was having to yell at one of the professors one day, because the basic data structures he was teaching were being done incorrectly. He was teaching people to leak memory. ("Let's allocate a huge linked list, and then just set the head pointer to NULL and consider it freed!")
Sigh. It was frightening then, and apparently all my fears were justified, as now the entire discipline is getting a bad reputation. Unfortunately, I can't exactly disagree with that reputation from some of the CVs I've seen recently. My degree is destined fscked, apparently.
You hiring? ^_^
Start simple and use different types of languages (Score:3, Insightful)
I would think it better to have functional language next. Students are much more receptive in the earlier years, and Functional programming does take some getting used to. I don't know much to recommend these languages
After that I would take an Object Oriented Language, Preferably Java. It is a nice Object Oriented Language.
The important thing is that these languages must not be taught as languages, but as a tool to understand some Computer Science subject. For example we had something like this in my science courses
1) Data-Structures : Pascal
2) Linear Programming: FORTRAN
3) Programming Languages : Lisp
4) Graphics: C (Now it would be better to use something like C++)
5) Systems Programming: Assembly
6) Filing Systems / Database: Cobol (I would think java would be good here)
7) Artificial Intelligence: Prologue
I think now the layout now could be
1) Data-Structures: C
2) Mathematics (Set-theory, Combinatronics, Boolean Algebra, Linear Programming): Scheme, (C for Linear Programming)
3) Graphics: C++
4) Systems Programming and Operating Systems: C, Assembly, and shell
5) Databases: Java
6) Windowing Systems: Java
7) Compilers: Perl and Yacc
8) Artificial Intelligence: Scheme
I am not sure Ada is required as such, because it is not used as much as other languages. I think having different types of languages will put enough base in people to learn Ada later on if required. I would have liked to fit Python somewhere, but I don't know where
-anandsr
Re:software engineering != computer science (Score:4, Insightful)
Re:Java for Dummies (Score:2, Insightful)
You can't know unless you've been developing 8 hours every day after work. Wanna know why? Because every day I work I am 8 hours more knowledgeable in my field than the day before. I doubt you're keeping up with me.
Re:software engineering != computer science (Score:5, Insightful)
Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later. C was never designed to protect you from yourself, as explicitly pointed out by Dennis Ritchie many times. If you want a language that will protect you from yourself, program in VB.
So yes, Java doesn't teach very darn much about the harsh realities of actually thinking.
But C obviously does - like checking boundary conditions. I don't understand how you can slam C in one breath, then praise it in the next.
Re:software engineering != computer science (Score:5, Insightful)
Re:software engineering != computer science (Score:2, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
What is the purpose being served? (Score:2, Insightful)
Back in the day, "computer science" apparently had a strong liberal-arts component, with topics such as logic that are traditionally taught by philosophy departments. By gaining a solid grounding in theory, compiler design, and different languages, a CS graduate could reasonably be expected to fit into a wide variety of roles. By the late '70s and early '80s, that had been folded into the engineering schools in many universities; CS was seen as a subset of EE, and CS underclassmen generally had the same requirements as any "other" engineering underclassmen, getting precious little specialization until their junior year. We all know how well that turned out; it's what led to the "industry-focused" "curricula" of the late '80s and '90s. Education became a byword for vocational training, with an ever-shrinking set of currently topical skills being taught; shrinking largely because the American (and, to a lesser degree, Canadian and British) lower public educational systems were being systematically raped and dismantled by the political trends of the day. Johnny can't program? Well, that's to be expected; he can't read or write past what for half a century was deemed a third-grade level by the time he hit university.
Rather than solve that problem by re-broadening (all levels of) education, industry "solved" it post-1990 by offshoring everything that didn't involve a well-paid management position, and bringing indentured labor (via H1B visas in the US, for instance) for positions that were deemed "too difficult" to offshore. Everybody dranks the purple Kool-Aid for a decade and more, paying little attention to the fact that failed projects where becoming more and more common, and more and more costly. Instead, approaches like XP were introduced and sold, not on their very real merits, but on the idea that "this will help failing projects fail faster, earlier and cheaper".
What ever happened to the idea that the project shouldn't fail at all? Or, more heretical still, that software shouldn't fail at all? We put our lives at the mercy of software whenever we get onto a modern elevator, a recent-model airliner, or an automobile with electronic fuel injection. We put our wealth and comfort in the hands of software much more regularly. I recall one day back in early 2002 when I walked into the local branch of my bank, to be greeted by the sight of every "terminal", including at the teller windows, displaying the Windows "blue screen of death". I walked out, came back the next day, closed my account and took my money to a different bank. I was, according to the local newspaper, far from the only one to do so.
Our society is and will remain completely dependent on the correct functioning of computer software for its continued health and growth, if not survival. We, as a society, are being extremely shortsighted and apathetic by tolerating the status quo without examination or serious discussion. Which of these "sensible" "reforms" of the last 20 years will be the equivalent of the Romans' engineering decision to use lead as the lining of their water pipes?
Re:software engineering != computer science (Score:5, Insightful)
This goes for most of the so called 'insecure' functions in C, they only become insecure if you have already messed up in an earlier stage of your code. If you are aware of the limitations of the standard library routines (even the unsafe ones) and you are operating in a 'hostile' environment (and todays internet certainly qualifies as such) then you'll need to take great care to accept only input that matches your assumptions in the code further down, if not you are in trouble. But good programmers will work like that anyway.
It's perfectly possible to write crappy code in *any* language, not just in C (though, in the words of one old timer programmer 'C is like a racecar, you can cut corners but if you do that too often you'll end on your side).
To come back to a fairly well thought out piece with an answer like what was written several levels above here is not in any way helping the discussion, it is simply insulting.
Re:Pointers, References and Performance (Score:2, Insightful)
Re:Java for Dummies (Score:3, Insightful)
Just to harp on one statement in your comment. This is either an absurdly basic demand to ask of programmers, or an absurdly complex one. On one side, this is as simple as making a function call bmp2jpg(...) or whatever it is with the libraries they are using. The process of doing this is simplistic, and taught to the java mentality of "find the right tool in the shed to do your entire program".
On the other hand the question you are asking could be one so absurdly difficult that it is unlikely you'll find more than a handful of people who just got a BS in CS will know (having a BS in Computer Engineering it is possible they might understand it, and having advanced degrees makes it even more likely). Expecting a CS student to be an expert in signal processing is rather silly. Even if someone was a computer engineer and was actually changing the JPEG code (a process I have done recently), it isn't expected that they understand the entire algorithm. They need to know the basic steps, and how the data flows through the process. They may even need to know the basics of what a DCT does (knowing the math behind it could be useful, but experts have optimized it so using a naive algorithm is idiotic). They should also know how Huffman encoding works, and any student who's taken an algorithms course should be familiar with such ideas. However at a low enough level, they really don't need to understand it. People spend entire careers optimizing something like a DCT, so to most programmers it should be seen as another tool.
The point is that different levels of abstractions exist in different problems, and a good programmer (not necessarily a good CS student) should know what level of abstraction is necessary for the task at hand. Having someone who reinvents the wheel for everything is about as useful as a programmer who only uses canned solutions. It's finding the right mix that is important.
However, one key points of this article is about the fact that CS curriculums spend TOO much time on programming, and too little on theory. A university is not a trade school, and students should learn the important theories and not the tools. However teaching a student when using the wrong tool is highly detrimental. And one thing that was observed is that while Java may make the introductory course easier, it causes problems later on when a student must know some other tool (such as pointers). Not having a good enough foundation can mean that the professor must waste a significant portion of a class teaching student the new tool.
Phil
Re:Beginner language? (Score:3, Insightful)
C is a perfect beginner's language. When something is wrong, you know that YOU did it, not the compiler, not some class or wrapper or interface you have never seen, not the VM, or some bug or leak or god knows what.
I write for real-time control systems (thermal/air/chemical, etc...) in C , for PIC and NEC microcontrollers, and I've written Java servlets and PHP (5, OO) for myself.
When I write something like
shortPointer=&(BYTE *)&Short_Transmission+2
I know exactly where I point, what kind of data is there, how I handle it and what can change it. If something is messed up, I KNOW I did it, and I shouldn't google around for workaround, a bug fix or some undocumented feature.
Learning programming should start from the point where you have full control over your code and data, and build up to the level of Java, where very often you handle little more than instances of objects that you just happen to understand enough to use.
Re:Why is "Computer Science" Staffing S/W companie (Score:3, Insightful)
The comment about my degree was flippant, I admit. And it's been 25 years since I took that degree. So it's only in retrospect that I realize it's worth (or lack thereof). It got me a job. Without that piece of paper, it would have been difficult for me to break into the field. But it did *not* teach me anything about CS. Nor did it get me to a point where I could realistically do meaningful graduate work in the way, say, a physics graduate would have. But I had lots of trivial information about systems that were in use at the time (if you know what IEFBR14 does, then you understand what kind of systems those were. If you don't, be *very* thankful!)
I take your point about researching the degree, and if I had known what I know now, then I would have known to go to those few schools that actually teach CS. But here's the thing: when you don't know something, you don't know what you don't know. Now that I've spent the last 2 and a half decades writing and reading, I'm beginning to understand what you need to know in theoretical CS. I at least have an idea of what I don't know. It is not reasonable to expect someone who has not done that work to know what is out there. But most professors who do research in CS know this. So I don't excuse them.
And the point of my rant remains. The purpose of the university is basic research. Most universities are not training graduates (even grad students) to a level where they can do basic research in computer science. They may very well be providing a function for industry by churning out people by the tens of thousands with an introduction into the hot programming languages of the day. But that's the function of a technical college.
What we risk by not investing in basic research is a stagnation in computer theory. Marvin Minsky proved that perceptrons can't compute everything. Then in a off hand comment said that he didn't know if multi-level neural nets had the same problem. It took something like 10 years before anyone even checked. In other areas, programs are getting more and more complex all the time. If we can't find ways of representing that knowledge in more expressive forms, then we will just hit a glass ceiling. By choosing to train programmers and neglect theorists, our whole profession loses. *That's* what I'm complaining about.
Re:You have to start somewhere... (Score:5, Insightful)
Assembly is necessary, to understand how a computer really works. Functional languages are good, just to know a completely different style. Some other language for breadth. Then the student can realise that everythin after asm was a waste of time, and return to C.
This is kind-of bollocks.
When I was a young programmer - which is about twenty-five years ago - the team I was on got a new ink-jet printer. It printed its own character set, we needed it to print bitmaps. The processor it used was one none of us had ever worked with before. One of the older members of the team - a guy called Chris Burton - took the spec sheet for the processor and the spec sheet for the printer home with him on the train, and came back the next day with the code for the new printer driver written in long hand, not in assembler mnemonics but in actual op-codes, in pencil on a pad of paper. It was burned on an EEPROM that day and drove the printers until that model became obsolete five years later - there were no bugs, it never needed fixing.
It should be said in passing that Chris had worked in his youth on the Manchester Mark One [wikipedia.org], and after he retired was part of the team that rebuilt Baby and got it running again.
I've always thought that was epic programming, a standard I'll never reach. But it's one particular layer on the stack. My job on that team was writing inference engines, and Chris was always really impressed by that. It's nearly thirty years since I touched any assembler and fifteen since I wrote anything serious in C. A modern computer system is way too complex for any single person to really understand, in depth, all the layers. I take what the silicon designers do as given, and likewise the microcode programmers. Right back in the early days of Linux I did fix issues in kernel code a couple of times but I wouldn't even try these days - the guys who do that are much more expert at it than I am. Likewise, I don't expect them to understand the compiler compilers that I write. It's a different layer on the stack.
I agree that you need to have a rough idea about how the whole stack works. But we no longer expect all computer science students to be able to wire up NAND gates from discrete valves or transistors. And although a computer scientist needs to know that there are primitive logic operations carried out on the metal, and that on top of that there are a stack of different software layers with real machine code on the bottom and a whole slew of intermediate code representations above that, I don't believe that it is any longer necessary for all students to be able to write a serious program in assembler.
Re:software engineering != computer science (Score:3, Insightful)
That's a teaching problem, not a Java problem (you can teach badly in any language). If they can't function without an IDE then they've obviously not been taught how to use Ant, or even java and javac come to think of it.
Lisp (Score:4, Insightful)
And ignorance is key to bad habits (Score:5, Insightful)
Guess what? Even in Java, pointers still come to bite you in the arse when you least expect them. I see people every day who have trouble understanding the difference between "==" and "equals()" in Java, because they never learned the pointers behind them. They're essentially one abstraction level too far from understanding what their own code is doing.
Or even in Java learning why you can't modify an "int" parameter, but you can modify the contents of an "int[]" parameter, guess what? Requires pointers. People end up doing all sorts of unnatural metal contortions to remember when passing by value isn't really passing by value, when "it's a pointer" would sum it up perfectly.
And it shows. I've had people come to me half a dozen times with basically the same idiotic "auugh! Java's Hashtable is broken! I added a new value, and when I look into its array with a debugger it replaced my old one!" When in fact, it was only added a node to the front of the linked list. But they don't know what a linked list is, nor what a hash table really is, nor how a Node can contain another Node, without a concept of pointers. Worse yet, not only I see them spending a week debugging Hashtable, I see piss-poor workarounds done to prevent it from doing its job.
Or I see burger-flippers-turned-programmers occasionally get the real programmers fired for doing the right thing. Like using a "==" where it's correct to use it. But the burger flipper doesn't understand that. He learned some "for String use equals()" mantra, and he'll apply it and preach it, cargo-cult style, without even understanding what he's _doing_.
Or I see people think that optimization means replacing two lines with a one line call, because they have no fucking clue what the machine does with that code. They think that speed is measured in lines of code, because noone explained to them otherwise. So they wonder why their replacing two ifs with a catch is actually slower. (And I'm not getting into the many ways such a catch can make the code less secure, for example, by assuming that a real exception is just their loop reaching the end of the array.) Exactly what throwing an exception does, is a mystery to them.
Etc.
No, noone said you must keep programming in "a language where even K&R wrote unsafe code, nor that difficulty equals worthiness. But it helps to be at least exposed to those concepts once, even if thereafter you go on to program in Java or VB for the rest of your days. The fact that you worked with pointers once in C and managed to get them right, _will_ show in your Java code too.
Probably the best thing that helped my coding was doing assembly on my parents' old home computer, back in high school. In fact, in hex, because that ZX-81 with 1k RAM didn't even have enough RAM for an assembler. Wrap your mind around _that_, if you think C is too hard.
Would I advise anyone to write a production program in assembly nowadays? Nope, God forbid. I wouldn't have advised writing a whole program in assembly even back then. But understanding the machine behind that high level stuff will show even in your Java code.
And, yes, not every architect needs to be a Michelangelo. But it helps if they're not a clueless moron who can't even build a doghouse right. You can see plenty of architects nowadays who can't even get a basic house right. They know how to draw an artsy sketch of a house, but they have no clue how to calculate it to actually stand upright or what materials to use so it doesn't get damaged by rain within a year or two. And/or need a civil engineer to fix their elementary mistakes. Maybe it wouldn't hurt that much if they knew a bit more, ya know?
Re:You have to start somewhere... (Score:3, Insightful)
My day job is as a mainframe (z/OS) programmer in a number of languages and whenever a program has a non-obvious bug, it still comes in damn handy when you can read what is actually happening underneath.
I'm not talking about training them upto the level of building systems in assembler, but they should be able to take hex values, processor specs and an assembler guide and translate what's going on.
Assembler teaching you what a processor does, just like students should know how it's possible for a processor to be costructed out of NAND gates. They needn't be able to create their own processors, just understand why it actually works.
Re:software engineering != computer science (Score:5, Insightful)
I do advocate designing primitives as essential to the language as the C string functions to powerfully remind the programmer using those functions of the programmer's logical obligations and support the programmer to reason correctly about those obligations, without having to digest 15 lines of preceding context to see that calloc() provided the implied terminating NUL.
strlcpy and strlcat - consistent, safe, string copy and concatenation [gratisoft.us] by Todd C. Miller and Theo de Raadt, OpenBSD project
Weirdly enough, the Linux people are about the only major group of people that has constantly stayed deaf to these arguments. The chief opponent to strlcpy in glibc is most certainly Ulrich Drepper, who argues that good programmers don't need strlcpy, since they don't make mistakes while copying strings. This is a very mystifying point of view, since bugtraq daily proves that a lot of Linux and free software programmers are not that bright, and need all the help they can get.
One must recognize that in a solid code base, thinking occurs more often while reading code than writing code. Correctness is not a write-only proposition in any living code base.
The original C string functions were (and remain) a pedagodic disaster. Most beginning programmers failed to realized how much thinking had been folded into the surrounding context. If they were reading K&R, that thinking existed. If they were reading any code they had at hand, it likely hadn't, by any survey of average C code quality ten years later. With the original string functions, whether this careful thinking existed is not obvious without doing a lot of mental work, and that work has to be repeated *every time* the code is seriously reviewed.
Worst of all, the strcpy() function seemed to imply "buffer overflow is no great concern, we're not even going to give you a single argument on this very dangerous function to help you avert it". It was a false parsimony to save that extra argument in the default case.
This isn't at the level of whether the handgun has a safety or not. It's at the level of whether it is possible to chamber a round too large for the barrel. I can point the gun successfully, but I'd greatly prefer it not to detonate in any other direction.
A more thoughtful C string API would have averted mistakes on the magnitude of chambering bad ammunition, without encumbering the pointy end in the slightest, or failing to endanger the programmer's foot.
Why We Teach Java (Score:5, Insightful)
(1) Java is what the market wants. Yes, we can teach any other language under the sun. But the reality is, that the software industry values individuals who are Java-literate. By this I mean an individual who has a basic understanding of the OO principles that the language is founded upon, can write Java code using common tools, and has at least some insight into some of the more common Java APIs. Any learning institution that doesn't take this into account when designing their curriculum is doing a serious disservice to their student body. While some do go to University for the sheer joy of learning a subject - most are there to ultimately get a job.
(2) In my opinion there is something seriously wrong with a Java course that emphasizes Swing or Web development, rather than the fundamentals. Yes, its important to get things in and out of a program, but, at least initially these should be incidental to the main event. Learning the language, and applying it effectively. Thinking in an object-oriented way, which many of you know is not necessarily an intuitive way to look at the world - especially if you already have a procedural background. GUI and web application development should be separate, advanced courses.
(3) I sometimes lament the lack of insight into pointers, but any professor worth their salary will spend some time discussing the Java object reference architecture, and relate that to pointer-based languages. Regardless of how abstract your language is "opening up the hood" and demonstrating how things work, and why things have been designed the way they are, is often worth knowing.
(4) I laughed when I read the article about Praxis, especially the part about formal methods. Are they serious? Yes I was taught formal methods in school, and could understand *why* I'd want to use them... If I had all the time in the world... a huge budget to burn and customers not screaming for something that the business needed yesterday. Praxis offers software development based on formal methods and as a consequence occupies an important (and probably expensive) specialized niche of the software ecosystem. To suggest that this approach should be the norm and lament its absence really betrays that the authors have spent too much time in academia and not enough in the real world.
(5) Ada is a great language - in fact I learned Ada 83 as a first language along with C. It just isn't relevant to most software development companies or IT departments - if indeed it ever was. I worked on a research project that was part of the Ada 9X Real-Time initiative - the main users were aerospace and military vendors - particularly embedded systems. There you do need to know about concurrency and distribution - along with hard performance deadlines and often a slew of safety and mission-critical issues you need to consider to do a good job. However, I fail to see the general relevance of Ada to a commercial market that is primarily interested in "simple" information systems, getting information out of a database and/or putting it in - with some processing on route. Why should I use Ada when the market in general doesn't use it?
(6) We teach concurrency - its useful stuff to know. I think that using formalisms to describe concurrent programs is going a wee bit too far (see (4)) above.
Re:tasty (Score:0, Insightful)
Where are the car designers of tomorrow? We need to be teaching solid principles such as wheel design. They just take pi for granted these days!
Re:Java == Jobs (Score:5, Insightful)
I'm mainly a C programmer these days, but I took the job basically understanding that I would be working significantly with Java. That was the only language I had experience with on leaving Uni, and I was promptly put to work on a Pascal / OpenVMS system! Friends from Uni have had similar experiences.
I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this. I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.
In dealings with many (perhaps even most) other companies whose software I write interfaces with, it's pretty clear that they are also using C or C++, and often even older systems (in one interface we have to convert our messages from ASCII to EBCDIC). You can frequently tell what language the other system is from the sort of errors that crop up, and sometimes from the design of the interface. I'm forced to believe that my area of the industry is still primarily C based.
You get what you pay for... (Score:2, Insightful)
It's why so many "project managers" think taking a little extra time to properly engineer a system is a waste of time, but have no problem rewriting entire systems every few years since they are unmaintainable.
Re:You have to start somewhere... (Score:4, Insightful)
Which is exactly what we want people to do when they're off doing real work. One problem with teaching languages without such libraries is that people get used to writing their own rather than looking for pre-packaged.
There's no problem with basic data structures and algorithms in Java. You declare that certain libraries are off-limits for an assignment, and give a zero score to anybody who violates that. The ones that actually belong in the field will catch on real quick.
How much extra pay for knowing the theory? (Score:3, Insightful)
There is no shortage of highly skilled computer scientists who know the theory and can program in any language, including ones they invent for themselves because the existing ones are not good enough. If there were a shortage then there would be head hunters out looking for people like that and offering big salaries and golden handshakes. The same is true for various other science and engineering disciplines.
When businessmen or politicians talk about that sort of shortage what they really mean is that these days there is a shortage of naive people who will spend many years racking up debts pursuing an advanced education and then work for a pittance afterwards.
Re:And ignorance is key to bad habits (Score:3, Insightful)
Ada 2005 (Score:2, Insightful)
http://www.adaic.com/standards/05rm/html/RM-TTL.html [adaic.com]
There is nothing Java (as a language) has what Ada 2005 hasn't got as well - but a lot what Java hasn't got but Ada does.
Martin
Re:Why is "Computer Science" Staffing S/W companie (Score:3, Insightful)
I'd research my courses very carefully and go somewhere where theoretical subjects were taught. That may mean a masters or even a PhD.
When I did my B.Sc. in Computing Science, we had a mix of theoretical subjects (automata and discrete math were actually core, compilers was an elective I took, I took a couple of artificial intelligence subjects as well)
Because they don't teach it at universities in general (with some notable exceptions).
Ahhh so there are "some notable exceptions". Could it be that you should seek one of these out if that's what you want to do?
They churn out professional programmers, who would actually be *much* better off (in terms of being good programmers) to just spend those 4 years writing code.
Complete garbage. Programmers who don't understand how the machine works under the hood, don't understand that you can express problems that can't be solved, or can't be solved practically etc. are rubbish. They don't understand the very tools they use. I've worked with people who understand the classics like the travelling salesman problem, or the halting problem. I've also worked with people who don't. Let me tell you I'd much prefer to work with those that aren't going to propose something impossible and set a deadline for my team to meet in creating it. (I once had a manager propose we write a full blown compiler in a month, and it was clear she had zero understanding. My current boss on the other hand - and I work elsewhere thankfully - use to write software and she's brilliant). It doesn't matter if you never become a computer scientist proper. These things are VERY important.
And the point of my rant remains. The purpose of the university is basic research.
The point of a university is higher learning. One avenue is basic research.
What we risk by not investing in basic research is a stagnation in computer theory. Marvin Minsky proved that perceptrons can't compute everything. Then in a off hand comment said that he didn't know if multi-level neural nets had the same problem. It took something like 10 years before anyone even checked.
If it were an easy problem Minksy would have set one of his grad students to work on it.
Look the rest of this refutation is pointless. If you want to move into a career of research now, you clearly have the intelligence and articulation required. Your education has put you in better stead than you give it credit for. What I don't know is whether you have the drive and circumstances to persue another degree or a change of career. If you want it desperately enough, you're willing to make the sacrifices and you're lucky enough to do it, I wish you the best of luck.
I'm not saying this flippantly either. I dropped out of a science degree because I found it didnt suit me. I worked then went back and didmy comp sci and found my career in IT. Science was a dream of mine right through highschool. When I found I couldn't persue it I didn't give up on it. I went and did a Masters of Astronomy part time and on the Internet. I certainly didn't have my hand held to learn what I did, and though I never intended it as a change of career (ie. I l did the degree "for fun") and though it cost me big time health wise and socially (not to mention financially) I don't regret doing it. It is part of who I am, and I loved the challenge and cherish the knowledge I gained. I finally know how we know what we know about the universe. I've computed the distances to stars and understood their life cycles. I understand what the universe looks like on the grand scale. In short I have a better understanding than the average person of the universe I live in. While I'd love to go and do research I know I won't sacrifice what I have to for that. It's still okay. My degrees have been anything but a waste of time.
Re:tasty (Score:5, Insightful)
Do more. Try doing your homework in haskell or lisp or hell, write in forth or postscript. It's a billion times easier to learn a language when you have someone else telling you what to do in it, and a billionth of the stress when your paycheck doesn't depend on it working.
I've wanted to learn ruby and rails for a while now, but I've got nothing to do with it at home, and like hell I'm going to show up at work and replace a production app with ruby for the hell of it, even though we've got a number of internal web apps that are basically exactly the kind of CRUD RoR was designed for.
Re:software engineering != computer science (Score:4, Insightful)
I code mostly in Java in my professional life, but when I was in school we were forced to diversify, and it was a definite plus.
The intro course used mostly JavaScript for some reasons (!), but other (even relatively low-level courses) required projects written in C, Schema, and Java. I took an operating systems course where we had to write a project in some kind of mini-assembly language... it's all a bit fuzzy now (I graduated 10 years ago), but I remember it being tough to wrap my head around for a while. And that's a good thing, right?
I also did a couple of summer-long solo projects that probably taught me more than anything -- just fighting through the problems on my own, learning the hard way about the value of clean code, OO, version control, debugging skills, etc. etc..
Perhaps obviously, I'm much better a *Java* developer than I would have been without the other stuff. So I agree wholeheartedly that students must learn more than one language in their schooling -- either for professional reasons OR for academic reasons; you simply have to flex your thinking in more than one way if you're going to learn.
It also strikes me as a tough way to learn... how do you learn what X language is, and the reasons behind its design, totally in isolation? How do you learn what OO is if "functional" is a meaningless concept to you?
Re:Java == Jobs (Score:5, Insightful)
Re:Java == Jobs (Score:2, Insightful)
The first PC generation (those kids who first wrote programs in the late 70's and 80's) were blessed with having to learn all of this low level stuff to make their systems sing on the limited resources at the time.
Since then, students have been getting farther and farther from the key understandings that would make them excellent regardless of the language they would use in the future. The best programmers understand the underlying implimentation so they can leverage it.
Now industry is calling for more developers who can do concurrent applications - but how can you expect them to understand and build a system that avoids race conditions, when they don't even understand pointers?
The quality of software will continue to suffer until Universities get their act together and standardize system programming (including assembly, as well as C/C++, and scripting - shell, Perl, awk, sed, Python).
Re:Java for Dummies (Score:3, Insightful)
A bit like your post. I mean comments like "How can someone understand the Linux kernel without C & C++ ?". Why would anyone need to understand the linux kernel or C++ unless they were linux kernel programmers. Even so we have things called an "API" which allows developers to write to a kernel without having to know how the Kernel works or if it is even in C++.
Also your comments on Java are laughable. Java is in use in day to day objects in real life. Phones, Set top boxes, Cars. It is not just web servers. There are even operating systems and emulators in Java which run fine. Go read up on JIT and stop living in the 90's.
"In my experience, 50% could not work out how to develop a data structure for a bitmap that used palettes."
And why would they? Unless they were looking for a job that does that. Seriously, if I was in an interview and they came out with the stuff you posted I would laugh at them.
Re:software engineering != computer science (Score:5, Insightful)
As I said, I'm not a programmer. I could (if I had to) model the frequency response of a simple mechanical system to a range of perturbations by hand. The chance that I'd have to do that in the course of my professional employment is so slim as to be laughable. Yet, the fact that I could do this (if I really had to) tells me that I don't want to put an eccentric load on a rotating shaft with out a lot of careful consideration. Now if I sort of knew this was a bad thing, but didn't really understand why I might to something silly like put only a single U-joint in a shaft. After all, it provides flexibility, and as long as the shaft is straight there is no eccentricity. The problem occurs when there is a deflection, then your single U-joint translates a nasty sinusoid down-stream. If you do that things tend to break.
Now, I agree that the university should have some courses focused things that practicing professionals in the field use. I could draw a part by hand (if I really had to), but if I've never seen CAD before, I'd be at a serious disadvantage if I ever wanted to be a machine designer. However, fundamentally, a university is an academic institution. The suggestion that it should be an employment mill would severely compromise our education system.
Oversimplification is sooooo easy ... (Score:3, Insightful)
There are many dynamics in (American) programming; some are real problems, and some are changes that the "old guard" finds uncomfortable. Nobody here (including me) wants to read a 30-minute rant, so let me just mention a few high spots.
There are also the traditional inter-generational conflicts:
Progress is made by human beings, not by tools.
Or, as Shakespeare might have put it: "The fault, dear Brutus, is not in our programming languages, but in ourselves..." Julius Caesar (I, ii, 140-141), 2008 edition
Re:tasty (Score:5, Insightful)
Most of my professors had no real world experience, either. So, teaching things like team dynamics and working within a project schedule were really beyond their expertise. Granted, I've been quite successful, but I attribute most of that to my abilities, not what I learned in college. College just got me a piece of paper that opened the door.
I don't think the problem is with the languages being taught, but in the lack of true engineering being taught. This is true of any of the programming related fields (CS, MIS, SE). All of them need these skills.
Layne
Re:Java == Jobs (Score:5, Insightful)
As code gets more complex, the best way to keep it understandable to others is to follow common language idioms, indentation / code formatting practices, and use built-ins in the standard libraries. These alone often take months to become familiar with, but that's only half of it. The other half I can only describe as trying to approach problems from the unique perspective of the language. Any asshole can jump from Java to Ruby, or from C++ to Lisp, or from VB.NET to Scala. But learning how to solve problems using those languages' strengths, rather than writing code as you would in the language you're coming from, is crucial.
From my own experience, Java programmers coming fresh into Ruby don't use blocks. When you finally convince them to use blocks for enumerators, they miss the point entirely and simply use each_with_index for everything, rather than more powerful methods from functional programming like map. They also don't like to reopen classes. In Ruby, classes can be added to at will, so if you want a method to calculate the average value of an Array, you can simply define it as a new method on the class. But Java programmers will create a Util module, throw a method in there that takes an Array, and think nothing more. It's not wrong, per se, but it's ignoring Ruby's strengths, and simply writing Java code inside the Ruby interpreter. And the people who do this are bloody useless.
My rant is getting long, but the main point is this: learning syntax for a new language is easy. Learning to use that language properly (much as a screw is used differently than a nail) is crucial to being able to work with other people, and getting anything meaningful done.
Re:tasty (Score:5, Insightful)
And also beyond the scope of computer science. If that's what you wanted, you should have specialized in software engineering. People keep forgetting that computer science classes should feel more like math classes than engineering or management classes. One look at TAoCP would hint at that. For the record, I'm an engineer and I find the pseudo-engineering that most CS programs push out to be highly disturbing. Either do it right and call it software engineering, or remove the non-CS stuff and call it computer science. If you're not gonna do either aggressively, give it a fake major name like "Information Technology" or "Management of Information Systems" and teach a bunch of stuff really poorly.
Re:Java == Jobs (Score:4, Insightful)
Mostly the classes were theory, concepts; stuff that applied equally to all languages. It was weird in some ways; I had a networking class that assigned the eternal "create a server/client chat program" project, where part of the project was a Java GUI. At this point, I'd never programmed a GUI, and neither had anyone else I talked to. The response of the TA (who was the only one who'd ever give programming advice, because the professor only dealt in theory), was that GUI design was beyond the scope of the class and we'd just have to figure it out.
My method for figuring it out involved downloading a Java editor, and using the GUI design tools. It was the first time I'd used a graphical editor for Java; it was encouraged to do the work in VI or Emacs, and generally, that's all we did.
Now I hated that crap at the time, but nothing has prepared me better for my day to day life than having projects dumped on me where I had to goddamn well use my initiative and figure it out. Over and over again, I was forced to go out and read and work out for myself how to translate the theory into code. These days, I program in Java about 20% of the time. I'd hardly say it stunted my abilities, and it certainly didn't make me into a cookie cutter corporate programmer.
I'd have to say that specifically teaching any language is a problem. They all come in and out of fashion. I work with a guy whose mind is stuck in Visual Basic...And I don't mean
Right Tools for the Job (Score:4, Insightful)
University is not where you go to learn a specific set of skills. If you want that, you go to a technical trade school.
University is where you go to get an in-depth set of concepts, critical thinking skills, research skills, and theory foundations. This is true for any major you wish to approach. In the CS department, there is a reason you take different languages, some are for system development, some are for app development, some are for theory exploration with little to no value outside of the educational environment. Java falls into one of those categories. Assembly, C, C++ fall into others. Ada falls into yet another.
Think of it in the terms of the English major, you know, those dime-a-dozen students who will end up working at Burger King and Mr. Chows Empire Chinese Buffet, or they go to Hollywood to work as waitresses while they wait for their big break. The English major takes a load of literature, English, American, Russian, Manga, and poetry from Bacon/Shakespeare to Ginsberg to Hughes to Tupac, and writing from haiku to freestyle with a goofy footed pentameter (trademark and patent pending). None of this is particularly helpful to someone who wants to come out of school with business writing skills.
Remember, in University, some of the most mistaken ideas come from the professors.
Re:Java == Jobs (Score:2, Insightful)
Re:tasty (Score:5, Insightful)
anyone with a CS degree and half a brain should be able to pick up new languages within a very short amount of time.
There are unfortunately a great many universities turning out a great many low-quality "computer science" grads who don't know the first thing about programming, much less the intricacies of stacks and pointers in C. I've met some alleged CS grads who didn't know a compiler from a hole in the ground.
I think the problem may be improving from how it was a few years back (the dot-com bust knocked CS off the lists of many students just looking for an easy $80k paycheck on graduation), but there are still a lot of dolts around, devaluing the degree.
Re:tasty (Score:1, Insightful)
Sounds like a person problem, not a language probl (Score:3, Insightful)
Re:software engineering != computer science (Score:4, Insightful)
I know, it's insane nowadays! Why, just the other day I was just remarking to my barefoot wife, who was cleaning our clothes by beating them against the rocks in the river, and then rolling them through a wringer, that all these fancyboys with their pushbutton machines and laundry powder just don't value godly, manual labor anymore! Then I went to my factory job where the machines have no safety features, so they really require you to think, which is sadly out of vogue these days with OSHA and safety goggles and whatnots.
The shame!
--Rob
Re:software engineering != computer science (Score:2, Insightful)
Re:You have to start somewhere... (Score:3, Insightful)
I am not saying that Java is bad or that Java programmers are bad, but there is still some serious value in understanding how a computer works right down there at the metal, and C is a lot closer to the bare metal; and that is why it is important for CS students to learn to use C well, even if they will forever more use Java, C# or whatever. When you study computer science, you study to become a Computer Scientist, which requires a lot more than the ability to program - hence the word 'Scientist'. Computer Scientists are, as far as I know, required to be able to write things like operating systems from scratch and that kind of things, and you won't choose Java for that if you want it to go fast (not quite true, I know - Nixdorff made a series of servers once, whose OS was written almost entirely in interpreted BASIC).
Now, as for the 80% errors caused by pointers - it is perfectly possible to learn a coding discipline that avoids them. This should happen automatically with experience; after a while you learn to always initialize buffers - whether they are pointers, strings, int or other - when you define them.
Re:1 language is damaging. (Score:3, Insightful)
So, sure, if you could find a language that supports all those programming paradigms, and allows one to expose students to both low- and high-level programming concepts, then my all means, teach it from start to finish. Because you're absolutely right, it isn't really about the language. But, that said, the language is there to best express one's ideas. And I'd be *very* surprised if you could find a language that encapsulated the full breadth and depth of programming approaches while not being a pathetic, lowest-common-denominator expression of those concepts.
Incidentally, the real irony in all this is that languages like Java and C# are taking on more and more functional and declarative features (though, again, only a poor man's version of them). Just look at C#: suddenly, they have lambdas, and soon it'll have a built in query language. Bang, suddenly we have declarative, functional, and procedural programming models all jammed into the same syntax. And the brutal, bitter truth is that most programmers using C# probably have *no idea* how to best use these tools to solve problems.
Re:tasty (Score:4, Insightful)
But, the whole reason to GO to a University, is to get the skills/education to make more money when finished, than you would have if you had not gone.
College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.
If people could make good $$ without college, I doubt you'd see so many people trying to go....
A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.
Re:tasty (Score:5, Insightful)
Re:tasty (Score:5, Insightful)
I think anyone who is spending 4 to 6 years getting an a degree in computer science only to get a high paying job when then get out, are a tad silly. They are really, really wasting their time. They can get an intern job right now, at a software consulting company, study their ass off (as we all have to do in this field). Within a year they will be making decent money, within 3 years making really good money. 4 years later when the person has their shiny degree, after studying Java (which probably wont' even be used then), they get the joy of getting a junior developers job.
There is an old adage: "How do you become a writer?" "Write... a lot". This is the same with programming. You can't fake your skills, and a PHD in CS won't matter if you can't bill your clients because your application doesn't fulfill requirements or even work.
Truthfully if all you care about is money, work in finance, or become a salesperson. The best developer in the world won't compete with a high end salesperson dollar for dollar; hell CEOs can't compete with top salespeople. Zero education required.
I very much value a university education, but it has nothing to do with making more money. Learn, create, become a very educated person; the money will follow; the money part really isn't that hard.
Re:tasty (Score:5, Insightful)
College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.
If people could make good $$ without college, I doubt you'd see so many people trying to go....
A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.
For a moment, put the reason why YOU go to a University to the side and consider what the purpose of the University is. It's an institution that's literally thousands of years old, dating back to the old Greek institutions of education. When Plato and Aristotle founded their schools, they didn't put up a big sign that said "When you're done, you get more money." That wasn't the promise. The promise was that by teaching you about the world, you would become a better person. That is to say, the founding concept of the University was that education lead to human excellence. And, for the Greeks especially, human excellence was not directly related to the possession of wealth.
This understanding of education was dominant up until very recently. Everyone was required to learn Greek and Latin, so they could read Homer and Plato. Reading the Homer isn't going to get you a job, it's not going to get you a promotion, it's not going to get you an interview, and it's not going to get you laid this Friday. No one at the University used to make the claim that it would. They'd claim that reading Homer made you a better person, even if it doesn't get you a job.
Now, as to why YOU should go to a University? If you're going for the purpose of getting a job, you're not going to understand the vast majority of your classes at the University. You're going to be wondering "Why do I have to take this anthropology class?" or "I have no interest in Operating Systems, why do I need this Operating Systems class?" and "Why do I need a foreign language, I'm going to be working with code all day." All these questions miss the larger point of what the University is trying to do to you. And if you're missing the point of the entire institution, it's exceptionally difficult to do well there.
The whole thing is really just the result of multiple generations of corruption, I think. Employers realized that well-rounded, educated (dare I say, excellent) human beings are better for the health of a company. So they pay more for people who are excellent, and a University degree used to be a short-hand of some form of excellence. The masses of uneducated began to realize this, and started saying to their kids "If you want a good job, you need a degree." So their kids started going to the University, thinking the point was to make money. Professors, having tenure, just did what they were going to do anyway, but now we've gone two or three generations like this. We're reaching the point where current professors went to school thinking it was for money. We have boards of Universities with pressure from the state to focus less on the goal of education for excellence and more on the goal of education for job skills.
Re:tasty (Score:3, Insightful)
Universities certainly do both educate and prepare people for jobs but I'd guess that the main reason people go to universities is to get a good job and make more money than they would otherwise.
Let's imagine for a moment that going to a university would have ABSOLUTELY NO IMPACT on the kind of job you get or money you can make and that the ONLY thing anyone could get out of it was to be a more rounded and educated person.
Do you really think that people would pay the kind of money that is required to go to a university if this were the case?
The vast majority of people go to college to end up with a better job.
Re:Java == Jobs (Score:5, Insightful)
If you look at developers who spend a lot of time doing things in C (e.g., the OpenBSD developers-- have a look at their repository [openbsd.org]), you'll see that they are keenly aware of "object-oriented" design principles. They also tend to know exactly when things like byte alignment is an issue, and when you really should just use a void pointer, because they are forced to think about their machines. Most OO programmers I know have no idea why they would need OO language features-- they just use them because that's what they've been taught-- and they know next to nothing about the machines themselves. I would argue that a good programmer is a good programmer; and if they have standard procedural programming experience, that will nicely complement their future OO work.
GP is right-- OO is simply a design philosophy. The actual mechanics of building an application are no different.
Re:tasty (Score:5, Insightful)
But knowing those langs helps a lot ! (Score:4, Insightful)
You're barking mad. (Score:5, Insightful)
So, what, you're going to hire math geeks only? People with degrees in mathematics or operations research, or perhaps some of the hard sciences? In my own experience, while there are some non-CS degrees that are excellent preparation for a CS career, only a CS degree is a CS degree. It is lamentable that some schools have embraced the trade-school mentality, but many more have not. When I was teaching courses as a graduate student (just a couple of years ago), the curriculum began with Java and quickly shifted to Haskell. A neighboring institution still uses Ada as an undergraduate language. There's also a legion of Knights of the Lambda Calculus who are trying to get Scheme reintroduced to the undergraduate curricula in several institutions in the area. Intellectual diversity about languages is alive and well in the academy, based on the institutions I've seen up close and personal.
Also, who is this "we"? You and someone else who shares your prejudices? Or is this you and the senior engineering staff? If you're about to decree CS as a non degree, maybe you should get the input of the people who will be most brutally affected by your shortsightedness.
So glad to know that you think design patterns and classic algorithms are worth studying.
Look, pick up a copy of Cormen, Leiserson, Rivest and Stein's Algorithms textbook sometime. That's the definitive work on algorithms--if you need an algorithm, it's probably sitting in CLRS somewhere, along with some beautiful mathematical exposition about it. Every algorithm listed in the book can be trivially converted into Java. So why the hate for teaching CS with Java? It's a perfectly sensible language for many very important parts of CS.
Further, I've taught operating system design in Python. Yes, Python. When talking about how a translation lookaside buffer works, I don't write C code on the board. I write pseudocode in Python and say "so, this is how it looks from twenty thousand feet." On those rare occasions when we have to get down and dirty with the bare metal, then it's time to break out C--and we leave C behind as soon as possible. I want students to be focused on the ideas of translation lookaside buffers, not the arcane minutiae of implementations.
After all. Implementing it is their homework, and it involves hacking up the Minix code. In C.
If it was an easy subject, would changes need to be made to make it easier?
If it was a spectacularly hard subject with a 50% washout rate, would changes need to be made to make it easier?
I've been in courses where 50% of the class washed. They were horrible, horrible classes. The pedagogy needed to change. The learning curve needed to be smoothed out and made gentler. This is in no way equivalent to saying it was made easy. The fact you think otherwise brands you as an intellectual elitist who can't be bothered to think logically about his own prejudices.
Re:Java == Jobs (Score:1, Insightful)
Then use a different language that supports your pet features. First-class functions are a procedural hack and don't really belong in a truly OO design in the first place.
However, one could also argue that a design's "OO-ness" can never reach 100% no matter what language you use for implementation.
The map is shorter, less cluttered, has less potential for bugs, and ultimately easier to understand once you understand what map and lambda do. Which shouldn't take a long time for a competent programmer.
I find the second block to read much more clearly, and six months from now if the guy who wrote it got hit by a bus I wouldn't need to make sure the next guy knew what map and lambda did. It makes work really dull to write code like that, but you're not paying people to show off their knowledge of a language's esoteric syntactic sugar. You pay them to get the job done in the most effective way possible, which in the real world, means writing the clearest, most maintainable code possible given the constraints of the language.
Re:tasty (Score:1, Insightful)
>more money when finished, than you would have if you had not gone.
funny, that's what i used to think. now that i have a couple of degrees, i finally realize that there is more to life than money. THAT is why one gets a bachelor's degree - to learn that there is more to life than just making money.
Re:Java for Dummies (Score:1, Insightful)
Unfortunately, it seems most other people agree. Based on my own experience, the user interface for data input is actually quite important - you need to think about how to make it as intuitive and foolproof as possible. Catching categories of errors and warning the user about them with some visual cue is a biggie, for example - or planning for how to scale a form for larger data input sets and still allow some sort of coherence in how it is displayed.
Of course I'm sure the hope is input is a relatively rare activity or a one-time-per-problem step, but none the less many errors can be caught there in a variety of real world scenarios (corporate data process collection, for example, where controls are known.)
Re:Why is "Computer Science" Staffing S/W companie (Score:2, Insightful)
Considering the years and (tens of) thousands of dollars you've invested in formal education, I can see why you would want to justify that decision. Personally, the years I spent earning money and gaining experience in the field lead me to believe that a degree would have been a less-than-optimal use of my time and money.
Re:tasty (Score:3, Insightful)
Comment removed (Score:3, Insightful)