CMU Eliminates Object Oriented Programming For Freshman 755
fatherjoecode writes "According to this blog post from professor Robert Harper, the Carnegie Mellon University Computer Science department is removing the required study of O-O from the Freshman curriculum: 'Object-oriented programming is eliminated entirely from the introductory curriculum, because it is both anti-modular and anti-parallel by its very nature, and hence unsuitable for a modern CS curriculum.' It goes on to say that 'a proposed new course on object-oriented design methodology will be offered at the sophomore level for those students who wish to study this topic.'"
so the wheels are coming of the OO band wagon then (Score:5, Interesting)
Re: (Score:2)
I've found the same problem with SQL and prolog (of that was object prolog, scrub that)
Victory for Tablizer? (Score:3)
Longtime Slashdot readers know and either love or hate user "Tablizer" [slashdot.org] .
He has a website [geocities.com] detailing his objections to object-oriented programming, while arguing for "table-oriented programming". It has been a fruitful source of flamewars over the years.
So, is this a vindication for Tablizer? Tablizer, what say you?
Re: (Score:2)
No, you have terrible reading and logic skills, because what I summarized is exactly what the post to which I replied actually means.
Computer scientists? (Score:5, Funny)
Re: (Score:3)
Not sure whether you're being sarcastic or not, assuming you're not...
Programming lets you put your mind in a certain 'mindset' which can help you analyse and solve problems, even if you don't actually get your hands dirty in the end.
Re: (Score:3)
Chem majors do work in chem labs.
Physics majors do work in physics labs.
Why shouldn't CS students do lab work?
Re:Computer scientists? (Score:5, Insightful)
Why are computer scientists even learning programming? When did this happen? Programming sounds like one of those get-your-hands-dirty jobs in flyover territory, where you would show a lot of ass crack on the job and live in a trailer park. Educated people don't do that.
They need to be able to program for the same reasons management and engineers need to spend some time on the assembly line: so they can learn how things actually work. There's often a wide chasm between "on paper" and "in practice" and ideas need to be able to traverse it.
Re: (Score:3)
Goodness forbid an "educated" person get their hands dirty. It is useful to put theory into practice through direct effort, and it demonstrates a proficiency with the practical tools and results that can only lead to better theory.
Why remove it? (Score:4, Insightful)
I don't see why it should be removed, it should be 'complimented' instead with other programming methodologies in order to let users compare and contrast. But most CS in the end will end up being done with OOP so there's no reason not to start in the beginning - at least that's my personal experience.
In my first year we had a mixture of different programming types, including functional programming. I never really used any of those, I'm sure there are certain places where Prolog or Haskel is used, but its not as common as an OOP.
Comment removed (Score:5, Funny)
Re: (Score:2)
Well I would not call OO either directly anti parallel, the statefulness of objects is. The entire Actors / Message model which seems to become more popular despite being functional really blends well into OO. After all some OO systems are basically very close to that, you have objects and messages for interoperation. The main difference to the Actors / Message model is, that OO leaves it up to the implementor whether the message sender/receiver can be stateful or not.
Outside of that, I personally think an
Interesting move (Score:5, Interesting)
OO is practical for lots of problems, because it makes modelling real-world data easy. However, it is not useful if you want to give students a solid understanding of the theoretical computer science. OO is fundamentally data-centric, which gets in the way of algorithmic analysis.
To give a pure view of programming, it would make sense to teach pure functional and pure logic programming. If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically. Apparently that was too big of a step.
Re: (Score:2)
On the other hand, you must balance theory with practice, because otherwise the students will a) leave b) not be able to do practical projects while studying the theory. So teaching only logic programming (which is great, IMO - it helped me a lot!) is not practical.
Re: (Score:3)
Functional and logic programming get in the way of some aspects of algorithmic analysis too ... the hardware is imperative after all.
Re:Interesting move (Score:5, Interesting)
OO is fundamentally data-centric
Maybe the way that you do it. Personally I find that quite a lot of my classes have methods.
If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically.
Imperative programming isn't really any more difficult to model mathematically than functional programming. They just use different branches of mathematics. Check out David Gries's The Science of Computer Programming for example, which shows how to do it, and Object-Z which actually does it. The main difficulty is with side effects, but functional programming has the same issues as soon as you try to interact with the external world.
Re: (Score:3)
Sure your classes have methods. But good OO design says that your classes correspond to groups of real-world objects: bank accounts, people, widgets, whatever. Why? Because that is the data that needs to be processed. The methods are tied to the classes, meaning that their very definitions depend on the data modelling you have done.
Working this way is suitable when you are dealing with complex data and comparatively simple algorithms; get the data model right, and your problem is half solved. Moreover, basi
Re:Interesting move (Score:5, Insightful)
Sure your classes have methods. But good OO design says that your classes correspond to groups of real-world objects: bank accounts, people, widgets, whatever. Why? Because that is the data that needs to be processed.
That seems to be a fundamental misunderstanding of OO. The focus on real-world objects is because they have self-contained properties and behaviour. "Because that is the data that needs to be processed" is no more (and no less the case) than Because that is the behaviour that needs to be produced.
The methods are tied to the classes, meaning that their very definitions depend on the data modelling you have done.
Or (and just as valid), the members are tied to the classes, meaning that their very definitions depend on the use cases (ie, actions) you have identified.
Working this way is suitable when you are dealing with complex data and comparatively simple algorithms; get the data model right, and your problem is half solved.
You do realise that data model != object model, don't you?
The other situation is where you have complex algorithms, but simple data. Signal processing, control software, scientific programming, etc.. Take a radar as an example: the data is just a continuous stream of numbers coming in off the radar dish; but the algorithms that extract meaning from those numbers are very sophisticated. Using an OO language for problems like this is like trying to loosen a screw using a wrench - maybe you can make it work, but it's basically a really bad fit for the problem.
Radar data was a bad example. A modern radar processing system takes data from many sources such as primary radar heads, secondary radar heads, ADS-B, ADS-C, WAM, neighbouring radar processing systems and so on. These behave the same in most ways but differently in some, and OO turns out to be an excellent way of dealing with that.
Yes, in some applications OO doesn't offer you anything significant. They are typically the cases in which you end up effectively with one class because the data and the processing are so simple. But in this context "simple" processing doesn't mean mathematically or computationally simple, it means that you only do one thing with the data (in the case of the radar front-end, for instance, the one thing will be extracting distinct decluttered returns into distinct objects for passing to the next processing stage). OO design still doesn't get in the way, though: if that's the only thing going on in that process you simply end up with a design with only one class having only one significant method (so if your language supports it you might just as well write the lot as a single routine with no class packaging).
For what it's worth, I have no problem with OO not being taught from the outset on a computer science course. It's a decompositional/compositional approach and is not a lot of use until you understand what you are decomposing and composing.
Re:Interesting move (Score:5, Interesting)
The other situation is where you have complex algorithms, but simple data. Signal processing, control software, scientific programming, etc.. Take a radar as an example: the data is just a continuous stream of numbers coming in off the radar dish; but the algorithms that extract meaning from those numbers are very sophisticated. Using an OO language for problems like this is like trying to loosen a screw using a wrench - maybe you can make it work, but it's basically a really bad fit for the problem.
I nodded at first when I read this, then I realized, the complex algorithms are objects in themselves. And the "simple data" usually has structure and meaning to it. I have some experience trying OO approaches to scientific computing. A naive approach can result in some awful problems that add little to the fundamental work.
My personal experience has been with writing matrix algorithms. The conditions of the matrices were that they were "large" and usually had nice internal structure (for example, a class structure might be rectangular matrix, upper/lower triangular matrives, diagonal matrix, and constant matrices, constant diagonal and the zero matrix). They are "simple" data sets, but a little insight into their structure can result in significant savings in data stored and the cost of common matrix operations (like matrix multiplication).
For me, I ran into some deep problems when I attempted to operate on multiple matrices with structure. For example, suppose in addition to the list mentioned above, I create a new class type for block diagonal matrices. There were already six matrix classes. So if I want to do matrix multiplication, I have potentially 13 new pairings of classes (left or right multiplication by block diagonal matrix on the six existing classes plus multiplying two block diagonal matrices together).
For my language choice, C++ (and later Java), I couldn't just implement matrix multiplication as a method. I know there's some stuff with better OO implementation out there, but that's what I was using. So I ended up representing matrix multiplication as its own object, and registering specialized multiplication algorithms in a poset structure. When two matrices need to be multiplied, the matrix multiplication class looks up all the specific algorithms that would work and picks one with the best heuristics.
Whether the complexity of that overhead is justified, well, eh... never did finish it. All I can say is that what I implemented, allowed me to throw stuff together without thinking too much about the internal structure or taking a huge hit on computation (given what I was doing, PDE difference method modeling and least squares stuff on large piles of data).
But this example steers my intuition here. You may have several competing complex algorithms. You may end up changing the structure of the simple data you operate with. The places where your scheme can change are places which you can and maybe should objectify. Also you have processes which you can abstract through objects, for example, if you are piping data through a DSP system, then a stream structure with filters applied to the stream in sequence might be a suitable abstraction for what you're doing.
Re: (Score:3)
Most of the imperative languages I have used in the last 45 years have been easy to model, but not easy to prove mathematically at the same level of logic. They invariably needed to be proved in some customized Higher-Order-Language, and many times this was LISP or PROLOG.
Object-Oriented programming has the drawback of distributing data all over the system, and it tends to obscure the underlying structure of the problem space and the matching solutions. OOP is convenient for plugging completed subsystems in
OO a tool for craftsmen, not comp sci (Score:5, Insightful)
It seems to me that CMU's made the very obvious decision that today, OO is a tool for craftsmen, not for freshman computer scientists. And they probably are right. It's important to not confuse the tools of the trade, with the basics of the science, and this is especially true at the freshman level. For a good while (going back decades) OO was enough on the leading edge that its very existence was an academic and research subject but that hardly seems necessary today.
In the electrical engineering realm, the analogy is deciding that they're gonna teach electronics to freshmen, and not teach them whatever the latest-and-greatest-VLSI-design-software tool is. And that's a fine decision too. I saw a lot of formerly good EE programs in the 80's and 90's become totally dominated and trashed by whatever the latest VLSI toolset was.
Re: (Score:2)
CMU decided that the future is focusing on parallel. Because of the direction of hardware, I think they are on the right track.
Re: (Score:2)
Ok, so let's make a computer science degree exclusively about "computer science" as opposed to "computer programming". Then we might as well dispense with the CS degree requirement for the vast majority of programming positions. Then we should realize that the market for "computer scientists", the ones that design pure math algorithms and do scientific studies of computer-related systems, is extremely small and already overfilled. Then everyone will realize that going into "computer science" as opposed to a
Re: (Score:3)
We teach music *theory*, not how to play an instrument. Nothing to do with us.
Computers are tools. You program them to get them to do something useful. They're not an abstraction, they're real. Computer science is about the theory behind computers. It's not about some "higher dimension" of thought that is unrelated to programming machines, it's all about what you can achieve by programming those complex machines.
Otherwise, why study functional or logical programming? That's "dirty" as well, isn't it? Why ha
Re:OO a tool for craftsmen, not comp sci (Score:5, Insightful)
Well, I'd agree with you if you this were a class in an O-O language, but it's a class in a major programming *paradigm*. And even theoreticians have to express themselves in code. If the summary is to be believed, the department is taking a stand against O-O programming based on what it sees as the paradigm's shortcomings. Let me tell you, thirty years ago you'd hear the same kinds of arguments in a computer science department about how bad virtual machines were. Virtual machines were an early, failed attempt at getting around the limitations of early compiler technology. C showed you could have a language which was nearly as fast as assembler (faster for most programmers), and easily portable across architectures. There was no reason to study virtual machines, they had no practical application. Well, it probably would have been a good idea to have them in the curriculum somewhere to study on a theoretical basis, because ideas in computer science have a way of re-emerging.
Computer science as a discipline embraces topics other than the theory of computability and complexity theory, which might as well be taught in the mathematics department. For example there is computer language design, hardware architecture, data communication theory, database theory, AI etc. Some of these could be put into other departments, I suppose, but an understanding of the core intellectual discipline *is* widely applicable to all the topics that commonly fall under the "computer science" rubric.
I think an empirical case for separating computer science from software engineering might have sounded convincing fifteen years ago, but Google, and the Internet in general, are strong disproofs of this. Google isn't a search company. It's an algorithm company. It's about doing things on a scale so massive it requires serious computer science ability, well above the "tradesman" level of expertise. Which is not to say that what it does is as "pure" as the work an academic interested in computability per se and defining new kinds of complexity classes, but that kind of work, isolated from its applications, is of such narrow interest it hardly justifies the existence of a department separate from the Mathematics department.
I do see the need for separate concentration areas or departments for computer science per se, software engineering and information science, although having employed a number of people with Masters' degrees in information science I have grave doubts about the academic quality of degree programs in that field. An academic degree should give you the intellectual tools to think about a problem, not necessarily the practical tools to work on it, which can be picked up other ways. The O-O paradigm fits into this framework. The requirement to learn a specific O-O language like C++ or Java is a different matter; somebody with a degree in C.S. should be able to pick any such language up because he should understand the *design* of the language.
I suspect we're seeing departmental politics here.
Re: (Score:3)
Say what? Virtual machines were an early, successful attempt at allowing multiple users to use the same machine at the same time for different purposes.
Re: (Score:3)
I'm talking about VM/CMS. Yes, it's time sharing. It's also based on virtual machines (hence "VM").
A larger problem (Score:5, Insightful)
The big problem with OO courses has always been that there's far more focus on the mechanism (and, lately, on design patterns) than on actually structuring your program to take advantage of objects. The word "polymorphism" isn't as important as getting across the concept that you can coordinate a few things as if they were the same thing, and have them act in different ways. Knowing a bunch of design patterns by name isn't as important as figuring out once and for all that you can structure your program in such a way that you can create interfaces (not necessarily actual interfaces/protocols) between different parts of your program and essentially replace them, or at least rework them entirely without having the whole house of cards collapsing.
There's no focus on making it click. There's no course that teaches you explicitly just about formulating problems in code, and that makes me sad.
Really? (Score:3)
Re:Really? (Score:5, Funny)
Perhaps I'm misunderstanding the post... it sounds to me like OO techniques are only going to be taught in elective courses from now on. If that's the case, I think CMU is missing the fact that the majority of development work in the "real world" is done on already-existing platforms. Parallel/cloud computing and modular design may be the "next big thing", but what happens when the student gets their first job working with an application built with Java or .NET? Maybe in their ivory tower they can say "OO is dead" but in the real world, OO is very real.
This is a CS program we are talking about. Much like economics, in these disciplines the real world is often considered a special case.
Re: (Score:2)
When I went to school, graphics was an elective.
I didn't take graphics, so I didn't get a job where I needed to write a ray tracer or whatever.
See how that works? Not so hard.
Anti-Modular? (Score:5, Interesting)
Apparently the meaning of "Modular" has changed since I was in University back in '82. OO used to be the epitome of modularity.
But I do agree that making it an introductory first-level course does warp the mind of the young programmer. There are a lot of languages that don't enable OO programming at all (e.g. Erlang), which become much more difficult for them to grasp because OO is so engrained in their thinking.
I can't think of anything specific about OO that makes it poorly suited to parallel programming. There are languages whose nature is parallelism (again, Erlang), but that's usually accomplished by adding parallelism operators and messaging operators to a relatively "traditional" language. I don't see why you couldn't add and implement those constructs in a non-parallel language.
I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.
I gotta tell you, though, I really object to the use of Java as an introduction language for programming. Java is far from a shining example of any particular style of programming. It's not real OO because it's only single inheritance. It's not designed for parallelism. It doesn't have messaging built in. In short, Java is actually a pretty archaic and restricted language.
Re: (Score:2)
I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.
I dont shudder they should learn why some patterns lend themselves better to parallelism than others instead of learning some high level in the language baked tools.
For instance if you give a student Erlang or Scala with a dedicated actors model I personally think they will never grasp why this high level construct works as pattern better than lets say a critical region - semaphore based model.
The students need a deeper understanding than applying a few patterns. The entire segment of parallelism probably s
Re:Anti-Modular? (Score:5, Interesting)
Yes, there has been much progress in module systems. The very Bob Harper mentioned there was involved in the design of the SML module system, which is often cited in the CS literature as `the' reference module system. SML achieves a level of isolation that is simply impossible to achieve in a language like Java or Smalltalk. in Java, any object you can still call `hashcode()' and `toString()' etc. on, and it's often possible for someone to subclass one of your internal classes and thereby break your intended module structure.
In SML, you can confine types through (e.g.) the following signature:
(* this is imperative code; a typical SML program focuses on functional code, but it's good enough for illustration purposes. *)
signature STACK =
sig
type 'a stack (* 'a is a type parameter, so "'a stack" is what OO-land calls a `generic type' *)
empty : unit -> 'a stack (* create fresh stack *)
push : 'a stack -> 'a -> unit (* take a stack of 'a elements, take an element of type 'a, and plug them together *)
pop : 'a stack -> 'a option (* "'a option" means that the operation may fail *)
end
You can then implement this stack in various structures that match this signature, and confine it in such a way that only the operations listed above are available. That is, you can't stringify stacks (there is no such operation listed there, though you can choose to add one), you can't compare them (again, no such operation), you can't `reflect' on them and you can't access their `protected' functionality by subclassing them, unless the stack implementer put a separate view of the structure into place for that particular purpose.
Why is this good? Client code won't end up using features that you didn't want to expose. Why is it bad? If you forgot to expose something important, someone else will have to re-implement it. But that's your fault, then, not the language's fault.
SML also allows you to build modules from other modules through something called `functors', but let's leave that for another time.
Now, SML's modules have issues-- you can't have mutual dependencies between them (which does have advantages, too, though), and the question of how to integrate type classes (something you may know as `C++ concepts') is unresolved. But the concepts behind the module system are clear and powerful. So if you want to teach the concepts underlying modular software design, this is a vastly better choice than most other options out there. (I remember the Modula-3 module system being fairly good, too, but not quite at this level.)
So, for teaching purposes I'd say Bob Harper is closely connected to the best system out there that has actual working implementations tied to it.
OOP in freshman year (Score:5, Interesting)
Many kids coming to colleges these days do not have any programming experience or a very shaky one at best. Picking up concepts like classes, inheritance, the entire idea behind OO modelling is difficult if you are lacking basics such as how memory is managed, what is a pointer, how to make your program modular properly, etc. From the course description they are going to use a subset of C, I think that is a good starting basis for transitioning to something else (C/C++/C#/Java/... ) later on.
What is worse, many of these introductory courses were given in Java - producing students who were completely lost when the black box of the Java runtime and libraries was taken away - e.g. when having to transition to C/C++. We are talking engineering students here who could be expected to work on some embedded systems later on or perhaps do some high performance work. Even things like Java and C# still need C/C++ skills for interfacing the runtime with external environment.
I think it is a good move, indeed.
Re:OOP in freshman year (Score:5, Interesting)
I agree that OO in the 101 course is a little much. You should really be focusing on simple programming techniques that a non-major might encounter when, say, writing a batch script or macro. I'm not sure about the second semester courses, though, since those are more for potential majors. Certainly at some point a CS major needs to be exposed to OO, but I don't think it needs to come first.
As for understanding the infrastructure, I do think C/C++ get you closer, but in my experience it doesn't really click until you take some kind of computer architecture course or similar. For instance, I didn't *really* understand pointers until I understood how values and addresses are stored in memory.
Re: (Score:3, Interesting)
What is worse, many of these introductory courses were given in Java - producing students who were completely lost when the black box of the Java runtime and libraries was taken away - e.g. when having to transition to C/C++.
What is worse, many of these introductory courses were given in C/C++ - producing students who were completely lost when the black box of the C/C++ runtime and libraries was taken away (stl, libstdc++, libc, stdlib/malloc) - e.g. when having to transition to Assembly.
What is worse, many of these introductory courses were given in Assembly - producing students who were completely lost when the black box of the Assembly runtime and libraries was taken away (OS virtual mem) - e.g. when having to transition to
It's about time! (Score:2)
Thank goodness a university has finally decided to teach a curriculum based on what its professors like, instead of adhering to silly concerns about what might be useful in the real world. Students can rest assured that they'll get a first class CS education, and--sorry, what was that? Jobs? You want to get a job? What the fuck do you think this is, DeVry?
Now go finish your LISP homework!
Re: (Score:2)
Speaking of, I recently read the message of someone who was prayising lisp as the perfect choice of being able to express algorithsm. Speaking of boneheaded that is. This guy probably never worked with languages like pascal which really give strong emphasis on clean syntax for expressing algorithms instead he has been drowning his brain constantly in parentheses.
They are right (Score:2)
The decision is sound, OOP mostly leads to bloated software, mind-bending and partly just plain stupid object-oriented constructions, and strongly encourages mutation. In a sense it's even based on the idea of mutating objects. Because of the dramatic increase in concurrent programming in the future strictly functional programming will be much more important than OOP. I understand that OO enthusiasts will not like to hear this, and of course OOP also has its good sides, but it is highly overrated. A good mo
This is a gOOd thing. (Score:3)
Programming is about utilizing paradigms. Not being stuck in one. OO is just another layer on top of anything else. It is a set of rules that one can follow in any language. Some languages have it built in for convenience, but it is also an inconvenience for implementations where it is not so optimal.
There was definitely a moment when OO seemed to be some new paradigm in programming, but no, it is merely a tool, and CMU put it in its rightful place.
Good strategy. (Score:3)
Right for the wrong reasons (Score:3)
OOP should absolutely not be taught at the freshman level, because it gets in the way of understanding more basic concepts like, oh I dunno, variables, branching, looping, subroutines, I/O, etc. Their claim that OOP is "anti-modular" is of course absurd. "Anti-parallel" is probably arguable, but how that's relevant at an undergraduate level of instruction is highly questionable.
At any rate the real problem is that colleges offer degrees in "computer science" but not "software engineering". How many of these students will finish their degrees without ever having committed to source control? Without coding to a 3rd-party API? Connecting to a database? Performing maintenance programming? Working in an honest-to-god team?
Come back to reality.. (Score:3)
Of course you don't want to teach algorithm development on a high level language that already has all of the algorithms (that these teachers are capable of teaching) built in. What would be the point of that? And I think when they said anti-parallel and anti-modular, they meant in the attitudes it instills in students.
--- start off topic rant ----
For the record, these degree programs provide you with skills tuned for different jobs. The CompSci folks are algorithm based - down in the details making things fast and efficient. The SoftEng folks are architecture/technology based - up at the top making things that WORK and that people can actually use; you know - products. They are COOPERATIVE jobs with a very large overlap.
Any CompSci should be grateful to have talented SoftEng helping them provide a cogent, fast, and polished view of their algorithmic efforts; your work means nothing if it never gets in the hands of the users or if it's spoiled by a bad implementation.
Any SoftEng should be pleased to have hard working CompSci making fast algorithms to help them to create their next product. You couldn't make these fancy architectures of you didn't have well thought out underpinnings backing you up.
I see some piss-poor attitudes on this thread and some of you snobs and dorks are in for a rude awakening.
First math and now OO... (Score:3)
A few weeks ago, there was a lot of discussion about how many universities are wanting to drop the math requirements from their CS degrees. Now we have dropping OO, too. I seem to recall that CS stands for Computer Science. Should not somebody with a Computer Science degree be able to handle things like OO and math? I could see dropping these requirements if the degree were something like Computer Technician or Computer programming (although even that last one, would indicate that a graduate would be verse in multiple programming paradigms).
Is all of this a continued dumbing down of college so that the masses can get a degree? If so, it appears more and more technical jobs will go offshore as the next generation of true computer scientists won't receive their training in this country.
On the other hand, if the goal is to pay a university a $100K to teach you how to do web pages and the like, then go for it. But, if the US wants to return to being a technology leader, then they need to focus on more than just the currently hot marketing skills and focus instead on the whole big picture.
Another CMU Perspective (Score:4, Insightful)
http://reports-archive.adm.cs.cmu.edu/anon/2010/CMU-CS-10-140.pdf [cmu.edu]
Whatever you may think about delaying OO--and opinions are mixed at CMU as everywhere--one advantage of the new curriculum is that the sophomore-level course can do OO design more justice than we were ever able to do in the prior intro sequence, since the students already know how to program. Modularity and parallelism are in fact major emphases of that course, which I and other CMU faculty are currently developing.
Re:Hmmm ... (Score:5, Insightful)
I don't know about *starting* in assembler, but a programmer who isn't somewhat proficient in assembler is going to have a very weird mental model of how programs work. OOP has the same problem--it's not that OOP is bad; it's that if you start out with an OOP language, you don't learn a lot of things that are important to understand. Once you know how the machine works, then you can start studying abstraction. Treating OOP as the only way, or even the best way, to solve any computing problem is going to tend to produce programmers who think everything is a nail. It doesn't mean that there are no nails, nor that students shouldn't learn to swing a hammer.
Re:Hmmm ... (Score:4, Interesting)
Re:Hmmm ... (Score:5, Funny)
Re:Hmmm ... (Score:5, Informative)
Um. No. Many modern libraries or "frameworks" (newfangled word for library) are OO. Most OSes remain written in classic system programming languages like C and assembly language. In fact, most frameworks start as object oriented wrappers for certain OS calls and cruft up from there.
Re: (Score:3)
That certainly depends on what OO means to you. From what I can see, Linux is quite OO code - everything is quite modular and it's a very nice state machine. From where I sit, Linux itself can be considered OO code - just that it doesn't use classes or OO language doesn't mean anything.
non-OO code is like libc
Actually, not using classes does mean something: the code is more ugly, inflexible and hard to maintain than necessary. Speaking as someone who lives and breathes the stuff.
Linux kernel would be far better off actually compiling its class-oriented portions with a proper object oriented compiler. Likely as not, the code would be more efficient due to optimizations that can be done on virtual functions. Look at the hundreds of places where you have "if (obj->fn) opj->fn". In many cases that extra test i
Re:Hmmm ... (Score:5, Insightful)
Agreed ... but aren't most modern OS's OO based? In most cases students need OO programming in order to become employable. OO certainly isn't the holly grail of computing but it is entrenched in business and needs to be taught just like COBOL was all those years ago (when I had to learn it even though it was like writing a book every time I wanted to write a small program).
And how is this an argument for including in the introductory, freshman curriculum? I put forward the possibility that some topics may be more appropriate to be taught to students only after they've learned the basics.
Re: (Score:3)
Yes, learning Basic at home helped me a lot with Java.
I had a little trouble at the beginning compiling with the line numbers, but I solved it quickly. Now my code is as efficient and good looking as everyone's else.
/* 10 */ public static in main(String args[]) { // This line does nothing. Brackets are empty and they did not even 'DIM' args
/* 20 */ System.out.println("Hello world!"); // semi-colon because the guys at Sun / Oracle didn't figure out that the line ends at CRLF
/* 30 */ } // Maybe this has something to do with the opening symbol at 10
Still working it with GOTOs, yet.
Re: (Score:3)
Programming is very easy, but requires math and logic skills.
I agree that learning the basics of programming is easy. But, like with any other trade, becoming a good programmer requires the right kind of motivation, curiosity, attention to detail, and desire to improve one's craft.
In my experience people who say "Programming is easy" are usually the ones you don't want to having working with you on a project.
Programming is a craft. You have to love doing it to be good at it.
Anyone can saw wood or swing a hammer. It takes a passionate person to properly build a st
Re: (Score:3)
If you come out with solid math and science skills you do fine. Programming is very easy, but requires math and logic skills.
That is certainly one (popular) route to take into programming. A strong background in linguistics also provides a solid mental foundation for programming. The important ability to cultivate is structured, organized thinking.
Re: (Score:3)
Man HS has gone downhill. In the 70's I had 6 years of math, 3 years of chemistry 1 semester of physics and 1 semester of programming in fortran. The card decks had to be sent to the main downtown facility to be run and they would send back thee printouts the next day. Now with computers all but free, your school could not come up with a VB course? Sad indeed.
Re: (Score:3)
I went to CMU. The curriculum is designed so that if you work hard, you can make it through the freshman courses without an incoming knowledge of programming; it's much, much easier if you know some basics coming in, but if you're motivated it's not necessary.
Re:Hmmm ... (Score:5, Insightful)
I outright laugh at people in an Interview when they ask me if I'll write OO code.
Good advice for those seeking work.
If you can't read straight up C code and understand what the fuck is going on, stop calling yourself a programmer.
They used to say that about assembly. Try writing a modern game or GUI in assembly. You might well be able to do it, but by the time your'e done I'll have finished several projects and be moving on to the next one.
If you can't code directly for the hardware you're interfacing with, stop calling yourself a programmer.
Right. Abstraction has no place in computer science education or in programming. As a matter of fact, if you're using anything other than a magnetized needle to right to your hard drive, quit calling yourself a computer user.
If you depend on .NET, any library, framework, or something written by someone else, you're not a programmer. You're a script kiddy.
Right. Code reuse is so lame. Anything that reducing development time and improves efficiency has no place in computer science. Everything should be written directly in binary on a computer you, the programmer, built from latches, switches, and flip-flops. Otherwise, you're just a bum.
OO should never be taught lest we end up with a generation of useless tools who think they're "programmers" that can't actually accomplish fuck all
Right. After all, every program written using OO is useless and nonfunctional. Except for the hundreds of thousands of programs that work just fine.
Go back to your garage with your unflinching dedication to The Old Ways and leave the rest of us to be productive.
Re:Hmmm ... (Score:4, Informative)
Re:Hmmm ... (Score:5, Funny)
So you can't call yourself an english major if you can't read Sumerian cuneiform?
Re: (Score:3)
Personally, I think the best wa
Re:Hmmm ... (Score:4, Funny)
Yes it produces more powerful methodologies, because of the synergistic effects that objects have with modern cloud environments.
Re:Hmmm ... (Score:4, Insightful)
You are still probably doing OO on some level even in C. OO is not a whole different paradigm, but is rather an institutionalization, so to speak, of a variety of patterns that are already used in procedural programming, with its own additional "enhancements" (for better or for worse).
I don't think the difference in job markets between C and Java really has a lot to do with OO as a methodology. It probably has more to do with what's available on what platform and what companies are using those platforms (more importantly, what *kind* of companies are using those platforms).
Re: (Score:3)
By your logic, Universities should be teaching Cobol again. They STILL to this day 40 years after the language's hay day make massive amounts of money to basically maintain the ugly code of Cobol because nobody else wants to. Do you respect Cobol programming more than C? Then I suggest you detach the salary proposition from the equation.
I learned C in school and for years worked in companies where my job was in C. I was in a tight spot in 2001 when the bubble broke and I dove into a Java shop. I had done a
Re:Hmmm ... (Score:4, Interesting)
Re:Hmmm ... (Score:5, Interesting)
If it means they stop doing everything in Java throughout their education, I'm all for it. There's nothing wrong with Java, and I use it often in my current company, but kids in school need to learn from the get go that languages are tools in a toolbox - use the right tool for the right job when you can. I can't remember the last time I interviewed a graduate who had used C++ or C outside of a single survey course on the language! Hell, I can't remember the last time I interviewed a post 2000 graduate who had built their own processor or had even taken an assembly class. The kids are just as smart, just as eager, but woefully unprepared. The one thing they are getting a little better at is included some 'software engineering' into the curriculum - but only a little bit better in that they do 'projects together' which in my experience means that the alpha nerd does 90% of the work and the other 4 team members offer worship and keep the ramen coming.
Re:Hmmm ... (Score:4, Insightful)
but only a little bit better in that they do 'projects together' which in my experience means that the alpha nerd does 90% of the work and the other 4 team members offer worship and keep the ramen coming.
That's called "managing expectations", and is a wonderful introduction to their future careers for a lot of CS grads. Now go fetch me some coffee while this build finishes, would you? Thanks.
Re: (Score:3)
An applicant for what though?
If I wanted a guy/gal as a back end systems and server engineer, I would want somebody with a well rounded CSCI education including assembler, kernel space programming, long running process experience, et cetera.
If I wanted someone who had to rapidly iterate client software because I was running a consulting firm, I'd prefer the person with the tech writing skills.
I've never seen a single job post 2000 that required students to know calculus either, but I'd still prefer to hire
Re: (Score:3)
its not compiler optimizations you're missing when you only learn java. Sure java does lots of stuff for you and does the "right thing" most of the time... But it doesn't teach you how things work at a low level. Learning only java doesn't teach you how indexes in a database work, because you just use Java's built in data structures that "just work" and you never have to learn how to manually build a b-tree. Which then teaches you how indexes work, and by deduction why its a bad idea to index every colum
Re: (Score:2)
Yes, but how do you create a transistor? Do you just buy one?
I think all studies should start from scratch - throw you on an island and let you invent calculus yourself. What's the point in progress if you can't explain it to a 10 year old in two sentences?
Re: (Score:3)
far better to hand wire logic gates and address decoders to have a real feel for what's going on.
Laugh all you want, but this was a big part of my intro to CS class way back before "CS" turned in to "using Visual Studio".
Programming is a useful skill and should be taught to CS students, that isn't in question. However, college isn't (or shouldn't be) a trade school. Programming is a bit of an art, and consequently shouldn't be a major part of any CS curriculum, certainly not the dominant part as it seems is the case at far too many institutions.
That said, first year students should learn to program,
Re: (Score:3)
I actually agree with that.
Or rather, it makes sense that students get a chance to sort of "climb up" to the current level of abstraction. Start out with basic electronics, digital gates, registers and ALUs, machine code/asm, C, OO...
Maybe I'm biased because I sort of followed that path on my own and I'm annoyed by all the developers out there who started out with VB using Visual Studio and then went on to VB.NET or Java using Visual Studio or some other IDE. I mean, I've worked with guys who think that som
Re:Hmmm ... (Score:4, Insightful)
I actually have never believed that assembly is in any sense informative or educational because "it is how the computer works". It's actually a remarkably hard to decipher representation of a program, and what the program is and does is the important part.
I consider myself quite a strong programmer, and I never think in terms of assembly when writing programs. Higher-level languages, in particular functional programming languages, are far closer to my mental model of things. Why not work in a language that represents or helps formulate the problem-space abstractions better?
Re:Hmmm ... (Score:5, Insightful)
In a word, performance. In your particular field of expertise, performance and memory footprint may not matter (99% of desktop applications I suspect), but a strong programmer is fully aware of what goes on under the covers when [his|her] program is compiled to the degree that it impacts the product. Until recently, I never thought there was a functional difference between a lone line containing "i++;" "++i;". Of course, for variable assignment it matters, but what's going on under the covers? If you stop and think about it, i++ actually has to return the old value. ++i can destroy that old value and never needs to worry about returning the old value (you can avoid an extra copy). If you're in a high performance loop, either with not much body or with a lot of increments, such things matter. Understanding the impact of caching, the scarcity of registers, the high cost of flushing the cache, all of these can matter, and it's hard to teach these concepts without using at least a pseudocode that looks suspiciously like assembly. If you're paying for compute cycles, or if you're selling something in a market where performance matters, you're going to see a strong advantage in a computer scientist who understands assembly and compiling.
Now, if your main concern is code readability, maintenance and/or moving onto the next product as soon as this one compiles with no errors, higher level languages are undoubtedly far more appropriate.
Re: (Score:3)
You need not worry about that anymore nowadays, at least in C and C++. Recent compilers optimize the useless instructions out automatically.
Re: (Score:3)
If you stop and think about it, i++ actually has to return the old value. ++i can destroy that old value and never needs to worry about returning the old value (you can avoid an extra copy). If you're in a high performance loop, either with not much body or with a lot of increments, such things matter.
Note that this only matters with overloaded ++ operators in C++. For the basic types, the compiler will optimize out an unused copy.
Re:Hmmm ... (Score:4, Informative)
Isn't it great that we have optimizing compilers that'll take care of considerations like that for us?
A.. code should be written for clarity first, and if it turns out that it's not quite fast enough only then should you start worrying about whetehr or not it should be i++ or ++i.
Re: (Score:3)
I sort-of agree with you; I am not making an argument that one "should not" know assemby, or at least the principles of it so you can read output even if you did not actually write it yourself. It's about being a well-rounded programmer.
My issues are more about this (mythical, to me) attitude that some people display that assembly is "the real thing" that is just hidden by abstraction in higher-level languages and that this is some kind of a mental crutch and that the assembly is what somehow actually "tell
Re: (Score:3)
I think you have a very limited view of what kind of programming folks do. Only a small percentage of folks do programming at the level where they need to know how a processor works. The vast majority of programmers out there work in high level languages like Java or SQL or XSLT where the nature of the processors capabilities, whether it is RISC or CISC or whether it is a multi-processor or uni-processor systerm doesn't matter a bit to what they are delivering. A lot of times you have assembler bigots on /.
Re: (Score:3)
my experience is that knowing how the processor works or a stack works, isn't helpful in 99% of cases..
However, it is helpful in 100% of cases where efficiency matters.
Re: (Score:3)
Which is why a certain CS professor, you may have heard of him, he's called Knuth, uses Assembler in his somewhat reasonably well-known(some would call them the bible of CS) books.
I have a great deal of respect for Knuth and have implemented a number of algorithms from his texts. Particularly in the case of high precision arithmetic, it was necessary from time to time to actually read his horrible assembly language. The end result was good and tight, however I can say with confidence that his exclusive use of assembler for detailing algorithms is a blemish that weakens his monumental work. There are some cases where only a description of the algorithm in machine instructions will do,
Re: (Score:3)
Re: (Score:3)
Seriously, TeX is a great example of how being a great computer scientist doesn't make you a good programmer. From the concept ("instead of a typesetting program, I'll create a programming language which typesets as a side effect") to the implementation of that concept, TeX is evidence that they had some good weed in the South Bay in the 70s (BibTex, on the other hand, is a tragic illustration of the effects of the crack epidemic in the 1980s).
Re: (Score:3)
Sure, they may be "good" inside their specific niche, but they have no versatility in their problem solving.
Actually it sounds like you are the one coming from a specific niche -- probably some kind of embedded development. It is exactly the versatility in problem solving that is, in my view, not tied to any kind of processor architectures or anything of the sort -- it comes from exposure to both problems (in their abstract form, as the same stuff shows up in various real-life incarnations) and various kinds of solution styles. As my argument goes, assembly just simply doesn't provide the latter -- you still need
Re: (Score:3)
C++ and parallelization don't seem to be mutually exclusive. It's fairly easy to write a C++ wrapper around pthreads, so you have a class container for all the data, one or more functions to process a single thread task, and a single function call to scatter the data, assign and supervise so-many threads to the task and gather the data afterwards.
Re: (Score:3)
Ya I am not really sure I believe this article either, sure OO is not great for parallel but it seems a lot better then any alternative to OO I have ever heard of.
OO is great for parallel programming. The actor model of concurrency comes from object orientation! A number of OOP concepts come from Simula, which was designed for writing concurrent systems. It is a bit less good for implicit concurrency, but it's still pretty good if you have some profiling data to determine when it's sensible to insert the concurrency.
I read TFA when it was published a few days ago and it sounded like a theoretician trying to push the functional programming agenda by spreading F
Re: (Score:2)
Re: (Score:2)
We're back to the discussion about the "scientific education" vs. "trade schools" rift again?
Re: (Score:3, Insightful)
The point of a university Computer Science course is not to train students in iPhone development.
Re:What? (Score:5, Interesting)
Ah, but in the same vein... CS classes aren't supposed to train them for the "real world" their first semesters. SERIOUSLY. Do keep in mind that CS is really a branch of theoretical mathematics (I should know...I studied CS and made the leap to Software Engineering...) and in order to grasp the actual study, you need more than just being trained for the "real world" (If you're wondering why I'm putting that in quotes, there's a substantive (dare I say a small majority...) of development work that just simply can't use Java or "pure" C++ and there's a nearly as large subset of programming that cause more problems than they're worth because it requires REAL skill doing the task in Java or "pure" C++ and many, many of the train wrecks are caused by someone using the wrong tool because they don't understand how the tool actually works- they were trained for the "real world" by their college in a CS or Software Engineering degree and they were ill prepared to make the right decisions.). Quite simply, you teach fundamentals first, then you branch off into functional and OO programming after the fact so they understand all the tradeoffs and actually can be theoretical mathemeticians or software engineers at their discretion.
Re: (Score:3, Insightful)
Re: (Score:3)
Geez, I don't know.
Most kids go to school for four years. Maybe they'll learn OOP in one of the other three?
Re: (Score:2)
So, because last month a car manufacturer sold more cars with an automatic transmission we no longer need to teach people to drive with a stick shift?
Re: (Score:2)
> Can someone name me some actual real world, large
> software projects based on functional programming?
Ericsson's stuff scales to billions-of-cell phones and is written in Erlang.
Re: (Score:3)
Major parts of the worlds telecom networks, for example the software in any of Ericssons equipment made in the last 15 years is written in Erlang. Nokia also uses a fair chunk of Erlang IIRC.
Software upgrades for some equipment used by the Swedish Defense Force, to accomodate the Network-Centric Warfare model used functional programming to.
I've used Erlang myself for semi-embedded networks that need to work in parallell.
Re: (Score:3)
How about the Ericcson AXD301 [ericsson.com] ATM switching system, with over a million lines of Erlang?
Or...how about RabbitMQ [rabbitmq.com]?
Or...how about Facebook's Chat backend [facebook.com]?
There's quite a bit more than you think and the three I referred to were using Erlang- there's loads more with some of the other functional languages.
Re: (Score:2)
Clojure is the typical unreadable lisp hackjob. Scala however is pretty enterprise ready if you ask me, it is just on step up from a complexity point of view, you need some time to grasp all the additional stuff. All which stands between scala and the enterprise is their refactor full force mentality which means the apis are not entirely stable yet.
Btw. someone mentioned OOP will go away in favor of functional programming. This is a wet dream of the functional guys, not even parallelism will do that instead
Law of Demeter the problem? (Score:3)
Demeter, they tell me, was some manner of software project and the Law of Demeter is a style of OOP programming that is supposed to have come out of the experience on that project. In the strict sense, you are supposed to never invoke methods on objects embedded inside other object. Instead, the containing object is supposed to have a method that in turn invokes the method on the embedded object.
Re: (Score:3)
Ironically, I find it vastly easier to encapsulate my mechanisms for parallelization in objects :).
Re: (Score:3)
These single focusfunctional/imperative/O-O programming classes are teaching tools (techniques), not tool selection.
- that's where it all goes wrong. Teaching people in colleges about specific tools rather than about helping them to adequately approach a problem and to come up with a solution that is generic enough mathematically, so that it could be expressed in any language. The tools they will learn on the job.
Of-course if that's a technical school, then there is more merit in that sort of an approach, still I'd rather have somebody who can think his way out of a box than somebody who has a hammer.