Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education Programming

CMU Eliminates Object Oriented Programming For Freshman 755

fatherjoecode writes "According to this blog post from professor Robert Harper, the Carnegie Mellon University Computer Science department is removing the required study of O-O from the Freshman curriculum: 'Object-oriented programming is eliminated entirely from the introductory curriculum, because it is both anti-modular and anti-parallel by its very nature, and hence unsuitable for a modern CS curriculum.' It goes on to say that 'a proposed new course on object-oriented design methodology will be offered at the sophomore level for those students who wish to study this topic.'"
This discussion has been archived. No new comments can be posted.

CMU Eliminates Object Oriented Programming For Freshman

Comments Filter:
  • by mjwalshe ( 1680392 ) on Saturday March 26, 2011 @07:15AM (#35621146)
    I always thought the obsession with making everything OO when it doesn't suit every type of programming problem was a bad thing - glad to see some one agrees with me.
    • I've found the same problem with SQL and prolog (of that was object prolog, scrub that)

    • Longtime Slashdot readers know and either love or hate user "Tablizer" [slashdot.org] .

      He has a website [geocities.com] detailing his objections to object-oriented programming, while arguing for "table-oriented programming". It has been a fruitful source of flamewars over the years.

      So, is this a vindication for Tablizer? Tablizer, what say you?

  • by DNS-and-BIND ( 461968 ) on Saturday March 26, 2011 @07:22AM (#35621170) Homepage
    Why are computer scientists even learning programming? When did this happen? Programming sounds like one of those get-your-hands-dirty jobs in flyover territory, where you would show a lot of ass crack on the job and live in a trailer park. Educated people don't do that.
    • Not sure whether you're being sarcastic or not, assuming you're not...

      Programming lets you put your mind in a certain 'mindset' which can help you analyse and solve problems, even if you don't actually get your hands dirty in the end.

    • Chem majors do work in chem labs.
      Physics majors do work in physics labs.

      Why shouldn't CS students do lab work?

    • by WrongSizeGlass ( 838941 ) on Saturday March 26, 2011 @07:46AM (#35621302)

      Why are computer scientists even learning programming? When did this happen? Programming sounds like one of those get-your-hands-dirty jobs in flyover territory, where you would show a lot of ass crack on the job and live in a trailer park. Educated people don't do that.

      They need to be able to program for the same reasons management and engineers need to spend some time on the assembly line: so they can learn how things actually work. There's often a wide chasm between "on paper" and "in practice" and ideas need to be able to traverse it.

    • Goodness forbid an "educated" person get their hands dirty. It is useful to put theory into practice through direct effort, and it demonstrates a proficiency with the practical tools and results that can only lead to better theory.

  • Why remove it? (Score:4, Insightful)

    by Haedrian ( 1676506 ) on Saturday March 26, 2011 @07:25AM (#35621182)

    I don't see why it should be removed, it should be 'complimented' instead with other programming methodologies in order to let users compare and contrast. But most CS in the end will end up being done with OOP so there's no reason not to start in the beginning - at least that's my personal experience.

    In my first year we had a mixture of different programming types, including functional programming. I never really used any of those, I'm sure there are certain places where Prolog or Haskel is used, but its not as common as an OOP.

    • by account_deleted ( 4530225 ) on Saturday March 26, 2011 @07:36AM (#35621258)
      Comment removed based on user account deletion
    • Well I would not call OO either directly anti parallel, the statefulness of objects is. The entire Actors / Message model which seems to become more popular despite being functional really blends well into OO. After all some OO systems are basically very close to that, you have objects and messages for interoperation. The main difference to the Actors / Message model is, that OO leaves it up to the implementor whether the message sender/receiver can be stateful or not.

      Outside of that, I personally think an

  • Interesting move (Score:5, Interesting)

    by bradley13 ( 1118935 ) on Saturday March 26, 2011 @07:25AM (#35621186) Homepage

    OO is practical for lots of problems, because it makes modelling real-world data easy. However, it is not useful if you want to give students a solid understanding of the theoretical computer science. OO is fundamentally data-centric, which gets in the way of algorithmic analysis.

    To give a pure view of programming, it would make sense to teach pure functional and pure logic programming. If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically. Apparently that was too big of a step.

    • by janoc ( 699997 )
      If you read the article, they kept functional programming in parallel with imperative one, with focus on proving validity of programs. So part of that is there.

      On the other hand, you must balance theory with practice, because otherwise the students will a) leave b) not be able to do practical projects while studying the theory. So teaching only logic programming (which is great, IMO - it helped me a lot!) is not practical.

    • Functional and logic programming get in the way of some aspects of algorithmic analysis too ... the hardware is imperative after all.

    • Re:Interesting move (Score:5, Interesting)

      by digitig ( 1056110 ) on Saturday March 26, 2011 @08:01AM (#35621420)

      OO is fundamentally data-centric

      Maybe the way that you do it. Personally I find that quite a lot of my classes have methods.

      If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically.

      Imperative programming isn't really any more difficult to model mathematically than functional programming. They just use different branches of mathematics. Check out David Gries's The Science of Computer Programming for example, which shows how to do it, and Object-Z which actually does it. The main difficulty is with side effects, but functional programming has the same issues as soon as you try to interact with the external world.

      • Sure your classes have methods. But good OO design says that your classes correspond to groups of real-world objects: bank accounts, people, widgets, whatever. Why? Because that is the data that needs to be processed. The methods are tied to the classes, meaning that their very definitions depend on the data modelling you have done.

        Working this way is suitable when you are dealing with complex data and comparatively simple algorithms; get the data model right, and your problem is half solved. Moreover, basi

        • by digitig ( 1056110 ) on Saturday March 26, 2011 @10:20AM (#35622268)

          Sure your classes have methods. But good OO design says that your classes correspond to groups of real-world objects: bank accounts, people, widgets, whatever. Why? Because that is the data that needs to be processed.

          That seems to be a fundamental misunderstanding of OO. The focus on real-world objects is because they have self-contained properties and behaviour. "Because that is the data that needs to be processed" is no more (and no less the case) than Because that is the behaviour that needs to be produced.

          The methods are tied to the classes, meaning that their very definitions depend on the data modelling you have done.

          Or (and just as valid), the members are tied to the classes, meaning that their very definitions depend on the use cases (ie, actions) you have identified.

          Working this way is suitable when you are dealing with complex data and comparatively simple algorithms; get the data model right, and your problem is half solved.

          You do realise that data model != object model, don't you?

          The other situation is where you have complex algorithms, but simple data. Signal processing, control software, scientific programming, etc.. Take a radar as an example: the data is just a continuous stream of numbers coming in off the radar dish; but the algorithms that extract meaning from those numbers are very sophisticated. Using an OO language for problems like this is like trying to loosen a screw using a wrench - maybe you can make it work, but it's basically a really bad fit for the problem.

          Radar data was a bad example. A modern radar processing system takes data from many sources such as primary radar heads, secondary radar heads, ADS-B, ADS-C, WAM, neighbouring radar processing systems and so on. These behave the same in most ways but differently in some, and OO turns out to be an excellent way of dealing with that.

          Yes, in some applications OO doesn't offer you anything significant. They are typically the cases in which you end up effectively with one class because the data and the processing are so simple. But in this context "simple" processing doesn't mean mathematically or computationally simple, it means that you only do one thing with the data (in the case of the radar front-end, for instance, the one thing will be extracting distinct decluttered returns into distinct objects for passing to the next processing stage). OO design still doesn't get in the way, though: if that's the only thing going on in that process you simply end up with a design with only one class having only one significant method (so if your language supports it you might just as well write the lot as a single routine with no class packaging).

          For what it's worth, I have no problem with OO not being taught from the outset on a computer science course. It's a decompositional/compositional approach and is not a lot of use until you understand what you are decomposing and composing.

        • Re:Interesting move (Score:5, Interesting)

          by khallow ( 566160 ) on Saturday March 26, 2011 @10:53AM (#35622524)

          The other situation is where you have complex algorithms, but simple data. Signal processing, control software, scientific programming, etc.. Take a radar as an example: the data is just a continuous stream of numbers coming in off the radar dish; but the algorithms that extract meaning from those numbers are very sophisticated. Using an OO language for problems like this is like trying to loosen a screw using a wrench - maybe you can make it work, but it's basically a really bad fit for the problem.

          I nodded at first when I read this, then I realized, the complex algorithms are objects in themselves. And the "simple data" usually has structure and meaning to it. I have some experience trying OO approaches to scientific computing. A naive approach can result in some awful problems that add little to the fundamental work.

          My personal experience has been with writing matrix algorithms. The conditions of the matrices were that they were "large" and usually had nice internal structure (for example, a class structure might be rectangular matrix, upper/lower triangular matrives, diagonal matrix, and constant matrices, constant diagonal and the zero matrix). They are "simple" data sets, but a little insight into their structure can result in significant savings in data stored and the cost of common matrix operations (like matrix multiplication).

          For me, I ran into some deep problems when I attempted to operate on multiple matrices with structure. For example, suppose in addition to the list mentioned above, I create a new class type for block diagonal matrices. There were already six matrix classes. So if I want to do matrix multiplication, I have potentially 13 new pairings of classes (left or right multiplication by block diagonal matrix on the six existing classes plus multiplying two block diagonal matrices together).

          For my language choice, C++ (and later Java), I couldn't just implement matrix multiplication as a method. I know there's some stuff with better OO implementation out there, but that's what I was using. So I ended up representing matrix multiplication as its own object, and registering specialized multiplication algorithms in a poset structure. When two matrices need to be multiplied, the matrix multiplication class looks up all the specific algorithms that would work and picks one with the best heuristics.

          Whether the complexity of that overhead is justified, well, eh... never did finish it. All I can say is that what I implemented, allowed me to throw stuff together without thinking too much about the internal structure or taking a huge hit on computation (given what I was doing, PDE difference method modeling and least squares stuff on large piles of data).

          But this example steers my intuition here. You may have several competing complex algorithms. You may end up changing the structure of the simple data you operate with. The places where your scheme can change are places which you can and maybe should objectify. Also you have processes which you can abstract through objects, for example, if you are piping data through a DSP system, then a stream structure with filters applied to the stream in sequence might be a suitable abstraction for what you're doing.

    • by meburke ( 736645 )

      Most of the imperative languages I have used in the last 45 years have been easy to model, but not easy to prove mathematically at the same level of logic. They invariably needed to be proved in some customized Higher-Order-Language, and many times this was LISP or PROLOG.

      Object-Oriented programming has the drawback of distributing data all over the system, and it tends to obscure the underlying structure of the problem space and the matching solutions. OOP is convenient for plugging completed subsystems in

  • by shoppa ( 464619 ) on Saturday March 26, 2011 @07:26AM (#35621188)
    Focusing on the basics, and not on the tools of the trade, is very important at something that is not a "trade school", and CMU's computer science department certainly lives above the trade school level. (Just to contrast: when I was a freshman, the "trade school" argument was whether new students should be taught Fortran or Pascal ! Thank heaven I didn't devote my career to being a programmer.)

    It seems to me that CMU's made the very obvious decision that today, OO is a tool for craftsmen, not for freshman computer scientists. And they probably are right. It's important to not confuse the tools of the trade, with the basics of the science, and this is especially true at the freshman level. For a good while (going back decades) OO was enough on the leading edge that its very existence was an academic and research subject but that hardly seems necessary today.

    In the electrical engineering realm, the analogy is deciding that they're gonna teach electronics to freshmen, and not teach them whatever the latest-and-greatest-VLSI-design-software tool is. And that's a fine decision too. I saw a lot of formerly good EE programs in the 80's and 90's become totally dominated and trashed by whatever the latest VLSI toolset was.
    • CMU decided that the future is focusing on parallel. Because of the direction of hardware, I think they are on the right track.

    • by Chemisor ( 97276 )

      Ok, so let's make a computer science degree exclusively about "computer science" as opposed to "computer programming". Then we might as well dispense with the CS degree requirement for the vast majority of programming positions. Then we should realize that the market for "computer scientists", the ones that design pure math algorithms and do scientific studies of computer-related systems, is extremely small and already overfilled. Then everyone will realize that going into "computer science" as opposed to a

    • by obarel ( 670863 )

      We teach music *theory*, not how to play an instrument. Nothing to do with us.

      Computers are tools. You program them to get them to do something useful. They're not an abstraction, they're real. Computer science is about the theory behind computers. It's not about some "higher dimension" of thought that is unrelated to programming machines, it's all about what you can achieve by programming those complex machines.

      Otherwise, why study functional or logical programming? That's "dirty" as well, isn't it? Why ha

    • by hey! ( 33014 ) on Saturday March 26, 2011 @08:44AM (#35621652) Homepage Journal

      Well, I'd agree with you if you this were a class in an O-O language, but it's a class in a major programming *paradigm*. And even theoreticians have to express themselves in code. If the summary is to be believed, the department is taking a stand against O-O programming based on what it sees as the paradigm's shortcomings. Let me tell you, thirty years ago you'd hear the same kinds of arguments in a computer science department about how bad virtual machines were. Virtual machines were an early, failed attempt at getting around the limitations of early compiler technology. C showed you could have a language which was nearly as fast as assembler (faster for most programmers), and easily portable across architectures. There was no reason to study virtual machines, they had no practical application. Well, it probably would have been a good idea to have them in the curriculum somewhere to study on a theoretical basis, because ideas in computer science have a way of re-emerging.

      Computer science as a discipline embraces topics other than the theory of computability and complexity theory, which might as well be taught in the mathematics department. For example there is computer language design, hardware architecture, data communication theory, database theory, AI etc. Some of these could be put into other departments, I suppose, but an understanding of the core intellectual discipline *is* widely applicable to all the topics that commonly fall under the "computer science" rubric.

      I think an empirical case for separating computer science from software engineering might have sounded convincing fifteen years ago, but Google, and the Internet in general, are strong disproofs of this. Google isn't a search company. It's an algorithm company. It's about doing things on a scale so massive it requires serious computer science ability, well above the "tradesman" level of expertise. Which is not to say that what it does is as "pure" as the work an academic interested in computability per se and defining new kinds of complexity classes, but that kind of work, isolated from its applications, is of such narrow interest it hardly justifies the existence of a department separate from the Mathematics department.

      I do see the need for separate concentration areas or departments for computer science per se, software engineering and information science, although having employed a number of people with Masters' degrees in information science I have grave doubts about the academic quality of degree programs in that field. An academic degree should give you the intellectual tools to think about a problem, not necessarily the practical tools to work on it, which can be picked up other ways. The O-O paradigm fits into this framework. The requirement to learn a specific O-O language like C++ or Java is a different matter; somebody with a degree in C.S. should be able to pick any such language up because he should understand the *design* of the language.

      I suspect we're seeing departmental politics here.

      • Virtual machines were an early, failed attempt at getting around the limitations of early compiler technology.

        Say what? Virtual machines were an early, successful attempt at allowing multiple users to use the same machine at the same time for different purposes.

  • A larger problem (Score:5, Insightful)

    by wootest ( 694923 ) on Saturday March 26, 2011 @07:26AM (#35621190)

    The big problem with OO courses has always been that there's far more focus on the mechanism (and, lately, on design patterns) than on actually structuring your program to take advantage of objects. The word "polymorphism" isn't as important as getting across the concept that you can coordinate a few things as if they were the same thing, and have them act in different ways. Knowing a bunch of design patterns by name isn't as important as figuring out once and for all that you can structure your program in such a way that you can create interfaces (not necessarily actual interfaces/protocols) between different parts of your program and essentially replace them, or at least rework them entirely without having the whole house of cards collapsing.

    There's no focus on making it click. There's no course that teaches you explicitly just about formulating problems in code, and that makes me sad.

  • by vmfedor ( 586158 ) on Saturday March 26, 2011 @07:27AM (#35621204)
    Perhaps I'm misunderstanding the post... it sounds to me like OO techniques are only going to be taught in elective courses from now on. If that's the case, I think CMU is missing the fact that the majority of development work in the "real world" is done on already-existing platforms. Parallel/cloud computing and modular design may be the "next big thing", but what happens when the student gets their first job working with an application built with Java or .NET? Maybe in their ivory tower they can say "OO is dead" but in the real world, OO is very real.
    • Re:Really? (Score:5, Funny)

      by Fnord666 ( 889225 ) on Saturday March 26, 2011 @07:43AM (#35621288) Journal

      Perhaps I'm misunderstanding the post... it sounds to me like OO techniques are only going to be taught in elective courses from now on. If that's the case, I think CMU is missing the fact that the majority of development work in the "real world" is done on already-existing platforms. Parallel/cloud computing and modular design may be the "next big thing", but what happens when the student gets their first job working with an application built with Java or .NET? Maybe in their ivory tower they can say "OO is dead" but in the real world, OO is very real.

      This is a CS program we are talking about. Much like economics, in these disciplines the real world is often considered a special case.

    • When I went to school, graphics was an elective.

      I didn't take graphics, so I didn't get a job where I needed to write a ray tracer or whatever.

      See how that works? Not so hard.

  • Anti-Modular? (Score:5, Interesting)

    by msobkow ( 48369 ) on Saturday March 26, 2011 @07:30AM (#35621216) Homepage Journal

    Apparently the meaning of "Modular" has changed since I was in University back in '82. OO used to be the epitome of modularity.

    But I do agree that making it an introductory first-level course does warp the mind of the young programmer. There are a lot of languages that don't enable OO programming at all (e.g. Erlang), which become much more difficult for them to grasp because OO is so engrained in their thinking.

    I can't think of anything specific about OO that makes it poorly suited to parallel programming. There are languages whose nature is parallelism (again, Erlang), but that's usually accomplished by adding parallelism operators and messaging operators to a relatively "traditional" language. I don't see why you couldn't add and implement those constructs in a non-parallel language.

    I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.

    I gotta tell you, though, I really object to the use of Java as an introduction language for programming. Java is far from a shining example of any particular style of programming. It's not real OO because it's only single inheritance. It's not designed for parallelism. It doesn't have messaging built in. In short, Java is actually a pretty archaic and restricted language.

    • I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.

      I dont shudder they should learn why some patterns lend themselves better to parallelism than others instead of learning some high level in the language baked tools.
      For instance if you give a student Erlang or Scala with a dedicated actors model I personally think they will never grasp why this high level construct works as pattern better than lets say a critical region - semaphore based model.
      The students need a deeper understanding than applying a few patterns. The entire segment of parallelism probably s

    • Re:Anti-Modular? (Score:5, Interesting)

      by jameson ( 54982 ) on Saturday March 26, 2011 @08:27AM (#35621566) Homepage

      Yes, there has been much progress in module systems. The very Bob Harper mentioned there was involved in the design of the SML module system, which is often cited in the CS literature as `the' reference module system. SML achieves a level of isolation that is simply impossible to achieve in a language like Java or Smalltalk. in Java, any object you can still call `hashcode()' and `toString()' etc. on, and it's often possible for someone to subclass one of your internal classes and thereby break your intended module structure.

      In SML, you can confine types through (e.g.) the following signature:

      (* this is imperative code; a typical SML program focuses on functional code, but it's good enough for illustration purposes. *)
      signature STACK =
      sig
          type 'a stack (* 'a is a type parameter, so "'a stack" is what OO-land calls a `generic type' *)
          empty : unit -> 'a stack (* create fresh stack *)
          push : 'a stack -> 'a -> unit (* take a stack of 'a elements, take an element of type 'a, and plug them together *)
          pop : 'a stack -> 'a option (* "'a option" means that the operation may fail *)
      end

      You can then implement this stack in various structures that match this signature, and confine it in such a way that only the operations listed above are available. That is, you can't stringify stacks (there is no such operation listed there, though you can choose to add one), you can't compare them (again, no such operation), you can't `reflect' on them and you can't access their `protected' functionality by subclassing them, unless the stack implementer put a separate view of the structure into place for that particular purpose.

      Why is this good? Client code won't end up using features that you didn't want to expose. Why is it bad? If you forgot to expose something important, someone else will have to re-implement it. But that's your fault, then, not the language's fault.

      SML also allows you to build modules from other modules through something called `functors', but let's leave that for another time.

      Now, SML's modules have issues-- you can't have mutual dependencies between them (which does have advantages, too, though), and the question of how to integrate type classes (something you may know as `C++ concepts') is unresolved. But the concepts behind the module system are clear and powerful. So if you want to teach the concepts underlying modular software design, this is a vastly better choice than most other options out there. (I remember the Modula-3 module system being fairly good, too, but not quite at this level.)

      So, for teaching purposes I'd say Bob Harper is closely connected to the best system out there that has actual working implementations tied to it.

  • OOP in freshman year (Score:5, Interesting)

    by janoc ( 699997 ) on Saturday March 26, 2011 @07:34AM (#35621246)
    From the position of someone who used to teach basic programming courses to freshmen, I can only applaud the decision.

    Many kids coming to colleges these days do not have any programming experience or a very shaky one at best. Picking up concepts like classes, inheritance, the entire idea behind OO modelling is difficult if you are lacking basics such as how memory is managed, what is a pointer, how to make your program modular properly, etc. From the course description they are going to use a subset of C, I think that is a good starting basis for transitioning to something else (C/C++/C#/Java/... ) later on.

    What is worse, many of these introductory courses were given in Java - producing students who were completely lost when the black box of the Java runtime and libraries was taken away - e.g. when having to transition to C/C++. We are talking engineering students here who could be expected to work on some embedded systems later on or perhaps do some high performance work. Even things like Java and C# still need C/C++ skills for interfacing the runtime with external environment.

    I think it is a good move, indeed.

    • by DCheesi ( 150068 ) on Saturday March 26, 2011 @08:06AM (#35621446) Homepage

      I agree that OO in the 101 course is a little much. You should really be focusing on simple programming techniques that a non-major might encounter when, say, writing a batch script or macro. I'm not sure about the second semester courses, though, since those are more for potential majors. Certainly at some point a CS major needs to be exposed to OO, but I don't think it needs to come first.

      As for understanding the infrastructure, I do think C/C++ get you closer, but in my experience it doesn't really click until you take some kind of computer architecture course or similar. For instance, I didn't *really* understand pointers until I understood how values and addresses are stored in memory.

    • Re: (Score:3, Interesting)

      by stefski66 ( 1643585 )

      What is worse, many of these introductory courses were given in Java - producing students who were completely lost when the black box of the Java runtime and libraries was taken away - e.g. when having to transition to C/C++.

      What is worse, many of these introductory courses were given in C/C++ - producing students who were completely lost when the black box of the C/C++ runtime and libraries was taken away (stl, libstdc++, libc, stdlib/malloc) - e.g. when having to transition to Assembly.

      What is worse, many of these introductory courses were given in Assembly - producing students who were completely lost when the black box of the Assembly runtime and libraries was taken away (OS virtual mem) - e.g. when having to transition to

  • Thank goodness a university has finally decided to teach a curriculum based on what its professors like, instead of adhering to silly concerns about what might be useful in the real world. Students can rest assured that they'll get a first class CS education, and--sorry, what was that? Jobs? You want to get a job? What the fuck do you think this is, DeVry?

    Now go finish your LISP homework!

    • Speaking of, I recently read the message of someone who was prayising lisp as the perfect choice of being able to express algorithsm. Speaking of boneheaded that is. This guy probably never worked with languages like pascal which really give strong emphasis on clean syntax for expressing algorithms instead he has been drowning his brain constantly in parentheses.

  • The decision is sound, OOP mostly leads to bloated software, mind-bending and partly just plain stupid object-oriented constructions, and strongly encourages mutation. In a sense it's even based on the idea of mutating objects. Because of the dramatic increase in concurrent programming in the future strictly functional programming will be much more important than OOP. I understand that OO enthusiasts will not like to hear this, and of course OOP also has its good sides, but it is highly overrated. A good mo

  • by v(*_*)vvvv ( 233078 ) on Saturday March 26, 2011 @07:51AM (#35621346)

    Programming is about utilizing paradigms. Not being stuck in one. OO is just another layer on top of anything else. It is a set of rules that one can follow in any language. Some languages have it built in for convenience, but it is also an inconvenience for implementations where it is not so optimal.

    There was definitely a moment when OO seemed to be some new paradigm in programming, but no, it is merely a tool, and CMU put it in its rightful place.

  • by davidbrit2 ( 775091 ) on Saturday March 26, 2011 @08:10AM (#35621478) Homepage
    I guess they want to build a nice moat to go with their ivory tower.
  • by MoNsTeR ( 4403 ) on Saturday March 26, 2011 @08:20AM (#35621526)

    OOP should absolutely not be taught at the freshman level, because it gets in the way of understanding more basic concepts like, oh I dunno, variables, branching, looping, subroutines, I/O, etc. Their claim that OOP is "anti-modular" is of course absurd. "Anti-parallel" is probably arguable, but how that's relevant at an undergraduate level of instruction is highly questionable.

    At any rate the real problem is that colleges offer degrees in "computer science" but not "software engineering". How many of these students will finish their degrees without ever having committed to source control? Without coding to a 3rd-party API? Connecting to a database? Performing maintenance programming? Working in an honest-to-god team?

  • by segfault_0 ( 181690 ) on Saturday March 26, 2011 @09:16AM (#35621842)

    Of course you don't want to teach algorithm development on a high level language that already has all of the algorithms (that these teachers are capable of teaching) built in. What would be the point of that? And I think when they said anti-parallel and anti-modular, they meant in the attitudes it instills in students.

    --- start off topic rant ----

    For the record, these degree programs provide you with skills tuned for different jobs. The CompSci folks are algorithm based - down in the details making things fast and efficient. The SoftEng folks are architecture/technology based - up at the top making things that WORK and that people can actually use; you know - products. They are COOPERATIVE jobs with a very large overlap.

    Any CompSci should be grateful to have talented SoftEng helping them provide a cogent, fast, and polished view of their algorithmic efforts; your work means nothing if it never gets in the hands of the users or if it's spoiled by a bad implementation.

    Any SoftEng should be pleased to have hard working CompSci making fast algorithms to help them to create their next product. You couldn't make these fancy architectures of you didn't have well thought out underpinnings backing you up.

    I see some piss-poor attitudes on this thread and some of you snobs and dorks are in for a rude awakening.

  • by Dcnjoe60 ( 682885 ) on Saturday March 26, 2011 @11:09AM (#35622638)

    A few weeks ago, there was a lot of discussion about how many universities are wanting to drop the math requirements from their CS degrees. Now we have dropping OO, too. I seem to recall that CS stands for Computer Science. Should not somebody with a Computer Science degree be able to handle things like OO and math? I could see dropping these requirements if the degree were something like Computer Technician or Computer programming (although even that last one, would indicate that a graduate would be verse in multiple programming paradigms).

    Is all of this a continued dumbing down of college so that the masses can get a degree? If so, it appears more and more technical jobs will go offshore as the next generation of true computer scientists won't receive their training in this country.

    On the other hand, if the goal is to pay a university a $100K to teach you how to do web pages and the like, then go for it. But, if the US wants to return to being a technology leader, then they need to focus on more than just the currently hot marketing skills and focus instead on the whole big picture.

  • by JonathanAldrich ( 878621 ) <jonathan@aldrich.cs@cmu@edu> on Saturday March 26, 2011 @05:01PM (#35625044) Homepage
    As some readers may have guessed, "anti-modular," "anti-parallel," and "unsuitable for a modern CS curriculum" are one person's opinions, and do not represent the majority view of the CMU faculty. The introductory curriculum was changed away from Java for different reasons: primarily to focus on a language (Python) with simpler syntax and dynamic types, and to supplement with material on C that is closer to the machine. For more details, see a report by the SCS Deans:

    http://reports-archive.adm.cs.cmu.edu/anon/2010/CMU-CS-10-140.pdf [cmu.edu]

    Whatever you may think about delaying OO--and opinions are mixed at CMU as everywhere--one advantage of the new curriculum is that the sophomore-level course can do OO design more justice than we were ever able to do in the prior intro sequence, since the students already know how to program. Modularity and parallelism are in fact major emphases of that course, which I and other CMU faculty are currently developing.

A consultant is a person who borrows your watch, tells you what time it is, pockets the watch, and sends you a bill for it.

Working...