Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming

Anthropomorphism and Object Oriented Programming 303

An anonymous reader writes: We've all been warned about how anthropomorphizing animals and machines can lead us astray. But Edsger Dijkstra once cautioned (PDF) developers against thinking of their programs that way as well. "I think anthropomorphism is worst of all. I have now seen programs 'trying to do things,' 'wanting to do things,' 'believing things to be true,' 'knowing things' etc. Don't be so naive as to believe that this use of language is harmless. It invites the programmer to identify himself with the execution of the program and almost forces upon him the use of operational semantics." A new article fleshes out Dijkstra's statement, providing a good example of where an anthropomorphized analogy for Object Oriented Programming breaks down when you push it too far.
This discussion has been archived. No new comments can be posted.

Anthropomorphism and Object Oriented Programming

Comments Filter:
  • Strawman (Score:5, Insightful)

    by Anonymous Coward on Saturday January 03, 2015 @05:28PM (#48726995)

    Dijkstra spends time building an analogy, then explains how it's flawed, and uses that to argue against 'anthropomorphizing'.

    That's nice, and I certainly agree that the analogy can only go so far, but he was the one to build that analogy in the first place. This is not a valid argument against anthropomorphizing at all.
    I do agree with the conclusion that anthropomorphising is not a reason to call OOP better than procedural.
    ---
    "I don't know how many of you have ever met Dijkstra, but you probably know that arrogance in computer science is measured in nano-Dijkstras"

    • You seem to think Dijkstra wrote the article. He was just the one who made the first quote. The article was written by Loup Vaillant.
  • Summarizing (Score:5, Funny)

    by ofranja ( 589375 ) on Saturday January 03, 2015 @05:30PM (#48727001)
    Object oriented programming, the "crystal healing therapy" of computer science.
  • by NoNonAlphaCharsHere ( 2201864 ) on Saturday January 03, 2015 @05:35PM (#48727013)
    I used to have a procedural toaster which cooked the bread until it became toast. Then I upgraded to a much more elegant OO toaster, which simply sends a "toast yourself" message to the bread. Unfortunately, bagels don't have a self.toast() method, so i still have to have a backup procedural toaster to handle the older API.
    • Toaster DRM (Score:5, Funny)

      by tonywestonuk ( 261622 ) on Saturday January 03, 2015 @06:16PM (#48727221)
      Oh FFS. Look on the bloody Bagel packet before you buy. If it doesn't say 'implements toastable' then don't buy em. Yeh , they may be a few bucks more, but thats your own fault for getting a toaster that is made by the same people who make the bagels.
      • by tonywestonuk ( 261622 ) on Saturday January 03, 2015 @06:25PM (#48727263)
        What you can buy is an 'Toast Decorator' - its a Chinese import, probably not the most legal thing as they've cracked the DRM........ what you do is just slip your generic, non toastable bagels in this toasting bag, and then shove it in your toaster. It accepts the 'self.toast()' method, and does whats required to make sure your bagels are toasted to perfection every time. Result!
    • by quenda ( 644621 ) on Saturday January 03, 2015 @06:40PM (#48727343)

      Hardware analogies are fraught with peril but ...
      the object-oriented toaster never burns bread, because the slices/bagels etc set their own cook time. You don't need to upgrade the toaster every time you get a new kind of bakery product to toast.

      And best of all, you never need to empty the crumb tray, because of the built-in garbage collection.

      • By the same token, now you have to implement pie.toast() and cake.toast() and lots of other useless and irrelevant methods, even though you're never ever going to use them, simply because they extend the isBakeable() interface.
        • by kjots ( 64798 ) on Saturday January 03, 2015 @08:15PM (#48727787)

          By the same token, now you have to implement pie.toast() and cake.toast() and lots of other useless and irrelevant methods, even though you're never ever going to use them, simply because they extend the isBakeable() interface.

          Unnecessary, since the IsBakeable interface provides a default implementation of the toast() method that throws an exception if you attempt to toast something that is not toastable.

          The real issue is why the IsBakeable interface has a toast() method in the first place...

    • Improper subclassing. bagel::toast() should be inherited from bread::toast() and you haven't to rewrite the whole thing, only let it know to toast on side.
    • I used to have a procedural toaster which cooked the bread until it became toast. Then I upgraded to a much more elegant OO toaster, which simply sends a "toast yourself" message to the bread. Unfortunately, bagels don't have a self.toast() method, so i still have to have a backup procedural toaster to handle the older API.

      Man does that bring back some memories..... Borland used to make some truly awesome toasters, what ever happened to them?

  • Ya, Sure. (Score:5, Insightful)

    by wisnoskij ( 1206448 ) on Saturday January 03, 2015 @05:37PM (#48727029) Homepage
    Ya, sure. It is so much better to use the phrase: "The program contains a variable that stores your name", instead of: "The program knows your name". English, ect. all was not designed to work that way. Unless you want to take a week to describe a single program, it really helps to anthropomorphise it.
    • by N1AK ( 864906 )
      Yeah the blanket dislike for anthropomorphizing surprised me as well. The example of how discussing program behavior/structure as though it is an example of the issues this causes is useful and informative; refusing to use words like 'know', 'tries' etc when discussing programs outright, rather than keeping in mind that they can be potentially misleading abstractions, seems like throwing the baby out with the bath water.
      • Re: Ya, Sure. (Score:2, Insightful)

        by Anonymous Coward

        Atheism suffers from this too... There are so many standard phrases for swearing/exclamations that invoke some deity construct that atheists are forced to continue using them as they are too entrenched in our cultures to replace. "Damn you!" Is too perfect to replace with "my anger is so vivid that I want the universe to arrange something very bad to happen to you. Err, except I don't believe the universe is sentient of course, ummm, errrr, I think I'll shut up now"

    • Unless you want to take a week to describe a single program, it really helps to anthropomorphise it.

      Especially when it's that cute little Clippy!

    • Re:Ya, Sure. (Score:4, Informative)

      by sandytaru ( 1158959 ) on Saturday January 03, 2015 @06:31PM (#48727293) Journal
      I think there is a layer there at which point it's useful, and one at which it's not. It's fine to anthropomorphize a program when explaining to an end user why it's broken, e.g. "The program doesn't know to check for the start date of a new lease when the old one expires, it just thinks it should activate it regardless." (Actual problem we're having to fix right now.) But of course the developer shouldn't think that the program is confused; it's doing exactly what we asked of it in a nightly stored procedure, and not bothering to check start dates because it wasn't programmed to do that in the first place!
      • "The program doesn't know to check for the start date of a new lease when the old one expires, it just thinks it should activate it regardless."

        What's wrong with "The program wasn't designed to check for the start date of a new lease when the old one expires, it just activates it regardless."

        More accurate, less words, and no shifting responsibility for the situation to a "naughty program" in a manipulative subconscious effort to evade responsibility for what you built.

        • by SuperKendall ( 25149 ) on Saturday January 03, 2015 @07:27PM (#48727569)

          "The program doesn't know to check for" is in fact more accurate than "The program wasn't designed to check for". The second statement could mean that it wasn't designed to do something, but might do it anyway - what the program ACTUALLY does is left somewhat ambiguous (this is a technique lawyers will often use answering in court). In the first instance the statement makes it quite clear the program DOES NOT KNOW HOW to do what you are talking about.

          You can shorten something so far for clarity, but if you go to far you end up with less clarity.

          The "thinks" part at the end could go though, removing that does no harm to clarity.

          The thing is, writing clearly is just plain hard - being able to use anthropomorphic kinds of terms helps make it simpler to add clarity to description at the cost of somewhat wordier sentences... kind of like how sometimes you make a program a little more verbose so that a different programmer coming across the code later can understand it.

          • Or even simpler (and closer to the truth) "We screwed up. We didn't code a check for the start date."

            No passing the blame off to some imaginary "bug".

          • It's either a feature or a bug.

            I understand what you are saying, but language that makes the computer sound like an out of control actor makes me sound like I'm not in control of my job and my dog ate my homework, so I make an effort not to use it. I think it makes me look less professional. Language that involves me saying things like "I designed it that way for these justified reasons, but we can discuss changing it", or "I'm not sure why it's responding this way, but it's my screw up and these are the

            • by SuperKendall ( 25149 ) on Saturday January 03, 2015 @07:58PM (#48727725)

              language that makes the computer sound like an out of control actor... describes most accurately what is really going on. You sent the slinky down the stairs but it veered sideways and hit a wall.

              A program is basically a state machine, a very complex one - we start it and watch what happens, but with a complex enough state machine it is basically indistinguishable from watching a video feed of a person robbing a convenience store. You can describe later on what went wrong quite well, but at the time the result is unexpected.

              To be the description of something as a "bug" is leaning more towards papering over flaws; just admit we build systems and almost never have a 100% understanding of what they will do in operation. That is far more truthful than saying there is *A* bug rather than the program is a series of statements that may do what we intend, but may not.

              • No, its a system I built to fix a problem, and if it's not working precisely like it does in my minds eye, then it's wrong because I made a mistake, and I'm ok admitting that.

                • No, its a system I built to fix a problem, and if it's not working precisely like it does in my minds eye, then it's wrong because I made a mistake, and I'm ok admitting that.

                  "The program has a bug" indicates the program is flawed, not you.

                  I'm more than OK admitting I made a mistake, I prefer it.

          • You can shorten something so far for clarity, but if you go to far you end up with less clarity.

            I see what you did there!!

            ... I only wish it had been intentional.

            how sometimes you make a program a little more verbose so that a different programmer coming across the code later can understand it.

            Oh, like comments? You can write insanely complicated code and as long as it produces the results you had intended, it's correct. But it helps the programmer behind you if you then also write "War and Peace" describing how it works. "It's Magic" is too short.

    • by dissy ( 172727 )

      The last time I anthropomorphized a program it got quite angry at me.

      Mrs Compiler wouldn't let me sleep in C: for a week, and even then she wouldn't let me declare unsigned variable types for the rest of the month!

    • Say, "the program contains your name." Or "the program has your name." Problem solved.
      • Or someone hearing "the program knows your name" can simply recognize it's meaning instead of tossing a hissy.
      • by Imrik ( 148191 )

        Neither of those means quite the same thing as "the program knows your name." The first could be a reference to attribution, the second could be a reference to the program's name.

        • the second could be a reference to the program's name.

          You mean "the program has your name"? There is no way that could be a reference to the program's name.

    • by AmiMoJo ( 196126 ) *

      "The program stores your name." Simple and precise.

      I agree thought, FTA is dumb. It is accurate to say that a program tries to do something when it makes some call that may fail, or when it uses a try...catch to attempt something that may fail. "Want" may be a bit of a stretch, but it's a shorthand way of describing behaviour like needing a file to be in a certain location or trying a particular operation first before taking some alternate action.

    • by mbone ( 558574 )

      He doesn't like assignments [loup-vaillant.fr] either. As he says, the variable "x itself doesn't change. Ever. The value it holds is just replaced by another." So he would probably want you to say "the program contains a variable whose value stores your name"

  • Like all, this one works up to a point. Learning something new without analogies is probably going to be very slow. It's not like this is the downfall of programming, but it's passing-interesting as a note.
  • My code wants to be antropomorhized!
  • Encapsulation (Score:5, Insightful)

    by mrflash818 ( 226638 ) on Saturday January 03, 2015 @05:47PM (#48727081) Homepage Journal

    For me, I prefer OO programming (c++/java) to functional programming (C-lang), just due to encapsulation. I like having an object, with methods for its attributes, public/private methods, and such. Then having the objects interact in my program. It kinda makes sense to how I think.

    However, I also think that if a team or individual programmer can release software that serves a useful purpose, and is maintainable, then those are the only things that matter. Which language, functional or object-oriented 'way' that was used to get there? Seems less important.

    • It sounds like you might be somebody who learned to program someplace other than a CS degree, or who got a CS degree and forgot some academic stuff that you haven't used in your day-to-day work.

      You've run afoul of the "functional" doesn't mean "uses function calls a lot" problem and some chest-pounders here are slamming you for that. As an EE who only had a few undergrad CS courses, I've had similar problems. Somewhere out there is a USENET thread in which I'm assuming that "side effects" are "bad side ef

  • Something that Dijkstra wrote in 1983, and a crap article that gives some examples of bad programming to show a bad point.
    • by cruff ( 171569 )
      I didn't see any anthropomorphism in the example. It just looked like a comparison of procedural and OO coding styles.
  • by Chris Mattern ( 191822 ) on Saturday January 03, 2015 @05:49PM (#48727095)

    ...they hate that.

  • Missing the point (Score:5, Insightful)

    by pushing-robot ( 1037830 ) on Saturday January 03, 2015 @05:51PM (#48727103)

    OO isn't about anthropomorphism, it's about isolation and providing a clear API. If this was a large scale project with fifty people working on code that could move students, I don't want them implementing fifty different versions of move_student that will break whenever the Student or Classroom model changes.

    I know it's trendy to hate on things that have been around a while, and OO indeed isn't the answer to everything, but it's still a useful way of keeping a complex program from getting out of control.

    • by phantomfive ( 622387 ) on Saturday January 03, 2015 @06:01PM (#48727145) Journal

      OO isn't about anthropomorphism, it's about isolation and providing a clear API.

      Believe it or not, isolation and 'providing a clear API' existed before OOP, so you can't say that's it. In general, object oriented programming means that the functions go with the data.

      • by lokedhs ( 672255 )
        Except in good object orientation systems, functions does not go with the data. That's a limitation introduced by Simula 67, which started a trend that survives till today, with Java, Ruby and all the other single-dispatch languages.

        Far too many programmers these days seems to believe that object orientation is equivalent to the limited form available in languages like Java and C++. They would be helped by learning a language that has multiple dispatch [wikipedia.org] before commenting on that object orientation is.

      • There's a big difference between allowing something and requiring it. I think OO was the natural evolution of concepts like interfaces and namespacing, but what sets it apart is that it insists (so far as it can) the developer encapsulate information, while procedural languages at best suggest it.

        Of course it's possible to write clear code in any paradigm, and some of the most beautiful code I've ever seen has been purely procedural, but that is unfortunately the exception. Most code I read is pretty poor,

        • I think OO was the natural evolution of concepts like interfaces and namespacing, but what sets it apart is that it insists (so far as it can) the developer encapsulate information,

          No, that's not what sets it apart. See above for the meaning of OOP: it means that the objects and functions are bundled together.

    • OO isn't about anthropomorphism

      Originally, it was actually about cellulomorphism. Programs were supposed to look like systems of communicating biological units, i.e. cells.

    • by Tom ( 822 )

      it's about isolation and providing a clear API.

      Which did not exist before the invention of OOP... oh, wait...

      I don't want them implementing fifty different versions of move_student

      If only someone had thought of a way to define shared functions. He could call it a collection, or an archive, or a library, no that's crazy...

      but it's still a useful way of keeping a complex program from getting out of control.

      Being organized and professional is a useful way of keeping a complex program under control. With the right approach, it almost doesn't matter which language or paradigm you use. And with the wrong approach - ditto.

  • He says that anyone who uses the term 'object oriented programming' is crazy [loup-vaillant.fr]. I don't think he's realized that OOP means that the functions go with the data.
  • One of the worst things about OOP is the stupid analogies used to explain it. If the people you're explaining it to can't understand it in abstract, programming terms then they're not worth wasting your time on because they'll be useless programmers anyway. But, of course, it's probably not the audience that's the problem, but the writer - who's incapable of communicating without resorting to stupid analogies.

  • by RhettR ( 632157 ) on Saturday January 03, 2015 @05:56PM (#48727127)
    Dijkstra makes an argument about what he calls object oriented programming, but doesn't really use OOP. That he happens to base his argument around two styles of coding, one that is clearly procedural, and one that happens to use objects, is merely accidental. His argument is centered around poor code organization, plain and simple. He passingly slaps down some code modeling Student as an object, neglecting to mention anything about why one would do that (e.g. encapsulation), and completely fails to even mention other OOP ideas such as composition, inheritance, polymorphism, etc. In short, he bashes horrendous code organization, and calls that OOP, without addressing a single reason typically given in favor of OOP. Frankly, that article was awful.
    • by gnasher719 ( 869701 ) on Saturday January 03, 2015 @06:15PM (#48727217)
      Just saying: It's not Dijkstra talking rubbish. Dijkstra is died 12 years ago. There is a handwritten article by Dijkstra which probably made sense in 1983 when it was written. In 1983 Dijkstra didn't know about C++. Stroustroup didn't know about C++ in 1983.
      • C++ is a horrible language. It's made more horrible by the fact that a lot
        of substandard programmers use it, to the point where it's much much
        easier to generate total and utter crap with it. Quite frankly, even if
        the choice of C were to do *nothing* but keep the C++ programmers out,
        that in itself would be a huge reason to use C.

        Linus. :)

  • by anchovy_chekov ( 1935296 ) on Saturday January 03, 2015 @06:10PM (#48727187)
    I think what this article highlights is that not everything Dijkstra said was gold. Nor does slavishly following his missives make you a better programmer.
  • stupid (Score:5, Insightful)

    by Tom ( 822 ) on Saturday January 03, 2015 @06:11PM (#48727193) Homepage Journal

    One of the more stupid blog-level postings I've read. I use "blog-level" as an insult, btw. because blogs are generally a source of shallow thinking, because it just is too convenient to publish some thoughts. When it is more trouble, you're also forced to polish them more.

    Firstly, to understand the difference between trying to do and "trying to do", read some Dennett. If correctly understood, anthropomorphisms like the attribution of intention to a non-intentional entity can be extremely helpful.

    Secondly, not even his example is anywhere near what he's trying to explain. Yes, the analogy breaks down but it has nothing to do with the convulted reasoning he's applying. The cause for the analogy to break down is that there's no equivalent to walking to the classroom in his example. All of his code simply assigns a classroom number, without any equivalent of the walking part. As soon as you add that - magic ! - the analogy works again.

  • The insensitive clod who wrote this is clearly not working in AI.

  • by caseih ( 160668 ) on Saturday January 03, 2015 @06:31PM (#48727289)

    This is why I like Python. Python allows object-oriented programming styles or procedural, or a mix. Python has a lot of warts, but it's really refreshing to me to use. Every time I look at Java, I'm turned off by the forcing of class-based object-oriented programming for everything, even when the program is really just procedural with a static main. Perhaps this tends to make programmers try to shoehorn OOP when it's not the best fit.

    • by radish ( 98371 )

      There's absolutely nothing stopping you writing procedural code in Java, just put everything in one class and mark all your methods as static. Of course if you're going to start interacting with the class library you'll have to bend to it's way of thinking but that's not a _language_ thing. Of course I don't recommend doing that, but it can be done.

      This is why an experienced developer has multiple tools at her disposal - Java is great (IMHO) for a lot of things, but I'll pull out Ruby or Perl for some stuff

  • Balance (Score:4, Insightful)

    by Tablizer ( 95088 ) on Saturday January 03, 2015 @07:00PM (#48727419) Journal

    OOP is a tool, and like any tool it has places and times where it is useful, and places and times where it is not. Knowing when to use it and when not to is part of the art and craft of programming. Does OOP improve the overall organization, reading, and flexibility of a given portion of software to change? If you don't have a good answer, then consider skipping it.

    All-or-nothing zealots are usually full of squishy stuff. In the past, one was told to model all domain nouns as objects and/or classes (depending on language flavor). That doesn't always work well, especially if a database is being used. Domain nouns are often poor candidates for OO-ness. Objects seem better suited, however, for computing-domain nouns, such as GUI's, report columns, files, sockets, security profiles, etc. That's my experience.

    (However, lately GUI's seem to have outgrown OOP, as multi-factor and situational grouping and searching becomes more important for managing mass attributes and event handlers, and I'd like to see research in database-driven GUI engines, perhaps with "dynamic" relational or the like.)

  • by SuperKendall ( 25149 ) on Saturday January 03, 2015 @07:07PM (#48727459)

    A dash of anthropomorphism is actually quite helpful to a programmer - the trick is that people are just not applying it in quite the right way; instead of considering a program or objects as a rational kind of being that does what you tell it, instead consider that code and objects are all malevolent entities that exist only to twist what you tell them to do into the most horrific possible result.

  • Do anthromorphise! (Score:4, Insightful)

    by dwheeler ( 321049 ) on Saturday January 03, 2015 @07:25PM (#48727553) Homepage Journal

    Don’t anthropomorphize computers, they hate that [dwheeler.com] notes that most developers do use anthropomorphic language. I think there are probably a variety of good reasons for it, too. Here's one speculation: When we communicate with a human, we must use some language that will be more-or-less understood by the other human. Over the years people have developed a variety of human languages that do this pretty well (again, more-or-less). Human languages were not particularly designed to deal with computers, but languages have been honed over long periods of time to discuss human behaviors and their mental states (thoughts, beliefs, goals, and so on). In any case, the problem isn't anthropomorphic language, it's the use of a bad analogy.

  • First I would like to point out that OO and procedural are not so much different. The main difference is that in OO the procedure run is governed implicitly by the class the call is made on and not explicitly by the code that makes the call. Saying OO vs procedural is a false dichotomy. I will refer to "procedural" as non-OO

    The example in the blog is overly simplistic and does not show the strengths of OO. It use OO where it is not needed. Here are some issues with the OO implementation;
    1. Too few classes.

  • If programming were strictly about efficiently providing instructions to computers, then anthropomorphism would be wasteful and counter-productive. Think about all of the code and processor cycles devoted to displaying data as windows, folders, icons, or just plain aesthetics. Those metaphors are highly wasteful of computer processing power.

    But the point is, computers are, above all, a tool for people. So why not make them function in a way that is understandable to people? If anthropomorphism helps pro

  • by nuckfuts ( 690967 ) on Saturday January 03, 2015 @07:29PM (#48727591)
    This has been going on since before object-oriented programming existed. In Unix, for example, processes have "parents", "children", and can be "killed".
  • Car Analogy (Score:4, Funny)

    by PaddyM ( 45763 ) on Saturday January 03, 2015 @07:41PM (#48727645) Homepage

    Lets say you're a traveling auto salesman, and you would like to sell your cars to different stores around the state. You could either drive each car, one at a time, to each assigned destination and hitchhike back to your starting point (always with a towel). Or you could come up with an algorithm for taking all the cars, putting them into a truck, and finding the shortest path that visits each auto store, saving gas and giving you the street credibility to comment on the appropriateness of OOP vs procedural languages. Then, after having spent a more fulfilling life than most people by being so efficient, you can watch as people invoke your name, and come up with a poor analogy which doesn't really explain OOP vs procedural languages that shows up on Slashdot.

  • by melchoir55 ( 218842 ) on Saturday January 03, 2015 @07:43PM (#48727655)

    Humans are bad at abstract logic. Not just bad, but shockingly, horribly bad. It requires many years of teaching to get them to learn how to reason according to logical principles and to avoid logical fallacies. Most people never get there at all.

    OOP is a step in the right direction, for some kinds of programming. You don't always need to tell a story about your concept space. Sometimes what you're doing is so simple, and so shortlived, that it doesn't matter. However if you want long term maintainability and something that other people are going to be able to learn as quickly as possible, OOP wins. Consider the following example:

    English:
    John loves Sally. People like to spend time with others that they love. Does John like spending time with Sally?

    If you are human and not deeply mentally impaired, you will quickly answer "yes" to the above question

    Symbolic logic:
    Derive Q (John likes spending time with Sally)
    P (Assertion)
    P -> Q (Modus Ponens)

    Did you immediately think in your brain the following:
    Q (Consequence of premises 1 & 2)

    A lot of people who stare at that sequence of symbols will require a few moments to process it. Very few will read it as trivially as they read the English expression, although both expressions communicate the same relationships and information. Why is that? It is because the logical derivation is an abstraction above the English expression (which itself is of course an abstraction of something else). Every level of abstraction adds to how difficult it is for a human to comprehend something. It doesn't mean they can't get it, it means it will take longer (though depending on the person it might mean they can't get it).

    Do you want people to be able to read your code in the future? The best chance of them succeeding is to use object oriented programming, and to create a class model that closely resembles what most people intuitively understand as the concept space you are working in.

    Humans did not evolve to process information regarding Ps and Qs. They evolved to process information about Johns and Sallys. They are much better at the latter than the former.

    • A lot of people who stare at that sequence of symbols will require a few moments to process it. Very few will read it as trivially as they read the English expression

      It's not very well written.

      although both expressions communicate the same relationships and information.

      I don't think they actually do. You don't define what P means, and actually you have two assertions in your English sentence, but only one apparently in your symbolic logic.

  • What? (Score:5, Funny)

    by matunos ( 1587263 ) on Saturday January 03, 2015 @10:56PM (#48728307)

    An analogy that breaks down if you push it too far?

    Ridiculous! You cannot push analogies. They are not tangible things.

  • The original analogy involved a principal (a fairly complex process) who instructs a group of students (a collection of complex processes) to go to various classrooms (a fairly expensive / time consuming task). The pseudo-code he has breaks most of these assumptions and results in a simple single threaded task calling move-to-classroom routine that is so trivial that the implementation is not even shown. His straw man analogy is thus not correct.

    It could easily be the case where the analogy is correct.

  • Object orientation is a tool, unfortunately most programming languages treat it as if was a religion.

  • by khallow ( 566160 ) on Sunday January 04, 2015 @12:34PM (#48730807)
    Anthropomorphizing is not this ridiculous caricature. It is a very effective process by which we relate new information to information we inherently have. Would you rather relate new information in a way you and others can readily understand or in a way you and others can't readily understand?

    Sure, it's not perfect, but usually you're looking for good enough, not perfect. For example, consider this example from my neighbor, Yellowstone National Park. You are tourist and come across a bull (male) elk [wikipedia.org]. It's a 600 pound member of the deer family with an antler spread around two meters wide. There are correct and incorrect ways to anthropomorphize the behavior of that bull elk. The following is the incorrect way which unfortunate, real world tourists use each year: "That's a pretty elk. I know he wants me to pet him, because I would like being petted if I were a pretty elk too!" Their world gets rocked as a result. Consider the alternate approach: "That's a big elk in the middle of rut season [wikipedia.org]. I bet he'll be pissed if a crazy human tries to touch him. I would if I were running around hyped up on hormones." Look! No headbutt Ma! Obviously, neither approach captures what it means to be elk or those elk sensibilities. There is this certain lack of nuance of the elk point of view. But one approach, which I would go as far as to label an entirely correct approach to understanding in this scenario, keeps you from finding out what pointy antlers backed by 600 pounds of enraged elk can do to a person.

    My view is that humanity and our behaviors are sufficiently complex that one can shoehorn any understandable phenomenon into an anthropomorphic basis. The real problem is not the process, but insufficient understanding of the problem needed to come up with a sufficiently correct anthropomorphic model.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...