Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Think Python 201

An anonymous reader writes "In a neverending effort to spread the word about free quality online programming books, here is a Python programming book. 'How to Think Like a Computer Scientist: Learning With Python', by Allen B. Downey, Chris Meyers, and Jeffrey Elkner is a copylefted work available in multiple formats at Green Tea Press: HTML , PDF, LaTeX. Compliments of the online books what's new page."
This discussion has been archived. No new comments can be posted.

Think Python

Comments Filter:
  • by Tonetheman ( 173530 ) on Saturday July 27, 2002 @09:23AM (#3964089)
    This books has been translated to other programming languages (like C++ and Java)... so if Python is not for (it should be) you can read those too.


    • I only find two versions - Java and Python.

      Where can I find the C++ version ?

      Thanks in advance !

    • Hi. Thanks for mentioning the other versions of "How to think..."! Actually, the Java version was the original (I used it at Colby College) and then I wrote the C++ version to help students prepare for the AP exam. Jeff Elkner translated the Java version into Python, and Chris Meyers translated some of the later chapters and added some new material. So the Python version is truly the result of a Free Content collaboration (I have never met Jeff or Chris in person). I am in the process of editing and expanding the Java version, in preparation for the AP Exam's switch to Java. The web page for the Java version is: http://thinkapjava.com Cheers, Allen Downey
  • At first it looked like you were talking about a new IDE for writing Python programs ;) Anyone remember Think Pascal?
  • Sounds like a good read for anyone who thinks that 'computer science' is 'installing Linux and setting up a web server'.

    The most pure computer science is essentially mathematical at heart. I don't think current CS curriculums put enough emphasis on that basic tenet.
    • by affenmann ( 195152 ) on Saturday July 27, 2002 @09:42AM (#3964137)
      I think part of the problem is the name `Computer Science', which gives a wrong impression of what the core of the poodle really is. That's like calling Astronomy `Telescope Science'. I have met so many people who didn't want to study CS at all - they just wanted to learn `installing Linux and setting up a web server'. This has regrettably put universities under pressure to change their curriculum...

      Some universities (eg. Edinburgh) have started calling it `Informatics', which is much more appropriate. (In fact in Germany, and probably elsewhere, it was always called `Informatik'.)
      Maybe there should be CS *and* Informatics.

      Uhm, guess that was offtopic.

      • I agree that the term Informatics is more appropriate. In my country (Spain), it is called "Informática".
      • my local community college puts stuff like Introduction to Microsoft Word under Computer Science! gaaaaaaaaaaaa!!!
      • I think part of the problem is the name `Computer Science', which gives a wrong impression of what the core of the poodle really is.

        Well, yeah.

        I'd make the following analogies:

        • Computer Science ~= Physicist
        • Software Engineering ~= Electrical Engineer
        • Informatics and Coders ~= Electronics Technician

        CS is a science that deals with unravelling how information and logical systems function and developing frameworks to understand them. CS are most likely to determine the boundaries at which things can happen and to lay out how to practically approach that boundary.

        Software Engineering is an engineering discipline that deals with manipulating those systems to perform a needed task. They take the work of the CS and design systems to address specific problems. Quick and dirty is just fine, provided that all the needs are being met.

        Coders assemble the systems that the SEs design and informaticians maintain those systems.

        There's overlap among all of them to some degree, and plenty of people do them all, but from an education point of view, if you mix them together, you get a mess - and most schools mix them together. It was easier to mix them in the past because the field was narrow. But now, you just can't do it.

        CS has become very deep, and you can't get into any of the real work if you spend your time dealing with SE and coding practices. SE has become very deep as well and you don't want these folks getting bogged down with the NP completeness proofs and whatnot, or with learning the programming tools too much. There's enough to do in all three areas that they need to be treated as different but complementary disciplines...

    • Well as a CS student what I can say is CS always mess with optimization in the first hand
      And a basic programming language like C/C++
      Hardly ever the idea is to teach a language but how to optimize things
    • True 'Computer Science' covers a lot of areas, mostly fundemental, including Computer Theory, Electronics, Mathematics, Logic, Processor Theory and Design, to name but a few.

      The majority of today's CS Courses seem to fall into two broad categories, 'Software Development', and 'Systems Management'. Whilst these are both elements of computer science, they do not encompass computer science as a whole.

      Universities are more and more often cutting out the core Computer Science components of their courses, such as Electronics and Computer Thery, which is a shame - whilst the courses leave graduates with an understanding of 'How' to do certain tasks, they are left with no understanding of 'why' they are done that way, because they have no real understanding of how the Computer Systems they are carrying out tasks on actually perform their functions.

      NDFSM's are important, Karnaugh Maps are important, Understanding the CPU F/E Cycle is important too - bring back real CS to our Universities!

      • by Anonymous Coward
        The lower-level concepts you have described probably fit better into a degree of Electronics and Electrical Engineering than Computer Science. Indeed, the style of EE courses has shifted from electronic component based logic to programming in languages like C++.

        In many ways, CS and EE are extremely similar.
      • by Anonymous Coward
        I wouldn't make a broad statement that CS is lacking in CS degrees... I know my CS degree contained:

        Computer theory, system design and architecture, analysis of algorithms, algorithm design, data structures, compiler design, AI, networking theory, files and databases, mathematics (up to multi-variable calculus, discrete structures/combinatorics, linear algebra, numerical analysis, statistics and probability), software engineering, languages (LISP, C, C++, Java, Prolog, Cobol, Assembly), computer graphics, to name a few things - there's more but it's been quite a few years and that's what comes to mind at the moment. I'd say that's pretty well-rounded for a CS program.
      • I'm gonna make a bold claim with the electronics bit, but it's kinda.... easy. I do think it should be taught, though. I'd never really played with electronics much until a recent embeded project at work and the EE's from the lab came in and started drilling me on , like , what sorta DAC's we'd be using and the like. Out came the books. The thing that struck me was how easy it was to schematify a little circuit up (ignoring ugly stuff like, oh say resistors and the like!) that I could give to the EE's to make into real toys. But that's because back in the day we actually learned how this stuff fits together. Really, if one can understand the finer points of stack tuning then figuring out that one needs to provide a latch to a parallel thinger is not too hard. real CS rocks, and amazingly once or twice in a career it actually gets used.
  • by Taco Cowboy ( 5327 ) on Saturday July 27, 2002 @09:35AM (#3964113) Journal


    Thanks to Copylefted Online Books, I now can read the books before I buy.

    On my bookshelf, seven of the books were bought after I read their online version.

    I live in a third world country where there is no Towers bookstore, nor Borders, nor Barnes - there is NO WAY for you to know how good a book is without first buying the book - the bookstore here do NOT allow you to read the book !

    The idea of Copylefted books really help me, and many others who are in the situation of buying books not knowing if the books are good or not.

    Thanks again !

    • Another option for you may be to subscribe to a service like Safari [oreilly.com] from oreilly. I subscribed and use it daily. Basicly you can check out books from oreilley and several other publisher for 45 days at a time (when you can then check them back in if you want another book).

      Its not too expensive compared to how much technical books cost in some countries outside the US I have visited. The Safari service is about 10 dollars a month (US) for 5 books and 15 dollars for 10, and so on.

  • Question (Score:3, Insightful)

    by citizenc ( 60589 ) <cary&glidedesign,ca> on Saturday July 27, 2002 @09:37AM (#3964118) Journal
    Am I the only person who thought the title of the book was "How to Think Like a Computer Scientist: Learning With Monty Python"?

    Man, it's early.
    • You aren't totally crazy -- that's actually where the name "Python" came from. In fact, Guido van Rossum (the creator of Python) strongly encourages the interspersion of Monty Python references into the comments of Python programs :-)
  • Dive Into Python (Score:4, Informative)

    by jonwiley ( 79981 ) on Saturday July 27, 2002 @09:42AM (#3964138) Homepage

    Another excellent free book for Python is Dive Into Python [diveintopython.org] by Mark Pilgrim [diveintomark.org]. It is available in HTML, PDF, Word 97, Windows Help, plain text, and XML formats.

    This book has plenty of examples and pointers to further reading on each subject. It features good layout, use of colors, and typography which makes for easy reading and comprehension.

  • Are there any free Perl books out there like this one for Python?
    I haven't done any Perl or Python before and I wanted to look at both before I pick one and use it primarily.

  • I found this a great book for when you are learning Python as a second or third language

    I learnt python in two nights from it because as we all know, once you have the basics of CS in your head the language you use is just an implementation detail.
  • Computer 'Science'? (Score:2, Interesting)

    by nih ( 411096 )
    Feynman once said during an interview that some 'sciences' were really pseudosciences, in that they have never made any laws, eg social science, is computer science a real science with laws, or a pseudoscience?, if its a science, does anyone know of any of these laws?
    • by Anonymous Coward
      Feynman doesn't have a fucking clue about anything outside his own field. He's a prime example of one of these arrogant beings who believes nothing is more important than their field of choice.

      Of course Computer Science is a science - take any introductory CS course and you will come across many formal theorems and hypothesis-based discoveries.

      Examples:

      Halting problem - this is essentially derived logically from basic premises.

      Neural networks - these are constantly the subject of scientific study in much the same way as geneticists study rats.
      • Here's what Feynman actually had to say on the subject:
        Computer science also differs from physics in that it is not actually a science. It does not study natural objects. Neither is it, as you might think, mathematics; although it does use mathematical reasoning pretty extensively. Rather, computer science is like engineering - it is all about getting something to do something, rather than just dealing with abstractions and in pre-Smith geology.
        That's from the introduction to Feynman Lectures On Computation - which book suggests that RPF thought the subject to be important. Certainly not as important as physics - but if you don't think nothing is more important than your field of study, can we assume you're only in it for the paycheck?
        • by Lictor ( 535015 ) on Saturday July 27, 2002 @11:02AM (#3964355)
          Hmm... to be honest... the "Feynman Lectures on Computation" are just about as absurd as the "Goedel Lectures on Biochemistry" (these don't really exist... I'm just being sarcastic). The original poster's comments on Feynman had some merit.

          The phrase that you quote here displays a mind-boggling ignorance about exactly what "Computer Science" is. Software Engineering is, indeed, "like Engineering" but there are many branches of Computer Science that deal *purely* with the abstract. I do Formal Language Theory and Automata Theory for a living and I just can't see how these fields are about "getting something to do something". Feynman, like most people, has missed the science for the telescope.

          The real joke is that things like the Church-Turing thesis could not possibly be MORE about "natural objects". In the abstract, I can define a machine that can solve the halting problem. Heck, I can define a machine that solves any problem I want! The Church-Turing thesis tells us about a PHYSICAL limitation on computing. In this universe, you can only build a machine that will compute *these* functions.... But what if I live in a universe where time has no meaning? All of a sudden, I get a *very* different Church-Turing thesis.

          There is no question that Feynman had some brilliant insights in physics, but I have to admit that when I read the Lectures on Computation, not only did I lose a small amount of respect for him... I found myself actually outraged. Many intelligent people will read these lectures and believe them... I mean, after all, they're written by Feynman, right?

          Computer Scientist's have enough trouble trying to explain to people that, no, we don't just sit around installing Windows network drivers all day without a respected and intelligent person like Feynman adding to the problem.

          • by hysterion ( 231229 ) on Saturday July 27, 2002 @02:03PM (#3964935) Homepage
            Those are hard things to tell an Caltech Alumnus :-)

            No doubt, Feynman was a very, very good physicist. But he was also a genius at self-promotion, and his cult has gone way overboard as a result. It's well-established by now that some of the ideas he's famous for were first published by others.

            (Not that he wasn't honest about it sometimes. I think he's on record, for instance, crediting Stueckelberg for the renormalizion of electrodynamics, and for the idea that positons are electrons travelling backwards in time. See e.g. this timeline [weburbia.com], or the last chapter of this book [amazon.com].)

    • It's a lot more like mathematics than science (because essentially we're talking about a generative and definitional activity rather than an analytical process of discovery), but yes, there are plenty of CS "laws": Church's theorem, Godel's theorem, Shannon's Law, NP-completeness (no cracks about quantum computing now), etc., etc. Oh, I suppose there's always quantum computing to make it a science.

      Lately, though, it's starting to become analytical, courtesy of Microsoft. First, you make a hypothesis about how the OS works, then you write a program to test this hypothesis, analyze the results, and modify your hypothesis to fit the facts :-).

      You might find it interesting that, science or not, Feynman spent the last few years of his life teaching a computer science class called "The Potentialities and Limitations of Computing Machinary". It was a very interesting class. He brought his unique wit and wisdom to a subject much in need of wit and wisdom than most.

    • by panurge ( 573432 )
      Many years ago I went for interview to the CS department of the University of Leeds, England. Things were going well until I asked the full professor interviewing me if we were going to learn anything about the hardware side of computing. He looked down his nose at me, drew in his breath and said "My boy, we don't concern ourselves with the doings of technicians". (Leeds is not precisely in the forefront of advances in modern computing.)
      However, for the last hundred years or so it has really been increasingly difficult to separate science and engineering. More and more, scientific hypotheses can only be tested when sufficiently advanced engineering comes along. There have always been "whiteboard scientists" (i.e. theoreticians) who resent this.
      But most great scientists were skilled engineers as well. Galileo, Newton, Bunsen, Babbage, Turing...

      I think the terminology is the problem. We don't talk about "Physics science" or "Biology science", so why "computer science" or "rocket science"?
      Why not just computing and rocketry?

      While I'm having a rant, there's also a problem with degrading the word "engineer". MCSEs and such are basically technicians, not engineers. Perhaps if we admitted that the people who implement systems using standard components that just have to be set up correctly (although this may be a challenging role) are technicians, then we could accept that most "computer scientists" are actually trained as engineers, that this is a highly skilled and challenging professional role, and the number of real scientific researchers is not that great. Just like physics and chemistry nowadays, in fact.

      I would suggest that the test of a pseudoscience is that it doesn't create a heirarchy of engineers and technicians because, basically, it doesn't work and there would be nothing for them to do. You don't get sociological engineers designing ever better societies, and socio-technicians building them just as fast as people can throw money at them. (At least, the attempts, such as Marxist-Leninism, have been abject failures). But you get plenty of sociologists. On this basis, computing, with its deep organisational structures, is an extremely successful science-based system. Arguments about testing hypotheses are irrelevant: real scientists tend not to work like that anyway.
      Scientific proof has been conventionally about other people reproducing your results. But if the nature of your science/engineering is that you can rapidly produce millions of copies of your concept or invention, this becomes trivial. If I claim to have invented (say) a graphics chip architecture that can draw polygons twice as fast as the previous best for a given clock speed and die size, I prove this by marketing the product, not by publishing and waiting for other labs to build a copy and duplicate my result.

    • of all sciences. Believe it or not, CS means nothing unless you know some other science (but it can also glue other things, like information management, which is not a science but actual data).
  • Practical PostgreSQL [commandprompt.com]
    Using Samba [oreilly.com]
    Personally I thought both were very well written, the samba book has helped me greatly.
  • by hacksoncode ( 239847 ) on Saturday July 27, 2002 @10:18AM (#3964233)
    I should probably quit whining and write my own copyleft book, but I skimmed through this book, and I'm not all that impressed.

    This book doesn't strike me as a book on how to think like a Computer Scientist, except insofar as Computer Scientists generally make lousy Software Engineers. There are no descriptions of the advantages of object oriented programming, discussions of theoretical topics, and in general very little encouragement to view programming as a science. Basically, this appears to be just a book on the Python language, written for someone who has never programmed before. That's a fine thing, don't get me wrong. My brief look even makes me think it could be an effective example of such a book. At the very least, I think it's hyped wrong.

    However, from a software engineering point of view, I find it damning that the book forgoes any explanation of the practice of, or motivation for, writing maintainable code. I consider that unforgivable in a beginning programming book. You absolutely have to impress on newbies early the importance of documentation, sensible structure, logical variable naming, good class hierarchy, etc.

    I consider this especially true for Python, which is an interpreted non-declarative language (making maintainabilty all that more important). Python is, conversely, also especially well designed as a platform where such concepts could be taught. It largely overcomes the occasional weaknesses of its design philosophy by consciously including language features such as built in support for docstrings, well crafted namespaces, modules as first-class citizens, etc.

    Yet, these language features are barely given a nod in this book.

    It's books for existing programmers that can afford to skimp on these areas.

    • by Anonymous Coward
      That's because it's not about Software Engineering, you fool. It's about Computer Science.

      Software Engineering is essentially the application of CS to real world projects - and the current fashions in SE should be a separate course entirely. It's more about psychology and HR than it is about Computer Science.
      • That's because it's not about Software Engineering, you fool. It's about Computer Science.

        If you actually look at the book in question, you'll see that the original poster was correct: it's not about computer science at all. It's a Python programming book with a marketing angle relating it to computer science.

        If you really want a book which teaches "How to Think Like a Computer Scientist", try SICP [mit.edu]. For a good summary of the book, see this comment [slashdot.org] from the recent "Best Computer Books" article.

        • We want it for free :( ... though many of the critics are ineed covered in the book: naming of variables, abstracting and not hardcoding stuff, making it readable, making it generic, wrapping code, etc.

          "It's something like do as I do (though I am not saying it explicitly every 5 seconds)".
          • We want it for free :(

            The link I gave to SICP [mit.edu] includes the full text in HTML. How much more free do you need it to be? It's accessible by clicking the link on that page which reads "Full text! The complete text in HTML". Unintuitive, I know, but that's the kind of thing you'll learn to understand as a computer scientist...

            though many of the critics are ineed covered in the book: naming of variables, abstracting and not hardcoding stuff, making it readable, making it generic, wrapping code, etc.

            These things have very little to do with computer science, with the exception of abstraction (which has very little to do with "not hardcoding stuff"). What we have here is a situation where people who know absolutely nothing about the current practice of computer science are using the term to mean whatever they imagine it means.

            Variable naming, readability, wrapping code etc. all involve techniques which any good programmer should understand, academic or otherwise. But calling that computer science is like calling hammering a nail "civil engineering" or "materials science". The Python book, at best, could be described as an intro guide to software engineering, but even that's a stretch. It's really "how to program in Python", with a few technical terms introduced along the way.

            • Oh thanks! (It sounded like a place to buy the book) I'll try and see if I can get along with it (I don't like math much though I adore logic), it seems more akin to what I need ... (ie: I know to how to program, but not how to make good programs. Every tome i need to add unexpected features or when I review my old stupid ugly code, I am reminded of that fact :-)

              • I don't like math much though I adore logic)

                I also tend to prefer the logic side of things. A lot of computer science of the kind that SICP deals with has to do with mathematical logic, which I find more logic-like than math-like. For that reason, I love the lambda calculus, which is what almost all functional programming is based on.

                If you like logic, and you like programming, you *must* learn the lambda calculus - it'll give you new insight into the meaning of computer programming, and you'll probably enjoy it, too. It has applications to literally every programming language. If you're not already familiar with lambda calculus, the name sounds a lot more daunting than it really is. To start out with, you can think of it as a really simple and primitive programming language - almost the prototypical programming language.

                I'm not sure what the best intro source into lambda calculus is - the academic works are mostly heavier-duty than you really need. Some google searching definitely turns up some useful stuff, but offhand I don't know of a definitive intro site. However, SICP teaches a lot of the necessary fundamentals, since the Scheme language that it uses as a teaching tool is just about the closest language to the lambda calculus that there is. If you learn a bit of Scheme first, learning lambda calculus from the lay perspective is fairly straightforward. (I say "lay perspective" because it has some hairy mathematical underpinnings, which I mostly ignore, and that's OK for most purposes unless you're trying to prove mathematical theories.)

                (ie: I know to how to program, but not how to make good programs. Every tome i need to add unexpected features or when I review my old stupid ugly code, I am reminded of that fact :-)

                ...And then there's programming under a deadline, which is a whole 'nother story.

                I think that these academic topics can be excellent for improving one's design and thinking skills, at least I've found that to be the case. It's an indirect kind of thing, though, and experience is probably equally useful. Abstraction is key, though, whether you do it using object-orientation or modules in functional programs, keeping interfaces separate from implementations is probably one of the most important things, which people violate all the time. It doesn't help that many languages don't provide features to do this properly...

    • This isn't exactly what you're looking for, but have you checked out The Art of Unix Programming [tuxedo.org] by ESR? Only the first four chapters are written (plus the preface and TOC), but it looks like it covers a lot of what you're thinking of.
    • Ooo! I _knew_ I was gonna get flamed for
      choosing a pretentious title. Really, it's
      mostly meant to be silly (not a marketing
      angle).

      The book is (just) an introduction to computer
      science that focuses on the basics of programming.
      It covers the material I've been able to get
      college students to understand in one semester,
      which means yes to functional and data abstraction
      and no to modules as first class citizens.

      It's also aimed at people with no programming
      experience at all, so I tried to explain the
      basics slowly and LOUDLY.

      Thanks to all the slashdotters that have commented
      on the book!

      Cheers,
      Allen Downey
  • From the opening section of each book: 1.1 What is a programming language?

    Java is an example of a high-level language; other high-level languages you might have heard of are Pascal, C, C++ and FORTRAN.

    Python is an example of a high-level language; other high-level languages you might have heard of are C, C++, Perl, and Java

    Both C++ and Pascal are high-level languages; other high-level languages you might have heard of are Java, C and FORTRAN

    C, a language without file i/o, without bound checking, and with direct access to ports is high-level? If you say the libraries chucked into a C load makes it so... Then Assemebler is a high-level language, too.

    Last I heard was Binary Code=0, Assem=1, C=1.5, Fortran, Cobol, & Basic were about 3, ADA, C++=5.

    Perl was not even in the picture, because it was scripting language

    Also high-level languages does not equal easier code or does not make it faster code... It does makes more strict to code, more following the limited ways the authors of the langauge thought you should think (like the use of GOTOs :-). Low-level languages allow the coder the freedom to get the job done and not comprise the functions to limits of the authors, and it requies the coders to truely think like computer sceincist. Look at ADA for what is wrong with really high-level langauge. See how limiting the langauge can be made. And how much time is need to see up the coding effort.

    PS: maybe these are great books, but I stopped reading there, because how can it teach to "Think like Computer Sceincist" when it does not know about the basics of computer sceince?

    • I think you're being too harsh.

      The fact is, how "high" level something is, is completely relative. There are no strict rules to my knowledge about what's included in each "level" (e.g., file i/o, bound checking like you mentioned). Maybe you're just annoyed because you've always heard C as medium or low-level and it goes against that?

      My first programming language I used was QBasic (I was very young, ok?). Eventually, I moved on to use Turbo Pascal. During the time I spent developing software in Pascal, I frequently heard people talk about C and how it was tighter and superior because it was a low-level language. And compared to Pascal, it is _slightly_ lower level.

      Last year I had a Machine Organization class where we used MIPS assembly. The professor always referred to C as a high level language.

      I think a justified complaint would not be that the author classified C as a high level language, but that he classified it in the same realm as Python and Java. Clearly, Python and Java are substantially more high level.
      • Being to harsh.

        The bases for a sceince is terms. If they can not get the basic of terms right then there is no science.

        About background... I started with Basic on a HP2000F in 1973, I was 14 at the time.

        Which carries into high-level languages are portable hahaha Basic is a high-level language and it is not portable, it was not until about 1989 was the Basic Language generally uniform, Cobol today, even higher-level, in my option, than BASIC still has "flavours".

        This books was primer - when you can not get history/levels correct what does it say for the rest.

        MIPS assembly compared to C - C is higher, more human to read (if you do not compound expressions (example: x(i++) =+ y(--j); ). But C is much lower than Basic.

        Know the question: Did your teacher refer C as a HIGH-LEVEL LANGAUGE (wrong) or a HIGH-ER-LEVEL LANGUAGE (right)?
        • The bases for a sceince is terms. If they can not get the basic of terms right then there is no science.

          I don't think they got any "terms" wrong. I still assert that how "high-level" a language is is NOT an absolute scale.

          MIPS assembly compared to C - C is higher, more human to read (if you do not compound expressions (example: x(i++) =+ y(--j); ). But C is much lower than Basic.

          Good, so you are thinking in terms of "higher" and "lower".

          Know the question: Did your teacher refer C as a HIGH-LEVEL LANGAUGE (wrong) or a HIGH-ER-LEVEL LANGUAGE (right)?.

          High-level language. This is NOT wrong in the context of the course. Relative to the course material (assembly), C is a very high-level language. I have always thought of this scale in terms of context and relativity, not absolute.

          Why do I do this? Well, I suppose in theory someone could come up with an absolute scale and say "C is classified as a X-level language", but when the only terms we're using are "high", "low", and sometimes "medium", this will not hold up.

          You've been programming for a long time. Certainly you realize that C was considered more high level back then than it is now.

          Rewind back to the early 90's. Ignoring Basic, Pascal was the highest-level language I was familiar with and used. People undeniably (and fairly consistently) referred to Pascal as a high-level language. Along comes languages like Java, which is clearly higher level than Pascal, and we have to re-evaluate what we consider to be "high-level".

          The point is, we can't keep redefining this stuff everytime a new language comes out. You call Python high-level today, but 15 years from now when some other language comes out even more abstract, you can no longer call Python high level unless you keep attaching superlatives (like "very") to the new languages.

          Relative. Not absolute.
    • Compared to something like assembly or machine language, I would consider C, a "high-level" language. Of course compared to a language like c++ or Java, it is not as "high-level".
  • by AmericanInKiev ( 453362 ) on Saturday July 27, 2002 @10:37AM (#3964276) Homepage
    Does anyone know if the author of the book gets paid by Green Tea for donating or "copylefting" the book?

    I'm working on the theory of collecting tax deductions for copylefted art, and this contribution is a great example because it closely resembles historically donated items. If the author donates the artwork to the right organization - he could by my reading of the IRS be paid in tax deductions.

    Does anyone know of cases in Open Source / Copyleft where tax deduction are being used to help cover expenses?

    I'm sure that the competition - i.e. Microsoft uses every tax deduction in the book. Are Open Source contributers playing by the same rules - or are we handicapping ourselves by ignoring the tax benefits of donation?

    If anyone can provide examples of copylefted donations and how you documented it for tax purposes - I'm interested.

    I believe there are Billions of dollars in potential government funding just waiting to be collected by Open Source artists. Lets go get it!

    AIK

    • Does anyone know if the author of the book gets paid by Green Tea for donating or "copylefting" the book?
      The authors are the publishers.

      O'Reilly offers quite a few books for free on the web because they're out of print.

      For a book that's in print, I don't think it's appropriate to pay the authors extra for copylefting it, because making the book free in digital form is actually a wonderful sales tool. It's worked for me [lightandmatter.com], and it's also worked for Baen books.

      For a book that's out of print, I also don't think it makes sense. The reason these O'Reilly books are out of print is that they weren't big sellers. If they're not making a profit on the book, there's no reason to pay the author extra. Of course I assume O'Reilly only makes the books free online with the author's consent.

    • I believe what you are on is the road to jail for tax evasion.

      The IRS does not allow you to donate your time and deduct your perceived cost from your taxes. It's highly illegal because it is quite obviously prone to abuse.

      So in the case of your artwork example. If you go out and buy a painting for $20,000 and then donate that painting to a non-profit charity, you may deduct the $20,000 or whatever the current market value is of that painting.

      However if you go out and buy a canvas and some paints and then make your own painting to give to charity. The only thing you may deduct is the cost of the canvas and paints, i.e. the supplies. Now if you sold the painting at auction for $20,000 and then proceeded to give that $20,000 to charity, you may deduct the $20,000, but you're also showing the $20k as income so it's a net-zero-sum game.

      The same is going to be true of a book.
    • Does anyone know if the author of the book gets paid by Green Tea for donating or "copylefting" the book?

      It probably doesn't make any difference either way, since the author owns Green Tea.
  • I can't. (Score:1, Troll)

    by FreeLinux ( 555387 )
    Every time I start to "Think Python", it's something huge like a Burmese. Then Steve Erwin jumps in and starts wrestling with it while shouting something about what a beauty she is. It's very distracting and I just can't code with all that going on.

  • Pointers required (Score:3, Insightful)

    by magi ( 91730 ) on Saturday July 27, 2002 @10:54AM (#3964331) Homepage Journal
    While Python is my favourite language, I think it's rather silly to teach Computer Science and especially basic algorithmics with a language that doesn't have pointers.

    At low level, pointers are everything, and low level is what you want to teach when you're teaching basic data structures and algorithms. There's simply no point in demonstrating list implementation with an interpreted language that has very efficient native lists, dictionaries, etc. C/C++ or Pascal are much better for that; with them you can teach real implementations, not toy ones.

    On the other hand, Python might be ideal for teaching advanced algorithms such as sorting and string algorithms, as those are more "high-level" problems and low-level pointer-messing is no longer needed nor desired. Python has very beautiful string and list operations, which make such algorithm implementations cleaner.

    Also, Python might be ok for the very first Basics of Programming course with respect to pointers, as they don't really teach any algorithmics there. However, the weak typing (very late binding) would be a problem in this case. Beginners will have enough trouble understanding the language without the need to handle implicit types. I'd very much suggest a strongly typed object-oriented language such as C++, Java, or Eiffel, where the types are always explicit. For an algorithms course this isn't so much a problem.

    For some classes, such as AI, there's simply no winner for Prolog, and perhaps Lisp, but many Python features such as easy string manipulation and other middle-level data structures make it temptating for many subjects such as Automata and Formal Languages. It would be interesting to have a good Python interface to a Prolog interpreter; one that is well integrated with the syntactic philosophy of Python.
    • by cheezedawg ( 413482 ) on Saturday July 27, 2002 @11:49AM (#3964506) Journal
      I think it's rather silly to teach Computer Science with a specific programming language at all. Computer Science is not about teaching people how to program.

      To me, Computer Science is a very cool blend of:
      - Discrete mathematics
      - Computation theory
      - Linguistics
      - Complexity theory
      - Logic
      - Probability

      None of these rely on programming at all, let alone a specific language or whether that language has pointers or not. Programming is only an application of Computer Science.

      You gave example of an AI class- to me, the core of AI is learning about first-order logic and predicate calculus, searching and graph traversal heuristics, reasoning, and natural language processing. I know this list isn't complete, but the point is that these concepts are for the most part independent of programming. Sure, you can start off by teaching somebody a specific implementation in Lisp or Prolog, but then they only know program AI in Lisp instead of how to apply AI concepts in any setting.
      • Programming is only an application of Computer Science.

        One might reasonably argue that programming is the application of Computer Science. Without programming to implement its ideas, CS is pretty much just a bunch of theory.

    • by pclminion ( 145572 ) on Saturday July 27, 2002 @01:19PM (#3964800)
      While Python is my favourite language, I think it's rather silly to teach Computer Science and especially basic algorithmics with a language that doesn't have pointers.

      You have failed to understand the point of Computer Science (pun intended). Python is a terrific language for teaching CS because it has the basics of discrete structures: lists, maps (in Python, called dictionaries), tuples, and atomic data types such as strings, ints, and reals. That's all you need.

      There's really nothing you can't do once you have lists and maps. Don't object that you can't have O(1) access-time arrays -- you can do that with a map.

      I challenge you to describe any algorithm at all that can't be implemented without pointers. If you think you need pointers, you just aren't thinking like a computer scientist.

      For some classes, such as AI, there's simply no winner for Prolog, and perhaps Lisp

      In general, you are absolutely correct. Of course, this is opinionated and others may disagree. But remember, you can use any Turing-complete language to simulate any other Turing-complete language (that's the entire definition of Turing-complete). Which means I can write a C interpretter in Prolog if I want (and I'm feeling particularly masochistic), and therefore I can simulate pointers using Prolog.

      Oh, but you cry "That can't possibly be efficient!" Right again. But you've again missed the point of Computer Science. CS is about efficient algorithms, not efficient programs. That's something we leave to the software engineers and other "implementors." Us CS freaks think about what can be done, we don't actually do it ;-)

      • Python is a terrific language for teaching CS because it has the basics of discrete structures: lists, maps (in Python, called dictionaries), tuples, and atomic data types such as strings, ints, and reals. That's all you need.

        You're probably right with regards to basic and advanced programming courses, as I also mentioned, but I was talking especially about certain intermediate courses such as Data Structures and Algorithms, and System Programming. In those courses, low-level languages would be better in my opinion.

        I don't know how it's elsewhere, but at our CS dept, we also had mandatory courses such as Introduction to Computer Science, which quickly introduces logic gates and processor microcode programming; Computer Organization, which is about processor architecture at all levels; and Physics for Computer Science, which goes all the way down to electric fields in condensators and coils.

        Sure, you might want to call that Computer Engineering, but I think, and I believe many CS teachers think, that such low-level stuff is important for CS students too for "getting the whole picture".

        Thus, learning how those lists, maps, and other things have been done, is important for CS students too.

        But you've again missed the point of Computer Science. CS is about efficient algorithms, not efficient programs.

        That's of course the academic ideal, but I'd say most CS students will work as programmers after they graduate (or drop out of school). If they don't have the low-level knowledge, they will be next to useless. In all the companies I've worked in, I've never seen a CS student working as an "algorithm designer"; they all either code or do overall system design. Even at university, most algorithm researchers have to pay at least some attention to code efficiency, which usually means using C or C++.
    • While Python is my favourite language, I think it's rather silly to teach Computer Science and especially basic algorithmics with a language that doesn't have pointers...

      I'd very much suggest a strongly typed object-oriented language such as C++, Java, or Eiffel...

      Actually, while you're probably correct about Python not being ideal for what is about to come, the real problem facing many students learning computer science is that they've never programmed anything before. Many don't know how to break a problem down and build out a logical structure.

      It's a basic issue of not demoralizing the student in the first course by tossing a strongly typed, constructor based language at them like Java. Early on, students just need to see how you take a problem and build a solution. The easier and quicker it is, the more likely they are to engage and stay on. As you go, you can illustrate the various benefits of languages that trade off short-term gains for long-term gains. If the students are engaged they'll stay with you.

      Most CS students don't graduate from CM or MIT. There are tons of students that attend universities that might be great CS students, but they bail out after a quarter or two because the benefits of the program are lost. That first course in Java is like hell on earth. Hours and hours of writing code that might be the most reusable, modular code on earth but doesn't do a goddamn thing.

      Python is a wonderful intro course because you can solve problems quickly. Even in a first course you can teach students to turn their code into a CGI. Now they can actually do productive things. That goes a long way to just keeping students in the program.

    • by alienmole ( 15522 ) on Saturday July 27, 2002 @03:50PM (#3965201)
      While Python is my favourite language, I think it's rather silly to teach Computer Science and especially basic algorithmics with a language that doesn't have pointers.

      At low level, pointers are everything, and low level is what you want to teach when you're teaching basic data structures and algorithms.

      Conceptually and from a computer science perspective, the object references present in languages like Python, Java etc. are equivalent to pointers in all the ways that matter for representation of data structures and algorithms. In the academic community and elsewhere, it's generally considered beneficial to teach such things without reference to the machine pointers which you're referring to, since machine pointers carry a lot of baggage that's unrelated to the abstractions involved in data structures and algorithms.

      There's simply no point in demonstrating list implementation with an interpreted language that has very efficient native lists, dictionaries, etc.

      To refute this, let me offer a tutorial: A Gentle Introduction to ML [napier.ac.uk]. If you work through this tutorial, you'll very soon begin implementing functions in the ML language for basic list operations and the like - functions that already exist in the language. And guess what: the implementations that the beginner typically comes up with in that tutorial are very close to the actual implementations that ML uses - the tutorial gives some examples of actual implementations for comparison.

      This high-level operation doesn't even cost much -languages in the ML family, including OCaml, regularly are top performers - see e.g. Doug Bagley's language shootout [bagley.org]. They can perform on par with languages like C because their type systems allow sophisticated compile-time optimizations to be performed, and their high-level abstraction features are supported by optimizations such as tail recursion.

      C/C++ or Pascal are much better for that; with them you can teach real implementations, not toy ones.

      If you believe that C/C++ and Pascal are good languages for teaching computer science, you don't know much about modern computer science. All three of those languages have very weak type systems and lack basic features that allow the construction of high-level abstractions.

      Pascal is all but a dead language in the CS community nowadays. The primary use for C is as a decent portable assembler. Learning C has very little to do with computer science, and absolutely nothing to do with teaching computer science concepts.

  • After having a quick look at bits I'm qualified to assess (I'm not a Python programmer, but do have plenty of background in CS, C++ and other related topics) I'm not convinced at all that I'd want to learn from this book.

    Much of the preface by Jeff Elkner basically compared C++ to Python and has a go at the deficiencies of C++. It would be more convincing if he knew the return type of main(), the name of the standard header <iostream> and what a statement was. Three fundamental mistakes just in discussing "Hello, world!" is not a good sign for the author's level of knowledge and understanding.

    Trying to put aside my bias, as I like C++ as a practical language, I examined the appendix on creating a UDT for fractions to form a second opinion. Here, they do the obvious simple things to create a rational number class, and nowhere do they make the basic sanity check that your denominator is not zero. Surely one of the basic tenets of OO theory is that you always maintain your class' invariant properly? Their class may be a fine demonstration of Python's OO features -- I don't know, I'm not familiar enough with them to judge -- but it's a lousy demonstration of either good CS or good OO.

    From these observations, I have to ask whether I'd actually want to learn Python from this book. If I do, how will I ever have any faith that what I've learned is correct and in good style?

    • I got put off by just the second paragraph:
      or why teach programming with Python?" Answering these questions is no simple task - especially when popular opinion is on the side of more masochistic alternatives such as C++ and Java.
      It manages to make two derogatory statements. That programming in C++/Java is masochistic (thats a subject of opinion, so I can let it slide) and makes a subtle barb that they're used because they're popular and not because of merit. It just poked of programming religion to me. I'm very much a "use the right tool for the job" guy. I haven't used Python, so I'm a little cautious about talking about it, but I'm fairly confident saying they're are a lot more C and C++ libraries available to a C++ programmer than off the shelf Python packages.
      • "I'm fairly confident saying they're are a lot more C and C++ libraries available to a C++ programmer than off the shelf Python packages."

        The libraries available to C and C++ are irrelevant for teaching an introductory programming course. The idea behind using Python to teach programming is to start with a readable language that can be used to illustrate basic concepts without mucking with lower-level machine related details. Once the basic concepts have been conveyed, more lower-level details such as strong typing and pointers can be introduced, using languages like C or C++. In short, get the students to walk before they try to run.

        Just because C++ is a good practical language doesn't mean it's a good teaching language.
    • Much of the preface by Jeff Elkner basically compared C++ to Python and has a go at the deficiencies of C++.
      ...

      Python is a great language, but it is just plain ignorant to sell it as a C++ replacement. It makes substantially different design tradeoffs, which makes it suitable for very different programming problems. C++ has better compile time checking, and better performance, while python has more runtime flexibility.

      Actually, Python and C++ work very well together, I see them as collaborators more than competitors.

      For a good introduction to Python, I'd recommend Mark Lutz's Learning Python

      • "Python is a great language, but it is just plain ignorant to sell it as a C++ replacement."

        The author of the book wasn't selling Python as a C++ replacement. He was pointing out the lower-level features of C++ that get in the way of using C++ as a first language for people learning to program.
  • a review (Score:4, Informative)

    by bcrowell ( 177657 ) on Saturday July 27, 2002 @11:09AM (#3964378) Homepage
    I've written a review of this book [theassayer.org] on The Assayer. The book is self-published (the authors run Green Tea Press), and one of the things people don't realize about self-publishing is how hard it is to attract reviews. (Actually, it's hard in ordinary publishing, and even harder in self-publishing.) Without reviews, you don't get much credibility. So if there's a free book in The Assayer's database [theassayer.org] that you've read, please write a review!
    • by Anonymous Coward
      My first impression of your review was quite favorable. You started off well with an engaging style.

      Unfortunately, you spent far too much time writing about the merits of Lunix and Open Source than reviewing the book. Using the review text as a platform for your views on the GPL was inappropriate and didn't tell me what I wanted to know about - the book.

      The few paragraphs on the content of the books were sadly lacking and offered little insight into the use of the book for learning about CS or Python.

      I'd say your review scores 1 out of 5.
  • I like electronic books, but when something's good and long, I prefer reading them in printed form. Probably due to my failing eyesight.

    At any rate, I went to Green Tea Publishing's homepage and that's got to take the cake for the most bare website i've seen. They make mention of printing copies for a reasonable price, but they don't say how one can make that request nor any contact info.

    Would someone tell me how I can get printed versions of the book?
    • Reread their page at http://greenteapress.com/thinkpython.html
      They link amazon and barnes and noble, where you can get a printed copy.
    • I think you missed the following sentence from their web page [greenteapress.com]:
      You can order the book from Barnes and Noble or Amazon.
      They give you links and everything. You should also realize that they're in the textbook publishing business, so nearly all their sales are probably going to be wholesale.
  • Try Haskell [haskell.org] instead. Or, perhaps, SML/O'CAML.
  • by Animats ( 122034 ) on Saturday July 27, 2002 @12:37PM (#3964688) Homepage
    "How to think like a computer scientist" is a bit much for this book. It's an introduction to Python programming, and at best, a mediocre one. It's aimed at the overpopulated "first book on programming" market. The book reads like a BASIC programming manual of 25 years ago.
  • Upgrade C? No! (Score:2, Interesting)

    by wackybrit ( 321117 )
    I definitely think this book has some shortcomings. Not in its practicality or even in its multitude of examples, but in the attitude it presents. Here's a quote:
    C is becoming irrelevant to computer scientists, as it fails to adapt to the changing environment of computer engineering. Unlike Python, C fails to embrace newer concepts such as automatic memory management and object orientation. We recommend the use of Python in place of C at any point in the development cycle to all modern day computer scientists.
    I thought the point of these books was to educate people rather than slate languages. C is over 30 years old, so should it really come up for a slating because it doesn't 'embrace newer concepts'?

    C is a system level language, and is still used widely, especially in OS and VM coding. The whole point is for C to remain stable. I certainly don't see Python being used in these applications, and it doesn't deserve to be used at the system level either.. Python is nothing but a glorified scripting language.
    • Re:Upgrade C? No! (Score:2, Informative)

      by abdowney ( 596643 )
      Hi All, I don't know where this quote is from, but it is not from my book, and does not reflect my attitude toward C. The preface of the book discusses some of the problems using C++ as a first language for new programmers, but after that we get down to the business of teaching programming. Anyway, there is a C++ version of the book, too! Cheers, Allen Downey
  • They'd be much better if they used the standard PDF Type 1 fonts. (I use pdflatex.)

    -Kevin

  • from the because-thinking-perl-hurts-too-much dept.

    Thinking (and writing) Perl doesn't hurt at all. It's reading Perl that hurts. Write Once Read Never.

    • What I want to know is when someone is going to write a book like "Python for Perl Programmers".

      (Or, for the flamebait oriented folks, "Python for Perl Refugees".)

  • by stevarooski ( 121971 ) on Saturday July 27, 2002 @02:19PM (#3964987) Homepage
    Python is an excellent languange that I've been using for about 8 months now. Anyone who has ever programmed before can pick it up easily just by reading the [very thorough] documentation at python.org [python.org]. It also has an rabid support base via newsgroup, who are excellent at answering questions.

    That said, I don't know if I would teach a begining computer science course in python. At my University, our general intro to CSE involves a two class series teaching generic basic theory wrapped around a programming language. We used to teach them with C and C++ but just recently moved to Java. I have been a TA for these classes before. Based on my experiences, I think there are both pluses and minuses to the idea of teaching these classes in python.

    Benefits:
    1. Python is extremely easy to learn, as mentioned before. Much easier than C, C++, or Java.
    2. Python works really well with Tk, which would make it easy to build out skeleton code (multiplatform skel code at that) for the students using windowing and graphics. Students are 100% happier if they can see what they're working on reflected graphically. Makes it more fun to show off. This is why our projects usually include basic games.
    3. BASIC Python is truly, completely, multiplatform, working identically on Mac, Win*, and *nix. Some specialized functions in modules don't support all platforms, but nothing that would be important to a begining student. Support issues would be MUCH simpler than C or C++. God, we had huge headaches trying to support MSVC, CodeWarrior, CodeWright, Borland, etc. . .
    4. There is a great installer script available that will build python modules into either standalone exe's or distributable directories. (Available here [mcmillan-inc.com] if you've never seen this before)
    But, there are also some downsides that would have to be weighed. These are:
    1. I have yet to find a solid dev environment that includes a great debugger. Yes, emacs can be set up to help a bit, and the default program included with the windows install is ok (albiet a bit flaky), but I would want a rock solid, easy to use complete dev studio for my students.
    2. I REALLY dislike any language that depends on white space. Miranda and Haskell are two other examples of this. Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed. Also, unless tabs are set to spaces, computers with differnt tab stops will see your code differently, which can be a problem if code is emailed, etc.
    3. As has already been mentioning, not too much one can teach about memory management and pointers with python. . .
    4. Sounds kinda strange as a complaint, but too much is built in. I have this complaint about java too. As an example, I would much rather have an early homework be a sorting algorithm and then have them reuse this algorithm in other homeworks than let them just type "xxx.sort()". Not that this isn't a great feature for experienced programmers, its just that begining students should have to do sorting, reversing, duplicating, etc themselves at first.

    Looking away from basic intro classes, python is great to know. I did a lot of AI code sketches in python, and have used it to slap together simple programs at work. However, I would consider it a tool to be learned after the basics have been beat in. If I had learned python first, it would be a lot harder to force me to do everything in C later. :o)

    -s
    • 1. I have yet to find a solid dev environment that includes a great debugger.

      I've found Ipython to be very useful. It's not a complete development environment, but it works quite well. It's not the same as a debugger-- it has some relative advantages and disadvantages. A point in its favor is that Python always issues backtraces automatically, which is the most essential function of a debugger.

      Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed

      You're pretty screwed without auto-indent anyway (-; Re tabs set to spaces: tabs need to be set to spaces or tabs. As long as they're not set to a mixture of the two, there shouldn't be any problems.

      # As has already been mentioning, not too much one can teach about memory management and pointers with python. .

      The Python API will give you so much pointers and memory management that it hurts (-;

      As an example, I would much rather have an early homework be a sorting algorithm and then have them reuse this algorithm in other homeworks than let them just type "xxx.sort()".

      An alternative philosophy is that it's a good idea to learn to reuse a well designed library before rolling ones own poor quality imitation. Python opens up interesting possibilities by allowing one to pass in predicates: L.sort(lambda x,y: y-x), etc etc. It's a good thing to expose students to flexible and well design class libraries, rather than encouraging them to reinvent the wheel. Then they can implement their own custom containers.

    • I REALLY dislike any language that depends on white space. ... Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed. Also, unless tabs are set to spaces, computers with differnt tab stops will see your code differently, which can be a problem if code is emailed, etc.

      You have to move blocks around in C, C++, Java and Perl, too. Yes, I know these languages' compilers/interpreters don't care about formatting, but programmers still must indent code to show structure. Otherwise you've got unmaintainable garbage.

      Once you accept this point, you realize that all of those braces are actually completely spurious. The indentation shows the blocks, the braces are just there for the interpreter/compiler. Python has taken a step forward by eliminating them.

      As for tab stops, yes, they are annoying, but the answer to this problem is known. Tab stops must be mod 8. If your terminal/editor/whatever does anything else, it's broken.

      --Mike

    • I REALLY dislike any language that depends on white space. Miranda and Haskell are two other examples of this. Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed. Also, unless tabs are set to spaces, computers with differnt tab stops will see your code differently, which can be a problem if code is emailed, etc.

      It isn't a pain to move blocks around. At least not more so than any other language when one is coding properly. Nor does not having auto-indent cause any more problems here than any other language when coding properly. Operative words, coding properly. The nice thing about Python is you're assure that the programmer got his blocks mostly right.

      As for the space/tab problem that is simple, no tabs. Done. Tabs are bad. They are very bad. Just say no to Tabs.

      As has already been mentioning, not too much one can teach about memory management and pointers with python. . .

      Oddly enough it is with Python that I finally grasped the concept of pointers. The nice thing about Python is that one isn't constantly trying to do the dereference shuffle to do the right thing. This makes copying lists a little sticky but that is an acceptable trade off compared to scratching one's head over "Ok, this function requires a variable, a pointer-to-a-variable, another variable which I'm going to dereference from this pointer here...."

      Sounds kinda strange as a complaint, but too much is built in. I have this complaint about java too. As an example, I would much rather have an early homework be a sorting algorithm and then have them reuse this algorithm in other homeworks than let them just type "xxx.sort()". Not that this isn't a great feature for experienced programmers, its just that begining students should have to do sorting, reversing, duplicating, etc themselves at first.

      Not only is this an odd complaint it is also a non-issue. There is nothing that precludes you from requiring a home-brew sort to be used in your assignments. It begins something like this, "Yes, Python is nice in that it has a sort function. However, knowing how a sort function works really helps one understand computers and programming. So, today's class we're going to program our own sort function which we'll later use in a larger program." The nice thing is then you can say thisn, "Then we'll compare what we've written with the default sort function..." It is an unimaginative teache who cannot figure that one out. It has been pretty common practice in all of the programming classes I've taken over the years. High school, college and vocational all did it while I was learning BASIC, Turbo Pascal and C.

  • While not CopyLefted, Bruce Eckel has online a 0.x version of Thinking in Python [mindview.net], which is more pattern oriented.

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...