Think Python 201
An anonymous reader writes "In a neverending effort to spread the word about free quality online programming books, here is a Python programming book. 'How to Think Like a Computer Scientist: Learning With Python', by Allen B. Downey, Chris Meyers, and Jeffrey Elkner is a copylefted work available in multiple formats at Green Tea Press: HTML , PDF, LaTeX. Compliments of the online books what's new page."
There are other books in this series (Score:3, Informative)
Where are they ? (Score:2)
I only find two versions - Java and Python.
Where can I find the C++ version ?
Thanks in advance !
Re:Where are they ? (Score:4, Informative)
http://www.ibiblio.org/obp/thinkCS.php [ibiblio.org]
I was actually quite surprised to find this article on slashdot. You see, I'm the author of the Perl script which converts the LaTeX source to HTML. I hope nobody finds any blatant problems with the online book websites...
Thanks ! (Score:2)
Many thanks !
Re:There are other books in this series (Score:2, Interesting)
"Think Python" (Score:2)
How to think like a computer scientist (Score:1)
The most pure computer science is essentially mathematical at heart. I don't think current CS curriculums put enough emphasis on that basic tenet.
Re:How to think like a computer scientist (Score:4, Interesting)
Some universities (eg. Edinburgh) have started calling it `Informatics', which is much more appropriate. (In fact in Germany, and probably elsewhere, it was always called `Informatik'.)
Maybe there should be CS *and* Informatics.
Uhm, guess that was offtopic.
'Informatics', I agree (Score:1)
Re:How to think like a computer scientist (Score:2)
Re:How to think like a computer scientist (Score:3, Interesting)
I think part of the problem is the name `Computer Science', which gives a wrong impression of what the core of the poodle really is.
Well, yeah.
I'd make the following analogies:
CS is a science that deals with unravelling how information and logical systems function and developing frameworks to understand them. CS are most likely to determine the boundaries at which things can happen and to lay out how to practically approach that boundary.
Software Engineering is an engineering discipline that deals with manipulating those systems to perform a needed task. They take the work of the CS and design systems to address specific problems. Quick and dirty is just fine, provided that all the needs are being met.
Coders assemble the systems that the SEs design and informaticians maintain those systems.
There's overlap among all of them to some degree, and plenty of people do them all, but from an education point of view, if you mix them together, you get a mess - and most schools mix them together. It was easier to mix them in the past because the field was narrow. But now, you just can't do it.
CS has become very deep, and you can't get into any of the real work if you spend your time dealing with SE and coding practices. SE has become very deep as well and you don't want these folks getting bogged down with the NP completeness proofs and whatnot, or with learning the programming tools too much. There's enough to do in all three areas that they need to be treated as different but complementary disciplines...
Re:How to think like a computer scientist (Score:1)
I honestly have not learned very much in a classroom setting. Almost all of my programming skills are self taught. I am not saying a CS degree is useless or that I don't secretly wish I had gotten one before I got married and had bills to pay, just that you can get hired as a programmer and write some nifty apps without one.
I personally hate math. But for some reason, when I am expressing equations in code, it is a natural thing I don't think about. If I had to take a math test, however, I would probably do horribly.
Re:How to think like a computer scientist (Score:1)
And a basic programming language like C/C++
Hardly ever the idea is to teach a language but how to optimize things
Re:How to think like a computer scientist (Score:2, Insightful)
The majority of today's CS Courses seem to fall into two broad categories, 'Software Development', and 'Systems Management'. Whilst these are both elements of computer science, they do not encompass computer science as a whole.
Universities are more and more often cutting out the core Computer Science components of their courses, such as Electronics and Computer Thery, which is a shame - whilst the courses leave graduates with an understanding of 'How' to do certain tasks, they are left with no understanding of 'why' they are done that way, because they have no real understanding of how the Computer Systems they are carrying out tasks on actually perform their functions.
NDFSM's are important, Karnaugh Maps are important, Understanding the CPU F/E Cycle is important too - bring back real CS to our Universities!
Re:How to think like a computer scientist (Score:1, Interesting)
In many ways, CS and EE are extremely similar.
Re:How to think like a computer scientist (Score:1, Interesting)
Computer theory, system design and architecture, analysis of algorithms, algorithm design, data structures, compiler design, AI, networking theory, files and databases, mathematics (up to multi-variable calculus, discrete structures/combinatorics, linear algebra, numerical analysis, statistics and probability), software engineering, languages (LISP, C, C++, Java, Prolog, Cobol, Assembly), computer graphics, to name a few things - there's more but it's been quite a few years and that's what comes to mind at the moment. I'd say that's pretty well-rounded for a CS program.
Re:How to think like a computer scientist (Score:2)
Re:How to think like a computer scientist (Score:2)
Thanks to Online Books (Score:5, Interesting)
Thanks to Copylefted Online Books, I now can read the books before I buy.
On my bookshelf, seven of the books were bought after I read their online version.
I live in a third world country where there is no Towers bookstore, nor Borders, nor Barnes - there is NO WAY for you to know how good a book is without first buying the book - the bookstore here do NOT allow you to read the book !
The idea of Copylefted books really help me, and many others who are in the situation of buying books not knowing if the books are good or not.
Thanks again !
Re:Thanks to Online Books (Score:3, Interesting)
Its not too expensive compared to how much technical books cost in some countries outside the US I have visited. The Safari service is about 10 dollars a month (US) for 5 books and 15 dollars for 10, and so on.
Question (Score:3, Insightful)
Man, it's early.
Re:Question (Score:1)
Dive Into Python (Score:4, Informative)
Another excellent free book for Python is Dive Into Python [diveintopython.org] by Mark Pilgrim [diveintomark.org]. It is available in HTML, PDF, Word 97, Windows Help, plain text, and XML formats.
This book has plenty of examples and pointers to further reading on each subject. It features good layout, use of colors, and typography which makes for easy reading and comprehension.
Re:Dive Into Python (Score:1)
Perl? (Score:1)
Are there any free Perl books out there like this one for Python?
I haven't done any Perl or Python before and I wanted to look at both before I pick one and use it primarily.
Re:Perl? (Score:2)
Both actually have remarkably similar approaches and capabilities with regard to object-oriented programming(this in terms of Perl 5).
Python's functional programming ideas can fairly easily be translated to Perl constructs such as map and grep.
Not sure how easily generators would translate to Perl, as I've not had occasion to use them yet.
Re:Perl? (Score:2, Interesting)
- Python is strongly dynamically typed, while Perl is weakly typed
- Python has a small number of syntactic constructs, while Perl has many
- In Python, everything is an object
blah, blah, blah
Re:Perl? (Score:2, Funny)
Re:Perl? (Score:2)
In this respect Ruby has both languages(at least until Perl6) beat.
I'm not saying they're exactly the same, but I do think they're more similar than most people like to think.
Re:Perl? (Score:2)
I've apparently been out of the loop a little too long.
Re:Perl? (Score:2)
It also has the advantage of being a fresh start(it behaves much like Perl 5 in everyday use, but for different, better, more consistent reasons). Python is a wonderful language, but that doesn't have to mean Perl can't be a great language too.
Excellent (Score:1)
I learnt python in two nights from it because as we all know, once you have the basics of CS in your head the language you use is just an implementation detail.
Computer 'Science'? (Score:2, Interesting)
Re:Computer 'Science'? (Score:1, Insightful)
Of course Computer Science is a science - take any introductory CS course and you will come across many formal theorems and hypothesis-based discoveries.
Examples:
Halting problem - this is essentially derived logically from basic premises.
Neural networks - these are constantly the subject of scientific study in much the same way as geneticists study rats.
Re:Computer 'Science'? (Score:1)
Re:Computer 'Science'? (Score:4, Insightful)
The phrase that you quote here displays a mind-boggling ignorance about exactly what "Computer Science" is. Software Engineering is, indeed, "like Engineering" but there are many branches of Computer Science that deal *purely* with the abstract. I do Formal Language Theory and Automata Theory for a living and I just can't see how these fields are about "getting something to do something". Feynman, like most people, has missed the science for the telescope.
The real joke is that things like the Church-Turing thesis could not possibly be MORE about "natural objects". In the abstract, I can define a machine that can solve the halting problem. Heck, I can define a machine that solves any problem I want! The Church-Turing thesis tells us about a PHYSICAL limitation on computing. In this universe, you can only build a machine that will compute *these* functions.... But what if I live in a universe where time has no meaning? All of a sudden, I get a *very* different Church-Turing thesis.
There is no question that Feynman had some brilliant insights in physics, but I have to admit that when I read the Lectures on Computation, not only did I lose a small amount of respect for him... I found myself actually outraged. Many intelligent people will read these lectures and believe them... I mean, after all, they're written by Feynman, right?
Computer Scientist's have enough trouble trying to explain to people that, no, we don't just sit around installing Windows network drivers all day without a respected and intelligent person like Feynman adding to the problem.
Re:Computer 'Science'? (Score:4, Informative)
No doubt, Feynman was a very, very good physicist. But he was also a genius at self-promotion, and his cult has gone way overboard as a result. It's well-established by now that some of the ideas he's famous for were first published by others.
(Not that he wasn't honest about it sometimes. I think he's on record, for instance, crediting Stueckelberg for the renormalizion of electrodynamics, and for the idea that positons are electrons travelling backwards in time. See e.g. this timeline [weburbia.com], or the last chapter of this book [amazon.com].)
Troll Feedin' Time (Score:2, Informative)
Given a countably infinite amount of time, one can set a Turing machine running on an input, and then simply observe whether it halts or not. Heck, you can set it running on a countably infinite number of words, and see if it halts on each one. In other words, you can solve the halting problem.
Likewise, you can get super-Turing power if you can compute with real numbers (not floating-point approximations, but the true continuum). But again, due to the physics of this world, we can't maintain analog values with an infinite degree of precision (due to thermal noise, etc.) This has even been published in the journal Science by Siegelman, et. al. a few years ago.
In any case, you've made it clear in your post that you are unable to think these things through for yourself. You simply read Feynman, and accept it as gospel truth, because it came from the mouth of a great prophet.
Re:Computer 'Science'? (Score:2, Interesting)
Lately, though, it's starting to become analytical, courtesy of Microsoft. First, you make a hypothesis about how the OS works, then you write a program to test this hypothesis, analyze the results, and modify your hypothesis to fit the facts :-).
You might find it interesting that, science or not, Feynman spent the last few years of his life teaching a computer science class called "The Potentialities and Limitations of Computing Machinary". It was a very interesting class. He brought his unique wit and wisdom to a subject much in need of wit and wisdom than most.
Re:Computer 'Science'? (Score:3, Interesting)
However, for the last hundred years or so it has really been increasingly difficult to separate science and engineering. More and more, scientific hypotheses can only be tested when sufficiently advanced engineering comes along. There have always been "whiteboard scientists" (i.e. theoreticians) who resent this.
But most great scientists were skilled engineers as well. Galileo, Newton, Bunsen, Babbage, Turing...
I think the terminology is the problem. We don't talk about "Physics science" or "Biology science", so why "computer science" or "rocket science"?
Why not just computing and rocketry?
While I'm having a rant, there's also a problem with degrading the word "engineer". MCSEs and such are basically technicians, not engineers. Perhaps if we admitted that the people who implement systems using standard components that just have to be set up correctly (although this may be a challenging role) are technicians, then we could accept that most "computer scientists" are actually trained as engineers, that this is a highly skilled and challenging professional role, and the number of real scientific researchers is not that great. Just like physics and chemistry nowadays, in fact.
I would suggest that the test of a pseudoscience is that it doesn't create a heirarchy of engineers and technicians because, basically, it doesn't work and there would be nothing for them to do. You don't get sociological engineers designing ever better societies, and socio-technicians building them just as fast as people can throw money at them. (At least, the attempts, such as Marxist-Leninism, have been abject failures). But you get plenty of sociologists. On this basis, computing, with its deep organisational structures, is an extremely successful science-based system. Arguments about testing hypotheses are irrelevant: real scientists tend not to work like that anyway.
Scientific proof has been conventionally about other people reproducing your results. But if the nature of your science/engineering is that you can rapidly produce millions of copies of your concept or invention, this becomes trivial. If I claim to have invented (say) a graphics chip architecture that can draw polygons twice as fast as the previous best for a given clock speed and die size, I prove this by marketing the product, not by publishing and waiting for other labs to build a copy and duplicate my result.
It's the glue .... (Score:2)
Couple of online books i've come across (Score:2, Informative)
Using Samba [oreilly.com]
Personally I thought both were very well written, the samba book has helped me greatly.
Re:Couple of online books i've come across (Score:1)
Re:Couple of online books i've come across (Score:1)
How about how to think like a Software Engineer? (Score:5, Insightful)
This book doesn't strike me as a book on how to think like a Computer Scientist, except insofar as Computer Scientists generally make lousy Software Engineers. There are no descriptions of the advantages of object oriented programming, discussions of theoretical topics, and in general very little encouragement to view programming as a science. Basically, this appears to be just a book on the Python language, written for someone who has never programmed before. That's a fine thing, don't get me wrong. My brief look even makes me think it could be an effective example of such a book. At the very least, I think it's hyped wrong.
However, from a software engineering point of view, I find it damning that the book forgoes any explanation of the practice of, or motivation for, writing maintainable code. I consider that unforgivable in a beginning programming book. You absolutely have to impress on newbies early the importance of documentation, sensible structure, logical variable naming, good class hierarchy, etc.
I consider this especially true for Python, which is an interpreted non-declarative language (making maintainabilty all that more important). Python is, conversely, also especially well designed as a platform where such concepts could be taught. It largely overcomes the occasional weaknesses of its design philosophy by consciously including language features such as built in support for docstrings, well crafted namespaces, modules as first-class citizens, etc.
Yet, these language features are barely given a nod in this book.
It's books for existing programmers that can afford to skimp on these areas.
Re:How about how to think like a Software Engineer (Score:1, Insightful)
Software Engineering is essentially the application of CS to real world projects - and the current fashions in SE should be a separate course entirely. It's more about psychology and HR than it is about Computer Science.
Not about computer science; try SICP instead (Score:3, Insightful)
If you actually look at the book in question, you'll see that the original poster was correct: it's not about computer science at all. It's a Python programming book with a marketing angle relating it to computer science.
If you really want a book which teaches "How to Think Like a Computer Scientist", try SICP [mit.edu]. For a good summary of the book, see this comment [slashdot.org] from the recent "Best Computer Books" article.
Re:Not about computer science; try SICP instead (Score:2)
"It's something like do as I do (though I am not saying it explicitly every 5 seconds)".
Re:Not about computer science; try SICP instead (Score:3, Insightful)
The link I gave to SICP [mit.edu] includes the full text in HTML. How much more free do you need it to be? It's accessible by clicking the link on that page which reads "Full text! The complete text in HTML". Unintuitive, I know, but that's the kind of thing you'll learn to understand as a computer scientist...
though many of the critics are ineed covered in the book: naming of variables, abstracting and not hardcoding stuff, making it readable, making it generic, wrapping code, etc.
These things have very little to do with computer science, with the exception of abstraction (which has very little to do with "not hardcoding stuff"). What we have here is a situation where people who know absolutely nothing about the current practice of computer science are using the term to mean whatever they imagine it means.
Variable naming, readability, wrapping code etc. all involve techniques which any good programmer should understand, academic or otherwise. But calling that computer science is like calling hammering a nail "civil engineering" or "materials science". The Python book, at best, could be described as an intro guide to software engineering, but even that's a stretch. It's really "how to program in Python", with a few technical terms introduced along the way.
Re:Not about computer science; try SICP instead (Score:2)
Re:Not about computer science; try SICP instead (Score:3, Insightful)
I also tend to prefer the logic side of things. A lot of computer science of the kind that SICP deals with has to do with mathematical logic, which I find more logic-like than math-like. For that reason, I love the lambda calculus, which is what almost all functional programming is based on.
If you like logic, and you like programming, you *must* learn the lambda calculus - it'll give you new insight into the meaning of computer programming, and you'll probably enjoy it, too. It has applications to literally every programming language. If you're not already familiar with lambda calculus, the name sounds a lot more daunting than it really is. To start out with, you can think of it as a really simple and primitive programming language - almost the prototypical programming language.
I'm not sure what the best intro source into lambda calculus is - the academic works are mostly heavier-duty than you really need. Some google searching definitely turns up some useful stuff, but offhand I don't know of a definitive intro site. However, SICP teaches a lot of the necessary fundamentals, since the Scheme language that it uses as a teaching tool is just about the closest language to the lambda calculus that there is. If you learn a bit of Scheme first, learning lambda calculus from the lay perspective is fairly straightforward. (I say "lay perspective" because it has some hairy mathematical underpinnings, which I mostly ignore, and that's OK for most purposes unless you're trying to prove mathematical theories.)
(ie: I know to how to program, but not how to make good programs. Every tome i need to add unexpected features or when I review my old stupid ugly code, I am reminded of that fact :-)
I think that these academic topics can be excellent for improving one's design and thinking skills, at least I've found that to be the case. It's an indirect kind of thing, though, and experience is probably equally useful. Abstraction is key, though, whether you do it using object-orientation or modules in functional programs, keeping interfaces separate from implementations is probably one of the most important things, which people violate all the time. It doesn't help that many languages don't provide features to do this properly...
The Art of Unix Programming [was: How about how to (Score:1)
Re:How about how to think like a Software Engineer (Score:2, Informative)
choosing a pretentious title. Really, it's
mostly meant to be silly (not a marketing
angle).
The book is (just) an introduction to computer
science that focuses on the basics of programming.
It covers the material I've been able to get
college students to understand in one semester,
which means yes to functional and data abstraction
and no to modules as first class citizens.
It's also aimed at people with no programming
experience at all, so I tried to explain the
basics slowly and LOUDLY.
Thanks to all the slashdotters that have commented
on the book!
Cheers,
Allen Downey
C is a high-level language?? (Score:2, Insightful)
From the opening section of each book: 1.1 What is a programming language?
Java is an example of a high-level language; other high-level languages you might have heard of are Pascal, C, C++ and FORTRAN.
Python is an example of a high-level language; other high-level languages you might have heard of are C, C++, Perl, and Java
Both C++ and Pascal are high-level languages; other high-level languages you might have heard of are Java, C and FORTRAN
C, a language without file i/o, without bound checking, and with direct access to ports is high-level? If you say the libraries chucked into a C load makes it so... Then Assemebler is a high-level language, too.
Last I heard was Binary Code=0, Assem=1, C=1.5, Fortran, Cobol, & Basic were about 3, ADA, C++=5.
Perl was not even in the picture, because it was scripting language
Also high-level languages does not equal easier code or does not make it faster code... It does makes more strict to code, more following the limited ways the authors of the langauge thought you should think (like the use of GOTOs :-). Low-level languages allow the coder the freedom to get the job done and not comprise the functions to limits of the authors, and it requies the coders to truely think like computer sceincist. Look at ADA for what is wrong with really high-level langauge. See how limiting the langauge can be made. And how much time is need to see up the coding effort.
PS: maybe these are great books, but I stopped reading there, because how can it teach to "Think like Computer Sceincist" when it does not know about the basics of computer sceince?
Re:C is a high-level language?? (Score:1)
The fact is, how "high" level something is, is completely relative. There are no strict rules to my knowledge about what's included in each "level" (e.g., file i/o, bound checking like you mentioned). Maybe you're just annoyed because you've always heard C as medium or low-level and it goes against that?
My first programming language I used was QBasic (I was very young, ok?). Eventually, I moved on to use Turbo Pascal. During the time I spent developing software in Pascal, I frequently heard people talk about C and how it was tighter and superior because it was a low-level language. And compared to Pascal, it is _slightly_ lower level.
Last year I had a Machine Organization class where we used MIPS assembly. The professor always referred to C as a high level language.
I think a justified complaint would not be that the author classified C as a high level language, but that he classified it in the same realm as Python and Java. Clearly, Python and Java are substantially more high level.
Re:C is a high-level language?? (Score:1)
The bases for a sceince is terms. If they can not get the basic of terms right then there is no science.
About background... I started with Basic on a HP2000F in 1973, I was 14 at the time.
Which carries into high-level languages are portable hahaha Basic is a high-level language and it is not portable, it was not until about 1989 was the Basic Language generally uniform, Cobol today, even higher-level, in my option, than BASIC still has "flavours".
This books was primer - when you can not get history/levels correct what does it say for the rest.
MIPS assembly compared to C - C is higher, more human to read (if you do not compound expressions (example: x(i++) =+ y(--j); ). But C is much lower than Basic.
Know the question: Did your teacher refer C as a HIGH-LEVEL LANGAUGE (wrong) or a HIGH-ER-LEVEL LANGUAGE (right)?
Re:C is a high-level language?? (Score:2, Insightful)
I don't think they got any "terms" wrong. I still assert that how "high-level" a language is is NOT an absolute scale.
MIPS assembly compared to C - C is higher, more human to read (if you do not compound expressions (example: x(i++) =+ y(--j); ). But C is much lower than Basic.
Good, so you are thinking in terms of "higher" and "lower".
Know the question: Did your teacher refer C as a HIGH-LEVEL LANGAUGE (wrong) or a HIGH-ER-LEVEL LANGUAGE (right)?.
High-level language. This is NOT wrong in the context of the course. Relative to the course material (assembly), C is a very high-level language. I have always thought of this scale in terms of context and relativity, not absolute.
Why do I do this? Well, I suppose in theory someone could come up with an absolute scale and say "C is classified as a X-level language", but when the only terms we're using are "high", "low", and sometimes "medium", this will not hold up.
You've been programming for a long time. Certainly you realize that C was considered more high level back then than it is now.
Rewind back to the early 90's. Ignoring Basic, Pascal was the highest-level language I was familiar with and used. People undeniably (and fairly consistently) referred to Pascal as a high-level language. Along comes languages like Java, which is clearly higher level than Pascal, and we have to re-evaluate what we consider to be "high-level".
The point is, we can't keep redefining this stuff everytime a new language comes out. You call Python high-level today, but 15 years from now when some other language comes out even more abstract, you can no longer call Python high level unless you keep attaching superlatives (like "very") to the new languages.
Relative. Not absolute.
Re:C is a high-level language?? (Score:1)
Did the author get paid? (Score:3, Interesting)
I'm working on the theory of collecting tax deductions for copylefted art, and this contribution is a great example because it closely resembles historically donated items. If the author donates the artwork to the right organization - he could by my reading of the IRS be paid in tax deductions.
Does anyone know of cases in Open Source / Copyleft where tax deduction are being used to help cover expenses?
I'm sure that the competition - i.e. Microsoft uses every tax deduction in the book. Are Open Source contributers playing by the same rules - or are we handicapping ourselves by ignoring the tax benefits of donation?
If anyone can provide examples of copylefted donations and how you documented it for tax purposes - I'm interested.
I believe there are Billions of dollars in potential government funding just waiting to be collected by Open Source artists. Lets go get it!
AIK
Re:Did the author get paid? (Score:1)
The authors are the publishers.
O'Reilly offers quite a few books for free on the web because they're out of print.
For a book that's in print, I don't think it's appropriate to pay the authors extra for copylefting it, because making the book free in digital form is actually a wonderful sales tool. It's worked for me [lightandmatter.com], and it's also worked for Baen books.
For a book that's out of print, I also don't think it makes sense. The reason these O'Reilly books are out of print is that they weren't big sellers. If they're not making a profit on the book, there's no reason to pay the author extra. Of course I assume O'Reilly only makes the books free online with the author's consent.
Re:I also don't think it makes sense. (Score:1)
AIK
Re:Did the author get paid? (Score:3, Interesting)
The IRS does not allow you to donate your time and deduct your perceived cost from your taxes. It's highly illegal because it is quite obviously prone to abuse.
So in the case of your artwork example. If you go out and buy a painting for $20,000 and then donate that painting to a non-profit charity, you may deduct the $20,000 or whatever the current market value is of that painting.
However if you go out and buy a canvas and some paints and then make your own painting to give to charity. The only thing you may deduct is the cost of the canvas and paints, i.e. the supplies. Now if you sold the painting at auction for $20,000 and then proceeded to give that $20,000 to charity, you may deduct the $20,000, but you're also showing the $20k as income so it's a net-zero-sum game.
The same is going to be true of a book.
Re: Road to jail (Score:2)
Re:Did the author get paid? (Score:2)
It probably doesn't make any difference either way, since the author owns Green Tea.
"you haven't contributed anything" (Score:1)
Re:"you haven't contributed anything" (Score:2)
Good response. I just replied to one of your other posts about your tax deduction idea, but when I saw this I wanted to add: getting Federal matching funds for Open Source is not a bad idea, but I think you may be looking in the wrong direction when it comes to the IRS. However, the Federal government gives out grants for all sorts of things, and it would seem to make more sense to look in that direction for ways in which open source might fit into existing programs.
But I doubt you'll find some existing loophole that allows you to simply receive cash for open source by just filling out a form. More likely, with a lot of political lobbying etc., it might be possible to get the government to be more receptive to funding open source projects that are in the public interest. However, you'll have to fight lobbies of commercial software makers who feel threatened, so it's not likely to be easy.
If you want to be realistic instead of ideological (no value judgement there, just being pragmatic), you might have more luck finding a business which is interested in contributing towards the work in question.
I can't. (Score:1, Troll)
Pointers required (Score:3, Insightful)
At low level, pointers are everything, and low level is what you want to teach when you're teaching basic data structures and algorithms. There's simply no point in demonstrating list implementation with an interpreted language that has very efficient native lists, dictionaries, etc. C/C++ or Pascal are much better for that; with them you can teach real implementations, not toy ones.
On the other hand, Python might be ideal for teaching advanced algorithms such as sorting and string algorithms, as those are more "high-level" problems and low-level pointer-messing is no longer needed nor desired. Python has very beautiful string and list operations, which make such algorithm implementations cleaner.
Also, Python might be ok for the very first Basics of Programming course with respect to pointers, as they don't really teach any algorithmics there. However, the weak typing (very late binding) would be a problem in this case. Beginners will have enough trouble understanding the language without the need to handle implicit types. I'd very much suggest a strongly typed object-oriented language such as C++, Java, or Eiffel, where the types are always explicit. For an algorithms course this isn't so much a problem.
For some classes, such as AI, there's simply no winner for Prolog, and perhaps Lisp, but many Python features such as easy string manipulation and other middle-level data structures make it temptating for many subjects such as Automata and Formal Languages. It would be interesting to have a good Python interface to a Prolog interpreter; one that is well integrated with the syntactic philosophy of Python.
Re:Pointers required (Score:5, Insightful)
To me, Computer Science is a very cool blend of:
- Discrete mathematics
- Computation theory
- Linguistics
- Complexity theory
- Logic
- Probability
None of these rely on programming at all, let alone a specific language or whether that language has pointers or not. Programming is only an application of Computer Science.
You gave example of an AI class- to me, the core of AI is learning about first-order logic and predicate calculus, searching and graph traversal heuristics, reasoning, and natural language processing. I know this list isn't complete, but the point is that these concepts are for the most part independent of programming. Sure, you can start off by teaching somebody a specific implementation in Lisp or Prolog, but then they only know program AI in Lisp instead of how to apply AI concepts in any setting.
Re:Pointers required (Score:2)
One might reasonably argue that programming is the application of Computer Science. Without programming to implement its ideas, CS is pretty much just a bunch of theory.
Re:Pointers required (Score:2)
(A) If you see CS more as a way to solve generic problems then you'll like to teach a high level language. (B) If you see it more like a hardware/software problem solving, then you'd like to teach a low level language.
Which is best depends on what are doing. (A) is more akin to a real abstract science (you may not care how memory segments are referenced, or if a particular machine uses 64 bit or 32 bit registers) and (B) is would be a more practical aproach that would allow some more thing to be done with the actual hardware in use today.
If I knew I'd live 400 years, I'd certainly learn a high level language first. But that depends, because you can abstract the hardware specifics in low level languages to some part of the program (but then again, why should you care? ).
You miss the point(er) (Score:4, Insightful)
You have failed to understand the point of Computer Science (pun intended). Python is a terrific language for teaching CS because it has the basics of discrete structures: lists, maps (in Python, called dictionaries), tuples, and atomic data types such as strings, ints, and reals. That's all you need.
There's really nothing you can't do once you have lists and maps. Don't object that you can't have O(1) access-time arrays -- you can do that with a map.
I challenge you to describe any algorithm at all that can't be implemented without pointers. If you think you need pointers, you just aren't thinking like a computer scientist.
For some classes, such as AI, there's simply no winner for Prolog, and perhaps Lisp
In general, you are absolutely correct. Of course, this is opinionated and others may disagree. But remember, you can use any Turing-complete language to simulate any other Turing-complete language (that's the entire definition of Turing-complete). Which means I can write a C interpretter in Prolog if I want (and I'm feeling particularly masochistic), and therefore I can simulate pointers using Prolog.
Oh, but you cry "That can't possibly be efficient!" Right again. But you've again missed the point of Computer Science. CS is about efficient algorithms, not efficient programs. That's something we leave to the software engineers and other "implementors." Us CS freaks think about what can be done, we don't actually do it ;-)
Re:You miss the point(er) (Score:2)
You're probably right with regards to basic and advanced programming courses, as I also mentioned, but I was talking especially about certain intermediate courses such as Data Structures and Algorithms, and System Programming. In those courses, low-level languages would be better in my opinion.
I don't know how it's elsewhere, but at our CS dept, we also had mandatory courses such as Introduction to Computer Science, which quickly introduces logic gates and processor microcode programming; Computer Organization, which is about processor architecture at all levels; and Physics for Computer Science, which goes all the way down to electric fields in condensators and coils.
Sure, you might want to call that Computer Engineering, but I think, and I believe many CS teachers think, that such low-level stuff is important for CS students too for "getting the whole picture".
Thus, learning how those lists, maps, and other things have been done, is important for CS students too.
But you've again missed the point of Computer Science. CS is about efficient algorithms, not efficient programs.
That's of course the academic ideal, but I'd say most CS students will work as programmers after they graduate (or drop out of school). If they don't have the low-level knowledge, they will be next to useless. In all the companies I've worked in, I've never seen a CS student working as an "algorithm designer"; they all either code or do overall system design. Even at university, most algorithm researchers have to pay at least some attention to code efficiency, which usually means using C or C++.
Re:Pointers required (Score:2)
While Python is my favourite language, I think it's rather silly to teach Computer Science and especially basic algorithmics with a language that doesn't have pointers...
I'd very much suggest a strongly typed object-oriented language such as C++, Java, or Eiffel...
Actually, while you're probably correct about Python not being ideal for what is about to come, the real problem facing many students learning computer science is that they've never programmed anything before. Many don't know how to break a problem down and build out a logical structure.
It's a basic issue of not demoralizing the student in the first course by tossing a strongly typed, constructor based language at them like Java. Early on, students just need to see how you take a problem and build a solution. The easier and quicker it is, the more likely they are to engage and stay on. As you go, you can illustrate the various benefits of languages that trade off short-term gains for long-term gains. If the students are engaged they'll stay with you.
Most CS students don't graduate from CM or MIT. There are tons of students that attend universities that might be great CS students, but they bail out after a quarter or two because the benefits of the program are lost. That first course in Java is like hell on earth. Hours and hours of writing code that might be the most reusable, modular code on earth but doesn't do a goddamn thing.
Python is a wonderful intro course because you can solve problems quickly. Even in a first course you can teach students to turn their code into a CGI. Now they can actually do productive things. That goes a long way to just keeping students in the program.
Sorry, but you're confused (Score:4, Insightful)
At low level, pointers are everything, and low level is what you want to teach when you're teaching basic data structures and algorithms.
Conceptually and from a computer science perspective, the object references present in languages like Python, Java etc. are equivalent to pointers in all the ways that matter for representation of data structures and algorithms. In the academic community and elsewhere, it's generally considered beneficial to teach such things without reference to the machine pointers which you're referring to, since machine pointers carry a lot of baggage that's unrelated to the abstractions involved in data structures and algorithms.
There's simply no point in demonstrating list implementation with an interpreted language that has very efficient native lists, dictionaries, etc.
To refute this, let me offer a tutorial: A Gentle Introduction to ML [napier.ac.uk]. If you work through this tutorial, you'll very soon begin implementing functions in the ML language for basic list operations and the like - functions that already exist in the language. And guess what: the implementations that the beginner typically comes up with in that tutorial are very close to the actual implementations that ML uses - the tutorial gives some examples of actual implementations for comparison.
This high-level operation doesn't even cost much -languages in the ML family, including OCaml, regularly are top performers - see e.g. Doug Bagley's language shootout [bagley.org]. They can perform on par with languages like C because their type systems allow sophisticated compile-time optimizations to be performed, and their high-level abstraction features are supported by optimizations such as tail recursion.
C/C++ or Pascal are much better for that; with them you can teach real implementations, not toy ones.
If you believe that C/C++ and Pascal are good languages for teaching computer science, you don't know much about modern computer science. All three of those languages have very weak type systems and lack basic features that allow the construction of high-level abstractions.
Pascal is all but a dead language in the CS community nowadays. The primary use for C is as a decent portable assembler. Learning C has very little to do with computer science, and absolutely nothing to do with teaching computer science concepts.
Not convinced at all (Score:2, Insightful)
After having a quick look at bits I'm qualified to assess (I'm not a Python programmer, but do have plenty of background in CS, C++ and other related topics) I'm not convinced at all that I'd want to learn from this book.
Much of the preface by Jeff Elkner basically compared C++ to Python and has a go at the deficiencies of C++. It would be more convincing if he knew the return type of main(), the name of the standard header <iostream> and what a statement was. Three fundamental mistakes just in discussing "Hello, world!" is not a good sign for the author's level of knowledge and understanding.
Trying to put aside my bias, as I like C++ as a practical language, I examined the appendix on creating a UDT for fractions to form a second opinion. Here, they do the obvious simple things to create a rational number class, and nowhere do they make the basic sanity check that your denominator is not zero. Surely one of the basic tenets of OO theory is that you always maintain your class' invariant properly? Their class may be a fine demonstration of Python's OO features -- I don't know, I'm not familiar enough with them to judge -- but it's a lousy demonstration of either good CS or good OO.
From these observations, I have to ask whether I'd actually want to learn Python from this book. If I do, how will I ever have any faith that what I've learned is correct and in good style?
Re:Not convinced at all (Score:2)
Re:Not convinced at all (Score:2)
The libraries available to C and C++ are irrelevant for teaching an introductory programming course. The idea behind using Python to teach programming is to start with a readable language that can be used to illustrate basic concepts without mucking with lower-level machine related details. Once the basic concepts have been conveyed, more lower-level details such as strong typing and pointers can be introduced, using languages like C or C++. In short, get the students to walk before they try to run.
Just because C++ is a good practical language doesn't mean it's a good teaching language.
Re:Not convinced at all (Score:2)
Python is a great language, but it is just plain ignorant to sell it as a C++ replacement. It makes substantially different design tradeoffs, which makes it suitable for very different programming problems. C++ has better compile time checking, and better performance, while python has more runtime flexibility.
Actually, Python and C++ work very well together, I see them as collaborators more than competitors.
For a good introduction to Python, I'd recommend Mark Lutz's Learning Python
Re:Not convinced at all (Score:2)
The author of the book wasn't selling Python as a C++ replacement. He was pointing out the lower-level features of C++ that get in the way of using C++ as a first language for people learning to program.
Re:Not convinced at all (Score:3, Insightful)
The C++ spec certainly specifies that the return type of main() is int. It's covered in 3.6.1 of the C++ standard. (I'm pretty sure the same is true for C).
Also, in practice, different implementations require different return types.
They're not conforming implementations if they require that main() doesn't return int.
Re:Not convinced at all (Score:2)
If it's running on an OS, it's hosted. And even if not, the question is whether it has a main() function, not the return type of same if it does.
Re:Not convinced at all (Score:2)
With all due respect, no, it doesn't.
The relevant standards are crystal clear on this matter: main() is required to return int in both standard C and C++. In C90, it was permissible to omit the int in the spec, leaving it implicit (though it was never allowed to use void) and in C99 and C++, this is not allowed. In C++, you are allowed to omit the return statement itself, and if control reaches the end of main(), 0 is returned. In every case, though, the return type is always int. If you'd like more information on the subject, please consult the FAQs for the various relevant language newsgroups, where an idiot causes a flame war on this subject approximately once every five minutes.
If your implementations do not have main() returning int, then you aren't programming standard C or C++, and it's really as simple as that.
a review (Score:4, Informative)
Re:a review of your review (Score:1, Interesting)
Unfortunately, you spent far too much time writing about the merits of Lunix and Open Source than reviewing the book. Using the review text as a platform for your views on the GPL was inappropriate and didn't tell me what I wanted to know about - the book.
The few paragraphs on the content of the books were sadly lacking and offered little insight into the use of the book for learning about CS or Python.
I'd say your review scores 1 out of 5.
Printed copy wanted (Score:1)
At any rate, I went to Green Tea Publishing's homepage and that's got to take the cake for the most bare website i've seen. They make mention of printing copies for a reasonable price, but they don't say how one can make that request nor any contact info.
Would someone tell me how I can get printed versions of the book?
Re:Printed copy wanted (Score:1)
They link amazon and barnes and noble, where you can get a printed copy.
Re:Printed copy wanted (Score:1)
You can order the book from Barnes and Noble or Amazon.
They give you links and everything. You should also realize that they're in the textbook publishing business, so nearly all their sales are probably going to be wholesale.
Think like a computer scientist? (Score:1)
Misnamed book (Score:3)
Upgrade C? No! (Score:2, Interesting)
C is a system level language, and is still used widely, especially in OS and VM coding. The whole point is for C to remain stable. I certainly don't see Python being used in these applications, and it doesn't deserve to be used at the system level either.. Python is nothing but a glorified scripting language.
Re:Upgrade C? No! (Score:2, Informative)
how to make hideous PDFs like a Computer Scientist (Score:2)
-Kevin
Thinking Perl (Score:2, Funny)
from the because-thinking-perl-hurts-too-much dept.
Thinking (and writing) Perl doesn't hurt at all. It's reading Perl that hurts. Write Once Read Never.
Re:Thinking Perl (Score:2)
(Or, for the flamebait oriented folks, "Python for Perl Refugees".)
Python is a GREAT language, but. . . (Score:4, Insightful)
That said, I don't know if I would teach a begining computer science course in python. At my University, our general intro to CSE involves a two class series teaching generic basic theory wrapped around a programming language. We used to teach them with C and C++ but just recently moved to Java. I have been a TA for these classes before. Based on my experiences, I think there are both pluses and minuses to the idea of teaching these classes in python.
Benefits:
Looking away from basic intro classes, python is great to know. I did a lot of AI code sketches in python, and have used it to slap together simple programs at work. However, I would consider it a tool to be learned after the basics have been beat in. If I had learned python first, it would be a lot harder to force me to do everything in C later.
-s
Re:Python is a GREAT language, but. . . (Score:2)
I've found Ipython to be very useful. It's not a complete development environment, but it works quite well. It's not the same as a debugger-- it has some relative advantages and disadvantages. A point in its favor is that Python always issues backtraces automatically, which is the most essential function of a debugger.
Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed
You're pretty screwed without auto-indent anyway (-; Re tabs set to spaces: tabs need to be set to spaces or tabs. As long as they're not set to a mixture of the two, there shouldn't be any problems.
# As has already been mentioning, not too much one can teach about memory management and pointers with python. .
The Python API will give you so much pointers and memory management that it hurts (-;
As an example, I would much rather have an early homework be a sorting algorithm and then have them reuse this algorithm in other homeworks than let them just type "xxx.sort()".
An alternative philosophy is that it's a good idea to learn to reuse a well designed library before rolling ones own poor quality imitation. Python opens up interesting possibilities by allowing one to pass in predicates: L.sort(lambda x,y: y-x), etc etc. It's a good thing to expose students to flexible and well design class libraries, rather than encouraging them to reinvent the wheel. Then they can implement their own custom containers.
Re:Python is a GREAT language, but. . . (Score:2)
You have to move blocks around in C, C++, Java and Perl, too. Yes, I know these languages' compilers/interpreters don't care about formatting, but programmers still must indent code to show structure. Otherwise you've got unmaintainable garbage.
Once you accept this point, you realize that all of those braces are actually completely spurious. The indentation shows the blocks, the braces are just there for the interpreter/compiler. Python has taken a step forward by eliminating them.
As for tab stops, yes, they are annoying, but the answer to this problem is known. Tab stops must be mod 8. If your terminal/editor/whatever does anything else, it's broken.
--Mike
Re:Python is a GREAT language, but. . . (Score:2, Interesting)
I REALLY dislike any language that depends on white space. Miranda and Haskell are two other examples of this. Its a pain to move blocks around and anyone who doesn't use an editor with auto-indent is screwed. Also, unless tabs are set to spaces, computers with differnt tab stops will see your code differently, which can be a problem if code is emailed, etc.
It isn't a pain to move blocks around. At least not more so than any other language when one is coding properly. Nor does not having auto-indent cause any more problems here than any other language when coding properly. Operative words, coding properly. The nice thing about Python is you're assure that the programmer got his blocks mostly right.
As for the space/tab problem that is simple, no tabs. Done. Tabs are bad. They are very bad. Just say no to Tabs.
As has already been mentioning, not too much one can teach about memory management and pointers with python. . .
Oddly enough it is with Python that I finally grasped the concept of pointers. The nice thing about Python is that one isn't constantly trying to do the dereference shuffle to do the right thing. This makes copying lists a little sticky but that is an acceptable trade off compared to scratching one's head over "Ok, this function requires a variable, a pointer-to-a-variable, another variable which I'm going to dereference from this pointer here...."
Sounds kinda strange as a complaint, but too much is built in. I have this complaint about java too. As an example, I would much rather have an early homework be a sorting algorithm and then have them reuse this algorithm in other homeworks than let them just type "xxx.sort()". Not that this isn't a great feature for experienced programmers, its just that begining students should have to do sorting, reversing, duplicating, etc themselves at first.
Not only is this an odd complaint it is also a non-issue. There is nothing that precludes you from requiring a home-brew sort to be used in your assignments. It begins something like this, "Yes, Python is nice in that it has a sort function. However, knowing how a sort function works really helps one understand computers and programming. So, today's class we're going to program our own sort function which we'll later use in a larger program." The nice thing is then you can say thisn, "Then we'll compare what we've written with the default sort function..." It is an unimaginative teache who cannot figure that one out. It has been pretty common practice in all of the programming classes I've taken over the years. High school, college and vocational all did it while I was learning BASIC, Turbo Pascal and C.
YA : Thinking in Python (Score:2, Informative)