Conceptual Models of a Program? 411
retsofaj queries: "Almost all of the introductory programming books I've looked at focus on syntax, with possible digressions into a bit of semantics. What I haven't found are any great discussions that go beyond syntax and semantics and make it all the way to conceptual models. My goal is to develop a set of resources that can be used in an introductory course that teaches students programming starting with conceptual models, as opposed to starting with syntax."
"What I mean by conceptual models are how you think about what a program is (if a program can be anything!). Examples would be (all prefaced by "a program is made up of..."):
- flowcharts (structured programming)
- arrangements of opaque things sending messages to each other (OO)
- transformations of data structures (Wirth's view)
- state machines
- a knowledgebase (Prolog, etc.)
- algebraic operations on sets (Functional languages)
- Who/Where/How are the different models of a program being taught?
- What conceptual models do you use when programming (and where would I go to find out about them)?
Experience!!! (Score:2, Redundant)
Re:Experience!!! (Score:3, Insightful)
Without reading about say, Functional Programming, how will one make the intuitive leap mearly by exercising the iterative.
Like *all* disciplines it is the combination of study and practice that will lead the way.
Re:Experience!!! (Score:2)
Analysis/Design? (Score:4, Insightful)
Programming (coding) is how you implement a design. By the time you get around to coding, I would hope that you already have the design worked out.
Or am I missing something here?
Re:Analysis/Design? (Score:2, Insightful)
Re:Analysis/Design? (Score:4, Interesting)
But it sounds like this question was posed by someone who has recently discovered that analysis and design are important, and doesn't understand why programming books don't cover analysis and design in greater detail. Which is, to some extent, a bit like asking why books on carpentry don't teach architectural design.
It's simple, really (Score:2)
Re:It's simple, really (Score:2)
Then I flesh in the lower level functions, creating even lower ones where needed. I try to keep each function fairly short - going over 40 lines, let's say, could indicate that an unexpressed function is "swallowed" in your function. Longer functions are sometimes OK, and sometimes it's possible to factor out very short functions that have a clearcut, reusable purpose.
When I code in C, I often start by defining the data structures, but in Perl I haven't felt a need to do this since Perl's data structures are dynamic.
I hope this helps.
Simple, LOGO (Score:2, Funny)
If a turtle walking around the screen can't teach the fundamentals of programming, nothing can.
Design Patterns (Score:4, Informative)
Re:Design Patterns (Score:2, Insightful)
Re:Design Patterns (Score:5, Funny)
Deferring decisions to runtime makes code hard to read. Inheritance trees can get fairly deep, work is delegated off in clever but unintuitive ways to weird generic objects, and finding the code you're looking for is impossible, because when you're looking for the place where stuff actually happens, you eventually come across a polymorphic wonder like
object.work();
and the trail ends there. Simply reading the code doesn't tell you what it does; the subtype of object isn't determined until runtime. You basically need a debugger.
You can take a really simple program and screw it up with aggressive elegance like this. Here is Hello World in Java:
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello, world!");
}
}
But this isn't elegant enough. What if we want to print some other string? Or what if we want to do something else with the string, like draw "Hello World" on a canvas in Times Roman? We'd have to recompile. By fanatically applying patterns, we can defer to runtime all the decisions that we don't want to make at runtime, and impress later consultants with all the patterns we managed to cram into our code:
public interface MessageStrategy {
public void sendMessage();
}
public abstract class AbstractStrategyFactory {
public abstract MessageStrategy createStrategy(MessageBody mb);
}
public class MessageBody {
Object payload;
public Object getPayload() {
return payload;
}
public void configure(Object obj) {
payload = obj;
}
public void send(MessageStrategy ms) {
ms.sendMessage();
}
}
public class DefaultFactory extends AbstractStrategyFactory {
private DefaultFactory() {;}
static DefaultFactory instance;
public static AbstractStrategyFactory getInstance() {
if (instance==null) instance = new DefaultFactory();
return instance;
}
public MessageStrategy createStrategy(final MessageBody mb) {
return new MessageStrategy() {
MessageBody body = mb;
public void sendMessage() {
Object obj = body.getPayload();
System.out.println((String)obj);
}
};
}
}
public class HelloWorld {
public static void main(String[] args) {
MessageBody mb = new MessageBody();
mb.configure("Hello World!");
AbstractStrategyFactory asf = DefaultFactory.getInstance();
MessageStrategy strategy = asf.createStrategy(mb);
mb.send(strategy);
}
}
Look at the clean separation of data and logic. By overapplying patterns, I can build my reputation as a fiendishly clever coder, and force clients to hire me back since nobody else knows what all this elegant crap does. Of course, if the specifications were to change, the HelloWorld class itself would require recompilation. But not if we are even more clever and use XML to get our data and to encode the actual implementation of what is to be done with it. XML may not always be a good idea for every project, but everyone agrees that it's definitely cool and and should be used wherever possible to create elegant configuration nightmares.
Re:Design Patterns (Score:3, Interesting)
There are requirements that can make abstracted designs a good idea - things like i18n, which suggests that all the text in your UI should be ripped out of constants and put in config files. So in that case Hello World would need a way to read the config file in order to easily morph into Bonjour Monde or Hola Mundo or whatever.
I can't tell you how many times I've seen an object model in a project that starts with EVERYTHING being based on some GenericItem object that all the others inherit from, and GenericItem has no real value. The designer just wanted to use inheritance a lot. Oy.
I like OO and all, I just don't like OO developers who read 10 books on OO and try and apply 100% of what they've read in their first project... and spend all their time building more and more goofy infrastructure instead of actually implementing any functionality.
Re:Design Patterns (Score:2)
Amen to that ... (Score:3, Interesting)
Re:Design Patterns (Score:4, Insightful)
but on the other hand most tasks aren't as simple or well defined as 'Hello World'. remember the last time your boss/client said "hey can we change it to do [this]" and you groaned because [this] wasn't anywhere near the original spec and you knew how much work it would need to hack it in? at that point you're wishing you'd abstracted just a little bit more at the outset.
My corollary: the boss is always going to ask for something that you didn't expect. twice.
Re:Design Patterns (Score:2)
While Gamma et al talk about these patterns from an object-oriented point of view, most of them predate the "OO language". We've been doing publish-subscribe, decorator, etc. for a long time, we just didn't use a OO language to talk about them.
Re:Design Patterns (Score:2)
I have started documenting procedural/relational versions of GOF-influenced patterns at:
http://www.geocities.com/tablizer/prpats.htm
Personally, I find p/r versions of them simpler, and more flexible. In table-oriented thinking, a "pattern" is often merely a relational formula adapted to solve a need for a particular task. GOF tends to view them as global.
In GOF OO thinking, one tends to make the *code* resemble the pattern. However, in p/r one tries to factor potential relationships into formulas on tables instead of code shape. Changes are then less disruptive, because it is easier to change a formula than a physical code structure. And, you can have as many different such view-providing formulas as you need without trampling on one another.
Perhaps it is subjective. Developers tend to model their own mind more than anything external I have come to conclude after many heated debates. I just like p/r modeling better. I find it cleaner, more flexible, and less intrusive.
Relational modeling is as close as anything I have seen to reducing GOF-like patterns into a compact, virtual "pattern language". IOW, "pattern math".
One problem is that... (Score:2)
Programming is a process and modern US school don't teach processes at all. They teach you what 2+2 is, not how to add numbers. If someone figures out how to add on their own it's not cause they were taught it in school. How many people can add up all the numbers from 1 to a hundred quickly?
but not to make this a pessimistic post, I would say that teaching others how to program is probably best started with a flowchart type modeling, since that is easy to understand and can be represented graphically and then easily turned into code.
State diagrams are harder to understand at first, but once you get them they are very good tools for writing programs.
Re:One problem is that... (Score:2)
I can...
$ perl -e 'foreach $i (1..100) { $sum += $i; }; print $sum . "\n";'
5050
Probably should have done this recursively for brain calisthenics, though
Re:One problem is that... (Score:2)
$ perl -e 'foreach $i (1..100) { $sum += $i; }; print $sum . "\n";'
Why waste all the CPU cycles:
$ perl -e 'print (100+101)/2;'
mmm,
$ perl -e 'print (100+101)/2;'
201$
Probably not the result you wanted, definitely not a useful result, and points out why you need to understand syntax (operator precedence) as well as structure...
the guy needs a little Knuth (who doesn't?) (Score:2)
Premature optimization is the root of all evil
:-)
Re:One problem is that... (Score:3, Insightful)
50 pairs + one lone number (Score:2)
1 + 99 = 100
2 + 98 = 100
...
50 + 0 = 50
sum (1..100, step=1) { x } = 5050 because of that.
Come to think of it, patterns like this (eg. all the multiples of 9 x a single digit N have the first digit as N-1 and the second as the difference 9-(N-1)) are probably why I hated calculus and used Taylor series in place of integrals every chance I got in college.
Integration still seems like voodoo to me. Sequences, now there's something *REAL*.
(even in job choices -- computational chemistry, database work, etc. -- I ended up with discrete rather than continuous bases for solving most problems... odd)
Re:50 pairs + one lone number (Score:2)
Karel The Robot (Score:2, Insightful)
Worm's eye view (Score:2, Insightful)
Re:Worm's eye view (Score:2)
Well, you're one up on Redmond.
Andrew
A Famous One Is... (Score:5, Informative)
by Harold Abelson, Gerald Jay Sussman, Julie Sussman.
Re:A Famous One Is... (Score:3, Interesting)
Re:A Famous One Is... (Score:4, Insightful)
The only problem, as I point out in another message, is that they almost totally ignore other conceptual models of programming; lambda calculus is thoroughly explored, but combinatorial logic and similar models, as demonstrated impurely by APL/J/K and purely by Forth/Postscript/Joy, are almost ignored. A good teacher would, IMO, base a class on SICP, but augment with two of the above languages and a discussion of their paradigms.
-Billy
Re:A Famous One Is... (Score:2, Informative)
mod parent up (link to full text of SICP online)! (Score:4, Informative)
SICP online [mit.edu] (my god that background is ugly)
Not to be confused with the Society for Invasive Cardiovascular Professionals [sicp.org], mind you.
Another option (Score:5, Informative)
(Read the articles on Graham's site. They're friggin' amazing distillations of experience. If you've been programming (successfully) for long enough, you'll not only be pleasantly surprised, but will find yourself nodding in agreement whilst learning about new topics. Anyways, the book is an implementation of much of what he writes about, into his 'Mother Tongue' of Common Lisp. Hell, this is one of the few good writers who can correctly answer the question:
"If you're so damn smart, why aren't you rich?"
The answer, for anyone whose opinions you'd want to trust, is "I am", and it's BECAUSE of his opinions.
How To Design Programs (HTDP) (Score:2)
Well ... (Score:2, Interesting)
That having been said, it's not too hard to find good texts that cover these issues. Knuth's opus is only one example; there are several good books on algorithm design and analysis that are recent enough not to suffer from the "provably correct" disease of the '70s and '80s, and there's probably dozens of books by now that are part of the Booch series on OO design. Plus the hundreds of excellent texts on linear algebra and other basics.
Is it necessary to *start* with these? Well, I'm a fan of a two-track system where students are given both theory and praxis, allowing them to apply the former by way of the latter. You can learn theory without application, but chances are you'll end up with something like Python (/me ducks and covers); you can also learn application without theory, but then you get
Just US$0.02 from The Watchful Babbler.
You are VERY correct (Score:2)
I would add that this approach of theory w/out specifics of implemenatation is not as workable on a practical level.
People learning to code want to get in their and 'do' something. They want to see some results and I don't think there is anything wrong w/that. You need to give them that opportunity to keep enthusiasm.
The 2 track idea is excellent.
.
Java Design Patterns (Score:2)
While many design patterns are specific to good J2EE design, and may not be relevant for all types of application, many are general enough to be used in any object oriented language. May be part of what you are looking for.
-Pete
(yes, the book link has my amazon id in it...click and buy the book. It's worth having.)
How To Design Programs (Score:5, Informative)
That's a really clear book (Score:3, Interesting)
I liked the chapter differentiating generative recursion from structural recursion. That's a really insightful distinction in terms of the mechanics of grasping a problem and a good solution for the problem.
Definitely worth a read, although I think SICP is the cleverest (most intellectually satisfying) exposition of these little gems ever written.
I am reading (and doing) Paul Graham's 'ANSI Common Lisp' book for amusement and it's really sharpened my thinking. Macros are a great example of meta-generative-recursion, if you can call it that. Whatever you call it, it's raw power.
To Code Well - Write Code (Score:4, Interesting)
You have to know syntax and semantics to practice.
To take the high road right off the bat is good conceptually but the problem is implementation is often where it gets difficult. I know a lot of people will disagree but I can tell you that the concepts behind something do not have a lot of value until the user has a level of experience that brings out that value.
I believe this is true across a wide range of disciplines - not just programming. If you tell someone that breaking in boots is important to hiking (w/out getting into a lot of messy details) they may listen they may not. If they sit at the end of a trail w/blisters all over their feet (or see a companion in that shape) they will value the information much, much more.
I've never found a conceptual approach to be nearly as useful before I've tried something compared to after those attempts.
.
...and read code! (Score:3, Insightful)
Case studies are widely used in other disciplines like engineering, and they can be useful in programming too.
Re:To Code Well - Write Code (Score:2, Insightful)
If you have never developed before, the conceptual underpinnings will be largely meaningless. Sure, students can and will learn what you're teaching by wrote, but the information is not *real* to them yet because they can't make the connection to anything within their experience. Further, they won't retain it long enough for it to be of value later when their development skills catch up enough to appreciate it. So it is all well and good to say 'teach the abstract stuff first' but you're just going to frustrate them and yourself. It's like trying to give the benefit of your life experience to a teen by lecturing them. It's just words to them until they come into that wisdom on their own someday and finally 'get' what you were trying to tell them.
Re:To Code Well - Write Code (Score:2)
You are discussing the difference between maths, engineering and social science. Programming can unfortunately only be described as a social science subject.
In maths you are presented with the result (the theory), and then with the proof. You don't need to satisfy yourself that it is correct emperically or "in the real world" or that its better to do it that way. Its proven.
In engineering you are presented with the result. You don't care about the proof, because you know that there is research and proof underlying the result, and you use the result in the recommended way to achieve those ends.
In social science everyone has their own theory. Some people rise to the top of the pile, and get continually picked on. Occasionally someone provides sufficient emperical evidence to justify the result being called a fact within the subject's domain.
This is the case with subjects like psychology and computer science. There is always more than one way to do or interpret something, and ego dictates that you must contest "accepted facts", and either derive or disprove the result for yourself.
Programming SHOULD be taught as an engineering subject. There is a body of knowledge for programming according to the conceptual model in use, and it is based on hard evidence from decades of development.
Unfortunately, software engineers prefer to make their own cement rather than ask the expert cement maker for the properties of the cement that is readily available.
Re:To Code Well - Write Code (Score:2)
I'm hardly surprised. How can you understand functions and procedures without understanding the conceptual model of a functional program? How can you understand pointers, references and pass-by-value without understanding the execution model behind the program?
To teach, you have to start at the beginning. Programming is the means, which comes in just before the end. The approach of teaching students how to hack their way along in a visual language is a primary cause of the number of inept developers, and gives our profession a bad name. Bookkeeping is a well-established profession: if you get a useless bookkeeper, you blame the person or the institution. A useless developer is assumed to be par for the course.
Programming courses need to start with the most basic topic: what IS a program. Then the theory of how programs run (jumping around to repeat bits of processing, and varying the processing with parameters). Followed by how we achieve that using source code and a compiler.
Only then can a student appreciate and understand what they are going to be doing when they "program".
Models (Score:3, Funny)
You have to learn arithmetic ... (Score:3, Insightful)
Students need to learn syntax before they learn (much) in the way of structure. It doesn't matter what language they first learn in, though I think something in the C family (i.e., C, C++, Java, etc.) is a good place to start, since a) most real-worl programming is done in one of these languages and b) if you can really, truly learn C, you can learn anything.
But hell, teach 'em in Perl or LISP or Pascal if it makes you happy. The point is that programming courses have traditionally started out with "Hello World" or some such thing for a reason: beginning computer scientists need to learn that they can type in, compile, and run a program before they start worrying about higher-level structures. Any attempt to teach theory before practice will fail as surely as the "New Math" -- which basically did try to teach algebra before arithmetic -- did a couple of decades ago.
Re:You have to learn arithmetic ... (Score:2)
While that's true, a lot can be done with beginners in a pseudocode type of "language" that does not actually get run. I recall my first "programming" exposure was BASIC in 7th grade math. We had no computers, and this was shortly before the Apple ][ came out. We wrote very short programs on paper which were then essentially traced by hand by the teacher.
Still, as a beginner with zero exposure to programming at the time, I learned a lot: linear processing of commands, simple loops, printing, and yes, even some syntax such as line numbers and labels.
(back to my point) Pseudocode can help teach the logic and flow and structure aspects of programming without the burden of having to follow syntactical rules at the same time. So, the first step should be to write out the "program" in a lightly structured pseudocode; the second step would be to "compile" that by hand into actual code when syntax had been covered. At that point, the fundamentals of the logic are out of the way and the students can focus on getting the syntax right.
When I think about how my projects have been completed in the real world recently, this is essentially the same process I go through: map out the general logic (either in prose or with a type of pseudocode shorthand) and then program the thing after I've got that worked out.
Re:You have to learn arithmetic ... (Score:2)
However, that pseudo code can actually be run if it's done in Python [python.org]. Python is often described as "executable pseudo code". The syntax is very clear and very simple, but it is also amazingly powerful. Check it.
Perl Version (Score:2)
or this:
Couldn't resist.
Re:You have to learn arithmetic ... (Score:2)
Re:You have to learn arithmetic ... (Score:2)
Mwahaha, couldn't agree more. I'm still a rather inexperienced programmer and while I've dabbled in various languages, C is definitely the one that makes me shudder. They present it to you easily enough... at first. Ya know: These are variables, this is an array, these are for loops. And then all of the sudden its "Oh my god, how do I use pointers! How to define a structure?! HOLY SHIT, IF THIS 5-LINE FUNCTION SEGFAULTS ONE MORE TIME I'M GOING TO MURDER THE ENTIRE PLANET!!"
I'm sure we've all been there.
Re:You have to learn arithmetic ... (Score:2)
What I've seen of the "current crop of programmers" is that they are not very good with fundamental concepts and they are especially bad a breaking problems down and then applying the tool (whatever language/programming "style" that happens to be). So I know guys who know Java like the back of their hand, but when it comes to solving problems, they don't know _how_ to think about solving it. I also know some masters grads who can talk algorithms and concepts till their blue in the face, but they couldn't actually implement it if you wrote half the code for them.
start lower than you'd think (Score:2)
If this is a truly introductory class, please do what my first programming teacher didn't do: take a few classes to explain what the hell "programming" is. I went through a lot of confusion making the connection between words written into a file and behavior later when executing that file. Granted, this was a long time ago, and I had never sat behind a computer until my first lab in that class, but I imagine this is still a problem for most first timers.
In my case, I was able to fall back on my considerable experience with logic concepts and advanced mathematics. Fortunately for me, Fortran behaved(s) itself in this regard and allowed me to apply these concepts. I learned that this text needed to be processed before running it, but I had no idea until I started studying C what the hell those steps really meant.
Students today already have a bunch of assumptions of what computers are/do, so I imagine you'll have to help them unlearn some of that limited perspective. They also probably haven't learned a great deal of formal logic, so they might need some instruction there as well.
Hrm... can you teach people to think? (Score:3, Insightful)
I think you could take people through graded series of exercises soluble in different approaches, but there's no "one size fits all" way to develop intuition.
One approach used widely in architecture, a sibling profession if we ever had one, is "masterworks" - taking students through the works of other great architects, examining each decision made in some detail with reference to notebooks and discussions.
I think that this approach may make a lot more sense than teaching theory because it gives some access to an experienced mind, rather than just a methodology created by such a mind.
I know that I learned more from working with great programmers and absorbing their tricks than from any book I ever read or course I took.
Re:Hrm... can you teach people to think? (Score:2)
Plenty of changelogs available, and he'd probably talk you through a big chunk of the design process if asked nicely.
I do think that people can change the way you think about things, but I think 90% of the time it's done by "call-and-response" - you ask a question which reflects your current state, then they emit a response which changes your current state into some new state, and you fall on your ass wondering how you could have lived not knowing the world could be seen like that: code gurus.
Our models of education, alas, assume this is to happen by chance during lectures based on rough ideas of the makeup of a class.
I think the best programming education of all is apprenticeship and the open source community provides an excellent pool of masters [avogatro.org] from which to learn.
Email somebody: I bet within the first couple of dozen, you'll find somebody who's willing to mail you their source code to comment or standardize.
Re:Hrm... can you teach people to think? (Score:2)
flowcharts (Score:2, Interesting)
Design Patterns, The Book (Score:3, Informative)
I also would advocate you not to follow the dogma that object orientation is the holy grail of software. Be open minded to structured programming too! :)
Re:Design Patterns, The Book (Score:3, Informative)
Re:Design Patterns, The Book (Score:2)
I just went and looked at both of those sites and I don't think that either are portraying design patterns nearly as harshly as you say they are.
Here is a quote from the second link: "Some of the patterns disappear -- that is, they are supported directly by language features, some patterns are simpler or have a different focus, and some are essentially unchanged."
From the first, "16 out of 23 patterns have qualitatively simpler implementations in lisp or Dylan than C++ for at least some uses of these patterns".
Both of these are quite balanced evenhanded statements, neither condemning design patterns or equating them with language flaws. Both links are oriented towards only one difference between language kinds, dynamic rather than static. There are quite satisfactory reasons to choose static languages over dynamic ones regardless of whether the design patterns are more complex, and it is somewhat closeminded to call either approach flawed.
Re:Design Patterns, The Book (Score:2)
I'm sure you are aware of established books on computing fundamentals, such as How to Design Programs [htdp.org], so it must be fairly obvious that the GoF is not remotely comparable to these.
Structure And Interpretation of Computer Languages (Score:5, Informative)
Lambda isn't everything, and a good teacher should also cover some languages which use it lightly (J and K) as well as a language which doesn't use it at all (Forth, Postscript, Joy) -- but it's good to have as a starter. SICP doesn't teach 'conceptual models', though; I don't think that the authors even realised there were other conceptual models out there. Most people don't, since most people don't even know that lambda calculus has almost nothing to do with how computers work, but is rather just the way most programming languages have been designed, in imitation of Fortran.
But I can't slam SICP. It may not cover other conceptual models, but it does a BANG-up job of covering the one it acknowledges, and even points out the weaknesses.
-Billy
Re:Structure And Interpretation of Computer Langua (Score:2, Informative)
Just the "Table of Contents" should be enough to set any red-blooded programmer on "DROOL".
Scheme has trivial syntax. This lets the authors explore semantics in amazing detail. Scheme's semantics are explained using progressively finer (and more accurate) "models". Eventually these models are implemented, in the form of interpreters and compilers for interesting subsets of Scheme. Meanwhile, various data types are presented. Unlike the vast majority of programming textbooks, arrays and array-based types are given little space. Meanwhile, lists, trees and various (potentially) "infinite" data structures are examined.
Models are also given for other types of programming, including a machine code, a logic programming language, and the pure functional style. Functional programming is used extensively (assignments are deliberately rare) but not fanatically.
An amazing introduction to the subject (even if you already know it).
Re:Structure And Interpretation of Computer Langua (Score:3, Interesting)
Sussman was a co-author on several papers, the titles of which approximated: "Lambda: The ultimate goto instruction." I think Sussman and Abelson know a lot more about cs than you give them credit for.
You might like Design Patterns (Score:2)
Perhaps more similar to ideas from systems theory than from ontology, design patterns are sets of abstractions and ideas that are repeatedly useful in many different contexts for solving complex problems in an elegant manner.
As a developer, patterns are the most powerful part of my design repertoire. Conceptual models are invariably domain specific, but uses for patterns come up again and again.
Try Design Patterns, by Gamma, Helm, Johnson and Vlissides. There are many other books in that genere, though. Analysis Patterns by Martin Fowler may be more of what you want if you're looking for domain-specific analytical abstractions.
The book Cognitive Patterns by Karen Gardner describes a different kind of conceptual model for programming. An interesting read, but I've rarely used anything from it.
If you really want to get into a detailed ontological perspective on programming and the world, read On the Origin of Objects by Brian Cantwell Smith. Absolutely fascinating philosophical study. Absolutely useless to a working programmer. (unless you work at PARC or Watson labs)
Structure and Interpretation of Computer Programs (Score:3, Redundant)
I'm sure many people here are already familiar with it, but if you're not it's worth a look.
Best advice: Keep It Simple (Score:3, Insightful)
Structure and Interpretation of Computer Programs (Score:3, Informative)
It's the standard MIT intro text. Philip Greenspun called it the "one great book on the subject of computer programming". It's even online [mit.edu]!
The only caveat is that students reportedly find it hard to absorb on the first pass--even at MIT. (This is second hand information--I didn't read SICP in a class, nor did I go to MIT. I read it after programming professionally for a few years, and loved it.)
How to Think Like a Computer Scientist (Score:4, Informative)
However, I think it would be a mistake not to teach any syntax at the beginning. Students need concrete examples, and the only thing that makes it fun to learn how to program is that you get to build actual programs that really do things.
I always start with the concept of transformation (Score:2, Insightful)
The fact that this is vauge is, frankly, exactly the point. I also take the student through the real-world words excersize. Particularly "what is a file? Now try again but completely forget that htere has ever been a thing called a computer; what is a file?" wearing them down to "An arbitrary collection of arbitrary things."
Once you have done this decomposition of practical thought (typically about ten minutes of easy banter and cheap jokes) you can "really start" with the idea that there is a thing called "a state", that that thing is "only as defined as it needs to be for the task" and that any task is a transformation on that state.
The whole lecture really involves working a crowd (of students) like you were the warm-up act for a TV sit-com live-taping audience. But done correctly you are seeding your students with the tiniest grains of everything you will be requireing them to think from there on out. Most importantly you are doing it in a non-threatening way AND showing them that what they already know is vitally important. (That's the heart of the all important act of validating your students.)
Then you start bouncing back and forth between the practical and the theoretical.
Basically, what you should *REALLY* do is spend a few weeks with an "english as a second language" teacher or just a plain english teacher and learn how to *TEACH* before you even think about teaching a particular subject like "programming".
The fundimental problem with computer science is that the students are learning from the people who learned from the people who made it up in the first place, and not one of those people ever learned to "teach" any subject to anybody. -- me
Re:I always start with the concept of transformati (Score:2, Insightful)
The most important "conceptual model" is "you already know how to solve problems, you do it every day."
This is generally followed by:
"This is not hard, just new/strange to you"
and/or
"Everything anybody can do in here, at the atomic level, is exactly analogous to something you have already done in the real world."
The first and hardest thing to do is demystify the experience. Computer mystics program by rote formula, always recreating the same program with nearly identical structures in a cookbook-like format. Computer scientists take knowledge and use it to manufacture a good solution for each unique problem.
Most of the freshly-minted graduates in Computer Science are Computer Mystics. If you can break down this formula approach to the subject you will get much further much faster and produce a better graduate.
They used to call it "Systems Analysis" (Score:2)
OK, I'm being cynical, but my point is that, in the last 40 years or so, everybody in the industry has agreed that one should have *something* that precedes programming, but there has never been a really good agreement on what should that be. Methods go in and out of fashion, just like names, Yourdon, Chris Gaine, etc.
My tiny piece of advice, based solely on my decades of working with computers is: hire really _good_ programmers (they may be expensive, but they are worth every cent) and let them do what they like. Nothing hampers a good programmer as much as imposing a "methodology".
Concept Programming (Score:2)
I'm currently hard at work on the first concept-programming language, called XL.
"Cardboarding" and Literate Programing. (Score:2)
Because computers are so literal, you need to think out the fine details of the process as well. This is where the cardboard comes in. In essence, you assume the program works, and manually work through the data.
You need to be conversant with your target enough to deal with all of the possibilities. To get an idea of this, read some of the bug-lists and history files of shareware.
The other thing to do is to design the thing so that the user knows what to expect. Spreadsheets were there before computers [Stationers carry books of x or y columns].
Once you do this, sit down with your material, and start with this program:
call startup
call looping
call closure
exit
You break the program down into parts, and then address each part. Each part should do a clearly defined activity. This may not be just "one thing". Each routine must leave the global variables in a clearly defined state,
More importantly, each of the controls (be they buttons or command switches) must do logical things. You can draw on your user's expectations. Putting a function at menu|F|X will draw the ire of people who use this sequence to shut the program down.
There are of course excellent books out there. The one by Hackel [I forget the name .... will post it tonigth :)] is good, as is Knuth's "Literate Programming". Jon Bentley's comment in there is worth considering as well.
IT is about information, not technology (Score:2)
The key thing is to keep a sense of proportion - anyone referring to UML or the GoF design patterns book has failed to understand what the fundamentals of IT are, and is certainly overestimating the relevance of their own preferred language or paradigm.
Scheme/LISP, logic and some database theory is a good way to approach the fundamentals, as it was 15 years ago when I went through it. They won't thank you to begin with, but its what college is for! I'm not totally sold on SICP, students might think it's a bit pedantic, you might like to look at How to Design Programs [htdp.org] as an alternative. I don't have a good reference for database and logic texts - I use Joe Celko's Data and Databases [mkp.com] book, but this isn't suitable as an introduction.
Brain and Box (Score:2)
The box was your state machine, and the brain was the operator/programmer/user whatever looking at some aspect of it.
The whole concept of programming languages was explained using a sequence of grunts and sidetracks, and lots of pictures of the brain and the box with different relevant sizes
Best book I've read.... and the one I started with (Score:2, Informative)
Nino & Hosch
That's the book I started with 7 months ago (had never programmed before, HTML doesn't count
While this approach to programming is _very_ frustrating (I was very itchy, wanting to get ahead and start coding real programs which could actually 'do' something), it gives you a very good base from which to go on and learn more about programming. GUI's, for instance, are discussed in chapter 20 (one of the last chapters), and only AFTER completely digesting the whole Model-View-Controller pattern.
Enough of the propaganda for this book, I'm simply a student who is very happy to have been able get such a good introduction to programming from this book. I've tried other java books in the past, only to have been irritated with the lengthy examples which they START OFF with, even before teaching you exactly what a class is.
Problem with this book is that it doesn't really stand alone. You'll be able to grasp the concepts of OOP completely, but you'll have almost no hands-on experience, unless you pay _very_ close attention. I recommend using a second book as reference (we used a book by Kalin which I do NOT recommend, maybe an O'Reilly in a Nutshell book would be good)
Old timer comment... (Score:5, Insightful)
Re:Old timer comment... (Score:3, Interesting)
Teach that it might be easier to use 'i' as a variable in a short loop, but loop-idx or object_idx make more sense.
Research suggests that, for complex equations and/or complex operations, shorter variable names are more easily recognized than longer variable names. That's because most people who learned algaebra recognize patterns in the equations, and using longer variable names makes the equations harder to recognize.
Thus, "force = gravitational_constant * object_1_mass * object_2_mass / (distance * distance)" takes a little more time for the brain to parse than "f = G*m1*m2/(r*r)", even though they represent the same thing.
With this in mind, I would suggest that if the iterator of a loop was being used as part of a mathematical operation (such as an array index), perhaps using 'i' will make the code more readable. However, if your iterator is not being used as part of a complex equation or represents an object (such as a pointer in a linked list), perhaps using a descriptive name makes more sense.
Personally I tend to write:
for (i = 0; i < 10; ++i) a[i] = 0;
yet:
for (windowPtr = GWindowList; windowPtr != NULL; windowPtr = windowPtr->next) {
windowPtr->Update();
}
You get the idea.
Just my two cents worth.
Re:Old timer comment... (Score:3, Interesting)
for (int i=0; i<size; ++i) {
(assuming that the "..." doesn't contain any braces), but function/method arguments should be longer, like this:
int createWidget(string name, billOfMaterials parts) {...}
and globally-visible items (like the class and function/method names above) get the longest names.
Re:Old timer comment... (Score:2)
I generally have to agree. As a rule of thumb, the more often something is referenced, the shorter its name should be. Long-winded code is harder to read IMO.
If you have a long variable name, then shorten it with an abbreviation, and then comment the abreviation. Example:
var fbtb;
or perhaps:
var fooXfr;
IOW, factor the "documentation" into one spot.
As a compromise, perhaps try "idx" instead of "i".
Who needs to look? (Score:3, Funny)
Look at the screen, you must not.
Shut your eyes, you must.
Guide your fingers, the Force will.
Great coder, you will be, yesss.
Re:Old timer comment... (Score:2)
I tend to use "j", for that reason but in vim, there's always \<i\> to look for the solo letter "i".
You Should Start with Some Syntax (Score:2)
Basic rule of software: do the right things before you worry about doing the right things well. Learning the syntax is doing the right things, proper design is doing the right things well.
/me getting off soapbox
Everyone is different (Score:5, Insightful)
Some people prefer to read code. I definitely prefer reading code, because I think backwards and use non traditional techniques to learn programming principles. I prefer to deconstruct a piece of good code and work back to the theory that way. Some people prefer to understand the theory first and think about different approaches to apply it.
A good teacher is one who is able to adapt the study plan to the strengths and weaknesses of the students. People should stop thinking of teaching as a mechanical process. Teaching is a creative, organic process that changes both the teacher and student. There are many smart and talented people working as teachers, who can't teach worth a dime. There are great teachers who are terrible programmers. Finding some one who is great at both is difficult.
Perhaps you should be asking, "How do become a good teacher?" As Lao Tzsu taught, if a person wants to be a good teacher, first be a good student. The teacher has to be a student of the student to understand how and why a particular student fails, so that he can adapt the explanation/technique for that individual.
Re:Everyone is different (Score:2)
I agree!
People tend to assume that either:
1. Others *do* think like themselves
and/or
2. People *should* think like themselves
People simply process and digest and "model" information in very different ways. What works for you may not work for others. For example, social people may learn or process information better if they envision processes as a bunch of friendly little elves interacting and cooperating with each other to get a job done.
Although I tend to be visual, I find that some people just don't relate to diagrams and prefer some sort of written description or dialog of some sorts.
One size does *not* fit all.
Ensure they know why... (Score:3, Insightful)
Show them a couple of very simple constructs (like an if statement and a while loop), then show them the corresponding code in as many languages as you can. Build up in their mind that a computer language is just a tool to solve the problem .
The language you use to solve problems isn't irrelevent (some languages are better at certain tasks than others), but at this point in their programming career, it largely is.
The Modern Prometheus (Score:2)
Developing software means creating something new, that has never existed before. Something that is often cobbled together out of parts from both the living and the carcases of the recently or not-so-recently deceased. Something that, when we first try to bring it to life, often fails, badly at times. Something that we, acting as gods, mold and shape into an active entity worthy of respect from our peers and customers. The range to which software can be applied is vast, encompassing the entire spectrum of human endeavor. The consequences when we make mistakes can be devastating. Be careful out there, ok?
Two good books (Score:2)
-m
Thinking like a Unix Programmer (Score:2)
Re:Practicum first, then theory (Score:2)
For i = 1 to 10
Print i
Next
is a whole lot easier to understand than
for(i=0;i<=10;i++)
{
printf("%d", i);
}
for newcomers to programming.
Re:Practicum first, then theory (Score:2)
ugh.
Re:There is no such thing as "theory" in software (Score:2)
For example, consider the sorting algorithms. There are several different methods of sorting a series. Some of them, like quicksort, are generally better than others, like bubblesort. But in Knuth, we find a system where bubble sort is the most efficient method.
The idea that different methods "cost" different amounts, and doing these calculations can speed up programs greatly, is also not to be missed.
There is also "design philosopy", which is about how the parts fit together, and are built. The shift from the hard-coded jumps and gotos of assembler and basic to the structured programs of c, rexx &c is the application of a theoretical discovery. The unix pipe is built around the idea of task segments.
Like bridge design, program design is a case of learnt skills. And if you don't periodically look objectively with some theory at the design process, the design of both the bridge and the program may be sadly lacking and unnecessarily expensive.
And yes, you CAN learn bad habits. Programs like basic teach you bad habits in the sense that all values can be called from anywhere, and the use of "GOTO"'s.
Flow charts are generally a bad idea, since they force you into excessive codes. Better to let things wait and ride.
Re:There is no such thing as "theory" in software (Score:2)
None the same, theory is valuable, not all the time, but as often as it is in engineering. Debugging is fairly theoretical task. The more you move up the front, the less you have to debug and rewrite.
Yes, the average programming does benefit from a fair bit of design. In essence, if you can plan for things aforehand, then you can avoid these pitfalls. A big program in an area wher you can't lock away variables, or needs a tight squeeze of space, does well with a bit of foreplanning.
And, besides, what's the difference between "SORT" "ORDER BY", and cutting and pasting a standard sort routine. :)
I may not know how to do quicksort, but the program runs faster with it than, say, bubblesort. Most of the time I use the "sort" command, and structure output to make its life easier. It's only when I need an inline sort that I dig out the code.
I mean, I wanted some code for matrix determinate and inverses, so I took some stock code, and made my programs use the right variables. It's not that much different to doing the same in an spreadsheet.
There's nothing wrong with a spot of idealism, and I assure you, it's not a new thing with computers.
As far as programming goes, the first thing I show people is things like how to write filters in rexx, and shell programming. Both of these are dirt-easy, and immediately appliable.
Re:There is no such thing as "theory" in software (Score:2)
I mean, what's the difference in cut-and-paste and kit-assembled homes. The bulk of the work is done elsewhere.
Sorting routines are pretty much thrown at people. I do not deny this. So is bridge design. But this is not the point.
On the other hand, a careful study of where things go in a program, and considering alternate options for the program, resulted in massive speed increases (here hundred or thousand-fold type), in some of the programs I wrote and rewrote. Yet the first program I wrote was recycled to do an entirely different task, where it was much more effective.
There's no "magic wall" between theory and practice. It's more a matter of how far you stand back and see the scheme of things.
Re:Think in Parts (Score:2)
Good general advice. However, the problem is that there are often competing, orthogonal ways to divide something up. For example, traditional procedural programming tends to divide by tasks or verbs (at least the code part), while OOP tends to divide up by nouns. (I see no evidence that noun-centric code division is superior to task-centric, BTW.)
Every experienced programmer will probably agree that "parts is good", but the hard part (pun) is how to partition things to reflect the multi-faceted world we live in. Dealing with multiple, interweaving facets is the hardest part of designing complex systems IMO. (And please don't say that AOP has "solved" this. It is only a lab toy at this point.)
Re:Pseudocode (Score:2)
In fact, the concepts of bubble-programming and literate programming are based on pseudocode. All it means, is that you write the calls to the subroutines before the subroutine appears...
So a program like:
call begin
call loops
call endme
exit
is a perfect pseudocode, that becomes a valid rexx script when the bits are filled in. It's also a neat block-breaker.
Often the pseudocode for something becomes with a little debugging, the alpha-code.
Re:Design (Score:2, Informative)
So, I have to start very basic. I get them to look at the source of a webpage (like google or soemthing). Then I point out that it's all just text. Then I get them to open up Notepad.exe, where we'll stay for the first 8 weeks before graduating to Homesite.
Then I describe what H,T,M, and L in HTML stand for, and what (exactly) mark-up is. I compare it to an english essay that you hand in, and when you get it back, the teacher has "marked it up" saying "new paragraph here" or "italic" or "bold" here.... she's not telling you to change the content, just how it appears.
Then I teach them 1 tag at a time and the well-formedness rules. After they type it in, they look at it in the browser. They understand it that way.
After completing all of the basic xHTML tags we move on to CSS and start emphasizing the separation of data and presentation.
Then we do css layers/positioning and @media types, even further separating data & presentation.
Finally we do javascript and dHTML. This is full-on scripting. Data and presentation layers are separated (for the most part), and I can now introduct the idea of variables (which I relate back to grade 10 algebra x=1, y=2, therefore x+y=3), looping, conditional structures, and functions.
For 16 3 hour classes, this gets them from point zero to understanding the basics of scripting, as well as all of the details required to make a site with css-p that will validate under the strict xhtml dtd. For beginners I have no doubt that "doing" is much more powerful than "talking/thinking about".
Re:UML tailored for OOP... (Score:2)
> etc etc) is done in Z type languages.
Really? Can you back that up? Specifically, can you prove that any of Windows NT / VMS / Solaris / FreeBSD / Linux / MacOS X have a kernel that was written in a "Z type language" as opposed to just coded directly in C/C++/whatever and tested the hard way?