A Better Way To Program 467
mikejuk writes "This video will change the way you think about programming. The argument is clear and impressive — it suggest that we really are building programs with one hand tied behind our backs. Programmers can only understand their code by pretending to be computers and running it in their heads. As this video shows, this is increadibly inefficient and, as we generally have a computer in front of us, why not use it to help us understand the code? The key is probably interactivity. Don't wait for a compile to complete to see what effect your code has on things — if you can see it in real time then programming becomes much easier."
Great but... (Score:5, Interesting)
Seems like it mostly only applies to GUI programming or programming with results expressed through a GUI. What about, say, kernel programming?
Re:Great but... (Score:5, Funny)
Re:Great but... (Score:5, Insightful)
That sounds simple but it isn't. While you could theoretically do this from a virtual machine, the difference between visualising” it and testing it on real hardware is significant especially when it comes to device drivers, which are known to be the most common source of bugs in kernels.
Plus verifying a kernel or a compiler is a pretty hard problem, it's a miracle if you manage to do it in decent time, let alone manage to visualise it in any way.
Re:Great but... (Score:4, Funny)
Re: (Score:3)
Re:Great but... (Score:5, Informative)
You didn't see the point when he showed how you could find bugs in algorithms as you typed them.
Re:Great but... (Score:5, Insightful)
In the video he covers that as well. Well, at least he conceptually says its covered, I disagree...
Lets start with his abstract example. His binary search on the surface looks straightforward and he wanted to portray it as magically finding bugs as he got a float in one instance and an infinite loop in another. However the infinite loop example was found because he *knew* what he was doing as he intentionally miswrote it to start with and intentionally changed the inputs in accordance with this knowledge. There are a few more possibilities that you have to *know* to try out. For example, he didn't try a value that was lower than the lowest (would have panned out), he didn't try a value omitted from the list but still higher than the lowest and lower than the highest (which also would have been fine) and he didn't try an unordered list (which is incorrect usage, but accounting for incorrect usage is a fact of life). He didn't try varying dataset sizes (in this algorithm doesn't matter, but he has to *know* that) and different types of data. You still have the fact that 'B' is smaller than 'a' and all sorts of 'non-intuitive' things inherent in the situation.
Now consider that binary search is a freshman level programming problem and therefore is pretty low in terms of the complexity a developer is going to deal with. Much of software development will deal with far more complicated scenarios than this, and the facility doesn't *really* cover even the binary search complexity.
I know I may sound more negative than is appropriate, but his enthusiasm and some people's buy-in can be risky. I've seen poor developers suckered in by various 'silver bullets' and produce lower quality code because they think that unit test or other mechanisms passed and they can reast easy. Using these tools is good, but always should be accompanied with some wariness to avoid overconfidence.
Re:Great but... (Score:4, Insightful)
So your point, basically, is that programming is all about knowing what could go wrong with your code?
Not a bad definition actually...it would certainly explain why coding productivity increases in step with experience; you've made those mistakes already and now know to avoid them.
Re: (Score:3)
Re:Great but... (Score:5, Insightful)
However the infinite loop example was found because he *knew* what he was doing as he intentionally miswrote it to start with and intentionally changed the inputs in accordance with this knowledge.
It's was a demo. Demos by their nature tend to be both simplistic contrived.
He was putting across a principle in the first half of his talk and a way of life in the second. Both very valid.
From your comments it's clear that you're more of a craftsman than a visionary. There's room in the world for both.
Re:Great but... (Score:4, Insightful)
I personally am a fan of "debugging by printf." Kernighan and Pike make a good argument for it in The Practice of Programming. But, that's not limited to debugging, really. It's a great tool for understanding one's own code. Basically, whenever I want to get a greater intuition about how something works, I load it up with print statements at strategic points. I guess you might call it "understanding through printf."
I'll be honest: I didn't invest an hour of my time watching this video. How really does his technique compare to "understanding through printf"?
Re: (Score:3)
You should invest an hour of your life, it really is worth it.
To the discussion at hand, sure it is not an end all, but how often I see people doing the same thing in a debugger, only with mouse over variable / watch windows. His approach is essentially the same as many people do in the debugger, only the feedback look is tighter. Sure you still need to know about the problem and solution domain, to isolate boundary conditions; but it surly changes the way you program.
Re: (Score:3)
On the other side another field that is sorely in need of this is the medical field. A simple example would be that we have built machines to take blood pressure and heart rate automatically. But why aren't they in homes? We brush our teeth everyday, why can't we take blood pressure? Why can't we get a daily or even real-time heart rate? We carry smart phones. We have wireless networks. Why not advocate daily blood pressure readings tied to a wireless network that would log and analyze your blood pressure and heart rate over time so you could see trends on a daily basis rather than behaving in a reactive manner by going to the doctor when you do start to have severe symptoms.
It's become trivial to get a few basic medical parameters on demand cheaply. A pulse oximeter (heart rate and blood oxygen saturation) costs from $10-$50 for a home version, a blood sugar testing kit is $20+$1/test strip, and an automatic blood pressure cuff system is about $50. They all work well, they usually have storage, and if you're willing to spend more you can probably get a USB version. Heart rate monitors with a lot of memory have been used by athletes for decades. Fully networked versions of all this that give information to your doctor are not far off-- given that the baby boomers are getting older and it will cost too much to have live people monitor them as much as they want. The military is also developing such things so that someone can sit in a dark room in florida and play FPS using live people halfway around the planet, seeing what they see and what their health status is. If you watch high level bicycle racing, the team directors can often get real time data like heart rate and power output of riders.
Re:Great but... (Score:5, Funny)
What about, say, kernel programming?
If you enter a syntax error, you get a video of Orville Redenbacher ripping you a new one.
Re:Great but... (Score:4, Insightful)
Re:Great but... (Score:4, Insightful)
You need to watch the video. The horizons of his thinking couldn't be wider.
Re:Great but... (Score:5, Insightful)
Re:Great but... (Score:4, Insightful)
It's worth a hour. But if you don't want to watch it fine. Just don't expect your comments to be worth anything if you haven't done your homework.
He has a point. You don't need a 1hr long video as the sole measure with which to convey a technical point. Summaries, diagrams, and/or a 8-10 page paper are also necessary. Asking people to devote one hour just to know that something is worth it, that is not how you present technical ideas or issues.
Re: (Score:3)
It's worth a hour. But if you don't want to watch it fine. Just don't expect your comments to be worth anything if you haven't done your homework.
I'd like to have my hour back actually.
He said we need to push testing closer to the developer. His algorithm example was simply an demonstration of doing unit tests in real time. It's a good concept, suitable for developing small algorithms, but breaks down quickly once you hit a million lines of code, or have to work with large datasets.
Re: (Score:3)
It's amazing how many people are proud to be closed minded. Too bad you missed out. I'm glad I didn't.
You had a life changing experience watching javascript guy with his cargo pants, I get it and I rejoice for you. That does not mean that people who find him tedious and cheesy are close-minded, it just mean that some people need a little more substance before they applaud.
Re: (Score:3)
Smalltalk and Erlang both have this neat feature where you can not only load and unload new code on the fly, you can change implementations in mid-execution just like you can with the Eclipse Java debugger at a breakpoint, or with the C# debugger under Microsoft's IDE platform.
So what exactly is he talking about that's "new"?
We haven't had to rely on static build-test-debug-fix-repeat cycles for day-to-day programming in at least 5-6 years!
Connecting to your creation in Clojure (Score:5, Interesting)
Re:Connecting to your creation in Clojure (Score:5, Informative)
And here is the vimeo video [vimeo.com] for those who want to tear their eyes out when
visiting i-programmer and their 180px content column.
Re: (Score:3, Insightful)
Re: (Score:3)
25 minutes of that video involve video examples.
The rest, perhaps a transcription would do.
Re:Connecting to your creation in Clojure (Score:4, Insightful)
And for those decrying the use of video, you'll definitely want to check out Up and Down the Ladder of Abstraction by the same author: http://worrydream.com/LadderOfAbstraction/ [worrydream.com]
It's a big wall of text with interactive javascript examples and no video.
They invented the debugger! (Score:5, Insightful)
Yeeehaaa! ;)
The tough problems aren't about running the code and seeing what happens, they're about setting up very specific situations and testing them easily.
Re:They invented the debugger! (Score:5, Insightful)
The tough problems aren't about running the code and seeing what happens, they're about setting up very specific situations and testing them easily.
Handling non-specific unknown/unpredicted situations gracefully is also tough. Unsanitized user input, crazy failure modes, failure in other code making your state machines go bonkers... The trendy thing to do is just throw your hands up in the air and tell the user to reboot and/or reinstall, but that's not so cool.
Maybe another way to phrase it is at least one of the specific situations needs to be the input of a random number generator doing crazy stuff.
Your Arabic to Roman numeral converter accepts a INT? Well it better not crash when fed a negative, or zero, 2**63-1 (or whatever max_int is where you live), and any ole random place in between
Re:They invented the debugger! (Score:5, Interesting)
The tough problems aren't about running the code and seeing what happens, they're about setting up very specific situations and testing them easily.
another way to phrase it is at least one of the specific situations needs to be the input of a random number generator doing crazy stuff.
Your Arabic to Roman numeral converter accepts a INT? Well it better not crash when fed a negative, or zero, 2**63-1 (or whatever max_int is where you live), and any ole random place in between
This is still archaic thinking. A much more efficient way would be for the IDE to, when specifying a variable, ask there & then what the boundaries of the variable should be. Then the compiler could error any time it saw a situation where the variable could be (or was) handed a value outside those boundaries. Programmers should not be having to catch weird situations over and over; that's what computers are for. Allowing a variable to be any possible INT/FLOAT/REAL just doesn't make any sense in many situations so I'm quite curious why we're still having to even talk about random number generators for debugging & testing. It feels like we're still working for the computers instead of the other way around.
Re: (Score:3, Insightful)
This is still archaic thinking. A much more efficient way would be for the IDE to, when specifying a variable, ask there & then what the boundaries of the variable should be. Then the compiler could error any time it saw a situation where the variable could be (or was) handed a value outside those boundaries. Programmers should not be having to catch weird situations over and over; that's what computers are for. Allowing a variable to be any possible INT/FLOAT/REAL just doesn't make any sense in many situations so I'm quite curious why we're still having to even talk about random number generators for debugging & testing. It feels like we're still working for the computers instead of the other way around.
A good Ada compiler could do this almost thirty years ago, due to the flexibility of the type system. Of course, Ada '83 looked hopelessly complex compared to the other languages of the time such as K&R C. Put that together with the fact that the major users were bureaucratic mega-corps involved in government aerospace projects and it acquired a horrible reputation as a bloated mess.
Time moved on. C begat C++ and C++ started adding features without much evidence of overall direction. For example it was
Strict Typing (Score:3)
We have languages that force you to define the allowed range of an integer every time you define it (like ADA), and static analysis tools that find many of these problem. Entering this information into a dialog box wouldn't be any faster than typing it into the source code. The problem is that programmers consider these languages to be too cumbersome, and using such tools on other languages has too many false positive. They would rather have freeform languages like python, which are faster to program and ea
Re: (Score:3)
It seems to me you're describing is Design by contract [wikipedia.org]. Specifically, Code Contracts [microsoft.com] in .Net allow you to do this. The compiler there tries to check as much as it can, but checking every situation is simply not possible. It also produces runtime checks, so that your application crashes immediately when the contract is broken, instead of corrupting data.
Re: (Score:2)
That's why you put a human in charge, to recognize the infinite loop conditions and fix them by stopping the inference engine.
Fuzz testing (Score:3)
at least one of the specific situations needs to be the input of a random number generator doing crazy stuff.
You're by far not the only person to reach this conclusion [wikipedia.org].
Conjecture. (Score:4, Insightful)
Basically, the video referenced by the article is no different than "wouldn't it be nice if we were no longer dependent on foreign oil... that would make so many things so much easier!"
Re:Conjecture. (Score:5, Funny)
Re: (Score:2)
Yes, if only there were existing systems that worked that way. Such as the Lisp environment from 1958 or the Smalltalk environment from 1976. Such revolutionary new ideas about programming! I wonder if he will invent automatic refactoring tools next...
Lisp.. Smalltalk... Oh wait, I've got a FORTH answer oh no wait thats just the third... (Sorry bad joke, makes more sense if you know what FORTH is)
Gotta admit that the Venn diagram of languages with that kind of environment and "write only languages" nearly perfectly overlap. If someone would make a real time dev system for Perl and Basic then we'd have near perfect overlap.
Re: (Score:3)
Re: (Score:2)
How about Java? If you run your program in debug mode inside Eclipse it'll do hot code replacement every time you save (doesn't ALWAYS work and obviously won't retroactively change results of previous calculations).
Re:Conjecture. (Score:4, Informative)
"Whoever does not understand LISP, is doomed to reinvent it".
(As a practical example, I used OpenGL in Lisp 10 years ago, and it was great to modify the code while the system was running.)
Re:Conjecture. (Score:5, Insightful)
Smalltalk and Lisp are a good example, and they show (to me) that the problem isn't the language. The hard part about programming isn't the code.
The hard part about programming is understanding and decomposing the problem. If you're not any good at that, then no matter what language you use, you're going to struggle and produce crap.
This isn't to say that languages aren't important -- different languages lend themselves to particular problem-spaces by suggesting particular solutions. Picking the right language for the problem is as important as picking the right wrench for the nut.
But there will never be a DWIM language, because the big problem is getting the programmer's brain wrapped around what needs to be done. Once that's done, what's left is only difficult if the programmer doesn't have the figurative toolset on hand.
languages (Score:2)
The hard part about programming is understanding and decomposing the problem. If you're not any good at that, then no matter what language you use, you're going to struggle and produce crap.
there was a report a few years ago about an entrepreneur who *refused* to employ computer science majors. he only employed english language majors. the reason was he said that it was easier to communicate with english language majors in order to teach them programming, and they were more effective and also worked bett
Re: (Score:3)
there was a report a few years ago about an entrepreneur who *refused* to employ computer science majors. he only employed english language majors. the reason was he said that it was easier to communicate with english language majors in order to teach them programming, and they were more effective and also worked better when introduced into teams, than the people who had been taught programming at university.
I think there's a reason we don't all know this entrepreneur's name and abbreviated life history.
Re: (Score:3)
Sure the difficult part is the problem. But if you have the computer doing the trivial and menial parts, the programmers mind will have more brain resources to expend analyzing the problem and thus it will be easier to solve the difficult part.
Re: (Score:2)
"wouldn't it be nice if we were no longer dependent on foreign oil... that would make so many things so much easier!"
The funniest thing about this one is that it wouldn't matter. Even if we were no longer dependent on foreign oil, so many other countries around the world would be, that the middle east would still be getting funding for their terrorist programs.
Wait (Score:5, Insightful)
Someone re-invented scripting languages ?
Re:Wait (Score:5, Insightful)
well. yes. but sold the idea as having someone(the computer, AI) to magically setup the situation you were thinking of when writing that code too.
it's very easy to demo the benefits with something that for example just draws something, but such game engines have been done before and isn't really that different from just editing the code in an svg- however as you add something dynamic to it... how is the computer supposed to know, without you instructing it? and using mock-content providers for realtime ui design is nothing new either so wtf?
Fine. (Score:2)
Now go program me a nuclear reactor control system with this.
Re: (Score:3)
Are you saying that trial and error isn't appropriate for a system that cannot fail even one time?
I think it'd be very appropriate to build reactor control software with tests. Lots of tests. Lots and lots of tests. And you can simulate every device out there, you can simulate what happens when pressure builds or releases unexpectedly, you can simulate what happens when the operator pours his pepsi down the control panel and provides you with non-sensible inputs, etc.
Matter of fact, I can't see any other
Re: (Score:2)
So nuclear reactors are out. Can you program a firewall this way? What about an airline ticket reservation system? A compiler, maybe? A web browser?
His design is very good for instant unit testing, which is fine and dandy and I'm all for it.
Sounds like they have a GUI REPL (Score:5, Insightful)
Unless somebody wants to give a better executive summary, there's no way I'm weeding through an hour of video. Do they have any idea how many hours there are of "the one video you must see this year" on YouTube?
Re:Sounds like they have a GUI REPL (Score:5, Insightful)
The first example (with the fractal tree) is interesting. He changes a number in the code, and the trunk gets taller, or the number of leaves grow. He then adjusts the variable in the code as if it were a slider, and the picture updates accordingly in realtime.
Second example is a platform game. He is able to user a slider to go back and forwards throughout time, and change certain parameters such as the gravity, or speed of the enemy turtle. To solve a problem at the end where he wants the character to jump at a very particular height, he employs a 'trail' of the character over time (so you can see all of 'time at once'). Adjusting the variable he can get that perfect jump height more intuitively. The 'character trail' changes in realtime as he adjusts the variable.
The third example is where he talks through a binary search algorithm. Usually, you're blind as you have to figure out what the computer is doing behind the scenes. But let's say you can see the output as you're typing. The numbers of variables are drawn to the right as you're typing. If there's a loop, then there will be more than one value. In this case, the values are segmented horizontally in the output.
I've thought of a lot of the things that this guy has said (and even semi-planned to incorporate his third idea into my OpalCalc program shown in my sig), but a couple of things were moderately surprising, and it's nice to see it all in action.
Been There Done That. (Score:4, Informative)
I've been doing this for years with Python's interactive prompt. I write code and test it on a line by line basis as I'm programming when working with unfamiliar libraries. The code that works is then put into a permanent file for reuse as a script or compiled to an executable for distribution to my end users.
Obligatory Dijkstra (Score:5, Insightful)
"I remember how, with the advent of terminals, interactive debugging was supposed to solve all our programming problems, and how, with the advent of colour screens, "algorithm animation" was supposed to do the same. And what did we get? Commercial software with a disclaimer that explicitly states that you are a fool if you rely on what you just bought."
From http://www.cs.utexas.edu/~vl/notes/dijkstra.html [utexas.edu].
Re: (Score:3)
Re: (Score:2)
Yeah, I like syntax highlighting (and color screens in general), as well as interactive debuggers. Certainly, we can write code faster and find bugs more efficiently with all the tools available today. But it's the "silver bullet" claims that I'm skeptical about.
Re: (Score:3)
Re: (Score:3)
Sometimes very smart people can be mostly insightful, but very spectacularly wrong on some points.
Dijkstra turned into a figurative monk in his later years, dedicating himself to austerity and impractical principles. Kinda like Stallman is for "Software Freedom". There's some value in that, but the rest of us need to get work done, perfection being the enemy of the good.
My boss sent me this drivel as well (Score:5, Funny)
It is the most worthless, dumbass thing I've ever had to sit through.
It's just another "wouldn't it be great if computers programmed themselves" video by someone too stupid to tie their own shoes.
I know what the code I'm writing is going to do *before* I start writing it, as I hope for the love of god most programmers do.
In fact, the biggest plague on programming ever has been this concept that changing what you want fifteen times along the way is perfectly okay and we just need to adapt methods to that idiocy. You don't need any of this crap if you just know what you want ahead of time.
Re: (Score:2)
When you write code you should be 100% sure what the code will do. If not you don't really know how to program and can be categorized as belonging to the "poke at it untill it works"-crowd and be banned from comercial programming all together. (Members of that crowd should stay away from kernels and other important open source projects too, please.)
Working with such coders will be frustrating at best, and the death of projects at worst.
Re: (Score:2)
Maybe an analogy would actually be better...
Think of programming as a Mathematician developing a new maths proof. The Mathematician may not know how to get to his goal, but that doesn't mean that the solution isn't robust, or that he needs a calculator at every step.
Similarly, a good programmer can develop robust and easy-to-maintain code even without an a-priori design, or automated assistance.
Where machine-assistance comes in is that I can see situations where a computer can assist the Mathematician in a
Re: (Score:2)
You are wrong, and I hope in time you see this.
Programming is *not* like a mathematician stumbling upon a new proof. Programming is in fact extremely rigid and based upon a series of existing mathematics.
There is no room in *professional* software for stabbing in the dark. There is all the room in the world for that in hobby programming done by professionals or otherwise.
I don't expect a carpenter to build a house without a blueprint, it'd turn up as shitty as most modern software.
Re: (Score:2)
Re: (Score:2)
what is with your morons and equating the video I was commenting on to debugging?
Re: (Score:3)
It's "you morons", not "your morons and", and a sentence starts with a capital letter. What's the matter, didn't you know what the sentence was going to be before you wrote it?
Re: (Score:2)
A bug is a failure on the part of the programmer. You are supposed to recover from occasional failures, not have them as a constant and mandatory presence in your work.
Re: (Score:2)
Why exactly do you guess that?
See, us intelligent programmers realize that without knowing what you're building to start with it's impossible to ever build it in a reasonable amount of time.
But you go ahead with your idiocy, I'm sure it will serve you well in some low paid job at a sub-par software shop.
This interactive language already exists. (Score:4, Interesting)
It's called Python. I use the interactive interpreter to test snipets of code al the time. This makes me 4 times more productive than Java, for example.
Turbo Pascal (Score:2)
'nuff said
Re: (Score:2)
VB Redux? (Score:2)
The much-hated VB offers much of what you're looking for. Like it or hate it, it is really good for prototyping and for rapid development. The only drawbacks is that it encourages bad code and marries you to Windows.
This concept has already been in use for some time (Score:2)
... in the creative coding community.
Patchers like Max and Pure Data allow for realtime graphical programming and live coding environments such as Fluxus exist for realtime graphics and sound. Max [cycling74.com] was originally written by Miller Puckette in the mid 80s for realtime control of DSP for computer music at IRCAM and Pure Data [wikipedia.org], started in the mid 90's, is his open source take on addressing some of it's design issues. Fluxus [pawfal.org] originates from around 2000 as is a live 3d engine for performance using a Lisp varient a
But we already have it! (Score:3)
We already have something like it: IDEs for typed languages with strong inspections.
When I'm writing something my IDE helps me by highlighting possible errors or undesirable effects. That's not the 'visualization' stuff the author's talking in the article, but it's actually useful.
understanding by doing... faster (Score:2)
the extension of this idea is to use evolution-style techniques. from an automated perspective, that means deploying genetic algorithms to improve the code. unfortunately that means that good tests have to be written in order to verify that the code so (eventually) generated is actually "correct". many people would fail to go to the trouble of writing good enough tests.
yes - the alternative is seen to be to "read every line of code and make it run in your head".
i can't handle that. a) i can't be bothere
Re:understanding by doing... faster (Score:5, Insightful)
Congratulations, you are an idiot.
"Tests", no matter how numerous, cover an infinitely small fraction of possible instances of your code being used (that would be an infinite number of, unless you take into account the maximum lifetime of the Universe). They are supposed to assist a developer finding some faults in his reasoning, nothing more than that.
the plural of anecdote is performance bonus (Score:2)
I can't even bring myself to watch this, and I'm generally a compulsive bring-myselfer.
Dijkstra spins in his grave. Somewhere out there, Lamport is guzzling Guinness by the barrel and swinging his underwear around his head, while Knuth plays organ dirges.
The plural of anecdote is performance bonus. That was the VB business model in the late 1990s. This won't work twice. To obtain twice as many programmers at half the price there's now India and China. And they can do math.
The biggest Mistake Today (Score:5, Insightful)
Most programmers think, that coding takes the most part of the development. In some cases they would admit that testing is also very time consuming. But the real issue is the understanding of the problem. A lot of programmers design while their code. This results in strange development cycles which also includes trying to understand what the program does. As this can be done with a debugger, an interpreter or a trace analysis tool. The real issue is the problem description. First, understand the problem and its borderline cases. Second, come up with a solution for that problem. And third, try to implement it.
BTW: Most programs of today do not contain that many surprising new algorithms. They collect data, the validate data to some constraints, they store data, they provide an interface to query the data. In desktop applications like Inkscape or Word the data is stored in in memory models. And there exist serializers and renderers for the content. So the code as such is not hard to understand, as we all know such structures.
Re: (Score:3)
And why would it be a bad practice using a computer to analyze the available data to discover the possible cases that must be covered, and thus understand what the program needs to do? Sure, the final released code should not contain the tests implemented for the first assumptions about data. But understanding the problem is easier if you can run a primary version of the basic program logic on live data.
Re:The biggest Mistake Today (Score:4, Insightful)
No. My argument is: You have to understand the problem and its cases first and then code it. At least that is what we learned from the analysis of different software development methods. Some developers think that the perfect model to fit all cases will emerge when they code. This is true for very small problems, where they can keep the cases all in their head. As an alternative you could come from the cases. Define your input and desired output and all parameters which effect the result. You define your constraints. If you believe in a strict test driven development, you even could design the tests. then you divide the problem and specifies the different sub-problems before you try to implement that in C, Java, Fortran, Cobol etc.
Your argument is:You start with some input (subset of the real input or maybe a specification of the input) and a desired output. Then you fiddle around and try to come up with a suitable transformation which reads the input and provides the desired output. To "test" if it works you run it with input data. When it does not suit your needs you modify the transformation code. On a sample input basis you cannot know all alternatives of input look like. And if you have just a specification of input code, you have to generate suitable input from that specification.
My proposition is, if you know what the input is and what the output is and what the cases are, you should be able to come up with a specification for the transformation before you implement it. The specification is then the model for the implementation. That approach is much saver and you can even try to proof if the code is supporting all specified input. I know most people work differently. But it is expensive and results in long coding nights and a lot of extra hours.
Re: (Score:3)
That is the myth of the classical waterfall model. It assumes that you can and already do understand the whole problem in advance. As I said in another thread, that only works if you already have a model for the problem, maybe because it's a classical problem in physics or math. What I say is that you can use software to develop a new model; in
TL;DW (Score:2)
It's an hour long, and I don't have the patience to sit through someone gibbering on for that long. Is there a transcript?
Reasonable idea, terrible article (Score:4, Informative)
The article completely misses the point. The talk starts out awful, but after about five minutes of blithering, he gets to the point. He's set up an environment where he's running a little Javascript program that draws a picture of a tree with flowers, with mountains in the background. As he edits the code, the picture changes. There's a nice user interface which allows selecting a number in the code and then changing it with a slider. This immediately affects the picture. There's an autocomplete feature which, when code typing has reached a limited number of possible choices, offers options like drawCircle, drawRect, etc. Mousing over the selections changes the picture.
It makes sense if you're drawing pictures with programs. Something like this should be in editors for Renderman shaders and Maya programs, and maybe for some game engines. It also makes sense for HTML editors, and there are HTML editors which do that. Beyond that, it may not be too useful.
Re: (Score:2)
Just replying to myself, did not plan for the post above (yes, a first post!) to be made anonymous...
Re: (Score:3, Insightful)
Re:An observation... (Score:4, Insightful)
If you need to "run" code, either in your head or on a computer, in order to see what it's going to do, you're probably not really programming and you're definitely not an engineer.
Would be a better post if you explained the "right way", hopefully its not mysticism.
Whats wrong with processing this line of perl in your head according to the rules to figure out what it does? (admittedly I have no idea why the heck you'd want to do this, but its the simplest example I can think of using about 3 key perl concepts...)
s/(.*):(.*)/$2:$1/;
The other aspect has to do with new code vs maint (even maint of my own code). If I have no idea what I'm doing with my own freshly written code, thats just wrong... but old code always has some element of intense CSI work to figure out what it does before I can modify it..
Re: (Score:3)
Looking at your expression I do not think I "ran" it in my head. Rather i "understood" it the same way I look at an equation.
Generally I do not find myself simulating the running of code when I look at a program. I only do that when the code is overly complicated and hard to understand. Or if it is a clever algorithm that I do not already understand.
This is even more true for functional languages. Looking at some Haskell program it is not even always clear how the computer is going to "run" it. You look at
Re:An observation... (Score:5, Funny)
I'd guess it slips a $1 and $2 bill into a stripper's titties by the looks of it.
Re: (Score:3)
I'd guess it slips a $1 and $2 bill ...
Oh so close. She dances over to you with somebody elses one dollar bill in the left cup and a two dollar bill in the right cup and this magically swaps the $1 into the right cup and the $2 into the left cup without using a third cup, or even a hand. So it is essentially the popular internet meme "1 Girl 2 Cups". Even worse, a sharp eye can see the process involves a colon in the regex... This is going downhill fast...
Three obvious ? perl concepts and the pitfalls surrounding them are:
1) You're operating
Re: (Score:2)
okay, never played with perl (I know, I know) - it takes line of input and swaps two blocks around first found ':' character?
Re:An observation... (Score:5, Informative)
* is a greedy "match as many as you can", and the first
So the result of -- $_ = "foo:bar:baz:qux"; s/(.*):(.*)/$2:$1/; -- would be "qux:foo:bar:baz".
Re: (Score:2)
Your expression does not work you think it does if there is more than one colon in the input... ...bitch!
Re: (Score:2)
Or you're debugging.
Re: (Score:2)
WTF? How do you see what's going on in the code, if not by running it on a computer or in your head?
Let me guess, if you need to add numbers with a calculator or in your head, then you don't really understand arithmetic?
Re: (Score:2)
he means that you should code it in your head/paper/screen to some meta-language first("design") - then you're a real computer scientist engineer programmer, otherwise you're just a stinking hacker! waterfall or death!
Re: (Score:2)
I'd guess probably half of people doing a task involving significant technical skills don't really know what they're doing.
Not to say that so many people aren't capable. It's more that for a significant chunk, once they reach that point then they're not so far off promotion. Then there's another chunk who specialise in something else entirely then found they need to do some programming or whatever as a tool for that.
It's all about efficient resource management (Score:3)
Sure, you can do that in your head. They got to the moon with an abacus and some slide rules, so why do we need computers again?
That you can make it in your head doesn't mean that you should. Human brains are incredibly slow and error-prone at recall and logic, the primary strengths of computers. On the other hand our brains are
Re: (Score:2, Insightful)
And if you dont run it thru the debugger and STEP thru it you are just guessing what it will do.
If you are not right about behavior of your code, you are not qualified to write it in the first place.
Many time I step thru my code to find some assumption I was making that is invalid.
Then go kill yourself. People like you are the reason why there are bugs everywhere.
You can write code that compiles with 0 warnings on the highest levels, can get thru the most stringent of lint checks, passed dozens of code reviews, pair wise coded, etc, etc etc.
Compiler warnings are about things you are supposed to know -- a good programmer only gets them on typos or after removing things thus leaving something unused in the code.
But until you run it and step thru and see you will never know.
LISTEN, EVERYONE!
This is what is wrong with those people. They think, they can write random shit, single-step through it, do more random changes, and rep
Re:An observation... (Score:5, Interesting)
You approach makes lots of sense for writing mathematical of physical code, where you have a perfect model of the world that is the basis of the problem to solve. Your ancient Slashdot ID seems to indicate that this is the kind of that you have experience with; that, or the essential infrastructure for big businesses, which is similarly model-based.
But when your program is for a specific business flow application, for which there is no model and the logic and corner cases must be elucidated from the client by showing a partially working prototype, the only way to extract all the requisites for the program is to throw in a few lines, show it to the user and ask what it's missing or how the current version is wrong, and then fix it; because the program's behavior has not been fully defined yet when the first version was written, it's d^Hrefined as a reaction of the working code.
Re: (Score:2)
Re: (Score:2)