Why Programming Still Stinks 585
Andrew Leonard writes "Scott Rosenberg has a column on Salon today about a conference held in honor of the twentieth anniversary of the publishing of 'Programmers at Work.' Among the panelists saying interesting things about the state of programing today are Andy Hertzfeld, Charles Simonyi, Jaron Lanier, and Jef Raskin."
panel link (Score:5, Informative)
Re:This is why I hate slashdot (Score:3, Informative)
It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.
In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.
But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.
These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.
As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."
The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."
Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.
Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The other is the sheer software engineering to make that artifact come into being. I believe these are two separate roles -- the subject matter expert and the software engineer."
Giving the forme
Full Article (Score:0, Informative)
Programming must change -- but how? At a reunion of coding pioneers, answers abound.
- - - - - - - - - - - -
By Scott Rosenberg
March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.
It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.
In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.
But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.
Today's Daypass sponsored by LowerMyBills.com
These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.
As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."
The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."
Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.
Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, desi
Article text, minus links (Score:-1, Informative)
Programming must change -- but how? At a reunion of coding pioneers, answers abound.
- - - - - - - - - - - -
By Scott Rosenberg
March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.
It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.
In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.
But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.
These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.
As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."
The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."
Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.
Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The othe
Re:Copyright (Score:4, Informative)
Re:This is why I hate slashdot (Score:5, Informative)
Re:This is why I hate slashdot (Score:3, Informative)
Re:And Here's the Google Cache to the Article (Score:5, Informative)
just remove all+the+search+terms
The same link here [google.co.th] with extra words highlighted.
Hungarian Notation (Score:5, Informative)
The linked page didn't mention that Charles Simonyi is the Hungarian for whom the term, Hungarian Notation is named.
Hungarian Notation is that Microsoft Windows programming naming strategy where the first few characters of a variable name should hint to the reader as to its data type. So hToolbox tips you to understand that it was a HANDLE type, even without scrolling your editor up a couple pages to find out; papszEnvironment would likewise tell a Win32 devotee that it was a Pointer to an Array of Pointers to Zero-terminated Strings.
It's not the first such instance of binding data type and name, and it won't be the last. For example, FORTRAN compilers have long had implicit variables; any variable not otherwise declared that started with I, J, K, L, M, or N would be assumed to be an integer, where most other variables would assume a real (floating-point) type. So FORCE(LAMBDA) directs the code to a real scalar from a real array, given an integer index. Many programmers start a routine with IMPLICIT NONE to disable this assumptive behavior, as mistakes are easy to make when you let the machine decide things for you.
BASIC would use sigils at the end of the variable names (NAME$, COUNT#, and scripting languages like Perl use sigils that precede the name %phones, @log, $count).
Unrealistic expectations (Re:From the soapbox) (Score:2, Informative)
At the same time, software is perhaps the only industry where everyone is a friggin expert. Most people (in the US anyway) happily pay someone $25 to change their oil. How many are willing to pay someone $25 to install a new sound card in their PC and load drivers? And, if you look at the complexity of interactions between parts of the system and open-endedness of the interface with the environment, oil change is downright primitive compared to sound card change. But no, for some reason it should be "easy". Look at everyone who enables "expert mode" on their software, while even "novice mode" presents more controls than a smal airplane.
And the only people who actually understand the complexity of the software, the software engineers, somehow let themselves become convinced that software really should allow millions of users to do thousands of things they want, the way they want, all at once, and be "easy" at the same time.
Simonyi's Intentional Software Explained (Score:4, Informative)
I'm told that my university is one of the leading source transformation research centres in the world, but the only interesting things they're producing right now are for understanding legacy systems. So yes, there's probably a lot of money in source transformation, but it's also boring as hell.
It helps to understand low-level stuff (Score:5, Informative)
The thing is, Java is somewhat high-level. There are things that go on under the hood that you won't learn about, but once in a while pop up anyway. For example, being taught Java, you won't learn about the difference between memory allocated on the heap, and memory allocated on the stack. And yet...
This does not work (it doesn't even compile):
There's nothing wrong with the code; the problem is that Java pretends to support closures but really doesn't. To use x in the anonymous inner class, you need to declare it final. But if you declare it final, you can't do the x = "b" assignment.
I'm familiar with C, so I understand the difference between the heap and the stack. I can infer that x (the reference to the string, not the string itself) is allocated on the stack. It is not uncommon for instances of anonymous inner classes to outlive the stack frame they were created in, so the compiler doesn't know whether or not x (on the stack) will still exist when the object's run() method is called. So it makes a _copy_ of x, but in order to pretend that it is still x, the compiler wants you to declare it final so that the original and the copy can never get out of sync.
Having experience with C, I know the heap is a safe place to put things that may need to outlive the current stack frame:
It's ugly, but it works. The reference called x needs to be declared final (because it's on the stack) but the reference contained in the array does not need to be final (because it's on the heap).
Because of my experience with lower-level stuff, I understand how Java is faking its support for closures, and how to work around the limitations. This is only one example; there are many other times when understanding things from a closer-to-the-metal perspective gives insights that would be lost if things were only understood from a high level. Joel Spolsky summed this up fairly well: Leaky Abstractions [joelonsoftware.com]
Full text (Score:1, Informative)
Programming must change -- but how? At a reunion of coding pioneers, answers abound.
- - - - - - - - - - - -
By Scott Rosenberg
March 19, 2004 | In some quarters today, it's still a controversial proposition to argue that computer programming is an art as well as a science. But 20 years ago, when Microsoft Press editor Susan Lammers assembled a collection of interviews with software pioneers into a book titled "Programmers at Work," the idea was downright outlandish. Programming had long been viewed as the domain of corporate engineers and university computer scientists. But in the first flush of the personal computer era, the role of software innovator began to evolve into something more like the grand American tradition of the basement inventor -- with a dollop of the huckster on top and, underneath, a deep foundation of idealism.
It made sense that the people writing the most important code for the new desktop machines were ragged individualists with eccentric streaks. At a panel on Tuesday (sponsored by the SDWest conference and Dr. Dobb's Journal) that celebrated Lammers' book, seven of the 19 original subjects of "Programmers at Work" lined up on stage to talk about what's changed in software over the past two decades -- and demonstrate that they have lost none of their cantankerous edge.
In "Programmers at Work," Lammers told the crowd, "I looked at the programmer as an individual on a quest to create something new that would change the world." Certainly, the panel's group lived up to that billing: it included Andy Hertzfeld, who wrote much of the original Macintosh operating system and is now chronicling that saga at Folklore.org; Jef Raskin, who created the original concept for the Macintosh; Charles Simonyi, a Xerox PARC veteran and two-decade Microsoft code guru responsible for much of today's Office suite; Dan Bricklin, co-creator of VisiCalc, the pioneering spreadsheet program; virtual-reality pioneer Jaron Lanier; gaming pioneer Scott Kim; and Robert Carr, father of Ashton-Tate's Framework.
But for all their considerable achievements, this was not a group content to snooze on a heap of laurels. In fact, though the hour-and-a-half discussion was full of contention, one thing all the participants agreed on was that software today is in dire need of help. It's still too hard: not only for users struggling to make sense of poorly designed interfaces, but for programmers swimming upstream against a current of constraints that numb creativity and drown innovation.
These veterans shared a starting-point assumption that the rest of the world is only slowly beginning to understand: While computer hardware seems to advance according to the exponential upward curve known as Moore's Law (doubling in speed -- or halving in cost -- every year or two), software, when it advances at all, seems to move at a more leisurely linear pace.
As Lanier said, "Software inefficiency can always outpace Moore's Law. Moore's Law isn't a match for our bad coding."
The impact of this differential is not simply a matter of which industry gets to collect more profits. It sets a maddening limit on how much good we can expect information technology to achieve. If computers are, as it has often been put, "amplifiers for our brains," then software's limitations cap the volume way too low. Or, in Simonyi's words, "Software as we know it is the bottleneck on the digital horn of plenty."
Most successful programmers are at heart can-do engineers who are optimistic that every problem has a solution. So it was only natural that, even in this relatively small gathering of software pioneers, there were multiple, and conflicting, ideas about how we should proceed in order to break that bottleneck.
Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The othe
Re:why does programming stinks today, an opinion (Score:2, Informative)
Note: This reply got really off-topic as I wrote it, but I still think it is an interesting train of throught. Got me thinking about languages, at least. So it may be a little rambling... I apologize in advance for any hard-to-understand prose.
That's generally true except for some niche markets. There are still a few things, which should be programmed in assembly or a lower-level language like C. Problems with extreme memory or speed requirements are often only solvable with languages invented when most computers benchmarked against those extreme memory or speed specs.
I don't think most languages - C++ in particular - are flawed. They rather represent different tradeoffs between ease of writing small programs, ease of writing large programs, limiting performance, capabilities, and extensibility.
For instance, Java and Python provide large built-in libraries to make programs less complex. Those two languages are also generally implemented via virtual machines. VMs provide a medium between slow-but-safe (and highly portable) interpreted languages like BASIC, ECMAScript or Haskell and the traditional fast-but-dangerous compiled languages such as C/C++ or others. Also C++ was originally implemented in C macros - wasn't it? That's extreme extensibility. However, you don't want to use those features for individual projects: it'll only add complexity and thus a greater chance of bugs. Those features are generally useful for niche experiments, debugging, or sharing code among many projects. As a movie said (or something like it), with great extensibility comes great responsibility. :-}
There are many more examples. Special-purpose languages like XSLT which do one thing really well (in this case, interpret data) and suck at almost everything else. (The game of Life can be implemented in XSLT, but why? :-} ) Some languages are easier to use in some professions than others - i.e. Lisp and Haskell in Comp Sci; Matlab and Mathematica in Engineering (although everyone really hates them); I once saw a psychology experiment written in Pascal (why? I don't know!). The choice of a programming language is all about design tradeoffs. The requirements of the problem dictate which class of languages are best suited - it doesn't make one right or one wrong.
Re:Keep it simple, stupid. (Score:5, Informative)
In pure line count a minimal Java Hello World would only have one additional line.
It is crammed with keywords, and it contains the notion of Objects and Classes.
But I see that as a good thing - you can concentrate on the mainline code and introduce the student to control flow and so forth, but when you come to the concepts of classes you've got a nice immediate example to point to.
It's so much easier to teach a language when a common reaction to new information is "oh, I wondered what that was for" rather than "why would I ever need that ?"
Finally I have to completely disagree with you about type safety. A perfectly written and comprehended program does not need type safety. A real world program will never be either, and type safety will prevent some of the nastiest bugs from occurring and keep your data intact.
I have no problem with C in its place, but its place is not as a learning language or as a business language.
D.
Re:Hungarian Notation (Score:2, Informative)
Hungarian Notation is not a "Microsoft thing". If you've ever bothered to read the original Charles Simonyi paper, it has nothing to do with Windows, and the original seemed more tailored for e.g. physics concepts than window and process handles. Microsoft later adopted an adapted version of this for their own coding. It's not a "Microsoft thing" or a "Windows thing", sheez, if I had a cent for every time someone repeated that bit of misinformation.
Re:Hungarian Notation (Score:5, Informative)
Hungarian notation declined after Stroustrup added strong typing to C++. It is worth noting that Stroustrup never even considered NOT doing strong typing in C++. (Read his book on the design and evolution of C++.) Distaste on the part of hard-line C programmers for strong typing also declined, after C++ forced them to eat their broccoli, and they discovered it actually tasted pretty good (by catching errors that would otherwise not have been found nearly as easily).
It is also worth noting that Hungarian notation never caught on in any language other than C. In particular, Ada programmers never bothered with it: the Ada type system was rich enough that it could do everything that Hungarian notation pretended to do, and enforce it, by requiring compilers to refuse to compile type-incorrect programs.
(Somewhere, recently, I saw something about a commercial satellite project that was forced to use Ada, because there was no C/C++ compiler available for a recently-declassified microprocessor. Their programmers screamed bloody murder at the idea. The screams stopped when they noticed that the Ada compiler was catching errors for them.)
Re:ADA (Score:2, Informative)
Re:ADA (Score:1, Informative)
Wrong [act-europe.fr], wrong [gnuada.org], and wrong [gnat.com].
GNAT [gnat.com], the GNU Ada compiler, is Free (as in speech and beer), commercially supported, and has been integrated into mainstream GCC development. Get it. Use it. Love it.
Users of debian can simply "apt-get install gnat" (and also think about getting gnat-gps gnat-doc ada-mode and ada-reference). Other distros probably have similar packages, Others can check GNUAda.org [gnuada.org], which has packages for Linux, NetBSD, DOS, and OS/2.
I studied CS at NYU and took a programming languages class with Robert Dewar (main author of GNAT and president of AdaCore, the company behind GNAT, among other things (SPITBOL, anyone?)). One of the best classes I've taken at college.