Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Why Programming Still Stinks 585

Andrew Leonard writes "Scott Rosenberg has a column on Salon today about a conference held in honor of the twentieth anniversary of the publishing of 'Programmers at Work.' Among the panelists saying interesting things about the state of programing today are Andy Hertzfeld, Charles Simonyi, Jaron Lanier, and Jef Raskin."
This discussion has been archived. No new comments can be posted.

Why Programming Still Stinks

Comments Filter:
  • by Imperator ( 17614 ) <slashdot2.omershenker@net> on Saturday March 20, 2004 @11:24PM (#8624846)
    Someone with a @salon.com address submits a story to slashdot linking to a Salon article. That article costs money to read. Slashdot posts the story anyway.

    Could an AC please post the full text?
  • Re:Copyright (Score:3, Insightful)

    by PktLoss ( 647983 ) on Saturday March 20, 2004 @11:34PM (#8624912) Homepage Journal
    The article is of obvious interest to a large subset of the Slashdot community, and the editors made the choice to post it here. If he/she hadn't posted it, someone else would have, so I don't see how who made the initial post is relevent.
  • by bug1 ( 96678 ) on Saturday March 20, 2004 @11:36PM (#8624921)
    courtesy of wikipedia,

    "In the common law theft is usually defined as the unauthorised taking or use of someone else's property with the intent to deprive the owner or the person with rightful possession of that property or its use."

    By posting the artile here, we arent depriving the copyright owner of its possession or use.

    If i make a noise am i stealing someones silence ?

    Copyright infringment isnt stealing, its copyright infringment.
  • by BeerSlurpy ( 185482 ) on Saturday March 20, 2004 @11:36PM (#8624924)
    It is entirely possible to survive in many companies as a bad programmer who nonetheless manages to be productive and produce seemingly non-buggy code. They may even appear to be especially hardworking and motivated because of the poor design that they have to spend extra time working around as they add features.

    The forces that allow this phenomenon to self-perpetuate are:
    =Lack of people who know how to manage engineers properly, know how to recognize good ones and bad ones and how to motivate the ones you have to be productive.
    =Lack of good project management skills that inevitably leads to crunched schedules and poor quality code, also lack of perception on the part of management as to why software is having problems with performance, bugs or schedule to complete
    =Lack of desire to retain good engineers or cultivate improvement in the junior ones
    =Lack of communication between engineering and whomever is giving them work, especially regarding desired features and schedule
    =Lack of quality control, lack of oversight, lack of checkpoints in project progress

    It doesnt help that the concept of "good engineering" is so hard to measure- a few things are "obviously bad" but most things are not. Even if someone is being completely wrong headed about one particular concept, it is entirely possible that they are exceptionally strong in many other areas within that field. It eventually boils down to "the proof being in the pudding" with the pudding being exceptionally complex to make and subject to the whims of the royal pudding tasters when done.

  • by Otter ( 3800 ) on Saturday March 20, 2004 @11:37PM (#8624926) Journal
    Could an AC please post the full text?

    They have free day passes, FYI.

    To summarize, though:

    • Charles Simonyi has a new company that he claims will change everything.
    • Jaron Lanier is still happy to inform you that he's a genius and everyone else is stupid. Don't count on him to do anything, though.
    • Andy Hertzfeld sounds like he's gearing up to lose more money on Linux desktop software.
    • Salon continues to suck up to Linux users.
    That's pretty much it. Don't count on anything more useful out of these guys, except maybe Simonyi.
  • by PktLoss ( 647983 ) on Saturday March 20, 2004 @11:38PM (#8624935) Homepage Journal
    We are robbing them of its purpose, to generate revenue via ads, day passes or subscriptions.
  • While that is true, it's also naive to think it would be any other way. Why? Think about every other profession! The opportunity to be doing something creative is delegated to a selective, lucky few. I say lucky few because everyone has met great people in crap jobs.

    So the question of course becomes--how do you dodge the bullet of crap positions? Doing well in academia is probably unfortunately the best solution. I'm going to CMU next year, annd had a nice talk with one of their CS profs about some openGL + C++ projects I'm working on, plus some AI research I started. I will get to work on these things, however I will also have to try not to swallow cyanide while dying through Java classes teaching me how to program in a way so dumbed down that even the greatest imbecile can't screw up.

    This of course touches upon a great sadness of modern engineering training for me: you don't get taught to think--you get taught to use prescribed methods. Why? Ostensibly to never never reinvent the wheel, but also to get retards to do the job fine.

    Let's not be stupidly depressed about everything, however. Trying to shoot to the very top has always required talent and hard work, and always been possible.

    I think programming is truly great, truly beautiful. This afternoon I made some money writing some boring PHP code, but also worked on my personal projects, and I'll work to have the tides change in the future.
  • by Illserve ( 56215 ) on Saturday March 20, 2004 @11:43PM (#8624958)
    For programming to get "good" it's going to have to get unfun. No more will long haired super cool geniuses plug away for hours on end.

    It'll have to be a managed engineering process with all the fun and excitement of a CPA convention.

  • by G4from128k ( 686170 ) on Saturday March 20, 2004 @11:43PM (#8624961)
    Moore's Law is one reason why software still stinks. Instead of perfecting systems within the confines of a limited amount of resources, its too easy to just assume more MHz, MB, amd Mbps.

    With exponentially increasing resources, nothing ever stabilizes and everyone knows it. If people design software with the assumption that it will be totally obsolete and replaced in 18 months, they create software that is so badly designed that it must be replaced in 18 months.

    Until hardware performance plateaus and people get off the upgrade-go-round, programming will be sloppy and ugly.
  • by Rinikusu ( 28164 ) on Saturday March 20, 2004 @11:44PM (#8624969)
    And it's precisely this attitude of yours towards your "common man" that really makes computing suck, in general. Your entire post reeks of "elitism", harkening to a time where computer programmers were some sort of elite bunch that people "just depended upon" to make the magic database "go". Now that computing has been reduced to the masses, for better or for worse, you feel that you and your 4-8 years of education in the computing field are threatened by a bunch of ITT graduates who don't have the theoretical knowledge that is absolutely not required to generate a simple GUI that queries a database and presents them to the user.

    Frankly, programming as a profession bores me, which is why I no longer do it. I don't mind programming, on projects that *I* want to work on, but I no longer want to work on databases that aren't mine or are applicable to anything I'm remotely interested in. I'd rather use the computer as a TOOL, you know, a means to an end. It seems a lot of programmers are programming because it IS the end. They have no interests outside of computing.

    The fact that many ITT Tech students may not "understand" Knuth is irrelevant because many ITT Tech students don't give a flying fuck about Knuth. I know who Knuth is, I've owned/used his "Art of Programming" or whatever it was called, and I don't consider myself a better person.
  • by gilmet ( 601408 ) on Saturday March 20, 2004 @11:44PM (#8624970) Homepage
    The two are remarkably similar. As time goes on, analagous roles to those found in the production of physical machines/structures (such as concept artists, architects, engineers, construction workers) will be defined for digital creation. Actually, this has already happened. Perhaps what's lagging behind is the partitioning of education that leads to these professions?
  • by xtal ( 49134 ) on Saturday March 20, 2004 @11:45PM (#8624971)
    I agree with you, but only partly. Another problem is that some people are interested in programming applications as a ends in itself - e.g. their whole life revolves around implementing solutions to other people's problems. The guy from cox probably couldn't care less about Knuth - it's just what he's being told to do. Perhaps this isn't so much a problem as it is a side-effect of the need for programming services.

    That's because business has a need to get their problems solved, and finds the most effective tool to do it - in this case, generic problem solvers or programmers. This is work that is easily outsourced.

    Back in the day, the guy programming was solving problems to make -his- life easier. It's not a stark distinction, but one that needs to be made. My formal training is as an EE, I I took MANY more advanced mathematics courses than the CS people at least at the undergraduate level. We did a grand total of three programming courses, all of them offered by the CS faculty, and when I was there, we were taught Modula-2. It's since moved to Java. They don't start out teaching the virtual machine or bytecode, either. Pointer? Eh?

    Anyway, back to my point - I used Matlab, C, Assembly, you name it in my digital systems courses. We were not taught those things; we were expected to know them or learn them on our own to solve the problem at hand.

    Using a calculator to solve a problem and making the calculator are different things.
  • by bug1 ( 96678 ) on Saturday March 20, 2004 @11:46PM (#8624978)
    Its purpose remains the same wether its succesfull or not.

    Copyright infrement doesnt change the intent of the copyright owner.
  • by JordanH ( 75307 ) on Saturday March 20, 2004 @11:47PM (#8624988) Homepage Journal
    From the article:
    Simonyi believes the answer is to unshackle the design of software from the details of implementation in code. "There are two meanings to software design," he explained on Tuesday. "One is, designing the artifact we're trying to implement. The other is the sheer software engineering to make that artifact come into being. I believe these are two separate roles -- the subject matter expert and the software engineer."

    Giving the former group tools to shape software will transform the landscape, according to Simonyi. Otherwise, you're stuck in the unsatisfactory present, where the people who know the most about what the software is supposed to accomplish can't directly shape the software itself: All they can do is "make a humble request to the programmer." Simonyi left Microsoft in 2002 to start a new company, Intentional Software, aimed at turning this vision into something concrete.

    It's difficult to believe that Simonyi could be ignorant of the many many years of development of CASE tools and AI projects that have promised to build software systems from specifications.

    In 1980, a Professor told a lecture hall of Sophomore Computer Science students, myself included, that almost none of them would have jobs in programming, because in just a few years we would have AI systems that would build software systems from specifications that subject specialists could input.

    I don't think we are even a little bit closer to that dream today than we were 24 years ago.

    Maybe I'm confusing things here, though. Specifications aren't exactly the same as design. I know that I've sat through some CASE tool presentations where they implied that the work was all done when the design was done, but they were doing some pretty fast hand waving. I believe that those tools did not live up to the promises of their marketing.

    Am I off-base here? Has Simonyi cracked this problem with something entirely new?

  • by Anonymous Coward on Saturday March 20, 2004 @11:48PM (#8624990)
    When we have the power to build highly generalized, evolutionary programs we might start to approach the reliability levels seen in nature. We will look back on all these fancy programming metaphors we have now as barely better than hunter-gathering. We haven't even had our programming agricultural revolution, let alone our industrial one.

  • by timothy ( 36799 ) on Saturday March 20, 2004 @11:55PM (#8625017) Journal
    Ads can be (are not always) annoying, in any medium, but they make the content possible.

    Radio ads drone on seemingly forever, but they pay for me to listen to Coast to Coast a.m. once in a while, or NPR (whose ads, in the form of begging, are even worse, but whose content is better). Television ads, on programs not caught to TiVo, can be obnoxious, too.

    The Salon article *can* cost money (that is, you can subscribe to Salon to read it), but you can also watch an ad (or you can click on the ad and carefully look away from it) and then read the article for free. That's what I do. Sites not run as charities need to pay for their content somehow: Even some commercial websites don't make money per se, but are justified by other means (goodwill, information spreading leading to sales, etc), and some are free to read and make money with banners. Salon, unlike some sites, has provided two ways to read their stuff, meaning (I hope) that they stay in business, since I like some of their original stories. Note that reading Salon by the watch-ad/get-daypass means doesn't require you to give them demographic information, answer surveys, surrender your email, click checkboxes to avoid (yeah right!) spam, choose a password, or pay any money.

    Probably someone will come up with a way to block the content of the interstitial Salon ads: the arms race continues. But I prefer their approach to the increasing number of news sources that require registration and / or a paid subscription. The New York Times is annoying but hard to ignore as a news source, enough so that we link to it from Slashdot despite the required registration process; other papers, barring unusal circumstances, we won't link to because it's annoying to keep so many username / password combinations and have to login to read their content.

    And that it's someone from Salon who submitted ... Na und? An editor or writer with a publication or website can submit just like anyone else; I'm glad when they're up-front about it. Would you rather A. Leonard have submitted more sneakily from a throwaway hotmail account? :)

    Cheers,

    timothy

  • by ArbitraryConstant ( 763964 ) on Saturday March 20, 2004 @11:57PM (#8625023) Homepage
    My employer has a pretty sweet racket. They keep a few employees that are still in university so they can snap up bright young people without having to sift through hundreds of idiots.

    In other news, this is the inaugural post from my new account, created so no one at my office knows I'm talking about them.
  • by gcaseye6677 ( 694805 ) on Saturday March 20, 2004 @11:59PM (#8625031)
    Anybody remember parameterized programming about 15 years ago that was supposed to replace the need for programmers? Gotta love how that worked out.
  • by miu ( 626917 ) on Sunday March 21, 2004 @12:00AM (#8625043) Homepage Journal
    From the article:
    "Making programming fundamentally better might be the single most important challenge we face -- and the most difficult one." Today's software world is simply too "brittle" -- one tiny error and everything grinds to a halt: "We're constantly teetering on the edge of catastrophe." Nature and biological systems are much more flexible, adaptable and forgiving, and we should look to them for new answers. "The path forward is being biomimetic."
    This is easy to say, but what to do about it? A CPU is controlled by a set of registers and the contents of a stack, even if you virtualize those things (JVM, smalltalk, .NET, ...) and give them access controls you still have a system that is subject to massive failure once a single part of the system falls.

    So for this biomimetic approach to work would require a dramatically different machine architecture from what we have now. Of course this would also require the rewrites of all existing Operating Systems and lots of existing application and library software. So 'emulate biological systems' is a nice easy answer that does not really answer anything in the near term.

  • Charles Simonyi? (Score:2, Insightful)

    by Anonymous Coward on Sunday March 21, 2004 @12:02AM (#8625061)
    Charles Simonyi, former MS Chief Scientist and inventor of the horror that is Hungarian Notation? I guess they picked him as an example.
  • by alienmole ( 15522 ) on Sunday March 21, 2004 @12:10AM (#8625099)
    For programming to get "good" it's going to have to get unfun. No more will long haired super cool geniuses plug away for hours on end.

    It'll have to be a managed engineering process with all the fun and excitement of a CPA convention.

    This only works when no innovation is necessary. You can't CPA-ify innovation (at least, no-one has ever succeeded at that). That's why big companies have to buy small companies, and why big companies run research departments for the long-haired super cool geniuses to play.
  • Any moron fresh out of ITT or CLN with a degree pasted to their face by their own drool can churn out a reaking pile of code that will work.

    However, without the theoretical knowledge to back that basic syntax knowledge up, it won't work well.

    The grandparent post mentions coding being a "factory job". The commoditization of coding IS a huge problem. Coding WELL is not easy. However, because these half-wits that barely dragged their sorry asses through High School can go to CLN and pick up the latest "Microsoft cert dujour" or whatever other worthless peice of paper they offer, the overall expectation of coding is dropping. I couldn't tell you how often I've been ordered to cut critical corners by clueless bosses who don't understand coding on any level deeper than how to throw syntax together to create a brittle shell of a program that will work just long enough to take the customer's money and run (a favorite quote: "... *I* was never taught that." - spoken in the manner of someone who can't imagine they don't know everything).

    The problem isn't that we're elite. The problem is that good programming is no less complex or time-consuming a task now than it was 20 years ago. Why is it elite to try and explain that to someone when they tell you not to bother with such and such critical piece or this basic security test? It's not elite, it's just that we've been flooded by so many bozos that wouldn't know good programming practice if it bit them in the balls that we're constantly deluged by sub-par workers and everyone has come to accept that sub-par work as the norm.

  • by WasterDave ( 20047 ) <davep@z e d k e p.com> on Sunday March 21, 2004 @12:16AM (#8625131)
    Oh, for fucks' sake. Sooner or later somebody doing this - anonymous or not - is going to get Slasdot sued.

    Fucking stop it. It's a copyrighted piece of work, it belongs to someone else, it is their right to control it.

    Dave
  • by KrispyKringle ( 672903 ) on Sunday March 21, 2004 @12:17AM (#8625134)
    I know a number of people just bitched you out for this post, so I'm going to try to keep it brief. ;) Just a few points, in no particular order.

    You refer to the cable guys as iif they are the epitome of computer science. They aren't computer scientists. They almost certainly aren't even programmers. Perhaps to the completely ignorant, all computer-related jobs are the same, but they aren't. Most jobs as a technician are crap. Slightly above that would be the post of admin. Keeping something up to date. Installing new software. Above that, some network and system admins have interesting jobs designing new systems, implementing creative solutions to problems, and so forth. Programmers have a similar opportunity, to do creative coding, but often it's just another solution to another problem. Not something that sounds like a lotta fun. And above that would be computer science. Research. Whole different ball game.

    I think this is the root of your confusion. You see more blue-collar technical jobs. This doesn't mean less research is going on, though. Back in the day, the only people who interacted with computers were academics and researchers. There was no ITT tech. Now, in addition to the academics and researchers (of whom there are actually almost certainly many many more), there are hordes of unwashed masses actually (heaven forbid) using computers as tools, rather than just for the academic prospects themselves. Point is, the research is still there; in fact, there's far more of it. But there are also more and more other uses. This isn't a bad thing; it's a good thing.

    In case you don't see what I mean, look at it this way. Your complaint could be summarized with an analogous complaint about the watch industry. Back in the 1800's, the only watches available were really classy, expensive, work-of-art kinda things. A gentleman's accessory. Now, any old Joe on the street has one; they come in all sorts of cheap, disposable, low-quality shitty versions. But that doesn't mean there are less high-quality versions; in fact, there are more. Tag Huer, Rolex, Citizen, Suunto...the competition to make the greatest precision timepiece is quite tough, I suspect. Point is, there's a lotta shit out there now that wasn't there in 1800, but plenty more nice watches as well.

    Hmm. I guess I didn't really keep that brief. Sorry.

  • by torokun ( 148213 ) on Sunday March 21, 2004 @12:20AM (#8625150) Homepage
    It's clear to me why you wrote this post from a subjective standpoint -- I thought the same way when I was 20. Even when I was 24. I don't think the same way now at 28.

    Why? Because I've seen through experience that (1) most people can't learn the hard CS stuff, and (2) 95% of projects don't require it. The sad fact is that "Computer Science" is only really applicable to solving "hard problems", writing compilers, designing languages, or to AI and its kin. It is in general, not applicable to business applications.

    It used to be, but what happened? Computers got faster. Here's the progression in my career...

    1. Kudos for optimizing memory management and speed in C/C++, or even assembler.
    2. Questioning the need for such optimizations and pushing profiling before such work if it would take a significant amount of time
    3. Questioning the need for ever doing optimization, questioning the value of low-level languages.
    4. Pushing high-level languages (web-based solutions / VB) for everything unless a clear need exists.
    5. Sending everything to India.
    This just wouldn't work unless most apps simply didn't need the work that we as computer scientists want to put into them. Knuth is a perfect example -- he spent years and years getting TeX perfect just so he could see his books typeset perfectly. We have that sort of perfectionist bent.

    But it's all driven by money in the end, unless you're in academia, or research... A very few people are in positions doing both technically hard stuff and making money for it. These would be like Wolfram Research, google, some game companies (although I hear they're sweatshops, but what isn't nowadays?)...

    In the end, you're correct that a lot of hard problems can only be handled by people trained in CS. These would be the things mentioned, along with parallelism, threading, and optimization issues... But it's also true that most of the products out there don't need these. We're sifting these categories apart now, and unfortunately, it's just a fact that not much yummy stuff is left.

    But when you have garbage collection, a raging fast machine, and graphical IDE's, even if someone puts crappy code together, as long as they make a decent API, it's going to work after they try to compile it half a million times.

  • by slamb ( 119285 ) on Sunday March 21, 2004 @12:22AM (#8625165) Homepage

    The article is crap. A typical snippet:

    "There's this wonderful outpouring of creativity in the open-source world," Lanier said. "So what do they make -- another version of Unix?"

    Jef Raskin jumped in. "And what do they put on top of it? Another Windows!"

    "What are they thinking?" Lanier continued. "Why is the idealism just about how the code is shared -- what about idealism about the code itself?"

    This is similar to many articles before disparaging the WIMP (Windows, Icons, Mouse, Pointer) model. A bunch of "visionaries" who see that we've used this same model for some time and therefore are convinced it is horribly limiting, and that we are using it solely because the people who actually make systems have less imagination than people who write these kinds of articles. [*] They never have any but the most vague suggestions of a better model. They certainly never take the time to explore its limitations longer to ensure it really is workable (much less actually an improvement).

    In fact, this article is so vacuous that I'm not sure what they think stinks about software, much less why. And certainly not how to fix it.

    [*] In fairness, this article mentions people who have done some impressive work in the past (and is thus atypical of the genre). But still, I do not see any suggestions for a fundamentally better model or even any concrete problems with the existing one.

  • by Paleomacus ( 666999 ) on Sunday March 21, 2004 @12:25AM (#8625180)
    My company does this too. I'm one of those in university employees you speak of.

    It is great to experience both sides of the fence simultaneously. Every week I get a good helping of theory and a huge truckload of the real world application (or tossing the theory out the window,whichever the case may be).

    I enjoy programming as long as I'm not doing something brain-dead factory work like building a new form/ui for one of our web apps.
  • Re:panel link (Score:5, Insightful)

    by nlper ( 638076 ) on Sunday March 21, 2004 @12:26AM (#8625182)

    Reviewing the list of contributors, it's interesting to note that some of them had already stopped programming back when they were interviewed. So why should we listen to them opine about software development techniques today?

    My pet peeve on the list would have to be Jef Raskin [sourceforge.net], who's far better at self-promotion than actually coding. Had people actually listened to his ideas in the early days of the Macintosh project, they would have delivered a machine without a mouse or other features most people associate with the Mac. (As Andy Hertzfeld puts it, he's not the father of the Macintosh so much as the eccentric uncle.) [folklore.org]

    However, if you want to hear him repeat the same things he's been saying for the last 20 years, he'll be keynoting the Desktop Linux Summit [desktoplinuxsummit.org]. No doubt he'll be beating the horse's skeleton that mice, icons, and the windowing interface are what's holding Linux back on the desktop. (MacOS X be damned!) Using those special "leap" keys that made the Canon Cat so successful, now that's the future!

    Tyler

  • by heyitsme ( 472683 ) on Sunday March 21, 2004 @12:28AM (#8625192) Homepage
    As a student at a major Big Ten University (tm) I can easily tell that your perception is a bit skewed. The old cliche "you get what you put into it" applies to many things in life, and computer science is no different.

    My school's core computer science curriculum is in Java. Language of instruction is a moot point to a rather great extent. You can learn as much from a data structures class taugh in Java as you can from one taught in $language_of_choice. The idea is to learn how things work fundamentally, and then apply those ideas practically. A linked list in Java works the same as a linked list in C. Its not about Java being the "industry standard" as you call it, its about Java being a perfectly modern and capable programming language. The idea

    Your next analogy of the cable repairmen almost prompted me to moderate your post as +1 Funny, but when I found out you were not joking I decided to write this reply instead. To even equate a cable repair person with a computer scientist is pure madness. Even if they were programmers, how is getting the cable modem working a good metric of "computer stuff in general" being "a lot less like a science or craft and more like a factory job", or even relevant to the discussion of computer programmers vs. computer scientists at all?

    None of your points even remotely explain what you consider the fundamental problem: "why software sucks...why the programming "trade" sucks...why companies can send the jobs abroad to work for peanuts" The fact is not all software sucks, many people love their jobs in the industry, and these people are getting paid well to do their jobs. Most of the computer scientists you speak of don't work in the private sector, you can find them at government [fnal.gov] research [llnl.gov] institutions [ornl.gov].

    To say that these type of people don't currently exist, and that current CS curriculums can't produce scientists of this caliber is nothing short of ignorant.
  • From the soapbox (Score:5, Insightful)

    by DaveAtFraud ( 460127 ) on Sunday March 21, 2004 @12:41AM (#8625242) Homepage Journal
    I have been working professionally in software development for not quite 24 years with experience in aerospace/defense, established commercial, "dot com", and a post dot-com startup companies plus I dabble in Linux. This still means this is a series of single data points taken in different industries at different times so take what I have to say with a grain of salt.

    The worst programming problem is unrealistic expectations on the part of management. What it really will cost and how long it will take is always too much and too long so budgets and schedules get cut. At least aerospace/defense makes an attempt to figure this out and bid the contract accordingly. The commercial world looks at when the next trade show is or something else equally irrelevant and then says it has to be done by then with the staff that's available. They end up getting what they paid for and blaming the programmers when it crashes (See my sig. Yes, I do software QA). Established commercial companies aren't quite as bad but there is still a tendency for making the sale to somehow trump in the determination of what can be developed with the time and resources available. The resources may be there but there is a tendency to try to produce a baby in one month by getting nine women pregnant and then wondering why there is no baby after the month is up in spite of publishing detailed schedules.

    In contrast, I think one of the primary reasons free/open source software tends to be of significantly higher quality is that these factors don't come into play. A feature or program either is ready or it is not. If it is not, it stays as a development project until it either dies of apathy or enough people are attracted to it to make it into something real. For established projects, you have people like Linus who "own" the project and ensure that contributions only get incorporated if they pass muster.

    I find it amusing that one of the criticisms of FOSS is that the schedules are unpredictable but the reality is that software development schedules ARE somewhat unpredictable* but at least the FOSS development process recognizes this and instead focuses on the quality of the program rather than pretending it doesn't exist and coughing up something that isn't really done based on someone else's absurd schedule.

    * If someone develops the same sort of software over and over again (think IBM) they will eventually gain enough experience to have a reasonable shot at scheduling and resourcing a project correctly. The fewer data points you have, the less likely you are to get it right.
  • by bcrowell ( 177657 ) on Sunday March 21, 2004 @12:43AM (#8625256) Homepage
    They have free day passes, FYI.
    Too bad you can't get the day pass if you don't have Flash installed.
  • by melatonin ( 443194 ) on Sunday March 21, 2004 @12:45AM (#8625265)

    Uh, I haven't read the salon article.

    I think it comes down to a matter of discipline. One of the things that made Unix such a success is KISS. Unix is built around the philo that you should focus on one thing and do that one thing well. Tools in Unix don't re-invent the wheel to get stuff done; they rely on other tools.

    Another thing that comes to mind is what Apple's VP of hardware engineering Jon Rubenstein said in an interview. He said that engineers get too creative when they do work, and end up engineering things that do not need engineering - the rote work, as he put it. You have to get engineers to be creative in the non-rote work.

    IMO, Java is too complicated and easily gets in the way (I'm not going to touch C++ here). At least with C you can start off with a one-line Hello World program. I've seen first-year CS students trying to understand the many lines of code that constitutes a Java Hello World program (there are so many elements to a typical Java HW example; the class declaration, the static main declaration, the exception handler, System.out, etc. Every token except for "Hello, world!" is gobbldygook to newbies). In the case of C, it can go either way. To a well disciplined programmer, C programs can be extremely elegant. On the other hand, you can end up with some really fug-ugly code. However, a Pascal programmer is more likely to use C in an elegant way because Pascal is simple enough to make a programmer focus on the task, restricting the programmer's freedom to do stupid things.

    Although many argue that type-safety is incredibly important, that you can get really weird behaviour if you don't have type-safety, and that it's important that languages restrict programmers by being type-safe. Uh, no. Type-safety is like training wheels. If your program is well structured, 'types' in the program are for the most part irrelevant. What matters is what the code is doing, and knowing what the general flow of the program accomplishes. In a well-structured program with a well-defined execution flow, the data that moves around should always be well defined anyway.

    Here's another way to look at it (something an old professor brought up). What useful program has no input and produces no output? There is no such thing. Your job as a programmer is to transform your input sources to your desired output products. And the path to get that done should be as simple as possible. Unfortunately, we've got languages with templates, boxing/unboxing, exceptions, RTTI, etc as our main tools, that often provokes us to design systems that are more complicated than they should be.

    Really, it's like an artist trying to make a painting, and going out and buying a self-cleaning, featherweight, self-balancing $1K paintbrush. That's way beyond the point, isn't it? An artist needs to envision their artwork and realize it in their medium of choice. Similarly, a programmer needs to understand the problem they need to solve and then efficiently implement it on their target system (be it a PC or an automated toaster oven). Language features don't make programmers write better programs; discipline does. I can't remember the example perfectly, but it's something like, if you give a team 20 tools to do the job, they'll try and use all 20 tools. If you give a team 4 of those tools that they really need, they'll find a solution too - and a more efficient one.

    Anyway, KISS. You need to focus on what execution paths your program takes and what affects those execution paths (and note that exceptions throw a wrench into that. I avoid them). Don't get too creative with the language you're using. The best thing for a programmer to do is to learn how to program non-trivial projects in several different languages (Java and C++ don't count as different languages). And then hopefully more programmers will demand cleaner programming environments that aren't just souped-up C++ clones, and programming will stink less. (Take a look at Smalltalk [squeak.org] some time).

  • by ciggieposeur ( 715798 ) on Sunday March 21, 2004 @12:48AM (#8625278)
    Quoting the article: "Giving the [software architects] tools to shape software will transform the landscape, according to Simonyi. Otherwise, you're stuck in the unsatisfactory present, where the people who know the most about what the software is supposed to accomplish can't directly shape the software itself: All they can do is 'make a humble request to the programmer.'"

    As a programmer who recently stopped working for a very very very large computer firm that sells both hardware and software, let me say that Simonyi's point makes zero sense. Tools already exist to "shape software," and they are known as programming languages like Visual Basic, C, C++, C#, Java, Perl, PHP, Python, etc...

    I'm frankly sick of architects (that's the term for people who say they design software but don't actually design software) who bemoan the gap between their glorious visions and the real products their teams end up producing. These people need to click "Close" on their UML models and go get their hands dirty by writing parts of the production code. Then they'll understand the real-world constraints that their codeless design didn't account for, like internationalization, performance bottlenecks, user authentication, heterogenous networked environments, and ACID transaction support (to name the first few).

    Oh yeah, and the reason open-source developers wrote a Unix-like operating system (Linux) and put a Windows-like interface on top of it (X11 + GNOME/KDE) is because these are both very reasonable and mature solutions for a variety of computing needs. If any of you architects out there want something besides Linux that conveniently abstracts away 99.9% of the hardware interaction yet also provides an easy-to-learn interface for general users, you are more than welcome to write it yourself. Or you can model it in UML, click some buttons, and hope it compiles.

    Why do I think software sucks? Because market droids and architects who forgot how to program get together and promise their customers AI in only six months.
  • by Doomdark ( 136619 ) on Sunday March 21, 2004 @12:58AM (#8625322) Homepage Journal
    however I will also have to try not to swallow cyanide while dying through Java classes teaching me how to program in a way so dumbed down that even the greatest imbecile can't screw up.

    ...

    This afternoon I made some money writing some boring PHP code, but also worked on my personal projects,

    While I whole-heartedly agree with points you are making, it's worth mentioning that there's nothing fundamentally wrong with either Java or PHP, that leads to boring lowest common demoninator programming: it's possible to do interesting advanced and sophisticated things using both, as well as with their countless alternatives. Except for some elitits who claim one has to use, say, functional programming languages to do anything interesting, or "no pain no gain" hard-core low-level language fanatics, most truly good programmers understand it's not tools that make exceptional advances; it's the craftsmen that use them.

  • by afgates ( 755260 ) on Sunday March 21, 2004 @12:59AM (#8625325)
    I consider this to be improper use of the hard work of the Salon staff. There are only 74,000 paying subscribers to salon, full disclosure, of which I am one. They provide a nice counter-balance the Microsoft sponsored Slate online magazine. So many of the online e-zines have suffered and closed down over the last few years, and the infosphere is poorer for it. As a anarcho-capitalist, I believe that any valuable service or product should be made as easily available as possible, but as a consumer in this arrangement, I have a responsibility to support that creative producer. For example, though I download music for my personal use, the artists and tracks I find to my liking, I go and buy the artist's product, either through iMusic or at a brick and mortar. Salon will allow you to sample some of their content with only a small hurdle. If you find the product useful and to your liking, support them directly. Take responsibility for yourself. Remeber, IAAMOAC (http://www.davidbrin.com).
  • by MagikSlinger ( 259969 ) on Sunday March 21, 2004 @01:34AM (#8625520) Homepage Journal
    I hate programming now. I loathe the thought of it. Not because I hate the act of programming, but the systems I have to work with.

    Sure, in the nice old days, the C64 and IBM PC were fairly easy to code for, but they also gave you very little bang for the buck. The nice thing was a couple hours of programming could get something nice out.

    Now, it can take me a couple hours to do even a simple notepad application from scratch. I'm forever spending lines of code to fill in structures or respond to all the events an API wants.

    The computers got more powerful, and the APIs also got more powerful, but now I spend so much time filling out basic structures that I don't need. I'd rather a lot of that stuff was user configurable or stored in an XML file somewhere. I don't want to have to know about allocating & positioning fonts! I just want to dump it in a nice scrolling box.

    It's like a bureacratic nightmare writing code now. Sure, there's MFC, etc., but that's like the "easy" tax form. The moment you want to do just one thing different, you're back to square one. And the learning curve, sheesh!

    That's why I like projects like XUL. We've made the APIs so programmer centric, that I can't breathe anymore. I just want to code the important stuff then let someone else make the GUI pretty.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Sunday March 21, 2004 @01:44AM (#8625557)
    Comment removed based on user account deletion
  • by Anonymous Coward on Sunday March 21, 2004 @01:59AM (#8625624)
    Admittedly it was outside of the letter of the law to post this outside of its home site. But this is the internet. The expectation is to be able to link to another part of someone else's site so that we have this inter-related "web" of information. The commercialization of the internet with broadcast paradigm limitations destroys the peer-to-peer web-like nature of the ... of the web. Linking was supposed to be the proper attribution format for the web. The salon web site (and several other web sites) don't really allow this any more. The are definitely resources available through the internet, but they are no longer really part of the web, as they have chosen not to play by the rules and guidelines of the medium. If they don't want to play by the rules, they can leave.

    As for copyright: Copyright was meant to limit copying with copy presses. The world wide web is not what the provision was designed for. I haven't visited Salon since they went with this advertising scheme. I would not have gone to their site if this hadn't been posted here. I'm reading web pages for news, and if I wanted to mindlessly watch ads as I was told to by my corporate masters, I'd be watching TV instead. If salon wants to force ads, they should be on TV or Radio. This is the wrong format for that crap. I don't think this loses them anything, as many other people here feel the same way and won't even go through the hassle of a free reg. to read NYTimes articles.

    After I lost my third slashdot username and just started signing my AC posts. If I couldn't post as an AC, I wouldn't post at all. I have better things to do than jump through someone else's hoops.

    Anyway, even the letter-ignoring law abiding slashdotters understand the bad karma of direct posting, but the impetus for doing so was that the submission of this article appeared to be an intentionally subversive advertising ploy. "Submit to a news site so that they will do the work of advertising for us so we don't have to pay for it, then force them to watch ads so that we get paid for it." Even if it wasn't like that, it surely fails to avoid the appearance of impropriety; and as cynical and paranoid as the techno-elite usually are, it was assumed that someone was trying to pull a fast one, so we pulled back.

    I'm glad I got to read it. It was nice. If it hadn't been posted, I wouldn't have read it. Salon gained from this by having content attributed to them that was of decent quality. I'd consider going back if it hadn't been made quite clear that they are still doing the crap that made me start my boycott of them to begin with. If they didn't want the article put up on slashdot, then someone from salon shouldn't have sent it in.

    I have no remorse for a commercial organization failing to reap the full financial rewards of exploiting a non-profit web site, just like I have no tolerance for SPAM or unsolicited phone calls hijacking a communications service for the purpose of advertising against the will of the audience. Not to mention leveraging laws well outside their intended boundaries to bully common citizens into compliance with unfair practices.

    - theed.
  • by Hamster Of Death ( 413544 ) on Sunday March 21, 2004 @02:00AM (#8625630)
    Nobody knows how it should be done. Plain and simple. Sure there are 50 different ways to shuffle 1's and 0's around to produce something that kind of solves a problem someone may or may not have (customers seldom even know the problem they are paying to solve).

    Add to that the string of phbs, development team, testing team. Then you end up with people who don't really understand the problem, solving it with tools that we don't know if they work or not, or which they might not even fully understand. (Have you proved GCC is correct lately? How about .NET?)

    So until we can get a method that ties programming to what it really is (problem solving) we get to poke about blindly in the dark to find our 'solution' and hope it shuts up the customer long enough for them to write the check. We're slowly getting there, but because programming is still a new thing it hasn't been remotely fully explored yet.

    There's lots of room to figure out how to make a computer solve a problem once it's defined. Finding the problem is a major portion of the battle. The rest comes in finding a repeatable, provable way to solve it.

    Until that happens youll need your blindfold and poking stick handy.
  • by Stinking Pig ( 45860 ) on Sunday March 21, 2004 @02:21AM (#8625703) Homepage
    Lemme get this straight, doing well in academia is the answer but the only good thing in your academic curriculum is a side research project because your main classes don't teach thinking?

    I'll say this for the English degree, which I highly recommend to any young geek looking at schools: you learn to think, you learn to communicate, and you learn to differentiate shit from shinola. Surviving through a hardcore postmodernist-influenced seminar will prepare you for any amount of corporate meetings.
  • by ebuck ( 585470 ) on Sunday March 21, 2004 @02:24AM (#8625709)
    Well, I took the bait and read it.

    It's suprisingly full of fluff. I admire the challenge to go somewhere new and interesting, but am equally appaled by lack of sense of direction in the article.

    It's about as coherent as pointing out that theres 360 degress around you, and they are all hopeful and promising. Then asking you "Where do want to go today?", while reminding you that you're in your hometown.

    The dismissal of open-source as a non-innovator is questionable, and the statments about programming itself not getting better keep me scratching my head. What do you mean by more innovative programming? Compilers won't accept any type of creative garbage, and personal expression in the language (aka Perl) has it's own limitations.

    These guys should be motovational speakers, but problem is they don't have a large an audience as say, people with finiancial troubles, grumpy employees, or people with weight problems.

    Lack of industry-redefining innovation is an indicator of maturity in computing science. Innovations become small steps forward, and are no longer the cataclysmic leaps that existed in the past.

    Look at Gnome as (only one) example. They changed their default browsing mode to a spatially oriented one. It's innovative, but it's not going to be as big a leap as say, going from the command line interface to a windowing one. Arguing that it's not a big enough innovation to have real merit implies that their early pioneering breakthroughs elevate them to a kind of revered status as super-programmer.

    It's easy to be innovative via discovery in a field that hasn't matured, it's a lot harder to be innovative via discovery when millions are working along side of you. I'm not trying to diminsh thier hard work and effort, not discount the magnimity of their accomplishemnts, but to stretch my analogy (a bit too far), claiming that nobody has discovered a new continent recently isn't the fault of less innovative map makers.
  • by jjohnson ( 62583 ) on Sunday March 21, 2004 @02:26AM (#8625720) Homepage
    I think comparing the progress that software development makes to the progress of hardware development is a fundamental mistake. Moore's Law depends upon the properties of the materials we're using and our ability to exploit them--chip fabrication, heat conduction, power consumption--not upon our ability to design chips. It's not the chip layout that's improving, it's our ability to milk what's already in the materials, which yields exponential growth. We're not responsible for that growth curve, the materials are. Doubling the length of our lever gets us the ability to move four times the weight.

    Software, on the other hand, is all about design. Of course it's not going to double in power every 18 months--we're not doubling in design ability every 18 months. If the computing of our hardware were limited to our chip design abilities, it would be going just as slowly.
  • by PopCulture ( 536272 ) <PopCulture@@@hotmail...com> on Sunday March 21, 2004 @02:32AM (#8625746)
    which notation is nothing but an irritant when code is well-structured

    which happens how many times when you buy software from anybody.

    which happens how many times when your own development team is faced with insane deadlines and unrealistic specs.

    Which happens how many times in the post dot com boom?
  • by Milo77 ( 534025 ) on Sunday March 21, 2004 @02:32AM (#8625748)
    I agree - coding well isn't easy, but my CS degree really only half prepared me for being able to code well. Yea I learned a lot of theory and understand more-or-less exactly what's going on on that processor, and this has enabled me to write some pretty clever code. But I've also learned that cleverly written code (code that's "better" in a purely acedemic sense) can be some of the worst code. It reminds me of that aweful hacker creed: "it was difficult to write, it should be difficult to read." Its sad, but true for most of the "software developers" I've met. They'll write terribly clever code, but in the end they've done a disservice to the project's long-term viability. I saw another quote this week (off the UNO website, off slashdot earlier this week) [paraphrase]: "the code will get written once, but read many, many times - if you make it easy to read you and others will benefit in the long-run." (you get the gist)

    It is sad what we've allowed to happen to software engineering - a disgrace really.
  • by noonien_soong ( 723097 ) on Sunday March 21, 2004 @02:53AM (#8625815)
    Let me get this straight. You're 19 years old, you've played around with computers a little bit "in your day," you've read some Knuth, and you're smart enough that you've come to see yourself as superior to the average blue-collar worker. This gives you so much insight into the world of software engineering that you can discern exactly what the problem is---programmers aren't as smart as you are, and if they were, software wouldn't suck. Would you say that's a fair characterization of your argument? Because that's how you come off.

    The first thing you have to realize is that software is being written at a much higher rate now than it was back in the days when programmers were physicists and mathematicians. That's because it no longer takes a rocket scientist to write a program, and that is a Good Thing (TM). If the programming learning curve hadn't come down, we wouldn't be living in the wonderful information society that we are today, because there simply aren't enough rocket scientists to hammer out every PHP script and database app one might desire. The technical aptitude of the people working on the foundations of computing is certainly not degrading as programming becomes more democratized, because those people's talent and skills are not forged in some intro programming class. Every hacker knows he learned his art outside the classroom. So it's not exactly logical to assume that colleges could turn out a crop of brilliant innovators if only they used Haskell, read Knuth, and taught all their CS students physics and numerical analysis on the side. Clearly, what would actually happen if one simply cut the lowly Java-bred programmers out of the programming craft is that a lot less bad software would be written, a little less good software would be written, and all of it would cost more. That's not exactly an improvement.

    None of this is to say that there are not pervasive (and addressable) problems in modern software engineering. Those problems are simply much more endemic to the state of the art of programming than they are to any particular group of people. As many have pointed out, software today is brittle. It is frequently opaque, offering users and programmers alike only the most rudimentary means of debugging. The bread and butter software used by everyone everyday is often monolithic, designed for one purpose and impossible to customize without intensive study. Think about it for a moment, and you'll realize that much of the functionality of a typical document-editing application is duplicated in any other. In principle, such functionality could be factored out. But I can't digress too far into that here - let me continue with the list of reasons why software sucks. Integrated development environments give programmers a view of a project that scares poorly with complexity, software is incredibly difficult to build from source (unless you use Java--heaven forbid that should find its way into the bubble of intellectual purity you inhabit), and perhaps worst of all, the design decisions and architecture of software are usually not expressed clearly anywhere except in source code, where they are obscured by all manner of syntactic complexities, compiler optimizations, and details that aren't significant to the overall intent of the code. These things--all the things that make software complex, which make it difficult for groups to work together on large software projects (as you would understand if you'd ever worked on one)-- are some of the real hurdles to be overcome in software engineering. ITT Tech and outsourcing to India are NOT the problem.

    I haven't said much about how to solve any of these problems. But I've said a lot, so I'm going to stop now. I highly encourage you to get some more experience and perspective before you make sweeping and arrogant generalizations. College-aged know-it-alls with overblown rhetoric are a dime a dozen. Real problem solvers are rare.

  • by ebuck ( 585470 ) on Sunday March 21, 2004 @02:58AM (#8625837)
    Agreed, good programming is not easy, and many more of the avenues to enter the field should require more basis in theory (language design, automata, OS internals, compilers, underpinnings of good database design, etc.)

    But good programming should be less complex now than it was before. That's the whole imperitus of language design. That's the reason that the last of the big languages to roll out is the same "simple" JAVA bashed in a few previous posts.

    In a previous post, I couldn't fathom the divergence of thoughts that denounced JAVA as a language while espousing that he's really cool C++/OpenGL stuff out there. C++ has a syntax that's unwieldy and awkward, mastered by comparatively few, and full of "compatibility" weaknesses shared by it's older brother, C. It's almost like it was thrown in there subconsiously to say, "Look, I am an uber elite programmer. OpenGL and C++. Watch me whine as I use something that has a clean, clear syntax."

    I'd hate to hear him gripe about PASCAL.

    "Really cool work", can be done in any language, and the proliferation of languages shows that there's many solutions to the same problem.

    His bashing the language for it's simplicity was as insightful as bashing good error checking, testing array boundaries, or enforcing garbage collection. Note that technological improvements can lessen the impact of these annoyances, but when the language design is flawed, only deep education of the masses (as in, don't do this, you'll regret it) can save the language.
  • by slamb ( 119285 ) on Sunday March 21, 2004 @03:02AM (#8625849) Homepage
    You refer to the cable guys as iif they are the epitome of computer science. They aren't computer scientists. They almost certainly aren't even programmers. Perhaps to the completely ignorant, all computer-related jobs are the same, but they aren't. Most jobs as a technician are crap.

    Agreed.

    Slightly above that would be the post of admin. Keeping something up to date. Installing new software. Above that, some network and system admins have interesting jobs designing new systems, implementing creative solutions to problems, and so forth. Programmers have a similar opportunity, to do creative coding, but often it's just another solution to another problem. Not something that sounds like a lotta fun. And above that would be computer science. Research. Whole different ball game.

    Here's where you lose me. I don't agree that computer science is "above" programming. In fact, I'd say that programming is the union of computer science and software engineering. Superior programming requires contributions from both fields.

    Software engineering is nothing to sneer at. It envelopes version control, coding style, rigorous specifications, code review, bug tracking, API documentation, user manuals, user interface design and testing, unit/regression/acceptance testing, etc. There's some real artistry involved. It's easy for us programmers to neglect these, saying that they aren't hard but we don't have the interest or resources. But in reality, when I really try to do even the most seemingly mundane of these tasks, I find there's a lot more skill involved than I previously realized.

    The experience has also made me skeptical of the idea of farming software engineering tasks off wholesale to specialized people. For example, writing good API documentation requires the involvement of the people who designed the interface. Likewise good user manuals require the people who designed the UI and administered the user tests. Pure technical writers can maybe even write the bulk of the prose, but if they can't do what they document, they don't have a prayer of noting subtleties without help.

    Summary: I think actually designing, implementing, and even proving correctness/efficiency of algorithms is a much smaller part of the whole than we like to admit. The other things not only valuable and difficult, but also should be done by the same people.

  • by ExEm2SS ( 763982 ) <jhcrump@gmai l . com> on Sunday March 21, 2004 @03:02AM (#8625850)
    Okay, I just had to respond to this post. You're young so it's expected that you're going to be opinionated and judgemental. I used to have that luxury when I was your age.

    I am an ITT Tech graduate, and I have been a computer geek since 1995 when I bought my first computer. When I was working in Portland as an electrician, we used laptops for troubleshooting. We needed these laptops so that we could plug them into our machines's PLC's (Programmable Logic Controllers). Unfortunately, laptops and industrial environments don't mix. Although we asked management to buy us industrial laptops, they refused, which meant that we would usually get the sales force's old laptops, which happened to be in great supply. These were old IBM thinkpads and their life expectancy was about three months once they were subjected to the harsh environment. Ok, no problem. If a laptop dies, pop out the hard drive, and stick it in another laptop. Unfortunately, our system admin could never figure out how to open up the laptop and remove the hard drive. Finally, after about the third time showing her, I decided that it's probably time for me to change careers. Here I am, an electrician, showing a University of Washington graduate how to replace a hard drive in a laptop. Go figure.

    Anyhow, I knew that I couldn't get into the IT based on my background. Even though I played around with Linux, I needed to have that magic piece of paper. Guess what, this was during the Dot com boom when every Tom, Dick, and Harry got an $80,000 a year (or so it seemed at the time) job and became an instant millionaire (on paper). I figured, what the hell, I didn't have the time to go to a four year university, and ITT seemed to have a respectable name. (Little did I know) Anyhow, it seemed like a fast track to a career that I would enjoy.

    Anyway, I got to ITT and started their CNS (Computer Network) program. However, I was always interested in programming, so I switched to the SAP (Software And Applications) program. While it's not a Stanford education by any means, it wasn't bad as long as you were willing to work hard and apply yourself. It didn't really go into any advanced topics, but it did go into the fundamentals. We had two Java classes, a Linux class, two VB classes, two C++ classes, and a Data Structures class using C++. We also had two software design and architect classes. Not university level, but pretty solid. I guess I was lucky, I had good instructors. Fourth of them come to mind. One of our instructors had a PhD and worked for NASA on the space shuttle program. The other guy developed missle guidance software for the Air Force. The third worked for IBM. The fourth, well, she didn't know shit from shinola.

    While in school, I was able to get job as a customer support rep at a very, very tiny ISP. 800 customers, three employees. Anyhow, the guy they hired to as the system admin got fired for incompentence. My boss found out that I played around with Linux and made me the system admin. That and I also worked cheap. Let me tell you, I learned FreeBSD the HARD WAY!!!! As in WHAT NOT TO DO!!! As in the web server's down, and you're feverously reading the FreeBSD handbook and parsing the newsgroups to get the Freaking thing up and running. Oh yeah, also explaining to your customer's why your web server's down. The job sucked and so did the pay, but I got a lot of experience real quick. Pretty soon my reputation got around because I gave the customers good service, and I was able to land a few minor consulting jobs.

    Fortunately, after I graduated from ITT Tech, I was lucky enough to land a programming job, as a Microsoft Access report writer (Yuck!). Mind you, I did my final software project at ITT using Java and MySQL so I wouldn't have to work with Access!!! However, in this economy, beggers can't be choosers, especially when you have a two year degree from ITT Tech. Anyway, after working with Microsoft Access and exploring VBA's capabilities, I found that Access is a very use
  • by The Snowman ( 116231 ) * on Sunday March 21, 2004 @03:02AM (#8625851)

    If you think this post had alot of obscenities, you should have seen the email I sent to salon.

    Salon writes good articles for the most part, and they have had rough financial times. If ignoring an "internet commercial" means they get money to keep writing articles such as this one, so be it. I do not like it, but I understand they are just playing the system. I understand economics well enough to know that this is a necessary evil.

    Now, as to your comment, please try to be constructive. I communicate with companies and my Congressmen on a regular basis, both email and snail mail. Some of the issues are very dear to me, such as the letter I wrote to Dell berating them for laying off American call center employees to outsource to India. Not once in my letter did I swear or come across as uneducated, unintelligent, or uncaring. Swearing and other vulgar language does nothing to help your cause. Be constructive. Do not just say "ads suck," provide them with a solution to your perceived problem. I think you will be hard pressed to do this, however. Their solution is very good for their company and not too intrusive to their readers, although it is intrusive.

  • by jhoger ( 519683 ) on Sunday March 21, 2004 @03:20AM (#8625911) Homepage
    It makes variable names butt ugly.

    The farthest I am willing to go is to end my pointer variable names with the letter p, and pointer-to-pointers with pp. Array names don't get a suffix, since they're not really pointers.

    It makes the code leaps and bounds clearer without the hideous ugliness of Hungarian notation. And pointers is where the bad errors are, so it's the bang-for-the-buck here...
  • by morphage ( 62416 ) on Sunday March 21, 2004 @04:01AM (#8626070)
    If one compares the industrial revolution with the so called "information age", we are somewhere in the early 1800's. The industrial revolution resulted from the experimentation of scientists with formal training and tinkerers with informal training. The information age is no different. During the first few decades of the information age, those who worked with computers were both scientists and engineers with formal training, as well as tinkerers with informal training.

    After the foundation of the industrial revolution was laid, two professions emerged, the engineer who designed the machines, and the mechanic who maintained them. Some of the previous posts noted with dismay the similarities between blue collar workers and graduates of technology programs and certificate holders. Computer science and engineering is still barely older than 40 years old. Since the demand for programmers is still very large (even if the demand is being met in India), simple jobs are being delegated to programmers with training lacking theory. These are the mechanics of the information age and they have their place.

    The "technician programmer", is on the same level as those technicians who obtain degrees in "engineering technology". The technician programmer is well suited for cranking out small system administration scripts, coding SQL, creating database front ends, and developing websites. Like machinists, they work with tools that are comparatively simple, and repetitive in nature. Occasionally, a complex problem requires a new method of applying the tool. A machinist can make engine parts, but can't design an engine in the same way a technician level programmer can create a database, but not the database engine.

    Unfortunately, the tech schools try to convince both their graduates and businesses that technician programmers are able to do more than this. In the current economy, businesses would much rather pay a tech school graduate to customize an off the shelf solution, than an unemployed programmer trained as a scientist or engineer. However, there will always be jobs for systems programmers and software engineers, as long as someone is willing to pay for new ideas. The difference between the "technician programmer" and computer scientist and engineer needs to be recognized, just as the difference between the appliance repairman and the electrical engineer, and the mechanic and mechanical engineer is recognized.

    We have not seen the true next generation of computers. Miniaturization and speed increases are the results of advancements in materials science and electrical engineering. Computer science and software engineering are still using ideas from the first generation of technology. The majority of computers (PC's) run a von Neumann architecture, with software written in languages that are procedural, even if support for classes and objects is included. Classes and objects simply create a method of abstraction to enable the problem to be approached in modular manner. Not that this is bad, but the models and techniques of computer programming are based on the limitations of hardware technology from the 1960's and 70's. The tradeoff between performance and ease of use are still issues in software engineering. The next generation of computers may eliminate the constraints of current methods, as well as introduce new constraints. It's all part of an ongoing process called technology. This is a good thing.
  • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Sunday March 21, 2004 @04:24AM (#8626135) Homepage
    It's usefull for what it does, but it does make code damn fugly (and near-unreadable, for me), and theres far better tools now for figuring out what a variable is.
  • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Sunday March 21, 2004 @04:26AM (#8626139) Homepage
    Try actually reading what he's pissed off about - the ad he got required his interaction before he could continue.

    You should also remember that this isn't a letter to his congressman or any other such thing, it's a pissed off rant on slashdot, and as such swearing is just as acceptable as it would be if he were blowing off steam in a pub.

  • by pipacs ( 179230 ) on Sunday March 21, 2004 @05:46AM (#8626319)
    It makes variable names butt ugly.

    Another problem is that if you change types, you have to change the names, too. Or if you don't, you end up with a completely misleading code.

  • by mabu ( 178417 ) on Sunday March 21, 2004 @05:59AM (#8626352)
    The problem is not the art, it's the "artists".

    Programmers are analagous to lawyers now. It used to be that passion and a genuine interest was why most people were in this business. Now most people arbitrarily pick CompSci because they think that will give them career stability, and really giving a damn about the art of programming doesn't matter much. So like lawyers, you have this new breed of people in the industry who are just there for the money and have no appreciation for the work and the accomplishment. You don't see lawyers trying to use their craft to change the world.. you see them chasing around ambulances. Likewise, you don't see programmers these days trying to make things better.. you see them promoting ASP, Java, PL-SQL, and a hoarde of other get-bys so they can collect their check and move on.

  • by Anonymous Coward on Sunday March 21, 2004 @06:07AM (#8626378)
    If you are serious about the C++ part, the problem is that he'll have gotten the diploma, raised a family, watched his children grow, retired and dead before he can start learning Java that way.

    Weren't you advocating simplicity?
  • by prockcore ( 543967 ) on Sunday March 21, 2004 @06:17AM (#8626401)
    even without scrolling your editor up a couple pages to find out; papszEnvironment would likewise tell a Win32 devotee that it was a Pointer to an Array of Pointers to Zero-terminated Strings.

    No it wouldn't. It would tell a win32 devotee that it started out that way.. it may not be that way now.

    Look at how many "lp" variables are in the win32 headers.

    Hungarian Notation is the most horrible concept ever because it always ends up lying. I bet that's why MS is so slow to fix buffer overflows.. in order to change a variable from an int to a long is an arduous process.
  • by lokedhs ( 672255 ) on Sunday March 21, 2004 @06:34AM (#8626445)
    Ever coupld of months some know-it-all, usually with a few degrees in CS come out saying that "programming sycks", "we haven't eveolved in 20 years", "we need better tools that can automate things" and usually finishes off with "this can't go on! We're working on a tool that will transform programming!".

    Then you usually don't hear from them again. Want to know why? Because they're wrong.

    The fact is that regardless of what methodologys used when developing software, in the end you are simply giving instructions to the computer what to do. No matter how many layers of tools you try to add on top of this, in the end you want to giveinstructions to the computer how it should solve the problem at hand. What it all boils down to is that the more complex the problem is, the more detailed must your instructions to the computer be.

    Allow me to give an example: If you have some calculations to perform you can do that in a spreadsheet app, but when your formulas grows more complex you start scripting the spreadsheet. After a while even that isn't enough and you write a separate VB (or other scripting language) app to do this for you. Again, the problem might grow to the level that even your scripting language can't handle it and you sit there with a full app implemented in Java or C++ which solves your original problem. If you happen to be a CS professor, you will start thinking: "why did I have to write this app so solve this simple problem? Programming sucks! We haven't evolved in 20 years! I'm going to write an app that takes the complexity out of programming!", you will publish an article on this, and then you'll spend the next couple of years trying to solve a problem that doesn't exist.

    Are you old enough to remember the craze about 4GL? The reasoning behind that is exactly the same as what Charles Simonyi says:

    Giving the former group tools to shape software will transform the landscape, according to Simonyi. Otherwise, you're stuck in the unsatisfactory present, where the people who know the most about what the software is supposed to accomplish can't directly shape the software itself: All they can do is "make a humble request to the programmer." Simonyi left Microsoft in 2002 to start a new company, Intentional Software, aimed at turning this vision into something concrete.

    Right. Brining programming to non-programmers. Think about it. Does it make sense? As soon as you start programming, you ARE a programmer. Why, then, do you want to limit yourself to a limiting point-and-click tool? This is where 4GL failed. While making it very easy to connect a few database tables together, the real business logic was hard or even impossible to create, and the resulting apps were extremely difficult to understand and impossible maintain.

    One of the best tools that helps programming in recent years is, in my opinion, IDEA [intellij.net]. It's a Java IDE that doesn't assume that the programmer is stupid and doesn't understand programming, but rather automatically creates the boilerplate code for you while you write the code. You still work with the code, you just don't have to write all of it.

    There's an enormous difference between IDEA and the "4GL mindset". While IDEA acknoledges that the best way of writing code is by typing in the instructions that will actually run, the 4GL mindset assumes that people are incapable of learning this and need fancy icons instead. Allow me to clarify:"icons is not a good way of representing computer code.

    It feels to me that the people who claim these things have realised that they are not the worlds bet programmers. They realised that programming can be hard, but instead of acknowledging this and try to be better, they decide that it's the "state of programming today" that is at fault. That if they don't know how to write good code it's got be the tools fault. It couldn't possible be that some non-academics can be better programmers than them, now could it?

    So

  • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Sunday March 21, 2004 @07:41AM (#8626567) Homepage
    I don't care if you use it or not, but I thought we were talking about the relative merits of it. I prefer descriptive and readable variable names, and I rely on automated tools and an IDE to show me the type if my memory fails. Note that I'm familiar with Hungarian notation and I still think it fails the readability test - it's messy and distracts from the variable name which is where the real meaning is.
  • by sonamchauhan ( 587356 ) <sonamc@PARISgmail.com minus city> on Sunday March 21, 2004 @08:05AM (#8626619) Journal

    > > > If you think this post had alot of obscenities,
    > > > you should have seen the email I sent to salon.

    > > Now, as to your comment, please try to be constructive.

    > Try actually reading what he's pissed off about - the ad
    > he got required his interaction before he could continue.

    So? It's a *Slashdot* slip-up, not a Salon one. If he was angry, he should have berated "timothy" (the Slashdot editor of this story) for wasting people's time pointing to the Salon article without stating it wasn't free (this probably means timothy subscribes to Salon.) And then, maybe, also scold the story submitter (who was quite probably another Salon subscriber.)

    Instead this guy blows up, spews profanities on Slashdot, *and* sends a (much worse) attack letter to ... Salon!?! He is an idiot.

    > ...pissed off rant on slashdot, ...as acceptable as ... in a pub ...
    To be angry on somebody for no reason is not acceptable - anywhere.
  • by Ed Avis ( 5917 ) <ed@membled.com> on Sunday March 21, 2004 @08:20AM (#8626657) Homepage
    I feel that if you want to use a notation where the type of a variable determines part of its name, then this should be checked by the compiler. Furthermore, it should be easy to strip off the type prefixes when editing code (for those who don't like them) and add them back, since they can be automatically determined from the variable's type.

    Manually prefixing each variable name with redundant information is the kind of extra work that I'd rather have the computer do for me.
  • by Anonymous Coward on Sunday March 21, 2004 @10:34AM (#8627046)
    I have been working in industry for a bit over 7 yrs and have made a few observations that I believe makes software suck:
    -People try to rely on process to fix 'common sense'. By common sense I mean being careful, meticulous, and using one's brain for each situation one comes across. Granted, I work in a large corporation and this may not apply to some of the smaller companies, but I have noticed that when we find a bug, or have some sort of other development issue, management tacks on more process to fix it. It's of course normal for people to make mistakes but sometimes you just have people that continually use poor judgement - get rid of those people and get others who can do the job right!
    -Today's engineers have a large tendancy to overarchitect, doing MUCH more than is required; overthinking what possible changes may occur in the future and designing code around that idea (this idea isn't bad in and of itself, but I have found people get carried away in this area) - What happened to the KISS principal?
    -I may sound horribly outdated, but I have serious questions as to whether OOP has bought us anything as an industry. Sure, when used properly, I believe it can have some benefits. BUT I think it gives the programmer so many powerful tools, that incompetant programmers (of which are there many) turn those tools into powerful weapons. Most of the projects I have dealt with have been C based (~90%) - the rest, some sort of OOP (the remaining 10%). Even though the quantity of procedural code greatly outnumbered the OOP, the two most confusing, sh*ttiest pieces of garbage were OOP - I don't this this is a cooincidence. I have found that people can learn the techniques and tools of OOP, but often they fail to understand the philosophy and why it was developed in the first place. Abuse of OOP creates mass amounts of crap code that needs to be maintained.
  • by Anonymous Coward on Sunday March 21, 2004 @11:03AM (#8627165)
    The problem isn't the tools. The problem is that people don't know what to do with them. It's either been done before, or you need to invent it. How are different tools supposed to ease invention or make people more creative? Ever notice how when you *do* encounter someone with a a creative spark, they manage to get the job done with the tools available?

    I would take this kind of bullshit seriously if someone could present the following: "I have a great idea, and here's what it is, but I can't implement it well using the tools available." Basically, all we're hearing from these folks are excuses as to why they haven't accomplished anything.
  • by Anonymous Coward on Sunday March 21, 2004 @11:14AM (#8627208)
    Somehow I don't believe this person at all...
  • I'm frankly sick of architects (that's the term for people who say they design software but don't actually design software) who bemoan the gap between their glorious visions and the real products their teams end up producing. These people need to click "Close" on their UML models and go get their hands dirty by writing parts of the production code. Then they'll understand the real-world constraints that their codeless design didn't account for, like internationalization, performance bottlenecks, user authentication, heterogenous networked environments, and ACID transaction support (to name the first few).

    While I generally agree with your point and have spent a great deal of my own time and energy over the years ranting in much the same way about reality-free architects, I think you also need to consider the other side of the story. Just as you get tired of architects who stay up at the 30,000 foot level and never come to earth to code, architects get tired of programmers who are grubbing around in the dirt and refuse to look at how their little piece affects the whole. Any project of significant complexity needs people working at both high and low conceptual levels. A message-formatting module might be important, for example, but still a small part of the overall design. It shouldn't consume huge amounts of (human or machine) resources, or drive interface definitions throughout the rest of the system, but all too often the guy writing the message-formatting module acts like it's The Only Thing That Matters. It's an architect's job to remind such people that there are other things that matter too. There are other modules, and other considerations, and of course the features that users need and developers occasionally forget. Some of those considerations, like security or performance, are of immediate concern to developers, while others involve different constituencies.

    No you can't do it that way, the architect has to say, because it complicates recovery or doubles the number of scenarios QA has to test or requires an overhaul to the documentation or leaves our support people hanging as they try to explain a weird program limitation to an unsympathetic customer. Of course the person who just wants to get their one piece done doesn't like to hear that, and they go off grumbling about architects who don't understand how hard it is to write a decent message-formatting module. Screw 'em. Somebody has to look out for how all the pieces will eventually fit together or else you end up with a dozen brilliantly-coded modules that don't work together. Been there, done that, seen projects and companies fail because of it. Yes, there are reality-free architects who spend way too much time with their nose stuck in the latest $10K piece of UML-diagramming junk and don't have Clue One about how anything actually works. Maybe they're even the majority of architects, and every one of them should be shot (except that shooting's too quick). However, there are also architects who can code as well as anyone else on their team in any of a half-dozen languages, and who would love to prove it day in or day out, but who end up being architects because they're the only ones on their team who can do anything but code. More projects fail due to a lack of architecture than due to an excess of it.

  • by GooberToo ( 74388 ) on Sunday March 21, 2004 @01:15PM (#8627926)
    I remember reading DD mag years ago. They had a series of articles on why HN sucked. For me, the ones that really stuck in my head were:

    o When properly used, it created many ambigious meanings, which defeated the whole purpose of using it. Most of these ambiguities centered around various pointer constructs, which is supposedly one of the major areas which it attempts to address. Basically, HN is broken right out of the gate.

    o It often required more typing, which slowed development and increased typo error rates. Which, in turn, required another round of typo fixing and compile attempts.

    o While HN provides "instant" type information, it's often harder to read, especially for complex types, requiring much longer than an "instant" duration to comprehend or extract the information that it's attempting to provide.

    o Maintaining code is very problematic. Changing the type of a variable may result in changes in countless files. This increases development time. It also causes a lot of garbage to be created in RC logs, which may of otherwise required changes in as little as one file.

    Long story short, the disadvantages of HN do not justify what is basically a single reward, when it actually works. Basically, there are smart programmers that use good names which help convey the same types of "instant" type information and then there are bad programmers that insist on using HN.
  • by 1iar_parad0x ( 676662 ) on Sunday March 21, 2004 @01:29PM (#8628020)
    Just a random, but somewhat related thought....

    IIRC, Stephen Wolfram has argued that there exists a Principle of Computational Equivalence. (Kolmogorov and Chaitin said similar things in a more rigourous way.) In short, PCE says that any computation to describe certain phenomena would take a similar amount of time to occur as the phenomena itself. In other words, massive parallelism or some other intrinsic property causes some phenomena to be so computationally intractable, that it would be just as quick to take a measurement of the phenomena as opposed to mathematically computing it ourselves. Algorithmic Information Theory says that some data is so random that the program to describe it is of a similar message length (in terms of information) as the actual data itself. Thus there is a hidden complexity in the data.

    I wonder what Kolmogorov's view of Brooks' Mythical Man Month would be. Could it that software is complex because software is Complex? Perhaps we've reduced software to it's most reduced form. In other words, maybe there exists no more efficent way of describing the phenomena we wish to describe. Is the data that we describe in the business world is so random, that no set of instructions can reduce it by any signifigant order?

  • So the question of course becomes--how do you dodge the bullet of crap positions? Doing well in academia is probably unfortunately the best solution.

    I very much doubt that. The #1 method of succeeding in ANY business, whether it's programming, engineering, or even performing, is the ability to market and sell what you do. All the skill in the world isn't going to count for aught if you can't sell what you do... and if you can't program worth beans, if you can sell, then you're going to be able to make big bucks on crap.

    Hey, if it can work for Bill Gates, it can work for you!
  • by gnu-generation-one ( 717590 ) on Sunday March 21, 2004 @03:41PM (#8628645) Homepage
    "Sorry, could have been clearer. I was talking about commercial websites (like Salon)"

    Yep, it's pretty awful watching people like Salon trying to operate a website -- kind'a like watching an amateur at anything failing.

    For any other website, it's surprisingly simple. Write something interesting, put it on the web, and people come to your site and read it.

    Then newspaper companies try the same, and completely fail.

    They design a subscription system, spend years sorting out credit-card handling, account management, setting up the website to track people, setting up systems to stop images being linked to, setting up referer logs and browser-detects, setting up banner advert servers, loggers, and systems to charge advertisers. Setting up systems to convert simple text stories into flash or images or 10 frames, or anything to make it more difficult to copy.

    And then after a hundred thousand dollars have disappeared into website design, start copying stories from the newspaper onto the website. And wonder why so few people are jumping the hurdles to get to any of their content.

    Then having a meeting, wondering why nobody's subscribed, and spending thousands more dollars setting up a system where people can watch an advert and do something to prove they read it, and setting up a temporary ID and making it so people can use that ID to read content, and doing the logging and the tracking and the cookies to make sure that only one person can use each ID at a time.

    And they wonder why the website isn't making a profit.

    Eventually they get a few thousands of subscribers, and rack their brains trying to think how to make the site more profitable, even as they spend 60K per year on DBAs to handle all the user-ID and user-tracking databases, and the same again on support staff to handle lost passwords and account cancellations.

    Yep, it's like watching a newbie bricklayer's wall falling down each time he builds it and wondering what's wrong. So bad to watch, you almost feel compelled to go over and offer them a clue. But they'd never take it. They know what they want, and it's big-budget.
  • by Anonymous Coward on Sunday March 21, 2004 @03:57PM (#8628707)
    If you change the type of a variable, you should look at every place where that variable is used and consciously confirm that the code is still valid. This is true whether you use Hungarian or not. It is not a good thing to be able to change a variable type in a header file and leave it at that. Every module that is changed then has a newer date and will be flagged for a new round of testing. Facing all this may make you decide that you don't want to change a variable type and make you think a bit harder about the original decision.

  • by OnanTheBarbarian ( 245959 ) on Sunday March 21, 2004 @06:52PM (#8629460)
    How does this get modded "insightful"? I'm sort of tired of people parading around their ignorance about programming disguised as 'good design sense'.

    "Type-safety is like training wheels."

    Yeah, right. One day you graduate to the level of Supreme Programming God, with the power to effortlessly remember the exact types of all the "void *" pointers they left lying around. And of course, you don't need type safety when interfacing to other people's code, because you're just so damn intelligent, you instantly understand everything about their code and don't need those 'training wheels'. Because lord knows we all understand exactly what the other programmers mean and don't need no steenking type system to communicate what should and shouldn't be legal.

    "Unfortunately, we've got languages with templates, boxing/unboxing, exceptions, RTTI, etc as our main tools, that often provokes us to design systems that are more complicated than they should be."

    I love Slashdot for these kind of broad-brush statments. Which language uses exceptions or RTTI as its 'main tool', exactly? And who the hell thinks exceptions and templates are more complicated than implementing the same features _without_ exceptions or templates.

    My guess: someone who has never written a large program* or a program that actually requires generic programming or sophisticated exception handling (yes, sometimes it is OK to just print an error message and quit). The stuff about type safety also suggests near-total lack of experience with programming large systems with portions of the system written by other people.

    Also, absent turning newlines into spaces, how do you write a properly formatted 1-line "Hello World" program in C? If you're going to complain about Java's main declaration and System.out, you should at least take into account C's main declaration, and the equally inscrutable (to a first-year student) '#include' line.

    * Pre-emptively: to the people that always post about how you don't actually need to ever build large systems, because you can always build elegant collections of small, gem-like, perfect tools, please close your Web browsers and finish your sophomore CS homework.
  • by Barryo_Stereo ( 546123 ) on Sunday March 21, 2004 @09:59PM (#8630464)
    Since I haven't yet seen my main objection to HR posted, I'll add it in for the archives: What do you do first and most when reading code? Try to figure out what is happening at that point. The first thing you should run into is a variable name that is descriptive and will help this out. After all else fails, when debugging, you would check the type. However, HR is backwards in that the type appears prominently first and slows down the normal understanding of the purpose of the code. Not a big deal, but one more thing to make debugging more difficult.
  • by SlashdotStu ( 721099 ) on Sunday March 21, 2004 @11:58PM (#8631152) Homepage

    And who says that a program that experiences fundamental errors but limps along anyway is a good thing? Is it going to hose my data?

    I cite the Rule of Repair [faqs.org] (Repair what you can -- but when you must fail, fail noisily and as soon as possible.) from "Basics of the Unix Philosophy".

  • by miu ( 626917 ) on Monday March 22, 2004 @03:02AM (#8632101) Homepage Journal
    I agree with the philosophical point, the problem is that highly configurable systems are subject to a wide variety of fundamental errors. I've caught flack on several occasions for implementing "early verbose" failure, the powers that be want systems to deal with the error and "figure out what to do". This often results in some pretty heinous code, but the servers must roll.

    So a desire for "self-healing" is motivated more by the bottom line than any sort of engineering principle. (The question of good engineering being a necessity for good business has been fought and lost too many times for me to care any more)

  • by yason ( 249474 ) on Monday March 22, 2004 @04:28AM (#8632403) Journal
    Why Programming Still Stinks

    Because it IS hard and has very few ubiquitous engineering disciplines and practices to support creating a distinguished craft out of it.

    It also offers lucrative fallacies to explain further the lack of those qualities. First, as opposed to "programming" (the real craft) "coding" (a more common practice) is NOT hard and almost any technical person can write code. And coding is unfortunately commonly mistaken for programming. Designing and writing maintainable programs that are as simple and elegant as possible, and then actually maintaining them is not included in the skillset of many people in the business. Most of the commercial software sucks and has a relatively short life.

    Secondly, companies not investing in quality helps creating a breeding pit of bad, non-sustaining program-making habits. Sadly enough, customers are kind of used to the unfunctionality of computer programs, they won't demand standards on software quality. Therefore, it's not profitable to do a good job since technically inferior products often conquer the markets and set actually good products aside due to non-technical factors (as it was with Microsoft Windows).

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...