Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

What I Hate About Your Programming Language 854

chromatic writes "Perl programmers like punctuation. Python programmers like indentation. Every programming language has its own syntax, stemming from its philosophy. What I Hate About Your Programming Language examines the issues that shape languages as they grow. It's not advocacy, I promise."
This discussion has been archived. No new comments can be posted.

What I Hate About Your Programming Language

Comments Filter:
  • PHP (Score:5, Interesting)

    by sethadam1 ( 530629 ) * <ascheinberg@gmai ... inus threevowels> on Tuesday May 13, 2003 @05:01PM (#5949439) Homepage
    What he hates about PHP doesn't sound so bad, and doesn't seem like anything that won't be corrected in PHP5.

    I knew there was a reason I liked it.
    • by Imperator ( 17614 ) <{slashdot2} {at} {omershenker.net}> on Wednesday May 14, 2003 @03:17AM (#5952734)
      I code PHP every day, so I've become quite proficient at it. However, I constant find myself horrified at some newly-discovered inconsistency in its library. The language itself is not so terrible, but its library is a beast.

      As a particular example, take PHP's error handling. The language has no real exceptions, which is forgivable--but it insists of making up for it by faking them.

      It has something akin to sigaction(), but much less powerful. It allows you to provide one function to handle all errors, except for some that PHP insists on handling itself. At least that function can switch on the error, right? Nope! There are only 5 different error codes which your code can catch, only 3 of which you can actually throw (again, with a function instead of a language construct).

      And if you thought this was bad, try the error handling in the library. Each set of functions seems to have its own function to check for errors, and you have to repeatedly check the manual to find out how a function indicates failure. I've seen the following different methods of indicating failure:

      function returns FALSE

      function returns TRUE

      function prints a message to the browser

      function returns 0

      function returns 1

      function returns nonzero

      function returns negative

      call another function to find out

      functions returns something that can be fed into another function to find out

      function raises an error condition you can catch (through fake exceptions described above)

      function raises an error condition you can't catch

      pass in a variable by reference and the result will be there

      check if the returned array is empty, and if it is use a different function to find out whether that indicates an error or just a (legitimate in context) empty array

      Don't even get me started on the naming conventions of functions, or the ordering of their arguments. (Check out the array functions if you want some good examples.)

      PHP is a language that was designed for small, simple CGI scripts, and it does this well. It does not scale. PHP was never meant to be used from the command line, but how else can you write a cron job to do some nightly maintenance? (Write in another language? Sure, and give up all the libraries you've written for the project.) Sure, you can use lynx -dump http://example.com/nightly.php >/dev/null, but then you have to make sure no one but you can use that script, and it's just generally an ugly thing to do.

      For all of its faults (and it has many), one of the thigs Perl does well is provide actual language features for things like merging arrays, sorting arrays with a user-provided comparison function, or declaring a variable with loop scope. PHP's libraries keep growing, which is nice, but the language itself is too small and too limited. I don't want to use library functions for everything, nor do I honestly care whether the language is even context-free. I just want a lanugage that doesn't suck.

      </rant>

  • by ucblockhead ( 63650 ) on Tuesday May 13, 2003 @05:03PM (#5949450) Homepage Journal
    What I hate about your programming language is that it doesn't work like mine does.
  • Slash (Score:3, Funny)

    by Malicious ( 567158 ) on Tuesday May 13, 2003 @05:04PM (#5949460)
    I hate your Grammer/Punctuation.
  • by TheDormouse ( 614641 ) on Tuesday May 13, 2003 @05:06PM (#5949481)
    That's why I use Whitespace [dur.ac.uk], of course [slashdot.org]!
  • Firestarter (Score:5, Funny)

    by Flounder ( 42112 ) * on Tuesday May 13, 2003 @05:06PM (#5949485)
    I used to have a T-shirt that was designed to piss off everybody. It said "Nuke the Gay Unborn Baby Seals". That's what reading this article felt like. Tinder to start a flame war that everybody can join in on.
    • by Reziac ( 43301 ) on Wednesday May 14, 2003 @01:44AM (#5952452) Homepage Journal
      The article reminded me of this old gem:

      THE PROGRAMMER'S QUICK GUIDE TO THE LANGUAGES

      The proliferation of modern programming languages (all of which seem to have stolen countless features from one another) sometimes makes it difficult to remember what language you're currently using. This handy reference is offered as a public service to help programmers who find themselves in such a dilemma.

      =====> TASK: Shoot yourself in the foot.

      C: You shoot yourself in the foot.

      C++: You accidentally create a dozen instances of yourself and shoot them all in the foot. Providing emergency medical assistance is impossible since you can't tell which are bitwise copies and which are just pointing at others and saying, "That's me, over there."

      FORTRAN: You shoot yourself in each toe, iteratively, until you run out of toes, then you read in the next foot and repeat. If you run out of bullets, you continue with the attempts to shoot yourself anyway because you have no exception-handling capability.

      Pascal: The compiler won't let you shoot yourself in the foot.

      Ada: After correctly packing your foot, you attempt to concurrently load the gun, pull the trigger, scream, and shoot yourself in the foot. When you try, however, you discover you can't because your foot is of the wrong type.

      COBOL: Using a COLT 45 HANDGUN, AIM gun at LEG.FOOT, THEN place ARM.HAND.FINGER on HANDGUN.TRIGGER and SQUEEZE. THEN return HANDGUN to HOLSTER. CHECK whether shoelace needs to be re-tied.

      LISP: You shoot yourself in the appendage which holds the gun with which you shoot yourself in the appendage which holds the gun with which you shoot yourself in the appendage which holds the gun with which you shoot yourself in the appendage which holds the gun with which you shoot yourself in the appendage which holds the gun with which you shoot yourself in the appendage which holds...

      FORTH: Foot in yourself shoot.

      Prolog: You tell your program that you want to be shot in the foot. The program figures out how to do it, but the syntax doesn't permit it to explain it to you.

      BASIC: Shoot yourself in the foot with a water pistol. On large systems, continue until entire lower body is waterlogged.

      Visual Basic: You'll really only _appear_ to have shot yourself in the foot, but you'll have had so much fun doing it that you won't care.

      HyperTalk: Put the first bullet of gun into foot left of leg of you. Answer the result.

      Motif: You spend days writing a UIL description of your foot, the bullet, its trajectory, and the intricate scrollwork on the ivory handles of the gun. When you finally get around to pulling the trigger, the gun jams.

      APL: You shoot yourself in the foot, then spend all day figuring out how to do it in fewer characters.

      SNOBOL: If you succeed, shoot yourself in the left foot. If you fail, shoot yourself in the right foot.

      Unix:
      % ls
      foot.c foot.h foot.o toe.c toe.o
      % rm * .o
      rm:.o no such file or directory
      % ls
      %

      Concurrent Euclid: You shoot yourself in somebody else's foot.

      370 JCL: You send your foot down to MIS and include a 400-page document explaining exactly how you want it to be shot. Three years later, your foot comes back deep-fried.

      Paradox: Not only can you shoot yourself in the foot, your users can, too.

      Access: You try to point the gun at your foot, but it shoots holes in all your Borland distribution diskettes instead.

      Revelation: You're sure you're going to be able to shoot yourself in the foot, just as soon as you figure out what all these nifty little bullet-thingies are for.

      Assembler: You try to shoot yourself in the foot, only to discover you must first invent the gun, the bullet, the trigger, and your foot.

      Modula2: After realizing that you can't actually accomplish anything in this language, you shoot yourself in the head.

      CLARION: You tell your computer to create a program for shooting y
  • I hate (Score:4, Insightful)

    by AvitarX ( 172628 ) <me@@@brandywinehundred...org> on Tuesday May 13, 2003 @05:08PM (#5949499) Journal
    I hate all your programming languages because they arn't just a .wav file of my dictating what I want it to do.

    PS.
    I don't program for a living
  • by gilesjuk ( 604902 ) <giles.jones@nospaM.zen.co.uk> on Tuesday May 13, 2003 @05:09PM (#5949502)
    Produce a language without some rules and you would end up with even messier code.

    I wish some higher level languages would force the use of comments in code, make it part of the declaration for a class or function.
    • by EvanED ( 569694 ) <evaned.gmail@com> on Tuesday May 13, 2003 @05:17PM (#5949590)
      >>I wish some higher level languages would force the use of comments in code, make it part of the declaration for a class or function.

      I'm not sure if that would help... how many "// fucking compiler requires this" comments would you see?
      • Visual C++ (Score:5, Funny)

        by ucblockhead ( 63650 ) on Tuesday May 13, 2003 @05:59PM (#5949906) Homepage Journal
        Probably about as many as the number of "// TODO: Place code here" in Visual C++ projects.
      • The University I used to teach at designed and implemented their own OO teaching language that did enforce comments as well as invariants (pre and post conditions on methods).

        99% of the comments and invariants were just what the parent described. Sure, as markers we should have bounced more code back to the students, but heck, sometimes we just agreed with them!
    • by elmegil ( 12001 ) on Tuesday May 13, 2003 @05:21PM (#5949623) Homepage Journal
      /* this is the mandatory comment */
    • I wish some higher level languages would force the use of comments in code, make it part of the declaration for a class or function.

      Better would be languages which are self-documenting... you don't need to read the comments because the purpose is clear anyway.

      Class or package specifications are an improvement over having to plough through masses of functions; there are bound to be methods of making plain code easier to read in the specification of the language too.

      Phil

      • by xdroop ( 4039 ) on Tuesday May 13, 2003 @05:53PM (#5949851) Homepage Journal
        Better would be languages which are self-documenting... you don't need to read the comments because the purpose is clear anyway.

        I think that this won't happen, partially for Mr. Kringle's comment above, but mostly because there is a difference between what you do and why you did it (and again from why you didn't do it a different way). You can see function, but you can't necessarilly see the intent of the programmer. There are many times in my programs where a single line (often, less than 10 characters long) will result in several lines of comments explaining why it is done that way. That way, the poor boob who inheirits the job of extending/fixing the program (who is usually me) has a fighting chance of figuring out my intent, not just my procedure.

      • Self-documenting? (Score:5, Interesting)

        by steveha ( 103154 ) on Tuesday May 13, 2003 @05:59PM (#5949907) Homepage
        Better would be languages which are self-documenting...

        There is no language that will force perfect code. There is always room for a poor programmer to produce hard-to-understand code. Functions that do two unrelated things, confusing control flow, bad variable names, broken code that was repeatedly patched instead of being cleaned up... the possibilities are endless.

        Nonetheless, some languages have been designed with self-documenting code in mind; sometimes it even works.

        If you look at languages like COBOL, they have long descriptive keyword names designed to make the code easy to read. But you get tired of looking at those long keywords.

        I haven't used ADA, but I understand that it is somewhat designed for self-documenting code, and that as a result you are hemmed in on all sides by language rules. (ADA fans please comment here.)

        The best language I have seen for this is Python. As a rule there is exactly one way to do things, so you don't trip over obscure hackish tricks that you have to puzzle out. The language doesn't force self-documenting or comments, but it does force indentation; everyone indents their Python pretty much the same (compare with the mess that is C indentation). The language is high-level enough, with lots of libraries, so you don't need to write 10 lines of code just to do one simple thing.

        Python was designed by a guy who is both a computer geek and a math geek. The math geek in him led to a very tidy language design, and I like it very much. I think schools ought to be using Python to teach introductory programming classes.

        steveha
      • by RevAaron ( 125240 ) <revaaron@hotmail. c o m> on Tuesday May 13, 2003 @06:43PM (#5950259) Homepage
        Smalltalk is quite self-documenting. I'm sure most C/C++/java/Perl/Python programmers think you're joking when you talk about a "self-documenting language," but they're real.

        A simple langauge plus a decent code browser can equal a self-documenting language. Methods are organized into logical groups (e.g. "accessing" "initialization" etc), and clicking on a category will tell you the methods there. Especially when there is a tradition for short (7 lines or less is the rule) methods, as in Smalltalk- you can usually see what the entire method is doing just by looking at it, if you cannot guess at what is is for by looking at the name.

        People may think this is an exageration, especially if they're used to systems that require various man pages, books, and on-line class lib references just to write some code. Other than one book on Smalltalk style, I've not read any books on Smalltalk. I read some tutorials when I began, but after you learn the basic syntax [1], the very basic ideas [2], and especially, how to browse classes, you learn as you go, finding out classes to use as you need them.

        [1] All of Smalltalk's syntax can be summarized as-
        a := 1. ":= is assignment"
        obj + 2. "a binary message"
        obj methodName. "a unary messsage"
        obj methodName: argument. "a keyword message, unlim keywords"
        [ :a :b | a + b ] "block creation- a block closure, aka anonymous subroutine"

        [2] You don't even need to know anything about OOP or OOA/D- simpyl the rudiments of *object-based* programming... simply understand that an object is a chunk of data that can do certain things.
      • Better would be languages which are self-documenting... you don't need to read the comments because the purpose is clear anyway.

        Even if you could program in plain English, that would still only tell you the how: the low level of what it's doing and how it does it. Those sort of comments are usually redundant or obsolete, anyway.

        What's important to comment is why: the big picture of what's being done and how it fits into the rest of the system. Once you know that, then you can work out the rest, readabl

    • by los furtive ( 232491 ) <ChrisLamothe&gmail,com> on Tuesday May 13, 2003 @05:41PM (#5949752) Homepage
      Yeah, my favorite of all time:
      /**
      *
      * Javadoc goes here
      */
  • by Loki_1929 ( 550940 ) on Tuesday May 13, 2003 @05:09PM (#5949505) Journal

    It causes tens of thousands of clients to slam into unsuspecting web servers are bone-shattering rapidity.


    Ok, so slashcode's not a language... Sue me

  • by limekiller4 ( 451497 ) on Tuesday May 13, 2003 @05:09PM (#5949508) Homepage
    What I hate about your programming language is that nobody can understand you with that damned Lisp.

    Oh god. Did I just type that? I'm very, very sorry...
  • FlameRPL? (Score:3, Interesting)

    by SHEENmaster ( 581283 ) <travis&utk,edu> on Tuesday May 13, 2003 @05:10PM (#5949524) Homepage Journal
    If no one has heard of it [frob.us] they can't make fun of it. Until they realize that I haven't gotten around to loops in the released version yet :-)

    I think that varied languages are a necessity. It'll be better when .jar files can be executed from the console, J2dk for Linux/ppc version 1.4 is released, caffeine springs from trees, and C++ no longer requires the programmer to deal w/ pointers. Oh yeah, and all BASIC interpretters are dumped into Sol.
  • Pretty limited scope (Score:4, Interesting)

    by rpg25 ( 470383 ) on Tuesday May 13, 2003 @05:12PM (#5949535)

    I looked at this article, and I was disappointed by what a limited set of languages chromatic had examined. Where was Prolog? ML? Common Lisp? SNOBOL? Smalltalk? Dylan? All the languages in the article are in the class of "imperative languages with varying amounts of object-oriented gravy." If you're talking about how languages embody a philosophy, why stick to languages that pretty much embody the same philosophy, with some minor tail-fins and chrome as their differences?

    [I suppose that's some flame bait....]

    • by chromatic ( 9471 ) on Tuesday May 13, 2003 @05:20PM (#5949609) Homepage

      I didn't list specific gripes about the languages you describe because I don't really have enough practical experience with them to analyze them well. I do discuss languages such as Lisp and Smalltalk in the analysis section though, just as you mention.

      Just to be fair, though, one of my gripes with Lisp is the idea that reducing all syntax to a Lambda form makes up for moving all the remaining complexity to built-ins and extensions. I certainly don't think in trees -- a little syntactic sugar is tasty. That doesn't make Lisp wrong; it just doesn't fit my brain as well.

    • Haskell, you didn't mention Haskell. How can you mention Dylan, ML without Haskell.

      Eiffel, you didn't mention Eiffel. How can you mention Dylan, ML, Haskell without mentioning Eiffel.
      • by Anonymous Coward
        And how can you mention Dylan without mentioning the Mamas and the Papas? How can you mention Dylan without mentoining Woodie Guthrie?
    • by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Tuesday May 13, 2003 @05:45PM (#5949780) Homepage Journal
      I looked at this article, and I was disappointed by what a limited set of languages chromatic had examined.

      Given the superficial and haphazard nature of the review, I was just as glad it didn't touch on those other languages. I really didn't get much of value out of the article and the only thing that would have been worse is an equally superficial treatment of my own languages of choice.

      And anyway, one person's opinion is just one person's opinion. It's a pity the author didn't attempt to do any kind of survey. Even an unscientific survey might have been more interesting and/or informative than this was. In its present form, there's no way to detect hints of incompleteness, idiosyncracy, bias, ... other than to incompletely, idiosyncratically, and in biased form say "well, here's something I noticed that I disagree with".

      I'm sorry if these remarks sound critical, but the entire article came across to me as flamebait and I'm not sure what positive quality I can draw from it. It started off as a nice idea--that language philophy can influence syntax or vice versa. But it diverged about halfway through from that to random, unmotivated jabs at this and that language and really ended up going nowhere with few, if any, useful takehome messages.

      Maybe I was also put off by the fact that the author's statement, that "Lisp is very much the lambda calculus". As a matter of history, several decades ago, it might have been reasonable to say that Lisp was "inspired by" ideas of the language calculus (though some might say "misunderstandings of the lambda calculus"), but the language was a whole is really enormously different than that now. It is often used as a teaching vehicle for esoteric things like the lambda calculus because other languages can't stretch that far, but mainstream Lisp does not look or feel much at all like the lambda calculus, any more than "modern music is very much that of Elvis Presley", however much his break from the past may have been a founding influence on modern music. This failed allusion injured the author's credibility with me within the article almost irreparably.
      • by Tom7 ( 102298 )
        Maybe I was also put off by the fact that the author's statement, that "Lisp is very much the lambda calculus".

        Yeah, he pretty much lost all credibility with me there. Basically anyone who's used modern lisp knows that the language has mutated far beyond its initial inspiration by the lambda calculus. And, indeed, anyone who's studied the lambda calculus knows that lisp gets its static scope wrong--and in a language as minimal as the lambda calculus, that's enough to hardly make them related.

        The author s
  • by evronm ( 530821 ) <evronm@NoSpam.dtcinc.net> on Tuesday May 13, 2003 @05:13PM (#5949545) Homepage

    Anyone else noticed how, in the middles of the "my language is better than your language" flame war this guy was starting, he managed to slip in an editor flamewar by linking to vim?

    Truly brilliant!

  • Say What? (Score:3, Interesting)

    by fidget42 ( 538823 ) on Tuesday May 13, 2003 @05:13PM (#5949549)
    Python programmers like indentation.
    As a python programmer, I loath its indentation sensitivity. It is hard to find the end of a block and heaven help the people who use tabs rather than spaces for indentation.

    The lack of good line termination (sorry, but a caridge return doesn't cut it) is another problem.
    • Re:Say What? (Score:3, Insightful)

      by los furtive ( 232491 )
      Amen to that! I used to think that tabbing as a form of indent was a major sin, but since most IDEs these days (and every text editor worth its weight in salt) will allow you to adjust tab length/replace tabs with spaces/replace spaces with tabs, I don't see what the big deal is any more, and in fact I think tabs are a better choice.
  • by secolactico ( 519805 ) on Tuesday May 13, 2003 @05:14PM (#5949563) Journal
    No mention whatsoever of BASIC or Logo. Yes! At least he spared my languages of choice.
  • by fishlet ( 93611 ) on Tuesday May 13, 2003 @05:15PM (#5949567)

    I don't think it's always technical. A few years ago it seemed like most comments in regard to Java were positive, but when it became evident that it wasn't really "free" in the same sense as is perl or python... then lots of people started bashing it. Though like many languages has it's flaws, it still remains a solid language. The same with VB, virtually no-one in *this* audience considers VB a great language, which is reinforced by the fact that no-one's really putting much effort into creating a VB like tool for Linux (albeit there are several dead projects that have tried). It's a shame because VB actually works quite well for a particular niche- quickly developing business apps. In the case of VB, I can safely predict most people here will not give it credit because of it's links to Monopolysoft.

    • Actually, I kind of like VB for certain purposes. I am not, nor do I have any wish to be a programmer, but sometimes there are things you just can't do in a batch file, and dammit if VB isn't a nice way to get around this limitation. Not to mention that it looks a bit cleaner, and certainly less scary, to the average end user, than a command prompt. Plus the ability to send windows messages (logoff and the like) can be really damn useful.
      So yes, VB is another MS product, but it can still be a very usefu
    • by aardvarkjoe ( 156801 ) on Tuesday May 13, 2003 @05:36PM (#5949722)
      Well, I rather suspect that the main reason why you see a lot more Java bashing is because most CS students used to have to learn C++; now they have to learn Java. Since /. is mostly high schoolers and college kids, you have a whole bunch of people that were forced to learn the language in the last few years. Obviously, you're going to end up with quite a few people that don't like it.

      There's not really all that much bashing of any other non-free stuff (the exception being Microsoft, but that's mostly because they're still beating up on the penguin.) I don't see it as being the primary reason for Java bashing.
    • by pi_rules ( 123171 ) on Tuesday May 13, 2003 @06:10PM (#5949990)
      I can safely predict most people here will not give it credit because of it's links to Monopolysoft.


      This is probably true, and I'm as much as an anti-MS guy as you can get really, but I have my reasons for not liking VB. I did a few projects with it in the past year (ASP/VBScript with VB COM components, MTS, etc), so I speak from experience.

      I went into it thinking it would suck, but I quickly found it being okay for gettings things done. "Hey, maybe these guys are onto something I think...". Then the project gets more complex and I realize why I like langauges that are far more strict regarding what you can and cannot do.

      • Lack of short circuiting conditionals really started to eat me up. Mostly because every other language I used (except BASIC) would short-circuit conditionals. All too often I would find myself writing complex loops using short-circuits and then realize later on I had totally blown the algorithm. I wasn't the only one either. I saw experienced VB guys do stuff like: If (objRS IsNotNull AND objRS.RecordCount > 0) Then.... Not much fun when that blows up.

      • Some consistency would be nice. I think Pascal is the only other language I've ever used that would let you declare a function in two different ways. One for returning data and one that didn't. The seperation of Sub and Function is a friggen mess. It's even worse when you realize later on you need to return a boolean out of your Sub and suddenly you have to track down everybody that called it and change the method of which you call it.... even the ones that could care less about it's true/false return. What's wrong with the return type void?

      • No macros or a precompiler. VB didnt' clean up unused objects very well for itself and one point in time, if I recall correctly, which made developers (at least at where I was) make sure they always set objects to Nothing before a failed function would return. That's find and dandy, but I hated repeating the same awkward (due to lack of short-circuiting conditionals) cleanup code in function after function that were nearly identical. Think nearly identical functions with a bit of business logic in them passing data off to a data layer for the real DB access. Each function had maybe 3-4 lines of individual code, sometimes up to 20 though. Every function though was at least 30 lines long, with the same drivel repeated over and over again. How much better it would have been to write:

        If (ErrorState = True) Then
        CLEANUPCOMMONOBJECTS
        End If

        If I had good exception and a good GC I wouldn't have even needed this though.

      • That damned VARIANT type needed for COM. Okay, this is common amonst all COM enabled apps when going across boundaries, but it really stunk if you asked me.

      • Little bugs. If you return a 'decimal' datatype from an ADODB.RecordSet and called IsNumeric() on it would you expect a true or false? Assuming the value in question wasn't Nothing, you'd assume it's true, right? Bzzzzzz!. IsNumeric(CStr(val)) would return true though. All because IsNumeric didn't understand all the possible variant datatypes that you could toss into it. Minor oversight, but it turned up a pretty decent and noticable bug in my code once. Err, wait, that was VBScript, anybody know if that happens in Real VB?


      It's a short list, but it's been a while since I coded in it.

      • by Steve G Swine ( 49788 ) on Tuesday May 13, 2003 @08:51PM (#5951075) Journal
        Scary thing is, all the things you mention are gone now in VB.NET.

        AndAlso/OrElse for short circuits, subs and functions all take parens to call, real exceptions, variants are dead, and they took the VBScript IsNumeric oddities to the grave with them (though you were begging for pain when you relied on IsAnything with VBScript).

        I mention this lest you think VB was crippled forever... Try the new stuff - it's like object oriented code with real words, instead of seventy different types of punctuation to make something simple look like some goddamn magic trick.
    • OK, standard disclaimer: I did CS at a (very) good uni, and was taught the "proper" way to program, via Miranda, Modula, Smalltalk, Prolog, Turing etc. Never learnt C++ (all our lecturers hated it) - Java had only just come out at the time so did very little of that.

      Went straight into the finance world - first 2 years or so were straight ksh/perl/sybase - you'd be amazed how much of the banking world is held together with that stuff (which I affectionatly term "sticky tape"). Don't get me wrong, perl is gr
  • by WndrBr3d ( 219963 ) * on Tuesday May 13, 2003 @05:17PM (#5949585) Homepage Journal
    <%@ Language=VBScript %>

    Is you see this, please call Crime Stoppers at (888)580-TIPS.
  • But (Score:3, Insightful)

    by Timesprout ( 579035 ) on Tuesday May 13, 2003 @05:21PM (#5949619)
    As the article points out many languages have a lot of quirks. While the pragmatic programmer is one of my favourite books on coding I dont think learning a new language every year is a particularly useful thing to do. There is a world of difference between just 'learning' a language and gathering the experience over an extended period of time to become truly proficient in it. I already code in serval languages and while I might have a passing interest in reviewing a couple more just to see how they work I really dont want to invest the time required to learn them properly when I dont see them appear in job adverts.
  • by Anonymous Coward on Tuesday May 13, 2003 @05:22PM (#5949632)
    ...I work at a company that uses an early 1970's mainframe (won't divulge any details). We use punchcards (yes punchcards) to program the beast in FORTRAN. As you may or may not know, FORTRAN was originally adapted to punch cards, hence the 80 column limit and the 6 column space prior to issuing commands. (These limitations have been relaxed in FORTRAN 90/95). Of course, I also program on other, more modern systems using other languages, mostly C++ and Perl. However, I still find myself writing programs that basically mimic FORTRAN's style. I prefer short lines no longer than 80 characters and capitalized command names, etc. Once I actually rewrote some of gcc's source code so that reserved words like for, while, switch, etc. were changed to FOR, WHILE, SWITCH, etc. I also capitalized the functions in the standard library (!). Since then, I've gotten over my capitalization fetish, but FORTRAN's code still looks better to me. I guess old habits never die.
  • by Animats ( 122034 ) on Tuesday May 13, 2003 @05:24PM (#5949642) Homepage
    C is the syntactical ancestor of many programming languages today, but none of them get it right.

    ANSI C itself is at least stable. The procedural part of the language is generally accepted (it's basically the same in Java, C++, etc.) The declaration syntax has problems. It's broken for historical reasons. Originally, C was LALR(1), but then came "typedef", and it went downhill from there with "class", etc. Nobody has been able to fix this properly. This is why the parser gets lost in so many error situations.

    C++ suffers from some early bad design decisions. Templates came late. Strostrup knew about templates, and decided not to put them in. This led to great pain and ugly code, templates went in, and it's taken a decade to clean up that mess.

    Java was supposed to clean this all up, but now Java is getting generics, which it wasn't supposed to need. So it's going down the same path as C++, but with a new set of mistakes.

    Other attempts to fix C include Objective C (which still has a following) "C+@" (a Bell Labs product that predates Java), "C#", a Microsoft variant, and several others with tiny market share such as "D". None are enormously better than C.

    I'd like to see C++ cleaned up, but the ANSI committee is more interested in putting in obscure features for template writers.

    • I think the main problem with C++ is that you *can't* clean it up. It took an agonizing 5 years for compilers to finally support the C++ '98 standard, and any breaks in compatibility at this point will have developers up in arms. Also, remember that there are many large software systems written in C++ than Java or Python, so they can't afford to change things in a way that would break that code. Heck, given the pains Sun is going through to implement generics, I'd say that even Sun is having problems becaus
    • by avdi ( 66548 ) on Tuesday May 13, 2003 @05:51PM (#5949831) Homepage
      C++ suffers from some early bad design decisions. Templates came late. Strostrup knew about templates, and decided not to put them in. This led to great pain and ugly code, templates went in, and it's taken a decade to clean up that mess.


      Bjarne wanted to put generics in from the very beginning.

      Java was supposed to clean this all up, but now Java is getting generics, which it wasn't supposed to need. So it's going down the same path as C++, but with a new set of mistakes.


      Java "cleans up" nothing, it simply strips out all the more powerful features of C and C++ which novices tend to stub their toes on. Oh, and it adds one important feature: inner classes. Unfortunately the result is a language whose omitions actually make it more verbose and harder to maintain than C++.

      Other attempts to fix C include Objective C (which still has a following) "C+@" (a Bell Labs product that predates Java), "C#", a Microsoft variant, and several others with tiny market share such as "D". None are enormously better than C.


      Neither ObjC nor C# is an attempt to "fix" C; Objective C is an attempt to embed a Smalltalk object system in C, and C# is an attempt to fix Java. Neither of them are applicable to the same problem domains as C.

      I'd like to see C++ cleaned up, but the ANSI committee is more interested in putting in obscure features for template writers.


      Everything in C++ belongs there, and most of it was intended to go in from a very early stage. The only thing that needs to be "cleaned up" is the C preprocessor. Templates could use easier syntax but no one has come up with anything signifigantly better than the current syntax.
      • Unfortunately the result is a language whose omitions actually make it more verbose and harder to maintain than C++.

        I disagree with this a little; I find uncommented Java code easier to understand than commented C code. Regardless, I'm not a huge Java fan, partly because I've always found the class library to be a pain in the ass. The first Java project I worked on, I spent hours trying to figure out how to manipulate dates properly. Writing the code in Python or Perl instead would have been much short
  • Important clue: (Score:5, Insightful)

    by DdJ ( 10790 ) on Tuesday May 13, 2003 @05:25PM (#5949650) Homepage Journal
    An important clue for evaluating this: he finds fault with Perl, Python, C, and Java, but does not find fault with JavaScript.
  • by YllabianBitPipe ( 647462 ) on Tuesday May 13, 2003 @05:34PM (#5949713)

    ... set far in the future where AI civilizations try to eradicate each other because they disagree about which programming language is better. Obviously, they see the programming language they use as the superior one.

    Throw in some wire-fu and you got yourself a franchise.

  • by CuteAlien ( 415982 ) on Tuesday May 13, 2003 @05:39PM (#5949743) Homepage
    Actually i enjoy programming in a lot of languages and, as problably most who programmed for a while, the problem is seldom the language (otherwise you chose the wrong for the job).

    But it always get's ugly when it comes to debugging. You're in a bad mood anyway (it's a bug - probably your bug - and it will cost you, very probably, even more time than programming the whole f**king function).

    No matter which language, after a while you start hating your debugger. You're programming 3D and have a problem with vectors - all u see variables with some numbers. You're programming a database and the results don't fit.. all you see are variables with wrong result. Etc...

    It's always like your car broke down and you get messages like iron content of bumper 100%, mass of bumper 1.4, foo.ineedtorenamethis 1.5...

    And then you gotta dig through the dirt :(
  • by SimHacker ( 180785 ) on Tuesday May 13, 2003 @05:43PM (#5949767) Homepage Journal
    Many languages weren't designed, they were just reactions to the mistakes of other languages.

    Perl is a reaction to the flaws of many different languages. Unfortunately it reacted by imitating all the worst flaws of all the worst languages. People who think Perl is great are totally ignorant of other languages, and have extremely bad taste. They are desparate about their job security, which is why Perl is the best choice for corporate parasites looking to drum up busy-work to justify their salary.

    PHP is a reaction to Perl, used by amateurs who were burned by Perl, but actually want to get work done, however they don't know any better languages. Perl (mis)taught them that programming languages were extremely difficult to learn. But they couldn't stand Perl, so they switched to PHP because it seemed "simpler", without realizing how much better other programming languages are. So they stick with PHP because they're afraid to learn another programming language, having been traumatized by Perl, and tranquilized by the incredible mediocrity of PHP. PHP was designed to recruit disillusioned Perl programmers.

    C++ is a baroque overreaction to C, whose designers were obviously ignorant about programming language design, learnability, usability, readability and maintainability. So all those lessons had to be (mis)learned again, the hard way. Which brings us to...

    Java is a moderate reaction to C++, that still ignored much about programming language design that C++ designers never bothered to learn (so as not to drive off C++ converts by forcing them to learn new concepts). So if you know C++ but don't know Lisp or any other reasonable language, you think Java is great. Java was designed to recruite disillusioned C++ programmers.

    So PHP is to Perl as Java is to C++. The lesson: You can't fix a badly designed, fatally flawed language by imitating it.

    -Don

    • by mbrubeck ( 73587 ) on Tuesday May 13, 2003 @06:17PM (#5950049) Homepage
      Paul Graham put together a brief list along those lines, titled What Languages Fix [paulgraham.com].
    • by Bendebecker ( 633126 ) on Tuesday May 13, 2003 @06:26PM (#5950117) Journal
      I know Lisp, Java, C, C++, PHP, and Prolog. First, I think your view of PHP and Perl is correct but don't ignore that fact that they are very powerful and very useful languages, regardless of their flaws. Just because PHP is simple doesn't take anything away from it, it was meant to be simple. Every language was designed with a use in mind and PHP in that theater has its uses that make it better than other languages. People don't just use PHP becuase "they are afraid to learn other programming languages." Rather most poeple use PHP because it is the best tool for the job.

      As for the rest it just seems that your pissed off that people perfer languages that aren't Ada. C++ is sort of a mess, I'll give you that, but that is mostly because they tried to keep too much of C in it. As for Java, it is a grea language. It is very readable, it is easy to write, it's OOP design (if implementing a good modularized design) makes it very easy to maintain, etc.. The developers didn't ignore programming language design, they made choices as all developers must. They had certian goals for the language in mind and they met those goals. Just because you don't like their design choices doesn't mean they didn't know how to design. You have to make trade offs as well. You can't have your cake and eat it too. I have programmed in Lisp. Java is still great. Java is an object oriented language, Lisp is a functional langauge. They have different design issues for different paradigms. As for readibility, writibility, and maintainability consider this: how many versions of Java re tehre? How many verison sof Lisp are there? Ever try to write a Lisp program that could runb in all implementations of Lisp. How about teh fact that Lisp goes crazy with (). Every try to read a Lisp program with twenty nested function calls? Even writing them and trying toi keep track of all those () are a pain in the ass.

      Go read "Programming Languages" by Kenneth Louden.
  • by Bendebecker ( 633126 ) on Tuesday May 13, 2003 @05:49PM (#5949814) Journal
    A programming language where I don't have to do any work. One where I can just decide, "hey, I have a great idea for a program" and then discover that my computer had already programmed it for me.
  • by The Bungi ( 221687 ) <thebungi@gmail.com> on Tuesday May 13, 2003 @05:50PM (#5949821) Homepage
    As long as you have a good library and support of some type (community or corporate).

    Other than that, the language is just like the favorite couch - it doesn't really matter where you sit, but that one just happens to be more comfortable.

    That's one of the reasons .NET is cool. It provides a unified runtime library that caters to any number of languages, as long as someone has bothered to port them. The end result should always be the same. We joke about COBOL.NET, but the reality is, it's made possible by this - dare I say - revolutionary idea. Soon we'll have Python.NET, Perl.NET, Ruby.NET, PHP.NET, etc, etc.

    You will be assimilated =)

    • I think that the gcc group had that figured out first. gcc uses front ends to translate the c, c++, fortran, java, and whatever other languages it can use to intermediate files, which are then compiled to assembly then machine code.

      Once again, Microsoft "innovates" themselves into territory where others have lead them.
  • by Anonymous Coward on Tuesday May 13, 2003 @05:54PM (#5949860)
    I can't stand machine language. I'll be typing along and accidently type a '0' when I meant to type a '1' and my program goes apeshit. They should fix that.
  • by Ilan Volow ( 539597 ) on Tuesday May 13, 2003 @06:02PM (#5949929) Homepage
    One fault I find with the author's assessment is that he is evaluating the language only from the standpoint of the one who is writing in it. I think a better language assessment would also evaluate a language from the viewpoint of the poor bastard who actually has to read someone else's code written in that langage. Does the language have the tendency to produce code that is readable and understandable by the person who didn't write it? Or does the language have the tendency to produce code that is readable/understandable by only its original author?

    For example, Perl allows the programmer who writes a perl program try to make their code as terse and unreadable as possible, fitting everything on one line by exploiting some bizarre behavior of the perl interpreter. While this "expressiveness" might be wonderful to the person who's writing the code, it's really going to be a problem for a second person who might want to contribute to it or maintain the project after the original author threw in the towel or got hit by a bus.

    Another example is operator overloading. Perhaps operator overloading is useful to the first person writing the code, as it provides a nice little shortcut where they can do foo + bar as opposed to something like foo.add(bar). But if there's another person who's decided to work on this project, and they're not very familiar with the code and they are trying to get the idea of how it works, how can they tell whether foo+bar is a mathematical operation or some sort of concatenation? Yes, if they look over the code enough, they can understand it. But perhaps that extra amount of fuss and the extra amount of time wasted trying to make sense of things will convince that person it would be easier to write their own stuff than try to reuse someone else's.

    A final area I wish the author focused on is documentation. Does the language support some sort of embedded and standardized documentation that make it easier for the first programmer to provide information that would help a second programmer make sense of the code, or is documentation at the discretion and mercy of the first programmer and whatever bizarre and non-standard documentation system they might use?

    I would suspect that projects using languages that make it harder on the person who has to read the code have higher incidences of duplication of effort and a great NIH (Not Invented Here) tendency.

    But that's just my opinion.
  • Delphi.. (Score:3, Insightful)

    by jagilbertvt ( 447707 ) on Tuesday May 13, 2003 @06:21PM (#5950080)
    What we can conclude from this article is that Delphi roxors :)
  • OK, Here's My List (Score:5, Insightful)

    by avdi ( 66548 ) on Tuesday May 13, 2003 @06:49PM (#5950301) Homepage
    C
    Let's start this off nice and flameworthy: what is the point of using C anymore? Nearly any valid C program is a valid C++ program, and C++ gives me the option of selectively using much higher-level abstractions than C can support, with little or no overhead, in a much safer and easire-to-debug way than any pure-C approximation. And most of the projects which are coded in C these days shouldn't even be coded in C++; they should be coded in something higher-level like Java or Python.

    C++
    • Manifest typing is so damn verbose! If the compiler's so clever, why can't it do a little type inferencing?
    • Needs Java's inner classes to do typesafe pseudo-closures
    • All C++ compilers suck. This in itself is not the problem; the problem is that all compilers suck differently.
    • Templates are powerful, but ugly
    • C++ code is full of juicy semantic information, which all IDEs uniformly fail to exploit, making coding far more painful than it should be in such a mature language.

    Java

    • In their haste to throw out every frightening piece of "complexity" from C++, the designers managed to throw out all the expressive power as well. The result is a langauge that is so syntactically impoverished that it is actually less readable than C++, the language it sets out to improve upon. See the bloated maintenance nightmares that are used to work around the lack of enums, just as one example.
    • Ironically, by eschewing out C++'s "confusing" features, Java actually manages to be more error-prone than C++. For example, by forcing casts to be used everywhere, Java defers to runtime a whole class of errors that would never make it past the C++ compiler. This makes type errors much harder to track down, hardly a net gain for the novice programmer.

    Perl

    • Doing OO in Perl is like... doing OO in C. Sure, you can do it, but it probably won't work with anyone else's OO code, and you have to do a lot of the compiler's work yourself.
    • Perl isn't as ugly as the Python fanatics claim it is; but there's still a hard limit on how readable it can ever be. Any language where the canonical way to sort an arbitrary list is expressed as @sorted = sort { $a <=> $b } @mylist; has readability issues.
    • Figuring out what chain of braces, sigils, and arrows is needed to properly dereference a deeply nested data structure is a PITA.

    Python
    By far the biggest problem with Python is the user community. There's something about Pythoneers that make them glom onto the language with religious zeal, and then go around telling every one else that their own language of choice isn't elegant enough. Many Python users have the mistaken impression that Python is a carefully worked-out work of modern programming cleanliness like Scheme. In fact, Python was an unremarkable in-house procedural "little language" that, rather than dying the graceful death that most such languages eventually experience, was hyped to a larger audience and has been loaded down with all kinds of trendy features. Unfortunetely, due to it's humble roots, these features have gone in rather awkwardly.
    All this would be fine, in fact, it would be similar to Perl's story, if it weren't for the singular nature of Python apologists. Python is perhaps the only open-source language whose users will proudly and vehemently defend a language flaw as a feature. The best example is the post-facto rationalization of the extra "self" argument to methods, which the Python FAQ helpfully explains was simply an artifact of the way OO was hacked into an originally procedural language. This fact doesn't deter the fanatics however, who will happily tell you that it was an intentional feature and that it somehow makes Python better.
    Other examples of Python's awkward growing pains and the inexplicable attitude of it's users: the fact that Pytho defines private variables as variables whose

  • by Fweeky ( 41046 ) on Tuesday May 13, 2003 @06:53PM (#5950342) Homepage
    What I Hate About Ruby

    The sigils that mark instance and class variables always stick out visually in an otherwise clean language.

    Er, the use of @foo to define an object attribute is great; it means there's no need to type self. all the time, makes attributes obvious, and means you don't need to use lame prefixes like m_ObjectAttribute.

    A much better hate would have been the awful Perl/sh-era pseudo globals ($_ $@ $! $| $" $' $1 - what were you thinking matz!?); we all hate those ;)
  • by CoughDropAddict ( 40792 ) on Tuesday May 13, 2003 @07:51PM (#5950671) Homepage
    I can think of only two possible reasons why Python's whitespace-significant block structure would bother people:
    • people are determined to write code that is not indented the way it looks (so that the parser will recognize a different block structure than the indentation implies)
    • people feel warm and fuzzy staring at braces and "begin/end" keywords.

    Someone please explain: why does this feature make you so upset? How could it possibly make your life more difficult to know for a fact that the interpreter sees the blocks the same way you do on the screen?
    • Try this some day: (Score:4, Insightful)

      by avdi ( 66548 ) on Tuesday May 13, 2003 @11:40PM (#5952040) Homepage
      Take a hundred-line snippet of Python code. Stick it into a web page. Copy&paste the web page to an email. Post the email to a programming mailing list. Have a lengthy thread about the code, quoting and requoting the original.

      Now, let an intermediate Python programmer try to take the mangled code from the end of that thread and reformat it so that it works as intended. If it were in a language with explicit block syntax, chances are it would run as intended with nothing more than the removal of any quoting prefixes that mail clients have added. And a decent programmer would be able to whip up a script that would automate the transformation from mangled code into nicely indented code. Not so for the Python code.

      The problem is not so much that whitespace is signifigant by default; it's that there is no way to modify this behaviour in order to generate "portable" code which can survive whitespace mangling.

      But frankly, all things considered, whitespace signifigance is by far the least of Python's worries.
  • by Gldm ( 600518 ) on Tuesday May 13, 2003 @08:35PM (#5950981)
    Ok begin flaming me but I love what I've seen of C# so far. I'm not a very experienced programmer, but I was forced through C, C++, MIPS assembly, shellscripts, and Java in college. Since then I've done C# and PHP on my own. So far I like C# the best.

    Why? C is an ancient ugly mess that needs to adapt or die. I'd hate to do more than a 200 line program in it because I'd get lost without objects. "Oh but you can use objects in C by doing blah blah struct blah blah kludge etc." No thanks, it took me years to figure out what the big deal with objects was and how to use them without overusing them, and I'm never going back now for anything serious.

    C++ has objects you say, but they always feel like it's grafted on to C. Granted it works, and it's still reasonably portable, which is C's main advantage these days, but some things are still just ugly. How about an array who's size you don't know until runtime? Welcome back to pointers 101. Sure you can use new and delete instead of malloc and it looks nicer, but alot of things just don't have really elegant solutions, and the standard libraries are too sparse for what modern apps do with modern languages.

    Java... everything you hate about C++ fixed the wrong way! Yay we have big useful libraries now... but they're constantly changing, bitching that what you just used is now "depreciated", doing things you're not allowed to do etc. No I do not want to use something called "vector" to replace a linked list, give me a freaking "linked list" object! Even if it's just a renamed vector at least it doesn't confuse people into thinking I'm going to have calculus and matrices popping out in the next few lines. This may have been the fault of my instructor but he loved crap like this. "Don't use the Stack class, use vector to make your own stack!" Oh and just because I don't want to do something with pointers if I can help it doesn't mean I don't EVER want to use pointers, I'd like to code without a babysitter please. If I screw up at least it's me to blame. Everything must be a class! Umm yeah that's great when I just want a struct with an int and a float so I don't have to write half a dozen methods to implement a "proper" class with private data and constructor and operators and copy... Put up with all this and you're rewarded with 10x slower performance and maybe cross-platform execution on alternate tuesdays when it's raining and the moon is waxing.

    PHP seems nice, though I haven't really written much of anything in it yet. Some things kinda weird me out like how nothing cares if your variable is an int, float, string, etc. It's kinda nifty but extremely unsettling at the same time. At least it's easy to spot variables since they all start with $. I really don't have much else to say about it yet.

    By now everyone's waiting for why I like C#. I like it because it fixes the things I hate about C++ and Java and just seems to make everything work smooth. Want to use pointers? Sure, just put it in an unsafe section for the over paranoid. Want to use objects? It's easy. Want to do threading? We've got this easy to use library for it. How about resize an array? No problem. Arrays remember their own sizes. They can even sort themselves. They can even sort themselves and another array at the same time based on the values in the first array (someone PLEASE show me how to do this with qsort() in C++ elegantly). Networking? Got it. Performance? Eh, about 20% hit from C++ on my machine, less if you use ngen to precompile it. Still too bad? Ok, put your critical sections in C++, C, or even ASM libraries and link them seamlessly. GUI apps? Tons of easy to use stuff there, though it's mostly windows specific. The downside is you don't get the portability of other languages... yet.
  • by halfgeek ( 657613 ) on Tuesday May 13, 2003 @08:54PM (#5951094) Homepage

    This was a very interesting article. I natively speak Perl, C, and C++, know enough about PHP to get by, and still remember some Commodore 64 BASIC (10 ? CHR$(147)). I am also, as I believe I've said before, not afraid to learn things like Java, Python, Ruby, maybe even Visual Basic again (God forbid) should they prove exceedingly relevant to my case - in fact, I quite look forward to knowing (hopefully) all of them and then some. But never Pascal. (Just kidding.)

    I've really found that the thing I hate most about programming in general is that no single language is the right one to use for any of my programs! I am very interested in any effort I ever come across to do functional merging of disparate environments. In addition to a couple of workarounds I've invented in the past for shoehorning Perl into PHP, I like reading about things like SWIG [swig.org], the open CLR [go-mono.org], and even COM (the concept [mozilla.org] more than the implementation [microsoft.com]), and a smile always comes to my face when I think about the Inline library [cpan.org] written for Perl.

    Now, the thing I really pine for is all of this interlanguage binding stuff being easy, fairly portable, more synactically simple, and less hacky. I know that these exist, but not quite completely together. If I write a program in Perl with use Inline C, I can never be sure that anyone else has all the development tools necessary to compile all the C on the fly. Writing a program in Visual Basic with a nice mouse-drawn GUI and an external component is really easy - but it's Visual Basic. Writing a component wrapper for Perl is fairly straightforward with SWIG, but some well-thought-out language features would make it easier. And COM... I'm going to have to try wrapping my head around that book again someday... I'm sure the ATL makes it all very simple, but can I use ATL from MinGW? From C? From Perl? And don't try to tell me that I need to learn yet another flavor of XML to make all of this work.

    That's mis tus centavos.

    (Note: I disclaim perfection. Don't hit me too hard; I admit I haven't done enough of my homework to claim this post isn't full of holes. Once I've looked this whole matter through, if ever, and if I still haven't come up with anything good, I may just have to take a deep breath, lay down a syntax, figure out how to use a lexer generator and a compiler compiler, and throw together some ghastly but very easy-to-use homogeneous aggregator system. Either that, or I wait for Parrot [parrotcode.org] to interoperate with Mono...)

"Facts are stupid things." -- President Ronald Reagan (a blooper from his speeach at the '88 GOP convention)

Working...