Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Games

Lobster, a New Game Programming Language, Now Available As Open Source 153

Aardappel writes "Lobster is a new programming language targeting game programming specifically, building on top of OpenGL, SDL 2 and FreeType. The language looks superficially similar to Python, but is its own blend of fun features. It's open source (ZLIB license) and available on GitHub."
This discussion has been archived. No new comments can be posted.

Lobster, a New Game Programming Language, Now Available As Open Source

Comments Filter:
  • by decora ( 1710862 ) on Wednesday June 19, 2013 @05:46AM (#44047909) Journal

    i := find([ 1, 2, 3 ]): _ > r

    yeah. no. thanks but no thanks.

    • by fph il quozientatore ( 971015 ) on Wednesday June 19, 2013 @06:03AM (#44047967)
      Still looks like a step forward from Perl.
      • look if we are going to have these bizarre punctuation marks everywhere, then why not just start using chinese characters, that actually mean what we are talking about?

        "oh because nobody can read them"

        nobody can read _?@$$$ __ *&* anyways. but at least chinese has some meaning already attached to characters, like

        look at

        it means download

        its literally the cjaracter for "down" followed by a character for a wagon/cart (top view.. two wheels, see?) and a thing next to it. (down transport)

        that makes a hell o

        • by decora ( 1710862 ) on Wednesday June 19, 2013 @06:48AM (#44048141) Journal

          oh well

          • by Xest ( 935314 ) on Wednesday June 19, 2013 @08:21AM (#44048805)

            Amusingly this is somewhat the answer to your question - most programming languages will avoid unicode characters because it then runs a greater risk of transmission of code between systems because unfortunately there are still all too many applications, sites and programs that don't properly support unicode which means bugs could arise in source code for no reason other than loading it up, manipulating it, and saving it in the wrong text editor.

            But I agree, it's a sad state of affairs that we can't rely on the existence of unicode even now.

            • by CastrTroy ( 595695 ) on Wednesday June 19, 2013 @09:28AM (#44049477)
              This is one of my favourite things about .Net. All strings are unicode (utf-16) by default. You don't have to do any fancy trickery to get the language to interpret your string as UTF, and all the functions (assuming no bugs) work properly for international characters. In most other languages, you have to remember to precede the string with some character to signify that it's unicode, and the strange things start happening when you mix unicode and non-unicode strings, and have the functions don't work properly with unicode strings to begin with. Same thing goes with base-10 decimal numbers. It's a native type. You don't have to import some library and a= b.add(c) every time you want to add a couple numbers (gets really messy with more complex math).
              • by spitzak ( 4019 )

                Bull. Microsoft's refusal to interpret byte strings as UTF-8 is the problem. The fact that you have to use "wide characters" everywhere is by far one of the biggest impediments to I18N.

                Unicode in bytes with UTF-8 is *TRIVIAL*. Look at the bytes and decode them. Variable length is not a problem, or if it is then you are lying about UTF-16 being so great because it is variable length as well! And if there are errors you can do something *intelligent*, like guess an alternative encoding (thus removing the need

            • by HiThere ( 15173 )

              There's an interesting reason, though. Consider building a Trie around Unicode chars. Granted, this may not be a major reason, but UGH! There's a lot of advantages to having a small alphabet. The early languages didn't usually even allow both upper and lower case. Well, memories have expanded, processors have speeded up, etc. But Unicode is still too verbose for many algorithms to work well. And using bytes and utf-8 yields different problems.

              I will grant that there are lots of approaches that don't

              • Consider building a Trie around Unicode chars.

                Done it.

                Any data structure programmer worth their paycheck knows that a trie is an abstract structure which can be realised in many different ways. It is logically a tree of "nodes", where each "node" is a map from a digit (where the key is expressed in some radix) to another node. That map can be implemented in multiple ways. The simplest is an association list (sorted or unsorted), but it could be a simple array, a binary search tree (often realised as a terna

              • by dkf ( 304284 )

                Consider building a Trie around Unicode chars.

                Why would you be building a trie? A less-elaborate data structure will usually be faster due to memory access patterns that fit better with what cache predictors cope with.

                • by HiThere ( 15173 )

                  OK, I'm not a compiler builder. And a hash table would be better for a symbol table. And I was thinking about a slightly different representational problem, for which a Trie would also not be the correct data structure to use, but which seemed to have the same problem. (It's actually an n-dimensional list structure...though less general than that implies. And I'm probably going to slap a restrictive upper limit on n...at least if I can figure a way to do so that won't choke things up.)

          • by Inda ( 580031 )
            You must be new here.

            Welcome to the circus.
        • Chinise characters in Slashdot, really? When do you think you're living ? 21st century ?
        • by dintech ( 998802 ) on Wednesday June 19, 2013 @07:49AM (#44048519)
          Been there, done that. Look specifically at APL [wikipedia.org] in the 60s. Functions were represented by single characters which you needed a special keyboard to type. For example, instead of typing the string floor, instead it was represented by what is now Unicode Character 'LEFT FLOOR' (U+230A) [fileformat.info] and required a special terminal to reproduce them. This limited where you could input and also display APL code.

          One evolution of APL was the A+ [wikipedia.org] language leading finally to K [wikipedia.org] in the 90s. Having these special character requirements was too much of a pain in APL so all special characters were replaced by tuples of ASCII characters that were already common. In K, 'floor' was now expressed as _: which is no easier to guess the meaning of if you don't know the syntax, but now you need only standard ASCII to represent it.

          'Son of K' was Q [wikipedia.org] which comes full circle replacing _: with the keyword floor. Iverson's argument in developing APL was that the terseness achieved by using notoation (single characters) meant that you could express concepts more conciesely. This in turn meant that complex concepts were easier to visualise. There's a lot to be said for this, but I think Q now provides a much happier medium between the two perspectives.
        • look if we are going to have these bizarre punctuation marks everywhere, then why not just start using chinese characters, that actually mean what we are talking about?

          There are a lot better languages already out there [wikipedia.org] if you want bizarreness.

      • by Anonymous Coward

        shut your mouth when grownups are talking. ;) /what's wrong with perl? It only looks odd to you because you don't know the language. To me ( 20 year C programmer and 10 year perl programmer ) it's extremely straight forward.

        • Old timer (Score:2, Interesting)

          by Anonymous Coward

          You've been programming for at least 20 years. That means you've started when things weren't buried behind seven layers of abstraction but had to be done by hand. In languages that didn't help you all that much, but didn't get in the way of letting you get things done either. So, like me, you've seen things those young whippersnappers wouldn't believe.

          Anyway, about perl, I've never seen why it got such a bad rap for excessive punctuation. The sigils on variables aren't that weird, even BASIC used them when

          • by alexo ( 9335 )

            You've been programming for at least 20 years. That means you've started when things weren't buried behind seven layers of abstraction

            How many levels of abstraction again? [wikipedia.org]

          • by Lumpy ( 12016 )

            "You've been programming for at least 20 years. That means you've started when things weren't buried behind seven layers of abstraction but had to be done by hand. In languages that didn't help you all that much, but didn't get in the way of letting you get things done either. So, like me, you've seen things those young whippersnappers wouldn't believe."

            Those languages still exist, and real programmers learn them and even more powerful stuff that makes you a far better programmer... Like Assembler.

            the "you

            • Define "better" please.

              My boss would have defined better as the 10x programmer who got done in 1 month what I'd said would take 10 but left zero documentation, unit tests or comments; and code so brittle that the slightest deviation from spec brouht the entire mess crashing down around our ears. Sure he was 2x as expensive and it took me nearly 12 months to sneak something past the powers that be that reduced my daily support request queue back to what it was prior to him coming and working his magic but go

              • by Lumpy ( 12016 )

                One that actually has a clue as to how Logic and computers work. Almost every single CS grad we get nowdays cant even pass the tests for employment like binary math, or solving a logic problem.

          • Perl is very easy to understand.... if you wrote it.

          • The strangeness of perl at times is that the excessive punctuation changes meaning in context. It makes sense once you know the rules, but even then you can run across something very strange and head scratching if you're not actively using it all the time. Ie, there's a lot of overloading. So almost every time I use perl I still am referring to my dog eared O'Reilly quick reference guide and manual, despite having used Perl since 1989.

      • People who dont like Perl, dont know Perl.
        • by Anonymous Coward

          Because if they really knew it, they'd hate it with the fiery passion of a thousand suns?

          Oh wait, I'm thinking of Javascript. Carry on.

          • Or Python or Basic.

            Perl has a bad rap mainly because systems adminstrators have abused it to hack out a quick solution. But, really, any language can have that happen. There's nothing inherently evil about Perl.

            Python and Basic, OTOH, both have some pretty evil formatting requirements.

            • by kraut ( 2788 )

              Python only forces you to indent in the way any sane person would indent anyway. That's not evil.

              • Re: (Score:2, Informative)

                by hedwards ( 940851 )

                It hides formatting information in whitespace, something that no sane person would do.

                It also ends lines at the new line rather than at a ;, which means that you're in a position where you can end up with long lines at times, where normally, you would just hit enter and continue on the next line.

                In general though, any language that depends upon white space for anything other than separating elements is just asking for trouble.

                • by tepples ( 727027 ) <.tepples. .at. .gmail.com.> on Wednesday June 19, 2013 @10:16AM (#44050097) Homepage Journal

                  It also ends lines at the new line rather than at a ;, which means that you're in a position where you can end up with long lines at times, where normally, you would just hit enter and continue on the next line.

                  Python uses newline as a statement delimiter only if all bracketing constructions (...) [...] {...} are closed. The arguments of any function call, for instance, can be split over multiple lines, as can the elements of a list or dictionary or a long expression. And back when print was a statement (Python 2) as opposed to a function (Python 3), it was my common practice to do something like this:

                  print ("%s: not raising price because %s"
                          % (sku, reason))

              • Python only forces you to indent in the way any sane person would indent anyway. That's not evil.

                It is when you have to send code through a channel that strips whitespace from the start of each line. With languages that use curly brackets or BEGIN/END, you can pass the code through something like GNU indent to restore the sane indentation. With Python, the block structure is just lost. And if you have your Slashdot posting preferences set to "HTML Formatted" rather than "Plain Old Text", Slashdot is one such channel, as <ecode> loses indentation in "HTML Formatted" mode.

        • ... and therein lies our problem.

      • Please tell me that people aren't writing games in Perl... for the love of all that is sacred in this world.

    • by pmontra ( 738736 )

      I agree that the meaning of this one liner is not easy to guess but there are other more fundamental things that bother me in Lobster. One is why they should make a difference between = to assign and := to define & assign. The first assignment should define. Most languages just do that and everybody is happy. The second rant is about the pythonish end of line colon. The : is ugly. It still hits me as a bad taste when writing Python: if a statement looks complete at the end of the line, then it's should

      • I agree that the meaning of this one liner is not easy to guess but there are other more fundamental things that bother me in Lobster.

        I think you're agreeing to something the GP didn't say. By virtue of the subject, he's referring to the number of times you have to use the SHIFT key to type up that line, slowing your programming down. Understanding the line is a different question.

      • by samkass ( 174571 )

        It's always nice to see a new language even if the chances it will survive a couple of years are slim (that's true for every new language). Ideas spread so keep inventing.

        IMHO, it's almost never nice to see a new language. They really couldn't have just extended Lua? What new value is offered by a new syntax for the same concepts everyone else has?

      • Actually, having something like `len(x)` instead of `x.len()` has some benefits. Check out Guido's rationale [effbot.org] for why it was done that way in python:

        There are two bits of “Python rationale” that I’d like to explain first.

        First of all, I chose len(x) over x.len() for HCI reasons (def __len__() came much later). There are two intertwined reasons actually, both HCI:

        (a) For some operations, prefix notation just reads better than postfix — prefix (and infix!) operations have a long tradition in mathematics which likes notations where the visuals help the mathematician thinking about a problem. Compare the easy with which we rewrite a formula like x*(a+b) into x*a + x*b to the clumsiness of doing the same thing using a raw OO notation.

        (b) When I read code that says len(x) I know that it is asking for the length of something. This tells me two things: the result is an integer, and the argument is some kind of container. To the contrary, when I read x.len(), I have to already know that x is some kind of container implementing an interface or inheriting from a class that has a standard len(). Witness the confusion we occasionally have when a class that is not implementing a mapping has a get() or keys() method, or something that isn’t a file has a write() method.

        Saying the same thing in another way, I see ‘len‘ as a built-in operation. I’d hate to lose that. //

        • by pmontra ( 738736 )
          Yes I know his argument. I just don't buy and I could build counterarguments but it doesn't matter. At the end even programming languages are not completely rational and our tastes are even less so.To me Python looks similar to C and tastes of '80s, with all those unnecessary double underscores (Guido should have uses a keyword for that). Maybe that is why it is getting successful as a system language. But there is the inconvenience of paying attention to spaces when copy pasting code around, which is a pai
          • Yea ultimately it's a matter of taste. I am happy and productive programming in Python so I know that I like it =). I agree with the spaces thing. That's the one downside I can think of for having the space-indentation.
    • by dywolf ( 2673597 )

      press the buttons in the right order and win!
      Call it....Rock Band Lobster

  • Dynamically Typed? (Score:5, Insightful)

    by Wattos ( 2268108 ) on Wednesday June 19, 2013 @05:48AM (#44047917)

    Dynamically Typed with Optional Typing

    Thanks, but no thanks, I prefer to stay with statically types languages. I know that the "kewl" kids love dynamically types languages, but it becomes a horror for maintenance. Ill be sticking with UDK in the meantime

    • by buchner.johannes ( 1139593 ) on Wednesday June 19, 2013 @06:43AM (#44048125) Homepage Journal

      It really depends what you are doing. For many projects, scripting with some OOP is good enough (all those web projects, RoR, etc.). Having short code in an expressive language leads to less bugs.

      Static typing is extremely useful because it catches all mistakes of a certain class. However, other mistakes you still have to unit test for. So if you are unit&integration testing well, the benefit of static typing is small, and you are capturing more mistakes than static typing would.

      For projects where you have contract-like, long-term stable interfaces/APIs, yes, use static typing. But don't pretend it's for every project.

      • I agree.

        A robust, statically typed language is for the framework and core functionality.
        Dynamic typing is for scripting languages. As the name implies - for running short, often modifiable scripts in a well defined context.

        I don't get why some people insist on going dynamic all the way.

        • by Anonymous Coward

          I don't get why some people insist on going dynamic all the way.

          Probably because they never had to. I like Python, quite a lot, but at some point you just throw it away because running (help) on every ill-documented object you encounter stops being funny.

        • It's the same reason why people use virtual everywhere, or make every class a template: It's the latest 'trick' they've discovered, and they think it's the silver bullet solution to everything. 12 months down the line, the painful maintainence nightmares they've created will encourage them to do things differently next time.
      • by Xest ( 935314 ) on Wednesday June 19, 2013 @07:56AM (#44048571)

        "It really depends what you are doing. For many projects, scripting with some OOP is good enough (all those web projects, RoR, etc.). Having short code in an expressive language leads to less bugs."

        Are you sure you're not conflating two different things here? It sounds like you're saying some languages are better for short, more expressive code, but that's not the same as static vs. dynamic typing.

        The only increase in code from static typing is explicit conversion, but I do not see how this extra code can increase bugs, on the contrary, it's what often decreases bugs in applications written with static typing because the developer has to explicitly declare and perform the possible conversions. In contrast, with a dynamically typed language you're relying on the interpreter to guess, which is much more error prone.

        If you perform a conversion in a statically typed language and it's wrong, you know the second you try and execute, but in a dynamically typed language you may not know there's a problem until you hit some edge case input, which is more likely to get out into production due to the subtle nature of it.

        Do you have any examples of the classes of problem you believe dynamic typing avoids but static typing doesn't? You make the assertion that if you unit and integration test a dynamically typed language you capture more mistakes than you would with a statically typed language. I don't think that's ever the case, because static type makes capture of certain errors explicit in the implementation, the faults are unavoidable when you attempt execution, whilst dynamic typing relies on you stumbling across the error during execution, which means to capture it with unit tests means it's only as good as your unit tests which will rarely be as good as explicit and inherent capture of errors.

        I agree that dynamic code has it's place - where you want to make quick changes, dynamic changes and want to see change instantly or where you don't care about code quality because you're just doing prototyping or proof of concept. But I think dynamic code is always inherently more error prone, I think it's a fallacy to pretend otherwise and I've never seen any evidence to suggest dynamically typed code is less error prone than statically typed code so I'd be intrigued to see it because I don't see how inherent ability to capture a certain class of errors coupled with tools for finding every other class of errors can ever be worse than no inherent ability to capture that class of errors with the same tools to find the other classes of errors. It just doesn't make sense.

        • by Hentes ( 2461350 )

          If you perform a conversion in a statically typed language and it's wrong, you know the second you try and execute, but in a dynamically typed language you may not know there's a problem until you hit some edge case input, which is more likely to get out into production due to the subtle nature of it.

          Dynamic typing doesn't mean those languages are typeless. Type errors like trying to add a string to a number still get caught at runtime. Unlike static languages, where a wrong cast can make the code compile and the program will never complain afterwards, leaving you wondering where those segfaults are coming from.

          Do you have any examples of the classes of problem you believe dynamic typing avoids but static typing doesn't? You make the assertion that if you unit and integration test a dynamically typed language you capture more mistakes than you would with a statically typed language. I don't think that's ever the case, because static type makes capture of certain errors explicit in the implementation, the faults are unavoidable when you attempt execution, whilst dynamic typing relies on you stumbling across the error during execution, which means to capture it with unit tests means it's only as good as your unit tests which will rarely be as good as explicit and inherent capture of errors.

          Static error checking is a shallow way to test your code, and will only catch simple syntactic errors, that usually don't even occur in a dynamic language with a less complicated syntax. Regardle

        • Are you sure you're not conflating two different things here? It sounds like you're saying some languages are better for short, more expressive code, but that's not the same as static vs. dynamic typing.

          The only increase in code from static typing is explicit conversion, but I do not see how this extra code can increase bugs, on the contrary, it's what often decreases bugs in applications written with static typing because the developer has to explicitly declare and perform the possible conversions. In cont

    • Hm, count me among the skeptics, too. The problem is that "dynamic typing" creates principal performance bottlenecks - not good for games. The golden rule is to compute as much as possible at compile time using a strong type system, including type checking, type inference, bounds checking, overflow checks. Heck, with a strong enough type system you might even be able to avoid most of runtime exception handling (see e.g. the design goals of Parasail). What you want is to encourage the programmer to use very

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        How many games have you written, exactly? I've worked on AAA games from 1995 to today, and most of the industry is using dynamically-typed languages for scripting, and has been since the days of QuakeC. The iteration time is so much faster because the compiler doesn't have to work all that shit out up front. Iteration time is king in game production. Runtime is important too but we all know (right?) that only 10% of your code is reponsible for 90% of your runtime. The other 90% of your code can bloat b

        • by The Cat ( 19816 ) *

          Get it done cheap and fast, and most of all cheap. Then fire everyone.

          The AAA game market is not a good example of tight programming.

        • We're talking about a new language; the claim that fast easy development cannot be combined with strong typing and and compile-time checking is totally unjustified. There is absolutely no reason why a language with "dynamic types" is, could, or should lead to easier development or faster development cycles, particularly not if automatic type inference is available. In fact, the opposite is true due to improved error checking at compile time in a strongly and statically typed language.

          Even less understandab

    • by dkf ( 304284 )

      Dynamically Typed with Optional Typing

      Thanks, but no thanks, I prefer to stay with statically types languages. I know that the "kewl" kids love dynamically types languages, but it becomes a horror for maintenance. Ill be sticking with UDK in the meantime

      As the project becomes larger, you get more and more of the code devoted to converting values between different type systems and serializations and all that stuff. It's boring code, but often just slightly too complex for a computer to do for you without some oversight. Going to a looser dynamic type system greatly reduces this overhead.

      That's not to say that strict static types are useless; they're very useful when developing the components that the dynamically-typed language sticks together. Indeed, using

  • Alternatively you could just use the Python OpenGL bindings [sourceforge.net] (r pick your favourite language). From the project home page I can't see any reason why this language is better than many existing, stable, and optimised languages for accessing OpenGL.
    • Python OpenGL bindings [sourceforge.net]

      Pygame [pygame.org] and Pyglet [pyglet.org] are a couple of other Python-based choices.

  • "Fun features"? (Score:5, Insightful)

    by Viol8 ( 599362 ) on Wednesday June 19, 2013 @05:57AM (#44047953) Homepage

    Languages don't have "fun" features, they either have useful features or bloat.

    Looks like yet another me-too language that's someones pet project that will be forgotten about this time tommorow.

  • by Anonymous Coward

    Another programming language! Why do people keep reinventing the spoon? Is it all CS-majors that feel they need to make a mark on the world?

    • by RaceProUK ( 1137575 ) on Wednesday June 19, 2013 @06:51AM (#44048157)

      Another programming language! Why do people keep reinventing the spoon?

      Which spoon? The soup spoon? Teaspoon? Tablespoon? Dessert spoon? Wooden spoon?

    • Why do people keep reinventing the spoon? Is it all CS-majors that feel they need to make a mark on the world?

      So that they can delude themselves that their also-ran game programming language is going to catch on and become all the rage, as if all the big game developers are going to throw away their uber-expensive proprietary development environments and rewrite their engines in some shitty new open-source language that has shit for documentation, a billion bugs, no IDE support, and a micro-fraction of the libraries available for even the lamest existing language.

      • Why do people keep reinventing the spoon? Is it all CS-majors that feel they need to make a mark on the world?

        So that they can delude themselves that their also-ran game programming language is going to catch on and become all the rage, as if all the big game developers are going to throw away their uber-expensive proprietary development environments and rewrite their engines in some shitty new open-source language that has shit for documentation, a billion bugs, no IDE support, and a micro-fraction of the libraries available for even the lamest existing language.

        Throw enough shit at a wall and eventually something will stick. I'm pretty sure that's how PHP got any use at all. :)

        Is compiler design still part of a healthy CS diet? That means thousands of languages are being pumped out every semester. It takes a special kind of ego to think any of them are worth a damn.

        Actually, the wide variety of commercial home-use 3D printers out there seems to follow this model pretty closely. Get smart enough to make one and all of a sudden they think their version is so relevan

    • by Xest ( 935314 )

      To be fair developing a programming language is actually an excellent project for anyone wanting to further their comp. sci./development skills.

      There's just absolutely no need to plaster it over the front page of a news site on the internet. Keep it to yourself, no one cares.

  • by Anonymous Coward

    No thanks, already have a perfect programming language. Why reinvent the wheel when the old wheel still works ok?
    Mostly we don't need new languages we simply need better libraries.

    • Re: (Score:3, Insightful)

      We've had a perfect programming language since C.

      That's why everything since has copied the syntax and half the operators.

      • We've had a perfect programming language since C.

        And a whole bunch of segfaults, too.

      • We've had a perfect programming language since Fortran.

        That's why everything since has copied the syntax and half the operators.

        Fixed that for you. Where did you think most of C syntax came from?

        • I just looked up some Fortran code. It doesn't look very C-like. No semicolons, no curly braces, some functions take bracketed paramaters while others do not, and the example code on Wikipedia contains a lot of things that just make no sense to me. Like 'IF (IA) 777, 777, 701' - What does that do? There is no variable I can find called IA. It may be a good language once you've learned it, but it doesn't look remotely like C. If anything, I'd say it shows some simularity to BASIC.

        • by The Cat ( 19816 ) *

          C syntax sure as hell didn't come from FORTRAN. What the hell mixture of recreational chemicals are you on?

  • by Anonymous Coward

    They author should be commended for creating and releasing publicly rather than the whining and complaining found here. As a personal project, it may be improved, abandoned, rewritten, or simply enhances skills that will lead to other contributions.

    • by abies ( 607076 )

      All of us have half-finished, useless projects out there, which have potential to be something nice if we spend another 30 man-years of effort and rewrite them few times. Nothing wrong with that. But posting ninja self-promoting submissions to slashdot about them... thats pathetic.

      • All of us have half-finished, useless projects out there, which have potential to be something nice if we spend another 30 man-years of effort and rewrite them few times. Nothing wrong with that.

        That is not always good. Finishing your projects properly is a very important skill for an engineer, artist, or anyone really. Half-finished stuff gives a bad impression of your work and makes yourself feel uncomfortable about not completing them.

        Just spec your projects before starting and assess whether you can realistically complete them, and you're good.

  • Then we could have had fun with the "It wasn't a rock it was a ...Rock Lobster! comments.
  • great, another one of those wannabe languages.. There are already a lot of other alternatives out there..
    Just use one of the classic languages with the same libraries as this one uses, you'll be glad you did..

    • by pmontra ( 738736 )

      Why not? Experimentation is useful and gives many languages that die quickly but also ideas that spread and end up in languages that stick. Just imagine saying use the classic languages at the time of Cobol and Fortran, or at the time of C later on. No C, no Perl, no Python, no Ruby, no Java, no PHP (oh well...), no JavaScript. All of them got ideas from other languages and spread their ideas into newer languages or into contemporary ones (PHP has traits nowadays).

      As for really using Lobster in production,

      • by abies ( 607076 )

        It makes sense if language explores new ideas or has groundbreaking implementation. There is no reason to experiment with languages which have both design and implementation sub-par to multiple of existing ones.
        That said, everybody should try writing their own language at least once in lifetime - it is very good experience and you learn a lot about why other languages have certain quirks. It is just that you should not try to sell your 'baby' on slashdot...

  • by Yvanhoe ( 564877 ) on Wednesday June 19, 2013 @07:28AM (#44048371) Journal
    Why do you need a new language?
  • Is it tasty?

  • OP here, let me see if I can address common comments I see here:

    "why another language?" - because I can? I can't wrap my head around the thinking that creating new languages is somehow a problem for our development ecosystem. Noone forces you to use them. And like others have so kindly already mentioned, this one will probably die in obscurity, solving your problem before it even started.

    "what's the point when it's not a major innovation?" - Better mainstream languages is an evolutionary process of designs

  • Yet another language from Wouter van Oortmerssen? When will he ever get enough?

    He's also the guy behind the Cube game and game-engine [wikipedia.org].

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...