Forgot your password?
typodupeerror
Java Programming Ruby

Mirah Tries To Make Java Fun With Ruby Syntax 444

Posted by timothy
from the user-friendly-is-hard dept.
An anonymous reader writes "Java is performant, widely adopted and eminently portable, however, its syntax is largely inherited from C++ along with some of its esoteric unfriendliness. Mirah aims to place a friendly face on Java through the implementation of a syntax whose primary concern is developer friendliness (think Ruby/Python/Groovy), and route of least surprise. The result is a truly cogent alternative syntax delivering readability, expressiveness and some compelling new language features."
This discussion has been archived. No new comments can be posted.

Mirah Tries To Make Java Fun With Ruby Syntax

Comments Filter:
  • by Foofoobar (318279) on Sunday March 20, 2011 @06:55PM (#35554122)
    Doesn't Groovy and Grails already do that? Speaking as a LAMP developer who uses Groovy/grails, I figured that WAS Java's answer cause I'm having a blast and dumping PHP like a hot potato.
  • Ruby syntax is fun? (Score:3, Interesting)

    by shish (588640) on Sunday March 20, 2011 @07:02PM (#35554176) Homepage
    Is it just me who finds ruby even more cryptic than perl? Reading why's poignant guide [uniqpath.com], I loved the presentation of the book, and really wanted to love the language, but every time he said "read this code out loud, it makes perfect sense, doesn't it?" all I could think of was "you, my dear little cartoon foxy friend, have clearly been snorting too much of the good white stuff. I'm going back to python now" :-(
  • I prefer C++ syntax (Score:4, Interesting)

    by Jorl17 (1716772) on Sunday March 20, 2011 @07:07PM (#35554204)
    Yes, that's right, I prefer C++ syntax and coding style. Efficient, does exactly what I tell it to and makes sense in my head. All this Python/Ruby etc just makes my head blow up. I often code in those languages and I see them as very useful (and easy to work with), but I still prefer C++ syntax.
  • Re:Uh... (Score:5, Interesting)

    by snookums (48954) on Sunday March 20, 2011 @07:30PM (#35554366)

    "Prepend" isn't a word either, but technical people use it a lot because there is a specific meaning there that needs a word -- to append at the beginning. Strictly speaking you could use "prefix" as a verb, but that word has a connotation of adding a small fixed string to the beginning of one or more items. "Prefix all international phone numbers with a + symbol." "Prepend the header before sending the request."

    Similarly there is a need for a concise expression meaning "of adequate performance" without stretching to "high-performance" (especially since High Performance Computing [wikipedia.org] has a specific meaning of its own). Unfortunately, in the modern language of hyperbole, terms like "adequate" and "acceptable" have negative connotations along the lines of "not really good enough but better than nothing". So, we, as an industry, have invented a jargon word "performant" to express the idea that a thing has a level of performance sufficient that you don't need to worry about it and can look for optimisations elsewhere in your system.

  • by jfbilodeau (931293) on Sunday March 20, 2011 @08:57PM (#35554920) Homepage
    I may be the only programmer in the world that is willing to admit it, but having programmed in Java since version 1.0, I really like the syntax. And yes, I do find Java a lot of fun, especially for serious enterprise development. The thing I presently hate the most about Java is that new 'Oracle' thing.
  • by IBitOBear (410965) on Sunday March 20, 2011 @09:15PM (#35555014) Homepage Journal

    I am a guy who loves computer languages. I had a lot of fun with Ada for gawd's sake. I am pursuing erlang at the moment. Thought about haskell but it was just too big to play with...

    So keeping that in mind?.... Java and the JVM was a non-starter for me.

    Every time a new object oriented language comes out the purists start with "we don't need multiple inheritance" and so on, and they always end up having to hack it back in as some half-conceived junk (see "interfaces" providing, at a later date, all the much shat upon "complexity" of multiple inheritance with none of the ability to provide a default implementation, so then you add delegation which is all the default implementation with none of the inheritance etc...). "Java doesn't have pointers" my pasty white behind, every object is a pointer in java, you just can't use them properly, but they do manage to use pointers to prevent first class object copying, so then they added clone() etc.

    Then they "didn't need" proper destructor behavior, we have a finalize that would run at some time in the future, but really the code out to know when the last object reference is going out of scope so it can call a destructor manually if it wants. That was a stopper for me.

    See they coded their "every other language should have remained pure" hubris into their virtual machine, they encoded it into their hardware, from inception they designed their system to be limited and resistant to repair. So no thank you. So now Ruby fans want to take their niche language and cram it into the fundamentally flawed Java VM. Ooooohhh sign me up!

    I swear the language feels like it was designed on a dirty napkin by someone who had no grasp of scope or symmetry.

    Might as well be Python (I am old enough to remember RPG and COBOL Coding Forms, saving one apparent character, e.g. "}", (because in Python ":[newline]" is the same as "{" so what did we save again?) for the privilege of using white space, and counting tabs, as a control structure. And save the "but now we have editors to help us so that doesn't matter" tripe, we had coding forms to "help us" etc. And I suppose its okay to hack off a foot because you can alwasy get a peg-leg to help you too?

    Why is it that each new generation of "language designers" insist on reinventing the same old square-wheels of the previous generation and calling it new?

    Now get of my damn lawn... (yes, this rant makes me feel old, but come on people, imagine where we would be going if you would just stop trying to reopen the same tapped out mines...)

  • by TheTyrannyOfForcedRe (1186313) on Sunday March 20, 2011 @09:20PM (#35555042)
    Coming from someone who writes a lot of Python and currently has a lot of Python code in production, the Python language is a steaming pile of shit. Yep, I said it. It's as if Guido specifically went out of his way to choose the worst possible option at every design decision. The only things Python have going for it are a comprehensive library, "better than Ruby" performance, and mindshare. I would much rather code in Ruby, Java, C/C++, Lisp, Scheme, Smalltalk, or assembly than Python. Sadly, about 50% of my production code must be in Python. Sucks to be me!
  • I don't like Java, but I do have a couple of issues here:

    Java doesn't have pointers" my pasty white behind, every object is a pointer in java,

    Every object is a reference in Java. There is a world of difference between a name which refers to some object and an integer which might refer to some object, but you can still do integer math, and the object might not even be there anymore...

    This is almost as if you're trying not to see the advantages. No more segfaults. No double-frees, no crazy-ass debugging where the wrong method gets called because your pointer is pointing to the wrong (or a corrupt) vtable, and you really have to try to get a memory leak.

    they do manage to use pointers to prevent first class object copying, so then they added clone() etc.

    And the number of times I should've just passed the original object, vs the number of times I really didn't want it to implicitly clone something? Again, I have to give this one to Java, with the caveat that the interface to clone() kind of sucks. Ruby has dup, and all objects have it by default. Implementing clone() in Java is a pain, and if an object doesn't implement it, you're SOL.

    the code out to know when the last object reference is going out of scope so it can call a destructor manually if it wants. That was a stopper for me.

    Really? This?

    Think back to all the destructors you've ever written in C++. How many of them can you count that did more than free memory? In other words, how many destructors have you ever written which aren't entirely replaced by the garbage collector?

    I can pretty much count them on one hand. Filehandles, DB handles, etc. Yes, it sucks, but having to close a filehandle vs having to free every bit of memory I ever allocate? I'll take the filehandle.

    So now Ruby fans want to take their niche language and cram it into the fundamentally flawed Java VM.

    Wait, what?

    You haven't mentioned a single issue with the JVM. Your complaints have been about the Java language. Surely you can tell the difference?

    Oh, alright, you had one other complaint: You don't like the lack of proper destructors. Guess what? Ruby doesn't have them either. Ruby has finalizers, just like Java. Of course, I don't see anything about the JVM's design that prevents a language from implementing destructors anyway.

    It's also funny how you, like most of Slashdot, seems to have missed the point: JRuby exists, and is pretty much neck and neck with the official C Ruby implementation in terms of performance. It's just as stable, and almost everything that works in one implementation works in the other -- kind of like how you can have multiple C compiles.

    This article was about Mirah, which is not Ruby, nor trying to be. It's a way to make Java suck less, at least syntactically. If your gripe isn't with the syntax, you probably won't care about Mirah.

    And for a bit of balance, here's the features I really, really miss in Java:

    • Closures
    • Better setters/getters
    • Operator overloading (or something other than the retarded handling of == vs equals)
  • yes, that... (Score:4, Interesting)

    by IBitOBear (410965) on Sunday March 20, 2011 @11:42PM (#35555860) Homepage Journal

    I _massively_ use destructors for doing more than freeing memory, particularly in multi-threaded code.

    Iet's see...

    closing files.

    Releasing, e.g. _unlocking_ regions of files shared between applications (e.g. matching flock() calls between constructor and destructor to lock and release records at precise times).

    Terminating protocols semantically and _then_ closing sockets instead of just closing sockets.

    Issuing signals on the _last_ release of a mutex that is coupled with a condition variable instead of on every release of a nested mutex.

    Unlocking and dismissing shared object libraries (e.g. undoing dlopen()) when, but not before, the last instance of any/every object dependent on the shared object file goes out of scope.

    Preventing "Cruft" in my heap by doing "deep" memory frees of complex structures as soon as I no longer need them instead of at "some random time in the future if ever".

    Changing modes and states on devices using ioctl() etc. (e.g. when the last "raw" use of the controlling terminal goes out of scope you put the terminal back into line mode until/unless you need to bring it back into raw mode.)

    Resetting hardware on last use.

    Emulating devices and subsystems that, by definition, reset themselves on last use.

    Doing all of the above with "exception safety" without having to write a ass-ton of "finally" blocks (though I _do_ whish C++ had "finally" 8-).

    Doing all of the above in "deep structures" so that my objects are true active objects instead of just nested hunks of memory.

    You are like a blind guy asking "when was the last time you really used your eyes for anything but reading" because you have never heard of art, when you presume destructors are "really just for freeing memory" you demonstrate a horrific limitation in your understating of object, functional, and event driven programming.. Just because you don't understand the non-beginner ways to use a construct doesn't mean the construct is only used the way a beginner would use it.

    Meanwhile:

    I lived through the "P-System" and "P-Code" the shortcomings and costs overheads and raft of annoying assumptions built into the JVM are a "given" to me. Sorry for not doubling the size of my rant to make you happy. I work with too many system internals to think the JVM is a win. You go back to treating your heap as executable, and over-stressing your CPU translation look-aside buffers, and leave me alone... 8-)

    In counterpoint:

    I think closures are overrated, but I don't disparage them because I recognize that the fact that just because "they have never been particularly necessary or useful to anything I have done" doesn't mean that they are unnecessary or useless to persons other than I. Plus you can pull the same thing more or less in C and C++, for a limited number of variables, by returning a pointer to a nested function, but like eeew.... Closures do typically require an executable data segment, which I find impure, but all of java requires an executable data segment so who am I to judge. Closures are just the latest brand of secret sauce to allow people to throw memory at a problem instead of logic. 8-)

  • Re:yes, that... (Score:4, Interesting)

    by mmcuh (1088773) on Monday March 21, 2011 @08:46AM (#35557984)

    You are like a blind guy asking "when was the last time you really used your eyes for anything but reading" because you have never heard of art, when you presume destructors are "really just for freeing memory" you demonstrate a horrific limitation in your understating of object, functional, and event driven programming.. Just because you don't understand the non-beginner ways to use a construct doesn't mean the construct is only used the way a beginner would use it.

    Also, freeing memory manually in C++ is almost always the wrong way of doing things. There are plenty of adaptable containers in the standard library, and when you want to write you own and actually do need to keep track of memory directly there are smart pointers that deallocate whatever object or array they're pointing to when they go out of scope. Raw pointers and manual deallocation are only needed in very special cases, like when you're writing lock-free data structures or have to deal with C APIs.

    C++ really isn't about manual memory management, it's about scope-based memory management. You don't have to free things manually, and yet you can be completely certain of when a certain chunk of memory is deallocated.

I have not yet begun to byte!

Working...