Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Java Open Source Programming Red Hat Software Linux

Red Hat Uncloaks 'Java Killer': the Ceylon Project 623

talawahdotnet writes "Gavin King of Red Hat/Hibernate/Seam fame recently unveiled the top secret project that he has been working on over the past two years, a new language and SDK designed to replace Java in the enterprise. The project came out of hiding without much fanfare or publicity at QCon Beijing in a keynote titled 'The Ceylon Project — the next generation of Java language?'"
This discussion has been archived. No new comments can be posted.

Red Hat Uncloaks 'Java Killer': the Ceylon Project

Comments Filter:
  • by Kagato ( 116051 ) on Tuesday April 12, 2011 @11:24PM (#35803336)

    Mod up. It's a fair point.

  • Re:Java killer? (Score:4, Informative)

    by tomhudson ( 43916 ) <barbara,hudson&barbara-hudson,com> on Wednesday April 13, 2011 @11:33AM (#35808542) Journal

    Sheesh, obviously I hit a sore spot :-)

    GC implementation is not a language feature, it's a runtime feature. This has nothing to do with Java as a language. Furthermore, any sufficiently complex C++ program implements some form of (at least) semi-automated garbage collection; even if it's just reference-counting smart pointers. You apparently have no experience writing real software.

    Finalizers are a language feature. The fact that a finalizer is not guaranteed to ever run, not when your class does the java equivalent of c++ "out of scope" (no valid references left to it), or even on program termination, is a language flaw. The runtime just surfaces, or exposes, that flaw.

    Would multi-threading real-time data servers that never kill off a thread from their thread pool between startup and shutdown months later while serving a thousand requests a second (because they never lose a byte of memory) qualify as "sufficiently complex"? No reference counting needed - just well-behaved code, and contracts between the loadable modules and the server as to who owns what memory.

    It's called the "Dear Abby" school of memory management. If you pick it up, put it back when you're finished. If you give it to someone, either make sure that they know that they're expected to put it back when finished, or make explicit arrangements for them to return it to you so you can put it back.

    It's not that hard to get right if such an obvious dummy as I can do it, right?

    Really though, are you seriously arguing against non-deterministic GC because you think every program in the universe should be "well-behaved" and "provable" (on your own terms)? We might as well toss out, gee, I dunno, almost every modern language under the sun while we're at it. I think I can count on one hand the number of languages in regular, widespread use today whose standard runtimes leave memory management exclusively up to the programmer.

    Finalizers in java can end up being just so much dead code, even on runs on the same machine. That's the sort of non-deterministic behaviour, where the code says one thing, but the behavior is "undefined" even between runs, that cries out for fixing. That's not "arbitrary" - that's a bug.

    "Class explosion?" This sounds like a term that would be used by a C programmer who writes C code inside of C++ classes, but who doesn't actually design object-oriented programs. "Classes if necessary, but not necessarily classes" would be exactly the kind of tripe I would absolutely never expect an experienced C++ programmer to say.

    Every non-trivial c++ program contains a fair chunk c code inside classes. All those flow control statements ... for(), if(), else, switch(), break;, continue; they're all c, not c++. Same with all the non-overloaded operators. You can't write a c++ program without using some c code. Here's a hint - look at the declaration for main().

    "Classes if necessary, but not necessarily classes" is a rule of thumb a lot of us use; in my case, it's because, before java was ever even a gleam in JG's eye, I tried the "make everything a class" idiom, and ran into the same class explosion problem everyone does. Some people think that a proliferation of classes shows how great they are at coding. I don't. Classes, like everything else, are just semantic tools for managing complexity. Nothing more. Anyone who invests them with more meaning than that doesn't understand what's actually going on behind the scenes - your classes aren't real "objects" - just a series of bytes handily organized to do the job.

    It's a separate tool though. #define is not part of the C++ language in any way. It's part of the CPP language.

    The pre-processor is an integral part of every c implementation, always has been. And what is this CPP you speak of? The c pre-proces

  • by JesseMcDonald ( 536341 ) on Wednesday April 13, 2011 @12:06PM (#35808960) Homepage

    In this case, yes. Just like UTF-16, UTF-8 can encode any Unicode character up to 31 bits (U+7FFFFFFF). Since both are variable-length encodings, UTF-16 is no simpler to work with. UTF-8 has the additional advantages of being identical to ASCII for the first 128 codepoints, using a single byte order vs. big-endian/little-endian UTF-16, not embedding NUL characters, generally taking less space, not being confused for UCS-2, etc.

    See also: advantages of UTF-8 compared to UTF-16 [wikipedia.org]

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...