Ruby, Clojure, Ceylon: Same Goal, Different Results 138
snydeq writes "Charles Nutter, Rich Hickey, and Gavin King each discovered that 'simplicity' doesn't mean the same thing as they developed Ruby, Clojure, and Ceylon, respectively. 'Languages that are created with similar goals in mind may yield highly disparate final results, depending on how their communities understand those goals,' writes Andrew Oliver. 'At first, it surprised me that each language's creator directly or indirectly identified simplicity as his goal, as well as how differently the three creators and their languages' communities define what simplicity is. For Ruby, it is about a language that feels natural and gets out of your way to do what you want. For Clojure, it is about keeping the language itself simple. For Ceylon, it is a compromise between enabling the language to help, in King's words, "communicating algorithms to humans" and providing proper tooling support: the same general goal, three very different results.'"
Lot's of information about Clojure... (Score:4, Insightful)
Re:Lot's of information about Clojure... (Score:4, Informative)
There isn't. But this is how to do it in Ruby
String.instance_methods.select {|m| m.match("sub")}
=> ["sub", "gsub!", "gsub", "sub!"]
instance_methods returns an array. select iterates on it calling a code block on every element. If the block returns true it adds the element to the array it will return at the end of the loop. |var| is how the element is passed to the block. Blocks can span over multiple lines, it's usual to wrap them inside a do..end but { } still work.
String.instance_methods.select {|m| m =~ /sub/}} is the perlish alternative.
I didn't match "last" as in the article because Ruby's String has no methods with "last" in their name.
Re: (Score:3, Informative)
A slightly better way, IMHO:
String.instance_methods.grep /sub/
Re: (Score:2)
Re: (Score:3)
Programmers with zero *nix background puzzle me.
Re: (Score:2)
Programmers with zero *nix background make sure you get your dole check and ensure that the local kwik-e mart has goods for you to spend it on.
Re: (Score:2)
I prefer the more concise "".methods[/sub/].
Re: (Score:2)
Does that work?
> "".methods[/sub/]
TypeError: can't convert Regexp into Integer
I tried it with 1.9.3 and 1.8.7. Array's rdocs don't list a [] method with a regexp argument.
Re: (Score:2)
"String.instance_methods.select {|m| m =~ /sub/}} is the perlish alternative."
Actually your "Perlish" form is the preferred way to do it in Ruby.
On the other hand, that's one of the nice things about Ruby. There are different ways to get things done. There are a few people who might consider that a flaw, but that does not seem to be the general consensus.
Re: (Score:2)
Well, we only just started on Ceylon and are not even finished yet, so it's a bit early to start comparing how many times it gets mentioned in relation to the rest, but people have to start some time :)
Every programming language is touted as "simple" (Score:4, Insightful)
If it gains traction, then it will have to deal with feature creep (keeping up with the new hot languages), standard library bloat, backward compatiblity, and differing interpretations of the spec by compilers and developers. Then it becomes no longer simple.
Java is the classic example. It's hard not to giggle or mutter "WTF?" when you read Sun's original positional paper claiming the language was "simple".
Re: (Score:2)
Well, from my understanding changing times/styles also contribute to this. Wasn't C once considered a relatively "high" language when it first emerged and is now more of a "middle" language?
Compared to rubbing two sticks together or use of flint matches are "simple", but now we have lighters.
Re: (Score:2)
"Wasn't C once considered a relatively "high" language when it first emerged and is now more of a "middle" language?"
No, C has never been considered a "high level" language in the Computer Science world, when compared to its predecessors such as BASIC and PASCAL. (Say what you want about BASIC, but it *is* a high level language, and vastly more so today than when it first appeared.)
People who insist that C a "high-level" language (you did not do that) make me cringe. At best, C is a "mid-level" language, lying somewhere between a high-level language and Assembly.
Re: (Score:2)
I'm not sure that C is that old. FORTRAN, COBOL, and LISP (and a number of others) are all older than C and are higher level than C. Not to mention that the LISP enthusiasts probably consider all other languages to be lower level languages.
Re: (Score:2)
"I'm not sure that C is that old. FORTRAN, COBOL, and LISP (and a number of others) are all older than C and are higher level than C."
This is precisely what I was saying. It came later, but was lower-level.
And it was, for the specific purpose of being more performant. It is not easier to use, it is not "simpler" to learn to use wisely. It is just more efficient at the compiled level, which being drastically less "efficient" at the code level. Personally, I can barely stand to look at it.
Re: (Score:2)
It must have been a real long time ago as I did wrote win98 vxd in C and in high level assembly (assembly with complex macros, but not more complex than recursive substitutions). My C driver, that did the exact same thing as my HLA one, was bigger and less understandable than other one. It was that way since function pointers get rapidly ugly. A big contrast to assembly where there are no types, only words sizes and alignment.
According to my anecdotal knowledge, C is only an ubiquitous portable assembly lan
Re: (Score:2)
"Win98" and "a long time ago" have no business being remotely close to each other -- when we're talking programming language lineages, many of the developments still interesting today happened between the 60s and the 80s, and some (LISP) date back to the 50s.
Now get off my lawn!
Re: (Score:2)
"According to my anecdotal knowledge, C is only an ubiquitous portable assembly language; nothing more, since a good set of macros in assembly can be more terse than C if you target only one type of CPU."
I would tend to agree with this. That was exactly the purpose of C. It wasn't "high level", but it played well cross-platform while Assembly did not. So it was "higher level" than Assembly, in that it abstracted out much of the hardware interface.
Re: (Score:2)
In the 1980s, C was considered a high-level assembly language.
Re: (Score:2)
C is absolutely fairly and squarely a high level language.
What features do Pascal and Fortran and Cobol have that make them high level and C not?
Re: (Score:3, Insightful)
Java is the classic example. It's hard not to giggle or mutter "WTF?" when you read Sun's original positional paper claiming the language was "simple".
It was, and still is, simple compared to C++, or ADA.
Although at the time, there was the quote "Claiming Java is easier than C++ is like saying that K2 is shorter than Everest." And there is some truth to that.
But Java *IS* simple! (Score:2)
Java combines the natural and easy syntax of C with the blazing speed of Smalltalk.
*Tadum* *Crash* *Thud*
Thank you, thank you, I'm here all week. Try the fish and tip your waitor.
Re:Every programming language is touted as "simple (Score:4, Insightful)
Just as there is no compression algorithm that's best at compressing all data, it will be unlikely for anyone to come up with a "decision compression language" that will be the best at compressing "everything". To make things more complicated, you often need to change certain stuff in the future, so you shouldn't pack everything too tightly, even if the language allows it.
Last but not least, I prefer a language not because of the code I need to write, but because of all the code I won't need to write ( and debug and document etc). In other words - the libraries and modules are important. Even if a language is very good and simple, and you only have to write one third the lines to do something, it still is not as good if you have to write everything you may need (database connectors, xml parsers, web clients, big number support, strong crypto, etc). In contrast a language that is 3 times more verbose but has libraries for nearly everything you need would actually result in you writing a lot fewer lines, and if the libraries aren't crap, supporting, documenting a lot fewer lines.
So a language that makes my life simple, isn't necessarily a simple language
Re: (Score:3)
You are very right.
May I recommend Paul Grahams "On Lisp"?
Use of functional programming, and macros to build dsm's reduces the code you need to write, and can simplify things.
You then need good ffi (foreign function interfacing) to utilize external libraries.
My favorite system (currently) is Gambit-C Scheme. It supports define-macro as well as hygenic macros. It compiles to C, so the ffi is simply writing "in-line" C code if needed. Best of all is it has a 20 year history behind it.
Re: (Score:2)
Mad (Score:3, Insightful)
This makes me mad to see a link to an article about "9 top languages", in which some major (established) players of the field such as Haskell or OCaml are not mentioned, while languages-to-be get some nice coverage.
Creating a programming language boils down to being fashionable, rather than doing something neat.
Success (Score:2)
Re: (Score:2)
apparently its a java remix
Re: (Score:2)
"apparently its a java remix"
Not even close.
Its syntax is nothing like Java's. Its structures are nothing like Java's. Both being high-level languages, of course it is almost inevitable that they deal with some of the same data structures, like strings and arrays. But program structure is markedly different.
When it was first becoming popular outside Japan, various comparisons showed Ruby to average about 20% as many lines of code to get the same job done as Java programs that did the same things.
Granted, those demonstrations m
Re: (Score:2)
thats great and all but I was responding to the line about Ceylon
Re: (Score:2)
" thats great and all but I was responding to the line about Ceylon"
Pardon me then. My comments were well-meant.
Re: (Score:2)
It's not really, but of the 3 it's the most similar maybe. We're trying not to stray too far away from familiar territory for Java developers.
Find/replace "Simple" with "Good" (Score:3)
Simple leads to different results because it usually means something more like "quality". Simple is in itself not an absolute value. Instead, the simplicity of something is a ratio of its value to its sucking. So what they're really saying is "I'd like to achieve high value outcomes with the least amount of sucking along the way." There's a lot of ways to do that.
Simple Made Easy (Score:3)
Re: (Score:2)
He's a smart guy and that's a good talk, but his arguments about simplicity are a little weird. Trying to use the etymology of the word 'simple' as a justification for design choices? I found Clojure to be a language where first you had to make the leap-to-lisp before it was easy, and making that transition wasn't helped by having the JVM/standard lib/Java syntax as prerequisites (not to mention lein/ant/maven/ide configs/etc...).
Heh, consider what this description says about clojures simplicity: "Leiningen
Re: (Score:3)
Keep in mind that Clojure users could use Ant or Maven or Grails for their build system, and all of those work perfectly fine; Leiningen, on the other hand, is simpler than any of them.
Clojure developers have a high bar for simplicity, albeit measured in a way that might not make much sense to anyone who doesn't first grok LISP.
Re: (Score:2)
Ruby's Principle of Least Surprise (Score:2)
Matz designed Ruby with "principle of least surprise" in mind. No, not for you. you might be surprised or not. It's whatever causes least surprise to the creator Matz. 8D
Re: (Score:2)
Yes. And perhaps Ruby is simple... to Matz. One important point: simpler != fewer characters in source code.
Re: (Score:2)
"One important point: simpler != fewer characters in source code."
But you can be as verbose as you like in Ruby. Unlike many other languages, the "standard" way of getting things done is not mandatory. You have options.
So you can make your source code compact, or, if you think that's not easily readable, expand it to a "simpler" form.
Case in point: while working on a project, one of my fellow workers (very smart guy) refactored some of my code to a single line of nested lambdas.
I examined his code very carefully. But the simple fact was that it was too hard to me
Re: (Score:2)
I love Ruby. It's easily my favorite language right now.
With that said, more verbose doesn't always equate to easier to read nor does terse and short equate to simple. I'm very happy with Ruby's flexibility in that regard.
Re: (Score:2)
Even MacRuby does not really compile, but exposes your raw code. JRuby does, of course, but it's not 100% compatible and requires JVM. JRuby is an admirable project, don't get me wrong... but having native bytecode compilation would be tremendous.
One
Re: (Score:2)
It's trivial to decompile Java bytecode, and even decompiling machine code isn't all that hard. It really doesn't matter, just use Ruby for desktop apps if you like it.
Re: (Score:2)
"It's trivial to decompile Java bytecode, and even decompiling machine code isn't all that hard. It really doesn't matter, just use Ruby for desktop apps if you like it."
It is trivial for knowledgeable people to decompile bytecode. It isn't trivial for the majority of commercial software customers. And it is far more trivial to simply read the raw code from non-compiled programs.
Further, don't confuse the task of decompiling with the task of making sense of the decompiled code. In most cases there are no meaningful variable names; instead they get named things like "integer0", or whatever the decompiler decides is a good designation. And it is not generally well-formatte
Any Non-Relational Language is OVER-Simplified (Score:2)
Stop looking under the lamp post for the keys.
Re: (Score:2)
That's what the fucking fuck.
Re: (Score:2)
Re: (Score:2)
Out of interest, what languages do you think are simple?
Re: (Score:3)
Re: (Score:2)
The problem is the designers assumed that all programmers are simple minded idiots.
For a large market that Java was originally aimed at, that assumption was a reasonable first-order approximation.
It is why I never liked Java (that, and Sun's ridiculously exaggerated claims about it)--but it is still arguably a feature, and one that is appropriate in many situations.
Re:java backend is not simple. (Score:4, Interesting)
That's interesting. The creators of C# have a somewhat similar philosophy: they say that they would like it to be a "pit of quality", it should be easy to write correct code. But that doesn't mean they removed features that can be abused.
As a consequence, the things you mention (pointers, gotos, operator overloading) are all included. But for example in the case of pointers they are "hidden" (they have to be in an "unsafe" block).
On the other hand, for example fall-through switch cases are not allowed in C# at all, they thought those are not worth all the bugs they cause.
Re: (Score:2)
Which is a bad idea, because they're very frequently useful. A better idea would have been to reverse the default- make it not fall through unless you use a special keyword.
Re: (Score:3)
Maybe that would be a good idea in an ideal world. But in reality, such behavior would be deeply confusing for people who know C, C++ or Java. And I think that "it should be easy to write correct code" applies to people who already know another language too.
Also, from my experience, fall-through is not that useful anyway. I don't think I ever wrote code in C# where it would be useful. Having two cases for the same code sometimes is useful, and C# does support that.
Re: (Score:2)
I actually kind of like that. It also enables you to have 3 or 4 cases that all need different minor initializations (say they all want to initialize a starting condition to different values) to then jump to a common case, which was actually a frequent pattern in assembly programming that's unfortunately difficult in modern languages.
Re:java backend is not simple. (Score:5, Informative)
That's why there is no easy to explicitly do things such as pointers, gotos, and operator overloading
The reason there was no pointers was that pointer manipulations were highly machine dependent. Java emerged out of Oak and the slogan "write once run anywhere" was key to its popularity.
Goto -- came from the whole philosophy that goto leads to bad code.
Operator overloading and multiple inheritance are both examples where subtle shifts in code can lead to enormous shifts in how the compiler views the code. One of the key aspects of Java was making sure that side effects to changing code were contained.
Re: (Score:3)
"Operator overloading and multiple inheritance are both examples where subtle shifts in code can lead to enormous shifts in how the compiler views the code."
That and I've still not really seen many, if any convincing arguments where multiple inheritance is a good idea. We've had a few MI zealots extoll their virtues here and give us examples of why MI was essential to their project. The problem is, each time they've done so so far on Slashdot at least, they've only served to prove they have absolutely no id
Re: (Score:3)
Program in Java? Everywhere you see interfaces, that's multiple inheritance, they just restricted you to only inherit from the interface, not the implementation. Which means every class that implements it has to rewrite that code. Depending on the interface and the class, that may or may not be a good idea. But I'll frequently find myself writing very similar code for multiple classes that implement the same interface.
What they really needed to do was just block diamond inheritance- inheriting from two
Re: (Score:2)
"Program in Java? Everywhere you see interfaces, that's multiple inheritance"
Yes, but when most people talk about MI they're talking about the ability to perform actual inheritance of real classes, rather than the ability to implement interfaces - it's that that I'm referring to.
"Which means every class that implements it has to rewrite that code. Depending on the interface and the class, that may or may not be a good idea. But I'll frequently find myself writing very similar code for multiple classes that
Re: (Score:2, Interesting)
Interfaces are about type checking and making your compiler happy, MI is about sharing implementation (which is mixed with type-checking in most of programming languages). Part of the confusion about interfaces and MI in Java is caused by tutorials and docs which wrongly mentions interfaces as a way of doing MI (Sun training I'm blaming you!).
The biggest problem with MI is the implementation: it leads to lots of "exceptional" cases in the handling on instance variables and method dispatch. If you google for
Re: (Score:2)
Re: (Score:3)
Well in languages like Haskell I use them all the time. For example if I'm doing algebra on a matrix I want to inheret from the algebra library and the matrix library.
So something like 3x^2 + 4x + 7, where x = [[2,3], [1,1]] can be used.
Re: (Score:3)
Sure, but in an OO language it makes far less sense, which is what I was referring to - specifically why the lack of support for actual MI in Java isn't a problem - apologies for not making that explicit.
In an OO language you'd recognise that algebra defines actions upon an object, and so you'd simply implement algebra the interface against a Matrix class, and define the Matrix specific implementations of algebraic actions there. Then anything that is a matrix, is a matrix, and anything that is a matrix, ca
Re: (Score:2)
I don't see how OO changes it. To me you would run into a problem with not wanting to implement everything twice or 100x. For example the connection between sin(x) and the Taylor series I would have in Algebra not Matrix. But if I want to compute a sin function on a matrix space I'm going to want to use the Taylor series. Why should I have to re-implement all that code?
___
Or to be less mathy. I have "imports" and "cars" as classes why not inherit both for Toyota objects?
Re: (Score:2)
I don't see why you'd have to implement it many times, you only implement it if it changes, and then you have to implement it.
Most of the solution revolves round breaking down the problem, and reusing those broken down chunks. If you're implementing a functon that has many many lines of code and then complaining about needing to reimplement that then the chances are that you could have broken down that function more and reused parts of it.
"Or to be less mathy. I have "imports" and "cars" as classes why not
Re: (Score:2)
"Or to be less mathy. I have "imports" and "cars" as classes why not inherit both for Toyota objects?"
I don't really understand this example. Are you saying you might have imported cars? If not why not have Toyota inherit from ImportedCar, which inherits from Car? If your assertion is that you might have other types of imports than cars, then your base class is Import, from which Car inherits. I don't really see the problem?
Then I still have a problem
Car -> ImportedCar -> Mini Cooper
is fine but
Suit
Re: (Score:2)
"I have to re-implement the entire imported structure again.
A Mini has properties of Car (like wheels) and properties of imports (like international duites)"
So do as I said, if an import sits below a car, then have Import as your base class, so:
Import -> Car -> Mini Cooper
Import -> Suit -> Gucci
Stick your international duties in import, your wheels in car. This only further serves to demonstrate the point that most people think they need MI when they fail to grasp proper OO structure.
"True, but
Re: (Score:2)
And with that structure how do you handle a GM or a Brooks Brother's Suit? You can't have Import as a base class for Car because not all cars are imports. You can't
Re: (Score:2)
"And with that structure how do you handle a GM or a Brooks Brother's Suit? You can't have Import as a base class for Car because not all cars are imports. You can't have Cars fork into Import and Domestic because you also need Import / Domestic for Suits and no Suits are Cars."
This is the problem though, you can keep on adding cases until you break a solution and say "Hey look, I told you multiple inheritance was the solution!" but you're still completely wrong. In this case you need to then question wheth
Re: (Score:2)
This is the problem though, you can keep on adding cases until you break a solution and say "Hey look, I told you multiple inheritance was the solution!" but you're still completely wrong.
This is the same case from the start. Its just taken several rounds for you to see why you can't use a single hierarchy.
In this case you need to then question whether import even needs to be a class, or whether you just have, say, a product class with a country of origin property from which you can derive if it's impo
Re: (Score:2)
"This is the same case from the start. Its just taken several rounds for you to see why you can't use a single hierarchy."
But I've done exactly that. I've given you a solution. You're using a really weak argument here.
"I don't want a taxes data structure I want hundreds of methods and objects having to do with imported vs. domestic. Imported objects may have far service offices. They have shipping times. They may have multiple conflicting law sets that apply to them."
Sure, and all this can be represented as
Re: (Score:2)
In an OO language you'd recognise that algebra defines actions upon an object, and so you'd simply implement algebra the interface against a Matrix class, and define the Matrix specific implementations of algebraic actions there. Then anything that is a matrix, is a matrix, and anything that is a matrix, can have algebraic actions performed upon it.
While I can't speak for Haskell's implementation of alebra, I'd have to say that mathematically, an algebra is far more than just a definition of "actions upon an object" in the OO sense. An algebra also defines the results of those actions, while in OO, an interface only defines the type signature of those actions. So you can happily define your interface as "supports-add-and-subtract" without defining that "x - x must equal zero". This is only half of an algebra - if that.
And really, isn't this the core p
Re: (Score:2)
Typo: "language define" -> "language development"
Re: (Score:2)
Funny enough Haskell allows you to do that. What you are looking for is Monadic operators (http://en.wikipedia.org/wiki/Monad_(functional_programming))
You can define arbitrary types and then use functions which invoke either
So for example say you are using Nat (natural numbers).
type Nat = Zero | Suc Nat
so for example
8 = Suc (Suc (Suc (Suc (Suc (Suc (Suc (Suc (Zero))))))))
Now you have fromInt:: Integer -> Either Nat
given by
fromInt x = if x >= 0
then Left x
else Rig
Re: (Score:2)
"While I can't speak for Haskell's implementation of alebra, I'd have to say that mathematically, an algebra is far more than just a definition of "actions upon an object" in the OO sense. An algebra also defines the results of those actions, while in OO, an interface only defines the type signature of those actions. So you can happily define your interface as "supports-add-and-subtract" without defining that "x - x must equal zero". This is only half of an algebra - if that."
It's the implementation of the
Re: (Score:2)
That and I've still not really seen many, if any convincing arguments where multiple inheritance is a good idea.
There is value in declaring that instances of a class can participate in a some pattern or other. The concept of interfaces is a way of doing this that is used in Java (and in variations in a number of other languages too). However, it uses a different dispatch model to direct inheritance: straight indexing into a vtable won't work (the point of dispatch doesn't have enough information at compile time, so a more complex — and somewhat slower — lookup is required).
Ontologically, multiple inherit
Re: (Score:2)
"There is value in declaring that instances of a class can participate in a some pattern or other. The concept of interfaces is a way of doing this that is used in Java (and in variations in a number of other languages too)."
Agreed, you at least need the facility to do that.
"Ontologically, multiple inheritance is not a problem either: it's just the is-a (well, strictly the is-a-specialization-of) relationship."
Indeed, but how often is something actually two things? that's what multiple inheritance implies.
Re: (Score:2)
I tend to think "simplicity" was part of the goal in not including those things. As you point out it was more simplicity for the compiler, but still simplicity for the developer flows out of it in this case.
Re: (Score:2)
Clojure has an incredibly hard job.
Take language A and use libraries from language B where A and B have totally different ideologies about everything. We need a good LISP and we need a good set of modern libraries for that LISP. But yeah mixing them kinda sucks.
Re: (Score:3)
Re: (Score:2)
oh my, Ada. COBOL meets Pascal. ;P
Re: (Score:2)
Re: (Score:2)
I can assure you I am very proficient in all three of those languages. Ada had its day, peaking in the late 90s but in decline ever since.
Re: (Score:3)
Ada wasn't bad and certainly capturing more bugs at compile time is wonderful. One of things I love about Haskell is that generally if the program compiles it does what you wanted it to. I save a ton of time with debugging.
A new Ada like language prodecedural with light object orientation, static and strong compile time checks with extensive libraries and financial backing would be good.
As an aside, Ada doesn't have closures, it doesn't have tail recursion... even in the 1970s this was the reason ironica
Re: (Score:2)
Actually Ada pretty much fits the description you've given except for the extensive library. The financial backing is already there due to the military projects attached to it. If you want reliable software for potentially dangerous things Ada is the only acceptable choice.
And Ada doesn't support tail recursion for the simple reason that well written software shouldn't need recursion. Addi
Re: (Score:3)
And Ada doesn't support tail recursion for the simple reason that well written software shouldn't need recursion. Additionally it's actually terribly inefficient, Ada was also meant for embedded systems. Do you realise what happens every time you call a function? Your processor puts the program counter and other registers on the stack and then jumps to the function call.
When a language supports "tail recursion" that actually means it does "tail recursion elimination". Which means that the processor does NOT put anything on the stack per iteration. There is nothing inefficient about tail recursion when the language supports it.
Re: (Score:2)
So you might want to look at how rewriting what the programmer did is against the Ada philosophy. You should keep in mind predictability is important. Hence using tail recursion really would lead to recursion in Ada, result in possible stack overflows.
Re: (Score:2)
And Ada doesn't support tail recursion for the simple reason that well written software shouldn't need recursion. Additionally it's actually terribly inefficient, Ada was also meant for embedded systems. Do you realise what happens every time you call a function? Your processor puts the program counter and other registers on the stack and then jumps to the function call.
That's what happens without tail recursion which is why you want tail recursion. What happens with tail recursion is the call gets rewri
Re: (Score:2)
It's not cause you can solve most problems using large explosions
Re: (Score:3)
Tail recursion doesn't imply the compiler rewriting what the programmer wrote to me, it simply means using recursion in the return line.
A "tail recursive call" is one where recursion happens in the return line.
For any language that allows recursion at all, "Tail recursion" as a language property means the language rewrites tail recursive calls as iterative during compile or execution.
In any case if the language wanted to allow recursion and offer reliability it could just flag on recursive calls it couldn
Re: (Score:2)
"Ruby has a horrible syntax, it's slow and unreliable."
Ruby's syntax is exactly why most people who like it, use it. And it's not as if its syntax were unusual; almost all of it was "borrowed" from existing languages, though in a generally consistent way, so it is still coherent.
But okay. You don't like the syntax. That's your prerogative. About being "slow and unreliable", however:
It is no slower than other modern "dynamically typed" languages. While it is generally true that as a group they are slower than compiled languages like C or even Java. (And e
Re: (Score:2)
Re: (Score:2)
Chaining dot operators doesn't add to the readability of a language as some people seem to think it does. Yet for some reason the Ruby crowd seems to assume this is a good idea.
It actually is slower than the competing languages. Python (also dynamically typed) is a lot faster. In fact in certain cases even PHP beats Ruby at speed of execution. And your assumption that Java can only be run by a virtual machine is just p
Re: (Score:2)
"Ruby is far from consistent in my opinion. But that's subjective so I'll skip that one. Chaining dot operators doesn't add to the readability of a language as some people seem to think it does. Yet for some reason the Ruby crowd seems to assume this is a good idea. ... It actually is slower than the competing languages."
It must all be kept in perspective. If you think Ruby syntax is inconsistent in comparison to Python, you need your head examined.
And I will grant that Ruby is -- a little, not a lot -- slower than Python on benchmark suites, code maintainability is important and if you want to compare readability (and, as I mentioned, actual syntactical consistency), Ruby is the clear winner. Significant-whitespace languages simply aren't as readable as other infrastructures. I know some die-hards disagree, but blind st
Re: (Score:2)
I'd say in terms of readability it might actually be ranked below SPARC assembly (and that's not a compliment).
What do you have against SPARC assembly? It is an extremely straightforward three-address instruction set without complications. The only slightly challening part is register windows. It is by far my favourite instruction set to write for (admittedly I don't really write assembler anymore...)
Now PA-RISC on the other hand... Luckily that is dead.
Re: (Score:2)
Re: (Score:2)
Ah right, it's the Sun assembler not the instruction set you don't like :) That makes a lot more sense.
I am not sure what you mean by missing symbols and confusing naming. Coming from MC 68k at the time, SPARC assembler was a great leap forward.
99 bottles of beer in PA-RISC [99-bottles-of-beer.net]
Re: (Score:2)
But the behaviour is unpredictable for more complicated situations
Can you give an example of this? We're working hard to make Ceylon exactly the opposite of what you mention here, and even though it's still a work in progress I don't see why you would say this.
Re: (Score:2)
Re: (Score:2)
I agree that was the case on the older systems. Luckily with the increase in processor speed it's not that much of a problem any longer.
I can only assume you've never used Eclipse.
Dumb programmer approach to Java:
1. Don't even think about memory allocation. Forget to free up references to objects you're not using and lose references to objects you still need, so your program leaks memory or randomly stops working.
2. Notice that the program freezes regularly when the garbage collector runs.
3. Increase the amount of RAM allocated to the program to stop the garbage collector running and compensate for those memory leaks you forgot to clean up.
Re: (Score:2)
I can only assume you've never used Eclipse.
The slowness in Eclipse is down to SWT, which doesn't even use the garbage collector. For proper Java code, the incremental garbage collector taht has been the default for many years now prevents the kinds of slow downs or pauses that used to be common (and afflicted other languages like most Common Lisp implementations).