Experience faster, smoother browsing with built-in features like a free VPN, ad blocker, and AI tools—get the Opera web browser and redefine how you explore the web! Download for FREE here
ShoeHorn writes: "Here is a good article (1st of a 4 part series), that introduces you to the Ruby language. If you are currently a programmer coming from the likes of C++, Perl, or Python, you will see some strong similarities (especially to Python)."
This discussion has been archived.
No new comments can be posted.
I'm just guessing, but I think I have learned at least 15 programming languages. Maybe it is 30, if you count assembly languages. At some point, learning languages must come to an end. At some point, I would like to see all the good features put into one or two languages. I have no intention of using my life learning languages that then die very quickly, like Pascal.
It seems that, at some point, every skilled programmer becomes interested in writing either an editor or a computer language. I wrote one myself, for use with some H-P data acquisition equipment. Now both the equipment and language are obsolete.
With every new language, there are years of extending the language, finding and curing the bugs, and wrestling with incompatibility problems. At some point, this must stop.
Language writers should put their creativity into extending C++ rather than abandoning it.
Consider Java, for example. There have been literally years of hassle and discussion about Java, when it could have been implemented merely as a compiler switch to a C++ compiler. The compiler switch could have provided automatic garbage collection, given error messages for use of pointers, and provided the other features of Java.
Microsoft, following its usual abusive practices, put many ease-of-use features into Visual Basic, while keeping them out of Visual C++. The result was that there are maybe 1,500,000 programmers who know Visual Basic, but don't know a real language. And what language is Visual Basic itself programmed in? C++, of course.
When you write in Visual Basic, you are just a dog on a leash. Anytime Microsoft wants to yank your chain, it can. If Bill Gates happens to lose interest, Microsoft can kill this one-OS language, the way it killed xBase (dBase, FoxPro), for example, by making quirky additions that no other vendor could/would follow, and then abandoning development.
It is a shortcoming of the U.S. culture that so many men feel that they must play King on the Mountain, and try to knock everyone else down, rather than cooperate.
Like everyone, Bill Gates has inner conflict. But don't let his anger and obsessive desire to make money reduce the quality of your life!
So now it is Ruby, supposedly the next big thing. How long will it be until the books are all written? How long until we discover the shortcomings? Is there some feature in Ruby that can't possibly be added to another language?
I reviewed Ruby a little about a year ago, but came to the conclusion that the documentation was extremely poor. The overall value of a language is the value of the language itself, plus the quality of documentation of the language.
At some point, I want the merry-go-round to stop, so that I can get off.
Is there something in C# that simply could not be made a compiler switch to C++? Do we really need more syntax? I'm not saying I know all the answers to some of these issues. But I sense that something bad is happening.
For a few years, languages were interesting to me. But now, I just want to do the job, not discover other ways to do the job. Let's move on, rather than repeating what we have done before. Let's improve what we have, rather than starting again.
If C++ were extended in the direction of making it easier to call from other languages, and of making it easier to call routines in other languages, then this would become easier. But as I understand it, the C++ standard doesn't even talk about this. The closest it comes is with the extern "C" construct, which allows C programs to call a subset of the programs.
I would not want to claim that this is an easy problem, but the C++ justifications of "Well, we can't tell the C compiler how to do things, so we'll just leave that unspecified." grates on my nerves. If you can't decide on the proper specification, at least you could define a way that would be guaranteed to work. And I don't believe they even talk about any language other than C (though that would be a reasonable lingua franca if it were better defined). But a C interface is sort of guaranteed to be a bare minimum that leaves out all of the OO features. I'ld like it to be something that would open up C++ templated routined to being called from Smalltalk, etc. Then I might agree that C++ could be worked on as the "only needed" language. Currently it doesn't have anything approaching that state. Java comes closer, despite its single inheritance and clumsy interpreter.
Note. Much of the foregoing is wrt Standard C++. gcc is a somewhat different beast. It is aggressively much more compatible between languages than the C++ standard requires. But it suffers the problem that its features are non-standard. And this is a problem. Standards are the fixed posts around which designs evolve, so if the standard doesn't proclaim that "this should be a feature of the language" then it's quite easy to evolve away from it.
Condider, e.g., purely virtual C++ compilers. For awhile there were several C++ compilers that would preferentially make all class methods virtual calls, so that they could be overridden by inheritance. That seems to have slipped away. I don't know the precise reason, but the result is that code written for those compilers is often broken with the currently common compilers. Because they were using a useful extension that wasn't in the standard.
So for working code, rather than experimental code, it's important to only depend on features specified by the standard. Which means that as long as the C++ standard doesn't specify decent methods for connection to other languages, then such a feature can't be depended upon.
Languages rarely give up a feature that was a part of the standard. At most they will mark it as obsolescent, and suggest that all new programs adopt another way of doing things. They only give up features quite reluctantly, and with as much backward compatibility as possible. And I have a feeling that C++ may have painted itself into a box. In the name of efficiency it has specified that certain features should be implemented in certain ways with certain results. So it would, for example, be quite difficult to import a feature like Smalltalk/Python/Ruby's run-time binding of name to feature into it. It has gained efficiency, and lost flexibility. The obvious way to regain the flexibility is to allow the flexible part to be done in another language. But it's difficult to link most other languages to C++. They generally need to pipe themselves in through the restricted pipe of a C compliant interface. (Though gtk shows both a way around that, and the costs involved in using that way.)
And, no, I don't have any better choice for the central position. Even though it's also missing garbage collection (another bonus of its C heritage -- it's hard to distinguish pointers from data).
I agree and disagree with your point. On the one hand, I agree strongly that there are too many languages. I have yet to see anyone post why Ruby is different from Perl or Python.
And I strongly agree that making Java a compiler switch to a C++ compiler might have been a Good Thing.
BUT...! The reason C++ sucks (to me, anyway) is the language complexity, not lack of features. In other words, it has too many features. The reason I like Java-the-language (versus Java-the-environment, which I don't like) is that it strips out a lot of the B.S. that makes C++ unwieldly, like multiple inheritance, operator overloading, and other "somewhat useful features but not worth the extra complexity and downsides".
Sometimes to make something really good you have to throw away the past and start over with a clean slate. With languages, this is very dangerous, because it's hard to build up a following for a new language. The whole reason C++ was able to create a following is because it was semi-backward compatible with C.
I don't know that much about C#, but it's guaranteed to be used and successful simply because Microsoft will probably embrace a huge internal development effort which will bleed over into the industry. I hope and pray that it turns out to be what C++ should have been... a nice, tight language like C with modern features, but without the insanity. I also hope and pray that it can be native compiled, and not require a Java-style runtime environment.
I think your point is interesting about C++ having too many features. But why not just have compiler switches that prevent use of some features? That way, if a project didn't need the complex features, they could be prevented from being used, assuring uniformity and, in some cases, ease of debugging.
The features of C++ don't bother me. I just use what I want. Similarly, English is a very complex language. But, as a professional writer, I choose to avoid obscure words and poor constructions. For example, I usually avoid verb phrases ending in prepositions. I don't say "used to", for example.
And the complex features of C++ are of higher quality than the complex features of a human language that was never designed. The complex features of C++ are very useful in special cases.
What I really don't want is to have to change languages just because I need a particular feature.
"For a few years, languages were interesting to me. But now, I just want to do the job, not discover other ways to do the job. Let's move on, rather than repeating what we have done before. Let's improve what we have, rather than starting again."
You sound like a guy I work with. I sympathize with him, because he's getting close to retirement. Maybe you are too. I think it was the "Pragmatic Programmers" who said you should learn a new language every year. I find it helps to keep the brain plastic. It's all too easy to get into ruts, but soon the joy is gone too, and you might as well be flipping burgers.
Age is not the issue here. The issue is that it might be better to put more of the fine features into one language, rather than have so many languages.
There would still be the same amount of features to learn, but they would not be scattered all over the programming landscape.
How do you think these experimental features are going to get into an ISO standardized language like C++?
Do you think some smart guy like Matz, or Guido Van Rossum or Larry Wall can just go up to the ISO committee and say: "I've got this set of cool ideas -- let's add them to your language."
Also, do you really think that all of the good ideas of Perl, SmallTalk, C++, Lisp and Icon can naturally fit together in a single language? It isn't always possible, nor advisable, to pile every possible feature into a language.
Most people dislike C++ because it has TOO MANY FEATURES. That may not bother you but it is a big part of why Java caught on so quickly. So if you make an uber-language with every feature, you will CAUSE the creation of competitive new languages with smaller feature-sets. Smaller languages are often a reaction AGAINST bigger ones.
You are raising extremely interesting questions. If people don't like many features, should language designers provide subsets that people can access easily? Could a compiler switch make C++ like Java?
Is the dislike people have for a language partly due to poor explanation of the features?
I'm not saying I know the answers. I'm just thinking that we could do better for ourselves than create an average of one big language each year.
Maybe the ideas of Perl, SmallTalk, C++, Lisp and Icon are just a subset of a larger truth that, if we recognized it, would simplify all languages.
You raise an interesting question about the politics. My guess is that, if someone knew enough to lead us in a better direction, the politics would eventually be changed to fit the new situation.
"most teaching of C++ at the moment is terrible, which is the single
biggest problem the language has"
I agree with this. My city, Portland, Oregon, U.S., has an extremely large
technical bookstore, Powell's (http://www.powells.com/technicalbooks [powells.com]), and I have spent several
afternoons looking through all the books on C++. There is a strong tendency in
most of these books to explain without truly explaining, or to mystify without
explaining at all.
It is great to know that C++ will eventually be extended. But the 2 to 4 human
years that you mention is equivalent to 14 or 28 technology years.
You mention "better support for interfacing with other programming
languages".
This is not a controversial addition to the C++ language. It seems to me that
it should have been finished 2 years ago, not 2 years from now.
Most of the advantages being quoted are "ease of use", "rapid development", etc. Fine. But what about runtime? How does Ruby compare with Perl in running efficiency? Are there any benchmarks out there?
A associate once said about programming its "Horses for courses". What he meant was that we should use the best tool for the job at hand. When he said it he was talking about using Clipper (an dBase clone) over Quick Basic in 1990.
Today I know two languages to what I consider 'professional level'. On Windows I develop stand alone applications with Delphi. Its easy to develop quality code - I define quality in this case by not crashing due to obscure memory errors. Its very hard to write code in Delphi which causes memory handling issues. Writing Delphi feels like the IDE wants to help you, but if you want to get your hands dirty you still can (unlike VB).
For Web Development I now use Java. I wouldn't use Java for stand alone GUI apps because its still too slow compared to Delphi, but for server side web development it has the features I want - portability, easy to develop in, connects with SQL etc.
Learning a new syntax takes time. You need to be sure that a new language is going to have a payoff in terms of being able to achieve something you can't in others. For years I wrote Delphi, as it was the best horse for the course. Now with Web Develoipment the course has changed, and Java is best (for me).
That said, a programmer should be able to learn any language. But just because you can doesn't mean you should...
It is important not to judge Ruby on this article.
The content in the article is specifically designed to look like either Perl or Python. This was to generate a similarity that programmers from other languages can understand.
The second one, article in the series looks a lot less like Perl and more like Python. This is due to Ruby being a true OO, very similar to Python.
The third article will (if they let me write it) cover advanced Ruby structures and illustrate them by interfacing with the PostgreSQL database.
The fourth article will cover Ruby/Qt.
If you guys want to see these articles you may want to give some decent feedback on the Developer Works site.
I'm just learning python now. There just aren't enough hours in the day for me to learn all of these languages.
Then learn Ruby. Python looks clean to the untrained eye, but you have to remember lots and lots of special cases because of the arbitrary mixture of object-oriented and procedural features.
Ruby is extremely consistent and well thought out. I've used well over a dozen languages, and Ruby is my favorite by far.
I think one of the great advantages of python is the wealth of material the monty python scripts bring to the table. Endless material for tutorial and how-to writers. Think of the novelty of reassembling the string "spam, eggs, spam, sausage, spam, spam, bacon, spam and spam" in a million different ways using all the different types of sequence objects. Ruby just can't compete with this type of idiom.
Hey, cool. I just finished reading "The Pragmatic Programmer" last week. This seems to be by the same authors. I recommend their work. I only wish I got ahold of it earlier than I did. It seems I've come up with a lot of the same points they did, the hard way. (Perhaps that's for the best...)
I think it's time for Slashdot to draw up an icon for Ruby topics. This story is about the 5th maybe 6th Ruby related item to appear on/. in the past few months. A simple picture of a good red-ruby you would find in a jewelery store would do fine, such as one of these [israel-diamonds.com]. Anyone for it? Against it? Got a better pic to use?
good idea. I particularly like the pics by John Kaurin, the ruby slippers are good, but we could always go with that anime-chic to preserve Ruby's Japanese heritage:)
There's nothing wrong with it, it just doesn't get into any of the crazy stuff that makes ruby ruby. So wait until the other parts come out or check out the Pragmatic Programmers Guide.
If Python was the result of Lisp and C++ having a baby, Ruby is the result of Perl and Smalltalk having a baby.
If Python was the result of Lisp and C++ having a baby, Ruby is the result of Perl and Smalltalk having a baby.
Which is funny, as Smalltalk itself is really a child of a Lisp mommy getting some unknown, alien artificial insemination. Some put Smalltalk in the LISP family of languages, but those who don't just avoid classifying it. Fascinating, it's family tree.
If Python was the result of Lisp and C++ having a baby, Ruby is the result of Perl and Smalltalk having a baby.
The key features of Lisp are the use of a uniform notation for programs and data, a notation that makes it very easy to write syntactic extensions, and full support for functional programming. Python 2.1 now has some limited support for lexical closures, but users almost never extend its syntax. The key feature of Smalltalk is its programming environment. As a language, Smalltalk's main feature is the use of very descriptive method and argument names. Neither of those is shared by Ruby.
Python, Perl, and Ruby are nifty little scripting languages, but don't think for a moment that if you are using them you are using the next generation of Lisp or Smalltalk. The object models of Python, Perl, and Ruby are general but slow and memory intensive, their programming environments are oriented towards scripting, not large system development, and compilation to native code, when it exists at all, is of limited benefit. If the scripting languages ever evolve into something like Lisp or Smalltalk, their object models, syntax, and semantics will have to change dramatically and incompatibly.
Wow! I nominate you for the "Sweeping Statements Award." I especially like the way you say "The object models of Python, Perl, and Ruby."
So far, I haven't heard anyone nominate these languages for "large system development," but a lot of large systems are made out of small systems, and while it is tempting to create these large systems in a uniform language (*cough*)Java(*cough*), this can be a mistake. I read a magazine article recently about a shop where they were replacing shell scripts with Java for automating tasks--which turned out to be a big mistake, considering the overhead of the Java VM.
You're right about compilation to native code being of limited benefit to these languages. For the most part, it is a "benefit" they don't need. Take, for example, creating a complex web site in C or another statically-typed, and therefore easily-compilable, language. By the time you're on your third page, you'll realize you need some kind of templating system. Next, you'll discover you need to vary the appearance of pages based on aspects of your business logic, so you add conditions to your templating system. Next, you'll find management wants to be able to update the site quickly using cheap HTML monkeys and is reluctant to spend expensive programmer time recovering old ground, so you make the site entirely data-driven. So now you have a buggy, ad hoc, poorly-documented excuse for PHP! Sure, the code you wrote is compiled, but the system as a whole is slower, and it took a whole lot longer to write.
You said, "If the scripting languages ever evolve into something like Lisp or Smalltalk, their object models, syntax, and semantics will have to change dramatically and incompatibly."
Have you ever considered that maybe these languages ARE the evolution of Lisp and Smalltalk? With all due respect to Lisp and Smalltalk, their creators didn't have the benefit of Lisp and Smalltalk when creating their languages. Larry and GVR and Matz didn't create their languages in a vacuum. Even PL/1 must be of benefit to current language designers (here's what NOT to do!).
Perhaps it would be of benefit to aspiring language designers if you could elucidate exactly how "objects models, syntax and semantics" could be improved and what the benefits would be--keeping in mind the purpose(s) of these languages.
Granted, the best programming language doesn't always win out, but the purpose of evolution is to thrive, or at least survive. I am therefore equally suspicious of praise for highly-evolved-yet-dead languages and snubs of unevolved-yet-wildly-thriving languages.
I nominate you for the "Sweeping Statements Award." I especially like the way you say "The object models of Python, Perl, and Ruby."
Maybe you haven't noticed, but Python, Perl, and Ruby all represent objects as general purpose dictionary types, in contrast to the Lisp and Smalltalk approaches.
Beyond that, I have no idea what you railing against. Did I say scripting languages were bad? Did I say people should switch to Lisp? What is your problem?
Have you ever considered that maybe these languages ARE the evolution of Lisp and Smalltalk? With all due respect to Lisp and Smalltalk, their creators didn't have the benefit of Lisp and Smalltalk when creating their languages.
I would definately agree that it's evolution, however, it's not a controlled evolution pointed forwards, like the evolutions that created Lisp and Smalltalk.
Python, Ruby, and others are adaptations of Lisp and Smalltalk to a way of thinking that people used to C and Unix can handle. Smalltalk and Lisp are too advanced and forward thinking that languages like Python and Ruby have to take a forced step backwards to accomodate those who cannot advance. It's kind of sad in a way, but I suppose it's still a step in the right direction, as there's a better chance of converting C++, Java and Perl people to Ruby than converting them to Smalltalk or Lisp, unfortunately. Better part way than none!
by Anonymous Coward writes:
on Saturday August 11, 2001 @05:24PM (#2146983)
I notice strong distinctions between those languages that have been developed by actual programming language researchers and those who hack together a language to scratch an itch.
Usually the latter turns out to be some baroque conglomeration of features piled on features, creating a very top heavy feeling to the language, while the former classification languages all have a purity to them, e.g. smalltalk, lisp, and c.
Some would defend the "hack languages" as a means to Rapid Application Development, but Smalltalk has been shown to be the most productive language, and Ruby/Python/Perl all seem to me to have a BASIC odor to them; I'm wondering if people are afraid to learn a new way of speaking?
Usually the latter turns out to be some baroque conglomeration of features piled on features, creating a very top heavy feeling to the language, while the former classification languages all have a purity to them, e.g. smalltalk, lisp, and c.
Well, I mean, as long as we're talking in generalities here, the latter also seem to be geared towards getting jobs done where the former are geared towards elegant problem-solving. Purity is great when you're admiring something, but no one wants to hang out with the righteous virgin when they feel like getting laid. Personally, when I program, I'm not looking for my code to fit some elegant theory. I'm looking for the job to get done as succinctly as possible.
Some would defend the "hack languages" as a means to Rapid Application Development, but Smalltalk has been shown to be the most productive language, and Ruby/Python/Perl all seem to me to have a BASIC odor to them; I'm wondering if people are afraid to learn a new way of speaking?
I would say that Perl stinks more of shell and C than it does BASIC. If you're talking because it has a lot of built-in features, there are plenty of languages that have it. But out of all the languages you've mentioned, I found it ironic you say that because Perl has problem undergone the most intensive language development of any (and the new process probably blows efforts for other languages out of the water). Larry Wall has a great affinity for languages (both spoken and programming), and it shows, because, for English speakers, Perl is designed so that you can write it like you almost would speak it. Now, you might say that that is "hack"-ish, but that shows a lot more care for the process of programming than a language that idealistically sticks to a theoretical truth in favor of making the language easier to use.
And provide a link to a study that shows Smalltalk is the most productive language. I'm not saying I don't believe you, but I personally find the "hack languages" to allow a much more natural flow between my brain and the screen, so I'd be interested in seeing how that conclusion was reached.
...I personally find the "hack languages" to allow a much more natural flow between my brain and the screen, so I'd be interested in seeing how that conclusion was reached.
Different strokes for different folks, I suppose. While you seems to say that the simplicity and elegance of Lisp and Smalltalk isn't practical, it is for me- my brain thinks in such terms. Perl is, for me, in many ways almost a big of a pain in the ass as C++, because it tries way too hard to fit what Larry Wall said is the "natural flow."
I'm wondering if people are afraid to learn a new way of speaking?
You bet they are! Ruby, Perl and Python are putting in terms of a contrived BCPL derived syntax Lisp and Smalltalk. They don't quite embody the semantics fully (Ruby seems about the closest), but as people are too lazy to learn a fully new way, they have to give people training syntactical wheels.
I agree Ruby and Perl seem to be like the latter type.
Python, however, is a truly clean, well-designed, quite-pure, strong-typed language.
I wouldn't say it has a Basic odor at all, besides for a built-in 'print', perhaps:)
Basic is braindead and has no library - instead, it has its entire library built into its syntax in a horrid way.
Python is the exact opposite - many many libraries separated into modular modules, with only flow constructs, OO organization, functions and exception handling built in.
Python gets all of its object-orientness from C++, Ruby gets its from Smalltalk. C++ is generally regarded as a crude (although useful) hack, whereas Smalltalk is the definition of elegance itself. So, if it is cleanliness you want you should move to Ruby. I find it amusing that while the Pythoners mock the Perl-hackers that refuse to upgrade to Python, they themselves refuse to upgrade to the next level .
Python gets all of its object-orientness from C++, Ruby gets its from Smalltalk.
I have a strong sense that you don't know Python. Python gets *none* of its object-orientedness from C++.
Python has full dynamic dispatch, the ability to catch unknown method calls at runtime and so forth. In other words it is much more like Smalltalk than like C++.
Python's OO credentials are just as strong as Ruby's. This is especially true for Python 2.2 where you can subclass even primitive types.
I find it amusing that while the Pythoners mock the Perl-hackers that refuse to upgrade to Python, they themselves refuse to upgrade to the next level.
If Ruby were clearly the next level, you'd be right. But Ruby has as many weaknesses relative to Python as it has strengths. Python's threads are extremely robust (at least on single-processor machines). Ruby's are weak. Python's industry support is much better. Python has full multiple inheritance. Python has less "syntax".
So Ruby isn't more OO and it isn't more elegant. It is at best more "Ruby" -- that will be important for some people and not for others.
I get the stronger sense that you don't know Ruby -- most Rubyists tend to be ex-Pythoners and not the other way around, but nevermind.
Python gets *none* of its object-orientedness from C++
Python lacks metaclasses, lacks a true unified object hierarchy, and supports multiple inheritance (considered a very bad idea by most experts in OO. Like the "goto" statement, multiple inheritance may seem useful at times but it leads to unmaintainable code). All these are simple repeats of the mistakes of C++.
Python's OO credentials are just as strong as Ruby's. This is especially true for Python 2.2 where you can subclass even primitive types
Yes, Python is improving, but all these improvements (like allowing the subclassing of primitives) only serve to point out flaws in the design (why aren't primitives normal objects in the first place?)
I get the stronger sense that you don't know Ruby -- most Rubyists tend to be ex-Pythoners and not the other way around, but nevermind.
People do tend to progress from more well-known languages to less well-known ones. They don't go to a bookstore and pick up a K book and think: "this would be a good first language." I've played with Ruby enough to decide that it is neat but not appropriate for many of my projects and not a sufficient improvement over Python for the rest.
Python lacks metaclasses, lacks a true unified object hierarchy, and supports multiple inheritance (considered a very bad idea by most experts in OO. Like the "goto" statement, multiple inheritance may seem useful at times but it leads to unmaintainable code). All these are simple repeats of the mistakes of C++.
Python 2.2 has metaclasses, a unified object hierarchy and supports multiple inheritance which is quite useful and safe when used thoughtfully.
It is simply a matter of historical fact that Python was not based upon C++. If it shares features with C++, those would probably be traced back to Simula through Modula-3. Smalltalk is also based upon Simula.
Yes, Python is improving, but all these improvements (like allowing the subclassing of primitives) only serve to point out flaws in the design (why aren't primitives normal objects in the first place?)
Nobody would claim that Python 1.0 was perfect, nor that Python 2.2 is perfect. Nevertheless, you haven't yet mentioned a feature of Ruby that isn't in Python 2.2. There certainly are such features -- but they get increasingly esoteric as Python improves. If I have to choose between Bertrand Meyer-approved-OO-cleanliness and native threads that don't block when you do I/O, I would choose the solid threads. OO-cleanliness is about conceptual elegance and native threads are about getting the job done.
And how deep is Ruby's Unicode support? If you can please point me to documentation on using Ruby's regexp engine to match Unicode characters, I would appreciate it.
My current Python projects depend heavily on both threads and Unicode. Mixins, multiple inheritance and the ability subclass "integer" really don't matter one whit! I am mostly happy that Python is unifying its type system for rhetorical reasons. Practically it hardly matters at all!
If you want to impress the vast majority of programmers who are not language collectors, you'll have to show us some program that are hard to solve in (e.g.) Python and easy to solve in Ruby.
For this and other reasons, it is not accurate to paint Ruby as the next step after Python. It is another good language with strengths and weaknesses. One day it will have a superset of features that Python currently has...but Python will itself have evolved by then.
And how deep is Ruby's Unicode support? If you can please point me to documentation on using Ruby's regexp engine to match Unicode characters, I would appreciate it.
Here you have an issue of implementation, not design, but at present a legitimate issue. Ruby is a product of Japan, and ironically, from the perspective of Westerners, who generally see Unicode as a sort of peace offering to Asians to make up for the dark ages of ASCII, the Japanese hate Unicode and prefer their own multibyte solution. So, the status of Unicode in Ruby is somewhat primitive at present. However, as Ruby was designed with multibyte characters in mind, it should be much easier to improve the Unicode support than in other languages.
It is simply a matter of historical fact that Python was not based upon C++. If it shares features with C++, those would probably be traced back to Simula through Modula-3. Smalltalk is also based upon Simula.
No, Smalltalk borrowed ideas from Simula. Other than the idea of classes, Smalltalk derived very little else from Simula. Most everything else came from Lisp.
Maintaining code is a pretty thankless job, but at least with MI the changes only have to be made in one place. Java's interface based MI either forces similar code to be included in each class that implements an interface or the inclusion of lots of little stub routines that to call the smae named routine on a different class.
Clearly you aren't a Java programmer. If you want to do multiple inheritance, what you most likely really want to do is share some utilities across several class hierarchies. This means you want a utility class, if you want things to be clean.
What you should do is declare an interface which exposes various properties which are needed to perform the utility operation. Then create a singleton which you pass in object of the interface type.
Presto! Code in one place can be modified to support multiple classes. Inheritance is used for what it is supposed to be used, and maintanence is easy.
If Ruby were clearly the next level, you'd be right. But Ruby has as many weaknesses relative to Python as it has strengths
No, the next level isn't Ruby. It's Smalltalk. It's Lisp. That's the point your post's parent is saying- if you can move from Perl to Python, why not go all the way to the source, to the acme of elegance and simplicity- Smalltalk or Lisp?
What's wrong with Smalltalk being "the next level"?
Well, there is a place for scripting languages and "normal" programming languages. Smalltalk competes with Java and C++. It complements scripting languages.
Oh no! I've been writing scripts in Smalltalk! I should've been Ruby, it's "script-world equivalent" all along!
That's bunk. There's nothing other than what is "acceptable usage" in a coder's mind that makes something a "normal" or "scripting" language. Smalltalk is equpped with the tools to build large systems more so than most scripting languages like Perl, Ruby, and Python- but out of the box, I'd say it's also more equipped than C++ and Java.:)
That's bunk. There's nothing other than what is "acceptable usage" in a coder's mind that makes something a "normal" or "scripting" language.
Well, in the case of Smalltalk, most implementations I've seen live in their own environments rather than interacting with the native command line. This makes them more or less useless for "scripting" in the normal sense where you call a script from the command line and pipe data into a script and output results on stdout. For example:
neatoScript < data.txt > processedData.txt
Perhaps you don't have to do such tasks in your own work, but that's what scripting languages are used for.
As a follow up mentioned, GNU Smalltalk fits in with the usual Unix environment fairly well. Squeak can run headless and run a Smalltalk text file and has a package called OSProcess that allows you to do piping and such. Furthermore, I believe VisualAge/Smalltalk and/or VisualWorks has facilities for this.
I don't use Smalltalk to script in the old-fashioned way, however. I spend a lot of my time with an image (meaning, a Smalltalk environment) open, and write, run, and debug my scripts directly out of it, rather than using an external editor.
You can look at this in the way some people spend a goodly amount of time in nothing but (X)Emacs, writing scripts in elisp.
Regardless, no where in the definition of "script" is it specified you have to be working with pipes.
who says Python is an upgrade from Perl? It's all a matter of choice.
Well, even Larry Wall himself admits that Perl 5 is showing its age. Look at the plans for Perl 6. It is looking more and more like Python and Ruby, isn't it?
Ruby is too much OO. I don't want methods for literal values.
Then don't use them, silly. Just because you can give your integers a different function for the addition operator doesn't mean you *have* to. Do whatever is appropriate.
In the professional world (i.e. the one that matters), OO is way overdone and misused.
Sturgeon's Law: 90% of everything is crud. Procedural programming is just as poorly used as object-oriented programming. You probably weren't around at the time, but OO was supposed to be the 'silver bullet' that solved all the problems that structured programming didn't solve. Structured programming, in turn was supposed to be the silver bullet that slew the beast of ad hoc programming.
With an OO environment, you can abstract the design of a program from the code itself, in the old procedural model you couldn't really do that.
Sure you could. Clear conceptual abstraction was, in fact, the *whole point* of structured design. 'This module has these responsibilities. It provides these functions. It uses these other modules. It functions as a highly-extended wrapper around this basic module.'
Take a good structured design, substitute 'class' for 'module' and 'member method' for 'public function', and you'll have a good OO design. OO is just an easier way to implement the classic structured design elements: encapsulation, extensibility, and genericity.
and not have to type all that class/method setup garbage just to get to one library routine. Is that too much to ask? To OO zealots it is.
Huh? I can't speak for Java, but your 'dream code' is standard operating procedure for Python (and, I imagine, Ruby). Here's a transcript of your Hello World example from an actual Python interpreter:
(I could have said print "Hello World" instead of the sys.stdout.write() call, but I wanted it to look like what you had written.)
Download a Python interpreter, type that code in, and see it run for yourself. If you don't like OO because of your experiences with C++, Java, or Perl, you should try Python or Ruby. They are OO languages done right, as opposed to the hideous Lovecraftian horror of C++. Even if you insist on structured programming with no OO, Python has a very nice module system that makes it easy to create and encapsulate modules.
One could call C++ hacked - it's "features on top of features". But than- it's the most widely use language, so I guess it _MUST_ have is strengts...
C++ is almost nothing else but a huge hack. It's only strength over other languages is backwards compatibility of a way of thinking with C. Nothing more.
FWIW, within a week you'll learn to love whitespace. Yes, it imposes a visual "order" to your code. You'll soon appreciate the consistency: you can look at other code and immediately make sense of it, without having to mentally adjust for the other programmer's weird habits.
I notice strong distinctions between those languages that have been developed by actual programming language researchers and those who hack together a language to scratch an itch.
"Actual programming language researchers" are typically not even interested in designing languages for general purpose use. Often they are just trying to explore a particular idea of aspect of programming.
Usually the latter turns out to be some baroque conglomeration of features piled on features, creating a very top heavy feeling to the language, while the former classification languages all have a purity to them, e.g. smalltalk, lisp, and c.
I think you're going to have to define your terms if you want to make this point. How were Kernighan and Richie programming language researchers but not Van Rossum and Wall? By now, the latter two have spent about ten years of their lives thinking about almost nothing other than programming languages.
Some would defend the "hack languages" as a means to Rapid Application Development, but Smalltalk has been shown to be the most productive language,
Could you provide a reference to back up that claim? I'd like to see evidence that Smalltalk fares well at system administration or text processing.
and Ruby/Python/Perl all seem to me to have a BASIC odor to them;
Now you are really grasping. That claim isn't even solid enough to refute.
...I'm wondering if people are afraid to learn a new way of speaking?
I don't know...are you? If you are into purity and elegance, I would suggest you give either Python or Ruby a real try. There are many Lisp fans that like both and Ruby is especially popular amoung Smalltalk users.
If C is the kind of language you DO NOT like, and SML is the kind of language you DO like, then I sincerely hope that you continue to dislike the scripting languages. C has proven itself perfectly adapted to solve a large an important set of real-world problems. Perl, Python and Ruby are similarly designed.
I don't really see "symbolic AI", "planning", "Dynabook, educational software" and "proof systems" as particularly representative of the programming most of us do in the real world.
"Text munging", "kernel hacking", "GUI and server application programming" etc. are more typical. Thank Guido there are "hacked" languages to let us do our jobs!
I'm not sure how importing 'frederick' other than 'freddy' shows how Ruby is dynamic, but I'll assume you meant something that you didn' ttotally illustrate.
Smalltalk is just as (and more) dynamic. I work on an IRC client that's part of Squeak. Without restarting the client once, I added an a plug-in system and wrote a few sample bots. Pretty amazing. Change a method, and the next time it's called, it's using the new version. That's kind of difficult to do using Ruby, without an interactive environment, but I suppose it can be done with a TkListener or a thread taking irb-like evals on a socket.
Interesting. Can one of you Ruby or Python elmers tell me if its possible to, say, have the user input an arbitrary snippet of code at runtime
return (foo + bar)
and have the body of the multiplication operator be replaced with this, making it now an addition operator?
In Python, yes. At any point where you could define a class's * operator, you could define it by compiling an arbitrary string on the fly.
Smalltalk didn't come before C. C was created in 1972. There was an early version of Smalltalk called Smalltalk-72 back then (written in BCPL, IIRC), but it's not much like the version most implementations today resemble- Smalltalk-80, which was created in, 1980.
Why didn't Smalltlak take off? There are a lot of reasons, including:
Horrible business management. In some ways, it's a surprise ParcPlace didn't totally kill it. Now a days there a a few actually-free implementations as well as non-commercial versions of others that can be used for no charge.
Smalltalk (not SmallTalk) came out in 1980, and required more resources than most PCs had at the time-1 MB of RAM and a bitmapped display. By the time those were available on a good many PCs, people were already used to C and Pascal.
In what sense would you say Smalltalk "didn't grow up as much?" Smalltalk is incredibly mature and stable- so, I doubt that's what you mean.
Smalltalk is productive, in my experience. Putting Smalltalk and C in the same league is silly- at the very least, C++ and the STL would be a better comparison.
Smalltalk isn't the best for kernel-hacking or serious number crunching (F90 anyone?), but I don't claim it to be. It's a great language, a great system- one with which a person can be incredibly productive, especially in comparison to C++/STL and Java.
Not being able to spell the language's name correctly is a sign you've never actually had any experience with it.
However, I mispelt Smalltalk as "Smalltlak," which is a typo, and would be taken as such by a member of the Smalltalk community. When you mispelled it, it was as SmallTalk, often taken as a sign of ignorance, rather than a simply typo. Why is that? The "proper" way is Smalltalk, but it's pretty usual for people that don't know what they're talking about, who have maybe only read about St on/. will spell it SmallTalk. No one knows why.
So, if it was a typo, my apologies, but that is my reasoning.
The first time I touched Smalltalk, I have immediately placed it in the category of academic curiosity (just like Prolog) because it is an interpreted language.
Smalltalk isn't interpreted. It's incrementally compiled run on top of a VM. What does that mean? Methods are compiled into bytecode and then interpreted or JITered. This is similar to Java, with the exception that there is no explicit "compile" command.
In a Smalltalk browser, when you save the method you're working on, the method is compiled, and you're alerted to any errors in your code. Now, this compilation takes a fraction of a second, even on 486. It may not feel like it's compiling, but that's only because you're used to having to take the explicit step and running gcc/g++/javac or what have you.
Considering it productive was nearly the same, IMHO, as considering QuickBASIC to be productive.
It shows you've not done more with Smalltalk than "Hello, World" examples. For application level programming, in my experience, Smalltalk is more productive than C++ and Java, given both the Smalltalk and C++/Java programmers are somewhat experienced (at least 6mos). I've been using Smalltalk for a little over a year, and am quite a bit more productive than I am with Java (3mos use), C (4 years, on and off), and C++ (2 years).
For one, there are simply not the same amount and magnitude of "gotchas" as the above listed languages/systems.
Frankly, I see C++ as mostly worthless, especially as an application's level language. The STL and other libraries don't let you escape from C++ and it's many annoyances, they just complicate things. I'd much rather be using Smalltalk (or another high-level language like Common Lisp, Ruby, Python) and writing extensions in C when neccesary.
In most Smalltalks and Lisps, there is a call-out interface, such that lib-extensions like those found in Perl, Python, and Ruby don't have to be written. You just tell the system the function signature and the name of the library.
You don't need a 1+GHz machine to run a Smalltalk system for development, or to run a Smalltalk-based app. You're too used to Java, which is still largely impractical for real-world apps.
GUI builders have been a part of Smalltalk for a while. Don't know a date though. Check out Dolphin Smalltalk for an implementation tightly integrated with Windows.
As far as there not being a place in the world for Smalltalk- bah. If you don't want to use it, don't. But please don't spread misinformation because your 20 minutes worth of experience was confusing. I understand some people are set in their ways, and prefer the way they've been doing for years- a contrived, BCPL-based syntax, and a very static, compilation based way of life. That's fine- but I'll continue using Smalltalk and getting my stuff done.
Well, I see a lot of people complaining as usual... Why should they learn another language? So I thought I would throw in my $0.02 for all a languages (not just Ruby, which I am particularly fond of).
A lot of you people come from what I call the Computer Science Student mentality. This unfortunately something I've noticed as a side effect of the way computer science is taught in most schools. It's quite unfortunate, because that clearly isn't the aim of the Professor's when they are teaching classes this way.
I'll use my school as an example. When I started at the University of Dayton, about 7 years ago, every class was taught in ADA. It was a horrible horrible expierence. The ADA compilers were horribly lacking at the time, and anybody who has done any work in ADA will know that it is an extremely strict and picky language. That has it's good and bad sides when it comes to teaching programming, but I'll save that for a different discussion. I had no problems, since, well, I had a good solid 4 years for Pascal and C/C++ programming experience before I even started college, but this seriously affected the other students in my class.
The problem was, about two years into the program, my school decided (thank god) to switch the department over to C++. This was a great move, because now the students were being taught a language that they could actually apply in the real world (beyond the confines of the Wright Patterson Air Force base anyway). It worked out quite well for the newer students, but my classmates were blindsided. Most of them suffered through half a semester of C (the other half of the semester was 360 assembler) and never even touched upon C++. They should have just been able to pick up a C++ book, apply the concepts they learned from their ADA classes and the syntax from the C++ book and their C course and move on, but most of them had a hard time doing it.
Why was that? They all knew the concepts. They all knew how to write their algorithms, and their trees, and their stacks and queues. Yeah, they weren't taught how to write real software, but they clearly knew the basics. I spent more time helping others learn these concepts than I did doing my own homework the first two years, so I know what they were capable of from first hand experience.
The problem was entirely in their minds. C++ is a huge ugly beast, and it is a bit imposing when you first start. But if you've got two solid years of programming behind you, it should be a relatively smooth and easy transition. Most of them didn't realize this though. They were scared of C++, they were scared of new languages, and they suffered as a result.
The simple fact of the matter is, if you know one language, you know them all. It's not the syntax that makes the difference, it's the concepts that you express within the framework of the language's syntax that are the real guts of programming. My classmates took awhile to realize this (and I'm sure many of them still don't). That's the same thing with Ruby, or Python, or Perl, or just about any language. Unless you are making the jump from procedural to OOP, or OOP to Functional for the first time, you *CAN* pick up a book and learn a new language in a days time. The only thing stopping you is yourself.
Now, the other part of this is, why would you do that? I love learning new languages, and I love learning new languages for a few reasons. These reasons apply to every programmer, and I honestly don't understand why some people are so opposed to learning something new. I guess that's what seperates a good programmer from a bad programmer. So if you want to know why you should learn Ruby, or LISP, or Haskell, or even Visual Basic, I'll tell you why.
1. It helps keep your skills in tip top shape. Perusing a computer manual may remind you of algorithms or techniques you haven't used in a long time and forgotten.
2. You always seem to learn something new. Not some new technical trick that only works in one lanuage (although that definitely happens), but just a different way of approaching problems that sometimes can transcend language boundaries.
3. You may find a new language that allows you to get the job done faster!
4. You're enhanced knowlege of languages looks great on your resume no matter what you use as your primary language.
5. You learn the way other people think. And I don't want to gloss over this one. As a programmer, you frequently have to work with other programmers. Learning new languages is a *GREAT* way to see how other people do things. To learn the way other people think, so to speak. By learning Ruby and Smalltalk, you start to learn why people in those communities are so die hard about OOP programming styles. By learning LISP or Haskell you start to learn why Functional styles even exist! And it all comes back full circle. Techniques I learned from Haskell I now use when writing C++ programs and vice versa. It only made my C++ code better.
Knowlege is power, and learning new languages is one (of many) ways of increasing your knowlege. Go ahead and try it, even if you think you won't use the language, and even if you're just starting out and don't think the transition from your learning language to a new one will be easy. You just might be surprised by how much you already know, and how much you have yet to learn. That's the real benefit of it.
I appreciate and share your view that by learning multiple languages
you enlarge your toolbox, and in the big picture all languages are
intertwined. (Jeez. I sound like I'm rambling on about "The Force"
or something.)
I've noticed a similarity in my "night job" as a musician, where
playing different types music acts in a similar way to programming in
different languages. You pick up stuff in one style that can enrich
your playing of another style. Well I sure am staying on topic, eh?
One comment I wanted to make regarding Picking up another language
in a day. I agree somewhat. There was no question that after
programming in (time order) Pascal, C, Fortran, Ada, Lisp, Clips, and Scheme
I was able to "pick up" Perl pretty quickly. That said, it took some
time to really learn the Perl idioms and to do things in the "Perl
Way". I've seen my share of C code that was really Fortran written in
C, or Java code that was really C written in Java (procedural vs OO).
With appologies to Heinlein, it takes longer than a day to grok
a new language.
With appologies to Heinlein, it takes longer than a day to grok a new language.
Not always. IMO, Java built on my existing C++ and Smalltalk knowledge that it didn't really provide anything that Smalltalk (and to a lesser extent, C++) could provide- it was merely a new syntax and was a part of a different business model. For those unexposed to Smalltalk, Ruby has some aspects which may take a while to grok, but for the most part, it builds on my existing knowledge, leaving almost nothing to grok. But in many cases, it's more convenient.
I'm stil trying to get to know how to use Perl, but not just to grok it- it seems there are so many rules for every tiny thing you'd want to do that I've had to keep docs open for anything that departs from print "hello world...";
Great post, but IMHO any 3-year CS course that aims to teach only one language is going about things completely the wrong way.
You should spend time in (for e.g.) a simple teaching language to start off, 1 mainstream procedural/OO language, one functional language, one scripting language and study briefly a sampling of languages of comercial or academic interest, and what makes them interesting or sucessfull.
IMHO when you learn your second language is when you start to 'get it' about what is an essential feature, and what is an accident of syntax or history in programming.
I wouldn't call knowing only 1 language a "the Computer Science Student mentality" because a CS graduate should definitely not know just one language - they should know how to pick up any language quickly, having had practice at it.
I noticed this too. I'm curious what schools are pursuing this type of curriculum. When I was in school, it seemed like I was learning a new language for every class. Whatever language was most apropos was (usually) used. Freshman year Intro class? Pascal. Artificial Intelligence? Lisp. Operating Systems? C++ (that's a little odd, I admit.) Embedded Systems? 68K Assembly. Numerical Analysys? Mathmatica. The default of course was good old C, which I ended up doing a fair amount of.
In the second half of his post above, Dalroth has made a very sensible case in my opinion, for learning other languages.
(Dalroth seems like the kind of programmer employers want to hire. But he has provided no way to contact him. In fact, he is very negative about being contacted in his bio.)
I'm tired of new languages (read my post #128), but Dalroth has a point that other languages sometimes teach other ways of thinking.
The answer seems to be to put all the knowledge in one place, or as few places as possible. At some point even Dalroth will decide that one more language is too many.
The answer seems to be to put all the knowledge in one place, or as few places as possible. At some point even Dalroth will decide that one more language is too many.
I doubt he will. That is, there is no point at which there are too many languages. To say otherwise is just a submission to the stronger and stronger trend of American capitalist corporate philosophy under the guise as "effeciency."
Let me explain. Oftentimes new languages are created for someone to learn the ins and outs of designing and implementing a language and it's tools, be it an interpreter or compiler, native or bytecode. Writing a language seems to be a popular enough of a hobby that it won't soon go away. Eventually, some schmuck, consortium, or business will come up with some new whiz-bang theorectical basis for some new language that will make everything look that much older.
Now, you don't have to go and learn any of these new languages. No one is forcing you. In fact, most of you are already mentally stuck in the 60s, with languages based on flat files. And that's fine. But the nature of science is that there's a lot of experimentation before discovery, and regardless of whether or not you think Ruby is that useful, it's serving some use to someone, and seems to be advancing computer science (not "programming", but CS) to a degree, if only for a small group of people.
The simple fact of the matter is, if you know one language, you know them all.
I was about to object to that, but then you wrote...
Unless you are making the jump from procedural to OOP, or OOP to Functional for the first time, you *CAN* pick up a book and learn a new language in a days time. The only thing stopping you is yourself.
Thank you for being one of the few to acknowledge that while languages using similar ideas may be easy to learn once you know the first, languages using fundamentally different approaches might take some effort! Sadly few people seem to realise this.
Having said that, I'm afraid I have to disagree with your "one day" as well. You can learn a new syntax in one day, sure. But how long does it take to learn the new idioms? Java and C++ have similar syntax, but Java uses a GC and finally for resource management, while C++ has predictable destruction and uses the abysmally-named-but-rather-neat "resource acquisition is initialisation" idiom. Anyone working seriously in these languages needs to appreciate this distinction, but it's not written down in (m)any of the books.
There seems to be something about schools that systematically damps down any enthusiasm for any subject. I'm really glad that programming doesn't feel much like math to me. I used to be enthusiastic about math, but a few years as a math major killed that. Now I avoid it to the extent possible. But programming doesn't feel like math (to me), so I've got a decent job that doesn't feel bad.
If you don't feel enthusiastic about something, then it can feel like an imposition to be required to learn something different. It doesn't much matter what. And a new language certainly can qualify.
OTOH, languages certainly do run in tides. There don't seem to be any decent books right now on "Data Structures and Algorithms the C++ way" that aren't primarilly texts. Now I'm trying to switch over to C++ after using a raft of different languages, so most of the introductory books get insufferably dull. I don't want to work through all the drill. Been there, done that. I want a guide on picking things up fast. The best book that I've found so far is "Data Structures, a PseudoCode approach with C++". It isn't really designed as a reference book either, but at least it doesn't stop in the middle of an algorithm and say "the rest is left as an exercise for the student". So I may be able to get my AVL tree working and tested by Monday. (Then I need to decide whether to switch the code over to Eiffel, or recode the Eiffel a bunch of Eiffel in C++... or experiment with trying to link the two in a manner not covered by the C++ standard, and quite compiler specific in Eiffel.)
So using lots of languages has its drawbacks, too.
There seems to be something about schools that systematically damps down any enthusiasm for any subject. I'm really glad that programming doesn't feel much like math to me.
Perhaps it's because most school in America, both secondary and tertiary, seems to be geared towards producing employees, and not thinkers? It used to be that you'd learn many ways of thought, and usually, many languages when persuing a CS degree. Now, they teach you C, C++, and Java. Forget that you may not learn any real science or how to think and synthesize, you're taught how to use the tools. At least they still teach students to experiment in Biology, rather that just showing them how to follow procedure and use the microscopes.
A serious programmer should always be interested in some new paradigm/language/tool/os/...
However, a language like Ruby fails to impart or embody any new paradims or concepts. That is, it's mostly a new syntax atop existing semantics found in Perl or Python. An exercise is syntax and grammar is relatively uninteresting. For me, learning a new language, be it spoken or coded, most of the interesting parts are novel ways to express old semantics or completely new semantics that are put out in the open.
That said, I like Ruby- while it may be pointing at the same problem-niche as Perl and Python, as a language, it makes a lot more sense to me, and I'd heartily choose it over Perl or Python in most circumstances.
I currently make my living writing.jsp's and Java servelets (that's not what I signed up to do, but hey, as long as they pay me, I don't mind).
Conclusions:
1. Java is a nice language. I would prefer Java to C++ any time of the day.
2. Java is a useless language outside of a very controlled environment. Because it is not Open Source, it is highly unportable (just try running Java on OpenBSD sometime!), and because the existing runtimes are so bloated, it's only useful for applications where you don't mind having a spare 40 megabytes of bloat hanging around.
In other words, Java is *not* a panacea, and certainly isn't a replacement for highly dynamic languages such as Python or Ruby, which tackle an entirely different problem set.
My opinion of Ruby: Nice language. Some stupidities though -- the whole notion of making variable types case-sensitive reeks of Fortran. I considered Ruby for the TapiocaStor project, but had to dismiss it from contention because it's not yet mature enough. We're using Java, but only because we can't use Python for legal reasons.
Because it is not Open Source, it is highly unportable (just try running Java on OpenBSD sometime!)
Ok. On a just-installed machine (complete with generic kernel, yes I know):
(rob@denali:~)$ uname -a
OpenBSD denali.CENSORED.com 2.9 GENERIC#653 i386
(rob@denali:~)$ java -version
java version "1.4.0-beta"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.4.0-beta-b65)
Java HotSpot(TM) Client VM (build 1.4.0-beta-b65, mixed mode)
That's OpenBSD 2.9 running the latest JDK 1.4 beta. What were you saying again?
You also seem to be confusing Java (the platform) with Sun's version of Java. Open Source editions (eg Kaffe [kaffe.org] can run on just about anything. Not to mention that you can get the sources to Sun's JDK.
Some stupidities though -- the whole notion of making variable types case-sensitive reeks of Fortran.
That's because they're not types, they're classes. And why does that matter? Because all you reference a class by it's name- as a VARIABLE. If all Ruby variables were case-insensitive, then classes, or types as you refer to them, would also be case-insensitive.
It's simple, beautifful elegance. This is unlike other languages, where types and/or classes are an exception, aren't normal variables. There lie's power in Ruby's approach.
Almost as fashionable as pretending to be pragmatically cynical.
If we followed Dalroth's advice, we would repeatedly waste weeks at a time on other fringe languages as well. Sure it MIGHT pay off but this is the same logic people are criticized for when they spend money on lottery tickets.
I don't do things because I think they'll "pay off." I do them because I learn things from them, and because they're fun. Every time I learn a new language and compare it with the ones I already know, I learn something (and it's fun, too).
Perhaps more demand, but when your supply is huge, more than needed, that higher demand is worthless. Smalltalk is far from as popular as Java, and there definately aren't as many Smalltalk jobs out there. But there also aren't as many Smalltalkers- so it's definately not impossible to get a job doing Smalltalk. Incidentally, Smalltalkers tend to make more money than Java and C++ coders, which are a dime a dozen now-a-days.
That is, for the most part true, but there are languages that make it difficult and some that make it easy. For me, no matter how much I am forced to use it in school, C++ is still a lot harder to express ideas (especially well-designed OO ones) than some of the other languages I know. This is for a lot of reasons, from syntax to manual memory management. Now, it's not that what I want to do is impossible in C++, but there's a lot more crap I have to think about and deal with to get the same basic stuff done.
Let's pretend you were writing an "rcurses" library.
Need a line of 20 "-". No problem.
Anyway, from now on we'll all contact you before adding a feature to a language.
If I had modpoints, i'd have used them on you. The other explainations (ie: it's better/quicker than a for loop) are valid, but seem wasteful. This, though, seems appropriate.
I guess i'll download and try it later tonight, I just dont like the VB style conditionals (if fuck break else shit break end nonsense).
Heh. Does perl speed up development? Everytime I've tried to use it, I had to have multiple browser windows open for docs and other people's code, in an attempt to figure out what the hell it's goofy syntax stood for. The small script I ended up making kind-of worked- doing the same read, search-and-replace, and write back to file only works some of the time on some of the files- I try it on a different and directory and BAM it doesn't work!
I'm not saying perl is completely flaky and useless- but contrary to Wall's silly assertions, perl isn't a language you can pickup gradually, there are simply too many rules and funny symbols ($/, @_, $_ anyone?) to learn just to do simple things.
And why the hell can you only store scalars into hashes? Pffft! And don't tell me I can just use references to non-scalar data, as refs are scalar- if I wanted to deal with C's bullshit, I would.
Come on people, this is (meant to be) funny, not a troll. It doesn't even remotely try to make a sound argument. A troll would say:
For embedded systems: C# has a defined bytecode that can be JIT compiled onto a variety of special purpose chips without rewriting anything.
Web Programming: C# has built in XML and network support, and don't forget that because everything tunnels over port 80, no firewall will prevent your code from executing!
Instead, all of the statements are absolutely absurd. Maybe you don't think it is funny, but it is absolutely NOT a troll. Trolls try to say things that have a basis in reality but are completely non-constructive for the purposes of discussion to have people respond (I'm not sure where the line to Flamebait is, but...). This is just a joke.
and for the MS route
[...]
(Java/no other language is as useless to compete with Java)
I think you're under the false impression that Java was written by Microsoft. You also probably don't use java for anything hence feel it's unnecessary. If you want a good, easy to maintain and robust app on multiple platforms then use java, but if you need that app ready to go into production quickly, then java is not your best choice, scripting languages like Perl are. I havent learned Python as of yet, but based on what I'm told, it's good for getting an app done quickly.
Actually, as much as I hate to say it. Microsoft is on to something with the Windows scripting host. It can interperet Perl, VBScript, Jscript, and possibly some others (I don't remember offhand).
Actually, there is a lot of talk going on between L Wall and GvR to try to unify their bytecode interpreters (and maybe object models, etc.) for just this reason. I'm not awareof the details, but perhaps in some distant future, Python, PERL, and maybe even Ruby will all converge on a common backend (forget about TCL, though. It ain't never gonna happen)
It was indeed a joke. But I've also heard discussion on the Ruby list that sounded serious about it. And nobody piped up to deny it. So, perhaps, the joke is turning serious.
I think a developers productivity has less to do with the language hes using and more to do with how much experience he has with that language
If that were true, you could just as easily write a text processing system in raw assembly or in a scripting language. I don't believe that. Languages are tools. Different tools effect your productivity in different ways. A farmer's productivity does depend on his tools and so does a programmer's.
An interesting discussion can be had as to whether one's spoken/written language affects your "productivity" for certain tasks (writing lyrics, novels, technical documents, etceteras).
Are there concepts that can be expressed in one (human) language, but not another?
I really hate Python's indentation scheme. Really. Really. As of this month (or next) they will have solved the licensing issues that had been bothering me since around the time of Python 1.6, and basically caused me to switch languages (well, I'd only written a couple of programs in it, so there was no big investment), but on thinking about switching back I really notice just how much I hate the indentation scheme. I really prefer tabs to spaces, and some python editors seem to feel compelled to switch tabs to spaces. Programs keep breaking because of this. In no other language have I had this kind of problem.
I keep(kept) trying to think of some alternate syntax that would be easy to preprocess into the non-delimited form, but all of the traditional parenthetic characters on my keyboard have already been prempted for other uses. And of course if I do design this, then none of the standard tools would work with it. So I'd need an easy way to do a bijective mapping. And at a minimum this would mean parsing all of the strings to be sure that the parenthetic marking I was using wasn't a part of a string instead of a code delimiter. Basically I'd be writing a complete parsing engine with a two-way mapping.
So one of the things that I really like about Ruby is that it didn't adopt the alignment delimited program logic. I certainly see using alignment as an added check for logic correctness, but there should be syntax markers that determine the syntax. Layout should be for verification.
There is a DBI module at http://www.ruby-projects.org/dbi/ . There are also a number of DB specific modules. See the Ruby Application Archive ( http://www.ruby-lang.org/en/raa.html ) for a list of other software modules that are available for Ruby.
15 languages is enough. (Score:2, Insightful)
I'm just guessing, but I think I have learned at least 15 programming languages. Maybe it is 30, if you count assembly languages. At some point, learning languages must come to an end. At some point, I would like to see all the good features put into one or two languages. I have no intention of using my life learning languages that then die very quickly, like Pascal.
It seems that, at some point, every skilled programmer becomes interested in writing either an editor or a computer language. I wrote one myself, for use with some H-P data acquisition equipment. Now both the equipment and language are obsolete.
With every new language, there are years of extending the language, finding and curing the bugs, and wrestling with incompatibility problems. At some point, this must stop.
Language writers should put their creativity into extending C++ rather than abandoning it.
Consider Java, for example. There have been literally years of hassle and discussion about Java, when it could have been implemented merely as a compiler switch to a C++ compiler. The compiler switch could have provided automatic garbage collection, given error messages for use of pointers, and provided the other features of Java.
Microsoft, following its usual abusive practices, put many ease-of-use features into Visual Basic, while keeping them out of Visual C++. The result was that there are maybe 1,500,000 programmers who know Visual Basic, but don't know a real language. And what language is Visual Basic itself programmed in? C++, of course.
When you write in Visual Basic, you are just a dog on a leash. Anytime Microsoft wants to yank your chain, it can. If Bill Gates happens to lose interest, Microsoft can kill this one-OS language, the way it killed xBase (dBase, FoxPro), for example, by making quirky additions that no other vendor could/would follow, and then abandoning development.
It is a shortcoming of the U.S. culture that so many men feel that they must play King on the Mountain, and try to knock everyone else down, rather than cooperate.
Like everyone, Bill Gates has inner conflict. But don't let his anger and obsessive desire to make money reduce the quality of your life!
So now it is Ruby, supposedly the next big thing. How long will it be until the books are all written? How long until we discover the shortcomings? Is there some feature in Ruby that can't possibly be added to another language?
I reviewed Ruby a little about a year ago, but came to the conclusion that the documentation was extremely poor. The overall value of a language is the value of the language itself, plus the quality of documentation of the language.
At some point, I want the merry-go-round to stop, so that I can get off.
Is there something in C# that simply could not be made a compiler switch to C++? Do we really need more syntax? I'm not saying I know all the answers to some of these issues. But I sense that something bad is happening.
For a few years, languages were interesting to me. But now, I just want to do the job, not discover other ways to do the job. Let's move on, rather than repeating what we have done before. Let's improve what we have, rather than starting again.
Re:15 languages is enough. (Score:2)
I would not want to claim that this is an easy problem, but the C++ justifications of "Well, we can't tell the C compiler how to do things, so we'll just leave that unspecified." grates on my nerves. If you can't decide on the proper specification, at least you could define a way that would be guaranteed to work. And I don't believe they even talk about any language other than C (though that would be a reasonable lingua franca if it were better defined). But a C interface is sort of guaranteed to be a bare minimum that leaves out all of the OO features. I'ld like it to be something that would open up C++ templated routined to being called from Smalltalk, etc. Then I might agree that C++ could be worked on as the "only needed" language. Currently it doesn't have anything approaching that state. Java comes closer, despite its single inheritance and clumsy interpreter.
Note. Much of the foregoing is wrt Standard C++. gcc is a somewhat different beast. It is aggressively much more compatible between languages than the C++ standard requires. But it suffers the problem that its features are non-standard. And this is a problem. Standards are the fixed posts around which designs evolve, so if the standard doesn't proclaim that "this should be a feature of the language" then it's quite easy to evolve away from it.
Condider, e.g., purely virtual C++ compilers. For awhile there were several C++ compilers that would preferentially make all class methods virtual calls, so that they could be overridden by inheritance. That seems to have slipped away. I don't know the precise reason, but the result is that code written for those compilers is often broken with the currently common compilers. Because they were using a useful extension that wasn't in the standard.
So for working code, rather than experimental code, it's important to only depend on features specified by the standard. Which means that as long as the C++ standard doesn't specify decent methods for connection to other languages, then such a feature can't be depended upon.
Languages rarely give up a feature that was a part of the standard. At most they will mark it as obsolescent, and suggest that all new programs adopt another way of doing things. They only give up features quite reluctantly, and with as much backward compatibility as possible. And I have a feeling that C++ may have painted itself into a box. In the name of efficiency it has specified that certain features should be implemented in certain ways with certain results. So it would, for example, be quite difficult to import a feature like Smalltalk/Python/Ruby's run-time binding of name to feature into it. It has gained efficiency, and lost flexibility. The obvious way to regain the flexibility is to allow the flexible part to be done in another language. But it's difficult to link most other languages to C++. They generally need to pipe themselves in through the restricted pipe of a C compliant interface. (Though gtk shows both a way around that, and the costs involved in using that way.)
And, no, I don't have any better choice for the central position. Even though it's also missing garbage collection (another bonus of its C heritage -- it's hard to distinguish pointers from data).
Re:15 languages is enough. (Score:2)
I agree and disagree with your point. On the one hand, I agree strongly that there are too many languages. I have yet to see anyone post why Ruby is different from Perl or Python.
And I strongly agree that making Java a compiler switch to a C++ compiler might have been a Good Thing.
BUT...! The reason C++ sucks (to me, anyway) is the language complexity, not lack of features. In other words, it has too many features. The reason I like Java-the-language (versus Java-the-environment, which I don't like) is that it strips out a lot of the B.S. that makes C++ unwieldly, like multiple inheritance, operator overloading, and other "somewhat useful features but not worth the extra complexity and downsides".
Sometimes to make something really good you have to throw away the past and start over with a clean slate. With languages, this is very dangerous, because it's hard to build up a following for a new language. The whole reason C++ was able to create a following is because it was semi-backward compatible with C.
I don't know that much about C#, but it's guaranteed to be used and successful simply because Microsoft will probably embrace a huge internal development effort which will bleed over into the industry. I hope and pray that it turns out to be what C++ should have been... a nice, tight language like C with modern features, but without the insanity. I also hope and pray that it can be native compiled, and not require a Java-style runtime environment.
Just don't use the features. (Score:2)
I think your point is interesting about C++ having too many features. But why not just have compiler switches that prevent use of some features? That way, if a project didn't need the complex features, they could be prevented from being used, assuring uniformity and, in some cases, ease of debugging.
The features of C++ don't bother me. I just use what I want. Similarly, English is a very complex language. But, as a professional writer, I choose to avoid obscure words and poor constructions. For example, I usually avoid verb phrases ending in prepositions. I don't say "used to", for example.
And the complex features of C++ are of higher quality than the complex features of a human language that was never designed. The complex features of C++ are very useful in special cases.
What I really don't want is to have to change languages just because I need a particular feature.
Re:15 languages is enough. (Score:2, Insightful)
You sound like a guy I work with. I sympathize with him, because he's getting close to retirement. Maybe you are too. I think it was the "Pragmatic Programmers" who said you should learn a new language every year. I find it helps to keep the brain plastic. It's all too easy to get into ruts, but soon the joy is gone too, and you might as well be flipping burgers.
Brent
Age is not the issue. (Score:2)
Age is not the issue here. The issue is that it might be better to put more of the fine features into one language, rather than have so many languages.
There would still be the same amount of features to learn, but they would not be scattered all over the programming landscape.
Re:Age is not the issue. (Score:2)
Do you think some smart guy like Matz, or Guido Van Rossum or Larry Wall can just go up to the ISO committee and say: "I've got this set of cool ideas -- let's add them to your language."
Also, do you really think that all of the good ideas of Perl, SmallTalk, C++, Lisp and Icon can naturally fit together in a single language? It isn't always possible, nor advisable, to pile every possible feature into a language.
Most people dislike C++ because it has TOO MANY FEATURES. That may not bother you but it is a big part of why Java caught on so quickly. So if you make an uber-language with every feature, you will CAUSE the creation of competitive new languages with smaller feature-sets. Smaller languages are often a reaction AGAINST bigger ones.
Too many features. (Score:2)
You are raising extremely interesting questions. If people don't like many features, should language designers provide subsets that people can access easily? Could a compiler switch make C++ like Java?
Is the dislike people have for a language partly due to poor explanation of the features?
I'm not saying I know the answers. I'm just thinking that we could do better for ourselves than create an average of one big language each year.
Maybe the ideas of Perl, SmallTalk, C++, Lisp and Icon are just a subset of a larger truth that, if we recognized it, would simplify all languages.
You raise an interesting question about the politics. My guess is that, if someone knew enough to lead us in a better direction, the politics would eventually be changed to fit the new situation.
RE: The future of C++ (Score:2)
"most teaching of C++ at the moment is terrible, which is the single biggest problem the language has"
I agree with this. My city, Portland, Oregon, U.S., has an extremely large technical bookstore, Powell's (http://www.powells.com/technicalbooks [powells.com]), and I have spent several afternoons looking through all the books on C++. There is a strong tendency in most of these books to explain without truly explaining, or to mystify without explaining at all.
It is great to know that C++ will eventually be extended. But the 2 to 4 human years that you mention is equivalent to 14 or 28 technology years.
You mention "better support for interfacing with other programming languages".
This is not a controversial addition to the C++ language. It seems to me that it should have been finished 2 years ago, not 2 years from now.
Is there a tutorial for perl that is simalar ? (Score:2)
Re:Is there a tutorial for perl that is simalar ? (Score:2)
Performance, gentlmen (and ladies)? (Score:2, Interesting)
Re:Performance, gentlmen (and ladies)? (Score:2, Informative)
what type of shot is Ruby ? (Score:2)
We've all seen the usenet post in the coffee room or on various websites. Usually under some title like:
Horses for Courses (Score:2, Interesting)
A associate once said about programming its "Horses for courses". What he meant was that we should use the best tool for the job at hand. When he said it he was talking about using Clipper (an dBase clone) over Quick Basic in 1990.
Today I know two languages to what I consider 'professional level'. On Windows I develop stand alone applications with Delphi. Its easy to develop quality code - I define quality in this case by not crashing due to obscure memory errors. Its very hard to write code in Delphi which causes memory handling issues. Writing Delphi feels like the IDE wants to help you, but if you want to get your hands dirty you still can (unlike VB).
For Web Development I now use Java. I wouldn't use Java for stand alone GUI apps because its still too slow compared to Delphi, but for server side web development it has the features I want - portability, easy to develop in, connects with SQL etc.
Learning a new syntax takes time. You need to be sure that a new language is going to have a payoff in terms of being able to achieve something you can't in others. For years I wrote Delphi, as it was the best horse for the course. Now with Web Develoipment the course has changed, and Java is best (for me).
That said, a programmer should be able to learn any language. But just because you can doesn't mean you should...
As the guy who wrote the article. (Score:5, Informative)
The content in the article is specifically designed to look like either Perl or Python. This was to generate a similarity that programmers from other languages can understand.
The second one, article in the series looks a lot less like Perl and more like Python. This is due to Ruby being a true OO, very similar to Python.
The third article will (if they let me write it) cover advanced Ruby structures and illustrate them by interfacing with the PostgreSQL database.
The fourth article will cover Ruby/Qt.
If you guys want to see these articles you may want to give some decent feedback on the Developer Works site.
so many choices (Score:2, Insightful)
Re:so many choices (Score:2, Informative)
Then learn Ruby. Python looks clean to the untrained eye, but you have to remember lots and lots of special cases because of the arbitrary mixture of object-oriented and procedural features.
Ruby is extremely consistent and well thought out. I've used well over a dozen languages, and Ruby is my favorite by far.
Advantage of Python (Score:2)
Re:so many choices (Score:3, Funny)
when you end up in Mexico with an English-to-Mexican and vice-a-versa dictionary.
Mexican? Everyone knows they speak Latin in Latin America.
The Pragmatic Programmers Ruby Book is Online (Score:4, Informative)
Re:The Pragmatic Programmers Ruby Book is Online (Score:2)
reviews, anyone? (Score:2)
Time for an icon.... (Score:3, Insightful)
Re:Time for an icon.... (Score:2)
I'm a fan of the ruby-crowned kinglet, myself.
Re:Time for an icon.... (Score:2)
Re:Time for an icon.... (Score:2)
Don't judge ruby based on the article (Score:5, Interesting)
If Python was the result of Lisp and C++ having a baby, Ruby is the result of Perl and Smalltalk having a baby.
Re:Don't judge ruby based on the article (Score:2)
Which is funny, as Smalltalk itself is really a child of a Lisp mommy getting some unknown, alien artificial insemination. Some put Smalltalk in the LISP family of languages, but those who don't just avoid classifying it. Fascinating, it's family tree.
missing the point (Score:3, Insightful)
The key features of Lisp are the use of a uniform notation for programs and data, a notation that makes it very easy to write syntactic extensions, and full support for functional programming. Python 2.1 now has some limited support for lexical closures, but users almost never extend its syntax. The key feature of Smalltalk is its programming environment. As a language, Smalltalk's main feature is the use of very descriptive method and argument names. Neither of those is shared by Ruby.
Python, Perl, and Ruby are nifty little scripting languages, but don't think for a moment that if you are using them you are using the next generation of Lisp or Smalltalk. The object models of Python, Perl, and Ruby are general but slow and memory intensive, their programming environments are oriented towards scripting, not large system development, and compilation to native code, when it exists at all, is of limited benefit. If the scripting languages ever evolve into something like Lisp or Smalltalk, their object models, syntax, and semantics will have to change dramatically and incompatibly.
Re:missing the point (Score:2, Insightful)
So far, I haven't heard anyone nominate these languages for "large system development," but a lot of large systems are made out of small systems, and while it is tempting to create these large systems in a uniform language (*cough*)Java(*cough*), this can be a mistake. I read a magazine article recently about a shop where they were replacing shell scripts with Java for automating tasks--which turned out to be a big mistake, considering the overhead of the Java VM.
You're right about compilation to native code being of limited benefit to these languages. For the most part, it is a "benefit" they don't need. Take, for example, creating a complex web site in C or another statically-typed, and therefore easily-compilable, language. By the time you're on your third page, you'll realize you need some kind of templating system. Next, you'll discover you need to vary the appearance of pages based on aspects of your business logic, so you add conditions to your templating system. Next, you'll find management wants to be able to update the site quickly using cheap HTML monkeys and is reluctant to spend expensive programmer time recovering old ground, so you make the site entirely data-driven. So now you have a buggy, ad hoc, poorly-documented excuse for PHP! Sure, the code you wrote is compiled, but the system as a whole is slower, and it took a whole lot longer to write.
You said, "If the scripting languages ever evolve into something like Lisp or Smalltalk, their object models, syntax, and semantics will have to change dramatically and incompatibly."
Have you ever considered that maybe these languages ARE the evolution of Lisp and Smalltalk? With all due respect to Lisp and Smalltalk, their creators didn't have the benefit of Lisp and Smalltalk when creating their languages. Larry and GVR and Matz didn't create their languages in a vacuum. Even PL/1 must be of benefit to current language designers (here's what NOT to do!).
Perhaps it would be of benefit to aspiring language designers if you could elucidate exactly how "objects models, syntax and semantics" could be improved and what the benefits would be--keeping in mind the purpose(s) of these languages.
Granted, the best programming language doesn't always win out, but the purpose of evolution is to thrive, or at least survive. I am therefore equally suspicious of praise for highly-evolved-yet-dead languages and snubs of unevolved-yet-wildly-thriving languages.
Brent
Re:missing the point (Score:2, Informative)
Maybe you haven't noticed, but Python, Perl, and Ruby all represent objects as general purpose dictionary types, in contrast to the Lisp and Smalltalk approaches.
Beyond that, I have no idea what you railing against. Did I say scripting languages were bad? Did I say people should switch to Lisp? What is your problem?
Re:missing the point (Score:2, Insightful)
I would definately agree that it's evolution, however, it's not a controlled evolution pointed forwards, like the evolutions that created Lisp and Smalltalk.
Python, Ruby, and others are adaptations of Lisp and Smalltalk to a way of thinking that people used to C and Unix can handle. Smalltalk and Lisp are too advanced and forward thinking that languages like Python and Ruby have to take a forced step backwards to accomodate those who cannot advance. It's kind of sad in a way, but I suppose it's still a step in the right direction, as there's a better chance of converting C++, Java and Perl people to Ruby than converting them to Smalltalk or Lisp, unfortunately. Better part way than none!
Strict languages vs. hacked languages (Score:4, Interesting)
Usually the latter turns out to be some baroque conglomeration of features piled on features, creating a very top heavy feeling to the language, while the former classification languages all have a purity to them, e.g. smalltalk, lisp, and c.
Some would defend the "hack languages" as a means to Rapid Application Development, but Smalltalk has been shown to be the most productive language, and Ruby/Python/Perl all seem to me to have a BASIC odor to them; I'm wondering if people are afraid to learn a new way of speaking?
Re:Strict languages vs. hacked languages (Score:3, Insightful)
Well, I mean, as long as we're talking in generalities here, the latter also seem to be geared towards getting jobs done where the former are geared towards elegant problem-solving. Purity is great when you're admiring something, but no one wants to hang out with the righteous virgin when they feel like getting laid. Personally, when I program, I'm not looking for my code to fit some elegant theory. I'm looking for the job to get done as succinctly as possible.
Some would defend the "hack languages" as a means to Rapid Application Development, but Smalltalk has been shown to be the most productive language, and Ruby/Python/Perl all seem to me to have a BASIC odor to them; I'm wondering if people are afraid to learn a new way of speaking?
I would say that Perl stinks more of shell and C than it does BASIC. If you're talking because it has a lot of built-in features, there are plenty of languages that have it. But out of all the languages you've mentioned, I found it ironic you say that because Perl has problem undergone the most intensive language development of any (and the new process probably blows efforts for other languages out of the water). Larry Wall has a great affinity for languages (both spoken and programming), and it shows, because, for English speakers, Perl is designed so that you can write it like you almost would speak it. Now, you might say that that is "hack"-ish, but that shows a lot more care for the process of programming than a language that idealistically sticks to a theoretical truth in favor of making the language easier to use.
And provide a link to a study that shows Smalltalk is the most productive language. I'm not saying I don't believe you, but I personally find the "hack languages" to allow a much more natural flow between my brain and the screen, so I'd be interested in seeing how that conclusion was reached.
Re:Strict languages vs. hacked languages (Score:2)
Different strokes for different folks, I suppose. While you seems to say that the simplicity and elegance of Lisp and Smalltalk isn't practical, it is for me- my brain thinks in such terms. Perl is, for me, in many ways almost a big of a pain in the ass as C++, because it tries way too hard to fit what Larry Wall said is the "natural flow."
Jujst another perspective...
Re:Strict languages vs. hacked languages (Score:2)
You bet they are! Ruby, Perl and Python are putting in terms of a contrived BCPL derived syntax Lisp and Smalltalk. They don't quite embody the semantics fully (Ruby seems about the closest), but as people are too lazy to learn a fully new way, they have to give people training syntactical wheels.
Re:Strict languages vs. hacked languages (Score:3, Insightful)
Python, however, is a truly clean, well-designed, quite-pure, strong-typed language.
I wouldn't say it has a Basic odor at all, besides for a built-in 'print', perhaps
Basic is braindead and has no library - instead, it has its entire library built into its syntax in a horrid way.
Python is the exact opposite - many many libraries separated into modular modules, with only flow constructs, OO organization, functions and exception handling built in.
Re:Strict languages vs. hacked languages (Score:2)
Re:Strict languages vs. hacked languages (Score:2)
Re:Strict languages vs. hacked languages (Score:2)
I get the stronger sense that you don't know Ruby -- most Rubyists tend to be ex-Pythoners and not the other way around, but nevermind.
Python gets *none* of its object-orientedness from C++
Python lacks metaclasses, lacks a true unified object hierarchy, and supports multiple inheritance (considered a very bad idea by most experts in OO. Like the "goto" statement, multiple inheritance may seem useful at times but it leads to unmaintainable code). All these are simple repeats of the mistakes of C++.
Python's OO credentials are just as strong as Ruby's. This is especially true for Python 2.2 where you can subclass even primitive types
Yes, Python is improving, but all these improvements (like allowing the subclassing of primitives) only serve to point out flaws in the design (why aren't primitives normal objects in the first place?)
Re:Strict languages vs. hacked languages (Score:3, Insightful)
People do tend to progress from more well-known languages to less well-known ones. They don't go to a bookstore and pick up a K book and think: "this would be a good first language." I've played with Ruby enough to decide that it is neat but not appropriate for many of my projects and not a sufficient improvement over Python for the rest.
Python 2.2 has metaclasses, a unified object hierarchy and supports multiple inheritance which is quite useful and safe when used thoughtfully.
It is simply a matter of historical fact that Python was not based upon C++. If it shares features with C++, those would probably be traced back to Simula through Modula-3. Smalltalk is also based upon Simula.
Nobody would claim that Python 1.0 was perfect, nor that Python 2.2 is perfect. Nevertheless, you haven't yet mentioned a feature of Ruby that isn't in Python 2.2. There certainly are such features -- but they get increasingly esoteric as Python improves. If I have to choose between Bertrand Meyer-approved-OO-cleanliness and native threads that don't block when you do I/O, I would choose the solid threads. OO-cleanliness is about conceptual elegance and native threads are about getting the job done.
And how deep is Ruby's Unicode support? If you can please point me to documentation on using Ruby's regexp engine to match Unicode characters, I would appreciate it.
My current Python projects depend heavily on both threads and Unicode. Mixins, multiple inheritance and the ability subclass "integer" really don't matter one whit! I am mostly happy that Python is unifying its type system for rhetorical reasons. Practically it hardly matters at all!
If you want to impress the vast majority of programmers who are not language collectors, you'll have to show us some program that are hard to solve in (e.g.) Python and easy to solve in Ruby.
For this and other reasons, it is not accurate to paint Ruby as the next step after Python. It is another good language with strengths and weaknesses. One day it will have a superset of features that Python currently has...but Python will itself have evolved by then.
Re:Strict languages vs. hacked languages (Score:3, Insightful)
Here you have an issue of implementation, not design, but at present a legitimate issue. Ruby is a product of Japan, and ironically, from the perspective of Westerners, who generally see Unicode as a sort of peace offering to Asians to make up for the dark ages of ASCII, the Japanese hate Unicode and prefer their own multibyte solution. So, the status of Unicode in Ruby is somewhat primitive at present. However, as Ruby was designed with multibyte characters in mind, it should be much easier to improve the Unicode support than in other languages.
Re:Strict languages vs. hacked languages (Score:2)
No, Smalltalk borrowed ideas from Simula. Other than the idea of classes, Smalltalk derived very little else from Simula. Most everything else came from Lisp.
Re:Strict languages vs. hacked languages (Score:2)
Clearly you aren't a Java programmer. If you want to do multiple inheritance, what you most likely really want to do is share some utilities across several class hierarchies. This means you want a utility class, if you want things to be clean.
What you should do is declare an interface which exposes various properties which are needed to perform the utility operation. Then create a singleton which you pass in object of the interface type.
Presto! Code in one place can be modified to support multiple classes. Inheritance is used for what it is supposed to be used, and maintanence is easy.
-jon
Re:Strict languages vs. hacked languages (Score:2)
Man, don't be so bitter- it's just Slashdot.
Re:Strict languages vs. hacked languages (Score:2)
No, the next level isn't Ruby. It's Smalltalk. It's Lisp. That's the point your post's parent is saying- if you can move from Perl to Python, why not go all the way to the source, to the acme of elegance and simplicity- Smalltalk or Lisp?
Re:Strict languages vs. hacked languages (Score:2)
Well, there is a place for scripting languages and "normal" programming languages. Smalltalk competes with Java and C++. It complements scripting languages.
Re:Strict languages vs. hacked languages (Score:2)
That's bunk. There's nothing other than what is "acceptable usage" in a coder's mind that makes something a "normal" or "scripting" language. Smalltalk is equpped with the tools to build large systems more so than most scripting languages like Perl, Ruby, and Python- but out of the box, I'd say it's also more equipped than C++ and Java. :)
Re:Strict languages vs. hacked languages (Score:2)
Well, in the case of Smalltalk, most implementations I've seen live in their own environments rather than interacting with the native command line. This makes them more or less useless for "scripting" in the normal sense where you call a script from the command line and pipe data into a script and output results on stdout. For example:
neatoScript < data.txt > processedData.txt
Perhaps you don't have to do such tasks in your own work, but that's what scripting languages are used for.
Re:Strict languages vs. hacked languages (Score:2)
I don't use Smalltalk to script in the old-fashioned way, however. I spend a lot of my time with an image (meaning, a Smalltalk environment) open, and write, run, and debug my scripts directly out of it, rather than using an external editor.
You can look at this in the way some people spend a goodly amount of time in nothing but (X)Emacs, writing scripts in elisp.
Regardless, no where in the definition of "script" is it specified you have to be working with pipes.
Re:Strict languages vs. hacked languages (Score:2)
who says Python is an upgrade from Perl? It's all a matter of choice.
Well, even Larry Wall himself admits that Perl 5 is showing its age. Look at the plans for Perl 6. It is looking more and more like Python and Ruby, isn't it?
Re:Strict vs. hacked languages (THIS IS A RANT) (Score:2)
Take a good structured design, substitute 'class' for 'module' and 'member method' for 'public function', and you'll have a good OO design. OO is just an easier way to implement the classic structured design elements: encapsulation, extensibility, and genericity.
You also said:
Huh? I can't speak for Java, but your 'dream code' is standard operating procedure for Python (and, I imagine, Ruby). Here's a transcript of your Hello World example from an actual Python interpreter: (I could have said print "Hello World" instead of the sys.stdout.write() call, but I wanted it to look like what you had written.)Download a Python interpreter, type that code in, and see it run for yourself. If you don't like OO because of your experiences with C++, Java, or Perl, you should try Python or Ruby. They are OO languages done right, as opposed to the hideous Lovecraftian horror of C++. Even if you insist on structured programming with no OO, Python has a very nice module system that makes it easy to create and encapsulate modules.
Re:Strict languages vs. hacked languages (Score:2)
C++ is almost nothing else but a huge hack. It's only strength over other languages is backwards compatibility of a way of thinking with C. Nothing more.
Re:Strict languages vs. hacked languages (Score:2)
Re:Strict languages vs. hacked languages (Score:3, Insightful)
Re:Strict languages vs. hacked languages (Score:5, Interesting)
Re:Strict languages vs. hacked languages (Score:2)
I don't really see "symbolic AI", "planning", "Dynabook, educational software" and "proof systems" as particularly representative of the programming most of us do in the real world.
"Text munging", "kernel hacking", "GUI and server application programming" etc. are more typical. Thank Guido there are "hacked" languages to let us do our jobs!
Re:Strict languages vs. hacked languages (Score:2)
Smalltalk is just as (and more) dynamic. I work on an IRC client that's part of Squeak. Without restarting the client once, I added an a plug-in system and wrote a few sample bots. Pretty amazing. Change a method, and the next time it's called, it's using the new version. That's kind of difficult to do using Ruby, without an interactive environment, but I suppose it can be done with a TkListener or a thread taking irb-like evals on a socket.
Re:Strict languages vs. hacked languages (Score:2)
Re:Strict languages vs. hacked languages (Score:2)
Re:Strict languages vs. hacked languages (Score:2)
Why didn't Smalltlak take off? There are a lot of reasons, including:
In what sense would you say Smalltalk "didn't grow up as much?" Smalltalk is incredibly mature and stable- so, I doubt that's what you mean.
Smalltalk is productive, in my experience. Putting Smalltalk and C in the same league is silly- at the very least, C++ and the STL would be a better comparison.
Smalltalk isn't the best for kernel-hacking or serious number crunching (F90 anyone?), but I don't claim it to be. It's a great language, a great system- one with which a person can be incredibly productive, especially in comparison to C++/STL and Java.
Not being able to spell the language's name correctly is a sign you've never actually had any experience with it.
Re:Strict languages vs. hacked languages (Score:2)
However, I mispelt Smalltalk as "Smalltlak," which is a typo, and would be taken as such by a member of the Smalltalk community. When you mispelled it, it was as SmallTalk, often taken as a sign of ignorance, rather than a simply typo. Why is that? The "proper" way is Smalltalk, but it's pretty usual for people that don't know what they're talking about, who have maybe only read about St on /. will spell it SmallTalk. No one knows why.
So, if it was a typo, my apologies, but that is my reasoning.
Re:Strict languages vs. hacked languages (Score:2)
Smalltalk isn't interpreted. It's incrementally compiled run on top of a VM. What does that mean? Methods are compiled into bytecode and then interpreted or JITered. This is similar to Java, with the exception that there is no explicit "compile" command.
In a Smalltalk browser, when you save the method you're working on, the method is compiled, and you're alerted to any errors in your code. Now, this compilation takes a fraction of a second, even on 486. It may not feel like it's compiling, but that's only because you're used to having to take the explicit step and running gcc/g++/javac or what have you.
Considering it productive was nearly the same, IMHO, as considering QuickBASIC to be productive.
It shows you've not done more with Smalltalk than "Hello, World" examples. For application level programming, in my experience, Smalltalk is more productive than C++ and Java, given both the Smalltalk and C++/Java programmers are somewhat experienced (at least 6mos). I've been using Smalltalk for a little over a year, and am quite a bit more productive than I am with Java (3mos use), C (4 years, on and off), and C++ (2 years). For one, there are simply not the same amount and magnitude of "gotchas" as the above listed languages/systems.
Frankly, I see C++ as mostly worthless, especially as an application's level language. The STL and other libraries don't let you escape from C++ and it's many annoyances, they just complicate things. I'd much rather be using Smalltalk (or another high-level language like Common Lisp, Ruby, Python) and writing extensions in C when neccesary.
In most Smalltalks and Lisps, there is a call-out interface, such that lib-extensions like those found in Perl, Python, and Ruby don't have to be written. You just tell the system the function signature and the name of the library.
You don't need a 1+GHz machine to run a Smalltalk system for development, or to run a Smalltalk-based app. You're too used to Java, which is still largely impractical for real-world apps.
GUI builders have been a part of Smalltalk for a while. Don't know a date though. Check out Dolphin Smalltalk for an implementation tightly integrated with Windows.
As far as there not being a place in the world for Smalltalk- bah. If you don't want to use it, don't. But please don't spread misinformation because your 20 minutes worth of experience was confusing. I understand some people are set in their ways, and prefer the way they've been doing for years- a contrived, BCPL-based syntax, and a very static, compilation based way of life. That's fine- but I'll continue using Smalltalk and getting my stuff done.
Why learn another language? (Score:5, Informative)
A lot of you people come from what I call the Computer Science Student mentality. This unfortunately something I've noticed as a side effect of the way computer science is taught in most schools. It's quite unfortunate, because that clearly isn't the aim of the Professor's when they are teaching classes this way.
I'll use my school as an example. When I started at the University of Dayton, about 7 years ago, every class was taught in ADA. It was a horrible horrible expierence. The ADA compilers were horribly lacking at the time, and anybody who has done any work in ADA will know that it is an extremely strict and picky language. That has it's good and bad sides when it comes to teaching programming, but I'll save that for a different discussion. I had no problems, since, well, I had a good solid 4 years for Pascal and C/C++ programming experience before I even started college, but this seriously affected the other students in my class.
The problem was, about two years into the program, my school decided (thank god) to switch the department over to C++. This was a great move, because now the students were being taught a language that they could actually apply in the real world (beyond the confines of the Wright Patterson Air Force base anyway). It worked out quite well for the newer students, but my classmates were blindsided. Most of them suffered through half a semester of C (the other half of the semester was 360 assembler) and never even touched upon C++. They should have just been able to pick up a C++ book, apply the concepts they learned from their ADA classes and the syntax from the C++ book and their C course and move on, but most of them had a hard time doing it.
Why was that? They all knew the concepts. They all knew how to write their algorithms, and their trees, and their stacks and queues. Yeah, they weren't taught how to write real software, but they clearly knew the basics. I spent more time helping others learn these concepts than I did doing my own homework the first two years, so I know what they were capable of from first hand experience.
The problem was entirely in their minds. C++ is a huge ugly beast, and it is a bit imposing when you first start. But if you've got two solid years of programming behind you, it should be a relatively smooth and easy transition. Most of them didn't realize this though. They were scared of C++, they were scared of new languages, and they suffered as a result.
The simple fact of the matter is, if you know one language, you know them all. It's not the syntax that makes the difference, it's the concepts that you express within the framework of the language's syntax that are the real guts of programming. My classmates took awhile to realize this (and I'm sure many of them still don't). That's the same thing with Ruby, or Python, or Perl, or just about any language. Unless you are making the jump from procedural to OOP, or OOP to Functional for the first time, you *CAN* pick up a book and learn a new language in a days time. The only thing stopping you is yourself.
Now, the other part of this is, why would you do that? I love learning new languages, and I love learning new languages for a few reasons. These reasons apply to every programmer, and I honestly don't understand why some people are so opposed to learning something new. I guess that's what seperates a good programmer from a bad programmer. So if you want to know why you should learn Ruby, or LISP, or Haskell, or even Visual Basic, I'll tell you why.
1. It helps keep your skills in tip top shape. Perusing a computer manual may remind you of algorithms or techniques you haven't used in a long time and forgotten.
2. You always seem to learn something new. Not some new technical trick that only works in one lanuage (although that definitely happens), but just a different way of approaching problems that sometimes can transcend language boundaries.
3. You may find a new language that allows you to get the job done faster!
4. You're enhanced knowlege of languages looks great on your resume no matter what you use as your primary language.
5. You learn the way other people think. And I don't want to gloss over this one. As a programmer, you frequently have to work with other programmers. Learning new languages is a *GREAT* way to see how other people do things. To learn the way other people think, so to speak. By learning Ruby and Smalltalk, you start to learn why people in those communities are so die hard about OOP programming styles. By learning LISP or Haskell you start to learn why Functional styles even exist! And it all comes back full circle. Techniques I learned from Haskell I now use when writing C++ programs and vice versa. It only made my C++ code better.
Knowlege is power, and learning new languages is one (of many) ways of increasing your knowlege. Go ahead and try it, even if you think you won't use the language, and even if you're just starting out and don't think the transition from your learning language to a new one will be easy. You just might be surprised by how much you already know, and how much you have yet to learn. That's the real benefit of it.
Re:Why learn another language? (Score:2, Insightful)
I've noticed a similarity in my "night job" as a musician, where playing different types music acts in a similar way to programming in different languages. You pick up stuff in one style that can enrich your playing of another style. Well I sure am staying on topic, eh?
One comment I wanted to make regarding Picking up another language in a day. I agree somewhat. There was no question that after programming in (time order) Pascal, C, Fortran, Ada, Lisp, Clips, and Scheme I was able to "pick up" Perl pretty quickly. That said, it took some time to really learn the Perl idioms and to do things in the "Perl Way". I've seen my share of C code that was really Fortran written in C, or Java code that was really C written in Java (procedural vs OO). With appologies to Heinlein, it takes longer than a day to grok a new language.
Re:Why learn another language? (Score:2)
Not always. IMO, Java built on my existing C++ and Smalltalk knowledge that it didn't really provide anything that Smalltalk (and to a lesser extent, C++) could provide- it was merely a new syntax and was a part of a different business model. For those unexposed to Smalltalk, Ruby has some aspects which may take a while to grok, but for the most part, it builds on my existing knowledge, leaving almost nothing to grok. But in many cases, it's more convenient.
I'm stil trying to get to know how to use Perl, but not just to grok it- it seems there are so many rules for every tiny thing you'd want to do that I've had to keep docs open for anything that departs from print "hello world...";
Re:Why learn another language? (Score:4, Insightful)
You should spend time in (for e.g.) a simple teaching language to start off, 1 mainstream procedural/OO language, one functional language, one scripting language and study briefly a sampling of languages of comercial or academic interest, and what makes them interesting or sucessfull.
IMHO when you learn your second language is when you start to 'get it' about what is an essential feature, and what is an accident of syntax or history in programming.
I wouldn't call knowing only 1 language a "the Computer Science Student mentality" because a CS graduate should definitely not know just one language - they should know how to pick up any language quickly, having had practice at it.
Re:Why learn another language? (Score:2)
Other languages teach other ways of thinking. (Score:3, Insightful)
In the second half of his post above, Dalroth has made a very sensible case in my opinion, for learning other languages.
(Dalroth seems like the kind of programmer employers want to hire. But he has provided no way to contact him. In fact, he is very negative about being contacted in his bio.)
I'm tired of new languages (read my post #128), but Dalroth has a point that other languages sometimes teach other ways of thinking.
The answer seems to be to put all the knowledge in one place, or as few places as possible. At some point even Dalroth will decide that one more language is too many.
Re:Other languages teach other ways of thinking. (Score:2)
I doubt he will. That is, there is no point at which there are too many languages. To say otherwise is just a submission to the stronger and stronger trend of American capitalist corporate philosophy under the guise as "effeciency."
Let me explain. Oftentimes new languages are created for someone to learn the ins and outs of designing and implementing a language and it's tools, be it an interpreter or compiler, native or bytecode. Writing a language seems to be a popular enough of a hobby that it won't soon go away. Eventually, some schmuck, consortium, or business will come up with some new whiz-bang theorectical basis for some new language that will make everything look that much older.
Now, you don't have to go and learn any of these new languages. No one is forcing you. In fact, most of you are already mentally stuck in the 60s, with languages based on flat files. And that's fine. But the nature of science is that there's a lot of experimentation before discovery, and regardless of whether or not you think Ruby is that useful, it's serving some use to someone, and seems to be advancing computer science (not "programming", but CS) to a degree, if only for a small group of people.
Re:Why learn another language? (Score:2)
Re:Why learn another language? (Score:2, Insightful)
I was about to object to that, but then you wrote...
Thank you for being one of the few to acknowledge that while languages using similar ideas may be easy to learn once you know the first, languages using fundamentally different approaches might take some effort! Sadly few people seem to realise this.
Having said that, I'm afraid I have to disagree with your "one day" as well. You can learn a new syntax in one day, sure. But how long does it take to learn the new idioms? Java and C++ have similar syntax, but Java uses a GC and finally for resource management, while C++ has predictable destruction and uses the abysmally-named-but-rather-neat "resource acquisition is initialisation" idiom. Anyone working seriously in these languages needs to appreciate this distinction, but it's not written down in (m)any of the books.
Schools! (Score:2)
If you don't feel enthusiastic about something, then it can feel like an imposition to be required to learn something different. It doesn't much matter what. And a new language certainly can qualify.
OTOH, languages certainly do run in tides. There don't seem to be any decent books right now on "Data Structures and Algorithms the C++ way" that aren't primarilly texts. Now I'm trying to switch over to C++ after using a raft of different languages, so most of the introductory books get insufferably dull. I don't want to work through all the drill. Been there, done that. I want a guide on picking things up fast. The best book that I've found so far is "Data Structures, a PseudoCode approach with C++". It isn't really designed as a reference book either, but at least it doesn't stop in the middle of an algorithm and say "the rest is left as an exercise for the student". So I may be able to get my AVL tree working and tested by Monday. (Then I need to decide whether to switch the code over to Eiffel, or recode the Eiffel a bunch of Eiffel in C++
So using lots of languages has its drawbacks, too.
Re:Schools! (Score:2)
Perhaps it's because most school in America, both secondary and tertiary, seems to be geared towards producing employees, and not thinkers? It used to be that you'd learn many ways of thought, and usually, many languages when persuing a CS degree. Now, they teach you C, C++, and Java. Forget that you may not learn any real science or how to think and synthesize, you're taught how to use the tools. At least they still teach students to experiment in Biology, rather that just showing them how to follow procedure and use the microscopes.
Re:Why learn another language? (Score:2)
However, a language like Ruby fails to impart or embody any new paradims or concepts. That is, it's mostly a new syntax atop existing semantics found in Perl or Python. An exercise is syntax and grammar is relatively uninteresting. For me, learning a new language, be it spoken or coded, most of the interesting parts are novel ways to express old semantics or completely new semantics that are put out in the open.
That said, I like Ruby- while it may be pointing at the same problem-niche as Perl and Python, as a language, it makes a lot more sense to me, and I'd heartily choose it over Perl or Python in most circumstances.
Java has won *nothing* (Score:2)
Conclusions:
1. Java is a nice language. I would prefer Java to C++ any time of the day.
2. Java is a useless language outside of a very controlled environment. Because it is not Open Source, it is highly unportable (just try running Java on OpenBSD sometime!), and because the existing runtimes are so bloated, it's only useful for applications where you don't mind having a spare 40 megabytes of bloat hanging around.
In other words, Java is *not* a panacea, and certainly isn't a replacement for highly dynamic languages such as Python or Ruby, which tackle an entirely different problem set.
For a bit of background:
The Problem with Java [badtux.org]
My opinion of Ruby: Nice language. Some stupidities though -- the whole notion of making variable types case-sensitive reeks of Fortran. I considered Ruby for the TapiocaStor project, but had to dismiss it from contention because it's not yet mature enough. We're using Java, but only because we can't use Python for legal reasons.
-E
Re:Java has won *nothing* (Score:3, Informative)
Ok. On a just-installed machine (complete with generic kernel, yes I know):
(rob@denali:~)$ uname -a
OpenBSD denali.CENSORED.com 2.9 GENERIC#653 i386
(rob@denali:~)$ java -version
java version "1.4.0-beta"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.4.0-beta-b65)
Java HotSpot(TM) Client VM (build 1.4.0-beta-b65, mixed mode)
That's OpenBSD 2.9 running the latest JDK 1.4 beta. What were you saying again?
You also seem to be confusing Java (the platform) with Sun's version of Java. Open Source editions (eg Kaffe [kaffe.org] can run on just about anything. Not to mention that you can get the sources to Sun's JDK.
Re: (Score:2)
Re:Java has won *nothing* (Score:3, Informative)
That's because they're not types, they're classes. And why does that matter? Because all you reference a class by it's name- as a VARIABLE. If all Ruby variables were case-insensitive, then classes, or types as you refer to them, would also be case-insensitive.
It's simple, beautifful elegance. This is unlike other languages, where types and/or classes are an exception, aren't normal variables. There lie's power in Ruby's approach.
Re:Java has won Re:Why learn another language? (Score:2)
Almost as fashionable as pretending to be pragmatically cynical.
If we followed Dalroth's advice, we would repeatedly waste weeks at a time on other fringe languages as well. Sure it MIGHT pay off but this is the same logic people are criticized for when they spend money on lottery tickets.
I don't do things because I think they'll "pay off." I do them because I learn things from them, and because they're fun. Every time I learn a new language and compare it with the ones I already know, I learn something (and it's fun, too).
Re:Java has won? Evidence to the contrary. (Score:2)
Re:Why learn another language? (Score:3, Insightful)
Comment removed (Score:3, Insightful)
Re:word math (Score:3, Funny)
Re:word math (Score:3, Insightful)
Re:word math (Score:2, Informative)
$indent = "....";
...
print $indent * $current_indent_level;
print $indented_text;
Re:word math (Score:2)
I guess i'll download and try it later tonight, I just dont like the VB style conditionals (if fuck break else shit break end nonsense).
Re:I think this said it all... (Score:3, Insightful)
Heh. Does perl speed up development? Everytime I've tried to use it, I had to have multiple browser windows open for docs and other people's code, in an attempt to figure out what the hell it's goofy syntax stood for. The small script I ended up making kind-of worked- doing the same read, search-and-replace, and write back to file only works some of the time on some of the files- I try it on a different and directory and BAM it doesn't work!
I'm not saying perl is completely flaky and useless- but contrary to Wall's silly assertions, perl isn't a language you can pickup gradually, there are simply too many rules and funny symbols ($/, @_, $_ anyone?) to learn just to do simple things.
And why the hell can you only store scalars into hashes? Pffft! And don't tell me I can just use references to non-scalar data, as refs are scalar- if I wanted to deal with C's bullshit, I would.
Mod Parent UP Funny, not DOWN Troll (Score:2, Interesting)
For embedded systems: C# has a defined bytecode that can be JIT compiled onto a variety of special purpose chips without rewriting anything.
Web Programming: C# has built in XML and network support, and don't forget that because everything tunnels over port 80, no firewall will prevent your code from executing!
Instead, all of the statements are absolutely absurd. Maybe you don't think it is funny, but it is absolutely NOT a troll. Trolls try to say things that have a basis in reality but are completely non-constructive for the purposes of discussion to have people respond (I'm not sure where the line to Flamebait is, but...). This is just a joke.
Re:Why worry about importing IT workers, (Score:2)
You forgot (Score:2, Informative)
Re:Why worry about importing IT workers, (Score:3, Informative)
[...]
(Java/no other language is as useless to compete with Java)
I think you're under the false impression that Java was written by Microsoft. You also probably don't use java for anything hence feel it's unnecessary. If you want a good, easy to maintain and robust app on multiple platforms then use java, but if you need that app ready to go into production quickly, then java is not your best choice, scripting languages like Perl are. I havent learned Python as of yet, but based on what I'm told, it's good for getting an app done quickly.
Re:Lots o' languages (Score:2, Interesting)
Re:Lots o' languages (Score:4, Interesting)
Re:Lots o' languages (Score:2)
Re:productive? (Score:2)
If that were true, you could just as easily write a text processing system in raw assembly or in a scripting language. I don't believe that. Languages are tools. Different tools effect your productivity in different ways. A farmer's productivity does depend on his tools and so does a programmer's.
Re:productive? (Score:2)
Are there concepts that can be expressed in one (human) language, but not another?
Re:Please clarify (Score:2, Redundant)
I keep(kept) trying to think of some alternate syntax that would be easy to preprocess into the non-delimited form, but all of the traditional parenthetic characters on my keyboard have already been prempted for other uses. And of course if I do design this, then none of the standard tools would work with it. So I'd need an easy way to do a bijective mapping. And at a minimum this would mean parsing all of the strings to be sure that the parenthetic marking I was using wasn't a part of a string instead of a code delimiter. Basically I'd be writing a complete parsing engine with a two-way mapping.
So one of the things that I really like about Ruby is that it didn't adopt the alignment delimited program logic. I certainly see using alignment as an added check for logic correctness, but there should be syntax markers that determine the syntax. Layout should be for verification.
Yes, there is DB access (Score:4, Informative)