The D Programming Language 530
dereferenced writes: "Walter Bright, author of the
original Zortech C++ Compiler and the free (as in beer) Digital Mars C/C++ Compiler, has posted a draft
specification for a new programming language that he describes as "a successor to C and C++". It seems to me that most of the "new" programming languages fall into one of two categories: Those from academia with radical new paradigms and those from large corporations with a focus on RAD and the web. Maybe its time for a new language born out of practical experience implementing compilers."
A critique (and take a look at Ocaml) (Score:3, Interesting)
o Multiple inheritance is absolutely necessary. The main way it is useful is for Java-style interfaces.
o Getting rid of macros (preprocessor) is a very bad idea. What is needed is even more powerful macros (see Lisp).
o Generic programming with templates is the greatest thing about C++ -- the one feature that puts C++ above other programming languages. I'd rate generic programming capability as being a "must" of any modern programming language.
o Operator overloading is a Good Thing, in that it helps you set up a well-designed library as a "mini-language". Good programming practice involves reducing the number of keystrokes required to achieve a given result (ultimately). Generic programming, macros, and operator overloading all go in this direction. Eliminating them is a step backward.
o You say "smart pointers are irrelevant in a garbage collected language". Not true. There are many types of resources which a destructor might free besides memory. One weakness of Java vs C++ is that it is hard to control Java's "destructors".
The "best" programming language (for general-purpose "big" programming projects) I've encountered may be Ocaml. It can compile into native code as efficient as C++'s. It can also be interpreted. It is stronly typed. It supports a powerful "functional" programming idiom. It looks pretty good to me, although I haven't used it for anything "real" yet. But if you're looking for the "be-all, end-all" modern programming langauge, I think Ocaml's worth taking a look at.
Re:A critique (and take a look at Ocaml) (Score:2)
Good programming practice involves reducing the number of keystrokes required to achieve a given result (ultimately).
Not really. What you really want is to increase productivity. This might mean "reducing keystrokes", but more importantly, you want to maximize maintainability. This is why operator overloading is often a great evil -- you are hiding information that is not readily apparent. What does "+" mean in context X? It can be extremely difficult to know, particularly when bringing on a new programmer to work on old code.
You're argument will probably be something like, "well, yeah, but everything can be abused" but that's not the point. The point is how easily overloading can be abused, how little it actually adds to the language, and how complex it makes things.
Re:A critique (and take a look at Ocaml) (Score:2, Insightful)
Still, overloading, and macros, add a great deal to the language, increasing the possibilities for creating powerful libraries, with simple, intuitive interfaces. If you want a language which protects you from yourself, use Java. But don't expect to be as productive! (he says as he ducks for cover...)
As an example (of overloading), consider the C++ streams library. Imagine having to do:
It's horrible! Instead, we can write: Nice! It's much faster to type, and much more clear (hence more maintainable and less prone to bugs).In my experience, in general, reducing the number of keystrokes (increasing the conciseness of code) leads, simultaneously, to faster-written, less-buggy, and more-maintainable code.
In general.
Re:A critique (and take a look at Ocaml) (Score:2, Interesting)
Sure, and such surveys as have been done have repeatedly shown that your typical programmer will average roughly the same number of lines of code in a given period of time (about 20 per day, usually). Thus, the more power there is in each of those lines, the better.
It should be. If it's not pretty much immediately obvious what a + operator means in a given context, then it's clearly a bad use of operator overloading. (Granted, it does get widely abused. So does inheritance. That's not to say these things can't be very useful when used properly.)
What many people ignore is that operator overloading, like the option to use value or reference semantics, is important to allowing user-defined types to function just like built-in ones. C++ is one of the few languages that (almost) achieves this. As a result, you can do things like writing nice generic algorithms using templates, which is still a much under-rated but incredibly powerful feature.
For example, in C++, I can write a "sum" algorithm that iterates over an array of values, and +s them all. On ints, you get the sum of the values. On complex numbers, with a suitably overloaded operator+, you also get the sum of the values. On strings, if I've defined + to mean "concatenate" (which even those langauges claiming operator overloading is bad actually do) then I get the concatenation of several strings. All of this makes sense and is nicely consistent. It's just that in C++, it's fully controllable, whereas in Java, you're stuck with + meaning concatenate with a String, whether you like it or not.
Re:A critique -- macros (Score:2)
LISP style symbolic manipulation of code permits the construction of special purpose syntax that is closer to the problem at hand, making the code easier to write and understand, provided that the macros are written to make things more clear rather than less. It is certainly true that you can take things too far with LISP-style macro expansion, but you can also go much much farther than you can with any text-preprocessor scheme.
Not nice thing to say (Score:2)
in the time few years ago:
1. Java is already invented.
2. Nobody cares about 16 bit code anymore.
If you like Java, use it. If you dislike java,
stick to good old C++. No need to invent a new language.
Give it a chance. (Score:2)
This sort of staement really amazes me. Are you so righteous that you think Java and C++ are the answer to all programming problems? Get real. They both have their place, and there's nothing to say that D might not have its place too.
This guy's trying something new. If you see a particular problem with his approach, by all means let him know. We all value constructive criticism and suggestions. But don't just say it's no good before its even been given a chance. I for one think he has some good ideas in there, and I look forward to being able to try it out some time.
Re:Give it a chance. (Score:2)
> righteous that you think Java and C++ are the answer
> to all programming problems? Get real. They both
> have their place, and there's nothing to say
> that D might not have its place too.
If he would invent something really new I would not argue. What he offers is another facelift to C++ which was extention of C. He offers it in a way very similar to Java.
For fresh approach to OO languages take a look at
OZ [mozart-oz.org]. I wish I had more time to play with it. Looks quite interesting.
I really get disappointed when people invent pet language for their project when there are thousands of other lanugages which you could use. For my projects I am using
GUILE [gnu.org].
I hope he gets it finished (Score:2)
C++ and Java (Score:2)
Did anyone RTFA before posting? (Score:2)
Overall, it looks like a pretty good job, if what you wanted is a language for large projects on modern desktop & server computers. It doesn't entirely take away C's capability of letting you screw up enormously, but it does make screwups a little less likely. For instance, C and C++ programmers tend to manipulate strings with pointers; this results in very efficient code, but combined with sloppy programming (always by management decree, I'm sure) it also results in hundreds of security exploits involving buffer overruns. D gives you dynamic arrays, which handle strings the way they ought to be handled, and I think makes a program that allows buffer overruns harder to write than one that doesn't.
On the other hand, I do some embedded programing with 8-bit CPU's. I rather suspect that over half the people who frequently code on-the-job are doing the same, although most of them are not "programmers" but rather are EE's, design engineers programming their own designs. Certainly you'll find a few dozen of those 8-bitters in the average american home, even though they own nothing they recognize as a computer.
D doesn't even define how to compile to a CPU with less than 32 address bits. That doesn't mean it's a bad language -- in fact, it make's it a better language for the 386 and up -- but it does mean that it's widening the gap between embedded programmers and the rest.
Likewise garbage collection is acceptable in an embedded system of any size only if you can control when it happens, and there is no mechanism in place for this. (The docs do mention that interrupt-handlers probably can't be written in D because of the garbage collection.
But for desktop and server applications, I like 99% of what I've seen and have just one complaint: operator overloading is _important_, it lets you write extensions to the language.
It's like Nick Wirth and Java had a child (Score:2)
Reading the document, I felt that old shuddering horror I felt when I learned Modula-2. M-2 looks great on paper until you try to build a major project with it.
It sounds like he took Java's design aims and added Nicholas Wirth's bugbears (being an academic) and tried to marry them, but the problem I have is: what's the compelling reason to use this over Java? I didn't see anything in there that gives it a clear advantage over Java, and he doesn't give an alternative to templates. Templates, especially as implemented by C++ and Ada, can create type safe structures that a pure OO design can't (A Stack of Objects cannot distinguish what's being pushed onto it at compile time).
Sounds like he's enjoying the ego trip of making his own language. Personally, I'd rather wait and see what Stroustroup's C++ Redux effort generates.
There's been a "D". The next one should be "P" (Score:2)
(It led to an interesting confrontation at one point. Roger Gregory was accosted by a member of one of a cult, who gave him a flower and started on the cult's conversion spiel. When the cultist got to the first question (your occupation) he said "I'm a 'D' programmer.". Of course the cultist heard it as "deprogrammer" and ran.)
I hear that the language "BCPL" was part of the inspiration first for a language called "B" and then for "C". By that precedent the first non-superset successor to "C" should be "P".
Same old same old! For a change, try O'Caml (Score:2)
This language looks to be the same as everyone else's attempt to make a modern C-like language (ie, Java, C#). What is the point?
For a fast (as fast as C, maybe faster), safe language with some really neat (and probably unfamilar) features, try O'Caml. This will cure your doldroms, and you may never want to go back...
http://caml.inria.fr/
Sounds Ok but needs Interfaces... (Score:2, Interesting)
Here's my 2 cents: D Sounds ok. I DO like the idea that a typedef actually creates a new type. But as a C++ programmer of 9+ years who is not "terrified" of managing his own memory, and who thinks that operator overloading and templates are cool, I have some issues with the draft standard as it stands:
1) Templates: I hope that Mr. Bright does find an answer. I agree that C++ template syntax is tough, but the power of generic programming is far too great a feature to drop for large applications!
2) Operator overloading: I like it, many people don't. Used properly you can make some very cool looking type safe code. I don't think a very powerful feature should be dropped from a language just because some people are idiots. C++ is not a toy; and neither should it's successor be.
3) Interfaces: Hello out there? The world has gone distributed. How about direct language support for CORBA interfaces? Now THAT would be a slick feature to add to an extended C++ language!
4) Standardize the name mangling! Name mangling issues are what make different C++ compilers incompatible; let's fix this oversight...
5) Garbage Collection: I'm ok with garbage collection but DO give me a way to override the collector! There will always be situations where I know I can get rid of something but the garbage collector wouldn't see it that way. DO give me a way to manually kick off a garbage collection cycle and DO give me a way to manually delete things.
6) I'm working on a million line+ surface ship combat system right now. One thing that the old Ada programmers keep screaming about is the inability to get very fine grained control over numbers; and for this application I can see why they are complaining. What's needed is a way to enforce the domain of a numeric type, ala low bound high bound with an exception thrown for invalid values. Very fine grained control over coordinate and range values is key to a large class of military applications.
I've been pondering my own new language too. Maybe I should go for it. My language would look alot like C++/D - with the items listed above - plus some other ideas that I've been pondering...
--Richard
conservative language: java-- (Score:2)
Lets summarize:
- no generics, yet (he was 'thinking' about it)
- I haven't spotted an interface construct like in Java, very useful and no performance penalty -> include it!!
- no dynamic classloading, another useful Java feature
- no reflective features, another useful Java feature.
- no new features worth mentioning
I would propose this language were called Java--. It removes features the author of the spec deems irrelevant and doesn't add any new ones. Java has its flaws but these flaws are tolerated because it also adds useful features.
Incidently, the VM approach has been adopted by MS now in
If the author of the spec still wants to go ahead with implementing D (aka Java--), I'd strongly suggest to make it compile to
And finally, there is no market for this language. C/C++ programmers are generally so much in love with the language that they are virtually blind for its disadvantages and ignore/dismiss competing languages (of which there are many). Based on this it is safe to assume that most of them will also dismiss D, no matter how good it is. And since I already argued that the language really doesn't add anything new, programmers of other languages will also not be inclined to swap language.
It is funny that all attempts to make a new language that looks like C++ but doesn't have its advantages end up looking like Java. Maybe the Sun guys got it right after all. It certainly is a nice language to program in.
reminds me (Score:2)
I called it Dtone, since I thought calling it just D would be arrogant, claming a single letter name. However it seems others didn't disturb that
Well it's not nice to link to ones own page, and advertise on
http://w ww.dtone.org
D versus C# (Score:3, Informative)
Many of the features look pretty sensisble. There is now pretty unanimous support for dropping Multiple inheritance. The problem with multiple inheritance being that it leads to programs only the original authors understand.
It is disapointing that the syntax was not changed more radically. I for one am pretty bored with typing semicolons at the end of lines. Using braces for block structure is equally tedious.
The garbage collector is of the 'stop everything and collect' type, this is not a good scheme as anyone who has seen a PET running Microsoft Basic GC will agree. The incremental GC in .NET is a better scheme, even if it is slower overall. But that is an implementation detail.
It would be good if people would start to look at adding support for parallel program execution. The threads programming model is very clunky and hard to use, in part because there is no means for the program to perform checks on access conflicts.
Also a persistence model should be part of any new language. The current division between programming language and database is a lot of wasted overhead.
Microsoft will HATE it (Score:2, Funny)
Garbage Collection vs. Virtual Memory (Score:3, Interesting)
Meaning that, since the garbage collector has to periodically walk all of the heap of a process, it would seem to me that it would thus periodically force any pages that are paged to disk to be brought back in by the VM even if they didn't need to be otherwise.
I used to do alot of Java programming, and I got the uncanny feeling that every time my program grew very large (which was very often - Java programs use *soooo* much memory, don't know if it's just a general tendency of GC or if it's Java's implementation) the system would thrash quite a bit more than if I wasn't running any Java programs ... and I came to believe that it might have something to do with the garbage collector forcing the OS to load every page of the process into memory as the GC swept through, so everything that modern OS's do in terms of trying to streamline VM kind of gets thrown out the window when garbage collectors are forcing every page of their process to be loaded in periodically.
Just wondering why no one has ever made the point (to my knowledge, anyway) that garbage collectors may be very bad for virtual memory performance. It seems quite likely to me, anyway.
Otherwise, I like just about every idea in the D language, especially his Name Space notion - although I didn't read too much detail of his spec, at least he's thinking about it. I hate the fact that modern languages are based on string identifiers during linking; there's no formal mechanism whatsoever of avoiding clashes in the namespace (Java's class package name idea is a small step in the right direction), and it really seems stupid to me that shared libraries should be carrying around all this string baggage, and doing all these string compares during linking ...
Anyway, that's how I see it.
Re:Garbage Collection vs. Virtual Memory (Score:3, Informative)
Java's class packaging is considerably more than a small step in the right direction. It supports a universal naming convention, based on the internet's naming systems, that can underlie local and remote code. D's modules are a primitive mechanism at best, similar to Delphi's. They're OK for a single organization, but problematic for integrating code on a wider basis.
Re:Garbage Collection vs. Virtual Memory (Score:2)
And the reason that I think that Java's mechanism is only a small step in the right direction is that it's simple a convention. There's very little behind it. I don't have to name my classes with a package, nor to I have to pick a package name that is reasonable. Package names actually take up a considerable amount of space in a Java file - I wrote a commercial obfuscator package and it turned out that something like 5 - 10% of a Java class file's size is just package names embedded in every reference to other classes - and it seems silly to drag all of these strings around when a more concise, robust system could be used instead ...
Re:Garbage Collection vs. Virtual Memory (Score:2)
Actually, that's not quite true. But generational collectors are set-up so you don't have to run through entire sections of memory except in rare circumstances. And you can use virtual memory hardware to alert the collector when an old generation is being written to (so you don't have to look yourself).
D's floating-point model is dangerous (Score:4, Informative)
If you get different answers on different computers due to different roundoff errors, your software becomes unreliable. It's true!
People get confused by Intel's 80-bit FP arithmetic. Yes, the FPU expends some effort in rounding the 80-bit result back to 64 bits, but the result is not more accurate than a 64-bit FPU. In fact the answers will be exactly the same--this is mandated by the standard.
Anyone using floating-point arithmetic for anything serious needs to know exactly what the arithmetic model is. If Walter pursues this philosophy with his new language, he will make it unusable for numerical applications.
Walter needs to read:
David Goldberg, "What Every Computer Scientist Needs To Know About Floating Point Arithmetic," ACM Computing Surveys, vol. 23, pp. 5-48, 1991.
I could not find a copy online, but here is an interview with William Kahan [berkeley.edu], the Turing award winner who co-developed the IEEE 754 floating-point standard. Language designers should notice that Kahan implicates of Java and Fortran at the end of the article.
Here's the link (Score:2)
But you know, just yesterday I re-read 80% of Kahan's site, and he does in fact recommend that non-expert numerical analysists (i.e. just about everyone) should use the maximum precision possible, as a sort of band-aid. Not that a guru like him thinks it's a cure-all, just that it'll help people a bit.
So Walter isn't that far wrong there, after all.
Languages should be written for programmers (Score:2, Interesting)
As a programmer that's worked with about 15 languages over 18 years what I really want is a language that:
1> Is as quick to program in as php/perl/python.
2> Is still managable for large projects.
3> Is as fast as C/C++.
4> Is easy to port across platforms(porting Quake V from Linux to Windows should just be a recompile).
5> Performs in a predictable manner(no wierd behavior out of the basic operations every language has in common).
6> Memory management should be handled automatically.
7> Integrates seemlessly over networks.
Is this too much to ask for?
After C comes P! (Score:4, Informative)
Re:After C comes P! (Score:2)
I guess few remember the great debates (in good humor) about whether the successor to C would be called D or P. Bjarne Stroustrup managed to appease (and probably get a good chuckle out of) both sides by calling his newly developed language C++... An obvious software engineering in-joke.
Re:After C comes P! (Score:3, Informative)
And calling it Fifth fits more so than Further- Forth comes from the word "Fourth," as in the cardinal number. The mythology goes, the filesystem where Forth was first implemented couldn't handle a filename as long as Fourth. ;) Hence, forth.
Re:No, "Fith" is a spoken language (Score:2)
PostScript and Forth may be based on some of the same ideas (stack-based), but they're far different in purpose, so I wouldn't say PS is Forth's successor.
Re:After C comes P! (Score:2)
Why would anyone need OO? They don't. Why does anyone need C? They don't. Maybe we should just all be using hexeditors and doin raw binary. Don't really need assembly, or the OO macros for assembly (yes, they exist).
There are actually quite a few Forth object systems. MOPS [netaxs.com] and bigForth [sourceforge.net] come to mind. Come to think of it, Forth plus an object system is probably about the fastest OO you can get.
On the other hand, you could propose that "Forth" be followed by "Further". After that, you need to *think* before finding a new name.
Ignoring the obvious fact that choosing a name like "Further" for no reason but that would be stupid, one could just as easily say "Farthest," and "Damn, we're serious about being Far now (DWSABFW)".
Re:After C comes P! (Score:2)
I'm no Forth bigot, but give bigForth a try- it's fun! I say, I'm impressed!
can you say "Java?" (Score:5, Insightful)
from the overview page [digitalmars.com]...
features to keep:All except the last is contained in Java.
features to drop:This seems to be precisely the parts of C++ that Java also does away with. Furthermore, the C preprocessor is not strictly part of the C language and in fact many other programming projects use cpp for simple cut and paste includes of their favorite language. When I first read about trigraphs, I couldn't wait to try them out to make some extra obfuscated [ioccc.org] code, but alas the C compiler I was using didn't support them. In fact the lack of standards compliance is one of the main drawbacks to programming in C++ and C. If my Java code compiles on sun's compiler, then I can be assured that it will also compile on any other compiler claiming to compile Java code.
The author also mentions that D will not have any bytecodes. From a strict perspective, the Java programming language and the Java VM are two different standards and just because you typically compile Java code into (confusingly named) Java byte codes, doesn't mean you can't use one without the other. For example, anyone (who is insane) can pick up a copy of the Java Virtual Machine Specification [sun.com] and a hex editor and make some syntactiacally correct class files. More realistically though, java bytecodes are often targets for compiler construction classes. Also, if you use the GNU Java Compiler [gnu.org] you can compile programs written in the Java programming language directly into machine code.
While 90% of the description of this language screams Java, there seem to be some of the more useful features of C++ thrown in (typedefs, scope operator, etc.). The only way for this to be successful, is to finish standardizing the language as soon as possible and get a reference compiler for it so it leaves the realm of theoretical vaporware. Perhaps Java might have looked more like this if the language design was revisited. However, Java has lots compilers [geocities.com] which are much much more likely to conform to the standard [sun.com] than the C++ equivalents [cplusplus.com].
A few suggestions (Score:2)
semicolons (Score:2)
Burris
Big deal (Score:2)
Personally, I think we need a radically different way of programming. I don't know what it is, but we need it. We're really in the bronze-age of programming. We've got a long way to go. Right now, writing software is more art than engineering. The few groups that do it like engineering pay a heavy premium to do it (i.e. the Shuttle Software group)
There was a guy a while back that wrote a program that emulated a bunch of CPUs. He then wrote a language for those CPUs and had a "goal" for the program. He then introduced the idea of mutations and spawning child programs. He would start off writing a program to acheive the goal, and then feed it into the "CPUs". After several generations and mutations, and a "natural selection" type process, the computer ended up generating better code than he originally did.
I've had it in the back of my mind that that's what we really need to do in software. Come up with a way for computers to put our software through some sort of "mutation" and "natural selection" process and in the end we get better code. Obviously in the real world, this is a much more complex problem than the simulation this guy wrote. Wish I could remember where it was and what the link was. Very cool stuff.
SECURITY BY DEFAULT (Score:2, Insightful)
Re:SECURITY BY DEFAULT (Score:2)
Builtin security = slower runtime. If you know how to program then you don't write code with buffer overruns. If you don't, then you can use a bounds-checking coddle langauge like VB.
Re:Bounds checking (Score:2)
You can use a template-based array class in C++, which (at a guess) will be no slower than Basic or Java arrays:
Array Karma(1000);
Karma[1001] = 50;
etc. , and just not use the * and [] syntax. If you were feeling really leet, you could make some macros and typedefs to make your arrays look pretty Basic-like or C-like.
Maybe? (Score:3, Interesting)
Maybe it's time for a new language to be born out of practical experience writing software.
I don't know how it is in Linux, but I really hate having to write several hundred lines of code for a single window w/controls in Window API calls. Personally, I'd like to see MS get rid of those API calls (and don't replace it with ActiveX until ActiveX works). Between the ones that don't work as documented and the rest of them being overly cumbersome, it's just a hassle. Especially when you have to create your own encapsulation objects for those things. I like Delphi because of its encapsulation of the visual components, but their base library sucks in itself in that it doesn't expose all the functionality. And since they saw that it was so important to declare everything as PRIVATE methods, you can't get descendent object to do everything you want either because you don't have access to all the base functionality.
Simplicity shouldn't be taken to the extreme either, and gear a new language towards the non-programmer crowd like MS tries to do.
Of course MS is just making things worse right now by implementing these new Luna APIs for XP. I'm sorry, but I don't know of anybody thats been really dying for the ability to use API calls to put a gradient on a button. In my opinion, this is just MS's attempt at trying to get developers to waste time, so they don't work that hard on developing new products that may compete with MS.
Re:Maybe? (Score:4, Insightful)
However, for those who bother to do the job right, VB can be a very powerful tool, used to create shipping application. (As I personally have done.)
With VB, you don't care about all the "stuff" underneath (which can be a problem when you try to do something that isn't built-in, but there are creative solutions). You just drag controls onto your window, and write the code behind them. Very easy.
VisualStudio.NET is bringing this in two different directions: First, VB gets full access to everything, and is no longer the "bastard" child of the VS family. Secondly, the other VisualStudio languages get a new Forms system similar to VB's -- just drag controls onto the Window, set properties, then write the code to handle the events. Easy and clean.
That's really what the entire
This message sometimes gets lost because
Microsoft has already stated that when the Win9x code line is pretty much dead, and everyone is writing to the CLR instead of Win32, they are going to make a move to port the CLR to the WinNT Executive (that is NT's native kernel API). Win32 will finally be relegated to "legacy" tech just like DOS interrupts and Win16.
Re:Maybe? (Score:3, Informative)
While I think you have some valid points, you are far too eager to suckle at Microsoft's teat and call the watered down skim soya milk it gives 10% m.f. homogenized.
I too like VB (quit staring at me like that!), and I agree with you that poor programmers give VB a bad man, but I big to differ on the idea that VB somehow protected you from Microsoft's shifting API's. You've read the reviews for VB.Net? Everyone VB programmer out there is screaming blue murder because the object model in VB has so radically changed that it requires re-learning VB. Yes, I said re-learning VB. MFC has changed their official "ways to do things" with each major release that it's necessary to re-learn MFC every major release. Sure, MS can provide some insulation from their API's, but even their insulation can't protect you from the pointy spikes that poke through everytime MS changes its architecture.
One other thing:
As an embittered and disgruntled fan of NextStep, I vehemently disagree with your opinion. :-) You want a seriously kick-ass distributed networking object-based API? Try NextStep's Distributed Objects. Can you say, "Sweeeeeeet!"
Other than that, I think you made some good points for people to think about.
Re:Maybe? (Score:2)
Given that I converted several hundred program from VB to VB.NET, I can speak on how different the two are.
Essentially, if you "just got by" with VB6 because of how simple a world view it could present, you may be in for a small learning curve with VB.NET
Now the good part:
If you can program in any C-family language, perl, or anything moderately more complex than VB, you're going to love VB.NET because you get to keep the simplicy of VB 95% of the time, but you get 99% of the power of C whenever you want it. These percentages are of course made up, but for those of you that are following C# and like it - VB.NET is almost the same language, what-you-can-do-wise
You'll be happy to know that there is no longer a VB runtime - now theres a
On the other end of the spectrum, theres a VB project template for creating Windows NT services. This was not even necessarily possible with VB6 - and to do many interesting things in VB6 you needed to import Win32 functions.
So in general, i think people that are programmers will love VB.NET - it gives all the long-time vb haters and complainers many things they've been asking for - and then some.
On the other hand, people that can barely grasp VB6 may be in trouble - the new power and flexibility of VB.NET does come at a price in terms of complexity of auto-generated code. If you're already in trouble when you accidentally go into the code window instead of the design window, expect problems with VB.NET.
Re:Maybe? (Score:2)
Why not completely from scratch? (Score:2)
However, to play Devil's Advocate, why base a language on anything pre-existing? Is anyone creating languages completely (or mostly) from scratch?
Admittedly a from-scratch language would be up against a higher learning curve, but I wonder if the benefits would outweigh this.
Just a thought from a person who's had to learn a lot of languages.
Useful? Not Really. (Score:3, Interesting)
What D will implement in the core language is really meant for the standard library. Not everyone needs resizable and bounds checked arrays (the bounds checking is the one with the real overhead). If you are coding a kernel or something low level, the overhead isn't neccesary. If I don't need to resize my arrays, I just don't #include <vector>. Simple as that.
Also, there are no prototypes. Now, tell me, how does one get the source for a 3rd party proprietary library and read the source for the documentation? Often times, I document my code by putting a 3 or 4 line description of what the class [member|function|data type] does below its declaration in the header. If I forget what a function does, I just open the header in another frame in emacs and read its description (which has such useful information as what it uses its arguments for, what exceptions it may throw, what it returns, and whether or not it will modify an argument). It is also much easier to see what members are in a class when you can look at a simple declaration with the outline of class, instead of having to wade through 50 line members to see the next member. That just makes the class look messy, unless each function was only 1 line long.
The lack of operator overloading also makes it harder to implement something like, say, a complex number in a library. With C++, you have the standard complex type in the standard library. If there was no operator overloading, using complex would be more difficult (which is easier: complex foo, bar; foo.i = 1; bar.r = 2; foo.add(bar); OR complex foo, bar; foo.i = 1; bar.r = 2; foo += bar;).
I do see some good qualities. One is the ability to call a constuctor from a constructor. This results in less duplicated code, and makes it easier to keep two constructors of a class synced. Say you had a class with two constuctors: one that takes no arguments (default) and one that takes an int argument. The int argument one can't call the default constructor (this creates a temporary, contructs it, and then deletes the temporary). D allows you to do that. Maybe the next C++ specification will fix that.
D does seem to have a lot of flaws. It doesn't seem very useful. Maybe some people will find it useful. But it seems to me to be yet another language written for someone's personal usage. It makes sense to that person, but not to anyone else. C is a good language because its creators made it useful for other people as well as themselves, same for C++, lisp, Objective-C, and countless other languages.
Re:Useful? Not Really. (Score:2)
templates and operator overloading are good things (Score:5, Interesting)
Templates and stack instantiation of of objects with semantics [i.e. constructors/destructors] is a royal pain in the a** for compiler writers. In fact, only somewhat more recently is g++ even able to handle templates in a decent way; it took a long time to get it right. C++ was a very ambitious language, hard as hell to implement, but that's what makes it so usefull. Give up templates and multiple inheirantance? He suggests this is a good thing?! D is clearly not a language innovation, he should have called it C--.
Besides, you don't actually have to use such features extensively [or at all, really] in a C++ program. You could always avoid iostream and just #include old stdio.h, for example, only choosing to use classes with constructors for some usefull/neccessariy/labor-saving part of the code, while all the rest of it is essentially no different then C [aside from stricter compile-time type checking, which ANSI C has been moving towards anyway, lately]
This is no innovation.
A few other random points:
Ohh! Garbage collection, you can link to a garbage collecting malloc in a C++ program anyway. [If you really care to look into it, C++ allows a whole range of complex underlying storage classes for custom memory management of different parts of a project.]
Arrays are not first class objects?!
Well, this is true, sort of. But you can choose to use vectors, [or other more efficient representations [such as maps, etc] depending on your data type, and with inlining, they will be as efficient as if they were 'native language syntax' features. You don't even have to use the STL, you can write a custom implementation of dynamicly resizable vectors of your own [with automatic bounds checking and resizing, for example] quite trivially. I did it once, and it took, what, 2 pages of source. That's the power of C++, it's so expressive for implementing custom manipulations of low level data, packaged nicely into classes.
No on stack variables? All data as dynamic references?
Yech. Generally too inefficient. I still suspect that he just dosen't want to tackle the hairness of writing such a complex compiler. Remember, you can use only dynamic memory in C++ easily enough, with garbage collection too.
Overall, I think D is too lenient. I give him an F.
Still, I strongly respect the desire to attempt to implement a novel language. Not that there aren't hundreds out there, but it's a noble effort. Still, publishing without even demo code? Yeesh.
Re:templates and operator overloading are good thi (Score:2)
I find it humorous, because two of the 'advantages' of java were simplicity through lack of support of templates and operator overloading.
But, now that the language is mature, and the people using it want a more mature language, they are probably going to add genericity back in :).
Which kind of leads me to the thought that, in general, the whole idea of 'leaving features out because they aren't used' is flawed. Leaving features out for other reasons is good. But perceptions of lack of use are not.
Sounds like... (Score:5, Interesting)
If, on the other hand, all he wants to do is sell compilers, and therefore he needs to convince the rest of the world of the language's benefit, then fooey.
And for the record, damn, I feel old -- I remember trying to make the Zortech compiler work for an old project of mine circa maybe 1989 or so(?) and having problems. I think at one point or another I might have actually gotten email from Walter. Wow, names from the past. In a conference call yesterday I needed to come up with a secure hashing algorithm and I said "ROT13. If we need extra security we can do it twice." and absolutely no one got it.
Anyway, back on topic: No templates? Oooooo, I have a C++ friend who is gonna be pissed....
duane
"In C++, you can look at your friend's privates."
Re:Sounds like... (Score:2)
Didn't Borland end up buying the Zortech compiler and turning it into Turbo C? There were a lot of C compilers back then.
Which reminds me, back in 1987 I was working with the Computer Innovations CI86 compiler. The documentation was a few hundred pages of photocopies in a 3-ring binder, no tools, just the debugger and linker, but it came with the source. Find a commericial compiler these days that includes the source.
Re:Sounds like... (Score:2)
No -- Symantec bought Zortech, turned it into Symantec C++, back when Symantec was into development tools; it had the coolest Windows IDE at the time, but like many other Symantec products throughout the years it died a silent death.
Walter Bright probably did a deal with Symantec to acquire the rights to the compiler and development tools; essentially this the free C++ compiler available on the Digital Mars [digitalmars.com] site.
Zortech may have been the first native C++ compiler, but TopSpeed had the better one, known as the fastest compiler around. TopSpeed had a common IDE/back end for C/C++, Pascal and probably some other languages. TopSpeed merged with Clarion and Clarion/TopSpeed was acquired by SoftVelocity [softvelocity.com]. Clarion isn't C++, but its compiler is probably still based on TopSpeed technology.
Re:Sounds like... (Score:2)
I think Walter Bright should be encouraged to keep doing what he is doing. Your comments about TopSpeed (I remember that one, too) are an illustration of how the software industry is going today; a small number of big companies buying a large number of small companies, with deminishing choices for the consumer. Let's encourage all the innovtion we can, even if we dont' think the idea is particularly sound.
Re:Sounds like... (Score:2)
Ah...the good ole days. I think in around 1991 I was dealing with a Watcom? C compiler under VM/CMS and was having trouble with something. I posted to a BITNET mailing list - nothing too major, and someone from Watcom actually called me like 10 minutes later after reading the post.
Back when customer service was good!
Re:Sounds like... (Score:2)
Nothing special... (Score:3, Informative)
switches (Score:2)
Also, I like sizeof() since it's a feature of the language, not a class's function, so it shouldn't pretend to be a member function.
Re:Nothing special... (Score:2)
The only thing new is the name (Score:4, Insightful)
Re:The only thing new is the name (Score:2, Informative)
Re:The only thing new is the name (Score:2)
BCPL. But
(credits to Larry Wall, I think.)
Re:tangential: try-catch exception handling (Score:2)
Well actually you can handle non errors also as exceptions, why not? But becare that exception catching is an expensive job. It requires some time to do multidrops (jump several functions call backward at once, not just one like the normal return), since the complete stack so far has to be cleared and destructed cleanly by that process.
I'm not saying that exceptions this way is the last answer to the multi-return problem, however they can be misused in this condition.
to declare a function with multi returns could be done rather easy:
take this as dummy example:
int32 , bool positivize(int32 a)
{
if (a 0) {
return -a, true;
} else {
return a, false;
}
}
This could be done easily with parsers/compilers however the problem arises how to call the function...
positivize(4)
Now how should this be treated furhter? So far there is no public solution to it.
function(
Re:tangential: try-catch exception handling (Score:2)
Here, you can see the Right Thing is to write code naturally and functionally:
float factorial (float x) {
return x * factorial(x-1)
catch overflow throw overflow;
}
that is a billion times better (I put infinitely first, but got an overflow :) than this:
bool, float factorial(float x) { :)
// and by now if you can't see what
// is so fscked up about this approach
// you shouldn't be writing code
// forget about compilers...
}
Now, you're probably thinking "yeah, but overflow is an exception already" but my point is that looping over an array of strings and looking them up in a dictionary and doing something with the result should be treated just the same way: functionally till there is a not found, and it is so much more natural to write it as an exception rather than put that ugly if in there every time.
This was my whole point in the beginning: stop thinking of exceptions as "errors" and think of them as normal control flow and build compilers that can handle it.
Re:The only thing new is the name (Score:2, Informative)
D already exists.. (Score:4, Informative)
weirdness (Score:3, Interesting)
He doesn't have any post-code gen optimization? I know you can perform elementary optimization onthe intermediate rep, (such as folding, etc), but he'll really need another phase if he wants to optimize for pipelines, which will vary from architecture to architecture? Tut tut. Maybe it's just an omission on his part.
Re:Convince me (Score:3, Insightful)
It's unfair to compare a Java UI directly with a native UI. How well does that native UI run on other platforms? Oh yeah, it can't. How well does it run from a web page? It doesn't.
Properly written Java code can approach the speed of pure C, be done in a tenth the time, and be significantly more maintainable and portable.
Re:Convince me (Score:3, Insightful)
Why? Users don't care about wether your application has a slow user interface, all they'll do is complain that "This program is slow"
Java has a place, and by extension Java UI's have their place. But saying "Oh well it's O.K for the user interface to be slow, because it will run equally slow across all platforms" is a load of rubbish. If Java code can run just as fast as native compiled C or C++, just why are the Java User Interfaces slow?
Any Java zealots want to clear up the aparent contradiction there?
Price for everything... (Score:2)
In the case of the UI's, it's partly a problem that Java just isn't QUITE as fast as either C or C++ and that you've got the sandbox in the way- amongst other things.
Re:Convince me (Score:2)
Re:Convince me (Score:2)
Well, I don't believe C++ will ever reach the speed of assembly, but then again 'good enough' will usually do: Just look at Windows
Now will you people give up this silly argument?
Re:Convince me (Score:5, Insightful)
Now on the server...that's a totally different story. I write server apps all day in Java --- my development times are SLASHED from what they would be in C/C++, or even CGI's. Maintenance and documentation are a breeze, and performance is fabulous. Java really has done great things on the server.
Re:Convince me (Score:2)
Yes, such as properly written swing. I saw a scheduling applet using a swing GUI that was running damn fast
Moral of the story: if you want decent performance, do it yourself, because the stuff you get out of the box will produce crap. Meanwhile, every whipped-up VB and Delphi app I've seen has at least had a responsive interface.
Re:Convince me (Score:2)
Remember: threads are hard. And locking will fill an exponential resource space. If you're just going to throw faster hardware at it, you should switch back to process based arch.
That's why your Perl,Python, and PHP services deal with heavy loads better - no thread contention. OTOH, they have all that process creation overhead (which is linear rather than exponential like lock contention), so if you can fix your thread bugs you can beat them.
Re:Convince me (Score:3, Informative)
I'd be interested to see a *true* benchmark
I've done that - kinda. Wrote several mickey-mouse comparisons (moving memory, calculating pi, etc), in C, C-machine-translated-to-Java and in regular Java.
The biggest problem was that, for the tasks we were interested in (memory-management, for example) C and Java do it so differently there is no easy way to compare. (Java's habit of creating multiple references to single objects instead of multiple copies of the same object really helps it here).
In general, Java was 3-4 times slower than C on string manipulation with built-in classes/library functions, but was damn-near identical on heavy maths (Java dropped ~1 second for every 30 secs of calculations.)
(Visual C++ 6 compiler against Sun's latest JRE for Windows NT. These timings were only ever meant to be rule-of-thumb.)
Re:Convince me (Score:4, Informative)
Read again. Nowhere do I compare the speed of a VM executed program to a native compiled one. Java is not the end all, be all of languages, but it is much more than the applet creation toolkit it was in 1995. Will fourier transforms ever run as fast in a VM as they do in optimized native code? Probably not. But, then again, how many of your programs are doing fourier transforms
It's simply a right tool for the right job issue. Plain and simple
Re:Convince me (Score:2, Funny)
Re:Floating Point (Score:2, Informative)
Some operations always give an 80 bit result (eg. adds & muls) but some (eg. divides) can be limited by the current precision setting.
floats have 23 bits of mantissa, 7 digits precision.
doubles have 52 bits of mantissa, 15 digits precision.
80 bit "long doubles" have 64 bits of mantissa, 19 digits precision.
Re:practical experience implementing compilers?? (Score:2)
Contracts are a new idea to me, and it looks pretty good. It's a sort of super-assert statement (and assert is now built-in, not a library). Using contracts properly should help both in communicating with other programmers writing related code, and in catching bugs. I don't know about you, but I hate debugging and I'd much rather write bug-free to begin with; this is going to help a little.
One quibble: his square-root function example shows he's never programmed anything mathematical. The "out" contract specifies that squaring the result gives you back the input. In long integers. That is, x = 20, result = sqrt(x) = 4, result * result = 16, the program fails. In floating point, you can get pretty close, but it's never exact so you can't just assert result * result == x. You assert abs(result * result - x) 10.0E-6 * result, for example.
Re:practical experience implementing compilers?? (Score:5, Insightful)
He's talking about making the compiler do all the work - for instance, there are no headers, as declarations are lifted from the source. For that matter, modules and libararies and source are treated the same. I think that he *might* be talking about features that would require a new object format, and thus a new linker.
I really don't like his ".html" file idea: code inside a html file is compiled by ignoring everything but tagged bits. The concept is to use html to document and compile the code right in the documentation. Personally, I prefer to generate documentation from the code. A language that implements context sensitive comments that can be compiled into various types of documentation would be, IMHO, a very good thing. As it is, systems like doxygen seem to work okay, but if it were built into the language, you could even dump documentation out of modules on the fly. Nifty in an IDE environment, or makefile driven dev when you want to check that version 2.2 of openfoo() does the same thing that 2.1 openfoo() did.
--
Evan
Re:practical experience implementing compilers?? (Score:2)
Doesn't FWEB [pppl.gov] already do this? In arbitrary languages?
Re:practical experience implementing compilers?? (Score:2)
I read the ".html" thing slightly differently from you. Rather than compiling code into the html file, source is included in the HTML file. If the volume of source is high relative to the volume of text, then it looks a lot like generating documentation from code.
It looks to me a lot like being able to use HTML markup for your comments, which seems pretty harmless. What I'd really like to see is an XML schema which is custom crafted for helping with programming problems -- in essence adding semantic distinctions to various kinds of comments:
[section name=init_foo]
[remarks] Our foo unit must be instantiated and helper objects initialized [/remarks]
[modification by="Alice" date="1/1/01"]Created[/modification]
[modification by="Bob" date="7/4/01"]Added code for foo-2 hardware[/modification]
[modification by="Alice" date="7/15/01"]Fixed GPF in new foo-2 code, still problems in Win95[/modification]
[TBD assigned-by="Alice" assigned-to="Charlie" for-release="2.1" due-by="9/1/01" applies-to="WIN32"] Fix random GPFs under Win95 OSR2 [/TBD]
[code]
foo my_unit = new foo(something);
init_bar(foo);
etc.;
[/code]
[/section]
The point would be to take just plain old comments and structure them so tools could do useful things with them (e.g. find out what Charlie was supposed do for release 2.0 on Win32). Of course you wouldn't want to see all this detail all the time; perhaps the default editor view could be just the code and the remarks, with little glyphs to indicate change history or pending tasks.
Re:practical experience implementing compilers?? (Score:2)
Literate programming, Code-in-html and what I am suggesting have in common that they intersperse code with documentation in markup language. What I am suggesting is different, in that I believe the markup should be semantic, not stylistic in nature. I don't care about the point size of typeface of the comment, but I do care about the nature of the information each comment carries.
Re:practical experience implementing compilers?? (Score:2)
Rereading the site, I think these are a mismash of ideas, not completely thought out, or at least not peer-reviewed (a.k.a., "Hey, lemmie bounce an idea off you"). Some of them are good, depending on your localized value of good in the context of a programming language. Some I don't like, but then there are things in all languages that I (or any given programmer) don't like.
--
Evan
Re:practical experience implementing compilers?? (Score:3, Insightful)
If you look at the source files, you see that you still have to declare variables (e.g. "int i;").
What you don't have to do is to declare classes in a separate header file, when all the information about the class's public interface could have been gleaned from source file in which the class is actually defined.
The purpose of this, I guess, in C++ is to allow the compiler to layout an object in memory prior to the constructors being called, and generate assembly for class member access, without necessarily knowing where the class is defined or even having access to source at all. Secondarily the class's interface can be determined for compile time checking. I say secondarily, because clearly that isn't the main purpose of class definition headers since they also reveal information about private members and methods, which are of no interest to the client modules.
D is more like Java in that the compiler can do all this without any special help (in the form of header files) from the programmer.
Perhaps somebody who knows java better than I can comment, but I expect that the Java compiler does all its checking by looking for
In any case, I've never found the C++ way of doing things much of a problem, but if you think about it, it is rather unnecessarily complicated. Every little bit counts.
Re:ASM ROCKS !!! :) (Score:3, Funny)
Re:No templates? (Score:2, Insightful)
Oof... believe me, I know about strong versus weak typing. (I posted the parent, but posted it anonymously by accident.) I learned real programming (i.e. not Applesoft Basic) with Scheme, and learned SML last year. SML is just about as strongly-typed as you can get, and Scheme is weakly typed.
Weak typing does have some advantages. I use Perl, which is weakly typed, and the convenience is worth it. But weakly typed languages are slower than strongly typed ones (and this is a fundamental limitation that can't be removed, weakly typed languages have to have runtime checks for types). Also, type errors can catch a lot of common mistakes at compile time rather than runtime (for example, putting arguments to a function in the wrong order will often trigger a type error.)
Overall, I definitely agree that weak typing has some purpose, but for general applications development, strong typing makes for significantly more maintainable code - at a cost to developers, to be sure, but in my opinion a worthwhile one.
Re:Here's what D can't do... (Score:2)
Re:First Parrot (Score:2)
Re:First Parrot (Score:4, Funny)
1. "#", ASCII code 35.
Common names: number sign; pound; pound sign; hash; sharp; crunch; hex; mesh; grid; crosshatch; octothorpe; flash; square; pig-pen; tictactoe; scratchmark; thud; thump; splat.
Personally, I like "C-octothorpe"
- JoeShmoe
Re:Encore! (Score:2)
1) American Revolutionary soldier hanged by the British as a spy. According to tradition, his last words were "I only regret that I have but one life to lose for my country."
2) An asterisk ("*") Notionally, from the misquote "I regret that I have only one asterisk for my country!" of the famous remark uttered by Nathan Hale just before he was hanged ("life to give" -> "ass to risk" -> "asterisk").
Weird, wild stuff there, Ed.
- JoeShmoe
Re:Casting pointers to integers (Score:2)
Walter Bright wrote the Zortech C++ compiler. He's good.
Re:Casting pointers to integers (Score:2)
I'm not saying this guy doesn't know what he's doing; just that if he is trying to solve all the world's problems, then he may have missed the mark in this particular area. (Although in his defence, he has not yet removed unions from his language, so apparently he is aware that they may be too useful.)
Re:Overloading? (Score:2)
Re:Overloading? (Score:2)
Re:What are his motives? (Score:2)
This is probably what everybody said about C++ when they were using C.
The first version of C++ was a preprocessor which converted C++ into C. Thus you still had all the compiler optimizations and even the code for the compiler itself. Then you could further optimize the binary by shortcutting some of the C++ -> C -> machine code into C++ -> machine code.
Garbage collection has already been implemented into C++, it seems silly to make a new language for it unless you can obtain some serious optimizations.
The only real advantage of java over C++ that you can't build into C++ is the security manager. That can't be done without either hooks into the OS or an interpreted language.
Re:Forth !!!! (Score:2)
Re:Forth !!!! (Score:5, Funny)
Variable x to 10 be setting.
1 to x you add.
This times 10 you be repeating.
Re:Forth !!!! (Score:4, Informative)
: Variable ; IMMEDIATE
: to SWAP ;
: be ; IMMEDIATE
: setting ! ;
"Variable" and "be" do nothing and compile to nothing; they are just syntactic sugar. "to" does a SWAP so you can say "x to 10 !" rather than "10 x !". "setting" just does a ! (store) operation.
Actually, you could make "to" and "setting" IMMEDIATE words; you would just need to make them compile in the words they implement. I'm very rusty on my FORTH, but I think you can do it this way:
: to COMPILE SWAP ; IMMEDIATE
Then "to" compiles a reference to SWAP, instead of creating a subroutine that calls SWAP and then returns. The IMMEDIATE version saves one subroutine call and one return.
This would make a nice short article to publish in Dr. Dobb's or some similar magazine, right around April Fool's Day.
I have fond memories of an April-Fools article on FORTH, describing how to add GOSUB to FORTH. He went through several versions, before finally arriving at this very efficient solution:
: GOSUB ; IMMEDIATE
In other words, GOSUB does nothing and compiles to nothing. FORTH is all subroutine calls anyway; it never really needed GOSUB in the first place.
steveha
Re:Forth !!!! (Score:2)
You just don't hear about them. Check out Joy [latrobe.edu.au] and Chaos [sourceforge.net]. (the link for chaos points to Coldstore, Chaos is a toy that comes with coldstore and is used to test it). Chaos has perhaps more in common with postscript than forth. The cold fact is, people don't like to program everything in rpn.
Personally, I like the idea of Intentional Programming [microsoft.com], where you code to an AST, creating higher levels of abstract AST nodes called "intentions". In IP, the language is merely an intermediate tool to reach that end, and the runtime is a particular implementation of it, both expressed in terms of transformations on the tree (simonyi's colorful term for such transformation functions is "enzymes").