Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Perl Programming

Perl6 Being Rewritten in C++ 152

jamiemccarthy writes "A rewrite of Perl in C++ is underway. The audacious plan, now called Topaz, will become Perl6 if and when it's successful. Its author, Chip Salzenberg, will tell you all about it. " Wow. That's quite a project - you can also listen to Chip's talk given at the OpenSource Convention. For those you unaware, Chip is one of the Perl core developers.
This discussion has been archived. No new comments can be posted.

Perl6 Being Rewritten in C++

Comments Filter:
  • It would give and emberassing twist to the python vs. perl religious wars :)

    after all, python people wrote a mailing list manager just so they wouldn't have to use majordomo written in perl.

    Well, we could use python to write perl, perl to write python and have tons of fun.

    There's that story about when steve jobs bought a cray machine. the sales manager called up the guy who designed cray (I forgot the name) and told him that apple bought a cray to be used for designing apple computers. The cray designer thought abit and said that it was logical - he is using an apple to design the cray.
  • Chip listed a couple other contenders (ObjC, Eiffel, Ada) but decided not to choose them because of various issues ranging from implementation to the availability of compilers on many platforms.

    They could have used JPython, which is a Python implementation that compiles to Java bytecode. Can't get much more portable than that, and GCC will compile Java class files into native executables if you want more speed.

    What?
    What?
    WHAT?

  • ..and no this isn't intended as a flame (I'm sure it will be interpreted as one by many anyway, but whatever.. ;).. if this comes from someone who thinks that Perl stands for Practical Extraction and Report Language..

  • by Anonymous Coward
    I fear this project is doomed, but hopefully we can draw some good lessons from it when it does fail.

    First, anyone who has any aspirations of writing platform independent C++ code should read what netscape [mozilla.org] has to say about the subject. This should scare off any right-minded hacker.

    A re-write of perl has interesting correctness issues. Larry Wall believes that languages are so complex that they can't be completely specified. Therefore, he has intentionally avoided a formal specification for perl, relying on the one implementation as the spec. How will we know if this new implementation is correct? I bet that any new implementation of perl will break thousands of existing scripts.

    C++ is deceptively complex - in ways that perl isn't. You really need to know every corner of C++ before you can write solid, reliable, extendable code. Chip's description of his project doesn't make him seem like one of the top ten C++ programmers in the world, and I think you'd need to be, in order to pull off this project. Just as a one particular, I'm curious how he intends static initializers and destructors are going to get called for dynamically loaded code in any portable way.

    Binary compatibility. C++ classes aren't guaranteed to have binary compatibility on the same platform from compiler to compiler, or even from any given release of one compiler to the next. This makes distributing and maintaining pre-compiled perl modules for a given platform much more difficult.

  • This is not exactally on-topic but libstdc++.so sucks. I have to have 3 incompatable versions on my system to have anything linked against them work correctly. And even then I have to make symlinks for programs linked to other versions than I have.

    In /usr/lib I have libstdc++.so.2.8 and libstdc++.so.2.9, in /usr/lib/gnulibc1 I have libstdc++.so.27.2.8 (probably s/b 2.7.2.8) with a symlink called libstdc++.so.2.7.2 (for RealAudio 5, which doesn't even work with most content nowadays).

    Half of my programs are linked to 2.8 and the other half to 2.9 and they are completely incompatable. What nonsense!
  • I have a /lib/libc.so.5.3.12 and a /lib/libc.so.6 and they're completely incompatabile?




    What nonesense!
  • I wrote a few short programs to test simple file I/O using standard FILE pointers, and the C++ abstraction. Each program reads a large file from stdin, breaks it down into words, and writes it to a file.


    #wc Database-HOWTO.txt

    13313 63905 472980 Database-HOWTO.txt


    So.. it's a pretty big file. Both programs were compiled under pgcc-2.91.66, with -O6 -mpentium and the binaries were stripped.


    Here's the code: (hope this looks ok.. please excuse the complete lack of formatting.. html is a tricky medium with a limited subset of tags!)


    #include <iostream>
    #include <fstream>
    #include <string>


    using namespace std;


    int main( int argc, char *argv[] )
    {
    string szinput;
    ofstream write;
    write.open( "output_cpp.txt", ios::out );


    while( cin >> szinput )
    write
    write.close();


    return 0;
    }




    #include <stdio.h>
    #include <string.h>


    int main( int argc, char *argv[] )
    {
    FILE *write;
    char buffer[4096];
    char *ptr;


    write = fopen( "output_c.txt", "w" );


    while( fgets( buffer, 4096, stdin ))
    {
    ptr = strtok( buffer, "\n " );
    while( ptr )
    {
    fputs( ptr, write );
    fputs( "\n", write );
    ptr = strtok( NULL, "\n " );
    }
    }


    fclose( write );
    return 0;
    }


    ...the results...
    Juggernaut:~/test/source> time words_c
    0.080u 0.020s 0:00.11 90.9% 0+0k 0+0io 200pf+0w




    Juggernaut:~/test/source> time words_cpp
    0.370u 0.140s 0:00.50 102.0% 0+0k 0+0io 145pf+0w

    Don't ask me how I got 102.0% cpu. I just cut and paste. It is an overclocked celeron, so that might have something to do with it.



    Anyway.. C is obviously much faster at this. However.. the C++ code is much easier to understand.



    Of course, the perl code for this is very simple, short and easy to understand:

    open( OUTFILE, "> output_perl.txt" );
    while( <> )
    {
    foreach $word( split )
    {
    print OUTFILE "$word\n";
    }
    }



    Juggernaut:~/test/source> time ./words.pl
    1.080u 0.020s 0:01.35 81.4% 0+0k 0+0io 402pf+0w




    At a cost! ;-)
  • by cdonat ( 96648 )
    "...It would also mean that Perl would benefit from the portability of Java."

    Jou don't get that from himplementing Perl in Java. You get that by implementing Perl to compile not to their own intermediate code, but to Java Byte code and then run the result through a Java VM.

    So I guess that the implementation of Perl should never been thought of been done in Java.

  • It would be a fascinating Idea of having a Perl Frontend for egcs. Then the last point would also speak for Perl.

    One could also write and test perl code using the Interpreter, and when it comes to beeing productive and fast compile it.

  • OTOH, I wonder why he didn't choose GNAT.

    From the article: The problem is that if Perl 6 were written in Ada--it would require people to bootstrap GNAT before they could even get to Perl. That's too much of a burden to put on anybody.

    Basically Ada '95 didn't get used because a lot of people don't have it. Chicken, egg, egg, chicken.

    But that may also be suffering from lack of current maintainers.

    Errm? 3.11p was released in January, and 3.12p is suppoused to be out RSN.

    AdaCore Technologies is the normal maintainer, but they were sounding very commercial when I talked to them a year ago.

    Hmmm. I'm a member of the Ada-Linux team [gnuada.org] and we've got a very good relationship with ACT and ACT-Europe. They are a company, but they do do a lot to help both the Ada and GNU communities (there was an ALT/ACT meeting in Paris in June), both officially and unofficially (the Ada-mode [eu.org] for emacs is maintained by them, and several members of ACT-Europe helped create GtkAda [eu.org])

    Of course the problem of where to get his programmers couldn't have anything to do with his choice. :-)

    Course, comp.lang.ada [lang.ada] wouldn't be a good place to start would it?

    - Aidan (dislikes disinformation, but is horrendously behind on /.)

  • (Note...please don't take the following seriously, last night I went through a rather lengthy C++ debug session and am a little bitter at the moment)

    The problem, as I see it, with C++ is that give them a few more iterations and the Standards Folks will have used every English word as a keyword and we will be left using hard coded constants. I suspect that is why unicode is getting more and more popular. They've even had to start re-using keywords (if re-use works for code, why not language syntax). At last count, the number of different uses for 'const' was approaching Avagadro's number.

    Mind you, I think keyword bloat is probably good for compiler writers. Since no one can possibly implement ALL of the C++ keywords a lot of the pressure is gone. They only have to implement the subset of them that they like most. For the rest of the keywords, they can just give suitably entertaining compiler errors, such as the following that I got out of g++ (or was it CC?) a couple of years ago...

    WARNING! Namespaces is mostly broken!

    (Ah, I feel better after that rant :) )

    Dana

  • Comment removed based on user account deletion
  • There is a school of thought that says compilers and interpreters are systems programs and hence should be written in systems programming languages like C.

    There is another school of thought that says the language being implemented is clearly the best in the world for almost all purposes, therefore its compiler/interpreter should be bootstrapped.

    In general, I don't think that either of these attitudes is entirely appropriate. It strikes me that when implementing a translator for a programming language, that one would want to use a language that had:

    1) Automatic lexer and parser generator tools. I suspect that the lack of these tools is not a real barrier though, since most compiler jocks know enough to rattle both off in a weekend.

    2) Reasonable linguistic facilities for manipulating strings.

    3) Reasonable linguistic facilities for specifying and manipulating user defined data structures -- particularly trees.

    4) Some way to do information hiding and separate compilation. And perhaps other facilities for programming in the large.

    5) A reasonably speedy implementation, so the translator that you're writing isn't hideously slow.

    Now, does Perl have all of these attributes? Mostly. The thing that I would think you wouldn't want to do, is write a bytecode interpreter in Perl. This, you definitely want to do in C (if you're concerned with portability) or C/assembler (if performance matters more than anything else).

  • I know a lot of companies run Perl4. But hey, wake up and smell the coffee. Perl5 has been the stable developed Perl for longer than all previous versions were *combined*.

    Besides which the Perl5 syntax is simpler, and it supports lots of neat things. (Taken a look at CPAN recently?)

    Cheers,
    Ben
  • Having read the entire article, but not listened to the realaudio (owing to no sound card at work), I seem to have missed the bit where he says what language it uses atm. Does anyone know offhand?

    Please excuse me for never having looked at the Perl sources, or even downloading them. I got fed up with compiling it myself and nouse rpms.

  • At the last Python conference, Paul Dubois of LLNL talked about a package that let you write Python extension modules in C++. What was really neat about this was that, because of operator overloading, the C++ code looked a lot more like the corresponding Python code. For example, instead of writing PyDict_GetItemString(dict,"foo"), you could code dict["foo"]. Instead of checking for a NULL return value, it would raise an exception that you could catch, so the code was simpler and easier to read. GvR found this pretty impressive, and said he'd keep this in mind if/when the interpreter is ever rewritten from the ground up.

    So, I'd expect a similar design decision was made for Topaz. In interpreters you have to deal with data structures that represent scalars, arrays, dictionaries, etc., so an OO approach might let you avoid hard-wiring an expected type, in favor of an expected interface. Whether this will pay off significantly is the purpose of the experiment.

  • by Anonymous Coward
    So instead of Killing perl.. Rewrite the code you have. Get rid of quick little hack's and Macro's that do way to much. (as someone said.. more than one line and you are being lazy) Ive hacked and debugged some VERY old mud code. And the one thing that saved me was Geting rid of Macro's and replacing them with good ole functions. CircleMUD uses Macro's for Memory management *ouch* That was the worlds best Nightmare to debug. I broke down and rewrote the code. Now it hums along without ever crashing. Nice. The point is Changing languages is only a band-aid on a big sore that should be fixed a little better. Not just throw OO and cool C++ stuff at it and see what happens.
  • Isn't perl written in perl, currently?

    No, Perl is interpreted, so it's not possible for it to interpret itself. How would the interpreter run, if it were in Perl? (Actually, Perl compiles to bytecode - but that needs interpreting by a C program.)

    As perlfaq1(1) [perl.com] says:

    You'll notice that perl is not itself written in Perl.
  • Youre right! :)
    But, but, everything else's suposed to be in C++!
    (this is me desperatly trying save my side of the argument and loosing...badly)
    No, I can't spell!
    -"Run to that wall until I tell you to stop"
    (tagadum,tagadum,tagadum .... *CRUNCH*)
    -"stop...."
  • I think there are several interesting points raised by this article, some of which have been mentioned, and some which haven't.

    Basically, apart from the time it would take to write, the only downside I can see to this is the huge number of Perl modules that would be utterly broken.

    Part of the attraction of using Perl for me is that I know whatever I'm doing there are going to be modules available to help me solve my problem.

    Changing the implementation, and breaking these modules is going to be a huge loss from the whole Perl community - I've written Perl modules, and I've dabbled in embedding a Perl intepretter inside some of my programs, so I know how deeply you sometimes have to get inside Perl to work.

    The obvious question is how much gain there would be if the code was re-written in C++, I know that a lot of the Perl code is an utter mess to follow, not the mention the "witty" comments
    like:
    /* And you'll never guess what the dog had */
    /* in its mouth... */ in file "mg.c"

    If the code was restructured in C++ it could be better, but I'm sure that it could be equally structured if the C code was smarted up.

    Some OO(P) advocates tend to forget that its possible to program OO'dly with necessarily using an OO language. (Though granted it is a little harder, and some things are not as "nice").

    As for the whole "C++ is too slow / inificcent"
    argument I think thats a no-brainer, Perl is never going to be the fastest language in the world if it remains intepretted, (I wonder how the Compiler Malcomn Beatty was working on is coming along), the implementation langauge should have little impact on the end-user.

    Besides I don't imagine that many "advanced"
    features of C++ would be used if it was to be re-written. But the readability of the source code would go way up if there was a nice object structure, with a clear inheritence model.

    As soon as the more people can understand the Perl code the maintainence suddenly becomes much easier.

    (I wonder how many of the people who say that Perl is a write only language have seen the source ... I'm sure they'd think much less of C if they had...)

    Steve
  • Why? Isn't C more stable when your developing across platforms?

    It's sad to live in a world where knowing how to
  • "Jou don't get that from himplementing Perl in Java."

    Sure you do, the interpreter is a Java program. The java program gets optimized so the perl implementation benefits from the optimizations. Of course you would never get the same speed as with compiled perl (since we are talking about an interpreter), you are right about that. But also a Perl JIT compiler could be written in Java, especially if it is supposed to compile to Java bytecode but other code should be no problem either.

    But as I said in my original post, I don't belief Java is fast enough at this moment to meet the performance requirements for PERL (i.e. be as fast as modc, or whatever its called, for apache).
  • > You are forgetting that class==private struct!

    Well, no I wasn't, I was just using a contrived example. The point is that the reason it looks like C++ is doing something weird is because it just provides language definition capabilities needed to make first class types. I get your point though.

    >Instead of using recursion, they wrote an
    >iterative function with handcoded
    > stack. They made the stack dynamic, extending
    > their sizes using malloc() and free()!

    Ha! To bad you can't tweak them about it without getting a bad grade.

    Actually, call/return overhead is the least of your worries when you're calling malloc and free. On many systems, these are the slowest OS calls there are that don't do IO.
  • Isn't perl written in perl, currently?
  • Am I missing something here? Lots of people have raved about how much slower C++ is. C++, as well as C, is the definition of a program written by a human. The most efficient language is therefore the one that suits the writers of the code. Larger projects benefit from object-orientation, so leaving C for such a language seems reasonable.

    Speed issues are largely just compiler issues. Whatever you say about each language, they both end up as machine code. How fast the executable is depends on the compiler far more than the language.
  • Honestly, I don't trust it. C++ is not a very
    stable language; most systems don't implement
    the spec properly yet.

    Anyway, this says to me "we are buying into
    buzzwords". I don't like it. Perhaps someone
    else should look at a clean rewrite in a real
    language?

    Maybe it's time to start pushing C9X.
    :)
  • Chip listed a couple other contenders (ObjC, Eiffel, Ada) but decided not to choose them because of various issues ranging from implementation to the availability of compilers on many platforms. In the end he says:

    "So, we're left with C++. It's rather like the comment that I believe Winston Churchill is reported to have said about democracy: It's absolutely the worst system except for all the others. So, C++ is the worst language we could have chosen, except for all the others. "
  • C++ doesn't fix that any better than well-written C fixes that.

    Let me put it another way:

    If perl's OO module is not fundementally broken, why would you rewrite perl in a language that isn't doing things that way?

    If "there is no security. don't use the undocumented API" is a good philosophy (and I think it's a great one), then you can do that just as well in C, without pushing us all to much less reliable compilers.

    C will be more stable on more platforms forever.
  • by Evangelion ( 2145 ) on Thursday September 30, 1999 @04:32AM (#1648849) Homepage
    And they're off! We already have some vague allusions to C++ being a bad idea, and the cross platform card has been played mighty early, implying that C is better than C++ for that. This is an exciting match, ladies and gentlemen, and we'll keep you updated as the race continues. We have yet to see any real material claims, be we expect plenty of personal anecdotes and opinions this match. It promises to be an exciting time, so stay tuned!


    (
  • Who remembers the C++ episode in Linux 0.99? I think it changed from C++ between patch levels 12 and 13, and then back to C for patch level 14. (Of course, some people would argue that a change in the base language qualifies for more than a patch...)

    The reason Perl is so hard to maintain isn't that it's written in C. It's that it's written in hard-to-maintain C. If someone were to rewrite it in C with the same clarity as, say, emacs or qmail or tcl, the problems that Chip alludes to would go away.

  • A feature req is certainly off topic, but I'd still like to mention it here.

    Does it feel awkward to you to write every fscking structure reference in the perl ->{} format? Wouldn't it be nice to do structured variable embedding as s/\$([\w\.]+)/${$1}/gs ? A syntax hook for such would be nice :)
  • I don't like C++. Guess I'll have to stick with Perl5* for the rest of my life to make sure I die happily. :-)

    Perl6 (if I'm reading and listening correctly) will still be the same unwieldy, arcane beast you have known and loved. ;) It might have a smaller footprint, squeeze into tighter spots (MicroPerl! yay!), and be a bit easier to maintain.

    And it won't have all those nasty macros you C programmers seem to like so much. (ducking and covering) ;)

  • I believe difficulty in code maintenance with languages such as C was one of the primary reasons for the spawning of object oriented programming languages (complex and sometimes unwieldly though they may be to many), such as C++.

    Not really. Simula [dmoz.org] is generally considered to be the first object-oriented language (i.e. being built around the notion of classes: entities which encompass both data and behaviour). It builds on Algol60 [dmoz.org] and was defined in 1967; it predates both C and Pascal which are early seventies.

    Object-orientation gained popularity through SmallTalk [dmoz.org] (late seventies).

    C++ isn't a pure OO language; it's a procedural/OO hybrid language due to being based on C. It's unwieldyness isn't explained by this. Other OO and OO/hybrid languages (say Objective C [dmoz.org] which was a stronger influence on Java's OO nature than C++) are a lot less unwieldy.

    Nowadays, Object Orientation [dmoz.org] and OO programming [dmoz.org] are popular because they allow for a more natural modelling of problem domains and implementations of solutions than the procedural approach which makes it possible to deal with larger problem sizes.

  • While an interpreted interpreter is a fascinating concept to ponder, the recursive nature of such a design, it seems to me, would by necessity eventually consume all computing cycles in the universe.
  • by LHOOQtius_ov_Borg ( 73817 ) on Thursday September 30, 1999 @05:36AM (#1648859)
    Sather is an interesting language, but the resultant portable C code is, according to the testimonials on the Sather pages themselves, only as fast as the equivalent code written in C++.

    Object Orientation is a relatively new paradigm, and work is still being done in making OO code run faster - in general. It is the OO heirarchy that slows down C++ compared to C, it is what helps make Java slower even though the JVM is a relatively efficient interpreter/VM, and it is what "crippled" languages like Smalltalk and Eiffel in the eyes of "real" developers. However, the ease of design and maintainability of OO code has inspired a lot of research into trying to speed up OO languages so they can compete with their functional counterparts.

    The advantage of Sather isn't speed (not yet, maybe someday), but, according to the authors, ease of maintainability. This, however, might be a reason to make the choice to use this language. As C++ is fairly broken in some places due to the ad-hoc OO/C integration, it can be very quirky to work with, and it's easy to write bad C++. I'd need to study Sather in detail to say it is actually more elegant and maintainable than C++, but I rarely doubt claims that something is more elegant than C++ ;-)

    Unfortunately, though, to port the Perl source to Sather one would first need to port the Sather compiler to Windows, as Berkely hasn't bothered doing so... and the Perl6 team will obviously need to support Perl on Windows as that's been in the core Perl for a bit of time now...

    Creating a formal language base to bootstrap a framework into C through LISP is a great idea... but according to Larry Wall, PERL can not be formally specified in such a manner, and therefore I believe that this tactic would not work in this case, though I have not really studied the possibility in any great detail.

    The suggestion of rewriting Perl in Python is a bit silly... why not rewrite Perl in Scheme, TCL or Java... or Perl? Talk about speed not being an advantage... While Python is pretty fast, Java with a good compiler (such as Jikes) and a good JVM (such as Java2 with HotSpot) is faster, at least on our tests on WindowsNT and Linux.

    If raw speed at the expense of maintainability (and therefore the possibility of speed lost to bad programming rather than a "bad language") is one's goal, C is still the language to use (or assembler if you're both wed to one chipset and a total maniac). Otherwise, there are a lot of choices, including C++.

    I think C++ was chosen because porting from C to C++ isn't that difficult to do (at least to do very badly), and can be done incrementally. I think the efficiency of Perl in C vs C++ will depend entirely on the programmers and how they use the language. As one person already pointed out, if they use C++ for some added maintainability but with an eye towards speed (that is, use the parts that don't slow the program to a crawl), they'll do just fine on the speed front...



  • What a great idea! Bind the Perl language to a single processor by implementing it in Assembler. We could all have "Perl6 coprocessors" that we would shoehorn into our hardware to run Perl scripts with.

    Not!
  • I'm gratified that Topaz has captured the imaginations and enthusiasms of so many slashdotters. But the timing of this story was very unfortunate.

    You see, I'm in the middle of a cross-country move, having just been hired by VA Linux Systems, and my net access this week is probably the worst it's been for five years. Right now I'm in a borrowed office at an Alabama rest stop.

    I will review everyone's comments and do my best to respond to them--and to work on the code! But please be patient while I settle into Geek Country. Just the culture shock will probably set me back for a day or two. :-)

  • by jilles ( 20976 ) on Thursday September 30, 1999 @05:37AM (#1648864) Homepage
    "Macros are the reason to dump C for C++? "

    Macros are evil. Each time you use a macro you are working around a language limitation. C++ has more language constructs than C so you can avoid macros more often. The nice thing about C++ is that the use of those language constructs is optional. I.e. if the use of a certain language construct in a certain spot of your program has a serious performance impact on your program the solution is simple: don't use it.

    "Overall this will end up one of two ways: 1) it will fail because C++ is too slow or 2) it will succede because he uses C++ only for what advantages the syntax can provide him without being trapped into the glitzy (and mostly useless) constructs that make C++ about as speedy as my grandmother on a cold winter day."

    I would go for number 2 and would like to add the prediction that if this thing is designed in a proper way, performance might actually be better than the old C version.

    One of the arguments for reimplementing was that the current version is so complex that there are only a handfull of programmers who have enough knowledge to make non trivial changes to the code. A nicely designed C++ version of Perl could change this. An example of an increasingly successfull C++ implementation is Mozilla. It is fast, well designed and progressing really fast.

    This discussion seems to be drifting in the "you can do anything in C" direction. While this is certainly true, it does not mean that C is the best solution for everything. Most perl users at least wouldn't dream of using C for the stuff they normally use perl for. Why?? perl is a better solution for their problem.

    Likewise, C++ is a good language for large, complicated software. It's OO features allow for better structuring of the programs, and its C inheritance allows for performance optimizations should they be needed.

    "Along those lines: I heard a good joke recently. A JIT compiler for Java that claimed to run code "just as fast as C++"! I laughed for minutes. Then I cried when I realized that this will likely work as an add campaign. Sigh."

    You'll be crying a lot in the coming few years. Probably Java is not a good solution for implementing perl right now. But I wouldn't be surprised if it becomes an option in a few years. Perl is used mainly for server side processing of scripts. Good performance is essential there. Java has been coming quite nicely in this area and can be expected to improve even further the coming few years.

    from the article: "I finally realized that Perl may be competing with Java in the problem space, but when you're writing Perl, implementing the Perl runtime, really what you're doing is something equivalent to writing a JVM. You're writing the equivalent of a Java Virtual Machine. Now, would you write a JVM in Eiffel? I don't think so. No, so neither would you write the Perl runtime in Java or in Eiffel"

    While this may be true, implementing Perl in Java would mean that pearl benefits from all the optimizations in the Java vm (that would otherwise have to be implemented in the C++ version of perl). It would also mean that Perl would benefit from the portability of Java. One major advantage of a Java implementation would be that it would be far easier to maintain than a C or C++ implementation.
  • Maybe we could reconfigure Hugh so that he would use this hypothetical "Recursive Perl" instead of the standard Perl, and he'd bring it back to the Borg collective, at which point it would spread and infect every Borg in the galaxy, and eventually their ships would grind to a halt and all of the Borg would suffocate from the lack of NOOPs.

    Yes, I believe this is a clever plan.. does anyone have any qualms about committing genocide using Recursive Perl?
  • Didn't you know? It's written in Visual Basic.

    sheesh.
  • I remember. Wasn't it an option: you could choose which (C or C++) to compile it with/in?
  • I've seen a number of other projects (albeit not opensource) that have been re-written in C++. All have been a disaster. The memory footprint is always larger, and the speed of execution is noticably slower. Plus there is the learning curve involved with C++ for the developers involved in the previous project.

    One application I had a lot of dealings with was re-written in such a way, and was so buggy that users reverted to the previous version, or went elsewhere. It even caused a number of NT migrations. Unfortunatly the C++ version was the only one that was Year 2000 compatable. I even spoke to the head of support at the UK end of the company - he did not admit there were any problems, probably to save his job. The product sucked more than a VAX (thats the UK vacuum cleaner, not a DEC machine!).

    Another thing - have you seen how much memory gcc uses when compiling C++? With the sudden rise in memory prices I may not be able to compile perl6, as I won't be able to afford the extra gigabyte of RAM required! :-)

    Please, please, please don't do it.
  • I have read the article, posts and re-read every thing I saw but I still have no clue what launguage Perl 5 is written in. Can someone help me out here please.
  • Every rule has an exception. Ever tried basic? :-O
  • Perl's written in C. http://www.perl.com/
  • I believe that Sather is not currently maintained. Possibly he didn't want the new Perl to also require that he maintain a second language. OTOH, I wonder why he didn't choose GNAT. But that may also be suffering from lack of current maintainers. AdaCore Technologies is the normal maintainer, but they were sounding very commercial when I talked to them a year ago.

    Of course the problem of where to get his programmers couldn't have anything to do with his choice. :-)
  • ANSI C++ is a fine idea. Unfortunately the compilers haven't caught up with the ANSI standard yet. Some of them will never be ANSI compliant due to lack of vendor interest.

  • Having written in both, I can tell you that I am able to actually write faster code in C++ than in C, not to mention that I make MUCH fewer stupid mistakes because of better type
    checking etc.

    Remember that C++ is a semantic extension to C. So whatever C can do, C++ can do it too. There are two possible places where C++ can beat C: 1) inline functions and 2) using const reference parameter. The first can be solved using macros or compiler-specific inlines.

    Based on my experience, the reason why some people find C is bigger than C++ is because the C code is written in a 'neat and tidy' fashion. C will punish you for this. If you want to write elegant code and want the compiler to figure out how to generate this efficiently, sometimes it is better to use C++. This phenomenon is similar to the case when assembly coders start to heavily use macros to make their code look neat and tidy, and later found out that it is much bigger than a C equivalent. C will look through the macros, and do proper register allocation and topologically sort the expressions for you.

    You can use C++ as efficiently as C if you don't use the extensions. When you do, it MAY start to kill you. At best, it will simplify code writing and maintainence for something you can manually code in C anyway. Case in point, GTK. At worst, you end up with code bloat because compilers are either immature or there are semantic constraints that prevents it from doing further optimization, which can easily done by human inspection.

    Kernighan/Ritchie designed C based on what they can generate in assembly. Stroustroup designed C++ based on what he can generate in C. There is going to be some efficiency in C++. So you have to ask yourself what is more important, ease of coding and maintenence (C++), or efficient and deterministic code (C)?

    Hasdi


  • I imagine he limited himself to those languages that the principal developers of Perl have experience in. There is nothing worse than trying to learn a new language to develop a major project. By the end of the project, you realize that you made several mistakes in the early stages which have locked you into undesirable outcomes.
  • by hey! ( 33014 ) on Thursday September 30, 1999 @06:00AM (#1648879) Homepage Journal
    I don't know where this idea that C++ generates slow exectuables came from. Bad programming create slow executables.

    Early OO systems did a lot of runtime message passing, so they _were_ slow. C++, on the other hand implements a lot of the OO paradigm at compile time for more speed.

    Virtual function DO add a table lookup each time they're called, but this would only have any effect during a tight loop. And you don't have to use them if you don't want them, but they do provide the benefit f reducing the need for type unsafe downcasting.

    The other argument I've seen about C++ has to do with it somehow not being as "close to bare metal" as C. This impression comes from the fact that C++ does a lot of weird things like construct temporary objects to fit an object into an expression where it otherwise can't be used. Of course C does this too; if f takes a float and i is an int, then in the expression "f(i)", C constructs a float out of i for you. The difference is that C++ allows you to create bonafide first class types.

    If you don't want temporary objects constructed, code around this problem. After all, in C++ you can always define both f(int) and f(float), but in the end it won't make any difference because most of the time you'll be doing the equivalent conversion in the function anyhow.

    A lot of the hot opinion about C++ "inefficiency" reminds me of all my CS professors who used to say as an article of faith that "recursion is inefficient", and then go on to code elaborate ugly iterative algorithms to get around this. Well, one day I went home and ran a few algorithms through the profiler, and guess what? Most of the time there was no difference, and some of the time the recursive algorithm was a tad faster.

    You can't trust your intuition about what is fast and what is slow. So, profile your code. If a virtual function or constructor for a temporary is taking a lot of time in a tight loop somewhere, optimize just that one piece by coding around the problem. Usually performance problems tend to be in a very tiny fraction of code.

    I think that a lot of bad C++ comes from the fact that some programmers can't effectively use the extra flexibility that the language provides. They say if all you have is a hammer, every problem looks like a nail. If you have a hammer and a screwdriver, you're not in any better position unless you can tell the difference between a nail and a screw.

    In the end, I personally prefer C, because I find it small and elegant. Look at how thin K&R is! C++, because it exposes so much language definition machinery to the programmer, will never be as elegant, but it is nonetheless in most respects very well thought out.

  • Early OO systems did a lot of runtime message passing, so they _were_ slow. C++, on the other hand implements a lot of the OO paradigm at compile time for more speed.

    I have to agree with you on this one. C++ is probably the fastest OOPL around. You can code OO is C but it gets ugly. Stroustroup wants the code to be as elegant as other OOPL but can be as fast as C. That's why you have virtual, inline, const, and static keyword cluttering the class declaration.


    The other argument I've seen about C++ has to do with it somehow not being as "close to bare metal" as C. This impression comes from the fact that C++ does a lot of weird things like construct temporary objects to fit an object into an expression where it otherwise can't be used. Of course C does this too; if f takes a float and i is an int, then in the expression "f(i)", C constructs a float out of i for you. The difference is that C++ allows you to create bonafide first class types.

    You are forgetting that class==private struct! At this point, a lot of compilers optimize ints and floats using automatic register allocation. Struct and array is usually implemented as a pointer to a memory on the current stack frame (Although since struct is well-bounded, there are optimization opportunity not in arrays). A C/C++ does not *care* how many temporary ints and floats you have, it will weed them out! struct/class temporaries always are spilled! You don't believe me, encapsulate your ints into a 'class Integer' and floats into 'class Float' and overload operators accordingly. Compile -O3 an arithmetic intensive functions one using pure ints and floats and the other substitute with 'class Integer' and 'class Float'. get code size using a dump utility. QED.


    A lot of the hot opinion about C++ "inefficiency" reminds me of all my CS professors who used to say as an article of faith that "recursion is inefficient", and then go on to code elaborate ugly iterative algorithms to get around this. Well, one day I went home and ran a few algorithms through the profiler, and guess what? Most of the time there was no difference, and some of the time the recursive algorithm was a tad faster.

    I know what you mean. That's what they say at Ann Arbor too. Some professors are either misinformed or either doing a poor lip-service to real gurus. Instead of using recursion, they wrote an iterative function with handcoded stack. They made the stack dynamic, extending their sizes using malloc() and free()!

    So what is wrong with the picture? The iterative function is poorly written. 99% of the time, the reason why recursive function is slower is because of the call/ret overhead and function header/tail overhead. If you replace this with a cmp/jmp, you will get much better performance, hence the endorsement for iterative functions. If you start using malloc() and free(), you are making the problem worse!

    If you want to optimize something, don't look at the C code. Look at how the code is being generated. A smart compiler can automagically convert a recursive function into an iterative one, if it is tail recursive.

    Enough ranting for one day. Back to work.

    Hasdi
  • by dave_aiello ( 9791 ) on Thursday September 30, 1999 @06:01AM (#1648881) Homepage
    veldrane wrote:

    We run Perl5 in development but have been working very diligently to get it approved for prod, which currently uses v4. In my travels writing Perl code for Fortune 500 companies, this is a much more serious impediment to acceptance than the majority of Perl users realize. My largest clients are commercial banks and brokerage houses, and you constantly find machines at these places running versions like 5.002 or earlier.

    The normal reaction from a Perl user at an ISP or an academic institution is that the people running that system must be really dumb or must not use Perl. This is not the case.

    Many consultants like me and the software development staffs we work with are not the people that decide which development tools are used. These decisions are made by people at the Chief Technology Officer level at many companies. They tend to spend most of their time evaluating Closed Source Software because this is the software that is promoted through the corporate IT news, sales, and marketing channels.

    OTOH, software that is actually used, like Perl, is considered an ancillary part of the UNIX OS build that is installed on each Sun, HP, or IBM server. The geeks in the back room often have to resort to civil disobedience in order to get a new version of Perl approved and installed.

    Sorry if this seems slightly off-topic. To tie it back to the discussion about Perl 6 I would have to say, with all the new features we would like to add, we in the community need to re-double our efforts at advocacy. We need to do a better job of identifying places where obsolete Perl versions are still installed, and figure out ways to remedy the situation.

  • I didn't assess why the movement towards conformance took place, and couldn't address that in detail in a mere three sentences.

    My point was not to detail the "splintering" but to indicate that it took place, and represents cause for the lack of stability of C++ on Linux.

    I certainly agree that the now-much-better implementation results from the EGCS efforts.

  • I believe difficulty in code maintenance with languages such as C was one of the primary reasons for the spawning of object oriented programming languages (complex and sometimes unwieldly though they may be to many), such as C++.

    Kill paranthesized section and subsequent comma, and you get a little closer to my point. I wasn't attempting to take a historical look at OOP in general, just it's recent widespread popularity (with regards to C++ vs. C).

    C++ isn't a pure OO language; it's a procedural/OO hybrid language due to being based on C. It's unwieldyness isn't explained by this. Other OO and OO/hybrid languages (say Objective C which was a stronger influence on Java's OO nature than C++) are a lot less unwieldy.

    One might even say less useful. Java hurts my mind, yet C++ doesn't, so the term "unwieldly" is merely an opinion (well, it could be considered a documented fact in the case of certain languages.. but certainly not among the languages of choice), which is why I said "some" people might find it to be so.

    Besides, C++ isn't so much a procedural/OOP hybrid as that it allows you to use C within C++ (though if you do, you are clearly missing the point of using C++ in the first place.. unless you have a real good reason for doing such a thing). Yes, Java lets you "play it safe" (and I won't even go into Objectionable C), but it sacrifices a bit of power and versatility. I'll take the power and versatility, thanks.

    Nowadays, Object Orientation and OO programming are popular because they allow for a more natural modelling of problem domains and implementations of solutions than the procedural approach which makes it possible to deal with larger problem sizes.

    An excellent point, though I don't see how this debunks the notion that encapsulation, and subsequent ease of code maintenance, is not a primary factor is the widespread acceptance of OOP and/or C++.

  • Firstly, you can ignore the problem if you allow only the exact same executable to re-open the memory map. In that case, the vtable pointers that are in that persistent file should be valid, because the vtables have fixed virtual addresses in static storage. All you have to ensure is that the mapping of the region is to the same address so that pointers to within that storage also remain valid. Of course, this will not work if you change an recompile the program, because the vtables might then shuffle around.

    Secondly, you can still hack outside of the C++ language to mess with objects' vtable pointers. It should be possible to wrap up this ugliness in a module that can be adapted for various compilers.
  • If you're interested in Perl Advocacy, you might want to join the Perl Advocacy mail list. Send the message "subscribe advocacy" to majordomo@perl.org [mailto].

    David
  • Well, python is largely an interpreter (just as Perl is) so this might not make that much sense. OTOH, I suppose that you could have a code generator module to create the Perl interpreter in native (for whatever machine) code. So it might.

    This might make the language design easier to understand, but at the same time it would make the generation of the interpreter (should I say virtual machine, to be current?) more complex. That shouldn't be a real problem, since it wouldn't need to be done often.

    But given this line of thought, perhaps the Perl in Perl thread earlier should be the way choosen.
  • by AT ( 21754 ) on Thursday September 30, 1999 @07:07AM (#1648888)
    Ok, a lot of people seem to really hate C++. I think part of this is because Linus rejected for the kernel. This makes sense, though. The kernel is very low-level and you want as much control as possible. C++ is a bigger language, and it does more stuff behind your back. C is very predictable, and thats why C is such a good systems language.

    But for a big, non-nuts and bolts project like perl, C++ has a few important advantages.

    1. OO. I think this is obvious. Of course you can do OO in C, and C++ doesn't garentee a good object model, etc. But it is less work, "cleaner", and requires fewer dodgey practices like macros to make things work. Additionally, there is far more tool support for doing OO in C++ as opposed to C.

    2. Templates/generic programming. Now that there is a cross-platform compiler that does these relatively painlessly (ecgs/gcc3), Open Source C++ programmers should have no qualms about exploiting this feature, as they have in the past. What do templates give you that macros and typecasts don't? Well, first of all, type safe containers. That means you can't add a string to a list of ints. It won't even compile. Compare this to the best C containers, which can only catch it at run-time (with a performance hit). If type safe containers aren't enough, the C++ standard template library gives you iterators. To make a long story short, iterators allow you to write container independant algorithms. Your sort routine will run on a tree, a hash, a list, a vector, etc. Try doing that in C without enough casts and macros to sterilize a rat. And of course, the STL comes with nice implementations of most of the well-known algorithms you are likely to need.

    3. Exceptions. Exceptions are simply a more elegant way to program, as any Java programmer will tell you. It nicely solves the ever-recurring problem, "How do I return both an error code and a result?". No more if (foo() == -1) return -1; everywhere and wondering if returning TRUE means success or failure.

    In summary, C++ isn't for everyone or every project. But it has some nice features that make it nice for large projects.
  • >> Nononono, everyone knows that it stands for Pathologically Eclectic Rubbish Lister :-)

    > Just be careful which circles you swim in when you say things like that. I'm sure a lot of zealots would refer to such a thing as blasphemy.

    Er... Direct quote:

    "To those who merely like it, Perl is the Practical Extraction and Report Language. To those who love it, Perl is the Pathologically Eclectic Rubbish Lister".

    Camel book, p. xi

    Kaa
  • Then you would break things like:

    $onetwo = $one.$two;

    and:

    $blah = $one.scalar($two);

    This becomes less of a benign "feature request" and more of a fundamental change in one of Perl's more frequently used operators.

    Granted, you could make the distinction by looking for whitespace near the . (e.g. $blah = $one . scalar($two);), but you would still end up breaking a significant number of scripts. Plus it's less obvious when you see $one->{$two} versus $one.$two versus $one . $two. Also, how do you distinguish between $one->{'item'} and $one->item (a method)? Use paranthesis? Things get complicated.
  • if Perl is the ultimate language, why didn't they write it in Perl?

    "The number of suckers born each minute doubles every 18 months."
  • I know they should be angle brackets, couldn't remember the HTML to make the browser not think they were tags. At least I previewed.

    &lt;&gt;==<nbsp;>

    :)

    --
    A host is a host from coast to coast...

  • Just be careful which circles you swim in when you say things like that. I'm sure a lot of zealots would refer to such a thing as blasphemy. Personally, I don't care, but I would rather it remained named Perl (and I'm sure it will, unless someone assassinates Larry ;). Hee hee..

  • oops. forgot an &... you get the picture.

    --
    A host is a host from coast to coast...
  • by Anonymous Coward

    Granted, C++ and Perl are overcomplex and have exception-ridden syntax, but underpowerful?

    IMO, a big part of C++'s problem in terms of complexity is that offers many weak constructs rather than a few powerful ones. Thus you need to learn a new syntax for each partial solution and then when they run out of steam you need to combine them in patchwork fashion to solve your real problem. This makes it much harder than it should be to do clean design and reason about your system.

    Let's take C++'s approach towards polymorphism. It gives us templates, virtual functions/inheritance, and function overloading on the argument types. Each of these is an ad-hoc solution to a piece of the larger problem of supporting polymorphism, and in a well-designed language there would be a unified mechanism for all of these things. (Objective CaML, Cecil, and Dylan are two examples of languages that are of comparable performance to C++ but exhibit much better design.)

    The standard objection to this line of argument is that this is all just syntactic sugar, and that a good programmer can write elegant code in C++. This is IMHO true but vacuous; since all computer languages are Turing equivalent, all the differences between them are essentially a matter of syntactic sugar. And since I don't see many systems written in Intercal, I think that the differences are of practical importance.

  • The problem with Sather is that it could be mistaken for being a dead language, and isn't something that many people are well-familiar with.

    That's not the only problem; the other critical issue is that of Foreign Function Interface. Perl presently offers the opportunity to connect in libraries with C interface plumbing, which requires that the implementation language be fairly deeply compatible with C.

    That being said, if there was a good compiled Scheme or Lisp with a bunch of back ends (arguably, the way to do this would be a Scheme front end for GCC), this would in fact represent a fast option. It is a longstanding fallacy that "Lisps are Interpreted." That may have been true 20 years ago, and was particularly true with David Betz's XLisp, but isn't a truthful statement.

  • Whatever Visual C++ implements, we shall call that "enough," because I really don't think that we can ignore Windows as a target market. If nothing else, we need the checklist item-works on Windows.

    Great. So now development of Perl is driven by least-common-denominator capabilities just so they can say "It works on Windows!" I thought the whole thing behind open source projects like this was that decisions could be made on technical merit rather than what some marketroid decided the software needed to be able to do. Or is that turning out to be just a myth?

    Chip could have at least phrased that statement a little different to say something like, "We just want to make sure it works with Visual C++ so we can support Windows" rather than the "We're basing it on Visual C++ because we think supporting Windows is much more important than using a compiler where bugs really get fixed on a stable platform the developers really use" implications of the way he stated it.

    -=-=-=-=-

  • A number of C compilers still don't conform to ANSI. I've noted that they aren't in very widespread use. Personally, while I'm in love with the idea of Perl being rewritten in C++, I'd rather it was fully ANSI/ISO compliant. I could just about smack Chip across the face for even thinking thoughts about Visual C++. (am I the only person who thinks MS Visual anything is simply a Bad Idea?)
  • Though I don't see how this debunks the notion that encapsulation, and subsequent ease of code maintenance, is not a primary factor is the widespread acceptance of OOP and/or C++.

    I didn't intend to debunk this notion, but rather I intended to offer a slightly different perspective on it. In my view OO(P)'s success results from it being the right tool for more jobs than procedural modelling/programming ("if all you have is a hammer, every problem looks like a nail"). Encapsulation just a part of it (as is e.g. the notion of inheritance) and the easier code maintenance is also because the program is more similar to one's mental map of the problem domain than a procedural program for the same problem.

  • A number of C compilers still don't conform to ANSI. I've noted that they aren't in very widespread use. Personally, while I'm in love with the idea of Perl being rewritten in C++, I'd rather it was fully ANSI/ISO compliant. I could just about smack Chip across the face for even thinking thoughts about Visual C++. (am I the only person who thinks MS Visual anything is simply a Bad Idea?)

  • by Anonymous Coward
    I'm working on a C++ compiler written in Perl. I'm calling it P++. Of course, the Windows version will be called "Visual P++" or just "VP++".

    -- Peter Piper
  • Bah...I don't buy into the whoe language wars thing..course I don't have to worry about it cause I just use Perl..cause everything else sucks..especially Python. Now...I also like perl cause it runs well on Linux..I just use Linx cause everything else sucks..especially Apple/Mac. oh ..and Amigas.. they suck too. Yep Linux rocks..course I use KDE on Linux cause everything else sucks..especially Gnome. Yep..I like GTK cause QT sucks. But I usually use the command line cause GUI's suck.

    Now...I'm gonna go watch an old movie on my Betamax...cause VHS sucks.
  • As nice a name as TOPAZ would be for the new Perl 6, yo should be aware that there already exists
    a software product by that name.
    They( Software Science) have been making TOPAZ
    for DELPHI (on PC's)for quite some years now.
    Check out their Web site:
    http://www.softsci.com/

    Scott
  • DIE, FOOL!!! PERL FOREVER!!! if (PHP == 'crappy') { print "You are a good person"; } else { print "You MORON.\n"; } if (Perl == "God in a programming language") { print "Well, that's an understatement but I'll let it pass...\n"; } else { print "Moron.\n"; }
  • by Anonymous Coward

    I could not access the link in this article so I may be asking questions that may have been covered.

    1. Why undertake such a huge task of re-writing a proven language ('s interpreter) in another language instead of cleaning up existing C source?

    2. What are the benefits of using C++ to do this as opposed to C?

    3. What is bootstraping?

    4. I think [?] that PHP4 was a complete re-write of the interpreter of the PHP language, while remaining backwards compatible with most PHP3 code. This seems to have been a dramatic advantage, and did not seem to [?] take all that long to accomplish. Can the re-write of Perl be compared to this?

    5. What is Larry Wall's opinion on this?


    Please be a little descriptive, for my sake at least. :-)

  • I am using VC++ 6.0 (+ SP3) and I don't have the problems you refer to.

    STL seems to work fine. It now supports default template arguments.

    I even use it in thread code without trouble. Of course, you need to use a synchronization object
    when one thread inserts items into a list and another removes them. The synchronization objects I use (CCriticalSection) are not portable. But there are ways to create classes that
    hide the OS implementation.
    We just have no need to port the code to Unix/Linux at work.

    I learned STL on G++ and started using it with MS VC++ 4.2 (using explicit allocator template args).

    At work, we are getting ready to migrate the changes needed from VC++ 4.2 to VC++ 6.0. This mostly involves adding "using namespace std" in
    a lot of places.

    Folks who argue that C is more portable seem to forget about things like configure, autoconf, automake, and the ever popular #ifdef, #else, #endif.



  • by Feldmrschl ( 79133 ) on Thursday September 30, 1999 @10:15AM (#1648913) Homepage
    After reading the article and the posts, I feel compelled to share my thoughts and experiences with both languages.

    I've been a software engineer for 13 years. My first exposure to C came during a junior year course, Survey of Programming Languages, at ULowell, now UMass/Lowell. Once exposed, there was no going back. I was hooked. IMHO, it was so much better than Pascal. Once exposed, my Turbo Pascal went back to the bookshelf, never to be used again.

    My initial exposure to C++ came during my senior year. I was enrolled in a compiler design course and the professor was a Modula-2 fanatic. Since half of the class were budding C gurus, he relented and allowed us to use "this new language", C++. We shelled out the $$$; he went to the Harvard Coop to buy copies Stroustrup's book.

    I was impressed by the differences between C and C++. I particularly liked the organizational improvements that class definitions had over struct definitions. I found it a joy to code.

    The project for the course was to build a Pascal-subset compiler. We couldn't use global variables and all functions, except for the tokenizer, had to fit on one page of hard copy printout. At the end of the course, he ran informal benchmarks for our compilers in terms of output size and compilation speed. My C++ compiler was the fastest and 3rd in efficiency!

    Fast forward to the real world. After working for Omtool and Phoenix Technologies coding C and x86 assembler respectively, I worked as chief software engineer at Image Concepts. As chief software engineer, I was to design and develop their image cataloguing database product. To do so, I had the freedom to pick and choose the development tools and platforms for development. Since our product was slated to run on Unix/X11, Mac System 7 and Win3.x systems, I chose C. Again, portability, and the fact that Unix shipped with free compilers and Image Concepts was a very small company...

    Once I started working at Pegasystems, after 6 years at Image Concepts, C++ became the language of choice, both for day job and night coding. Why? Code organization. Even though I wrote my C code in a C++ style, first argument to functions written for a particular struct was a pointer to a variable of that struct, C++'s constructors, destructors and exceptions made error management much, much better.

    Typical C function (pseudo-code):

    open file
    if error: return
    alloc memory
    if null: close file and return
    read from database
    if error: free memory, close file and return
    do something else
    if error: yada, yada, yada...

    C++:
    open file object
    if error: return
    init buffer object
    if error: return (file object destructor closes file)
    read using database object:
    if error: return (buffer object frees memory, file closed)

    -or-
    throw exception on error, which will also call destructors

    You get the idea. When using C, I spent more time writing error management code than writing the algorithm. Worse, the algorithm was hidden amongst the error checking and management code. In C++, I build my libraries of well-designed classes and exceptions and, wow, you can actually see the logic. What a concept!

    C++ isn't perfect, particularly some implementations. However, now that the language has been standardized, it should become more portable.

    Languages are tools. We use them to design apps. We use the compilers to build the apps. Some are better than others in some areas. Each has a place. I use C++ for systems programming. I use Perl for CGI programming. For pattern parsing and portable scripting, Perl gets the nod. For outright speed, C++ gets the nod.

    Is C better than C++? Vice versa? Is Java better? Is C++ purely OO? Does it really matter? Just pick a language, write the app, and distribute your work and add some value to the computing community at large. 'Nuff said.

    One caveat to note regarding C++: templates and shared libraries are a bitch to implement well, that is, if you don't want multiple copies of stack scattered throughout your libraries. Both Visual C++ and Metrowerks CodeWarrior had ways of working around this, to have stack in one library and have other libraries link back to the first one, but the syntax each used were extensions off of standard C++. I don't how G++ will handle this.
  • by Kitsune Sushi ( 87987 ) on Thursday September 30, 1999 @04:34AM (#1648914)

    Might I inquire what would give one such an impression? C is covered by ANSI standards, and so is C++ (not to mention, yay! ISO!), which means just about any compiler worth anything can compile perfectly ANSI compliant C or C++ code. Both are just about the epitome of portability (unless you count Java.. but you know, I don't feel like it), which aside from their being quite powerful languages, is what makes them languages of choice. C being a subset of C++, etc.. I'm not sure what would cause one to think of C++ as "less stable". Perhaps I'm not understanding the question? After all, it is rather early..

    And since C++ is object oriented, it's a lot more.. useful.. in most situations than C. C still has its uses where C++ would be considered blatant overkill, but for anything that's going to be complex enough and is going to need to evolve a lot, C++ is the better alternative.

  • by Anonymous Coward on Thursday September 30, 1999 @04:36AM (#1648915)
    Well, so what the article says is basically this: "over long time we have done a shitload of
    adhoc programming without order or system. Now
    we look at it and we are scared. So we will just
    drop everything we'v done and rewrite this whole
    thing in some other language. We hope that objective nature of C++ will solve the problems
    we as programmers cannot resolve (such as writing
    a good structural code)."
    Well, it won't... It will be slow as hell, Perl
    not being a jetrocket now will be worse then your
    java in Netscape on a 486:) And if they can't
    keep their code neat - no OO language is going
    to save them...
    Sad..very sad.. they are smart people but that
    doesn't necessarily gives discipline.
  • by ajs ( 35943 ) <{ajs} {at} {ajs.com}> on Thursday September 30, 1999 @04:38AM (#1648916) Homepage Journal
    From the article:

    Why not use C? Certainly C does have a lot to recommend it. The necessity of using all those weird macros for namespace manipulation, which I'd rather just use the namespace operator for, and the proliferation of macros are all disadvantages. Stroustrup makes the persuasive argument that every time you can eliminate a macro and replace it with an inline function or a const declaration or something or that sort, you are benefiting yourself because the preprocessor is so uncontrolled and all of the information from it is lost when you get to the debugger. So I'd prefer to use C++ for that reason.

    Macros are the reason to dump C for C++? Woah, I got off of perl5-porters way too soon. Not to start a language war or anything, but this article read like a C++ lovers manifesto, not a reasonable set of excuses to use the language to re-impliment one of the most powerful and stable interpreted languages ever.

    If you want a clean object model, just look at GTk+. If you want to eliminate macros just use inlines. Most people have a hard time with the idea of building inlines in header-files, and I don't blame them, but that's how you're going to end up doing it in C++.... And, don't try that "inline isn't in most C compilers" because most OSes don't ship with a C++ compiler. If you say that you can just buy one or get GCC, then the one you buy will almost certainly also be a C compiler that handles inline, and gcc supports inline (or __inline__ if you have -ansi turned on).

    OTOH, macros are a good thing. Yep, I said what you thought I said. When a macro performs only a simple syntactic transformation, that's fine. I've never once been caught by such a thing. When a macro is a few lines of code, you've failed to correctly design your program, and I've lived in debugger-hell for that one. I can certainly see the value of consts over macros for variables, but that's not even something that's hard in C.

    Overall this will end up one of two ways: 1) it will fail because C++ is too slow or 2) it will succede because he uses C++ only for what advantages the syntax can provide him without being trapped into the glitzy (and mostly useless) constructs that make C++ about as speedy as my grandmother on a cold winter day.

    Along those lines: I heard a good joke recently. A JIT compiler for Java that claimed to run code "just as fast as C++"! I laughed for minutes. Then I cried when I realized that this will likely work as an add campaign. Sigh.
  • by cd-w ( 78145 ) on Thursday September 30, 1999 @04:38AM (#1648917) Homepage
    Why doesn't someone bootstrap a perl compiler written in perl? This is usually the way it is done?
  • As much as I like the new undertaking (I will be installing Perl6 on my system @home when it gets released). This will mean bad news for the geeks at our large company. Right now we have a split system between development and production, where everything in production has to go through what I can only assume to be rigorous testing before it is implemented. We run Perl5 in development but have been working very diligently to get it approved for prod, which currently uses v4. Change can be slow with the overhead that comes with big business so I see a much larger gap between dev and prod here. (That's our problem, I guess. ;)

    But yes, this is great news and I can only hope they can make a great thing(tm) better with more functionality.

    -Vel
  • by Anonymous Coward

    Why does writing Perl in C++ not make sense to me?

    It's a natural fit! Consider that C++ is a systems programming language with an arcane and exception-ridden syntax and a semantics that is at once overcomplex and underpowerful, and that Perl is a scripting language with precisely the same design. :)

    I suppose the corresponding effort in the Python world is John Max Skaller's attempt to reimplement Python in Objective CAML. You may draw your own parallels between size of OCaml's user base and Python's.

  • by Christopher B. Brown ( 1267 ) <cbbrowne@gmail.com> on Thursday September 30, 1999 @04:39AM (#1648920) Homepage
    The stability of C++ has been perpetually around the corner for some time now. It has assortedly suffered from:
    • There not being a normative standard.

      That became untrue a year or so ago, when the ANSI C++ committee released the "final" standard.

    • There not being a good free compiler.

      G++ only fairly recently has started to be both "reliable," "correct," and "nearly completely conformant to standard."

      The gyrations between GCC and EGCS, which has recently become GCC, did not help.

    • Interoperability of LIBC++ versions has not been real good.

      The claim is that there shouldn't need to be Yet Another Noninteroperable Version of the GCC libc++ library; I'm inclined to wait six months and see...

    What this adds up to is that C++ on Linux has had some fairly severe handicaps. Several are no longer in effect; we'll see if this allows C++ to "get up and walk."

    (My personal suspicion is that there may continue to be some LIBC++ gyrations for a while...)

  • ...why they left C in the first place.
    ---
  • qmail? you mean, that piece of modular C code that manages to avoid *every single convention* that C programmers are used to (starting wtih null-terminated strings), just so you can read it and feel like it's another language altogether? and there's next to 0 comments in the qmail code, too. i love qmail as a user, and i've made minor changes to it before, but i wouldnt count it as an example of a particylarily clear program.
  • It is true that C++ on Linux has had some handicaps, but in my experience it's Visual C++ which will be the biggest problem for the Perl Team.

    I'm the coordinator of the Free Trek project, which is a space battle simulator intended to run at the very least on Linux and Windows computers. The project uses C++, compiled by G++ and Visual C++ 6.0. It has been tremendously difficult at times getting VC++ to work properly with the code, mostly due to the brain damaged Standard Template Library that ships with VC++.

    The STL that comes with VC++ is based on the Hewlett Packard implementation which is old, and not thread safe. In my experience, VC++ will not even compile that old STL without a pile of errors which are nearly impossible to fix. I found an STL implementation called STLPort (http://www.stlport.org) that does the job, but it was also somewhat difficult to get working even with the latest Microsoft compiler and all the patches. The good part of that implementation is that it is a modern implementation based on the SGI code (which is in turn based on the HP code), and it is thread safe. The Free Trek distribution includes a configuration file for STLPort which can be dropped right into the STLPort distribution for VC++ compilation. It is not entirely obvious how to compile STLPort with VC++, and it certainly doesn't compile properly out of the box. With our config file, it will.

    They are right in pointing out the G++ is pretty much standards compliant right now, and that they will have to limit their usage of the standard to what VC++ supports. If they use the STLPort library, then those problems will be minimized. One of the developers on the Free Trek project uses the Metrowerks compiler, and he seems to have a great deal fewer problems compiling the code.

    So, based on my experience, I'd say that C++ on Linux is really excellent right now, and whatever kinks might exists should not exclude C++ from being chosen as a development language when it is appropriate for the project. Only if you are going to be maintaining a Windows port will you run into problems. I'd have to say that Free Trek has hit most of those problems already, so if anyone writing the Perl code gets stuck with something, please send me an e-mail. I might be able to help you out.

  • Here are some good reasons why they might be considering using C++.

    1. Perl is already written in C. C++ gives you an easier migration path from C than other languages. You can gradually move the program component by component. The C++ parts can freely call the C parts and everything is linked into a homogeneous whole.

    2. Speed. Perl is a interpreter for a programming language. Moreover, it is widely used in practical applications. So it deserves to be written in a systems programming language that is statically typed and translates to reasonably efficient machine code.
  • There not being a good free compiler.

    G++ only fairly recently has started to be both "reliable," "correct," and "nearly completely conformant to standard."

    The gyrations between GCC and EGCS, which has recently become GCC, did not help.

    I disagree strongly. The creation of the splintered-off EGCS compiler is what gave us the "reliable," "correct," and "nearly completely conformant to standard" compiler! If the old maintainers of GCC were left to their own devices, we would still be worlds away from having a free standards-compliant compiler.

    The fact that the two groups have re-merged and adopted the EGCS compiler as the standard GNU one just shows how well the Darwin model of free software development works. Again, the ability to temporarily fragment the code base allowed the code base as a whole to become better and more reliable than it otherwise would have.

    99 little bugs in the code, 99 bugs in the code,
    fix one bug, compile it again...

  • Crazy talk! There are many ways to develop screamin code with ansi/c++ (as with many other languages). Most of the slow+bigass OO C++ is due to bastard designs, not to the language. OO != horrific over-use of polymorphism constructs, etc., etc.

    C++ is a good tool when used properly. It sounds like C may be 'easier' in this case though, as the current sources reside there...unless the architectural change is needed.
  • There have been a lot of statements along the lines of 'C++ is slower than C'. After being puzzled by where this perception came from, I realize it came from Visual C++ development under Windoze. This end up being slow there becuase of both the pressure people in that environment are under, and the slow and badly designed nature of the function call libraries masquerading as an OS that they have to work with.

    In my experience C++ is as fast, or faster than C.

    Sure, virtual functions impose a tiny bit of overhead. So does RTTI. Exceptions also introduce some overhead, though implementations are getting better.

    The speedup doesn't come from using these features directly. It comes from the flexibility afforded the programmer. Want to switch from using a list to using a tree because you discovered that your list is unexpectedly holding thousands of elements instead of the expected few dozen? No problem. Want to have a more elegant and faster way than a switch statement of picking the right thing to do for a particular type of data? No problem.

    And that's just a sampling. Done correctly, C++ both leverages optimization efforts and makes them easier.

    I find this dislike of C++ in the open source community to be very upsetting and distressing. It puzzles me how a bunch of people who claim to have open minds can reject a language out of hand that is so obviously useful and worthwhile. One can only suspect that the declaration of open mindedness is a facade and that they're horribly afraid of learning anything new.

    PS. To me, GTKs main disadvantage is that it isn't written in an OO language. I though that people, after seeing what a cock-up Xt is, would've known better.

  • If I were Chip (which I'm not) I would barely give a moment's thought to the Visual C++ port at this stage. Yes, Perl needs to run on Windows. But GCC works fine on x86 Win32, and I'm not sure Windows plans to support any non-Intel-types in the future. Well over 99% of Perl users on Microsoft platforms will never compile perl.exe and don't care what compiler built their binary. And if M$ considers Perl worthwhile (which it seems to), it'll take care of its own. Even if the first stable Topaz works with Visual C++, I bet M$ will rewrite it to ``take advantage of Windows features''.

  • Then feel free to write you own implementation. Or don't use it.
  • by Kitsune Sushi ( 87987 ) on Thursday September 30, 1999 @04:50AM (#1648943)
    Why in the world would I do such a thing? Or rather start the ball rolling? Well the primary reason was difficulty in maintenance. Perl's guts are, well, complicated. Nat Torkington described them well. I believe he said that they are "an interconnected mass of livers and pancreas and lungs and little sharp pointy things and the occasional exploding kidney." It really is hard to maintain Perl 5. Considering how many people have had their hands in it; it's not surprising that this is the situation. And you really need indoctrination in all the mysteries and magic structures and so on--before you can really hope to make significant changes to the Perl core without breaking more things than you're adding.

    I believe difficulty in code maintenance with languages such as C was one of the primary reasons for the spawning of object oriented programming languages (complex and sometimes unwieldly though they may be to many), such as C++.

  • by kalamon ( 36451 ) on Thursday September 30, 1999 @04:54AM (#1648946)
    I seem to fail to see anything in C++ that makes it inherently slower than C. Having written in both, I can tell you that I am able to actually write faster code in C++ than in C, not to mention that I make MUCH fewer stupid mistakes because of better type checking etc.
    If you use features such as polymorphism or RTTI, of course you get a performance hit, but if you try to implement the same things in C manually, you won't get any better results.

    Slowness of the program comes from sloppy programming, it has nothing to do with the language.

  • And what exactly is the performance problem with C++?

    My C++ is admittedly rusty, (I moved on to Java three years ago), but I don't recall any performance issues. I do remember being completely screwed by over-enthusiastic use of operator overloading, virtual base classes, fancy conversions, and especially templates, but not performance issues.

    You say that it's the C++ glitz that makes the language slow. Exactly what features are you referring to, and what sort of performance penalty is incurred?

    If I had to write C or C++ again, I would definitely want to use C++ (performance permitting), but I would be much more selective about the features I used.

  • by WasterDave ( 20047 ) <davep@z e d k e p.com> on Thursday September 30, 1999 @04:59AM (#1648948)
    Something that's been rumbling in my head for a little while. Where do we draw the line between real C++ and pseudo-C++?

    For instance, if I write something using MFC (CMapStringToPtr or the less disgusing CMap template) then, well, it isn't real C++ is it? I won't be able to compile it anywhere else. I'll be stuck with that god awful library.

    So you start using STL, beacause it's ANSI'd and open and designed by adults who knew what they are doing. You go looking for (and find) a convenient map template and in the process notice templates for near as dammit everything else! Strings, memory allocation, streams, the whole enchelada.

    So, to be *real* C++, should I only be using the stl library, doing #include "string" and only stretching as far as #include "cstdio" when I feel a nasty hack urge coming on? Can I say goodbye to the happy days of getting a char* to the first byte and bouncing along the string until it dereferences to zero? Do I, in short, have to clean my act up?

    And to what extent do other major C++ projects do this, especially given the relative newness (in terms of C++ standardisation) of the STL?

    Dave :)

    (offtopic note) MFC is horrible. Immense kudos to the ATL team for a fine piece of work.

    I know they should be angle brackets, couldn't remember the HTML to make the browser not think they were tags. At least I previewed.

  • by BIFFSTER ( 31667 ) on Thursday September 30, 1999 @05:48PM (#1648956)
    I've ported perl to a new OS and I've written my own modules, so I feel I have a relatively valid viewpoint on perl's internals.

    Basically, the internal datastructures are hoary. They're awful to dig through, and even worse to try and modify. Even writing external modules for use in perl can be horribly nasty because of the structures involved, the reference counting, etc.

    I'd say at this point there are about 10-20 people (at most) who are competent to go and change the guts of perl. That's not a large number, especially when they don't have all that many tuits.

    One of the prime motivations of rewriting perl in C++ is so that things will be modularized so that the knowledge barrier is greatly reduced. Right now, you need to know how _everything_ works before you can modify _anything_.

    I'll be quite glad to get rid of things like
    return sv_2mortal(newSVpv(ret,len));, thank you very much.
  • by Paul Crowley ( 837 ) on Thursday September 30, 1999 @05:05AM (#1648962) Homepage Journal
    Chip didn't consider all the programming languages in the world before making his choice; there are several others. Some of which might have been a better choice.

    How about, say, Sather [berkeley.edu]? A clean, fast, free (libre and gratis), object-oriented, garbage-collected language, which (and this is the beauty) compiles into portable C, so you can bootstrap Perl without first bootstrapping Sather. You only need Sather if you want to tweak Perl.

    If that's too much effort, then you've basically already made the decision that only C or C++ will do since that's the compiler that machines will already have, so there's not much point in talking about other languages you might have used were it not for that condition. But I think it would be a mistake for free software projects to mandate that only these languages will do - it's not a restriction that proprietary, binary-only releases labour under.
    --

  • "...and I kind of like having the code and the HTML in the same file."

    Then you should check out the HTML::Mason [masonhq.com] module running under mod_perl. Even though I'm a hard-core Perl advocate, I was a big fan of PHP for web development (until I found Mason).

    Very cool stuff...

    --

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...