Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Ask Chuck Moore About 25X, Forth And So On 323

Chuck Moore is, among other things, a chip designer. His latest design, the 25x, is based on a 5x5 array of X18 microprocessor cores, and could provide 60,000 MIPS with a production cost of about one dollar. And Moore has the chops to back that up: he's been designing tiny, efficient processors for many years. He's also the inventor of the programming language Forth, which has evolved from a miniscule but radically fast language "difficult for a human to read" (according to The Secret Guide) to the even more radical colorForth. How radical? Try "includes own operating system; has own 27-key Dvorak keyboard layout; meaningful color syntax." How's that for starters? Ask below your questions for Chuck about processors and programming (ask all you'd like, but one per post, please) We'll pass the best ones on to him, answers soon to follow.
This discussion has been archived. No new comments can be posted.

Ask Chuck Moore About 25X, Forth And So On

Comments Filter:
  • 5x5 grid of procs (Score:2, Interesting)

    by Lord Kestrel ( 91395 )
    What kind of signalling method are you for inter-proccesor communication?

    And what led you to design/implement this array?
    • These CPU cores are on a single silicon die so there is no required (electrical) signaling method. Just address select lines. It's a direct register-to-register transfer at least in the original discussions that I had with Chuck about this.

      There are a number of different mechanisms that have been discussed to signal the software, I'm not sure what Chuck has experimented with so far.

      There is a group at MIT with a similar architecture only much less efficient.
      http://www.cag.lcs.mit.edu/raw/

      John L. Sokol
  • by Ed Avis ( 5917 ) <ed@membled.com> on Tuesday August 28, 2001 @12:22PM (#2225945) Homepage
    Many high-level languages compile into C code, which is then compiled with gcc or whatever. Do any use Forth instead? I understand Forth is a stack-based language: doesn't that present problems when compiling for CPUs that mostly work using registers?
    • Both the Java Virtual Machine and the intermediate
      code created by GCC, RTL, are stack based, and
      later interpreted or compiled for a register machine. It isn't really a problem, sometimes
      it's actually easier that way.
    • No, not really. Stack-based systems just use registers for the topmost stack elements.

      The Java VM is a stack-based system which is approximately equivalent to C's speed (a bit faster in some benchmarks, a bit slower in others). If you were to examine the machine code HotSpot produces, you'd see that it was indeed using registers even though the Java bytecode is entirely stack-based.
  • From the 25X webpage [colorforth.com]:

    A 7 sq mm die, packaged, will cost about $1 in quantity 1,000,000. Cost per Mip is 0.

    At that price, I'll take a few billion MIPs, please!
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Tuesday August 28, 2001 @12:26PM (#2225959)
    Comment removed based on user account deletion
  • by rho ( 6063 ) on Tuesday August 28, 2001 @12:26PM (#2225960) Journal

    Do you have a direction in mind as to where Forth/colorForth and the 25x could go? e.g. do you see them in handhelds, set-top boxes, etc?

  • Market Niche (Score:2, Interesting)

    by kaladorn ( 514293 )
    What market niche will the new grid-array processor be targeted at? 60 BOPS is a fair value for a buck. But is the power consumption ridiculous? Or would this be suitable for deployment in various mobile applications? Or is the only way to get this high of projected performance by clocking the chip like a six year old on chocolate frosted sugar bombs?

    And an aside:
    My ignorance of Forth might be showing (one of the few I haven't had kicked into me over the years) - but wouldn't "meaningful colour syntax" represent quite a nasty disadvantage for those who are either entirely or partially (red-green) colour blind?

    And speaking of Dvorak... anyone know where I can get an ergonomic, full sized, keyboard with a Dvorak key layout. I can probably remap the keys on the existing MS keyboard, but the idea of switching the keycaps is nasty. It'd be better to have a keyboard that sent the right scancodes.

    • Regarding your keyboard question, http://www.kinesis-ergo.com/contoured.htm [kinesis-ergo.com]
      has some great (though expensive) keyboards. They take some getting used to, but they're great once you do.
    • I don't think you really want such a keyboard.

      You MIGHT want one that sent the same keyboards and someone had already switched the keycaps...

      and I could see some usefulness in on that switched between the two modes in hardware, except it's probably exorbinantly expensive, due to the low production run. That alone is reason enough not to get one...

      But there are a bunch of things which map to a keyboard geometry, the one foremost in my mind being IJKM as arrows... and there are others, some I've programmed for testing reaction time. There's no reason I can think of to swap the keycodes around, and several not to, unless you're trying to fool everything, which I don't think you are. At least, if I were going to do that, I'd want an A/B switch.

      - Arete
    • You need keycaps to type Dvorak? Wimp. :)

      Seriously, though -- it's (very) well worth knowing how to remap any keyboard quickly (and being able to type without keycaps), because much of the time you won't be at your own box, and you can't very well drag your own keyboard along. Software key remapping works well almost everywhere (it gets covered as part of "internationalization" initiatives, even if nobody had it done beforehand), so the need for a keyboard that sends remapped scancodes is really redundant.
    • Re:Market Niche (Score:2, Informative)

      by treyb ( 9452 )
      60 BOPS is a fair value for a buck. But is the power consumption ridiculous?

      From the web site: Max power 500 mW @ 1.8 V, with 25 computers running

      Or is the only way to get this high of projected performance by clocking the chip like a six year old on chocolate frosted sugar bombs?

      Again, from the website: asynchronous microcomputer core, meaning you don't count clocks like you do in synchronous logic.

      My ignorance of Forth might be showing (one of the few I haven't had kicked into me over the years) - but wouldn't "meaningful colour syntax" represent quite a nasty disadvantage for those who are either entirely or partially (red-green) colour blind?

      Chuck uses color, but you could change the colors to different fonts and/or font styles, if you want. Just as Python source uses indentation for telling the compiler about nesting levels, colorForth uses color tokens (think of it as a trivial markup language) to tell the compiler about word (aka function) definition starts, literal numeric values, etc.

  • revolutionary (Score:4, Interesting)

    by rnd() ( 118781 ) on Tuesday August 28, 2001 @12:27PM (#2225966) Homepage
    Put on your Thomas Kuhn hat for a moment and tell us:

    What is the most revlutionary (i.e., it is scoffed at by those in control/power) idea in the software industry today? Explain how this idea will eventually win out and revolutionize software as we know it.

    • Now take off the paradigmatic Kuhnian hat of revolutionary scientific philosophization, and try on this colorful Dr. Seuss one. Ahh yes, good. Now tell us:

      How did it get so late so soon?
      It's night before it's afternoon.
      December is here before it's June.
      My goodness how the time has flewn.
      How did it get so late so soon?

  • uh, forth post? (Score:2, Interesting)

    by alienmole ( 15522 )
    Sorry, couldn't resist.

    Forth is a very cool language: I first used it running on an Apple ][ a couple of decades ago, to write programs to control lasers for laser shows at a planetarium. The combination of interactiveness and performance was great - it allowed details of a show to be reliably tweaked right before and even during a show. This was one of those situations where the tool really made a difference to the end result. Other languages available on the Apple at the time couldn't really compete.

    I don't have a question for Chuck, but I'll come back when I think of one.

  • by DG ( 989 ) on Tuesday August 28, 2001 @12:30PM (#2225977) Homepage Journal
    Now that sub-$1k computers are running in the GHz range, it seems that all the computational tasks on a common desktop system are not processor-bound.

    3D, rendered-on-the-fly games get well over 30 frames per second at insanely high resolutions and levels of detail. The most bloated and poorly-written office software scrolls though huge documents and recalculates massive spreadsheets in a snap. Compiling the Linux kernel can be done in less than 5 minutes. And so on.

    It seems that the limiting speed of modern computers is off the processor, in IO.

    What then, do you forsee coming down the pike that requires more processor power than we have today? What's the underlying goal you intend to solve with your work?

    • How about a PDA which gets input by reading your lips -- and perhaps even allows you to do text searches on everything that's been said to you by a specific person? A map utility which can map an optimal route from one point to another, taking into account traffic and time of day? A general scientific system which solves large systems quickly by massively parallel simulated annealing? A computer which could actually solve the trade difficulties of the Poictsome system, and perhaps predict in advance the fall of the Federation and its replacement by Empire?

      A computer which, when asked whether there is a God, would simply answer, "present." ;-)

      The possibilities are boundless. Seriously.

      -Billy
  • by JanneM ( 7445 ) on Tuesday August 28, 2001 @12:33PM (#2225988) Homepage
    I learned forth early on in my programming career; it was very memory and CPU efficient, something that was important on early microcomputers. It was also a great deal of fun (though far less fun to try and understand what you wrote a week earlier...). Today, even small, cheap microcontrollers are able to run fairly sophisticated programs, and it is far easier to cross-compile stuff on a 'big' machine and just drop the compiled code onto the development board.

    Forth has (in my eyes) always been about small and efficient. Today, though, embedded apps are more likely to be written in C than in forth, and the "OS as part to the language" thing isn't as compelling today as it was in the eighties. Where is forth being used today, and where do you see it going in the future?

    /Janne

    • Forth is small and efficient enough that the UltraSPARC PROMs contain a small interpreter. You can write Forth code and store it non-volatile RAM, to be executed at powerup.

    • Actually, it goes beyond the UltraSPARC. It has been around in Suns since the Sun-4's. I'm not sure that it was in the original boot monitor for the Sun-4 chips, but Sun-4c and later all included a forth interpretor. And as others have mentioned, many PPC computers have a forth interpretor, as part of the OpenFirmware standard.
    • Follow the colorforth link to "early binding" and you'll see what Chuck Moore really values: tiny and fast. Portability, readability, ease of programming, and even ability of the program to handle unexpected inputs take second place to efficiency, according to Moore.

      If you're programming a microprocessor controlled toaster, and will sell a million toasters, then using forth might let you use a $0.50 microcontroller instead of an $0.75 one, saving $250,000. That's certainly worth some extra programming effort. This is called "embedded programming". And, believe it or not, there are several guys coding the smallest scale embedded programs for every one coding PC and server operating systems and applications. (I suspect embedded programmers are grossly underrepresented in surveys, because most of them are EE's and also do circuit design, and so show up as design engineers rather than programmers. Likewise, few of the many people customizing databases to fit the needs of each and every corporation are called "programmers", although they do sort of code, and there are a heck of a lot of them.)

      For bigger jobs, things are a bit different. I don't disagree with Moore's comments about bloatware, it's just that I think there's a middle ground between Win ME (a bloated program with a 20 year history during which nothing was ever taken out, including bugs), and tiny OS-less applications where the programmer sweated for each byte saved. And I think that for less than million-unit quantities, that middle ground is more cost-effective -- otherwise we'd all still be using assembly language, except for the Forth programmers. Apparently Forth can have a smaller memory footprint than good assembly; it's not going to run as fast, but there never have been many good assembly programmers, and Forth might beat poorly written assembly. At any rate, for toasters, washing machines, ATM's, and about 90% of the other 8-bit embedded systems, it doesn't matter because the thing being controlled is thousands of times slower than any reasonably written software.

      I don't have experience working with Forth, but it looks like it would be much harder to work in than C, mainly because algebraic notation is much easier for humans to comprehend than reverse polish notation. (RPN is easier for computers to comprehend -- that's why the first scientific calculators used it, and why Forth is so compact.) In some respects, they have similar capabilities: they let you work on an extremely low level when you have to, and they let you make horrendous mistakes. (I can't imagine a compiler that would work for direct hardware control that didn't have those possibilities.) They also let you work at a higher level. It's claimed that Forth can go to a higher and more abstract level than C or even C++, and this might be true, but I don't see how I could ever program Forth without continually thinking about the stack, and I can shove the details into C functions and work at a pretty high level in C. C produces much larger machine code than Forth, but at the present prices of memory, working harder on the code to save memory takes something like 100,000 units sold to pay off. Forth is sort of interpreted, very quickly, while C is compiled into pretty fast machine code. C should be faster, but there's a lot more overhead to C so Forth would win sometimes. And of course, a good algorithm in a slow language beats a poor algorithm in expertly hand-tuned assembly -- will Forth's general weirdness make it harder to find an apply good algorithms?

      Am I biased? Too many people merely like the first language they met, but I think I'm about as unbiased as it is possible to be while still having some experience: I first learned programming in FORTRAN and COBOL long before these languages were "structured", then BASIC, APL, assembly, many more dialects of BASIC and assembly, a little Pascal and LISP, C, and Labview. I'm not wedded to any particular paradigm of programming languages, not even to using the Roman alphabet, but I do have to admit that C resembles FORTRAN in several ways (algebraic notation, printf formatting, the simpler variable declarations), and Forth doesn't resemble anything at all.

      One final thing note: Forth does seem like a great language for p-codes, that is, compiler output that is not machine code. P-code can be machine and OS independent, and protection against rogue code can be built into the interpreters. There's a performance loss, but since MS's bloated code has pushed everyone into buying ridiculously overpowered desktop boxes, does it matter?
      • I ask this question precisely due to the reasons you give above. I learned forth before learning C, actually, and I'm still fairly comfortable using it if the need arises. My problem is sort of that the need very rarely arises anymore.

        For almost any microcontroller, what you will get, development-wise, is a C-compiler (or an assembler in the case of signal processors, as you _want_ to get that close to it to realise it's benefits). About the only advantage forth seems to have in the embedded space today seems to be that the code can be even smaller than assembler, and that advantage is being eroded.

        At the same time, forth is Way Cool(tm), and it would be a crying shame to see the ideas slip away. The only thing I believe forth really has against itself is that the choice of keywords are... non-intuitive, let's say. From a readability standpoint, a language that allows you to define a useful, non-trivial function using only punctuation is not optimal. Now and again, I even get this urge to write a forth-like shell before I sober up and come to my senses :-)

        /Janne
      • algebraic notation is much easier for humans to comprehend than reverse polish notation

        I guess you haven't taught or tutored any algebra classes? So many programmers imagine that they way they've learned to think so well is the only way to think. Both forms are hard to work with; however, the Forthlike form is easier to express action in (since it's chronologically ordered, with no execution occuring out of order) and easier to refactor (since most refactorings require only cut'n'paste, with no possibility of code breakage); the Algol or Lisp-like form is easier to do certain other transformations (since the arguments of a function are 'tied' or applied to the function by hard syntax).

        The theory of Forthlike languages is brand new, in spite of Forth's age and Postscript's overwhelming success; it's discussed at the Joy page [latrobe.edu.au].

        will Forth's general weirdness make it harder to find an apply good algorithms?

        Forth's wierdness is explicitly tailored to help the programmer find and apply good algorithms. Let me list some ways:

        • interactive, to encourage experimentation
        • fast compile/run cycle, to encourage testing
        • programmer can write code to execute at any time: while editing, while compiling, while defining a word, while parsing, while generating code, and (of course) at runtime. This allows unit tests to be run as part of the compilation process.
        • syntax is infinitely mutable: if your problem requires BASIC or FORTRAN notation, just write (or load in) a parser and use it. To write an application in Forth, you first write a language in which the problem appears natural; then you naturally solve the problem.
        • refactoring is trivial, natural, and mostly free of the possibility of error.


        Am I biased?

        You have an awesome list of languages, but all of them operate on the same basic system: functions syntactically take parameters. Forth, together with Postscript and Joy, is /completely/ different; its functions aren't syntactically tied to its parameters. This causes it to behave completely differently from the "applicative" languages you list. The author of the Joy page I linked to earlier calls the set of Forth-like languages "concatenative", since any concatenation of valid programs is also a valid program, AND any dissection of a valid program along token boundaries is also a valid program.

        Read the Joy page -- I found it mind-stretching. It's good for a programmer to know some truly _different_ languages, which encourage truly different thinking.

        -Billy
      • Interesting post.

        Apparently Forth can have a smaller memory footprint than good assembly; it's not going to run as fast, but there never have been many good assembly programmers, and Forth might beat poorly written assembly.

        Every high-level language beats poorly written assembly. Believe me, I know. I've seen too much of it.

        Apart from that, though, Forth in its original P-Code form is tighter than assembly. At least 33% tighter, and often as high as 90% tighter, particularly in inefficient assembly codes like pre-386 Intel.

        I first learned programming in FORTRAN and COBOL long before these languages were "structured", then BASIC, APL, assembly, many more dialects of BASIC and assembly, a little Pascal and LISP, C, and Labview.

        Wow, you admit to learning in COBOL. I am in awe of your bravery. :)

        Horrible languages designed for appeasing the needs of suits aside, FORTRAN, COBOL, BASIC, Pascal and C are all Algol-family languages. Assembly of course is its own breed. LISP is also. I don't know what Labview is.

        It is true that Forth is unique. It's not an Algol-family language, like most languages are, outside of assembly. This is both a strength and a weakness. It's a strength because the way Forth does things is, well, better; it's a weakness because there are no commonly-used operating systems written in it, and it sometimes can be a real pain in the butt calling OS routines from Forth.

        The weirdness you reference does not make it harder to do algorithm design in Forth. It makes it easier. What it makes harder is learning Forth in the first place. If it wasn't for Leo Brodie (some of whose books [amazon.com] are still in print; unfortunately Starting Forth is not) I doubt there would be nearly as many Forth programmers as there are.

        Forth does seem like a great language for p-codes, that is, compiler output that is not machine code. P-code can be machine and OS independent, and protection against rogue code can be built into the interpreters.

        Forth is the original language for p-code, as far as I know. I am not sure what you mean by rogue code. Normally, Forth compiles to tokens. Each token is a numerical referent to a routine. For example, here's a sample (rather goofy) Forth program:

        : additiondemo
        4 5 + .
        ;

        the delimiter between commands is either a space or a CR. so that's actually four commands compiled there (six if you count the compiler control commands). : means compile a new command, and takes an argument following (there are a few commands in Forth which do that) of the name of the function. therefore, this is a new command named "additiondemo."

        and by the way, that's what all Forth programs are. new commands. so as you write programs, your Forth just gets bigger. and bigger. there are tools available to reduce your Forth code to the minimum necessary to run a particular command, and then run that command immediately on startup. this is referred to as turnkeying. but you don't have to do that. you can just start Forth on bootup and keep all your programs in RAM all the time.

        so back to our program. the command "4" puts the number 4 on the top of the stack. the command "5" does a similar operation. the command + adds the top two numbers on the stack, and places their total back on the top of the stack. (Therefore, at this point there's a 9 on the stack.) the command . pops the number on the top of the stack, and outputs it to the screen (well stdout really).

        finally, the command ; says time to stop compiling, write the end of the command, and return to interpreter mode.

        anyway, there ya go. that's basically how it all works. that command above should output a single 9, and leave the stack in the same condition it found it.

        with most Forth implementations, the 4 and 5 commands would compile to assembly. actually, probably so would the + - but it could compile to a P-code reference to the + command. the . command usually compile to a P-code reference. result? this additiondemo should come in under 10 bytes on most architectures.

        this is also why Forth only looks like it's using RPN. in reality, the + command is just a command like any other: it takes its arguments from the stack. it just so happens that it looks like RPN, most of the time. but you can do some rather strange things with +, especially if you have pointers on the stack. some of these things can be useful, although their implementations are usually very hairy.

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Tuesday August 28, 2001 @12:33PM (#2225990)
    Comment removed based on user account deletion
  • Somewhat off topic, but the Open Firmware system used to boot all current Sun, Apple Macintosh and I think also IBM machines is a Forth implementation.

    Basically its a Forth interpreter with a stack, and a device tree. You can literally 'cd' and 'ls' around the PCI and other busses on Sun workstations.
  • by sulli ( 195030 )
    umm, that's an old communications protocol [cisco.com]. I think you mean 25X.
  • by Midnight Ryder ( 116189 ) <midryder@midn[ ] ... m ['igh' in gap]> on Tuesday August 28, 2001 @12:39PM (#2226017) Homepage

    This one would probably require a bit more time to answer than you probably have available, but a quick rundown would be cool: Where do you see programming languages headed -vs- where do you think they SHOULD be headed? Java, C#, and some of the other 'newer' languages seem to be a far cry from Fourth, but are languages headed (in your opinion) in the proper direction?

  • by Anonymous Coward
    Chuck,

    I've read everything on your site & also Jeff Fox Ultratechnology.com site about your Minimal Instruction Set Chips, their design, performance etc.

    What advice and tools would you recommend to anyone today starting out and wanting to follow and build upon the path that you've set out?

    Very Interested?
  • What is Forth? (Score:2, Interesting)

    Someone on Slashdot has recommended Forth to me as "perfect middle ground between ASM and C." I have looked at the FAQs and could not find a quick-and-dirty overview of the language.

    I am looking for the simplicity, control, and elegance of ASM. But I also would like to enjoy some degree of abstraction and features that reduce the drudgery of programming. I have looked at HLA and Terse but they are platform-dependent, unless I write my own compiler. Do you think Forth meets these criteria?

    Another thing. Just from peeking at the FAQ I see Forth uses postfix expressions (among other things), which seems a little dated. I assume this was implemented for compiling on resource-constrained machines? Do you plan on giving Forth a minor face-lift?
  • Quick question (Score:4, Interesting)

    by jd ( 1658 ) <imipak@[ ]oo.com ['yah' in gap]> on Tuesday August 28, 2001 @12:45PM (#2226047) Homepage Journal
    I have often conjectured that multi-threaded processors (ie: processors that can store multiple sets of internal states, and switch between them) could be useful, as the bottleneck moves from the processor core to communications and dragging stuff out of memory.


    (If you could microcode the "instruction set", all the better. A parallel processor array can become an entire Object Oriented program, with each instance stored as a "thread" on a given processor. You could then run a program without ever touching main memory at all.)


    I'm sure there are neater solutions, though, to the problems of how to make a parallel array useful, have it communicate efficiently, and yet not die from boredom with a hundred wait-states until RAM catches up.


    What approach did you take, to solve these problems, and how do you see that approach changing as your parallel system & Forth language evolve?

  • The 25x concept looks like it could really a damned interesting idea. But one of the questions in my mind is where you want to head with it? Is this something that is to be used for very specialized research and scientiffic applications, or is this something that you envision for a general 'desktop' computer for normal people eventually?


    Secondly, if you are considering the 25x for a desktop machine that would be accessable by people that aren't full-time geeks, what about software? Forth is a lost development art for many people (It's probably been 10 years since I even looked at any Fourh code) and porting current C and C++ application would be impossible - or would it? Is there a potential way to minimize the 'pain' of completely re-writing a C++ app to colorForth for the 25x machines, which could help to speed adoption of a platform?

  • First: Great to see you here. I really enjoyed a FORTH like language on my Altair 8800 (written from an old Byte book "Threaded Interpretive Languages" or TIL's) because BASIC was SLOW and assembly tedius.

    Only question I have is about the choice of ShBoom for a microprocessor - any story behind that??

    I sure wish I could get a uP like an NC4000, RTX2000 or PSC1000 - inexpensively.
  • I played with forth back in the early 80s on an Apple II. It's really a remarkably powerful and efficient programming system. I was able to write programs in forth that ran at nearly the speed of the equivalent assembly language, but were much easier to read and maintain.

    Two excellent books worth finding, both of which are probably long out of print:

    "Starting FORTH", Leo Brody, Prentice Hall, 1981.

    A very well written book aimed at the absolute beginning programmer. Brody uses cartoon drawings to illustrate the operation of the forth operators, and over the course of some 350 pages, explains not only how to program in forth, but how the language works under the covers and how to extend the compiler. Highly recommended and extremely novice-friendly.

    "Threaded Interpretive Languages", R.G. Loeliger, Byte Books, 1981.

    A more technical work, Loeliger describes and explains the implementation of an almost-but-not-quite-FORTH language. The book contains (and explains) the full source code, assembly as well as high-level, for the interpreter.

  • What is Forth (Score:3, Interesting)

    by mcelrath ( 8027 ) on Tuesday August 28, 2001 @12:53PM (#2226103) Homepage
    This is going to be a stupid question...but one I suspect many will have.

    What is Forth? Why is it useful? How fast is it in terms of useful computations? X MIPS, when comparing miniscule Forth instructions to CISC Intel instructions isn't really a good comparison. So how many *useful* computations can it perform compared to modern processors? What has it been used for in the "real world"?

    I recall a company creating a transputer -- basically an array of FPGA's, all doing 4-bit add operations, and claimed X thousand MIPS, where X is large. How are Forth machines different?
  • I've followed you at http://www.ultratechnology.com for years now and I'm a big fan of your outlook on technology. Now, when are we going to see your chips in use? Your cpu would be great in a game box (for example). If your stuff is being used, please tell us how it is being used and for what purpose. Thanks.
  • I love Forth, always have. From the first hand-entered FIG listings to the excellent Win32Forth, I've known this is the "right" computer language. So why doesn't the rest of the world see what is so abundantly clear? Is it because they can't make the small mental leap to RPN?

    And why isn't Forth used more as a platform? Is it speed, security, advertising, what? I've never understood why the Forth community will take an excellent implementation right up to the point of being useful, then leave it without developing any applications. I can see an efficient, user configurable web cruiser built on any one of a number of Forths. But nobody has done it. Ditto for httpd servers. Why?

    And to the rest of the world, please stop parroting the old line about Forth being hard to read. It isn't. You can pick up most of what you need to know in an afternoon, then begin to enjoy some very elegantly stated code.
    • why isn't Forth used more as a platform? Simple answer, it looks too darned weird. I sometimes do small embedded programs where Forth's tiny memory footprint would be advantageous, and it would be more than fast enough. But my boss isn't about to let me use it. He doesn't want to learn to read it, and if something happened to me, the nearest guy that could maintain Forth code might be 200 miles south of here...

      Other reasons: It isn't advertised. It isn't standardized. (I doubt this matters much since it's pretty easy to add new words to any available Forth as needed to support a program from another dialect, but this sort of thing scares away managers who don't have time to listen to the details.) It allows you to do horrible things like writing a subroutine that removes too many or too few items from the stack. (Sort of like the horrible mistakes C programmers make with pointers, malloc and free, == vs =, etc, but we're talking about managerial perceptions again...) Some versions of Forth store your tokenized source code right in the executable program, so you can't protect your "trade secrets." RPN really is hard to work with, at least for me (with the early HP scientific calculators). But the basic reason is that it looks weird and this inclines people to find fault...
      • RPN on an old calculator and RPN on a computer are extremely different.

        On old HP calculators, you could only ever see the bottom item on the stack. Using these calculators was difficult because you had to think about "registers" that you couldn't see.

        A good RPN calculator on a computer (or an HP graphing calculator) lets you see almost everything that goes on with the stack. This is when RPN begins to make sense.

        And if you're _programming_ in an RPN language like Forth, then the stack is what you make it. If you switch things around on the stack at bizarre times, it will be hard to work with. If you think of stack slots like function inputs and outputs, it's easy to work with.

        But you did hit on what are probably the main reasons Forth isn't used: tradition, and the fact that Forth is closer to assembly than C, and as such it would not be obfuscated enough in binary form.
  • What part do you see Forth playing in the future of GNU/Linux and other FreeSoftware and OpenSource operating systems (*BSD, *[A-Z]*[T/N]*[X/CS] etc)?

    How do you forsee such a synergy affecting the popularity of both parties?

    -Marvin
  • * 5 x 5 array of cores: 60,000 Mips

    ...

    * Max power 500 mW @ 1.8 V, with 25 computers running


    500 milliWatts is .5 J/s. Divided by 60,000 million instructions/second implies that this can execute 1 instruction while consuming only 8.3e-6 Joules of energy. What I'd like to know: Pretending for a moment that the instruction was simply to flip a single bit, how close does this come to the absolute limit dictated by Information Theory?
  • It may not be clear to rest of the Slashdot audience that they are asking questions to a legend of the field. The FORTH language could easily have been what C became, and only luck decided the fate of each language.

    I was truly amazed when I first found a FORTH compiler for the Apple II. It was so alien to everything else available, yet so advanced, so ahead of the pack.

    So, as for a question, do you think the growth of the appliance and handheld markets can give FORTH a chance to achieve a mainstream status? What steps are being taken to bring FORTH compilers to Palm OS, Windows CE and such?
  • Do people actually buy stuff from all of those pop-unders* you guys have ON EVERY STINKING WEB PAGE?!?? Does it really fit in the palm of your hand? Neat!

    .

    .

    *Yes, I know that is X10. It is a joke. Live a little.
  • Ahem (Score:2, Funny)

    For-- is wel- kno-- for sto---- all key----- as thr-- let---- and the len--- of the nam-. Why was thi- des--- dec----- hel- ont- eve- aft-- mem--- bec--- che-- eno--- for spa-- not to be an iss--?
    • Re:Ahem (Score:2, Interesting)

      (Which, for the sake of people reading the above and going 'huh'? actually reads:
      "Forth is well known for storing all keywords as three letters and the length of the name. Why was this design decision held onto even after memory became cheap enough for space not to be an issue"
      Trust me, Moore will understand it without the need for translation...)
  • by PureFiction ( 10256 ) on Tuesday August 28, 2001 @01:20PM (#2226242)
    The 25X system reminded me of IBM's Blue Gene computer, where a large number of inexpensive CPU cores are placed on a single chip.

    The biggest problem in dealing with a large number of small cores lies in the programming. I.e. how do you design and code a program that can utilize a thousand cores efficiently for some kind of operation? This goes beyond multi-threading into an entirely different kind of program organization and execution.

    Do you see Forth (or future extensions to Forth) as a solution to this kind of problem? Does 25X dream of scaling to the magnitude that IBM envisions for Blue Gene? Do you think massively parrallel computing with inexpensive, expendable cores clustered on cheap die's will hit the desktop or power-user market, or forver be constrained to research...
  • Most of the questions I see here are FAQs (e.g. "What is Forth?") that you don't need to ask Chuck. It seems that people don't really understand what he has been up to, so here's a general overview.

    He did create Forth, yes, but that was thirty years ago. And while Forth has been relatively unchanged for the last twenty years, Chuck has kept evolving the language in a quest for the minmum interface between a human and a computer. The "OS" talked about in the intro is only a couple of kilobytes (yes, kilobytes).

    He works not just on software, but does true systems work: a combination of software and hardware. And that is what he is trying to minimize. The system as a whole, not just a programming language. He has been designing processors hand-in-hand with stack-based languages. So he can do things like write a compiler for his language in a hundred lines of code. And he has a chip that uses _milliwatts_ of power and only 15,000 or so transistors.

    If nothing else, realize that Chuck is one of the few people single-handedly creating microprocessors. And he's way, way out there. Remember the recent Slashdot post about asynchronous logic? Chuck has been designing chips without proper clocks for ten years now.

    My question to Mr. Moore: Linux is seen as a more stable and reliable alternative to Windows, but at the same time I wonder if it's real progress or just a similar incarnation of a traditional operating system. Is the concept of "operating system" outdated?
  • I don't think anyone will argue with the fact the Forth programming can allow one good programmer to generate small programs with amazing functionality compared to other languages, but do you feel that Forth advantages map well onto large scale projects? Forth has obviously been used for complicated embeded projects, but has it ever been used developing a GUI word processor or CAD program from scratch? If so, did it hold onto it's small application virtues, or if not do you feel that it would?
    Thanks for delivering a language that has proven to be great on memory constrained systems for years.
    • Forth has obviously been used for complicated embeded projects, but has it ever been used developing a GUI word processor or CAD program from scratch?

      Chuck's CAD system used to design his processors is written in colorForth.
      • Chuck's CAD system used to design his processors is written in colorForth. That is really eating your own dogfood! I have some idea of what goes into writing a schematics and board layout CAD system from watching a couple of vendors struggle through the transition to GUI's. IC design is much harder. Amazing to see it done by one man. Can I download a copy somewhere?

        Note that it wasn't used to design something equivalent to a Pentium. I think it was used to design a much simpler but not too slow CPU, and then replicate that 25 times with interconnections. And the user interface is often the hardest part of a program; Moore may have left that as simply a text console or something that takes a lot of work for anyone but the programmer to master. But still, I would think that to write a CAD program to do even that much well would take a a large programming team, and various routing algorithms that are held as trade secrets or patents by large corporations.

        Mr. Moore, are the specs or a demo on the web somewhere?
  • by nate37 ( 171012 ) on Tuesday August 28, 2001 @01:28PM (#2226280)
    Chuck,
    What are your views on Object-Oriented programming and how it would relate to forth?
  • From reading your webpage, it appears that the X25 is a 5x5 array of asynchronous processors that execute an assembly language very similar to Forth. How do they communicate and share information to do useful computations?
  • Comment removed based on user account deletion
  • It seems from the X18 architecture and the general format of Forth that it would be efficient at executing java bytecodes or at least as a good target for on-the-fly translation, since the JVM is also a stack machine. Java even has enough multithreading that it might be able to make use of having 25 processor cores on a chip.

    Have you looked at Java as a high-level language for these systems or at Java bytecodes as a way to make common software available to users?
  • Comment removed based on user account deletion
  • Comment removed based on user account deletion
  • The 25X has lots of MIPS for grinding away on small amounts of data in each CPU's stack, but it looks like getting data on and off the CPUs from memory is the bottleneck, especially since the CPUs implement this in software - this means it can crunch very hard on very localized bits of data, but it's tough to give a single CPU enough for, say, a Fast Fourier Transform (at least for the interior CPUs). What kinds of applications work well on this sort of machine? How much cache is it practical to build around it? Has anybody built a bignum multiplier with it?

  • I used Forth a couple of times in my younger days, for a PC data collection board, and STOIC, a VAX/VMS forth that an excellent editor was written in. So I'm familiar with some of Forth's strengths.

    But Forth hasn't taken off, either in the general market or in its target market. In the same time period, numerous other languages have either become quite popular or have become well established in niches: C++, Java, Perl, Python. Companies like Cygnus have made mucho $$ supporting C in embedded environments, supposedly a natural niche for Forth. And research projects which involve, say, downloading codelets into an operating system to filter network packets tend to use Java or interpreted C instead of Forth.

    If you could somehow wave a magic wand and create projects to make Forth popular, what would you do? What vendors would begin to offer Forth as an alternative, what killer open source projects could be done far more efficiently with Forth, and what great benefits could firewall vendors create by letting admins add little arbitrary packet filters written in Forth?
  • Comment removed based on user account deletion
  • Comment removed based on user account deletion
  • Your philosophy seems to be one of minimizing the solution and the problem at the same time. That is, not only do you write the least amount of code to solve a problem, but you also reduce the scope of the original problem to make a solution easier, sometimes drastically. What are the limits of this philosophy?
  • It's good that someone is opposing ever-growing processor complexity. But is this useful?


    Moore rejects most of the innovations in computer architecture of the past 20-30 years. No superscalar execution units. No pipelines. No caches. No floating point. No huge memories. Just simple little stack machines with high clock rates.


    Nobody seems interested. Not even the digital signal processing people, who should like the repeatable timing and be willing to put up with the tiny memories.


    So the real question is, what is this for?

  • You have talked of writing a tiny web browser, one in line with your view that you can write any application in 1000x less code than generally available solutions. What would your approach be to writing such a browser?

  • What happened to the "Ask FCC Chief Technologist David J. Farber" interview questions (http://slashdot.org/interviews/01/01/22/1349237.s html) posted on Jan. 22nd? I am still waiting to hear what benefit citizens gained when deregulation allowed Clear Channel's stranglehold on public airwaves to expand to over 1,000 radio stations. Why ask for interview questions if you aren't going to follow up with an interview...

    maru
    www.mp3.com/pixal
  • With many parallel threads and an internal cycle time much faster than external SRAM access time, won't this chip have a lot of trouble managing locks in parallel applications?

    Will this restrict the set of applications for which this chip is useful, or have you come up with a clever solution to the problem?
  • Your philosophy of coding emphasises (or seems to me to emphasise) many of the same things recently elaborated by the new "fad", Extreme Programming [xprogramming.com]. What's your opinion on this rediscovery? How strong is the correspondance?

    How, if at all, does Forth help you to do things like refactoring and unit testing in ways that other languages don't?

    -Billy
  • There used to be three major laguage families: Algol, Lisp, and Forth-types (lots of varients on these and lots of others minor families, I know). Forth and Lisp were popular enough to support hardware systems running them natively but in the end the Algol family (in the form of C, C++, Java, and Perl etc.) became dominant.

    Why?

    Many people complain about the RPN and many other argue that factor that can't be enough of a reason to spurn a language of the quality of Forth, but is it actually that maths as taught in schools the world over really does give languages like C a huge "familiarity bonus"?

    • True, in the early days most programmers had degrees in a mathematical field (math, physics, engineering), and that certainly biased them towards the familiar-looking Fortran, Algol, and their descendants. But I think it's not just that the algebraic-type languages look more like the math you learned in school, but that algebraic notation went through centuries of evolution to become as easy for humans to read as possible.
      Another factor is that Algol and its descendants visually break up the code more than Lisp and Forth usually do, making it easier to see the structure. (This depends on the programmer, of course -- anyone with a smidgen of artistic ability and any concern for those who must follow him can add white space to make any language look good. But the average hacker seems to be lacking in either artistry or concern for maintainability. I've even seen Pascal code run together until it was unreadable. But the varied grouping elements in C (parentheses and curly brackets) give it some visible structure even when the indentation is snafu'd. Lisps all-parentheses is harder to parse visually, and I've seen Forth programs presented as a single string with no breaks at all.)

      Who is saying "what's Algol?" It was a structured language created by a committee of mathematicians around 1960, before anyone thought to call it "structured programming". It had most of the ideas you find in Pascal, C, Modula 2, and filtering back into the originally unstructured Fortran, cobol, and basic languages. It also had a lot of ideas which turned out to be either unimplementable or just plain bad. If you want to see just the good parts of Algol, learn Pascal.

      It had blocks separated by begin and end, which allowed you to replace the single statement controlled by an if with a whole group of statements (C turned that into {} for faster typing), plus the whole nine yards of "structured" languages (while, for, etc.). For wasn't new, FORTRAN and COBOL already had equivalent loop statements, but using blocks instead some hokey syntax to delimit the loop was new. Algol might have been the first to make the variable names in a subroutine independent of the names in the main and other subroutines. It had rules for variable scope that only a mathematician could love -- e.g., you could define a function inside another function inside main, and it would have access to the outer function's and main's variables, besides having its own local variables. (C simplified this to each function having its own distinct set of variables, and blocks being allowed to have local variables plus inheriting the next level's.) It introduced := for assignment, to distinguish this from = for comparison in an if statement. (C kept the single character = for the most common usage and == for the less common comparison operator.) It had several different ways to pass an argument to a function: call by value (C directly uses this only), call by reference (passing a pointer in C, but C++ brought back references in a more general way), and call by name. (Wirth, the Swiss creator of Pascal and Modula 2, is supposed to have explained how to pronounce his name by saying, "You can call me by name, "Veert", or you can call me by value "Worth". You never heard a professor of mathematics and computing tell a joke before?)

      Call by Name seems to have been both hard to implement and a basically bad idea. Since a function could access the variables in the calling program, all the way up to the main program, the original language definition seems to say that a function could actually call a subordinate function that would change the invisibly change the value of the first function's argument, and if it was by name the first function should use the new value henceforth. Uhhg! The second edition of the standard (which I long ago discovered marked down to about 10 cents in hardcover and still have packed somewhere...), discussed this and various other language features that were impractical with the compiler technology of the time, but didn't actually say what to do about it. So I guess each compiler writer implemented a different subset of Algol. Fortran and Cobol maintained fairly good compatibility, Algol started out incompatible, so it died except as an inspiration to others.

      In the 1970's, Wirth finally handled it right: he named his favorite Algol subset "Pascal" and pushed it as a standard language. But it didn't compete too well with the more hacker-friendly C. It told you when you f*d up and didn't let the program compile, while C just assumed you knew what you were doing... Borland gave a version of Pascal a new lease on life by writing remarkably fast compilers for it (after, I presume, changing whatever part of Wirth's standard made it slow to compile but mathematically correct) and selling them cheap, but it was still treated mainly as a toy language -- good for learning to program, but to do real work you used C which didn't complain when you did something odd with a pointer. Even though most of the time it was a mistake... But I don't think it's possible to write a hardware driver in Pascal, it will think your reads and writes to hardware are a mistake.
  • by Medievalist ( 16032 ) on Tuesday August 28, 2001 @05:20PM (#2227414)
    When I built my first Internet node, the web did not yet exist, and one of the amazing things about the Internet was how friendly it was to the blind.
    Now, with some computer experts estimating that over 50% of the Internet is incomprehensible to braille interfaces, and most computer operating systems devolving to caveman interfaces ("point at the pretty pictures and grunt") we seem to be ready to take the next step - disenfranchising the merely color-blind.
    I realize that colorforth is not inherently discriminatory, in that there are a great many other languages that can be used to do the same work. The web is also not inherently discriminatory, because it does not force site designers to design pages as stupidly as, for example, Hewlett-Packard.
    Would you care to comment on the situation, speaking as a tool designer? How would you feel if a talented programmer were unable to get a job due to a requirement for colored sight?
    --Charlie
  • I've been a user of several self-hosted language environments, and a friend of mine has done major Forth hacking for the past ten years or so. From what I understand, the beauty of Forth is the extensibility, the tiny footprint, and the closeness to the metal.

    Wouldn't this make Forth and similar small-footprint environments a natural choice for devices such as sub-$100 PDA's, and why does it seem that line of development is completely unexplored?

    -jhp

  • by Baldrson ( 78598 ) on Tuesday August 28, 2001 @06:38PM (#2227712) Homepage Journal
    In his 1977 Turing Lecture, John Backus challenged computists to break free of what he called "the von Neumann bottleneck". One of the offshoots of that challenge was work on massive parallelism based on combinator calculus [nec.com], a branch of mathematics that is far closer to Forth's formalism than parameter list systems (which are more or less lambda calculus derivatives). The prolific Forth afficionado Philip Koopman did some work on combinator reduction related to Forth but seems not to have followed through with implementations that realize the potential for massive parallelism that were pursued in the early 1980s by adherents of Backus's Formal Functional Programming paradigm. Given recent advances in hierarchical grammar compression algorithms, such as SEQUITUR [rutgers.edu], that are one step away from producing combinator programs as their output, and your own statements that Forth programming consists largely of compressing idiomatic sequences, it seems Backus's original challenge to create massively parallel Formal Functional Programming machines in hardware are near realization with your new chips -- lacking only some mapping of the early work on combinator reduction machines. It is almost certainly the case you are aware of the relationship between combinator reduction machines and Forth machines -- and of Backus's challenge. What have you been doing toward the end of unifying these two branches of endeavor so that the software engineering advantages sought by Backus are actualized by Forth machines of your recent designs?

No skis take rocks like rental skis!

Working...