Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Chuck Moore Holds Forth 211

A little while ago you asked Forth (and now colorForth) originator Chuck Moore about his languages, the multi-core chips he's been designing, and the future of computer languages -- now he's gotten back with answers well worth reading, from how to allocate computing resources on chips and in programs, to what sort of (color) vision it takes to program effectively. Thanks, Chuck!

FFP, Combinator Calculus and Parallel Forth
by Baldrson

In his 1977 Turing Lecture, John Backus challenged computists to break free of what he called "the von Neumann bottleneck". One of the offshoots of that challenge was work on massive parallelism based on combinator calculus a branch of mathematics that is far closer to Forth's formalism than parameter list systems (which are more or less lambda calculus derivatives).

The prolific Forth afficionado Philip Koopman did some work on combinator reduction related to Forth but seems not to have followed through with implementations that realize the potential for massive parallelism that were pursued in the early 1980s by adherents of Backus's Formal Functional Programming paradigm. Given recent advances in hierarchical grammar compression algorithms, such as SEQUITUR, that are one step away from producing combinator programs as their output, and your own statements that Forth programming consists largely of compressing idiomatic sequences, it seems Backus's original challenge to create massively parallel Formal Functional Programming machines in hardware are near realization with your new chips -- lacking only some mapping of the early work on combinator reduction machines.

It is almost certainly the case you are aware of the relationship between combinator reduction machines and Forth machines -- and of Backus's challenge. What have you been doing toward the end of unifying these two branches of endeavor so that the software engineering advantages sought by Backus are actualized by Forth machines of your recent designs?

Chuck Moore: What can I say? Backus did not mention Forth in his lecture. He probably didn't know of it then. Yet Forth addresses many of his criticisms of conventional languages.

He thinks a language needs or benefits from a formal specification. I grew up worshiping Principia Mathematica 'till I learned how Goedel refuted it. The result is that I distrust formal representations. For example, the ANSII Forth standard does not describe Forth, but a language with the same name.

Yes, I am struck by the duality between Lisp and Lambda Calculus vs. Forth and postfix. But I am not impressed by the productivity of functional languages. Even as research tools, they have failed to live up to their promise. By that I mean to do something with computers that I couldn't do more easily in Forth.

I designed the memory for the c18 to occupy the same area as the processor. This means small, fast and smart. c18 can respond to a bus request by fetching from its memory, accessing off-chip or performing a calculation. The 25x avoids the von Neumann bottleneck by making up to 27 memory accesses at the same time (2 off-chip). And its multiple buses do not substitute a network bottleneck for a memory one.

Standard code will be in the ROM of each computer. How this is customized in RAM and the computers assigned tasks is left to the ingenuity of the programmer, not a compiler. Automatically generated or factored code has never impressed me. Nor has automatic place and route for circuit boards or silicon. They are both an order-of-magnitude from human performance. Because humans understand the problem, judge the results and cheat as required.

Marginalizing of the blind
by Medievalist

When I built my first Internet node, the web did not yet exist, and one of the amazing things about the Internet was how friendly it was to the blind.

Now, with some computer experts estimating that over 50% of the Internet is incomprehensible to braille interfaces, and most computer operating systems devolving to caveman interfaces ("point at the pretty pictures and grunt") we seem to be ready to take the next step - disenfranchising the merely color-blind.

I realize that colorforth is not inherently discriminatory, in that there are a great many other languages that can be used to do the same work. The web is also not inherently discriminatory, because it does not force site designers to design pages as stupidly as, for example, Hewlett-Packard.

Would you care to comment on the situation, speaking as a tool designer? How would you feel if a talented programmer were unable to get a job due to a requirement for colored sight?

CM: I'm amazed at how effective blind programmers can be. I rely so strongly upon seeing the code that it's hard to imagine listening to it. Yet I know it can be done. Not being color-blind, it's hard to appreciate the degree of information loss. But it's less than being blind.

My goal is to develop tools that augment my abilities. If others can use them, fine. It would be foolish to lose an opportunity to explore or excel just to conform to some equalitarian philosophy. Too often our culture seeks the lowest common denominator.

20-20 vision is required for fighter pilots. I have no qualms about requiring color vision for programmers. Everyone does not need to be a programmer.

But in fact, color is merely a property of words that helps to distinguish them. As is intensity, size, font, volume and tone. I'm sure colorForth will be translated into these other representations. I, myself, will be exploring spoken colorForth. (As soon as I can decipher PC sound cards.)

Massively Parallel Computing
by PureFiction

The 25X system reminded me of IBM's Blue Gene computer, where a large number of inexpensive CPU cores are placed on a single chip.

The biggest problem in dealing with a large number of small cores lies in the programming. I.e. how do you design and code a program that can utilize a thousand cores efficiently for some kind of operation? This goes beyond multi-threading into an entirely different kind of program organization and execution.

Do you see Forth (or future extensions to Forth) as a solution to this kind of problem? Does 25X dream of scaling to the magnitude that IBM envisions for Blue Gene? Do you think massively parallel computing with inexpensive, expendable cores clustered on cheap dies will hit the desktop or power-user market, or forever be constrained to research?

CM: Forth is a massively pragmatic language: do whatever you can to solve a problem. Its strength is in the ease of violating whatever rules it has. The 25x is similarly pragmatic. I don't know how to program it yet, but I'm confident I can. It's just another level of factoring.

The parallelism provided by the 25x has a different slant from other parallel architectures. The computers are not identical. I expect many will have different ROM and different interface to the real world. This asymmetry is a powerful clue as to how applications will be factored.

A 10x10 array of 25x chips is an easy board to build. At 50 Watts, it needs as much power as a notebook. That's 2500 computers providing 6M Mips. I can't imagine programming them any other way than Forth.

The advantage of Forth in this kind of context is that it scales. Forth is the machine language, Forth is the high-level language, Forth is the task-control language, Forth is the supervisory language. Each of these has a different vocabulary, but they share syntax, compiler and programmer skills.

Back to the array of 25x chips. Each chip could be on a vertical and horizontal serial bus with 10 others. A half-duplex bus requires a computer to manage, so that accounts for 200 computers. Now whatever the application, data must be provided. Say 1GHz Ethernet. Data (and program) is received, distributed and crunched. The assignment and coding of computers follows the data flow. Results are routed back to Ethernet, or displayed or whatever. It's a nice programming problem, well within the ability of a human to organize.

Will this ever reach the mass market? I don't know.

The direction of 25x Microcomputer...
by Midnight Ryder

The 25x concept looks like it could really a damned interesting idea. But one of the questions in my mind is where you want to head with it? Is this something that is to be used for very specialized research and scientific applications, or is this something that you envision for a general 'desktop' computer for normal people eventually?

Secondly, if you are considering the 25x for a desktop machine that would be accessible by people that aren't full-time geeks, what about software? Forth is a lost development art for many people (It's probably been 10 years since I even looked at any Forth code) and porting current C and C++ application would be impossible - or would it? Is there a potential way to minimize the 'pain' of completely re-writing a C++ app to colorForth for the 25x machines, which could help to speed adoption of a platform?

CM: At this stage the 25x is a solution looking for a problem. It's an infinite supply of free Mips. There's no obligation to use them all, or even very many. But they can effectively be used to eliminate hardware. To bit-bang what would otherwise need a controller. So if you want video or audio or radio or ...

The first applications will doubtless be embedded. These offer greater volume, less software and less market resistance than a general-purpose computer. I see 25x reaching the desktop as dedicated appliances rather than universal golems.

I'm not interested in recoding C applications. My experience indicates that most applications are hardware-dependent. The 25x is as large a change in the hardware environment as I can imagine. This changes the program so much it might as well be rethought and recoded. The most efficient way to do that is Forth.

Forth is a simple, interactive language. Its learning curve is steep with a long tail. You can be productive in a day/week. This depends only on how long it takes to memorize pre-existing words. Good documentation and management helps mightily. I'd rather train programmers than fight code translators.

That said, there are those who look at the mountain of existing applications and want to mine it. C to Forth translators exist and with some pre/post editing could produce code for the c18 core. How to distribute the application among 25 tiny computers would be a good thesis.

Quick question
by jd

I have often conjectured that multi-threaded processors (ie: processors that can store multiple sets of internal states, and switch between them) could be useful, as the bottleneck moves from the processor core to communications and dragging stuff out of memory.

(If you could microcode the "instruction set", all the better. A parallel processor array can become an entire Object Oriented program, with each instance stored as a "thread" on a given processor. You could then run a program without ever touching main memory at all.)

I'm sure there are neater solutions, though, to the problems of how to make a parallel array useful, have it communicate efficiently, and yet not die from boredom with a hundred wait-states until RAM catches up.

What approach did you take, to solve these problems, and how do you see that approach changing as your parallel system & Forth language evolve?

CM: The 25x could implement a multi-thread application nicely indeed. Except that most applications expect more memory that a c18 core has. Whereupon memory remains the bottleneck.

It's important to choose problems and solutions that avoid using off-chip memory. Even so, with 25 computers to support, I expect that every memory cycle will be utilized. The computer controlling memory can be smart about priorities and about anticipating requirements. For example, it could guarantee enough access to support display computers.

And the nice thing about memory-mapped communication is that a computer need not be aware of its environment. It's an ordinary Forth program accessing data asynchronously. Delays are invisible, as is synchronization. Of course, due care is required to avoid lock-up loops.

These conjectures are fun. But in a year we'll have real applications to review. And a much better appreciation of the advantages and drawbacks of so many tiny computers.

Programming languages...
by Midnight Ryder

This one would probably require a bit more time to answer than you probably have available, but a quick rundown would be cool: Where do you see programming languages headed -vs- where do you think they SHOULD be headed?

Java, C#, and some of the other 'newer' languages seem to be a far cry from Fourth, but are languages headed (in your opinion) in the proper direction?

CM: I've been bemused with the preoccupation of new languages with text processing. I've been accused of not providing string operators in both Forth and colorForth. Indeed, I haven't because I don't use them. Editing a file to pass on to another program never struck me as productive. That's one reason I chose pre-parsed source, to break the dependence upon strings and text processors.

Languages are evolving, as evidenced by the new ones that arise. But as with natural evolution, the process is not directed. There is no goal to approach nor any reward for approaching it. But whatever progress you might perceive, I don't. New languages seem only to propose new syntax for tired semantics.

These languages are all infix. Which is extraordinarily clumsy for anything but arithmetic expressions. And even those are comfortable only because we learned them in Algebra 101. Do you remember the learning curve?

Does everyone really think that 50 years into the computer age we have hit upon the ultimate language? As more and more C code accumulates, will it ever be replaced? Are we doomed to stumble over increasingly bloated code forever? Are we expecting computers to program themselves and thus save civilization?

I'm locked in the Forth paradigm. I see it as the ideal programming language. If it had a flaw, I'd correct it. colorForth uses pre-parsed source to speed and simplify compilation. This solves a non-problem, but it's neat and worth exploring. At least it proves I haven't gone to sleep.

What about memory protection?
by jcr

From the web pages, I don't see any mention of access control.

Can this processor be used in a multi-user, general-purpose mode?

CM: If you had a chip, you'd physically control access to it. It doesn't make sense for another person to share your chip. He can get his own. Certainly an individual c18 has too little memory to multi-task. And I doubt 25 computers could run 25 tasks.

But the 25 computers can certainly perform more than one task. They have to share resources: communication buses, off-chip memory and interfaces. Access is negotiated by the computer in charge of the resource. There is no hardware protection. Memory protection can be provided by the access computer. But I prefer software that is correct by design.

Communication with other computers, via internal or external buses, is subject to the usual problems of scheduling, routing and authentication. Internally, at least, my goal is to minimize delay rather than attempt protection. I anticipate spectacular crashes while software is developed. (Have you ever crashed 2500 computers?)

Where is forth going?
by JanneM

I learned forth early on in my programming career; it was very memory and CPU efficient, something that was important on early microcomputers. It was also a great deal of fun (though far less fun to try and understand what you wrote a week earlier...). Today, even small, cheap microcontrollers are able to run fairly sophisticated programs, and it is far easier to cross-compile stuff on a 'big' machine and just drop the compiled code onto the development board.

Forth has (in my eyes) always been about small and efficient. Today, though, embedded apps are more likely to be written in C than in forth, and the "OS as part to the language" thing isn't as compelling today as it was in the eighties. Where is forth being used today, and where do you see it going in the future?

CM: Forth is being used today as it always has been. In resource-constrained applications. I think they will always exist. I'm creating some with the tiny c18 computers in the 25x. I imagine molecular computers will be limited when they first appear.

Personally, I don't mind losing a mature market that can afford abundant resources. Such applications aren't as much fun. But Forth isn't restricted to small applications. Even with huge memories and fast processors, small, reliable programs have an advantage.

The major project cost has become software, to the dismay of managers everywhere. On-time, bug-free software is the grail. Forth doesn't guarantee it, but sure makes it easier. Will this ever be convincingly demonstrated? Will management ever value results over procedures?

The currently popular language is selected by uninformed users. The only thing in favor of such democratic choice is that it's better than any other. But why would anyone want to debug 1M lines of code instead of 10K?

What's the next Big Computational Hurdle?
by DG

Now that sub-$1k computers are running in the GHz range, it seems that all the computational tasks on a common desktop system are not processor-bound.

3D, rendered-on-the-fly games get well over 30 frames per second at insanely high resolutions and levels of detail. The most bloated and poorly-written office software scrolls though huge documents and recalculates massive spreadsheets in a snap. Compiling the Linux kernel can be done in less than 5 minutes. And so on.

It seems that the limiting speed of modern computers is off the processor, in IO. What then, do you forsee coming down the pike that requires more processor power than we have today? What's the underlying goal you intend to solve with your work?

CM: Memory is cheap. I don't mind wasting memory as long as it's not full of code that has to be debugged.

Likewise, Mips are cheap. The trick is to find productive ways to waste them. A Pentium waiting for a keystroke isn't very clever.

So here's a huge pool of Mips. What can you do with them? Voice recognition comes instantly to mind. Image recognition close behind. The brain deploys substantial resources to these tasks, so I suspect a computer must.

IO is indeed a bottleneck, but not in principle. If you can't get data from the camera to the computer, combine them. Put the image recognition algorithms in the camera. Analyse, reduce, compress data at the source. Meanwhile, it helps to have multiple paths off-chip.

revolutionary
by rnd

What is the most revolutionary (i.e., it is scoffed at by those in control/power) idea in the software industry today? Explain how this idea will eventually win out and revolutionize software as we know it.

CM: Forth! But then I haven't been out looking for revolutionary ideas. I like the phrase Baldrson used above: compressing ideomatic sequences. If you do this recursively, you obtain a optimal representation. I see no way to get a more compact, clear, reliable statement of a problem/solution.

Forth clearly revolutionizes software as most know it. It could lead to efficient, reliable applications. But that won't happen. A mainstay of our economy is the employment of programmers. A winnowing by factor 100 is in no one's interest. Not the programmers, the companies, the government. To keep those programmers busy requires clumsy languages and bugs to chase.

I don't have to be glib or cynical. Those are facts of life. Society must cope with them. But I don't have to. Nor you. There are niches in which you can be creative, productive, inspired. Not everyone can be so lucky.

Forth as intermediate language
by Ed Avis

Many high-level languages compile into C code, which is then compiled with gcc or whatever. Do any use Forth instead? I understand Forth is a stack-based language: doesn't that present problems when compiling for CPUs that mostly work using registers?

CM: I remember my shock at learning that Fortran compiled into Assembler, that then had to be assembled. A language that can be translated into another is clearly unnecessary. Truely different languages cannot be translated: C into Lisp.

Forth would make a fine intermediate language. But why have an intermediate language? It introduces another layer of confusion and inefficiency between the programmer and her computer. Macros were invented to support compiling directly to machine code.

Stacks are a compiler-friendly construct. Every compiler has to use one to translate infix notation to postfix. If this stack solution has to be assigned to registers, it's an extra step. Forth uses stacks explicitly and avoids the whole subject.

Register-based CPUs have more problems than just the complexity of their compilers. Their instructions must contain register addresses, which makes them longer and programs bigger. And it is rare that every register can be used in every instruction.

Moreover registers need to be optimized. After assigning system registers, performance depends on how well the remaining registers are handled. Compilers can optimize infix expressions better than humans. But such expressions are no longer the preferred means of doing arithmetic. DSPs and super-computers integrate difference equations.

Design guidelines encourage code with many subroutine calls each with only a few arguments. This is the style Forth employs. But it plays havoc with optimization, since register usage must be resolved before each call. So apart from being unnecessary and difficult, optimization has no effect on good code.

This discussion has been archived. No new comments can be posted.

Chuck Moore Holds Forth

Comments Filter:
  • Re:Hmmm (Score:3, Informative)

    by William Tanksley ( 1752 ) on Friday September 14, 2001 @05:23PM (#2300404)
    He effectively said there was a conspiracy to perpetrate bugs, which there isn't.

    He most certainly did NOT say that. He said that there is a common interest in keeping to the current ways of doing things, and further that the current way is less efficient than his way. His first statement is obviously true; he doesn't infer from this that there's any conspiracy, nor does he need to. His second statement is not obviously true, and in fact I believe it misses the point. It may be true that his way is actually more efficient, but the costs of converting everything to work his way would be very high indeed. When you add in the fact that the theory behind Forth isn't completely understood and is only beginning to be explored, it's very clear that now isn't the time to switch to Forth, and even more clear the 1980 was worse.

    This is not to say that you can't write code in other languages that is just as incomprehensible, but I find that I can pick up a perl script I wrote 2 years ago and read it right away.

    What can I say? You know Perl, and you know how to write clear code in it. Try reading a hardware engineer's Perl code (i.e. someone who didn't grow up writing software), and see how far you get.

    I can't do that with Forth.

    Odds are that you can -- I certainly can, and I don't have an immense amount of experience with Forth. You simply have to realise that Forth is different, and you have to learn how those differences affect the ideal coding style.

    For example, you're used to a vertical code layout:

    function1
    function2
    function3

    In Forth, the ideal code layout is horizontal:

    function1 function2 function3

    You're also used to keeping your routine size down to a page or two -- any bigger, and it gets too hard to maintain, and any smaller and the parameters take up too much space to type and too much time to pass to the function. In Forth the ideal routine size is about 50 characters.

    In both Perl and Forth, you can write unit tests to check your code. In Forth, you're encouraged to keep them in the same source file, and run them as part of compilation. There are tools to help -- one of my favorites defines the words "testing", "{", "--", and "}".

    testing addition
    { 3 4 + -- 7 }
    testing distributive property of addition
    { 3 4 5 + + -- 3 4 + 5 + }

    Include a section like that after every word definition, and update it regularly whenever you try to use the word in a different manner, and you'll have a much easier time maintaining the word when it has to be changed.

    (I'm not claiming that unit testing is new to Forth, although Forth was one of the first languages to use it heavily as part of the language. Perl and Python in particular can be used very effectively for unit tests, since they're interactive.)

    Anyhow, my point is that the rules for programming in Forth are very different than those for other languages.

    -Billy

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...