Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Graphics Programming Software Technology IT

Software/Hardware FPGA Dev Board that runs Linux 208

bforsse writes "The ML300 allows engineers to develop hardware with HDL synthesis/simulation and software with standard GNU tools. The entire system is implemented inside one FPGA with an integrated IBM PPC processor. The board comes with all the peripherals that a standard motherboard or laptop has and then some. It currently ships with MontaVista Linux, a number of other linux flavors and OSs are in the pipeline. Maybe this new merging of the hardware and software worlds will settle some of the religious wars between hw and sw engineers?...ok, maybe not."
This discussion has been archived. No new comments can be posted.

Software/Hardware FPGA Dev Board that runs Linux

Comments Filter:
  • funny... (Score:5, Funny)

    by jda487 ( 646991 ) on Wednesday February 12, 2003 @02:11AM (#5285921)
    ...the term 'engineer' is used very loosely when you are refering to software engineers...
    • Watch it mate...
    • Re:funny... (Score:1, Troll)

      by schwep ( 173358 )
      ...the term 'funny' is used very loosely by the ignorant masses - especially when talking about things they don't know very much about...
      • Re:funny... (Score:1, Funny)

        by Anonymous Coward
        considering how fat most hardware guys are, 'ignorant masses' is a good term... ;-)
    • I agree to some point...

      Well, if you want to get technical, if you're not a certified P.E., then some people say you shouldn't call yourself an engineer.
      • If we're stuck with these guys insisting that they're engineers, and the term has gotten common usage, then we obviously need a heirarchy.

        Wannabe Engineer: Application "Engineers" and Network "Engineers." Requirement: You're the guy down the street that "knows stuff about computers." A degree in IS, a diploma from ITT Tech and Certifications are all optional.

        I'm not an Engineer, But I Play One At Work: Software "Engineers." Requirement: A CS degree. You think you're cool because you hack XML. You think you know stuff about computers, but you couldn't so much as recognize a K-Map if it slapped you in the face. You've heard of "assembly" programming, but you've never actually seen it.

        Real Engineer (junior grade): Electrical/Computer Engineers. Requirement: A B.S. in Electrical Engineering. You actually know something. You eat Verilog for breakfast and assembly for lunch. You could probably design a power supply for your computer from scratch, but it wouldn't have good voltage regulation.

        Real Engineer: Requirement: PE. You think you are a demigod, and all the Engineer(j.g.)'s simultaneously despise you and want to be you. You dream in SPICE and FORTRAN and scorn those who rely on silly schematic capture tools, since you write all of your netlists in vi.

    • Re:funny... (Score:5, Interesting)

      by WhaDaYaKnow ( 563683 ) on Wednesday February 12, 2003 @02:58AM (#5286084)
      ...the term 'engineer' is used very loosely when you are refering to software engineers...

      Well that's what hardware engineers think when the software engineer disagrees to install MS-DOS 3.3 to test the hardware. ;-)

      Seriously though, I've been through two board bringups, both Intel Architecture.

      The first board was considered 'done' by the hardware guys, after it booted DOS. I told them that that was not really a test, and sure enough months (and numerous patch wires) later we finally were able to use _all_ the features on the board and boot Linux and Win95.

      On the second board I was most impressed with the software tools hardware guys used. NOT! Although the board was more or less up and running I found a couple of places where transmits were connected to transmits and receives to receives. I asked why the schematic capture tools didn't catch such obvious mistakes. I know the software can, but quite honestly, all the software used for hardware design feels like it was written by, uh, hardware guys. :-O

      Seriously though, the software tools that hardware engineers use leave a lot to be desired (I mean, the last board I worked on was in 2002 and they used a DOS based program to do the layout for peet's sake)

      In defense of the hardware engineer though, he'd use symbols provided by the manufacturer and they, for some reason, could not be bothered to indicated the type of signal a pin has properly (e.g. input, output, bidir, etc..)

      Until today I never understand why they'd risk the change of having to do a new rev of a board vs the cost of spending a few minutes to create the symbols properly.

      Then again, I've seen software 'engineers' do the same stupid stuff. ;-)
      • Re:funny... (Score:5, Insightful)

        by anubi ( 640541 ) on Wednesday February 12, 2003 @05:18AM (#5286377) Journal
        Seriously though, the software tools that hardware engineers use leave a lot to be desired (I mean, the last board I worked on was in 2002 and they used a DOS based program to do the layout for peet's sake)

        I work in circuit design all the time, and I use DOS based tools for schematic, PCB layout, and circuit analysis (spice). Why?

        Most of the stuff I do links to stuff I have done before. I have no trouble with my DOS tools re-opening files say 10 years old. I also know that I will be able to see my files 10 years from now if I play my cards right and do *not* "upgrade".

        Processors change. Although my schematic capture program requires an 8088 or better, both the PCB layout and circuit analyzer require at least an 80386. As processors changed, I simply copied the files to a directory on the new machine and they run. No "installation" or registry entries, authorization codes, or the like. They just run.

        The libraries on all of the programs are user configurable. As any new parts come out, I simply enter the configuration into the library.

        No user authentication. These programs were coded in a day where there was not all this emphasis on piracy - I am free to move these programs around to any machine I get my hands on. And because the family of machines I use all run the exact same software, the files can be read/modified/written on any machine without need of version controls. It was common in those days to buy a "site license".

        The programs are quite small. The schematic editor, along with all its libraries, and a good sized project's worth of files will all fit on a 1.4 Megabyte bootable floppy! The spice analyzer requires 1 floppy, and the PCB layout program requires three floppies. None of the programs require any sort of "installation", per se. Just make a subdirectory for them, copy them over, and run the appropriate .exe file and start work.

        But the part I like best is that I intimately understand what these programs are doing. If something goes amiss somewhere, I know where to look. Their file structures are pretty simple; if something goes amiss, I can usually patch it with a hexadecimal file editor.

        If I want the file in another format, its usually not all that difficult to pull up the C++ compiler and code a little file converter.

        The schematic editor and spice analyzer are wide open to debuggers, but the people who made the PCB editor got crafty and made theirs hard to debug- just thank goodness they coded it well and there was only one instance where their program needed debugging. ( For those of you who have ever had to use what passes for technical support, you may find the time better spent learning how to fix it yourself.) But this was five years ago.. today fixing it yourself is illegal in many cases as a result of the congresscritters foisting DMCA on us.

        I know its fashionable for me to say I run the latest systems. If the later systems actually gave me better results, I would gladly switch, but all I see out there is I would be throwing away a trusted and faithful system to get more problems than I could shake my proverbial stick at.

        My take on this is that my tools are precisely that: tools. It took me 12 years of education before I could even emit a coherent sentence in English. It took me 5 years in front of a keyboard before I typed halfway worth a damm. What I am trying to say is that although hardware and software complexity has grown by leaps and bounds, I have not. It still takes me a helluva long time to learn how to use this stuff. If I spent all this time learning how to play a piano fluently, I feel foolish going onto stage with a clarinet. My job is not learning new tools all the time, its applying what I know to get a job done. Would you want a seasoned old mechanic using his grandpa's wrench on your car, or somebody with the latest 200HP pneumatic tool seating the oil-drain plug? ( I use that as an example because they did it to me... that car never stopped leaking oil once they improperly used that power wrench on my car.)

        If what I am doing is bad, I guess it won't matter much - as this is my last decade I think I will be in the job force. This grandpa is about ready for pasture.

        • Actually, You've spotted the problem with software engineering in general! We've had tools that did the job on DOS (640K limit and all) that were MORE than adequate for better than 10 years.
          Yet - look at the Windows versions of these things. They cost 3x as much minimum, take 100x more memory to run(I get to count the OS here!).

          Why - poor software engineering practice! You can put a fair amount of blame on the bloat directly at MS's feet. Fine. Still, software guys just add layer upon layer upon layer without ever cleaning up after themselves (See OSI model as proof of concept ;-)

          Nowadays - I use RTL to do chip design. The tools don't advance to much for the simple fact that there is lack of competition and people are afraid of touching a big bulky piece of software that happens to work. The synthesis tool in particular has a repetition that for every feature they add, they broke three, and the new featuer won't REALLY work for two or three releases.

          As for software that operates like EE's desiged them. You guys have never seen the HP500 logic analyzer user's interface. Looks like a software guy designed it. I've been griping about that for a decade too! ;-)

        • [real world computer use account snipped]

          Yes, we computer folk are (or should be) constantly reminded that "if ain't broke don't fix it" (or pay for it!) is a good axiom for most people.

          I think some of the points you made directly apply to the rush to get EE oriented tools onto Linux. No one can ever declare an old version "obsolete", and I'm pretty sure you'll be able to get x86 hardware to run it for many, many years (i'd guess at least 50, and perhaps 100,000;).

          Contrast that to Windows where "OS Design du Jour" is the rule...and there is a constant commercial push to generate churn.

          Linux rocks!

        • I certainly hope your computer doesn't break down, because it'll be really hard these days to get a replacement system that can run your software. You might want to look into Linux for that reason: it can emulate DOS (and may be able to run your software natively-- I remember seeing a *nix version of spice), and supports available hardware.

          Hopefully, the seasoned old mechanic will be able to use a brand-new titanium wrench if his grandpa's old one finally gets lost. For that matter, switching to a lighter but equally strong material might be a good idea anyway.
      • Re:funny... (Score:3, Insightful)

        Hardware design tools tend to be extremely poor. Doing an EE degree, you get to use quite a few of them, and soon realise that no proper software engineer could have been involved in the design of these tools.

        Shall I show some examples?

        • Open/Save/etc. dialogs that always start from the same directory (eg. something nested really deeply in the windows directory). This is common in programs which require you to open many files, eg. VHDL compilers.
        • Context senstive menu bars... Altera Max Plus does this. The menu bar changes depending which window you are in. It makes options that affect other windows only accessible from others. It changes the order of the menu bar. Very annoying.
        • Old style open/save dialog boxes from win 3.11 era that don't have Desktop and all those other buttons....
        • Meaningless errors all over the shop... they don't reflect on the design you are working on, but still, get in the way.
        • Ok/Cancel pop ups after every single stage in the compilation/simulation process. This would be good if it took hours and I wanted to stop it, but these things pop up, always in the centre of the screen, away from where you are using the mouse. Why?
        • Horrible visual design (Synplify)... the buttons on the screen are entirely random sizes. Looks like it was programmed by kids, there is no consistency between one screen and another, so you have to hunt for buttons.
        • Huge amounts of windows (Modelsim) - if you are simulating a design, you can end up with over 20 windows open, all seperate, and it gets very confusing, especially with some of them not having any informative information (such as the name of the file) on the taskbar button.
        • Assuming that you work at a given resolution (eg. 1024x768). Most development environments benefit greatly from being used on high resolutions... but when the designed of the software assumed you would use one resolution alone, you end up with tiny text, things that look very odd when maximised etc.
        • Different implementations of VHDL - what works in one compiler may not in others. This can lead to tearing hair out.
        • Stupidly expensive - some of these packages cost more than $10,000... you think they would work better.
        • I could go on and on and on...

        Essentially, there are so many stupid, small mistakes in the user interfaces in these pieces of software, that it leads to people making mistakes. The less time you have to spend using them the better, so designs don't get tested or verified fully....

        • you seem to primarily focus on shortcomings in the GUIs of the tools. You should realise that the GUI is not the primary UI of these tools. Practcally all EDA tools are script based (usually tcl) and anyone who use them for real will hardly ever use the GUI.

          Performance and quality of results are much more important than GUI niceties.

          As for the price, I think you left out a zero. While there are a few tools that costs as little as $10000, prices of $100000-$400000 are not uncommon.
        • Well -your first mistake is using all this GUI nonsense! I use the command line pretty much ALL the time. The ONLY GUI I use for chip design is the waveform viewer, i.e the equivalent of a logic analyzer. That's it!

          Who CARES about human interfaces when all you have to do is type "verilog -f files"

      • On the second board I was most impressed with the software tools hardware guys used. NOT! Although the board was more or less up and running I found a couple of places where transmits were connected to transmits and receives to receives. I asked why the schematic capture tools didn't catch such obvious mistakes. I know the software can, but quite honestly, all the software used for hardware design feels like it was written by, uh, hardware guys. :-O

        I'm a hardware engineer, and I have to agree somewhat about hardware EDA tools. They do suck much more than SW design tools.

        'Compiling' (i.e. synthesising, layout, drc etc) of a hardware ASIC design is much more hands on and time consuming that compiling software (ie. type 'make'), and I guess that's why hardware design tools seem to be much more primitive.

        And on top of all that, hardware is expected to work after the first full compile (i.e. taping out a chip).


      • I mean, the last board I worked on was in 2002 and they used a DOS based program to do the layout for peet's sake.

        That statement literally makes me cringe. Do you remember the growing pains that so many programs had as they transitioned between the trusty DOS versions and the first and second (and often current) Windows versions? Programs that did the job perfectly fine turned into a complete fucking mess when ported to Windows. Remember how bank tellers used to fly at keyboard entry? Now you have to watch them point and click and click and click...

        Just because a program runs under DOS does not make it inferior.
    • ".the term 'engineer' is used very loosely when you are refering to software engineers..."

      Dilbert: "You're mighty brave in cyberspace, flame boy."

    • I call myself an engineer because I am a Master of Engineering [], according to my University []'s Faculty of Engineering [].

      But maybe they're wrong.
    • It's even worse for non-software engineers. After all, we all know those are the guys driving the train.


  • by Anonymous Coward on Wednesday February 12, 2003 @02:13AM (#5285928)
    VHDL is a tool of terror! Especially when put in the hands of those lunix cyberterrorists! These terrorists and their sympathizers are an affront to American liberty, justice, and equality for all non-Muslims. I strongly urge the Right Honourable Prime Minister George Williamson Bush, Junior to pass binding legislation which would put an end to these un-American activities.

    P.S. I have similar views on the 3rd world clone chip manufacturer, AMD.
  • Wow, bforsee works for xilinx. Looks like another slashvertisment to me.
  • Yeah, sure. (Score:5, Funny)

    by modecx ( 130548 ) on Wednesday February 12, 2003 @02:14AM (#5285933)
    The board comes with all the peripherals that a standard motherboard or laptop has and then some.

    With a $6k price tag, it should come with a high class hooker.
    • Actually, if you're considering buying a lot of them, the sales team throws that in for free.

      Actually, you're the one that... oh never mind.

    • Actually, a "High Class Hooker" is much more expensive that $6k.

      $12K/day, 2 day min.

    • Re:Yeah, sure. (Score:3, Interesting)

      by baywulf ( 214371 )
      If you want a cheaper FPGA board, then try out the following company. They have some decent boards for under $100 though no microprocessor is included.
      • Anyone have any experience with this stuff? Can I really buy this board and build a circuit to do calculations for ~$100?

        I want to experiment with converting programming logic into hardware.

      • Thanks! I was looking for a cheap FPGA board exactly like these!
    • Many hardware development environments have a somewhat pricy development box with lots of software tools, debugging ports, every interface the system could support, etc. which you can use to prototype and develop your real application, which is some embedded thing that might cost $5 or $50 or $5000 depending on what it it? So how much does one of these things cost for a typical deployed environment, e.g. a PCI board or a one of a bunch of chips on a graphics board? Does it go on a $500 board, or is $5000 realistic? (Makes a lot of difference if you want to build a Beowulf cluster of them....)
  • GNU tools? (Score:3, Informative)

    by e__alf ( 642611 ) on Wednesday February 12, 2003 @02:14AM (#5285936)
    The GNU tools are just for the software part.. the actual FPGA design tools are still covered by what looks like 200 patents (and runs on NT or Solaris)

    But still, me wants! Think about it.. 4 PowerPC cores embedded in a sea of programmable logic? *drool*
    • That's what I was afraid of.

      So, it is not only $5k, but also $5 Sun Workstation (or BSOD interface for $300) and third party tools - such as Synopsis for $xxx,xxx .

      I was just hoping that Xilinx ported its tools from Solaris to Linux and sells it with an evaluation board for mere $5k!
    • Yeah, I drooled over this, too. Even though this dev board has just one PPC, there are others with 4.

      But then I found out that the PPC cores are very slow relative to state-of-the-art standalone processors - its in the range IIRC 266-366 MHz each. Even multipling by 4 and it doesn't fare well compared to 1 GHz PPC's. Xilinx explained that the process they were using just didn't allow for fast processors, so I doubt they'll ever catch up.

      What I'd like to see is a 1 GHz 4-PPC core with either no programmable logic or just a little (enough to implement some simple communications fabric and maybe a little left over for a simple coprocessor).

      But, that's just me. This would be more sellable to my old company, which used a Virtex II and a discrete PPC. This combo woould save some realestate (they had 2 Virtex's, 2 ppcs, and a whole lot of other big parts on a 6U VME card; it was crowded!!)
  • by jericho4.0 ( 565125 ) on Wednesday February 12, 2003 @02:16AM (#5285941)
    Ok. I have a vauge idea what a Field Programmable Gate Array (FPGA) is. I understand that this is a device that can be programmed on the chip level. But I still don't understand what this is, really.

    Can someone with a bit of know-how point us towards some more info?

    • by anonymous cupboard ( 446159 ) on Wednesday February 12, 2003 @02:36AM (#5286014)
      I don't think there is a single 'HOWTO' on the subject, but essentially an FPGA is a chip with a large array of simple logic gates that may be interconnected in a programmable way. Tools exist to simulate and compile logic expressions into a form where they can be downloaded into an FPGA as a gate interconnection matrix. Once the FPGA has been programmed, it then will execute the logic function.

      As with software, a lot of modules exist (mostly quite expensive) for logic blocks up to and including microprocessor cores. Rather than having a chip with a single function, it is possible to squeeze multiple functions upto the limits imposed by the gate count.

      FPGAs can be reprogrammable, or programmable once only. There is a often fusable link inside that once blown prevents reprogramming or designs to be read out.

      If you are producing quantity, then you can go from an FPGA component to a gate array which is programmed by a photographic mask during manufacture. The mask is prepared from the same program that created the FPGA. The setup costs are high, but once you talk about big numbers of chips, the component becomes significantly cheaper than an FPGA and often better performing.

    • It's a Field Programmable Gate Array...

      It's a piece o hardware that you can buy to do stuff (you tell it how to map out a 'virtual processor'). If you had one large enough you could emulate an x86 cpu.

      They're used mostly in applications where (price OR time to market OR development costs) are a big factor. Custom silicon for a custom purpose will always be faster, and cheaper (If you build enough to justify the development costs).

      Flame away, more knowledgeable /.errs; Trying my best with a shallow understanding.
    • An FPGA is used to design hardware in it's early stages. It allows hardware engineers to define the hardware on a PC, by drawing schematics and writing descriptions in Hardware Description Languages (HDLs) like Verilog and VHDL. The FPGA software on the PC then links to the FPGA itself (through a serial cable or whatever) and programs an "array of gates" to implement the functionality the designers have defined. What this means, essentially, is that it defines the interconnections between prefabbed gates on a chip to cause them to implement the specified logic.

      It is used in early stages of hardware design to verify timing and functional correctness, and heavily in education. Sometimes final production products will use FPGAs, but usually only when the production volume is low. This is because FPGAs are more expensive per individual unit than ASICs (Application Specific Integrated Circuits), but ASICs require more expense on design costs.

    • by baywulf ( 214371 ) on Wednesday February 12, 2003 @03:15AM (#5286135)
      The book "ASIC" has a good chapter on FPGAs. Go to the following link and review chapter 5.

      But the answer your question briefly, the internal structure of the FPGA is an array of computational logic blocks. The boundary between these blocks in the array is routing logic that allows nearly arbitrary connections between the logic blocks. There are also IO blocks at the perimeter of the array. Each logic block typically consists of some combinational logic followed by a register element. The combinational logic element can be programmed to implement arbitrary logic functions of around 4-8 inputs. Thus you can configure a block to be a 1 bit adder, a mux, register, etc. By programming the CLBs and routing between the blocks, an hardware system can be built. You write the hardware description in Verilog, VHDL or schematics capture. Then a synthesizer maps your design to a bit pattern necessary to program the FPGA. You generally program this into the chip or into an external flash memory connected to the FPGA.
    • I'd like to expand on this question. I've been intrigued by FPGA's since I first read about them. To my (admittedly relatively untrained) mind, they seem like the ultimate platform: an infinitely reconfigurable piece of computer hardware. It would seem like an FPGA, with its gates programmed for optimum efficiency for a given task, should be able to kick the pants off a traditional fixed architecture cpu. If this isn't the case, can someone tell me why? Someone once told me that the distances between various virtual components in a configured FPGA slowed things down; is this true? Why don't we see desktops made with these? (I suppose, if FPGAs are really expensive enough to warrant the price tag on this doo-dad, that'd be one reason.) There actually is a company called Star Bridge Systems [] who make computers with FPGAs. They even promised a "desktop supercomupter" a few years back. But apparently, they've given up on that (or decided that selling a few multi-million dollar supercomputers is better than selling lots of thousand dollar desktops). And even if FPGAs aren't much faster than regular chips, there's something about compiling your code down to the actual gate layout (rather than for a set gate structure) that just seems really, really cool.

      • It would seem like an FPGA, with its gates programmed for optimum efficiency for a given task, should be able to kick the pants off a traditional fixed architecture cpu. If this isn't the case, can someone tell me why?

        FPGAs are slower than ASICs, due in part to the programmable switches used to wire up your design. FPGAs are also many times less dense than a ASIC, again due to the general purpose building blocks, and programmable switches.

        Basically, they're bigger, slower and more expensive than an ASIC


  • VHDL (Score:3, Interesting)

    by Amon Re ( 102766 ) on Wednesday February 12, 2003 @02:18AM (#5285953)
    Does linux even have any good VHDL simulators?
    • Re:VHDL (Score:5, Informative)

      by dlbowm ( 99810 ) on Wednesday February 12, 2003 @02:24AM (#5285974)
      Icarus ( is a competent Verilog (not VHDL) open source simulator. It even has some support for sythesizing to some FPGA libraries.
      Verilog is more common than VHDL in the US, so this is the only open source HDL tool I've used. Primarily, we are still slaves to Synopsis and Cadence though.
      • Speaking as a slave to Mentor Graphics who has seen Icarus before (you beat me to posting a link), I was wondering if you could offer any insight into how it well it works? Since you seem to have more experience with it than I do I was hoping you could expand on your declaration of Icarus as "competent".
        • I have primarily used it only for syntax checking when I am not connected to the office LAN (vcs needs access to the license server) since it will not work with my verification environment. In recent builds it seems to do fine as far as the compile goes. My main problem is it does not fully support the TF PLI library and I have written a TF PLI app in C that embeds a PERL interpreter into the verilog models so I can use PERL to generate stimulus from inside. At some point I will migrate the code (there isn't much of it) over to VPI which is more completely implemented in Icarus. But without a testbench, I haven't been able to really stress Icarus in real world HDL development.
      • OK. The question is, can these Xilinx tools xpr the RTL design into the xilinx format that can be downloaded to the FPGA on Linux machines, or do I have to run the RTL->xpr on Windows?

        Of course I do not expect Open Source Synopsis any soon.
      • Cant call ourselves slaves. though only proprietry solutions are there, there is enough competition in the market. Mentor, Synopsys, Cadence... and so on. Open source simulators are not really near yet.
        My guess is that verilog simulators will come out pretty soon, however VHDL will remain in infancy coz more of the hardware ppl are moving on to verilog.
    • Re:VHDL (Score:4, Informative)

      by Colonel Panic ( 15235 ) on Wednesday February 12, 2003 @04:16AM (#5286260)
      Well, there have been a couple of attempts, but nothing complete...

      Right now the most interesting one is a VHDL frontend for GCC called GHDL [].

      Also note that you need a lot more than a simulator to get it to work with this board: you need a synthesis tool that can map into the Xilinx part. The FPGA companies tend to keep their formats quite proprietary, so don't expect any open source tools for synthesis and tech mapping any time soon.... (unfortunately).
    • Modelsim have been supporting linux for some time. As for free tools, Alliance EDA provide a stripped down version of their simulator for free, but its not as sophisticated.
    • Re:VHDL (Score:3, Informative)

      by jeff_bond ( 135948 )
      Does linux even have any good VHDL simulators?

      Certainly, Modelsim []


      • I second that remark. Modelsim simply screams on Linux. I use it daily. It seems that for the truly large designs, performance on Linux starts to degrade, but I think that has more to do with bus and memory limitations inherent in the PC architecture (as compared with high-octane Solatis servers that cost > 5x more) . All my recent designs (RTL through post-layout) have gone through Modelsim on Linux, and I'm not going back.
  • by dlbowm ( 99810 ) on Wednesday February 12, 2003 @02:19AM (#5285955)
    This looks interesting, but way too expensive to break down any barriers in the short term. Actually, being hardware (ASIC) designer, many of the embedded software guys know their hardware as well as the designers. Some, however, need their hands held every step of the way and can't understand why we put all those damned interrupt capabilities in there. Just makes the software harder to write!
    I'd love to see something like this out in the market in a lower price range. It's great to have GNU software tools to write code inexpensively, and to have hardware as well would really be fun and useful. Sharing cool hardware accelerator HDL with others would be great. I've used Icarus recently and it is becoming quite a useable open source alternative to vcs, verilog xl, nc verilog, etc.
  • by nweaver ( 113078 ) on Wednesday February 12, 2003 @02:21AM (#5285964) Homepage
    EG, the XC2VP7 which is used in the core of that board has a PowerPC (>250 MHz), 8 SERDESes which can speak Gb ethernet with optical transievers (among other things), about 100 Kb of RAM, and 11,000 4-LUTs and flip-flops.

    Xilinx promises that at the end of the year, in suitable quantities (>25,000), they will be $100/each.
    • Hey Nick.

      And maybe if I get off my ass, we'll have a nice language with which to program the things, too.

    • Xilinx promises that at the end of the year, in suitable quantities (>25,000), they will be $100/each.

      Yeah, yeah, yeah... the same meaningless price quotes from Xilinx as always. Historically it's been at 100,000 pieces in their widely published literature.

      At 25,000 chips, you'd probably go to the trouble to create a custom ASIC. Or at least you'd do a "hard wire" conversion of the FPGA design to an ASIC, if you used the FPGA to "get to market quickly".

      If you call up one of the [] actual distributors [] where you'd actually buy this part, say at qty 1000 to 5000, the store would be different. A lot different.

  • TCPA (Score:2, Insightful)

    by raistphrk ( 203742 )
    Maybe this new merging of the hardware and software worlds will settle some of the religious wars between hw and sw engineers? ...or maybe this will provide an architecture that's free of DRM? If TCPA ends up being as insidious as we think it will be, an alternative architecture will be in order for those who want to actually USE their PCs (as opposed to their $1500 multimedia toaster that they bought from Intel). This is good. This is very good.
    • This is what I was going to say.

      This gives us great freedom with hardware.

      One of the things that made software approaches take off was that software is easy to change. You don't need a dedicated circuit designed for each purpose. I remember as a youngster in the decade of polyester suits before popular microcomputers playing with 7400 series TTL gates to build various logic circuits. (e.g. a clock. a burglar alarm with keypad code entry.) As I looked at more sophisticated devices, the logic circuits needed to become way too comples. For instance, from PolyPaks you could order a single digit readout with a 5x7 array of LED's. You would have to multiplex drive this. I never ended up building any breadbord circuits, but I designed a few on paper. I would end up using a PROM after reading about them in Popular Electronics.

      It wasn't far from here to make the jump to "programming". I finally "got" the idea when I got hold of a friend's HP25 calculator. I never went back. No longer can remember which end of a soldering iron to pick up.

      The greatness of software was that you only had one universal hardware circuit. But you could control the outputs of, say, a parallel port.

      Now here we are in the 21st century. Fantastic hardware. But there is the potential for us to lose control of it to powerful, greedy interests.

      I would love to see the day when anyone could buy a cheap part and "burn" (or whatever) their own circuit or chip design. This would open the floodgates. Especially if the development systems were cheap, like a CD burner. Especially if your chip could fit into a standard PCI board, or dangling USB module.

      This would ward off the dangers of hardware control, just as open source wards off the dangers of software control, and garlic wards off vampires who want to suck you dry through neck lock in's and licensing 6.
  • If it was cheaper, it would be really neat - there's a lot of things you could do with such a device. Problem is, it's US$5000 - for that price, you could buy *several* decent laptops, or even more decent PDAs :-(
  • by johnjones ( 14274 ) on Wednesday February 12, 2003 @02:45AM (#5286049) Homepage Journal
    god I hate PPC infact I nearly hate it as much as x86 but...

    now ARM a nice little design there is the same deal but with a ARM that altera do and see www []
    and MIPS have been doing a dev board with a hard and soft core mix for a while

    well you never guess they ALL come with GNU tools and as they use standard arch that linux is already ported to

    really what you want to get into is a CPU on a FPGA and one that you dont have to pay a licence for this is what is about and credit to them flextronics have started looking at it for a solution see

    news about the use of open hardware at []

    the openRisc 100 project at []

    See the FAQ at []

    hope that helps


    John Jones
    • The questiom, can I design the (Altera) FPGA internals _on_ Linux?

      It seems, that I can design on windows a system, that can run Linux and Linux kernel is included, BUT I have to have Windows/Solaris first to design compile the FPGA. Am I correct?
    • What in specific is it that you hate about PowerPC? I've found it a very pleasant instruction set to work with.
    • Altera's Excalibur chip with the ARM processor is basically dead. When it came out, it had pages of errata, the processor was hacked onto the edge of the chip, from what I have heard they have excess inventory they can't sell, and they have not been talking about it for months. Pretty much dead in the water as far as I can tell.
  • We [] demoed the ML300 at Embedded Systems East last year. It has an LCD screen, CompactFlash, serial, parallel, USB, Firewire, Fibre Channel, Ethernet and who knows what other I/O, PLUS heaps of blinking blue LEDs, on this one little card. It was like a shiny toy designed to attract geeks.

    Fantastic to look at, even if you had no reason to be legitimately interested.

  • I'm sorry, but I'm getting tired of FPGAs. Many early USB peripherals had FPGAs in them. The result? You need some weird driver CDs, and the hardware becomes useless when the special drivers don't install anymore.

    For hardware developers to imitate the mistakes of software development is a mistake. Hardware should conform to well-defined interfaces, it should be carefully designed, debugged, and tested, and then it should not require "upgrades" or "installation" later on, it should just work. If it hooks up to computers, it should only require generic drivers.

    • If it hooks up to computers, it should only require generic drivers.

      reminds me of the time that I used an HP LaserJet 4 driver to run a Deskjet 350 in DOS...guy said it would never work...granted, only printed black and white...but still
    • by kinnell ( 607819 ) on Wednesday February 12, 2003 @05:13AM (#5286364)

      Using an FPGA does not in any way require "weird driver CDs". Nor do they prevent the hardware developers from implementing clean well defined, standard interfaces. In fact hardware implemented in an FPGA is no different from the users point of view from hardware implemented any other way, or from embedded software running on a micro-controller for that matter.

      If your USB peripherals didn't work properly, its because they were poorly designed. This has nothing to do with the choice of using an FPGA to implement the interface.

      To say that hardware engineers are immitating the mistakes of software engineers is ridiculous, (although obviously some are making the same mistakes). Is it therefore perfectly acceptable for software engineers to implement poorly designed interfacesand neglect testing and quality control? I don't think so, but perhaps we have become numb to this issue. Bad engineering is bad engineering. The choice of using FPGAs for an emerging standard is good engineering, because if the standard changes before maturing the hardware does not then become instantly obsolete. This is why FPGAs are popular in mobile telecoms base stations, and rightly so. Being able to upgrade hardware is a good thing. Releasing an immature design is bad, both in hardware and software.

    • Just because something can be done poorly doesn't mean it must be done poorly.

      wasn't something similar to this said at the conference in Kittomer?
  • for maybe $100?

    This looks pretty cool, but no way I can afford one at $5K... You could do the slow bits of code in C targetting the PPC and the fast stuff in VHDL or Verilog...
  • Very good, this will finally allow open source to get a grip on the hardware development market, which is good for us all!
  • does FPGA still mean Flip chip Pin Grid Aray? That's what I thought at first.

  • by Anonymous Coward
    The problems of the war are pretty easy to solve.

    Assume that $religion means the presence of a $diety (belief systems without a $diety, like Taoism, will not be considered ${religion}s, which is to their credit).

    Either $diety is hardware (real, grounded in nature, possibly via a marked green cable) or is software (virtual, made up in human minds, subject to revision and short-lived cultural fad approaches like "extreme religion" and "christianity"). Since there are and have been umpteen different $dieties, none of which has lasted, while the hardware has remained relatively stable, $diety is software. This is also confirmed by the near-universal belief that $diety is infinite, which can only be true of software (since it is virtual). As a side note when you consider the state of software this explains a LOT.

    So since $diety is software and software requires hardware to run, hardware engineers are titans. They win and software engineers lose.
    But since $diety is software and can thus be made and freely and infinitely revised by software engineers, they're the ones who are titans. They win and hardware engineers lose.

    So I hope that's cleared things up. Now fight amongst yourselves.
  • by kinnell ( 607819 ) on Wednesday February 12, 2003 @05:25AM (#5286390)

    The xilinx parts are for embedded systems, and have no real benefits for your average PC user (hence they can market them them for $$$).

    Look here [] for genuinely cool FPGA technology. They use transputer based technology to implement parallel algorithms in, well, parallel. The demos are very impressive - real time raytracing @50MHz anyone?

    • Celoxica's main product is a development environment that basically lets you write C code and drop it onto an FPGA. (Yes it runs on Linux.)

      Its a lot faster to develop this way than more traditional methods (HDL's) as its so easy to iterate, for example being able to drag code back and forth to optimise the flow between a processor on your board and an FPGA being used as a custom parallell coprocessor is pretty cool.

      As for the demos, that ray trace one is pretty cool, but I did like the space invaders demo - I think the game code was from a ROM dump - you even got an insert coin prompt!

  • by freeio ( 527954 ) on Wednesday February 12, 2003 @08:58AM (#5286774) Homepage
    The ability to run one or more concurrent instances of Linux (or whatever, quite frankly) internally to one of the Xilinx Virtex II parts is seriously amazing. Ignore the board it comes on for development for now - that is just cruft. The Virtex II is probably the most powerful instantly reconfigurable DSP engine in existence (think audio, video manipulation at real time speeds). They have internal hardware to perform from 16 to 128 simultaneous 16x16 multiply/accumulate operations simultaneously, _in_one_clock_cycle_. And if you don't like what it is doing, you can change it, time and time again, forever. Raw Power. Complete Reconfigurability. Sweet!

    Combine this kind of power with multiple PPC processors on the same die, and the possibilities are incredible. The big difficulty is that the operation of the hardware and software can be so tightly tied together that it is difficult to program and debug. Everything is controlled by software (both the software and the VHDL or Verilog based FPGA code) and so the possibilities are limitless.

    Kudos to Jim Ready and the folks at Monta Vista for supporting this kind of device with development tools for Linux.
    • The ability to run one or more concurrent instances of Linux (or whatever, quite frankly) internally to one of the Xilinx Virtex II parts is seriously amazing.

      There is a Virtex II and a Virtex II Pro, which are not the same.

      The article is about a demo board with the PRO version on it. The plain Virtex II doesn't have a PPC processor built-in to it.

      Just wanted to clear that up.


  • Nice, but... (Score:3, Insightful)

    by eXtro ( 258933 ) on Wednesday February 12, 2003 @09:04AM (#5286800) Homepage
    I'd much prefer a native port of their FPGA development tools. They list compatibility with Redhat 7.2 but if you read the fine print that means that you use WINE to run them. Better yet, release specifications on programming your CLBs and routing. You would then see some real innovation in tools come out. FPGA's should be the electronics hobbiests component of choice much like PROMs and 7400 series TTL logic was a couple of decades ago. Instead you're forced into using their tools, which the last time I used (admittedly ~7 years ago) were about as much fun as extracting your molars with a spoon.
  • LEON (Score:2, Interesting)

    by girmann ( 34561 )
    Nobody here has mentioned the LOEN progect, which is based on the SPARC V8. This is an open processor core that you can put into any FPGA. Speeds aren't as great as the PowerPC in this desing, but hey, it works!
  • Board details (Score:3, Informative)

    by brandido ( 612020 ) on Wednesday February 12, 2003 @01:51PM (#5288849) Homepage Journal
    Some information on the boards:

    The CPU board, that has all of the main components on it, is an 16 layer board. It comes with 8 - 3.125 gigabit capable transceivers (used as 4 gigabit fiber, two HSSDC2/Infiniband and two Serial ATA), 128 MBytes of DDR, 2 PS/2, 2 Serial Ports, Parallel Port, FireWire, two PCCard/PCMCIA slots, Compact Flash interface (for configuration and file system) PMC slot, BDM and Trace ports, JTAG port, AC97 audio codec and a kitchen sink.

    The Power-I/O board, that has the TFT, most of the I/O and the majority of power regulation, is an 8 layer board, and has a 640x480 TFT, 14 I/O buttons, a multitude of LEDs and a small prototyping area underneath the TFT.

    Included with the kit is a 1GB microdrive, 2 fiber cables, 2 serial cables, an HSSDC2 cable, a serial ATA cable, two flavors of firewire, a Parallel Cable 4 programming cable, Xilinx ISE software, Chipscope ILA Pro, and on and on.

    In addition, I would like to say that this was an exciting project to work on - between the gigabit transceivers, the DDR and the high density of components on the board, this was the hardest board I've designed (I did the majority of the schematics and parts of the layout).

  • First, the Altera Flex is only $3000 or something and you could probably fabricate your own FPGA development board for $100. Slashdot shopping network won't tell you that nugget of information.

    Doing stuff in hardware is neat because it runs real fast, you're interacting with the real world instead of living in a black box, and you can charge money for it. Other than that, it's too expensive to use in most commercial situations and you need to go back to a general purpose computer. Let's put it this way. The ML300 is $4695 in materials. A standalone FPGA with supporting electronics and PCB fabrication is $100 in materials. Pure software on a general purpose computer is $0 in materials.

    • heroine: "Doing stuff in hardware is neat because it runs real fast, you're interacting with the real world instead of living in a black box, and you can charge money for it. Other than that, it's too expensive to use in most commercial situations and you need to go back to a general purpose computer. Let's put it this way. The ML300 is $4695 in materials. A standalone FPGA with supporting electronics and PCB fabrication is $100 in materials. Pure software on a general purpose computer is $0 in materials."

      The board is expensive because tech support for something like this is expensive. By charging a non-trivial amount of money, the vendor is able to weed out the non-serious players.

    • The bulk of the cost of the ML300 is not in the FPGA. The peripherals on the board and the accessories in the kit constitute a lot of the price.

      If you're interested in a "standalone" development board those are also available [].

Round Numbers are always false. -- Samuel Johnson