Open Source Graphic Card Project Seeks Experts 370
An anonymous reader writes "Could this dream of many open source developers and users finally happen? A 100% open sourced graphic card with 3D support? Proper 3D card support for OpenBSD, NetBSD and other minority operating systems? A company named Tech Source will try to make it happen. You can download the preliminary specs for the card here (pdf). The project, though a commercial one, wants to become a true community project and encourages experts and everyone who have good ideas to add to the development process to join the mailing list. You can also sign a petition and tell how much you would be willing to pay for the final product."
Great!! (Score:5, Interesting)
In theory other companies might steal the design and build and sell the card on their own, but if the design is community-owned, then that actually works to lower prices...
Anonymous Cow
Re:Great!! (Score:5, Insightful)
In fact, this is the very point of such a project. If a company comes along and wants to use it for a product they want to develop, then they can!
Re:Great!! (Score:4, Informative)
Having a straight forward design suitable for an FPGA would enable them add additional fail safe mechanisms and to qualify more easily for these applications. Oh yes, and they get others to work on their products for free. They could use rad hardened FPGAs for the final implementation.
Waste of time (Score:5, Interesting)
Building a good open 2D card? Mabye... I doubt it's really feasible, but have at it. Chase that dream.
But a 3D card? You are going to make a card to run the latest Quake and Doom? Or even release back of the games? Do you realize how much time, how many thousands of man hours go into these cards? The dollar amount for the simulators, the fabs to make the prototypes, etc
This could however, make a great teaching tool.
I take it back... if the card can target elementary 3D and stellar 2D, it could (in a few years) be THE card to own for a commodity Linux box. Target your audience carefully and don't get caught up in the IdSoftware upgrade cycle! :)
Re:Waste of time (Score:5, Insightful)
A harder problem is getting enough of the target audience to accept that they're in the target audience, because people (or at least americans; i can't speak for other cultures) like to have the possibility of doing something, even if they'll never do it (hence the ubiquity of SUVs on our roads, but i digress). This should be easier with people that use open-source software though; 3D-intensive software for those isn't nearly as common as on windows.
That said, if they can convince someone to slap it on a PCB, i'll keep an eye out for these things next time i need a video card.
Re:Waste of time (Score:5, Insightful)
You end up with much smoother window rendering, and it allows you to add in things like desktop transparency and shadowing without much of a performance hit. A 2D only card may be "good enough" for some, but the desktop environments are quickly moving in a direction where that may no longer be the case by time this card would come to market. Going for at least rudimentary OpenGL support from the start would be a good idea.
Re:Waste of time (Score:3)
Re:Waste of time (Score:5, Informative)
Also, Quartz Extreme is most definitely using 3D hardware acceleration. Regular "old" Quartz used before 10.2 was purely 2D based, but Quartz Extreme leverages your 3D accelerator to render the desktop on screen - acting like "Everything is a textured polygon." [udnimweb.de]
Re:Waste of time (Score:4, Interesting)
When you think about it, Quartz Extreme only needs to handle a relatively small number of parallel polygons at basically a constant distance away. That's a much simpler job than millions of triangles at arbitrary angles to each other at varying distances and whatnot.
The job the video card does can potentially be as simple as figuring out which window is exposed in a given area and grabbing pixels from the appropriate frame buffer. OpenGL is a good deal more complicated than that, but since both the driver and the FPGA are under our control, I would think it would be possible.
Re:Waste of time (Score:4, Informative)
Re:Waste of time (Score:4, Insightful)
Re:Waste of time (Score:5, Interesting)
That being said, 3D provides a lot more possibilities - you could make windows be actual objects that could be moved forward or backwards, stacked up, leaned against each other, and so on. Implement HAVOC physics so I can grab an icon and smash it into my other icons and watch them scatter all over my desktop, or throw it and watch it bounce off the edge of the screen and land in my network drive.
Eventually, all we'll need to do to solve the spyware problem is to use a wallhack and noclip and go bounce that crap to the curb. Sure, we'll have to endure the cries of spyware makers shouting 'lamer!' or 'wallhack' or 'aimbot', but we can just kick them off the network if it comes to that, or
Re:Waste of time (Score:2, Insightful)
False logic (Score:5, Insightful)
"No, it's impossible to build a replacement for Microsoft Office. Do you realize how much time, how many thousands of man hours went into this software?"
But there you go, Open Office is doing pretty well.
If anything, development of a good "open-source" 3D card could be hampered by patents.
Re:False logic (Score:5, Insightful)
Re: (Score:3, Insightful)
Re:False logic (Score:5, Informative)
The Windows driver for a DirectX card is not that complex - and there are several available reference sources (3Dlabs, ATI). The highly complex drivers out there at the moment are very heavily optimized for a given card - speed sells. But the central core of the driver is simple, with almost all work handled by one entry point that takes command batches.
I ought to know - I'm the guy that designed the Windows kernel interface to the driver back in '97, and it's basically unchanged to this day.
OpenGL Drivers (Score:3, Insightful)
No one has to write an OpenGL driver from scratch. You just start with MESA and start offloading stuff to hardware as much as you can. It's not a great route to a great system, but it's a straigh forward route to something that works and is feature complete.
Re:False logic (Score:2)
I see you drive a DeLorean. Dude, maybe this was the case back in 1996, but these days, Word owns the market.
Sad but true. Hell, I haven't even SEEN a WP5.1 installation in years. Maybe it's still big in some small business sectors but overall it's history.
Kind of sad, too. It was the last great word processor for DOS.
-Z
Re:False logic (Score:2, Funny)
Silly me, I'll read more carefully next time.. }:)
-Z
Re:False logic (Score:2, Interesting)
Re:False logic (Score:2, Offtopic)
Which was purchased from Star Software (Score:3, Informative)
A New Hope. (Score:3, Insightful)
Repeat after me. Hardware is not software. Software is not hardware
Overestimating is not any better than underestimating.
"If anything, development of a good "open-source" 3D card could be hampered by patents."
I've said as much elsewere. The vorbis people have shown that patents can be dealt with. However graphics is considerably more complex.
I won't repeat after you (Score:3, Insightful)
Re:False logic (Score:2)
Re:False logic (Score:5, Informative)
Talk about "false logic." Open Office is doing pretty well because it has had a huge amount of time and money put into it over the years. By the way, it existed for many years as closed source before it became open source, even before Sun bought it.
http://en.wikipedia.org/wiki/StarOffice
And it's not anywhere near being ready to replace Microsoft Office, but I guess they've only had 10 years...
Re:False logic (Score:3, Funny)
Yeah I'm still waiting for Open Office to support macro viruses.
Re:False logic (Score:3, Insightful)
Re:False logic (Score:5, Insightful)
I disagree. The same thing was said about Linux back when it didn't have networking, didn't have SMP support, didn't have a journalled filesystem, etc. It only took 10 years for all those comments to become irrelevant. It turns out that 10 years is a reasonable timeframe.
In the same vein, Abiword and Gnumeric, while admittedly not as good as OpenOffice or Microsoft Office, are well on their way to being decent office applications. The KDE crowd also has their own fully-free office suite (Kword, Kspread, etc). If OpenOffice hadn't been donated then the development effort would have gone into the GNOME and KDE applications and they would be further along then they are currently. They would without doubt have been at the tipping point within 5 years; that sounds reasonable to me.
Sun helped the process along, fast-forwarding us at least 5 years, but they did not solve the "impossible".
Re:False logic (Score:3, Insightful)
Incorrect. Linux had TCP/IP, SMP, journalled filesystems and lots of advanced features before IBM started paying attention. At least 4 years before, in fact.
It was primarily because Linux was so advanced that the big companies started payin
Re:Waste of time (Score:5, Insightful)
I don't think there's any requirement for it to be cutting edge. They just said "3D support", not "runs Doom3 at fast as the latest nVidia or ATI card". For a lot of people a card that was capable of running say Quake3 at reasonable (but not necessarily blindingly fast) frame rates would be quite sufficient. Not everyone gets 3D support on a card for gaming purposes, and for those people an open card that provides credible 3D support may be an attractive option.
Sure, you won't compete with ATI and nVidia, but then guaranteed open source drivers that will get the maximum performance out of the card are quite a benefit in themselves. Especially given the quality of ATIs Linux drivers.
There is a market for this card. No it isn't a huge market, but then Apple doesn't have a huge chunk of the desktop market, but they seem to be rolling along fine. As long as there is a big enough niche to support to company, that's all they need. More power to them.
Jedidiah.
Re:Waste of time (Score:3, Insightful)
I'm off-track though: my point is that if the "market" of co
Re:Waste of time (Score:3, Informative)
I already have one of them and probably wouldn't buy a worse card.
Re:Waste of time (Score:5, Informative)
If you read the mailing list archive, you'll see that what they are proposing is a card with simple, OpenGL compatible 3D. The interface will be PCI at first. My impression is that they have mini-ITX boards in mind. The last paragraph of your post is correct: they will probably target commodity Linux (and significantly, BSD) boxes.
I think that this is a great idea. Right now, if you want open source 3D, the only good hardware available is the Matrox G400/450/550 line, and that's over 5 years old. I bought my G450 in 1999 and am still using it quite happily, but I would certainly buy an open hardware card from Tech Source if this project comes to fruition.
As someone on OSNews posted, this project could be profitable for a small company even if it would be considered a flop by ATI or Nvidia.
Re:Waste of time (Score:3, Interesting)
I'm all for open-source hardware products, but lets make them something that isn't already readily available in a form opensource folks find to be generally acceptible. They should at least give the thing *one* major feature advantage (how about quad DVI? noone is doing THAT yet... at least not in any reaso
Re:Waste of time (Score:2, Interesting)
While I love my Matrox G450, the fact is, Matrox will never release another card like it, nor will they improve on it. If the Tech Source project works, then one day, it will release a card that is superior to the G450.
Re:Waste of time (Score:2)
Re:Waste of time (Score:5, Insightful)
Strange, my ATi Radeon 9200 RV280's disagree with you.
All of the R100 and R200 family Radeons are supported by the open DRI 3D drivers - type 'man radeon' for further information (including product names), the R300's are not supported though (but are supported for 2D by X). The fastest open-driver supported 3D card is the R200 based FireGL (careful - there's a newer R3xx based FireGL which wont work). There is work underway to reverse engineer the R3xx family and support the 3D features in the open drivers, see r300.sf.net [sf.net]. Also, there is an experimental R2xx Xorg kdrive Xserver featuring accelleration of XRender, and its probably where the work to move the Xserver over to 3D primitives will occur.
Anyway, go stock up on ATi Radeon 9200's. I have two, one AGP and one PCI, running happily on AMD64 and Alpha.
Re:Waste of time (Score:5, Insightful)
Commodity Linux boxes already have elementary 3D and stellar 2D. It's called Intel Extreme Graphics, has open source drivers, and it costs like $10.
Just want to repeat that $10 figure again. You are a going to have to do better than Fanboyism to beat that.
Finally (Score:3, Interesting)
I have never understood this project. If they want to start with something at least equivalent to a five year old SGI graphics pipeline abd build from there, then I'd say go for it. But the specs on this card don't look any better than the stuff you get right OOTB with an intel chipset (which, after sufferng with this goddamned nvidia system for too long now, is the reason I'll not be buyi
Re:Waste of time (Score:2)
Hardware is fairly straightforward to build 80 percent of the speed ups in hardware come from process and fabrication technology. The other 20% is clever hardware and gate arrangements.
I mean if I remember right, such things as latches, memory cells...etc are just the same circuit pattern repeated for th emost part.
I am not so sure the community wants THE fastest card.
I would be happy with three cards:
1) Super High End using PCI Xpress with complete Support for 3D..particularly OpenGL
Re:Waste of time (Score:2)
$50-100 dollars.
Er, only problem is this market is already served by generic $20 cards.
Personally, I've never paid more than $50 for a video card and I doubt I ever will. This goes for everyone I know both personally and professionally because, as unlikely as it might seem here on
In the busi
Not For Quake (Score:5, Insightful)
For 99% of users, this could be a great card. If it does great 2D, and can do good 3D (especially features like those used in Apple's Quartz, or Project Looking Glass) it would work more than well enough. Lets face it, for a large number of applications, a GeForce (origional) quality 3D would be MORE than enough for most anything many people would do. And if the graphics are localized into a small area (say a little 200x200 area of a window), then even such a card would be able to render very nice looking graphics (just like a "slow" card could run Doom 3 looking great at such a low resolution).
I'm with you. For a quality, commodity card this could be great. Plus, with the FPGA, not only could be hack the DRIVERS, you could hack the FIRMWARE! Think! You could buy the card, and write software to take the burden off the CPU for decoding MPEG2 or 4. You could even (with a little kernel help) swap firmware on the fly so you could have that video decoding, and then enter a command (or press a button on your desktop) to have the 3D firmware put in. When you're done, go back to video decoding acceleration.
Hell, make it run SETI in the background at super fast speed when just using 2D (like using nVidia cards to do scientific calculations on the GPU).
These things could be a LOT of fun to mess around with. I think I just sold myself on one ;)
Re:Waste of time (Score:2)
You could try and target that old 386 box in the basement, but don't even bother with SMPs, clusters and large high end machines!
Re:Waste of time (Score:2)
I'm sure they do realise this. However I'm their target audience; the 99% of people who don't spend $500 on a video card. I don't need nor want an ATI RadForce 9630 Plus Zero Alpha (+++++) video card. I just want something that runs a
Great Idea (Score:5, Interesting)
Really what a project like this needs is the developer to shut out the open source community, until the project is done. If linus had made a large project out of the original kernel, I seriously doubt if it would have ever been completed. This should be kept simple, and then open sourced, only once there is a good code base to build from.
Don't let them turn this into a HURD (Score:5, Insightful)
ati & nvidia release old specs? (Score:5, Interesting)
Re:ati & nvidia release old specs? (Score:4, Interesting)
Re:ati & nvidia release old specs? (Score:2)
Re:ati & nvidia release old specs? (Score:2)
If I remember right, though, most video card companies can't open source their drivers even if they wanted to. Parts of them contain licensed copyrighted code from companies like sgi, which will not permit their stuff to be released to be public.
Re:ati & nvidia release old specs? (Score:2)
Won't happen, and this is why: (Score:4, Insightful)
This, in addition to the very god point made by the poster above.
Re:ati & nvidia release old specs? ATI already (Score:3, Informative)
ATI already supposed specs for their R2xx cards. So everything up to a ATI 9200 has accelerated 3d support under X.org using the standard radeon driver. You won't get speeds as fast as the ATI drivers and some things like texture compression aren't supported due to patents but it gives good performance for something like chromium b.s.u and tux racer.
Yay! It has an FPGA on it. (Score:5, Informative)
I think the company would make a ton of money just making these as a reference platform and selling them to University students looking for a way to program their own GPU on the cheap for research purposes. Heck, Xilinx should do it themselves, and give all these students exposure to Xilinx parts (and their crappy design software) before they even find out who Altera is.
This project looks interesting. I'd sign on to help out, but this gets dangerously close to what my Day Job is, and I don't think my management would smile on my participation...
Re:Yay! It has an FPGA on it. (Score:3, Interesting)
Boards such as the Multimedia Board http://www.xilinx.com/products/boards/multimedia/ [xilinx.com] contain everything you would need. Not cheap though...
They have not put the whole thing on a PCI card, probably because it's even more fun to integrate a CPU core and build the whole system-on-chip on the FPGA while at it.
Cheers!
tech source isn't some n00b company... (Score:5, Informative)
tell how much you would be willing to pay (Score:5, Insightful)
Wgat sense does this make. There are some people (not me) that might pay up to $500 for the newest ATI or Nvidia cards. But they do that with the knowledge that the hottest 3D applications will take advantage of them. More importantly, that is the price they might pay for those cards today. It's well known that in six months those cards might be worth half that, in a year perhaps around $100. How can anyone say how much you would be willing to pay for the final product when by that time it might not even compete with the $100 cards?
Re:tell how much you would be willing to pay (Score:2)
There's a certain amount of money people are willing to pay for basic functionality, and then on top of that you add the amount you'd be willing to pay for a fully open-sourced card.
Add the two up, there you go.
As there is no competitor for the open-source aspect of the card, that's pretty much the only factor.
Re:tell how much you would be willing to pay (Score:2)
Re:tell how much you would be willing to pay (Score:2, Informative)
I suppose they'll have to either buy a lot of FPGA to get price reductions or wait until the price of those programmable chips come down. Add to this the price of DRAM, 250 MHz 3-channel DAC (which would not be on the FPGA), power converter, serial Flash for FPGA configuration and extra discretes. My guess on the final price... Ea
"Could this dream... really happen?" (Score:3, Interesting)
Come one folks, let's get real.
I'd throw a few bux their way (Score:3, Insightful)
Interesting tech... (Score:3, Insightful)
I can see it now: custom logic patches to change the core for extra performance on your favorite game...
A semi closed source model (Score:2)
A further question. (Score:2)
Now that a little bit of money is rolling in,
get it out quickly and create a framework (Score:5, Interesting)
Re:get it out quickly and create a framework (Score:5, Insightful)
It's already obsolete. It's on par with cards from about 6-7 years ago, if they achieve everything in their spec. It's only good enough as a teaching tool.
You can charge a little more than a comparable regular graphics card, but not a lot more. If this becomes a premium custom hardware product, it's dead on arrival.
A comparable graphics card costs $10 if you can even find it these days.
I don't see how this is worth the effort when you can buy the cheapest ATI card, and use the generic open-source VGA driver and achieve better 2D performance. This is somewhat like somebody trying to get people to work on an open-source version of DOS. Sure, you get your freedom of the free software, but who would want to use DOS? I'm all for open-source, but it has it be at least remotely competitive to get somebody to look at it.
What about OpenGL, 3dFX and gameing? (Score:5, Informative)
The first is that these are respectable specs - providing you don't want to to any gaming.
I think that is a really important caveat. I know that every once in a while people get all excited because the usual suspects port there games to Linux - you know ID and Blizzard come to mind.
It is a good thing that these two companies do this, but it is a bad thing that there are really only two companies that do this with anything approaching reliability.
Thing is... a card with these specs, especially considering that it is a year if not more away from reality will never cut it for any sort of gaming. You are going to produce a card with 3D support that doesn't have the muscle to handle any 3d games that are produced.
If you are fine with that then there is nothing wrong with those specs. This card will be able to handle email, porn and movies as well as anything ATI produces.
My 2nd thought is a bit more practicle.
Actually there may not be anything practicle about it. Might just be wishful thinking really.
What about 3DFX? What about OPENGL?
Between the two things isn't half the work already done?
I know it might seem insane - nuts even, but back in the day 3dFX had some very respectible hardware. They didn't fail cause there stuff was poop, they failed cause they underestimated nVidia (which in turn underestimated ATI). The hardware is still out there, the code is still out there. It just isn't being utilized.
Would there be anything wrong with utilizing these old resources to achieve this goal?
Re:What about OpenGL, 3dFX and gameing? (Score:2, Informative)
Blizzard doesn't port their games to Linux - only to Mac, unfortunately. (I would buy a Linux version of WC3/TFT, if anyone from Blizzard is reading this...) Perhaps you are thinking of Epic?
What about 3DFX? [...] Would there be anything wrong with utilizing these old resources to achieve this goal?
NVidia bought 3Dfx and assimilated its intellectual property. I don't think you could make a clone of a 3Dfx card withou
Sweet! (Score:5, Funny)
Not going over well... (Score:3, Informative)
a big mistake (Score:3, Insightful)
Osho
A few comments (Score:5, Insightful)
FPGAs are also slower than ASICs. This, and the cost, are the reasons why commercial manufacturers use ASICs. You may have a great design, but if it is limited by the performance of your FPGA you lose.
FPGAs are designed to be universal, and to do that they feature programmable interconnects. But the number of those interconnects is limited, and many FPGA designs are thus constrained. You may have plenty of gates left and no way to get to them... With ASICs this is not a problem because if you need a wider bus you build it there, on your own silicon. In FPGAs the busses are already there, and you can't add more.
Yet another concern is tools. Xilinx, for example, offers a free download of some bare minimum tools. They work OK if you are making a door lock with RS-232 control. But they fail miserably, to the point of being unusable, on a complex design - which this one is. Better tools, such as Synplify, will cost you your yearly salary. How many developers have access to that kind of tools? And once you switch to some specific tool you are committed.
Finally, there is a problem with skills of developers. There are many s/w developers who are very good with C/C++. But not that many are good with Verilog (and its wickedly evil predecessor, VHDL :-) Hardware design is very, very different from software design. And you can't debug it, you only can simulate it. Simulation tools, such as ModelSim, are absolutely not free on the level that you need for this design.
To summarize, this project can be done, but not by a bazaarful of people but a small, dedicated band of wizards who locked themselves up in a small cathedral. Even if these wizards release their works, none of mere mortals will be even able to open their files, since the tools to do that are not free.
And besides, why would any sane person, who is not burdened with FOSS thoughts, want to buy such a card even for $100? This cash buys you a decent entry-level Quadro, and if anyone suggests that this design can beat Quadro I won't believe that...
And if anyone wants a real entry-level card, then it can be had (Vanta TNT2, for example) for $10 in any bargain bin, at many places. Beat that first.
Re:A few comments (Score:5, Informative)
I'm actually quite surprised that they're opting for a Xilinx FPGA here; I must be missing something. Is there any particular reason that reprogrammability is more important than cost and speed for this?
In a second. (Score:2, Insightful)
The way I see it, most of the cost of the latest ATi or nVidia cards is to cover R+D expenses. The fact that the price drops drastically in a year or two is evidence of this.
The advantage of an open source hardware project isn't just that you have documentation for the hardware and can therefore write drivers for it. The real advantage is the same advantage that
Trend: Graphics Cards become General Purpose Cards (Score:4, Interesting)
Some of the thoughts expressed by experts are that 3D cards may become general purpose parallel computing cards.
If it weren't for bottlenecks in the AGP bus, it would be possible to use 3D cards of today for more general purpose computing (I'm fuzzy on what the actual hold ups are here...timing issues?).
There have been Slashdot discussions about using the graphics card for audio processing, because audio is usually less than a 32 bit stream. The problem is that audio and often general purpose computing have "real time" requirements.
Also, make sure your open source card supports ARB_fragment!
contact HP & IBM (Score:3, Insightful)
Good company to do it.. (Score:3, Informative)
Order-driven fabbing, other crazy ideas (Score:3, Interesting)
Might want to consider setting up a site for people to register their interest and potential orders, not just how much you would pay for but actually get the orders.
I don't remember if it was successful, but Sony has done this in the past. I know it failed once due to (I believe) a weblogic crash due to too many orders or weak system.
If the website is mentioned every time a story appears on slashdot or some other site, you can continue to accumulate and update information. If you make transparent the financials behind it, people may rush in to get you over the threshold of a precalculated breakeven point (including reasonable profit of course).
Personally I am in the market for a graphics card in the next 6 months. I am planning on getting the best I can afford at the time, and am curious what this project might offer to sway me. Sure performance is not likely to beat the top of the line of the other competitors at the same price point.. at least that is what one would guess. Maybe not true? Well, the FPGA looks really cool.
Consider that the fastest supercomputer in the world is the GRAPE-6 (GRAvity PipE) built on FPGAs for simulation of gravitational interactions (of globular clusters, etc.).
I was thinking it might be closer to something insanely great if you go for the multiple channels now for example. Maybe if you ask about that on your site you'll get people to agree. (How much more would it cost? etc.).
Also I don't know what the FPGA would promise, presumably quick firmware updates from the net of course. Could part of it be used for another purpose, or is that too difficult? Could an additional FPGA be turned into a chip that runs linux (use it on a PC) or perhaps be flashed with the results of another project (I'd love to have a Perl chip.. make it and they will come?) Could another chip or expanded memory provide say a video wall controller with edge blending for multiple screens in realtime? This kind of thing alone might sell enough to make it useful. What do commercial image processors have that this couldn't?
I just saw a sexy video switching fabric thingy here [jupiter.com]
I am curious about what exact "X.org eye candy" this would enable. I am guessing some of: "Direct Link for this comment Brilliant, and about time By Bryan Kagnime (IP: ---.polarnet.ca) - Posted on 2004-11-28 08:23:43 I don't really care so much for the 3d gaming aspect, distribute with the card an opensource operating system like Slackware with some 2d desktop eyecandy (translucency/transparency/openGL) and I'll buy a card for everyone I know with a comp. This'll show users *what* linux is all about, distrobuting a superior product and opening the market share for innovators." ?
One post on osnews mentioned realtime encoding/decoding of video streams, and though I am not sure this would not still impact the rest of the machine considering the design, that sounds neat!
128MB is enough to hold a couple frames of 20 times the resolution of a 1024x768 screen and still have over 30 MB left over. What if it included support for edge/corner blending and warping for a video wall? Is it conceivable that this could take the output from a fast consumer card and provide 2D warping and other effects for displays using multiple projected patches? Consider what it is good at. How about talking over the network or other bus to other oss graphics cards for multiple projector support.
If some nonvolatile memory was included, the card could remember a video wall wallpaper and open window/document information, or keep some megapixel images or something else always available. Would this be useful, say for quick startup or as a backup for important memories?
How about selling with an external patchbay that can take many video sources and provi
Priorities (Score:3, Interesting)
The absolute #1 focus for this card (if they hope to get people to pay more than $30 for it) needs to be fully reprogramable by mere mortals. It would be absolutely wonderful to get a general-purpose FGPA in a computer. People pay more than $100 for crypto cards, video capture cards, etc because hardware is so much better at those tasks. This would wipe the floor with them, because you could program in a new codec or cipher.
Even if it didn't have any video-output at all, I'd still pay $100+ for a PCI card version. Once video encoding apps are optimized to send the processing that's hardest on the CPU to the FGPA instead, I expect we'll see huge increases in encoding speed. That, BTW, also leads to much more complex codecs (MPEG-6 anyone?) that reduce filesize/bitrate significantly.
Besides that, I would also like to see a bit of effort in making sure it works on non-x86 hardware. Since this company makes video cards for SPARC systems, I that surely would not be difficult for them to handle.
If this thing actually sees the light of day, it will completely change what a videocard is. This also strikes me as a potentially piviotal moment in computer hardware. Perhaps, a few years from now, the biggest graphics card maker will have a museum wing dedicated to remember how it all started back in 2004. Yeah, I know it's a stretch, but this really does have that potential.
What about this??? (Score:3, Interesting)
Manticore already exists for some time and it is also what they call Open Hardware. If they could work together, this could result into a good implementation for a Linux/Un*x hardware design.
Re:Dupe! (Score:3, Interesting)
Re:Dupe! (Score:3, Funny)
Woops, ive been using a terminal too much.
Re:Dupe! (Score:4, Funny)
Hahahaha! Oh I'm a pathetic shell of a human.
+5 Korny!
Re:Dupe! (Score:3, Funny)
There's no need to bash yourself in that manner, it's perfectly all right to make a pun every now and then
When will you /.-ers ever wake up... (Score:4, Funny)
TRANSLATION OF TRANSLATION: All the fruit of your labor are belong to us.
TRANSLATION OF TRANSLATION OF TRANSLATION: You've been 0wned, l00zer.
Re:When will you /.-ers ever wake up... (Score:4, Insightful)
But seriously, it's not fair to criticize them for what they are doing. For a very long time, user of open source software have whined and complained about the derth of open-spec hardware.
Here, a company has come along, offering to give the people what they have been asking for. You see a problem with that?
Not a dupe (Score:5, Insightful)
RTFA/RTFWS/RTFE! (Score:5, Informative)
If you'd read-up on this subject, you'd have seen that these folk *do* know their hardware.
They are also not being overly ambitious. While they expect to be able to develop a card which has 3D accelleration for desktop applications, they make no bold claims about gaming.
Indeed, this card is being designed as the ideal desktop-card for open-source systems with open-source drivers and firmware. Any gaming performance, while unlikely, should be treated as a bonus.
I have already pledged my intention to buy one of these cards just out of curiosity.
Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)
Falling anywhere short of, say, OpenGL 1.4 support would make it pretty much useless. In other words, it doesn't have to have pixel shaders, but it has to have good, filtered texture mapping, lighting, alpha, quite a bag of stuff. The Spartan 3 (not III as the tech spec suggests) has 1.5 million gates and 384 MHz, which ought to be enough for a decent 3D core, with one catch: it's got 32 18x18 multipliers, no dividers. Don't even think about floating point, obviously, but without dividers, perspective interpolation is going to be pretty tough. Without perspective interpolation... well, think "1970's".
I just hope there's a standard way of getting around this. Any hardware hacks out there?
no dividers. (Score:2)
Ie, you need x/n for your perspective interpolation, so you instead calculate 1/n outside the fpga and do x*(1/n).
Of course, if n is not a constant, or a small group of constants, then external calulations of its value will slow things down dramatically.
Also, why not just use the 'Microblaze' softcpu for the spartan3? This includes a mapped instruction todo a divide fairly q
Use a coprocessor? (Score:2)
Perhaps there exists a cheap ASIC divider/trig unit (a 487?) that they can use as a coprocessor...
Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)
Odds are that your CPU doesn't have a divider on it either.
Google for Newton-Raphson.
Fast hardware dividers are big and expensive - somewhat more expensive than a multiplier. But if you have a multiplier and you're not too concerned about performance, or are happy to tradeoff precision for performance, then you can do division using your multiplier, a small seed ROM and a microcode engine.
Re:RTFA/RTFWS/RTFE! (Score:3, Interesting)
There is of course also the question if OpenSource driver can compete with the quality of say the NVidia drivers, after all they 'just work'[tm], which it not something that I can say about all the OpenSource stuff I use.
Overall I wish them luck, but I have a hard time imagening a market where su
Re:RTFA/RTFWS/RTFE! (Score:2)
The drivers and the firmware would be open to ad-hoc development so users could change things as they would see fit within the limitation of the hardware.
Of course, this has never been done before but I wouldn't want to be a nay-sayer. I'd prefer to say "Lets try and see how we do".
Re:RTFA/RTFWS/RTFE! (Score:2, Insightful)
Re:RTFA/RTFWS/RTFE! (Score:5, Insightful)
So if they aren't competing on the gaming front, and I highly doubt they'll be able to compete on the CAD front for the price they're expecting to sell the card for, then I'm afraid this idea is going to be dead before it ever really gets a chance to start.
So if they're not shooting for ati or nvidia levels of performance... are they seriously thinking they'd be able to put out a card that could compete with the wildcat realizm cards for around $200? If so, I'd sign up, even if that's not the best card for games. As it is, however, I can't sign the petition in good conscience knowing that if the product couldn't compete with what's already out there, I'd just pass it up for something else that better suits my needs. I don't make enough money to be able to buy things I can't really use.
Re:Bloody Stupid (Score:2, Insightful)
We are designing our operating system for exactly what we want. I think you missed the point of Free Software.