NVIDIA Releases Source To CUDA Compiler 89
An anonymous reader writes "NVIDIA has announced they have 'open-sourced' their new CUDA compiler so that their GPGPU platform can be brought to new architectures. NVIDIA's CUDA compiler is based upon LLVM. At the moment though they seem to be restricting the source code's access to 'qualified' individuals.'
The official press release implies wider access to the source will happen later. It so happens that a few days ago AMD opened their OpenCL backend and added initial support to the Free Software r600 driver.
Should work well with AMD 7000 series (Score:3)
Based on what we know about the high-end AMD 7000 series, that it will forgo VLIW for separate threads, CUDA might actually work very well on that architecture. As long as the right 'qualified' individuals work on it.
Re: (Score:2)
Doesn't this open source compiler still compile code that requires an Nvidia-proprietary hardware technology? Anyone who wants to run on non-Nvidia GPUs should just use OpenCL.
Re:Should work well with AMD 7000 series (Score:4, Interesting)
D'oh, NM, I RTFA'd:
Nvidia's CUDA compiler will be able to create code that supports more programming languages and will run on AMD and Intel processors, while previously it ran only on Nvidia's GPUs. The company made the announcement today at the GPU Technology Conference in Beijing.
Continuing to call it a CUDA compiler is a bit misleading then isn't it?
GPU drivers (Score:1)
If they would open source video drivers too NVIDIA would be clear card of choice in *nix systems.
Comment removed (Score:4, Interesting)
Re: (Score:1)
That's because right now, the nVidia solution just works. The open source community is primarily made up of people who proclaim open source as the next coming, use the software, and contribute absolutely nothing meaningful back. Of the small percentage that does contribute, you have people lending advice on IRC and message boards, you have people writing documentation, you have people doing HCI work, you have people managing, and you have people contributing code. Of those contributing code, you are only
not related to open source (Score:2)
Blaming the lack of success for ATi on Linux desktops on the fact that they went op
Re: (Score:1)
As I appreciate AMD opening up some of their secrets. The open source Radeon driver is a lot better than it was on ATI times. However, the closed driver still gives you double the fps on games, the bad news that it works so badly. Messes up the screen and stuff.
I recently bought a new nVidia based card, partly because I was fed up with the proprietary Radeon driver not working. Gave up trying t
Re: (Score:2)
ATI once bricked my Radeon laptop by suddenly making drivers that can't draw a single pixel on the mobile 9600. Okay, so maybe it was a bug, but they weren't in a hurry to fix it. Yes, I could have installed an older driver, but because of Linux, that would have also meant installing an old distribution with an old kernel. I needed new features and programs. And even while the ATI driver initially worked, it didn't support everything (dual screen in particular was hacky).
I'd be very happy to see AMD make st
Re: (Score:1)
Interesting that you add this little delusion but completely ignore the part which immediately prior, ATI told Linux users to go fuck themselves and handed the entire Linux market to NVIDIA. Furthermore, unlike ATI, NVIDIA has been very good to the Linux community for a very, very long time now. And unlike what you present, ATI's effort reflects an effort to grab market share away form NVIDIA rather than a good faith effort to do "right" by the open source community. On top of that, NVIDIA has repeatedly st
Re: (Score:3)
"It is shit, but will be supported and won't stop working after the kernel upgrades from 3.8.54-patch3 to 3.8.54-patch4."
Seems to be a sane option.
Re: (Score:3, Informative)
They cant, not fully at least. They have to keep details about tilt-bits and other DRM and patented crap secret.
Re: (Score:3, Informative)
If it's patented, then it isn't a secret anymore, by virtue of the patent.
Re: (Score:1, Troll)
Re:GPU drivers (Score:4, Insightful)
Do you know something we don't? Is there some defiency in our understanding of patents? "A patent [wikipedia.org] ... consists of a set of exclusive rights granted by a sovereign state to an inventor or their assignee for a limited period of time in exchange for the public disclosure of an invention" (emphasis added).
Re: (Score:2)
Yes. Very valid point. I was only addressing the remark about patents.
Re: (Score:2)
If that does happen, it's a violation. Look, I'm strongly against the absurd fringe patents of software, processes, business methods, and the like, and I'm not in favor of ANY patents under the present absurd system, but if they are going to exist in the first place, then outright violations should be shot down on sight.
Re: (Score:3)
Re: (Score:2)
I consider that practice a fundamental violation and agree that it should be stamped out.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Without patents, there would be far more trade secrets. Patents make inventions public, not private. That's the whole purpose of patents: to promote the spread of ideas by making them public.
Really? Wow, and here I thought they're solely a means for companies that don't actually produce anything except lawsuits to make money via protectionist licensing schemes! :P
Re: (Score:2)
If they would open source video drivers too NVIDIA would be clear card of choice in *nix systems.
It already is. At least in the post production industry. You don't see any of us wining about lack of open-source drivers. We just want to get stuff done with the most reliable drivers available.
So... (Score:2)
Does this mean eventually running CUDA applications on AMD GPUs?
Re: (Score:1)
Comment removed (Score:5, Interesting)
From TFA: CUDA runs on x86 (Score:4, Interesting)
OpenCL is a much better path, because it can execute code on a CPU as well as a GPU.
So can CUDA, according to a graphic in one of the featured articles [nvidia.com]: "NVIDIA C or C++, PGI Fortran, or new language support, through LLVM-based CUDA compiler, to NVIDIA GPUs, x86 CPUs, and new processor support."
Comment removed (Score:4, Interesting)
Re: (Score:3)
any word on a license? (Score:4, Insightful)
Despite the phrase "open-source", there seems to be a distinct lack of information about whether this is a "source is now available for inspection" type release, or actually under an open-source license, and if so, which one.
Re: (Score:1)
"unfortunately" ?
BSD > GPL in terms of freedom.
Re: (Score:2)
This is the kind of disagreement in which neither side can ever hope to convince the other. Let's just say that you are free to do almost anything you want under a BSD license. That's an objective fact. You are not compelled to contribute back any improvements you may make in a work. Which one contributes more to global "freedom" is much too broad and slippery a concept to ever resolve unanimously.
Re: (Score:3, Interesting)
OK, then how do you explain the rather large numbers of companies that give back to BSD projects? This anti-BSD FUD that the Linux and GPL camp seem to need to spread got old many,many years ago.
Without a permissive license the internet would have been greatly delayed as MS and the others would have had to develop their own TCP/IP stack from scratch.
Re: (Score:2)
I believe that is consonant with my comment. Not everyone only does what they are compelled to do. I have BSD licensed some of my own code.
Re: (Score:1)
BSD brings uncertainty when it comes to having contributors closing the source code and not contributing back.
Re: (Score:2)
GPL has its place, I agree. But unfortunately the usage of it is often 'wrong', or malicious IMHO.
Re: (Score:2)
BSD is the perfect license to apply to a layer of software that helps people talking to the hardware you sell.
If somebody wants to "steal" it, and make something great without sharing upstream, well, great for you, more people will buy your hardware.
No they haven't (Score:5, Interesting)
Title is correct. From TFA, the summary appears wrong. It seems they are not open sourcing anything. To quote TFA
On December 13th, NVIDIA announced that it will open up the CUDA platform by releasing source code for the CUDA Compiler.
They will let you look at the code, and they might let you send patches back to them. Nowhere I can find did NVIDIA promise anything along the lines of an open license, or even any license at all. This is more like a Microsoft shared-source deal, where you can look, but no rights or privileges are transferred to you.
That said, it would still be cool to see.
Re: (Score:2)
Not sure if any developer should even look at the code given that it is not really open. Last thing a developer would want is to end up being accused of copying stuff from said code illegally.
usually the company wouldn't admit you to seeing the code anyhow if they considered it as that secret and the nda's usually go both ways.
you know why? because usually it's shit and then they'd have to admit you worked on the project and that would be giving the employee an edge when looking for future work, big companies don't like giving exact creds on people walking out, they like some creds on people walking in.
getting tainted from looking the code is pure bullshit scary pants nonsense - the code won't b
Re: (Score:1)
What a load of absolute crap. By that logic, don't ever work for anyone or ever sign any NDA.
Looking at stuff is how programmers learn. The whole industry is built on residual knowledge. If I say I used to work at nVidia (which I did) on stuff that they want to look at themselves, then provided the non-compete clauses is done I am at a massive advantage compared to someone who has not looked at anything ever at all.
Stop trotting out this bullshit myth that you can't look at things if you later want a job
Re: (Score:2)
It may so appear to some, but it may be so, or it may not be so in fact. We don't know whether they intend to open source it or not. Not from this article, anyway. It certainly doesn'y say it will NOT be open sourced, though.
Re: (Score:2)
Open Source != Free Software.
Re: (Score:2)
"Open source is a development methodology; free software is a social movement." [gnu.org]
Re: (Score:1)
Why is everybody thinking this is big news?
ftp://download.nvidia.com/CUDAOpen64/ [nvidia.com]
The previous compiler, based upon Open64, has been available in source form since CUDA 1.0. They (partially) switched to LLVM in 4.1, and they also release the source code. They didn't have to, because unlike Open64 LLVM is not GPL, so it's nice of them, but it's not exactly earth-shattering news...
In related news (Score:2)
Stikypad [sic] has "given away" all of his "money" to a "qualified individual."
Re: (Score:3)
Ah, you're married, aren't you.
Re: (Score:2)
No he's just started investing.
Re: (Score:2)
Lost at Monopoly.
Can someone tell me NVidia's business model? (Score:2)
Discrete graphics is going away, they seem to be leaning increasingly towards the HPC market but that is tiny compared to the consumer graphics market that their company was built on. I just don't see it. Anyone?
Re: (Score:3)
(Off topic): [url=http://en.wikipedia.org/wiki/Tegra]Tegra[/url]? (An ARM chip with nVIDIA graphics.)
Re:Can someone tell me NVidia's business model? (Score:4, Informative)
Well, They are making some of the best mobile/low-power solutions with the Tegra [nvidia.com] family of chip-sets.
I also believe that it's still going to take some time before the integrated solutions (Intel IGP and AMD Fusion) are good enough to replace discrete graphics for gamers - where they are strong today.
Xbox 360 and Wii have integrated graphics (Score:2)
They are making some of the best mobile/low-power solutions with the Tegra [nvidia.com] family of chip-sets.
That and nForce.
it's still going to take some time before the integrated solutions (Intel IGP and AMD Fusion) are good enough to replace discrete graphics for gamers
SWF games, such as those seen on Facebook, are targeted at PCs with Intel GMA (Graphics My Ass) IGPs. So to people for whom "gaming" means FarmVille and "upgrade" means buying a new PC, integrated graphics have replaced discrete.
Xbox 360 and Wii have "integrated graphics" by AMD in the sense that the GPU is on the northbridge. The 360's graphics are also "integrated" in the sense that all 512 MB of its RAM can be used as VRAM. I'm not very familiar with the PS3 architecture other than tha
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Discrete graphics is going away, they seem to be leaning increasingly towards the HPC market but that is tiny compared to the consumer graphics market that their company was built on. I just don't see it. Anyone?
Discrete GPU market is growing. See JPR's analyst reports http://jonpeddie.com/press-releases/details/embedded-graphics-processors-killing-off-igps-no-threat-to-discrete-gpus/ [jonpeddie.com]
here is the full report http://jonpeddie.com/download/media/slides/An_Analysis_of_the_GPU_Market.pdf [jonpeddie.com]
That is the funniest thing I've read in a long (Score:2)
long time.
I'm going to print it out and put it on the shelf next to:
* "Buggy whip industry still growing with no end in sight"
* "Refrigeration is no threat to the ice delivery business"
* "Travel agents expect little competition from internet sales"
GPGPU Cold War finally ending? (Score:1)
Open Source the Libraries (Score:4, Interesting)
The pain is not in compiling GPU code; rather, the pain is in writing good GPU code. The major difference between NVIDIA and AMD (and the major edge NVIDIA has over AMD) is not as much the compiler as it is the libraries.
Of course, I'm biased, because I work at AccelerEyes and we do GPU consulting with our freely available, but not open source, ArrayFire GPU library [accelereyes.com], which has both CUDA and OpenCL versions.
Re: (Score:2)
Re:Nvidia Driver - trade secrets (Score:2)
Running Linux in the CUDA Cores (Score:1)
Re: (Score:1)
Re: (Score:3)
You probably wouldn't gain anything. Passing data between your CPU and GPU has a high latency penalty. OpenVPN processes small amounts of data. You would need to probably buffer a few hundred KB before it would become worth it.
A GPU would be great for any large amounts of data, like a block device, but small packetized datastreams work best with super low latency instruction level acceleration.
My guess anyway.