Intel's First Discrete GPU is Built For Developers (engadget.com) 50
At its CES 2020 keynote, Intel showed off its upcoming Xe discrete graphics chip and today, we're seeing exactly how that's going to be implemented. From a report: First off, Intel unveiled a standalone DG1 "software development vehicle" card that will allow developers to optimize apps for the new graphics system. It didn't reveal any performance details for the card, but did show it running the Warframe game. It also noted that it's now "sampling to ISVs (independent software vendors) worldwide... enabling developers to optimize for Xe." As far as we know right now, Intel's discrete graphics will be chips (not cards) installed together with the CPUs on a single package. However, it's interesting to see Intel graphics in the form of a standalone PCIe card, even one that will never be sold to consumers.
Naming fail. (Score:4, Funny)
Xe? Sounds like some newfangled gender.
Re: (Score:1)
Re: (Score:2)
Re:Naming fail. (Score:5, Funny)
Re: (Score:2)
Isn't that the leader of the Chinese government? (Score:2)
Xe Impeeing
Re: (Score:3)
Does not matter. This thing will likely vanish in short order again. Intel cannot do GPUs, they lack the skills. And lately, it looks like their CPUs are not very good either. They do make excellent non-volatile memory though.
Fact fail. (Score:1)
This isn't Intel first discrete GPU. Intel had i740 and i752 discrete GPU video cards back in the late 90s.
Is the author of this "article" a know-nothing millennial or something?
Re: (Score:2)
Xe? Sounds like some newfangled gender.
It's not new.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Integrated (Score:3)
discrete graphics will be chips (not cards) installed together with the CPUs on a single package
That doesn't really sound discrete.
Re: (Score:2)
Re: (Score:2)
Making it not discrete anymore. (Score:2)
It's a chiplet. Plain and simple...
Intel are forced to move to chiplets too, or die due to low yields.
Re: (Score:2)
Multiple Chips, one package. -- This new thing
vs
One chip, with integrated graphics in one package -- Your 2013 13" Macbook Pro.
vs
One chip per package, multiple packagas -- Your 9900K with Nvidia 2080 Ti
Re: Integrated (Score:2)
What's new about that? About 15 years ago I worked at a company where we put 2 chips in the same package, different technologies, and it was called an MCM, multi chip module. Looked just like a single one. We had one bga at some point with 2 dies stacked, a six layer redistribution substrate, 254 balls or so, 10x10mm and 1mm thick...
Re: (Score:2)
Yes. MCMs were around when I was but a young engineer back in the UK.
There are technical things that are different with the current multi chip packages, but that's just details. The basic principle is the same.
The 'new' I was referring to was the specific CPU+graphics chip+other things package, which is indeed new.
Re: (Score:2)
Yeah maybe I’m missing something but how is this any different than their current integrated graphics?
Ummmm, it's not integrated any more.
(as indicated by the word "discrete")
Re: (Score:2)
Comment removed (Score:4, Informative)
Re: (Score:2)
So by your logic AMD sells discrete cores? (Score:2)
And discrete IO "northbridges" for those cores?
After all they are separate chiplets in one package too.
Re: (Score:2)
Re: (Score:1)
I was going to say, I remember having an AGP i740 in my old Pentium II.
Re: (Score:2)
Its integrated++.
Sounds like this thing exists to improve yields while still selling an iGPU, since Intel has started selling a lot of disabled iGPU CPUs lately.
Re: (Score:2)
>Sounds like this thing exists to improve yields while still selling an iGPU
The main driver for chiplets in the same package is to allow different processes to be used for different purposes - DRAM, Flash, thick gate for 3V pins, fast logic for CPU. The yield improvement is also a thing and adds to the attractiveness.
Re: (Score:2)
No. This thing exists first as laptop components, just like other "discrete" laptop chips. There are plans for a PCIe card in the future but the near time announcement here is they will be sold to system vendors in chip form before they hit the market as add-in cards.
Re: (Score:2)
Re: (Score:2)
If you read the summary, you'll see that it's on a card just for software developers to develop on. ("software development vehicle"). The chip will only ship as a product to consumers as an integrated solution.
Re: (Score:2)
That doesn't really sound discrete.
It's an unfortunate wording that misses the content of the announcement. The Xe most definitely *is* a discrete GPU. What is being said here is the first and initial release of the GPU will be in the form of chip only, much like any discrete GPU currently installed in a laptop.
A dedicated PCIe add-on graphics card will come later. But this is a discrete GPU in every way that the NVIDIA 2060Q is a discrete GPU, i.e. completely standalone communicating to the CPU via PCIe lanes.
U FAIL IT (Score:4, Informative)
This is NOT Intel's first discrete GPU!
Go look at the 740!
Re: (Score:3)
Indeed. I used to expect better reporting from Engadget, but alas.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
For some reason the i740 driver is still included in xf86. Who the hell still has a need for that turd of a card? It was garbage two decades ago.
Re: (Score:2)
Re: (Score:2)
What's it like living in 1999?
Hmm...
Not as good as the 70s - 80s...
But still way better than it is now!
Re: U FAIL IT (Score:2)
Re: U FAIL IT (Score:2)
Re: (Score:2)
You beat me to it.
#include<stdComplaintAboutEditorsNotEditing.h>
The general trend seems to be utter lack of tech background knowledge on an allegedly tech web site. (The other egregious example I can think of is the use of the DEC logo on things that have nothing to do with DEC.)
I missed that chance (Score:2)
When I went for the Trident 3D Image 975. But that worked with Linux too.
Damn commies! (Score:1)
Doesn't mean much (Score:3)
Re:Doesn't mean much (Score:4, Interesting)
You forgetting intel poached AMD's chief architect of Radeon, Raja Koduri at the end of 2017. Well, and now about 2 years later we have a discrete graphic card. Kind of lick clockwork.
It will be interesting to see, and hopefully things play out well enough for AMD to remain competitive. Or to learn the "magic" Raja provided wasn't really from him after all but the team he left at AMD.
disclosure: I'm pretty sure I have mutual funds that include stock from intel, nvidia and amd.
Re:Doesn't mean much (Score:4)
I'm not saying it's impossible for Intel to compete. If anyone can, it's them. I just don't see their early efforts bearing much fruit when the competition's as strong as it is. I"m more interested in seeing if they stay in the game long-term.
Re: (Score:2)
I think it's beyond a single person to be the sole source of "magic."
False. A single person is the sole source of magic. It is beyond a single person to deliver results. That requires whole teams and departments. Koduri is only the most prominent poach. Intel has in the past 24 months poached a lot of dedicated GPU staff from both AMD and NVIDIA. Not just engineers either. They also have product marketing and design people, poached firmware writers, there was a good article about it back in ... September I think, I can't find it now but was quoting people as saying that Inte
Re: (Score:2)
Given the long record of failed projects Intel had in the last years, indeed this is not proving much.
I could see it work if they started from their iGPU and made it a first class product, allowing for more computing cores and special instructions for general purpose computing. Now whether it will work or not will highly depend on the quality and scalability or their initial iGPU architecture.
Re: (Score:2)
I could see it work if they started from their iGPU and made it a first class product
I could not. Their iGPU is *not* a high performance part and doesn't attempt to be. It's optimised for a completely different purpose than the requirements of a discrete GPU.
Performance looks like it will be kind of weak (Score:2)
I can't figure out who this is for except maybe Apple asked for it. It's still early, but AMD has new silicone in the pipepline that is likely to run rings around this. Intel usually has fantastic dri
Re: (Score:2)
I can't figure out who this is for except maybe Apple asked for it.
For people like me who have no interest in closed-source-only driven hardware from nVidia, and who are not willing to tolerate amdgpu's daily driver crashes and monthly oh-the-new-driver-version-does-not-even-start-on-my-hardware suprises.
Of course, Xe drivers will have to prove they are better in that regard, but past experience with iGPUs let me assume this to be very probable.