Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Intel

Intel's First Discrete GPU is Built For Developers (engadget.com) 50

At its CES 2020 keynote, Intel showed off its upcoming Xe discrete graphics chip and today, we're seeing exactly how that's going to be implemented. From a report: First off, Intel unveiled a standalone DG1 "software development vehicle" card that will allow developers to optimize apps for the new graphics system. It didn't reveal any performance details for the card, but did show it running the Warframe game. It also noted that it's now "sampling to ISVs (independent software vendors) worldwide... enabling developers to optimize for Xe." As far as we know right now, Intel's discrete graphics will be chips (not cards) installed together with the CPUs on a single package. However, it's interesting to see Intel graphics in the form of a standalone PCIe card, even one that will never be sold to consumers.
This discussion has been archived. No new comments can be posted.

Intel's First Discrete GPU is Built For Developers

Comments Filter:
  • by MrNaz ( 730548 ) on Thursday January 09, 2020 @02:08PM (#59603614) Homepage

    Xe? Sounds like some newfangled gender.

  • by Major Blud ( 789630 ) on Thursday January 09, 2020 @02:14PM (#59603636) Homepage

    discrete graphics will be chips (not cards) installed together with the CPUs on a single package

    That doesn't really sound discrete.

    • Yeah maybe I’m missing something but how is this any different than their current integrated graphics? I guess the chip has enough architecture built in that it could be it’s own package whereas their current offerings can only be paired with a CPU.
      • It's like the Kaby Lake G where they put a dGPU radeon vega on it.
      • Multiple Chips, one package. -- This new thing
        vs
        One chip, with integrated graphics in one package -- Your 2013 13" Macbook Pro.
        vs
        One chip per package, multiple packagas -- Your 9900K with Nvidia 2080 Ti

        • Multiple Chips, one package. -- This new thing

          What's new about that? About 15 years ago I worked at a company where we put 2 chips in the same package, different technologies, and it was called an MCM, multi chip module. Looked just like a single one. We had one bga at some point with 2 dies stacked, a six layer redistribution substrate, 254 balls or so, 10x10mm and 1mm thick...

          • Yes. MCMs were around when I was but a young engineer back in the UK.

            There are technical things that are different with the current multi chip packages, but that's just details. The basic principle is the same.

            The 'new' I was referring to was the specific CPU+graphics chip+other things package, which is indeed new.
             

      • Yeah maybe I’m missing something but how is this any different than their current integrated graphics?

        Ummmm, it's not integrated any more.

        (as indicated by the word "discrete")

    • Its integrated++.

      Sounds like this thing exists to improve yields while still selling an iGPU, since Intel has started selling a lot of disabled iGPU CPUs lately.

      • >Sounds like this thing exists to improve yields while still selling an iGPU

        The main driver for chiplets in the same package is to allow different processes to be used for different purposes - DRAM, Flash, thick gate for 3V pins, fast logic for CPU. The yield improvement is also a thing and adds to the attractiveness.

      • No. This thing exists first as laptop components, just like other "discrete" laptop chips. There are plans for a PCIe card in the future but the near time announcement here is they will be sold to system vendors in chip form before they hit the market as add-in cards.

    • From the photos in the article, it just looks like a regular discrete GPU on its own PCIe board with its own fan, just like Nvidia or whatever. So I don't know what that sentence about the single package is referring to.
      • by Chaset ( 552418 )

        If you read the summary, you'll see that it's on a card just for software developers to develop on. ("software development vehicle"). The chip will only ship as a product to consumers as an integrated solution.

    • That doesn't really sound discrete.

      It's an unfortunate wording that misses the content of the announcement. The Xe most definitely *is* a discrete GPU. What is being said here is the first and initial release of the GPU will be in the form of chip only, much like any discrete GPU currently installed in a laptop.

      A dedicated PCIe add-on graphics card will come later. But this is a discrete GPU in every way that the NVIDIA 2060Q is a discrete GPU, i.e. completely standalone communicating to the CPU via PCIe lanes.

  • U FAIL IT (Score:4, Informative)

    by sexconker ( 1179573 ) on Thursday January 09, 2020 @02:17PM (#59603654)

    This is NOT Intel's first discrete GPU!

    Go look at the 740!

  • AMD FTW!
  • by twocows ( 1216842 ) on Thursday January 09, 2020 @02:45PM (#59603790)
    Intel's great at putting out a lot of marketing BS that doesn't translate in any way to any sort of real world performance. Until we see real world performance numbers, it may as well be a discrete brick as far as I'm concerned. Competition's very strong in the discrete graphics market, Nvidia and AMD are no slouches. Even in the iGPU market, AMD's offerings tend to be better, and Intel has years of experience there; they've got almost none with discrete graphics. Maybe they can pull something out of their hat, but I won't hold my breath until they have something to prove it. Full disclosure: I own stock in AMD at the moment
    • Re:Doesn't mean much (Score:4, Interesting)

      by G00F ( 241765 ) on Thursday January 09, 2020 @02:55PM (#59603848) Homepage

      You forgetting intel poached AMD's chief architect of Radeon, Raja Koduri at the end of 2017. Well, and now about 2 years later we have a discrete graphic card. Kind of lick clockwork.

      It will be interesting to see, and hopefully things play out well enough for AMD to remain competitive. Or to learn the "magic" Raja provided wasn't really from him after all but the team he left at AMD.

      disclosure: I'm pretty sure I have mutual funds that include stock from intel, nvidia and amd.

      • by twocows ( 1216842 ) on Thursday January 09, 2020 @04:13PM (#59604162)
        Without being privy to their internal corporate structure, it's hard to say for sure, but I will say that hardware's an absolute behemoth, especially something like a modern discrete GPU. I think it's beyond a single person to be the sole source of "magic." I think it's likely he had some useful insights and it'll be up to Intel to provide him with the team and environment to make good on them.

        I'm not saying it's impossible for Intel to compete. If anyone can, it's them. I just don't see their early efforts bearing much fruit when the competition's as strong as it is. I"m more interested in seeing if they stay in the game long-term.
        • I think it's beyond a single person to be the sole source of "magic."

          False. A single person is the sole source of magic. It is beyond a single person to deliver results. That requires whole teams and departments. Koduri is only the most prominent poach. Intel has in the past 24 months poached a lot of dedicated GPU staff from both AMD and NVIDIA. Not just engineers either. They also have product marketing and design people, poached firmware writers, there was a good article about it back in ... September I think, I can't find it now but was quoting people as saying that Inte

    • Given the long record of failed projects Intel had in the last years, indeed this is not proving much.

      I could see it work if they started from their iGPU and made it a first class product, allowing for more computing cores and special instructions for general purpose computing. Now whether it will work or not will highly depend on the quality and scalability or their initial iGPU architecture.

      • I could see it work if they started from their iGPU and made it a first class product

        I could not. Their iGPU is *not* a high performance part and doesn't attempt to be. It's optimised for a completely different purpose than the requirements of a discrete GPU.

  • based on this [youtube.com] it looks like they caught up to Vega 11 on a 3400G, but it's a desecrate GPU. The extra cost is likely to push users to a 1050 TI mobile for the same price or just a little more and with 3x the performance. Spend another $100 and you're in 1660 mobile territory and almost 5x the performance...

    I can't figure out who this is for except maybe Apple asked for it. It's still early, but AMD has new silicone in the pipepline that is likely to run rings around this. Intel usually has fantastic dri
    • by ffkom ( 3519199 )

      I can't figure out who this is for except maybe Apple asked for it.

      For people like me who have no interest in closed-source-only driven hardware from nVidia, and who are not willing to tolerate amdgpu's daily driver crashes and monthly oh-the-new-driver-version-does-not-even-start-on-my-hardware suprises.

      Of course, Xe drivers will have to prove they are better in that regard, but past experience with iGPUs let me assume this to be very probable.

God help those who do not help themselves. -- Wilson Mizner

Working...