• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

kostacurtas

Member
Oct 27, 2017
9,065
Intel (Xe) DG1 spotted with 96 Execution Units

The Xe architecture will span across many segments. Starting from entry-level mobile gaming to high-performance computing. The DG1 graphics are believed to be Intel's first discrete graphics for gamers. Rumors also suggest Intel will launch the DG2 variant as well. Both of these names are not product names though, so we still do not know how the new series will be called.

The latest leak from EEC points towards 96 Execution Units. If Intel was to keep the same design principle for DG1 as for Intel mobile HD/UHD graphics, then we should expect 96*8 shading units, that's 768 in total. It is pointless to speculate if this is 768 or 1536 (*16) shading units, but it appears that DG1 focuses on entry-level graphics.


dg1g2kc2.png
 

KaiPow

Member
Oct 25, 2017
2,116
No FPS, resolution, anything in depth from the gameplay demo. Just proof of it running at CES.
 

Xiaomi

Member
Oct 25, 2017
7,237
Just sounds like a half-way point between integrated graphics and a low-end current gpu. Feels like something OEMs will love.
 

Sandcrawler

Member
Oct 27, 2017
545
I wonder if they're holding the DG2 (assuming it's a more powerful GPU because why would they make one with a similar number of compute units as an iGPU) back from the public so early impressions aren't bad. Since the drivers are way newer than AMD's and nVidia's, I'd imagine that there's a ton of performance to be gained with more driver development. How a bigger GPU performs against the midranges of AMD and nVidia is really important so bad (or even just middling) publicity before it launches would hurt sales. That said, I don't expect much from the first generation (or even second) of dGPU's from Intel since the competition have both been in the market so long.

Does anyone have any insight on how this will affect game devs? Do they now have 50% more work in this department because of a 3rd GPU manufacturer? I'm not sure how much work is done on the game devs' part in terms of optimizing for different architectures. I figure it's partly API dependent?
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
Is the portable Alienware thing running on this? They said they were on Intel 10th gen
 
Nov 8, 2017
13,111
That is terrible, sigh.

The full context, as pointed out in the video by both Steve and Gordon, is that this hardware is basically for developers to get used to it before anything goes out to the public. The performance is expected to be somewhat better on release (which will be months and months down the line). 96 execution units is what's supposed to be integrated onto the high end Tiger Lake chips, but this version has it's own non-shared memory. This is really a look at their lowest-of-the-low discrete stuff.
 
Nov 8, 2017
13,111
Is the portable Alienware thing running on this? They said they were on Intel 10th gen

No. But if the Alienware UFO ever does come to market in the next 12 months, it will probably be using Tiger Lake U chips, which could have performance not too far behind this, since that's also 96 EU (but with shared memory instead of dedicated).
 

elenarie

Game Developer
Verified
Jun 10, 2018
9,823
Does anyone have any insight on how this will affect game devs? Do they now have 50% more work in this department because of a 3rd GPU manufacturer? I'm not sure how much work is done on the game devs' part in terms of optimizing for different architectures. I figure it's partly API dependent?

Very much depends on how different the drivers are compared to their iGPUs. Afaik, Intel are the biggest GPU manufacturer out there (if you count iGPUs, which you should), so many devs should already have experience with Intel GPUs.

I wouldn't be worried about the performance competitiveness yet. Give them a few more years. For context, when we were developing support for DXR for BFV, we were often running and testing with 10-20 fps on a prototype hardware, but nVidia did a great job and in the end we ended up with 50+ fps on the actual shipping hardware.
 

RestEerie

Banned
Aug 20, 2018
13,618
i've been hearing about Intel making some headwind into 'decent' performing integrated GPU since 2008.

It's the 10th day of the year 2020 and i am still just hearing about it.