What's new

CPU/GPU Specs, Why no info?

BearFlag

Member
Yes, but the way it's looking I'm not even sure if it can play LoL well at the native resolution (3000x2000). If the GPU wasn't designed for it, then why even mention it? He could have left it at "this is for the architect who is building a building right now".
 

GoodBytes

Well-Known Member
Yes, but the way it's looking I'm not even sure if it can play LoL well at the native resolution (3000x2000). If the GPU wasn't designed for it, then why even mention it? He could have left it at "this is for the architect who is building a building right now".
Well we have to see.
 

MattL

New Member
Yes, but the way it's looking I'm not even sure if it can play LoL well at the native resolution (3000x2000). If the GPU wasn't designed for it, then why even mention it? He could have left it at "this is for the architect who is building a building right now".

3000x2000 is a really big resolution to game at. Keep in mind many with gaming desktops still game at 1080p. 3,000 x 2,000 is nearly 3 times the pixel count of 1080p. So if you were expecting a dedicated video card that can both fit in and be powered by basically a 13" ultrabook body to perform well at demanding games at that resolution then they were pretty unrealistic expectations. Gamers are just now upgrading to 1440p gaming, but that is very demanding and requires some good desktop level gaming hardware to perform well on any recent game... the SB resolution is nearly two times the pixels of 1440p. Heck 4k resolution requires about $700 of desktop gaming video card hardware (more like $1500-$2000+ if you really want to perform well) to run moderately and it's about 8.8 million pixels vs the SB 6 million pixels... So you really should set your expectations to what gaming hardware can achieve and what can actually fit into a very small form factor with limited power.
 

MattL

New Member
I'm not convinced it's going to be in line with those benchmarks as I have my doubts that this GPU is going to be the equivalent of a 940M.

And even if it was.....those benchmarks are 1366 x 768. And it is still struggling to get 30fps. Why would someone want to play a game in 2015 at that resolution on a machine that you spent $2,000 on?

It's impossible to know until benchmarks on the actual system, but based on the GPU-Z it looks very close to a 940 slightly downclocked in core speed but much faster memory... Might actually be faster than a 940m in fact due to the GDDR5 memory vs GDDR3.

You say $2000 like that means anything. It's all in context... for a very high end gaming rig $2000 means two GTX Titan X video cards and nothing else... at that point you can actually game at 4k well... It takes about $700 of desktop video card hardware to really run 4k even moderately... that's not even factoring in the space in the very slim SB base and very limited power. Panos specifically said this was not designed to be a gaming laptop... it's basically a 13" ultrabook with a very low end (low voltage) dedicated video card... I'd guess it will run 50% -100% faster than the integrated video, and to be honest that still makes it the best option since the comparable is the rMBP 13 without a dGPU (which is only $500 cheaper than SB and tops out at over $2000 also) at all and nearly all other 13" ultrabook level laptops are the same (this one obviously having tricks up it's sleeve too).

If you are wanting serious gaming power then you are looking at the wrong form factor lol... MS can't perform magic that no one else can perform, heck if they could fit and power a more powerful card (significantly more powerful to satisfy what you suggest) it'd probably generate *far* too much heat for the small base... There is a limit on what can be done based on current technology out of the hands of any manufacturer, it seems already they worked with nvidia to customize the dGPU as much as possible as is).
 

GreyFox7

Super Moderator
Staff member
Microsoft's secret sauce: DirectX 12 “Multiadapter” will use the integrated GPU and discrete GPU together.

Microsoft's GPU Secret Sauce: DirectX 12 Multiadapter | Microsoft Surface Forums
It's impossible to know until benchmarks on the actual system, but based on the GPU-Z it looks very close to a 940 slightly downclocked in core speed but much faster memory... Might actually be faster than a 940m in fact due to the GDDR5 memory vs GDDR3.

Right on with everything. One small point though i believed the specs on a 940M indicate even slower DDR3 not GDDR3 unless they made a typo. either way as soon as you say customized all bets are off until you know how and what. He said it was optimized for running Professional Applications, so some benchmarks might reveal that and others may not. The apparent removal of PhysX may not have been as important to the targeted apps or didn't buy much over what the i7 could do.

I think it will do well with what it was targeted at which was definitely not games but it will run LoL (I'm guessing that must be a gamers inside joke). :) Like, yeah he's a good baseball player, he can play in the girls' softball league. LoL :D
 

GoodBytes

Well-Known Member
Heck 4k resolution requires about $700 of desktop gaming video card hardware (more like $1500-$2000+ if you really want to perform well) to run moderately and it's about 8.8 million pixels vs the SB 6 million pixels... So you really should set your expectations to what gaming hardware can achieve and what can actually fit into a very small form factor with limited power.
Not really. A GeForce 980 will do the trick to be honest. Now, if you want max max max settings, you need lots of memory speed and quantity. Actually, in most cases you need either or. It really depends on how the game engine works in loading textures. But you want both if you want great performance across the board. So you want something like a fast GPU with 8GB HBM2 memory. That is memory that is right next to the GPU. AMD has a graphics card with this technology (HBM1), but due to limitation of HBM1, it can only support 4GB.

All to say, next generation GPUs should allow you to play games at max settings, across the board at 4K. But currently if you are content with 'high' in most games, than GTX 980 should do the trick 980Ti will help in some game due to the increase memory.
 

GoodBytes

Well-Known Member
Microsoft's secret sauce: DirectX 12 “Multiadapter” will use the integrated GPU and discrete GPU together.

Microsoft's GPU Secret Sauce: DirectX 12 Multiadapter | Microsoft Surface Forums


Right on with everything. One small point though i believed the specs on a 940M indicate even slower DDR3 not GDDR3 unless they made a typo. either way as soon as you say customized all bets are off until you know how and what. He said it was optimized for running Professional Applications, so some benchmarks might reveal that and others may not. The apparent removal of PhysX may not have been as important to the targeted apps or didn't buy much over what the i7 could do.

I think it will do well with what it was targeted at which was definitely not games but it will run LoL (I'm guessing that must be a gamers inside joke). :) Like, yeah he's a good baseball player, he can play in the girls' softball league. LoL :D

The 940M has 2 versions. (I know, it is a mess)
GM107 and GM108.
GM107 is uses PCI-E 16x, while the GM108 uses PCI-E 8x (version 3.0 on both), and they are other variations in the specs.
They both support GDDR5 and DDR3 (not GDDR3).

Have a look at this: GeForce 900 series - Wikipedia, the free encyclopedia (go to 940M on the table).

You will notice that the 940M GM107 model is a slightly reduced 950M

I think the GPU in the Surface Book is a GM108 version of the 940M, because it is on PCIe 3.0 x8, downclocked a bit, with elements of the GM107 model and on GDDR5.
 

GreyFox7

Super Moderator
Staff member
The 940M has 2 versions. (I know, it is a mess)
GM107 and GM108.
GM107 is uses PCI-E 16x, while the GM108 uses PCI-E 8x (version 3.0 on both), and they are other variations in the specs.
They both support GDDR5 and DDR3 (not GDDR3).

Have a look at this: GeForce 900 series - Wikipedia, the free encyclopedia (go to 940M on the table).

You will notice that the 940M GM107 model is a slightly reduced 950M

I think the GPU in the Surface Book is a GM108 version of the 940M, because it is on PCIe 3.0 x8, downclocked a bit, with elements of the GM107 model and on GDDR5.
Been there, looked at that :) however sometimes there's artifacts of specs on a chart never implemented, as I never found a 940M with GDDR5 or 40MB/s bandwidth anywhere. All to say, customized is customized and customizing up is much harder than customizing down but who knows what NVIDIA had on the table to play with. When I look at where they ended up and then at the 940M or 950M I think it's a whole lot easier to get to the end state starting from a 950M. That in no way suggests it's end state performance level. Some things don't add up almost as if the GPU-Z detection is flat wrong.
 

GoodBytes

Well-Known Member
GPU-Z uses Nvidia driver APIs to offer all it does. (Same for all overclocking software, it is all Nvidia API going through their drivers, beside GPU firmware replacement).

It is easy to go up, as well as down. Except if you have a highest end GPU, then it is hard to go up.
The reason is the way processors are made. They usually produce the high end one, and chips that fail specification/requirement are divided into groups, average down, and then blow up fuses breaking the broken cores (or cores that needs to be broken to match specs) and changing the frequency and voltages to a very stable state, and voila! Lower end GPU. So you can go up by taking a higher end model chip.
 

GreyFox7

Super Moderator
Staff member
GPU-Z uses Nvidia driver APIs to offer all it does. (Same for all overclocking software, it is all Nvidia API going through their drivers, beside GPU firmware replacement).

It is easy to go up, as well as down. Except if you have a highest end GPU, then it is hard to go up.
The reason is the way processors are made. They usually produce the high end one, and chips that fail specification/requirement are divided into groups, average down, and then blow up fuses breaking the broken cores (or cores that needs to be broken to match specs) and changing the frequency and voltages to a very stable state, and voila! Lower end GPU. So you can go up by taking a higher end model chip.
I think you just said what I said :) it's harder to stretch silicone or add in missing pieces, than it is to disable stuff you don't need. There are certain elements of this mystery GPU that seem to exceed the std 940M. If I was making this Id take a 950M blow off the parts I don't need or had to eliminate to fit the thermal envelope and call it done. Much easier than masking a new part even if I already have all the changes on file in the library. Although perhaps NVIDIA was going to make a 945x model anyway. :D

Speculation be damned
78060.png

From Anandtech
 
Last edited:

MattL

New Member
Not really. A GeForce 980 will do the trick to be honest. Now, if you want max max max settings, you need lots of memory speed and quantity. Actually, in most cases you need either or. It really depends on how the game engine works in loading textures. But you want both if you want great performance across the board. So you want something like a fast GPU with 8GB HBM2 memory. That is memory that is right next to the GPU. AMD has a graphics card with this technology (HBM1), but due to limitation of HBM1, it can only support 4GB.

All to say, next generation GPUs should allow you to play games at max settings, across the board at 4K. But currently if you are content with 'high' in most games, than GTX 980 should do the trick 980Ti will help in some game due to the increase memory.

Well 980 Ti is a significant performance increase over 980:
The NVIDIA GeForce GTX 980 Ti Review

You can indeed game at 4k with a 980, but it doesn't take max settings across the board to cause most games to get barely acceptable FPS. Basically if you game 4k with high settings (and even turn off some specific perf killers) with a 980 based on benchmarks you'll get anywhere from 20-40 FPS in most high end games. The Ti nets about a 30% improvement and if you want to run at the highest preset settings (High or Ultra depending on the games naming terms) and a moderate FSP across the board then it's realistically the starting point... those starting at $650, hence my $700 comment.

For example Dragon Age: Inquisiton - Ultra settings but with 0x MSAA (at 4K)

980 Ti - 40
980 - 30

So $500-$600 in *desktop* grade gaming hardware can get you to about 30 FPS average in 4k with top level settings preset (while tweaking down MSAA)... while $650-$700 nets you 40 FPS. That seems pretty reflective in most of the benchmarks too.

So basically if I wanted to get at 60 FPS I'd have to drop over $1000 in hardware to make it (cheapest might be two 980s, but I won't go down that road for this comparison).

Then you consider that the Surface Book's max resolution is 6m vs the 8.8m pixels of 4k and you can start to calibrate expectations for a ultrabook class form factor dedicated card.

For another point in DA:I the 980 gets 59 FPS at 1440p.... 1440- is about 3.7m pixels vs the SB at 6 million... The SB has about 60% more pixels and the 980 is a $500+ desktop card. Not to mention the space, or even the 165W TDP of the 980 and the TDP of about 33W for a 940m (though likely a bit less in the SB power wise).
 
Top