Aller au contenu

Photo

ME:A Directx 12 confirmed?


  • Veuillez vous connecter pour répondre
144 réponses à ce sujet

#51
Khrystyn

Khrystyn
  • Members
  • 478 messages

Here's an interesting article from Ars Technica using a test bed with DX 11 and DX12, rating performance with "Oxide Games' real-time strategy game Ashes of the Singularity." Apparently it's the "very first publicly available game that natively uses DirectX 12." They compare the benchmarks of the AMD R9 290X and GTX 980 Ti. The article was posted Aug. 20th, 2015. Still searching for articles that may be helpful with the DX11/12 issue for AMD and Nvidia Geforce. Thoughts?



#52
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

When tested maxwell gpu's didn't increase the latency up to 31 command queues. But it has shown a linear increase in time to compute each tasks if the command queues exceeded 31. Which means it doesn't compute Asynchronously.

 

 

Can you talk more about this, @Novak?



#53
Novak

Novak
  • Members
  • 370 messages

Here's an interesting article from Ars Technica using a test bed with DX 11 and DX12, rating performance with "Oxide Games' real-time strategy game Ashes of the Singularity." Apparently it's the "very first publicly available game that natively uses DirectX 12." They compare the benchmarks of the AMD R9 290X and GTX 980 Ti. The article was posted Aug. 20th, 2015. Still searching for articles that may be helpful with the DX11/12 issue for AMD and Nvidia Geforce. Thoughts?

 

You can't really take that benchmark seriously. It's only one game by one small studio with no prior experience or help of anyone who has done something like this before. 


  • Khrystyn aime ceci

#54
Novak

Novak
  • Members
  • 370 messages

Can you talk more about this, @Novak?

 

Sure what do you want to know?



#55
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

Sure what do you want to know?

 

I'd like to know more about latency and how that ties into this mysterious number of >31.



#56
spinachdiaper

spinachdiaper
  • Members
  • 2 042 messages

I'd make a dumb assumption that DX12 won't be used for "AAA" games till Xbox One is updated to use it.



#57
InterrogationBear

InterrogationBear
  • Members
  • 732 messages

I'd make a dumb assumption that DX12 won't be used for "AAA" games till Xbox One is updated to use it.

That happened last year.

 

I would be very surprised if the next Battlefield and all other upcoming Frostbite games won't support DX12. Johan Andersson (Technical Director of Frostbite) is basically the father of Mantle and the driving force behind the development of a low-level API. He would like to make Win10 and DX12 a requirement for all holiday 2016 Frostbite-titles.


  • UniformGreyColor et Novak aiment ceci

#58
Novak

Novak
  • Members
  • 370 messages

I'd like to know more about latency and how that ties into this mysterious number of >31.

 

Right, so in DX11 all tasks were performed serialized which just means one after another. With asynchronous computing it works a little different tasks or calculations are being worked on simultaneously. Not every tasks can be calculated in parallel so that's why dev time increases when working on more low level stuff (generally not a good thing with a small studio with not that many Technicians but good for big ones) they have to work out what can be split up and what can be worked on at the same time. 

 

So there was this guy who didn't buy in the whole Nvidia is fully DX12 capable from fermi upwards so he designed a benchmark in which he threw 128 compute tasks at any given GPU (the dev who designed the game in the first place had Nvidia cards run on DX11 instead of DX12 so he didn't trust them). He then recorded the time it took for the GPU to handle increasing compute tasks. 

 

The latency in this case merely describes the time taken by the GPU to execute each task. So what happened? Well nothing is definitive of course since nobody can just look at the architecture or code directly but it showed that the maxwell gpu's didn't increase in latency up to 31 command queues but did if it went further up than 31 (hence >31). At that point the computing tasks weren't done at the same time anymore but instead queued up to be calculated later on. So for each computing tasks there's a linear increase which should not happen if Async computing was truly utilized. 

 

There was that other guy who thought well that just means that maxwell doesn't have as many command queues available as AMD does. Which could be a valid point but the heights of the graph indicated that several tasks were added together and done in order which wouldn't happen in Asynchronous computing. 

And why is that? Well for the sake of simplicity if Async units compute something and the workload increases it increases in sudden spikes and valleys since task can be so to speak fiddled in somehow since there isn't a continues line of processing sometimes one unit may do nothing for a while. 

 

As far as nvidia goes they claim to have only one shader engine which is apparently able to do 32 deep command queues unlike amd which uses 8 shader engines but with less queues for each. 

 

GCN uses 1 graphics engine and 8 shader engines with 8-deep command queues, for a total of 64 queues.

 

Maxwell uses 1 graphics engine and 1 shader engine with a 32-deep command queue, for a total of 32 queues (31 usable in graphics/compute mode)

 

Hope that answers your question



#59
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

I'd like more info on this @Novak. Do you have any links?



#60
Novak

Novak
  • Members
  • 370 messages

I'd like more info on this @Novak. Do you have any links?

 

On what specifically? Asynchronous computing? Or that specific thing I talked about? For the latter I can't give you anything really solid since nobody knows for sure it's all just guesswork.



#61
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

On what specifically? Asynchronous computing? Or that specific thing I talked about? For the latter I can't give you anything really solid since nobody knows for sure it's all just guesswork.

 

I wanted links on the tests that the unknown guy did.



#62
Novak

Novak
  • Members
  • 370 messages

I wanted links on the tests that the unknown guy did.

 

Alright, I try to dig it up again.



#63
Novak

Novak
  • Members
  • 370 messages

So I was unable to find the original Benchmark post from him but I have found a blog post in which his graphs were used:

 

http://blog.logicali...dont-panic-yet/

 

And found some other resources for you to check out if you're interested how this works and how me and others came to the conclusion:

 

http://www.extremete...-we-know-so-far

 

Note: They used to analogy that AMD's version of async computing is more like hyperthreading and nvidia is CPU bound I talked about the software layer in nvidia cards so the analogy still fits (just don't want there to be any confusion regarding this)

 

Also if you're eager to find the original benchmark post you can search here:

 

https://forum.beyond...ad.57188/page-7

 

The guy who did it was named MDolenc on the forum. 

 

By the way, didn't you have a question about HDVC or something`?


  • Almostfaceman, Khrystyn et UniformGreyColor aiment ceci

#64
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

So I was unable to find the original Benchmark post from him but I have found a blog post in which his graphs were used:

 

http://blog.logicali...dont-panic-yet/

 

And found some other resources for you to check out if you're interested how this works and how me and others came to the conclusion:

 

http://www.extremete...-we-know-so-far

 

Note: They used to analogy that AMD's version of async computing is more like hyperthreading and nvidia is CPU bound I talked about the software layer in nvidia cards so the analogy still fits (just don't want there to be any confusion regarding this)

 

Also if you're eager to find the original benchmark post you can search here:

 

https://forum.beyond...ad.57188/page-7

 

The guy who did it was named MDolenc on the forum. 

 

By the way, didn't you have a question about HDVC or something`?

 

Thanks for the links. I found a link myself that talked about the HDVC which I named incorrectly. The actual thing I wanted to know about can he found here.


  • Khrystyn aime ceci

#65
Drakoriz

Drakoriz
  • Members
  • 383 messages

probably, as much of the new game that will come out around 2017, ME A will support DirectX12 but it will not exploit it to a 100% why??? bc windows 10 isnt as popular as Win7 at the moment. And they will lost alot of money if they going to realease a game that is mandatory to use DirectX12.

 

Myself i still using Win7 is more stable that Win10 by far and games have less problems. (alto of game have issues on my laptop that have Win10)

 

Is going to be a while yet till DirectX12 become mandatory, bc DirectX11 is still really powerful.


  • DaemionMoadrin aime ceci

#66
goishen

goishen
  • Members
  • 2 427 messages

probably, as much of the new game that will come out around 2017, ME A will support DirectX12 but it will not exploit it to a 100% why??? bc windows 10 isnt as popular as Win7 at the moment. And they will lost alot of money if they going to realease a game that is mandatory to use DirectX12.

 

Myself i still using Win7 is more stable that Win10 by far and games have less problems. (alto of game have issues on my laptop that have Win10)

 

Is going to be a while yet till DirectX12 become mandatory, bc DirectX11 is still really powerful.

 

 

 

I'm already seeing that day creeping up on us.  I've already tried to upgrade my Windows to Win10, but I keep getting Couldn't update the reserved partition.  I can handle it, as I've found the direction to.  It's just I don't really wanna.

 

I know that I'm gonna have to upgrade though.



#67
Novak

Novak
  • Members
  • 370 messages

Thanks for the links. I found a link myself that talked about the HDVC which I named incorrectly. The actual thing I wanted to know about can he found here.

 

That makes more sense. So I take it I don't need to explain anything about it?



#68
Novak

Novak
  • Members
  • 370 messages

probably, as much of the new game that will come out around 2017, ME A will support DirectX12 but it will not exploit it to a 100% why??? bc windows 10 isnt as popular as Win7 at the moment. And they will lost alot of money if they going to realease a game that is mandatory to use DirectX12.

 

Myself i still using Win7 is more stable that Win10 by far and games have less problems. (alto of game have issues on my laptop that have Win10)

 

Is going to be a while yet till DirectX12 become mandatory, bc DirectX11 is still really powerful.

 

If they do decide to cut features it won't be because of poor win 10 coverage. If anything Nvidia will halt development in that area. Why? Simple cutting out features etc is pretty easy so it's no problem with current tools to use every feature of DX12 and simply downgrade it for non win 10 users. Xbone also already supports DX12 and since nobody gives two shits about the PC platform for development anyway that's gonna set the tone for it.

 

We've seen this before with DX10. You simply weren't able to engage some of the advanced features if you didn't run DX10 on your computer. Would probably be the same story again. They don't really have to worry about it that much anyway since win 7 will probably have a tool to enable DX12 anyway. 

 

But in the end it doesn't really matter. In that regard DX12 is much like a driver, sure you can run an outdated one but don't expect your latest and greatest game to run good on an old driver.

 

Oh yea and DX11 is really powerful? **** that, it's an outdated piece of **** and is only regarded as good and stable because DX10 was even more of a mess. 

DX11 was heavily serialized and the resource management was complete crap. 


  • DaemionMoadrin aime ceci

#69
DaemionMoadrin

DaemionMoadrin
  • Members
  • 5 855 messages

Btw... I recommend a clean Win10 installation instead of upgrading a Win7 system that has been running for a few years. Win10 will -not- clean up your system, you'll keep tons of unnecessary, outdated files... oh, and a lot of the problems you had, too. If you're using old hardware that stopped getting driver updates during Win98 times, then you might not get it to work properly in Win10, too.


  • Novak aime ceci

#70
Novak

Novak
  • Members
  • 370 messages

Btw... I recommend a clean Win10 installation instead of upgrading a Win7 system that has been running for a few years. Win10 will -not- clean up your system, you'll keep tons of unnecessary, outdated files... oh, and a lot of the problems you had, too. If you're using old hardware that stopped getting driver updates during Win98 times, then you might not get it to work properly in Win10, too.

 

Exactly! Honestly win 7 wasn't that great as people made it out to be. It was just not as shitty as Vista and didn't force that Metro crap down our throats. But really nothing special. The last proper windows OS was XP. And from what I've seen so far win 10 is a new chapter and microsoft has finally gotten their **** together. Yes there is a **** load of spyware but that's easy enough to disable when using a couple of tools. Or powershell if you like working with text based consoles. 

But I'm both happy with stability and performance (well for a windows OS with a lot of overhead and legacy crap at least) and win 7 got really nothing on that if I'm honest. 

 

Of course I would prefer if Linux was the primary platform for both development and consumer use but I'm not seeing that happening any time soon so I settle for win 10


  • DaemionMoadrin aime ceci

#71
Sylvius the Mad

Sylvius the Mad
  • Members
  • 24 111 messages
1. Mantle

2. I'll upgrade to Win10 if I have to, but until then I'm really happy with 8.1. It's the first version of Windows I've really liked since Win98.

#72
Drakoriz

Drakoriz
  • Members
  • 383 messages

If they do decide to cut features it won't be because of poor win 10 coverage. If anything Nvidia will halt development in that area. Why? Simple cutting out features etc is pretty easy so it's no problem with current tools to use every feature of DX12 and simply downgrade it for non win 10 users. Xbone also already supports DX12 and since nobody gives two shits about the PC platform for development anyway that's gonna set the tone for it.

 

We've seen this before with DX10. You simply weren't able to engage some of the advanced features if you didn't run DX10 on your computer. Would probably be the same story again. They don't really have to worry about it that much anyway since win 7 will probably have a tool to enable DX12 anyway. 

 

But in the end it doesn't really matter. In that regard DX12 is much like a driver, sure you can run an outdated one but don't expect your latest and greatest game to run good on an old driver.

 

Oh yea and DX11 is really powerful? **** that, it's an outdated piece of **** and is only regarded as good and stable because DX10 was even more of a mess. 

DX11 was heavily serialized and the resource management was complete crap. 

 

LOL right that, why a game like Witcher 3 was made for DX11, sooo much garbage. lol

 

And yeah no really, no meter if Xbox one support DX12 the game could easily be non optimize for it. The game been on development longer that DX12 was announce, but really dont know anything till they release new info about ME A

 

And really since the last year companies are making more money on game on PC that console, why? i dont know, but i guess is bc digital download ( or if i remember correctly it was something about resell games or rental game some **** like that) . So i doubt they give a crap about PC. ( im not a PC owner by the way =P) but no giving a crap about a part of your market is a horrible financial decision.



#73
AlanC9

AlanC9
  • Members
  • 35 661 messages

Myself i still using Win7 is more stable that Win10 by far and games have less problems. (alto of game have issues on my laptop that have Win10)


Which games? I've been using W10 since prerelease, and the only problems I had were when it autoinstalled buggy AMD vidcard drivers; reverting to the last 8.1 release always cleared things up.

#74
Novak

Novak
  • Members
  • 370 messages

LOL right that, why a game like Witcher 3 was made for DX11, sooo much garbage. lol

 

And yeah no really, no meter if Xbox one support DX12 the game could easily be non optimize for it. The game been on development longer that DX12 was announce, but really dont know anything till they release new info about ME A

 

And really since the last year companies are making more money on game on PC that console, why? i dont know, but i guess is bc digital download ( or if i remember correctly it was something about resell games or rental game some **** like that) . So i doubt they give a crap about PC. ( im not a PC owner by the way =P) but no giving a crap about a part of your market is a horrible financial decision.

 

Recent development when it comes to AAA titles have shown otherwise. While it is true that generally the PC market makes more money but not for AAA titles. Primary development platform for almost all AAA titles were consoles for the last couple of years. 

 

The Witcher was in terms of performance a ****** piece of ****. But that's not the point. And you're not on point with the Witcher either. This is not a question of how good or bad games are. I was pointing out that DX11 was not a good API not at all but you've said the contrary but thought by referring to a good game developed for DX11 would disprove that. It doesn't if it was developed on DX12 for example performance and visual appeal would probably be able to taken up a few notches. I mean have you looked at the software overhead when running something with DX11? It's enormous.

 

While ME:A was in development before the official announcement of DX12, the technical development didn't start until later. I admit it probably started before the announcement of DX12 but this doesn't matter since Bioware isn't bound by official announcements like the consumer is they get information ahead of time and that includes resources. Furthermore even if the technical development has begun before they got any information at all about DX12 it doesn't really matter. In big studios the actual API implementation and hard optimization happens much later in the development and can constantly be changed. Optimization can occur up to 4 weeks before the official release. And since DX12 is very much about optimization and not so much about implementing new capabilities it's entirely possible to get DX12 even deep in the development. I mean splitting a few task in ACE units and maybe enabling stacked vram is relatively easy for a talented team. And the frostbite engine makes this pretty easy (mantle did good there, if you're interested in mantle you should read earlier posts) to pull it off. 

 

Of course we won't know anything until they make announcement since this is all just speculations but there a good reasons why they would use DX12. And I named more than enough in this thread. 

 

As for the Witcher and it's studio CD Project Red, it's one of the very few Studios that actually still cares about the PC market and see's it's capabilities in terms of performance. Which is very good. There are a ton of media based companies emerging in Poland and from what I've seen so far there is a lot of talent and when it comes to video games they still understand the value of PC. Just like Crytek did back in 2007. (They went down the drain as well)  


  • DaemionMoadrin aime ceci

#75
goishen

goishen
  • Members
  • 2 427 messages

They included Mantle with DA:I, I think they're doing anything they can do to get away from bloated DX anything.  DX12, with the help of Mantle/Vulcan, has really helped.