Aller au contenu

Photo

Now That TW3 Has Announced System Requirements, Hopefully People Will Shut Up About Quad-Core Being The Current Gaming Standard For AAA Titles


  • Veuillez vous connecter pour répondre
59 réponses à ce sujet

#51
Frenrihr

Frenrihr
  • Members
  • 364 messages

Er? TW1 and 2 were...kind of a technical/bug ridden mess. TW1 used the NWN2 engine too. While its good that they kept working and released the Enhanced Editions, lets not forget the Enhanced Editions were basically huge patches to fix numerous technical and UI issues. As well as polish up the original subpar assets and finish content that was cut from the original release.

 

I mean thats kind of the epitome of release it now finish it later.

 

You are full of BS.



#52
loralius

loralius
  • Members
  • 19 messages

Oh god, my PC just meets the minimum requirements for TW3...



#53
Rizilliant

Rizilliant
  • Members
  • 754 messages

Er? TW1 and 2 were...kind of a technical/bug ridden mess. TW1 used the NWN2 engine too. While its good that they kept working and released the Enhanced Editions, lets not forget the Enhanced Editions were basically huge patches to fix numerous technical and UI issues. As well as polish up the original subpar assets and finish content that was cut from the original release.

 

I mean thats kind of the epitome of release it now finish it later.

And the entire time, the communicated constantly, and apologized for it.. Which is mostly what the players ask for.. Not only that, but the GAVE free content, as a thank you for playing to people who already owned it, and to the newly released enhanced versions,.. What more could you POSSIBLY ask for?

 

And i dont remember part 1 being the technical catashtrophe you seem to be implying. I didnt own 2 until the enhanced ed released, so i cannot comment.


  • Rawgrim et Dominic_910 aiment ceci

#54
Rizilliant

Rizilliant
  • Members
  • 754 messages

I'd rather games push technology forward then to have games dumbed down to fit into consoles.

 

My PC is 3 yrs old which I built for Skyrim, and upon release I was very disappointed with a 32-bit application and poor textures, and my PC was overpowered.

Currently DAI is running on solid 30-60 FPS, FPS drops in lag-heavy areas, on medium to ultra settings, but at least I feel that it's using what I paid for.

 

If the next game requires an upgrade I'll be happy to upgrade. Money isn't an issue, my gaming experience is.

Which is why im completely baffled as to the reason behind DA:I releasing for last gen consoles. Utterly crippled any technological advancement for the game,  from a company that used to do just that!



#55
Rawgrim

Rawgrim
  • Members
  • 11 524 messages

Only played the enhanced editions of both game, you know those versions with all of the free dlcs added to them, and they worked fine. I don't remember DA:O or DA2 being particularly bug ridden either, though.



#56
AlanC9

AlanC9
  • Members
  • 35 624 messages

Which is why im completely baffled as to the reason behind DA:I releasing for last gen consoles. Utterly crippled any technological advancement for the game,  from a company that used to do just that!


Well, over 20% of DAI sales are on the old consoles. That's a lot of money to give up for the sake of technological advancement.

#57
Viper371

Viper371
  • Members
  • 287 messages

Except that no matter your rigs power, DA:I runs like ****...

Not true at all.

 

DA:I runs very fine on my system: Core i5 4670k, AMD R9 290x, 16gb RAM, 240gb SSD, 4TB WD Black hard drive.

Zero technical issues, zero slow downs, since AMD released their beta drivers the day after DA:I launch and I properly installed them (meaning, uninstall the old, reboot, install the new, reboot).

In fact, I think DA:I is the most stable game on release I've ever played.
Now, I can't say for AMD processors and Nvidia cards, I haven't played on these rigs.



#58
mutantspicy

mutantspicy
  • Members
  • 467 messages

A little OT but as a follow up to my questions regarding upgrading to GTX 980's with a repurposed 580 for physx, I decided to put EVGA to the test.  

 

I sent their customer service this email.

 

Subject: Looking to purchase GTX 980''s for sli but have ?''s
Comment: First would like to say I''m a happy EVGA customer. I currently have an EVGA X58 FTW3 mobo with two evga GTX 580''s this arrangement has been nothing short of great. 

 

1st) I''m considering upgrading to EVGA GTX 980''s FTW but have concerns if my mobo will be alright to run them. I''ve heard about issues on the x58 chipset and these cards in SLI. 

 

2nd) If I were to put 2x GTX 980''s in and use one of the 580''s for physx, does the slot arrangement matter. I''m aware pcie lane speeds would be 16x - slot 1,8x -slot 3,8x -slot 5. Would I have to run the two sli''s in the 8x slots (3 &; 5) or does it even matter? 

 

3rd) If I were to just do it and get the X99 FTW/Classified mobo. Can it do 3 pcie at 16x,16x,8x lane speeds?

 

EVGA's Response

 

Hello Eric,


Our PM team has done some tests and have been able to use 900 series cards in X58 chipset motherboards.  You may need to make manual adjustments to the Memory Low Gap in the BIOS (Frequency/Voltage Control--Memory Features) for it to work with SLI properly.  The slot arrangement will still matter as you need to have the 980's in slots of at least x8 or higher, the PhysX card has no such requirements.

For the X99 FTW, and the platform specifically, you'll have several lane availability depending on CPU.

PCI-E Lane Distribution (40 Lane Processor)
PE1 – x16 (x8 if PE2 is used)
PE2 – x8
PE3 – x8
PE4 – x16 (x8 if PE3 is used)
PE5 – x4 (Gen 2 only, 4 lanes pulled from PCH)
PE6 – x8

PCI-E Lane Distribution (28 Lane Processor)
PE1 – x16 (x8 if PE2)
PE2 – x8
PE3 – x8 (Slot is *NOT* functional with a 28 lane processor.)
PE4 – x8
PE5 – x4 (Gen 2 only, 4 lanes pulled from PCH)
PE6 – x4


 

Hope that helps!  If you have more questions, please let us know.

Regards,
EVGA Support

 

 

Now how's that for customer service?  Not only did I find out that I can run the GTX 980's in my PC with a bios tweak, I can also probably put the 580 in for physx but I would have to remove the air cooler, not sure if its wise, but then I'm not sure how hot the card would get in Physx operations.They also answered my question about the lane throttling and slot arrangement. They verified my concern that my lane distribution would be 16x, 8x, 8x as opposed to my current 16x,16x arrangement.  In addition gave a me detailed layout about the mobo I'm dreaming about.  This along with the fact I've had nothing but success with EVGA is why they will continue to get my $$$.

 

Just wanted to share for those who may be interested.


  • inquartata02 aime ceci

#59
Etragorn

Etragorn
  • Members
  • 559 messages

EVGA is one of my favorite companies as well for this exact reason.

 

It seems, using a 40 lane processor, you should have no trouble running both 980's in 16x mode, but I guess you'd have to dump the PhysX card since it needs at least 8x (due to the size of the connector it uses) and you don't have enough lanes left, using two 16x links the x58 will only have x4 left to work with. No point in running a 980 at 8x just for a PhysX card.

 

I don't see why you want to run two 980's in SLI. I am running one EVGA 980 and that is more than enough for everything out now, and coming out in 2015. The only real reason would be if you are running a 4K setup.



#60
mutantspicy

mutantspicy
  • Members
  • 467 messages

EVGA is one of my favorite companies as well for this exact reason.

 

It seems, using a 40 lane processor, you should have no trouble running both 980's in 16x mode, but I guess you'd have to dump the PhysX card since it needs at least 8x (due to the size of the connector it uses) and you don't have enough lanes left, using two 16x links the x58 will only have x4 left to work with. No point in running a 980 at 8x just for a PhysX card.

 

I don't see why you want to run two 980's in SLI. I am running one EVGA 980 and that is more than enough for everything out now, and coming out in 2015. The only real reason would be if you are running a 4K setup.

Because you know how it goes, this year's superstar card won't be as awesome on next year's games.  And I like I said earlier, there is always the possibility of disabling one of the cards, and running the other for physx.  Of course I could save myself money buy just one and use a 580 physx.  That's a more fiscally responsible path, but I'm irresponsible.   :)    And actually I was reading a bit of the TW3 reviews, and I want to say one of them mentioned how some the ultra textures were so big you could upwards of 6GB  Vram so...   :o  Don't quote me on that but I feel like I read that somewhere.

 

edit  - disregard that I was remembering a review of Shadows of Mordor which does recommend 6GB Vram for ultra.