New Computer for NWN2
#1
Posté 18 décembre 2010 - 08:17
I wasn't planning on posting anything yet but I'm already stumped by an oddity...
Instead of using a monitor, I'm going to use an HDTV. I plan on using an HDMI cable to connect the TV to the desktop.
A while ago, there were video cards with HMDI connectors. Now, I'm starting to see video cards with mini-HDMI connectors. A little research tells me that portable devices like camcorders typically use this type of connector.
So what are these doing on video cards then? And more importantly, would I be transmitting the same level of quality (video and sound) to the TV if it uses an mini-hdmi connector instead of an an hdmi connector.
#2
Posté 19 décembre 2010 - 07:54
Here is an example:
http://www.newegg.co...N82E16814125319
Not the specific card I planned on getting but this has the mini-hdmi cable connector.
#3
Posté 19 décembre 2010 - 10:17
“Legacy interfaces such as VGA, DVI and LVDS have not kept pace, and newer standards such as DisplayPort and HDMI clearly provide the best connectivity options moving forward. In our opinion, DisplayPort 1.2 is the future interface for PC monitors, along with HDMI 1.4a for TV connectivity.”
- AMD, Dell, Intel Corporation, Lenovo, Samsung Electronics and LG (Dec 8, 2010)
/ hype
HDMI.org
& if you can put up with Jimbo's smirk :
HDMI - connectors.
DisplayPort - "is not expected to displace HDMI in high-definition consumer electronics devices."
Since both the mini/micro and normal connectors use 19-pins, I doubt there's any loss of signal quality. Just use high quality cables & adaptors, where needed .. and since their sizes are part of the HDMI definition, it's all good.
But here's a question, if you decide to go back to a monitor in a year or two, will the card be compatible with those inputs?
#4
Posté 20 décembre 2010 - 06:34
Banshe said:
"deally, something that won't require a major reworking/redoing when the next big tech "thing" happens. But something I can easily upgrade."
From my experiance with this, I suggest that you get:
- High Quality, and High watts Power Supply: A bigger power supply will allow you to upgrade your computer's hardware without the fear of not having enough juice to run the computer with the upgraded parts.
- Powerful CPU: if you get a high-end processor now, you'll save a lot of headaches in the future.
I got a mid-level CPU on my second computer and it was a mistake. The chip was a Pentium D Duelcore @ 3.0GBs. I should have spent the extra $150 & gotten the better CPU.On my new computer, I've got an Intel Core i7 920 (2.66GHz, 8MB Cache) Quad Core. It works great & runs the games that I play just fine (even NWN2).
Graphics Card: Same as with the CPU - get a real good one. I'm running a 1.8GB NVIDIA GeForce GTX 260 & it can handle the Maximum settings for Grand Theft Auto 4, and Dragon Age. It can handle almost all of the available settings on max. for NWN2. I have to turn down anti-aliasing to 4x or 8x, for the game to run smoothly.
Here is the rundown of my desktop for you to examine:
Brand: Alienware Aurora Desktop
New Operating System: Windows 7 Home Premium, 64bit
Processor: Intel Core i7 920 (2.66GHz, 8MB Cache) Quad Core
RAM: 6GB Triple Channel 1333Mhz DDR3
Video Card: 1.8GB NVIDIA GeForce GTX 260
Power Supply: 875W PSU
Sound Card: Sound Blaster X-Fi Xtreme Audio
CD/DVD Drive: 24X CD/DVD burner (DVD+/-RW) w/double layer write capability
Moniter: 22-inch Widescreen Flat Panel
Harddrive: 640GB - SATA-II 3Gb/s 7200RPM, 16MB Cache HDD
Got this in Feburary of this year, and it works like fantastic.
#5
Posté 20 décembre 2010 - 01:59
Well, I say just fine, it works, and works OK. Not as well as it could, maybe, but that's just the nature of the beast (NWN2, that is) I guess.
#6
Posté 20 décembre 2010 - 09:33
Edit: On a different note, this, apparently over the hill, desktop is going for about 100 USD. How would this work as a server for less than ten players:
http://www.janus.hu/...esktop_pc_13358
Apparently you can increase the RAM to 4 GB.
It is in Hungarian (which I don't speak either) but most of the stats are self explanatory.
Modifié par Banshe, 20 décembre 2010 - 09:58 .
#7
Posté 07 janvier 2011 - 02:55
I have been trying to wrap my head around this whole "Sandy Bridge" thing in regards to my new build. Does anyone here understand this stuff?
What I think I get is that the CPU now has a GPU too. But the GPU is not good enough for video games (especially NWN2). But I think you can add discete video cards too but this article seems to contradict that:
http://www.anandtech...on-sandy-bridge
So, from what I have read, this Sandy Bridge thing is worth it to put in a new build as it is a huge change/improvement.
What do you guys think (with NWN2 in mind as it's primary use)?
#8
Posté 07 janvier 2011 - 03:05
No, that's just for that one specific feature, unrelated to gaming. The gaming news sites would be apoplectic if you couldn't use a discrete gpu.Banshe wrote...
Hi again guys.
But I think you can add discete video cards too but this article seems to contradict that:
#9
Posté 07 janvier 2011 - 03:30
From here: Computerworld: What Intel's Sandy Bridge chips offer you
Specifically:
...the end of the section entitled: Graphics.The advantage to having integrated graphics is that with one chip instead of two, there's no need to connect graphics to the CPU. Eliminating that hop between the two chips saves on heat and power loss.
The downside is that you don't get the graphics performance you would with separate graphics hardware. Intensive users might get more stuttering video or see some drag in graphics rendering, according to Olds.
And from here: The Register: intel's Sandy Bridge Welcomes Discrete Graphics
Opening statement:
IDF[/b] The on-chip graphics of Intel's Sandy Bridge processor may be measurably ahead of Chipzilla's previous integrated graphics, but it's not intended to replace discrete graphics for high-end users and dedicated gamers.
"I don't see high-end discrete graphics cards going away," said Tom Piazza, Intel Fellow and graphics-architecture guru, speaking on Monday at the company's annual developer forum, "nor do I see Formula One race cars going away just because we built Priuses."
RegAd('mpu1', 'reg.main_hardware.4159/pc_chips', 'pos=top;sz=336x280', VCs);
He also reminded his audience that the Sandy Bridge architecture has PCIe support on-chip — 1x16 or 2x8 — so support for the addition of discrete-graphics cards or GPUs is up to an OEM's decision concerning the inclusion of a PCI slot or on-board bus.
Piazza also attempted to deflect questions about Sandy Bridge's lack of DirectX 11 support. "There are no exclusive DirectX 11 games out today," he answered a questioner. "In fact, most [game developers] have actually skipped over DirectX 10, and most of the games fall all the way back to DirectX 9. I don't see the issue right now specifically about DirectX 11. DirectX 11 is, I'll just say, 'around the corner' — and on the Intel products as well."
In short, the Sandy Bridge is targeted for Mobile computing and not gaming. It is targeted for uploading and transcoding (that's moving from one location to another, essentially) video data much faster, due to the innate connections on silicon of the integrated graphics and main core(s) processor(s). It is NOT slated, targeted or geared for anything more than the notebook video power watcher (included is something called "Insider" which is a form of Hardware DRM encoding technology that several major Movie Companies are supporting (fyi) ) or for the traveling salesman, pitchman or executive on the go allowing them to slap out a spreadsheet faster than Neo can fly through the simulation known as the Matrix.
If you're going to be gaming or producing serious video yourself, modeling, editing or creating intensive graphics applications/programs, you're still going to need a "real" video card with separate True Graphics Processing for your video needs. And, it will apparently allow that switch at no issues. More than likely it'll either detect it on it's own (as it has PCIe core integration detection/support) or it'll be a simple BIOS switch at bootup. I'm guessing it will be self checking/correcting in such cases. I didn't see anything specific to that and why would I? It's just been officially announced two days ago and I guaranty you that Intel is not advertising their core integrated graphics technology as "easily replaced by a real graphics card."
You should be fine with it as long as you're not going to be gaming. That you're here in the NWN2 forums qualifies you as a candidate for a True Graphics Processing enabled system. That said, it's a fair bet that having a Sandy Bridge chip will still allow you to do all that fancy schmancy video transcoding and spreadsheet busting, even if you do have a dedicated graphics card. It's right there and there's no reason to shut it off for those sorts of things. Will it do it? I don't know, but the hope would be that it still functions in some fashion, being as how it's partnered to the cores of the processor on the same chip.
The ultimate advantage of this is now Mobo makers for mobile computing can save board space (which means smaller footprint and improved overall performance) without having to have a second integrated graphics chip as well. And, in reality, it will also remove (near as I can tell, anyway) the need to have a third (if you count the IGC too) chip to "bridge" the Integrated Graphics Chip to the mobo and thus to the Processors.
North Bridge, South Bridge, now Sandy Bridge. Hmm. I'm betting there's another advantage to desktop mobo makers there as well. Another set of Bridge chips looks now to be also on board. Don't quote me on that, though, it's been a long time since I bothered studying the guts of a motherboard of any sort, other than where to slap in my RAM, video card, audio card and processors/heatsinks.
regards,
dunniteowl
#10
Posté 07 janvier 2011 - 08:12
If you are making a gaming rig with discrete video cards, is it a better idea to go with Sandy Bridge or pre-Sandy Bridge?
Judging by your posts and things I have read, it clearly seems to benefit others (i.e. video peeps) more so than gamers. But is there an advantage at all to gamers? There is a lot of talk about how this won't replace discrete cards and one will still be needed. But nothing really concrete about whether or not it improves the gaming experience.
On Tom's Hardware, before the great unveiling there was a multitude of posts from people regarding processors for gaming and lots of responses saying wait to see what happens with Sandy Bridge.
#11
Posté 07 janvier 2011 - 09:53
The ultimate answer, though, at this point is: Wait and see.
dno
#12
Posté 07 janvier 2011 - 09:57
http://hardocp.com/a...ocessors_review
Short and sweet: their highest award.
#13
Posté 07 janvier 2011 - 10:29
"Wait and see" are not two of my favorite words...
@ Kamalpoe: That article seems to be pretty definitive in that the i7-2600K is master of all for the future.
#14
Posté 07 janvier 2011 - 10:31
it sounds like brand new architecture (unlike the HDMI stuff mentioned earlier which builds upon years of SVGA->DMI etc.), so there's going to be time needed to build up comfort-factor in the industry ..
browsing through the article yourself mentioned, I'd like to pull some quotes out :
"We will not be looking at the integrated graphics capabilities of these processors, as we do not think these capabilities will be important to our readers either. We are sure other sites will be looking into this, we still however highly suggest a discrete graphics card for true gamers."
-p.1 (emphasis mine)
"As always, these benchmarks in no way represent real-world gameplay. They are all run at very low resolutions to try our best to remove the video card as a bottleneck. I will not hesitate to say that anyone spouting these types of framerate measurements as a true measuring tool in today’s climate is not servicing your needs or telling you the real truth.
"For what it is worth, I think these gaming benchmarks are fairly useless, but people want to see these."
-p.4
"If you are still back on a dual core Socket 775 system, you just found out it is time to upgrade. If you are on a Socket 1156 or 1366 system, then you might likely be best off staying put, but I am very sure many of you will see enough value in Sandy Bridge to start putting together a new build list, especially if you are still running a dual core."
-p.6
pardon me, but I'm not upgrading anytime soon. Beyond that, lots of machines on the popular market run NWN2 just great!
While I'm sure there are advantages to these new Sandy-chips, look at it this way, what do they do that isn't already being done more than well enough? Second, price? Thirdly ( back to my initial point ) industry standard ..?
it might make an excellent server, i donno .. my Vote : pre-Sandy bridge
#15
Posté 09 janvier 2011 - 12:49
I was simply pointing out the 'inconsistency' in the article above. Banshe, if you're a 'beta-testing' kinda guy .. go with it, but my point is that there are plenty of magic-carpets out there already. And the line that stands out to me is,
"If you are on a Socket 1156 or 1366 system, then you might likely be best off staying put"
meaning, I'd personally invest my effort, time, and money into one of those (or similar) and enjoy the carpet ride
#16
Guest_Red Wagon_*
Posté 09 janvier 2011 - 03:45
Guest_Red Wagon_*
#17
Posté 09 janvier 2011 - 04:37
"If you are on a Socket 1156 or 1366 system, then you might likely be best off staying put"
In other words, the whole point of that statement only applies to that market segment that is already there prior to Sandy Bridge chips (and mobos.)
dno
#18
Posté 09 janvier 2011 - 07:54
dunniteowl wrote...
I'll simply point out that the quote is:In other words, the whole point of that statement only applies to that market segment that is already there prior to Sandy Bridge chips (and mobos.)"If you are on a Socket 1156 or 1366 system, then you might likely be best off staying put"
dno
This is how I read it too. Stay put if you "already" have it. In other words, it is not worth making the switch if you have already invested in the old tech. At least for now.
I know that tech always changes so there is little point in waiting for the next best things otherwise you will always be waiting. But this seems like a tidal wave change which is what is making me hesitant.
It was this statement that seemed to be close enough to apply to me:
"but I am very sure many of you will see enough value in Sandy Bridge to start putting together a new build list"
Even though that one is still not a clear vote for Sandy Brisge if you are building up a completely new system.
A lot of things that were said (and pointed out by you KevL) are making me hesitate. Mostly because none of these people are saying: "Gamers - We recommend that you get the Sandy Bridge mobo/processor along with one or two video cards". Nor are they saying the opposite. And that seems like the most obvious question to answer...
#19
Posté 09 janvier 2011 - 08:38
http://www.tomshardw...andy-bridge#bas
One answer so far with interesting results. What do you guys think?
#20
Posté 10 janvier 2011 - 01:53
yes. And coming from an industry sponsored webzine that has a vested interest in hyping the industry, it also means, those chips honk!Banshe wrote...
This is how I read it too. Stay put if you "already" have it. In other words, it is not worth making the switch if you have already invested in the old tech. At least for now.dunniteowl wrote...
I'll simply point out that the quote is:In other words, the whole point of that statement only applies to that market segment that is already there prior to Sandy Bridge chips (and mobos.)"If you are on a Socket 1156 or 1366 system, then you might likely be best off staying put"
dno
I'd say this is what it comes down to : your comfort zone, Banshe. Do you want to build your computer now, or wait a few months till the discussion gets a resolution?I know that tech always changes so there is little point in waiting for the next best things otherwise you will always be waiting. But this seems like a tidal wave change which is what is making me hesitant.
I admit it's a tough choice, because (considering that the integrated graphics are nearly irrelevant) the Sandy Bridge chips also honk! I mean, there's prestige-factor plus, apparently a cheap price.
now, here's the caveat on me. If I were building a new computer, literally I'd be building it. Right now, I wouldn't want the potential headache of wrapping my head around this new technology, looking for a relative scarcity of new information, and resolving conflicts that are bound to crop up when building a computer as an amateur.
ps. i cast my vote *shrug
#21
Posté 10 janvier 2011 - 09:11
Did you see the performance links in that Tom's thread? It seems that Sandy Bridge's i7-2600k beats the i7-950 in terms of performance.
Also, apparently the 1366 socket is on it's way out soon to be replaced by the LGA 2011.
So... If I chose a x58 board, thereby commiting to 1366, it is possible that no more 1366 processors are coming (or not too many). Therefore, within a couple of years, Sandy Bridge will outpace the 1366 range. Even the availability of 1366 processors is dropping off at places like Newegg (there are still some but not too many). In short, is 1366 on the way to being obselete? I realize with games they always go for the mainstream so it will take more time to get to the point of being obselete.
So I guess my new question would be in regards to what both you and DNO said: Wait and see. What exactly would I be waiting to see? What is it that needs to be resolved? It seems the benchmarks have been done. So what else is there?
Modifié par Banshe, 10 janvier 2011 - 09:13 .
#22
Posté 10 janvier 2011 - 10:24
Dno? anyone else??
( -devil's advocate, once again proud of his work
#23
Posté 10 janvier 2011 - 11:31
As far as the integrated graphics go, you can ignore it altogether when you think of a gaming computer. Also, Intel is not alone there, we also have AMDs Fusion technology which will most probably even have the better integrated graphics... if it mattered, but it doesn't matter.
All the rest is absolutely easy. It comes down to: How much money do you want to spend, what will you do, do you run certain applications that require technology X or stat Y?
For NWN2, the absolutely newest doesn't always have to be the best. You can throw as many processor cores at it as you want, the game will just use one, the toolset two and that's it - a good old Core 2 Duo processor will run the game just fine enough, or any recent Core or Athlon/Phenom. NWN2 is no reason to wait at all.
What NWN2 needs is a CPU at a decent speed, let's say everything recent above 2.6 Ghz would just be fine, and a multicore CPU with a turbo boost if only one Core is in use would help too.
Second is the graphics card, and yes, it still needs a decent one, because NWN2 is still one of the most demanding games I personally have. It wants a midrange card, that means, plan for one in the $150 range, like an AMD 6850 or nVidia GTX 460. Older cards will do too, as long as they were in the same league (it runs still very well on the old nVidia 8800/9800 cards for example.
Last, even though they seem to have more driver isues with NWN2, I'd recommend a nVidia card if you want to play the game with all eyecandy. Not because nVidia makes the better cards and AMD sucks, no, because only nVidia cards render point light shadows in the game as they should be. They also perform better with full shadows on. That's again a typical case of a "nVidia - The way it's meant to be played" game, as sad as it is. My own ATI 4870 performs much worse in NWN2 than my old nVidia card, against all logics.
Okay, since this is covered, all the rest depends entirely on your needs and on your purse. Personally, I'd buy a comp when I need it and never wait for the next big thing - usually the cheaper and better way, because the next big thing will just be horribly expensive, might still have bugs, and last year's high end is still good enough for a few years. At this very moment, if I needed a new one I'd buy an AMD Phenom X4 or X6 system with a nVidia GTX 460 card. Fast and inexpensive.
Oh, and as for that HDMI/Mini-HDMI issue, just get an adapter. It should be the same technology with a smaller plug, so all you need is a cable that fits both ends. Your electronic store should be able to help out.
Modifié par casadechrisso, 10 janvier 2011 - 11:37 .
#24
Posté 10 janvier 2011 - 03:39
#25
Posté 10 janvier 2011 - 08:42
It's simple, really. I shook my Magic 8-Ball quite vigorously, because you are asking about near term futures and, as the tachyon quantum whorls and eddies of the near future are in a constant state of change (Hmm, difficult to see. Always in motion is the future) a solid amount of shaking was required for decent to good results.
My results (*and they're almost sort of kind of close to scientifically silly) indicates two things:
1) My Magic 8-Ball has a slight crack in it (which doesn't affect performance) ; and]
2) You're overthinking it.
Here's the really important part. Once you make a purchase of any chip socket on the market today, it's a fair bet (and the M8B verifies this completely, so I stand by this) that by the time you're really ready for another processor or type of RAM that requires a newer motherboard, you'll be in luck, because all that old crap you bought is completely obsolete and there's only new stuff to get.
My advice: Don't worry about it. You go for best price to performance of what you have now and, honestly, the motherboard and processor are going to change hand in glove quite rapidly. The fact that you're worried about the availability of the same socket type, only newer, faster, stronger than what you put in there only highlights the fact that this thinking is already well behind the technology curve. There was a time when that was solid planning (I think it might have been as long ago as 2 or 3 years ago.) No longer.
If the core processors were stable at a set amount (and I happen to know that both Intel and AMD are hard at work on 12+ core processors) of cores and the speed gains were only to be had from more powerful, higher clock frequency processors with the same number of pins and features, planning on a socket/mobo longevity course of action to upgrade the processor later would be the way to go. This, sadly, just ain't valid in today's ever-fluctuating market to find that next plateau of processor architecture and design that can just be tweaked again next year for more sales.
At this point in time, neither AMD nor Intel are going to be saying, "We have a solid pin grid chip that won't change in the near term future." So relying on availability of the same socket type is like, in the words of Elton John's "Honky Cat," ...trying to find gold in a silver mine, it's like trying to drink whiskey, from a bottle of wine.
Don't worry, be happy, is the song you should be singing, or maybe David Bowie's, Changes.
Right now, I would focus on RAM that meets or exceeds today's mobo plethora of speeds in the same mobo. If you can get some super chunky peanut RAM and slap it into your current mobo choice and it will be able to run at the top speed of the board (but not at the top speed of the RAM,) then you have somewhat future proofed your next mobo purchase so that you don't at least have to buy RAM at the same time.
This methodology should also be in place with a Power Supply. Get MOAR, 'cause later you'll be sorry when you get that next gen Multi-GPU dedicated card and it's going to suck down half to more than half of your available power on a 750 or 1000 Watt supply.
Again, from all that I have read so far regarding Sandy Bridge is that there is really only an advantage to Mobile computing Mobo makers as it further reduces the chip footprint on the board -- As Well As, provides a much faster transcoding time to get video running much smoother on your laptops (and desktops, I guess) for when the Fasten Seatbelts lights come off and the You Can Now Turn On Your Mobile Computing Devices lights flash on in the pressurized cabin you just took your shoes off to get into. (Trust me, don't buy those headphones they sell on the plane, bring yer own laptop and watch what YOU wish to watch and hear it with your own earbuds.)
In short, the Sandy Bridge is probably a good idea if you're thinking about jumping up and buying right now. It certainly couldn't hurt. As the IGP (Integrated Graphics Processor) is (along with the Bridge chipset) now truly integrated into the core processor, you'll definitely see a bump in the quality of your You Tube videos or Netflix "Watch Instantly," entertainment. I also believe, based on additional reading (and bear in mind, this is inferential supposition on my part) that even with a dedicated video card to do the "heavy lifting" in gaming, the Sandy Bridge looks poised to provide that smoother video performance and internet browsing and spreadsheet quickness, whether or not you have a great PCIe card of whatever make installed as well.
Everything looks to be monitored and controlled by the Sandy Bridge on the core chip and, as such, it most likely acts as a traffic cop between the IGP on chip phase and the dedicated VGAC (Video Graphics Adapter Card) when both are present.
My prediction: Even if you get this newer technology versus the older technology, vis-a-vis the chip sockets, when you're ready for yet another motherboard and processor combo, you're going to have to get a newer motherboard anyway. This is, because the chip socket technology will once again have to change to meet the adaptations in Computer Chip Architecture technological changes. So go with your gut and your pocket book on what gives you the most bang for your bucks now and maybe 2-3 years from now. Other than that, I'll have to start assessing a fee for my use of the Magic 8 Ball in these and other forums. Meanwhile, remember, the Magic 8 Ball is to be used for entertainment purposes only. Any forward looking statements can fluctuate wildly based on market conditions and the clarity of that dark blue ink used inside.
best regards,
dunniteowl





Retour en haut






