Jump to content

Future proofing


David

Recommended Posts

Looking at replacing my primary 16:10 22" (1680x1050) monitor and replacing with a 4k 28". Ideally no more than £450.

 

I'm not all clued up with crossfire etc. but I'm currently running a single AMD HD7950. I was thinking of getting another to run in crossfire but the card is no longer manufactured and my only option would be to get a used one...... (thoughts on this?). Ideally wouldn't get a used one so I was wondering if anybody knew the limitations of crossfire, i.e. does it have to be an identical card to work or can I get something else to run alongside it. I'd rather not bin my current card as it still performs well.

 

If push comes to shove (it's not up to the task of 4k) and it's not compatible with any other cards I'd upgrade to an nvidia setup but this I've not given much thought yet.

Link to comment
Share on other sites

* WALL OF TEXT INCOMING *

 

Right, replacing the monitor is always a fun experience to do. Not sure if you want to cash in now, or wait a bit until 4K is a bit better priced, but that's up to you.

 

As far as Crossfire-ing goes, it does not have to be the same card to pair them together (known as 'regular' Crossfire) as long as both cards are at least HD 2xxx.

Previously, if you had a card lower then that you would need to designate a master card or have two of the same cards. But that's a whole other story.

 

So yes, your card will benefit from buying another AMD card that's not the same make. It does however mean that the card with the lowest performing figure will determine the specs for both of them; e.g. the AMD HD7950 currently in your rig could have / has 3GB. If you were to pair it with a card with 1GB VRAM that would mean you end up with a grand total of 1GB VRAM. There are some exceptions to these rules, not all cards will play nicely and sometimes tweaks can be used to maximise the available amount of VRAM.

 

Now for Nvidia SLI there is a whole other ruleset. The general one being that cards have to be same, in contrast to Crossfire, but again with some exceptions. For eample, a 9800GT cannot be paired with a 8800GT although they have exactly the same specs. But a GTS 250 with a 9800GTX+ will work fine because the GTS 250 is a rebranded 9800GTX+ (fuck knows why Nvidia feels the need to do this). However, one nice addition to the Nvidia system is that if clock speeds of the card differ, instead of lowering the speed to the reach the other card's speed, both will run at their default clock speed.

 

 

Now, TL;DR

 

In general it is stronly recommended to buy the same card nevertheless the chipset fabricator. If you would really want to run two different cards, AMD's crossfire pairing(?) is definitely the way to go. Some comparison to what I stated above:

 

AMD*

 

- Different memory sizes okay, but will lower to 'weakest' cards amount

- Different clock speeds okay, but will lower to 'weakest' cards amount

- In general, different cards may be paired together*

 

Nvidia*

 

- Different sizes not okay

- Different clock speeds very okay, will run at default clock speed

- In general, different cards may not be paired together*

 

* exceptions do apply

 

 

Now, as for buying second hand. I do not have that much personal experience, but buying graphics cards can be considered reasonibly safe, since really not much can break (down)*. Some tips for buying second hand (not only applicable to GPU's):

 

- Contact the seller for more pictures of the item. Aside from confiriming that they actually own the item they're proclaiming to sell, it will also allow you a better / more detailed look to inspect the product for damages / points of interest.

- You can also ask for them to put up a video, or make a skype call and thusly inspect the item.

- Finally, the best thing to ab-so-lute-ly guarentee a good item is to just go there and inspect the item yourself (or have a relative / friend do it). 

 

Some tips for GPU's:

 

- Check the dust build-up. Seriously. If a card has tons and tons of dust build-up, it may very well not be in working condition.

- How accurate is the owner with the info they provide about the card? Simply searching the difference between the TurboMaxx Special XXXXX 2GB and 3GB variant and ensuring they sell the right one can be vital.

- Ask them to plop the card in their rig and have them show you video of it playing a game with an FPS counter. It sounds a bit weird, but it the user is genuinely intersted in selling it they may not have a problem with it. I have requested it and done it for someone, so some personal experience there.

 

 

Well, that was a doozy to type! Seriously though, I really hope this helps you out bud, and wish you the best of luck on your GPU endeavours!

If anyone wished more info, either here or on steam/skype/whatever, lemme know and I'll see about helping you out! If anyone has remarks about my knowledge not being up to date anymore or my spelling, let me know as well! Bye!

 

* WALL OF TEXT OVER. SAFE TO RESUME READING *

Link to comment
Share on other sites

Awesomely detailed, thanks!

 

I'm more than able to buy new, only reason for not doing so is for getting the same card & not throwing away a perfectly good one outside a 4k environment. You wouldn't happen to know if there's a compatibility list? I notice AMD have discontinued their HD line and now have some sort of R7/R9 series which I need to get my head around. I was looking at this card http://www.novatech.co.uk/products/components/amdradeongraphicscards/amdr9290xseries/r9290x-dc2oc-4gd5.html as I wouldn't want a card that drags down the one I currently have.

 

I'm not a member of the build brigade and the most I've ever done is replaced ram and taken the heatsink and fan off an old defunct rig (so not really anything...)

 

I need to get into this building malarkey!

 

EDIT: Oh but then PSU could be a problem...... #fwp

Link to comment
Share on other sites

Thou meanest this?

 

http://www.wsgf.org/f/u/contrib/article/18259/AMD_CrossfireX_Chart_1619W.jpg

 

EDIT: Building is really not that hard though, it's often compared to lego in difficulty. The hardest hardware part of a fresh build is usually considered getting in the cooling paste, although some other things like drivers and compability issues can stir up a lot of trouble as well.

 

EDIT 2: Only now I see that the chipset availability is all blobs, meaning all combo's are valid. Anyone remember that fragment from Top Gear where they discuss those car brochures?

Link to comment
Share on other sites

Right, the chipset is not really the limiting factor here, but please post your motherboard type here to be sure. I'll give another 'lecture' later today ( currently on tablet which limits my typing abilities ) about the chipsets, card compabilities and G-sync / 4K ( because why not ).

Link to comment
Share on other sites

Right, took me somewhat longer to get home due to stormy weather (thanks Dutch Public Transport) and still had to finish my sit. Anyway.
 
* LECTURE THINGY STARTS HERE *
 
1. Crossfire & Card compabilities
 
To Crossfire two cards together, they only need the same GPU architecture, which is why you can 'mix and match' them. This is what the sheet refers to: any white or gray blocks cannot be paired together as theire basic makeup differs too greatly. It's like trying to add in another engine to add power to your car, except your car runs on diesel and you're trying to add a gasoline engine and feed it diesel. It won't work (as well)

 
For a more professional view on this, watch Linus' as fast as possible 

 

The chart I provided above basically shows the GPU's that can be paired together and if they require a 'bridge' or not: a bridge in SLI terms is a small piece of electronics that allows the cards to transfer information. Note that this is not required for all SLI setups, but it is most definitely stronly advised because it increases the performance very much.

 

 

 

2. Chipsets

 

Chipset make up the bulk of the motherboard and is what is the determining factor in buying a certain motherboard. It basically handles any information transfer in your PC (between CPU, GPU, RAM, storage, internet, etc). As can be seen on this diagram it actually consists of two parts, the northbridge and the southbridge, each who communitcate with different parts of the PC.

 

Now, if you were to pick a card that is not compatible with the particular chipset on your MOBO in your PC, than it will not properly transmit data between the two possible creating issues from a range spanning from reduced performance to not detecting the card and even damaging it. Luckily, these days it's neigh impossible to do this, since almost all chipsets work will almost all cards nowadays.

 

 

 

3.   4K 

 

Technology is a wonderful thing, especially since it keeps improving so quickly. You probably have seen 8 bit before, which was a way to display without using a lot of memory, which was not available due to hardware limitations, because they could only fit so much circuitry on a square centimeter which limited it's abilities to display vibrant colours.

 

Luckily, as technology improved over time, they were able to increase the resolution of displays (and assortments) so where we were limited to 320x200 in the 8bit era, 1080p or 1920x1080 is now considered standard by most and already outdated by some. Just for funsies, here's a comparison chart

 

Due to a very new technique which allows for even more circuitry per cm2 it's now possible to up the resolution even further, to a whopping 4K or 3840 x 2160 ( the 4K referring to the fact that it's a little over 4000 pixels on the horizontal pane or x-axis). However, since it is so brand-spanking new, it's also very expensive at the moment. Prices are expected to drop quite rapidly since big manufacturers are taking this up so quickly, thus lowering the cost.

 

 

 

4. G-sync

 

G-sync is technique which is developed by Nvidia, which basically lowers the input lag by synchronising display refresh rates to your GTX GPU [ SO IT WILL NOT WORK WITH AMD ], which in turn improves smoothness and reduces both screen tearing and display stutter.

It's all done by taking advantages of the displayport's abilities, really.

 

For AMD users, fear not. Samsung is currently developing monitors which support the opensource FreeSync (AMD's answer to G-sync) using similar techniques. Most ''tech people'' expect that G-syncs and Freesync will merge in a couple of years (as has happened to chipset standards and GPU slots standards before).

 

 

 

 

Once again, hope this helps you out. I like doing this, so how about we make this a more or less regular thing on here? If anyone has more questions on these sort of things, ask them, please! :D

 

* WALL O' TEXT CEASED *

 

EDIT: all links seem to work perfectly. I also recommend watching LinusTechTips (referred above) as well, since he explains most very well (also keeps it basic).

 

 

 

EDIT 2: may as well throw in some more 4K v. G-sync.

 

Aside from the obvious disadvantage of G-sync requiring you to have a Nvidia GTX GPU, it really does help smoothing things over, so it really helps out during those fast paced action games, whereas 4K is more for people who work with stills, video editing etc. Both can be used for the other as well, so really it's just personal preference.

 

I would personally go for G-sync, because I'd rather have the low input lag, then the high resolution. Also, aside from games which are rendered in your PC realtime, almost no pre-rendered footage like movies or webpages are available in 4K yet, so it's makes them really tiny, which I do not like. Also, apperently G-sync is so good, when you move text around in a webbrowser (dragging it around with the mouse) you can still properly read it while it's in motion.

 

Also, dont even think about 4K + Gsync: not available and will make you wallet exploderize.

Link to comment
Share on other sites

Again, awesome!

 

After looking at that video, it seems crossfire could well be a pain to implement.

 

I have 2 weeks off and would rather make the most of it. Ok, it may not be the most cost effective way but in my typical "ooh shiny!" pitfall I'm considering a cut and shut replacement of a GTX 980, added bonus of shadowplay.

 

As far as I can figure out my current card draws about 15w more than the GTX980 so a PSU upgrade won't be required.

 

EDIT: Also, I did manage to find a GSync 4k, I know what it does but with lack of knowledge I'm torn (pun intended) between whether it'd tear more on a 4k or the same as the frames it'll be drawing are much larger?

Link to comment
Share on other sites

Nice buy! :D

Sorry for not posting this earlier, but I had some personal familial issues that need to be dealt with first.

 

* TEXT ARTILLERY INCOMING!    TAKE COVER! *

 

Overall, Crossfire / SLI can indeed be a pain to implement, especially since most games are not (properly) optimized for it, so if there is any gain, it would be very little.

Because of that I would recommend getting a single better card then two mediocre cards for SLI / Crossfire (so lucky you! :D). Anyway, the 4K display with G-sync (Acer XB230HK I presume?) will have a smaller delay rate then the 'regular' 4K one; it should also remove the screen tearing if occuring.

 

Let me go into detail about screen tearing a bit more. Screen tearing is what happens when your PC outputs a higher amount of frames then the refresh rate of your monitor, an example would be when you're running arma 3 at 100 fps (god knows how, though) and the monitor you have says 60 Hz on it.

 

Due to the game / pc fps >>> refresh rate the pc will now send a new frame to the monitor while it is still rendering the last one. As a result, you will get tears troughout your frames http://upload.wikimedia.org/wikipedia/commons/thumb/0/03/Tearing_%28simulated%29.jpg/797px-Tearing_%28simulated%29.jpg'>like this. It's gets even worse when games have strobe effects in it; dead space is a very well known ugly example.

 

To prevent this from happening, most games include something called V-sync aka Vertical Synchronisation. This technique prevents the video card from doing anything visible to the display memory until after the monitor refreshes it's current cycle. However, V-sync does increase (!) input lag. Because the video card (GPU) now has to wait for every frame to render fully first, this makes your action take place later as well because the video card keeps placing the frames on which you took your action at the back of the que of frames to display, and hey presto, input lag for everyone!

 

A well known trick to get around this is to limit the FPS to 1 frame lower then your refresh rate (which is why you have that option in the first place) ensuring that there are not enough frames to be placed in que, since game / pc fps <<< refresh rate.

 

A better solution to combat this problem would be the getting a G-sync monitor (again, lucky you :D!). The way these monitors work is that instead of limiting the frames the GPU can display 'at a time' on the monitor (''slaving the GPU to the monitor'') it actually synchronises the G-sync monitor's refresh rate to that of the GPU (''slaving the monitor to the GPU'').

 

* TEXT SHELLING CEASED. SAVE TO COME OUT OF FOXHOLES *

 

Overall, I am very happy to hear / see you make such a wonderful purchase :D. I hope to be able to build my own rig soon, since my pre-build which I used for ~7 years recently died due to my little niece sticking putty in it's vents. Everything pretty much died :(, does leave me with some nice options for my first build though :)

 

Would you like to give me some review on how the setup is performing? I am very interested to hear our opinion; also send my kind regards to your wallet and I hope it recovers soon!

Link to comment
Share on other sites

Nice buy! :D

Sorry for not posting this earlier, but I had some personal familial issues that need to be dealt with first.

 

* TEXT ARTILLERY INCOMING!    TAKE COVER! *

 

Overall, Crossfire / SLI can indeed be a pain to implement, especially since most games are not (properly) optimized for it, so if there is any gain, it would be very little.

Because of that I would recommend getting a single better card then two mediocre cards for SLI / Crossfire (so lucky you! :D). Anyway, the 4K display with G-sync (Acer XB230HK I presume?) will have a smaller delay rate then the 'regular' 4K one; it should also remove the screen tearing if occuring.

 

Let me go into detail about screen tearing a bit more. Screen tearing is what happens when your PC outputs a higher amount of frames then the refresh rate of your monitor, an example would be when you're running arma 3 at 100 fps (god knows how, though) and the monitor you have says 60 Hz on it.

 

Due to the game / pc fps >>> refresh rate the pc will now send a new frame to the monitor while it is still rendering the last one. As a result, you will get tears troughout your frames like this. It's gets even worse when games have strobe effects in it; dead space is a very well known ugly example.

 

To prevent this from happening, most games include something called V-sync aka Vertical Synchronisation. This technique prevents the video card from doing anything visible to the display memory until after the monitor refreshes it's current cycle. However, V-sync does increase (!) input lag. Because the video card (GPU) now has to wait for every frame to render fully first, this makes your action take place later as well because the video card keeps placing the frames on which you took your action at the back of the que of frames to display, and hey presto, input lag for everyone!

 

A well known trick to get around this is to limit the FPS to 1 frame lower then your refresh rate (which is why you have that option in the first place) ensuring that there are not enough frames to be placed in que, since game / pc fps <<< refresh rate.

 

A better solution to combat this problem would be the getting a G-sync monitor (again, lucky you :D!). The way these monitors work is that instead of limiting the frames the GPU can display 'at a time' on the monitor (''slaving the GPU to the monitor'') it actually synchronises the G-sync monitor's refresh rate to that of the GPU (''slaving the monitor to the GPU'').

 

* TEXT SHELLING CEASED. SAVE TO COME OUT OF FOXHOLES *

 

Overall, I am very happy to hear / see you make such a wonderful purchase :D. I hope to be able to build my own rig soon, since my pre-build which I used for ~7 years recently died due to my little niece sticking putty in it's vents. Everything pretty much died :(, does leave me with some nice options for my first build though :)

 

Would you like to give me some review on how the setup is performing? I am very interested to hear our opinion; also send my kind regards to your wallet and I hope it recovers soon!

 

I've always had screen tearing and it was just something I was used to, I then researched it properly and started noticing it everywhere, my question about it happening 4x as much was a thought that it would be constantly drawing 5 frames at once (if I actually engaged my brain before asking that question I would have realised how silly that sounded). I also thought, what if it was more obvious on a 4k? I know the 980 is a good card but can it good Vsync & 4k? It would suck to have the whole experience ruined by tearing. + I think we all know what we think when somebody says you can't have something.  :ph34r: It's the XB280HK 

The only downside I have with the monitor is that with my force feedback wheel it can make the desk and  therefore monitor vibrate and shakes my trackir receiver so that after about 5 laps I'm looking out the right hand window  :wacko:. If they'd built it to be a bit more solid this wouldn't be an issue, but hey ho, I doubt it's something that would've come up in design meetings  :lol:

 

My performance in game is actually about the same as I was getting before while running at 4k, better than what I was expecting, amazing clarity in games (especially when it comes to reading gauges etc) so great for flight simming or where there's a lot of dials/numbers displayed. Browsing on the other hand is a bit of a change to get used to (screenshot incoming).

 

gNrH3Vv.png

 

As you can see, the background doesn't reach, not by a long shot.

 

In games like assetto corsa, the in game apps can be unreadable at speed (as they're so small now) so I actually had to scale down the game to read them.

 

nYmYzkA.jpg

 

It's a bit blurry in the above but trust me when it's a font size 10-12, it's not simple, not suitable for glancing at least!

 

It might be worth noting I run an i5 4670k @ 3.8 so although it's not the strongest available it's still got a reasonable bit of power behind it.

 

My verdict on the 4k is, do it if you can, but make sure you've got the graphics card to do it with, for me the GTX980 is slightly more grunt than is required right now but a year on it could be a different story.

 

The reason I went with EVGA was dimensions (about the same size as the previous card and not a tricky fit) and uses the same connectors as the previous card (2 x 6 pin), it also seemed to be a reasonable price for what it delivered :)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Forum Statistics

    11.1k
    Total Topics
    66.4k
    Total Posts
×
×
  • Create New...