Jump to content
Sign in to follow this  
Olham

Graphic Card Solutions - Crossfire or one top Card?

Recommended Posts

Recently, I have read somewhere here, that one user wants to either get himself a top ranking new graphic card,

or buy two less expensive medium range ones to share the job via Crossfire.

 

Now, I don't know anything more about "Crossfire" than that it seems to use two cards sharing the work effectively,

which should provide the user with faster graphics.

I use an ATI HD 4870 (1 GB) now, and could get a second one for maybe 150,- Euro.

The latest flagship from ATI costs more than 550,- Euro, I think I have read.

 

Can anyone here say something about the pros and cons of "Crossfire", and if that works on every system?

Share this post


Link to post
Share on other sites

.

 

Olham, from what I've read, two slightly lesser cards in a Crossfire config will be better that a larger single, as long as you have a system that supports that set up.

 

.

Share this post


Link to post
Share on other sites

Olham,

 

In order to run crossfire you'd need a motherboard that supports it, and two fairly identical ATI cards (as well as a crossfire bridge) or a single card with dual GPU's (such as the ATI 5970). The mainboard would have to have two PCI-e slots (preferably capable of both running @ PCI-e 16x even when both are populated). A lot of motherboards, even if they have two PCI-e 16x slots, will drop PCI-e speeds to 8x when both slots are populated. This isn't a game-breaker necessarliy, because the difference between 16x and 8x performance levels is in my opinion, not all that noticeable. Another requirement is that the mainboard's chipset BIOS provides support for crossfire arrays. Not all do....particularly when it comes to a number of "older" boards. Easily determined though, by simply checking on the manufacturer's website. Special attention must also be paid to the power requirements of a crossfire configuration. They would be substantially higher than that of running a single card.

 

You also have to remember that your system will only utilize the available memory from one of your two crossfire arranged cards. IE...if you have two 1 Gig cards, you won't double the amount of available dedicated graphics card memory. You'll still be limited to 1 Gig if I'm not mistaken.

 

In your specific case, dropping another card in would theoretically "double" your performance in games that actually provide support for crossfire. From my experience, the performance levels are never "theoretically correct", but you would realize a significant performance boost with any game that provides crossfire support or has a crossfire profile associated with it.

 

Once again, unfortunately OFF provides no support for either Crossfire or SLI, so either enhancement would be a non-enhancement in this particular scenario.

 

The best possible upgrade I can think of in your case would be going to something like the ATI 5850....or better still, the 5870. These cards are becoming more affordable every day, and I suspect when AMD/ATI releases their new offerings in the very near future, prices will drop to very reasonable prices. I can tell you without hesitation that a 5870 would literally wipe the floor with your current card, as would a 5850......just not to quite the same degree....lol.

 

Cheers Mate,

 

Parky

Share this post


Link to post
Share on other sites

Fundamental knowledge and advice like always, Parky; you cleared several aspects at once; thank you very much. :good:

Share this post


Link to post
Share on other sites

Parky is 100% correct. Neither ATI Crossfire nor Nvidas SLI configurations are supported by the CFS3 engine. That was why I chose to go with the single GTX 580 solution, which I am quite happy with. That said however, ATI is supposed to be shipping a new single GPU card sometime in the next few months. It may be worth it to wait for that, first because it could potentially be a faster card than the GTX 580 and second because if I recall correctly, the ATI owners don't have a problem with the rear view mirror reflection not working the way Nvidia owners like myself do. I could be wrong on that but I believe that is correct.

 

For games that do support Crossfire and SLI solutions, performance gain can be up to 80% improvement. You will never get the 100% improvement (doubling) that you one would expect however all the reviews say that it is indeed quite noticable.

 

Hellshade

Edited by Hellshade

Share this post


Link to post
Share on other sites

Olham,

 

I have the 5870 2GB version in my machine, and it runs BHAH without any problems (as it does with all my games). Previously I had an Nvdia 8600GT, which I thought was a good card although it only ran at 40fps in BHAH as opposed to the 60fps of the 5870, both of framerates were capped by me, but I can now run at 60fps comfortably with the graphics all set to 5 (UncleAl do not mention terrain problems on 5, you remind me of a broken record on that subject).

 

Thanks

Rugbyfan1972

Share this post


Link to post
Share on other sites

Yeah, I'm thinking of getting that card - thank you guys!

Share this post


Link to post
Share on other sites

I'd like to add something: SLI (and probably "multi-GPU" in general, to include Crossfire) IMHO is one of the most widely misunderstood features of personal computers these days.

 

While I do understand the concept that CFS3/OFF do not have "built-in" support for SLI or Crossfire, I do not agree that CFS3/OFF will not benefit at all from a multi-GPU platform.

 

It is the video driver's job to "support" SLI, not necessarily the application (game). What Parky said, I think, is the absolute truth:

 

"...you would realize a significant performance boost with any game that provides crossfire support or has a crossfire profile associated with it"

 

The thing is - while I cannot speak for Crossfire from personal experience, I can speak for SLI - the "profiles" are something you can create and edit yourself. Many are posted online, like for OFF here, with user 'tips and tricks', etc.

 

Perhaps not as effective as having 'native' support for multi-GPU, I believe the performance benefit of mutli-GPU is still possible even if there's no built in support for it. This was certainly my experience when I ran SLI for about a year - everything graphical I ran was improved at least somewhat, in some way, by using SLI. Some more than others, naturally, but still. If nothing else, for example, SLI would allow me to run AA at higher level, and on a larger screen res, than one of the two cards alone.

 

SLI doesn't make sense if you're buying two brand-new video cards, for exactly the reasons discussed here: Primarily, you won't get a 200% increase by using two cards; it might go well over 30 or even 50% or beyond, depending, but not double a single card. The entire idea behind SLI IMHO was/is to run single card you could afford - and then later (after prices on that exact card have dropped), get a decent boost in performance by adding an identical second card, at less than the cost of a new, better card.

 

But the various retailers' price models, sales, rebates, etc. tend to screw up the whole concept *lol*

 

In any event, I believe the best advice when buying new is to buy the best video card you can afford at that time, rather than two of anything else. Later on, hopefully prices come down, you can get a second card like the first for less than the 'latest and greatest' (which tend to be expensive at first release); ideally getting performance near to the newest card at a much lesser cost.

 

As for PCIe slots, Parky's got a good point, you have to have a mainboard that supports it, on two slots, and many cards do actually drop the number of PCIe "lanes" for both slots to x8 if you fill more than one slot.

 

That being said, however, there are a number of articles, reviews, and tests that I've read on this subject, which suggest that the 2 slots running at x8 isn't going to bottleneck current-gen video cards anywhere nearly as bad as it sounds...seems I recall bandwidth measurements, etc. The impression I was left with is that with the CPUs at the time of these tests, CPU bottleneck is far more of a potential problem than running two video cards at x8 instead of x16 (this may no longer be true...I don't honestly recall what generation of CPUs were involved at the time the articles/tests were published). With the appropriate Google serach string, you can find tons of info about PCIe x8 vs x16, and it may have changed substantially since I studied it (a year ago, maybe?)

 

HTH

Edited by Tamper

Share this post


Link to post
Share on other sites

I'd like to add something: SLI (and probably "multi-GPU" in general, to include Crossfire) IMHO is one of the most widely misunderstood features of personal computers these days.

HTH

+1 +1 +1

Todays graphic cards do not employ SLI. The way SLI works changed when Nvidia came along. SLI, or Scanline Interleave works by each video card processing alternate lines and then combining them to render a scene.

 

The term SLI was simply continued when explaing how later vc's were linked mechanically and worked to render a scene. Each card now works on a section of the scene and then both combine their output. The CPU and system ram are very important parts of the process. They integrate the video driver, the game developer's software for "SLI" and tell the cards what, when and how to render.

 

The really big difference between the "old" SLI and "new" is that the cards are now capabale of driving insanely high resolutions while delivering tremendous texture details. BUT you have to have the display(s) that have that capability.

 

The best single card of one generation can see a performance gap of 40 to 60% when compared to next-gen vc's running in SLI. Running SLI within the same generation will see much, much less difference in performance. And remember, this performance difference is totally dependent on driver and software efficiency.

 

The other major problems are PSU and the extra heat generated by multiple vc's. The PSU's continuous load rating and matching the needed amperage per rail becomes VERY important. Air flow cooling the cards becomes even more critical if the software is able to sustain the GPU's work load.

 

Parky and Tamper's comments hit several nails on the head. SLI can be of great benefit IF you have the display that will handle high, minimum 1920 x 1200, for base line results. SLi/Crossfire are already running at 3600 on true monitors! SLI is also totally dependent on your choice of gaming, the drivers and the ability of developers to fully utilize SLI potential.

 

plug_nickel

 

ps: How many "old timers" remember or used Voodoo cards in SLI? I remember when 1024 x 768 @70hz was the hallmark of resolutions. 3DFX gave us that.

Edited by almccoyjr

Share this post


Link to post
Share on other sites

Al,

 

I do thoroughly enjoy when you are involved in one of these technical discussions :) I certainly remember the "original" SLI, no doubt some others here will, too. OFF has a lot of old RB3D fans, and you know, RB3D was all about Glide...and Glide was all 3Dfx at the time.

 

But I do have to offer some clarification, though: The "old" SLI was actually what you're referring to - Scan Line Interleave, where the idea (and the *only* "mode") was to alternate each line of the display dat per GPU, in order to theoretically cut the load on each GPU in half. (Of course, there's overhead, etc....but that was the theory). In fact, I owned one of the rare VooDoo 5500 cards. They were among the first to incorporate two GPUs on a single card, to allow the original SLI concept on a single board (multiple GPU interface slots weren't really around at that time). Of course, that's when 3Dfx was already having serious problems...the 5500's never did too well, and they were bought up shortly thereafter, as I recall. I think it was the forerunners of today's Nvidia that bought them.

 

Today's SLI isn't the same thing, though. The SLI now stands for "Scalable Link Interface", and although the idea of a 'bridge' between GPUs is roughly the same, the "new" SLI offers different modes besides just the 'every other line" approach. Granted, they're all still essentially variants of splitting the workload, but there are (at least) two different frame rendering modes, and at last one additional mode, last I checked. Plus, I don't think the original SLI would have ever worked with more than 2 GPUs, whereas new SLI is designed to accomodate genuine "multi-GPU" setups; already there are triple and quad SLI setups. That's the S in SLI - "scalable", which it was designed to accomodate, where the original wasn't designed for it.

 

Of course, now the issue has become one of the chipsets limitation to 36 PCIe lanes. So, even if you have three cards, one will have to operate at 8 lanes while the other two can run at 16 lanes each assuming no other cards. (This limitation is present in Intel's x58 chipset; I don't know about whatever Nvidia themselves may have done recently, or the newer chipsets...changes too fast to keep up with *lol*). Last I checked - when I was shopping around for my current motherboard, there were very few that delivered 16 lanes on two slots with both populated. I found exactly one that did this and still offered the full balance - 4 lanes - on a third slot (I have a PCIe x4 RAID controller).

 

At present, I am not running a second video card; I'm actually considering a second GTX260/216 vs. a newer 400 or 500 series Nvidia card. I was never (ever) displeased by the SLI setup I had (three of them, actually, 8800GSs, 8600GTs, and 9800GTs), even though SLI was blamed for a lot of stuff inaccurately. I finally figured out and solved where the 'stutters' come from (it's not SLI), and only really stopped using SLI to try a single-card solution due to cost and the aforementioned marketing impact on pricing.

 

For me, it's funny - any number of folks here will readily tell you OFF runs best at higher resolution (no real details, mind you - but I believe it is a consensus, and I've no reason to disagree). The same number will tell you OFF doesn't benefit from SLI...

 

...but one of the biggest benefits of SLI is higher resolutions, and native "support" (by the game) isn't required for it to work - the video driver and configuration takes care of it.

 

So, I say OFF can and does benefit from SLI. How much, as has been discussed, is a function of many other variables. Whether it's "better" than a single card solution is a question of which exact cards, on which system, and how much each will cost. Another factor is that, if OFF isn't the only game you play, then other games might benefit even more from multi-GPU, which might possibly even out-do a single card on a 'bang-per-buck" scale, depending on what's on sale where.

Share this post


Link to post
Share on other sites

Good points Tamper. The original SLI acronym carried over even though it's function was radically altered and improved. I'd call it, and actually do refer to it as "SVLI", or Scalable Video Link Interface because it's so far removed from the original concept. I think, if memory serves me, that Scan Line went out when PCI was fully incorporated eliminating AGP. The necessary bandwidth was now available to fully render whole sections and then combine them.

 

SVLI comes into its own with higher resolutions and detail textures. A 5-10% gain in performance really needs to be addressed in those terms, IMHO. BHaH/HiTR looks better on my monitor at 2304 x 1440 @80hz than it does at 1600 x 1200 or 1920 x 1200 because of the gain in texture and detail. I haven't been able to pin it down, still doing some research off and on, but the higher the resolution and frequency, the better SVLI performs simply because you're now approaching its capabilities. The old Scan Line was incapable of achieving anything close to that. And if that be the case, then SVLI should be able to enhance many games even though they don't support SVLI. Higher resolutions combined with deeper textures and greater detail "seem" to make a game run better. Unfortunately, marketing seems to have gone down the path of fps performance as a rule of thumb when discussing the benefits of SVLI. The visual gain of 60fps at 1600 x 1200 doesn't compare to 60fps at 2304 x 1440. The kicker with SVLI are the roles the CPU, system ram, drivers and software integration have on the overall benefits of SVLI. Adding a second vc isn't necessarily a performance panacea in and of itself, but higher resolutions can be.

 

plug_nickel

Share this post


Link to post
Share on other sites

Try hard to live with it, because it happens to be true. And many times Newbies post with unexplainable Video Problems, and what do you know, simply turning down to 4,and their problems are gone

 

I think it comes about because most are used to running CFS3 maxed out. I know I did with my 5700LE in AGP.

 

It looks the same, and everybody THINKS their computer is top shelf, so they run OFF maxed out, . . . Surprize . . .Surprize . . . Surprize

 

Actually it's written enough places, . . but then average person . . . Doesn't Bother with Instructions

 

Till needed of course good.gif

 

UncleAl,

 

I could have swore that Rugbyfan more or less "politely" asked you not to respond to his post with that "if you run your terrain slider on 5, your computer will over a period of time, mysteriously self-destruct or burn your #@%^ing house down" theory of your's. It's absolute rubbish. I'll grant you that it's not exactly rocket science that running any of the sliders on 5 as opposed to a lower setting, is going to result in some kind of performance deterioration, particularly if your hardware isn't up to snuff. A moderately clever chimpanzee could figure that out. He'd also figure out that reducing the terrain slider one notch, either from 5 to 4 or 4 to 3 or even 2 to 1 for that matter isn't going to miraculously solve all those noobs "inexplicable video problems", and it isn't going to help them avoid formatting their hard drives either. It may help in some cases, but it's not a cure-all as you would seemingly have us believe. I've been running all sliders on 5 with the exception of clouds since last May. I haven't seen any cartoon characters, pink elephants or UFO's appear on my screen yet. Perhaps I'm just lucky, as Rugby apparently is as well, but his "broken record" comment was well justified. You've just managed to reinforce that justification with your latest response.

 

Basically, you're :deadhorse:. So......when somebody asks you directly to cease and desist with one of your ill-founded, completely irrational and irritating "theories", :please: have the courtesy to do so.

 

 

Oh......and......on Olham's behalf, I'd like to personally thank Almccoyjr, Tamper and Hellshade for their valuable, well informed and helpful contributions to this thread.

 

 

This has been a public service announcement......

 

 

Cheers,

 

Parky

Share this post


Link to post
Share on other sites

Ditto to the last part!

 

And as for our "UncCarp" - you know what that "Smiley" means that you used - flogging a dead horse.

Share this post


Link to post
Share on other sites

Thanks Parky, for your technical contribution, and the PSA :grin:

 

A correction on something I stated earlier (apparently, "Edit" only works for so long after you post...):

 

"...if you have three cards, one will have to operate at 8 lanes while the other two can run at 16 lanes each assuming no other cards. (This limitation is present in Intel's x58 chipset..."

 

This is inaccurate. The 36 lanes is a limitation of the Intel X58 chipset...still can't speak for others. But if you operate three cards, and if two of them operate at x16 (depending on the board, etc.), then the third card can't exceed x4. Similarly, if running two cards means (due to the board itself) that each of these cut back to 8 lanes, then a third card could run at x16 and still leave 4 lanes available to other slots. However, it would be my guess that triple- and quad-SLI setups are almost always going to result in each card being run at 8 lanes. (3x8=24; 4x8=32).

 

And, like Parky mentioned, I wouldn't buy a shiny new x16 card and run it in a configuration where I knew it would be limited to running 8 lanes. It just doesn't seem appropriate to me either, even if some indications are that 8 lanes won't choke the card (still not finished studying this, TBH...some day *lol*).

Share this post


Link to post
Share on other sites

Al,

 

There's no need for me to PM anyone about any problems. I don't have one....but I'll most certainly and eagerly look forward to the release of Phase IV.

 

Cheers,

 

Parky

Share this post


Link to post
Share on other sites

I always enjoy the technical discussions on this board, especially those with Parky, Tamper, Almcjoyjr, and Von Paulus...no disputing that you have a lot of talent when it comes to PCs.

 

However, I have to defend Uncle Al on this one, even though he's tried to twist my tail a time or two. There are still plenty of users here with older or budget builds, including me. I'm running BHaH at 3.6MHz with a relatively slow 5770 card, and can easily run all sliders at 5 with a decent framerate. However, there are times when flying over an airfield, over the frontlines, or during a flak attack when the framerate drops, the game may stutter briefly, and interferes with the sense of immersion. I run BHaH looks with supersample AA, and dropping the sliders a notch or two DOES improve overall graphics performance. Your mileage may vary.

 

 

Share this post


Link to post
Share on other sites

I "envidia" the tech talk getting much more detailed "ati" the discourse more involved as our pc's begin rolling over from the tremendous visual hits Phase 4 will be dishing out, with our study boards begining their slow, stuttering death spirals to the tarmac below, our vc's dead and fans silent with only the sound of code and data whistling through the cables to be finally enveloped in the calming presence of the great BSOD.

 

All will be quit on the Western Front.

 

Dam*; can't wait!

 

plug_nickel

Share this post


Link to post
Share on other sites

Al,

 

I do thoroughly enjoy when you are involved in one of these technical discussions :) I certainly remember the "original" SLI, no doubt some others here will, too. OFF has a lot of old RB3D fans, and you know, RB3D was all about Glide...and Glide was all 3Dfx at the time.

Thank you. But this discussion may end up costing us both some money.

 

When my curiosity gets peaked, I hit Google and some forums and start digging. Something I just found out at that's taken me completely by surprise was a technical article about games benefiting from SLI, SVLI to be more accurate (hehe). The second card can utilize antialiasing without adversely affecting GPU performance. Nvidia has incorporated a process called SLI-Antialias in 8x, 16x, and for quad cpu's, 32x. It unloads the work to the second card freeing up the primary "SVLI" card. This function is aimed strictly at "older" games that don't support SLI and that are primarily CPU bound! Found much of the info at SLI Zone and when I started to dig into some white papers on current Nvidia drivers.

 

If you can run any game at higher resolutions with better textures and detail without incurring any adverse affect, even without gaining any mythical fps marketing hype, that's a positive performance boost that's worth it, IMO.

 

It looks like my soon to be installed GTX260/216 just might be having company.

 

plug_nickel

Edited by almccoyjr

Share this post


Link to post
Share on other sites

Olham,

 

I'm running 2 x ATI 5970 crossfireX on my system. There is a huge difference in all my gaming ,even OFF P3.

 

I know that OFF is limited to hardware but if you create a uber PC with fewer bottlenecks the better for any gaming.

 

Take some cash and buy a intel or OCZ SSD (HD) that will increase ur OFF experience.

Cheers

 

Morris

Share this post


Link to post
Share on other sites

UncleAl, "falsehood" is NOT a word that comes to mind when I think about you.

Grumbler, yes. Coarse, maybe. Poltergeist, why not. But falsehood - never.

You seem to always speak right from your heart, which can't speak in polite

tongues. Quite a Tasmanian Devil.

 

Share this post


Link to post
Share on other sites

Thank you. But this discussion may end up costing us both some money.

 

When my curiosity gets peaked, I hit Google and some forums and start digging. Something I just found out at that's taken me completely by surprise was a technical article about games benefiting from SLI, SVLI to be more accurate (hehe). The second card can utilize antialiasing without adversely affecting GPU performance. Nvidia has incorporated a process called SLI-Antialias in 8x, 16x, and for quad cpu's, 32x. It unloads the work to the second card freeing up the primary "SVLI" card. This function is aimed strictly at "older" games that don't support SLI and that are primarily CPU bound! Found much of the info at SLI Zone and when I started to dig into some white papers on current Nvidia drivers.

 

If you can run any game at higher resolutions with better textures and detail without incurring any adverse affect, even without gaining any mythical fps marketing hype, that's a positive performance boost that's worth it, IMO.

 

It looks like my soon to be installed GTX260/216 just might be having company.

 

plug_nickel

 

*lol* Yes, it might at that.

 

As always, your study hits the nail on the head. The info you're refering to - the SLI AA modes - these are among the "modes" I spoke of earlier (don't know that 'modes' is the accurate word). But there definitely are things an SLI setup can do to improve performance overall, without requiring 'native' support. This is *the* reason I still don't get why folks still say games like OFF (and the "other" WW1 flightsim) don't benefit from SLI.

 

It's almost as if the only "improvement" that's recognized is increased FPS. And while FPS is obviously an important measure of performance, it's another often misunderstood aspect of PC performance. (I can show you a rig that runs 60+ FPS and still stutters like a fuel-starved rotary...) I think FPS in this respect is "overrated" for lack of a better term.

 

Olham,

 

I'm running 2 x ATI 5970 crossfireX on my system. There is a huge difference in all my gaming ,even OFF P3.

 

I know that OFF is limited to hardware but if you create a uber PC with fewer bottlenecks the better for any gaming.

 

Take some cash and buy a intel or OCZ SSD (HD) that will increase ur OFF experience.

Cheers

 

Morris

 

Absolutely. Again, another great piece of advice for how to get good *performance* - and not necessarily just a higher FPS rate - from OFF (or any other game). Very timely, considering I was just on a rant about FPS being overrated :good:

Share this post


Link to post
Share on other sites

The info you're refering to - the SLI AA modes - these are among the "modes" I spoke of earlier (don't know that 'modes' is the accurate word).

You know what buttons to push. Your "modes" comment got me to go back and do some research and shine a brighter light on SVLI. Sorry, ain't letting the "new" acronym go away peacefully.

 

plug_nickel

Share this post


Link to post
Share on other sites

Grumbler, yes. Coarse, maybe. Poltergeist, why not.

 

 

:rofl: :rofl:

Edited by 77Scout

Share this post


Link to post
Share on other sites

I was correct, I said to myself, this'll be good Scout won't add anything one way or the other, he's a cronic agreer clapping.gif

 

I have nothing more to add to UncleAl's excellent comments. I totally agree. :no: (grin)

Share this post


Link to post
Share on other sites

Okay, perhaps we go back to topic then? Arrh - what was it ... ?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×

Important Information

By using this site, you agree to our Terms of Use, Privacy Policy, and We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue..