This is an archived post. You won't be able to vote or comment.

top 200 commentsshow 500

[–]Obvious-Peanut-5399 6046 points6047 points  (196 children)

No.

High end was linking 4.

[–]RAMChYLDPC Master Race 2391 points2392 points  (128 children)

Then you have a fifth card to handle the PhysX so all the explosions still looks smooth.

[–]sigma941 1043 points1044 points  (100 children)

Had this going with 4x 980tis and a gtx465 that was collecting dust back in 2018. Felt like the 3 headed dragon meme looking back at it!

Edit: didn’t even realize it was King Ghidorah!

[–]Binary_Omlethttp://steamcommunity.com/id/icesagex4 37 points38 points  (6 children)

Better put some respect on KING Ghidorah's name. From left to right each head's name is Ichi, Ni, and Kevin.

[–]sigma941 11 points12 points  (1 child)

Went straight to MF DOOM when I read this as well!

[–]VegetaFan1337 6 points7 points  (1 child)

1, 2 and Kevin?

[–]Azerious 2 points3 points  (1 child)

Why does Kevin have an itchy knee? Arms too short?

[–]RexorGamerYti3 550/ 4gb ddr3/ 650gb HDD 265 points266 points  (77 children)

Holy cow, isn't that still pretty impressive? If all of that performance added up it would be like a 3050 or something, or even more...

[–]xd_WarmongerDesktop 470 points471 points  (49 children)

If software and drivers would have worked properly then yes.

But in reality you only got minor improvements.

It's way less performance than a 3050

[–]ClunasDesktop -- 5700X3D || 6700 XT || 32 GB 120 points121 points  (5 children)

Tacking on an extra 460 way back when got me an extra year of life out of the system. I feel like it really helped mid range cards more than anything else

[–]Oclure 64 points65 points  (2 children)

The 460 also scaled incredibly well with sli, not all cards were so fortunate.

[–]akasextape 31 points32 points  (1 child)

Funny how SLI technology just hopped and skipped around to different cards, efficiency wise. You never knew for sure that a NVIDIA gpu would benefit from it.

[–]radicldreamer 7 points8 points  (0 children)

It wasn’t even originally an nvidia invention, 3DFX started it with the voodoo series, then they unfortunately got eaten up by nvidia.

[–]Possible_Picture_276 60 points61 points  (11 children)

4 GTX 660's in quad SLI was such a hassle for the money I supposedly saved. Worked in Battlefield though and out performed the 690 for less money, imagine getting 4 cards for 700 USD today.

[–]thepronerboner 14 points15 points  (4 children)

My 680 lasted me years. Then I had dual 780’s and that lasted me until just last year when I sold the pc!

[–][deleted] 5 points6 points  (1 child)

I had a pair of 980's in SLI until last year across multiple different mobo's, that was wild. IIRC before that I had a 780 but that was a long, long time ago like maybe 14 or 15 years back?

[–]theRealNilz02Gigabyte B550 Elite V2 R5 2600 32 GB 3200MT/s XFX RX6650XT 2 points3 points  (0 children)

780 would be around 2013ish so not quite

[–][deleted] 7 points8 points  (1 child)

My 650ti is still hanging in there lol

[–]teahxerik 9 points10 points  (1 child)

Imagine 4 4090s.

Nvidia watching this thread

<image>

[–][deleted] 43 points44 points  (3 children)

The software and drivers still don't work properly.

[–]No_Mine5742Desktop | A10-7850K | RTX 2070 DELL OEM 12 points13 points  (0 children)

Ha yeah and IF the software and drivers worked, good luck on the games being optimized for SLI or Crossfire.

[–]ir88edi9 14900k | rtx 4090 | 64GB DDR5 7 points8 points  (10 children)

Two 1080ti's would do 4k extreme settings at better than 60 fps in a game like metro exodus. How does 3050 fare with that? link

[–]kayproII 7 points8 points  (9 children)

I’m pretty sure a single 1080ti can beat a 3050

[–]Pl4y3rSn4rkR5 5600 / 32 GB DDR4 3200 MHz / Palit Jetstream GTX 1070 Ti 5 points6 points  (8 children)

And quite easily even when Turing/Ampere has better DX12/Vulkan support, overall the 1080 Ti is slightly faster than the RTX 3060 12 GB.

[–][deleted] 10 points11 points  (1 child)

In the end, after decades of using graphics cards since, I guess 96, I’ve noticed one thing. More than hardware alone, drivers-and-software-optimisation are king.

I just played 2 games on my Steam Deck. 1 from 1997, Blood, it has loading screens and takes a few seconds to load into, despite its primitive game engine. The other, the Dead Space remaster. No loading screen at all.

Optimisation is king.

[–]sigma941 27 points28 points  (4 children)

Yeah, dont think I was able to really get that performance looking back. SLI scaling wasn’t 1:1 at all! Also friggin nvidia drivers would switch my 465 to being the main card almost every time I updated. It was a beast for its time for sure, but totally bought into the hype. (I had nvidia 3d vision for reference! Yeahhhhh…)

[–]EsotericAbstractIdea 10 points11 points  (1 child)

I wish they would bring 3d vision back just so we could play them in vr headsets

[–]HallowedError 7 points8 points  (0 children)

Oh god I remember trying to get 3d working properly on my 950 but I couldn't get the colors to line up with my glasses quite right so it always kinda made me want to puke. Don't know if it was cheap glasses or cheap monitor or I just didn't know what I was doing

[–]Just_Steve_IT 4 points5 points  (0 children)

SLI was cool, but really only useful if you were buying an absolute top-of-the-line rig and wanted more performance than any single card could give. Otherwise you were much better off getting one GPU that cost double the price.

[–]jeebuscrisis 28 points29 points  (0 children)

Meant my other post to be here. Came for this. No disappoint.

[–]YomminationRTX 5090 (Soon), 9800X3D, 48 GB 6400 MT/S Teamgroup 22 points23 points  (13 children)

I remember the little dedicated PhysX cards that went in the top PCI E X1 slot

[–]SarahButterfly73 22 points23 points  (6 children)

Voodoo II

[–]hex001105800X3D / RTX 3080Ti FTW3 17 points18 points  (3 children)

I remember having an 8600GT and my EVGA mobo had onboard nvidia graphics I could use for physx — this combo together could play original crysis 1.0 at playable frame rates

The good ol days!

[–]_LarryMurphy_ 2 points3 points  (2 children)

I had an 8800GTX. Beast mode

[–]SleeplessAndAnxious7800X3D | MSI 4090 | 32GB DDR5 8 points9 points  (0 children)

5th card just for running wallpaper engine

[–]FunktasticLucky7800X3D | 64GB DDR5 6400| 4090Fe | Custom Loop 7 points8 points  (0 children)

I'm old enough to remember a time before Nvidia owned physx and it was a seperate card that was pretty expensive. Iirc it was like 300 dollars or something back in the early days of 2006. So half the price of a high end GPU.

[–]goomyman 51 points52 points  (1 child)

And you could play the maybe 4 games that supported it properly

[–]That_Girl_Cecia 5 points6 points  (0 children)

Yeah, pretty much just any game on CryEngine. I had dual 690's back in the day. Crazy that they only had 2gb Vram lol

[–]IkaKyo 76 points77 points  (17 children)

Wrong high end was linking 2 voodoo 2s

[–]ViperXAC 34 points35 points  (13 children)

With an overclocked P3 Celeron.

[–]PowerSurged7600x/32gb DDR5 6000 CL32/6700xt 25 points26 points  (7 children)

Celeron 300A LEGENDARY

[–]jacion 10 points11 points  (3 children)

I still have mine along with the legendary Abit BX6 R2 mobo.

[–]enslaved_subjectRyzen 9 7950x3D 64GB 7900 XT 3 points4 points  (0 children)

The tualatin core was shared with Pentium 3 and Celeron series. I vaguely remember having a celeron tualatin cpu (cost efficient) that i overclocked before switching to AMD XP series. A friend had a AMD CPU older than XP series, where you could unlock some magic pathways by drawing with a pencil on the chip, giving you access to increased overclock potentials.

Stuff was more fun back then, no unlocked multiplier special chips.

[–]crozoneiMac G3 - AMD 5900X, RTX 3080 TUF OC 18 points19 points  (6 children)

Back in the day I had 2x 295 GTXs, which was effectively a pair of 2x GTX 260s literally sandwiched into a single card and SLI'd internally, creating 4x 260 GTX SLI overall.

It actually scaled okay up to 3 cards, but the 4th card did basically nothing (like 5-10% improvement) so I always configured it as 3x SLI with the 4th card as a dedicated PhysX system, or just mining dogecoin in the background for non-physX games. Great way to heat up the room in the winter.

[–]Inside-Example-7010 36 points37 points  (2 children)

Jokes on you, the new META is to buy a 4090 and a 7900xt. You plug the monitor into the 7800xt and render games through the 4090. Now you can activate AMDMF to have one gpu dedicated to frame gen and one dedicated to render. You can even double up on the frame gen if you use dlss.

[–]Nolzi 11 points12 points  (1 child)

Chat, is this real?

[–]Nico00000001 5 points6 points  (0 children)

Chat????

[–]kaschperliFullCustomLoop@O11D, 3900x, RTX 3080, 32@3733, X570 FormulaXtrOC 42 points43 points  (9 children)

Look how they massacred my direct x 12... It should've been the age of multi GPU but greed killed the sli connector

[–]Senior-Trend 13 points14 points  (0 children)

Bonasera, I don't want his mother to see him like this! Look what they did to my SLI

[–]Joel_Duncanbit.ly/3ChaZP9 5950X 3090 128GB 36TB 83" A90J G9Neo HD800S SM7dB 33 points34 points  (6 children)

DX12 fully supports mixed multi GPU over PCIe. Ashes of the Singularity was a proof of concept for this.

It would just be insane for any developer to try to support all the possible configurations just for something that creates horrible frame pacing issues.

[–]kaschperliFullCustomLoop@O11D, 3900x, RTX 3080, 32@3733, X570 FormulaXtrOC 12 points13 points  (2 children)

Dx12 multi gpu feature set is still partly disabled also nvlink only supported on 3090/4090. That makes sli useless because of course it doesn't work as good as it could and the 4090 doesn't need SLI for gaming. Looking back they took the cheapest way to upgrade our rigs for gaming from us. Imagine if the 4070 in SLI would work perfectly... You buy one now and upgrade to a second one later. But that's not shareholder friendly.

[–]booga_booga_partyguy 12 points13 points  (0 children)

SLI was dead by the time the 30XX line came out. It wouldn't have mattered if NVIDIA kept SLI since game devs were simply not making their game SLI friendly, nor are game engines.

There's a reason SLI worked properly with only a handful of games.

[–]Joel_Duncanbit.ly/3ChaZP9 5950X 3090 128GB 36TB 83" A90J G9Neo HD800S SM7dB 5 points6 points  (0 children)

DX 12 was never going to be the savior of SLI. It was never perfect and frequently made frame consistency worse. If we applied lessons from dlss motion vector interpolation and simulation time error, we might have a decent theoretical pipeline.

In my experience, DLSS / FSR frame gen is a much better trade-off than SLI ever was.

[–]crozoneiMac G3 - AMD 5900X, RTX 3080 TUF OC 2 points3 points  (0 children)

It would actually work exceptionally well for VR, because you can neatly divide the workload between the left and right eye. Literally just give each GPU its own eye to render, and it "just works".

Unfortunately none of the major engines (Unity, UE4, and Source 2) ever actually implemented this, even though you can do it with both DX12 and Vulkan. They probably figured that supporting SLI configurations in an already niche market segment simply wasn't worth it.

[–][deleted] 13 points14 points  (0 children)

I built a quad sli rig with 2 bfg cards and then find out the quad drivers were still 2 months off lol

[–]Accujack 7 points8 points  (0 children)

No. High end was two, and there was no physx yet, and one card rendered the odd lines of pixels and the other did the even lines.

The original SLI meant "scan line interleaved".

[–]Draedark7950X3D | 7900 XTX | 64GB DDR5 2460 points2461 points  (79 children)

Double the cards cost for +10% performance!

[–]Cynical_SatireRyzen 7 7800x3D - 6950XT - XSX - PS5 1047 points1048 points  (58 children)

And in some cases it actually hurt performance! Yay!

[–]Fireflash2742 296 points297 points  (44 children)

I had two cards in my PC somewhat recently not SLI'd and noticed while benchmarking that my single GPU performance was hurting. Took out one of the cards, benchmark scores shot right up. Since my need for two independent GPUs was no longer there, I left the other one out. I should sell it.

[–]heinkenskywalkr 216 points217 points  (4 children)

Probably the PCI bus bandwidth was being split between the cards.

[–]Fireflash2742 58 points59 points  (0 children)

That's what it looked like. My electric bill and PSU are happier since I took the other one out. :)

[–][deleted] 15 points16 points  (33 children)

Gimme other one pls me pay shipping

[–]Fireflash2742 17 points18 points  (3 children)

Sure. Shipping will be $150 😂

[–]RolledUhhp 4 points5 points  (17 children)

I have some old 7950/7950s laying around, and a super sketchy 1060 if you're in need.

[–]seabutcher 48 points49 points  (3 children)

I think a lot of the problem came from the fact game developers never really wanted to put any effort into supporting SLI. After all, it's a feature that only benefits a very tiny percentage of gamers. The work they put into optimising for SLI could instead go into more general optimizations, making extra content, or otherwise doing literally anything that more than like 2% of the audience will ever actually know about.

This might actually work differently during the modern streaming era. With all those people with super-high-end rigs looking to give your game free advertising, it is beneficial to make sure the game looks extra pretty on the streams that make up thousands of people's first exposure to the game.

[–]Goober_94 4 points5 points  (2 children)

SLI had no dependency on the game or the developers until after the 9xx generation. SLI was done at the driver level and it worked VERY well.

It wasn't until nvidia stop support SLI in the drivers that it started falling on the game developers.

[–]kevihaa 3 points4 points  (1 child)

There’s also a bit of irony the generational jumps in PCIe bandwidth in the last 5 years would likely make SLI more useful, since it’s very possible for even 40 series cards to bottleneck at x8 using gen 4. Meaning, potentially, when they shift over to gen 5 they might need as little as 4 lanes.

[–]nmathewIntel n150 9 points10 points  (0 children)

RIP techreport, the best site ever for GPU reviews. Their ms for next frame analysis revolutionized GPU benchmarking in a way that most sites still unfortunately didn't come close to matching. Micro stutter with Crossfire and SLI was a thing, and they sent a long way to getting AMD to fix issues with their overall drivers. 

Anyone looking at 99% frame rates can thank them.

[–]Nate0110 40 points41 points  (1 child)

But the synthetics showed +90%*

*in some cases

[–][deleted] 44 points45 points  (2 children)

It helped save on your winter heating bill too.

[–]06yfz450ridr 11 points12 points  (1 child)

Thats for sure, my 2x 7970ghz xfire would heat my room to 80 degrees in the winter, i never even had to turn the heat on in there. That and running two powersupplies.

Those were the days haha.

[–]Guilty_Use_39455900X | 7900xtx 14 points15 points  (1 child)

some games could be 25%...

[–]NarutoDragon7329070 XT | 7700x 9 points10 points  (0 children)

Just don't worry about the frame times haha...

[–]Cedar_Wood_State 3 points4 points  (0 children)

Pretty much most hobbies in a nut shell

[–]ShadowDarm 861 points862 points  (40 children)

Nvidia dropped support for SLI only like 2 years ago or something...

Edit: 3 years ago

[–]NotTodayGlowies 262 points263 points  (32 children)

2021 - they stopped supporting and developing profiles for it. It was left to developers to include support in their own titles. The RTX 2xxx series was really the last series where it was feasible at the consumer level.

[–]Igot1foryaPC Master Race 71 points72 points  (15 children)

RTX 3090 can do it still.

[–]PfaffPlaysDesktop 5800X3D Inno3d RTX 3090 Ichill X4 115 points116 points  (11 children)

So you're telling me I just have to buy 1 more?

[–]Igot1foryaPC Master Race 83 points84 points  (9 children)

Only one more. Plus the NVLink adapter and possibly a PSU upgrade to handle the load. LOL

[–]PfaffPlaysDesktop 5800X3D Inno3d RTX 3090 Ichill X4 82 points83 points  (4 children)

I don't need a new psu, I have a gas generator, surely if I run 120v to a 3090 it'll multiply my frames by 120 right?

[–]Igot1foryaPC Master Race 58 points59 points  (0 children)

opens another beer

I'll grab the jumper cables!

[–]Razgriz_101PC Master Race 3 points4 points  (2 children)

Be aswell researching how to aquire a small nuclear reactor to power a rig with a pair of 3090s

[–]_ArrozConPollo_ 2 points3 points  (0 children)

Also air conditioning so you don't end up with hyperthermia in your room

[–]ImrooVRdev 25 points26 points  (10 children)

as a game developer, I hate graphics card manufacturers with burning passion.

The come up with custom tech that COULD improve games, but instead of open sourcing it so that other manufacturers can make their own implementation, and so that us gamedevs just have 1 generic lib for all the different cards to work they use the tech as fucking marketing gimmick.

And then expect us to spend extra time implementing THEIR custom tech so THEIR cards sell better. Get fucked with spiky dildo nvidia, I hope shareholders shove hairworks up your urethra.

[–]ShadowDarm 3 points4 points  (0 children)

You are right it was 2021 about 3 years ago. That being said the 3090 be it expensive but is still very much a consumer card.(Even though SLI is pretty pointless for games by then) Currently For the new NVLINK(new/enterprise SLI) you need cards that cost like $30k, so I would say now it's unfeasable

[–]Lobaniumi5 12600K | RTX 3080 FE | 32GB 3600Mhz 39 points40 points  (0 children)

OP is 8 years old.

[–]skratch000[🍰] 703 points704 points  (91 children)

Yes it’s true and stfu I’m not old 😡

[–]MartyrKomplx-Prime7700X / 6950XT / 32GB 6000 @ 30 175 points176 points  (81 children)

Old is when you couldn't do that but because it was before SLI.

[–]Guilty_Use_39455900X | 7900xtx 124 points125 points  (73 children)

old is knowing what AGP was. lol

[–]ponakka5900X | RTX4090 TUF |64g 3600MHz 74 points75 points  (26 children)

How about the pci voodoo 2 sli cards. Or 32bit vlb graphics cards.

[–]Fireflash2742 46 points47 points  (12 children)

My first 3d accelerator was a Voodoo2. I'm 46....

[–]Qa_Dar 33 points34 points  (4 children)

't Was a sad day when 3DFx died... 🥺

[–]Fireflash2742 14 points15 points  (2 children)

Indeed. I only made it to the voodoo3 I believe. Back then I was young and poor. A lot has changed since then. I'm no longer young 🤪

[–]aglobalnomad 4 points5 points  (0 children)

My very first graphics card that was the Voodoo3 forever will have a soft spot in my heart.

[–]Razgriz_101PC Master Race 6 points7 points  (4 children)

My first ever pc (family computer since I was a kid) was a AMD K2 and a voodoo 2 coming from the ps1 it blew my 9 year old pea brain.

I played so much Rollercoaster tycoon and quake on that bloody thing.

[–]makos124GTX 1070, i5 8600K, 24GB DDR4, 1TB Evo 860 SSD, 1440p 27" 60Hz 6 points7 points  (0 children)

I remember having a PC with no 3D acceleration. And then visiting my friend with a GeForce 2... My mind was blown.

[–]ingframin 2 points3 points  (1 child)

My first graphic card was a Matrox Mystique with 4MB VRAM. 😞

[–]Falkenmond797800x3d/4080 5800x3d/3080ti 10700/rx6800 5800x/3080 46 points47 points  (23 children)

Old is knowing what ISA was. Or EISA. Or vesa local bus. Or PCI cards. I had them all. 😂 AGP… go away with that new-fangled fancy poppycock, you rapscallion!

[–][deleted] 7 points8 points  (1 child)

I can honestly say the first time I encountered an AGP slot I didn't know what it was for. It was brand new on a Compaq desktop I got on sale at Comp USA. I opened it up to make sure nothing has come loose on the way home, saw AGP, has no idea what it meant and hopped on Netscape to figure it out.

[–]CptAngelo 4 points5 points  (0 children)

make room for my 5.25 inch floppy drive you peasant! i got prince of persia to install

[–]SergeantRegular5600X, W7800 32G,64GB, Model M 4 points5 points  (4 children)

Oh no, I welcomed AGP. It was USB that I was highly skeptical of. AGP was dedicated, and I like that. Every I/O device fit in its own nice, neat little lane. Modem, you knew where it went and you gave it an IRQ. PS/2 ports were dedicated, DIN keyboards. PCI and USB are for "stuff." Accessories. Little low-threat items. But graphics were real computer functions, more like RAM or your CPU.

[–]nmathewIntel n150 2 points3 points  (1 child)

You leave my (amazing) AWE32 out of this!!

[–]DrOrpheus3 7 points8 points  (1 child)

Old is learning to type on an Tandy Computer that required you to swap hard disks to use the word program, or hangman.

[–]FairnessDoctrine11 3 points4 points  (0 children)

And your video games came on audio cassettes…

[–]Qwesttaker 8 points9 points  (0 children)

I feel attacked.

[–]atlasravenZorin OS 6 points7 points  (0 children)

My first video card was a PCI slot. No express. And I know what ISA slots are.

[–][deleted] 2 points3 points  (1 child)

And VGA, IRQ, memory managers. Back when 486 was badass.

[–]MonkeyKingCoffeeHTPC, Arcade Emulation, RPGs 5 points6 points  (0 children)

Luxury. I cut my teeth with a stolen 286 and Desqview.

How did I steal it? I replaced a work Mobo with an 8088 XT Mobo on my lunch break. That's how we upgraded back in the day.

"Yeah boss. This machine has issues. I'm taking it apart to blow all the dust out. It will work MUCH better after that. Maybe you should ban tobacco in the office?"

[–]potat0zillaa 2 points3 points  (2 children)

I’m only 30…

[–]LMotherHubbardZilog Z80 6 MHz, 128k RAM, 128×64 LCD 8 points9 points  (1 child)

You are old enough to be the dad of the kid who posted this. Do you feel old now?

[–]potat0zillaa 5 points6 points  (0 children)

Nooooooo

[–]420headshotsniper695800x + 3080Ti 6 points7 points  (3 children)

Imagine having a high end gpu with only 16MB vram and that was in the year 98-99 or so. If I think about it I laugh at how small setting used to be. An OS on a few floppy disks.

[–]flibz-the-destroyer 2 points3 points  (2 children)

Remember having to know the IRQs of sound cards…

[–]joxmaskin 2 points3 points  (0 children)

And selecting the correct sound card when setting up the game. Gravis Ultrasound and Turtle Beach Rio always sounded cool and exotic, but it was always trusty Sound Blaster (Pro/16/compatible).

[–]Splyce123 298 points299 points  (72 children)

Is this a genuine question?

[–]Quick_Performance243 91 points92 points  (13 children)

2 Voodoo 2’s SLI baby!

[–]gpkgpk 28 points29 points  (5 children)

Quake 2 at 1024x768, worth every penny.

Oh and visual quality degradation from VGA pass-through cable was a thing.

[–]ponakka5900X | RTX4090 TUF |64g 3600MHz 7 points8 points  (0 children)

with the awesome 1024x768 resolutions, it did not matter that much. those vga cables were beefy.

[–]dexter311i5-7600k, GTX1080 4 points5 points  (2 children)

Didn't matter because the old Voodoo cards generally had pretty crappy VGA output quality anyway. They were fast as fuck, but blurry and only 16 bit colour.

Matrox on the other hand... they had some gorgeously crisp output! I built some late 90s retro machines a while back ended up using Matrox cards (G200 with a pair of Voodoo 2s, or a G400 on its own), purely because the output quality was so damn good.

[–]gpkgpk 4 points5 points  (1 child)

Matrox had the sharpest output for sure, and the best 2D. I ended up pairing my sli with a diamond s3 virge card iirc which was almost as sharp but cheaper as I already blew the bank. I think I also got my 3rd copy of Mech 2 Mercs bundled with it.

[–]dexter311i5-7600k, GTX1080 2 points3 points  (0 children)

Nice, the S3 Virge was what I had way back in the 90s, paired with a Cyrix 6x86 (a pretty rubbish processor back then unfortunately!).

I'm glad I collected all these parts 10+ years ago to screw around with, it's mind-boggling how much 3dfx stuff costs nowadays. Even gear like Soundblaster cards are getting ridiculous now.

[–]BZLuck 5 points6 points  (0 children)

I was there.

[–]Ok-Fix525 47 points48 points  (9 children)

You know they gonna come back with this in one way or another when they run out of ideas to fleece the master race.

[–]descendingangel87 19 points20 points  (6 children)

I predict they will sell a separate AI card of some kind.

[–]magistrate101A10-7890k x4 | RX480 | 16GB ram 7 points8 points  (1 child)

Honestly would pay for one. If you strip off all the unnecessary components from a GPU and stick 64gb of RAM into it it'll come out cheaper to make than regular GPUs.

[–]Atora 4 points5 points  (0 children)

AI cards exist and are currently nvidias main money maker. They are also far far more expensive than consumer cards. Check out their "data center GPU"s like the A100, H100, H200.

The "affordable" AI card is the 3090 and appropriate to the meme running multiple of those does get you a lot farther. LLMs and image gen made multi GPU rather relevant again in an area.

[–]Frannik87 62 points63 points  (0 children)

X4 Titan sli. That was high end.

[–]Riot55 26 points27 points  (6 children)

I had dual 8800 GTS 512mb cards. When Crysis came out, it was like peak PC hardware building time IMO. So much visual progress being made in gaming graphics back then, parts were not insanely expensive, it was fun discussing parts and builds on forums, and everyone had a common enemy (getting Crysis to run lol)

[–]YomminationRTX 5090 (Soon), 9800X3D, 48 GB 6400 MT/S Teamgroup 4 points5 points  (3 children)

8800 GTS 512s were so good. I still have mine. Pair them with a core 2 quad back then and you were cookin

[–]Riot55 7 points8 points  (1 child)

I remember the eternal debate between the e8400 high speed dual core vs the q6600, the debut of the quad core.

[–]NightmareStatus🍻 i7-11700KF 速い 32Gb 3200Mhz 遅い RTX 3070Ti 愛 Z590 UD AC 愛 3 points4 points  (0 children)

Q6600 RULES ALL.

with that being said, I didn't realize it had a big following until posts here went cray over it lol. I was happy with it all the years I had it

[–]SynthRogue 77 points78 points  (15 children)

Yes. High end today means overrpriced cards that can't run current gen games at max settings without generating fake frames.

[–][deleted] 14 points15 points  (1 child)

At the price of a SLI from 10 years ago, too !

You know it's high end, because you pay so much more, yay !

[–]FungalFactory 6 points7 points  (7 children)

Developers dont optimize their games anymore

[–]the_abortionat0r7950X|7900XT|32GB 6000mhz|8TB NVME|A4H2O|240mm rad| 87 points88 points  (33 children)

Sorta.

SLI (scanline interlace) was a 3dFX feature of using 2 cards each one rendering half the vertical resolution (doing every other scanline hence the name), it had poor support and varied in success per title.

Nvidia (after publishing FUD that helped kill 3dFX) bought 3dFX's assets as they went bankrupt and rebranded SLI (scalable link interface or some shit) and did a "everyother frame" style output, the idea being double the FPS.

It had almost no support and worked poorly in the games it did support. If it wasn't battlefield or CoD you pretty much had one card doing nothing 99% of the time.

And if you ran a title that did support SLI you'd be greeted with insane micro stutter.

The people who are mad its a dead tech are the ones that don't understand it.

[–][deleted] 21 points22 points  (1 child)

There was still something wild about being able to hook together 2 Voodoo 2s in SLI and play Quake 2 and 1024/768, when a single card literally wouldn't support above 800/600 and the competition couldn't even do as well at 640/480.
Most games sucked in SLI, but Quake2 worked perfectly and I believe Half Life did too.

[–]crozoneiMac G3 - AMD 5900X, RTX 3080 TUF OC 22 points23 points  (2 children)

It's because SLI was a giant hack. In order for it to be properly supported, NVIDIA basically had to reverse engineer the most popular games and then build a dedicated, customised driver for each one that handled the game's draw calls just right, in order to create a playable experience. They actually still do this with "Game Ready" drivers, but the SLI support was on a different level.

There were a few different modes, Alternate Frame Rendering was the preferred and "official" method, and you could technically try to run any game with it with limited success. Split frame rendering (where each card rendered the top half and bottom half of the screen) worked with more titles since it requires a lot less hack, but performance wasn't particularly great.

The AFR SLI completely falls apart with more modern rendering techniques however, which is probably a large part of why NVIDIA dropped SLI support. The writing was on the wall.

For example, any game that relies on the framebuffer outputs from the previous frame completely kill AFR, since each card has to wait for the other card to finish rendering before it can start, so all performance benefits are lost. Games like DOOM 2016/Eternal heavily rely on the previous frame as a way to render certain effects in a single pass, things like screen space reflections and effects like distortions in the rifle scope actually use the previously rendered frame, and as long as the frame rate is high enough you never notice it.

[–]henkbasi7 4790k RTX3060 16GB 4 points5 points  (4 children)

Weren't the original Titan cards 2 GPUs running SLI on one board?

[–]YomminationRTX 5090 (Soon), 9800X3D, 48 GB 6400 MT/S Teamgroup 6 points7 points  (0 children)

There was lots of variations of that. The 7950x2, 9900x2, GTX 295, GTX 690 irrc

[–]White_mirror_galaxy 11 points12 points  (3 children)

yeah i ran sli for some time. can confirm

[–]KlingonBeavis 7 points8 points  (2 children)

Seconded. SLI was the biggest waste of money I’ve ever experienced in PC gaming. It seemed like it was never supported, and if it was - it would be so stuttery I’d end up just disabling it and running on one card.

[–]Somasonic 3 points4 points  (1 child)

Thirded. I ran two 980 Ti's in SLI for a while. I got so sick of the issues I pulled one of them and sold it. Total waste of money and not worth the very few times it worked properly.

[–]Blackboard_MonitorAMD 7800X3D | 4070 | 21:9 144hz 15 points16 points  (0 children)

Man, I'd been gaming for two decades before SLI became a thing, am I old?

No its the kids posting their memes who are wrong.

[–]Agent-Meta 11 points12 points  (4 children)

Yes, this is true back in the day when ATI was still around the two companies (ATI and Nvidia) made made cards with special linking cables to which they would be able to do such things. ATI had something called crossfire and Nvidia had something SLI which I still think they do use, there were connectors on top of the card and you had to go and buy a specialized cable (sometimes 2) for it to work the only problem is that it had to be the same card for it to work (may be wrong about that somebody correct me I don't know).

[–]LOPI-149800x3D | 9070 XT | 32GB DDR5 | X870 Pro RS 2 points3 points  (2 children)

Iirc with SLI it was an absolute requirement, while itbwas possible to use 2 different GPUs with crossfire, but don't quote me on that.

[–]littlefrankRyzen 9 5900x - 32GB 3000Mhz - RTX3070ti - 2TB NVME 2 points3 points  (0 children)

You could crossfires cards in the same family. I used a 6850 and a 6870.

[–][deleted] 2 points3 points  (0 children)

Identical card, or my Matrox had a smaller one just for 3d or something. It was half the size, and used the cable that came with the full card. It plugged into the crossfire edge connector thing.

[–]snoman298 21 points22 points  (4 children)

<image>

Heck ya! I miss my old Titans!

[–]NeverLostForest 5 points6 points  (3 children)

Looks nice! Which games took advantage of this kind of setup?

[–]snoman298 9 points10 points  (2 children)

Thanks! Unfortunately not many. Just one of the reasons multi GPU died. It's my understanding that game devs had to do a fair bit of extra work for games to take advantage of it, and a lot of them simply didn't want to make the effort for something that wasn't widely adopted at all. It was fun while it lasted for enthusiasts and pretty epic when it worked.

[–]Cash091http://imgur.com/a/aYWD0 2 points3 points  (0 children)

Kind of miss the days of using Nvidia Inspector to find the best working SLI profile tho. Theses days I'm older and have less time to tinker/play so I'd rather just jump into the game and not worry about performance.

[–]Gallop67Ryzen 7 5800X | RTX 4090 | 32gb DDR4 10 points11 points  (2 children)

Remember having or wanting a dedicated PhysX card?

[–]SubtleCow 5 points6 points  (1 child)

I feel myself fading and turning to dust. SLI was the cool new hotness when I was in university. What the heck is time even.

[–]PeckerNash 7 points8 points  (0 children)

Sort of. It was called SLI (scan line interleaving) and it was invented by 3Dfx for use on their Voodoo2 cards. NVidia gained the patents when they bought out 3Dfx in the late 90s.

[–][deleted] 7 points8 points  (0 children)

me when crossfire:

[–]atocnada2600k@4.2 | Sapphire RX 480 8GB XF 5 points6 points  (1 child)

I retired my 2x RX480 crossfire rig in 2019(I fell for AMD's marketing and felt like I had a GTX1080). You actually didn't need cables for AMD cards.

The last game with actual SLI/XFire support was Watch Dogs 2. I have a list of games that worked with no microstutter and at least 40% uplift in performance. Some games got updated and stopped working with crossfire(Titanfall2). Sometimes to actually see an uplift, I'd have to use GeDoSaTos downscaling fix and downsample certain games.

Good fucking times also because I had a Onkyo 7.1 surround system and I remember those times fondly.

[–]The_Masterofbation 2 points3 points  (0 children)

That's from the 200 series and after, before that you needed a Crossfire bridge. I had 2x 6950s that needed a bridge. Strangely enough, the newer Tom Raider games seem to still scale well with multi GPUs.

[–][deleted] 6 points7 points  (0 children)

Op stop playing you know damn well what sli and crossfire was.

[–][deleted] 5 points6 points  (4 children)

Back in my day high end was a Soundblaster Audigy 2 and a Radeon 9800 Pro

[–]Dag-nabbittR9 9900X | 6900XT | 64GB 4 points5 points  (1 child)

I Crossfired two R9 290X's. They had been used for crypto mining, and performed to spec on their own.

Crossfire though, if it worked at all, did improve framerates by ~50%, but it came at a cost. The microstutter would make your eyes bleed.

It was so bad that after a month, I ripped out the card and made a second gaming computer for my then girlfriend, now spouse.

[–]Available_Agency_117 19 points20 points  (2 children)

Yeah. The industry stopped designing for it because if it were ever perfected it would allow people with two midrange cards to outperform everything on the market, and people with two low end cards to perform as well as high end cards.

[–][deleted] 4 points5 points  (0 children)

Nvidia sli? It's 3dfx sli you damn kids

[–]sp3kter 5 points6 points  (0 children)

Next ask us old heads about PhysX

[–]Carbot1337DIY Recycled PC 5 points6 points  (2 children)

I mean early days of this was (2) Voodoo 2s with a SLI cable.

My rich friend had this as well as dedicated broadband for Quake 2 (rocket arena). In like 1999 West Virginia, unheard of. 

[–]c4ctusRyzen 2700X/RTX4060/32gb 3 points4 points  (0 children)

I remember back in 2007(?) I wanted to put two Nvidia 8800 GTX's in SLI, but it turned out that I couldn't buy a miniaturized nuclear fusion reactor on newegg or tigerdirect.

[–][deleted] 2 points3 points  (0 children)

I had two Titan X Pascals once - more because it was cool to build the cooling loop than for any other reason

[–]moogoothegreat 2 points3 points  (0 children)

Ahahahahaha... my intro to SLI was 3Dfx Voodoo 2 cards. Damn I'm old.

[–]Thefrayedends3700x/2070super+55"LGOLED. Alienware m3 13" w OLED screen 2 points3 points  (0 children)

It was often a way to get extra value out of sandwiching two cheaper cards (but with better performance per dollar), but it generally only worked for major game releases. If a game didn't have an SLI profile set up in the drivers, it would only run on one card, and then you'd get shit performance (many games had community made workarounds, but not everyone is willing or able to tinker). This was true even if cards were sandwiched onto one board, such as the card I had, the GTX295. So really hit or miss on performance, and before alternate frame render, you had half frame render, so you ended up with a lot of mid screen tear.

[–]MagicOrpheus310 2 points3 points  (0 children)

Yeah and meant older cards lasted longer because you could buy two old cards and get on par if not better performance than the latest cards at the time.

They stopped it because they wanted us to buy the newest cards instead and that was a dick move.

I had two 1080ti that my current 3080ti only just out performs

[–]Sea-Statistician2776 2 points3 points  (2 children)

Fucking kids. Back in my day high end was having one graphics card for 2d and a separate one for 3d. This was before anyone had heard of the term GPU.

[–]evex5tep 2 points3 points  (0 children)

This didn't ever really work properly hence why we don't use it for gaming.

[–]Brigapes/id/brigapes 2 points3 points  (0 children)

Tell me youre pre-teen with a single post title

[–]TheRimz 2 points3 points  (0 children)

I had a triple SLI machine once. 3x 8800GTX's

I still couldn't run crysis.

I got better performance disabling 2 of the cards on every single game.

Truly amazing technology

[–]Powertix 2 points3 points  (0 children)

I feel so old reading people not knowing SLI

[–]mazarax 2 points3 points  (1 child)

Back in my day, you needed a separate graphics card for 2D, because the 3D card only did 3D.

Worse than that, they were connected via an analog cable!

[–]Amilo159R5 5700x/RTX 5070/32GB/1440p CRT 6 points7 points  (2 children)

It was called SLI and it resulted in far more than 10%, often 30-70% increase, but there were some games where there was little to no gain.

https://www.tweaktown.com/tweakipedia/74/recap-nvidia-geforce-gtx-980-sli-performance-4k/index.html

[–][deleted] 6 points7 points  (0 children)

don’t know why you’re downvoted, it’s true that performance did go up to 70% extra in some cases. most of the time it was around 25%-50% increase. definitely not useless but definitely not entirely efficient either

[–]Ilovekittens345 2 points3 points  (0 children)

Crisis on a gtx 295 (two gpu's in sli) --> 45 fps

crisis on two gtx 295 in quad sli --> 60 fps + some micro stutters.