Where’s the external video cards?

One of the things I still don’t understand is why haven’t AMD and nVidia started making external video cards?

Obviously, one problem is going to be what do you plug it into.  But it would sure be handy to be able to bring home my laptop and plug it into my docking station and be able to play some mega cool games on it.

[more]

47,034 views 24 replies
Reply #1 Top

Well, unlike Harddrives and other USB/2 or Firewire devices, is that they can easily vary speeds. I mean the idea of a an exteranl video card sounds nice indeed, but it'd need a whole different kind of extranl slot that really make for some crazy looking towers/laptops. But hey, I'd buy one - after I got the motherboard capable of handling such a thing of course.

Reply #2 Top

There are hardware issues with both shielding the cable and, even more importantly, the length of the cable, vis a vis signal strength and integrity.  Given that, I used the nVidia QuadroPlex recently which is a high end version of what you are talking about, and I just saw this article on Guru3d about another vendor's external graphics card product.

http://www.guru3d.com/news/amilo-sa3650-with-graphics-booster-review/

 

Reply #3 Top

Eventually, it'll end up as a LEGO computer.

 

:fox:

Reply #5 Top

I was just talking about this with somebody.   I figure its because of the processing that a video card must do.   The CPU, RAM, and video card are pretty much all bundled together to ensure efficiancy and speed.  If you don't have them all plugged into the motherboard bridge, wouldn't it cause all sorts of problems?

Reply #6 Top

The the required technology for external video cards has only become available recently. AGP doesn't allow long cable lengths, so is unsuitabke for external video cards. With PCI Express it has become possible, however, 16 PCI Express lanes is something not many vendors would like to waste on an external port.

Reply #7 Top

Like this?

 

http://ati.amd.com/technology/xgp/index.html

Reply #8 Top

There are a few of them out there - I've seen them.  But I don't know why they aren't more prevalent.

Reply #9 Top

Like I said above, the Nvidia Quadroplex already does this as an external PCIexpress bridge.  :)

Reply #10 Top

Bandwidth.

A 16x PCIe slot is capable of 4 GB/s in each direction. Your average USB 2.0 port is capable of 60 MB/s. Gigabit Ethernet is capable of 125 MB/s (theoretical). eSATA is capable of 375 MB/s (theoretical). None come close to PCIe's max speed.

NOTE: GB/s = gigabyte/s Multiply by 8 to get gigabit/s.

For best operation, a video card needs the full bandwidth of the bus.

The best way to get great video on a laptop is to buy a laptop with great video. Buy a gaming laptop.

Reply #11 Top

This is actually a very good idea. It would be a simple matter to design, or modify an existing mobo design to give it an external PCIe slot, which would couple with it's mate in the dock. The graphics board, cooling, and power supply would be built into the dock itself.

Why stop at graphics? One could also add a higher end sound board.

This wouldn't necessarily have to be in a dock, it could also be stand-alone, a la an external hard drive. A reasonable legnth of highest quality cable, and an enclosure with good cooling and it's own power supply could work, at a potentially lower price point. 

I am quite surprised that no company has implemented this idea yet. It seems obvious to me that this would be a winner. Precious few people can afford a high end gaming laptop- even fewer of the target market (gaming demographic) can. Most buy, or have bought for them a fairly basic rig for school. Processing power, memory, and hard drive size/speed is becoming increasingly affordable now, so the basic requirements of a potential gaming rig are already there.

Tower owners who game generally put their rigs together piece by piece, as the funds become available; laptop owners, IMO, would jump at the chance to be able to do the same.

 

 

Reply #12 Top

I also know some of the older docking stations had card slots built in. It was older tech though (PCI maybe?). Most of what I see now are port replicators. I'm guessing there might be a newer version out there but I haven't looked.

Reply #13 Top

Potential problem: If you are using this external graphics card only at home, does that mean you need to constantly install different drivers depending on if the external card is currently plugged in or not? In that case, wouldn't it be easier to come up with an OS that has every driver built in so that you could simply carry around an external hard drive and boot from that at any desktop/laptop you come across?

Reply #14 Top

Quoting ToxDrawace, reply 13
Potential problem: If you are using this external graphics card only at home, does that mean you need to constantly install different drivers depending on if the external card is currently plugged in or not? In that case, wouldn't it be easier to come up with an OS that has every driver built in so that you could simply carry around an external hard drive and boot from that at any desktop/laptop you come across?

 

Not with windows XP or above, the HAL eliminates that problem, it might take a minute for it to catch up to you, but the transition is smooth, I took a OS course at community college and they gave the students hard drive bays, so no matter the system you were on, it was always your files and your configuration.

Reply #16 Top

Quoting USSENTERNCC1701E, reply 14

Quoting ToxDrawace, reply 13Potential problem: If you are using this external graphics card only at home, does that mean you need to constantly install different drivers depending on if the external card is currently plugged in or not? In that case, wouldn't it be easier to come up with an OS that has every driver built in so that you could simply carry around an external hard drive and boot from that at any desktop/laptop you come across?


 

Not with windows XP or above, the HAL eliminates that problem, it might take a minute for it to catch up to you, but the transition is smooth, I took a OS course at community college and they gave the students hard drive bays, so no matter the system you were on, it was always your files and your configuration.

 

That's odd, because I have XP, and I've cloned my hard drive to an external as a back-up. I was also hoping that i could boot from it using my mom's computer when i came back home, but all it did was start to boot up, get to the XP loading screen, and then restart.

Reply #17 Top

I suppose there might be an issue with heat? At least with the high-end cards, it might be quite dangerous to have them out where someone could touch them, so they would need a big clunky box to house them.

No, you're right, it's a good idea and they should do it. There are a lot of fairly pricey laptops out there with stupid Intel integrated barely-runs-Vista graphics, yet pretty decent CPUs and plenty of RAM. Even the cheapest discrete GPU could vastly improve their performance.

Reply #18 Top

Quoting ToxDrawace, reply 16


That's odd, because I have XP, and I've cloned my hard drive to an external as a back-up. I was also hoping that i could boot from it using my mom's computer when i came back home, but all it did was start to boot up, get to the XP loading screen, and then restart.

Did you clone the hard drive while running windows from that hard drive?  XP is a bit pickier, might also be because you were trying to boot from a USB hard drive, when Windows starts to boot, the original install expects to look for the files on a primary IDE hard drive, but when it looks there the files aren't what it expects, when I was in the class the OS was installed to the drive while it was in the bay, if you did a clean install to the USB drive that should work.

Reply #19 Top

As others have said you can get USB2 video devices.  The problem is that they tend to be low power devices that have trouble doing basic video overlay at their maximum resolutions; their are some that are more feature complete.  They also don't tend to have any 3D capabilities.

So they exist, but are a bit naff when it's all said and done.  A few pundits think USB3 might give enough bandwidth to allow for 3D acceleration but who knows.

Reply #20 Top

Anything that can function on the bandwidth limitations of a USB connection isn't going to be a whole lot faster than what you already have.  Plus it will need power, and be designed with ground loops in mind since you aren't likely to be able to draw enough power from any connector coming from the laptop. 

 

These things can be designed around, but neither is a trivial task with GPU's drawing so much power and unknown feedback circuitry from the laptop side (You don't want to have a ground loop fry you laptop while trying to play a game). 

 

These are just things off the top of my head, and I have never designed a video card.  I am sure that those have additional challenges that also preclude this from being a common thing.  I know that I have seen these external video cards before, so they are available if you really need one.

 

Here is one, but it doesn't include 3D acceleration: http://www.everythingusb.com/iogear_usb_2.0_external_video_card_12787.html

 

Reply #21 Top

Quoting CobraA1, reply 10
Bandwidth.

A 16x PCIe slot is capable of 4 GB/s in each direction. Your average USB 2.0 port is capable of 60 MB/s. Gigabit Ethernet is capable of 125 MB/s (theoretical). eSATA is capable of 375 MB/s (theoretical). None come close to PCIe's max speed.

NOTE: GB/s = gigabyte/s Multiply by 8 to get gigabit/s.

For best operation, a video card needs the full bandwidth of the bus.

Well, PCI-express 2.0 has double the bandwidth, which means 8GB/s, although very few (none as of today) make use of the full bandwidth, even though the internal (on the card) memory bus is usually faster. (There are quite a few bottlenecks on the way, so the graphics card will not be able to fully utilize all the bandwidth. - These differ depending on the chipset, chipset design, processor and so forth though.) There are a lot of benchmarks to show how much bandwidth certain cards use, although I don't have any direct links, unfortunately. (Will update when I find them though.)
But yes, "the more the merrier", when it comes to bandwidth, even if you don't need 100% of it.

Anyway, back On Topic:
nVidia Tesla D870 / nVidia Quadroplex 2200 D2 - nuffsaid ^^

Yes, it requires a PCIe host card, but it's all external GPU power :) And quite a lot of it too... Not too mobile, but hey ^^

Reply #22 Top

USB/Firewire are too slow for the positively obscene amount of bandwidth that a modern video card uses.

Even when USB 3.0 is out (it will be much, much faster), I doubt there would be a significant improvement over the latest Intel crappy integrated graphics chips.

For just one example, remember that a modern video card has at least 320 mb of RAM, but more likely 512/768/1024 mb.  In the near future, 2gb cards will be out (I mean cards that aren't X2).  Well that's 1 gig of information that needs to be read and written to almost instantaenously if a game will be playable.

You could use system RAM instead but you have to reach a point where you say the improvement (if any) would not be significant over even an intel GMA 950 or so, which is the minimum standard for onboard graphics.

I suppose you could rig up some sort of "GPU port" on the back of a computer which has the necessary interface speed, but then you run into another problem - heat.  Either your external GPU is going to be well-protected and sort of attractive, or it's giong to have copper sticking out all over the place.... and you just know some asshat will use it for a stand for his bottle of butane lighter fluid or something and then Nvidia would be no more.

Next, you've got the problem of the end-user.  People are DUMB.  Really, people are massive idiots.  And presumably, if these video cards are external, the idea is that they will be user-installible.  Someone will accidentally bash it or push it up against a wall too hard or use it as a can opener... they'll get busted, and there's no way a consumer will admit to being dumb enough to have been responsible for damaging their $400 "FATAL1TY X-TREME ASSASSIN Z8000 PRO PREMIUM TURBO SUPERCLOCKED VISTA-READY EDITION", it will have stopped working all on its own.

The less user-servicable parts there are, the better.  Damn people are stupid.

In short, it's technically infeasible and without any practical value even if it were possible.

Reply #23 Top

lol, you seem to have experience of tech support work :grin: I do, and it's just like that, things "just go broke" for no apparent reason ^^

However, you don't need to write to all the graphics memory instantly, if the application/game has any optimization at all (GTA IV is a great example on how not to do this /end bash) much of the data that's going to be on the graphics card's memory (usually GDDR, up to GDDR5 on today's cards) is buffered by the application, meaning that it will preload much of the data into the memory before it's actually needed. This, in addition to the GPU <-> GDDR links being damn much more faster (12.5 GB/s on my GTX280, for example) than the GPU <-> PCIe (8 GB/s on a PCIe 16x 2.0, just to compare) link make buffering quite essential.

So the link is more important than where the Graphics memory is, actually it's better for the GPU to handle it, since it doesn't have to pass the PCIe lane, which would create another bottleneck...

Reply #24 Top

Quoting Caydr, reply 22
For just one example, remember that a modern video card has at least 320 mb of RAM, but more likely 512/768/1024 mb.  In the near future, 2gb cards will be out (I mean cards that aren't X2).  Well that's 1 gig of information that needs to be read and written to almost instantaenously if a game will be playable.

$400 "FATAL1TY X-TREME ASSASSIN Z8000 PRO PREMIUM TURBO SUPERCLOCKED VISTA-READY EDITION", it will have stopped working all on its own.

Looking at some of the tech demos and listening to the talks given by people at the EVE:Online Fanfest it seems to be that you don't shove 1GB of textures in to the card, you'll choke the bus/processor/something.  The RAM is for textures/models/etc but also for processing, running micro applications purely on the GPU that figure out shadows, do real time lighting calculations and such.  At least that's the impression I'm getting from Nvidia's Cuda technology and whatever ATI's version of it is/will be.

Curiosity has the better of me, I keep seeing all this well... overpriced tat with FATL1TY written on it.  What the hell is all that about and why they hell can't these people spell better than me?