Currently, the majority of graphics cards are integrated, and at the same time, most systems sold for gaming are laptops. However, it goes without saying that there are not many external graphics cards or even none at all. Moreover, they seem to have disappeared. Why did this happen? Let me explain.
The main problem when buying a gaming laptop is precisely its graphics capacity, and everyone knows that due to power consumption and temperature limitations, you cannot fit the same powerful beast found in tower computers. You might think there should be demand for external graphics cards, but that’s not the case, and it remains a mere anecdote.
External Graphics Card
Why are there so few external graphics cards? The first issue when using an external graphics card is the high energy consumption required. As some models consume hundreds of watts, they are not exactly small and need an integrated power supply to run. This also makes it impossible to make desktop models portable due to their power consumption. Therefore, to make them fully portable, batteries must be included in the device, which increases the overall cost and decreases performance.
If you are currently using a laptop and would like to provide additional graphics processing capabilities for specific applications, a few years ago, these computers’ graphics cards were not soldered to the motherboard but used a type of module called MXM. These were still laptop graphics cards, and you could simply replace them with more powerful ones as needed. The problem? They were very expensive and rare, making it worth purchasing an equivalent tower model GPU instead.
Ultimately, the best option for not only gaming but also professional 3D rendering is to purchase a desktop computer without the limitations of a laptop. However, the reason external graphics cards are not accepted by users is primarily technical.
The problem is the interface
Currently, our graphics cards are connected to a parallel interface called PCI Express, which has a transfer capacity of several tens of gigabytes per second. However, this is achieved using a large number of pins and a very short distance. One reason peripherals use serial interfaces is because the connectors are too wide. Older people may remember the size of traditional printer connectors, colloquially called LPT1, and their width and bulk.
PCI Express Optics
In reality, there are external PCI Express interfaces, but they are based on optical interfaces rather than copper wires to prevent power consumption caused by the resistance of copper wires as cable distance increases. The reason the connection of graphics cards is so short is not a waste; a longer connection requires more energy to transmit data, causing electricity bills to skyrocket.
However, you only need to check the size of the connector, which in this case is an x8 connection, taking up a considerable part of the card’s back width. It can also be placed on the side, but you would need to create a PCB or dedicated board for this type of interface, which only solves part of the problem: information transfer and the power required to enable the card to have external graphics.
Keeping things simple
The reason no one chose external graphics cards is that, with current models, you can simply click on the PC motherboard to reduce costs, especially since no additional power supply is needed. There is no power supply and no case for it. This means you don’t have to complicate things with multiple power supplies or cases, making things much easier.
Furthermore, with the earlier AGP interface, it soon became apparent that there was a lack of power, necessitating the policy of 6, 8, and 16-pin connectors to supply power to graphics cards. As a historical curiosity, it is necessary to highlight some models that come with integrated PSUs.