Currently, most graphics cards are integrated, and at the same time, most systems sold for gaming are laptops. However, it goes without saying that you cannot have multiple external graphics cards visible or even present. Moreover, it seems that they have disappeared. Why did this happen? Let me explain.
The biggest issue when purchasing a gaming laptop is precisely its graphic capacity, as everyone knows that due to consumption and temperature limitations, it is not possible to place the same brown beast found in tower computers. While some might think there should be demand for external graphics cards, it is not the case and remains just an anecdote.
Why are multiple external graphics cards not displayed?
The first issue when using an external graphics card is the high energy consumption required. As there are models that consume several hundred watts, an integrated power supply is required, rather than a small one. This also makes it impossible to make desktop models portable for consumption anywhere. Therefore, to make them fully portable, a battery must be included in the equipment, increasing the overall cost and decreasing performance.
If you’re currently using a laptop and want to provide additional graphics processing power for a specific application, you may be interested to know that a few years ago, the graphics cards in these computers were not soldered onto the motherboard, but rather used a module type called MXM. This meant that laptop graphics cards could be replaced with more powerful ones if needed. However, the problem was that these modules were very expensive and rare, making it worthwhile to buy a tower with an equivalent model GPU instead.
In the end, the best option for not only gaming but also professional 3D rendering is to buy a desktop computer without the limitations of a laptop. However, the reason external graphics cards are not widely accepted by users is primarily due to technical reasons.
The problem is the interface.
Our current graphics cards are connected to a parallel interface called PCI Express, which has a transfer capacity of several tens of gigabytes per second. However, this is achieved using a vast number of pins and a very short distance. One reason why peripheral devices use a serial interface is that the connector width is too wide. Older people will remember the size, width, and length of the traditional printer connector, colloquially called LPTXNUMX.
There actually exists an external PCI Express interface, but as cable distance increases, power consumption is caused by copper wire resistance, which is why they are based on an optical interface rather than copper wire. The reason for the very short connection for graphics card is not wasted. If the connection is long, more energy is needed to transmit data, which leads to higher electricity costs.
However, you only need to check the connector size. In this case, it is an x8 connection, which occupies a significant portion of the back width of the card. It can also be placed on the side, but a PCB or dedicated board for this type of interface must be created, which only solves part of the problem, namely the issue of information transfer and the power required to enable it. External graphics cards.
Simplifying things
The reason no one chose external graphics cards is because with current models, you can reduce costs simply by clicking on the PC’s motherboard. Especially since no power is needed. There is no need for a power supply or case. This means that there is no need to complicate things with multiple power sources or cases, making things much simpler.
In addition, when power shortages were confirmed with the previous interface, AGP, policies for 6, 8, and 16 pin connectors to supply power to graphics cards were needed immediately. As a historical curiosity, it should be emphasized that some models wanted to include an integrated PSU.