This card does Metro Exodus at 4K, 67FPS. The end.
Just kidding, but that could be the review; it really tells you everything you need to know.
We have waited for it, hoped for it, and our dreams have now come true. Enter the RTX3080. Yet with a very odd cooling design, potential buyers started worrying if third-party suppliers would be able to cool the card. MSI has not disappointed.
Before the review itself, I would like to point out that Gamereactor, like most European media, got the card very late, and therefore a lot of features haven't been fully tested, and overclocking wasn't a priority either; we simply focused on gathering as much raw data as possible. Therefore there will be follow up articles, and when press samples are more readily available, we hope to have another go with the Trio X card, because boy does it kick ass. This also means that the tech part isn't as deep as one might have hoped for.
Before we go any further, however, let's take a look at this technical marvel more broadly.
The RTX3000 series is based on an architecture called Ampere. It uses the second generation of Nvidia's Ray Tracing technology, although that has become a smaller part of it as the platform has grown.
The Ray Tracing cores have been upgraded to the second generation of that tech, and they are responsible for calculating lighting effects amongst other things. The precision math/deep learning Tensor cores have been upgraded as well, and are now third generation, and the number of calculation units, or streaming multiprocessors, has exploded, as well as being upgraded.
The RTX3080 has 8704 Cuda cores, and 10GB DDR6X memory, operating at 19 Gbps. The boost clock is at 1755 Mhz, surpassing Nvidia's own card because that is what MSI does on a boring Tuesday afternoon. The Memory bus is 320 bit, there are three display port 1.4a, and there's an HDMI 2.1 as you most likely want to use this card for 4K gaming. Power draw is pretty huge though - 340 watts - and therefore uses three 8-pin connectors. You better pray you didn't skimp on your power supply. It's also worth noting that the card is pretty big and weighs in at 1565 grams.
This does two things. First of all, the raw computing power on offer is a massive leap forward. Second, the card is able to use deep learning and AI computing to a much higher degree than previous iterations. This means that using deep learning for frame rendering instead of the raw computing power will now provide even better performance. It's called DLSS, and if your game supports it, you better turn it on. It has also enabled Nvidia to some come up with a couple of new things that we will have to cover later, as the press samples are limited, and so is the time that they are available for testing.
Nvidia has a new system called Reflex. What it does precisely is unknown, but it's claimed that it redistributes the workload of the machine, ensuring a faster end-to-end response. This is of course only possible if you've got a Nvidia G-Sync monitor. The relevant Dragon Center software is also fairly easy to use, and while the red colour chosen is a tad aggressive, the functionality is good.
RTX cards have always had a dedicated section that did overlays, screen capture and streaming without interfering with the actual computing. This has now turned into Nvidia Broadcast, which both enables noise reduction, better live streaming, and is even able to add an artificial background, making streaming easier and higher quality. Last but not least, it supports DX12 Ultimate, with even better Ray Tracing and variable rate shading. Last-gen DX12 games were a mixed experience, but when it worked, it was impressive.
MSI has chosen to use their own tried and tested Frozr2 system, and should actually get some credit for it. Despite being so much larger and more power-hungry than the last generation, it kept fairly quiet at 37dB during testing, with small 38dB peaks once in a while. GPU temperature was a moderate 58 degrees, although prolonged sessions forced it up to 61 degrees; while it was rare, I suspect that it was more about the temperature of the room rising over time, as the heat output from the card is massive. You can't feel the heat blast directly, but pets and small children should stay clear of the back of your computer.
There is, of course, synchronised lightning, because why not, and Trio in the name refers to the three massive fans you can see in the pictures. These are fourth-gen Torx fans, each with double ball bearings cooling an improved heatsink design with something called "airflow control", which is marketing speak for MSI having spent a lot of time adding small changes to the heatsink to avoid turbulence. The relative silence of the card operating on full power suggests that they have succeeded. MSI has connected all this to "core pipes", meaning metal heat pipes that are milled with a high degree of precision to ensure both large surface area, and maximum contact area, with the heat being spread out across the entire length of the pipe instead of concentrating on specific areas.
The backplate is no longer metal, but graphene in an attempt to limit the weight, and while we did not have time to disassemble the card to confirm it, all evidence points towards MSI using a healthy array of thermal pads where needed. They have chosen to add additional fuses, as well as using "enhanced PCB material" with thicker copper layers for more stability and better conductivity, so says the marketing material. There's also a metal bracket included, but at least they put some rubber feet on it so it doesn't scratch the card.
The pricing here in the UK is around £830, and while it's more than the Founder's Edition, that's simply because Nvidia has dictated it be so. But for a card that makes your RTX2080 look like the integrated graphics chip on your CPU - it's, well, not cheap, but it is still really good value.
Testing was done on both an Intel 10900K with an MSI Z490 MEG motherboard, and an AMD Ryzen 9 3900X system with an X570 motherboard. Both systems used 16 Gb of 3200 Mhz CL16 RAM. Our initial fear was that the PCIe 4.0 specification of the card meant that it would bottleneck due to the Intel system-only supporting PCIe 3.0. That turned out not to be the case.
Both systems had the same benchmarks run with the RTX3080 Gaming X Trio and the RTX2080 Gaming X Trio. The lowest difference scores were chosen. While we as a gaming magazine focus on the games, synthetic benchmarks such as Heaven and 3D Benchmark were also used. Programs that test the entire system or collective computing power have been omitted, however.
All scores are listed as RTX2080 score / RTX3080 score / difference between the RTX2080 vs RTX3080 in %. Game measurements are in FPS:
3D Benchmark - Synthetic Score
Heaven Benchmark - Synthetic Score
Total War: Warhammer II
Red Dead Redemption 2
Assassin's Creed Odyssey
The Division 2
Middle Earth: Shadow of War
Shadow of the Tomb Raider
4K Score - FPS
Red Dead Redemption 2 - 111
Doom Eternal - AMD: 115-182 / Intel 171-212
Metro Exodus - No DLSS, 49. With DLSS, 67. We finally did it: +60FPS on Metro Exodus 4K.
Hitman 2 - 82
Battlefield V - AMD: 84 both with and without DLSS. Intel: 121 native, 92 with DLSS. Something seems wrong here. We'll re-run this one again later.
Control - Intel 65 native, 105 with DLSS. AMD 70 native, 201 DLSS. The systems did not seem to base render at the same resolution, and the Intel system rendered at 1440p with DLSS, while the AMD system rendered at 1171p.
Overall, MSI has managed to take this 340-watt monster and turn it into a cool and quiet kitten that purrs; it's hard not to be impressed by the lack of noise and the low temperatures considering the massive amounts of heat created by this amount of raw computation power.
4K gaming is finally a reality for those who crave it. But for everyone else, it's still very much worth considering getting a new graphics card as the leap from the previous 20 series is massive. And that's putting it mildly; I can't remember a generational leap this massive before.
The age of true 4K PC gaming has begun.
Loading next content