NVIDIA just launched the GeForce MX450 – a new MX-class GPU for laptops, with GDDR6 memory and a PCIe 4.0 interface!
Find out what else is new and what it offers over the GeForce MX300 and GeForce MX200 series GPUs!
NVIDIA GeForce MX450 : Finally Time For Turing!
The NVIDIA GeForce MX300 and MX200 series of laptop graphics are based on the old Pascal architecture from 2016, and built on the 14 nm process technology.
This allows them to deliver decent esports gaming performance at just 25 watts, but they are certainly not as fast as the NVIDIA GeForce GTX 1050.
In addition, with the exception of MX350, they all lack the NVENC hardware encoder and NVDEC hardware decoder.
With the upcoming 11th Gen Intel Core processors (Tiger Lake) threatening to pose a serious challenge with their Xe graphics, NVIDIA needed to up the ante.
And so, they launched their new GeForce MX450 graphics, based on the newer Turing microarchitecture!
NVIDIA GeForce MX450 : New MX-Class Laptop Graphics
The NVIDIA GeForce MX450 was quietly released, without much fanfare even though it is the first NVIDIA GPU to officially support PCI Express 4.0.
On top of that, it is also the first MX-class GPU to support faster GDDR6 memory, and feature the newer Turing microarchitecture.
The NVIDIA GeForce MX450 is built on the more efficient 12 nm process, and come with 1024 CUDA cores, with 64 texture units and 16 ROPs.
More importantly, the Turing TU117 GPU it uses supports the NVENC hardware encoder and NVDEC hardware decoder, allowing it to boost content creation and media playback performance with lower power consumption.
That is if NVIDIA doesn’t actually turn disable them in the GeForce MX450, to avoid cannibalising their higher-end mobile GPUs.
Like the other MX-class GPUs, it has a 25 watt TDP. And while it supports GDDR6 memory, manufacturers can opt to use GDDR5 memory.
NVIDIA GeForce MX450 : Specifications
Now, NVIDIA has not revealed much on the GeForce MX450, so here are our best educated estimates of its key specifications, compared to its predecessors :
|4.7 Billion||3.3 Billion||1.8 Billion|
|Fab Process||12 nm||14 nm|
|Die Size||200 mm²||132 mm²||74 mm²|
|Base Clock||1395 MHz||1354 MHz||1531 MHz||1519 MHz||1519 MHz|
|Boost Clock||1575 MHz||1468 MHz||1594 MHz||1582 MHz||1582 MHz|
|Memory Size||2 GB|
|Memory Speed||2500 MHz||1750 MHz||1500 MHz|
|Memory Bandwidth||80 GB/s||56 GB/s||48 GB/s|
|Interface||PCIe 4.0 x16||PCIe 3.0 x16|
- How To Setup NVIDIA NULL For G-SYNC Monitors Correctly!
- NVIDIA Image Sharpening Guide for DirectX, Vulkan, OpenGL!
- Death Stranding : How To Get It FREE With GeForce RTX!
- Samsung Odyssey G9 : 49-inch 1000R QLED Gaming Monitor!
- Samsung Odyssey G7 : World’s 1st 1000R Gaming Monitors!
- Dell XPS Desktop (8940) : Premium Desktop PC!
- 31.5-inch Dell S3221QS : Curved 4K UHD Multimedia Monitor!
- 27-inch Dell S2721QS / S2721Q : 4K UHD Multimedia Monitor!
- 27-inch Dell S2721DS / S2721D : What You Need To Know!
- HUAWEI MateBook X Pro (2020) Review : Ultra-Light Beast!
- 2nd Gen NVIDIA Max-Q Technology : What’s New?
- NVIDIA GeForce RTX SUPER Gaming Laptops Unleashed!
- RX 5600 XT vs RTX 2060 (Super) Performance Comparison!
- RX 5600 XT vs RTX 2060 (Super) Price-Performance!
- Tech ARP Mobile GPU Comparison Guide Rev. 19.1
- The Desktop Graphics Card Comparison Guide Rev. 37.1
- ASUS ROG Swift 360 : World’s First 360Hz G-SYNC Display!
- NVIDIA TensorRT 7 with Real-Time Conversational AI!
- NVIDIA DRIVE Deep Neural Networks : Access Granted!
- DiDi Adopts NVIDIA AI + GPUs For Self-Driving Cars!
- NVIDIA DRIVE AGX Orin for Autonomous Vehicles Revealed!
Support Tech ARP!