The Microsoft Surface Laptop 3 boasts a number of improvements, including new AMD Ryzen processors and a 11.5 hour battery life!
Here is EVERYTHING you need to know about the new Microsoft Surface Laptop 3 – key features, specifications, price and availability!
Microsoft Surface Laptop 3
The Surface Laptop 3 is Microsoft’s response to the Apple MacBook Pro. Like its rival, it promises a powerful computing experience in a thin and light form factor, with a phenomenal battery life.
Two Form Factors
The Microsoft Surface Laptop 3 comes in 13.5-inch and 15-inch form factors, with weights of just under 1.3 kg and just over 1.5 kg respectively.
New AMD + Intel Processors
The 13.5-inch Surface Laptop 3 comes with the latest 10th Gen Intel Core processors, while the 15-inch Surface Laptop 3 is powered by the AMD Ryzen Microsoft Surface Edition processors!
With USB-C Port
The Microsoft Surface Laptop 3 now comes with both a Type A USB port, as well as a Type C USB port. This gives you more flexibility connecting it to an external display, docking stations or USB accessories.
Larger Trackpad + Two Palm Rest Options
The glass trackpad is now 20% larger, and you can opt for a metal palm rest, or have it covered in Alcantara fabric.
Microsoft Surface Laptop 3 Specifications
To make it easier for you to compare both the 13.5-inch and 15-inch models, here is a table comparing their key specifications :
Windows Hello face authentication camera
720p HD front camera (f/2.0)
Audio
Dual far-field Studio microphones
Omnisonic speakers with Dolby Audio Premium
Connectivity Options
1 x USB Type C
1 x USB Type A
1 x Surface Connect Port
1 x 3.5 mm headphone jack
Battery Life
Up to 11.5 hours
Dimensions
308 mm wide
223 mm deep
14.5 mm thick
339.5 mm wide
244 mm deep
14.69 mm thick
Weight
Matte Black : 1.288 kg
Platinum : 1.265 kg
1.542 kg
Colour Options
Matte Black (Metal Palm Rest)
Platinum (Alcantara Palm Rest)
Matte Black (Metal Palm Rest)
Platinum (Metal Palm Rest)
Microsoft Surface Laptop 3 Price + Availability
The Microsoft Surface Laptop 3 is available for pre-order now, with general availability beginning 9 December 2019, and retail sales starting in January 2020.
Here are the prices for the first models and accessories :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Acer ConceptD 5 | ConceptD 5 Pro is a new laptop designed for the discerning graphic designer, offering the power of a desktop computer, in a svelte and light chassis.
Here is our quick primer on Pro and non-Pro variants of this designer laptop!
Acer ConceptD 5 | ConceptD 5 Pro For Graphic Designers!
The Acer ConceptD 5 and ConceptD 5 Pro are designed to be a light and thin laptop, weighing just 1.5 kg (3.3 lbs) and just 16.9 mm (0.67 inch) thick. Yet, they pack a lot of professional-grade hardware.
Flip the cover and a PANTONE-validated IPS display with a 4K UHD resolution greets you. To ensure colour accuracy for design work, it displays 100% of the Adobe RGB colour gamut, with Delta E <2.
They are both powered by the 9th Gen Intel Core i7 mobile processor, with up to 16 GB of DDR4 memory, and up to two 512 GB PCIe NVMe SSDs in RAID 0.
They will come with AMD Radeon RX Vega M GL graphics in some regions, and a choice between NVIDIA Quadro and GeForce graphics in other regions.
They come with a full range of ports, including a USB-C Gen 1 port, a DisplayPort and a dedicated USB port that supports offline charging.
There is also has an embedded fingerprint reader that allows for easier login through Windows Hello.
They are both kept cool by a 4th Gen AeroBlade 3D fan. It features very thin 0.1 mm fan blades with a serrated edge, winglets along the top and bottom, as well as a curved fin along the inner portion of each blade. Acer claims this increases airflow by up to 45%!
To keep things quiet though, the AeroBlade 3D fan has a noise reduction mechanism that keeps it down to less than 40 dB – as quiet as a library!
Acer ConceptD 5 | ConceptD 5 Pro Specifications
How do the Pro and non-Pro models differ? Only in aspect – the Pro model uses a Quadro RTX 3000 / Quadro T1000, while the non-Pro model uses a GeForce GTX 1660 Ti or Radeon RX Vega M GL.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA spiked the AMD Radeon RX 5700 XT launch party by first teasing, and then launching the GeForce RTX 2070 SUPER.
A cut-down version of the GeForce RTX 2080, it comes with 2560 stream processors and 8 GB of GDDR6 memory, at a lower price point!
Find out how it matches up against the competition, and why we gave it our Reviewer’s Choice Award!
NVIDIA GeForce RTX 2070 SUPER Primer
Designed to extend NVIDIA’s performance lead against the new AMD Navi-based graphics cards, the NVIDIA RTX SUPER graphics cards are based on cut-down versions of existing GeForce RTX models.
The GeForce RTX 2060 survives to remain the cheapest RTX card you can buy, while the GeForce RTX 2080 Ti remains the most powerful RTX graphics card that money can buy.
NVIDIA GeForce RTX 2070 SUPER Price + Availability
The NVIDIA GeForce RTX 2070 SUPER has a launch price that starts at $499 / ~£399 / ~€444 / ~RM 2,051. This makes it $100 more expensive than the GeForce RTX 2060 SUPER.
Here are some direct purchase links and prices of various AIB custom cards (prices accurate as of 18 August 2019) :
The NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) comes in a large cardboard box that doubles as a stand. Let’s take a quick look around, and see what we find inside!
The RTX 2070 SUPER package is rather basic, but for those who are still using monitors with DVI ports, the bundled DP-to-DVI dongle will be very useful.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The NVIDIA GeForce RTX 2070 SUPER Hands On!
The NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) has an NVIDIA TU104 GPU with 2560 stream processors.
This gives it 160 TMUs and 64 ROPS, delivering a peak output of 283 GT/s and 113 GP/s, thanks to its higher clock speeds.
Its 8 GB of GDDR6 memory runs at a speed of 1750 MHz, giving it a peak memory bandwidth of 448 GB/s.
Even though the TU104 GPU is fabricated on the 12 nm process technology, the card only needs 215 watts of power.
NVIDIA Turing
The NVIDIA GeForce RTX SUPER series continues to use the NVIDIA Turing microarchitecture, the first to introduce real-time ray tracing capabilities and new Tensor Cores.
It also continues to benefit from NVIDIA DLSS (Deep Learning Super-Sampling), which can be used to deliver better than TAA quality with a significantly lower performance penalty, or much better image quality – equivalent to 64X super sampling.
[adrotate group=”1″]
RTX 2070 SUPER Power Requirements
The NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) is a cut-down version of the GeForce RTX 2080, so it comes to no surprise that it would require 215 watts of power as well.
Like the GeForce RTX 2080, you will need a power supply with both 6-pin and 8-pin PCI Express power cables.
RTX 2070 SUPER Display Ports
The NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) has three DisplayPort 1.4 ports, each supporting up to 8K HDR resolution at 60 Hz, or 4K HDR at 120 Hz or more.
There is also a tiny USB-C VirtualLink port, which allows you to connect a VR headset. But guess what – it works like a standard USB-C port, so you can also connect any USB-C device to it!
For compatibility with older displays, there is a HDMI 2.0b port, which supports up to 4K HDR displays at up to 60 Hz… as well as a DP-to-DVI dongle, which allows you to hook it up to older monitors with a DVI port.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA GeForce RTX 2070 SUPER Benchmarking Notes
In this review, we will take a look at its gaming performance, in comparison to 7 other graphics cards :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
F1 2019
F1 2019 is a really new racing game by Codemasters, released on 28 June 2019.
We tested it on three resolutions at the Ultra High settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, the NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) was slightly (just over 2%) faster than the GeForce GTX 1080 Ti.
Based on the 2013 movie, World War Z is a relatively new third-person shooter game, released in April 2019.
We tested it on three resolutions using the Vulkan API at the Ultra High settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
Note : This game is known to be highly optimised for AMD Radeon cards.
1080p Gaming Resolution
At 1080p, World War Z was CPU-limited even with the new AMD Ryzen 7 3700X processor. So take these results with a pinch of salt.
The GeForce RTX 2070 SUPER (US | UK | MY) was somehow 7% faster than the RTX 2060 SUPER, and 9-10% faster than the RX 5700 and GTX 1080 Ti graphics cards.
1440p Gaming Resolution
When we kicked up the resolution, it was now just 3-4% faster than the Vega 64 and RX 5700 graphics cards.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Division 2
Tom Clancy’s The Division 2 is a new third-person shooter game released in March 2019.
We tested it on three resolutions using the Extreme settings :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Metro Exodus
Metro Exodus is a relatively new first-person shooter game, released in February 2019.
We tested it on three resolutions using the Ultra settings :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA GeForce RTX 2070 SUPER – Summary
Coming up with the RTX SUPER cards was easy for NVIDIA. They are basically cut-down versions of existing and faster graphics cards.
The NVIDIA GeForce RTX 2070 SUPER (US | UK | MY) is essentially a GeForce RTX 2080 with 384 CUDA Cores disabled, but higher clock speeds to compensate.
This allows it to be almost as fast as a regular RTX 2080, just a whole lot cheaper! In fact, that was precisely why it replaced the RTX 2080, which was $200 more expensive.
Sadly, we currently do not have an AMD Radeon RX 5700 XT to compare it with, only the Radeon RX 5700.
Needless to say, the GeForce RTX 2070 SUPER far outclasses the RX 5700. But until we get our hands on its true rival – the Radeon RX 5700 XT, let’s compare it against its “junior”, the RTX 2060 SUPER in our six real world game benchmarks :
F1 2019 : 16% faster at 1080p, 19% faster at 1440p, 21% faster at 2160p
World War Z : 7% faster at 1080p, 15% faster at 1440p, 20.5% faster at 2160p
The Division 2 : 15.5% faster at 1080p, 20.5% faster at 1440p, 24% faster at 2160p
Strange Brigade : 17% faster at 1080p, 16.5% faster at 1440p, 15% faster at 2160p
Metro Exodus : 16% faster at 1080p, 17% faster at 1440p and 2160p
AOTS : 3% faster at 1080p, 9% faster at 1440p, 15% faster at 2160p
Generally, we would say that GeForce RTX 2070 SUPER (US | UK | MY) is roughly 12-18% faster than the RTX 2060 SUPER.
NVIDIA GeForce RTX 2070 SUPER Game Bundle
Like the RTX 2060 SUPER, NVIDIA is bundling TWO GAMES – Wolfenstein : Youngblood and Control – with the RTX 2070 SUPER. They are worth at least $50, so that’s a really nice deal… if you like those games.
NVIDIA GeForce RTX 2070 SUPER – Our Verdict + Award
Now, there is nothing “super” about what’s really a cut-down version of the existing GeForce RTX 2080 graphics card.
What will really sell the RTX 2060 SUPER is the fact it offers a much better value proposition than the RTX 2080.
The NVIDIA GeForce RTX 2070 SUPER is basically a slightly slower GeForce RTX 2080 with a $200 discount.
That means you get a card that’s fast enough for 4K gaming at 60 fps or better in ⅔ of the games we tested, at what you used to pay for a GeForce RTX 2070!
That’s sure as heck worthy of our Reviewer’s Choice Award! Nice work, NVIDIA!
NVIDIA GeForce RTX 2070 SUPER – Where To Buy
Here are some direct purchase links and prices of various AIB custom cards (prices accurate as of 18 August 2019) :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA spiked the AMD Radeon RX 5700 launch party by first teasing, and then launching the GeForce RTX 2060 SUPER.
A cut-down version of the GeForce RTX 2070, it comes with 2176 stream processors and 8 GB of GDDR6 memory, at a $100 lower price point!
How will it match up against the new Radeon RX 5700, and its predecessor, the RTX 2060? Let’s find out!
NVIDIA GeForce RTX 2060 SUPER Primer
Designed to extend NVIDIA’s performance lead against the new AMD Navi-based graphics cards, the NVIDIA RTX SUPER graphics cards are based on cut-down versions of existing GeForce RTX models.
The GeForce RTX 2060 survives to remain the cheapest RTX card you can buy, while the GeForce RTX 2080 Ti remains the most powerful RTX graphics card that money can buy.
NVIDIA GeForce RTX 2060 SUPER Price + Availability
The NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) has a launch price that starts at $399 / ~£314 / ~€351 / ~RM 1,649, just $50 more than the GeForce RTX 2060.
Here are some direct purchase links and prices of various AIB custom cards (prices accurate as of 18 August 2019) :
The NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) comes in a large cardboard box that doubles as a stand. Let’s take a quick look around, and see what we find inside!
As you can see, it basically comes with just a Quick Start Guide, and a Support Guide. Nothing else.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The NVIDIA GeForce RTX 2060 SUPER Hands On!
The NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) has an NVIDIA TU106 GPU with 2176 stream processors.
This gives it 136 TMUs and 64 ROPS, delivering a peak output of 224 GT/s and 106 GP/s, thanks to its higher clock speeds.
Its 8 GB of GDDR6 memory runs at a speed of 1750 MHz, giving it a peak memory bandwidth of 448 GB/s.
Even though the TU106 GPU is fabricated on the 12 nm process technology, the card only needs 175 watts of power.
NVIDIA Turing
The NVIDIA GeForce RTX SUPER series continues to use the NVIDIA Turing microarchitecture, the first to introduce real-time ray tracing capabilities and new Tensor Cores.
It also continues to benefit from NVIDIA DLSS (Deep Learning Super-Sampling), which can be used to deliver better than TAA quality with a significantly lower performance penalty, or much better image quality – equivalent to 64X super sampling.
[adrotate group=”1″]
RTX 2060 SUPER Power Requirements
The NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) is a cut-down version of the GeForce RTX 2070, so it comes to no surprise that it would require 175 watts of power as well.
Like the GeForce RTX 2070, you will need a power supply with an 8-pin PCI Express power cable.
RTX 2060 SUPER Display Ports
Surprisingly, it comes with a wide variety of display options. It has two DisplayPort 1.4 ports, that allows it to support up to 8K HDR resolution at 60 Hz, or 4K HDR at 120 Hz or more.
There is also a tiny USB-C VirtualLink port, which allows you to connect a VR headset. But guess what – it works like a standard USB-C port, so you can also connect any USB-C device to it!
For compatibility with older displays, it also comes with a HDMI 2.0b port, which supports up to 4K HDR displays at up to 60 Hz… as well as a dual-linked DVI port.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA GeForce RTX 2060 SUPER Benchmarking Notes
In this review, we will take a look at its gaming performance, in comparison to 7 other graphics cards :
We used 3DMark’s Time Spy and Time Spy Extreme synthetic benchmarks, which offers a great approximation of gaming performance in general.
The Time Spy and Time Spy Extreme benchmarks support DirectX 12, and the latest features like asynchronous compute, and multi-threading support.
Time Spy – 2560 x 1440
Wow! In the Time Spy test, the GeForce RTX 2060 SUPER (US | UK | MY) was 16% faster than the RTX 2060, 13% faster than the Radeon RX 5700, and just 6.5% slower than the GeForce GTX 1080 Ti!
Let’s see how much of an effect they have on the overall gaming score…
Amazingly, it came within 2% of the GeForce GTX 1080 Ti, and was 11% faster than the Radeon RX 5700, and 14% faster than the RTX 2060.
Time Spy Extreme – 3840 x 2160
At the higher 4K resolution, the GeForce RTX 2060 SUPER (US | UK | MY) was just 6.2% slower than the GeForce GTX 1080 Ti.
It was now 17% faster than the Radeon RX 5700, and 20% faster than the RTX 2060.
And here are their overall gaming scores at 4K…
Again, its performance gap closed against the GeForce GTX 1080 Ti in the overall score – it was now just 5% slower.
It was also 14% faster than the Radeon RX 5700, and 17% faster than the RTX 2060.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
F1 2019
F1 2019 is a really new racing game by Codemasters, released on 28 June 2019.
We tested it on three resolutions at the Ultra High settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, the NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) was slightly (less than 2%) faster than the Radeon RX 5700 and Vega 56 graphics cards.
1440p Gaming Resolution
At 1440p, it tied with the Radeon RX 5700, and they were both about 12.5% faster than the RTX 2060, Vega 64 and Vega 56 graphics cards.
2160p Gaming Resolution
When we bumped up the resolution to 4K, it continued to tie with the Radeon RX 5700, and they were both 13% faster than the RTX 2060 and Vega 64 graphics cards.
[adrotate group=”1″]
World War Z
Based on the 2013 movie, World War Z is a relatively new third-person shooter game, released in April 2019.
We tested it on three resolutions using the Vulkan API at the Ultra High settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
Note : This game is known to be highly optimised for AMD Radeon cards.
1080p Gaming Resolution
At 1080p, World War Z was CPU-limited even with the new AMD Ryzen 7 3700X processor.
So take these results with a big pinch of salt. The top 5 cards – Vega 56, Vega 64, RX 5700, RTX 2060S and GTX 1080 Ti – were producing roughly the same frame rates.
1440p Gaming Resolution
When we kicked up the resolution, it was 21% faster than the RTX 2060, but was now 10% slower than the RX 5700 and Vega 64 graphics cards.
2160p Gaming Resolution
Switching to 4K shook things up again. It now tied with the RX 5700, and they were both 6% slower than the Vega 56, 12% slower than the GTX 1080 Ti, and 16% slower than the Vega 64.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Division 2
Tom Clancy’s The Division 2 is a new third-person shooter game released in March 2019.
We tested it on three resolutions using the Extreme settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, the NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) was 20% faster than the RTX 2060, and 11.5% faster than the new Radeon RX 5700.
1440p Gaming Resolution
When we bumped the resolution to 1440p, it was 14.5% faster than the RTX 2060, and 9% faster than the new Radeon RX 5700.
2160p Gaming Resolution
At 4K, it was 18% faster than the RTX 2060, and 10% faster than the new Radeon RX 5700.
[adrotate group=”1″]
Strange Brigade
Strange Brigade is also a third-person shooter game, released in August 2018.
We tested it on three resolutions using the Vulkan API at the Ultra High settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, the NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) was 14.6% faster than the RTX 2060, and 7.5% faster than the new Radeon RX 5700.
1440p Gaming Resolution
At 1440p, it was 16% faster than the RTX 2060, and 6.5% faster than the new Radeon RX 5700.
2160p Gaming Resolution
At 4K, it basically tied with the new Radeon RX 5700, and was 18% faster than the the RTX 2060.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Metro Exodus
Metro Exodus is a relatively new first-person shooter game, released in February 2019.
We tested it on three resolutions using the Ultra settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, the NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) was slightly faster than the Vega 64, and tied with the new RX 5700.
1440p Gaming Resolution
At 1440p, it now tied with the Vega 64, and was now 8% slower than the RX 5700, and 16% slower than the GTX 1080 Ti.
2160p Gaming Resolution
At 4K, it tied with the Vega 64, and they were both 14% slower than the GTX 1080 Ti, and 7% faster than the new RX 5700.
[adrotate group=”1″]
Ashes of the Singularity
Ashes of the Singularity is a 2016 game that supports multi-core processing and asynchronous compute.
We tested it on three resolutions using the DirectX 12 API at the Extreme settings :
1080p : 1920 x 1080
1440p : 2560 x 1440
2160p : 3840 x 2160
1080p Gaming Resolution
At 1080p, AOTS is CPU-limited. The GeForce RTX 2060 SUPER (US | UK | MY) was 3% slower than the RX 5700 and GTX 1080 Ti graphics cards.
1440p Gaming Resolution
Even at 1440p, it was still a little CPU limited. The RTX 2060 SUPER was now 7% slower than the the RX 5700, and 8.5% slower than the GTX 1080 Ti.
On the other hand, it was 9% faster than the Vega 64, and 16% faster than the RTX 2060.
2160p Gaming Resolution
At 4K, we could finally see the performance difference between the eight cards.
It was now 9% slower than the the RX 5700, and 17% slower than the GTX 1080 Ti.
On the other hand, it was 8% faster than the Vega 64, and 18% faster than the RTX 2060 and Vega 56.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
NVIDIA GeForce RTX 2060 SUPER – Summary
Coming up with the RTX SUPER cards was easy for NVIDIA. They are basically cut-down versions of existing and faster graphics cards.
The NVIDIA GeForce RTX 2060 SUPER (US | UK | MY) is essentially a GeForce RTX 2070 with 128 CUDA Cores disabled, but higher clock speeds to compensate.
This allows it to be almost as fast as a regular RTX 2070, just a whole lot cheaper! In fact, that was precisely why it replaced the RTX 2070, which was $100 more expensive.
In the synthetic 3DMark benchmark, the RTX 2060 SUPER is, without a doubt, far superior to the RX 5700. But in the six real world we put them through, here is a summary of the results :
F1 2019 : Tie
World War Z : Tie
The Division 2 : RTX 2060 SUPER is 12% faster on average.
Strange Brigade : RTX 2060 SUPER is 7% faster at 1080p and 1440p
Metro Exodus : Tie at 1080p, RX 5700 is 8% faster at 1440p, RTX 2060 SUPER is 7% faster at 2160p
AOTS : RX 5700 is 3-10% faster
In other words, its real world gaming performance vis-à-vis the AMD Radeon RX 5700 really depends on the game you are playing or testing.
Generally, we would say that GeForce RTX 2060 SUPER is equivalent to the Radeon RX 5700.
To fight off the challenge, AMD immediately cut the launch price of the RX 5700 to $349, which makes it $50 cheaper and therefore a better buy than the RTX 2060 SUPER.
However, NVIDIA is bundling TWO GAMES – Wolfenstein : Youngblood and Control – with each RTX SUPER card. They are worth at least $50, so there goes the AMD price advantage.
NVIDIA GeForce RTX 2060 SUPER – Our Verdict & Award!
You can’t go wrong with either card, but what we feel will tilt many buyers to choose this card over the Radeon RX 5700 would be :
[adrotate group=”2″]
The ability to support NVIDIA RTX in certain games for a more realistic gaming experience
The ability to use NVIDIA DLSS in certain games to improve image quality with minimal performance penalty
It has a much quieter two-fan cooler
It uses 10 watt less power
It only requires a single 8-pin power connector
It has a USB-C port and a dual-linked DVI port
Now, there is nothing “super” about what’s really a cut-down version of the existing GeForce RTX 2070 graphics card. What will really sell the RTX 2060 SUPER is the fact it offers a much better value proposition than the RTX 2070.
The NVIDIA GeForce RTX 2060 SUPER is basically a slightly slower GeForce RTX 2070 with a $100 discount. It will deliver 16%-20% better performance than the regular RTX 2060 at just $399.
On that note, we believe it makes our grade for a Reviewer’s Choice Award. Congratulations, NVIDIA!
NVIDIA GeForce RTX 2060 SUPER – Where To Buy
Here are some direct purchase links and prices of various AIB custom cards (prices accurate as of 18 August 2019) :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
AMD just announced that gamers can get three upcoming games for FREE when they buy an AMD Radeon RX Vega, RX 580 or RX 570 graphics card! So buy Radeon and get these three games absolutely FREE!
Buy Radeon, Get Three Upcoming Games Absolutely FREE!
Get the PC versions of the highly anticipated Assassin’s Creed Odyssey, Star Control: Origins and Strange Brigade for FREE when you buy an AMD Radeon RX Vega, RX 580 or RX 570 graphics card, once the titles are released.
Fast-forward to the year 2086 to join the galactic community and feel the thrill of ship-to-ship battle in Star Control: Origins. Stand against an ancient forgotten evil power in Strange Brigade. And forge your destiny and define your own path in war-torn Ancient Greece as you live an epic adventure in Assassin’s Creed Odyssey.
Strange Brigade will be available on August 28, Star Control: Origins will be available on September 20 and Assassin’s Creed Odyssey will be available on October 5. The promotional period begins August 7 and expires November 3, 2018. Games can be redeemed until December 31, 2018.
This is a great time to buy Radeon! Don’t forget to share this news!
Some Buy Radeon Deals!
Here are some great purchase options on qualifying Radeon graphics cards.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
AMD revealed a great many things at their Computex 2018 press conference, but the only announced product that will actually ship with immediate effect is the Radeon RX Vega 56 nano. Let’s hear from Scott Herkelman about this new graphics card, and check out the actual Vega 56 nano card, as well as its key specifications!
Scoot Herkelman On The Radeon RX Vega 56 nano
Here is the segment where Scott Herkelman reveals the Radeon RX Vega 56 nano graphics card, which was created and sold by PowerColor.
Basically, the PowerColor Radeon RX Vega 56 nano is a small version of the regular Radeon RX Vega 56, designed specifically for mini-ITX systems. Other than its much smaller size, it has 6-pin + 8-pin power connectors, instead of dual 8-pin power connectors.
Everything else, from its performance characteristics and power consumption appears to be the same as a regular Radeon RX Vega 56 graphics card.
PowerColor Radeon RX Vega 56 nano Specifications
Specifications
PpwerColour Radeon RX Vega 56
Model
AXRX VEGA 56 NANO 8GBHBM2
GPU
AMD Vega 10
Stream Processors
3584
Textures Per Clock
224
Pixels Per Clock
64
Base Clock Speed
1156 MHz
Boost Clock Speed
1471 MHz
Texture Fillrate
258.9~329.5 GT/s
Pixel Fillrate
74.0~94.1 GP/s
Graphics Memory
8 GB HBM2
Graphics Memory Bus Width
2048-bits
Graphics Memory Speed
800 MHz
Graphics Memory Bandwidth
409.6 GB/s
TDP
210 W
PCI Express Power Connectors
8-pin x 1
6-pin x 1
Display Ports
DisplayPort x 3
HDMI x 1
Dimensions
170 mm long
95 mm tall
38 mm thick
Retail Price
$449 (card only)
$599 (with Gaming Station)
[adrotate group=”1″]
The PowerColor Radeon RX Vega 56 nano Up Close!
Right after the Computex 2018 press conference, we had the opportunity to check out the PowerColor Radeon RX Vega 56 nano up close. Here is a video we took of the two samples AMD provided at the end of the press conference.
Interestingly, AMD actually used one of them to power the large display for their presentation.
Radeon RX Vega 56 nano Price + Availability
The PowerColor Radeon RX Vega 56 nano is available from 6 June 2018 onwards, with a recommended retail price of US$ 449. This is a US$ 50 premium over the regular Radeon RX Vega 56.
PowerColour is also offering a bundle with the PowerColor Gaming Station, which they’re pricing at US$ 599. The PowerColor Gaming Station alone is priced at US$ 299, so that’s at US$ 149 discount for the bundle – a really good deal!
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
If you thought the Intel Computex 2018 press conference was impressive, the AMD Computex 2018 press conference was even more so. One by one, key AMD executives came on stage to announce and show off new AMD products.
Bam! Bam! Bam! They just kept rolling out the announcements – the 2nd Gen Ryzen Threadripper, the 7nm EPYC CPU, the 7nm Vega GPU, and even a Radeon RX Vega nano graphics card! Get the full details below!
The AMD Computex 2018 Press Conference
Here is our video of the complete 76+ minute long AMD Computex 2018 press conference, followed by our pictures and comments that were posted live on Instagram, Facebook and Twitter during the event.
Dr. Lisa Su kicks off her keynote speech! @ The Westin Taipei
Cisco just announced that they’re adopting AMD EPYC processors!
HP just announced that they’re going to offer their first 1P AMD EPYC Proliant servers
Tencent CEO just announced that they’re offering AMD EPYC-powered SA1 cloud service!
Scott Herkelman takes over from Dr. Lisa Su to talk about Radeon and gaming.
[adrotate group=”1″]
Acer’s Jerry Hou shows off the Predator Helios 500 gaming laptop powered by 2nd Gen Ryzen and Radeon RX graphics.
Scott Herkelman reveals the new Radeon RX Vega 56 Nano graphics card
Jim Anderson begins his talk on the Ryzen processor family.
Kevin Lensing starts showing off new Ryzen Mobile laptops.
Ray Wah talks about Dell’s adoption of Ryzen Mobile processors in their Inspiron laptops.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
AMD Computex 2018 Press Conference Part 2
ASUS’ Marcel Campos shows off a Ryzen Mobile laptop with… wait for it… an NVIDIA GeForce GTX 1050 GPU… LOL
Jerry Hou talks about Lenovo’s Ryzen Mobie laptops
The 2nd Gen Ryzen Threadrilpper will have up to 32-cores and 64-threads. It’ll be available in Q3 2018!
Jim Anderson shows off the new 2nd Gen Ryzen Threadripper processor… with a delidded example.
David Wang talks about the world’s first 7nm GPU.
Introducing the AMD Radeon Instinct Vega 7nm!
[adrotate group=”1″]
Want to see how fast the 7nm Radeon Instinct is? Wait for our video!
The new 7nm AMD Radeon Instinct is sampling now and will launch in 2H 2018!
Dr. Lisa Su announces that EPYC processor will be shrunk to 7nm.
The AMD EPYC processor Lisa held is an early sample, with sampling in Q2 and a target launch in 2019.
The 7nm EPYC processor with the 7nm Radeon Instinct Vega.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Official AMD Computex 2018 Press Release
TAIPEI, Taiwan — June 6, 2018 — AMD (NASDAQ: AMD) today demonstrated its next generation of CPU and GPU product leadership during a live-streamed press conference at COMPUTEX TAIPEI 2018. AMD provided a first look at the performance of upcoming 7nm AMD Radeon Vega GPU products slated for launch in 2018, 12nm 2nd Generation AMD Ryzen Threadripper processors with up to 32 cores slated for launch in Q3 2018, and unprecedented customer adoption of Ryzen and Radeon products in premium OEM devices. AMD also announced four EPYC processor milestones: immediate availability of EPYC processors through Tencent Cloud Services, a new HPE single-socket system, details of its first Cisco UCS server platform, and that the next generation 7nm EPYC processor, codenamed “Rome”, will begin sampling in 2H 2018.
“At Computex 2018 we demonstrated how the strongest CPU and GPU product portfolio in the industry gets even stronger in the coming months,” said AMD President and CEO Dr. Lisa Su. “Our upcoming 7nm and 12nm products build on the momentum of our Ryzen, Radeon and EPYC processors, positioning AMD to lead the next generation of high-performance computing in markets from premium devices and gaming to machine learning and the datacenter.”
AMD Computex 2018 Client Compute Update
AMD delivered the first public demonstration of 2nd Generation AMD Ryzen Threadripper processors—the second AMD 12nm product family—featuring up to 32 cores and 64 threads. 2nd Gen Ryzen Threadripper processors are scheduled to launch in Q3 2018 with outstanding performance expected in rendering, post production, and encoding workloads.
AMD showcased its broadest portfolio of premium notebook and desktop systems from global OEM partners powered by Ryzen APUs, 2nd Gen Ryzen desktop processors, and Radeon graphics. Newly introduced systems from partners include:[adrotate group=”2″]
Dell’s latest Inspiron series including Inspiron 13” 7000 2-in-1, Inspiron 15” 5000 notebooks and Inspiron 7000 gaming desktop
HP Envy x360 13” and Envy x360 15” notebooks
Huawei MateBook D 14” notebook
Lenovo Yoga 530, IdeaPad 530S, 330S and 330 notebooks
AMD also continued to expand and improve upon the AM4 desktop ecosystem for Ryzen desktop processors with the announcement of AMD B450 chipset-based motherboards. Optimized for 2nd generation Ryzen desktop processors, B450 chipsets, designed to offer a superb balance of features, performance, and value, will be available from partners including ASRock, ASUS, Biostar, Gigabyte, and MSI.
AMD Computex 2018 Graphics and Gaming Update
AMD showcased the first public demonstration of its Radeon Vega GPU based on 7nm process technology built specifically for professional/datacenter applications.
AMD announced that 7nm Radeon “Vega” architecture-based Radeon Instinct has started sampling to initial customers and will launch in both server and workstation form factors for key compute use cases in 2H 2018.
AMD also revealed:
Radeon™ RX Vega56 “nano” graphics card from PowerColor that enables small form factor enthusiast gaming performance,
the latest Radeon FreeSync™ technology adoption with Samsung’s 80” QLED TV for tear free and smooth gameplay directly from a Radeon RX graphics card equipped PC or with an Microsoft Xbox One S or Xbox One X console,
Freesync support with HDR now available in Ubisoft’s recently launched and popular AAA game Far Cry 5.
AMD Computex 2018 Server Update
AMD announced growth in its EPYC processor engagements including:
The company’s first ever Cisco UCS server engagement in Cisco’s highest density offering ever with 128% more cores, 50% more servers, and 20% more storage per rack.
The all new HPE ProLiant DL325 Gen10 one socket server for virtualization and software-defined storage applications with up to 27% lower cost per virtual machine than the leading two-socket competitor.
Immediate availability of the EPYC based SA1 Tencent Cloud Service
The next generation 7nm EPYC processor, codenamed “Rome” and featuring Zen2 architecture, is now running in AMD labs and will begin sampling to customers in the second half of this year, ahead of launch in 2019.
Partner Quotes
“Acer is excited to partner with AMD on the Predator Helios 500 gaming notebook featuring 2nd Gen AMD Ryzen 7 processors, Radeon RX Vega56 graphics, and a built in 17.3-inch display supporting Radeon FreeSync,” said Jerry Hou, General Manager, Consumer Notebooks, IT Products Business, Acer Inc. “It’s a gaming beast that offers superior desktop-class gaming performance in a notebook, built for graphic-intensive AAA titles.”
“ASUS is dedicated to delivering the most innovative hardware for gamers of all levels,” said Marcel Campos, ASUS Global PC & Phone Marketing Senior Director. “We’re excited to announce the new ASUS X570ZD, the first laptop in the world to pair AMD’s Ryzen mobile processors with NVIDIA GTX 1050 graphics for gaming on the go. If it’s productivity you’re after, the ASUS X505ZA laptop is powered by a choice of AMD Ryzen mobile processors with Radeon Vega Graphics to give you the performance you need.”
“Dell is excited to continue implementing AMD’s Ryzen and Radeon solutions across a variety of products, including the latest Inspiron notebooks and Inspiron Gaming Desktops,” said Ray Wah, senior vice president and general manager, Dell Consumer and Small Business Product Group. “AMD’s Ryzen mobile processors with Radeon Vega Graphics deliver the responsiveness and performance users want and the performance in creative and productivity apps they need.”
“HP’s focus on design and engineering is delivering uncompromised style, performance and versatility to our customers,” said Kevin Frost, vice president and general manager, consumer PCs, HP Inc. “And we are delivering unrivaled premium PC experiences on the latest AMD Ryzen mobile processors, including the HP ENVY x360 13 – the first 13-inch convertible with the AMD Ryzen Processor with Radeon Vega Graphics and the HP ENVY x360 15 for incredible multi-tasking performance.”
“AMD EPYC has enabled HPE to pack more performance into an efficient server design, removing the need for a second processor and reducing TCO for our customers. The HPE ProLiant DL325 Gen10 allows customers to achieve dual-processor performance in a versatile single-socket server,” said Justin Hotard, vice president and general manager, Volume Global Business Unit, HPE. “By providing up to 32 processor cores, 2 terabytes of memory and more fully utilizing 128 PCIe lanes of I/O, we have set the bar for single processor virtualization performance, and with HPE OneView, customers can optimize their applications and dramatically speed deployment of new virtual machines.”
“As a first venture with AMD, Huawei is excited to be working with AMD to integrate the latest Ryzen mobile processors featuring Radeon Vega Graphics into Huawei’s MateBook D.” said Michael Young, General Manager of Xunwei technology. “The Huawei MateBook D is designed to provide superior performance and sleek design for users at work and play.”
“In the past, great computing performance with practical portability wasn’t always accessible,” said Jeff Meredith, Senior Vice President and General Manager of Lenovo’s Consumer PCs and Smart Devices. “The Lenovo IdeaPad 530S, 330 and 330S laptops and Yoga 530 convertible 2-in-1 are designed to change that. Built with the latest AMD Ryzen processor with Radeon Vega graphics, these laptops offer consumers more choice – enabling mobile computing to meet nearly any budget.”
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Hurray! AMD will now provide a FREE Far Cry 5 licence with every purchase of the Radeon RX Vega 64 or Radeon RX Vega 56 graphics card from authorised dealers and retailers! Here are the full details!
Far Cry 5
Deemed as one of the most exciting games in 2018, Far Cry 5 (Price Check) delivers endless fun as you roam through an open world to liberate Hope County, Montana from dictatorial ruling! For maximum gaming pleasure, it deserves a top-of-the-range graphics card like the Radeon RX Vega 64 (Price Check) or Radeon RX Vega 56 (Price Check).
Far Cry 5 costs US$ 59.99 / RM 209, but now you can get Far Cry 5 (Price Check) absolutely FREE!
Step 2 : Obtain your free Far Cry 5 redemption code card from the dealer or retailer *
Step 3 : Download your free Far Cry 5 game from the AMD Rewards website with the redemption code and enjoy!
* If you had earlier purchased an AMD Radeon RX Vega 56 or Radeon RX Vega 64 but did you receive your redemption code, check with your retailer or dealer ASAP.
Free Far Cry 5 Promo Availability
The AMD Free Far Cry 5 promo is available across the globe – North America, Latin America, Western Europe, Eastern Europe, the Middle East and Africa.
Note that in some of these regions, you may qualify for the free Far Cry 5 (Price Check) licence if you purchase an AMD Radeon RX 580 (Price Check).
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
At CES 2018, AMD announced the AMD Raven Ridge desktop processors – the long-awaited AMD Ryzen APUs. They are basically AMD Ryzen processors with AMD Radeon Vega graphics built-in. We can now share with you the full details and our reviews of the AMD Raven Ridge desktop APUs!
Updated @ 2018-02-13 : Added the AMD Ryzen 5 2400G and Ryzen 3 2200G review links. Updated various parts of the article.
Updated @ 2018-02-10 : Added two new sections on the new CPU package, L3 cache, PCI Express lanes, and Precision Boost 2. Also updated the section on the Single CCX Configuration.
Originally posted @ 2018-02-08
[/su_spoiler]
The AMD Raven Ridge Desktop APU Reviews
Here are the reviews of the new AMD Raven Ridge desktop APUs.
Raven Ridge is AMD’s codename for their Ryzen-Vega APUs (Accelerated Processing Units). First introduced in the mobile segment as the AMD Ryzen Mobile, AMD is now introducing them to the desktop market.
Desktops APUs are not new. AMD have been making them for years, and many Intel desktop processors come with integrated graphics. But AMD is still the only manufacturer to integrate “premium CPU cores” with “premium graphics cores” to deliver gaming for the masses with :
1080p HD+ gaming performance without a discrete graphics card
support for Radeon FreeSync, Radeon Chill, Enhanced Sync and Radeon ReLive
Single CCX Configuration
Unlike the Summit Ridge-based Ryzen CPUs, the AMD Raven Ridge processors use a single CCX configuration. This is a cost-saving measure with a much smaller die size, that also yield some performance benefits – reduced cache and memory latencies.
AMD analysed the performance of the 2+2 and 4+0 configuration and concluded that they are “roughly equivalent on average across 50+ games“.
Smaller L3 Cache
Using a single CCX configuration halves the Raven Ridge L3 cache size from 8 MB to 4 MB. To compensate, AMD increased their base and boost clock speeds, particularly in the Ryzen 5 2400G.
New CPU Package
The Raven Ridge APUs also introduce a revised CPU package, and a switch to the traditional non-metallic TIM (thermal interface material). These are again cost-cutting measures, albeit with a side benefit of allowing the Raven Ridge processors to officially support DDR4-2933 memory.
[adrotate group=”1″]
Precision Boost 2
The new Raven Ridge processors also boast the improved Precision Boost 2, whose more graceful and linear boost algorithm allows them to “boost more cores, more often, on more workloads“. It is now able to change frequencies in very fine granularity of just 25 MHz.
According to AMD, this will allow the Raven Ridge processors to perform better with apps and games that spawn many lightweight threads, as opposed to apps with persistent loads (e.g. video editing and 3D rendering).
PCIe x8 For Discrete GPU
The Summit Ridge-based AMD Ryzen 7, Ryzen 5 and Ryzen 3 processors have 16 PCI Express 3.0 lanes dedicated to the PCIe graphics card. In Raven Ridge, that gets cut down to half. That means any external graphics card will only communicate with a Raven Ridge processor at PCIe x8 speed.
This is a cost-saving measure, making the Raven Ridge processor simpler and cheaper to produce. The Ryzen 3 2200G, for example, is $10 cheaper than its predecessor, the Ryzen 3 1200. They also claim that the move contributed to a smaller and more efficient uncore.
AMD made this decision because “abundant public data has shown that this is a neutral change for the midrange GPUs and workloads likely to be paired with a $99-169 processor“.
Frankly, the Raven Ridge is best used as-is. If you plan to use a discrete graphics card, it makes far more sense to get the Summit Ridge-based AMD Ryzen 7, Ryzen 5 and Ryzen 3 processors instead.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The AMD Ryzen 2000G Series APUs Revealed!
As announced at CES 2018, AMD is introducing two Raven Ridge desktop processors as part of the new AMD Ryzen 2000G family – the AMD Ryzen 5 2400G, and the AMD Ryzen 3 2200G.
AMD kindly sent us their Raven Ridge desktop processor media kit, which we unboxed in this video :
The AMD Raven Ridge Desktop APU Specification Comparison
For your convenience, we created this table that compares their key specifications with those of their CPU-equivalents – the AMD Ryzen 5 1400 and the AMD Ryzen 3 1200.
Specifications
AMD Ryzen 5 2400G
AMD Ryzen 5 1400
AMD Ryzen 3 1200
AMD Ryzen 3 2200G
TDP
65 W
65 W
65 W
65 W
Socket
AM4
AM4
AM4
AM4
Process Technology
14 nm FinFET
14 nm FinFET
14 nm FinFET
14 nm FinFET
Transistor Count
4.94 Billion
4.8 Billion
4.8 Billion
4.94 Billion
Die Size
209.78 mm²
192 mm²
192 mm²
209.78 mm²
CCX Configuration
4+0
2+2
2+2
4+0
Processor Cores
4
4
4
4
Number of Simultaneous Threads
8
8
4
4
L2 Cache Size
2 MB
2 MB
2 MB
2 MB
L3 Cache Size
4 MB
8 MB
8 MB
4 MB
Base Clock Speed
3.6 GHz
3.2 GHz
3.1 GHz
3.5 GHz
Boost Clock Speed
3.9 GHz
3.4 GHz
3.4 GHz
3.7 GHz
Max. DDR4 Speed
DDR4-2933
DDR4-2667
DDR4-2667
DDR4-2933
GPU
Radeon RX Vega 11
- 704 stream processors
- 44 TMUs, 16 ROPs
- Up to 1250 MHz
None
None
Radeon Vega 8
- 512 stream processors
- 32 TMUs, 16 ROPs
- Up to 1100 MHz
PCI Express Lanes
PCIe x8
PCIe x16
PCIe x16
PCIe x8
Bundled CPU Cooler
AMD Wraith Stealth
AMD Wraith Stealth
AMD Wraith Stealth
AMD Wraith Stealth
Launch Price
US$ 169
US$ 169
US$ 109
US$ 99
AMD Raven Ridge Price & Availability
The AMD Raven Ridge desktop APUs are available for purchase starting 12 February 2018, at the following prices :
At those price points, these Raven Ridge APUs will literally shred Intel processors with integrated graphics to pieces with their value proposition. More cores and more threads, with a much faster graphics core, at such prices. What more can you ask for?
The AMD Raven Ridge desktop APUs will be a relief to many esports gamers, who are suffering from extremely high GPU prices because of cryptocurrency miners.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
AMD announced the AMD Ryzen 5 2400G with Radeon RX Vega 11 Graphics at CES 2018, and now it is finally here!. Today, we will share with you our review of the AMD Ryzen 5 2400G APU, and its integrated Radeon RX Vega 11 Graphics!
The AMD Ryzen 5 2400G Specifications Compared
We created this table to compare the specifications of the AMD Ryzen 5 2400G (Price Check) and AMD Ryzen 3 2200G (Price Check) APUs, against the AMD Ryzen 5 1400 and AMD Ryzen 3 1200 CPUs, that they will replace.
Specifications
AMD Ryzen 5 2400G
AMD Ryzen 5 1400
AMD Ryzen 3 1200
AMD Ryzen 3 2200G
TDP
65 W
65 W
65 W
65 W
Socket
AM4
AM4
AM4
AM4
Process Technology
14 nm FinFET
14 nm FinFET
14 nm FinFET
14 nm FinFET
Transistor Count
4.94 Billion
4.8 Billion
4.8 Billion
4.94 Billion
Die Size
209.78 mm²
192 mm²
192 mm²
209.78 mm²
CCX Configuration
4+0
2+2
2+2
4+0
Processor Cores
4
4
4
4
Number of Simultaneous Threads
8
8
4
4
L2 Cache Size
2 MB
2 MB
2 MB
2 MB
L3 Cache Size
4 MB
8 MB
8 MB
4 MB
Base Clock Speed
3.6 GHz
3.2 GHz
3.1 GHz
3.5 GHz
Boost Clock Speed
3.9 GHz
3.4 GHz
3.4 GHz
3.7 GHz
Max. DDR4 Speed
DDR4-2933
DDR4-2667
DDR4-2667
DDR4-2933
GPU
Radeon RX Vega 11
- 704 stream processors
- 44 TMUs, 16 ROPs
- Up to 1250 MHz
None
None
Radeon Vega 8
- 512 stream processors
- 32 TMUs, 16 ROPs
- Up to 1100 MHz
PCI Express Lanes
PCIe x8
PCIe x16
PCIe x16
PCIe x8
Bundled CPU Cooler
AMD Wraith Stealth
AMD Wraith Stealth
AMD Wraith Stealth
AMD Wraith Stealth
Launch Price
US$ 169
US$ 169
US$ 109
US$ 99
Unboxing The AMD Ryzen 5 2400G
The AMD Ryzen 5 2400G with Radeon RX Vega 11 Graphics (Price Check) comes bundled with an AMD Wraith Stealth cooler. Let’s unbox it, and see what we find inside!
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The AMD Ryzen 5 2400G APU Up Close!
The AMD Ryzen 5 2400G with Radeon RX Vega 11 Graphics (Price Check) has four Ryzen processor cores with a 3.6 GHz base clock, and a 3.9 GHz boost clock. It supports SMT (simultaneous multi-threading), and can therefore handle 8 threads simultaneously.
It uses a single CCX (CPU Complex), allowing AMD to fit a Radeon GPU on the same die. Its transistor count only increased by 3% to 4.94 billion, with a 9% larger die size of 209.78 mm².
Single CCX Configuration
Unlike the Summit Ridge-based Ryzen CPUs, the AMD Ryzen 5 2400G uses a single CCX configuration. This is a cost-saving measure that yields a much smaller die size, with some performance benefits – reduced cache and memory latencies.
AMD analysed the performance of the 2+2 and 4+0 configurations, and concluded that they are “roughly equivalent on average across 50+ games“.
Smaller L3 Cache
Using a single CCX configuration has the unfortunate effect of halving the L3 cache size from 8 MB to 4 MB. AMD increased its base and boost clock speeds to compensate for the smaller L3 cache.
The AMD Ryzen 5 2400G has a 400 MHz (12.5%) higher base clock and a 500 MHz (14.7%) higher boost clock than the Ryzen 5 1400 it replaces.
New CPU Package
The Raven Ridge APUs also introduce a revised CPU package, and a switch to the traditional non-metallic TIM (thermal interface material). These are again cost-cutting measures, albeit with a side benefit of allowing the AMD Ryzen 5 2400G (Price Check) to officially support DDR4-2933 memory.
[adrotate group=”1″]
Precision Boost 2
The AMD Ryzen 5 2400G (Price Check) supports the improved Precision Boost 2, whose more graceful and linear boost algorithm allows them to “boost more cores, more often, on more workloads“. It can change frequencies in very fine granularity of just 25 MHz.
According to AMD, this will allow the Raven Ridge processors to perform better with apps and games that spawn many lightweight threads, as opposed to apps with persistent loads (e.g. video editing and 3D rendering).
PCIe x8 For Discrete GPU
The Summit Ridge-based AMD Ryzen 7, Ryzen 5 and Ryzen 3 processors have 16 PCI Express 3.0 lanes dedicated to the PCIe graphics card. The AMD Ryzen 5 2400G (Price Check) only has half that – 8 PCIe lanes. That means any external graphics card will only communicate with it at PCIe x8 speed.
This is a cost-saving measure, although AMD also claims that the move contributed to a smaller and more efficient uncore. According to AMD, this is unlikely to make a significant difference with the type of (mid-range) graphics cards this processor will usually be paired with.
AMD Wraith Stealth
The AMD Ryzen 5 2400G (Price Check) is bundled with the AMD Wraith Stealth cooler. This is a basic CPU cooler, so don’t expect LED or RGB lighting, a copper base or even heatpipes.
The Wraith Stealth cooler uses a simple, low-profile aluminium heatsink, with a new spring-screw clamping system. Its main advantage – it’s quiet with a maximum noise level of 28 dBa.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
In the graphics tests, we will compare it to the NVIDIA GeForce GTX 1050 (Price Check), and the AMD Radeon RX 460 (Price Check) graphics cards. The graphics drivers used were the NVIDIA GeForce 390.77 and the AMD Radeon Software 17.7.
3D Rendering Speed – CINEBENCH R15
CINEBENCH R15 is a real-world 3D rendering benchmark based on the MAXON Cinema 4D animation software. This is a great way to accurately determine the actual performance of a processor in 3D content creation.
CINEBENCH R15 Single Core
This Single Core test is not reflective of real world performance, but it is useful to find out the performance of the individual core.
Despite its higher clock speed advantage, the AMD Ryzen 5 2400G (Price Check) delivered a significantly poorer single-core performance than the Ryzen 3 2200G (Price Check)! You can see it in the video above.
CINEBENCH R15 Multi Core
The Multi-Core test shows the processor’s real-world 3D rendering performance.
The analysis of the Multi-Processing Ratio is useful in checking the efficiency of the SMT implementation. The MP Ratio is independent of the processor’s clock speed.
HandBrake is a free, open-source video transcoding utility. Video transcoding basically converts a video file from one resolution / format to another. As you can imagine, it’s very compute-intensive. In our test, we converted a 4K video of 1.3 GB in size into a 1080p video (HQ1080p30).
Despite its clock speed advantage, the AMD Ryzen 5 2400G (Price Check) was 9% slower than the Ryzen 5 1500X (Price Check) at video transcoding. This is due to the much smaller L3 cache size. But thanks to its support for SMT, it is 15% faster than the Ryzen 3 1300X ( Price Check), and 32% faster than the Ryzen 3 2200G APU.
Radial Blur Speed – Photoshop CC 14
The radial blur filter adds the perception of motion to a picture. This is a compute-intensive operation that benefits from multiple processing cores. This radial blur test was performed on a single 13.5 megapixel photo, with a filesize of 4,910,867 bytes.
Again, the smaller L3 cache size had a noticeable effect on its performance in Photoshop. Even though it supports SMT and has a slightly higher clock speed, the AMD Ryzen 5 2400G (Price Check) was 22% slower than the Ryzen 5 1500X (Price Check), and 7% slower than the Ryzen 3 1300X ( Price Check).
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
3DMark – FireStrike (1920 x 1080)
In the 3DMark FireStrike benchmark that runs at 1920 x 1080, the Radeon RX Vega 11 core of the AMD Ryzen 5 2400G (Price Check) delivered a Graphics Score of just over 3100. This makes it roughly half as fast as the AMD Radeon RX 460 (Price Check).
3DMark – FireStrike Extreme (2560 x 1440)
When we bumped up the resolution to 2560 x 1440, the AMD Ryzen 5 2400G delivered a Graphics Score of just under 1400.
[adrotate group=”1″]
Ashes of the Singularity (1080p)
In the RTS game, Ashes of the Singularity, the single core CPU performance has a significant effect on the actual frame rate. We tested the cards using the Low setting preset.
At the resolution of 1920 x 1080, the AMD Ryzen 5 2400G (Price Check) delivered a frame rate just under 30 fps. But remember – we tested them at the Low settings. So gamers will want to drop the resolution even further to get a decent frame rate.
Ashes of the Singularity (1440p)
We then bumped up the resolution to 2560 x 1440, again with the settings set to Low.
The AMD Ryzen 5 2400G (Price Check) held up surprisingly well at 1440p – its frame rate dropped only 9%.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Total War: Warhammer (1080p)
Like Ashes of the Singularity, the single core performance appears to be crucial in Total War: Warhammer.
At 1080p, the AMD Ryzen 5 2400G (Price Check) delivered a decent average frame rate of 42 fps. However, note that the quality settings were set to Low.
Total War: Warhammer (1440p)
We then bumped up the resolution 1440p to see how they fare, again with the quality settings set to Low.
Alas, 1440p is just too hard for the Ryzen 5 2400G (Price Check) to handle. The average frame rate of 24 fps is just too low to be playable.
[adrotate group=”1″]
For Honor (1080p)
We started out testing For Honor using the Low settings, which means Trilinear Filtering and no Anti-Aliasing.
The AMD Ryzen 5 2400G (Price Check) was able to deliver a playable frame rate, which averages out at 39 fps. This makes it half as fast as the AMD Radeon RX 460 (Price Check) graphics card.
For Honor (1080p)
We then increase the resolution to 2560 x 1440, still using the Low settings.
The AMD Ryzen 5 2400G (Price Check) was definitely not powerful enough to handle the increased workload, with an average frame rate of only 24 fps.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Our Verdict
First off, it bears reminding that the AMD Ryzen 5 2400G (Price Check) is an APU – basically a quad-core processor, with built-in AMD Vega graphics. Even with the added graphics capability, AMD kept the Ryzen 5 2400G at the same $169 price point as the Ryzen 5 1400 CPU it replaces.
AMD achieved this by using half of a Summit Ridge processor die, allowing 11 Vega Compute Units to be inserted with a slight bump in transistor count and die size. This clever bit of engineering compromise, and a number of other tweaks, allowed them to keep costs low.
The biggest problem with the decision to use a single CCX – the AMD Ryzen 5 2400G (Price Check) only has a 4 MB L3 cache – half that of the Ryzen 5 1400 CPU it replaces. AMD increased its clock speed to compensate, actually making the Ryzen 5 2400G run faster than the Ryzen 5 1500X (Price Check) on paper!
The Ryzen 5 2400G was 9-15% slower than the Ryzen 5 1500X in our benchmarks, which makes them both equal in price / performance, because the 1500X is 12% more expensive at $189. The Ryzen 5 2400G, however, has more value, thanks to its integrated Vega graphics.
When it comes to games, AMD promised that it will deliver “1080 HD+ gaming performance“. That may be true for less strenuous esports games like League of Legends. In the games we tested, its Radeon RX Vega 11 graphics core can only deliver playable frame rates at 1080p if we use the lowest possible quality settings.
[adrotate group=”2″]
Make no mistake – the Radeon RX Vega 11 processor graphics is no replacement for a good graphics card, like AMD’s own Radeon RX Vega 64 and Radeon RX Vega 56.
So what is the AMD Ryzen 5 2400G really good for? We see it being used mostly in small form-factor esports gaming PCs. It offers great CPU performance paired with good gaming performance for games like Dota 2, League of Legends and CS:GO in a single, highly-affordable and power-efficient package.
Remember – the AMD Ryzen 5 2400G (Price Check) offers 4-core, 8-thread CPU performance, with faster than average processor graphics, for just 65 watts of power consumption.
If you are looking to play games with all of the bells and whistles enabled, you need to opt for a dedicated graphics card. But if you are a casual gamer, or just want a really affordable and power-efficient esports gaming system (looking at you esports cafe owners!), it’s hard to beat the value proposition of the AMD Ryzen 5 2400G (Price Check).
Mark our words – the AMD Ryzen 5 2400G is going to shred Intel processors with integrated graphics to pieces.
The AMD Ryzen 5 2400G Price & Availability
As AMD announced, the AMD Ryzen 5 2400G (Price Check) desktop APU will be available starting 12 February 2018.
It is priced at just US$169 (RM 759 in Malaysia), making it an affordable gaming solution. This will be a relief to many esports gamers, who are suffering from extremely high GPU prices because of cryptocurrency miners.
You can help support Tech ARP by ordering your AMD Ryzen 5 2400G fromthis Amazon link.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Las Vegas – 2018 CES – January 7, 2018— AMD (NASDAQ: AMD) today detailed its forthcoming roll-out plan for its new and next generation of high-performance computing and graphics products during the AMD CES 2018 technology update event in Las Vegas.
Alongside announcing the first desktop Ryzen processors with built-in Radeon Vega Graphics, AMD also detailed the full line-up of Ryzen Mobile APUs including the new Ryzen PRO and Ryzen 3 models, and provided a first look at the performance of its upcoming 12nm Ryzen 2 desktop CPU expected to launch in April 2018.
In graphics, AMD announced the expansion of the AMD Vega family with Radeon Vega Mobile and that its first 7nm product is planned to be a Radeon Vega GPU specifically built for machine learning applications.
AMD CES 2018 Technology Updates
AMD CTO and SVP Mark Papermaster shared updates on AMD’s process technology roadmaps for both x86 processors and graphics architectures.
x86 Processors
The “Zen” core, currently shipping in Ryzen desktop and mobile processors, is in production at both 14nm and 12nm, with 12nm samples now shipping.
The “Zen 2” design is complete and will improve on the award-winning “Zen” design in multiple dimensions.
Graphics Processors
Expanding the “Vega” product family in 2018 with the Radeon Vega Mobile GPU for ultrathin notebooks.
The first 7nm AMD product, a “Vega” based GPU built specifically for machine learning applications.
A production-level machine learning software environment with AMD’s MIOpen libraries supporting common machine learning frameworks like TensorFlow and Caffe on the ROCm Open eCosystem platform. The industry’s first fully open heterogeneous software environment, which is making it easier to program using AMD GPUs for high performance compute and deep learning environments.
AMD CES 2018 Client Compute Updates
AMD SVP and General Manager, Computing and Graphics Business Group, Jim Anderson detailed upcoming AMD client compute processors including:
The Ryzen desktop processor with Radeon graphics
Desktop Ryzen APUs combine the latest “Zen” core and AMD Radeon graphics engine based on the advanced “Vega” architecture, bringing:
The highest performance graphics engine in a desktop processor[i]
Advanced quad core performance with up to 8 processing threads
1080p HD+ gaming performance without a discrete graphics card
Beautiful display features with Radeon™ FreeSync technology[ii]
Full benefit of Radeon software driver features including Radeon Chill, Enhanced Sync and Radeon ReLive
Planned to be available starting February 12, 2018.
Ryzen Desktop Processors with Radeon Vega Graphics
AMD’s first 12nm based processor with Precision Boost 2 technology.
Scheduled for introduction April 2018.
Ryzen PRO Mobile Processors with Radeon Vega Graphics
Targeted for commercial, enterprise, and public sector implementation, Ryzen PRO mobile processors are designed to power sleek and powerful enterprise notebooks featuring the world’s fastest processor for commercial ultrathin notebooks[iv], state-of-the-art silicon-level security, and reliable solutions with enterprise-class support and product stack top-to-bottom DASH manageability.
Up to 22% more competitive productivity performance[v]
Up to 125% more competitive graphics performance than Intel i7-8550U and 150% more competitive graphics performance than Intel i7-7500U[vi]
AMD expanded the Ryzen Mobile Processor family, featuring the world’s fastest processor for ultrathin notebooks line up, with the introduction of the Ryzen 3 mobile processor.
AMD discussed its first mobile discrete graphics solution based on the “Vega” architecture. This razor-thin Radeon Vega Mobile GPU is designed to enable new, powerful gaming notebooks in 2018 with extraordinary performance and incredible efficiency.
AMD also announced that Radeon Software will add support for HDMI 2.1 Variable Refresh Rate (VRR) technology on Radeon RX products in an upcoming driver release. This support will come as an addition to the Radeon FreeSync technology umbrella, as displays with HDMI 2.1 VRR support reach market.
Ubisoft announced that Far Cry 5 will support Radeon RX Vega-specific features like Rapid Packed Math as well as Radeon FreeSync 2 technology. Radeon RX Vega owners will be able to enjoy Far Cry 5 at exceptional fidelity, with stunning frame rates and beautiful image quality.
AMD CES 2018 Footnotes
[i] Testing by AMD Performance labs as of 12/08/2017 for the Ryzen 5 2400G, and 09/04/2015 for the Core i7-5775c on the following systems. PC manufacturers may vary configurations yielding different results. Results may vary based on driver versions used.
System Configs: All systems equipped with Samsung 850 PRO 512GB SSD, Windows 10 RS2 operating system. Socket AM4 System: Ryzen 52400G processor, 16B (2 x 8GB) DDR4-2667 RAM, Graphics Driver 1710181048-17.40-171018a-319170E 23.20.768.0 :: 12/08/2017. Socket LGA1150 System: Core i7-5775c processor, 8GB (2x4GB) DDR3-1867 MHz RAM, graphics driver 10.18.15.4256:: 09/04/2015. 3DMark 11 Performance benchmark used to represent graphics power. The following processors achieved the following scores in 3DMark 11 ‘performance’ benchmark v1.0.132.0: The Ryzen 5 2400G: 5042. Also in v1.0.132.0, .The Core i7-5775c, the Intel desktop processor with the highest Intel desktop graphics performance, achieved 3094. RZG-01
[ii] FreeSync 2 does not require HDR capable monitors; driver can set monitor in native mode when FreeSync 2 supported HDR content is detected. Low-latency HDR only attainable when using a FreeSync 2 API enabled game or video player and content that uses at least 2x the perceivable brightness and color range of sRGB, and using a FreeSync 2 qualified monitor. Based on AMD internal testing as of November 2016. GD-105.
[iii] AMD Radeon and FirePro GPUs based on the Graphics Core Next architecture consist of multiple discrete execution engines known as a Compute Unit (“CU”). Each CU contains 64 shaders (“Stream Processors”) working together. GD-78
[iv] “Processor for ultrathin notebooks” defined as 15W nominal processor TDP. Based on testing of the AMD Ryzen 7 PRO 2700U, AMD Ryzen 5 PRO 2500U, and Core i7-8550U mobile processors as of 10/6/2017 Performance based on Cinebench R15 nT and 3DMark® TimeSpy in order of AMD Ryzen 7 PRO 2700U and Intel 8550U. Cinebench R15 nT results: 660.5, 498.2; 3DMark TimeSpy results: 978, 350. 50:50 CPU:GPU weighted relative performance with i7 baseline: Intel i7-8650U = (498.2/498.2*.5) + (350/350*.5) = 100%; AMD Ryzen 7 PRO 2700U = (660.5/498.2*.5) + (978/350*.5) = 206%.AMD Ryzen 7 PRO 2700U Processor: HP 83C6, AMD Ryzen 7 PRO 2700U Processor with Radeon Vega 10 Graphics, 8GB Dual Channel (2x4GB) DDR4-2400 RAM, Samsung 850 PRO 512GB SATA SSD, Windows 10 Pro RS2, Graphics driver 22.19.655.0, 12-Sep-2017. i7-8550U: KBL Woody_KL, i7-8550U with Intel UHD Graphics 620, 8GB Dual Channel (2x4GB) DDR4-2400 RAM, MTFDDAV256TBN – M.2 Sata, Windows 10 Pro RS2, Graphics driver 22.20.16.4771, 12-Aug-2017. Different configurations and drivers may yield different results. RPM-6
[v] Testing by AMD Performance labs. PCMark 10 Extended is used to simulate productivity performance; the AMD Ryzen 7 PRO 2700U scored 3102, while the Intel i7-8550U scored 2533 for a benchmark score comparison of 3102/2533 = 1.22X or 22% faster. AMD Ryzen™ 7 PRO 2700U: HP 83C6, AMD Ryzen™ 7 PRO 2700U with Radeon Vega 10 Processor Graphics, 8GB DDR4-2400 RAM, Samsung 850 PRO 512GB SATA SSD, Windows 10 Pro RS2, Graphics driver 22.19.655.2, 06-Sep-2017. Core i7-8550U: Acer Spin 5, Core i7-8550U with Intel UHD Graphics 620, 8GB DDR4-2400 RAM, MTFDDAV256TBN – M.2 Sata SSD, Windows 10 Pro RS2, Graphics driver 22.20.16.4771, 12-Aug-2017. Different configurations and drivers may yield different results. RPM-2
[vi] Testing by AMD Performance labs. 3DMark 11 Performance is used to simulate system performance; the AMD Ryzen 7 PRO 2700U scored 4357, while the Intel i7-8550U scored 1937 for a benchmark score comparison of 4357/1937 = 2.25X or 125% faster and the Intel i7-7500U scored 1743 for benchmark score comparison of 4357/1743 = 2.50X or 150% faster. AMD Ryzen 7 PRO 2700U: HP Envy x360 @25W, AMD Ryzen 7 PRO 2700U Processor with Radeon Vega 10 Graphics, 8GB DDR4-2400 RAM, Samsung 850 PRO 512GB SATA SSD, Windows 10 Pro RS2, Graphics driver 22.19.655.2, 06-Sep-2017. Intel Core i7-8550U: Acer Swift 3, Intel Core i7-8550U with Intel UHD Graphics 620 @15W, 16GB DDR4-2400 RAM, Samsung 850 PRO 512GB SATA SSD, Windows 10 Pro RS2, Graphics driver 22.20.16.4771, 12-Aug-2017. Intel Core i7-7500U: HP Envy x360, Intel Core i7-7500U with Intel HD Graphics 620 @15W, 8GB DDR4-2400 RAM, Samsung 850 PRO 512GB SATA SSD, Windows 10 Pro RS2, Graphics driver 22.20.16.4691 , 01-Jun-2017. Different configurations and drivers may yield different results. RPM-1
[vii] Based on AMD testing as of 10/11/2017. Battery life targets for the AMD Ryzen PRO Processor with Radeon Graphics assume a fully power-optimized software/hardware solution stack, and the following system configuration: AMD Reference Platform, AMD Ryzen 7 PRO 2700U, 2x4GB DDR4-2400, graphics driver 17.30.1025, Windows 10 x64 (1703). Assuming a 50 Wh battery capacity, MobileMark 14 battery life for the Ryzen 7 PRO 2700U playback is estimated at 13.5 hours. Different configurations and drivers may yield different results. RPM-5
[viii] AMD SenseMI technology is built into all Ryzen processors, but specific features and their enablement may vary by product and platform. mXFR enablement must meet AMD requirements. Not enabled on all notebook designs. Check with manufacturer to confirm “amplified mXFR performance” support.
[ix] mXFR enablement must meet AMD requirements. Not enabled on all notebook designs. Check with manufacturer to confirm “amplified mXFR performance” support.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
At IFA 2017, one of the most exciting products that Acer revealed was the 18-core, quad-GPU behemoth known as the Acer Predator Orion 9000! We had the opportunity recently to take a close look at the Orion 9000, and get updated on its price and specifications.
The Acer Predator Orion 9000
The Acer Predator Orion 9000 is a massive HEDT (High-End Desktop) PC that boasts an 18-core processor, and up to four graphics cards. It has a silver-and-black chassis with a spacecraft design and customisable RGB lighting along the sides of the front bezel.
To showcase the powerful hardware inside, it has a massive side window panel with a metal mesh that keeps its electromagnetic interference (EMI) in check. Up to 5 fans can be installed, with customisable RGB lighting to create a virtual light show. The two side panels are also tool-less, allowing for easy upgrades.
Despite its size, Acer made it easy to move – it is outfitted with two handles and wheels covered with a carbon fibre pattern. It also has a headset cradle and cable management to keep everything tidy during and after the move.
Liquid Cooling & IceTunnel 2.0
The Predator Orion 9000 features AIO liquid cooling for the CPU, and Acer IceTunnel 2.0 to keep the temperature down even in the most demanding conditions. IceTunnel 2.0 is an advanced airflow management design that separates the system into several thermal zones, each with an individual “airflow tunnel” to expel heat.
[adrotate group=”2″]
There are huge metal mesh panels on the front and top that allow more cold air in, and the rising hot air of the liquid-cooled CPU out. Up to five 120 mm fans can be installed in the front, top, and back to channel cool air through the chassis.
Part of the airflow is redirected towards the back of the motherboard tray to cool the storage devices. The graphic cards feature blower-style fans to drive the heat out from the back, while the PSU is self-contained to avoid thermal interference.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Within 24 hours of Intel announcing the integration of Vega into the 8th Gen Intel Core processor, Raja Koduri, the Radeon Technologies Group SVP and Chief Architect, announced that he was leaving AMD. 24 hours later, Intel announced his appointment as the head of the new Intel CVC group that will focus on such products!
Confused? Let us summarise his recent moves…
Summary : Raja Koduri Leaves AMD To Head Intel CVC Group
For those who don’t have details, here is a summary of the recent events leading to the announcement that Raja Koduri is leaving AMD, possibly heading to Intel.
13 September 2017
Raja Koduri began his sabbatical from AMD, with a target return date in December. In his absence, AMD CEO Dr. Lisa Su was in charge of the Radeon Technologies Group. The point also marked his two years in charge of the new Radeon Technologies Group.
7 November 2017
Raja Koduri announced his departure from AMD. Rumours have it that he would be announcing a leading role in Intel shortly.
He also switched his Twitter handle, with his old account now called Fake Raja Koduri.
8 November 2017
Intel announced that Raja Koduri has been appointed as Intel Chief Architect and Senior Vice President of the newly-formed Intel Core and Visual Computing Group, and general management of “a new initiative to drive edge computing solutions”.
For those who want more details, the article continues…
Raja Koduri Takes A Sabbatical
On 13 September 2017, Fudzilla and Tweaktown reported that Raja Koduri was taking a sabbatical from the Radeon Technologies Group. This was right after they launched the AMD Radeon RX Vega graphics cards. This was the letter he sent to his team (as provided by PC Perspective.
Raja Koduri's Sabbatical Letter To The RTG Team
RTG Team,
You haven’t heard from me collectively in a while – a symptom not only of the whirlwind of launching Vega, but simply of the huge number of demands on my time since the formation of RTG. Looking back over this short period, it is an impressive view. We have delivered 6 straight quarters of double-digit growth in graphics, culminating in the launch of Vega and being back in high-performance. What we have done with Vega is unparalleled. We entered the high-end gaming, professional workstation and machine intelligence markets with Vega in a very short period of time. The demand for Vega (and Polaris!) is fantastic, and overall momentum for our graphics is strong.
Incredibly, we as AMD also managed to spectacularly re-enter the high-performance CPU segments this year. We are all exceptionally proud of Ryzen, Epyc and Threadripper. The computing world is not the same anymore and the whole world is cheering for AMD. Congratulations and thanks to those of you in RTG who helped see these products through. The market for high-performance computing is on an explosive growth trajectory driven by machine intelligence, visual cloud, blockchain and other exciting new workloads. Our vision of immersive and instinctive computing is within grasp. As we enter 2018, I will be shifting my focus more toward architecting and realizing this vision and rebalancing my operational responsibilities.
At the beginning of the year I warned that Vega would be hard. At the time, some folks didn’t believe me. Now many of you understand what I said. Vega was indeed hard on many, and my sincere heartfelt thanks to all of you who endured the Vega journey with me. Vega was personally hard on me as well and I used up a lot of family credits during this journey. I have decided to take a time-off in Q4 to spend time with my family. I have been contemplating this for a while now and there was never a good time to do this. Lisa and I agreed that Q4 is better than 2018, before the next wave of product excitement. Lisa will be acting as the leader of RTG during by absence. My sincere thanks to Lisa and rest of AET for supporting me in this decision and agreeing to take on additional workload during my absence.
I am looking to start my time-off on Sept 25th and return in December.
Thank you, all of you, for your unwavering focus, dedication and support over these past months, and for helping us to build something incredible. We are not done yet, and keep the momentum going!
Regards, Raja
Raja Koduri Leaves AMD
On 7 November, HEXUS reported that Raja Koduri was leaving AMD, and shared his final memo to the Radeon Technologies Group :
Raja Koduri's Final Letter To The AMD Family
To my AMD family,
Forty is a significant number in history. It is a number representing transition, testing and change. I have just spent forty days away from the office going through such a transition. It was an important time with my family, and it also offered me a rare space for reflection. During this time I have come to the extremely difficult conclusion that it is time for me to leave RTG and AMD.
I have no question in my mind that RTG, and AMD, are marching firmly in the right direction as high-performance computing becomes ever-more-important in every aspect of our lives. I believe wholeheartedly in what we are doing with Vega, Navi and beyond, and I am incredibly proud of how far we have come and where we are going. The whole industry has stood up and taken notice of what we are doing. As I think about how computing will evolve, I feel more and more that I want to pursue my passion beyond hardware and explore driving broader solutions.
I want to thank Lisa and the AET for enabling me to pursue my passion during the last four years at AMD, and especially the last two years with RTG. Lisa has my utmost respect for exhibiting the courage to enable me with RTG, for believing in me and for going out of her way to support me. I would also like to call out Mark Papermaster who brought me into AMD, for his huge passion for technology and for his relentless support through many difficult phases. And of course, I want to thank each and every one of my direct staff and my indirect staff who have worked so hard with me to build what we have now got. I am very proud of the strong leaders we have and I’m fully confident that they can execute on the compelling roadmap ahead.
I will continue to be an ardent fan and user of AMD technologies for both personal and professional use.
As I mentioned, leaving AMD and RTG has been an extremely difficult decision for me. But I felt it is the right one for me personally at this point. Time will tell. I will be following with great interest the progress you will make over the next several years.
On a final note, I have asked a lot of you in the last two years. You’ve always delivered. You’ve made me successful both personally and professionally, for which I thank you all from the bottom of my heart. I have these final requests from you as I leave:
. Stay focused on the roadmap!
. Deliver on your commitments!
. Continue the culture of Passion, Persistence and Play!
. Make AMD proud!
. Make me proud!
Yours,
Raja
[adrotate group=”1″]
Raja Koduri To Head New Intel CVC Group
On 8 November, 2017, Intel announced Raja Koduri’s appointment as Intel Chief Architect, Senior Vice President of the newly-formed Intel Core and Visual Computing Group, and General Manager of “a new initiative to drive edge computing solutions”. He will officially start in his new role at Intel in early December.
Intel's Announcement About Raja Koduri's Appointment To Head Intel CVC Group
SANTA CLARA, Calif., Nov. 8, 2017 – Intel today announced the appointment of Raja Koduri as Intel chief architect, senior vice president of the newly formed Core and Visual Computing Group, and general manager of a new initiative to drive edge computing solutions. In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments.
Billions of users today enjoy computing experiences powered by Intel’s leading cores and visual computing IP. Going forward under Koduri’s leadership, the company will unify and expand differentiated IP across computing, graphics, media, imaging and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing.
“Raja is one of the most experienced, innovative and respected graphics and system architecture visionaries in the industry and the latest example of top technical talent to join Intel,” said Dr. Murthy Renduchintala, Intel’s chief engineering officer and group president of the Client and Internet of Things Businesses and System Architecture. “We have exciting plans to aggressively expand our computing and graphics capabilities and build on our very strong and broad differentiated IP foundation. With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
Koduri brings to Intel more than 25 years of experience in visual and accelerated computing advances across a broad range of platforms, including PCs, game consoles, professional workstations and consumer devices. His deep technical expertise spans graphics hardware, software and system architecture.
“I have admired Intel as a technology leader and have had fruitful collaborations with the company over the years,” Koduri said. “I am incredibly excited to join the Intel team and have the opportunity to drive a unified architecture vision across its world-leading IP portfolio that help’s accelerate the data revolution.”
Koduri, 49, joins Intel from AMD, where he most recently served as senior vice president and chief architect of the Radeon Technologies Group. In this role, he was responsible for overseeing all aspects of graphics technologies used in AMD’s APU, discrete GPU, semi-custom and GPU compute products. Prior to AMD, Koduri served as director of graphics architecture at Apple Inc., where he helped establish a leadership graphics sub-system for the Mac product family and led the transition to Retina computer displays.
Koduri will officially start in his new role at Intel in early December.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
With the launch of the NVIDIA GeForce GTX 1070 Ti, rumours have been swirling around the GeForce GTX 1070. Some say NVIDIA is dropping it from the market. Others say that there is a GeForce GTX 1070 price cut. However, the rumours are all false. It is very much alive, and NVIDIA has no plans to cut its price.
Yes, you read that correctly – there was no GeForce GTX 1070 price cut. Not now, not even back in March. NVIDIA never announced any price adjustments of the GeForce GTX 1070.
In this article, we will clarify the history and price cuts of the NVIDIA GeForce 10 Series graphics cards. Please share this out so people won’t get confused by the earlier reports of a GeForce GTX 1070 price cut.
Updated @ 2017-11-02 : Added The History of GeForce 10 Series Prices. Clarified the GeForce GTX 1070 price cut rumours.
Updated @ 2017-10-31 : Added a clarification by NVIDIA on the GeForce GTX 1070 price cut. Updated the article with the new details, and some corrections.
Originally posted @ 2017-10-29
Was There A GeForce GTX 1070 Price Cut? Updated!
When NVIDIA announced the GeForce GTX 1080 Ti on 28 February 2017, they also announced price cuts for the GeForce GTX 1080 Founders Edition and GeForce GTX 1070 Founders Edition. At that time, it was mentioned in many articles that NVIDIA also cut the prices of the regular GeForce GTX 1080 and GeForce GTX 1070 cards.
Those articles are only partly true. The fact is only the regular GeForce GTX 1080 received a price cut. There was no GeForce GTX 1070 price cut, even back in March 2017.
Coming to the present, and the release of the GeForce GTX 1070 Ti graphics card, there is no accompanying GeForce GTX 1070 price cut either. The GeForce GTX 1070 remains priced at $399 (Founders Edition) with a base price of $379.
Factoid :$499 was the original price of the GeForce GTX 1070 Founders Edition, when NVIDIA launched it on 10 June 2016. The base price for the GeForce GTX 1070 was $379, and has remained so until today.
Even without a price adjustment, the GeForce GTX 1070 and the new GeForce GTX 1070 Ti brackets the Radeon RX Vega 56, which is priced at $399. Here’s a quick comparison of their key specifications and prices.
With help from Bryan Del Rizzo, Global PR Manager, NVIDIA GeForce, here is a table that summarises the history of the GeForce 10 Series prices.
Models
Launch Date
Launch Price
Price Cut Date
Current Price
Difference
GeForce GTX 1080 Ti
Founders Edition
10 March 2017
$699
Never
$699
–
GeForce GTX 1080 Ti
10 March 2017
$699
Never
$699
–
GeForce GTX 1080
Founders Edition
27 May 2016
$699
28 February 2017
$549
$150
GeForce GTX 1080
27 May 2016
$599
28 February 2017
$499
$100
GeForce GTX 1070 Ti
Founders Edition
2 November 2017
$449
Never
$449
–
GeForce GTX 1070 Ti
2 November 2017
$449
Never
$449
–
GeForce GTX 1070
Founders Edition
10 June 2016
$449
28 February 2017
$399
$50
GeForce GTX 1070
10 June 2016
$379
Never
$379
–
GeForce GTX 1060 6GB
Founders Edition
19 July 2016
$299
Never
$299
–
GeForce GTX 1060 6GB
19 July 2016
$249
Never
$249
–
GeForce GTX 1060 3GB
18 August 2016
$199
Never
$199
–
GeForce GTX 1050 Ti
25 October 2016
$139
Never
$139
–
GeForce GTX 1050
25 October 2016
$109
Never
$109
–
GeForce GT 1030
17 May 2017
$80
Never
$80
–
As you can now see, only three GeForce 10 series graphics cards were ever discounted – the GeForce GTX 1080 Founders Edition, the regular GeForce GTX 1080, and the GeForce GTX 1070 Founders Edition. They saw price cuts from $50 to $150.
Bracketing The Vega 56
[adrotate group=”2″]
Although many critics say that the GeForce GTX 1070 Ti is pointless or stupid, it actually makes a lot of sense for NVIDIA. With the GeForce GTX 1070 price cut, gamers now face these options :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The road to Vega has been a rather long one. We first saw the AMD Vega prototype running DOOM in December 2016. Not unlike a baby, it took AMD nine months to give birth to the AMD Radeon RX Vega. Today, we are going to take a close look at the AMD Radeon RX Vega 64 (Price Check) graphics card. Let’s see how it performs!
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The AMD Radeon RX Vega 64 Up Close
The AMD Radeon RX Vega 64 (Price Check) looks exactly like Radeon RX Vega 56 (Price Check). In fact, the only way to tell them apart is a small label at the back of the card. Let’s take a quick tour of the Vega 64 up close!
AMD dumped the dual-linked DVI port starting with the Radeon RX 480. So it’s no surprise to see that the AMD Radeon RX Vega 64 (Price Check) lacks a dual-linked DVI port. It has three DisplayPorts and a single HDMI 2.0b port instead.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Vega 64 Power Consumption
The AMD Radeon RX Vega 64 (Price Check) uses the Vega 10 GPU with 12.5 billion transistors fabricated using the 14 nm FinFET process technology. That’s 500 million transistors more than the NVIDIA GP102 GPU powering the NVIDIA GeForce GTX 1080 Ti.
Even though it’s using a finer process technology than NVIDIA, the Radeon RX Vega 64 (Price Check) has a significantly higher TDP of 295 W – 45 W (18%) more than the GeForce GTX 1080 Ti.
AMD equipped the Radeon RX Vega 64 (Price Check) with two 8-pin PCI Express power connectors for a peak power draw of 375 W. This will come in handy if you intend to overclock, although keeping the GPU cool will be a challenge.
GPUTach
There is a GPUTach LED light strip right next to the two 8-pin PCI Express power connectors, which tells you the GPU load at a glance. There are two switches nearby that allows you to turn it on or off, and switch between the red and blue LED colours.
Note that these are not the recorded temperatures, but how much hotter the exhaust air is above ambient temperature.
With its higher TDP, it’s no surprise that the Radeon RX Vega 64 (Price Check) has the highest exhaust temperature in this comparison – 2.8 °C higher than the GeForce GTX 1080 Ti, and 5.4 °C higher than Vega 56 (Price Check).
[adrotate group=”1″]
The Vega 64 Noise Level
Needless to say, you will be wondering about the noise level of the more powerful cooler used to keep the Radeon RX Vega 64 (Price Check) cool. In this video, we recorded the Radeon RX Vega 64 running the Ashes of the Singularity benchmark at the 4K resolution.
Benchmarking Notes
Our graphics benchmarking test bed has the following specifications :
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
3DMark DirectX 12 Benchmark (2560 x 1440)
3DMark Time Spy is the DirectX 12 benchmark in 3DMark. It supports new API features like asynchronous compute, explicit multi-adapter, and multi-threading.
This is torture, even for high-end graphics cards.
The AMD Radeon RX Vega 64 (Price Check) held its place at this extremely high resolution, maintaining its 14% lead over the Vega 56 (Price Check), and slightly extending its performance lead over the GeForce GTX 1070 to 27.5%. The NVIDIA GeForce GTX 1080 Ti was 28% faster than the Vega 64 at this resolution.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Ashes of the Singularity (1920 x 1080)
We tested Ashes of the Singularity in the DirectX 12 mode, which supports the Asynchronous Compute feature. We started with the full HD resolution.
We then took Ashes of the Singularity up a notch to the resolution of 2560 x 1440. Let’s see how the cards fare now…
At this higher resolution, the GeForce GTX 1080 Ti‘s pixel fill rate advantage allowed it to creep past the Vega 64 (Price Check) and Vega 56 (Price Check). The two AMD Vega cards were now 9% and 7.4% faster than the GTX 1070 respectively.
Ashes of the Singularity (3840 x 2160)
Finally, let’s see how the cards perform with Ashes of the Singularity running at the Ultra HD resolution of 3840 x 2160.
At this ultra-high resolution, the NVIDIA GeForce GTX 1080 Ti gained a small 4% performance advantage over the Radeon RX Vega 64 (Price Check), and 16% over the Vega 56 (Price Check). The two AMD Vega cards were 24% and 11% faster than the GTX 1070 respectively.
[adrotate group=”1″]
Warhammer (1920 x 1080)
Total War : Warhammer is another game that supports Asynchronous Compute. This chart shows you the minimum and maximum frame rates, as well as the average frame rate, recorded by Total War : Warhammer‘s internal DirectX 12 benchmark.
At this resolution, most fast graphics cards are CPU-limited. Thanks to Asynchronous Compute, both the Vega 64 (Price Check) and Vega 56 (Price Check) were able to beat the GeForce GTX 1080 Ti by 8% and 6.5% respectively, and the GTX 1070 by 10% and 9%.
Warhammer (2560 x 1440)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, recorded by Total War : Warhammer‘s internal DirectX 12 benchmark.
At this higher resolution, the GeForce GTX 1080 Ti was now right in between the Vega 64 (Price Check) and Vega 56 (Price Check) in performance, with a small 3% gap either way. The two AMD Vega cards were now 8% and 1.5% faster than the GTX 1070 respectively.
Warhammer (3840 x 2160)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, recorded by Total War : Warhammer‘s internal DirectX 12 benchmark.
This ultra-high resolution proved too much for the AMD Vega cards, with the GeForce GTX 1080 Ti the only card capable of delivering an average frame rate above 60 fps. The GTX 1080 Ti was now 41% faster than the Vega 64, and 60% faster than the Vega 56.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The Witcher 3 (1920 x 1080)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in The Witcher 3.
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in The Witcher 3.
At 1440p, the Vega 64’s performance lead over the GeForce GTX 1070 dropped slightly to 15%. It was now 32% slower than the GeForce GTX 1080 Ti.
The Witcher 3 (3840 x 2160)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in The Witcher 3.
At this Ultra HD resolution. the Radeon RX Vega 64 (Price Check) maintained its 15% performance lead over the GeForce GTX 1070. It was now 35% slower than the GeForce GTX 1080 Ti, which was the only card to maintain an average frame rate in excess of 60 fps.
[adrotate group=”1″]
For Honor (1920 x 1080)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, recorded by For Honor‘s internal DirectX 12 benchmark.
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, recorded by For Honor‘s internal DirectX 12 benchmark.
At this Ultra HD resolution, even the GeForce GTX 1080 Ti could not maintain an average frame rate above 60 fps. But it was now 43% faster than the Vega 64. The Radeon RX Vega 64 (Price Check) was now 12% faster than the GTX 1070, and 17% faster than the Vega 56 (Price Check).
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Mass Effect: Andromeda (1920 x 1080)
Mass Effect: Andromeda is another game that supports Asynchronous Compute. This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in Mass Effect: Andromeda.
All six cards did well, delivering average frame rates in excess of 70 fps. In fact, the higher-end cards – the GeForce GTX 1070 onwards – were obviously CPU-limited at this resolution.
Thanks to Asynchronous Compute, the Radeon RX Vega 64 (Price Check) was able to beat the GeForce GTX 1080 Ti by 7%, albeit at this low resolution. It was also 11% faster than the Vega 56 (Price Check), and 12% faster than the GTX 1070.
Mass Effect: Andromeda (2560 x 1440)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in Mass Effect: Andromeda.
With the jump in resolution, the GeForce GTX 1080 Ti barely registered a drop in its average frame rate, allowing it to leapfrog over the Radeon RX Vega 64 and beat it by 26%. The Vega 64 (Price Check) was now 7% faster than the GTX 1070, and 14% faster than the Vega 56 (Price Check).
Mass Effect: Andromeda (3840 x 2160)
This chart shows you the minimum and maximum frame rates, as well as the average frame rate, that FRAPS recorded in Mass Effect: Andromeda.
Increasing the resolution to 4K UHD greatly sapped the average frame rate of all six cards. Only GeForce GTX 1080 Ti managed to achieve an average frame rate of about 60 fps. It was now 48% faster than the Vega 64 (Price Check).
The Vega 64, on the other hand, was 10% faster than the GTX 1070, and 16% faster than the Vega 56 (Price Check).
[adrotate group=”1″]
Our Verdict
Ever since we saw AMD Vega running DOOM, we had such high hopes for it. The success of the AMD Ryzen family of processors only buoyed that optimism. However, the AMD Vega turned out not to be the Pascal-killer we thought it would be.
On paper, the AMD Radeon RX Vega 64 (Price Check) has considerable advantages over even the NVIDIA GeForce GTX 1080 Ti. It has a 34% advantage in texture fill rate, and a 51% advantage in memory bandwidth. The GeForce GTX 1080 Ti only beats it in pixel fill rate by 20%.
In real life though, the Radeon RX Vega 64 (Price Check) was significantly slower than the GeForce GTX 1080 Ti… unless the game supports Asynchronous Compute. In those DirectX 12 games, the Vega 64 actually beats the GeForce GTX 1080 Ti, albeit at lower resolutions.
Of course, the GeForce GTX 1080 Ti costs more than the Vega 64, but it bears reminding that numbers on paper do not always translate into real world performance. It also showcases how important certain software features are in delivering better performance.
[adrotate group=”2″]
At $499, the Vega 64 (Price Check) is frankly a bit overpriced, because it is, on average, 13% faster than the Vega 56 (Price Check), which is $100 (20%) cheaper! The price would become even harder to justify when NVIDIA launches the GeForce GTX 1070 Ti for just $429.
The Radeon RX Vega 64 also has the disadvantage of a significantly higher power consumption at 295 watts. It also requires at least a 750W power supply with two 8-pin PCI Express power cables.
That said, the Radeon RX Vega 64 (Price Check) is a step in the right direction. It is about 51% faster (on average) than the Radeon RX 580, which is based on the last generation AMD Polaris architecture. It may not have met our overly high expectations, but it is still a big improvement.
If you want a graphics card for 1440p gaming today, the Vega 56 (Price Check) or GTX 1070 offer better value. But if you want a graphics card for 1440p gaming with more headroom for future games, then the AMD Radeon RX Vega 64 (Price Check) makes sense.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
When AMD announced the ability to run two Radeon RX Vega cards simultaneously, they conspicuously called it mGPU (short for multiple GPU) instead of the far more familiar CrossFire. That’s because they are retiring the CrossFire brand in favour of the generic mGPU moniker. They also limited the mGPU capability. Find out why!
In fact, the AMD Radeon RX Vega graphics cards was only capable of running as single cards until the release of Radeon Software 17.9.2. It also represented the end of the road for AMD CrossFire. With this release, AMD officially abandoned it for mGPU.
Why? Here is AMD’s response when they were asked that very question by Brad Chacos of PCWorld :
CrossFire isn’t mentioned because it technically refers to DX11 applications.
In DirectX 12, we reference multi-GPU as applications must support mGPU, whereas AMD has to create the profiles for DX11.
We’ve accordingly moved away from using the CrossFire tag for multi-GPU gaming.
This is a surprising turn of event because the CrossFire brand goes all the way back to 2005. Almost 12 years to the day, as a matter of fact. That’s a lot of marketing history for AMD to throw away. But throw it all away, they did.
Nothing has changed though. They just decided to call the ability to use multiple graphics cards as mGPU, instead of CrossFire. In other words – this is a branding decision.
AMD will continue to use CrossFire for current and future DirectX 11 profiles, but refer to mGPU for DirectX 12 titles.
[adrotate group=”1″]
Limited mGPU Capability
AMD is also limiting the mGPU support to just two graphics cards. The 4-way mGPU capabilities that top-of-the-line Radeon cards used to support have been dropped. The AMD Radeon RX Vega family are therefore limited to two cards in mGPU mode :
Gamers can pair two Radeon RX Vega 56 GPUs or two Radeon RX Vega 64 GPUs
This move was not surprising. Even NVIDIA abandoned three or four card configurations with the GeForce GTX 10 series last year. With fewer games supporting multi GPUs and interest in power efficiency burgeoning, the days of 3-way or 4-way multi GPUs are over.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
Ever since AMD Radeon RX Vega was launched, there have been questions about support for Radeon RX Vega CrossFire. After all, running two Radeon RX Vega cards greatly increases performance for gaming. Now the wait is over. AMD has finally enabled CrossFire for Radeon RX Vega!
AMD Radeon RX Vega CrossFire Is Here!
[adrotate group=”2″]
While the AMD CrossFire mode was enabled for earlier graphics cards (e.g. Radeon RX 480 CrossFire), it was not possible to use two Radeon RX Vega graphics cards in CrossFire mode. That ends with the release of Radeon Software 17.9.2.
With Radeon Software 17.9.2, you can now pair two RX Vega 56, or two RX Vega 64 graphics cards to greatly boost performance.
For some reason, AMD no longer calls this feature CrossFire, just the plain “multi GPU“. But unless they are permitting the combination of more than two graphics cards later, they should call it “dual GPU“. We think it would have been easier and better to stick with CrossFire. Every techie / gamer worth his / her salt knows what CrossFire means.
AMD Radeon RX Vega CrossFire Performance
Now, you may be wondering – how much of a performance boost can you expect with AMD Radeon RX Vega CrossFire?
AMD shared this slide with us. It shows the results of the benchmarks performed at the AMD Performance Lab. According to their tests, running two RX Vega 64 graphic cards in CrossFire mode (okay, multi GPU mode) delivers more than 80% faster performance in Far Cry Primal, Metro Last Night Redux, Sniper Elite 4 and The Witcher 3: Wild Hunt.
If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!
The road to Vega has been a rather long one. We first saw the AMD Vega prototype running DOOM in December 2016. Not unlike a baby, it took AMD nine months to give birth to the AMD Radeon RX Vega. Today, we are going to take a close look at the AMD Radeon RX Vega 56 (Price Check) graphics card that is designed to take on the NVIDIA GeForce GTX 1070. Let’s see how it performs!
The AMD Radeon RX Vega 56 Specification Comparison
This table compares the specifications of the AMD Radeon RX Vega 56 against those of its rival, the NVIDIA GeForce GTX 1070.
Specifications
AMD Radeon RX Vega 64
AMD Radeon RX Vega 56
NVIDIA GeForce GTX 1070
GPU
Vega 10
Vega 10
NVIDIA GP104
Stream Processors
4096
3584
1920
Textures Per Clock
256
224
120
Pixels Per Clock
64
64
64
Base Clock Speed
1247 MHz
1156 MHz
1506 MHz
Boost Clock Speed
1546 MHz
1471 MHz
1683 MHz
Texture Fillrate
319.2~395.8 GT/s
258.9~329.5 GT/s
180.7~202.0 GT/s
Pixel Fillrate
79.8~98.9 GP/s
74.0~94.1 GP/s
96.4~107.7 GP/s
Graphics Memory
8 GB HBM2
8 GB HBM2
8 GB GDDR5
Graphics Memory Bus Width
2048-bits
2048-bits
256-bits
Graphics Memory Speed
945 MHz
800 MHz
2000 MHz
Graphics Memory Bandwidth
483.8 GB/s
409.6 GB/s
256.0 GB/s
TDP
295 W
210 W
150 W
Retail Prices
$499
$399
$379
$449 (Founder's Edition)
The AMD Radeon RX Vega 56 Up Close
The AMD Radeon RX Vega 56 (Price Check) looks exactly like the AMD Radeon RX 480, just longer. It has the same black shroud design that debuted with that Polaris-based card.
The NVIDIA GeForce GTX 1070 still offers a dual-linked DVI port, but the AMD Radeon RX Vega skips that for three DisplayPorts and a single HDMI 2.0b port.