cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

AMD at Computex 2019

AMD at Computex 2019 Presentation is here: https://www.youtube.com/watch?v=jy0Q75xCwDU 


I think it is an excellent presentation. 

Navi Pre-Release Information:

400 Million game on Radeon. Vega Radeon VII & GCN for High Performance Compute. Navi GPU for gaming on PC Console & Cloud. In Sony PlayStation. Navi new RDNA architecture. New CU high IPC +25%. +50% Perf/Watt. 10% > RTX2070. Launches in July.


Ryzen 3700X, 3800X, 3900X information:

Double Floating Point & Cache. IPC increase 15% in PC Workloads. Ryzen 7 3700X 8 Core 16 Thread 3.6GHz base 4.4GHz Boost. 36MB Cache 65W TDP. Ryzen 3700X 33% > i7-9700K in Cinebench R20. 3800X equals i9-9900K in PUBG. 3DMark PCIe Gen 4 +69%. 3900X 18% > i9-9920X HEDT in Blender.

The Third Generation Ryzen processors will be available to purchase on July 7 2019. 


0 Likes
2 Replies
amdbooger
Adept II

Hmmm.  A bit torn about the Computex presentation.  As usual, i'm left more excited about the CPU's and less-so about the GPU's.  I already have a Vega 64 and a Radeon VII, so Navi, so far, is of little interest to me.  I wish AMD's GPU division would be as aggressive as their CPU division, and release an equivalent GPU product to go with the Intel crushing 3900X Ryzen 9, a high end product to silence the 2080ti.  It is as if AMD is just "ok" with matching Nvidia's mid tier cards and just letting them have the high end market.  I am perfectly happy with my Radeon VII and Vega 64, but i would like to see, just once, AMD silence all the Nvidia fanboys.

0 Likes

Hi,

RE: I am perfectly happy with my Radeon VII and Vega 64, but i would like to see, just once, AMD silence all the Nvidia fanboys.

Not many people can or are prepared to spend the eye watering amount of money Nvidia asks for the RTX 2080Ti. I think an RTX2080 level of performance is more than enough for most people - RTX ray tracing does need an RTX2080Ti if you game at 4K  but it is only available on a very small number of games. From the benchmark numbers I see  the Radeon VII beats an RTX2080 in BFV, so I think that is good. Pity AMD AIB partners didn't make a Radeon VII with fully populated  VRM, dual BIOS, and add an AIO with rear hoses like on FuryX and option to fit single or dual fan wide radiator. I think that would be enough to give RTX2080 a harder time. Also I would like to see improvements to Radeon Adrenalin 2019 so that Auto Overclocking, and Radeon Chill work/ work better.

RE: I wish AMD's GPU division would be as aggressive as their CPU division

They seem to be more aggressive releasing GPU's recently. RX590, Radeon VII, now Navi.

I just wish more work could be done to fix broken/improve existing features in Adrenalin 2019.
Info below


Bye.

I searched for and bought an RX Vega 64 Liquid in November 2018 last year.
The GPU + Adrenalin Drivers were very unstable I was not happy with it, there were lots of stability issues, and a number of broken things in the Adrenalin Driver.
AMD reports were filed and some things have been fixed in Adrenalin 2019.  Radeon  Performance Overlay was simply broken and caused green screen, for example.  I bought an RTX2080 in December 2018 and fitted it in the same PC to compare the cards - they cost me ~ same. The RTX2080OC was +20 more expensive.

I have been focusing on and comparing BFV performance in DX12 at 4K Ultra with Adrenalin 2019 19.5.2 recently. - Forget RTX and DLSS for the moment.
I test with VSync and FreeSync off initially.

The RTX 2080 provides mostly around 60-65 FPS with VSync off and no overclocking. I have to use ThunderMaster FrameRate Limiter and set it to ~ 58 to keep at about 59-60 FPS and avoid screen tearing on a 4K Monitor 60Hz monitor. FPS will dip below 60 in some parts of the game, especially where lots of physics and CPU computation or in intro scenes. I run the game with both fans maxed out on the RTX2080, and they are louder than the single maxed out fan on the RX Vega 64 Liquid.

Setting the Frame Rate Limiter in BFV to 60 minimised screen tearing issues on the RTX2080OC, and is sensible to set at 4K so I keep that setting.

Now onto the RX Vega 64 Liquid. Out of the box with Turbo profile and fans maxed out I get about 50-55 FPS.
Using a careful overclock and Memory Timing Level 1 on the HBM and a small undervolt and a PowerPlay Table mod I can get about 56-60 FPS.
I use FRTC and set it to 59 and there is almost no screen tearing. Initially the Frame Rates are around 59 when the GPU is cool, but as it heats up towards max of 56'C the FPS drops down to ~50-52 FPS range.  The GPU Power Report is showing 340-350 Watts Power Consumption.

I had to manually overclock/undervolt the RX Vega 64 Liquid GPU because the auto-overclock button in Radeon Adrenalin 2019 is a self destruct button for my GPU. Report filed a long time ago, since Adrenalin 2019 released. Still broken. Needs fixed.  

The RTX 2080OC is the faster GPU in terms of FPS and runs at less power consumption. It can be auto-overclocked easily using ThunderMaster or ASUS GPU Tweak II. I don't really need to run it with both fans maxed out but it keeps the GPU temps low and I wear headphone / don't care anyhow.

However - there is also the neglected AMD weapon that needs some extra work so it could push that 55-60 FPS Number higher - Radeon Chill.
Wattman BFV Profile settings = Chill_Min = 30, Chill_Max = 300, FRTC off.
Wattman Global settings = FRTC =59.
When the Adrenalin 2019 driver is working correctly, rapid mouse only movement is limited to Chill_Min.

Chill_Max slider with the above settings is just a slider that controls how high keyboard only input FPS goes.

Using the above setting of Chill_Min 30 = ~ 90-120 Watts of GPU Power with no keyboard or mouse input and FPS=30 instead of 330-350 Watts. 
That is a massive power saving.

Problem is, those settings limit Keyboard Only Input FPS to about 52 FPS - in most games the equation is Keyboard only Input FPS is ~ Chill_Min + (15-25 FPS) for Chill Max of ~ (60-300). Tapping WASD or other keys rapidly and repeatedly can push the FPS up to the Global FRTC limit of 59, sometimes, depending on how hot the GPU is,  but that rapid tapping is unnatural gameplay for me. I could increase Chill_Min to 35 to attempt to have 35-60 FPS range but that costs an additional 50-70 Watts with no keyboard/mouse input. Chill_Min = 35  increases power consumption to around  140-190 Watts negates the effect of Chill reducing temps and power consumption when no mouse/keyboard input  so I can hit "Chill Off"/"Cold GPU"  max FPS performance when mouse/keyboard input starts.

Chill simply detects what keys are pressed and how rapidly they are pressed and uses that to control FPS and power consumption. In BFV just pressing right mouse button to Zoom into rifle scope causes Chill to ramp from 120W to 350 Watts.Why? Why can't I turn that behaviour off?

I think AMD are leaving performance and positive user game experience on the table for Nvidia to eat by not improving Radeon Chill.
Please allow users to control internal "Chill Variables" so Users can adjust Chill behavior on a per game basis.
I would like to be able to control which keyboard and mouse keys cause the Chill Algorithm to Ramp up FPS and Power Consumption.
I would like control of the "key held down" max FPS and the "key tapped rapidly" max FPS.
The Original Hialgo Chill allowed such control by modifying values in a User File - and it was easy to use.
Chill Control in Radeon Overlay needs fixed. Global FRTC should always be present and moving Chill_Max should not switch on Profile FRTC to match Chill_Max - because that can "break FreeSync" if the old rapid mouse movement behaviour of limiting it to FRTC value is happening (another Chill bug - reported to AMD) .