cancel
Showing results for 
Search instead for 
Did you mean: 

PC Graphics

regwin
Journeyman III

TDP : FreeSync vs G-Sync

Translated with DeepL

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Hi, there,

I wanted to know if for FreeSync technology, it's the AMD graphics card (and not the display) that does all the calculations and adjustments to avoid Tearing and Suttering.

On the other hand, Nvidia needs a module that needs to be implemented in the monitor, which is responsible for making calculations and adjustments to avoid suttering.

I have noticed that the TDP of the AMD Radeon RX 580 card is 180W and the Nvidia GTX 1060 card is 120W.

If we were to compare: RX 580 + FreeSync vs GTX 1060 + G-Sync = 180W vs 120W?!! = This is not true?

Shouldn't the TDP of the G-Sync module also be calculated, because as it is a hardware component it will have a consumption, and it should be added to the comparison, right?

Thank you very much

A greeting,

Hiroshi Richi

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Hola,

Quería saber si para la tecnología FreeSync, es la tarjeta gráfica de AMD (y no la pantalla) la que realiza todo los cálculos y ajustes para evitar el Tearing y Suttering.

En cambio, por parte de Nvidia es necesario un modulo que necesita ser implementado en el monitor, el cual es el encargado de realizar los cálculos y ajustes para evitar el Suttering.

Me he fijado que el TDP de la tarjeta AMD Radeon RX 580 es de 180W y la tarjeta Nvidia GTX 1060 es de 120W.

Si comparásemos: RX 580 + FreeSync vs GTX 1060 + G-Sync = 180W vs 120W?!! = ¿Esto no es cierto verdad?

¿No se debería también calcular el TDP que tiene el modulo G-Sync?, ¿Porque al ser un componente hardware tendrá un consumo, y debería ser añadido a la comparación, no?

Muchas gracias

Un saludo,

Hiroshi Richi

0 Likes
5 Replies

Current nVidia architecture is more efficient, it has nothing to do with adaptive sync technologies.

0 Likes

Translated with DeepL

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Yes, it does, because I want to compare an RX 480 + FreeSync vs GTX 1060 + G-Sync card.

If the RX 480 does the work of FreeSync without the need for an extra hardware module and its TDP is 180W and the GTX 1060 requires an extra hardware module (G-Sync) to perform the same task.

Then I want to buy the TDP of the RX 480 (180W) vs TDP GTX 1060 (120W) + TDP G-Sync module (??W)

Do you know the TDP of the G-Sync module?

The G-Sync module has its own board, its own processor, its own RAM memory and even its own fan... something has to consume!

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Si que tiene que ver, porque yo quiero comparar una tarjeta RX 480 + FreeSync vs GTX 1060 + G-Sync

Si la RX 480 realiza el trabajo de FreeSync sin necesidad de un modulo hardware extra y su TDP es de 180W y la GTX 1060 requiere de un modulo hardware (G-Sync) para poder realizar la misma tarea.

Entonces quiero comprar el TDP de la RX 480 (180W) vs TDP GTX 1060 (120W) + TDP modulo G-Sync (??W)

¿Conoces el TDP del modulo G-Sync?

El modulo G-Sync tiene su propia placa, su propio procesador, su propia memoria RAM y hasta su propio ventilador... algo tiene que consumir!!

0 Likes

Actually this question is more suitable for an NVidia forum.

I suggest you look to power consumption specification of a monitor with G-Sync and compare it to a similar model without GSync.

Another thing to consider that, as you correctly stated, GSync requires a proprietary module. That is why GSync monitors cost more than Freesync. For the sam ereason the choice of Freesync monitors is currently 6 times greater than GSync.

regwin
Journeyman III

Translated with DeepL

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Thank you, Quant.

I was able to see that there is a difference of 5V

If I make theoretical calculations 5V * 16A = 80W ?? I think it's a lot, don't you?

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Gracias Quant

He podido comprobar que hay una diferencia de 5V

Si hago cálculos teóricos 5V * 16A = 80W ?? Creo que es mucho, verdad?

0 Likes

You looked something wrong. There must be a direct statement of power usage in watts. 16 A is really a lot of current.

For example Acer XB0 XB240Hbmjdpr has 23 W typical power consuption while Acer XB1 XB241Hbmipr (G-Sync) has 36 W. That is 13 W difference. You may look at other model to make sure that the G-Sync is the  cause.

But take in account, that the AMD graphic cards consume more power only under load (gaming), while idle power consumtion is comparable to NVidia cards. On the other hand a G-sync monitor will probably use additional power whether you game or not.

0 Likes