I built a new gaming PC about six months ago (Ryzen 5 1600) but I had a nasty shock when I went looking for a new graphics card. Crypto-currency miners were buying up graphics cards by the dozen to perform the hard sums needed to unlock virtual currency pushing prices for graming graphics cards through the roof. The €300 price bracket that I consider to be my comfort zone seems to be particularly affected with the result that there were no cards available in my price range that would give a reasonable upgrade over my three year old GTX 970.
Aside: Bitcoin was the fore runner and still the most widely known crypto-currency but this year a new player called Etherium seems to be demanding all attention. A key feature of Etherium is that it is resistant to mass extraction using customised chips called "Asics" and therefore requires actual GPUs for mining. It is my understanding that AMD Radeon architecture is the favourite for this process leading to complete lack of availability and sky high prices for cards such as the Radeon RX 580. Nvidia cards are less popular with miners but the lack of competition in the market place has ensured that the price of Nvidia cards remains high even on the second hand market. My gut feeling is that crypto-currency is a bubble waiting to burst. I believe this despite knowing someone who has made hundreds of thousands on an early bit coin investment. It seems to me that the majority of miners are just selling to themselves and to speculative investors. I suspect that like most bubbles a few insiders will make fortunes but most will lose their shirts.
Anyway rather than spend €400 on a graphics card I decided to spend slightly less than €400 on a new monitor instead and bought a Dell 2417DG monitor. The 2417DG has three features that make it a significant upgrade from what I was using previously: it has QHD resolution of 2560x1440, it has high refresh rate of up to 165Hz and it has gsync synchronisation. Gsync is the killer feature for me and it means that the refresh rate of the monitor dynamically adjusts to the frame rate being produced by the GPU. It is an absolute game changer (apologies for the unintended pun). Synchronising monitor and GPU refresh rates leads to a very smooth visuals with no tearing effect. My ageing GTX 970 is quite under-powered for this QHD monitor. In graphically intensive games such as Witcher 3 and Total War Warhammer my frame rates drop to below 40 fps. This would be quite unpleasant with an old fixed refresh rate monitor but on the gsync monitor it looks butter smooth.
Buying a gsync monitor really did allow me to extend the useful life of my old graphics card but there are a couple of disadvantages. The first is the nagging feeling that I am not getting the most out of my lovely 165Hz monitor when my current graphics card struggles to get frame rates above 60fps in any modern game. The second issue is that buying a gsync monitor has locked me into Nvidia graphics for the foreseeable future because AMD cards use the similar but different Freesync technology. This is not a major issue today given the ridiculous pricing of AMD GPUs but it may be a problem in the future.
My cunning plan, as you may have guessed, was to rely on gsync technology to extend the life of my old GTX 970 while waiting for the crypto currency madness to pass. It was a good plan except for the weakness of human nature. Six months passed and GPU prices remained sky high. There is only so much browsing of Amazon, Ebay and pcpartpicker.com a human can endure. Last weekend I crumbled and bought a GTX 1080 for €460. This is more than I have ever paid for a graphics card and to be honest is a silly amount of money. The only justification I can give is that it should give me frame rates about twice what the old 970 could achieve and it didn't seem worth it to settle for anything less.
TLDR: I strongly recommend gsync (or presumably freesync) monitors and they can extend the useful life of an old graphics card by delivering smooth visuals even at low frame rates. They will only save you money however if you have strong willpower and can resist the impulse to buy a new graphics card as well.
Aside: Bitcoin was the fore runner and still the most widely known crypto-currency but this year a new player called Etherium seems to be demanding all attention. A key feature of Etherium is that it is resistant to mass extraction using customised chips called "Asics" and therefore requires actual GPUs for mining. It is my understanding that AMD Radeon architecture is the favourite for this process leading to complete lack of availability and sky high prices for cards such as the Radeon RX 580. Nvidia cards are less popular with miners but the lack of competition in the market place has ensured that the price of Nvidia cards remains high even on the second hand market. My gut feeling is that crypto-currency is a bubble waiting to burst. I believe this despite knowing someone who has made hundreds of thousands on an early bit coin investment. It seems to me that the majority of miners are just selling to themselves and to speculative investors. I suspect that like most bubbles a few insiders will make fortunes but most will lose their shirts.
Anyway rather than spend €400 on a graphics card I decided to spend slightly less than €400 on a new monitor instead and bought a Dell 2417DG monitor. The 2417DG has three features that make it a significant upgrade from what I was using previously: it has QHD resolution of 2560x1440, it has high refresh rate of up to 165Hz and it has gsync synchronisation. Gsync is the killer feature for me and it means that the refresh rate of the monitor dynamically adjusts to the frame rate being produced by the GPU. It is an absolute game changer (apologies for the unintended pun). Synchronising monitor and GPU refresh rates leads to a very smooth visuals with no tearing effect. My ageing GTX 970 is quite under-powered for this QHD monitor. In graphically intensive games such as Witcher 3 and Total War Warhammer my frame rates drop to below 40 fps. This would be quite unpleasant with an old fixed refresh rate monitor but on the gsync monitor it looks butter smooth.
Buying a gsync monitor really did allow me to extend the useful life of my old graphics card but there are a couple of disadvantages. The first is the nagging feeling that I am not getting the most out of my lovely 165Hz monitor when my current graphics card struggles to get frame rates above 60fps in any modern game. The second issue is that buying a gsync monitor has locked me into Nvidia graphics for the foreseeable future because AMD cards use the similar but different Freesync technology. This is not a major issue today given the ridiculous pricing of AMD GPUs but it may be a problem in the future.
My cunning plan, as you may have guessed, was to rely on gsync technology to extend the life of my old GTX 970 while waiting for the crypto currency madness to pass. It was a good plan except for the weakness of human nature. Six months passed and GPU prices remained sky high. There is only so much browsing of Amazon, Ebay and pcpartpicker.com a human can endure. Last weekend I crumbled and bought a GTX 1080 for €460. This is more than I have ever paid for a graphics card and to be honest is a silly amount of money. The only justification I can give is that it should give me frame rates about twice what the old 970 could achieve and it didn't seem worth it to settle for anything less.
TLDR: I strongly recommend gsync (or presumably freesync) monitors and they can extend the useful life of an old graphics card by delivering smooth visuals even at low frame rates. They will only save you money however if you have strong willpower and can resist the impulse to buy a new graphics card as well.
Comments
GTX660M :D