NVIDIA Unveils New GeForce GTX 1050 Variant – Has Upgraded Core Specifications
It would seem that NVIDIA has quietly introduced a brand new variant of the GeForce GTX 1050 card. How does this fare against the base model as well as the 1050Ti? Here’s everything that you need to know.
NVIDIA Unveils New GeForce GTX 1050 Variant
In a somewhat surprising move, NVIDIA has introduced a new variant in the GTX 1050 family of graphics cards. This new variant is based on the same Pascal GP107 GPU that the original version utilized, but has upgraded core specs, a cutdown memory interface, and increased VRAM.
The new GTX 1050 has upgraded specifications on practically everything the base model offered. It has been mentioned that the reason why NVIDIA produced this is because the 2GB model has some serious memory limitations for today’s computing needs. With 3GBs of memory, you can do more a bit more work and games will have slightly better performance without having the memory buffer getting all choked up.
In terms of specifications, the new GTX 1050 utilizes the same SKU as the 1050 Ti, and it features 768 CUDA cores, 1392 MHz base clock and 1518 MHz boost clock. In this sense, it is faster than the 1050 Ti, but keep in mind, it has 1GB less in memory. The memory speed is clocked at 7 Gbps just like the others but has a 96-bit bus interface with a memory bandwidth of 84 GB/s instead. It is still expected to be featuring 75W TDP.
While no official pricing has been announced at the time of this writing, it is expected to be priced between the 1050 and 1050 Ti. This may sound weird since the core specifications are better than the 1050 Ti but the memory limitation will definitely put the Ti variant ahead of the card.
We will be reporting on the official price as well as availability of the card as soon as information becomes available so stay tuned to Pokde.net.
Pokdepinion: Well, I am currently using the Ti variant for now so I think I’m fine with what I have as it is but if you’re looking to build a budget rig or don’t plan on splurging on GPU, this could be something to consider.