You are here: Home > Message Board > General Talk > The Turing Cards
September 22 2018 8.03am

The Turing Cards

Previous Topic | Next Topic


Page 1 of 2 1 2 > Last

 

View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 14 Sep 18 11.22pm Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards.

Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month.

Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress.

Any thoughts?

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Jimenez's Profile Jimenez Flag SELHURSTPARKCHESTER,DA BRONX 14 Sep 18 11.44pm Send a Private Message to Jimenez Add Jimenez as a friend

Originally posted by Stirlingsays

Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards.

Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month.

Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress.

Any thoughts?

 


[Link]
Give us a like & Follow

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 15 Sep 18 12.16am Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Originally posted by Jimenez


Not for everyone I know.

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View JusticeToad's Profile JusticeToad Flag Beckenham 15 Sep 18 6.33am Send a Private Message to JusticeToad Add JusticeToad as a friend

The 2080 is a grand! [Link]

PC gaming has always been a premium market but this is getting silly, especially given that consoles can offer pretty good graphics now.

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View chateauferret's Profile chateauferret Flag Airdrie 15 Sep 18 9.38pm Send a Private Message to chateauferret Add chateauferret as a friend

I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items.

I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion.

Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think).

Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend.

 


============
The Ferret
============

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View davenotamonkey's Profile davenotamonkey Flag 15 Sep 18 10.36pm Send a Private Message to davenotamonkey Add davenotamonkey as a friend

Originally posted by Stirlingsays

Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards.

Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month.

Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress.

Any thoughts?

I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon.

Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti.

I'll wait!

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View davenotamonkey's Profile davenotamonkey Flag 15 Sep 18 10.41pm Send a Private Message to davenotamonkey Add davenotamonkey as a friend

Originally posted by chateauferret

I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items.

I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion.

Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think).

Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend.

I never really got into CUDA to be honest. I'd have to have really hacked away at my legacy code, to the point it's probably diminishing returns. I'm aware it's really good at array functions though.

Is this procedural planet topology you're generating, incidentally?

PS: those games are driving the FLOPS, don't bash them ;-)

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 15 Sep 18 10.43pm Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Originally posted by chateauferret

I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items.

I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion.

Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think).

Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend.


Gaming is cool but it's just one channel of improvement. AI, VR and AR will be advanced with each generation....though time between generations after 7nms might be considerably increased.

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 15 Sep 18 10.47pm Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Originally posted by davenotamonkey

I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon.

Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti.

I'll wait!

Maybe by the time of the 30 series VR will eached a critical mass with developer time and the specs it offers.

I see it breaking through to the mass market via the consoles first so if they stick with it....though AMD are the chip makers there.

Exciting times.

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 15 Sep 18 10.52pm Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Originally posted by davenotamonkey

I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon.

Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti.

I'll wait!

I've got my eyes on a Asus STRIX 1080ti.....I might go second hand as possibly the 30 series will be released before or around the Q4 of 2019.....and those cards are likely to give you far higher than reliable 60 plus...probably more like 100 fps on 4K.

Proper immersion.

Edited by Stirlingsays (15 Sep 2018 10.53pm)

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View davenotamonkey's Profile davenotamonkey Flag 15 Sep 18 11.00pm Send a Private Message to davenotamonkey Add davenotamonkey as a friend

Originally posted by Stirlingsays

I've got my eyes on a Asus STRIX 1080ti.....I might go second hand as possibly the 30 series will be released before or around the Q4 of 2019.....and those cards are likely to give you far higher than reliable 60 plus...probably more like 100 fps on 4K.

Proper immersion.

Edited by Stirlingsays (15 Sep 2018 10.53pm)

That's a nice one... I imagine there will be quite a few trading up, so you could grab a bargain. I ended up adding keyword alerts to [Link] so you get a notification if something is posted there.

For some games, I couldn't go back to "pancake mode" - 3D all the way :-)

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply
View Stirlingsays's Profile Stirlingsays Flag Wisbech, England 15 Sep 18 11.08pm Send a Private Message to Stirlingsays Holmesdale Online Elite Member Add Stirlingsays as a friend

Originally posted by davenotamonkey

That's a nice one... I imagine there will be quite a few trading up, so you could grab a bargain. I ended up adding keyword alerts to [Link] so you get a notification if something is posted there.

For some games, I couldn't go back to "pancake mode" - 3D all the way :-)

Thanks for the link, I use pcpartpicker, looks like I have something else now as well.

[Link]

VR is the future along with AR.....but I do love pancakes as well.

 

Alert Alert a moderator to this post Edit this post Quote this post in a reply

 

Page 1 of 2 1 2 > Last

Previous Topic | Next Topic

You are here: Home > Message Board > General Talk > The Turing Cards