• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

SunBroDave

Member
Oct 25, 2017
13,183
You know, it usually annoys me how unknowledgable a lot of posters on this forum can be. But every now and then, you get a gem like this and it's all worth it.
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,072
While the CPU is developing at a snails pace (with Moore's law breaking down), GPU's are developing at a higher pace. It's possible that at some point these will intersect, meaning that GPU's will be just as good as CPU's at general computation. GPU's can already do general computation it's just that they are slower than CPU's for these tasks.

I'm don't have actual in depth PC knowledge though.
you are literally talking about what they were attempting to do with the cell processor lol.
 
Oct 27, 2017
704
This is probably a troll thread, but I'll give it a serious answer. No.

A GPU is only one component of a modern PC. You'd still need a CPU, motherboard, RAM, PSU, storage, etc. to actually make a functioning PC.
 

jay

Member
Oct 25, 2017
2,275
Not only that, but soon a graphics card will replace you at work.
 

R.T Straker

Chicken Chaser
Member
Oct 25, 2017
4,715
What's up with these wierd topics since the Nvidia event?

Did they fry your brain or something?
 

Mark It Zero!

Member
Apr 24, 2020
494
I have a really good troll thread I am thinking about posting that has to do with these GPUs but I am restraining myself
hqdefault.jpg
 

platocplx

2020 Member Elect
Member
Oct 30, 2017
36,072
Crazy Ken was living 2030, while failing to notice that the rest of us where still living in 2006. ;(
yup lmao when you said that i was like wait that def sounds like cell. lmao i actually found an academic paper where some people were trying to use it to do ray tracing lmao. it definitely was very very ambitious maybe one day it gets revisited but there is a reason why GPUs are great at what they do which is specialized work vs a CPU which is general.
 

Spinluck

â–˛ Legend â–˛
Avenger
Oct 26, 2017
28,513
Chicago
Gfx cards have technically been like their own computer within a computer for a little while now.
 

julia crawford

Took the red AND the blue pills
Member
Oct 27, 2017
35,350
Lmao, imagine a time where i have to buy a CPU to put inside my graphics card

Truly soviet russia thinking right there
 

MysteryM

Member
Oct 28, 2017
1,751
Err what? Are you sure you haven't fallen for that jez corden twitter post with a graphics card being directly connected to a tv? That was a joke and was meant to show that GPUs aren't the only cost in building a pc.
 

shark97

Banned
Nov 7, 2017
5,327
With the new reveals of these Nvidia cards yesterday are we now coming to a point where they will simply replace PCs completely.

Becoming all encompassing devices where you won't need PC's for memory, storage etc and everything will be on the graphics card Itself. It will be what people buy like a console Just one device rather than various parts and peripherals complicating things due to incompatibilities?

It seems to be heading this way with the size of these things and the memory where half of it can be used for memory and the other half for video memory. With the size of flash memory it would be easy to attach onto a GPU the size of a 3090 too. I mean the GPU is taller than a XBSX already so it can fit at least as much on.


You realize the average person is a lot more interested in price than anything right? A $169 chromebook works for them. Why on earth would they want super expensive hot graphics cards to be their PC's?
 
Oct 27, 2017
4,647
I get the feeling what OP is actually asking is more along the lines of if APU/SOC type designs could become the default or dominant paradigm instead of the discrete/modular setups we are used to in computing?
 
Last edited:
Oct 28, 2017
1,091
yup lmao when you said that i was like wait that def sounds like cell. lmao i actually found an academic paper where some people were trying to use it to do ray tracing lmao. it definitely was very very ambitious maybe one day it gets revisited but there is a reason why GPUs are great at what they do which is specialized work vs a CPU which is general.
Yeah, the technology just wasn't there yet, maybe it never will, but it made sense in theory.
 

LumberPanda

Member
Feb 3, 2019
6,373
While the CPU is developing at a snails pace (with Moore's law breaking down), GPU's are developing at a higher pace. It's possible that at some point these will intersect, meaning that GPU's will be just as good as CPU's at general computation. GPU's can already do general computation it's just that they are slower than CPU's for these tasks.

I'm don't have actual in depth PC knowledge though.

Right. Not only are GPUs slower for certain tasks, they require a lot more energy to do those tasks.

The very-very-very-layman explanation is that GPUs are setup so that most/all of their cores do the exact same instruction but fed different data, while on a CPU all of their cores can run independently (which is one of the reasons why 8+ core CPUs are the hot thing, yet GPUs already have hundreds of cores). This makes GPUs really good at things like 3D Matrix math since you can just feed all of the cores the rows and columns of the matrix and get them to all add, or all subtract, or all multiply, etc, at the same time. But if you only need one core for something, you waste all the others.

EDIT: Did not respond to the wrong post lol.
 
Last edited:

arsene_P5

Prophet of Regret
Member
Apr 17, 2020
15,438
No not at all. Sure the GPU are getting better and better with all the ML capabilities, ... . But they don't get better at tasks the CPU excels in. CPU is the brain of a computer, console, ... .

Then GPU as of now don't have SSD/HDD storage and rely on RAM. Which is great for the tasks needed. However RAM can't save data when electricity is gone. Meaning when you shut down the device the data in RAM is gone. That's a no go for a device.
 
Last edited:

exodus

Member
Oct 25, 2017
9,953
This GPU jump is huge for 2 reasons:

1) 12nm down to 7nm
2) They actually decided to release high wattage cards for first-gen 7nm, whereas they previously would only release low wattage cards first gen and high wattage cards second gen

Don't expect much advancement past this gen any time soon. You really can't go any higher than a 350W card....that's pushing the limits. Now advancement is going to come down to extracurricular tech (e.g. DLSS).

Intel CPUs have stagnated because they've exhausted the potential of 14nm. We're due for a big jump and more incremental improvements over the next few years.