Sincerely, I did not foresee the development of AI. I had thought that the unit of measurement for computer power would always remain ghz. Hardware rendering has been faster over time. But given that DLSS and other comparable applications use AI already, I'm concerned that it could eventually replace our GPUs. Excuse my ignorance, but if AI is already able to place "fake" frames between actual frames, might it possibly do so in the future? What if, for example, the GPU only needs to render one frame per second? possibly even little. AI takes care of the remainder, filling it in at whichever framerate you want.

The thought that GPUs might one day perform the same functions as AIPUs is both intriguing and unsettling.

Recommended Answers

All 4 Replies

Take some time to research the AI/ML as we know it today. The usual one (won't write its name because it could result in a deletion) runs on hardware such as multiple GPUs with 48 or more Gigabytes of RAM.

So no, very few will have that hardware to host the AI on.

commented: Could result in a deletion? We have been through this before. Stop being dramatic. -8
commented: thanks for suggetion +0

As to the fake frames, that's already here. It's also inside our cameras. We are quickly moving away from where the "picture" is only what the pixels in the sensor reported. Apple, Samsung and others now run the image through various algorithms (which do include AI/ML) so the old statement "a picture never lies" is laid waste. The pictures you find on many smartphones are no longer the raw data they used to be.

As to fake videos, that's been shown for years and as the compute/AI/ML hardware improved it's now an everyday occurrence you may see fake videos. From my friend David: "My boss got deep-fake scammed yesterday. He received a video call from his boss (redacted). It was (redacted)'s face, speaking in (redacted)'s voice saying: "I'm on site in (redacted) and need the server credentials. Please text them to me here. I gotta go. Thanks." Aaron happened to know that (redacted) wouldn't have the slightest idea what to do with server credentials, so it didn't work. But it sure could have."

You already mentioned DLSS and such so I take it you don't need to discuss Tensor cores in GPUs and the relationship to AI/ML.

Sincerely, I did not anticipate the rise of AI. I had assumed that the ghz unit will always be used to measure computer power. Over time, hardware rendering has become faster. However, given that DLSS and other applications of a similar nature already make use of AI, I'm concerned that it might eventually supplant our GPUs. Please excuse my stupidity, but could AI ever insert "fake" frames between real frames if it is already able to do so? What if the GPU just needs to render one frame per second, for instance? perhaps even a little. The rest is handled by AI, which fills it in at any framerate you choose.

AI runs on GPU’s - 1000’s of them, perhaps 1,000,000’s of them. The difference is where the processing is occurring, not the kind of hardware.

commented: "GPT-3 has 175B parameters and would require 326GiB memory" so a few hundred current gen GPU cards. +17
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.