H[ARC]! A New Challenger [Intel] Enters The Arena
--
Today Intel took the wraps off its long-awaited GPU platform.
Arc.
Technology is great, wonderful, and amazing, and all of that jazz. The best news here is that there is more competition in the gaming space.
Once we get beyond that aspect of this announcement, I find myself just shrugging my shoulders and moving on with my day.
Don’t get me wrong — I have spent more than my fair share of time reading articles about transistor counts and CUDA cores and ray tracing.
Even having done more research and reading than most, what stands out here is watching a younger gamer turn on ray tracing on a brand new Nvidia 2060 expecting something absolutely amazing. The telling comment came not too long afterward.
“Is it working?“
Then I had to explain to a 14-year-old what ray tracing actually is and then point out the small reflections in a few areas of the screen that were, indeed, showing that it was working.
The problem here is very simple. Computer graphics processing has fallen into the same trap as cell phones.
Every year Apple builds and releases new cellphones. But going back to 2004 there hasn’t been much change in them. They have been maddeningly iterative.
Sure, the cameras get slightly better every year, the processors get slightly faster. The memory bumps up every few years. The screens get gradually bigger and sharper.
GPUs are mired in the same morass of slogging along with higher and higher transistor counts and more and more buzzwords heaped atop larger and large silicon dies.
Nobody is doing anything revolutionary here. I don’t think that Intel is even trying to tackle that topic.
Yes, Intel has already answered the big question that is basic table stakes for the way we have gamed for years now.
Their GPUs can run Crysis.
For ages that has been a rallying cry that has become the defacto memed litmus test for quality…