Dow anyone remember the Inmos transputer, programmed in Occam, the parallel language? Decades ago I worked on writing software for what was at the time the world’s largest transputer array processor (~5000 processors). That heavily featured parallel processing in each and all processors
This reads like someone saw an oversimplified explanation of how a GPU is different from a CPU and is unaware that Graphics Processing Units do other things besides AI.
What’s the source of that? That’s not how the graphic appears on the page when I looked the article up. There it’s labeled conventional cpu vs gpu parallel processing. The text of the piece mentions parallel computing, but in context it’s clear they’re trying to explain CPU vs GPU architecture
I'll take your word for it since it's not visible there anymore. fwiw, I think the version in the article makes a lot more sense especially in the context of the surrounding text which is trying to explain to a layperson why Nvidia's chips suddenly matter so much
Of course, I would nitpick that multicore CPUs are standard these days as is threading, but I get they're trying to quickly explain the difference between like 8 Ryzen cores and thousands of CUDA cores without getting into what things like CUDA or tensors are
Lmao holy shit. Like... bruh relatively inexpensive personal computers have been capable of multi-thread computing for AT LEAST 15 years. Hell, my cellphone (brand new, yes, but still) has an octocore processor for crying out loud lol
Incredible that the NYT doesn’t have anyone with an undergraduate level education in CS reading over this stuff. Absolute fucking rag. Honestly any STEM student with like 2 or 3 programming classes should know this is wrong
I have not studied CS and did not do any STEM courses (ok I did math as a major in high school 35 years ago). But I've used computers for, well, 40 years and I nearly choked reading this. I mean what in "multi threading" or "multiple cores" is so confusing to them?
(Yeah and // computing ofc)
There's a difference between "just discovered" and "is currently explaining," though. There's an overwhelming amount of fair criticism to level at the Times but I would say this isn't that.
Holy bleep this is amazing. You mean I can use all of the thousands of CPU in my cluster _at the same time_ ? The decade-long wait times to run something on 1 cpu have been stifling research at UC Davis since 1837, so this will be transformative.
I had recently read it somewhere else and suspected it was already a meme for reasons so I googled it myself as well. Chin up, buddy, with this administration you have a new meme like every six hours, can't keep up with that
oh that reminds me, do you have guidance or links to how to set Handbrake up for proper squeezing? i am very bad at finding the sweet spots for size v quality.
I can't even think of what must have happened in that editorial room. Nobody like ever, read anything on computing, not even on wikipedia?!
I'm afraid, journalism is dead for good and for long now. Nobody ever checks any information. Ever. As long as fits the spin and sounds good enough
Comments
It's bad for his brain, and it's bad for the rest of us as well.
(Yeah and // computing ofc)
IT'S FULL OF AI!!!
This graphic makes it seem like the technology to... display an image on a screen didn't exist until Sam Altman pooped his diapers
Everything is now AI
https://knowyourmeme.com/memes/everything-is-computer
Good lord the NYT is getting stupider by the minute.
"Think" does a lot of work in that sentence.
Right?
I'm afraid, journalism is dead for good and for long now. Nobody ever checks any information. Ever. As long as fits the spin and sounds good enough
But actually.. this really is state of the art in embedded systems chips (ask me how I know 😉)