the “C” in “CPU” stands for the C Programming Language. CPUs are called that because they are a hardware implementation of the C Abstract Machine as specified by ISO/IEC JTC 1/SC 22/WG 14.
Comments
Log in with your Bluesky account to leave a comment
When I was a student in the early 80's I don't recall a topic on GPU's. They're integrated nowadays. Not to be confused with Graphics Processing Units which weren't a thing until the late 80's.
You and me both. It's hard to tell these days with acronyms being renamed, like RN was Registered Nurse, not right now... I'll sit my fossil self in the corner now.
I mean legit the FPU instructions and number types really are from the Fortran / scientific product line and the CPU ones really are from the product line for business and, uh, COBOL. Shit. I think .. is the CPU the COBOL processing unit?
I thought x86 was targeting C, since it was 78 and I'm pretty sure at the very least that better support for HLLs was the rationale for base+index*mult addressing support and LEA.
...Then again, I can't think of any HLL except COBOL that AA[ASMD] would help with. And those are one byte each!
I think x86 inherited them from 8080 (they were also in the 4004), but I mean BCD instructions and some form of binary-oriented coding go back to the very earliest machines, predate all HLLs entirely. ENIAC was decimal!
(also x86 wasn't mainly targeting C, what with all the nested-function stuff..)
Ah. Hmmm. Admittedly my history of popular imperative languages in the 70s is a bit spotty. Pascal was also important I suppose? BLISS maybe? Probably like 50 others...
You're right about the BCD stuff coming from the 8080. Not quite as nice as the 6502 equivalent, but MOS had a patent x.x
False. The cpu is the Cool Processing Unit which is why it wears sunglasses all the time. The GPU is the Gay Processing Unit which is why it has all the rainbow colors
Specialized picometer feature chips for running large AI workloads with 32 bit floating point precision are AFUs. The most specialized have to shipped in containers marked "Ship High In Transit" and thus are S.H.I.T. AFUs.
Failure to see the humor in telling lies and calling them jokes may be a nuerodivergent thing. Literal thinkers are likely to be annoyed by this type of "comedy".
Things become funnier if you assume that other people know at least as much as you do, and that when someone says something we all know to be false, it can therefore be assumed to be a joke. Assuming by default that people are stupid and arrogant makes one's life utterly miserable.
I actually assumed they knew more than me because I didn’t know what the last part was referring to. But everyone knows what CPU means. If someone knows one thing, but not another that’s more fundamental, it gives the impression they’re parroting info they read.
damn maybe those 'literal thinkers' should give up on using social media then since they not only can't understand jokes but think jokes shouldn't exist because they can't enjoy them?
As a mostly web developer, “ISO/IEC JTC 1/SC 22/WG 14” is either a real ass thing or one of the most perfect parodies of IEEE-like specs I have ever seen. Bravo!
I am lucky to be following you. I am embarrassed a bit that after 45+ years of coding ( although unfortunately not much C lately) I was unaware of this.
Well, that's wrong. It's called C because it's predecessor language was B, and it's predessor was BCPL ("Basic Combined Programming Language"). https://en.wikipedia.org/wiki/BCPL
I figured that out eventually, but I agree with other posters that it failed to meet the criteria of being funny. I would be happy to workshop any future C related humour with you.
Ohhh I get it now 🤭
I’m an LPN and a dumb jock. I got confused, thinking that I thought it was central processing unit. I was like wait. Oh, smart people humor 😉 ✌🏼
Blaming this on C VM is kind of missing the forest for the trees. A lot of CPU arch started to develop independent of C, because C was still too "big" to write for them until well into 16 bit era. Each arch having its own assembly, and running into all the same problems.
The root cause is that a lot of tasks we need in general computing are sequential. There's a reason we don't handle page requests on servers with a GPU despite there being a massive ton of them arriving in parallel - each thread is still sequential branching, and GPUs aren't good at it.
And the way we do write GPU programs is exposing the underlying problem. You can give control over prediction to the code, but GPU programs are basically compiled to do a ton of speculative execution and caching, discarding results. So you turn a hardware exploit into whac-a-mole of software ones.
I'm not going to claim that C design hasn't influenced CPU architecture design at all, especially the x86 lineage, but simply saying, "And that's why we have speculative execution side channel exploits," is really missing the point.
Comments
https://youtu.be/iJy8VgB83OQ?si=uu-3DdYVGxEj7Y5f
Carma Police, Uh-rest this man.
This is a brilliant post thank you.
...Then again, I can't think of any HLL except COBOL that AA[ASMD] would help with. And those are one byte each!
(also x86 wasn't mainly targeting C, what with all the nested-function stuff..)
You're right about the BCD stuff coming from the 8080. Not quite as nice as the 6502 equivalent, but MOS had a patent x.x
I don't think you're real
The C Processing Unit as “common knowledge” is funny is funny because of how obviously false it is, especially the common knowledge part.
Even my Boomer dad knows it is wrong.
Pretty sure CPU stands for "Central Processing Unit"
*checks notes*
Yep, "Central Processing Unit" alright.
The CPU predates the C programming language, because, well, it sorta has to, right?
https://en.m.wikipedia.org/wiki/Central_processing_unit
Don't call us. We'll call you.
I’m an LPN and a dumb jock. I got confused, thinking that I thought it was central processing unit. I was like wait. Oh, smart people humor 😉 ✌🏼
all performance wins now come from cache locality (=> prediction) and SIMD, not from massaging void stars
And... reachable from better languages.