Oh damn, this mostly reflects my thoughts and feelings about the comitee.
And as soon as I saw gaby's name I knew how it'd end up. I had only very few interactions online with him, and it was enough for me to almost hate him.
Welp, I stopped following the c++ std changes 5years ago anyway.
The reaction to this post on Reddit is predictably not great.
Appeals to respectability, pleas for professionalism, and even complaints about how it isn't actually about safe in C++.
And what's probably the worst is that it's being moderated out both on r/cpp and r/programming
Nail on the head. so many programmers chase an “ideal” of simplicity that doesn’t exist. Anything that can’t be neatly boxed and categorised is anathema. The masculine desire to create an Other and deny their humanity with the darksign.
> As a result, this post is going to violate the Tech Industry Blog Social Compact. People get uncomfortable when you have a tech blog and then talk about topics that aren’t purely technical
Love it! I’d never explicitly realised this until now + hindsight it’s so obvious I wonder how I missed it!
Just read the post. I was intentionally trying not to follow up these things too closely but this post just confirmed my suspicions, and also had tons of things I don't know and don't know whether I want to know. It is overwhelming.
I hope you can find a community that's worth your time. One thing to keep in mind is that just like how Americans are *not* the government we live under, using C++ doesn't make you associated with the committee.
Honestly, if clang allowed people to add more builtins/extensions/improve/fork the language I think you would find that the committee would lose both its bark and its bite.
This was one wild ride for sure. I don’t keep more than half an eye on the c++, but ever since the USG report has been making rounds the disfunction was leaking out in a very noticeable way.
And I didn’t know the half of it.
Hope you’re ok.
hi im late to the party and im just a basic bitch white male cs grad with little knowledge about c++ but i liked this post and all its callouts on asshole interviewers belittling interviewees and wishes for a titanfall 3. you're cool. keep on keeping on.
I think you're right that C++ will probably go the way of the COBOL dodo. It's just a doom spiral at this point that's unlikely to change trajectory any time soon, and at the point that maybe some day it does, it would probably be too late.
Thanks for this, super well written and informative.
Lots of important takeaways but the lead in/reference for the last section was so incredibly on point that I understood the whole comparison before even reading a single word. Absolutely incredible work.
Sorry you have to endure all of that. I used to look up to those kind of people, thinking that their technical skills elevated them to a hero-like status. How wrong I was!
What makes me really sad is how such a toxic attitude is actually harmful to them and to the whole ecosystem they created. This is not an excuse for shitty behavior! It makes it even more disappointing.
I’m sorry >_< you do so much cool stuff with toolchains and retrocomputing. I’d seen your blog update while taking a break so the wonder swan was on my mind… 🥲
It's fine, I learned a lot (and I'm sorry) - it was just a surprising place to find a mention in, that's all!
(I wish we weren't stuck on a slightly buggy port of GCC 6.3, but there's not enough of a community for me to justify writing a new 80186 backend... it'd be a *lot* of work.)
I've spent a long time doing retrocomputing stuff and have some awareness of the diversity of toolchains in that space (there's like four or five actively maintained C compilers for the 6502), so if you ever need to ask questions, I'll do my best to help.
Staying away from /r/cpp is a good choice. That place is insanely toxic.
Even constructive criticism of C++ based on maintaining big C++ codebases will get you treated like a member of the Rust Evangelism Strike Force. Which ironically has been pushing me more towards Rust for new code.
This took me a day and a half to read. Well worth it.
Thank you for speaking up. If you need funding to protect yourself from those assholes, keep us posted. I don't have a well-paying job atm, but when I get one, I'll be chipping in for sure.
What a trip... I don't even like or care about C++, yet for some reason I managed to stay focused on a piece of text, start to finish, for the longest time in a very long while.
Great post, just a question about the bdfl stuff. Did I understand you correctly that you think bdfl-governamce is bad but the examples you list later are good cases of it (Hare and Zig)?
I'm saying that every time someone gets slighted by a language they run off to create their own fiefdom and this repeats again and again. I am in fact against the concept of a BDFL. Language development should be a horizontal collaboration, not a hierarchical system with someone at the top.
I think your criticism makes sense, even if I think a collaborative environment can be fostered with a BDFL. I think Python did pretty well when it went from a BDFL to a steering council. It sounds like you'd want a flatter organization but I appreciate having a central authority to make decisions.
A steering council is a perfectly fine option, as long as community members are voting for said council, and there is guaranteed moving people out of said position.
Obviously sites like bluesky aren't the best for long explanation, but the goal of a BDFL-less system would rely on fluid networks and creating an environment that allows for a more stigmergic approach to language design (i.e., there wouldn't need to even be a steering council in an ideal situation)
in regard to the O(0) thing:
O(0) is actually faster than O(1). take the function f(n) = 1. clearly f is O(1) but there's no n_0 such that forall n >= n_0, |f(n)| <= 0, so f isnt O(0). the other defns of big O also agree
i agree for IRL programs, but the definition of big O permits O(0) to be a thing. i don't claim the best time complexity of an IRL program is O(0) – i claim O(0) is a strict subset of O(1)
izzy's interviewer sucked at interviewing, but i just disagree with 'any constant inside O turns into O(1)'
just to be clear, I think forall x in R \ {0}, O(n ↦ x) = O(1)... and it's fine/good to call it O(1). i don't know about this cost thing. O(0) is the only special case
Would you still use O(1) for two constant time functions, where one runs in milliseconds and the other takes a minute?
They're both constant time, so by your definition O(1), but they are not the same thing.
That's why I prefer O(c) for constant time functions as it forces you to consider this.
in regard to the O(1) thing:
O(1) is actually faster than O(2). take the function f(n) = 2. clearly f is O(2) but there's no n_1 such that forall n >= n_1, |f(n)| <= 1, so f isnt O(1). the other defns of big O also agree
The definition of big-O requires that the function inside O is non-zero (for sufficiently large x). Otherwise the definition/equivalence by limit superior (f(x) = O(g(x)) <=> lim sup x->inf |f(x)|/|g(x)| < inf) wouldn't work.
This pedantry is both annoying *and* wrong, the worst kind.
1. The mathematical definition, which predates computers, doesn't allow it as I said above. So from a formal perspective it doesn't make sense.
2. Who cares about the runtime of something that doesn't run, it's like a zen riddle. So practically it also makes no sense.
god, what a fucking journey this was to read. dark souls bit got me good. mad props for not holding back against these fuckers. power structure hell forever
std::optional is absolutely based on JeanHeyd's work, but he went off to fix C's transcoding.
Also, though, we fixed a bunch of problems with dangling and lifetime that we couldn't have 6 years ago. Maybe.
I drew the short straw, though.
Also optional is the dumbest smart pointer.
The point I was trying to make is that for a long time we wanted JeanHeyd’s. However once someone else’s name with attached to the paper, suddenly the paper was taken more seriously. This happened with byteswap, from my perspective.
I was directly promised that if I pushed through the paper it would be killed in plenary, regardless of technical merit.
It had nothing to do with me going to C. I explicitly asked my name to not be on the Action Item poll to bring a follow up paper because I have a nose to smell and am not stupid.
Holy shit, that was a ride. Thank you for writing it all down. Now you have me wondering what an anarchist/feminist PL development process would be. I hope you find some rest after writing and reliving all that.
This was fantastically written and infuriating. Experiencing the toxicity and sometimes blatant racism in online 'professional' C++ forums in the late 90s led me to avoid those spaces and get on with my work. It was never safe.
It's maddening that anyone still has to deal with that bullshit.
You are actually the second person to say this to me (and also there is a reason the post ended the way it did talking about a specific game that he defended once. 🤔). I was joking to my girlfriend that this was turning into a YT video, which is why I added those "subject jumps" 😁
😂
"Now, in that linked paper Bjarne writes the most fucking insane thing I’ve ever read that is most definitely technically correct in the same way that a gram of Uranium-238 contains enough calories for a lifetime."
Super interesting post. Thanks for taking the time to write and publish it despite possible backlash. Lately I've become increasingly cynical about the direction of the language, and it's disappointing to see some of my fears confirmed.
I would love to be able to use another language for gamedev, but all the battle-tested libraries in this space are C or C++ first. Managing bindings is a huge waste of time to me. I wish we had a standardized IDL to generate bindings from or at least blessed, first-party bindings in safer languages.
I think the closest we're going to get are whatever is being cooked up over in WebAssembly land. Zig I know is picking up steam in some indie dev spaces, but I just can't enjoy that language when it's basically just "LLVM Intrinsics: The Language".
It’s almost like most people aren’t trying to explode the world or be malicious by default despite what so-called tries to say or push as a narrative for what amounts to “the most base” of human behaviors 🤔
Almost as if people writing code are willing to take on some personal responsibility. 💆♀️
what I mean is more along the lines of this hypothetical:
a library whose tests cover runtime execution behavior would need to extend its test suite to cover arbitrary compile-time execution contexts when checking that a version bump doesn't break compatibility.
I'm not convinced parsing C is the way to go. zig cImport works for basic stuff, but in practice its output changes between compiler releases or it fails on all-too-common macro spaghetti. To get any stability/usability guarantees you pretty much have to run translate-c manually and fix the output.
Interestingly, GNAT Ada has had this feature for ages (via gcc -fdump-ada-spec), and it has similar issues. It's a good starting point, but neither language truly solves the maintenance burden of keeping bindings in sync with the wrapped library.
Also would you mind if I print this out and hang it above my monitor? I have *many* conversations in which this is directly and instrumentally appropriate.
I've been told this many times. Yet someone keeps paying me to write more Rust, I have to assume they are using what my team is building or they wouldn't keep paying us.
Wow that post was a WILD ride... and then FEMINISM IN PL DESIGN! This is something I have lived and breathed since I first realized that my autistic brain *loves* programming language design and implementation. Thank you for the honesty, the raw heart, and the openness you've shared with these words
There are many things to appreciate about this post, but the fact that after I read this I went "Oh, like the plot of Dark Souls", and then you *referenced fucking Dark Souls* was just 👌 mwah, masterful, chef's kiss
“C++ joins the ranks of languages like COBOL” — and I thought it already did this. If it weren’t for outdated education and lots of legacy code I don’t think anyone would learn C++ as a first language.
Good point! Although, they'd use C if it had classes and bolt whatever they like on as a pre-processor (I'm looking at Epic and whatever abomination of C++ they're using in their engines).
There is a reason that I mentioned lambda the ultimate in my post. And it’s because Tim Epic is the 97th registered user. I used to look up to him when I was a kid because I was very much into unreal engine and unreal script was very interesting to me. So verse is basically his ideal language.
Comments
And as soon as I saw gaby's name I knew how it'd end up. I had only very few interactions online with him, and it was enough for me to almost hate him.
Welp, I stopped following the c++ std changes 5years ago anyway.
So many exploitable quotes in this wall of man-made horrors.
Appeals to respectability, pleas for professionalism, and even complaints about how it isn't actually about safe in C++.
And what's probably the worst is that it's being moderated out both on r/cpp and r/programming
"You're about to do the following actions"
*people do the actions*
"wow, who could have predicted this"
Love it! I’d never explicitly realised this until now + hindsight it’s so obvious I wonder how I missed it!
De facto the clang infrastructure moves underneath you and you can't.
😔
A generation has been inspired. We may yet see a spiritual successor.
And I didn’t know the half of it.
Hope you’re ok.
Lots of important takeaways but the lead in/reference for the last section was so incredibly on point that I understood the whole comparison before even reading a single word. Absolutely incredible work.
(I wish we weren't stuck on a slightly buggy port of GCC 6.3, but there's not enough of a community for me to justify writing a new 80186 backend... it'd be a *lot* of work.)
I’ll keep that in mind :D!
Even constructive criticism of C++ based on maintaining big C++ codebases will get you treated like a member of the Rust Evangelism Strike Force. Which ironically has been pushing me more towards Rust for new code.
Staring too long at the internals of C++ is a certain path to madness
Surprisingly I am even less interested in C++ now.
Thank you for speaking up. If you need funding to protect yourself from those assholes, keep us posted. I don't have a well-paying job atm, but when I get one, I'll be chipping in for sure.
(PS: loved the DS reference. Fkn Gwyn, man.)
O(0) is actually faster than O(1). take the function f(n) = 1. clearly f is O(1) but there's no n_0 such that forall n >= n_0, |f(n)| <= 0, so f isnt O(0). the other defns of big O also agree
izzy's interviewer sucked at interviewing, but i just disagree with 'any constant inside O turns into O(1)'
I agree the interviewer was terrible.
They're both constant time, so by your definition O(1), but they are not the same thing.
That's why I prefer O(c) for constant time functions as it forces you to consider this.
O(1) is actually faster than O(2). take the function f(n) = 2. clearly f is O(2) but there's no n_1 such that forall n >= n_1, |f(n)| <= 1, so f isnt O(1). the other defns of big O also agree
I didn't mention such a coefficient in my post since it's dominated by 0 anyway
This pedantry is both annoying *and* wrong, the worst kind.
it's so deeply stupid in so many ways it almost beggars belief
1. The mathematical definition, which predates computers, doesn't allow it as I said above. So from a formal perspective it doesn't make sense.
2. Who cares about the runtime of something that doesn't run, it's like a zen riddle. So practically it also makes no sense.
At one point it would have been 79 pages 😅
Also, though, we fixed a bunch of problems with dangling and lifetime that we couldn't have 6 years ago. Maybe.
I drew the short straw, though.
Also optional is the dumbest smart pointer.
Sigh.
You did good work, and weren't wrong.
It also helped that library has mostly turned over, so the people looking at it today weren't personally wrong.
Real problems are people problems.
It had nothing to do with me going to C. I explicitly asked my name to not be on the Action Item poll to bring a follow up paper because I have a nose to smell and am not stupid.
But I am constantly citing your work, to people who are now acknowledging it was correct.
It's maddening that anyone still has to deal with that bullshit.
it looks like it took courage to post it.
i am grateful that you did, and it looks like there's a lot of us who feel that way.
"Now, in that linked paper Bjarne writes the most fucking insane thing I’ve ever read that is most definitely technically correct in the same way that a gram of Uranium-238 contains enough calories for a lifetime."
Almost as if people writing code are willing to take on some personal responsibility. 💆♀️
a library whose tests cover runtime execution behavior would need to extend its test suite to cover arbitrary compile-time execution contexts when checking that a version bump doesn't break compatibility.
*couldn't find you pref. pronouns
I saw their swap function before it was turned into std::swap. That’s what I was referring to with the “Jesus *wept*” part of my post
It's awesome.