Profile avatar
pcwalton.bsky.social
Programming languages and graphics person. Rust compiler developer, Firefox hacker, Bevy contributor.
30 posts 2,000 followers 13 following
Regular Contributor
Conversation Starter
comment in response to post
I believe GPU driven is fairly common in AAA engines. Unreal uses it in Nanite, but not for the "normal" path yet. Most other engines are CPU driven I think.
comment in response to post
As an example of the performance, Activision's Caldera map, with ~30k meshes and millions of polys is about: * 60 ms/frame in 0.14; * 45 ms/frame in 0.15; * 9.1 ms/frame in 0.16 (with all my PRs landed). By way of comparison, Blender won't even load the glTF file, it's so big.
comment in response to post
I'm using the code editor thing that it popped up, not the chat directly. It's not going well :(
comment in response to post
Yep, Swift is GC'd.
comment in response to post
I know that's not really true, but it's good enough to get started, and then you can learn all about bivectors and Clifford algebras and whatnot later once you have the basic intuition. 2/2
comment in response to post
A lot of people defend LLMs with "well, humans are fallible too!" True, but humans can also gauge their level of certainty, and LLMs (currently) don't.
comment in response to post
It's in main. Try `cargo run --example occlusion_culling`
comment in response to post
It can't be implemented on WebGL 2 because it needs compute. WebGPU *should* work, modulo wgpu bugs.
comment in response to post
I just... don't see it. Doing gameplay logic in the ECS speeds the development process up for me. Maybe for tiny games it's different. But when everything is interacting with everything else, I need all the help I can get to manage complexity.
comment in response to post
Generally. For UI Bevy is a bit clunky right now, but so are all the other Rust UI solutions.
comment in response to post
Now if only we could get iOS/macOS to move to real GC...
comment in response to post
Also the types would have to be POD.