slckl.bsky.social
I work on software, mostly in Rust, mostly computer vision.
Love weird music, science and pompous writing.
I buy many books, but read few.
Will mostly post/repost about AI stuff.
198 posts
87 followers
246 following
Regular Contributor
Active Commenter
comment in response to
post
if your llm isn't quantum nirvanic, it's ngmi
comment in response to
post
I mean, need to see a particular pc to dunk on, before I can trust this combination exists in the wild.
comment in response to
post
*slaps ass cheeks of straw man*
this bad boy can hold so many contradictory opinions
comment in response to
post
Šitais jau vairāk uz oriģinālu velk, bet oriģināls tomēr jestrāks: www.youtube.com/watch?v=lEAw...
comment in response to
post
SheetsGPT here to do you...r taxes?
comment in response to
post
This is in great contrast to the dark days when nothing worked. A magical era, truly.
comment in response to
post
Nabaga Polija. Ko viņiem tā?
comment in response to
post
Growth of watchlist commonly exceeds the speed of consumption.
We'll take our watch and read lists to the grave...
comment in response to
post
Given the vibes from this list, may I recommend bubblegum crisis and the second patlabor movie? Patlabor movie feels like a prototype for gits movie in many ways. Bubblegum crisis is quintessential 80s cyberpunk vibe.
Memories and Perfect Blue are also missing.
comment in response to
post
get another, and you'll be at 20, yay.
comment in response to
post
The quotes do not seem entirely bad. LLMs do eat human labor as data and private AI labs would love for everyone to pay for controlled access. The tone is somewhat cringe, and I don't understand the bureaucracy part, but I've seen worse for sure.
comment in response to
post
sekoju tev bsky :>
citādi sekoju visam angliski, no latviskā satura mani interesē pamatā tie, kas kko interesantu dara ar AI mūsu vietējā valodā. Atstāstīt vnk par to, kas notiek angliski, tas nav baigi interesanti...
comment in response to
post
a scruffy and a neat on the same chair, preposterous
comment in response to
post
they write whole books like this, wtfff
comment in response to
post
Apple certainly thinks so.
comment in response to
post
royal kitty
comment in response to
post
nu tik traki nav, sadalās proporcionāli, kopumā jau vismaz pagaidām izskatās pēc knapas, bet uzvaras rīgā.
comment in response to
post
Sapratu, oriģināli izlasīju, ka 2/3 visi vai nu nobalsoja vai balsotu par šleso/sjv, bet tik drūma doma tur laikam nebija.
comment in response to
post
vari paskaidrot matemātiku? redzu, ka šitiem āpšiem tika 30%. kā no tā var izspiest 2/3?
comment in response to
post
most of the world*, really.
comment in response to
post
Tas, vai esi kremlins pēc pārliecības vai tikai pārdodies par tādu, lietas būtību baigi nemaina. Tādam nevajadzētu būt pie varas grožiem ne tuvu...
comment in response to
post
sounds like a good case.
do tell how it went.
comment in response to
post
ye, a solid -3000 LOC PR is worth gold.
depends on how easy it is to prove it's not used/needed, however...
comment in response to
post
amazing, gj, elon
comment in response to
post
I agree that brain is a messy affair, but I think biological self-repairing systems that can tolerate loss of whole modalities are not good examples of "fragility".
If brain was code, it'd be incredible spaghetti, yet difficult to topple completely.
comment in response to
post
MIT license on code, a very pleasant surprise.
comment in response to
post
I think there's a small contradiction between "fragility" and "redundant mechanisms", haha.
But I think I get what you mean. The complexity of the brain may turn out to not buy so much?
Remains to be seen, however...
comment in response to
post
I feel like something has to give if we drop most of pretraining (data). Does creativity suffer? Can you bootstrap creativity via a synth pipeline? Is there a synth pipeline version of human empathy? Seemingly hard to verify?
Maybe we'll find out...
comment in response to
post
The synthetic approaches do make the capabilities less magical. What previously magically fell out of web-scale data can now be traced back to an intentional training pipeline.
comment in response to
post
Of course, the pile of spaghetti approach reaches saturation, too, where models no longer are capable of adding new stuff to the pile, but in my experience this moment is a bit later in Python/JS than in Rust.
comment in response to
post
Very sensible take.
I feel that for extreme "vibe coding", where you don't interact with the code yourself at all and try to one-shot whole apps/features, then models tend to get stuck in Rust, bogged down in compiler errors. Whereas Python/JS lets them add more and more to the pile of spaghetti.
comment in response to
post
An absolute legend for sure.
comment in response to
post
this seems very impressive, thanks!
comment in response to
post
Gambling analogy for vibe coding hits well. Just one more roll, bro, one shot only.
comment in response to
post
I may have python-induced trauma.
comment in response to
post
some backlogs are more impressive than others