Profile avatar
tehscientist.bsky.social
(He/Him) Dad and Husband, Materials Scientist in Med Devices, AuDHD, dabble in leadership, data, photos, and make lots of pizza.
525 posts 251 followers 691 following
Regular Contributor
Active Commenter

Jamelle Bouie hits it out of the park. Gift link. www.nytimes.com/2025/02/05/o...

Some photos from Oregon

We all wanted a younger generation to have more influence on the country, right? www.wired.com/story/elon-m...

99% of my fears right now stem from having two kids under 7 while this is all going on, I’m not sure how I’ll protect them or even feed them 12 months from now.

When does the bidding begin on some sweet domains? I want IRS.gov to start my next ‘business’ idea.

Every time I go for my laptop instead of my phone I am reminded of how much better it is to have a full screen and peripherals.. why do I look at phone so much????

OpenAI right now

Finally getting around to organizing my catalogue and editing some photos - here's a humming bird in my brothers backyard in LA from last spring when we visited

I sensed that this would happen but it’s still a gut punch. My hope is that state level regulators and municipalities will do the right thing but I don’t trust that. Whole home filtration system here we come?

Driving while impaired cases dropped during investigation into NC troopers’ credibility. Wake County DA has dismissed 180 pending cases while investigating whether a trooper provided false or misleading information about a wreck. (Via @virginabridges.bsky.social) www.newsobserver.com/news/local/c...

Just cancelled my Netflix after 20 years of subscription. Raising prices after a record quarter is not something I’m keen on. Made me do the math on annual spend and I that made it easy to say no-thanks.

This morning I read that DeepSeek used training data generated from other language models which has companies like OpenAI fuming, and I can't stop thinking about how there's a lesson in there

I’m not in the ML industry but I find the deepseek situation interesting. For a while I’ve wondered: • Why not run fewer, more refined parameters for training? Huge models seem like bloat or applying a sledge hammer. • Why not focus on multiple models for different use cases instead of AGI?