Elon Musk's favorite supposed data expert, who he's retweeted at least a dozen times, claims she can only process 60,000 rows of data before her "hard drive overheats"
Comments
Log in with your Bluesky account to leave a comment
Perhaps someone should rescue her from where she's apparently stuck twenty years in the past, which is the only possible explanation for those hardware limitations and the apparent lack of access to cloud compute.
Unless, of course, she's just completely making shit up.
My ThinkStation P620 can easily cope with 50m rows of data if I spend some time thinking about memory management. Sure it is a bit more powerful than your average computer, but not exactly a supercomputer.
Actually, just checked, the biggest dataset I work on regularly has 464m rows. Query times on that are about 10 - 100ms depending on what I want to do.
been in the business for 30 years and hard drive overheating is not a thing, unless the head of your magnetic hard drive is digging into the platter, in which case I would still not describe it that way
I dont think this would have been a problem even 20 years ago. It might have blinked for a second, but I think my Compaq Presario from 2002ish could have easily handled this. Pretty sure I was running Perl scripts on bigger datasets around then.
60k rows. I work with SQL databases that have 10 million rows across hundreds of tables and my laptop handles it just fine. This person barely knows how to operate a computer, I bet.
Honestly, I'm pretty sure my midrange Apple laptop from 2005 could handle 60k rows without overheating unless you decided to smother it with a blanket at the same time
Eh, maybe. I guess I don't know what's in those records, but I doubt it's enough that any computer under 10 years old would have to start swapping for only 60K records. I mean, 1GB would be almost 18MB per record. Even with adding overhead for models, that's a lot.
Probably not making things up--just confused. Older versions of excel were row limited to ~65k rows and older databases may still output to the .xls format which means you're limited, even if you have a newer version of excel installed.
Twenty years? We were handling this kind of shit without our drives overheating 50 years ago. This is from an era when my textbooks had an algorithm for sorting data spanning multiple 9 track tapes.
I'm ex-Oracle, with 20 years of experience in large data infrastructures including the datasets generated by Large Hadron Collider. That republicandata account is cosplaying. I skimmed their Twitter history. It's a lot of gibberish and correct-but-irrelevant "look at meee!!" bullshit.
Seriously, nobody with database expertise even talks like that in the first place, much less complains "my hard drive overheated". They jumped the shark with that one.
I have an open source program, that runs on your desktop (that is its' point) that can easily process 1 million rows. But as you point out, she's just 'making shit up'. https://github.com/BlazorData-Net/PersonalDataWarehouse
I went over to the hell site to see what processing this was, and it looks like an SQL query that could probably be a tiny awk script. But the discussion gets lost to insults, temu ads, crypto bros, simps and probably AI responses. So much nicer here.
She runs a website called, wait for it... "DataRepublican" (dot com). Looking through it, I get the distinct impression it's one of those "only use data that supports my hypothesis" stats sites.
20 years? And 60k rows counts as Big Data now, does it?🤪
I was running Oracle RDBMS on a laptop in the 1990s. Of course, Musk's tech geniuses are not known for their database skills, but you'd think he'd at least have bought them a decent laptop with a post-90s spec.
I'm an old fart and hard disks never "overheated" they spin constantly and consumer ones have an air cushion that the head floats on, they're literally actively cooled by their very operation
Maybe they converted the whole data to Json? Or wait no, to XML.
Maybe that would be inefficient enough to make the reading heads of the hard drive go 🤯
pure conjecture based on a few painful attempts to use claude as a data cleanup shortcut, but i feel pretty confident it's got nothing to do with her machine
@jeffgeerling.com could provide some advice on connecting a NVMe drive to the Pi. If they want to use an ML model to analyse the data, Jeff can suggest how to hook up an eGPU too.
3.1415926535 8979323846 2643383279 5028841971 6939937510
5820974944 5923078164 0628620899 8628034825 3421170679
8214808651 3282306647 0938446095 5058223172 5359408128
4811174502 8410270193 8521105559 6446229489 5493038196
4428810975 6659334461 2847564823 378, and that’s all I have to say about that.
Eventually it came out that they were using some grossly inefficient LLM because none of them know how computers work beyond running scripts and asking chatgpt to write stuff for them.
I have an HP 48g graphing calculator from the mid 90s still kicking around I could probably donate as an upgrade. Although someone would have to explain RPN I'm sure.
She's claiming that her dbms is handling most of the actual data move and her result set is only 60k lines. Sus that it would "overheat" unless it was under Satan's ass but I have seen MacBooks just melt internally from misuse.
I'm a data engineer and imo she sounds sus but not nuts.
does she know that external drives are not the part where computation is happening and that part is still the same computer you're on no matter where the storage is 😭
For me, the story here is that the US spending database is now outside of government property. I'm not sure if there's anything sensitive in it, but I do find it concerning.
Did she just take terrabites of data randomly around with her on an external HD into hotels and stuff? I'm not totally sure what data we are talking about here, but that sounds like it might not be properly secured.
“Luckily I brought my backup drive with me”, was it already loaded with this database, because no way did they manage to download terabytes over hotel wifi.
I-... even I had a data query script for MIMIC-3 and 4 patient profiles that would just batch retrieve what I could keep in RAM and I'm an embedded dev. Where did they learn how to do data processing?
she's a conspiracist that musk promoted on X, internet-famous. Not aware of actual employment by "doge" but I may be out of date, or it may be informal and illegal, like everything else "doge" does
I did lose two MBPs (Intel) during heat waves and I was working when it was over 90F inside at home! That's how I learned batteries can shoot jets of blue flame when they swell out of the laptop case.
Then the database isn't on the macbook (it shouldn't be). It's just sending "DELETE FROM spending WHERE dei = 1" and then actual server somewhere is doing the work.
Either this person is the worst data scientist in the history of the world, or they're lying. I think the latter
I don't care if they're trying to say "I bought the biggest computer I could from dell then added discs myself", running a database as big as they claim locally is _idiot work_.
Typically with large data sets I recommend uploading directly to a warm cloud storage solution and run queries directly using remote compute.
It’s much more effective, simpler, and enables a larger array of tooling to be applied, as well as having the bonus of implicit sharing of analytics results.
For sensitive data (which I’ve since learned this is not), you can typically get secure cloud compute (government) provided, meaning no issues with securing it locally.
I've collaborated with researchers at a US Army site before, and they had to burn data files to CD-R (!) or use full milspec encrypted drives (SEDs?) with inventory tracking codes traceable to the research project.
So the US Spending database has been exported, has travelled through unknown areas throughout the country, has been in a hotel room unattended for a period of time, and is now at someone's personal home. SECURITY WIN.
This was my issue. I don't know anything about datasets but 1) where is this person that it's 5hat hot and humid in march, outside the us i assume? and 2) what kind of budget hostel are they staying in without ac?
Like, has she done absolutely no optimization on the data? I could throw a few terabytes at my old MacBook Air and it wouldn’t even break a sweat if the database was setup correctly
feel like "process a data set too large to hold in memory" is a standard coder question as well... though i can't imagine SQL would balk at 60k rows anyways.
That's still not going to overheat the hard drive though, if anything, it'll end up holding the active memory partition (RAM, CPU cache, etc.) for longer, and have less activity in the hard drive, I would have thought.
Wait…she’s on a MacBookAir? Look, I’m a Mac person all the way but I know that most computer coder type people prefer PCs to Macs for this stuff right? Macs are for people like me who don’t want to think about how their computer computers!
As a computer coder type person, very very few of us in the modern era prefer PCs. Those are for enterprise applications and people who insist on playing FPS games
There's three classes of "Very Computer" people: Gamers, tinkerers, and the rest. The first two have the same hardware (PCs) but different operating systems (Mostly Windows vs. Mostly 'nix), the third is probably the largest of the three and they're basically all on Macs.
Mac laptops have been a popular choice for programmers for 20+ years now. At the places I've worked, developers overwhelmingly use Macs. I have 3 Mac laptops sitting in front of me at the moment: 2 MacBook Pros for work and my personal MacBook Air.
No, the Macbook has been the laptop of choice for Very Computer people for a while because we don't want to think about it either ;) It's the easiest way to have access to a lot of the Very Computer tools without having to deal with the Very Computer user experience, which is terrible
These days if you go to a Very Computer conference, unless it's a very specific kind of Very Computer conference (and sometimes even then), a good half the laptops will be Macbooks Air because of how easy they are to haul around and how long the battery lasts.
Google had the infrastructure (they even had their own in-house Linux distro) for me to have a work computer running Linux; since I left Google, I've mostly ended up on Macs for work computers because trying to code on Windows is so much worse than dealing with the bits of Macs that I dislike.
I’m a serious data scientist and I use a MacBook Air! Not that I would default to running something locally in such a case, but even if I did, it would be hard to construct a query so poorly that it broke after scanning 60k rows.
In an earlier moment, I'd suggest reporting to the FBI that someone is manipulating federal government data over an insecure network, at least.
In an earlier moment.
What I don't understand is why she didn't just do select distinct on the column that would hold the data she's looking for, find how the award she's looking for is referenced, and then put that in the where clause of a select statement
Beyond laughing at hatenerds being shit at data processing, there is also a story here where the young folks signing up for the Musk-Youth are being abused as well. Obviously, under extreme pressure to produce results, even the (possibly) smartest person will descend into hardcoding Excel 97 queries
And if we just bully the Nazi babies, and don't also disentangle the abusive system driven to the extreme embodied by DOGE, then we're just little shitty bullies.
I’m sorry, is she doing this data work on goverment files/records in an unsecured HOTEL ROOM on an easily stolen external hard drive?
I mean, I’ve only been peripherally involved with data security for the last several decades, but that seems to be, and I use the technical term: KooKoo-bananapants
Also, unclear if he was traveling with just the non-sensitive spending database, or the full version that has confidential/classified data? Like was he just bopping around with the full dataset in his backpack?
I have questions about what sort of query you are running that an Apple Silicon machine can’t solve in less time than it takes to press the enter button
Inventing a unique architecture in which a large powerful central computer “serves” multiple smaller remote computers (I’d like to call them “clients”)
Since you've got their attention, can you ask about the trusted computing requirements for a laptop transmitting classified data over an untrusted hotel wifi network?
Wondering if they've figured out secure quantum entanglement wifi and I'm the last to hear about it.
The fun thing about lying in service to the fascists is that no one cares... until you're mysteriously poisoned/thrown out a window and promptly scrubbed from the official history.
Why does she have a copy of a federal government database on her laptop or an external hard drive?
Any query she is running should be remotely on a secure server.
I'm a CS undergraduate, relatively amateurish in data science, who uses a 2020 Air with one of those god forbid WD Elements external drives. I have tried many datasets WELL over 60K successfully, and without bottleneck for obvious reasons.
This makes these deranged bird app comments all the funnier
If every accusation from a certain segment of the population is a confession...a lizard person would require a warmer than normal room with some level of humidity (depending on the species of lizard person)
Well, probably lots of tech people. I’ve got 4 multi-terabyte USB drives the size of a deck of cards lying around my house. But she’s still full of shit. 💩
It ... doesn't "fit" in any of the standard MBA builds? Can we eliminate the possibility that these are physical documents being jammed into a spinning disk hard drive?
It's even more pathetic that it's more likely she doesn't realize Molly isn't talking about the FILE SIZE but that there's no way even a low power computer using an external HDD should be struggling with 60K rows of ANYTHING
This is what happens when you don't round things correctly. You think it's fine to use 1.1 or 2.6, but it always bites you in the end to be so precise. Just say ~1,~10,~100 and so on. That's good enough.
lol these grifters think everyone else must be a grifter because they are but like cmon. Sure, my MacBook pro couldn't open a multiple terabyte .xls file either! Oh no, 60,000 rows. B----, I've had to print out 60,000 rows and attach them to hard copy certifications for my work.
I am no programmer but my husband works with a large database and says Excel can handle a million lines of code... And he's got the slowest Lenovo on earth but it must be AWS that handles the load. I dunno. But can't this person just export it and open elsewhere if its only 60k?
No, she processed 60k out of an unknown number of total rows before HDD caved.
DB is > 1.5Tb in size (half of it indices and views), so constant disk thrashing => HDD might overheat.
Hard drive overheating hasn't really been a problem since the 90s, unless you're making some appalling cooling decisions. These days they'll throttle down to an irritating speed, but keep functioning.
I could be wrong, but I keep imagining that it's CSVs (with hundreds of millions of records), which keep opening in a spreadsheet, and the "processing" involved is "open it, ctrl F a few times, delete the rows that loaded, close, repeat"
I just can’t stop laughing at the idea that the first component to overheat in a batch run would be the drive. Even assuming a poorly ventilated NVMe, let alone an old spinning plate, that is the last component that’s going to heat throttle. Idiot’s version of smart people all around
I’m getting way too much enjoyment imagining them thinking they’re “warming up” their process. Forget baking your thermal paste. They’re cooking their solid states
I love the fact that she has monitoring on the temperature of her hard drive and yet she doesn't have hardware sufficient to process > 60,000 rows of data.... Also, is she producing this data locally?
She is saying "The hard drive overheated" as a way of saying "it crashed" in the same way that she also might blame any computer crash on a "virus". These are low-ability computer user phrases I have heard from users for decades.
copilot wrote the code
it crashed couple of times
after each row read it wrote result to a file
there's only 60k lines in the text file
by touching the components ssd feels the warmest
i've had a pc stuffed into a cubby rendering video for 30+ hrs at a time never get close to overheating, the idea that a *hard drive* would overheat before anything else is implausible, at best
I wonder if this is one of those people who call the entire desktop (other than the screen) the "hard drive" and if the computer freezes or becomes unresponsive (possibly due to a badly designed query), they call it "overheating".
Almost certainly. The more interesting question is who does she think this is fooling? The whole of the American people or just Elon Musk? I think it might be the latter.
Shhh! No, no, it's really such a shame that 60,000 rows is around the upper limit of the data a person could reasonably analyze with one query, just so sad really, maybe in another 50 years we'll have the technology...
I don’t know if there’s a configuration change required to allow more resource usage to go over the old 64k limit or not.
I learned long ago it wasn’t the best tool for giant CSV files..
When working rather massive (for 2007) 70MB CSV files i used either notepad++ or bulk import to sql
2003 or earlier. Since Excel 2007 they increased it to just over 1mill rows.
Could also be using Google Sheets, where there’s a cell limit of 10mill cells, so if the DB had around 170 fields per entry, it would limit you to about 60k rows.
It used to. For at least 5 years that's been over 1 million rows, and even on an off the shelf Mac laptop, pivot tabling 1m rows takes very little time and most definitely doesn't overheat the "hard drive"
it also reads like when you haven't started the project that's due tomorrow because you forgot about it and your boss asks you for an update. "overheated LONG before I could finish" -- wait... you only did 1 "pass" through? only 1 search? we know you didn't finish, but you didn't even do a second?
More concerning is sensitive data stored on an external drive. Was it encrypted and how was the key managed. What is the process for cleaning the drive afterwards
My first pc was an IBM with two floppy drives, no hard drive, running MS-DOS. I think she must have something pre-dating that. You know: if she were telling the truth.
The irony about Pounds in the Desertnet article about her of not only remote working, AHEM, but of her saying unregulated AI gives the power to the people to watch the Government. I don't think she even understands what the Government does, not to mention the fact biased programming.
My consumer-grade gamer PC built in 2014 can burn through queries on million+ row databases, stored locally, in under a minute, without breaking a sweat.
These people are beyond parody. They HAVE to be making shit up and assuming no one will call them out on it.
I’ve built a few computers over the years. I’ve put cooling systems on CPUs, GPUs and even a heat sink on the ram but I’ve never considered hard drives overheating as a problem.
Yeah I’ll give you that, but old school spinning disk hard drives don’t get meaningfully warm in a pc. I took apart a WD 2 bay NAS recently it didn’t have any fans just a few air slots at the top and bottom.
Remember when people would ask “will my phone ring to let me know if I get an email?” and we’d kind of stifle a laugh and explain that no, those things have nothing to do with each other. How things have changed…
Haven't heard anyone mention Compuserve since we partied like it was 1999. Oh, the days of AltaVista and Windows 3.1, 56k dial-up modems and mobile phones you carried in a suitcase. Such memories. I worked for a company called GEAC then. Library systems and leasing. Long gone.
I built a preferential vote counting spreadsheet in high school more than two decades ago (ugh, I feel old) that had several SHEETS of 65,536 rows in it...
Maybe she's processing her stuff in Excel 97 running on an already overtaxed 486 with 8MB of RAM running Windows 95 ;)
You joke, but this is what’s happening! GSA’s old (6-7 years ago?) market intel widget had an Excel-based chokepoint buried in it and they went live and I exported all fields and it was 65,536 rows. I had to be like “folks, if this number doesn’t set off fireworks in your brain we need to talk.”
1st impression was... what am I missing here, but reading though this confirms, this is not set up for success. May as well store the data in a blender.
Several observations on this. I’m a retired IT networker/deployment specialist.
People often call the PC’s main”box” the “hard drive”. Even given that, there’s no way you overheat any part of a computer doing database analysis, searches or anything similar. This kid is a hack.
Tweets like this aren’t for computer experts, who will instantly recognize the bullshit. They’re like the “the odds of the election turning out like this are one is sixty quadrillion, therefore voter fraud.” They’re to give internal justification to people who *want* to believe Elon.
Republican: (picks up mouse, uses like microphone)
“Computer! Locate all liberals for me and tell them to stop being mean! Also, program yourself to fire any federal worker with a coexist bumper sticker.“
Weirdly, it fits with how R officials still don't understand that things you publicly post online or put in emails to multiple recipients are not private conversations. They certainly don't disappear forever, when you delete them from your inbox.
She just means that her 'hard drive' to get her work done fogs out after the first 60,000 rows. How would you feel after 60,000 rows? Anyone would need a break!
I don’t know the full context of this stupid tweet, but is this person stating they’ve got critical data with certain privacy implications on a personal computer? And where is this located? Is this in a secure location? What the fuck???
Someone can correct me but I don't think hard drives generate heat unless they are very close to failure and then not that much. The processor generates much more heat.
Let’s face it, musk & the people he thinks are awesome, clearly are not. He is not an engineer, an inventor, an accountant, a rocket scientist, a video game champion, or anything he says he is. He is a complete fucking fraud in every sense of the word. That also means everyone he likes is a fraud.
Actually that was google. but I get your point.I am not technically savvy enough to understand most computer stuff beyond basic use. However, the screenshot gave a specific number. And I was able to look that up. 60,000 versus billions if not trillions? This lady is missing between 6 and 9 zeros.
yes an overheating hard drive on a data set that is small is obviously and apparently a dumb thing to claim but you copy pasted an AI summary my dawg which is the exact same credulous information environment the OP is critiquing
"Literal TERABYTES. You cannot imagine the scale of these data sets. I'm gonna have to get a *second IP* address just to query all these parameters in memory."
As to her workstation being poorly configured and under spec'd?
A w/s with multi-core (in strictness), -> SS <- drives in raid array for peformance (redundancy on a data analyst machine is not necessary), scaled up memory config for QP on remote data volumes, and this should not be a symptom.
What pathetic excuses... multi-SSDs on a software RAID? If overheating drives were a regular issue, animation render houses (for example) could never run continuously days on end...
Uh did everyone have a brain fart? DOGE is kabuki theater and a cringe waste of time, but she probably means each run reads 60k rows and her hdd overheated before she could finish all the urns.
Actually, the whole thing is odd. She's described as a senior AI/ML analyst at her (former) employer. Those folks typically deal with giga-peta bytes of data. Even if she's been conscientious enough to avoid work or public compute resources to process govt data - any recent laptop could handle this.
“AI gives us the ability to take on massive, entrenched systems that would otherwise be impossible to untangle,” Jennica Pounds told the Deseret News. “Without it, we’d be fighting blind.”
I was having a problem with a big data set last year. My process would choke somewhere around the 6000th record. But I dug into it and figured out the problem.
It couldn't recognize Spanish names with accented characters.
Any Trump project probably will have a similar problem.
Fortunately the U.S. will be down to < 60,000 recipients of social security, medicare and medicaid soon! And also, Musk's secret GROK AI data center outside Nashville will be very responsible handling everything immediately of course.
This tracks for Musk and his cretins. Musk playing POE2 and manually dragging items into his inventory not knowing how the pickup mechanic works.
This person using CTRL-X/CTRL-Y for a giant-fuck file when "Import" exists
This, so much THIS!
And for the love of GOO, it's a spreadsheet, MSExcel is a spreadsheet, not a bleeping database like SQL, er, MSAccess (is it possible she means dataset?)! 1/2
SQL isn't a database, it's a query language. And MSAccess also isn't a proper database. It's a mixup of GUI builder and query IDE with a file-based relational spreadsheet without essential database functions underneath.
Like my DB professor said: "Let's do Access and then let's look into databases."
Per Google AI
"No, Microsoft Excel is a spreadsheet application, not a database, though it can be used to store & manipulate data in a tabular format. Databases are designed for robust data storage, manipulation, &management, while spreadsheets excel at calculations and data visualization."
and if she's in a hotel using hotel wifi it's probably just i/o wait time that's spinning up her pc layered with using apps like excel, in the worst possible access method
I transponded the database onto quantum blockchain, removing microservices causing tachyon overload. Starlink now precaches dark matter RPC - rows now load at precognitive latency, up to 5000% faster!
I am constructing additional pylons to compensate for the TPS reports lagging a bit on the MOPS infrastructure. We may need some people to up their 4d3d3d3 to make this work.
My guess would be that's the cyber gang's word for "any individual amount of money paid out", mostly to avoid spooking the press by saying "entitlements", "grants", or "contracts". "Award" sounds like a gift or bonus, rather than the legal obligation it is.
Most likely, assuming they aren't lying just to make it sound to non-techies like they are doing something difficult, they are "processing" it by sending it through a GenAI system, possibly a local instance?
Some time spent doing ISP tech support taught me there are far too many people for whom the words "CPU" and "Hard Drive" are synonymous to "Desktop/Laptop". Sad, but true.
Even if the set of 60k rows was just a sample. Over heating on a 60k sample set demonstrates a seriously ridiculously inefficient piece of code, unless each is like a nesting doll of data. In any case, seems amusingly bad code.
I replied to the main thread with links and extended info, should be up there somewhere. The whole thing is bizarre, there is a RestAPI available and someone (probably her) posted on the datasource's help board asking about finding the exact info they are having trouble with
Oh yeah! Had to spool up the turbo and run a special boot sequence to get Comanche: Maximum Overkill running smoothly on the DX-33. It never overheated the hard drive, though.
Summed up like when IT gives a long bloated “manslaining” solution and then tells you to unplug the router and wait 10 seconds before starting. And they 10 seconds later, you receive a ticket saying resolved.
What's she using for processing a friggin' IBM 5150?
I regularly deal with datasets with billions of rows and I've never had any issue with my "hard drive overheating" because *checks notes* that's a thing that doesn't happen (well unless the AC in the datacentre goes out for an extended period)
Out of context, I would give the benefits of the doubts: "I tested my code with a sample of 60.000 rows because processing billions rows would be impractical on local hardware"
This is half baked but I dug into the original thread and this is the dataset: https://onevoicecrm.my.site.com/usaspending/s/database-download . JSONs/metadata, API on git, its extremely accessible. I haven't seen anyone post this though - not sure of datarepublicans IRL name but:
not sure why they would need to have a nested list? i haven't setup API yet but based on the thread they are trying to figure out some very basic ass information.
*DOOR slams open. In the doorway, backlit by white light, stands a being that greatly resembles Grimace's brother. It wears a t-shirt labeled 'MONGO DB' in block font.*
I think we should actively encourage her to give snorting Draino a really aggressive go. Like, just go for it, it's totally safe for you to do, our sweet summer Data Nazi child
Comments
Unless, of course, she's just completely making shit up.
He's assembled a party of Dunning-Kruger savants.
Spare parts table for copiers and fax machines was the biggie.
I don't know what her workload is, but maybe she's just trying to run it through Excel directly?
Never heard of SQL. Never heard of client software. Never heard of Ctrl+F, nothing.
Which an average Joe might round down to 60000
Ya Starting to smell anything?
Like abusing a spreadsheet as a database because you can’t SQL?
Maximum is like 2 million rows
I've been working big data for over 30 years. Overheating a hard drive has never once happened to me.
I AM A NERD
I'm uncertain why "sample size" would even be a thing here either.
(Isn't some of it really secret stuff?)
She runs a website called, wait for it... "DataRepublican" (dot com). Looking through it, I get the distinct impression it's one of those "only use data that supports my hypothesis" stats sites.
I'm seriously not impressed.
- they are "processing" the rows manually (e.g. reading),
- they try copy-pasting the rows in Chat GPT.
I was running Oracle RDBMS on a laptop in the 1990s. Of course, Musk's tech geniuses are not known for their database skills, but you'd think he'd at least have bought them a decent laptop with a post-90s spec.
Maybe that would be inefficient enough to make the reading heads of the hard drive go 🤯
Processing power would require them to be used.
5820974944 5923078164 0628620899 8628034825 3421170679
8214808651 3282306647 0938446095 5058223172 5359408128
4811174502 8410270193 8521105559 6446229489 5493038196
4428810975 6659334461 2847564823 378, and that’s all I have to say about that.
....just like the little fan grills on her hard drive case.
Like even my 10+ year old thinkpad can run searches on >250 million entry databases.
I'm a data engineer and imo she sounds sus but not nuts.
https://www.rollingstone.com/politics/politics-features/elon-musk-data-republican-anonymous-data-expert-doge-tech-1235280817/
the fumes explain the rest
(This is complete nonsense, b-trees are one of the index structures but they don’t affect the underlying data organization)
https://www.rollingstone.com/politics/politics-features/elon-musk-data-republican-anonymous-data-expert-doge-tech-1235280817/
Either this person is the worst data scientist in the history of the world, or they're lying. I think the latter
I’ve witnessed data scientists doing work the hard way but usually they’re inexperienced / academics.
Perhaps they’re more lax about who has access to their spending data.
It’s much more effective, simpler, and enables a larger array of tooling to be applied, as well as having the bonus of implicit sharing of analytics results.
None of the work was even security-related.
https://bsky.app/profile/arcknightt.bsky.social/post/3lkdtt5rt5k25
Any string of reasons that lead to this being necessary is indicative of extreme institutional incompetence.
Still far more effective to work on it through remote services than carrying it around, which was my primary concern.
Even downloading directly to cloud storage would have been quicker than downloading it to external storage.
https://bsky.app/profile/fillip.pro/post/3lkdue2lu4k2z
https://www.youtube.com/watch?v=Nl_Qyk9DSUw
Like, has she done absolutely no optimization on the data? I could throw a few terabytes at my old MacBook Air and it wouldn’t even break a sweat if the database was setup correctly
Half the database is views and indexes so my best guess is creating really bad queries
But holy shit I wouldn’t tell the internet about it
'SELECT * FROM payments WHERE receiver like '%scammer%' ORDER BY amount;"
Also love how now Very Computer People is now a thing and will use it hence forth!
A MacBook Air is …not the tool a serious data analyst would use.
Your critique seems more apt.
In an earlier moment.
Maybe she does not know that because she is a fake account and Elon is deeply ignorant.
Classic grifter
I get the feeling she actually doesn't know what she's doing.
"Tell me you don't know how use a database without telling me"
The crazy thing, this person may have barely written any SQL in their life.
Amazing how ORM and graphql and other tools can make it harder. Which also may be why worked so hard
a) their personal device
b) a USB connected hard drive
I mean, I’ve only been peripherally involved with data security for the last several decades, but that seems to be, and I use the technical term: KooKoo-bananapants
She's going through the lines manually and got bored, there's no other explanation.
Wondering if they've figured out secure quantum entanglement wifi and I'm the last to hear about it.
Any query she is running should be remotely on a secure server.
This makes these deranged bird app comments all the funnier
Go figure.
¯\_(ツ)_/¯
what a clown
My old 2011 MacBook overheated from time to time, but that was when it was pushing 10 years old.
> 60,000 rows
🤔
I don’t see how that could go wrong at all
Casue that is 100% what she's doing
Wired: thicc data
I think every excel doc I have ever touch combined might be around 100 megs.
What the fuck is she looking at
DB is > 1.5Tb in size (half of it indices and views), so constant disk thrashing => HDD might overheat.
Which is still absurd, of course.
Either way she's full of it.
She was running it on a smart thermometer
Now let’s assume they are in a high side network to give them the benefit of the doubt (but then why would they be posting about it online 🤣)
Nope I still can’t work this out! Maybe they are using a laptop and it’s heating up 🤣🤣🤣🤣🤣 I’m still like…. Wtf
I'm running at a loss as to what they can be struggling with at 65K rows.... LOL
copilot wrote the code
it crashed couple of times
after each row read it wrote result to a file
there's only 60k lines in the text file
by touching the components ssd feels the warmest
60000 is not very many rows! if you’re going to bullshit pick a bigger number!
So anyway, this bug in Stellaris' Gigastructures mod is annoying!
I sense a CSV batabase dump being loaded into a spreadsheet.
Likely because a SQL bulk import would be the equivalent of high magic to this person
I don’t know if there’s a configuration change required to allow more resource usage to go over the old 64k limit or not.
I learned long ago it wasn’t the best tool for giant CSV files..
When working rather massive (for 2007) 70MB CSV files i used either notepad++ or bulk import to sql
Could also be using Google Sheets, where there’s a cell limit of 10mill cells, so if the DB had around 170 fields per entry, it would limit you to about 60k rows.
Like 60,000 rows is trivial for anyone who can use Excel
I think I still have my 386 with windows 3.11 and my Compaq Presario 2200's CPU in the closet.
> "hard drive overheated"
AHAHAHAHAHAHAHAHA......*takes breath*
HAHAHAHAHAHAHAHAHAHAHAHA 🤣🤣🤣🤣
OMG...these people are either the biggest liars ever, or have no idea how computers work, or are completely incompetent...or all three.
These people are beyond parody. They HAVE to be making shit up and assuming no one will call them out on it.
I was sure this “virus” was just an urban legend used to scare people into forwarding useless emails, but… here we are.
Perhaps it mated with an LLM?
In a parallel dimension maybe, or in the metaverse.
Duh
Yeah... I've seen most things.
Maybe she's processing her stuff in Excel 97 running on an already overtaxed 486 with 8MB of RAM running Windows 95 ;)
People often call the PC’s main”box” the “hard drive”. Even given that, there’s no way you overheat any part of a computer doing database analysis, searches or anything similar. This kid is a hack.
“Computer! Locate all liberals for me and tell them to stop being mean! Also, program yourself to fire any federal worker with a coexist bumper sticker.“
*overloads the GPU in order to self-immolate*
I guess 95 must've still been mostly the 16-bit codebase and they only moved to fully to 32-bit in 97.
>just need more water cooling
I hope they're getting roasted on X. Even if you're just brute forcing csvs you can do better than 60k.
Smart at what, he didn't say.
He was able to post some fucking slop vomited by google's llm to feel good about himself.
You're part of the problem.
(Like, shut the fuck up man, this isn't twitter, you don't get 3 pennies for people being mad at you in replies)
And oh, heck, even the best data guru can typo an endless loop.
The sheer technical acumen to get a second ip address. I mean. How do you even do that??
It’s too complex for liberals.
A w/s with multi-core (in strictness), -> SS <- drives in raid array for peformance (redundancy on a data analyst machine is not necessary), scaled up memory config for QP on remote data volumes, and this should not be a symptom.
This isn't "big data". Big data doesn't fit in the RAM of a goddamn iPhone.
On the other hand... maybe not.
https://youtu.be/H2uHBhKTSe0?si=4gcsX6tTp1M2kNW1
Wait a minute. I think this is some kind of kinky sexy talk between those two. This is how they flirt...
“AI gives us the ability to take on massive, entrenched systems that would otherwise be impossible to untangle,” Jennica Pounds told the Deseret News. “Without it, we’d be fighting blind.”
https://bsky.app/profile/molly.wiki/post/3lkcvotmjgk2u
It couldn't recognize Spanish names with accented characters.
Any Trump project probably will have a similar problem.
No excuse for not doing this in govt cloud though
This person using CTRL-X/CTRL-Y for a giant-fuck file when "Import" exists
And for the love of GOO, it's a spreadsheet, MSExcel is a spreadsheet, not a bleeping database like SQL, er, MSAccess (is it possible she means dataset?)! 1/2
Like my DB professor said: "Let's do Access and then let's look into databases."
"No, Microsoft Excel is a spreadsheet application, not a database, though it can be used to store & manipulate data in a tabular format. Databases are designed for robust data storage, manipulation, &management, while spreadsheets excel at calculations and data visualization."
like, honestly, why
She needs to upgrade her abacus.
solid fucking bullshit
What the hell kind of rig are they running?
I've been working with big data for over 30 years and I have never, not once, overheated a hard drive.
We’re using VMS 8.4-2L3 on Itaniums working with datasets way bigger than 60k rows
Patient: "Doctor, thank you!"
Wife: "Yes, thank you Doct...wait...what's happening?"
Dr: "The... uhhh... the fistibular is filling with...err.. myo cradial bloo...oosto...po...fy...?"
Wife: "OMG! That sounds bad!"
Patient: Beeeeeeeeeeeeeeeee
Microsoft itself sets these parameters for spreadsheets:
1,048,576 rows by 16,384 columns
60K? Call me after you take a basic data analysis class.
Even if the set of 60k rows was just a sample. Over heating on a 60k sample set demonstrates a seriously ridiculously inefficient piece of code, unless each is like a nesting doll of data. In any case, seems amusingly bad code.
https://www.heise.de/en/news/New-Zealand-14-5-billion-euros-managed-with-an-Excel-spreadsheet-10313220.html
https://drive.google.com/file/d/1kGcIshItHjdmQdyGnATdkUWhy8-NVqrX/view?usp=drivesdk
our phones can process more data than that.
Se tried to open the file on pages or excel
If so why is hers in the toaster oven
/s
386sx16, dx 33, dx66, there were also dx50s. I don't remember any 20/40mhz versions
*AH* AMD made 20/40Mhz chips
🥰
I regularly deal with datasets with billions of rows and I've never had any issue with my "hard drive overheating" because *checks notes* that's a thing that doesn't happen (well unless the AC in the datacentre goes out for an extended period)
Wish the people that need to see how fucked he is would see this shit... Or at least understand how fucked this is 😔
So, uh, why are you using it? Use the sekret "data science" build, right?
So brave against such data.
MONGO DB: My time has come!
look when you're claiming the dog ate your homework you have a work on a good excuse