If you’re in a currently cold place, then it’s a reasonably good way to heat a room and make the energy do something useful along the way, instead of just running a space heater.
Physics nerd. Currently studying some quantum gravity adjacent stuff in QFT
They/them
If you’re in a currently cold place, then it’s a reasonably good way to heat a room and make the energy do something useful along the way, instead of just running a space heater.
I picked the worst time to get into print media…
This is 100% anecdotal of course, but I’ve noticed weirdly inconsistent behaviour. I have one tab I permanently keep open for YouTube and that one loads videos really fast. If I open a second tab by following a link from that main tab, then it partly loads the site and sits there for a weirdly long time before any content even appears.
I’ve got a really fast connection too, and nothing else was having issues. This whole thing is bizarre.
That’s literally the opposite of what it needs to be. I wanted to check out threads out of curiosity but I don’t want to have an Instagram account, so I won’t.
Cheers, I’ll give it a go, though I suspect I’ve already done it. I believe I’ve read the rant you’re talking about too
I have a personal server, mostly acting as a NAS but with some web hosting as well. For whatever reason, it randomly freezes until you manually power cycle it, it happens really often, like every 20 minutes.
Turns out it’s due to some weird interaction between debian and older ryzen CPUs, if the CPU isn’t busy it just dies. Solution? A Minecraft server, with no one on it, it keeps the CPU just busy enough to keep it alive. I’ve had it running for months at a time with no issues.
Generic “USB media players” used to be a big thing back before they were integrated into every TV, I’m fairly sure you can easily find one with a remote still.
It seems like a lot of DVD/bluray players still take USB too, we used to play off of an external HDD and USB thumb drives using our Blu-ray player when I was a kid. You can probably get something really good second hand too.
When I was 21, and I got the first of my “arthritis.”
It’s in quotes because years later it was diagnosed as an entirely different (and even more debilitating) disease, but I still effectively have arthritis a lot of the time, I just now also have constant insomnia and fatigue.
To oppress the some of the most vulnerable people in the world who desperately need help, of course.
The thing I don’t get is all these random LMG apologists saying “actually here is exactly why it happened and is actually completely okay” despite absolutely no one having a true inside view of the company.
Like I saw someone just randomly guessing really stupid reasons why Madison is in the wrong, and acting like they have secret knowledge.
After reading that, I fully expect their plans “to improve” will involve abusing and blaming staff unfairly. Seems like they’re already doing that when they blame “human error” for the videos being bad rather than taking personal responsibility, as management should.
Holy shit
This is Billet labs all over again, they’ll literally only do something if they’re publicly called out
I think “rounding error” is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that’s hilariously lower than I’m used to these things turning out to be, but I’m normally doing relativistic stuff so it’s not really fair to assume they’ll be even remotely similar.
I appreciate you revising your reply to be less harsh, I wasn’t aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I’m sorry if I made your day worse, I hope things improve.
I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I’m aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.
The only thing I disagree with is your assessment that computation doesn’t create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It’s called Landauer’s principle. It’s an extremely small proportion compared to resistive loss and the like, but it’s there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don’t delete information but I’m reasonably confident we don’t have anything ready to go.
All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there’s a fair bit of ground to make up yet.
There is still heat generated by the act of computation itself, unless you use something like reversible computing but I don’t believe there’s any current way to do that.
And even then, superconducting semiconductors are still going to be some ways off. We could have superconductors for the next decade in power transmission and still have virtually no changes to processesors. I don’t doubt that we will eventually do something close to what you describe, but I’d say it’s easily a long way off still. We’ll probably only be seeing cheaper versions of things that already use superconductors, like MRI machines.
Bruh
Data centres are wild
10TB of RAM? Surely that’s a typo?
I don’t really understand why, but this seems to be a common misunderstanding of the multiverse theory.
All it says is that every possible universe exists, so it’s not at all required that everything you can think of exists, just everything permitted by physics. Possible is the keyword here, and you can still have an infinity of universes even if you restrict what is possible.
I’m no expert on the subject, but as I understand it there are generally two types of multiverse theory. The one where you have infinite universes all with the same physical laws, but every unique possibility under those laws exists in the multiverse. And the one where every possible variation on the laws of physics exist (generally talking about different coupling constants rather than entirely different laws). It’s entirely reasonable that both types are one in the same.
In either case, it wouldn’t really be consistent for there to be a universe where the multiverse doesn’t exist, unless it is the only universe and there is no multiverse at all.
Propaganda.
I’ll only celebrate AI when someone makes one without stealing all the training data