

Servo is another wip web browser, managed by the Linux foundation’s European branch. It’s a little less far along but is making relatively quick progress now. Apparently discord already mostly works, with sending messages currently being a problem.
e
Servo is another wip web browser, managed by the Linux foundation’s European branch. It’s a little less far along but is making relatively quick progress now. Apparently discord already mostly works, with sending messages currently being a problem.
There are some pretty corporate “open core” software companies tho, that’s a more grey area
Idk if the tech for 3d printers is really more complex. All of the parts are readily available, basically nothing needs to be specially made except the hot end (one single metal part)
The consumer experience for 2d printers worse IMO but that’s probably because I’m stuck on Windows with its terrible printing system
You could keep the kernel tho while changing the gui
A lot of people block political keywords and related communities, and most niche subs aren’t there
Hmm, I was not aware of that. I’ve seen (not Nvidia related) simulations with probably tens of thousands of rigidbodies running on relatively old midrange CPUs in real time, so it’s pretty crazy that it’s that slow.
Are there really any 32-bit era games that your CPU can’t handle, especially if you have a $1k+ gpu? This post is honestly pretty misleading as it implies modern versions of PhysX don’t work, when they actually do.
That being said, it doesn’t make all that much sense as a decision, doubles are rare in most GPU code anyways (as they are very slow), NVIDIA is just being lazy and doesn’t want to write the drivers for that
Well, at least you aren’t on mac where 32 bit things just don’t launch at all… (I think they might be playable through wine, but even in the x86 era MacOS didn’t natively run any 32 bit games or software, so games like Portal 2 or TF2 for example just didn’t work even though they had a MacOS version)
An interesting decision from the moderators of the Signed Distance Field Organization
Or maybe from the Syrian Democratic Forces?
I’m just using basic fabric stuff running through a systemd service for my MC server. It also basically just has every single performance mod I could find and nothing else (as well as geyser+floodgate) so there isn’t all that much admin stuff to do. I set up RCON (I think it’s called) to send commands from my computer but I just set up everything through ssh. I haven’t heard of either pterodactyl or crafty controller, I’ll check those out!
for high vram ai stuff it might be worth waiting and seeing how the 24gb b580 variant is
Intel has a bunch of translation layer sort of stuff though that I think generally makes it easy to run most CUDA ai things on it, but I’m not sure if common ai software supports multi gpu with it though
IDK how cash limited you are but if it’s just the vram you need and not necessarily the tokens/sec it should be a much better deal when it releases
Not entirely related but I have a full half hourly shapshotted computer backup going to a large HDD in my home server using Kopia, its very convenient and you don’t need to install anything on the server except a large drive and the ability to use ssh/sftp (or another method, it supports several). It supports many compression formats and also avoids storing duplicate data. I haven’t needed to use it yet, but I imagine it could become very useful in the future. I also have the same set up in the cli on the server, largely so I can roll back in case some random person happens upon it and decides to destroy everything in my Minecraft server (which is public and doesn’t have a whitelist…). It’s pretty easy to set up and since it can back up over the internet, its something you could easily use for a whole family.
My home server (with a bunch of used parts plus a computer from the local university surplus store) was probably about ~170$ in total (i7 6700, 16gb ddr4, 256gb ssd, 8tb hdd) and is enough to host all of the stuff I have (very light modded MC with geyser, a gitlab instance, and the backup) very easily, but it is very much not expandable (the case is quite literally tiny and I don’t have space to leave it open, I could get a pcie storage controller but the psu is weak and there aren’t many sata ports), probably not all that future proof either, and definitely isn’t something I would trust to perform well with AI models.
this (sold out now) is the hdd I got, I did a lot of research and they’re supposed to be super reliable. I was worried about noise, but after getting one I can say that as long as it isn’t within 4 feet of you you’ll probably never hear it.
Anyways, it’s always nice to really do something the proper way and have something fully future proof, but if you just need to host a few light things you can probably cheap out on the hardware and still get a great experience. It’s worth noting that a normal Minecraft server, backups, and a document editor for example are all things that you can run on a Raspberry Pi if you really wanted to. I have absolutely no experience using a NAS, metasearch, or heavy mods however, those might be a lot harder to get fast for all I know.
“everything everywhere all at once” was made largely in Blender I think, it’s the most popular film from a studio using Blender that I know of
The browser would also have to guarantee that you yourself didn’t edit the website. It’s not supposed to insure that the content was real, only that that website really had that content on it.
Yeah, that’s kinda why I thought a screenshot thing would be better. It could also ideally work on private data like DMs. The idea also includes having the URL as tagged unencrypted metadata on the image, that anyone can access by opening the image in a metadata website (or the hypothetical authenticity checking service)
From what others are saying though, it sounds like my original screenshot idea would probably be impossible, so linking to the source is the best we can actually do
That’s why having the URL as part of the hash is important. I’m thinking less for real photos and more for ‘screenshot of a deleted tweet’ sort of things.
What got me thinking about this actually was whether there would be a way to verify which screenshots of the Google search AI are real and which are fake.
the obvious solution is to sacrifice control of your software and hardware to some proprietary third-party system that presumably has no stake in the outcome, but that causes more problems than it solves.
Yes, I can imagine a world in which some company has a system like this, and then could discreetly delete hashes from the database if they see the original image and realize that it shows evidence of something they don’t like.
If it would be used for actual investigative journalism or criminal evidence, its giving that company a lot of power.
yes, it would need some way to prove the exact software it was made in, and I’m not sure that’s possible
It would actually be convenient to have a screenshot feature that also automatically links to the latest archive of the website
there are situations though where that doesn’t work, such as if this can only be seen on your account, like if you take a screenshot of a DM or something
IDK if Mastodon has a good way to port accounts but I think its good to have people first join a basic instance and then move to something more specialized once they get used to the platform
“The surge of emotion that shot through me when I saved your life taught me an even more valuable lesson: where woke_mind_virus lives in my brain. [Announcer: “woke_mind_virus deleted.”] Goodbye, woke_mind_virus.”
It’s hard to say. “Open core” means that most of the software is open source (licenses vary) but some features are locked behind a paywall. Gitlab takes this approach for example, also maybe onlyoffice.