open source, can be self hosted or you can use the official instance.
Personally I have been using KDE connect most of the time when I am at home.
Pairdrop I use more when sharing with other people across the internet.
open source, can be self hosted or you can use the official instance.
Personally I have been using KDE connect most of the time when I am at home.
Pairdrop I use more when sharing with other people across the internet.
You can run a gui-less service that recieves and displays push notifications. I’ve programmed something like this before. I know it is technically a kind of client, but it is not an email-client.
Is the ‘%MARKDOWN’ part of your example correct? That should also be converted to a dash? Or did you forget the 20 there?
Sadly it’s a bot more complicated than just a docker container, but there is the manual install doc that goes into a bit more detail.
For anything deeper you’d have to read the script.
Personally I use Dokploy. It’s a dead simple docker web UI that makes domains and ssl easy peasy
GrayJay works fine for me
“Something released! Whats this?” he thinks while following the link and reading:
OpenVox, the community-maintained open source implementation of Puppet.
“Ah yes, Puppet, we have Puppet at home, as does everybody! I use Puppet all the time with the ladies, when they come over for Puppet and chill!”
Be aware, of course, that even though you can type the same commands, use all the same modules and extensions, and configure the same settings, OpenVox is not yet tested to the same standard that Puppet is.
“Of course, of course! As one should know, the Puppet and the Openvox commands, yes…”
Giving up on extracting any usable information from the website he opens the github link and reads:
OpenVox is fully Puppet™️ compatible, so modules from the Forge will work
“Can’t forget the Forge now can we? Aaah all the fond memories I have of lookong at modules coming straight hot from the Forge, amiright fellas?”
All your data and traffic passes through various routers and servers (both of which are computers and have memory) while you do anything on the internet (You can find the list of such computers by doing a traceroute). But because it is end to end encrypted - you don’t care.
It’s transient.
You can selfhost PairDrop though. Including the signaling and turn server. It’s open source.
You can write a pacman hook to just run the fix PreTransaction.
The file does not get uploaded to remote servers. It passes through them, fully encrypted, and the server does not have the keys to decrypt your files.
I think I am limited by the software.
With a gigabit ethernet connection, I was not able to have a good experience.
What’s so bad about servers?
Both are open source.
The signaling server just sees the IPs of your devices and matches them by roomID.
The turn server sees only locally encrypted files and your IPs (and it is used only IF you are behind a NAT).
As far as I see, there is no way for anything bad happening, but I am happy to learn if you know something. If you need it for a proof, I’d gladly give you some of my IPs and encrypted files - see what you can do with them.
Even when my internet doesn’t suck for a minute, I have yet to find a linux remote software that is not sluggish or ugly from compression artifacts, low res and inaccurate colors.
I tried my usual workflows and doing any graphic design or 3d work was impossible. But even stuff like coding or writing notes made me mistype A LOT, then backspace 3-5 times, since the visual feedback was delayed by at least half a second.
I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.
I have a desktop and a laptop.
Both run the same OS (with some package overlap, but not identical)
I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.
The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.
(The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)
That setup is easy and gets me 95% there.
The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).
The downsides:
I have to configure some settings twice. Like the printer that is used by both computers.
I have to install some packages twice. Like when I find a new tool and want it on both machines.
I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.
I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.
And as a bonus, I also sync some important document to my phone.
Thereis stuff like
https://meshtastic.org/
https://unsigned.io/private-messaging-over-lora/