This. They even provide the cover image to use. If they don’t want embedding they could just block the request.
But they don’t want to. They want to sell the cake and eat it too.
This. They even provide the cover image to use. If they don’t want embedding they could just block the request.
But they don’t want to. They want to sell the cake and eat it too.
BlueSky is its own thing with its own federated protocol called ATproto. They have an explanation in their docs on how it works, different features. There’s a bridge between the two as well, a bit janky but effective.
You just put both in the server_name
line and you’re good to go.
That should be mostly the default. My secondary Vega 64 is reporting using only 3W which, on a laptop would be worth it but I doubt 3W affects your electricity. It’s nothing compared to the overall power usage of the rest of the desktop, the monitors. Pretty sure even my fans use more.
The best way to address this would be to first take proper measurements. Maybe get a kill-a-watt and measure usage with and without the card installed to get the true usage at the wall. Also maybe get a baseline with as little hardware as possible. With that data you can calculate roughly how much it costs to run the PC and how much each component costs, and from there it’s easier to decide if it’s worth.
Just the electric bill being higher isn’t a lot to go with. Could just be that it’s getting cold, or hot. Little details can really throw expectations off. For example, mining crypto during the winter is technically cheaper than not for me because I have electric heat, so between 500W in a heating strip or 500W mining crypto, they both produce the same amount of heat in the room but one of them also made me a few cents as a byproduct. You have to consider that when optimizing for cost and not maximizing battery life on a laptop.
Yeah, that didn’t stop it from pwning a good chunk of the Internet: https://en.wikipedia.org/wiki/Log4Shell
IMO the biggest attack vector there would be a Minecraft exploit like log4j, so the most important part to me would make sure the game server is properly sandboxed just in case. Start from a point of view of, the attacker breached Minecraft and has shell access to that user. What can they do from there? Ideally, nothing useful other than maybe running a crypto miner. Don’t reuse passwords obviously.
With systemd, I’d use the various Protect* directives like ProtectHome, ProtectSystem=full, or failing that, a container (Docker, Podman, LXC, manually, there’s options). Just a bare Alpine container with Java would be pretty ideal, as you can’t exploit sudo or some other SUID binaries if they don’t exist in the first place.
That said the WireGuard solution is ideal because it limits potential attackers to people you handed a key, so at least you’d know who breached you.
I’ve fogotten Minecraft servers online and really nothing happened whatsoever.
Meanwhile, me running my whole Steam library off ZFS over NFS 😅
For a CPU benchmark like this, something is definitely weird because wine shouldn’t be translating anything. I wonder if the benchmark might be doing weird things with the Windows API.
That sounds like you’re hitting an edge case and it might not be representative of the actual performance you can expect out of Wine.
Have you tried other benchmarks?
I think it’s not as much as we expect everyone to host theirs themselves, but that it’s possible at all so multiple companies can compete without having to start from scratch.
Sure there will be hobbyists that do it, but already just on Lemmy users already have the freedom of going with lemmy.ml, lemmy.world, SJW, lemm.ee and plenty more.
It’s about spreading the risk and having alternatives to run to.