![](https://lemmy.jelliefrontier.net/pictrs/image/4674667d-6a51-4f15-be2f-a6c8a874201f.png)
![](https://lemmy.world/pictrs/image/44bf11eb-4336-40eb-9778-e96fc5223124.png)
I know you’re complaining, but I think you just described a good chunk of the reasons why I like Lemmy and the fediverse in general.
I know you’re complaining, but I think you just described a good chunk of the reasons why I like Lemmy and the fediverse in general.
No one here is important or official. There are no video game community managers or anything like that here. Lemmy is not used for interacting with anyone other than fellow idle nerds.
This is how Reddit was before it exploded in popularity and companies and celebrities started taking it seriously. I don’t know if Lemmy will ever get to that point, especially seeing how much abuse people will endure before they change platforms.
Other harmful side-effects aside, how much a game impacted you is significantly affected by the context of your life. Experiencing the same game at a different time in your life might not be as meaningful.
as a starting point to learn about a new topic
No. I’ve used several models to “teach” me about subjects I already know a lot about, and they all frequently get many facts wrong. Why would I then trust it to teach me about something I don’t know about?
to look up a song when you can only remember a small section of lyrics
No, because traditional search engines do that just fine.
when you want to code a block of code that is simple but monotonous to code yourself
See this comment.
suggest plans for how to create simple sturctures/inventions
I guess I’ve never tried this.
Anything with a verifyable answer that youd ask on a forum can generally be answered by an llm, because theyre largely trained on forums and theres a decent section the training data included someone asking the question you are currently asking.
Kind of, but here’s the thing, it’s rarely faster than just using a good traditional search, especially if you know where to look and how to use advanced filtering features. Also, (and this is key) verifying the accuracy of an LLM’s answer requires about the same about of work as just not using an LLM in the first place, so I default to skipping the middle-man.
Lastly, I haven’t even touched on the privacy nightmare that these systems pose if you’re not running local models.
Creating software is a great example, actually. Coding absolutely requires reasoning. I’ve tried using code-focused LLMs to write blocks of code, or even some basic YAML files, but the output is often unusable.
It rarely makes syntax errors, but it will do things like reference libraries that haven’t been imported or hallucinate functions that don’t exist. It also constantly misunderstands the assignment and creates something that technically works but doesn’t accomplish the intended task.
Personally I have yet to find a use case. Every single time I try to use an LLM for a task (even ones they are supposedly good at), I find the results so lacking that I spend more time fixing its mistakes than I would have just doing it myself.
If you think of LLMs as something with actual intelligence you’re going to be very unimpressed… It’s just a model to predict the next word.
This is exactly the problem, though. They don’t have “intelligence” or any actual reasoning, yet they are constantly being used in situations that require reasoning.
Roughly 10 million.
I would consider 1/3 a notable contender. Granted, only ~1 million of those users are active daily, but that’s still very significant for a FOSS alternative.
EDIT: Source
I just tried it out and it’s really nice!
I see you’ve got biceps to spare.
Shame / embarrassment is an extremely powerful teacher, for better or worse.
The current theory is that shame evolved in humans as a survival mechanism to keep humans in groups. Shame is our brain’s corrective tool to avoid behavior that would ostracize us from a social group. If an early human were outcast by their tribe, their chances of survival or reproduction plummeted.
They’ll be fine. I don’t think Apple can trademark something as common as “Hello” in French, and it looks like this project is based in France.
It also poses zero threat to Apple, since this doesn’t compete with any of their products or services.
Yes! Authentik is a great self-hosted OAuth platform. They actually publish integration guides in their documentation.
Integrate with Immich
Funny, but very confusing for people out of the loop.
I’m an American and I think America social media should be banned.
That is, closed-source, centralized for-profit social media platforms that will inevitably devolve into ads and data collection machines should be banned.
The problem isn’t the country that hosts the platform. The problem is the incentive structure for social media to profit off its users.
Platforms that are either FOSS, run by non-profits, or pay-to-use don’t have an intrinsic incentive to exploit its users and can, in theory, be run ethically and sustainably.
The sad reality is that the quality of modern BluRay releases has significantly declined. Sure the picture looks great, but they barely come with special features anymore. Also, the QA is atrocious. I buy a lot of UHD BluRays and ~30% of them come corrupted/damaged out of the box.
I really want physical media to become popular again so companies start actually putting in effort.
EDIT: I still love physical media. It’s pretty much the only way to own a copy of media anymore. I just wish it was as beloved as the DVD days.
This is neat! Bookmarking this. Not sure what you’re using on the backend, but are you open to contributions for more detailed descriptions?
If you mean those jars of mushed up fruit, they used mushed up fruit.
If you mean formula, then they either had a wet nurse who could assist, or the baby would die.
Infant mortality has decreased significantly in the modern age, and it’s the main reason average life expectancy has gone up so much.
Okay, fair enough.
Yes. The Lemmy instance I’m commenting from is running on a Raspberry Pi 4. A couple things you’ll need to consider though: