

Unless OP is running a data center, then there’s not really much of a power increase to run a local Ollama.
Hi! I’m Katherine, or webkitten. I’ve been on the internet since our family got our first computer - a Tandy Sensation.
Yes, I went to computer camp as a kid and learned how to program BASIC on Radio Shack TRS-80 Model 4.
I’m trans, queer, and bisexual. #actuallyautistic
I started programming with PHP in the mid 90s and haven’t stopped. I’m an advocate for the open web; I used Netscape as long as I can remember.
I have an obsession with Hello Kitty, Moogles, and Squishmallows.


Unless OP is running a data center, then there’s not really much of a power increase to run a local Ollama.


This is late but I switched to Hetzner and Domeneshop from AWS for VPS hosting and DNS and I’ve been quiet happy. I have a dedicated VPS with far more power for far less than I paid through AWS.
Also using Hetzner’s storagebox as a Google Drive replacement.
I mean it depends on what subs you subscribe to; my subscribed feed is mostly tech, history, and wholesome news.
But politics will tend to bubble up because gestures around


There’s a really good docker image I use for rustdesk at home. I’m thinking of just setting it up on my mom’s laptop and then dropping a script on her desk to toggle it on or off, depending on if she needs help (so she doesn’t have to fiddle with the commands).
But, yeah, the Rustdesk docker image is super easy to use along with the client. Then I just set up tailscale on my mom’s computer and invite her to my network.


git init --bare repository.git if you want to just have a private git somewhere on a server you own.
I’ve also used Beanstalk for years, both for SVN and Git, and have been pretty happy (back when Github private repos were paid only). I have no connection with them; I just used them because back in the day it was cheaper for private repositories than subscribing to Github.
Personally now, I just use Codeberg as an alternative and love it.


Pretty flawless update from the apt repo on my end.
Server version 10.11.7
I have a GL-AX1800 and I’ve been happy with it; going to get another for my mum.


This is good but cis people still need to push him to keep his promise on protecting trans people that he made during the campaign and stand up to Mount Sinai and Langone.
https://bsky.app/profile/erininthemorning.com/post/3mhtvnnr4222c


What troubles were you having with Baikal? I generally let mine just sit with a checked out tag from Git.


When you go to a shelf of recommendations, you’re not picking from a human; you’re picking from a shelf.


Seriously; local AI use is what everyone should strive for not only for privacy but because it’s better than using a large data centre and the power use for Ollama is negligible.


Is it any different that getting movies based on recommendations from employees at video stores?
You could probably get away with using gemma3:4b or phi3.5.