Could it be an MTU issue? Networking van be weird if packets get fragmented unexpectedly, but I see this mostly for IKEv2 and other VPN Services. Try to lower your MTU on the WAN side Maybe?
Could it be an MTU issue? Networking van be weird if packets get fragmented unexpectedly, but I see this mostly for IKEv2 and other VPN Services. Try to lower your MTU on the WAN side Maybe?
I run Nextcloud for many, many years. I hosted it for a very long time at Hetzners second lowest tier of Webspace they rent. It was not very fast there (you get what you pay for), but fast enough for our need here. Later I moved it to an Azure VM and after that to my Homeserver where it runs blazingly fast, especially since the last updates they pushed out.
In all that time I never reinstalled. I just upgraded to the newer versions when they were out. The only times I had problems upgrading was when I was hosting at the cheap Webspace instance at Hetzner and an upgrade process took longer than the PHP timeout my very cheap hosting instance provided. So it was never a fault of Nextcloud, but just that I hosted it on basically the cheapest hosting plan I could find.
We use it for file sharing, calendar + contacts (+ Sync with DAVx), Notes and of course Talk. For talk to make full use of Voice + Video calls, you should have a TURN Server, but if you do not use that (if you just text) it was running great even on the Webspace instance at Hetzner.
We are very happy in our family that it exists, that it is free and that it serves us well since many years.
You would think so, yes. But to my surprise, my well over 60 Containers so far consume less than 7 GB of RAM, according to htop. Also, of course Containers can network and share services. For external access for example I run only one instance of traefik. Or one COTURN for Nextcloud and Synapse.
I would absolutely look into it. Many years ago when Docker emerged, I did not understand it and called it “Hipster shit”. But also a lot of people around me who used Docker at that time did not understand it either. Some lost data, some had servicec that stopped working and they had no idea how to fix it.
Years passed and Containers stayed, so I started to have a closer look at it, tried to understand it. Understand what you can do with it and what you can not. As others here said, I also had to learn how to troubleshoot, because stuff now runs inside a container and you don´t just copy a new binary or library into a container to try to fix something.
Today, my homelab runs 50 Containers and I am not looking back. When I rebuild my Homelab this year, I went full Docker. The most important reason for me was: Every application I run dockerized is predictable and isolated from the others (from the binary side, network side is another story). The issues I had earlier with my Homelab when running everything directly in the Box in Linux is having problems when let´s say one application needs PHP 8.x and another, older one still only runs with PHP 7.x. Or multiple applications have a dependency of a specific library when after updating it, one app works, the other doesn´t anymore because it would need an update too. Running an apt upgrade was always a very exciting moment… and not in a good way. With Docker I do not have these problems. I can update each container on its own. If something breaks in one Container, it does not affect the others.
Another big plus is the Backups you can do. I back up every docker-compose + data for each container with Kopia. Since barely anything is installed in Linux directly, I can spin up a VM, restore my Backups withi Kopia and start all containers again to test my Backup strategy. Stuff just works. No fiddling with the Linux system itself adjusting tons of Config files, installing hundreds of packages to get all my services up and running again when I have a hardware failure.
I really started to love Docker, especially in my Homelab.
Oh, and you would think you have a big resource usage when everything is containerized? My 50 Containers right now consume less than 6 GB of RAM and I run stuff like Jellyfin, Pi-Hole, Homeassistant, Mosquitto, multiple Kopia instances, multiple Traefik Instances with Crowdsec, Logitech Mediaserver, Tandoor, Zabbix and a lot of other things.
deleted by creator
I love Traefik! When I started, I tried NGinx, but could not wrap my head around it. So I tried Caddy. Pretty easy to understand andI used it for a while. Then I had demands Caddy could not do ant stumbled uponTraefik. As you said, a learning curve, butfor me much easier than NGinx. I like that you can put the Traefik config inside the Compose files and that the service only is active in Traefik when the actual Containers are up and running. I added Crowdsec to my external facing Traefik instance and even use a plain Traefik instance for all my internal services also. And it can forward http, https, TCP and UDP.
Thank you for your feedback! I get the impression that it might work if used on a small scale when it´s not public. I guess I will have a new container soon :-)
One reason is because I can. And because of that, I tend to host things myself which I can. This generates cost and work to maintain it on my side and not for others. A few less users from our household on a public instance means more room for others who are just not as tech-savvy and have no other choice as to rely on public instances. So it is a mix of respecting other peoples time, effort and money and a part is just the nerd that wants to find out how it works and how it´s done :-)
Oh wow, that is a lot more usage than I can think of for all of us here, haha! Thank you very much. That sounds very promising.
That is cool, thank you very much!
I was about to order a SLZB-06, but they were out of Stock. That one looks exactly what I want. I never really looked into NodeRed, but normalizing everything before using it makes sense. The SLZB-06 makes the Zigbee Network connections independent from any Servers and making everything going through MQTT makes you independent from any Software that has to communicate with the Devices. Sounds like a lot of flexibility and independence.
Matter and whether external connections are needed or not will be interesting to follow. My HA instance is internal only too, since it does nothing that needs me to access it over the internet. And Owntracks delivers to a separate MQTT instance that will have no internal devices. So my HA is shut off from the internet and I will pay attention to everything Smart Home I will buy, that it does not require an Internet connection too.
Thanks :-)
Thank you! Regarding the Sonoff Zigbee 3 Dongle: I see it has an external Antenna. Did you need a USB extension cord too or is this interference problem other dongles seem to have with USB ports mitigated by the external Antenna?
May I ask what Gateway / Dongle you use? Oh and that weather station sounds interesting, if you happen to have a Manufacturer / Model for me I would like to read up on that too :-)
I am leaning towards MQTT too because of other solutions that already integrate into that. It looks like a great way to throw different Data into a single pool to make them accessible in the same way, no matter if it is a switch, temp sensor, Camera, GPS Data etc.
I am leaning more towards something that might be complex to set up, but has broader support and is future proof. And the latter one seems something that is not really clear for any of the current protocols. Maybe in 10 years things will have settled and everyone uses the same protocol, but who knows what it will be :-) I am leaning towards Zigbee2MQTT for now since my impression is that you are very flexible to do with MQTT. I already use MQTT for Frigate and Owntracks and if other devices put their stuff into MQTT I will I have a pretty open pool for all the data / actions, even if I switch from HA to something else in the future. I feel MQTT is here to stay for a while, but well… that could all be wrong, Haha!
Thanks for your input! MQTT is not an issue, I have Mosquitto running on both my installs (one with HA + Frigate on a remote location, the other one HA for me which is also used by Owntracks), so MQTT is not a problem. I will even set up a second one and connect them together to have 2 Brokers for the setup where I need Internet access (Owntracks) to MQTT and I do not want to share this MQTT instance with the devices for my home.
My impression about Matter was too that it is not “done” yet and device support is poor. On the other hand you read at every corner that it will be the future. This is why the SkyConnect Adapter looked very interesting to me at first, but since most of the features I would use now (Z2M probably, Docker compatibility) do not seem to work yet, or at least not reliably.
To be honest, I might have mixed something up reading up on all those standards. After researching a topic, my browser usually ends up having a hundred Tabs “just in case I need that information again” and honestly… in all that information I can not find it specifically. My consensus reading all that information was: “Make sure for each device you buy that it works with your specific gateway, even if it says it works with protocol X”.
Thank you very much, that was very helpful! I am leaning towards Zigbee2MQTT and it seems to be a good choice.
ZHA is out of question now, since how often I have to restart HA for updates, I am sure it would be annoying to me. My Mosquitto Instance however does not receive updates as frequent, so Z2M might be the better choice for me.
Could you tell me what kind of Bridge / Device / Dongle you use? I often read Conbee II**, but this is not on the recommended list for Z2M. On the other hand I am thinking about getting a Network only Gateway (there are some on the recommended list too), so I do not rely on a USB connection that needs to be mapped to a Docker Container. Having pure Network connections and independent devices sounds like a more stable solution. Also I assume having a Networked Gateway instead of a USB connected one would be independent of restarts from my Containers or even the host.
**Edit: It IS on the Recommended list, I just got confused because it is not listed on the USB section, but under Other :-)
We follow the principle of doing one thing well instead of all things mediocre, so we use 2 solutions for what you asked. As others in the thread, we do use Tandoor, but only for Recipes and Meal Planning. It does this execeptionally well, but the shopping list part is fitting to our style of shopping.
As a shopping list, we use David Shays Groceries / Specifically Clementines. Why?
There is more, but this post got too long already. It also has User Management, Permissions and Live Sync. Yes, my Partner can see live when I tick of items on the list and can put stuff on the list while I am shopping :-)
Everything in that software feels like it was created by a person that goes actually shopping.
It has a very good web interface (which also has the offline mode AFAIK) and a very good Android App.
Does it look fancy? No. Has it everything we ever searched for in a shopping list app? Absolutely!