I think something like this would do? You can search the list of supported devices there. Search by exposes power.
I think something like this would do? You can search the list of supported devices there. Search by exposes power.
Forgejo supports a ton of repos including docker / OCI images.
Yeah works good until its under load which federation does have. Matrix and Lemmy both got like 20GB of RAM dedicated to the database on my servers.
There are postgres settings to reduce disk writes. There’s a max size and a timeout to write to disk. By default these values are on the lower end I believe.
Its using OpenID so yes it would support those. But it would support a lot of others as well. Authelia, keycloak, and authentik are a few examples that are open source and self hostable. Its nice to have SSO for all your services even in a homelab.
NUCs are where to go. Intel chips good for transcoding and 3 year warranty. Had 1 die out of 3 die in 4 months and got a full replacement. Got another so I’m running 4 now and been about a year. Running tons of stuff and measured power to about $2.5/mo/pc.
Ah yeah it sucks to host on your actual computer, especially if you are using the laptop portably. I just wanted to point out storage isn’t too bad if setup right. Best of luck finding a good host for your needs!
Curious here how is storage an issue? Lemmy is 90% text and if you do the proxy for pictures its like 10%. If you use pictures from elsewhere and don’t use pictures it’s all text. I have had lemmy hosted for over a year now and it’s using under 35GB. I have a not that subscribes to top posts on larger instances so I should have a lot of communities loaded.
Edit: oh yeah Pictrs integrates with S3 api now so you can offload image storage to cloud for pennies.
At least one great thing about bitwarden, the passwords are stored on each device, so you kind of already have backups. That being said backups for vaultwarden is still beneficial.
I used Google lens. Got stuck afterwards on a chess rule. The captcha rule used the notation for the chess one to complicate it further haha
Other notable resources:
Hardware accelerated Machine learning requirements
Hardware accelerated Transcoding for videos
You’ll need a stronger CPU (or maybe multiple since you can run machine learning multiple machines) handle the load if not using a supported discrete GPU. Also for transcoding videos if you want to do that you’ll have to look at compatibility of the CPU with what it can encode/decode and what format you want to store.
That being said , it barely use CPU resources with immich 99% of the time with the exception when media is backing up to it.
The client sets up its transcoding profile (like what it supports for direct play, etc for auto transcoding) or the client has to specifically request a different quality. Findroid has had PRs for the second one and I did one of them updating based off the older PRs.
This is because it doesnt support transcoding. It does direct streams only.
There are forejo runners and they seem compatible with a bunch of github actions. I created one that builds a docker image and publishes it on the repo.
Lidarr to download music, has Spotify playlist integration, only problem is it pulls by alblum not by song.
LMS to play music, it supports selecting tags to use.
Picard to tag the music. Kinda optional, but using plugins it can pull genre, moods, and BPM, which I liked using to make a smart playlist to get songs I like the sound of without 100% hand picking out of thousands.
The only time I use caddy is to serve static files… I then put a nginx proxy in front of it to expose it lol
There exist tools for this from takeout archives already, its how I migrated to immich.
I ordered 5 of these when I saw they are finally available. I probably won’t use them until pikvm is ported to it, there is a lot of talk about the firmware being closed source currently.
It just says you are unlicensed on the webapp. They also said no features will every be locked behind it.
Using the igpu might be problematic for transcoding if you need that. I’d recommend older intel / Asus NUCs if you want a mini PC. 3 year warranty, built for Enterprise, tall version has room for a 7mm tall sata SSD or HDD along with nvme m.2 SSD.
I think if you do Asus 12gen + they have another m.2 slot though it is the smaller one 2242. Doing all this you can upgrade it to 64GB RAM, 8TB m.2 2280, 8TB SATA SSSD, and 1TB M.2 2242. In homelab especially with mini PCs the limit is usually RAM / storage rather than CPU.
I got 4 11th gen with 64 GB RAM each and 32TB of SSD storage. I recommend avoiding QLC SSD as much as possible. Aim for TLC , MLC, or SLC. Higher storage capacity tends to be QLC or TLC, QLC has shortest endurance and slowest speeds.