/r/selfhosted
A place to share, discuss, discover, assist with, gain assistance for, and critique self-hosted alternatives to our favorite web apps, web services, and online tools.
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Service: Dropbox - Alternative: Nextcloud
Service: Google Reader - Alternative: Tiny Tiny RSS
Service: Blogger - Alternative: WordPress
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
What Is SelfHosted, As it pertains to this subreddit?
/r/selfhosted
A few months ago I thought and asked about a solution for non-technical users to host multiple existing SelfHosted services:
https://www.reddit.com/r/selfhosted/comments/1fswyyl/selfhosted_as_a_desktop_application_idea/
Now I found the YunoHost system which looks AMAZING.
But I'm a technical user, I deploy my service on Docker and include volumes for backups (with rClone) and init scripts to the DB and so on.
I'm asking my friends to try the system, but meanwhile, I'm asking if someone from our community has experience with the system and wants to share.
My website got blocked somehow for phishing in the Cloudflare backend. I think it was probably one of the WordPress plugins that was abandoned or something and created a security issue. Not entirely sure, but my site is just a basic landing page and info; there is no store or anything like that.
When I went into Cloudflare to figure out how to fix it, they allowed me to ask for a review, but nothing has happened now for a couple months.
Anyone have this kind of experience and know what I should do?
TIA
I would like to hear your opinion on a possible piece of hardware to purchase for my home lab. I'm not sure if I'll be able to buy it yet, because these things are very expensive here in Brazil, but here we go.
Option 1: HP Prodesk 400 G9 Sff (not micro), i5-12500 Processor, 8Gb DDR4, 1TB pcie nvme SSD;
Option 2: Dell Optiplex Micro, i5 13500t, 8gb ram ddr4, 256gb ssd.
They are practically the same price and the performance of the two processors seemed very similar in most things according to the research I did, but the i5-13500T has 14 Physical cores and 20 Threads, while the i5-12500 has 6/12.
Today I use a Raspberry Pi 4/8Gb, with Debian and I run HomeAssistant, Adguard Home, Vaultwarden, node-red, etc. via docker, and I have no problems with this hardware.
However, I would like to use Immich with ML for family photos, maybe something from the *Arr stack, plex for videos on the living room TV, and I would like to back up my important things.
A NAS would be overkill for me, since my important documents don't reach 2Gb, and if I add the photos, I can easily store everything in 1TB, but I know I will need to expand in the future.
My idea would be to use ProxMox as follows:
1 HAOS VM (HomeAssistant, MQTT, ZIGBEE2MQTT, Node-Red)
1 VM with other services in docker (or LXC, I still don't understand this part well) such as *arr stack, DNS, plex server, vaultwarden, proxy, Immich with ML, Paperless-NGX, and other services that I want to "play" with;
1 VM with some Linux system to do tests (docker tests or any other utility);
Maybe 1 Windows VM for tax issues that don't work well on my macbook.
RPi 4/8 DNS, VPN and backup redundancy (one HDD or SSD via USB 3.0, with scheduled backups of photos in Immich and documents in paperless NGX and snapshots of the VMs).
I think I will have to increase the RAM in both cases, to 16 or 32Gb (I don't know if I'm missing something here).
I know that both computers would be able to run this and from what I saw, the power consumption would be similar in both cases (considering the use of 1 SSD and 1 HDD in both), but my doubt is about the big difference in Physical cores and Threads in this scenario of using VM and/or docker/LXC, as I don't know how this behaves in practice.
Hello /r/selfhosted !
I just finished building GopherDrop, a self-hostable tool inspired by Bitwarden Send. It's a secure REST API and UI for sharing one-time secrets and files. Built with Go for the backend and Vue.js with Vuetify for the frontend.
You can check it out here: Github Link
Would love to hear your thoughts and suggestions since this is my first open source project.
Hi. I have an Android phone that uses Chrome as its browser. I'm trying to use it to locally view web pages that I built using VS Code from a Windows 10 laptop. VS Code is running the Live Server extension and is serving pages on port 5500.
When I go to my computer's local IP address and port 5500 on my phone, Chrome shows "This site can't be reached" and then it shows the PC's local IP address and port 5500 and says its unreachable, followed by "ERR_ADDRESS_UNREACHABLE".
On the PC itself, browsing using the local IP address or using 127.0.0.1:5500 both work. I tried disabling Windows Defender's firewall, but it had no effect. I also added an inbound rule there which allows traffic on port 5500. The other security programs that I have are Avira Phantom VPN and Malwarebytes.
Does anyone have any insights that could help me? My goal is to be able to view web pages from my phone from the web server on my laptop to help build them so that they render properly on it without having to publish my projects to Netlify and view live deployed versions of them when I make changes.
Hi, I use Wireguard and unbound. All my Wireguard clients use the unbound DNS server that is running on my Wireguard central node.
Things were working perfectly, and recently for some reason I decided to fiddle with the config and remove some local-data
and local-data-ptr
values which I though were unnecessary (because after all I don't have that many devices and I know the IPs by heart). It took me a while to realise that it seems like without those lines, Google push notifications were not working anymore.
What could be the reason why these local name resolution lines in unbound.conf
would cause Google notifications to fail?
Thank you.
Hello Paperless-ngx community,
I would like to automate the extraction of ticket amounts when a document of type "Tickets" is detected. My goal is to extract the amount from the OCR content of the document and store it in a custom field called "Montant du ticket" (Ticket Amount).
I already have a post-process hook script to perform this extraction, but I’m still facing challenges getting the data properly added to the custom field.
Has anyone implemented a similar workflow or could offer advice on how to achieve this? Thank you in advance for your help!
I recently got a notification about running out of email storage and being prompted to upgrade. Since I’m trying to de-Google my life and have a home NAS, I started researching Synology MailPlus. I came across this Synology Knowledge Center article: How do I back up emails from Gmail or Outlook.com to Synology MailPlus? - Synology Knowledge Center.
This got me thinking about the whole “if it’s free, you’re the product” debate. Google/Outlook (and to a lesser extent, ProtonMail) don’t need 20 years of my emails. I don’t need them—except for the occasional nostalgia trip when I like to reflect and reminisce. Sure, I could delete them, but I prefer to keep them locally since I already have a 3-2-1 backup strategy at home.
Two questions for the community:
Has anyone transitioned to Synology MailPlus or a similar setup? How was your experience?
If you’ve archived emails locally, have you found it easy to search and revisit them when needed?
Over the holidays I'm looking to see if I can solve a workflow problem. Is there any kind of OSS software that'll let me turn a PC or RPi5 w/SSD into an automatic upload station? I'm prepared to buy/find hardware to solve this problem and write code, if needed.
When I take photos of our kids, I want to be able to make these available as quickly as possible, but I use Lightroom for editing. That means I have a serial workflow that goes: import, cull, edit (optional), export, and upload. The real job to be done is to import and upload the JPEGs so that my wife can get those quickly. I still want to go through my process on my own time in parallel for long-term curation of the photos.
I think my ideal solution would:
I don't think anything like this exists, but I wanted to ask. I program in Python and C# so building something isn't an impossibility, but I know I'd get distracted trying to build a touchscreen UI for an RPi. I could just script this locally from my PC, but the standalone station concept is very useful in case my wife uses the cameras.
I'm using a nginx as a reverse proxy for my applications and when tring to route it-tools the favcon returns fine but the page is totaly blank.
nginx location config:listen 80;
listen [::]:80;
location /ittools/ {
proxy_pass
http://ittools:80/
;
proxy_http_version 1.1;
proxy_set_header Host $host; # Forwarded host
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_redirect off;
}
docker network config: (not real ips)
name: proxy network {gateway: 172.35.0.1}
nginx: 172.35.0.3
ittools: 172.35.0.4
tried curl inside the nginx container to ittools container and it can see http://itools:80 fine but also returns body empty.
any help please?
Hi Guys,
I'm trying to do something, I don't know if it makes sense, but I'm asking anyways.
I want all users on one segment in my network to be able to access my dashboard *still wrapping my head around how this works* using a one word browser entry. so if I typed in "Azkaban" on my browser, my wife and I basically can get all our shared services.
Does this make sense? Or am I just thinking about something impossible?
Thanking you in advance, R.
Hi everyone, I am currently planning to set up a backup for my home server. About the server: It's an old tower PC, running proxmox. Has a piHole, nextcloud, immich and homeasistant running on it. I'd love to get some feedback on my thoughts.
For my data (Nextcloud and immich) I am planning to upgrade to 2 TB of storage for both services. Since I don't have any slots left I might need to connect an external drive via usb. Might that be a problem? If so why?
My Idea for a backup solution is FritzNAS. I thought of buying a icyBox 2Bay Raid with 2x 2 TB WD RED NAS HDDs from Amazon and hooking it to my Fritzbox. This would be a share on my network to which I will upload my backups.
Configuration would be raid 1 for the backup. Data (Nextcloud and immich) would be backed up separately from their hostmachines
I know that a good backup follows the 3-2-1 rule, but I am kind of cheap, so I don't want to pay for cloud storage and have no other off-site location to setup something. Furthermore I don't want to spend tons of money on an expensive NAS when my Fritzbox already supports one.
My main concerns are: Will fritzNAS work for this setup?
Is the Raid system actually needed? If so, is the icyBox a good product?
Is the external HDD a problem?
Thanks for your feedback!
I am looking for a DDNS server that I can host on my own Ubuntu server. Can you recommend a software solution?
So far, I have only found this Phython-based solution: https://github.com/SFTtech/sftdyn
Long time ago when I was only a newbie in the selfhost world I deployed on my Proxmox a Docker LXC using tteck scripts.
Now after long time I feel the need to move all the containers deployed (more than 40) on a VM.
I tried digging on the web but I cant get an easy method to do so without losing all the setup and data.
Anyone can help me figuring out how to do that?
Thanks in advance.
Any recommendations for simple / minimal photo gallery apps, with basic metadata editing functionality?
I am looking for something to organise and view my photos, and perform minor metadata edits like rotating or renaming files (or maybe even moving between folders). I generally organise my photos by folder/directories within other parent folders - I don't often use custom albums (it would be a nice to have feature, but it's not strictly required). I take pictures on a digital camera and film SLR so I am usually dealing with a few dozen pics at a time - organising, rotating and then just flicking through them occasionally. I am also just backing up photos received from family/friends and other images I've collected over the years.
I usually copy the photos to my PC, copy to my Ubuntu server which is attached as a SMB share, and then point the self-hosted app to that 'photos' parent folder, grouped by camera, date/event and so on.
PhotoPrism and Immich look to be overkill for my needs. PiGallery2 looks great, but I really would like to be able to rotate pictures and do some minor edits to metadata. Something with a few more features than PiGallery2, without the bloat and overkill PhotoPrism would be ideal. I've used PhotoPrism previously and it was too much for what I needed.
What's a nice middle ground between something like PiGallery2 and PhotoPrism?
Hey all!
A recent addition to my homelab is MeTube, a self-hosted YouTube downloader with a sleek and simple web interface.
I've been using it for a while now and decided to write a quick guide on how to set it up.
Blog: https://akashrajpurohit.com/blog/metube-selfhosted-youtube-downloader-with-a-sleek-web-interface/
While MeTube primarily specifies that its focused on YouTube, since it uses yt-dlp, it can be used to download videos from 1000+ other platforms as well.
Give it a try yourself and see how it works for you!
I was setting up a frigate/homeassistant setup with ubuntu server and docker, The current setup was an i3 w/ 1050ti and an HDD. It was working fine for a few days until I couldn't ssh into it and then checking the monitor output gave an Input/Output error (I've attached a screenshot).
I replaced the hdd to an ssd thinking that the hdd must be toast, however after getting it all setup the issue came back after a few days.
I'm currently bummed as to what could be the issue, also I noticed that the first time issue came I was able to reboot back into ubuntu however lately when the issue comes I cannot boot back into the OS (this is actually why I switched out the hard drive with an ssd).
This is the image of what I see when it crashes:
Any help would be appreciated.
I am using wireguard to access my local resources when away from home but I as curious as to it's viability for serving local resources to the world wide web via a cloud instance reverse proxy. I'm curious how secure a set up like this is and what the main concerns are and how to mitigate them.
For now I only really used to quickly demo a project I have been working on to a friend which relied on some of my other resources on my lan.
The set up was as follows:
/etc/wireguard/wg0.conf
[Interface]
PrivateKey = <private_key_value>
Address = <wg_adapter_ip>
DNS = <wg_server_ip>
[Peer]
PublicKey = <public_key_value>
AllowedIPs = <allowed_ip_cidr>
Endpoint = <home_external_ip>:51820
PersistantKeepAliveValue = 25
<allowed_ip_cidr> typically pointing to the one ip address of my local server (e.g. 192.168.0.100/32) or to my main subnet (192.168.0.0/24)
sudo wgh-quick up wg0
to start up the connection to my local network
Then I can access my webserver
/etc/nginx/sites-available
server {
listen 80;
server_name <your_instance_ip>;
location / {
proxy_pass http://<your_local_server>:<port>;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
<your_local_server> being the internal ip of my home server (e.g. 192.168.0.100) and the port beign where my app is served from (e.g. 3000)
then simply set up symbolic link to sites-enabled and restart nginx.
As far as I can tell the main concerns would be:
And the mitigations would be:
TL;DR an open-source app to create an internal system for ordering sandwiches from an external shop, with instant payment processing on the company account (independent of the shop).
Our company doesn't have an on-site restaurant, so many colleagues visit a nearby sandwich shop. However, going there, ordering, and payment can take time.
We can call in advance, but asking for individual sandwiches and manual collection of payments from colleagues can be time-consuming.
I want to deploy an in-house app that allows colleagues to:
I would then order all sandwich, go pay and collect them at noon and bring them back.
I've explored TastyIgniter, but it doesn't allow a payment on an account with instant validation. I'm in Belgium, Europe, by the way.
Thanks!
Hello,
I was wondering how folks around handle automatic backup for Vaultwarden.
Basically on my deployment I've the data stored into a PVC on a NFS share, I've done manually backups over the PVC through a job that also encrypt the backup file and later is stored into a veracrypt container (I guess all data there is encrypted anyway but not sure how easy would be to decrypted in case the backup file its compromised).
What are the approach people is following to preserve the data in case of disaster ?
Two Dell Latitude 5400 laptops. Both acquired cheap from ebay due to having broken screens and other damage. Batteries removed too. Both 8th-Gen i5, Debian 12, 12GB RAM. They're underneath the worktop in my office, right in the corner.
Top one is running our family Better-Minecraft server (MC Java but with around 200 Mods, including furniture!), my DynDNS pings, and a custom backend for a magic-mirror type thing I run on an old kindle in the kitchen. Future plans involve a new SSD to replace the 128GB one and then I can put Immich on it (and every photo I've taken since 2004) to get me off Google Photos.
Far one running Portainer + qBitTorrent + Jellyfin + Navidrome (Still about 50+ albums I need to run through Picard to tag properly). Already has a 2TB SSD in it, future plan is to put AudioBookshelf on it for podcasts/audiobooks and I plan to try to hack it so I can put archived radio shows and live concert bootlegs on there too, basically any longform audio that's not a traditional album/EP etc.
Originally I had an old full-sized Dell Optiplex running most of the above in the spare room (music/videos/etc were just SMB shares), with two 3TB HDs in a Raid-1 config. Wirring fans going all the time, 200W PSU. These two don't run the fans when idle, and there's no spinning rust either.
Future potential plans are a note-taking app (Google Keep), and possibly Calendar too.
Hi,
it's my first time selfhosting applications which are exposed to the internet. Before, i used a VPN to connect to my home-network and use the services which were hosted on an Raspberry Pi - that worked great.
I rented a VPS and want to host a application there. My plan was to set up my infrastructure with docker-compose because i was impressed how fast i was able to get back running my home lab after the microSD of the Pi died:
nginx, asp.net webpage, postgreSQL, service monitoring (and also external monitoring for host availability)
In my home network i used portainer to manage the containers, but after a quick checkup i found out that it is not recommended to expose portainer to the internet - and that makes perfect sense.
Is there another way to easily manage the containers without always connecting via ssh?
Not sure if this is possible or if this is even the best place to post
I am out of the area hiking/camping from time to time on a weekend which usually means I miss my local teams radio commentary what I would like to do is pass the radio commentary through my server to my phone or something similar. I dont think I can use an online radio as the commentary is only available over FM
Ive seen this device on amazon but not sure if its something that would help the cause https://www.amazon.co.uk/RTL2832U-Digital-Receiver-Recording-Playback/dp/B0CTHRBF1C
Any help would be greatly appreciated
Bonjour à tous !
Je souhaiterais automatiser l'extraction des montants des tickets lorsqu'un document de type "Tickets" est détecté. Mon objectif est que le montant soit extrait du contenu OCR du document et stocké dans un champ personnalisé nommé "Montant du ticket".
J'ai déjà un script en post-process hook pour effectuer cette extraction, mais je rencontre encore des difficultés pour que les données soient correctement ajoutées au champ personnalisé.
Est-ce que quelqu'un a déjà implémenté ce genre de workflow ou aurait des conseils pour m'aider à y parvenir ? Merci d'avance pour votre aide !
Hey folks,
Recently, I started getting serious about automation for my homelab. I’d played around with Ansible before, but this time I wanted to go further and try out Packer and Terraform. After a few days of messing around, I finally got a basic setup working and decided to document it:
Blog:
https://merox.dev/blog/homelab-as-code/
Github:
https://github.com/mer0x/homelab-as-code
Here’s what I did:
Starting next year, I plan to add services like Grafana, Prometheus, and other tools commonly used in homelabs to this project.
I admit I probably didn’t use the best practices, especially for Terraform, but I’m curious about how I can improve this project. Thank you all for your input!