Selfhosted

46677 readers
882 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
1
 
 

Hello everyone! Mods here 😊

Tell us, what services do you selfhost? Extra points for selfhosted hardware infrastructure.

Feel free to take it as a chance to present yourself to the community!

🦎

2
 
 

I'm still dipping my toes into self hosting and trying to figure out what services I would want to be always accessible from my devices vs those that could be awakened by LAN, and which services should be installed at the OS level vs as containers.

As of now, I just have an OrangePi 5 Plus running Home Assistant Supervised under Debian and nothing else. I'm hoping to expand the OPi a bit and also build out another PC (hardware unknown) as a NAS media server and NextCloud machine.

Before I start doing anything I can't undo, I'm wondering if I' on the right track with my proposed setup in the image, or if there's anything else I should consider?

3
 
 

I have a raspberry pi 4 connected to the TV that I want to use for a couple different apps (jellyfin client, YouTube, display foundry battle maps) and am looking for a launcher application that could switch between these. Ideally it could be controlled by HDMI CEC (or whatever it's called). Does anyone have suggestions for this?

4
5
 
 

A list of new features from their release page:

  • 🤖 Automatic "Follow Up" Suggestions: Open WebUI now intelligently generates actionable "Follow Up" suggestions automatically with each message you send, helping you stay productive and inspired without interrupting your flow; you can always disable this in Settings if you prefer a distraction-free experience.
  • 🧩 OpenAI-Compatible Embeddings Endpoint: Introducing a fully OpenAI-style '/api/embeddings' endpoint—now you can plug in OpenAI-style embeddings workflows with zero hassle, making integrations with external tools and platforms seamless and familiar.
  • ↗️ Model Pinning for Quick Access: Pin your favorite or most-used models to the sidebar for instant selection—no more scrolling through long model lists; your go-to models are always visible and ready for fast access.
  • 📌 Selector Model Item Menu: Each model in the selector now features a menu where you can easily pin/unpin to the sidebar and copy a direct link—simplifying collaboration and staying organized in even the busiest environments.
  • 🛑 Reliable Stop for Ongoing Chats in Multi-Replica Setups: Stopping or cancelling an in-progress chat now works reliably even in clustered deployments—ensuring every user can interrupt AI output at any time, no matter your scale.
  • 🧠 'Think' Parameter for Ollama Models: Leverage new 'think' parameter support for Ollama—giving you advanced control over AI reasoning process and further tuning model behavior for your unique use cases.
  • 💬 Picture Description Modes for Docling: Customize how images are described/extracted by Docling Loader for smarter, more detailed, and workflow-tailored image understanding in your document pipelines.
  • 🛠 Settings Modal Deep Linking: Every tab in Settings now has its own route—making direct navigation and sharing of precise settings faster and more intuitive.
  • 🎤 Audio HTML Component Token: Easily embed and play audio directly in your chats, improving voice-based workflows and making audio content instantly accessible and manageable from any conversation.
  • 🔑 Support for Secret Key File: Now you can specify 'WEBUI_SECRET_KEY_FILE' for more secure and flexible key management—ideal for advanced deployments and tighter security standards.
  • 💡 Clarity When Cloning Prompts: Cloned workspace prompts are clearly labelled with "(Clone)" and IDs have "-clone", keeping your prompt library organized and preventing accidental overwrites.
  • 📝 Dedicated User Role Edit Modal: Updating user roles now reliably opens a dedicated edit user modal instead of cycling through roles—making it safer and more clear to manage team permissions.
  • 🏞️ Better Handling & Storage of Interpreter-Generated Images: Code interpreter-generated images are now centrally stored and reliably loaded from the database or cloud storage, ensuring your artifacts are always available.
  • 🚀 Pinecone & Vector Search Optimizations: Applied latest best practices from Pinecone for smarter timeouts, intelligent retry control, improved connection pooling, faster DNS, and concurrent batch handling—giving you more reliable, faster document search and RAG performance without manual tweaks.
  • ⚙️ Ollama Advanced Parameters Unified: 'keep_alive' and 'format' options are now integrated into the advanced params section—edit everything from the model editor for flexible model control.
  • 🛠️ CUDA 12.6 Docker Image Support: Deploy to NVIDIA GPUs with capability 7.0 and below (e.g., V100, GTX1080) via new cuda126 image—broadening your hardware options for scalable AI workloads.
  • 🔒 Experimental Table-Level PGVector Data Encryption: Activate pgcrypto encryption support for pgvector to secure your vector search table contents, giving organizations enhanced compliance and data protection—perfect for enterprise or regulated environments.
  • 👁 Accessibility Upgrades Across Interface: Chat buttons and close controls are now labelled and structured for optimal accessibility support, ensuring smoother operation with assistive technologies.
  • 🎨 High-Contrast Mode Expansions: High-contrast accessibility mode now also applies to menu items, tabs, and search input fields, offering a more readable experience for all users.
  • 🛠️ Tooltip & Translation Clarity: Improved translation and tooltip clarity, especially over radio buttons, making the UI more understandable for all users.
  • 🔠 Global Localization & Translation Improvements: Hefty upgrades to Traditional Chinese, Simplified Chinese, Hebrew, Russian, Irish, German, and Danish translation packs—making the platform feel native and intuitive for even more users worldwide.
  • General Backend Stability & Security Enhancements: Refined numerous backend routines to minimize memory use, improve performance, and streamline integration with external APIs—making the entire platform more robust and secure for daily work.
6
18
submitted 19 hours ago* (last edited 15 hours ago) by gedaliyah@lemmy.world to c/selfhosted@lemmy.world
 
 

So I currently have a 1tb drive running all of my containers for my home server. I'm trying to finalize the process of transitioning from corporate cloud storage to my own personal cloud. I have a new 12 TB drive to hold all my files.

I am running Xubuntu, with OwnCloud running in docker compose similar to this setup.

By default, it saves all data in the var directory, but I am stuck with how to change it. I have my data backed up separately, so the plan is to set it up from scratch.

There are complicated instructions here, but I don't have the skill to transfer this process to a docker compose installation. I have never used SQL commands, and I really have no idea how to do anything with it in docker.

This is a critical step in my home server setup, without which I pretty much can't move forward. Can anyone help?

Edit: Thanks from a real beginner. I was making it harder than I needed. The default volumes in the YML file were in an unfamiliar format to me. I rewrote the relevant lines and re-initialized and now it is working as intended.

I guess I am still learning about the different ways that volumes can be designated in yaml.

7
23
submitted 22 hours ago* (last edited 21 hours ago) by d00phy@lemmy.world to c/selfhosted@lemmy.world
 
 

Current setup is PMS running on a Synology 5-bay, and another PMS running on a Shield Pro. The NAS server is primarily used for remote streaming, while the Shield serves to my home LAN (AppleTVs mainly).

I've been seeing stuttering on larger files, either using the Plex app or Infuse, and I'm fairly certain the Synology is the weak link. Network performance in the house has pretty solid, though admittedly I could stand to test it more thoroughly. I've been looking at moving my library to a standalone system. I've been looking at the Beelink ME Mini (which happens to be on sale!). What I don't know is the best way to build this out.

I don't want to have to buy all 6 SSDs (ar at least 6x4TB ones!) at once, so I'd be looking at either a stock Linux (Ubuntu or Rocky) install w/ I guess a BTRFS pool for the SSDs (I'm guessing I can use the eMMC for OS depending on how big the install is - that or use the SSD in slot 4). Alternatively, i could possibly set up TruNAS w/ the Plex pp to manage the storage.

As for populating the media, I plan to keep the Synology as the central repo of my data. I have it replicating to another NAS at my dad's house, with movies/music/tv replicating using Syncthing. I plan to also use Syncthing to populate the Beelink.

Anyway, please poke holes in this plan and/or suggest a better one. My main goals are to get the media I'm streaming off spinning disk w/ minimal power draw (didn't mention that above) in a way that I can expand storage as necessary to accommodate the media library. Nothing's purchased yet, so I'm not married to the hardware. I would ideally like to convert the library to h.265 or even AV1 if I can make it work.

ETA: For clarity: I'm not transcoding AFAIK. My Shield mounts the Synology over SMB and mostly works fine, until I try to play anything 4k - then I get stuttering. On the surface, this sounded like a network issue, but I can't find a problem w/ the LAN. My thought was to move the PMS to a single location w/ local storage, and use the Synology just as an archive.

ETA2: FWIW, I have not expanded the memory on the Synology or installed any cache drives.

8
29
submitted 1 day ago* (last edited 19 hours ago) by dr_robotBones@reddthat.com to c/selfhosted@lemmy.world
 
 

I'm trying to self host my portfolio on an old laptop running Ubuntu server. I've successfully set up docker and nginx. I got a DNS subdomain from freedns.afraid.org.

The IP connected to the DNS matches my server's public IP address.

I can connect with https://mypublicip/ from outside the network, but it shows as an insecure connection and the https has lines going through it in the browser.

Any attempts to connect to the website via DNS have failed, and trying to connect via IP on port 80 fails as well. I really have no clue what is going on, let me know if you need more information, or if this is the wrong place to ask for help with this sort of thing.

Edit: Whatever problem I had before, it seems its been fixed. However my subdomain is being blocked by ISPs. Thank you for the help everyone, I'll probably have to do cloudflare tunneling instead of fully self-hosting it.

9
43
submitted 1 day ago* (last edited 1 day ago) by Lv_InSaNe_vL@lemmy.world to c/selfhosted@lemmy.world
 
 

Hey guys i have been using Navidrome to stream my music from my server and its been amazing. I primarily use YT Music because of discoverability so I have all of my "primary" playlists (about 8 of them really, but supporting a somewhat arbitrary limit would be nice) in YouTube.

Im looking for an automated way to download the music and keep my navidrome instance updated with a couple playlists. I started working on some Python script to handle it, but its just not working super well so i would prefer to use someone elses solution haha.

Anyone have any good recommendations? I tried this one but I couldn't actually find the music and it seems to only support one playlist at a time. It would also be nice to download the album art and set some ID3 tags too

10
11
12
 
 

Anyone have any recommendations for Blog software?

I was considering for a while just using a mastodon instance as my blog because I just kinda wanna sign in and upload my papers that I've written. I was pretty close with Hugo. I'd rather not have to build the site everytime I upload and I want to self host and not use Github actions. I think I still could do it since I like using Cloudflared tunnels.

What is all out there?

13
 
 

Corporate VPN startup Tailscale secures $230 million CAD Series C on back of “surprising” growth

Pennarun confirmed the company had been approached by potential acquirers, but told BetaKit that the company intends to grow as a private company and work towards an initial public offering (IPO).

“Tailscale intends to remain independent and we are on a likely IPO track, although any IPO is several years out,” Pennarun said. “Meanwhile, we have an extremely efficient business model, rapid revenue acceleration, and a long runway that allows us to become profitable when needed, which means we can weather all kinds of economic storms.”

Keep that in mind as you ponder whether and when to switch to self-hosting Headscale.

14
15
 
 

Setting up a personal site on local hardware has been on my bucket list for along time. I finally bit he bullet and got a basic website running with apache on a Ubuntu based linux distro. I bought a domain name, linked it up to my l ip got SSL via lets encrypt for https and added some header rules until security headers and Mozilla observatory gave it a perfect score.

Am I basically in the clear? What more do I need to do to protect my site and local network? I'm so scared of hackers and shit I do not want to be an easy target.

I would like to make a page about the hardware its running on since I intend to have it be entirely ran off solar power like solar.lowtechmagazine and wanted to share technical specifics. But I heard somewhere that revealing the internal state of your server is a bad idea since it can make exploits easier to find. Am I being stupid for wanting to share details like computer model and software running it?

16
 
 

cross-posted from: https://lemmy.zip/post/40833329

We are pleased to announce the first release candidate preview release of Jellyfin 10.11.0!

This is a preview release, intended for those interested in testing 10.11.0 before it's final public release. We welcome testers to help find as many bugs as we can before the final release.

As always, please ensure you stop your Jellyfin server and take a full backup before upgrading!

WIP release notes: https://notes.jellyfin.org/v10.11.0_features

This is the first release that uses the new EF Core database mapper. If you'd like to help test this release, please remember to remove all plugins to make debugging logs as easy as possible.

17
 
 

So, I'm trying to get pangolin up and running.

What I have: Ubuntu server running in proxmox, docker running on that Ubuntu, dynamic IP, duckdns in docker to counter that, domain name

What I did: installed pangolin with the installation script, said yes to crowdsec because it looked like the safest option (over time) even if I don't know what it is/does, set a CNAME from pangolin.mydomain.com to my.duckdns.org, set a port forward for ports 80 and 443 on TCP and for port 51520 on UDP

What is happening: well, fairly, not much. If I test it from outside the network, I get a connection refused. If I test it locally (in portainer click on the 443 or 80 port) I get page not found

What I want: I want it to just work without a hastle and hope one of you can help me out here, cause I'm starting to lose my mind

18
 
 

So I am for the most part a lurker and a hobbyist. I've always been a bit of a techie, but over time decided I wanted to be more anti-consumption and such.

I started out with by doing my own calendar. I have a desktop that has my nextcloud and use it to sync my gnome calendar with fossify (with davx5). This was rather straight forward and gave me a nice confidence boost. This is mostly done on my local network, tho I am thinking of reading more into tailscale and getting a domain. The next move I did was to bring my todo list over. This was a bit tricky as many apps don't have a setting to support repeat todos and crossing one off might just remove the item entirely and kill the resets that another app set up. At one point I found the app super productivity. This app is basically perfect. Only downsides is that it is a bit more strict (particularly on the mobile app) about an ssl cert. There is an option to have the app sync with a local file. I thought I could be clever and just have nextcloud do the syncing and let the apps think they are working only off the local on their respective device. Alas there was a snag here. For some reason nextcloud will write the files with read only permission on the laptop, so I cannot add or cross off items. Then I remembered using some apps around a decade ago that worked off a todo.txt file. I figured maybe I could find some mobile and desktop apps and recycle the idea of letting nextcloud manage two way sync of a file and letting apps interact with it as if it were local. It seems like I have some winners here with sleek on desktop and ntodo.txt on mobile.

Just my humble story of selfhosting so I don't feel like a poser when listening to podcasts or lurking.

19
 
 

MAZANOKE is a simple image optimizer that runs in your browser, works offline, and keeps your images private without ever leaving your device.

Created for everyday people and designed to be easily shared with family and friends, it serves as an alternative to questionable "free" online tools.

See how you can easily self-host it here:
https://github.com/civilblur/mazanoke

---

Highlights from v1.1.5 (view full release note)

The focus of this release has been to improve the core foundation and file format support, but I'm planning to expand with more features further down the road in order to improve the usefulness of MAZANOKE (while still keeping the UX simple).

  • Support basic authentication for Docker setups.
  • TIFF file format support.
    • Convert from TIFFJPG, PNG, WebP, ICO
  • ICO file format support.
    • Convert from and to an ICO image.

---

I also feel incredibly honored that MAZANOKE was recently featured on several of my favorite communities:

It's been incredible to see the growth of the user base, with over 54,000 docker pulls for the previous release alone, and now reaching over 1400 stars! I never anticipated this at all and I'm truly grateful for the support!

I'd like to thank everyone who helped spread the word, whether through starring, word of mouth, community engagement, blog posts, or by packaging it for things like Unraid and NixOS, and everything in between!

20
 
 

I ditched most streaming services well over a year ago now, but Spotify has clung on because I have a playlist of around 2000 songs. I've set up Navidrome but now need to transfer all my music in the highest quality possible as efficiently as possible.

I tried lidarr some time ago, but it seemed to be based more around artists than individual songs and my indexer failed to find most of my library.

I've seen a couple of apps that will look at a playlist and then try to yt-dlp the song from YouTube but I'm worried about having a lower quality or different version. I've wondered if automating an "analog hole" type approach where I just pipe the audio of each song to a file and leave it playing overnight for a couple of weeks might actually be the best approach but that does seem a bit insane at this scale.

21
 
 

Hey gang, I'm considering using DNS4EU in Canada. My ping to their servers is ~130ms. That's way longer than anything local which is on the order of 1-5ms. Apart from resolving uncached entries taking longer, is there any contraindication to using a DNS server with high latency?

22
 
 

Hi, community :)

Thank you for your help on each post, it really makes me want to create more and more stuff ❤️

A few new updates for Postiz, but just a small recap:

Postiz is a social media scheduling tool supporting 19 social media channels:

Instagram, Facebook, TikTok, Reddit, LinkedIn, X, Threads, BlueSky, Mastodon, YouTube, Pinterest, Dribbble, Slack, Discord, Warpcast, Lemmy, Telegram, VK, Nostr.

https://github.com/gitroomhq/postiz-app/

(20k+ stars, thank you for all the love 🚀)

What's new:

  • Create a PDF carousel in LinkedIn. Upload pictures as normal, and then check the "Post as images carousel." It will convert the picture to a PDF in the background and schedule it as a Carousel.
  • Multi-language support - We added tons of languages and support for RTL. I used Lingo.dev for that, which was super helpful!
  • Post finisher - added post finisher to BlueSky, X, and Threads, it will add post in the end quoting the 1st post and tell people to follow you :)
  • Mastodon custom URL (self-hosted only)
  • Dub shortlinking custom URL (self-hosted only)
  • Disable image compression in the client (self-hosted only)
  • Created a Chrome extension that overrides your LinkedIn / X, post modal with Postiz to be more productive.

Our amazing mod egelhaus added tons of YouTube videos on the docs website on installing different providers / installing Postiz.

What else would you like to see in Postiz?

23
 
 

I've been using Tube Archivist to archive my YouTube playlists, but I've hit a portability snag. It stores all metadata in its internal database and saves video files with non-readable filenames. This makes the archive unreadable without the software and its database, which defeats the point of long-term archival storage.

Are there any tools that:

  • Archive playlists with human-readable filenames (or let you control the naming scheme)
  • Have an API for queuing archival jobs
  • Store metadata in portable formats (e.g., sidecar JSON or YAML)
  • Don’t require additional software to interpret the archive
24
 
 

Hi, I live in Germany and only have public IPv6. My address changes only very, very rarely and has never changed in the time I've been self-hosting.

I also have a very small, pretty cheap VPS with static IPv4/IPv6 – which would seem like a great fit for some sort of tunneling/proxy setup. Now comes the question: What/how should I use it? I would like to not have the additional latency for IPv6 enabled hosts, can I just setup a reverse proxy for IPv4? Would Tailscale work for my usecase, what are some resources you found useful when using it?

Currently, I'm just hosting everything IPv6-only and hoping my address never changes, but that does not work for everyone, as especially many new buildings with fiber optic connections still only have IPv4 (strangely).

25
 
 
view more: next ›