After spending the last few months diving deeper into IT and cybersecurity, I've been planning out a home lab project that goes beyond just learning, I want to build something practical that I'll actually use. The goal is creating my own private cloud infrastructure that I can access from anywhere, storing my media collection, all my notes and documentation, while maintaining security.
This is beyond the scope of a weekend project. There are serious considerations around hardware choices, security implications of remote access, and the complexity of doing this right. Here's my thinking process and all the rabbit holes I've gone down planning this thing.
What I'm aiming for is pretty straightforward in concept, complex in execution:
I want to access this from coffee shops, airports, and client sites without compromising security. That means solving the eternal IT dilemma: making something both convenient and secure.
This is where I've spent the most time going back and forth. The Raspberry Pi 5 with 8GB RAM looks appealing - low power consumption, small footprint, and honestly, there's something satisfying about running a full server stack on a $100 computer. Pair it with a 2TB NVMe SSD via USB-C, and you've got a capable little machine.
For what I'm planning, a Pi could handle it. Jellyfin runs fine on Pi hardware for direct play media (no transcoding), and file serving doesn't require much horsepower. The low power draw means I could run this 24/7 without worrying about electricity costs. Plus, if something goes wrong, I'm only out $100 in hardware.
But here's where I keep hitting walls in my research - transcoding. If I want to stream media to different devices with varying capabilities, hardware transcoding becomes crucial. The Pi 5 has some GPU acceleration, but it's limited. And what happens when I want to expand this setup later?
The alternative I'm considering is a refurbished mini PC - something like a Dell OptiPlex Micro with an i5, 16GB RAM, and room for a proper NVMe drive. More expensive (around $300-400 used), but significantly more capable. Intel Quick Sync would handle transcoding without breaking a sweat.
The power consumption difference isn't as dramatic as I initially thought. A modern mini PC might pull 15-30W under load vs 8-12W for the Pi. Over a year, that's maybe $50 in electricity difference.
Either way, I'm planning on external storage. A single 4TB NVMe drive in a USB-C enclosure gives me plenty of space for media and documents, with room to grow. For a home lab, redundancy through backups makes more sense than trying to build enterprise-level fault tolerance.
This choice feels more philosophical than technical. Plex has the polish and ecosystem - great apps, easy setup, robust transcoding. But it's also becoming increasingly commercial, with ads in the interface and features locked behind subscriptions. Plus, there's the whole "phoning home" aspect that makes me uncomfortable for a security-focused setup.
Jellyfin appeals to my open-source sensibilities. No telemetry, no ads, no subscription fees. The interface isn't as polished as Plex, but it's gotten significantly better in recent releases. For my use case, primarily streaming my own content to my own devices, Jellyfin seems like the right choice.
The transcoding situation is where this gets interesting. Jellyfin supports hardware acceleration on Intel systems, which pushes me toward the mini PC option. Being able to transcode multiple streams simultaneously ensures I have a system that actually works when I need it.
Right now all my notes live in Apple's Notes app synced through iCloud, which works fine until you start thinking about data ownership and vendor lock-in. The plan is to move everything to a self-hosted solution where I actually control my data and can access it from any device with proper sync capabilities.
Obsidian's sync service works great, but I'm trying to reduce dependence on external services. The challenge is that Obsidian doesn't have an official self-hosted sync solution. There are workarounds using Git or syncthing, but they're not seamless.
Joplin has better self-hosting support with Joplin Server, but the mobile experience isn't as smooth as Obsidian. I'm leaning toward running both - Joplin Server for structured notes and documents, with a separate syncthing setup for my Obsidian vault.
Adding Nextcloud to the mix would give me a full file sharing solution with document editing capabilities. It's another service to maintain, but the functionality might be worth it.
This is where the project gets complicated. Opening up your home network to the internet is essentially painting a target on your back. Every security researcher I've read emphasizes the same thing: assume you will be found and attacked.
I'm planning to register a dedicated domain for this project, something generic that doesn't tie back to my personal information. Using Cloudflare for DNS gives me flexibility with their proxy services and security features.
The question is whether to use Cloudflare's proxy features or go direct. The proxy adds a layer of protection and hides my real IP, but it also means trusting Cloudflare with my traffic. For a personal server, that might be the right tradeoff.
This is non-negotiable: everything goes behind proper authentication. I'm planning to implement Authelia as a single sign-on solution with two-factor authentication. Every service - Jellyfin, file access, server management - requires authentication through this central system.
The mobile access challenge is real though. Typing in 2FA codes every time I want to access my media from my phone is not ideal. I'm researching approaches using longer-lived tokens or certificate-based authentication for trusted devices.
The server absolutely cannot be on my main home network. I'm planning a separate VLAN with strict firewall rules; the server can initiate connections outbound for updates and external services, but all inbound access goes through the reverse proxy with authentication.
Adding VPN server capabilities to this setup seems like a natural extension. WireGuard has become the standard for good reason, it's fast, secure, and relatively simple to configure. The idea of having my own VPN server for use on public networks is appealing.
But there's a complexity here I'm still working through. If I'm accessing my media server through the public internet via my domain, do I also need VPN access? The VPN would be more for general internet browsing security rather than accessing home services.
The bandwidth considerations are real too. My home upload speed is only about 35 Mbps. Streaming high-quality media while also routing other traffic through the VPN could create bottlenecks. I might need to implement quality of service rules or consider whether the VPN is really necessary for my use case.
If I'm opening up services to the internet, I need to know what's happening. This means implementing proper logging and monitoring from day one, not as an afterthought.
For a home lab, a full enterprise SIEM is overkill, but I still need centralized logging and alerting. I'm considering a lightweight ELK stack or potentially using something like Wazuh for intrusion detection.
The key metrics I want to monitor:
If the server gets compromised, I need to be able to restore from clean backups quickly. I'm planning automated nightly backups to a separate device that's not network accessible, plus periodic manual backups to offline storage.
The backup strategy needs to cover both data and configuration. Rebuilding all the services and authentication setup would be painful without proper documentation and config backups.
I'm not rushing into this. The plan is to build and test everything locally first, then gradually expose services to the internet with extensive monitoring. Phase one is getting Jellyfin and file sharing working on the local network. Phase two adds the authentication layer and reverse proxy. Phase three opens it up to external access with full monitoring in place.
Each phase gets thoroughly tested and documented before moving to the next. The last thing I want is to rush the security implementation and create vulnerabilities.
Beyond the practical benefits, this project covers a huge range of IT and security concepts: network segmentation, reverse proxies, authentication systems, monitoring, backup strategies, and incident response planning. It's exactly the kind of hands-on experience that you can't get from just reading about these topics.
The security challenges are real, but they're also educational. Every decision involves weighing convenience against security, understanding attack vectors, and implementing defense in depth. These are the same considerations that drive enterprise security decisions, just at a smaller scale.
I'm still finalizing the hardware decision, but I'm leaning toward the mini PC approach for the additional capabilities. The next post in this series will cover the actual implementation, starting with the basic server setup and working through each component.
If you're thinking about a similar project, I'd love to hear about your approach. The security considerations around remote access are complex, and there are probably angles I haven't considered yet. Feel free to reach out through the contact information on the main page.
This is going to be a fun project. Complicated, sure, but that's what makes it interesting. The goal isn't just to build something that works, it's to build something that works securely and teaches me something in the process.