How to Deploy Services at a Low Cost?
If you need a cheap VPS like I do, take a look at RackNerd.
- Get a cheap VPS as a gateway
- Deploy services on any devices
- Use FRP to forward requests from the VPS to different services
- Use IPFS to save bandwidth for static files
- Use Serverless to save computing resources
I am the kind of person who wants to keep personal data in my own hand, so I always prefer self-hosted services. My services grew as I found more interesting and useful projects. For example, I use Bitwarden to remember my passwords, HedgeDoc to write documents, Flarum to discuss topics with friends, Gitea to host my personal repositories, and so on.
I tried Docker and loved it and I can't go back. So I deploy most of my services in Docker containers.
Docker containers are usually very large. Even if there are images based on
alpine, several images can cost gigabytes. And even if I don't use Docker, a number of services run out memory easily.
However storage and memory are price sensitive regarding a VPS, and I'm not ready to pay that much just for my personal services.
As per my experience, I always needed more storage and memory, but I never used even half of the bandwidth. I have a Raspberry Pi with an external SSD running some services, but I don't want to expose the IP of my home in the public directly. So can I combine them together to have not only enough bandwidth but also expandable storage and memory?
The answer is yes. I found a project called FRP, aka. a fast reverse proxy, which can make a tunnel from any device to my VPS.
So my cheap VPS becomes a gateway.
As a gateway HTTPS is necessary. No one hopes to be watched by others on the Internet. Let's Encrypt is a good choice. I can maintain signed TLS certificates easily by a few scripts, and apply them to an NginX. But I wanted even easier approach after I got a taste of Docker.
Caddy is a very powerful web server with automatic HTTPS written in Go. It is easy to install and easy to extend. To support HTTPS verified via DNS, I created a docker image to rebuild Caddy with Cloudflare and Aliyun DNS support. Check it out at GitHub.
Then I only need to maintain a
Caddyfile with all my services. By using docker I don't even need to maintain the ports for each service, because they are already separated at container level. So I can just configure my reverse server to the host name of each container.
Using FRP is straightforward.
I started an FRP server on my VPS, exposing a port
7001 for HTTP reverse proxies. So any HTTP services in clients will be forwarded to
:7001 on my VPS, and open to the public through the VPS as a gateway.
I used my Raspberry Pi as a client. I can even set domains from the client, provided that the TLS certificates are properly handled with wildcards on the VPS. So every time I opt in or out services, I only need to make changes in the client, and the client will communicate with the server on my VPS. It's really simple and convenient to use.
What's more, I can have multiple FRP clients. Say I have a few services on my Raspberry Pi, and another few services on my old computer, I can start an FRP client on each of them and communicate with the same server. As long as the proxy names (
web01 in the example above) are different, they will work at the same time.
In this way I don't need to worry about storage or memory of my VPS any more. I can still enjoy a low cost VPS with rich features backed by my own devices.
We can serve static files on dedicated services like Vercel, Netlify, Cloudflare, etc. Thanks to the Internet we have so many great and free infrastructure to use.
What if we run out of bandwidth? For personal services this is unlikely to happen. But sometimes it just happens because of crawlers, or visitors and RSS subscriptions beyond expectation.
There is another way to save bandwidth for static files: IPFS. Read my last post for more details.
This blog has an RSS feed served on IPFS. So even if there are a lot of subscribers requesting the feed periodically -- which is not going to happen but I have a long-term vision 😆 -- the data is transferred from IPFS gateways and doesn't affect my bandwidth at all.
What if we need more devices? What if we have services that are only used at a time?
The answer is Serverless. Serverless allows us to deploy a service whenever it's used. It can be integrated with CI/CD and free us from DevOps.
For example, if you use Deno as I do, you can deploy services on Deno Deploy automatically. There are other serverless providers with free or fair price, try it by yourself.