Busride-rs
Feb. 14th, 2024 02:42 pmOkay, so I know this is going to shock you, but I've been working on something arcane and impractical.
I made a wrapper for normal HTTP-speaking Rust web apps so their traffic can take an extra round trip through a totally different protocol, before being translated back into HTTP for the outside world. Specifically, I plan to serve Axum-based apps via FastCGI, a protocol that went out of fashion in the mid '00s.
This probably sounds dubiously useful, but, man, listen,
( Or don't! Contents: historical background and some technical exegesis. )
Anyway, I Did It!
Here's a little 3m demo I recorded when I got my initial proof-of-concept working. If you know anything about deploying a self-hosted app in the 2020s, it will shock and scandalize you.
And, here's the code itself, including a demo project:
I found a FastCGI server library for Rust (I'm SO curious about why the author made this, but yeah it's very precisely what I needed) and put together a server loop that translates between the normal HTTP that an inner app understands and the FastCGI protocol that Apache is willing to accept. As long the binary you build knows how to start up in the weird environment that classic FastCGI provides, you can just install it, drop in an .htaccess file, and wander off to go do something else.
At the moment, it's Axum-specific and has to be built into your app as an alternate server mode. In theory it ought to be possible to make a fully generalized wrapper that can spawn any program as a child process and proxy real-actual HTTP to it, but that's more work than I want to do on this; at the moment, this should work fine for me.
So... Why??
Here's another interesting point about apps that run in this mode: anyone else can install them on their shared hosting just as easily, if I give them a build and a README.
In the last few years, there's been a medium amount of big talk about how we need to re-wild the interwebs; bring back some spirit of curiosity and generosity and chaos that we thought we perceived in the '90s and the '00s.
In a recent thread that rolled across my Mastodon feed (wish I could remember and link it, but it took a while to percolate before I took it to heart), someone pointed out the short version of what I described above — that hosting has gotten better for pros at the expense of amateurs — and then said: if we think there's a connection between self-hosting and re-wilding the web, then we're going to have to reverse that, because getting out of a tech-dominated world of walled gardens is going to require empowering the type of normal users who could kinda-sorta keep a Wordpress installation afloat back in the day but who have no hope of, say, sysadmining a Mastodon instance.
I've been thinking about that in the background, a bit.
