I agree, this is a critical question to answer for the future of Rhasspy. A project with a bus factor of 1 is no good for longevity.
@maxbachmann, @koan, and other community members have stepped up to help with development and documentation. The GitHub organization has helped, and I plan to do something similar for DockerHub and PyPI.
One thing I hadn’t considered is to set up the Python package requirements to fit with semantic versioning instead of fighting against it. So instead of:
we would have:
Then each patch version (0.1.9, 0.1.10) would automatically get picked up during the next upgrade.
I don’t have a problem with this in theory, it’s just that a “release” ends up being a lot of work to get out right now. I’m hoping this will change with more automation. What you and @maxbachmann were helping set up on GitHub before the 2.5 release (using git tags and Actions) would get us most of the way there.
I think this is the Achilles’ heel of trying to automate releases right now. I use rhasspy-test to run a suite of automated tests against the latest (
amd64) Docker image before pushing an update. This spins up an isolated Docker container, downloads and trains an English profile, then runs the tests. It relies on a local web server to serve up the ~15GB of profile artifacts.
I imagine going forward, I’ll still be using
rhasspy-test but when it succeeds and we’re satisfied, one of will just
git tag the version and everything will be built/deployed from GitHub.