Reproducible Performance Tests
Kept in a personal, open, repo, run on a company-hosted, cloud, K8S infra.
Folks, I have a crazy thought. Please let me know if you see any flaws with the idea.
Within a company I’m helping, I’d like to persuade our DevOps to:
Set up a registry for blueprints and/or performance tests.
So that we could quickly spin up a few machines, and run those tests.
On our, company-paid, Kubernetes cluster, reproducibly.
At the same time:
I would like some of those tests / blueprints to be open.
Ideally, on my GitHub.
And, of course, I want to be able to run someone else’s open-sources perf tests.
On our, company’s, K8S cloud infra.
As long as those tests are configured accordingly, of course.
And now, the questions:
How seamlessly could I integrate the above?
Ideally, 100% same build & run scripts.
And no extra code.
Would it be able to use GitHub actions to trigger those test runs?
So that the actions are “shared across the company boundary”.
I.e., 100% identical GitHub actions, that:
take different run arguments, or use different secrets,
so that the company is not exposed,
but some of my (our!) blueprints / perftests remain open source?
If GitHub secrets can be “overridden” by a forked repo — and I hope that’s how they work — then, I think, this solves the problem neatly. Then I’ve got myself a nice playground for some immediate future.
My tests would likely be wrapped into Docker containers.
Although, generally, I don’t think they have to be.
And perf-testing Docker-run vs. bare metal is also interesting.
If Docker is used, I’d love the containers to be built from GH actions too.
So that no one has to trust “my” containers.
They can build their own, on their own hardware, into their own registry.
Last, but not least: I do want to be able to run these tests on “my” machines too.
Via the same set of GitHub actoins.
By maybe providing my own AWS / GCP / Azure secrets in GH secrets.
Or even pointing that particular run to use my Hetzner instance.
Is this a crazy idea, or might it fly?