If you can do without the dashboard altogether, then you can use Community Edition, which is configured entirely with files, all the policies, and API definitions are stored as files on disk and so can be versioned and stored arbitrarily very easily. THis has been the case since Tyk was launched.
Option 2: Use Pro and the import/export scripts
The Tyk Dashboard installation folder has a utils/scripts folder which can be used to export the parent organisation, API Definition and policies for a dashboard installation and then use those files to import them into a new environment - retaining all the IDs.
If you were to import into a pre-populated dashboard (e.g. moving from one revision to another instead of zero to a revision), then you would need to drop the policies and api definitions collections in Mongo before importing, this would mean that all API Definitions and POlicy IDs are retained)
What you would do is:
Set up a staging env
Create the org, and initial APIs / API Definitions in this staging env
Export these
Create a prod env
Import everything
This will set up everything ready for pipeline, next:
Modify API definitions and policies in staging GUI
Use export scripts to export API Definitions and policies
Version control
Your pipeline would then:
Build artefacts
Deploy services
Drop tyk_apis and tyk_policies in MongoDB prod
Run the import script for apis and policies
Trigger hot reload
You can disconnect the hot-reload signal from the dashboard if you want to explicitly “publish” changes to the gateway instead of having that happen automatically on import this would reduce risk during change over.
Option 3: Use the dashboard API
You could always use the dashboard API - if you have the policies set up initially in a staging env, or even just as files, and then use the dashboard API to update API Definitions and policies that already exist and use the import scripts to import new policies.
This has some subtleties depending on how you’re using things…