API object validation failed in setup script in tyk_quickstart repo

Hi,

I started with a fresh TYK install and no external conf file and I would like to customize the TYK_DASHBOARD_DOMAIN variable in the setup.sh script but it doesn’t work.

The response of the

OK=$(curl --silent --header "Authorization: $USER_AUTH" --header "Content-Type:application/json" --data "$API_PORTAL_DATA" -X PUT http://$DOCKER_IP:3000/api/apis/$API_PORTAL_ID 2>&1)

call is :

{
    "Status": "Error",
    "Message": "API object validation failed.",
    "Meta": null,
    "Errors": [{}]
}

Hello!

Just to clarify, did this call fails after your changes, or before?

You mentioned you tried to override TYK_DASHBOARD_DOMAIN, which value you set? Maybe you included “http://” into this variable?

Worth noticing that we are working on new quick start guide https://github.com/TykTechnologies/tyk-pro-docker-demo. But be aware that to change dashboard in new setup.sh in this repo, you need to change “TYK_DASH_DOMAIN” variable, not “TYK_DASHBOARD_DOMAIN”.

Hope it helps!

Leonid, Tyk Team

Hello!

It was before my changes. I didn’t change the setup.sh content except an echo $OK to see the cUrl response after this line OK=$(curl --silent --header "Authorization: $USER_AUTH" --header "Content-Type:application/json" --data "$API_PORTAL_DATA" -X PUT http://$DOCKER_IP:3000/api/apis/$API_PORTAL_ID 2>&1)

The url prefixes for the three (portal) API still are http://localhost instead of http://tyk_dashboard (because of the error ?)

Thank you for the new quick start guide. There is missing the cUrl to create portal access ?

There is missing the cUrl to create portal access ?

It is already inside setup.sh file

Ok, it’s very different than the previous quickstart.

I set these values :

TYK_PORTAL_DOMAIN="developers.domain.com"
TYK_DASH_DOMAIN="api.domain.com"
TYK_PORTAL_PATH="/portal/"

And i can access to the dashboard with this url : http://api.domain.com:3000

And i can access to the portal with this url : http://developers.domain.com:3000

But, i would like access to the portal without specifying port 3000, is it possible ? And, what is the purpose of /portal/ now ?

Hi,

Setting those values in setup.sh will break the demo. The demo is designed with very specific hostnames in mind, changing them requires changing more than just the variables in the setup file.

The setup script just initialises your stack so you can try it with the values that have been provided. If you deviate from those, it will break.

M

Ok, thank you Martin.

I’m trying to write a stackfile with the minimum environment variables to get tyk working but when i compare the excel file here : https://tyk.io/wp-content/uploads/2016/11/Gateway-Environment-Vars.xlsx and conf values in tyk.conf, tyk_analytics.conf or pump.conf some values are missing.

Or, some values are nested and i don’t know how to inject them via environment variables, like :

For example, in : pump.conf :

  • pumps
  • uptime_pump_config

How, can i translate it in environment variables ?

"pumps": {
    "mongo": {
        "name": "mongo",
        "meta": {
            "collection_name": "tyk_analytics",
            "mongo_url": "mongodb://mongo:27017/tyk_analytics"
        }
    }
},
"uptime_pump_config": {
    "collection_name": "tyk_uptime_analytics",
    "mongo_url": "mongodb://mongo:27017/tyk_analytics"
},

Same problem for array “tags”, what is the equivalent ?

"db_app_conf_options": {
    "connection_string": "http://tyk_dashboard:3000",
    "node_is_segmented": false,
    "tags": ["test2"]
},

There are special variables for the pumps, we recommend for the pump to have a file included with your deployment, and then set the DB servers using these variables:

Redis

  • TYK_PMP_REDIS_HOSTS=hostname:port.hostname:port,hostname:port
  • TYK_PMP_REDIS_ENABLECLUSTER=true

Mongo Pump

Aggregate Pump

For any nested elements in other confs, it;s the prefix (TYK_GW, then the setting TYK_GW_NESTPARENT_NESTVALUE

Thank you for your answer Martin

The goal with environment variables is to avoid using externals files…?

I’m not sure to understand what are TYK_PMP_REDIS_HOSTS and PMP_MONGO_CONNECTIONSTRING ?

Can you confirm, that is equivalent ?

pump.conf

...
"mongo": {
    "name": "mongo",
    "meta": {
        "collection_name": "tyk_analytics",
        "mongo_url": "mongodb://tyk-mongo:27017/tyk_analytics"
    }
}
...

ENVIRONMENT:

- TYK_PMP_PUMPS_MONGO_NAME=mongo
- TYK_PMP_PUMPS_MONGO_META_COLLECTIONNAME=tyk_analytics
- TYK_PMP_PUMPS_MONGO_META_MONGOURL=mongodb://tyk-mongo:27017/tyk_analytics

pump.conf

...
"uptime_pump_config": {
    "collection_name": "tyk_uptime_analytics",
    "mongo_url": "mongodb://mongo:27017/tyk_analytics"
}
...

ENVIRONMENT:

- TYK_PMP_UPTIMEPUMPCONFIG_COLLECTIONNAME=tyk_uptime_analytics
- TYK_PMP_UPTIMEPUMPCONFIG_MONGOURL=mongodb://tyk-mongo:27017/tyk_analytics

I’m afraid for the pump, because the pumps list is a dynamic key/value object, the environment variables will not work like that.

For the pump you’ll need a file, and then use the special variables I mentioned earlier to set the mongo and redis connection strings.

But you can’t set the sub-keys of a named pump config (e.g. “mongo”, “mongo_aggregate” etc.) because we don’t register them with the environment config handler since we don’t know if they exist (it’s dynamic, after all).

It’s something we’re looking to improve, but it will take a while

Too bad we need an external file :frowning:

Yes, it’s dynamic, but TYK could parse values between TYK_PMP_PUMPS_ and the next _

I think there is a typo in this environment variable name : TYK_DB_USESHARDEDANLAYTICS

AN LA YTICS

And, i think this is in TYK source code too, because, i can change the value with this variable name

Thanks for letting us know - it’s something we’ll need to fix in a major version release as it will break backwards compatibility.

For the pump, because we have a blanket environment parser, it isn’t quite that easy to add the extensions, instead extensions have their own prefixes so they can be managed independently, but they still need to be registered in a file.

One way to manage this might be to build a new container based off of ours as a base image and then bake Ina. Config file to override the settings dynamically with the environment variables.