How to install tyk?


#1

Hi,

A few weeks ago I already managed to get tyk working, but now I have to reinstall my things, and I’m again spending hours of trying different setups, but always ending up in 404s and timeouts. I’d say I almost know the install docs by heart, but I still don’t understand them, as they only speak about some special cases, like installing tyk on AWS on the port 8080 and not something else. I can’t find docs that would help me, and the result of the setup script is not giving me a functioning result.

I see that lots of people are having here similar problems and I suppose that’s because of the lack of understanding of the system. I don’t see why it has to be so much complicated - or maybe it’s not complicated at all, but then I’m not aware of that.

My basic question is, how do I set up a gateway with a pump and a dashboard?
Let’s say these are the parameters:

Could you provide me a very simple step-by-step how to do that, and especially what to look for in the config files to make sure it’s working? It would be also helpful to understand what to check for to know a component is working or not, because in lot’s of cases I only see that stdout and stderr is empty, and I just get a 404, and I don’t even know if it’s good or bad news.

I’d really-really appreciate your help. If you point me a doc, that gives answer to my question, that would be also great.
Thanks,
Gergely


#2

Hi Gergely

Have you followed the getting started with Ubuntu guide?
https://tyk.io/tyk-documentation/get-started/with-tyk-on-premise/installation/on-ubuntu/

Let me know if this hasn’t helped and I’ll see if there is something else we can provide?

James


#3

Hi James,

Thank you! I was reading this one:
https://tyk.io/docs/tyk-api-gateway-v-2-0/installation-options-setup/install-tyk-on-ubuntu/

I still don’t say I’d understand the logic behind the configuration, but this is definitely more helpful.

Now I have the gateway running at 127.0.0.1:8000, and there’s the dashboard too.

I have no idea if the gateway is working now correctly or not:

$ sudo service tyk-gateway status
tyk-gateway start/running, process 27115

/var/log/tyk-gateway.stdout and /var/log/tyk-gateway.stderr are empty.

I defined 3 apis on the dashboard (A,B and C in the example below).

When I try this:

$ curl
  -H "x-tyk-authorization: 352d20ee67be67f6340b4c0605b044b7"
  -s -H "Content-Type: application/json"
  -X GET http://127.0.0.1:8000/tyk/health/?api_id=68917538a89c4ec9580cab4046a65a3e

The result is:
{"status":"error","error":"Health checks are not enabled for this node"}
Which suggests, that the API is working.

But if I call curl to create a token I don’t get any response or output:

$ curl
 -H "x-tyk-authorization: 352d20ee67be67f6340b4c0605b044b7"
 -s -H "Content-Type: application/json" -X POST -d '{
  "allowance":100,
  "rate":100,
  "per":60,
  "expires":-1,
  "quota_max":10010,
  "quota_renews":1475893670171,
  "quota_remaining":10100,
  "quota_renewal_rate":2592000,
  "access_rights":{
    "510424f4d2ff4a4341b3f4197a30ce77":{
      "api_name":"A",
      "api_id":"510424f4d2ff4a4341b3f4197a30ce77"
    },
    "20e099b2bbb84efc617c78f527148638":{
      "api_name":"B",
      "api_id":"20e099b2bbb84efc617c78f527148638"
    },
    "c3e50c33b0214cc27dc3bbf4c3ac1bdc":{
      "api_name":"C",
      "api_id":"c3e50c33b0214cc27dc3bbf4c3ac1bdc"
    }
  }
}' http://127.0.0.1:8000/tyk/keys/create

I simply get no output on stdout. If I use the verbose mode of curl then the answer is:

* Hostname was NOT found in DNS cache
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8000 (#0)
> POST /tyk/keys/create HTTP/1.1
> User-Agent: curl/7.35.0
> Host: 127.0.0.1:8000
> Accept: */*
> x-tyk-authorization: 352d20ee67be67f6340b4c0605b044b7
> Content-Type: application/json
> Content-Length: 500
> 
* upload completely sent off: 500 out of 500 bytes
* Empty reply from server
* Connection #0 to host 127.0.0.1 left intact

If I try to call the same request from nodejs I get an error:
request to http://127.0.0.1:8000/tyk/keys/create failed, reason: socket hang up

As I could find no detailed docs for the API REST calls I have no idea which parameters are required to create a token, maybe some essential info is missing? In the docs there’s only a sample, or is it there somewhere and I just couldn’t find it? But even if this would be a problem with the parameters, is it normal to “hang the cocket” instead of giving back and error message?

my current tyk.conf is the following:

{
  "listen_port": 8000,
  "secret": "352d20ee67be67f6340b4c0605b044b7",
  "template_path": "/opt/tyk-gateway/templates",
  "tyk_js_path": "/opt/tyk-gateway/js/tyk.js",
  "use_db_app_configs": false,
  "app_path": "/opt/tyk-gateway/apps",
  "middleware_path": "/opt/tyk-gateway/middleware",
  "storage": {
    "type": "redis",
    "host": "localhost",
    "port": 6379,
    "username": "",
    "password": "",
    "database": 0,
    "optimisation_max_idle": 2000,
    "optimisation_max_active": 4000
  },
  "enable_analytics": false,
  "analytics_config": {
    "type": "csv",
    "csv_dir": "/tmp",
    "mongo_url": "",
    "mongo_db_name": "",
    "mongo_collection": "",
    "purge_delay": -1,
    "ignored_ips": [],
    "normalise_urls": {
        "enabled": true,
        "normalise_uuids": true,
        "normalise_numbers": true,
        "custom_patterns": []
    }
  },
  "health_check": {
    "enable_health_checks": false,
    "health_check_value_timeouts": 60
  },
  "optimisations_use_async_session_write": true,
  "allow_master_keys": false,
  "policies": {
  	"policy_source": "file",
  	"policy_record_name": "policies"
    },
  "hash_keys": true,
  "suppress_redis_signal_reload": false,
  "close_connections": true,
  "enable_non_transactional_rate_limiter": true,
  "enable_sentinel_rate_limiter": false,
  "local_session_cache": {
    "disable_cached_session_state": false
  },
  "uptime_tests": {
    "disable": false,
    "config": {
      "enable_uptime_analytics": false,
      "failure_trigger_sample_size": 3,
      "time_wait": 300,
      "checker_pool_size": 50
    }
  },
  "http_server_options": {
        "enable_websockets": true
  },
  "hostname": "",
  "enable_custom_domains": true,
  "enable_jsvm": true
}

Thanks,
Gergely


#4

Hi Gergely, can you run this command and share the output?

ps aux | grep tyk

This will be useful to see if the Tyk processes are up.

The “Health checks are not enabled for this node” occurs because you need to enable health checks in your tyk.conf, currently you have:

"health_check": {
    "enable_health_checks": false,
    "health_check_value_timeouts": 60
}

If you want to use the health check feature, you will need to set enable_health_checks to true.

It would be useful to know the output of the first command so we can help you out.


#5

Hi,

$ ps aux | grep tyk
root       409  0.0  0.0 302692  5404 ?        Ssl  szept03   1:08 /opt/tyk-dashboard/tyk-analytics --conf=/opt/tyk-dashboard/tyk_analytics.conf
root       413  0.0  0.6 422004 51408 ?        Ssl  szept03   3:26 /opt/tyk-pump/tyk-pump -c /opt/tyk-pump/pump.conf
root     17085  0.0  0.0 439984  6688 ?        Ssl  04:06   0:06 /opt/tyk-gateway/tyk --conf=/opt/tyk-gateway/tyk.conf
gulyas   24126  0.0  0.0  17432  2428 pts/31   S+   14:09   0:00 grep --color=auto tyk

Thanks,
Gergely


#6

It seems all the services are up, it would be interesting to run Tyk in debug mode.

service tyk-gateway stop
/opt/tyk-gateway/tyk --conf=/opt/tyk-gateway/tyk.conf --debug

The first command will stop the Tyk gateway, the second one will run it in the foreground, with debug mode enabled.
You can retry your token creation request and check the output.


#7
[Sep  9 20:39:40]  INFO Connection dropped, connecting..
[Sep  9 20:39:40]  INFO host-check-mgr: Starting Poller
[Sep  9 20:39:40]  INFO main: Setting up analytics normaliser
[Sep  9 20:39:40] DEBUG main: Enabling debug-level output
[Sep  9 20:39:40] DEBUG main: Initialising default org store
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40] DEBUG main: Loaded API Endpoints
[Sep  9 20:39:40]  INFO main: Setting up Server
[Sep  9 20:39:40]  INFO main: --> Standard listener (http)
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40] DEBUG Subscription started: tyk.cluster.notifications
[Sep  9 20:39:40]  INFO Loading API Specification from /opt/tyk-gateway/apps/1.json
[Sep  9 20:39:40] DEBUG INITIALISING EVENT HANDLERS
[Sep  9 20:39:40] DEBUG Checking for transform paths...
[Sep  9 20:39:40] DEBUG Checking for transform paths...
[Sep  9 20:39:40]  INFO Loading API Specification from /opt/tyk-gateway/apps/app_sample.json
[Sep  9 20:39:40] DEBUG INITIALISING EVENT HANDLERS
[Sep  9 20:39:40] DEBUG Checking for transform paths...
[Sep  9 20:39:40] DEBUG Checking for transform paths...
[Sep  9 20:39:40]  INFO main: Detected 2 APIs
[Sep  9 20:39:40]  INFO main: Loading API configurations.
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40]  INFO main: --> Loading API: Tyk Test API
[Sep  9 20:39:40]  INFO main: ----> Tracking: (no host)
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40] DEBUG main: ----> Loading Middleware
[Sep  9 20:39:40] DEBUG main: Batch requests enabled for API
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40]  INFO main: ----> Checking security policy: Token
[Sep  9 20:39:40] DEBUG Chain array start
[Sep  9 20:39:40] DEBUG URL Rewrite enabled
[Sep  9 20:39:40] DEBUG Chain array end
[Sep  9 20:39:40] DEBUG main: ----> Custom middleware processed
[Sep  9 20:39:40] DEBUG Chain completed
[Sep  9 20:39:40] DEBUG main: ----> Rate limits available at: /tyk-api-test/tyk/rate-limits/
[Sep  9 20:39:40]  INFO main: ----> Setting Listen Path: /tyk-api-test/
[Sep  9 20:39:40] DEBUG Subrouter done
[Sep  9 20:39:40] DEBUG tempSpecRegister done
[Sep  9 20:39:40]  INFO main: --> Loading API: Test API
[Sep  9 20:39:40]  INFO main: ----> Tracking: (no host)
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG Storage Engine already initialised...
[Sep  9 20:39:40] DEBUG Redis handles: 1
[Sep  9 20:39:40] DEBUG main: ----> Loading Middleware
[Sep  9 20:39:40] DEBUG Connecting to redis cluster
[Sep  9 20:39:40] DEBUG Redis pool already INITIALISED
[Sep  9 20:39:40]  INFO main: ----> Checking security policy: Token
[Sep  9 20:39:40] DEBUG Chain array start
[Sep  9 20:39:40] DEBUG URL Rewrite enabled
[Sep  9 20:39:40] DEBUG Chain array end
[Sep  9 20:39:40] DEBUG main: ----> Custom middleware processed
[Sep  9 20:39:40] DEBUG Chain completed
[Sep  9 20:39:40] DEBUG main: ----> Rate limits available at: /test-api/tyk/rate-limits/
[Sep  9 20:39:40]  INFO main: ----> Setting Listen Path: /test-api/
[Sep  9 20:39:40] DEBUG Subrouter done
[Sep  9 20:39:40] DEBUG tempSpecRegister done
[Sep  9 20:39:40] DEBUG Checker host list
[Sep  9 20:39:40]  INFO host-check-mgr: Loading uptime tests...
[Sep  9 20:39:40] DEBUG host-check-mgr: --- Setting tracking list up
[Sep  9 20:39:40] DEBUG host-check-mgr: Reset initiated
[Sep  9 20:39:40] DEBUG [HOST CHECKER] Checker reset queued!
[Sep  9 20:39:40] DEBUG Checker host Done
[Sep  9 20:39:40]  INFO main: Initialised API Definitions
[Sep  9 20:39:40] DEBUG main: Loading policies
[Sep  9 20:39:40] ERROR policy: Couldn't load policy file: open policies: no such file or directory
[Sep  9 20:39:40]  INFO main: Gateway started (v2.2.0.13)
[Sep  9 20:39:40]  INFO main: --> Listening on address: 
[Sep  9 20:39:40]  INFO main: --> Listening on port: 8000

And here I made the same request for creating a new key:

[Sep  9 20:39:50] DEBUG [STORE] Getting WAS: PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG [STORE] Getting: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG host-check-mgr: Primary instance set, I am master
[Sep  9 20:39:50] DEBUG [STORE] SET Raw key is: PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG [STORE] Setting key: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:39:50] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG [STORE] Getting WAS: PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG [STORE] Getting: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG host-check-mgr: Primary instance set, I am master
[Sep  9 20:40:00] DEBUG [STORE] SET Raw key is: PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG [STORE] Setting key: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:00] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:09]  INFO auth-mgr: Reset quota for key. inbound-key=****3224 key=quota-e1bfb97c
2016/09/09 20:40:09 http: panic serving 127.0.0.1:36756: runtime error: invalid memory address or nil pointer dereference
goroutine 114 [running]:
net/http.(*conn).serve.func1(0xc4205f4000)
	/usr/local/go/src/net/http/server.go:1491 +0x12a
panic(0xb07c00, 0xc420010120)
	/usr/local/go/src/runtime/panic.go:458 +0x243
main.createKeyHandler(0xf4d240, 0xc420118270, 0xc4200c60f0)
	/home/tyk/go/src/github.com/lonelycode/tyk/api.go:1337 +0x2933
main.CheckIsAPIOwner.func1(0xf4d240, 0xc420118270, 0xc4200c60f0)
	/home/tyk/go/src/github.com/lonelycode/tyk/middleware_api_security_handler.go:24 +0x2ae
net/http.HandlerFunc.ServeHTTP(0xc42046bb80, 0xf4d240, 0xc420118270, 0xc4200c60f0)
	/usr/local/go/src/net/http/server.go:1726 +0x44
github.com/gorilla/mux.(*Router).ServeHTTP(0xc4202ccc80, 0xf4d240, 0xc420118270, 0xc4200c60f0)
	/home/tyk/go/src/github.com/gorilla/mux/mux.go:98 +0x255
net/http.(*ServeMux).ServeHTTP(0x1006ee0, 0xf4d240, 0xc420118270, 0xc4200c60f0)
	/usr/local/go/src/net/http/server.go:2022 +0x7f
net/http.serverHandler.ServeHTTP(0xc4206a0000, 0xf4d240, 0xc420118270, 0xc4200c60f0)
	/usr/local/go/src/net/http/server.go:2202 +0x7d
net/http.(*conn).serve(0xc4205f4000, 0xf4dd00, 0xc42055e100)
	/usr/local/go/src/net/http/server.go:1579 +0x4b7
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:2293 +0x44d
[Sep  9 20:40:10] DEBUG [STORE] Getting WAS: PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG [STORE] Getting: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG host-check-mgr: Primary instance set, I am master
[Sep  9 20:40:10] DEBUG [STORE] SET Raw key is: PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG [STORE] Setting key: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:10] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG [STORE] Getting WAS: PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG [STORE] Getting: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG host-check-mgr: Primary instance set, I am master
[Sep  9 20:40:20] DEBUG [STORE] SET Raw key is: PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG [STORE] Setting key: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG Input key was: host-checker:PollerActiveInstanceID
[Sep  9 20:40:20] DEBUG Input key was: host-checker:PollerActiveInstanceID

#8

It’s also interesting, that my dashboard shows the APIs I created with the dashboard, but doesn’t mention the Tyk Test API.
While this debug log only mentions all the Tyk Test API and doesn’t show the others.

I’m sharing the dashboard config:

{
    "listen_port": 9000,
    "tyk_api_config": {
        "Host": "http://127.0.0.1",
        "Port": "8000",
        "Secret": "352d20ee67be67f6340b4c0605b044b7"
    },
    "mongo_url": "mongodb://127.0.0.1/tyk_analytics",
    "page_size": 10,
    "admin_secret": "12345",
    "shared_node_secret": "352d20ee67be67f6340b4c0605b044b7",
    "redis_port": 6379,
    "redis_host": "localhost",
    "redis_password": "",
    "enable_cluster": false,
    "force_api_defaults": false,
    "notify_on_change": true,
    "license_key": "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJhbGxvd2VkX25vZGVzIjoiOWNmYTUwMTMtNzBkYS00OGZiLTVhNTMtZDNiNzBhMTBlZWJhIiwiZXhwIjoxNTAyMzI0MjgyLCJvd25lciI6IjU3YjEwYTNhNDVmOTJlNjY4OTAwMDIzNCJ9.MYTSHKPI0CzfjBYEESsdzo8Uvdveit0WeZ3haWl0oQinPW3N-O59dMrc9Y4OPbFNryMuP5HlhiRXa_-DsmodaFs5-BIyP3V3T7hVazl5zDfHq4LT2NBVhOYiBoucHKZAW24-GO-6cTu2ew0x7b__gy7Ewgmo3Ob98ClMUqfn7ZC5rjXvWQ1qeQT4G_lza86J5RAm9ytmXFqmowijUTcdx3rwwpOyOIVxJBZHFobXClsTu4vntE9K8ZJgzpHGs01jn3gufF5gUKyIjlziB-tEqRbOROd1qFP-8ytMzLGpdMLTEzsWEeeMgZfgKuSUlA81y5fJO_uoo_z6DAsWTxZn5w",
    "redis_database": 0,
    "redis_hosts": null,
    "hash_keys": true,
    "email_backend": {
        "enable_email_notifications": false,
        "code": "",
        "settings": null,
        "default_from_email": "",
        "default_from_name": ""
    },
    "hide_listen_path": false,
    "sentry_code": "",
    "sentry_js_code": "",
    "use_sentry": false,
    "enable_master_keys": false,
    "enable_duplicate_slugs": true,
    "show_org_id": true,
    "host_config": {
        "enable_host_names": true,
        "disable_org_slug_prefix": true,
        "hostname": "dashboard.mapcat.dev",
        "override_hostname": "127.0.0.1",
        "portal_domains": {},
        "portal_root_path": "/portal",
        "generate_secure_paths": false
    },
    "http_server_options": {
        "use_ssl": false,
        "certificates": [
            {
                "domain_name": "",
                "cert_file": "",
                "key_file": ""
            }
        ],
        "min_version": 0
    },
    "ui": {
        "languages": {
            "Chinese": "cn",
            "English": "en",
            "Korean": "ko"
        },
        "hide_help": false,
        "default_lang": "en",
        "login_page": {},
        "nav": {},
        "uptime": {},
        "portal_section": null,
        "designer": {},
        "dont_show_admin_sockets": false,
        "dont_allow_license_management": false,
        "dont_allow_license_management_view": false
    },
    "home_dir": "/opt/tyk-dashboard",
    "identity_broker": {
        "enabled": false,
        "host": {
            "connection_string": "http://localhost:3010",
            "secret": "934893845123491238192381486djfhr87234827348"
        }
    },
    "tagging_options": {
        "tag_all_apis_by_org": false
    },
    "use_sharded_analytics": false,
    "enable_aggregate_lookups": true,
    "aggregate_lookup_cutoff": "01/07/2016",
    "maintenance_mode": false,
    "allow_explicit_policy_id": false
}

And the pump config too:

{
    "analytics_storage_type": "redis",
    "analytics_storage_config": {
        "type": "redis",
        "host": "localhost",
        "port": 6379,
        "hosts": null,
        "username": "",
        "password": "",
        "database": 0,
        "optimisation_max_idle": 100,
        "optimisation_max_active": 0,
        "enable_cluster": false
    },
    "purge_delay": 10,
    "pumps": {
        "mongo": {
            "name": "mongo",
            "meta": {
                "collection_name": "tyk_analytics",
                "mongo_url": "mongodb://127.0.0.1/tyk_analytics"
            }
        },
        "mongo-pump-aggregate": {
            "name": "mongo-pump-aggregate",
            "meta": {
                "mongo_url": "mongodb://127.0.0.1/tyk_analytics",
                "use_mixed_collection": true
            }
        }
    },
    "uptime_pump_config": {
        "collection_name": "tyk_uptime_analytics",
        "mongo_url": "mongodb://127.0.0.1/tyk_analytics"
    },
    "dont_purge_uptime_data": false
}

#9

FYI: I started a brand new AWS installation, and I could get it work. I don’t know what I did differently, and that’s my problem with the configuration documentation. Your reply was greatly helpful, as I did the setup in a trial-and-error way, used the debug messages, to check if the gateway could communicate with the dashboard. And debug messages helped me to quickly put things behind nginxs, too.

I guess I found some bug above. I you need help to figure it out, I’m planning to keep the non-working configuration for a little while, too.