Imported Google Group message.
Date:Tuesday, 14 April 2015 17:42:07 UTC+1.
Yep that's pretty much how it works, the reason you need to pass an API ID into the key list endpoint is because Tyk needs to know which Session handler to query.
The way Tyk is built, you can add custom Session and Identity handlers and hook custom storage engines into those - the simple LDAP provider is an example, what happens here is that if you want a list of API keys from your Gateway, Tyk will check the API ID, get it's defintion, pull a reference to the in-memory storage driver, then query that for the list of keys.
If Api A and Api B both use the Redis storage engine (they will), then you will get the same keys as the same engine is being queried.
API keys are not segmented by API ID like they are with Org ID (and even then that only works in non-hashed key mode), since a single key can give access to multiple API's based on it's access control settings, they are only segmented by back end.
A further note on that: If you enabled hashed key mode (in the latest builds), then the key listing goes right out the window because all segmentation is encoded away, so keys can only be accessed raw (via redis), or if you know the original key (i.e. you have taken the risk upon yourself to store the raw API key). We get around this in the latest version by having a concept of a developer, and developers have (hashed) keys, and those keys can be revoked through a secial API call that deletes the hashed representation. Anyway, I digress.
The reason why you see this behaviour is because both API's use the same back end, it's odd, but it means that keys can exist across multiple data providers if necessary - it depends on the integrator
If you are looking at controlling keys on an API level, I'd suggest using the policies feature (in master atm), this will let you controll keys en-masse by changing a few settings to a key profile.
- show quoted text -