sharing a few lessons learned managing a few thousand secrets in AWS Parameter Store.
At my current gig, we've got 50+ services, ~6+ environments, and a rough count of 3.5k parameters across our environments. We used to use Chef::EncryptedDataBags when we used chef-server. A few years later, we migrated to using Vault. And then a year after that, we finally settled on using Parameter Store. Over time, we've grown used to the intricacies of managing secrets and access to secrets.
If you've tried using the Parameter Store console, you'll know that the experience isn't great. But the ease/security of using it outweighs a few of its nuisances. Here's how we've dealt with some of these annoying bits.
the ui isn't that great to use.
One really annoying bit is that I can't easily search for parameters across the whole Parameter Store. For example, I can't search for all parameters starting with "db_" i.e. *db_*
a useful workaround - do a quick search* of parameter store. this requires getting all the parameters - but lets you apply plain regex across available paths which you can't quite do in the UI. *It is a little bit slow but can get you a quick preview of the secrets you have available
> aws ssm describe-parameters \
--output text \
| egrep '^PARAMETERS' \
| awk '{print $5}' \
| egrep $REGEX
/dev/myApp/foo
/dev/myApp/bar
...
anyone know of a better way of doing this?
keep your db connection parameters together as unique entities.
use a convention like /$env/databases/$appDb/{host,user,password,port,dbname}
which should include all the fields necessary to build out the ODBC connection string
This would let you control who has access to which sets of databases via IAM policies. You can also script it to automatically log you into a database without you needing to see the password.
here's an example of how you might do it in bash
dbInfo=$(aws ssm get-parameters \
--names "/dev/dbs/$dbName/database" \
"/dev/dbs/$dbName/host" \
"/dev/dbs/$dbName/password" \
"/dev/dbs/$dbName/port" \
"/dev/dbs/$dbName/user" \
"/dev/dbs/$dbName/scheme" \
--region $REGION \
--with-decryption \
--query Parameters[*].Value \
--output text | tr "\t" " ")
function set_parameters {
database="$1"
database_host="$2"
database_pass="$3"
database_port="$4"
database_scheme="$5"
database_user="$6"
}
set_parameters $dbInfo
if [ "$database_scheme" = "mysql" ];
then
MYSQL_PWD=$database_pass mysql -h $database_host -P $listen_port -u $database_user $database_database
else
PGPASSWORD=$database_pass psql -h $database_host -p $listen_port -U $database_user -d $database_database
fi
stick to the naming convention
This is the convention we've used for our parameters which has scaled ok. We've got 10-20 databases, ~50 services spanning 6+ environments organized with this structure.
/$environment_name/databases/$database_name/{host,port,pass,user}
/databags/$service_name/{all,my,server,creds}
/other_sensitive_info/{foo,bar,baz}
We borrowed databags
from our days using chef cookbooks & encrypted databags. It basically just means that there's a bunch of parameters under keys belonging to a service.
This lets us scope access in a few different dimensions via IAM policies. We can say that a developer should have access to database secrets in a dev environment, but only service secret in prod.
{
"Effect": "Allow",
"Action": [
"ssm:GetParameters"
],
"Resource": [
"arn:aws:ssm:us-east-2:123456123:parameter/dev/databases/*",
"arn:aws:ssm:us-east-2:123456123:parameter/prod/databags/myService/*"
]
}
you can't stuff big items into parameters
We used to use the certificate cookbook to install certs on our hosts which required us to keep our certificates stored in this format
{
"id": "mail",
"cert": "-----BEGIN CERTIFICATE-----\nMail Certificate Here...",
"key": "-----BEGIN PRIVATE KEY\nMail Private Key Here...",
"chain": "-----BEGIN CERTIFICATE-----\nCA Root Chain Here..."
}
But you can't stuff this as json into a parameter due to the size limitations of 4096 characters
use labels/pointers to help you rotate passwords
password rotation is an inherently hard problem, but one strategy could be to use pointers (or what they now call labels) to secrets. for example, lets say you want to rotate an encryption key, but you have multiple services relying on it.
you could do something like
/dev/databags/my-service/current = 2 (index of actual secret)
/1 = 'secret 2018'
/2 = 'secret 2019'
Your applications need to know to follow the pointer and read the intended secret. You'd typically do this by regenerating a config file and bouncing your service.
Fortunately, You can do this natively now on AWS with labels on parameters.
Rotating passwords can also be done using Lambda and Secrets Manager - check out this walkthrough here
param store keeps versions of your parameters, but not if you delete them
The docs indicate that parameters are versioned, and they are versioned while they exist. So if you accidentally delete a parameter, the history is gone with it. The consensus seems to be that you should export your parameter store database to S3 or Dynamodb but I haven't come across tools in this space yet.
some related tools that support param-store
secretly for exporting secrets into your environment, written in python - https://github.com/energyhub/secretly
parameter-store-exec - similar to secretly, written in go - https://github.com/cultureamp/parameter-store-exec
confd for putting into config files - https://github.com/kelseyhightower/confd
as part of a consul-template - https://github.com/hellofresh/consul-template-plugin-ssm
chamber for managing secrets including parameter store - https://github.com/segmentio/chamber
Hope these tips help, also interested to hear how other people have worked around param-store's nuisances and/or seen cool integrations with it.
It's been hard to figure out what best practices are when it comes to devops on AWS. I'd like to help you figure it out too. Subscribe to my list to get these updates in your inbox.
Top comments (0)