Configuring your Rails app via environment variables works well, but sometimes you want to be able to update your configuration on the fly. Here's a way to update your app's environment using SSM Parameter Store.

Why would you want to do this? Well, say you deploy your Rails app to an EC2 instance that's part of an autoscaling group. To get the fastest boot times, you should create a custom AMI with your code already on it (a.k.a. a golden image) that the autoscaling group can use when it's time to boot a new instance. Unfortunately, if you store your configuration on the image (and use something like dotenv to load it), you'll need to create a new AMI every time you have a configuration change. You can work around this by using SSM Parameter Store parameters, and let your app fetch its configuration at boot time.

Putting data into Parameter Store is easy enough -- you can use the CLI or the AWS console to edit the variables. But how do you get the data back out for your app to use? One way to do it is to fetch these parameters into your ENV via an initializer, like so:

Aws::SSM::Client.new.get_parameters_by_path(path: "/honeybadger/#{Rails.env}/", recursive: true, with_decryption: true).parameters.each do |param|
  ENV[param["name"].split("/").last] = param["value"]
end

Assuming you have parameters named like honeybadger/production/MY_API_KEY, this snippet will result in ENV["MY_API_KEY"] having whatever value you supplied for that parameter.

But what if loading the environment variable values in an initializer is too late in the Rails boot up process? What if you need settings like DATABASE_URL to be set in SSM and you need to use those settings before your app loads? For that, you can save the variables to a .env file and let dotenv handle that. But first, let's set up the database and store our DATABASE_URL value.

Here's a terraform snippet that creates an RDS instance and stores the connection in SSM Parameter Store:

resource "aws_db_instance" "db" {
  allocated_storage      = 20
  storage_type           = "gp2"
  engine                 = "postgres"
  engine_version         = "11.4"
  password               = "${var.database_password}"
  name                   = "honeybadger"
  username               = "honeybadger"
}

resource "aws_ssm_parameter" "database_url" {
  name  = "/honeybadger/${var.environment}/DATABASE_URL"
  type  = "SecureString"
  value = "postgres://${aws_db_instance.db.username}:${var.database_password}@${aws_db_instance.db.endpoint}/${aws_db_instance.db.name}"
}

With the following shell command, you can grab all the parameters (just like we did with the Ruby snippet above) and get them ready for use as environment variables:

aws ssm get-parameters-by-path --path /honeybadger/production/ \
  --recursive --with-decryption --output text \
  --query "Parameters[].[Name,Value]" |
  sed -E 's#/honeybadger/production/([^[:space:]]*)[[:space:]]*#export \1=#' \
  > /home/honeybadger/shared/.env.production.local

Now you have a .env.production.local file that dotenv can load -- assuming that it's symlinked into your current path at deploy time, if you are using capistrano. As a bonus, you can also source that env file to have variables like $DATABASE_URL defined for you in any shell scripts you want to run.

We put that shell command in our deployment script (which is triggered by our CI/CD pipeline), so any new code that goes to production will pick up any changes to made to our parameters in SSM. Now we get to have our golden image and we don't have to build a new one for every configuration change. 😎

Get the Honeybadger newsletter

Each month we share news, best practices, and stories from the DevOps & monitoring community—exclusively for developers like you.
    author photo
    Benjamin Curtis

    Ben has been developing web apps and building startups since '99, and fell in love with Ruby and Rails in 2005. Before co-founding Honeybadger, he launched a couple of his own startups: Catch the Best, to help companies manage the hiring process, and RailsKits, to help Rails developers get a jump start on their projects. Ben's role at Honeybadger ranges from bare-metal to front-end... he keeps the server lights blinking happily, builds a lot of the back-end Rails code, and dips his toes into the front-end code from time to time. When he's not working, Ben likes to hang out with his wife and kids, ride his road bike, and of course hack on open source projects. :)

    More articles by Benjamin Curtis
    An advertisement for Honeybadger that reads 'Turn your logs into events.'

    "Splunk-like querying without having to sell my kidneys? nice"

    That’s a direct quote from someone who just saw Honeybadger Insights. It’s a bit like Papertrail or DataDog—but with just the good parts and a reasonable price tag.

    Best of all, Insights logging is available on our free tier as part of a comprehensive monitoring suite including error tracking, uptime monitoring, status pages, and more.

    Start logging for FREE
    Simple 5-minute setup — No credit card required