Monitoring platform for keeping systems up and running at all times.
Full stack visibility across the entire stack.
Detect and resolve any incident in record time.
Conform to industry best practices.
A platform-agnostic way of accessing credentials in Python.
Even though AWS enables fine-grained access control via IAM roles, sometimes in our scripts we need to use credentials to external resources, not related to AWS, such as API keys, database credentials, or passwords of any kind. There are a myriad of ways of handling such sensitive data. In this article, I’ll show you an incredibly simple and effective way to manage that using AWS and Python.
Depending on your execution platform (Kubernetes, on-prem servers, distributed cloud clusters) or version control hosting platform (Github, Bitbucket, Gitlab, Gitea, SVN, …), you may use a different method to manage confidential access data. Here is a list of the most common ways to handle credentials that I’ve heard about so far:
All of the above solutions are perfectly feasible, but in this article, I want to demonstrate an alternative solution by leveraging AWS Secrets Manager. This method will be secure (encrypted using AWS KMS) and will work the same way regardless of whether you run your Python script locally, in AWS Lambda, or on a standalone server provided that your execution platform is authorized to access AWS Secrets Manager.
We will perform the following steps:
If you want to follow along, go to https://www.alphavantage.co/ and get your API key.
Alpha Vantage — image by the author
First, make sure that you configured AWS CLI with an IAM user that has access to interact with the AWS Secrets Manager. Then, you can store the secret using the following simple command in your terminal:
To see whether it worked, you can list all secrets that you have in your account using:
If your credentials should change later (ex. if you changed your password), updating the credentials is as simple as the following command:
AWS Secrets Manager allows storing credentials in a JSON string. This means that a single secret could hold your entire database connection string, i.e., your user name, password, hostname, port, database name, etc.
The awswrangler package offers a method that deserializes this data into a Python dictionary. When combined with **kwargs, you could unpack all the credentials from a dictionary straight into your Python function performing authentication.
My requirements.txt looks as follows (using Python 3.8):
Then, to retrieve the secret stored using AWS CLI, you just need those 2 lines:
There is a handy Python package called pandas_datareader that allows to easily retrieve data from various sources and store it as a Pandas dataframe. In the example below, we’re retrieving Apple stock market data (intraday) for the last two days. Note that we are passing the API key from AWS Secrets Manager to authenticate with the Alpha Vantage data source.
Here is a dataframe that I got:
Apple stock market data — image by the author
Since we were able to access the credentials on a local machine, the next step is to do the same in AWS Lambda to demonstrate that this method is platform agnostic and works in any environment that can run Python.
Side note: I’m using the new alternative way of packaging AWS Lambda function with a Docker container image. If you want to learn more about that, have a look at my previous article discussing it in more detail.
I use the following Dockerfile as a basis for my AWS Lambda function:
The script lambda.py inside of src directory looks as follows:
To build and package the code to a Docker container, we use the commands:
Finally, we build an ECR repository and push the image to ECR:
Note: Replace 123456789 with your AWS account ID. Also, adjust your AWS region, accordingly — I’m using eu-central-1.
We are now ready to build and test our AWS Lambda function in the AWS management console.
Deployed AWS Lambda using AWS Secrets Manager for API Key retrieval — image by the author
If you are running multiple Lambda function workloads, it’s beneficial to consider using an observability platform that will help you keep an overview of all your serverless components. In the example below, I’m using Dashbird to obtain additional information about the Lambda function executed above, such as:
You can see in the image above that the first function execution had a cold start. The second one used 100% of memory. Those insights helped me to optimize the resources by increasing the allocated memory. In the subsequent invocation, my function ran faster and didn’t max out the total memory capacity.
Hopefully, you could see how easy it is to store and retrieve your sensitive data using this AWS service. Here is a list of benefits that this method gives you:
I have a policy to always provide pros and cons to any technology without sugar-coating anything. Those are the risks or downsides I see so far using this service to manage credentials on an enterprise-scale:
In this article, we looked at the AWS Secrets Manager as a way of managing credentials in Python scripts. We could see how easy it is to put, update, or list secrets using AWS CLI. We then looked at how we can access those credentials in Python using just two lines of code thanks to the package awswrangler. Additionally, we deployed the script to AWS Lambda to demonstrate that this method is platform-agnostic. As a bonus section, we looked at how we can add observability to our Lambda functions using Dashbird. Finally, we discussed the pros and cons of AWS Secrets Manager as a way of managing credentials on an enterprise-scale.
Further reading:
Python error handling in AWS Lambda
Top 3 tools for monitoring Python in AWS Lambda
How to save hundreds of hours on Lambda debugging
Today we are announcing a new, updated pricing model and the end of free tier for Dashbird.
In this article, we’re covering 4 tips for AWS Lambda optimization for production. Covering error handling, memory provisioning, monitoring, performance, and more.
In this article we’ll go through the ins and outs of AWS Lambda pricing model, how it works, what additional charges you might be looking at and what’s in the fine print.
Dashbird was born out of our own need for an enhanced serverless debugging and monitoring tool, and we take pride in being developers.
Dashbird gives us a simple and easy to use tool to have peace of mind and know that all of our Serverless functions are running correctly. We are instantly aware now if there’s a problem. We love the fact that we have enough information in the Slack notification itself to take appropriate action immediately and know exactly where the issue occurred.
Thanks to Dashbird the time to discover the occurrence of an issue reduced from 2-4 hours to a matter of seconds or minutes. It also means that hundreds of dollars are saved every month.
Great onboarding: it takes just a couple of minutes to connect an AWS account to an organization in Dashbird. The UI is clean and gives a good overview of what is happening with the Lambdas and API Gateways in the account.
I mean, it is just extremely time-saving. It’s so efficient! I don’t think it’s an exaggeration or dramatic to say that Dashbird has been a lifesaver for us.
Dashbird provides an easier interface to monitor and debug problems with our Lambdas. Relevant logs are simple to find and view. Dashbird’s support has been good, and they take product suggestions with grace.
Great UI. Easy to navigate through CloudWatch logs. Simple setup.
Dashbird helped us refine the size of our Lambdas, resulting in significantly reduced costs. We have Dashbird alert us in seconds via email when any of our functions behaves abnormally. Their app immediately makes the cause and severity of errors obvious.