Auto-instrumenting Functions with AWS Lambda Layers!

Erica Windisch
IOpipe Blog
Published in
3 min readNov 29, 2018

--

We’re excited to be a launch partner for the AWS Lambda Layers! This exciting new feature allows developers to mix and match source code and other application resources into a single AWS Lambda function deployment.

Lambda layers allows developers and operators to add and enable the IOpipe libraries to your Lambda functions without directly modifying your application source code, or even redeploying your code!

Updating the IOpipe libraries becomes as simple as an API call, or updating your Cloudformation stack. It also makes it painless to audit which functions are running IOpipe and utilizing the latest version of our libraries.

Using AWS Layers

Developers can develop their own layers, or integrate them from third-parties such as IOpipe using an ARN, similar to using an AMI under EC2. However, Layers can be attached to existing functions, making the attachment model more similar to EBS or VPCs.

Each layer is defined as a zip file that is simply extracted to the ‘/opt' directory. When multiple layers are specified, these are extracted in order, overwriting files from previous layers if there is a conflict.

Layers can be published using the AWS API, the AWS CLI, or via SAM templates. Functions can be updated or published as usual, with API parameters for specifying layers in these commands.

Using Layers to integrate IOpipe

Select the layer for your language runtime:

Layers must be published to be compatible with specific runtimes. When writing your SAM template, make sure that the layer and runtime specified match. At launch we support the following runtimes, with the specified layers:

IMPORTANT: The region where your function is deployed must match the region where you source the layer. For this reason, we have specified $REGION in the above, this must be replaced with ‘us-west-2’, ‘us-east-1’, etc. to match where your function is deployed.

Instrumenting a function with SAM:

Because layers are runtime and language dependent, many users will find that configuring layers on a per-function basis will be the easiest way to get started with Lambda’s new Layer support:

Resources:
IOpipePythonExample:
Type: 'AWS::Serverless::Function'
Properties:
CodeUri: s3://sam-demo-bucket/hello.zip
Handler: iopipe.handler.wrapper
Runtime: python3.6
Layers:
- arn:aws:lambda:us-east-1:123456789101:layer:layer1:1
Environment:
IOPIPE_HANDLER: your_function_handler.hello

It is important to note that three changes are made here compared to a standard SAM template:

  1. The Layers field specifies the IOpipe layer to add to the function
  2. The environment variable IOPIPE_HANDLER is set to your function’s handler method.
  3. The Handler parameter is set to iopipe.handler.wrapper, instrumenting the module specified by the IOPIPE_HANDLER environment variable.

Users can also choose to alternatively skip these changes to the handler definition, instrument their code by importing and initializing the IOpipe libraries within code as classically done, but importing the IOpipe libraries via layers as an update and maintenance mechanism.

Deploying to AWS

Because we’re using SAM, packaging and deployment is unchanged from the existing SAM workflow. For reference, however, here’s the steps you’ll take:

$ sam package --template-file template.yml --s3-bucket my-bucket-name-$RANDOM --output-template-file sam.yml
$ sam deploy --template-file sam.yml --stack-name my-stack-name-$RANDOM --capabilities CAPABILITY_IAM

I recommend changing the name of your stack and s3 bucket, but I’ve added $RANDOM in there to make it unique in case any of you try this without changing it 😅

Forging forward

Let us know what you think. Support for custom runtimes and layers in Lambda is new and we’re still experimenting. Further work to simplify the auto-instrumentation of code and to provide a one-click experience of getting started with IOpipe is coming soon! We’re happy to take feedback and improve the experience. Thank you!

--

--

Building Streaming AI/ML | Cloud Computing Pioneer | Serverless Architect | Observability Founder & CTO