Getting started with Python AWS Lamda function?

After creating a new SST application with:

npx create-sst@latest my-sst-app

There is a lambda function provided as a “hello world” example in /packages/functions/src/lambda.ts.

I have some python code that I want to run as an API. But to get started, here is a little hello world in Python:

import json

def lambda_handler(event, context):
    print("Hello from Lambda!")
    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!')
    }

How do I go about deploying this one? Do I just remove lambda.ts and replace with lambda.py or is there more to it? Thanks.

Ok, so that was not so obvious but here is what I did to make it work:

import { StackContext, Api, Function } from "sst/constructs";

export function API({ stack }: StackContext) {
  const api = new Api(stack, "api", {
    defaults: {
    },
    routes: {
      "GET /": {
        function: new Function(stack, "PythonDemo", {
          handler: "packages/functions/src/lambda.hello", // Not lambda.py.hello
          runtime: "python3.11",
        })
      }
    }
  });

  stack.addOutputs({
    ApiEndpoint: api.url,
  });
}

The other important thing is, in order for the SST python handler to know its dealing with some python, it looks for a requirements.txt (or Pipfile or poetry.lock).

So I also created a very simple requirements.txt under packages/functions/src/requirements.txt. All I had in it is:

json

And that enabled my Python AWS Lambda function to run.

Made a little PR for the Docs: Added some more details on creating a Python Lambda handler. by rupertlssmith · Pull Request #3467 · sst/sst · GitHub

Normally you would use the filepath/*file-name*.*function-name* rather than the runtime file type. For example .py for you. I have been working with serverless for almost 3 years now and have always seen .handler as convention.

The .handler convention is used to point directly to the specific function (handler) within the code that will be invoked by the Lambda service when the function is triggered.

Sorry, I do not understand what you are getting at. Are you saying that I should be including the file extension? Like 'lambda.py.hello, since the file is named lambda.py`? It seems to work fine with just ‘lambda.hello’.

There is no exported function in Python. ‘hello’ is just the name of the function in the Python code that I chose.

The above works fine in dev with npx sst dev. But when I try to deploy to production with npx sst deploy --stage=production it fails since the image is too large.

It builds using this image:

Sending build context to Docker daemon  4.608kB
Step 1/11 : ARG IMAGE=amazon/aws-sam-cli-build-image-python3.7

and then fails:

API PythonDemo AWS::Lambda::Function CREATE_FAILED Resource handler returned message: "Unzipped size must be smaller than 262144000 bytes (Service: Lambda, Status Code: 400, Request ID: f443d341-8ee1-46b7-9803-328463f514a5)" (RequestToken: 1300324e-265e-bc7d-ff92-c75fa8fb56e0, HandlerErrorCode: InvalidRequest)

This image amazon/aws-sam-cli-build-image-python3.7 is about 650M in size, so too big. Is there some way to tell it to not use a custom image, just whatever the default image is for AWS Lambdas using Python? Or perhaps I am going to be forced down the route of deploying a custom image to run under Fargate?

Sorry just edited it and removed the unclear parts, can see how it was confusing.

normally you would use lambda.functionName, for example in your case lambda.lamda_handler (from your example code) but I missed the part where you said that you changed the function name to .hello.

1 Like

I was able to deploy the hello world example by starting from this template:

npx create-sst@latest --template=other/python

Rather than starting from:

npx create-sst@latest my-sst-app

Which is encouraging.

One thing I don’t understand is why is it using the amazon/aws-sam-cli-build-image-python3.7 when I have set the runtime to be python3.9? There is of course a published 3.9 image from Amazon, which seems likely to be what I should build on top of.

Actually, I think I see what I was doing wrong when the deploy image was too large. I had set up a venv, but the venv folder was under packages/functions/venv, so I think it must have gotten sucked into the image.

Now I put the venv up at the top-level of the application - it works!

1 Like