Tips and Tricks for using S3 presigned POST urls on AWS

avatar

Borislav Hadzhiev

Wed Apr 21 20213 min read

banner

Photo by NOAA

Updated on Wed Apr 21 2021

Reasons for using S3 Presigned POST urls on AWS #

When allowing users to upload files to an S3 bucket, we most certainly want to limit the file size they can upload.

We can't do that with s3.getPresignedUrl, but we can do it with s3.createPresignedPost, which has a little more complex API, but not by a large margin.

The flow of using a presigned url is the same regardless - your frontend makes a request to your backend, possibly specifying the content type of the file you want to upload. The backend responds with the presigned url, which is valid for a specified amount of time, and your frontend uploads the file using the presigned url.

This flow allows you to avoid sending the file to your backend and then to s3, instead you directly upload to s3 from the frontend.

Things to note when using S3 Presigned Post URLs #

  • the s3 bucket must have cors enabled, to be able to upload from a Web Application. Obviously your frontend is on a different domain, therefore you must enable cors on the bucket to allow requests from the specific domain. For example in cdk:
import * as s3 from '@aws-cdk/aws-s3'; import * as cdk from '@aws-cdk/core'; import {DOMAIN_NAME} from '../../globals'; export class UploadsBucketConstruct extends cdk.Construct { public readonly s3Bucket: s3.Bucket; constructor(scope: cdk.Construct, id: string) { super(scope, id); this.s3Bucket = new s3.Bucket(this, id, { cors: [ { allowedMethods: [ s3.HttpMethods.GET, s3.HttpMethods.POST, s3.HttpMethods.PUT, ], allowedOrigins: [DOMAIN_NAME], allowedHeaders: ['*'], maxAge: 3000, }, ], }); } }
  • the lambda, that makes the request to s3 for the presigned url must have s3:putObject and optionally s3:PutObjectAcl permissions for the bucket.

  • the conditions in the params object of s3.createPresignedPost must be met, i.e. if you limit Content-Type and your frontend attempts to upload a file with a different Content-Type you will get an error.

const params = { Bucket: bucketName, Fields: { key: filePath, acl: 'public-read', }, Conditions: [ // content length restrictions: 0-1MB] ['content-length-range', 0, 1000000], // specify content-type to be more generic - images only // ['starts-with', '$Content-Type', 'image/'], ['eq', '$Content-Type', fileType], ['starts-with', '$key', identityId], ], // number of seconds for which the presigned policy should be valid Expires: 15, };
  • to make the file publicly readable, you can set the acl in the Fields, public-read means that anyone who has the link can access it and view the file. That's why the lambda needs a s3:putObjectAcl permission on the bucket.

  • default expiration time for the presigned post url is 15 minutes, but you most likely want to set a shorter one.

  • once the frontend has the signed url it can make a POST request to s3, with the included fields in the lambda response set as FormData:

import {client} from '@utils/api-client'; export async function uploadToS3({ fileType, fileContents, }: { fileType: string; fileContents: File; }) { const presignedPostUrl = await getPresignedPostUrl(fileType); const formData = new FormData(); formData.append('Content-Type', fileType); Object.entries(presignedPostUrl.fields).forEach(([k, v]) => { formData.append(k, v); }); formData.append('file', fileContents); // The file must be the last element const response = await fetch(presignedPostUrl.url, { method: 'POST', body: formData, }); if (!response.ok) { throw new Error( 'Invalid file upload, check that your file size is less than 1MB.', ); } return presignedPostUrl.filePath; } type PresignedPostUrlResponse = { url: string; fields: { key: string; acl: string; bucket: string; }; filePath: string; }; async function getPresignedPostUrl(fileType: string) { const presignedPostUrl = await client<PresignedPostUrlResponse>( `get-presigned-url-s3?fileType=${fileType}`, ); return presignedPostUrl; }

Further Reading #

Add me on LinkedIn

I'm a Web Developer with TypeScript, React.js, Node.js and AWS experience.

Let's connect on LinkedIn

Join my newsletter

I'll send you 1 email a week with links to all of the articles I've written that week

Buy Me A Coffee