How to Delete a Folder or Files from an S3 Bucket

avatar
Borislav Hadzhiev

Last updated: Feb 26, 2024
5 min

banner

# Table of Contents

  1. Delete an entire Folder from an S3 Bucket
  2. Filter which Files to Delete from an S3 Bucket

# Delete an entire Folder from an S3 Bucket

To delete a folder from an AWS S3 bucket, use the s3 rm command, passing it the path of the objects to be deleted along with the --recursive parameter which applies the action to all files under the specified path.

Let's first run the s3 rm command in test mode to make sure the output matches the expectations.

shell
aws s3 rm s3://YOUR_BUCKET/ --recursive --dryrun --exclude "*" --include "my-folder/*"

delete files in folder test mode

The output shows that all of the files in the specified folder would get deleted.

The folder also gets deleted because S3 doesn't keep empty folders around.

Note that the order of the --exclude and --include parameters matters. Filters passed later in the command have higher precedence.

We passed the following parameters to the s3 rm command:

NameDescription
recursiveapplies the s3 rm command to all nested objects under the specified path
dryrunshows the command's output without actually running it
excludewe only want to delete the contents of a specific folder, so we exclude all other paths in the bucket
includewe include the path that matches all of the files we want to delete

Now that we've made sure the output from the s3 rm command is what we expect, let's run it without the --dryrun parameter.

shell
aws s3 rm s3://YOUR_BUCKET/ --recursive --exclude "*" --include "my-folder/*"

delete files in folder

To verify all files in the folder have been successfully deleted, run the s3 ls command.

If the command receives a path that doesn't exist, it has no return value.

shell
aws s3 ls s3://YOUR__BUCKET/YOUR_FOLDER --recursive

all files in folder deleted

It's always a best practice to run destructive commands like s3 rm with the --dryrun parameter first. Make sure that the command does what you intend without actually running it.

# Filter which Files to Delete from an S3 Bucket

Here is an example of deleting multiple files from an S3 bucket with AWS CLI.

Let's run the command in test mode first. By setting the --dryrun parameter, we instruct the AWS CLI to only print the outputs of the s3 rm command, without actually running it.

shell
aws s3 rm s3://YOUR_BUCKET/YOUR_FOLDER/ --dryrun --recursive --exclude "*" --include "file1.txt" --include "file2.txt"

delete multiple files test mode

The output shows the names of the files that would get deleted, had we run the command without the --dryrun parameter.

Note that the order of the --exclude and --include parameters matters. Filters passed later in the command have higher precedence and override those that come before them.

This means that passing the --exclude "*" parameter after --include "file1.txt" would delete all files in the S3 bucket.

It's always a best practice to run destructive commands like s3 rm with the --dryrun parameter first. Make sure that the command does what you intend, without actually running it.

In the example above, both of the files are located in the same folder, otherwise, we would include the path to the files in the --include parameters:

shell
aws s3 rm s3://YOUR_BUCKET/ --dryrun --recursive --exclude "*" --include "folder1/file1.txt" --include "folder2/file2.txt"

delete files in different folders

To exit test mode and perform the s3 rm operation, simply remove the --dryrun parameter.

The include parameter can also be set as a matcher.

For example, to delete all files with a .png extension, under a specific prefix, use the following command.

shell
aws s3 rm s3://YOUR_BUCKET/ --recursive --exclude "*" --include "folder2/*.png"

delete files that match pattern

The output shows that the --include parameter matches files with the .png extension in nested directories.

The command deleted my-folder/image.png as well as my-folder/nested-folder/another-image.png.

Running the s3 rm command with an --include parameter that does not match any files produces no output.

no output when no match

The --include and --exclude parameters are used for fine-grained filtering.

The following command, deletes all objects in the folder, except for objects with the .png extension:

shell
aws s3 rm s3://YOUR_BUCKET/ --recursive --exclude "*" --include "my-folder/*" --exclude "*.png"

delete files in folder with extension filter

The output of the s3 ls command shows that the image at the path my-folder-2/hook-flow.png has not been deleted.

If you wanted to preserve all .png and all .txt files, you would just add another --exclude "*.txt" flag at the end of the command.

The order of the --exclude and --include parameters is very important. For instance, if we reverse the order and pass --include "my-folder-2/*" before the exclude "*" parameter, we would delete all of the files in the S3 bucket because exclude comes after include and overrides it.

Finally, let's look at an example where we have the following folder structure:

shell
bucket my-folder-3/ image.webp file.json nested-folder/ file.txt file.json

We have a nested folder that we want to preserve, but we want to delete all of the files in the my-folder-3 directory.

shell
aws s3 rm s3://YOUR_BUCKET/ --recursive --exclude "*" --include "my-folder-3/*" --exclude "my-folder-3/nested-folder/*"

exclude nested folder from deletion

We can run the s3 ls command to verify the nested folder didn't get deleted.

shell
aws s3 ls s3://YOUR_BUCKET/my-folder-3/nested-folder --recursive --human-readable

verify nested folder not deleted

The output shows that the nested folder was excluded successfully and has not been deleted.

# Additional Resources

You can learn more about the related topics by checking out the following tutorials:

I wrote a book in which I share everything I know about how to become a better, more efficient programmer.
book cover
You can use the search field on my Home Page to filter through all of my articles.