Comments for Upload a File to S3

Hello, I’m coming across an issue at this point in the tutorial. Seems like a few before had the same issue. I tried adding he Amazons3fullaccess, but that didn’t work. My IAM policy and permissions for my S3 bucket look good to me.

Code for backend is here and code for frontend is here.

Here’s the error in my chrome console.

Please let me know if you would like to see anything else or if there are any solutions you have in mind. Thank you!

Hmm have you had a chance to check the CORS settings for your S3 bucket?

I encountered the same problem today while going through this tutorial and my Chrome console shows similar error messages as @frontendsomething presented.

I have used region eu-central-1, but otherwise proceed as instructed. My S3 bucket has the following CORS:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedMethod>POST</AllowedMethod>
    <AllowedMethod>HEAD</AllowedMethod>
    <AllowedMethod>DELETE</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

That looks good. Can you just check the IAM role that is created in this chapter https://serverless-stack.com/chapters/create-a-cognito-identity-pool.html? It needs to have the right S3 bucket name.

Why does Storage.vault.put() know the target directory to upload the file to? As my understanding, we already set the s3 bucket name when we call Amplify.configure(), however, we didn’t tell Amplify which directory in that bucket we want to upload the file to. Why does Amplify automatically create private/${cognito-identity.amazonaws.com:sub} directory in that bucket?

So that is the convention that Amplify follows and we basically followed that while creating the bucket and the IAM roles.

I am facing same issue. I get Access denied error when I try to upload:

HTTP403: FORBIDDEN - The server understood the request, but is refusing to fulfill it.

Below is my IAM Policy:

    {
        "Sid": "VisualEditor2",
        "Effect": "Allow",
        "Action": "s3:*",
        "Resource": [
            "arn:aws:s3:::usernotes-attachments/private/${cognitoidentity.amazonaws.com:sub}/*",
            "arn:aws:s3:::usernotes-attachments"
        ]
    }

Below is CORS configuration:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedMethod>POST</AllowedMethod>
    <AllowedMethod>HEAD</AllowedMethod>
    <AllowedMethod>DELETE</AllowedMethod>
    <MaxAgeSeconds>300000</MaxAgeSeconds>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>

I am using aws-amplify as given in sample code:

import { Storage } from "aws-amplify";

export async function s3Upload(file) {
	const filename = `${Date.now()}-${file.name}`;
	const stored = await Storage.vault.put(filename, file, {
		contentType: file.type
	});
	return stored.key;
}

Any idea why is this failing?

Sorry. it was a typo in my IAM policy. “-” was missing in “cognito-identity”. File is getting uploaded now. Thanks.

1 Like

Thanks for reporting back. Glad you figure it out.

1 Like

I am receiving the following errors:

Failed to load https://notes-app-uploads-jdccr.s3.us-east-2.amazonaws.com/private/us-east-2%3A636d8661-1573-42e5-aa8b-83f6854c666f/1543670268959-Guía%20del%20desarrollador.txt: Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘http://localhost:3000’ is therefore not allowed access.

xhr.js:87 Cross-Origin Read Blocking (CORB) blocked cross-origin response https://notes-app-uploads-jdccr.s3.us-east-2.amazonaws.com/private/us-east-2%3A636d8661-1573-42e5-aa8b-83f6854c666f/1543670287533-Guía%20del%20desarrollador.txt with MIME type application/xml. See Chrome Platform Status for more details.

I had reviewed several times my configuration and all my configuration is right but I don’t understand where is the error… :frowning:

Please, help me!

This definitely seems like an issue with the CORS settings for the S3 bucket. But before that I would check to make sure that the file that it is pointing to exists.

Go into your AWS console for S3 and check if this file exists:

https://notes-app-uploads-jdccr.s3.us-east-2.amazonaws.com/private/us-east-2%3A636d8661-1573-42e5-aa8b-83f6854c666f/1543670268959-Gu%C3%ADa%20del%20desarrollador.txt

Thank you @jayair for your answer. My bucket says: This bucket is empty. Upload new objects to get started.

Yeah that definitely seems like a CORS issue. Make sure you have the CORS part of this chapter configured.

I have been working through this tutorial, and have been able to get everything working up until this point: I’m getting “access denied” errors when I try to upload an object to the s3 bucket. I have double checked the CORS settings, messed around a little with the IAM role settings, so far to no avail. At some point I must have had it all correct, because there are a couple of objects that did get put in a private sub-bucket, but I’m not able to reproduce it, and I can’t recall ever not getting the access denied error (with the exception of trying the full ARN of the bucket instead of the bucket name, which resulted in “network error”). I followed the chapters in order, I’ve gone back and revisited the bucket creation and the user pool roles, it all appears good, but something must be wrong because it’s not working. Any suggestions how to best proceed debugging this will be greatly appreciated.

Good morning to everyone… I had a react course the last week and my teacher gave me a tip to the problem with CORS settings.

OPTION 1
Create shortcut for Chrome in the desktop and change the target with:

“C:/Program Files (x86)/Google/Chrome/Application/chrome.exe” --disable-web-security --user-data-dir=“c:/data”

OPTION 2
Use the following plugin: https://chrome.google.com/webstore/detail/allow-control-allow-origi/nlfbmbojpeacfghkpbjhddihlkkiljbi

I tested the OPTION 1 and now I can add notes with attachment in S3. My teacher says that the problem is because the URL is http://localhost:3000 and AWS requires an authorized domain.

I didn’t try jdccr’s options yet, because my front-end is running from http://jbm.test.deployment.bucket.s3-website.us-east-2.amazonaws.com/… which would seem to be an authorized domain.

I tried to test the API without the front end, by using the npx command in the chapter ‘Test the APIs’, but eventually I realized that it is only in the front end that the file is uploaded…

Curiouser and curiouser. I wanted to install the amplify-cli to hit it from a different angle. I had problems installing it until I uninstalled and reinstalled node (using home-brew on Mac). I ran amplify init, which created a few more buckets, but I didn’t connect the new buckets to the app. But for whatever reason, now file uploads are working from the app running locally. However - the deployed app loaded from the S3 deployment bucket still throws access denied errors on file upload!? I thought that perhaps whatever had changed that caused the local version to work might need to be uploaded to the deployment bucket, so I redid ‘nom run build’, followed by ‘aws s3 sync build/ …’ But the result is the same. This seems really odd to me - both versions are connecting to the same back-end, from the same browser on my machine, and presumably are running the same code? Ack!

Update: after re-deploying (with no substantive changes), now it works in both places!?

I seem to be missing something…

@jdccr So you got it to work with one of those options?

@jbmulligan It seems like there might be some issues in the settings? We recently updated the settings for configuring S3. Maybe that was causing it.

Yes with the custom shortcut of Chrome the attachment works.

Is it a security concern letting the react app directly upload to a S3 bucket? I’m not familiar with React and haven’t used it before this tutorial, but it’s frontend and the user has access to the js scripts right? So someone could abuse it and upload an arbitrary amount of files into the bucket. Would it be better to handle the upload through another lambda function?