Save Changes to a Note

From @jayair on Mon Apr 10 2017 01:07:11 GMT+0000 (UTC)

Link to chapter - http://serverless-stack.com/chapters/save-changes-to-a-note.html

Copied from original issue: https://github.com/AnomalyInnovations/serverless-stack-com/issues/55

From @SherpaPsy on Mon Apr 24 2017 10:50:14 GMT+0000 (UTC)

Perhaps I am getting ahead of myself here, but if I edit a note - and select a new attachment, the new attachment is uploaded, but the original attachment is still there. Is this intended? Oh, and once again, I am amazed by the quality of this work and your tutorial on it, even though I am not a developer and the code is way over my head.

From @jayair on Mon Apr 24 2017 19:55:22 GMT+0000 (UTC)

@SherpaPsy you are right, we don’t handle that case. And we don’t take care of the attachment, when we delete a note. For the sake of simplicity, we leave a lot of these up to you.

I’ll add a note making that clear so it doesn’t confuse people.

From @jayair on Mon Apr 24 2017 20:12:06 GMT+0000 (UTC)

Added a note regarding the attachment - 41987fbd9795f997f6bc06feb02efc9665cf7c85

From @toddmcneill on Thu May 25 2017 19:50:56 GMT+0000 (UTC)

I attempted to delete the file from S3 when attaching a new file.
I read the documentation and built a function patterned after the s3Upload function, but it just wouldn’t work. I finally found that you need to add a line to the S3 bucket’s CORS configuration:
<AllowedMethod>DELETE</AllowedMethod>
Once I did that, I encountered no more errors. Here’s some sample code:

In awsLib.js, add

export async function s3Delete(key, userToken){
  await getAwsCredentials(userToken);
  
  const s3 = new AWS.S3({
    params: {
      Bucket: config.s3.BUCKET,
    }
  });

  return s3.deleteObject({
    Key: key,
  }).promise();
}

In the handleSubmit function of Notes.js, under the line if (this.file) {, add

let fileKey = decodeURIComponent(new URL(this.state.note.attachment).pathname.substring(1));
await s3Delete(fileKey, this.props.userToken);

This parses the S3 object key out of the URL and uses it to delete the S3 object. It would probably be better practice to store both the URL and the S3 object key in the first place rather than storing only the URL.

From @jayair on Mon May 29 2017 10:36:11 GMT+0000 (UTC)

@toddmcneill Thanks for writing it up. There’s a PR that I’ll be merging soon to add this in.

From @jomazu on Fri Jul 14 2017 03:09:48 GMT+0000 (UTC)

@toddmcneill Thanks for the write up! I also added s3Delete to import { invokeApig, s3Upload, s3Delete } from '../libs/awsLib'; in Notes.js header. However, I still cannot seem to get this to work properly? Can you point me to the documentation you read to figure this out? jz

From @jayair on Fri Jul 14 2017 17:00:35 GMT+0000 (UTC)

@jomazu Did you add <AllowedMethod>DELETE</AllowedMethod> to the the CORS configuration for the S3 bucket?

From @jomazu on Mon Jul 17 2017 05:17:56 GMT+0000 (UTC)

@jayair Thanks for the follow up! I did add <AllowedMethod>DELETE</AllowedMethod> to the CORS config in AWS. However, whenever I select the Delete bar, the spinner just continues spinning endlessly… When I get some free time, I will look into this further.

From @d3sandoval on Sat Jul 29 2017 21:07:32 GMT+0000 (UTC)

I’m having a problem calling the PUT method. I get the following error in the console when I hit the submit button:

972e7390-74a1-11e7-9b16-adf6e59582b7:1 Fetch API cannot load https://<my-endpoint>.execute-api.us-east-1.amazonaws.com/prod/notes/972e7390-74a1-11e7-9b16-adf6e59582b7. No 'Access-Control-Allow-Origin' header is present on the requested resource.

Origin 'http://localhost:3000' is therefore not allowed access. The response had HTTP status code 502. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Notes.js:100 TypeError: Failed to fetch

I’ve double-checked my serverless.yml file and it seems to match what is expected at this point in the tutorial:

service: notes-app-api

plugins:
  - serverless-webpack

custom:
  webpackIncludeModules: true

provider:
  name: aws
  runtime: nodejs6.10
  stage: prod
  region: us-east-1

  # 'iamRoleStatement' defines the permission policy for the Lambda function.
  # In this case Lambda functions are granted with permissions to access DynamoDB.
  iamRoleStatements:
    - Effect: Allow
      Action:
        - dynamodb:DescribeTable
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
      Resource: "arn:aws:dynamodb:us-east-1:*:*"

functions:
  create:
    # Defines an HTTP API endpoint that calls the main function in create.js
    # - path: url path is /notes
    # - method: POST request
    # - cors: enabled CORS (Cross-Origin Resource Sharing) for browser cross
    #     domain api call
    # - authorizer: authenticate using the AWS IAM role
    handler: create.main
    events:
      - http:
          path: notes
          method: post
          cors: true
          authorizer: aws_iam

  get:
    # Defines an HTTP API endpoint that calls the main function in get.js
    # - path: url path is /notes/{id}
    # - method: GET request
    handler: get.main
    events:
      - http:
          path: notes/{id}
          method: get
          cors: true
          authorizer: aws_iam

  list:
  # Defines an HTTP API endpoint that calls the main function in list.js
  # - path: url path is /notes
  # - method: GET request
    handler: list.main
    events:
      - http:
          path: notes
          method: get
          cors: true
          authorizer: aws_iam


  update:
  # Defines an HTTP API endpoint that calls the main function in update.js
  # - path: url path is /notes/{id}
  # - method: PUT request
    handler: update.main
    events:
      - http:
          path: notes/{id}
          method: put
          cors: true
          authorizer: aws_iam

  delete:
  # Defines an HTTP API endpoint that calls the main function in update.js
  # - path: url path is /notes/{id}
  # - method: PUT request
    handler: delete.main
    events:
      - http:
          path: notes/{id}
          method: delete
          cors: true
          authorizer: aws_iam

Any help is appreciated

From @d3sandoval on Sat Jul 29 2017 21:12:19 GMT+0000 (UTC)

Also, maybe related, I get the same sets of notes no matter who I am logged in as:

From @jayair on Sun Jul 30 2017 00:26:20 GMT+0000 (UTC)

@d3sandoval The serverless.yml looks fine. You can try debugging this endpoint and checking the CloudWatch logs. More info on this here - http://serverless-stack.com/chapters/test-the-apis.html#common-issues

The login issue could be because the credentials are not being cleared out after you log out. Just double check that this step was completed properly - http://serverless-stack.com/chapters/clear-aws-credentials-cache.html

From @d3sandoval on Sun Jul 30 2017 01:18:01 GMT+0000 (UTC)

Good call @jayair . The log pointed me to the exact problem:

Unhandled promise rejection (rejection id: 1): TypeError: Cannot read property 'claims' of undefined

Turns out I had forgotten to change the line

userId: event.requestContext.authorizer.claims.sub,

to

userId: event.requestContext.identity.cognitoIdentityId,

when updating the authentication mechanism. I redeployed and it seems to be working successfully.

I’ll double check my log out code next and comment on that chapter’s comments, if I am still having issues :slight_smile:

From @deepseafishing on Sun Aug 06 2017 03:31:46 GMT+0000 (UTC)

@jayair
Hi, Jayair.

I’m having this problem when I create a new note and finishes history push to root url, I can’t see the newly created notes in the notes list. All I see is previous list. When I hard refresh the page, then I get the new list which added the newly created notes.

is this normal or am I doing something wrong here?

I see invokeApig is getting called when I redirected to root url after creating notes.
if I wrap the push history code with setTimeout of 200, then it works. but with 100 it doesn’t work.

Also, it sometimes makes this kind of error when I refresh it multiple times.

From @jayair on Sun Aug 06 2017 23:54:04 GMT+0000 (UTC)

@deepseafishing The new note should show up right away but there seems to be some weird timing relate issue for you. Can I see your Home.js and NewNote.js?

For the refresh error, it seems like it’s related to your local setup. Make sure your Create React App is up to date?

From @deepseafishing on Mon Aug 07 2017 01:08:22 GMT+0000 (UTC)

https://github.com/deepseafishing/showCode

Thanks for the reply, @jayair .
Yes, here are the codes. Do you mean I need to npm install Create React App to update to the newest version?

Also, when I login to the deployed website with one account and logout and login again with another account, I could still see the notes previous user wrote. I guess this has something to do with deleting storage? It worked in local environment. But I don’t understand why it does not do the same thing in the deployed environment.
Here is my website for the note app.
https://onejune.me

From @jayair on Mon Aug 07 2017 17:02:35 GMT+0000 (UTC)

@deepseafishing You are missing the await on this line - https://github.com/AnomalyInnovations/serverless-stack-demo-client/blob/master/src/containers/NewNote.js#L54. So it’s redirecting before it has a chance to finish adding the note.

On the deployed version are you seeing the refresh error? I don’t think I see it. So it’s probably related to your dev environment. You can try updating Create React App by doing the following.

npm install -g create-react-app
npm install --save-dev react-scripts

The login issue is a bug with the tutorial. We are not clearing the session properly upon logout. I haven’t had a chance to update the tutorial yet but you can try replacing this line - https://github.com/AnomalyInnovations/serverless-stack-demo-client/blob/master/src/App.js#L87 with this.

AWS.config.credentials.clearCachedId();
AWS.config.credentials = new AWS.CognitoIdentityCredentials({ });

From @deepseafishing on Tue Aug 08 2017 01:06:01 GMT+0000 (UTC)

Thank you so much, @jayair !
This totally worked out all great. I’m doing a tech talk about the things I learned from building notes from your tutorial. I’ll send you the link of the video later.
I was wondering though how much would be the difference in cost between using auto scaling in AWS and using Lambda services. Would using this serverless architecture a lot cheaper than using ordinary aws instance with auto scaling? or what would be the obvious benefits?

From @jayair on Tue Aug 08 2017 17:34:24 GMT+0000 (UTC)

@deepseafishing That’s cool to hear. Yeah definitely send it over. We’d like to share it with everybody.

The cost savings are fairly big for a lot of folks. The biggest reason for this is that most workloads are not constant. So with standard server based models you are paying for resources when they are not in use. Here is an example - Bustle Case Study

Bustle has also experienced approximately 84% cost savings by moving to a serverless architecture.

Now not everybody is going to see an improvement like this. And there are cases where your workload might make it so that Lambda ends up being more expensive. So you should try and run estimates for your setup.

From @samxi on Fri Sep 29 2017 15:21:51 GMT+0000 (UTC)

After completing this chapter I was getting a compile error that s3Upload was not part of awsLib.js. I took a look at the front end source in github for this chapter and noticed that method in the file there so I added it to my local file.

After this I am getting an error when trying to save my existing note with a new attachment TypeError: Cannot read property 'BUCKET' of undefined.

Not sure what I am missing here.