Code Monkey home page Code Monkey logo

aws_samples's Introduction

1. About

Sample Rust projects to access AWS via SDK.

2. Prerequisites

  • ~/.aws/credentials (This can be created via aws configure command.)

  • cargo-lambda

3. ./ec2 project

3.1 About

This project creates an HTTP server which does the following for each request:

  1. Receive a JSON of the form {"r": 255, "g": 255, "b": 0}.

  2. Create a PNG image whose every pixel is filled with the given color.

  3. Upload it to S3.

  4. Log the request with its timestamp to RDS (MySQL) and DynamoDB.

  5. Create and return a pre-signed URL (i.e. a public URL with expiration date) to access the object uploaded in S3.

It works locally, or you can deploy it to EC2.

3.2 Architecture

3.3 Usage

  1. Access S3 console to create a bucket called bucket-test-002-a with the default settings.

  2. Access RDS console.

    1. Create a MySQL instance called test-rds-001. Make sure Publicly accessible is turned on (to test it locally).

    2. Edit inbound rules to allow accesses to 3306 port.

    3. Connect to it via command-line to create a database called test.

      #You can NOT use a space instead of `=`.
      $ mysql -h <host> -P 3306 --user=<user> --password=<password>
      create database test;
      
  3. Access DynamoDB console to create a table called test_dynamodb_001, whose primary key has the name timestamp of the type String.

  4. Access EC2 console.

    1. Create an instance whose OS is Amazon Linux 2.

    2. Edit inbound rules to allow accesses to the port you want to listen (in addition to SSH).

  5. Deploy and build the project. (Warning: If you use a free tier instance, DO skip to the step 6. Executing cargo build inside such an instance eats too much power to completely break the instance. After that, you cannot connect to the instance via SSH or even via browser. Rebooting the instance doesn't fix the problem: you have no chance but to create a brand new instance. This behavior was observed both for Ubuntu and Amazon Linux 2.)

    1. Connect to the instance via SSH.

      $ ssh aws
    2. Install Rust by following the official instructions:

      $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    3. Install dependencies.

      $ sudo yum update
      $ sudo yum install git gcc openssl-devel
    4. Clone this repository.

      $ git clone 'https://github.com/your-diary/aws_rust_samples'
    5. Edit the configuration file (see below for the details).

      $ cd aws_rust_samples/ec2
      $ vi config.json
    6. Build.

      $ cargo build --release
  6. Deploy and build the project, using cross-compilation. (If you execute the step 5, skip this step.)

    1. Cross-compile the project.

      $ rustup target add x86_64-unknown-linux-gnu
      $ cargo build --release --target x86_64-unknown-linux-gnu
    2. Write a config file (see below for the details).

      $ vi config.json
    3. Send the config file and the built binary to the instance.

      $ ssh aws 'mkdir ec2'
      $ rsync -auv ec2 config.json aws:./ec2/
  7. Access EC2 console.

    1. Choose the instance.

    2. Select Actions > Security > Modify IAM role to attach a role to the instance to make AWS services accessible inside the instance.

  8. Send the credential files to the instance. (Warning: Normally this is not needed to use AWS SDK, but in this project we use the third-party rust-s3 crate and it didn't seem to support reading the credentials via IAM role.)

    $ rsync -auv ~/.aws aws:./
  9. Check if RDS (MySQL) is accessible inside the instance.

    1. Connect to the instance via SSH.

      $ ssh aws
    2. Install mysql command.

      $ sudo yum localinstall -y https://dev.mysql.com/get/mysql80-community-release-el9-1.noarch.rpm
      $ sudo yum install -y mysql-community-client
    3. Try to connect to RDS (MySQL).

      #You can NOT use a space instead of `=`.
      $ mysql -h <host> -P 3306 --user=<user> --password=<password>
    4. If connection failed, access RDS console and make sure the security group of the instance (e.g. sg-...) is listed in the inbound rules.

  10. Run the server.

    1. Connect to the instance.

      $ ssh aws
    2. Run the server.

      $ cd ec2/
      $ screen -d -m ./ec2
      $ screen -ls
  11. Call the API.

    $ curl \
        -H 'Content-Type: application/json' \
        -d '{"r": 100, "g": 100, "b": 200}' \
        <URL>
    {
        "status": "success",
        "url": "https://..."
    }
    $ curl <returned URL> | imgcat

3.4 Configurations

Configurations are read from config.json.

Example:

{
    "port": 30021,
    "img_width": 300,
    "img_height": 200,
    "s3": {
        "bucket_name": "bucket-test-002-a",
        "expiration_sec": 30
    },
    "rds": {
        "host": "test-rds-001.xyz.ap-northeast-1.rds.amazonaws.com",
        "port": 3306,
        "user": "admin",
        "password": "abcde",
        "database_name": "test",
        "table_name": "colors"
    },
    "dynamodb": {
        "table_name": "test_dynamodb_001"
    }
}

3.5 References

4. ./lambda/ project

4.1 About

This project creates a REST API which receives a JSON of the form {"content": <string>} and uploads its content as <timestamp>.txt to S3.

4.2 Architecture

4.3 Usage

  1. Run tests. We expect every test passes.

    $ cargo test
  2. Cross-compile the project for Amazon Linux 2.

    $ cargo lambda build --release
  3. Deploy the project as a lambda function. You do not have to create a lambda function from the console in advance; it automatically creates or updates the lambda function whose name is lambda_test_001. (The name can be customized via name property in Cargo.toml).

    $ cargo lambda deploy
  4. Access S3 console to create a bucket called bucket-test-001-a with the default settings.

  5. Access Lambda console.

    1. Select lambda_test_001.

    2. Select Configuration > Permissions > Execution role and click the name of the role (e.g. cargo-lambda-role-...).

    3. Add AmazonS3FullAccess role.

  6. Access API Gateway console.

    1. Create a new REST API called test_gateway_001.

    2. Select Actions > Create Method to create a POST method and bind it to lambda_test_001.

    3. After creating the POST method, click Method Request and change the value of API Key Required to true.

    4. Select Actions > Deploy API to deploy the API. The name of a stage is arbitrary but let's use testing for convenience. After that, you are redirected to Stage Editor. Change the values of Rate and Burst as you want. You may also want to memorize the Invoke URL which is the URL for this API.

    5. Select Usage Plans in the sidebar to start creating a usage plan called Testing. First set Rate, Burst and Quota as you want, and then click Add API Stage to bind the testing stage of test_gateway_001 to the plan.

    6. Select API Keys in the sidebar and select Actions > Create API key to create an API key. Then click Add to Usage Plan to bind it to the Testing plan.

  7. Call the API with the key.

    $ curl \
        -H 'x-api-key: <API key>' \
        -d '{"content": "hello"}' \
        <URL>
    {"status":"success","filename":"1678969418940.txt"}

4.4 References

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.