Code Monkey home page Code Monkey logo

awslabs / genomics-secondary-analysis-using-aws-step-functions-and-aws-batch Goto Github PK

View Code? Open in Web Editor NEW
38.0 9.0 22.0 604 KB

This solution provides a framework for Next Generation Sequencing (NGS) genomics secondary-analysis pipelines using AWS Step Functions and AWS Batch.

Home Page: https://aws.amazon.com/solutions/implementations/genomics-secondary-analysis-using-aws-step-functions-and-aws-batch/

License: Apache License 2.0

Shell 47.86% Dockerfile 7.48% Python 44.67%

genomics-secondary-analysis-using-aws-step-functions-and-aws-batch's Introduction

Deprecation Notice

In 2022, AWS launched AWS HealthOmics, a purpose built service to store, query and analyze genomics and other omics data securely and at scale. Since HealthOmics allows users to run bioinformatics workflows using industry specific workflow languages and abstracts the AWS infrastructure and its management from the user, we recommend using AWS HealthOmics. An updated guidance is available here: https://aws.amazon.com/solutions/guidance/development-automation-implementation-monitoring-of-bioinformatics-workflows-on-aws/

Genomics Secondary Analysis Using AWS Step Functions and AWS Batch

This solution provides a framework for Next Generation Sequencing (NGS) genomics secondary-analysis pipelines using AWS Step Functions and AWS Batch. It deploys AWS services to develop and run custom workflow pipelines, monitor pipeline status and performance, fail-over to on-demand, handle errors, optimize for cost, and secure data with least-privileges.

The solution is designed to be starting point for developing your own custom genomics workflow pipelines using Amazon States Language and AWS Step Functions using continuous integration / continuous deployment (CI/CD) principles. That is everything - from the workflow definitions, to the resources they need to run on top of - is code, tracked in version control, and automatically built, tested, and deployed when developers make changes.

Standard deployment

To deploy this solution in your account use the "Launch in the AWS Console" button found on the solution landing page.

We recommend deploying the solution this way for most use cases.

This will create all resources you need to get started developing and running genomics secondary analysis pipelines. This includes an example containerized toolset and definition for a simple variant calling pipeline using BWA-MEM, Samtools, and BCFtools.

Install options

Customized deployment

A fully customized solution can be deployed for the following use cases:

  • Modifying or adding additional resources deployed during installation
  • Modifying the "Landing Zone" of the solution - e.g. adding additional artifacts or customizing the "Pipe" CodePipeline

Fully customized solutions need to be self-hosted in your own AWS account, and you will be responsible for any costs incurred in doing so.

To deploy and self-host a fully customized solution use the instructions below.

Note: All commands assume a bash shell.

Customize

Clone the repository, and make desired changes

File Structure

.
├── CHANGELOG.md
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── LICENSE.txt
├── NOTICE.txt
├── README.md
├── deployment
│   ├── build-s3-dist.sh
│   └── run-unit-tests.sh
└── source
    ├── code
    │   ├── buildspec.yml
    │   ├── cfn
    │   │   ├── cloudwatch-dashboard.cfn.yaml
    │   │   ├── core
    │   │   │   ├── batch.cfn.yaml
    │   │   │   ├── iam.cfn.yaml
    │   │   │   └── networking.cfn.yaml
    │   │   └── workflow-variantcalling-simple.cfn.yaml
    │   ├── containers
    │   │   ├── _common
    │   │   │   ├── README.md
    │   │   │   ├── aws.dockerfile
    │   │   │   ├── build.sh
    │   │   │   ├── entrypoint.aws.sh
    │   │   │   └── push.sh
    │   │   ├── bcftools
    │   │   │   └── Dockerfile
    │   │   ├── buildspec.yml
    │   │   ├── bwa
    │   │   │   └── Dockerfile
    │   │   └── samtools
    │   │       └── Dockerfile
    │   └── main.cfn.yml
    ├── pipe
    │   ├── README.md
    │   ├── buildspec.yml
    │   ├── cfn
    │   │   ├── container-buildproject.cfn.yaml
    │   │   └── iam.cfn.yaml
    │   └── main.cfn.yml
    ├── setup
    │   ├── lambda
    │   │   ├── lambda.py
    │   │   └── requirements.txt
    │   ├── setup.sh
    │   ├── teardown.sh
    │   └── test.sh
    ├── setup.cfn.yaml
    └── zone
        ├── README.md
        └── main.cfn.yml

Path Description
deployment Scripts for building and deploying a customized distributable
deployment/build-s3-dist.sh Shell script for packaging distribution assets
deployment/run-unit-tests.sh Shell script for execution unit tests
source Source code for the solution
source/setup.cfn.yaml CloudFormation template used to install the solution
source/setup/ Assets used by the installation and un-installation process
source/zone/ Source code for the solution landing zone - location for common assets and artifacts used by the solution
source/pipe/ Source code for the solution deployment pipeline - the CI/CD pipeline that builds and deploys the solution codebase
source/code/ Source code for the solution codebase - source code for containerized tooling, workflow definitions, and AWS resources for workflow execution

Run unit tests

cd ./deployment
chmod +x ./run-unit-tests.sh
./run-unit-tests.sh

Build and deploy

Create deployment buckets

The solution requires two buckets for deployment:

  1. <bucket-name> for the solution's primary CloudFormation template
  2. <bucket-name>-<aws_region> for additional artifacts and assets that the solution requires - these are stored regionally to reduce latency during installation and avoid inter-regional transfer costs

Configure and build the distributable

export DIST_OUTPUT_BUCKET=<bucket-name>
export SOLUTION_NAME=<solution-name>
export VERSION=<version>

chmod +x ./build-s3-dist.sh
./build-s3-dist.sh $DIST_OUTPUT_BUCKET $SOLUTION_NAME $VERSION

Deploy the distributable

Note: you must have the AWS Command Line Interface (CLI) installed for this step. Learn more about the AWS CLI here.

cd ./deployment

# deploy global assets
# this only needs to be done once
aws s3 cp \
    ./global-s3-assets/ s3://<bucket-name>/$SOLUTION_NAME/$VERSION \
    --recursive \
    --acl bucket-owner-full-control

# deploy regional assets
# repeat this step for as many regions as needed
aws s3 cp \
    ./regional-s3-assets/ s3://<bucket-name>-<aws_region>/$SOLUTION_NAME/$VERSION \
    --recursive \
    --acl bucket-owner-full-control

Install the customized solution

The link to the primary CloudFormation template will look something like:

https://<bucket-name>.s3-<region>.amazonaws.com/genomics-secondary-analysis-using-aws-step-functions-and-aws-batch.template

Use this link to install the customized solution into your AWS account in a specific region using the AWS Cloudformation Console.


This solution collects anonymous operational metrics to help AWS improve the quality of features of the solution. For more information, including how to disable this capability, please see the implementation guide.


Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Licensed under the Apache License Version 2.0 (the "License"). You may not use this file except in compliance with the License. A copy of the License is located at

http://www.apache.org/licenses/

or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, express or implied. See the License for the specific language governing permissions and limitations under the License.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.