Code Monkey home page Code Monkey logo

altimeter's Introduction

Altimeter

Community Supported GitHub

Python 3.8

Altimeter is a system to graph and scan AWS resources across multiple AWS Organizations and Accounts.

Altimeter generates RDF files which can be loaded into a triplestore such as AWS Neptune for querying.

Quickstart

Installation

pip install altimeter

Configuration

Altimeter's behavior is driven by a toml configuration file. A few sample configuration files are included in the conf/ directory:

  • current_single_account.toml - scans the current account - this is the account for which the environment's currently configured AWS CLI credentials are.
  • current_master_multi_account.toml - scans the current account and attempts to scan all organizational subaccounts - this configuration should be used if you are scanning all accounts in an organization. To do this the currently configured AWS CLI credentials should be pointing to an AWS Organizations master account.

To scan a subset of regions, set the region list parameter regions in the scan section to a list of region names.

Required IAM permissions

The following permissions are required for a scan of all supported resource types:

acm:DescribeCertificate
acm:ListCertificates
cloudtrail:DescribeTrails
dynamodb:DescribeContinuousBackups
dynamodb:DescribeTable
dynamodb:ListTables
ec2:DescribeFlowLogs
ec2:DescribeImages
ec2:DescribeInstances
ec2:DescribeInternetGateways
ec2:DescribeNetworkInterfaces
ec2:DescribeRegions
ec2:DescribeRouteTables
ec2:DescribeSecurityGroups
ec2:DescribeSnapshots
ec2:DescribeSubnets
ec2:DescribeTransitGatways
ec2:DescribeTransitGatwayAttachments
ec2:DescribeVolumes
ec2:DescribeVpcEndpoints
ec2:DescribeVpcEndpointServiceConfigurations
ec2:DescribeVpcPeeringConnections
ec2:DescribeTransitGatewayVpcAttachments
ec2:DescribeVpcs
elasticloadbalancing:DescribeLoadBalancers
elasticloadbalancing:DescribeLoadBalancerAttributes
elasticloadbalancing:DescribeTargetGroups
elasticloadbalancing:DescribeTargetGroupAttributes
elasticloadbalancing:DescribeTargetHealth
eks:ListClusters
events:ListRules
events:ListTargetsByRule
events:DescribeEventBus
guardduty:GetDetector
guardduty:GetMasterAccount
guardduty:ListDetectors
guardduty:ListMembers
iam:GetAccessKeyLastUsed
iam:GetAccountPasswordPolicy
iam:GetGroup
iam:GetGroupPolicy
iam:GetLoginProfile
iam:GetOpenIDConnectProvider
iam:GetPolicyVersion
iam:GetRolePolicy
iam:GetSAMLProvider
iam:GetUserPolicy
iam:ListAccessKeys
iam:ListAttachedGroupPolicies
iam:ListAttachedRolePolicies
iam:ListAttachedUserPolicies
iam:ListGroupPolicies
iam:ListGroups
iam:ListinstanceProfiles
iam:ListMFADevices
iam:ListOpenIDConnectProviders
iam:ListPolicies
iam:ListPolicies
iam:ListRolePolicies
iam:ListRoles
iam:ListSAMLProviders
iam:ListUserPolicies
iam:ListUsers
kms:ListKeys
lambda:ListFunctions
rds:DescribeDBInstances
rds:DescribeDBInstanceAutomatedBackups
rds:ListTagsForResource
rds:DescribeDBSnapshots
route53:ListHostedZones
route53:ListResourceRecordSets
s3:ListBuckets
s3:GetBucketLocation
s3:GetBucketEncryption
s3:GetBucketTagging
sts:GetCallerIdentity
support:DescribeSeverityLevels

Additionally if you are doing multi-account scanning via an MPA master account you will also need:

organizations:DescribeOrganization
organizations:ListAccounts
organizations:ListAccountsForParent
organizations:ListOrganizationalUnitsForParent
organizations:ListRoots

Generating the Graph

Assuming you have configured AWS CLI credentials (see https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html), run:

altimeter <path-to-config>

This will scan all resources in regions specified in the config file.

The full path to the generated RDF file will printed, for example:

Created /tmp/altimeter/20191018/1571425383/graph.rdf

This RDF file can then be loaded into a triplestore such as Neptune or Blazegraph for querying.

For more user documentation see https://tableau.github.io/altimeter/

altimeter's People

Contributors

bechbd avatar cemito avatar james-baker avatar jbmchuck avatar jroimartin avatar katdev avatar manelmontilla avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

altimeter's Issues

./altimeter: line 9: aws2n.py: command not found

Seems like pip didn't leave everything ok, or perhaps I'm missing some instructions on how to run it.

altimeter was installed at: /Users/gonzalovasquez/Library/Python/3.7/bin/altimeter as my user and also with sudo at /System/Volumes/Data/Users/gonzalovasquez/Library/Python/3.7/bin/altimeter, but both yield same error message about the missing aws2n.py

Facts:

aws --version
aws-cli/2.0.15 Python/3.7.4 Darwin/19.6.0 botocore/2.0.0dev19

macOS Catalina 10.15.7

All permissions needed

Hey Guys,

We are running altimeter on our production AWS account.
It is running under a restricted access IAM role and continuously facing AccessDenied issue.
We request those permissions, but on the next run appear new ones and so on.

Is it possible to document all permissions, which should be allowed to run altimeter under a restricted access policy?

To work in Python 10 and greater

To get this to work with a Python 3.10.13 environment I needed to change
v-env/lib/python3.10/site-packages/tornado/httputil.py
this line 25:
From:
import collections
To:
import collections.abc

and also here line 107:

From:
class HTTPHeaders(collections.MutableMapping):
To:
class HTTPHeaders(collections.abc.MutableMapping):

Cut a new release including the merged PRs #168 and #163

@jbmchuck, if I'm not wrong, after merging the PRs #168 and #163 no releases have been cut. Would it make sense to cut a new release including them?

$ git log --oneline 6.4.4..master
525750f (HEAD -> master, upstream/master, upstream/HEAD, origin/master) Gather user and group policies (#163)
09f974a Make the field UserId in a Security Group optional (#168)

Thanks in advance!

Make runnable in ECS

Lambda's 15 min timeout becomes an issue as the number of accounts increases. Move the main aws2n process to ECS, continue using lambda for each account scan.

GuardDuty GetMasterAccount not always returns InvitedAt

for key in ("AccountId", "RelationshipStatus", "InvitedAt",)

ScalarField("InvitedAt", alti_key="master_invited_at"),

Code assumes that InvitedAt key is always present in the dictionary, but it is not required: https://docs.aws.amazon.com/guardduty/latest/APIReference/API_Master.html

Use .get(key) and add optional=True in schema.

Make installable using pipx

pipx is a super useful way to install scripts into their own virtualenv. At the moment, it can make a good attempt at installing altimeter but doesn't set up the main scripts as these are defined as scripts in setup.py rather than console_scripts.

My suggestion is that that the scripts be converted to console_scripts to allow pipx to install them. (I appreciate I might be missing some of the reasons why this isn't possible.)

Use Pydantic where appropriate for data-container classes

Several classes are responsible only for holding data and serializing/deserializing. These could be replaced with Pydantic and remove a significant amount of code as well as provide a check against mutability where appropriate.

-bash: altimeter: command not found

I am trying to set up altimeter. I have set up the config file ('current_single_account.toml') and am having trouble getting the 'altimeter' command to work. I am working on a MacOS machine with M1 chip, using Terminal. Any tips?

Example query showing incorrect results

The Locate vpcs with no ec2 instances, rds instances lambdas or ENIs attached. query returns all VPCs in my account, including those containing EC2 instances, etc.

Apologies but my SPARQL isn't quite sharp enough to spot the issue.

NoRegionError: You must specify a region. causing Altimeter to crash on use.

This line of code causes Altimeter to crash upon first use.
s3_client = boto3.client("s3") throws NoRegionError: You must specify a region. error because the region parameter is not specified and the AWS_DEFAULT_REGION environment variable wasn't specified.

Suggest either adding region parameter to the call based on what's in the config file or pick a region by default, or tell user to set AWS_DEFAULT_REGION and/or update documentation.

quick start example runs into errors

Using the quickstart guide and running the below command runs to error. Am I missing some confirguration. I have AWS CLI setup and all the aws cli commands work fine.
altimeter --base_dir /tmp/altimeter --regions us-east-1
usage: aws2json.py [-h] --config CONFIG output_dir
aws2json.py: error: the following arguments are required: --config

Question: ResourceLinkFields to two possible targets

I am currently trying to scan Base Path Mappings in API Gateway. This works as expected until it encounters an API Gateway V2 API. The get_base_mappings response data includes an API ID, but there is no way to tell whether this points to a V1 or a V2 API. As such, I can't just use a ResourceLinkField as usual.

Can you recommend a way to achieve the link? I'm thinking it would require checking the list of both v1 and v2 APIs for each mapping and setting my object as appropriate. I'll probably define two ResourceLinkFields and set them them as optional, unless there's some other better way to do this.

Allow continuing on scanning errors/support low privilege scanning

I am currently scanning an account in which I have close to full access but, due to compliance and security settings, I am unable to enumerate various settings. (SAML provider information and user access keys, as examples).

It would be useful if the scanning was allowed to continue when access errors are encountered, as I am only interested in the resources that I have access to. This would also help with being able to audit a user's access and ensure they do not have too many privileges.

(One potential way to achieve this could be a settings value that allowed you to exclude certain resources from scanning but that seems a little inelegant, plus could require repeated runs until no errors are generated.)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.