Code Monkey home page Code Monkey logo

aws-iot-twinmaker-samples's Introduction

2023/11/28

For a sample Digital Twin application highlighting an AI-assistant integration, check out our blog: Building an AI Assistant for Smart Manufacturing with AWS IoT TwinMaker and Amazon Bedrock

For a sample Digital Twin application built with a React application on iot-app-kit check out CookieFactoryV2

For a sample Digital Twin application highlighting TwinMaker Knowledge Graph, check out our guided SmartBuilding workshop!


Note: if you are just looking for sample IAM policies to use when creating an AWS IoT TwinMaker workspace, please see these sample permission and trust relationship policies. If you would like to create this role using AWS CloudFormation, please use this template .

The role permission policy will only grant AWS IoT TwinMaker access to manage workspace resources in your S3 buckets. We recommend you scope down the bucket permissions to your specific S3 bucket once it is created. You will also need to update the role to grant further permissions for your use-case, such as invoking AWS IoT TwinMaker custom AWS Lambda connectors you've implemented or accessing video stream metadata in AWS IoT SiteWise and Amazon Kinesis Video Streams. For an end-to-end setup experience (including auto-generation of these roles with all necessary permissions for the sample use-case) we recommend following the getting started guide below.

AWS IoT TwinMaker Getting Started

Summary

This project walks you through the process of building a digital twin application using AWS IoT TwinMaker. The project contains many samples, including a simulated cookie factory that you can use to explore many of the features of IoT TwinMaker. After going through this README you will have the following dashboard running in Grafana, which you can use to interact with the sample CookieFactory digital twin.

Grafana Import CookieFactory

If you run into any issues, please see the Troubleshooting section of this page.

Prerequisites

Note: These instructions have primarily been tested for Mac/Linux/WSL environments. For a standardized development environment, consider following our Cloud9 setup guide instead.

  1. This sample depends on AWS services that might not yet be available in all regions. Please run this sample in one of the following regions:

    • US East (N. Virginia) (us-east-1)
    • US West (Oregon) (us-west-2)
    • Europe (Ireland) (eu-west-1)
  2. An AWS account for IoT TwinMaker + AWS CLI

    • We recommend that you configure your default credentials to match the account in which you want to set up this getting started example. Use the following command to verify that you are using the correct account. (This should be pre-configured in Cloud9.)
      aws sts get-caller-identity
    • Ensure your AWS CLI version is at least 1.22.94. (or 2.5.5+ for AWS CLI v2)
      aws --version
    • When you are set up, test your access with the following command. (You should not receive errors.)
       aws iottwinmaker list-workspaces --region us-east-1
      
  3. Python3

    • Verify your python3 path and version (3.7+). (This should be pre-installed in Cloud9.)
      python3 --version
      
    • Optional: Pyenv and Pyenv-virtualenv. Use pyenv and pyenv-virtualenv to ensure that you have correct Python dependencies. They are optional as long as you have a system-wide Python3 installation, but highly recommended for avoiding conflicts between multiple python projects.
  4. Node.js & NPM with node v14.18.1+ and npm version 8.10.0+. (This should be pre-installed in Cloud9.) Use the following commands to verify.

    node --version
    
    npm --version
    
  5. AWS CDK toolkit with version at least 2.27.0. (The CDK should be pre-installed in Cloud9, but you may need to bootstrap your account.) Use the following command to verify.

    cdk --version
    
    • You will also need to bootstrap your account for CDK so that custom assets, such as sample Lambda functions, can be easily deployed. Use the following command.

      cdk bootstrap aws://[your 12 digit AWS account id]/[region]
      
      # example
      # cdk bootstrap aws://123456789012/us-east-1
      
  6. Docker version 20+. (This should be pre-installed in Cloud9.) Authenticate Docker for public ECR registries

    docker --version
    
    • Use the following command to build Lambda layers for CDK.
      aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin public.ecr.aws

Deploying the Sample Cookie Factory Workspace

  1. Set up environment variables.

    Set the following environment variables to make it easier to execute the remaining steps.

    # Change into the same directory as this README
    cd [directory_of_this_README]
    # Set your aws account id, you can use `aws sts get-caller-identity` to see the account id you're currently using
    export CDK_DEFAULT_ACCOUNT=[replace_with_your_aws_account_id]
    # Set some options for our install. If you want to use another workspace ID then change 'CookieFactory' to your preference
    export GETTING_STARTED_DIR=$PWD
    export AWS_DEFAULT_REGION=us-east-1
    export CDK_DEFAULT_REGION=$AWS_DEFAULT_REGION
    export TIMESTREAM_TELEMETRY_STACK_NAME=CookieFactoryTelemetry
    export WORKSPACE_ID=CookieFactory
  2. Install Python Libraries.

    We use Python to help deploy our Cookie Factory sample data. Use the following command to install the required Python libraries.

    pip3 install -r $GETTING_STARTED_DIR/src/workspaces/cookiefactory/requirements.txt
  3. Create an IoT TwinMaker workspace.

    a. Create an IoT TwinMaker execution role

    Different Digital Twin applications use different resources. Run the following command to create an execution role for our workspace that has the necessary permissions for this sample application. Note that you will use the role name when creating a workspace in the next step.

    python3 $GETTING_STARTED_DIR/src/workspaces/cookiefactory/setup_cloud_resources/create_iottwinmaker_workspace_role.py --region $AWS_DEFAULT_REGION

    b. Create the workspace in the AWS Console

    Now go to the console and create a workspace with the same name that you used for WORKSPACE_ID in step 1. You can have the console automatically create S3 buckets for you. When asked to provide a role for the workspace, use the role name generated by the preceding script. (The name should contain the string "IoTTwinMakerWorkspaceRole".)

    After entering the workspace settings you will be asked to specify the Grafana environment you will be using to interact with your workspace. We recommend you use Amazon Managed Grafana to host your infrastructure, but you can follow Grafana instructions to make your decision and build the environment.

    When asked for your dashboard role, you can follow the instructions in the console to manually create an IAM policy and role to be used in Grafana. You can automatically create the role using the script in this package in the next step after creating the workspace.

    Finally, click "Create" on the Review page to create your workspace.

    Console link for us-east-1: https://us-east-1.console.aws.amazon.com/iottwinmaker/home?region=us-east-1

    c. Create a Grafana dashboard IAM role

    If you did not complete the dashboard setting steps, run the following script to create a role for accessing the workspace on a Grafana dashboard. This uses scoped-down permissions for ReadOnly access to IoT TwinMaker and other AWS services in Grafana. Note the ARN of the role you create. You will use it when configuring a data source in Grafana.

    python3 $GETTING_STARTED_DIR/src/modules/grafana/create_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION --account-id $CDK_DEFAULT_ACCOUNT

    If you are using Amazon Managed Grafana, add the field:

    --auth-provider <Amazon Managed Grafana Workspace IAM Role ARN>

    Make sure that your current AWS credentials are the same as the ones you use in Grafana. If not, go to the IAM console after running this script and update the trust permissions for the authentication provider you will be using. Read more about your authentication provider in the documentation.

    We automatically add permission for IoT TwinMaker and Kinesis Video Streams to enable the basic functionality of the Grafana datasource, Scene Viewer panel, and Video Player panel. If you would like to enable more features of the Video Player (time scrubber bar + video upload request from cache) then you need to manually update your IAM policy by following our video player policy documentation.

  4. Deploy an Instance of the Timestream Telemetry module.

    Timestream Telemetry is a sample telemetry store for IoT data. It uses a single AWS Timestream database and table, and a Lambda function for reading and writing. Later steps will fill this table with sample data for the Cookie Factory. The following commands create the database and table and deploy the lambda function found under /src/lib/timestream_telemetry.

    cd $GETTING_STARTED_DIR/src/modules/timestream_telemetry/cdk/

    Use the following command to install dependencies for the module

    npm install
    

    Deploy the module. (Enter 'y' when prompted to accept IAM changes.)

    cdk deploy
    
  5. Use the following commands to import the Cookie Factory content.

    cd $GETTING_STARTED_DIR/src/workspaces/cookiefactory/
    
    # import cookie factory data into your workspace
    python3 -m setup_content \
      --telemetry-stack-name $TIMESTREAM_TELEMETRY_STACK_NAME \
      --workspace-id $WORKSPACE_ID \
      --region-name $AWS_DEFAULT_REGION \
      --import-all

    If you want to reimport the sample content you need to add flags to delete the old content (such as --delete-all or individual flags such as --delete-telemetry and --delete-entities).

    If you want to import only parts of the sample content, you can use individual import flags instead of --import-all (such as --import-telemetry and --import-entities).

    Note: on initial import the script will save the starting timestamp used for generating sample telemetry and video. This is stored in the TwinMaker workspace in the samples_content_start_time Tag. On subsequent re-runs of the scripts, this starting timestamp will be re-used for consistent data generation. If you would like to recreate the data using the current time instead, please delete the Tag from the workspace.

  6. (Optional) Verify connectivity for entities, scenes, and Unified Data Query (UDQ) Test data by using UDQ.

    After importing all content, you can go to the IoT TwinMaker console to view the entities and scenes that you created.

    AWS IoT TwinMaker provides features to connect to and query your data sources via its component model and Unified Data Query interface. In this getting started, we imported some data into Timestream and set up the component and support UDQ Lambda function that enables us to query it. Use the following command to test whether we're able to query for alarm data by using the get-property-value-history API.

    aws iottwinmaker get-property-value-history \
       --region $AWS_DEFAULT_REGION \
       --cli-input-json '{"componentName": "AlarmComponent","endTime": "2023-06-01T00:00:00Z","entityId": "Mixer_2_06ac63c4-d68d-4723-891a-8e758f8456ef","orderByTime": "ASCENDING","selectedProperties": ["alarm_status"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'
    

    See Additional UDQ Sample Requests for other supported request examples.

  7. Set up Grafana for the Cookie Factory.

    AWS IoT TwinMaker provides a Grafana plugin that you can use to build dashboards using IoT TwinMaker scenes and modeled data sources. Grafana is deployable as a docker container. We recommend that new users follow these instructions to set up Grafana as a local container: Instructions. (If the link doesn't work in Cloud9, open docs/grafana_local_docker_setup.md.)

    For advanced users aiming to set up a production Grafana installation in their account, we recommend checking out https://github.com/aws-samples/aws-cdk-grafana.

  8. Import Grafana dashboards for the Cookie Factory.

    When you have the Grafana page open, you can click through the following to import the sample dashboard json file in $GETTING_STARTED_DIR/src/workspaces/cookiefactory/sample_dashboards/. (If you are running from Cloud9, you can right-click and download the file locally then import it from your local machine)

    • mixer_alarms_dashboard.json

    Grafana Import CookieFactory

    For the CookieFactory sample running with local Grafana, you can navigate to http://localhost:3000/d/y1FGfj57z/aws-iot-twinmaker-mixer-alarm-dashboard?orgId=1& to see the dashboard.

Deploying Additional (Add-on) Content

SiteWise Connector

In this section we'll add SiteWise assets and telemetry, and then update the CookieFactory digital twin entities to link to this data source.

  1. Add SiteWise assets and telemetry.

    python3 $GETTING_STARTED_DIR/src/modules/sitewise/deploy-utils/SiteWiseTelemetry.py import --csv-file $GETTING_STARTED_DIR/src/workspaces/cookiefactory/sample_data/telemetry/telemetry.csv \
      --entity-include-pattern WaterTank \
      --asset-model-name-prefix $WORKSPACE_ID
    
  2. Update entities to attach SiteWise connector.

    python3 $GETTING_STARTED_DIR/src/modules/sitewise/lib/patch_sitewise_content.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION
    
  3. Test SiteWise data connectivity with UDQ to query WaterTank volume metrics.

    aws iottwinmaker get-property-value-history \
      --region $AWS_DEFAULT_REGION \
      --cli-input-json '{"componentName": "WaterTankVolume","endTime": "2023-06-01T00:00:00Z","entityId": "WaterTank_ab5e8bc0-5c8f-44d8-b0a9-bef9c8d2cfab","orderByTime": "ASCENDING","selectedProperties": ["tankVolume1"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'
    

S3 Document Connector

In this section we'll add an S3 connector to allow IoT TwinMaker entities to link to data stored in S3.

Go to the s3 modules directory and check the README.

cd $GETTING_STARTED_DIR/src/modules/s3

AWS IoT TwinMaker Insights and Simulation

Note: this add-on will create running Amazon Kinesis Data Analytics (KDA) compute resources that may incur AWS charges. We recommend stopping or deleting the KDA notebook resources with the steps in Add-on Teardown: AWS IoT TwinMaker Insights and Simulation once you are finished using them.

In this section we'll use the AWS IoT TwinMaker Flink library to connect our Mixers' telemetry data to two services to enrich our entity data for deeper insights:

  • A Maplesoft simulation to calculate Mixer power consumption based on RPM
  • A pre-trained machine learning model for RPM anomaly detection

Both services will be exposed as SageMaker endpoints that this Add-on will setup in your account.

Go to the insights modules directory and check the README.

cd $GETTING_STARTED_DIR/src/modules/insights

Additional UDQ Sample Requests

This section contains additional sample requests supported by get-property-value-history in the CookieFactory workspace.

  1. Single-entity, multi-property request (mixer data)

    aws iottwinmaker get-property-value-history \
       --region $AWS_DEFAULT_REGION \
       --cli-input-json '{"componentName": "MixerComponent","endTime": "2023-06-01T00:00:00Z","entityId": "Mixer_2_06ac63c4-d68d-4723-891a-8e758f8456ef","orderByTime": "ASCENDING","selectedProperties": ["Temperature", "RPM"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'
    
  2. Multi-entity, single-property request (alarm data)

    aws iottwinmaker get-property-value-history \
      --region $AWS_DEFAULT_REGION \
      --cli-input-json '{"componentTypeId": "com.example.cookiefactory.alarm","endTime": "2023-06-01T00:00:00Z","orderByTime": "ASCENDING","selectedProperties": ["alarm_status"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'
    
  3. Multi-entity, multi-property request (mixer data)

    aws iottwinmaker get-property-value-history \
      --region $AWS_DEFAULT_REGION \
      --cli-input-json '{"componentTypeId": "com.example.cookiefactory.mixer","endTime": "2023-06-01T00:00:00Z","orderByTime": "ASCENDING","selectedProperties": ["Temperature", "RPM"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'
    

Teardown

Note that these are destructive actions and will remove all content you have created/modified from this sample.

You should have the following environment variables set from the previous Setup instructions.

GETTING_STARTED_DIR=__see_above__
WORKSPACE_ID=__see_above__
TIMESTREAM_TELEMETRY_STACK_NAME=__see_above__
AWS_DEFAULT_REGION=us-east-1

Add-on Teardown: SiteWise Connector

Run the following if you installed the add-on SiteWise content and would like to remove it

python3 $GETTING_STARTED_DIR/src/modules/sitewise/deploy-utils/SiteWiseTelemetry.py cleanup --asset-model-name-prefix $WORKSPACE_ID

Add-on Teardown: S3 Document Connector

Run the following if you installed the add-on SiteWise content and would like to remove it

aws cloudformation delete-stack --stack-name IoTTwinMakerCookieFactoryS3 --region $AWS_DEFAULT_REGION && aws cloudformation wait stack-delete-complete --stack-name IoTTwinMakerCookieFactoryS3 --region $AWS_DEFAULT_REGION

Add-on Teardown: AWS IoT TwinMaker Insights and Simulation

Run the following if you installed the add-on AWS IoT TwinMaker Insights and Simulation content and would like to remove it. These stacks may take several minutes to delete.

Delete installed assets

python3 $INSIGHT_DIR/install_insights_module.py --workspace-id $WORKSPACE_ID --region-name $AWS_DEFAULT_REGION --kda-stack-name $KDA_STACK_NAME --sagemaker-stack-name $SAGEMAKER_STACK_NAME --delete-all

Delete cloudformation stacks

aws cloudformation delete-stack --stack-name $KDA_STACK_NAME --region $AWS_DEFAULT_REGION && aws cloudformation wait stack-delete-complete --stack-name $KDA_STACK_NAME --region $AWS_DEFAULT_REGION
aws cloudformation delete-stack --stack-name $SAGEMAKER_STACK_NAME --region $AWS_DEFAULT_REGION && aws cloudformation wait stack-delete-complete --stack-name $SAGEMAKER_STACK_NAME --region $AWS_DEFAULT_REGION

Delete Base Content

Change directory

cd $GETTING_STARTED_DIR/src/workspaces/cookiefactory

Delete grafana dashboard role (if exists)

python3 $GETTING_STARTED_DIR/src/modules/grafana/cleanup_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION

Delete AWS IoT TwinMaker workspace + contents

# this script is safe to terminate and restart if entities seem stuck in deletion
python3 -m setup_content \
     --telemetry-stack-name $TIMESTREAM_TELEMETRY_STACK_NAME \
     --workspace-id $WORKSPACE_ID \
     --region-name $AWS_DEFAULT_REGION \
     --delete-all \
     --delete-workspace-role-and-bucket

Delete the Telemetry CFN stack + wait

aws cloudformation delete-stack --stack-name $TIMESTREAM_TELEMETRY_STACK_NAME --region $AWS_DEFAULT_REGION && aws cloudformation wait stack-delete-complete --stack-name $TIMESTREAM_TELEMETRY_STACK_NAME --region $AWS_DEFAULT_REGION

(Optional) Delete local Grafana configuration

rm -rf ~/local_grafana_data/

Troubleshooting

For any issue not addressed here, please open an issue or contact AWS Support.

ImportError: libGL.so.1: cannot open shared object file: No such file or directory

Ensure you have mesa-libGL installed. e.g.

sudo yum install mesa-libGL

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

aws-iot-twinmaker-samples's People

Contributors

amazon-auto avatar crazyshipone avatar cshang2017 avatar gsechkin-aws avatar hanamz avatar hemantborole avatar hwandersman avatar intpix avatar johnnyw-aws avatar massimosporchia avatar maxzyantao avatar mo-elmu avatar panjintian avatar parsonjs avatar schleidl avatar stevenyang-aws avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-iot-twinmaker-samples's Issues

Cloud9 Setup VPC

The setup for Cloud9 now requires a VPC.

Would appreciate a mention of this as well as acceptable default values in the file describing the Cloud9 setup.

Tables are not being created for TimestreamTelemetryCdkLambdasStack database

/AWS/aws-iot-twinmaker-samples/src/workspaces/cookiefactory> python3 -m setup_content --telemetry-stack-name $TIMESTREAM_TELEMETRY_STACK_NAME --workspace-id $WORKSPACE_ID --region-name $AWS_DEFAULT_REGION --import-all
Traceback (most recent call last):
File "c:\Python310\lib\runpy.py", line 196, in _run_module_as_main
return run_code(code, main_globals, None,
File "c:\Python310\lib\runpy.py", line 86, in run_code
exec(code, run_globals)
File "c:\opt\AWS\aws-iot-twinmaker-samples\src\workspaces\cookiefactory\setup_content_main
.py", line 179, in
main()
File "c:\opt\AWS\aws-iot-twinmaker-samples\src\workspaces\cookiefactory\setup_content_main
.py", line 89, in main
telemetry = timestream_libs.TimestreamTelemetryImporter(
File "c:\opt\AWS\aws-iot-twinmaker-samples\src\workspaces\cookiefactory\setup_content../../../modules\timestream_telemetry\lib\TimestreamTelemetryUtils.py", line 22, in init
cfn_stack_description = cfn_client.describe_stacks(StackName=stack_name)
File "C:\Users\SRJ\AppData\Roaming\Python\Python310\site-packages\botocore\client.py", line 391, in _api_call
return self._make_api_call(operation_name, kwargs)
File "C:\Users\SRJ\AppData\Roaming\Python\Python310\site-packages\botocore\client.py", line 719, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ValidationError) when calling the DescribeStacks operation: Stack with id CookieFactoryTelemetry does not exist
using following timestamp for data ingestion: 2022-02-03 03:19:28 UTC (1643858368758ms from epoch)

Issue with newer versions of ffmpeg

The version of ffmpeg that currently works is 5.1.2, however this is no longer available via homebrew, homebrew only has versions 5.1.3 and 6, this creates an issue when running this command in the build steps

python3 $GETTING_STARTED_DIR/src/modules/grafana/create_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION --account-id $CDK_DEFAULT_ACCOUNT

I have already resolved this issue by updating opencv to the latest version and am in the process of creating a pull request

PutObject operation: Access Denied

CookieFactoryDemo: creating CloudFormation changeset...
3:32:44 PM | CREATE_FAILED | AWS::CloudFormation::CustomResource | TmdtAppiottwinmakerWorkspaceData40DBAECF
Received response status [FAILED] from custom resource. Message returned: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied (RequestId: 9b4e4168-57c1-492c-9903-8daa99
bc0bae)

❌ CookieFactoryV2Stack (CookieFactoryDemo) failed: Error: The stack named CookieFactoryDemo failed creation, it may need to be manually deleted from the AWS console: ROLLBACK_COMPLETE: Received response status [FAILED] from custom resource. Message returned: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied (RequestId: 9b4e4168-57c1-492c-9903-8daa99bc0bae)
at FullCloudFormationDeployment.monitorDeployment (/Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:426:10236)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Object.deployStack2 [as deployStack] (/Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:429:153208)
at async /Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:429:136985

❌ Deployment failed: Error: The stack named CookieFactoryDemo failed creation, it may need to be manually deleted from the AWS console: ROLLBACK_COMPLETE: Received response status [FAILED] from custom resource. Message returned: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied (RequestId: 9b4e4168-57c1-492c-9903-8daa99bc0bae)
at FullCloudFormationDeployment.monitorDeployment (/Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:426:10236)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Object.deployStack2 [as deployStack] (/Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:429:153208)
at async /Users/mk/.nvm/versions/node/v20.4.0/lib/node_modules/aws-cdk/lib/index.js:429:136985

The stack named CookieFactoryDemo failed creation, it may need to be manually deleted from the AWS console: ROLLBACK_COMPLETE: Received response status [FAILED] from custom resource. Message returned: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied (RequestId: 9b4e4168-57c1-492c-9903-8daa99bc0bae)

turn non-prod into prod

Please identify what needs to be done to make it a production stack

src/workspaces/cookiefactoryv3

Security token invalid when executing script to create role for Grafana

Hi,

When I execute the script

python3 $GETTING_STARTED_DIR/src/modules/grafana/create_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION --profile $MyProfile

It produces error botocore.exceptions.ClientError: An error occurred (InvalidClientTokenId) when calling the GetCallerIdentity operation: The security token included in the request is invalid.

This happens when using a specific profile.

AttributeError: module 'deploy_utils' has no attribute 'WorkspaceUtils'

python3 $GETTING_STARTED_DIR/src/modules/grafana/cleanup_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION
Traceback (most recent call last):
File "/home/ubuntu/environment/aws-iot-twinmaker-samples/src/modules/grafana/cleanup_grafana_dashboard_role.py", line 80, in
main()
File "/home/ubuntu/environment/aws-iot-twinmaker-samples/src/modules/grafana/cleanup_grafana_dashboard_role.py", line 44, in main
ws = deploy_utils.WorkspaceUtils(
AttributeError: module 'deploy_utils' has no attribute 'WorkspaceUtils'

Make setup_local_grafana_docker.sh script idempotent

When setting up the environment to be used for long term usage such as demo, user will probably encounter the grafana docker to be stopped or terminated. The easiest way to re-run the container is by re-running setup_local_grafana_docker.sh which yelled some errors on file deletion permissions.
This lower the trust user have on the stability of the environment.
Suggesting to either make the script idempotent.

iottwinmaker project access problem

After inputting "aws iottwinmaker list-workspaces --region us-east-1" as shown in the prerequisites section, the console says:

An error occurred (AccessDeniedException) when calling the ListWorkspaces operation: User: arn:aws:iam::453344025470:user/xijianlou is not authorized to perform: iottwinmaker:ListWorkspaces on resource: * because no identity-based policy allows the iottwinmaker:ListWorkspaces action:

The security token included in the request is expired

Hi guys,

Could someone help me I am getting the following error when following the steps to deploy the cookie factory:

botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the CreateRole operation: The security token included in the request is expired

Thanks

Query data error while we are building this sample

Hi Team,
We had set up the CookieFactory sample to explore the Twinmaker. Since yesterday, we have started to see Query Data error on the Alarms List panel on the CookieFactory AWS Managed Grafana Dashboard. We have tried multiple ways to debug this issue and we need the experts to support us here.
Error

Can someone help us on this issue to proceed further?

Update: When we tried with a local instance of grafana. We see the below error in the log.

logger=context traceID=00000000000000000000000000000000 userId=1 orgId=1 uname=admin t=2022-06-17T09:30:52.699790656Z level=error msg="Query data error" error="failed to query data: Failed to query data: rpc error: code = Unavailable desc = error reading from server: EOF" remote_addr=103.210.207.111 traceID=00000000000000000000000000000000
logger=context traceID=00000000000000000000000000000000 userId=1 orgId=1 uname=admin t=2022-06-17T09:30:52.700484991Z level=error msg="Request Completed" method=POST path=/api/ds/query status=500 remote_addr=103.210.207.111 time_ms=909 duration=909.577838ms size=75 referer="http://local:3000/d/FTW3SQj7k/testing-twinmaker-integration?orgId=1&editPanel=2" traceID=00000000000000000000000000000000
logger=plugin.grafana-iot-twinmaker-datasource t=2022-06-17T09:30:53.333277278Z level=info msg=Profiler enabled=false

Thanks,
Palani

Step 6 in Deploy needs update [startDateTime / endDateTime]

startTime and endTime in step6 of Deploy Guide needs to be updated to startDateTime and endDateTime:

aws iottwinmaker get-property-value-history
--region $AWS_DEFAULT_REGION
--cli-input-json '{"componentName": "AlarmComponent","endTime": "2023-06-01T00:00:00Z","entityId": "Mixer_2_06ac63c4-d68d-4723-891a-8e758f8456ef","orderByTime": "ASCENDING","selectedProperties": ["alarm_status"],"startTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'

Parameter validation failed:
Missing required parameter in input: "endDateTime"
Missing required parameter in input: "startDateTime"

Works with below update:

aws iottwinmaker get-property-value-history
--region $AWS_DEFAULT_REGION
--cli-input-json '{"componentName": "AlarmComponent","endDateTime": "2023-06-01T00:00:00Z","entityId": "Mixer_2_06ac63c4-d68d-4723-891a-8e758f8456ef","orderByTime": "ASCENDING","selectedProperties": ["alarm_status"],"startDateTime": "2022-06-01T00:00:00Z","workspaceId": "'${WORKSPACE_ID}'"}'

Grafana not showing data

Hi,

I executed all scripts, everything seems fine. However I stucked at the last step after import dashboard to Grafana, there is no data in every tab:
image

It there any step should be done?

Thanks.

Cloud9: Browsing your Grafana instance without exposing the instance to HTTP in security groups

When working in Cloud9 environment, there is a simple option to expose the grafana container instantiated as part of the setup_local_grafana_docker.sh to the end user without to

temporarily changing the Ingress rule for HTTP in your security group to "Anywhere-IPv4"

This can be achived by leveraging Cloud9 integrated port forwarding. See here for more details:
https://docs.aws.amazon.com/cloud9/latest/user-guide/app-preview.html#app-preview-preview-app

Simple modification of the way grafana container is instantiated to use port 8080 instead of 80:

docker run -d \
  -p 8080:3000 \
  --name=${CONTAINER_NAME} \
  -v ${SCRIPT_DIR}/local_grafana_data:/var/lib/grafana \
  -e "GF_INSTALL_PLUGINS=grafana-iot-twinmaker-app" \
  grafana/grafana:8.2.5

Will enable the user to brows from his local PC to Cloud9 address (securly over HTTPS):

https://<cloud9-environment-id>.vfs.cloud9.<region>.amazonaws.com/

issue in this, some recomendation

Deploy an Instance of the Timestream Telemetry module.

failed to register layer: Error processing tar file(exit status 1): write /usr/lib/jvm/java-1.8.0-amazon-corretto/jre/lib/rt.jar: no space left on device

/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/core/lib/bundling.ts:276
throw new Error(${prog} exited with status ${proc.status});
^
Error: docker exited with status 1
at dockerExec (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/core/lib/bundling.ts:276:11)
at Function.fromBuild (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/core/lib/bundling.ts:159:5)
at Object.bundle (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/aws-lambda-python/lib/bundling.ts:106:33)
at new PythonLayerVersion (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/aws-lambda-python/lib/layer.ts:43:13)
at new TimestreamTelemetryCdkLambdasStack (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/lib/timestream_telemetry_lambdas-stack.ts:45:11)
at Object. (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/bin/timestream_telemetry_cdk_lambdas.ts:9:1)
at Module._compile (node:internal/modules/cjs/loader:1103:14)
at Module.m._compile (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/ts-node/src/index.ts:1056:23)
at Module._extensions..js (node:internal/modules/cjs/loader:1155:10)
at Object.require.extensions. [as .ts] (/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/ts-node/src/index.ts:1059:12)

Cloud9: ImportError: libGL.so.1: cannot open shared object file: No such file or directory

I'm setting up the environment on Cloud9 IDE, and getting the following error:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory

Full trace:

python3 $GETTING_STARTED_DIR/src/modules/grafana/create_grafana_dashboard_role.py --workspace-id $WORKSPACE_ID --region $AWS_DEFAULT_REGION
Traceback (most recent call last):
  File "/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/grafana/create_grafana_dashboard_role.py", line 11, in <module>
    import deploy_utils
  File "/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/grafana/../../libs/deploy_utils/__init__.py", line 5, in <module>
    from .VideoUtils import *
  File "/home/ec2-user/environment/aws-iot-twinmaker-samples/src/modules/grafana/../../libs/deploy_utils/VideoUtils.py", line 12, in <module>
    import cv2
  File "/home/ec2-user/.local/lib/python3.7/site-packages/cv2/__init__.py", line 5, in <module>
    from .cv2 import *
ImportError: libGL.so.1: cannot open shared object file: No such file or directory

Environment:
uname -a: Linux ip-.ec2.internal 4.14.256-197.484.amzn2.x86_64 #1 SMP Tue Nov 30 00:17:50 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
AMI: Cloud9AmazonLinux2-2021-12-11T02-42
AWS managed credentials: YES

Solution:

sudo yum install mesa-libGL

7.Set up Grafana for the Cookie Factory

any suggestion?

rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/CHANGELOG.md’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/LICENSE’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/MANIFEST.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/README.md’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/dashboards/twinmaker-alarm-dashboard.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/dashboards/twinmaker-main-dashboard.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/img/AWS-IoT-TwinMaker.svg’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/datasource/plugin.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_darwin_amd64’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_darwin_arm64’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_linux_amd64’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_linux_arm’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_linux_arm64’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/gpx_twinmaker_app_windows_amd64.exe’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/img/AWS-IoT-TwinMaker.png’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/alarm/img/icon.svg’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/alarm/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/alarm/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/alarm/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/alarm/plugin_SKIP.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/layout/img/AWS-IoT-TwinMaker.svg’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/layout/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/layout/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/layout/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/layout/plugin.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/scene-viewer/img/AWS-IoT-TwinMaker.svg’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/scene-viewer/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/scene-viewer/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/scene-viewer/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/scene-viewer/plugin.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/video-player/img/AWS-IoT-TwinMaker.svg’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/video-player/module.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/video-player/module.js.LICENSE.txt’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/video-player/module.js.map’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/panels/video-player/plugin.json’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/draco/draco_decoder.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/draco/draco_decoder.wasm’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/draco/draco_wasm_wrapper.js’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/hdri/Chromatic_sm.hdr’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/hdri/Directional_sm.hdr’: Permission denied
rm: cannot remove ‘/home/ec2-user/local_grafana_data/plugins/grafana-iot-twinmaker-app/static/hdri/Neutral_sm.hdr’: Permission denied
24d2121c4b1ecd89122f532fedcc5066d04eb9363757b6200d7d5ff0abc976fc

Process exited with code: 0

Pane is dead

Error when running cdk deploy

Error msg:
no such file or directory, open
"/usr/lib/node_modules/docker/node_modules/highlight.js/styles/rsync -r /var/dependencies/. /asset-output/python && rsync -r . /asset-output/python.css"

  1. I went to the node_modules folder and found there was no highlight.js module;
  2. I installed the module: sudo npm install highlight.js@latest -g
  3. I went to the styles folder, but still cannot find the rsync file.

Is there a different Highlight.js package? Thanks!

unable to execute 'x86_64-linux-gnu-gcc': No such file or directory

I tried AWS IoT TwinMaker Insights , but I'm having trouble running the cdk deploy command on cloud9 Environment.

$ cdk deploy --all --require-approval never                    

✨  Synthesis time: 18.38s

CookieFactoryKdaStack: building assets...

current credentials could not be used to assume 'arn:aws:iam::000000000000:role/cdk-hnb659fds-deploy-role-000000000000-us-east-1', but are for the right account. Proceeding anyway.
[0%] start: Building 9ac91652bed689e137a8c2dc37127d334623e1706c485e737842c897a9e2fa64:000000000000-us-east-1
[100%] success: Built 9ac91652bed689e137a8c2dc37127d334623e1706c485e737842c897a9e2fa64:000000000000-us-east-1

CookieFactoryKdaStack: assets built

CookieFactorySageMakerStack: building assets...

current credentials could not be used to assume 'arn:aws:iam::000000000000:role/cdk-hnb659fds-deploy-role-000000000000-us-east-1', but are for the right account. Proceeding anyway.
[0%] start: Building b13c84220b58e4ebc58b6534d32c90621be7a29ba8e292daa1b3fea7c9100af2:000000000000-us-east-1
[0%] start: Building 6c86094f4acfcf2f82ef96efa77c98f0485abe716d705290350e419445a6ac06:000000000000-us-east-1
[0%] start: Building e1d3d136595270ead4b70f507ab0fc6e83efa762a3cbfdbd3527574738b9e29b:000000000000-us-east-1
[0%] start: Building 4afdcd85010adc84f46c5851a7f4baf15718f79c83916fb4c118f2eea5f1024f:000000000000-us-east-1
[25%] success: Built b13c84220b58e4ebc58b6534d32c90621be7a29ba8e292daa1b3fea7c9100af2:000000000000-us-east-1
[50%] success: Built 6c86094f4acfcf2f82ef96efa77c98f0485abe716d705290350e419445a6ac06:000000000000-us-east-1
[75%] success: Built e1d3d136595270ead4b70f507ab0fc6e83efa762a3cbfdbd3527574738b9e29b:000000000000-us-east-1
current credentials could not be used to assume 'arn:aws:iam::000000000000:role/cdk-hnb659fds-image-publishing-role-000000000000-us-east-1', but are for the right account. Proceeding anyway.
Sending build context to Docker daemon  160.8kB
Step 1/11 : FROM ubuntu:18.04
 ---> 251b86c83674
Step 2/11 : RUN apt-get -y update && apt-get install -y --no-install-recommends          wget          python3-pip          python3-matplotlib          python3-setuptools          nginx          cmake          ca-certificates     && cd /usr/local/bin     && ln -s /usr/bin/python3.6 python     && rm -rf /var/lib/apt/lists/*
 ---> Using cache
 ---> 7d6a64b9cb3a
Step 3/11 : RUN ln -s /usr/bin/python3 /usr/bin/python
 ---> Using cache
 ---> 1d8830efe4c1
Step 4/11 : RUN ln -s /usr/bin/pip3 /usr/bin/pip
 ---> Using cache
 ---> 737f976d9f6c
Step 5/11 : RUN pip --no-cache-dir install matplotlib pathlib numpy==1.16.2 scipy==1.2.1 scikit-learn==0.20.2 pandas flask gunicorn gevent==1.4 fmpy==0.3.1
 ---> Running in 2d5485e31f80
Requirement already satisfied: matplotlib in /usr/lib/python3/dist-packages
Collecting pathlib
  Downloading https://files.pythonhosted.org/packages/78/f9/690a8600b93c332de3ab4a344a4ac34f00c8f104917061f779db6a918ed6/pathlib-1.0.1-py3-none-any.whl
Collecting numpy==1.16.2
  Downloading https://files.pythonhosted.org/packages/35/d5/4f8410ac303e690144f0a0603c4b8fd3b986feb2749c435f7cdbb288f17e/numpy-1.16.2-cp36-cp36m-manylinux1_x86_64.whl (17.3MB)
Collecting scipy==1.2.1
  Downloading https://files.pythonhosted.org/packages/7f/5f/c48860704092933bf1c4c1574a8de1ffd16bf4fde8bab190d747598844b2/scipy-1.2.1-cp36-cp36m-manylinux1_x86_64.whl (24.8MB)
Collecting scikit-learn==0.20.2
  Downloading https://files.pythonhosted.org/packages/0d/3a/b92670f5c368c20329ecc4c255993fae7934564d485c3ed7ea7b8da7f741/scikit_learn-0.20.2-cp36-cp36m-manylinux1_x86_64.whl (5.4MB)
Collecting pandas
  Downloading https://files.pythonhosted.org/packages/c3/e2/00cacecafbab071c787019f00ad84ca3185952f6bb9bca9550ed83870d4d/pandas-1.1.5-cp36-cp36m-manylinux1_x86_64.whl (9.5MB)
Collecting flask
  Downloading https://files.pythonhosted.org/packages/cd/77/59df23681f4fd19b7cbbb5e92484d46ad587554f5d490f33ef907e456132/Flask-2.0.3-py3-none-any.whl (95kB)
Collecting gunicorn
  Downloading https://files.pythonhosted.org/packages/e4/dd/5b190393e6066286773a67dfcc2f9492058e9b57c4867a95f1ba5caf0a83/gunicorn-20.1.0-py3-none-any.whl (79kB)
Collecting gevent==1.4
  Downloading https://files.pythonhosted.org/packages/f2/ca/5b5962361ed832847b6b2f9a2d0452c8c2f29a93baef850bb8ad067c7bf9/gevent-1.4.0-cp36-cp36m-manylinux1_x86_64.whl (5.5MB)
Collecting fmpy==0.3.1
  Downloading https://files.pythonhosted.org/packages/93/12/524dfcfd08ffff4ef60b1eb407e685f862d8a1be4d71092b7328695c7559/FMPy-0.3.1-py3-none-any.whl (4.4MB)
Requirement already satisfied: pytz>=2017.2 in /usr/lib/python3/dist-packages (from pandas)
Collecting python-dateutil>=2.7.3 (from pandas)
  Downloading https://files.pythonhosted.org/packages/36/7a/87837f39d0296e723bb9b62bbb257d0355c7f6128853c78955f57342a56d/python_dateutil-2.8.2-py2.py3-none-any.whl (247kB)
Collecting click>=7.1.2 (from flask)
  Downloading https://files.pythonhosted.org/packages/4a/a8/0b2ced25639fb20cc1c9784de90a8c25f9504a7f18cd8b5397bd61696d7d/click-8.0.4-py3-none-any.whl (97kB)
Collecting Jinja2>=3.0 (from flask)
  Downloading https://files.pythonhosted.org/packages/20/9a/e5d9ec41927401e41aea8af6d16e78b5e612bca4699d417f646a9610a076/Jinja2-3.0.3-py3-none-any.whl (133kB)
Collecting Werkzeug>=2.0 (from flask)
  Downloading https://files.pythonhosted.org/packages/f4/f3/22afbdb20cc4654b10c98043414a14057cd27fdba9d4ae61cea596000ba2/Werkzeug-2.0.3-py3-none-any.whl (289kB)
Collecting itsdangerous>=2.0 (from flask)
  Downloading https://files.pythonhosted.org/packages/9c/96/26f935afba9cd6140216da5add223a0c465b99d0f112b68a4ca426441019/itsdangerous-2.0.1-py3-none-any.whl
Requirement already satisfied: setuptools>=3.0 in /usr/lib/python3/dist-packages (from gunicorn)
Collecting greenlet>=0.4.14; platform_python_implementation == "CPython" (from gevent==1.4)
  Downloading https://files.pythonhosted.org/packages/fd/6a/f07b0028baff9bca61ecfcd9ee021e7e33369da8094f00eff409f2ff32be/greenlet-2.0.1.tar.gz (163kB)
Collecting attrs (from fmpy==0.3.1)
  Downloading https://files.pythonhosted.org/packages/fb/6e/6f83bf616d2becdf333a1640f1d463fef3150e2e926b7010cb0f81c95e88/attrs-22.2.0-py3-none-any.whl (60kB)
Collecting msgpack (from fmpy==0.3.1)
  Downloading https://files.pythonhosted.org/packages/22/44/0829b19ac243211d1d2bd759999aa92196c546518b0be91de9cacc98122a/msgpack-1.0.4.tar.gz (128kB)
Collecting lxml (from fmpy==0.3.1)
  Downloading https://files.pythonhosted.org/packages/6d/b9/44f7e3b8a27eeef778188c50ad11feb46c7572f06227b4842188730591db/lxml-4.9.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (5.8MB)
Collecting lark-parser (from fmpy==0.3.1)
  Downloading https://files.pythonhosted.org/packages/76/00/90f05db333fe1aa6b6ffea83a35425b7d53ea95c8bba0b1597f226cf1d5f/lark_parser-0.12.0-py2.py3-none-any.whl (103kB)
Requirement already satisfied: six>=1.5 in /usr/lib/python3/dist-packages (from python-dateutil>=2.7.3->pandas)
Collecting importlib-metadata; python_version < "3.8" (from click>=7.1.2->flask)
  Downloading https://files.pythonhosted.org/packages/a0/a1/b153a0a4caf7a7e3f15c2cd56c7702e2cf3d89b1b359d1f1c5e59d68f4ce/importlib_metadata-4.8.3-py3-none-any.whl
Collecting MarkupSafe>=2.0 (from Jinja2>=3.0->flask)
  Downloading https://files.pythonhosted.org/packages/fc/d6/57f9a97e56447a1e340f8574836d3b636e2c14de304943836bd645fa9c7e/MarkupSafe-2.0.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting dataclasses; python_version < "3.7" (from Werkzeug>=2.0->flask)
  Downloading https://files.pythonhosted.org/packages/fe/ca/75fac5856ab5cfa51bbbcefa250182e50441074fdc3f803f6e76451fab43/dataclasses-0.8-py3-none-any.whl
Collecting zipp>=0.5 (from importlib-metadata; python_version < "3.8"->click>=7.1.2->flask)
  Downloading https://files.pythonhosted.org/packages/bd/df/d4a4974a3e3957fd1c1fa3082366d7fff6e428ddb55f074bf64876f8e8ad/zipp-3.6.0-py3-none-any.whl
Collecting typing-extensions>=3.6.4; python_version < "3.8" (from importlib-metadata; python_version < "3.8"->click>=7.1.2->flask)
  Downloading https://files.pythonhosted.org/packages/45/6b/44f7f8f1e110027cf88956b59f2fad776cca7e1704396d043f89effd3a0e/typing_extensions-4.1.1-py3-none-any.whl
Installing collected packages: pathlib, numpy, scipy, scikit-learn, python-dateutil, pandas, zipp, typing-extensions, importlib-metadata, click, MarkupSafe, Jinja2, dataclasses, Werkzeug, itsdangerous, flask, gunicorn, greenlet, gevent, attrs, msgpack, lxml, lark-parser, fmpy
  Found existing installation: numpy 1.13.3
    Not uninstalling numpy at /usr/lib/python3/dist-packages, outside environment /usr
  Found existing installation: python-dateutil 2.6.1
    Not uninstalling python-dateutil at /usr/lib/python3/dist-packages, outside environment /usr
  Running setup.py install for greenlet: started
    Running setup.py install for greenlet: finished with status 'error'
    Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-noscv1ag/greenlet/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-_7ohigw7-record/install-record.txt --single-version-externally-managed --compile:
    /usr/lib/python3.6/distutils/dist.py:261: UserWarning: Unknown distribution option: 'long_description_content_type'
      warnings.warn(msg)
    running install
    running build
    running build_py
    creating build
    creating build/lib.linux-x86_64-3.6
    creating build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/__init__.py -> build/lib.linux-x86_64-3.6/greenlet
    creating build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/__init__.py -> build/lib.linux-x86_64-3.6/greenlet/platform
    creating build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/__init__.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/leakcheck.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_contextvars.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_cpp.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_extension_interface.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_gc.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_generator.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_generator_nested.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_greenlet.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_greenlet_trash.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_leaks.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_stack_saved.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_throw.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_tracing.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_version.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/test_weakref.py -> build/lib.linux-x86_64-3.6/greenlet/tests
    running egg_info
    writing src/greenlet.egg-info/PKG-INFO
    writing dependency_links to src/greenlet.egg-info/dependency_links.txt
    writing requirements to src/greenlet.egg-info/requires.txt
    writing top-level names to src/greenlet.egg-info/top_level.txt
    reading manifest file 'src/greenlet.egg-info/SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    warning: no previously-included files found matching 'benchmarks/*.json'
    no previously-included directories found matching 'docs/_build'
    warning: no files found matching '*.py' under directory 'appveyor'
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    warning: no previously-included files matching '*.pyd' found anywhere in distribution
    warning: no previously-included files matching '*.so' found anywhere in distribution
    warning: no previously-included files matching '.coverage' found anywhere in distribution
    writing manifest file 'src/greenlet.egg-info/SOURCES.txt'
    copying src/greenlet/greenlet.cpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet.h -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_allocator.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_compiler_compat.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_cpython_compat.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_exceptions.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_greenlet.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_internal.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_refs.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_slp_switch.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_thread_state.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_thread_state_dict_cleanup.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/greenlet_thread_support.hpp -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/slp_platformselect.h -> build/lib.linux-x86_64-3.6/greenlet
    copying src/greenlet/platform/setup_switch_x64_masm.cmd -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_aarch64_gcc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_alpha_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_amd64_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_arm32_gcc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_arm32_ios.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_arm64_masm.asm -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_arm64_masm.obj -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_arm64_msvc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_csky_gcc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_m68k_gcc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_mips_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc64_aix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc64_linux.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc_aix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc_linux.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc_macosx.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_ppc_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_riscv_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_s390_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_sparc_sun_gcc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x32_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x64_masm.asm -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x64_masm.obj -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x64_msvc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x86_msvc.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/platform/switch_x86_unix.h -> build/lib.linux-x86_64-3.6/greenlet/platform
    copying src/greenlet/tests/_test_extension.c -> build/lib.linux-x86_64-3.6/greenlet/tests
    copying src/greenlet/tests/_test_extension_cpp.cpp -> build/lib.linux-x86_64-3.6/greenlet/tests
    running build_ext
    building 'greenlet._greenlet' extension
    creating build/temp.linux-x86_64-3.6
    creating build/temp.linux-x86_64-3.6/src
    creating build/temp.linux-x86_64-3.6/src/greenlet
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.6m -c src/greenlet/greenlet.cpp -o build/temp.linux-x86_64-3.6/src/greenlet/greenlet.o
    unable to execute 'x86_64-linux-gnu-gcc': No such file or directory
    error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
    
    ----------------------------------------
Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-noscv1ag/greenlet/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-_7ohigw7-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-noscv1ag/greenlet/
The command '/bin/sh -c pip --no-cache-dir install matplotlib pathlib numpy==1.16.2 scipy==1.2.1 scikit-learn==0.20.2 pandas flask gunicorn gevent==1.4 fmpy==0.3.1' returned a non-zero code: 1
[100%] fail: docker build --tag cdkasset-4afdcd85010adc84f46c5851a7f4baf15718f79c83916fb4c118f2eea5f1024f . exited with error code 1: The command '/bin/sh -c pip --no-cache-dir install matplotlib pathlib numpy==1.16.2 scipy==1.2.1 scikit-learn==0.20.2 pandas flask gunicorn gevent==1.4 fmpy==0.3.1' returned a non-zero code: 1

 ❌ Building assets failed: Error: Building Assets Failed: Error: Failed to build one or more assets. See the error messages above for more information.
    at buildAllStackAssets (/home/ec2-user/.nvm/versions/node/v16.19.0/lib/node_modules/cdk/node_modules/aws-cdk/lib/build.ts:21:11)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at CdkToolkit.deploy (/home/ec2-user/.nvm/versions/node/v16.19.0/lib/node_modules/cdk/node_modules/aws-cdk/lib/cdk-toolkit.ts:197:9)
    at initCommandLine (/home/ec2-user/.nvm/versions/node/v16.19.0/lib/node_modules/cdk/node_modules/aws-cdk/lib/cli.ts:374:12)

Building Assets Failed: Error: Failed to build one or more assets. See the error messages above for more information.

Please tell me what shoud I do.
Thanks.

Twin Maker - Deploy an Instance - Error This CDK CLI is not compatible

Hi all,
I tried to run the guide to install the workspace wiht aws-iot-twinmaker-samples example on my AWS Cloud9 environment but I'm having trouble running the cdk deploy command. below the log:

`
ec2-user:~/environment/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk (main) $ cdk deploy
Sending build context to Docker daemon 3.584kB
Step 1/9 : ARG IMAGE=public.ecr.aws/sam/build-python3.7
....
Step 3/4 : RUN yum -q list installed rsync &>/dev/null || yum install -y rsync
---> Using cache
---> e4568f4cd9fe
Step 4/4 : CMD [ "python" ]
---> Running in de1146631e5f
Removing intermediate container de1146631e5f
---> f0a31f91d9c2
Successfully built f0a31f91d9c2
Successfully tagged cdk-3de8e40c6b0631b1a2e8eed72754e5e4d95c87eefd46f85e40f67fbf1de6eb55:latest
Bundling asset TimestreamTelemetryCdkLambdasStack/timestreamReaderUDQ/Code/Stage...

**This CDK CLI is not compatible with the CDK library used by your application. Please upgrade the CLI to the latest version.

(Cloud assembly schema version mismatch: Maximum schema version supported is 20.0.0, but found 21.0.0)**
`

I report also the version about to tools needed:

`
Python 3.7.10

node v16.17.0

npm v8.15.0

cdk v2.39.1 (build f188fac)

aws aws-cli/2.7.29 Python/3.9.11 Linux/4.14.290-217.505.amzn2.x86_64 exe/x86_64.amzn.2 prompt/off

`

I followed also this guide -> iot.awsworkshops.com/aws-iot-twinmaker/lab101-twinmaker-cookiefactory/ same error...

Kindly waiting for your feedback.

Tnks and Br,

CDK deploy issues

/home/ec2-user/work/aws-iot-twinmaker-samples/src/modules/timestream_telemetry/cdk/node_modules/@aws-cdk/aws-lambda-python-alpha/lib/bundling.ts:108
IMAGE: runtime.bundlingImage.image,

CDK deploy failing with this message. Any help

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.