Code Monkey home page Code Monkey logo

arcgis-online-data-admin's Introduction

ArcGIS Online Data Admin

This project is an evolving collection of scripts that can be used to manage public facing datasets used for demonstration purposes. The file/directory system is designed to be modular and thus reusable.

Environment

This project is authored using Python 3.8.5 managed by Conda and requires arcgis.

Setup

Clone this repo and cd into this directory (arcgis-online-data-admin).

Conda Virtual Environment

Prereq: Install Conda

It is recommended you use conda to create and manage the virtual environment.

Create

Create a Python 3 virtual environment and install project dependencies:

conda env create -f agol-data-admin.yml

Activate

Then, activate the Python 3 virtual environment.

conda activate agol-data-admin

To deactivate the virtual environment type conda deactivate.

Contribute

If your contribution requires a new python dependency, please continue to use conda.

Consult the cheat sheet for quick conda onboarding.

conda install <channel?> <dependency>

Be sure to export the new environment so that all contributors have the same environment.

conda env export -c esri --from-history > agol-data-admin.yml

Other contributors will update their environment.

conda env update -f agol-data-admin.yml

AGOL Credentials

Some aspects of this project (will) require AGOL credentials. The ArcGIS API for Python requires case-sensitive supplied username and password. There are two avenues through which you can supply AGOL username and password.

1. Config File -> Environment

Some programs will ask for credentials via a yml configuration file. This configuration file should support the portal attribute at the highest level. If credentials are not supplied via configuration file, the program will fallback to read credentials from the environment. If the program cannot find credentials in either location, an exception is raised. For programs that follow this paradigm, the supplied yml configuration file must support a portal attribute at the highest level:

portal:
  # url is required
  url: https://www.arcgis.com
  # username is optional, omit this attribute to fallback to the environment
  username: your-username
  # password is optional, omit this attribute to fallback to the environment
  password: your-password
2. Arguments -> Environment

Some programs will ask for credentials via command line arguments. If credentials are not supplied via command line, the program will fallback to read credentials from the environment. If the program cannot find credentials in either location, an exception is raised. For programs that follow this paradigm, arguments can be supplied using:

  • -u --username
  • -p --password
Environment Credentials

You can specify your AGOL credentials via your environment. These environment variable names are:

  • AFD_PORTAL_USERNAME
  • AFD_PORTAL_PASSWORD

It's recommended you set credentials to your environment for one-and-done credential management.

Logger

Some scripts will use a logger. The logger outputs script logs to console and optionally to a file path. The hierarchy of script log levels follows this pattern:

DEBUG > INFO > WARNING > ERROR > CRITICAL

These scripts supply two optional command line arguments:

Verbose

-v --verbose

Print DEBUG level script logs to console. The default is INFO.

Output

-o --output

Write DEBUG level script logs to a file path.

Run tail -f output.log to follow the output in a separate terminal session.

Dev

-d --dev

Some scripts support 'DEV' environment which allows you to pass a limit to how many records are processed by the script, reducing the run time of potentially long-running scripts.

Scripts

transmute

Transmute allows you to transmute data stored in a source feature layer to a destination feature layer. This is particularly useful for performing nightly data scrubbing tasks.

The script uses the source data as the source of truth and edits the destination feature layer based on what is contained by the source. Any edits made to the source layer will be captured by the script and supplied to the destination layer at the next time the script is performed.

The script is designed to transmute a single feature layer, not an entire service; to transmute an entire service, perform the script repeatedly, layer by layer.

The script takes a configuration file formatted as so.

portal:
  url: http://www.arcgis.com
source:
  feature-service-item-id: "ITEM_ID"
  layer-index: 0
destination:
  feature-service-item-id: "ITEM_ID"
  layer-index: 0
  reference-id-key: "ATTRIBUTE_NAME"

The schemas for source and destination datasets must be compatible and are based on this simple equation,

destination.fields = source.fields + reference-id-key

The reference-id-key is a required field on the destination layer and is used to reference the contents of the destination layer to its source counterpart. The reference id key must be an Integer attribute field and can be named whatever you'd like.

You can test this script using the testing-dataset scheme.

python3 transmute.py -c ./schemes/testing-dataset.yml

Note, the datasets are hosted in the arcgisruntime org.

Setting up your own transmutable services

Follow these steps to setup your own transmute.

  1. Create a feature service in AGOL by uploading a Shapefile.
  2. Create a second feature service in AGOL by uploading the same Shapefile.
  3. You are welcome to drop some fields but be sure to drop them on both feature layers.
  4. On the destination layer, add an Integer field for referencing the source datum.
  5. Enable data collection setting on the destination layer.
  6. Set sharing on the destination layer to public.
  7. Set sharing on the source layer to private or organization.
  8. Enable editing for both feature services.
  9. Perform the script.

arcgis-online-data-admin's People

Contributors

esreli avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.