Code Monkey home page Code Monkey logo

dominode's Introduction

dominode

Development of the Dominica SDI

dominode's People

Contributors

gubuntu avatar ingenieroariel avatar lucernae avatar meomancer avatar nyakudyaa avatar ricardogsilva avatar samweli avatar waybarrios avatar zacharlie avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

dominode's Issues

implement analytics

'analytics' mentioned in section 5 Deliverables table - enquire what this actually means

Load all vector layers used in the topo maps into PostGIS

All layers used in the production of the topo maps are priorities to be available in PostGIS to support topo maps in QGIS

The layers might well be published in GeoServer as well (to provide web service endpoints for general users)

Fill in the data catalogue spreadsheet

The spreadsheet in

https://docs.google.com/spreadsheets/d/1uiuey2w4IvCNtNK0rV2Z3e5IOk0TsbGrkQit9Cdic0o/edit#gid=0

contains a Catalog sheet which contains an initial list of most datasets that were identified during previous stages of the project.

We need to fill in the blanks in the spreadsheet in order to

  1. Get a description of the existing datasets and their respective custodians
  2. Track the progress of data upload, validation and publishing

This sheet shall contain the following columns - be sure to create any columns listed here that may not be in the spreadsheet yet:

  • Columns related to data description and owners

    • OGC Topic Category - A value coming from the fixed ISO 19115 topic categories, which are listed in the first column of the Lookups sheet
    • Dataset - Name of the dataset
    • Dataset custodian department - Department that will be responsible for owning the data
    • Dataset custodian person
    • Metadata custodian department - Department that will be responsible for maintaining the metadata of the dataset
    • Metadata custodian person - Person that will be responsible for maintaining the metadata of the dataset
    • Style custodian department - Department that will be responsible for maintaining the style for the dataset
    • Style custodian person -
    • name of dataset on old DomiNode - This means that we most likely already have the dataset (as we have a dump of old DomiNode's contents)
    • location of authoritative file - It is likely that more than one department has different copies of datasets, but we need to find which is the canonical one
    • additional details
    • dataset is to be freely accessible to the general public (YES/NO)
    • dataset type - NATIONAL BASEMAP/CORE LAYER/PROJECT LAYER - Read more about this in the DomiNode Data Assessment SOP
  • Columns related to data ingestion workflow status - These are mostly Yes/No columns

    • dataset has been uploaded to staging DB or filesystem
    • dataset has been validated
    • dataset has a metadata record
    • metadata record has been validated
    • dataset has a style
    • dataset style has been validated
    • dataset has been promoted to the published DB schema or filesystem dir
    • dataset should be published via OGC webservices
    • dataset has been published via OGC webservices
    • dataset should be published via GeoNode
    • dataset has been published via GeoNode

Interact with the various project stakeholders in order to gather the information specified above.

Start with the datasets owned by the LSD and PPD departments, as these are the biggest stakeholders.

Write production deployment guide

Production deployment is to be documented with step by step directions.

This document shall then be used to perform the actual deployment.

Complete data permission matrix

The data permission matrix should identify

  • Main stakeholder agencies
  • Specific people inside each agency that will serve designated roles (data custodian, etc)
  • Specific people inside each agency that will be given privileged access

It should mention

  • Required capabilities for each database (data, styles, metadata, geonode), webservice (WFS, etc) and also to the minio storage
  • Required capabilities for data publishing

Access to the matrix:

https://docs.google.com/spreadsheets/d/1uiuey2w4IvCNtNK0rV2Z3e5IOk0TsbGrkQit9Cdic0o/edit#gid=656035837

coordinate data loading

To start now with the staging system and continue through to the production release

assign to Luisa

implement data request form

in 3.5.1: The firm will also assist in developing a data request process along with a data request form hosted on the DomiNode.

more detail from Collin email 8 June:

The PCU will follow up with Lands and Surveys, ICTU as well as Physical Planning with regards to a form. A call with the Firm and those agencies to sketch out the content of this form may be useful. They will need to define if they want one form for all requests or different forms for different parties, i.e. commercial vs research requests. Parties will also need to define the process for acceptance and processing of these requests.

Compile DomiNode five year maintenance budget

https://docs.google.com/spreadsheets/d/1a52ZJAwjK5hI-BNBaTTPSh06uc0b0NPG2eEfvDIBXz8/edit#gid=0

3.5.7 Budgeting and Recurring Costs

The firm will assist the GoCD in budgeting for the maintenance of the NSDI, and particularly the SDMP and related components. Minimizing recurring costs to a level which can be supported by the budgets of participating agencies is critical. Cloud services used under this project, for instance, must have a long term recurring cost which is possible with the budgets of participating agencies. Yearly budgets for maintenance of all hardware and software required for maintaining the processes and systems created under this project, or deemed vital for the NSDI by the GoCD, will be developed by the firm for a period of 5 years from the start of this contract.

further clarification from Collin email 8 June:

A five-year budget will be a rough estimate at best given the pace of technological change. If the Firm gives a budget before the end of your engagement, it should be fine as long as it identifies the factors which could substantially increase or lower the quoted amounts.

switch from staging to production

  • prep orchestration to remove Nucs from staging
  • ship Nucs to Dominica
  • install Nucs in Dominica data centre
  • update orchestration to final staging and production nodes and clusters
  • test everything
  • point dominode.dm to production instance

Skin GeoNode

Use elements from dominode.dm to skin the new one in the current best practice way using the same logo and colour scheme.

We just want a basic skin for now, not too much effort.

Develop cloud processing tutorial

3.5.6 Introduction of Cloud Based Analysis and Processing Tools

The firm will work with the LSD and PPD to determine possible cloud based tools, such as Google Earth Engine and cloud hosted Jupyter notebook, which they can use in their workflows. Other possible tools include cloud based drone imagery processing for LSD.

Check if geonetwork is a viable choice for integrating with geonode

We want to provide users with a nice interface for performing CRUD operations on metadata. geonetwork seems like a good fit for this job.

However, we need to investigate on the current state of the integration between geonode and geonetwork:

  • Is it still possible to use a geonetwork catalog as a backend to geonode?
  • Is it possible to make this geonetwork catalog be read-write or read-only from the geonode side?

Deploy stack to production environment

After #33 is done, or while it is being finalized we need to deploy the tech stack into the production environment.

  • - Rancher and Kubernetes cluster configuration

  • - minIO

    • - Setup a mount for raster data. It should be properly structured in order to offer dedicated areas for each stakeholder, including a production and a staging area for each, as specified in #25
    • - Setup a mount for the GeoServer data dir too
    • - Setup a mount for the django media dir too (to store uploaded user profile pictures)
  • - Install PostGIS data DB

    • - Configure appropriate schemas for each stakeholder, including those for production and staging as specified in #26
  • - Install and configure GeoServer

  • - Install and configure Geonetwork - This will also require a DB for storage of metadata records

  • - Install and configure Geonode

    • - Also needs a DB
    • - Setup Geonetwork as the metadata catalogue

Write data management workflows document

Write a document describing how data management workflows should be done.

This document shall describe workflows for both dataset initial ingestion and further modifications, describing the following:

  • Staging area -> process -> validate -> production area
  • Creation and validation of styles
  • Creation and validation of metadata records
  • Publishing on GeoServer and GeoNode

The document shall be created here:

https://docs.google.com/document/d/1rcoH0c1RpCguuPFLe92V-2xdipJU9ODwItOS_XPe-GM/edit

Develop CORS app

The GoCD currently operates four (4) high accuracy GNSS base-stations, which report observations in RINEX, Leica, and Trimble formats. These observations are useful for the post-processing of GNSS observations by surveyors and other data collection personnel. It is the intention of the GoCD to consolidate the collection of these data and streamline access.

The firm will work with the GoCD to consolidate all CORS data on a single server and develop a web based access to this server with read only access (as detailed in Task 3.2.2 of these Terms of Reference).

To streamline the use of these data, the firm will develop an app to extend the DomiNode interface to allow for easy access to the CORS datasets. The app shall:

  • Provide an interactive web-map based view of station locations (as points) and status (online/offline/under repair). Additional details such as station names, coordinates, owner and other details deemed necessary by the client may be included in a popup or sidebar for each point on the map.
  • Provide the ability to select a site, then as a sidebar or popup interface provide an interface for the user to select a date range of observations to download.
  • Provide the ability to control user access to the app by DomiNode administrators. Full public access will be restricted and downloads will require a user account allowed to download by administrators.
  • Allow the download of data as a .zip file.
  • Implement basic error handling within the app and will include messages on the interface in the case that the date range selected is invalid or data is unavailable.
  • Provide warnings to the user in case they have selected a large download (over 50 MB).
  • Provide a disclaimer before download and terms of use for user acceptance. The text of this disclaimer will be provided by the LSD.
  • Downloading will require confirmation that the user downloading is human (Captcha or similar).
  • Human readable logs will be kept of access to the app along with user logons and downloads.

Bug notes from Charles

Taken from slack message:

I pointed out the following to Gavin in respect of the Dominode:

  • the Coat of Arms must be at the top of the page
  • the link to the homepage says Geonode instead of Dominode
  • Layer thumbnails do not load
  • Legend thumbnails do not load
  • my credentials don't get passed on to Geoserver from GeoNode (after signing in)
  • clicking "View on Site" on Django User Details page sends user to an invalid page (URL is incomplete)

implement mobile data collection solution

https://docs.google.com/document/d/1y9YGe7o1r286ZWtP1Hxy2RmG-228v_2n0XlNHmqBAHU/edit#heading=h.wc70n4w8x69s

not sure we have to implement it - not mentioned in contract

clarification from Collin email 8 June:

When the TOR was initially written, we felt that the implementation of a field surveying application would be out of scope given the available budget. So only a plan, which the government could act upon if funding became available from a different source, was requested. If doing more is within scope and the line agencies find this useful then the PCU would go ahead with a full implementation.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.