Comments (2)
How often will knowledge of pixel level provenance be required? From my understanding, it must be available, but wouldn't be needed to be saved in high resolution on disk.
At the moment the database records indicate which datasets are used to produce a storage units, from which it's possible to compute per-pixel provenance.
The extra (probably crazy) step, would be storing this provenance information in the storage units as a variable, but I don't see any benefit for doing this, since it won't be necessary for general usage. And it increases both storage overhead and code complexity.
from datacube-core.
Suspect it would start to get very busy quickly. As an example, the seasonal ground cover composites use pretty much every landsat pixel (will all the masking and shadow and water...) to make a prediction for every date so the metadata soon blows out. We (JRSRP) currently have an issue raised looking and compression and truncation methods for these higher level products so the provenance can be recreated easily, just not necessarily stored within every file.
from datacube-core.
Related Issues (20)
- API Query doctest fails due to different timezone representation
- --with-docker unable to connect to local postgres when running integration tests HOT 1
- Errors following documentation build instructions HOT 1
- AttributeError: module 'ee.data' has no attribute '_get_cloud_api_resource' HOT 1
- issue in find_less_mature when indexing datasets without region_code HOT 3
- Installation environment issues HOT 2
- Error running `datacube system init --no-init-users` HOT 1
- datacube.utils.documents.UnknownMetadataType: Unknown metadata type: 'eo3' while running the command datacube product add s2_l2a.odc-product.yaml in opendatacube HOT 4
- Record intentional omissions from collections HOT 3
- Compatibility issue - python version and aiohttp HOT 6
- Unable to Work with Temporary AWS Security Credentials HOT 6
- lost date time and measurements after ingesting the data using the template s2amsil1c_albers_10.yaml HOT 3
- ODC feature request: env var for `skip_broken_datasets`? HOT 2
- list_products ignores information in load hints
- Replace datacube GeoBox with odc-geo GeoBox? HOT 1
- Use odc-geo GridSpec model internally HOT 6
- Cannot pass `odc.geo` GeoBoxes to `dc.load` HOT 2
- Regression with new database connection handling/transaction isolation HOT 1
- Deprecation Warning on `pkg_resources` in 1.9 branch
- "NOT" operator for ODC queries
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from datacube-core.