Code Monkey home page Code Monkey logo

aiforearthdatasets's Introduction

AI for Earth Data Sets

The Microsoft AI for Earth program hosts geospatial data on Azure that is important to environmental sustainability and Earth science. This repo hosts documentation and demonstration notebooks for all the data that is managed by AI for Earth. It also serves as a "staging ground" for the Planetary Computer Data Catalog.

If you have feedback about any of this data, or want to request additions to our data program, email [email protected].

Table of contents

Data sets

ALOS World 3D

Global topographic information from the JAXA ALOS PRISM instrument.

ASTER L1T (2000-2006)

The ASTER instrument, launched on-board NASA's Terra satellite in 1999, provides multispectral images of the Earth at 15m-90m resolution. This data set represents ASTER data from 2000-2006.

Copernicus DEM

Global topographic information from the Copernicus program.

Daymet

Estimates of daily weather parameters in North America on a one-kilometer grid, with monthly and annual summaries.

Deltares Global Flood Maps

Global estimates of coastal inundation under various sea level rise conditions and return periods at 90m, 1km, and 5km resolutions. Also includes estimated coastal inundation caused by named historical storm events going back several decades.

Deltares Global Water Availability

Simulations of historical daily reservoir variations for 3,236 locations across the globe for the period 1970-2020 using the distributed wflow_sbm model. The model outputs long-term daily information on reservoir volume, inflow and outflow dynamics, as well as information on upstream hydrological forcing.

Esri 10m Land Cover

Global estimates of 10-class land use/land cover (LULC) for 2020, derived from ESA Sentinel-2 imagery at 10m resolution, produced by Impact Observatory.

Global Biodiversity Information Facility (GBIF)

Exports of global species occurrence data from the GBIF network.

Harmonized Global Biomass

Global maps of aboveground and belowground biomass carbon density for the year 2010 at 300m resolution.

Harmonized Landsat Sentinel-2

Satellite imagery from the Landsat 8 and Sentinel-2 satellites, aligned to a common grid and processed to compatible color spaces.

High Resolution Electricity Access (HREA)

Settlement-level measures of electricity access, reliability, and usage derived from VIIRS satellite imagery.

High Resolution Ocean Surface Wave Hindcast

Long-term wave hindcast data for the U.S. Exclusive Economic Zone (EEZ), developed by the U.S. Department of Energy's Water Power Technologies Office.

Labeled Information Library of Alexandria: Biology and Conservation (LILA BC)

AI for Earth and partners have assembled a repository of labeled information related to wildlife conservation, particularly wildlife imagery.

Landsat TM/MSS Collection 2

Global optical imagery from the Landsat MSS and TM instruments, which imaged the Earth from 1972 to 2013, aboard the Landsat 1-5 satellites.

Landsat TM/MSS data are in preview; access is granted by request.

Landsat 7 Collection 2 Level-2

Global optical imagery from the Landsat 7 satellite, which has imaged the Earth since 1999.

Landsat 7 data are in preview; access is granted by request.

Landsat 8 Collection 2 Level-2

Global optical imagery from the Landsat 8 satellite, which has imaged the Earth since 2013.

MODIS (40 individual products)

Satellite imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS).

Monitoring Trends in Burn Severity Mosaics

Annual burn severity mosaics for the continental United States and Alaska.

National Solar Radiation Database

Hourly and half-hourly values of the three most common measurements of solar radiation – global horizontal, direct normal, and diffuse horizontal irradiance - along with meteorological data.

NASADEM

Global topographic information from the NASADEM program.

NREL Puerto Rico 100 Dataset (PR100)

A collection of geospasial data useful for renewable energy development in Puerto Rico. The dataset is curated by the National Renewable Energy Laboratory.

NREL PV Rooftop Database

A lidar-derived, geospatially-resolved dataset of suitable roof surfaces and their PV technical potential for 128 metropolitan regions in the United States.

NOAA Climate Data Records (CDR)

Historical global climate information.

NOAA Climate Forecast System (CFS)

Model output data from the NOAA NCEP Climate Forecast System Version 2.

NOAA Digital Coast Imagery

High resolution (1 meter or less) imagery collected by a number of sources and contributed to the NOAA Digital Coast

NOAA GFS Warm Start Initial Conditions

Warm start initial conditions for the NOAA Global Forecast System.

NOAA GOES-R

Weather imagery from the GOES-16, GOES-17, and GOES-18 satellites.

NOAA Global Ensemble Forecast System (GEFS)

Model output data from the NOAA Global Ensemble Forecast System.

NOAA Global Forecast System (GFS)

Model output data from the NOAA Global Forecast System.

NOAA Global Hydro Estimator (GHE)

Global rainfall estimates in 15-minute intervals.

NOAA High-Resolution Rapid Refresh (HRRR)

Weather forecasts for North America at 3km spatial resolution and 15 minute temporal resolution.

NOAA Integrated Surface Data (ISD)

Historical global weather information.

NOAA Monthly US Climate Gridded Dataset (NClimGrid)

Gridded climate data for the US from 1895 to the present.

NOAA National Water Model

Data from the National Water Model.

NOAA Rapid Refresh (RAP)

Weather forecasts for North America at 13km resolution.

NOAA US Climate Normals

Typical climate conditions for the United States from 1981 to the present.

National Agriculture Imagery Program

NAIP provides US-wide, high-resolution aerial imagery. This data set includes NAIP images from 2010 to the present.

National Land Cover Database

US-wide data on land cover and land cover change at a 30m resolution with a 16-class legend.

NatureServe Map of Biodiversity Importance (MoBI)

Habitat information for 2,216 imperiled species occurring in the conterminous United States.

Ocean Observatories Initiative CamHD

Video data from the Ocean Observatories Initiative seafloor camera deployed at Axial Volcano on the Juan de Fuca Ridge.

Sentinel-1 GRD

Global synthetic aperture radar (SAR) data from 2017-present, projected to ground range.

Sentinel-1 GRD data are in preview; access is granted by request.

Sentinel-1 SLC

Global synthetic aperture radar (SAR) data for the last 90 days.

Sentinel-1 SLC data are in preview; access is granted by request.

Sentinel-2 L2A

Global optical imagery at 10m resolution from 2016-present.

Sentinel-3 L2

Global multispectral imagery at 300m resolution, with a revisit rate of less than two days, from 2016-present.

Sentinel-3 data are in preview; access is granted by request.

Sentinel-5P

Global atmospheric data from 2018-present.

Sentinel-5P data are in preview; access is granted by request.

TerraClimate

Monthly climate and climatic water balance for global terrestrial surfaces from 1958-2019.

UK Met Office CSSP China 20CRDS

Historical climate data for China, from 1851-2010.

UK Met Office Global Weather Data for COVID-19 Analysis

Data for COVID-19 researchers exploring relationships between COVID-19 and environmental factors.

University of Miami Coupled Model for Hurricanes Ike and Sandy

Modeled wind, wave, and current data for Hurricanes Ike and Sandy, produced by the National Renewable Energy Laboratory.

USFS Forest Inventory and Analysis

Status and trends on U.S. forest location, health, growth, mortality, and production, from the US Forest Service's Forest Inventory and Analysis (FIA) program.

USGS 3DEP Seamless DEMs

USGS Gap Land Cover

Legal stuff

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

aiforearthdatasets's People

Contributors

agentmorris avatar brunosan avatar calebrob6 avatar lossyrob avatar manongros avatar mattblissett avatar microsoft-github-operations[bot] avatar microsoftopensource avatar mmcfarland avatar smartlixx avatar tomaugspurger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aiforearthdatasets's Issues

Adding requirements in term of dependencies

Hi AIforEarthDataSets amazing team!

THANK YOU so much for this amazing work! As we (through usegalaxy.eu / Galaxy for Ecology / Galaxy for Climate / BiodiFAIRse GO FAIR Implementation network / PNDB French Biodiversity e-infrastructure / French Challenge IA-Biodiv ) want to capitalize onthis work and facilitate the reuse of these notebooks, it appears to me that adding informations about requirements in terms of dependencies like R packages or Python libraries for each notebook can be of great interest to rise the FAIRness of each. A preliminary idea can be to add a requirements.txt file like this example https://github.com/galaxyecology/webhook_SPIPOLL_Flash/blob/master/requirements.txt and for sure something better an environment.yml listing conda packages like this example https://github.com/IFB-ElixirFr/ReproHackathon/blob/master/reprohackathon1/galaxy/environment.yml to each notebook, for example gbif_environment.yml for the gbif notebook.

If you think this can be relevant, please don't hesitate to comment / propose way to do so.

Whishing you a very good end of week,

Best,

Yvan

Question: Data projection transform

Hello,
I am getting familiar with the MSFT Planetary Computer datasets, specifically the HRRR forecasts. I have been reviewing the example notebook provided and I am confused by the projection transformations given. Specifically, I am confused by the need to provide the Plate Carree transform on the temperature plot, given that the dataset (ds.t) is in the Lambert Conformal projection, which is the same as the GeoAxes object for the plot. My understanding of the Cartopy library is that the transform parameter should indicate the data projection, and if it is left out, the GeoAxes projection is assumed to apply to the data, which should be the case for this example. However, if the Plate Carree transform is omitted, the data does not get mapped correctly (at all). I tried emailing the NOAA team with this question, but their email server rejected my message. Thanks for any information you can provide to help me understand this dataset better.

Impossible to retrieve goes-r lst data due to faulty documentation

Microsoft planetary computer documentation about getting goes lst data is insufficient to get that data.

Let me explain, here is an example of a goes image url:

https://goeseuwest.blob.core.windows.net/noaa-goes16/ABI-L2-LSTC/2023/008/12/OR_ABI-L2-LSTC-M6_G16_s20230081201170_e20230081203543_c20230081205077.nc)

Notice that directory name is fully predictable while the image file path basename is not because it contains the processing timestamp which will vary of few milliseconds from a goes image to another. The way GCP and AWS deal with that is by providing bucket path instead of URL/URI, so the user can list image paths in specific folders. Another solution would be providing a data catalog, either a stac or a parquet file, like you guys are already doing with other GOES products.

Or am I missing something?

Add time-series query examples to noaa-nwm-example data notebook

The NWM notebook currently displays interaction examples with single files from the NOAA National Water Model output data.

Many uses of the NWM data require collecting a series of multiple outputs and assembling them into a time series either for one or many points. This aggregation and cross-querying can be accomplished in a number of different ways: e.g., MultiZarr, Kerchunk, concatenated xarray datasets, direct netcdf library access, etc.

Including several examples in this notebook, along with performance statistics to show the relative advantages, especially if there is a particular advantage that can be obtained by using the data within the Azure platform specifically, would be powerful.

(I'll work on some prototypes and see if I can issue a PR...)

Invalid base64-encoded string

In advance fantastic work from all team.

I´m having an error when trying to reproduce the code at this point:

generator = container_client.list_blobs(name_starts_with=azure_scene_prefix)
scene_files = list(generator)

I try several products and get the same error. Can you please help me to solve this issue?

Thanks in advance.

Detailed error:

`---------------------------------------------------------------------------
Error Traceback (most recent call last)
/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/authentication.py in _add_authorization_header(self, request, string_to_sign)
117 try:
--> 118 signature = sign_string(self.account_key, string_to_sign)
119 auth_string = 'SharedKey ' + self.account_name + ':' + signature

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/init.py in sign_string(key, string_to_sign, key_is_base64)
46 if key_is_base64:
---> 47 key = decode_base64_to_bytes(key)
48 else:

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/init.py in decode_base64_to_bytes(data)
36 data = data.encode('utf-8')
---> 37 return base64.b64decode(data)
38

/srv/conda/envs/notebook/lib/python3.8/base64.py in b64decode(s, altchars, validate)
86 raise binascii.Error('Non-base64 digit found')
---> 87 return binascii.a2b_base64(s)
88

Error: Invalid base64-encoded string: number of data characters (37) cannot be 1 more than a multiple of 4

During handling of the above exception, another exception occurred:

AzureSigningError Traceback (most recent call last)
/tmp/ipykernel_359/1926042224.py in
1 generator = container_client.list_blobs(name_starts_with=azure_scene_prefix)
----> 2 scene_files = list(generator)
3 image_blobs = [blob.name for blob in scene_files if blob.name.endswith('.tiff')]
4 preview_blobs = [blob.name for blob in scene_files if blob.name.endswith('quick-look.png')]
5

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/paging.py in next(self)
127 if self._page_iterator is None:
128 self._page_iterator = itertools.chain.from_iterable(self.by_page())
--> 129 return next(self._page_iterator)
130
131 next = next # Python 2 compatibility.

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/paging.py in next(self)
74 raise StopIteration("End of paging")
75 try:
---> 76 self._response = self._get_next(self.continuation_token)
77 except AzureError as error:
78 if not error.continuation_token:

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_list_blobs_helper.py in _get_next_cb(self, continuation_token)
77 use_location=self.location_mode)
78 except HttpResponseError as error:
---> 79 process_storage_error(error)
80
81 def _extract_data_cb(self, get_next_return):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/response_handlers.py in process_storage_error(storage_error)
87 serialized = False
88 if not storage_error.response:
---> 89 raise storage_error
90 # If it is one of those three then it has been serialized prior by the generated layer.
91 if isinstance(storage_error, (PartialBatchErrorException,

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_list_blobs_helper.py in _get_next_cb(self, continuation_token)
70 def _get_next_cb(self, continuation_token):
71 try:
---> 72 return self._command(
73 prefix=self.prefix,
74 marker=continuation_token or None,

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_generated/operations/_container_operations.py in list_blob_flat_segment(self, prefix, marker, maxresults, include, timeout, request_id_parameter, **kwargs)
1473
1474 request = self._client.get(url, query_parameters, header_parameters)
-> 1475 pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
1476 response = pipeline_response.http_response
1477

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in run(self, request, **kwargs)
209 else _TransportRunner(self._transport)
210 )
--> 211 return first_node.send(pipeline_request) # type: ignore

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/policies/_redirect.py in send(self, request)
156 redirect_settings = self.configure_redirects(request.context.options)
157 while retryable:
--> 158 response = self.next.send(request)
159 redirect_location = self.get_redirect_location(response)
160 if redirect_location and redirect_settings['allow']:

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/policies.py in send(self, request)
513 self.sleep(retry_settings, request.context.transport)
514 continue
--> 515 raise err
516 if retry_settings['history']:
517 response.context['history'] = retry_settings['history']

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/policies.py in send(self, request)
487 while retries_remaining:
488 try:
--> 489 response = self.next.send(request)
490 if is_retry(response, retry_settings['mode']):
491 retries_remaining = self.increment(

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
69 _await_result(self._policy.on_request, request)
70 try:
---> 71 response = self.next.send(request)
72 except Exception: # pylint: disable=broad-except
73 if not _await_result(self._policy.on_exception, request):

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_base.py in send(self, request)
67 :rtype: ~azure.core.pipeline.PipelineResponse
68 """
---> 69 _await_result(self._policy.on_request, request)
70 try:
71 response = self.next.send(request)

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/core/pipeline/_tools.py in await_result(func, *args, **kwargs)
27 def await_result(func, *args, **kwargs):
28 """If func returns an awaitable, raise that this runner can't handle it."""
---> 29 result = func(*args, **kwargs)
30 if hasattr(result, "await"):
31 raise TypeError(

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/authentication.py in on_request(self, request)
139 self._get_canonicalized_resource_query(request)
140
--> 141 self._add_authorization_header(request, string_to_sign)
142 #logger.debug("String_to_sign=%s", string_to_sign)

/srv/conda/envs/notebook/lib/python3.8/site-packages/azure/storage/blob/_shared/authentication.py in _add_authorization_header(self, request, string_to_sign)
122 # Wrap any error that occurred as signing error
123 # Doing so will clarify/locate the source of problem
--> 124 raise _wrap_exception(ex, AzureSigningError)
125
126 def on_request(self, request):

AzureSigningError: Invalid base64-encoded string: number of data characters (37) cannot be 1 more than a multiple of 4`

GOES L1 data

Hi @TomAugspurger, the noaa website (https://www.noaa.gov/nodd/datasets) makes it sound like L1 datasets are also available on Azure, however, scanning through the docs as well as some manual searches didn't yield anything. Do you know if they're available, but not documented or are the AIforEarthDataSets docs (here) the more accurate in this case? Thanks!


From the website:
GOES-16 & GOES 17 》Amazon Web Services offsite link 》Google offsite link 》Microsoft Azure offsite link
Advanced Baseline Imager Level 1b
Full Disk (ABI-L1b-RadF) | CONUS (ABI-L1b-RadC) | Mesoscale (ABI-L1b-RadM)


From the docs here:

Products available

The following GOES-R products are available on Azure, for both GOES-16 and GOES-17:

ABI-L2-CMIPC (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery CONUS)
ABI-L2-CMIPF (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery Full Disk)
ABI-L2-CMIPM (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery Mesoscale)
ABI-L2-FDCC (Advanced Baseline Imager Level 2 Fire (Hot Spot Characterization) CONUS)
ABI-L2-FDCF (Advanced Baseline Imager Level 2 Fire (Hot Spot Characterization) Full Disk)
ABI-L2-LSTC (Advanced Baseline Imager Level 2 Land Surface Temperature CONUS)
ABI-L2-LSTF (Advanced Baseline Imager Level 2 Land Surface Temperature Full Disk)
ABI-L2-LSTM (Advanced Baseline Imager Level 2 Land Surface Temperature Mesoscale)
ABI-L2-MCMIPC (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery CONUS)
ABI-L2-MCMIPF (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery Full Disk)
ABI-L2-MCMIPM (Advanced Baseline Imager Level 2 Cloud and Moisture Imagery Mesoscale)
ABI-L2-RRQPEF (Advanced Baseline Imager Level 2 Rainfall Rate (Quantitative Precipitation Estimate) Full Disk)
ABI-L2-SSTF (Advanced Baseline Imager Level 2 Sea Surface (Skin) Temperature Full Disk)
GLM-L2-LCFA (Geostationary Lightning Mapper Level 2 Lightning Detection)

Sentinel 3 Token ClientAuthenticationError

Hi, I am trying to run the Sentinel 3 notebook and am retrieving the SAS token on the fly from https://planetarycomputer.microsoft.com/api/sas/v1/token/sentinel3euwest/sentinel-3 and saving them as a text file.

I am able to retrieve the scenes from SciHub, however while the listing the blobs using Azure container client; I get the error below:

ClientAuthenticationError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:1a44bfb3-001e-0015-5ad0-6033fc000000
Time:2022-05-05T22:32:55.2029339Z
ErrorCode:AuthenticationFailed
authenticationerrordetail:The MAC signature found in the HTTP request '70KLIGh33cHeZoJWYUZGhUJeaobyK/t522J6P4WX8FA=' is not the same as any computed signature. Server used following string to sign: GET

Is there another correct way to incorporate the SAS token?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.