Code Monkey home page Code Monkey logo

tutorials's People

Contributors

cassienickles avatar celiaou-podaac avatar cgentemann avatar edshred2000 avatar frankinspace avatar jinbow avatar jjmcnelis avatar mgangl avatar mike-gangl avatar nicktarpinian avatar nikki-t avatar sanchezroj avatar sciencecat18 avatar skorper avatar sliu008 avatar stefaniebutland avatar swot-jpl avatar torimcd avatar zoewalschots avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tutorials's Issues

broken link, please provide new URL for anonymous download

https://opendap.jpl.nasa.gov/opendap/allData/homage/L4/gmsl/global_timeseries_measures.nc

used in the tutorials is no longer available.

This breaks my corresponding Julia tutorial that runs via GitHub action like the rest of my CI for OceanStateEstimation.jl.

An important consideration for me is that download from opendap or ftp do not require authentication. That factored into my choice of this data set back then. As a fall back, I could use the paper's data link https://doi.org/10.5281/zenodo.3862995 but I am trying to avoid extra work.

Please provide new URL for anonymous download via basic protocol.

error downloading data

podaac-data-downloader

does work for me for ocean but not for NA rivers (i.e. does not work for me for

-c SWOT_SIMULATED_NA_CONTINENT_L2_HR_Raster_V1 -d ./SWOT_SIMULATED_NA_CONTINENT_L2_HR_Raster_V1 --start-date 2022-08-02T00:00:00Z --end-date 2022-08-22T00:00:00Z -b="-97,32.5,-96.5,33"

moreover I use the following command:
python3 podaac-data-downloader.py -c etc.
therefore I do not know if my issue is related to this,
the rest does not work

thanks for your reply

AWS S3 access

I have been trying to work with different notebooks, and have been having access problems such as the one in this attached notebook
test1.pdf

The dataset I am trying to read is podaac-ops-cumulus-protected/TELLUS_GRAC-GRFO_MASCON_CRI_GRID_RL06_V2/

And the notebook I used as an example is a recent one.

I verified that my credentials from get_temp_creds() look okay. However,

s3_file_obj = fs_s3.open(url, mode='rb')

throws an error, please see attachment.

Appreciate any help. Thanks!

warnings and error from gdal.Warp in MODIS_L2P_SST_DataCube.ipynb

The MODIS_L2P_SST_DataCube.ipynb output file MODIS_SST.data-cube.nc is missing all gdal.Warp SST output, but that might just be caused by an ad hoc python environment:

Currently Loaded Modules:

  1. CCconfig 5) mii/1.1.1 9) python/3.8.10 (t) 13) expat/2.2.9 (t) 17) gcccore/.9.3.0 (H) 21) hdf5/1.10.6 (io)
  2. gentoo/2020 (S) 6) ucx/1.8.0 10) ipykernel/2020b 14) udunits/2.2.26 (t) 18) gcc/9.3.0 (t) 22) netcdf/4.7.4 (io)
  3. imkl/2020.1.217 (math) 7) libfabric/1.10.1 11) scipy-stack/2020b (math) 15) antlr/2.7.7 19) openmpi/4.0.3 (m) 23) gdal/3.0.4 (geo)
  4. StdEnv/2020 (S) 8) libffi/3.3 12) proj/8.0.0 (geo) 16) libdap/3.20.6 20) gsl/2.6 (math) 24) nco/4.9.5 (io)

If not, are these warnings/error caused by an unexpected(?) use of ni,nj to store the L2 lat/lon arrays:

{'format': 'netCDF', 'copyMetadata': True, 'outputBounds': [-163, 15, -153, 25], 'xRes': 0.01, 'yRes': 0.01, 'dstSRS': '+proj=longlat +datum=WGS84 +no_defs', 'resampleAlg': 'bilinear'}
source filename: /project/riced/work/req/modis-datacube-output/20200702234001-JPL-L2P_GHRSST-SSTskin-MODIS_A-D-v02.0-fv01.0_subsetted.nc4
NETCDF:/project/riced/work/req/modis-datacube-output/20200702234001-JPL-L2P_GHRSST-SSTskin-MODIS_A-D-v02.0-fv01.0_subsetted.nc4:sea_surface_temperature
Warning 1: dimension #2 (ni) is not a Longitude/X dimension.
Warning 1: dimension #1 (nj) is not a Latitude/Y dimension.
Warning 1: dimension #2 (time) is not a Longitude/X dimension.
Warning 1: dimension #1 (ni) is not a Latitude/Y dimension.
Warning 1: 1-pixel width/height files not supported, xdim: 1 ydim: 903
Warning 1: No 1D variable is indexed by dimension nj
Warning 1: dimension #2 (time) is not a Longitude/X dimension.
Warning 1: dimension #1 (ni) is not a Latitude/Y dimension.
Warning 1: 1-pixel width/height files not supported, xdim: 1 ydim: 903
Warning 1: No 1D variable is indexed by dimension nj
ERROR 1: Too many points (529 out of 529) failed to transform, unable to compute output bounds.
Warning 1: Unable to compute source region for output window 0,0,1000,1000, skipping.

[2022-03-19 02:59:01.376839]  SUCCESS for: /project/riced/work/req/modis-datacube-output/subset_reproject-sea_surface_temperature-20200702234001-JPL-L2P_GHRSST-SSTskin-MODIS_A-D-v02.0-fv01.0_subsetted.nc4

SWOT PIXCVec files with wrong Spatial Extents

I'm using earthaccess to download PIXC and PIXCVec SWOT products.
For the same args (bbox and time range), the PIXCVec is returning several files that are not within the AOI.

Here is an example:

results = earthaccess.search_data(
    short_name='SWOT_L2_HR_PIXC_2.0', 
    temporal=('2023-11-06', '2023-11-07'), 
    bounding_box=AOI.bounds
)

Granules found: 1
results = earthaccess.search_data(
    short_name='SWOT_L2_HR_PIXCVec_2.0', 
    temporal=('2023-11-06', '2023-11-07'), 
    bounding_box=AOI.bounds
)

Granules found: 115

When we check for the results, we can see that several PIXVec files have Spatial Extent incorrectly (imho) set to -180, -90, 180, 90, and that's causing this discrepancy in the search.

image

I don't know if this is the best channel for such issue. I've also sent to the earthaccess github.

OPeNDAP Access

Hello,
I am following your MUR-OPeNDAP.ipynb jupyter notebook tutorial to access some L2 data via OPeNDAP.

  • I encountered the following error when getting token even thought I put in the same username and password that can log me in successfully into the urs.earthdata.nasa.gov: ( I got around this by directly generating and copying the token from my account portal.)
Error getting the token - check user name and password
---------------------------------------------------------------------------
UnboundLocalError                         Traceback (most recent call last)
Input In [96], in <cell line: 3>()
      1 # this function doesn't work for me somehow. 
      2 token_url="https://"+cmr+"/legacy-services/rest/tokens"
----> 3 token=get_token(token_url,'podaac-subscriber', "127.0.0.1",edl)

Input In [12], in get_token(url, client_id, user_ip, endpoint)
     13 except:
     14     print("Error getting the token - check user name and password")
---> 15 return token

UnboundLocalError: local variable 'token' referenced before assignment
  • Then, I encountered another error when requesting to get the data:
    response = requests.get(data_url, params=request_params, headers={'Accept-Encoding': 'identity'})
 requesting https://opendap.earthdata.nasa.gov/collections/C2075141524- POCLOUD/granules/ascat_20191130_223900_metopa_68055_eps_o_250_3202_ovw.l2.dap.nc4
 Request failed: HTTP Basic: Access denied.

When I check the response value, it showed <Response [401]>, I am not sure if this 401 response has anything to do with the first error about the authentication.
Could you help me figure out what the problem is?

Thank you!

Update UAT environments to Production Envrionments

A user mentioned issues with netrc and not knowing if we should be using UAT or non-UAT instances in the tutorials and subscriber toolsets. now that a lot of our data is in the cloud, we should update our tutorials and notebooks to the non-UAT instances to remove misunderstandings from our dev users. This will also be a good chance to ensure our tutorials are still working after the transition from dev/test to production.

@ScienceCat18 i'm not saying that you have to own this and do the work, i think you're in the best position to assign, prioritize, and track this work.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.