Code Monkey home page Code Monkey logo

s5p-tools's People

Contributors

bilelomrani1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s5p-tools's Issues

L3 Resampling and memory usage

Is it normal that the HARP tools for the L3 resampling use huge amounts of memory? It tried to download and process L2__NO2___ data with this command:

python s5p-request.py L2__NO2___ --date NOW-2HOUR NOW --num-workers 1

The download was quite small but when it got to the L3 resampling python started to use huge amounts of RAM until it crashed the computer. Is this expected when using HARP tools?

Screenshot 2021-03-08 at 18 57 36

NetCDF: One or more variable sizes violate format constraints

When running this command:
python s5p-request.py L2__AER_AI --num-workers 1

I am getting the following error during the conversion to L3.

`Converting into L3 products

Launched 1 processes
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/multiprocessing/pool.py", line 121, in worker
result = (True, func(*args, **kwds))
File "/media/WRF_OUT_DATA/s5p-tools-2/s5p_tools/utils.py", line 68, in process_file
file_format="netcdf"
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/harp/_harppy.py", line 1251, in export_product
raise CLibraryError()
harp._harppy.CLibraryError: NetCDF: One or more variable sizes violate format constraints (L3_data/S5P_NRTI_L3__AER_AI_20210712T074237_20210712T074737_19407_02_020200_20210712T080918.nc)
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "s5p-request.py", line 294, in
num_workers=args.num_workers,
File "s5p-request.py", line 142, in main
total=len(filenames),
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/tqdm/std.py", line 1178, in iter
for obj in iterable:
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/multiprocessing/pool.py", line 748, in next
raise value
harp._harppy.CLibraryError: NetCDF: One or more variable sizes violate format constraints (L3_data/S5P_NRTI_L3__AER_AI_20210712T074237_20210712T074737_19407_02_020200_20210712T080918.nc)
`

Any idea what the cause might be please @bilelomrani1 ? Thank you.

Respected Sir , Kindly guide me on the below errors

Launched 4 processes

Traceback (most recent call last):
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\file_manager.py", line 199, in _acquire_with_cache_info
file = self.cache[self.key]
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\lru_cache.py", line 53, in getitem
value = self.cache[key]
KeyError: [<class 'netCDF4.netCDF4.Dataset'>, ('C:\Users\pawar\s5p-tools-master\L2_data\L2__O3
\S5P_NRTI_L2__O3_____20210928T081425_20210928T081925_20514_02_020201_20210928T090545.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 401, in
main(
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 221, in main
attributes = {
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 223, in
"time_coverage_start": xr.open_dataset(filename).attrs[
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\api.py", line 497, in open_dataset
backend_ds = backend.open_dataset(
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\netCDF4_.py", line 551, in open_dataset
store = NetCDF4DataStore.open(
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\netCDF4_.py", line 380, in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\netCDF4_.py", line 328, in init
self.format = self.ds.data_model
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\netCDF4_.py", line 389, in ds
return self.acquire()
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\netCDF4
.py", line 383, in _acquire
with self._manager.acquire_context(needs_lock) as root:
File "C:\Users\pawar.conda\envs\project1\lib\contextlib.py", line 119, in enter
return next(self.gen)
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\file_manager.py", line 187, in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
File "C:\Users\pawar.conda\envs\project1\lib\site-packages\xarray\backends\file_manager.py", line 205, in _acquire_with_cache_info
file = self._opener(*self.args, **kwargs)
File "src\netCDF4_netCDF4.pyx", line 2307, in netCDF4.netCDF4.Dataset.init
File "src\netCDF4_netCDF4.pyx", line 1925, in netCDF4.netCDF4.ensure_nc_success
FileNotFoundError: [Errno 2] No such file or directory: b'C:\Users\pawar\s5p-tools-master\L2_data\L2__O3
\S5P_NRTI_L2__O3_____20210928T081425_20210928T081925_20514_02_020201_20210928T090545.nc'

(project1) C:\Users\pawar\s5p-tools-master>Python s5p-request.py L2__O3____
Traceback (most recent call last):
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 15, in
from s5p_tools import (
ImportError: cannot import name 'bounding_box' from 's5p_tools' (unknown location)

(project1) C:\Users\pawar\s5p-tools-master>Python s5p-request.py L2__O3____
Traceback (most recent call last):
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 15, in
from s5p_tools import (
ImportError: cannot import name 'bounding_box' from 's5p_tools' (unknown location)

(project1) C:\Users\pawar\s5p-tools-master>python s5p-request.py L2__O3____
Traceback (most recent call last):
File "C:\Users\pawar\s5p-tools-master\s5p-request.py", line 15, in
from s5p_tools import (
ImportError: cannot import name 'bounding_box' from 's5p_tools' (unknown location)

Exception during conversion to L3

My query requires a download of 160Gb and 1556 products. After downloading the first 4 files, then it tries to covert to L3, during which it crashes with the following error:

`
Traceback (most recent call last):
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info
file = self._cache[self._key]
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/lru_cache.py", line 53, in getitem
value = self._cache[key]
KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/wrfchem/Desktop/s5p-tools/L2_data/L2__AER_AI/S5P_NRTI_L2__AER_AI_20210703T152705_20210703T153205_19284_01_010400_20210703T160645.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "s5p-request.py", line 411, in
num_workers=args.num_workers,
File "s5p-request.py", line 228, in main
for filename in L2_files_urls
File "s5p-request.py", line 228, in
for filename in L2_files_urls
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/api.py", line 500, in open_dataset
**kwargs,
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 558, in open_dataset
autoclose=autoclose,
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 380, in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 328, in init
self.format = self.ds.data_model
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 389, in ds
return self.acquire()
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4
.py", line 383, in _acquire
with self._manager.acquire_context(needs_lock) as root:
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/contextlib.py", line 112, in enter
return next(self.gen)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info
file = self._opener(*self._args, **kwargs)
File "netCDF4/_netCDF4.pyx", line 2291, in netCDF4._netCDF4.Dataset.init
File "netCDF4/_netCDF4.pyx", line 1855, in netCDF4._netCDF4._ensure_nc_success
FileNotFoundError: [Errno 2] No such file or directory: b'/home/wrfchem/Desktop/s5p-tools/L2_data/L2__AER_AI/S5P_NRTI_L2__AER_AI_20210703T152705_20210703T153205_19284_01_010400_20210703T160645.nc'
`

Re-running the command then downloads another 4 files and then crashes again. Any idea what is wrong here please @bilelomrani1 ? I can't seem to be able to follow what's happening.

Is the downloading of just 4 files and then trying to covert to L3 normal behaviour? Or should it first download all the files first before converting to L3?

Thank you.

No module named 's5p_tools'

Hello, I was wondering if I may have some guidance on an issue I have encountered when trying to use the s5p-request.py script. When trying to run the following command I get an error : "ModuleNotFoundError: No module named 's5p_tools' "

I have not altered the s5p-request.py script in anyway and have followed the recommended set up instructions for the environment in conda:

conda create --override-channels -c conda-forge -c stcorp --file requirements.txt --name SP5

Any advice would be greatly appreciated. Thank you.

Command and error:

python s5p-request.py L2__O3____ --date NOW-1MONTH --aoi map.geojson
Traceback (most recent call last):
File "s5p-request.py", line 15, in
import s5p_tools
ModuleNotFoundError: No module named 's5p_tools'

NetCDF coordinates do not match given GeoJSON file

I'm running s5p-request with this command:

python s5p-request.py L2__AER_AI --aoi newmap.geojson --qa 80 --num-workers 1

The geojson file is this one: newmap.geojson.zip

The area covered by the geojson is: Screenshot 2021-07-12 at 07 37 01

But the output NetCDF coordinates are:
image

I am 99% sure that I'm missing something here rather than this being a bug in the script - however I'm not sure what I'm missing. @bilelomrani1, you have any idea please? Thank you.

Kindly help me sort out this errors

(base) C:\Users\bhushan.pawar\my project\me>python s5p-request.py L2_NO2_
Traceback (most recent call last):
File "C:\Users\bhushan.pawar\my project\me\s5p-request.py", line 15, in
from s5p_tools import (bounding_box,convert_to_l3_products,get_filenames_request,request_copernicus_hub,)
File "C:\Users\bhushan.pawar\my project\me\s5p_tools_init_.py", line 2, in
from .preprocess import bounding_box, convert_to_l3_products
File "C:\Users\bhushan.pawar\my project\me\s5p_tools\preprocess.py", line 6, in
import geopandas
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\geopandas_init_.py", line 1, in
from geopandas._config import options # noqa
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\geopandas_config.py", line 126, in
default_value=_default_use_pygeos(),
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\geopandas_config.py", line 112, in _default_use_pygeos
import geopandas.compat as compat
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\geopandas_compat.py", line 202, in
import rtree # noqa
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\rtree_init
.py", line 9, in
from .index import Rtree, Index # noqa
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\rtree\index.py", line 6, in
from . import core
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\rtree\core.py", line 75, in
rt = finder.load()
File "C:\Users\bhushan.pawar\Miniconda3\lib\site-packages\rtree\finder.py", line 67, in load
raise OSError("could not find or load {}".format(lib_name))
OSError: could not find or load spatialindex_c-64.dll

s5p-compress "some chunks keys are not dimensions on this object"

Hello, apologies for raising a new issue, however I have come across a new error when attempting the s5p-compress script and wondered if you had any ideas of how to over come it. I have tried everything I can think of to get around it, and I seem to get the same error no matter what product or band name I use.

Any ideas would be greatly appreciated. Thank you again for all the help previously.

python s5p-compress.py L2_data/L2__NO2___/S5P_OFFL_L2__NO2____20210310T102908_20210310T121038_17650_01_010400_20210312T035825.nc nitrogendioxide_tropospheric_column

/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rasterio/init.py:216: NotGeoreferencedWarning: Dataset has no geotransform set. The identity matrix may be returned.
s = DatasetReader(path, driver=driver, sharing=sharing, **kwargs)
/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rioxarray/_io.py:761: NotGeoreferencedWarning: Dataset has no geotransform set. The identity matrix may be returned.
warnings.warn(str(rio_warning.message), type(rio_warning.message))
Traceback (most recent call last):
File "s5p-compress.py", line 184, in
export_dir=EXPORT_DIR,
File "s5p-compress.py", line 38, in main
DS = rioxarray.open_rasterio(netcdf_file, chunks={"time": chunk_size})
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rioxarray/_io.py", line 774, in open_rasterio
mask_and_scale=mask_and_scale,
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rioxarray/_io.py", line 552, in load_subdatasets
default_name=subdataset.split(":")[-1].lstrip("/").replace("/", "
"),
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rioxarray/_io.py", line 865, in open_rasterio
result = _prepare_dask(result, riods, filename, chunks)
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/rioxarray/_io.py", line 603, in _prepare_dask
return result.chunk(chunks, name_prefix=name_prefix, token=token)
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/xarray/core/dataarray.py", line 1058, in chunk
chunks, name_prefix=name_prefix, token=token, lock=lock
File "/home/ab/bw43776/anaconda3/envs/Sentinel/lib/python3.7/site-packages/xarray/core/dataset.py", line 1916, in chunk
"some chunks keys are not dimensions on this " "object: %s" % bad_dims
ValueError: some chunks keys are not dimensions on this object: {'time'}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.