Code Monkey home page Code Monkey logo

fmiopendata's People

Contributors

antont avatar heikkil avatar jannylund avatar mikaelhg avatar pnuu avatar terop avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fmiopendata's Issues

Forecast data download - ModuleNotFoundError: No module named 'eccodes' "

Hello,
I'm having problems with downloading FMI forecast data with python (3.9).

I'm using your code example from github:

from fmiopendata.wfs import download_stored_query

model_data = download_stored_query("fmi::forecast::harmonie::surface::grid",
                                   args=["starttime=2020-07-06T18:00:00Z",
                                         "endtime=2020-07-06T20:00:00Z",
                                         "bbox=18,55,35,75"])

and getting the following error message:

  File "c:/****/****.py", line 561, in <module>
    model_data = download_stored_query("fmi::forecast::harmonie::surface::grid",
  File "C:\***\****\Python\Python38-32\lib\site-packages\fmiopendata\wfs.py", line 120, in download_stored_query
    from fmiopendata.grid import download_and_parse
  File "C:\***\****\AppData\Local\Programs\Python\Python38-32\lib\site-packages\fmiopendata\grid.py", line 31, in <module>
    import eccodes
ModuleNotFoundError: No module named 'eccodes'

I have eccodes installed, here are the messages for pip install eccodes:

pip install eccodes
Requirement already satisfied: eccodes in c:\python39\lib\site-packages (1.1.0)
Requirement already satisfied: cffi in c:\python39\lib\site-packages (from eccodes) (1.14.4)
Requirement already satisfied: attrs in c:\python39\lib\site-packages (from eccodes) (20.3.0)
Requirement already satisfied: numpy in c:\python39\lib\site-packages (from eccodes) (1.20.1)
Requirement already satisfied: pycparser in c:\python39\lib\site-packages (from cffi->eccodes) (2.20)

Non-timeseries data & HTTP Error 400: Bad Request

Testing with the non-timeseries data, the following code

start_time = "2020-10-15T11:00:00Z"
end_time = "2020-10-22T12:00:00Z"

obs = download_stored_query("fmi::observations::weather::multipointcoverage",
                            args=["bbox=24.88,60.14,25.01,60.19",  "bbox=24.88,60.14,25.01,60.19"
                                   "starttime=" + start_time,
                                  "endtime=" + end_time])
all_steps_list = list(obs.data.keys())
for i in range(0, len(all_steps_list)):
    print(all_steps_list[i])

throws me the following error:
urllib.error.HTTPError: HTTP Error 400: Bad Request

It seems that when the interval start_time --> end_time is more than 7 days, I am getting the HTTP Error 400. But with for example five days (start_time = "2020-10-17T11:00:00Z"
end_time = "2020-10-22T12:00:00Z")
the code runs fine and returns me everything for those five days.

Is this how the library is supposed to work, so that the non-timeseries download cannot return me more than 7 days of weather data in a single run of the script?


Moreover, if I am using the script with the timeseries data, it only returns me something like the last 12 hours of weather data with 72 data "steps", no matter what I set the start_time & end_time to. Example for trying to download a year worth of weather data:

start_time = "2017-10-22T05:44:16Z"
end_time = "2018-10-22T05:44:16Z"

obs = download_stored_query("fmi::observations::weather::multipointcoverage",
                            args=["bbox=24.87,60.14,25.01,60.19",  "bbox=24.88,60.14,25.01,60.19"
                                   "timeseries=True"])

print(len((obs.data["Helsinki Kaisaniemi"]["times"]))) 
# --> 72
for i in range(0, len((obs.data["Helsinki Kaisaniemi"]["times"]))):
    print(obs.data["Helsinki Kaisaniemi"]["times"][i])
# --> 2020-10-22 21:50:00
# 2020-10-22 22:00:00
# 2020-10-22 22:10:00
# 2020-10-22 22:20:00
# 2020-10-22 22:30:00
# 2020-10-22 22:40:00
# 2020-10-22 22:50:00
# 2020-10-22 23:00:00
# 2020-10-22 23:10:00
# 2020-10-22 23:20:00
# 2020-10-22 23:30:00
# 2020-10-22 23:40:00
# 2020-10-22 23:50:00
# 2020-10-23 00:00:00
# 2020-10-23 00:10:00
# 2020-10-23 00:20:00
# 2020-10-23 00:30:00
# 2020-10-23 00:40:00
# 2020-10-23 00:50:00
# 2020-10-23 01:00:00
# 2020-10-23 01:10:00
# 2020-10-23 01:20:00
# 2020-10-23 01:30:00
# 2020-10-23 01:40:00
# 2020-10-23 01:50:00
# 2020-10-23 02:00:00
# 2020-10-23 02:10:00
# 2020-10-23 02:20:00
# 2020-10-23 02:30:00
# 2020-10-23 02:40:00
# 2020-10-23 02:50:00
# 2020-10-23 03:00:00
# 2020-10-23 03:10:00
# 2020-10-23 03:20:00
# 2020-10-23 03:30:00
# 2020-10-23 03:40:00
# 2020-10-23 03:50:00
# 2020-10-23 04:00:00
# 2020-10-23 04:10:00
# 2020-10-23 04:20:00
# 2020-10-23 04:30:00
# 2020-10-23 04:40:00
# 2020-10-23 04:50:00
# 2020-10-23 05:00:00
# 2020-10-23 05:10:00
# 2020-10-23 05:20:00
# 2020-10-23 05:30:00
# 2020-10-23 05:40:00
# 2020-10-23 05:50:00
# 2020-10-23 06:00:00
# 2020-10-23 06:10:00
# 2020-10-23 06:20:00
# 2020-10-23 06:30:00
# 2020-10-23 06:40:00
# 2020-10-23 06:50:00
# 2020-10-23 07:00:00
# 2020-10-23 07:10:00
# 2020-10-23 07:20:00
# 2020-10-23 07:30:00
# 2020-10-23 07:40:00
# 2020-10-23 07:50:00
# 2020-10-23 08:00:00
# 2020-10-23 08:10:00
# 2020-10-23 08:20:00
# 2020-10-23 08:30:00
# 2020-10-23 08:40:00
# 2020-10-23 08:50:00
# 2020-10-23 09:00:00
# 2020-10-23 09:10:00
# 2020-10-23 09:20:00
# 2020-10-23 09:30:00
# 2020-10-23 09:40:00

(the latter script was run today 23.10.2020 at 12:49 (UTC/GMT +2 hours))

What could I do in order to the timeseries script would actually return me the timespan that I have set with start_time & end_time? @pnuu

Mast data parsing doesn't work

Parsing of fmi::observations::weather::mast::multipointcoverage fails with

data = download_stored_query("fmi::observations::weather::mast::multipointcoverage")
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-19-2fb36c180ad2> in <module>
----> 1 data = download_stored_query("fmi::observations::weather::mast::multipointcoverage")

~/Software/fmiopendata/fmiopendata/wfs.py in download_stored_query(query_id, args)
    124         raise NotImplementedError("No parser available for %s" % query_id)
    125 
--> 126     return download_and_parse(query_id, args=args)

~/Software/fmiopendata/fmiopendata/multipoint.py in download_and_parse(query_id, args)
    162         url = url + "&" + "&".join(args)
    163     xml = read_url(url)
--> 164     return MultiPoint(xml, query_id, timeseries=timeseries)

~/Software/fmiopendata/fmiopendata/multipoint.py in __init__(self, xml, query_id, timeseries)
     48             self._parse_radionuclide()
     49         else:
---> 50             self._parse(self._xml)
     51 
     52     def _parse_radionuclide(self):

~/Software/fmiopendata/fmiopendata/multipoint.py in _parse(self, xml)
     80         longitudes = positions[1::3]
     81         times = _parse_times(xml, positions)
---> 82         measurements = _parse_measurements(xml, (len(times), len(type2obs)))
     83 
     84         if self._timeseries:

~/Software/fmiopendata/fmiopendata/multipoint.py in _parse_measurements(xml, shape)
    126 def _parse_measurements(xml, shape):
    127     measurements = np.fromstring(xml.findtext(wfs.GML_DOUBLE_OR_NIL_REASON_TUPLE_LIST), dtype=float, sep=" ")
--> 128     return np.reshape(measurements, shape)
    129 
    130 

<__array_function__ internals> in reshape(*args, **kwargs)

~/miniconda3/envs/fmiopendata/lib/python3.8/site-packages/numpy/core/fromnumeric.py in reshape(a, newshape, order)
    296            [5, 6]])
    297     """
--> 298     return _wrapfunc(a, 'reshape', newshape, order=order)
    299 
    300 

~/miniconda3/envs/fmiopendata/lib/python3.8/site-packages/numpy/core/fromnumeric.py in _wrapfunc(obj, method, *args, **kwds)
     55 
     56     try:
---> 57         return bound(*args, **kwds)
     58     except TypeError:
     59         # A TypeError occurs if the object does have such a method in its

ValueError: cannot reshape array of size 8 into shape (2,6)

half an hour time shift?

If I ask data with

from fmiopendata.wfs import download_stored_query

starttime = "2022-08-30T23:00:00Z"
endditme = "2022-08-31T04:59:59Z"


snd = download_stored_query(
    "fmi::observations::weather::sounding::multipointcoverage",
    args=[
        f"place=Sodankylä",
        f"starttime={starttime}",
        f"endtime={endditme}",
    ],
)

for sounding in snd.soundings:
    print(dir(sounding))
    print(f"name {sounding.name}")  # Name of the sounding station
    print(f"nominal_time {sounding.nominal_time}")
    print(f"start_time {sounding.start_time}")
    print(f"end_time {sounding.end_time }")

    print(f"pressures {sounding.pressures }")
    print(f"temperatures {sounding.temperatures }")
    print(f"times {sounding.times }")

I receive

name Sodankylä Tähtelä
nominal_time 2022-08-31 00:00:00
start_time 2022-08-30 23:30:08
end_time 2022-08-31 00:58:32
pressures [1003.44 1001.75 1000.28 ...   27.1    27.05   27.01]
temperatures [  3.3   3.8   4.  ... -52.1 -52.1 -52.1]
times [datetime.datetime(2022, 8, 31, 0, 0)
 datetime.datetime(2022, 8, 31, 0, 0, 2)
 datetime.datetime(2022, 8, 31, 0, 0, 4) ...

Why the time is not startiing from 2022-08-30 23:30:08 ????
but from 2022, 8, 31, 0, 0

This seems to be valid more generally, there is the same about half an hour shift in time (the balloons are launched UTC 5.30, 11.30, 17.30 and 23.30)

Terveisin, Markus

Difficulties with correct time format and received data time

Could you add more info on how to format the timestamp, please?

The service gives back data with timestamps that are 3 hours ahead of the query time. I cannot see the logic here. I have my original query time in local time. Then I need to put in UTC (with "Z", "+00" does not work) to receive data in local time.

query with local time works, but gives unexpected time of return data:
arglist1 = ['fmisid=874863']

time1
datetime.datetime(2020, 5, 12, 12, 12, 43)
time2
datetime.datetime(2020, 5, 12, 12, 32, 43)
arglist1.append("starttime="+time1.isoformat(sep='T',timespec='auto'))
arglist1.append("endtime="+time2.isoformat(sep='T',timespec='auto'))
arglist1 = ['fmisid=874863', 'starttime=2020-05-12T12:12:43', 'endtime=2020-05-12T12:32:43']
obs1 = download_stored_query("fmi::observations::weather::multipointcoverage",args=arglist1)
otimes1 = obs1.data.keys()
otimes1
dict_keys([datetime.datetime(2020, 5, 12, 15, 20), datetime.datetime(2020, 5, 12, 15, 30)])

utc timestamp works only with "Z", not with "+00":
utctime1 = time1.astimezone(pytz.utc)
utctime2 = utctime1 + timedelta(minutes=20)
arglist2.append("starttime="+utctime1.isoformat(sep='T',timespec='auto'))
arglist2.append("endtime="+utctime2.isoformat(sep='T',timespec='auto'))

arglist2 = ['fmisid=874863', 'starttime=2020-05-12T09:12:43+00:00', 'endtime=2020-05-12T09:32:43+00:00']
obs2 = download_stored_query("fmi::observations::weather::multipointcoverage",args=arglist2)
Traceback (most recent call last):
File "", line 1, in
File "/home/user/.local/lib/python3.8/site-packages/fmiopendata/wfs.py", line 121, in download_stored_query
return download_and_parse(query_id, args=args)
File "/home/user/.local/lib/python3.8/site-packages/fmiopendata/multipoint.py", line 106, in download_and_parse
xml = read_url(url)
File "/home/user/.local/lib/python3.8/site-packages/fmiopendata/utils.py", line 27, in read_url
with urlopen(url) as response:
File "/usr/lib64/python3.8/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib64/python3.8/urllib/request.py", line 531, in open
response = meth(req, response)
File "/usr/lib64/python3.8/urllib/request.py", line 640, in http_response
response = self.parent.error(
File "/usr/lib64/python3.8/urllib/request.py", line 569, in error
return self._call_chain(*args)
File "/usr/lib64/python3.8/urllib/request.py", line 502, in _call_chain
result = func(*args)
File "/usr/lib64/python3.8/urllib/request.py", line 649, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 400: Bad Request

this works:
arglist5 = ['fmisid=874863', 'starttime=2020-05-12T09:12:43Z', 'endtime=2020-05-12T09:32:43Z']

The data fields are somehow mixed

I got non-reasonable figures for snow depth in May and as I compare the data with the same place and time from "Lataa havainnot" service, I can see that the fields are somehow mixed.

arglist
['fmisid=874863', 'starttime=2020-05-13T12:10:00', 'endtime=2020-05-13T12:40:00']
obs = download_stored_query("fmi::observations::weather::multipointcoverage",args=arglist)
obs.data
{datetime.datetime(2020, 5, 13, 12, 10): {'Espoo Tapiola': {'Relative humidity': {'value': 45791.0, 'units': '%'}, 'Precipitation amount': {'value': 164.0, 'units': 'mm'}, 'Dew-point temperature': {'value': 0.0, 'units': 'degC'}, 'Air temperature': {'value': 4.1, 'units': 'degC'}, 'Present weather (auto)': {'value': 1.9, 'units': ''}, 'Gust speed': {'value': 0.0, 'units': 'm/s'}, 'Pressure (msl)': {'value': 1003.9, 'units': 'hPa'}, 'Cloud amount': {'value': 4.3, 'units': '1/8'}, 'Precipitation intensity': {'value': 7.0, 'units': 'mm/h'}, 'Wind speed': {'value': 1.4, 'units': 'm/s'}, 'Horizontal visibility': {'value': 81.0, 'units': 'm'}, 'Wind direction': {'value': nan, 'units': 'deg'}, 'Snow depth': {'value': 81.0, 'units': 'cm'}}}, datetime.datetime(2020, 5, 13, 12, 30): {'Espoo Tapiola': {'Relative humidity': {'value': 42893.0, 'units': '%'}, 'Precipitation amount': {'value': 171.0, 'units': 'mm'}, 'Dew-point temperature': {'value': 0.0, 'units': 'degC'}, 'Air temperature': {'value': 5.5, 'units': 'degC'}, 'Present weather (auto)': {'value': 3.1, 'units': ''}, 'Gust speed': {'value': 0.0, 'units': 'm/s'}, 'Pressure (msl)': {'value': 1003.8, 'units': 'hPa'}, 'Cloud amount': {'value': 4.2, 'units': '1/8'}, 'Precipitation intensity': {'value': 7.0, 'units': 'mm/h'}, 'Wind speed': {'value': 1.1, 'units': 'm/s'}, 'Horizontal visibility': {'value': 81.0, 'units': 'm'}, 'Wind direction': {'value': nan, 'units': 'deg'}, 'Snow depth': {'value': 80.0, 'units': 'cm'}}}, datetime.datetime(2020, 5, 13, 12, 20): {'Espoo Tapiola': {'Relative humidity': {'value': 40083.0, 'units': '%'}, 'Precipitation amount': {'value': 182.0, 'units': 'mm'}, 'Dew-point temperature': {'value': 0.0, 'units': 'degC'}, 'Air temperature': {'value': 4.5, 'units': 'degC'}, 'Present weather (auto)': {'value': 2.5, 'units': ''}, 'Gust speed': {'value': 0.0, 'units': 'm/s'}, 'Pressure (msl)': {'value': 1003.9, 'units': 'hPa'}, 'Cloud amount': {'value': 4.3, 'units': '1/8'}, 'Precipitation intensity': {'value': 7.0, 'units': 'mm/h'}, 'Wind speed': {'value': 1.3, 'units': 'm/s'}, 'Horizontal visibility': {'value': 81.0, 'units': 'm'}, 'Wind direction': {'value': nan, 'units': 'deg'}, 'Snow depth': {'value': 81.0, 'units': 'cm'}}}, datetime.datetime(2020, 5, 13, 12, 40): {'Espoo Tapiola': {'Relative humidity': {'value': 40529.0, 'units': '%'}, 'Precipitation amount': {'value': 149.0, 'units': 'mm'}, 'Dew-point temperature': {'value': 0.0, 'units': 'degC'}, 'Air temperature': {'value': 4.7, 'units': 'degC'}, 'Present weather (auto)': {'value': 2.8, 'units': ''}, 'Gust speed': {'value': 0.0, 'units': 'm/s'}, 'Pressure (msl)': {'value': 1003.8, 'units': 'hPa'}, 'Cloud amount': {'value': 5.0, 'units': '1/8'}, 'Precipitation intensity': {'value': 7.0, 'units': 'mm/h'}, 'Wind speed': {'value': 1.1, 'units': 'm/s'}, 'Horizontal visibility': {'value': 23.0, 'units': 'm'}, 'Wind direction': {'value': nan, 'units': 'deg'}, 'Snow depth': {'value': 76.0, 'units': 'cm'}}}}

The keys have been mixed. The values are the same as from https://cdn.fmi.fi/fmiodata-convert-api/

Thanks for help!

requests module to "install_requires"

Hi, fresh install of fmiopendata threw error for missing requests module (utils.py), it might be added as a requirement to setup.py. I was not using Miniconda

How to parse netcdf to use forecast::enfuser::airquality?

Hi - I hope it's OK to use this for support requests. I'd need to use the ENFUSER Air Quality data, which is available as a grid. Am doing model_data = download_stored_query("fmi::forecast::enfuser::airquality::helsinki-metropolitan::grid")

However, it is apparently as NetCDF (I don't know yet what that format is), and parsing that is not implemented, so I get:

fmiopendata\grid.py", line 69, in parse
raise NotImplementedError("No parser for %s" % format)
NotImplementedError: No parser for format=netcdf

How should I go about getting the data?

Can I just get it simply without real parsing? Or could I pass it to an external parser, like there seems to be some Python NetCDF code in http://www.umr-cnrm.fr/gmapdoc/meshtml/EPYGRAM1.0.0/_modules/epygram/formats/netCDF.html ?

Or would it be better to add the support to fmiopendata somehow? Sorry for being clueless but I think that will be able to solve this somehow, would just be nice to get some guidance as am not familiar with that format and this lib from before.

TypeError in downloading and parsing the lightning data

Hello,
I'm getting the following error TypeError: a bytes-like object is required, not 'NoneType' when trying to execute the script for downloading and parsing the lightning data, straight from the Github page, in Azure Databricks
(Cluster Mode Standard
Databricks Runtime Version 6.5
6.5 ML (includes Apache Spark 2.4.5, Scala 2.11)
Python Version 3).

This script:
import fmiopendata

from fmiopendata.wfs import download_stored_query
lightning1 = download_stored_query("fmi::observations::lightning::multipointcoverage")

lightning1.latitudes # Latitude of the lightning event [° North]
lightning1.longitudes # Longitude of the lightning event [° East]
lightning1.times # Time of the lightning event [datetime]
lightning1.cloud_indicator # Indicator for cloud flashes (1 == cloud lightning)
lightning1.multiplicity # Multiplicity of the lightning event
lightning1.peak_current # Maximum current of the lightning event [kA]
lightning1.ellipse_major # Location accuracy of the lightning event [km]

returns the following error:

TypeError: a bytes-like object is required, not 'NoneType'

TypeError Traceback (most recent call last)
in
4 # Lightning data
5 from fmiopendata.wfs import download_stored_query
----> 6 lightning1 = download_stored_query("fmi::observations::lightning::multipointcoverage")
7
8

/databricks/python/lib/python3.7/site-packages/fmiopendata/wfs.py in download_stored_query(query_id, args)
124 raise NotImplementedError("No parser available for %s" % query_id)
125
--> 126 return download_and_parse(query_id, args=args)

/databricks/python/lib/python3.7/site-packages/fmiopendata/lightning.py in download_and_parse(query_id, args)
110 xml = read_url(url)
111 mode = query_id.split("::")[-1]
--> 112 return Lightning(xml, mode)

/databricks/python/lib/python3.7/site-packages/fmiopendata/lightning.py in init(self, xml, mode)
47 self.parse_simple()
48 elif mode == "multipointcoverage":_
_---> 49 self.parse_multipoint()
50 else:
51 raise NotImplementedError("No parser for %s" % mode)

_/databricks/python/lib/python3.7/site-packages/fmiopendata/lightning.py in parse_multipoint(self)
90
91 """
_---> 92 positions = np.fromstring(self.xml.findtext(wfs.GMLCOV_POSITIONS), dtype=float, sep=" ")
93 self.latitudes = positions[::3]
94 self.longitudes = positions[1::3]

TypeError: a bytes-like object is required, not 'NoneType'_

If I try to run the "simple" version of the lightning script, the following code only returns empty dataframes:

import fmiopendata

from fmiopendata.wfs import download_stored_query
lightning2 = download_stored_query("fmi::observations::lightning::simple")

print(lightning2.latitudes) # Latitude of the lightning event [° North]
print(lightning2.longitudes) # Longitude of the lightning event [° East]
print(lightning2.times) # Time of the lightning event [datetime]
print(lightning2.cloud_indicator) # Indicator for cloud flashes (1 == cloud lightning)
print(lightning2.multiplicity) # Multiplicity of the lightning event
print(lightning2.peak_current) # Maximum current of the lightning event [kA]
print(lightning2.ellipse_major) # Location accuracy of the lightning event [km]

The weird thing is, I have got the lightning1 script to work for a couple of times but have no idea why it sometimes works and sometimes not.

@pnuu

Download and calibrate latest radar reflectivity (dBZ) composite fails

I'm trying to run example code under paragrhap Download and calibrate latest radar reflectivity (dBZ) composite.

Command composite.download() fails with following error message:

Traceback (most recent call last):
  File "rasterio\_base.pyx", line 310, in rasterio._base.DatasetBase.__init__
  File "rasterio\_base.pyx", line 221, in rasterio._base.open_dataset
  File "rasterio\_err.pyx", line 221, in rasterio._err.exc_wrap_pointer
rasterio._err.CPLE_OpenFailedError: C:/Users/xxxx/AppData/Local/Temp/tmp42opk8v9: file used by other process

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<pyshell#3>", line 1, in <module>
    composite.download()
  File "C:\Users\xxxx\AppData\Local\Programs\Python\Python311\Lib\site-packages\fmiopendata\radar.py", line 77, in download
    img = rasterio.open(fid.name)
  File "C:\Users\xxxx\AppData\Local\Programs\Python\Python311\Lib\site-packages\rasterio\env.py", line 451, in wrapper
    return f(*args, **kwds)
  File "C:\Users\xxxx\AppData\Local\Programs\Python\Python311\Lib\site-packages\rasterio\__init__.py", line 320, in open
    dataset = DatasetReader(path, driver=driver, sharing=sharing, **kwargs)
  File "rasterio\_base.pyx", line 312, in rasterio._base.DatasetBase.__init__
rasterio.errors.RasterioIOError: C:/Users/xxxx/AppData/Local/Temp/tmp42opk8v9: file used by other process

Any ideas why example code is not working. I'm running Windows 10 and pyhthon 3.11.

The call crashes when there is no data available

print(arglist)
['fmisid=101004', 'starttime=2019-12-10T09:20:00Z', 'endtime=2019-12-10T09:50:00Z']

obs = download_stored_query("fmi::observations::weather::multipointcoverage",args=arglist)
Traceback (most recent call last):
File "", line 1, in
File "/home/pli/mypython38venv/lib/python3.8/site-packages/fmiopendata/wfs.py", line 123, in download_stored_query
return download_and_parse(query_id, args=args)
File "/home/pli/mypython38venv/lib/python3.8/site-packages/fmiopendata/multipoint.py", line 107, in download_and_parse
return MultiPoint(xml, query_id)
File "/home/pli/mypython38venv/lib/python3.8/site-packages/fmiopendata/multipoint.py", line 45, in init
self._parse(self._xml)
File "/home/pli/mypython38venv/lib/python3.8/site-packages/fmiopendata/multipoint.py", line 65, in _parse
positions = np.fromstring(xml.findtext(wfs.GMLCOV_POSITIONS), dtype=float, sep=" ")
TypeError: a bytes-like object is required, not 'NoneType'

When I load the data using fmi web page, everything is missing, only '-' for the observations. I would prefer here NaN or '-' over crash.

I tried to load Kaisaniemi weather for the same time period, it looks ok in the fmi web page load, but crashes with my python query, so maybe it is something else than no data problem.

This seems to have been temporary problem, the queries work now.

README missing import datetime

Hi,
if I understood correctly, your (README ) examples are missing an import.
I'm talking about this part:

Download and parse observation data
from fmiopendata.wfs import download_stored_query

Retrieve the latest hour of data from a bounding box
end_time = dt.datetime.utcnow()
start_time = end_time - dt.timedelta(hours=1)
Convert times to properly formatted strings
start_time = start_time.isoformat(timespec="seconds") + "Z"
-> 2020-07-07T12:00:00Z
end_time = end_time.isoformat(timespec="seconds") + "Z"
-> 2020-07-07T13:00:00Z

obs = download_stored_query("fmi::observations::weather::multipointcoverage",
args=["bbox=18,55,35,75",
"starttime=" + start_time,
"endtime=" + end_time])

It only seems to work, if I add import datetime as dt on top of it.
Am I correct about this?

@pnuu

Pandas interface for observation timeseries

Having the data output as a flat pandas dataframe greatly improves usability and versatility of the open data output.

Example of turning a result from a weather observations multipointcoverage result into a DF:

rows = []

for date in obs.data.keys():
    for loc in obs.data[date].keys():
        row = obs.data[date][loc].copy()
        for key in row.keys():
            row[key] = row[key]['value']
        row['date'] = date
        row['location'] = loc
        rows.append(row)

df = pd.DataFrame(rows)```

example does not work?

In Examples, in section Download and parse grid data
The example copy-pasted code below produces just an empty dictionary.
I also failed to download any grid data.
Any suggestions?
Terveisin, Markus

from fmiopendata.wfs import download_stored_query

model_data = download_stored_query("fmi::forecast::harmonie::surface::grid",
                                   args=["starttime=2020-07-06T18:00:00Z",
                                         "endtime=2020-07-06T20:00:00Z",
                                         "bbox=18,55,35,75"])

grid data fails due to "AttributeError: module 'eccodes' has no attribute 'GribFile'"

File "/home/hu-mka/miniconda3/envs/fmisaa/lib/python3.9/site-packages/fmiopendata/grid.py", line 80, in _parse_grib
   with eccodes.GribFile(self._fname) as grib:
AttributeError: module 'eccodes' has no attribute 'GribFile'

I have installed eccodes by

conda create -n fmisaa python=3.9
conda activate fmisaa
pip install fmiopendata
conda install -c conda-forge eccodes
pip install eccodes
conda install -c conda-forge cfgrib

Some sort of self check passes (from https://github.com/ecmwf/cfgrib#binary-dependencies )

python -m cfgrib selfcheck
Found: ecCodes v2.27.0.
Your system is ready.

The code that produced the error (below) tries to read 4d gridded data. I guess there is no netcdf option for the data?
A description how to setup a working environment for this task would be appreciated.
Terveisin Markus

from fmiopendata.wfs import download_stored_query
import datetime

starttime = datetime.datetime.utcnow() + datetime.timedelta(hours=10)
endtime = starttime + datetime.timedelta(hours=50)
starttime = starttime.strftime("%Y-%m-%dT%H:%M:%SZ")
endtime = endtime.strftime("%Y-%m-%dT%H:%M:%SZ")

print(f"starttime {starttime}")
print(f"endtime {endtime}")

data = download_stored_query(
    "fmi::forecast::harmonie::surface::grid",
    args=[
        f"starttime={starttime}",
        f"endtime={endtime}",
        "bbox=25,60,26,61",
    ],
)

latest = max(data.data.keys())
data = data.data[latest]
print(f"downloading data for {latest}")
data.parse(delete=True)

Available parameters for fmiopendata.wfs download_stored_query

Hi,

Love this package. Thanks!

  1. Is there a list of available parameters for monthly observations?

  2. Can you request more than one parameter per query?

I can do this:

obs = download_stored_query('fmi::observations::weather::monthly::multipointcoverage',
args=['bbox=12,55,35,75',
'starttime=2018-01-01',
'endtime=2019-12-31',
'timeseries=True',
'parameters=rrmon'])

to get monthly precip. But I also want mean monthly temperature. Can I get both of them in one query?

Thanks,

David
GPM Deputy Project Scientist for Ground Validation
NASA Wallops Flight Facility

Re-run examples and update README accordingly

There seems to be few updates in the data shown in the README examples. And at least one typo. So all the commands of the examples should be run through and update the README text.

While at it, update the wfs.md file to correspond to current state of available data on FMI open data.

download_and_parse() alters args list

download_and_parse(query_id, args=None) removes "timeseries=True" from args if this item is in the list.

This can lead to confusion, if a developer reuses the args list in subsequent queries (for example when requesting more than 168 hours of data).

args.remove("timeseries=True")

I suggest making a copy of the list, something like this:

    if args is None:
        args = []
    else:
        args = args.copy()

Add support to the FMI timeseries endpoint, which provides CSV, JSON and XML formats

dropsonde data ei tule?

Hei,
Yritän ladata dropsonde dataa Jokiosista 11.11.2020
soundings data tulee hyvin mutta dropsonde dataa en saa mistään.

import fmiopendata
fmiopendata.version
'0.3.4'

Olisiko mahdollista saada toimiva esimerkki dropsonde datan hakemisesta?
Yritin esim. alla olevalla koodilla:

from datetime import datetime
from fmiopendata.wfs import download_stored_query

start_time = datetime(year=2020, month=1, day=11, hour=11, minute=24, second=36)
end_time = datetime(year=2020, month=12, day=13, hour=15, minute=43, second=57)
# Convert times to properly formatted strings
start_time = start_time.isoformat(timespec="seconds") + "Z"
# -> 2020-07-07T12:00:00Z
end_time = end_time.isoformat(timespec="seconds") + "Z"
snd = download_stored_query("fmi::observations::weather::dropsonde::multipointcoverage",
                            args=["starttime=" + start_time,
                                  "endtime=" + end_time]
                            )

ja tuloksena on

/home/hu-mka/venvs/fmi/bin/python3 /home/hu-mka/git/fmi/fmi/demo_get_dropsonde_data.py
/home/hu-mka/venvs/fmi/lib/python3.8/site-packages/fmiopendata/utils.py:43: UserWarning: 

FMI servers responded with the following errors:

 - No handler for 'fmi::observations::weather::dropsondes::multipointcoverage' found!
 - URI: /wfs?endtime=2020-12-13T15%3A43%3A57Z&request=getFeature&service=WFS&starttime=2020-01-11T11%3A24%3A36Z&storedquery_id=fmi%3A%3Aobservations%3A%3Aweather%3A%3Adropsondes%3A%3Amultipointcoverage&version=2.0.0

  warnings.warn(exception_text)
No observations found

Process finished with exit code 0

Add more details to WFS html listing

There are more details available that could be added for available datasets (StoredQuerys) to WFS info listing. Few that I've noticed with a quick glance are:

  • calibration coefficients
  • calibration function (e.g. y = ax + b)
  • calibrated units (e.g. mm)

download_stored_query & timeseries - KeyError

Hi, you seem to have a typo in your manual in the following part:

"It is also possible to collect the data to a structure more usable for timeseries analysis by adding "timeseries=True" to the arguments:

from fmiopendata.wfs import download_stored_query

obs = download_stored_query("fmi::observations::weather::multipointcoverage",
args=["bbox=25,60,25.5,60.5",
"timeseries=True"])"

Here, "timeseries" should be "Timeseries" with a capital T.
If I write"timeseries" to my code as following:

import datetime as dt
from fmiopendata.wfs import download_stored_query

end_time = dt.datetime.utcnow()
start_time = end_time - dt.timedelta(days=2)
start_time = start_time.isoformat(timespec="seconds") + "Z"
end_time = end_time.isoformat(timespec="seconds") + "Z"

obs = download_stored_query("fmi::observations::weather::multipointcoverage",
args=["bbox=24.88,60.14,25.01,60.19",
"timeseries=True"])

latest_tstep = max(obs.data.keys())
print(sorted(obs.data[latest_tstep].keys()))
print(sorted(obs.data[latest_tstep]['Helsinki Kaisaniemi'].keys()))

it throws me the following error: print(sorted(obs.data[latest_tstep]['Helsinki Kaisaniemi'].keys())) KeyError: 'Helsinki Kaisaniemi'

Possible bug when fetching data before January 1st 1970

The program uses function utcfromtimestamp which for some reason cannot convert negative seconds into dates (i.e. dates before January 1st 1970).

When I tried to fetch daily data as a time series from the 60's by using fmi::observations::weather::daily::multipointcoverage, the program crashed at multipoint.py line 120. The conversion from seconds is done the same way in other files so they might need fixing too.

Original line 120 in multipoint.py:
times = np.array([dt.datetime.utcfromtimestamp(t) for t in positions[2::3]])

Possible alternative to do the conversion without crashing the program:
times = np.array([dt.datetime(1970, 1, 1) + dt.timedelta(seconds=t) for t in positions[2::3]])

Possible bug when fetching data before January 1st 1970

The program uses function utcfromtimestamp which for some reason cannot convert negative seconds into dates (i.e. dates before January 1st 1970).

When I tried to fetch daily data as a time series from the 60's by using fmi::observations::weather::daily::multipointcoverage, the program crashed at multipoint.py line 120. The conversion from seconds is done the same way in other files so they might need fixing too.

Original line 120 in multipoint.py:
times = np.array([dt.datetime.utcfromtimestamp(t) for t in positions[2::3]])

Possible alternative to do the conversion without crashing the program:
times = np.array([dt.datetime(1970, 1, 1) + dt.timedelta(seconds=t) for t in positions[2::3]])

Fix deprecation warnings in WFS XML handling

The use of elem.getChildren() has been deprecated, and these should be fixed in fmiopendata.wfs module:

fmiopendata/tests/test_wfs.py::test_get_stored_query_descriptions
  /home/pnuu/Software/fmiopendata/fmiopendata/wfs.py:100: DeprecationWarning: This method will be removed in future versions.  Use 'list(elem)' or iteration over elem instead.
    for f in root.getchildren():

fmiopendata/tests/test_wfs.py: 141 warnings
  /home/pnuu/Software/fmiopendata/fmiopendata/wfs.py:103: DeprecationWarning: This method will be removed in future versions.  Use 'list(elem)' or iteration over elem instead.
    f_ch = f.getchildren()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.