openclimatefix / uk-pv-national-gsp-api Goto Github PK
View Code? Open in Web Editor NEWAPI for hosting nowcasting solar predictions
License: MIT License
API for hosting nowcasting solar predictions
License: MIT License
Based on our design review earlier this week, we should address the following comments:
effective_time
etc. - what is this called in the meteorology world? target_time
?inputDataAge {
"pv": “2009-10-31T01:48:52Z”
"satellite": “2009-10-31T01:48:52Z”
}
gb
/forecasts/gb/national
pv
as a part of the URL. So the route might be v0/forecast/gb/pv/[national,gsp]
forecasted_values
to forecast_values
Also see #24.
Change zeros to random numbers, lower limit zero
Useful to be able to plot these numbers. A plot of zeros is very dull
using numpy
Setup fastapi with no actual routes
https://fastapi.tiangolo.com/
Worth updating the readme file on how to run the api locally
Get GSP capacity details
This is so we can show the % of power being generated
provide list of GSPSystems
. Need to add capacity to GSP systems in the data model, and update GSP consumer to load capacity
Add a few tests to make sure 'floor_30_minutes_dt' is working corectly.
This is needed in order to find the rounded down half an hour, when querying the api for the latest forecast
see other test structure
Current path is - /v0/GB/solar/gsp/truth/one_gsp/{gsp_id}/
But this is more of estimate rather than then truth
But should change to 'v0/GB/solar/gsp/sheffield_solar_estimate/one_gsp/{gsp_id}'
Good to have a status get method.
This will be handy for displaying to the user at some point
We could have:
This would involve making a
class StatusSQL(CreatedMixin):
"""ffffffff"""
__tablename__ = "status"
id = Column(Integer, primary_key=True)
status = Column(String)
message = Column(String)
class Status(EnhancedBaseModel):
""""""
status: string = Field(
...,
description="ggggg",
)
message: str = Field(
..., description="The forecasted value in MW"
)
+ validate on status in 'ok', 'warning', 'error'
tests
tests
tests
Deploy this api to AWS using terraform
Would be greta to have terraform to deploy this app to AWS.
As part of 'Nowcasting' Project want to be able to deploy this automatically using terraform
Currently our API returns snake_case in the JSON responses.
It's more common to use camelCase in JSON though, eg see Google's guidance on that.
In our shape definitions using pydantic however, we want them to be snake_case, as that is common for Python.
There seems to be possible fix via the approach outlined here: https://medium.com/analytics-vidhya/camel-case-models-with-fast-api-and-pydantic-5a8acb6c0eee
Add API security
This can be done with Auth0, The Nwocasting App will need update as it will need to pull some credentials when pulling the API
TODO addAuth0 reference docs
Would be great to have a coverage badge on the main page
We need to expose the GSP GIS Boundaries (public domain) via our API, to make sure we are always using the same version of it across the application.
If not already in that shape, they should be exposed in the GeoJSON format, in line with our other endpoints returning JSON as well.
A possible endpoint for this could be:
GET /shapes/GB/gsp
Coveralls says that we "only" have 99% coverage, because in the function datetime_must_have_timezone
the branch that raises the ValueError
isn't covered.
For nothing but fun, it would be nice to write a test case to cover that branch as well and thus reach 100% coverage. 😊
Think about how to access the API securely.
https://fastapi.tiangolo.com/tutorial/security/
This could be done with
For the moment return dummy values
More details to follow here - @flowirtz
See openclimatefix/.github#6 (comment):
We mostly use flake8 isort black
on our repos and we also use pre-commit hooks:
https://github.com/openclimatefix/nowcasting_dataloader/blob/main/.pre-commit-config.yaml
Load latest national forecast
linked with openclimatefix/nowcasting_forecast#20
Expose GSP level data
Good for front end to have access to data,
This although this could be done through Sheffield Solar, but the intermediate PV values are not stored
get data from yesterday 00.00
don't worry about time searching
Would be nice to have logging on when the routes are hit and with what variables. Would be great if they are piped through to AWS Cloudwatch - but I think this happens automatically if set up in the correct way
On release to main, push docker file to docker hub
The docker file can be then be used for deployment
....
Current the /v0/GB/solar/gsp/forecast/all with normalize == True takes ~10 seconds. This is to long.
Would be good for FE if ~1 second
slow part is the looping in the pydantic model, to normalize data. We could:
After reviewing, we shall go for this:
/v0/solar/GB/national/forecast/
/v0/solar/GB/national/pvlive/
/v0/solar/GB/gsp/forecast/all
/v0/solar/GB/gsp/forecast/{gsp_id}
/v0/solar/GB/gsp/forecast/{gsp_id}/{only_values} or other filter parameters
/v0/solar/GB/gsp/pvlive/{gsp_id}
/v0/system/GB/gsp/boundaries
/v0/system/GB/gsp/
TODOs (in the future)
/v0/wind/GB/
/v0/solar/site/
/v0/IT/
app.include_router(national_router, prefix=f"{v0_route}/national")
'/forecast/national' --> 'forecast'
'/pvlive/national' --> 'pvlive'
new route prefix is /v0/solar/GB/national
Split system functions into separate file and rename routes ( + update tests)
/v0/system/GB/gsp/boundaries
/v0/system/GB/gsp/
new route prefix is /v0/system/GB/
'/forecast/all/' --> 'forecast/all' (same)
'/forecast/one_gsp/{gps_id}' --> 'forecast/{gps_id}'
'/pvlive/one_gsp/{gsp_id}' --> 'pvlive/{gsp_id}'
same route prefix is /v0/solar/GB/gsp
The current docker file is ~ 600MB - docker hub.
It would be great to reduce this if possible.
Good to keep docker files
Perhaps dont install packages that are used. Might have to do something in nowcasting_forecast
/v0/forecasts/GB/pv/gsp_boundaries
currently returns the response as a string.
It would be better to return it as a JSON instead.
GSP history, only get data from yesterday, not everything
see openclimatefix/quartz-frontend#48 for graph
Add docker file for this app to run
Add automatic github action that builds docker file every push (move to #7)
This will be useful for in order to deploy it
add favicon to this api.
This might help - https://stackoverflow.com/questions/68786428/setting-favicon-for-fastapi
Add route to expose PV api data
So front end can plot api data
Due to this bug had to pin, opencv/opencv-python#675 (comment)
but this can be removed now
For our frontend to be able to send requests to the API we need to configure CORS.
There is a good guide on this in the docs.
For now, I propose that we should allow the following origins:
origins = [
"http://localhost:3002", # TODO: Remove this in production and only allow in dev
"https://app.nowcasting.io"
]
Would be great to see how long the routes took to run
the example here probably would be good
Good to add endpoint that gives the latest forecast of the past
Would be nice to see the forecast from yesterday, but just show the last forecast made.
This is useful for comparing to pvlive 'truth' values.
Optional inputs can be 'dateFrom' and 'dateTo'. This could default to yesterday to tomorrow
SELECT DISTINCT ON (forecast_value.target_time) forecast_value.created_utc AS forecast_value_created_utc, forecast_value.id AS forecast_value_id, forecast_value.target_time AS forecast_value_target_time, forecast_value.expected_power_generation_megawatts AS forecast_value_expected_power_generation_megawatts, forecast_value.forecast_id AS forecast_value_forecast_id
FROM forecast_value JOIN forecast ON forecast.id = forecast_value.forecast_id JOIN location ON location.id = forecast.location_id
WHERE forecast_value.target_time >= '2022-08-23T00:00:00'::timestamp AND forecast_value.created_utc >= '2022-08-22T00:00:00'::timestamp AND forecast.created_utc >= '2022-08-22T00:00:00'::timestamp AND location.gsp_id = 0 ORDER BY forecast_value.target_time, forecast_value.created_utc DESC
This query first apperas when we start the the web page.
Need to find out where an dwhat call this. It should use forecast_value_latest not forecast value
Add elastic beanstalk files into github. This means anyone with correct access to AWS can deploy the app.
Provide a route to give latest values of the forecast for today and yesterday
Might need to get List of 'ForecastValues' Rather than a Forecast
Upgrade nowcasting_datamodel to 0.0.53
This was a small change to do with making location.gsp_id
a distinct columns
Change info text to say this is the Alpha release
This really isn't very important, and I'm being selfish by even mentioning this! I bring this up because it breaks my gmail filters 🙂
This issue is about the automated "Release" emails we receive (with subjects like "[openclimatefix/nowcasting_api] Release 0.1.14 - v0.1.14").
In all our other repositories, the "Release" emails are from "github-actions[bot]", as can be seen in this screenshot from gmail:
But, for some reason, nowcasting_api
's "Release" emails are from "Peter Dudfield":
I was just wondering if we can change the "sender" from "Peter Dudfield" to "github-actions[bot]"? Or have GitHub changed something that's out of our control?! 🙂
Scheudle CI test runner to run every monday lunctime
Context
useful to regular check things are broken
Possible Implementation
on:
push:
schedule:
- cron: "0 12 * * 1"
Make terraform code for deployment of this service on AWS elasticbeanstalk.
This should incluse
Good so we can deploy this service easily
https://automateinfra.com/2021/03/24/how-to-launch-aws-elastic-beanstalk-using-terraform/
Read GSp results from database
The GSP results are made by the ML models
import nowcasting_forecast and use the database folder to import the sqlalchemy datamodel and connection details.
Option to Normalize forecasting
Good to deploy where it does nothing,
Then normalize by GSP capacity value. This is in the locationSQL
object
The current docker file is ~ 600MB - docker hub.
It would be great to reduce this if possible.
Good to keep docker files
Perhaps dont isntlal packages that are used. Might have to do someting in nowcasting_dataset
Automated deploy docker image to docker hub
Every push, a docker image should be made and pushed to docker hub
Would be great to have it versioned tag too
Useful for automatic deployment
For 2. we could try basing it off of SatFlows which publishes to Docker Hub and GitHub Container Repo on every push
Just need to update the description here
When loading the forecasts, load from latest table
This will have all the historic forecasted values + the latest ones
This follows from openclimatefix/nowcasting_datamodel#2
Good to add auto .giuthub that checks pylinter
useful to have general way to do this
https://github.com/openclimatefix/nowcasting_utils/blob/main/.github/workflows/linters.yaml
Describe the bug
Currently we are not redirecting http traffic (port 80) to be https instead (port 443).
This is because of a missing configuration in our Elastic Beanstalk deployment.
The API can already be reached over SSL via https://api.nowcasting.io (or api-dev
respectively).
Additionally, however, we should be upgrading http
connections as well.
To Reproduce
Steps to reproduce the behaviour:
Not Secure
in browser navigation barExpected Behaviour
All connections to http://nowcasting-api...
get redirected to https://nowcasting-api...
Additional Context
This can be resolved by following this AWS Guide.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.