This repository has been devided into two parts. Part 1 retrieves the stock data from alpha vantage Part 2 makes inferences from the collected data and sends them to the user via a restful API
- import the libraries from requirements.txt
It can be done by writing the following comand in the terminal
pip install -r requirements.txt
- Run task1_main.py.
- One should avail their free API from alpha vantage. Using which one should update the API_key.
- After updating the information in the code present in task1.py.
One can simply run it, it would store the information in the present directory as a .db file.
- Alternatively if one has the credentials to his postgreSQL server you can directly store it there by running the following code.
engine = create_engine('postgresql://username:password@localhost:5432/name_of_database')
- Run task2_main.py.
- The location of the data should be present with the user.
- After updating the information in db_connect one can start the server by simply running it.
- Then one can make requests to the api using the tests provided and observe the results.
- We retrieve the stock_data of the desired ticker.
- A UUID was generated and the stock name were added to the data retrieved.
- We take the top row as that represents latest data.
- We append this to a list.
- A delay of one minute has been added after 5 calls, because alpha vantage limits to 5 calls per minute.
- Then after getting all the required data, we convert that into a pandas dataframe.
- Then we store it in a database.
- The given script has been scheduled to run daily at 10:30 that can be changed to the user's desired time.
- A connection to the database was established.
- Two functions
create_top_gainers_df
andcreate_top_losers_df
were created. which would return to us our desired output. - Three routes were made for the api
- TopGainers returns to us the top 10 stocks which had the biggest positive change in an ordered manner.
- TopLosers returns to us the top 10 stocks which had the biggest negative change in an ordered manner.
- WeeklyReport returns to us the weekly report of the stocks. Due to the ambiguity of the presence or absence of the data in the database an additional
.py file has been provided. Namesly
generating_weekly_report.py
which would return to us the weekly report in the desired format.
- Results have been provided in the json format to get a quick glance.
- Result of get_top_gainers
- Result of get_top_losers
- Result of weekly_report
- Alpha vantage limits to using 5 calls per minute, so one had to add a delay of 1 minute.
- Calling the get_daily function in alpha vantage gives daily data of 20+ years, We do not require so much information. We just need the current information. This is a waste of resources.
- Complete name of the stocks were not present. So one had to hardcode them.
- Instead of using create_top_gainers_df and create_top_losers_df, Directly quering them could have been done. But due to the inability to produce optimal results one had to resort to making these functions.
- One has directly displayed weekly_report and no data manipulation has been done. This is due to the ambiguity of how one would recieve the data. But one has provided
generating_weekly_report.py
which resolves the ambiguity and provides a means of getting the data exactly as we want it.
- We can look into different ways to bypass the 5 calls per minute limit. Various other APIs were tried but alpha vantage came out at top. Buying the premium can be one possible solution.
- The schedule task would rewrite the present database every day that it runs. So we can make unique identifiers of each .db file and store them.
- A basic front end can be developed.
- Removal of redundant functions and instead using a query.
- Adding a user authorization.
Thank you, for giving your valuable time to review my submission.