Crawler for website and save data with creating openAI summery to csv filepath. use https://www.scrapingcourse.com/ecommerce/ for tests
git clone [email protected]:yevhenii-nevmyvako/clevervol_test.git
python3 -m venv venv
pip install -e .
Add Open Ai API_KEY to .env.
- Script name.
- URL to website
- path to save data to csv format as path/to/dst_filepath.csv
- -l 10 count of pages to crawling.
crawler.py https://www.scrapingcourse.com/ecommerce/ google_ads_2_data.csv