Comments (5)
Not at the moment. The whole thing is designed to write to disc incrementally, and that is very helpful for performance, not having to save/process huge data in memory.
I'm curious, why do you think this would be a good thing? What are you trying to achieve?
from advertools.
I am writing a frontend and would like to provide the status of the crawl as well as results. I am using FastAPI to trigger the crawl and was just wanting to avoid writing a bunch code to read the file.
I was also under the impression that the contents were written to the file all at once. But now I see thats not the case
from advertools.
Very interesting application. I'd love to know more.
would like to provide the status of the crawl
As the crawl is happening, a new line is added for each URL crawled. You can easily check for the number of lines of the output file, and display that to the user (X URLs crawled).
If crawling in list mod you can also provide a percentage, and/or "X URLs remaining".
wanting to avoid writing a bunch code to read the file
This is can be done with a single pandas command read_json
. Or am I missing something?
from advertools.
yeah I am familiar with the pandas command read_json. I've used the tool before I was just looking to see if there was a way to stream things into a different place, like say Redis or Kafka topic. At the end of the day I need to persist the crawl data and return some of the infromation back to the client. I can definitely do this using the dataframe method and writing those to some persistent location.
This is can be done with a single pandas command read_json. Or am I missing something?
Yes, I will go that route where I read the lines of the file to check crawl status. The only other thing might be calculating the number of potential pages/lines.
I am taking a domain url as an argument so I have to create the list dynamically. Does the follow mode offer away to determine how many pages will get crawled or does advertools provide a good way to determine the pages I could put in the list? Perhaps crawling the sitemap would do.
If crawling in list mod you can also provide a percentage, and/or "X URLs remaining".
I am building a suite of SEO tools that focus around automation.
Very interesting application. I'd love to know more.
from advertools.
Cool. I think the XML sitemap can be downloaded quickly, and can provide a generally good estimate of how the big the site is, and estimated number of URLs.
Of course there could be discrepancies, but most of the time, I think it can provide a good estimate.
Looking forward to seeing what you build!
from advertools.
Related Issues (20)
- Getting NaN values for serp_goog function HOT 3
- File not found on crawl method HOT 4
- opening .jl file command doesn't show 'my_output_file.jl' HOT 7
- crawl dataFrame - jsonld objects HOT 2
- Suggestion - don't treat jsonld items in distinct script tags as distinct. HOT 8
- How to get initial url in output.jl ? HOT 2
- Bypass protection HOT 3
- Need some way to rate limit requests for sitemap_to_df HOT 3
- Scraps forever HOT 3
- Python 3.10/11 SSL: SSLV3_ALERT_HANDSHAKE_FAILURE HOT 5
- Advertools in Ubuntu in a Venv (Python 3.10.12 and Python 3.9.18) HOT 7
- browser can get https://zapier.com but when run scrape failed HOT 2
- logs_to_df() Limitation HOT 5
- Bypass a cookie wall HOT 4
- Pandas Futurewarning "fillna" in url_to_df() HOT 1
- request_url_df creates wide list? HOT 3
- How to get started with development? HOT 7
- Instagram Mentions Allows Periods HOT 1
- about this result in serp HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from advertools.