Comments (1)
The most time-consuming part of both the transform() and fit_transform() methods is the extraction of BERT-based embeddings. When you have trained the model, you will still need to extract the embeddings for unseen documents. Unfortunately, this means that it is difficult to speed up transform() as it, computationally, mostly relies on extracting the embedding.
Fortunately, in the case of topic modeling, it is unlikely that you will frequently re-train the model as that would result in creating new topics that need to be interpreted again. Often, you will fit_transform() on a large dataset and use transform() for unseen documents, which is faster as the number of unseen documents are likely to be less frequent.
Having said that, it would be nice to be able to swap out BERT-embeddings for perhaps another feature extraction method. This could result in having a much faster application although it could hinder the quality of the generated clusters. Perhaps flair might be an interesting alternative.
Does this answer your question?
from bertopic.
Related Issues (20)
- `IndexError: list index out of range` when using zeroshot_topic_list in 0.16.1 HOT 10
- Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`How to slove this problem?
- Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28` HOT 1
- Add support for Python 3.10+ HOT 2
- [inhomogeneous shape unresolved] [Colab] ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (2,) + inhomogeneous part. HOT 1
- Huggingface transformer does not load as expected HOT 3
- BERTopic with large dataset (10-20 Million) HOT 1
- datamap visualisation does not work. HOT 1
- datamap visulisation does not work. HOT 4
- Request: Zeroshot option to assign unassigned documents to outliers rather than reclustering HOT 3
- should we reduce the dimensionality of topic_model.topic_embeddings_ ? HOT 2
- Bertopic verion 0.16.1 fails with zero_shot topics (works fine with 0.16.0) HOT 1
- bertopic version 0.16.0 - probs are empty when executing with zero_shot HOT 3
- bertopic version 0.16.0 - when adding representation model together with zeroshot_topic_list end with failure HOT 3
- Disable warning for update_topics() HOT 1
- Wrong link in Algorithm Documentation HOT 2
- Getting probabilities for all topics given a document from loaded model HOT 1
- Issues with Zero-shot Topic Modeling regarding outliers and future operations HOT 3
- Switch from setup.py to pyproject.toml HOT 4
- Seed Words
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bertopic.