MARTINI_enrich_BERTopic_QAnon17_Awakening
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("AIDA-UPM/MARTINI_enrich_BERTopic_QAnon17_Awakening")
topic_model.get_topic_info()
Topic overview
- Number of topics: 7
- Number of training documents: 468
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | soros - thepentagonpapers - ukraine - global - banned | 25 | -1_soros_thepentagonpapers_ukraine_global |
0 | fauci - vaccinated - clots - ivermectin - deadly | 209 | 0_fauci_vaccinated_clots_ivermectin |
1 | 432hz - illuminati - pyramids - frequencies - resonate | 79 | 1_432hz_illuminati_pyramids_frequencies |
2 | republic - liberty - indivisible - flag - salvation | 50 | 2_republic_liberty_indivisible_flag |
3 | hillary - kamala - leaked - epstein - videos | 38 | 3_hillary_kamala_leaked_epstein |
4 | trump - raffle - 2500 - millionaire - bitcoin | 36 | 4_trump_raffle_2500_millionaire |
5 | supplements - liver - antioxidants - detoxification - retina | 31 | 5_supplements_liver_antioxidants_detoxification |
Training hyperparameters
- calculate_probabilities: True
- language: None
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: None
- seed_topic_list: None
- top_n_words: 10
- verbose: False
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.26.4
- HDBSCAN: 0.8.40
- UMAP: 0.5.7
- Pandas: 2.2.3
- Scikit-Learn: 1.5.2
- Sentence-transformers: 3.3.1
- Transformers: 4.46.3
- Numba: 0.60.0
- Plotly: 5.24.1
- Python: 3.10.12
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.