metadata
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: >-
Shakespeare's Macbeth stands as a timeless exploration of ambition, power,
and the corrupting influence of unchecked desire. As the playwright delves
into the psyche of its titular character, Macbeth, and his wife, Lady
Macbeth, he unravels a narrative that transcends its historical context to
reveal universal truths about human nature. Central to Shakespeare's
critique is the portrayal of ambition as a double-edged sword. Macbeth's
ascent from loyal subject to ruthless tyrant illustrates the seductive
allure of power and the devastating consequences of its pursuit. His
initial reluctance and moral turmoil give way to a relentless pursuit of
supremacy, driven by the prophecies of supernatural forces and the
machinations of his ambitious wife.
Moreover, Shakespeare critiques the role of gender and masculinity through
Lady Macbeth's character. Her manipulation and goading of Macbeth
challenge traditional gender norms, presenting a complex and compelling
portrait of a woman driven by ambition and the desire for power. The
play's exploration of guilt and conscience further deepens Shakespeare's
critique. Macbeth's descent into madness and paranoia, haunted by visions
of his victims and the consequences of his actions, reflects the
psychological toll of moral corruption. Ultimately, Shakespeare's Macbeth
serves as a cautionary tale about the dangers of unchecked ambition and
the moral complexities of human nature. Its enduring relevance lies in its
ability to provoke introspection and contemplation of universal themes
that resonate across time and cultures, making it a masterpiece of
literature that continues to captivate and challenge audiences worldwide.
- text: >-
Demonetization, implemented in India on November 8, 2016, aimed to curb
black money, counterfeit currency, and corruption while promoting digital
transactions. However, its impact and effectiveness have been subjects of
intense debate and scrutiny.
Proponents argue that demonetization disrupted illegal financial
activities, forcing unaccounted wealth into the formal banking system. It
encouraged digital payments, potentially reducing cash-based transactions
and improving transparency. Moreover, the move signaled a strong political
will to tackle corruption and parallel economy.
On the contrary, critics highlight several shortcomings. The sudden
withdrawal of high-denomination currency notes led to cash shortages,
particularly affecting rural areas and small businesses reliant on cash
transactions. The informal sector, comprising a significant portion of the
economy, faced severe disruptions, impacting livelihoods and economic
growth.
Moreover, demonetization did not significantly curb black money or
corruption, as evidenced by the return of almost all demonetized currency
to banks. The costs of implementation, including printing new currency and
managing logistical challenges, were substantial. The move also diverted
attention from other pressing economic reforms.
Looking ahead, lessons from demonetization underscore the need for
comprehensive planning, stakeholder consultation, and phased
implementation of economic policies. Future reforms should prioritize
inclusive growth, address structural issues, and leverage technology to
enhance financial transparency without causing undue hardship to
vulnerable populations.
In conclusion, while demonetization aimed to achieve noble objectives, its
outcomes were mixed. The policy's success in achieving its primary goals
remains contentious, highlighting the complexities of economic policy
formulation and implementation in a diverse and evolving economy.
- text: "1. Development and Purpose: \x95 GPT Models: Developed by OpenAI, GPT (Generative Pre-trained Transformer) models, such as GPT-3, are designed for natural language processing tasks. They excel in tasks like text generation, language translation, and sentiment analysis through extensive training on large datasets. \x95 Cohere Models: Cohere specializes in fine-tuned models optimized for specific NLP tasks, emphasizing efficiency and effectiveness in particular applications.\n2. Architecture and Training: \x95 GPT Models: Utilize transformer architecture with attention mechanisms, enabling them to process and generate coherent text based on learned patterns from vast datasets. GPT models are pre-trained on diverse corpora and fine-tuned for specific applications. \x95 Cohere Models: Employ transformer-based architectures, but with potential optimizations or customizations tailored for specific tasks, such as semantic search, question answering, or document classification. Cohere focuses on maximizing model efficiency and performance for targeted applications. 3. Performance and Applications: \x95 GPT Models: Known for their general-purpose applications in natural language understanding and generation across various domains. GPT models are widely adopted in chatbots, content creation, and automated customer support systems. \x95 Cohere Models: Specialize in specific applications where task efficiency and domain expertise are crucial. This includes tasks like semantic search, where Cohere's models excel in understanding complex queries and retrieving relevant information efficiently. 4. Accessibility and Integration: \x95 GPT Models: OpenAI's GPT models are accessible through APIs and cloud services, facilitating easy integration into different platforms and applications. They are widely used in both research and commercial sectors due to their broad applicability. \x95 Cohere Models: Cohere's models are accessible through APIs and developer tools, focusing on providing tailored solutions for specific NLP challenges. Integration options depend on Cohere's partnerships and deployment strategies. 5. Innovation and Future Directions: \x95 GPT Models: Continually advancing with research and development efforts to enhance language understanding, context awareness, and multi-modal capabilities. \x95 Cohere Models: Innovating by fine-tuning models and exploring novel applications to optimize efficiency and performance in targeted NLP tasks.\nConclusion: GPT models and Cohere models represent distinct approaches to leveraging transformer-based architectures for natural language processing. While GPT models excel in general-purpose applications with broad adaptability, Cohere models specialize in optimizing efficiency and performance for specific NLP tasks, offering tailored solutions for semantic search and other targeted applications. Choosing between them depends on specific use case requirements, including task complexity, data efficiency, and integration needs."
- text: "Aerial Overview The attached aerial photograph of [Factory Name] offers a comprehensive view of the facility layout, including key operational areas: 1. Main Production Area: Located centrally, this section houses the primary manufacturing lines and assembly units.\n2. Warehouse: Positioned on the northeast side, this area is dedicated to storage and inventory management. 3. Loading Docks: Situated on the south end, these docks facilitate the efficient movement of goods in and out of the facility. 4. Administrative Offices: Found on the west side, this section includes offices for management, HR, and other administrative functions. 5. Maintenance Area: Located near the northwest corner, this area is equipped for routine equipment upkeep and repair tasks. 6. Employee Facilities: Including the cafeteria and break rooms, located adjacent to the administrative offices. Operational Details\n1. Production Process: \x95 Raw Material Handling: o Raw materials are received at the loading docks and transported to the warehouse. o Quality checks are performed before materials are moved to production. \x95 Manufacturing: o The main production area is divided into various sections based on product lines. o Each section is equipped with specialized machinery and staffed by trained operators. o Strict adherence to safety and quality standards is maintained throughout the production process. 2. Quality Control: \x95 Inspection: o Continuous quality checks are performed at each stage of production. o Finished products undergo rigorous testing before approval. \x95 Documentation: o Detailed records of quality inspections and tests are maintained for compliance and traceability. 3. Maintenance and Safety:\n\x95 Routine Maintenance: o Scheduled maintenance is performed to ensure machinery operates efficiently and safely. o Maintenance logs are kept for all equipment.\n\x95 Safety Protocols:o All employees must follow safety guidelines, including the use of personal protective equipment (PPE). o Emergency procedures are in place, and regular drills are conducted. Conclusion This document provides an overview of [Factory Name]'s operations and layout. Ensuring the confidentiality of this information is crucial for the safety and efficiency of our operations. For further details or inquiries, please contact the facility manager. "
- text: >-
Participating in the Government e-Marketplace (GeM) portal offers
significant advantages for vendors and buyers alike. GeM is an online
platform initiated by the Government of India to facilitate procurement of
goods and services by government departments, public sector units, and
autonomous bodies. For vendors, registering on GeM opens doors to a vast
market of government buyers, enhancing business opportunities and
visibility. The portal provides a transparent and efficient procurement
process, reducing paperwork and transaction costs. Vendors can showcase
their products/services, respond to bids electronically, and receive
timely payments, fostering ease of doing business.
On the buyer side, GeM streamlines procurement operations by offering a
wide range of products and services from verified sellers at competitive
prices. It ensures compliance with procurement rules and promotes
transparency through online tracking and monitoring of transactions.
Government entities benefit from cost savings, reduced procurement time,
and access to a diverse supplier base. Participation in GeM promotes
digital governance, aligning with the government's initiatives for
transparency, efficiency, and promoting local businesses. It empowers
vendors, particularly MSMEs, to compete on a level playing field and
contribute to India's economic growth. Embracing GeM facilitates a
seamless procurement experience while driving socio-economic development
through enhanced market access and operational efficiency.
inference: true
model-index:
- name: SetFit
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 1
name: Accuracy
SetFit
This is a SetFit model that can be used for Text Classification. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 512 tokens
- Number of Classes: 3 classes
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
Label | Examples |
---|---|
0 |
|
1 |
|
2 |
|
Evaluation
Metrics
Label | Accuracy |
---|---|
all | 1.0 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("amritzeon/setfit_finetuned_ide")
# Run inference
preds = model("Participating in the Government e-Marketplace (GeM) portal offers significant advantages for vendors and buyers alike. GeM is an online platform initiated by the Government of India to facilitate procurement of goods and services by government departments, public sector units, and autonomous bodies. For vendors, registering on GeM opens doors to a vast market of government buyers, enhancing business opportunities and visibility. The portal provides a transparent and efficient procurement process, reducing paperwork and transaction costs. Vendors can showcase their products/services, respond to bids electronically, and receive timely payments, fostering ease of doing business.
On the buyer side, GeM streamlines procurement operations by offering a wide range of products and services from verified sellers at competitive prices. It ensures compliance with procurement rules and promotes transparency through online tracking and monitoring of transactions. Government entities benefit from cost savings, reduced procurement time, and access to a diverse supplier base. Participation in GeM promotes digital governance, aligning with the government's initiatives for transparency, efficiency, and promoting local businesses. It empowers vendors, particularly MSMEs, to compete on a level playing field and contribute to India's economic growth. Embracing GeM facilitates a seamless procurement experience while driving socio-economic development through enhanced market access and operational efficiency.")
Training Details
Training Set Metrics
Training set | Min | Median | Max |
---|---|---|---|
Word count | 75 | 337.2667 | 706 |
Label | Training Sample Count |
---|---|
0 | 10 |
1 | 10 |
2 | 10 |
Training Hyperparameters
- batch_size: (30, 30)
- num_epochs: (2, 2)
- max_steps: -1
- sampling_strategy: oversampling
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: True
Training Results
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.05 | 1 | 0.1952 | - |
1.0 | 20 | - | 0.1326 |
2.0 | 40 | - | 0.0704 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.12
- SetFit: 1.0.3
- Sentence Transformers: 3.0.1
- Transformers: 4.39.0
- PyTorch: 2.3.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.15.2
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}