Update README.md
Browse files
README.md
CHANGED
@@ -61,7 +61,7 @@ This model is based on the **Mistral-Large-Instruct-2411** foundation model and
|
|
61 |
|
62 |
The model will be fine tuned to enhance its capabilities in handling advanced fluid dynamics and plasma physics scenarios relevant to Hall Effect Thrusters. Key enhancements include:
|
63 |
|
64 |
-
1. **Domain-Specific Fine-Tuning**: Adjusting the model's parameters using the `Taylor658/
|
65 |
2. **Validation and Testing**: Ensuring the model’s outputs are accurate and reliable by comparing them against established literature and computational benchmarks.
|
66 |
3. **Iterative Refinement**: Continuously refining responses based on domain expert feedback and real-world problem sets.
|
67 |
|
@@ -94,7 +94,7 @@ The model will be fine tuned to enhance its capabilities in handling advanced fl
|
|
94 |
### Acknowledgements
|
95 |
|
96 |
- **Mistral AI**: For providing the Mistral-Large-Instruct-2411 foundation model.
|
97 |
-
- **Dataset Contributors**:
|
98 |
- **Open-Source Community**: Gratitude for tools and libraries that supported the fine-tuning process.
|
99 |
|
100 |
---
|
|
|
61 |
|
62 |
The model will be fine tuned to enhance its capabilities in handling advanced fluid dynamics and plasma physics scenarios relevant to Hall Effect Thrusters. Key enhancements include:
|
63 |
|
64 |
+
1. **Domain-Specific Fine-Tuning**: Adjusting the model's parameters using the `Taylor658/Electrohydrodynamics` dataset to improve performance in electrohydrodynamics.
|
65 |
2. **Validation and Testing**: Ensuring the model’s outputs are accurate and reliable by comparing them against established literature and computational benchmarks.
|
66 |
3. **Iterative Refinement**: Continuously refining responses based on domain expert feedback and real-world problem sets.
|
67 |
|
|
|
94 |
### Acknowledgements
|
95 |
|
96 |
- **Mistral AI**: For providing the Mistral-Large-Instruct-2411 foundation model.
|
97 |
+
- **Dataset Contributors**: A Taylor
|
98 |
- **Open-Source Community**: Gratitude for tools and libraries that supported the fine-tuning process.
|
99 |
|
100 |
---
|