---
base_model: microsoft/deberta-v3-base
datasets:
- tals/vitaminc
- allenai/scitail
- allenai/sciq
- allenai/qasc
- sentence-transformers/msmarco-msmarco-distilbert-base-v3
- sentence-transformers/natural-questions
- sentence-transformers/trivia-qa
- sentence-transformers/gooaq
- google-research-datasets/paws
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:123245
- loss:CachedGISTEmbedLoss
widget:
- source_sentence: what type of inheritance does haemochromatosis
sentences:
- Nestled on the tranquil banks of the Pamlico River, Moss Landing is a vibrant
new community of thoughtfully conceived, meticulously crafted single-family homes
in Washington, North Carolina. Washington is renowned for its historic architecture
and natural beauty.
- '1 Microwave on high for 8 to 10 minutes or until tender, turning the yams once.
2 To microwave sliced yams: Wash, peel, and cut off the woody portions and ends.
3 Cut yams into quarters. 4 Place the yams and 1/2 cup water in a microwave-safe
casserole.ake the Yams. 1 Place half the yams in a 1-quart casserole. 2 Layer
with half the brown sugar and half the margarine. 3 Repeat the layers. 4 Bake,
uncovered, in a 375 degree F oven for 30 to 35 minutes or until the yams are glazed,
spooning the liquid over the yams once or twice during cooking.'
- Types 1, 2, and 3 hemochromatosis are inherited in an autosomal recessive pattern,
which means both copies of the gene in each cell have mutations. Most often, the
parents of an individual with an autosomal recessive condition each carry one
copy of the mutated gene but do not show signs and symptoms of the condition.Type
4 hemochromatosis is distinguished by its autosomal dominant inheritance pattern.With
this type of inheritance, one copy of the altered gene in each cell is sufficient
to cause the disorder. In most cases, an affected person has one parent with the
condition.ype 1, the most common form of the disorder, and type 4 (also called
ferroportin disease) begin in adulthood. Men with type 1 or type 4 hemochromatosis
typically develop symptoms between the ages of 40 and 60, and women usually develop
symptoms after menopause. Type 2 hemochromatosis is a juvenile-onset disorder.
- source_sentence: More than 273 people have died from the 2019-20 coronavirus outside
mainland China .
sentences:
- 'More than 3,700 people have died : around 3,100 in mainland China and around
550 in all other countries combined .'
- 'More than 3,200 people have died : almost 3,000 in mainland China and around
275 in other countries .'
- more than 4,900 deaths have been attributed to COVID-19 .
- source_sentence: The male reproductive system consists of structures that produce
sperm and secrete testosterone.
sentences:
- What does the male reproductive system consist of?
- What facilitates the diffusion of ions across a membrane?
- Autoimmunity can develop with time, and its causes may be rooted in this?
- source_sentence: Nitrogen gas comprises about three-fourths of earth's atmosphere.
sentences:
- What do all cells have in common?
- What gas comprises about three-fourths of earth's atmosphere?
- What do you call an animal in which the embryo, often termed a joey, is born immature
and must complete its development outside the mother's body?
- source_sentence: What device is used to regulate a person's heart rate?
sentences:
- 'Marie Antoinette and the French Revolution . Famous Faces . Mad Max:
Maximilien Robespierre | PBS Extended Interviews > Resources > For Educators
> Mad Max: Maximilien Robespierre Maximilien Robespierre was born May 6, 1758
in Arras, France. Educated at the Lycée Louis-le-Grand in Paris as a lawyer, Robespierre
became a disciple of philosopher Jean-Jacques Rousseau and a passionate advocate
for the poor. Called "the Incorruptible" because of his unwavering dedication
to the Revolution, Robespierre joined the Jacobin Club and earned a loyal following.
In contrast to the more republican Girondins and Marie Antoinette, Robespierre
fiercely opposed declaring war on Austria, feeling it would distract from revolutionary
progress in France. Robespierre''s exemplary oratory skills influenced the National
Convention in 1792 to avoid seeking public opinion about the Convention’s decision
to execute King Louis XVI. In 1793, the Convention elected Robespierre to the
Committee of Public Defense. He was a highly controversial member, developing
radical policies, warning of conspiracies, and suggesting restructuring the Convention.
This behavior eventually led to his downfall, and he was guillotined without trial
on 10th Thermidor An II (July 28, 1794), marking the end of the Reign of Terror.
Famous Faces'
- Devices for Arrhythmia Devices for Arrhythmia Updated:Dec 21,2016 In a medical
emergency, life-threatening arrhythmias may be stopped by giving the heart an
electric shock (as with a defibrillator ). For people with recurrent arrhythmias,
medical devices such as a pacemaker and implantable cardioverter defibrillator
(ICD) can help by continuously monitoring the heart's electrical system and providing
automatic correction when an arrhythmia starts to occur. This section covers everything
you need to know about these devices. Implantable Cardioverter Defibrillator (ICD)
- 'vintage cleats | eBay vintage cleats: 1 2 3 4 5 eBay determines this price through
a machine learned model of the product''s sale prices within the last 90 days.
eBay determines trending price through a machine learned model of the product’s
sale prices within the last 90 days. "New" refers to a brand-new, unused, unopened,
undamaged item, and "Used" refers to an item that has been used previously. Top
Rated Plus Sellers with highest buyer ratings Returns, money back Sellers with
highest buyer ratings Returns, money back'
model-index:
- name: SentenceTransformer based on microsoft/deberta-v3-base
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.815847362588151
name: Pearson Cosine
- type: spearman_cosine
value: 0.8601639057015673
name: Spearman Cosine
- type: pearson_manhattan
value: 0.8516017944909535
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.8544723060240249
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.8535561946511334
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.8552219791548535
name: Spearman Euclidean
- type: pearson_dot
value: 0.7359966626313633
name: Pearson Dot
- type: spearman_dot
value: 0.7255299595618487
name: Spearman Dot
- type: pearson_max
value: 0.8535561946511334
name: Pearson Max
- type: spearman_max
value: 0.8601639057015673
name: Spearman Max
- task:
type: binary-classification
name: Binary Classification
dataset:
name: allNLI dev
type: allNLI-dev
metrics:
- type: cosine_accuracy
value: 0.71875
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.8783560991287231
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.6325167037861915
name: Cosine F1
- type: cosine_f1_threshold
value: 0.8004631996154785
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.5144927536231884
name: Cosine Precision
- type: cosine_recall
value: 0.8208092485549133
name: Cosine Recall
- type: cosine_ap
value: 0.6019568147315763
name: Cosine Ap
- type: dot_accuracy
value: 0.6796875
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 452.99359130859375
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.5772357723577236
name: Dot F1
- type: dot_f1_threshold
value: 331.7955017089844
name: Dot F1 Threshold
- type: dot_precision
value: 0.445141065830721
name: Dot Precision
- type: dot_recall
value: 0.8208092485549133
name: Dot Recall
- type: dot_ap
value: 0.48922651308546194
name: Dot Ap
- type: manhattan_accuracy
value: 0.720703125
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 178.9223175048828
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.6370023419203747
name: Manhattan F1
- type: manhattan_f1_threshold
value: 235.0154571533203
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.5354330708661418
name: Manhattan Precision
- type: manhattan_recall
value: 0.7861271676300579
name: Manhattan Recall
- type: manhattan_ap
value: 0.6028436956130316
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.7265625
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 9.467215538024902
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.6327944572748269
name: Euclidean F1
- type: euclidean_f1_threshold
value: 13.209227561950684
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.5269230769230769
name: Euclidean Precision
- type: euclidean_recall
value: 0.791907514450867
name: Euclidean Recall
- type: euclidean_ap
value: 0.6040519098735336
name: Euclidean Ap
- type: max_accuracy
value: 0.7265625
name: Max Accuracy
- type: max_accuracy_threshold
value: 452.99359130859375
name: Max Accuracy Threshold
- type: max_f1
value: 0.6370023419203747
name: Max F1
- type: max_f1_threshold
value: 331.7955017089844
name: Max F1 Threshold
- type: max_precision
value: 0.5354330708661418
name: Max Precision
- type: max_recall
value: 0.8208092485549133
name: Max Recall
- type: max_ap
value: 0.6040519098735336
name: Max Ap
- task:
type: binary-classification
name: Binary Classification
dataset:
name: Qnli dev
type: Qnli-dev
metrics:
- type: cosine_accuracy
value: 0.67578125
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.8456684350967407
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.678688524590164
name: Cosine F1
- type: cosine_f1_threshold
value: 0.7042285203933716
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.553475935828877
name: Cosine Precision
- type: cosine_recall
value: 0.8771186440677966
name: Cosine Recall
- type: cosine_ap
value: 0.7168685131801189
name: Cosine Ap
- type: dot_accuracy
value: 0.63671875
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 401.1481628417969
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.684297520661157
name: Dot F1
- type: dot_f1_threshold
value: 335.7393798828125
name: Dot F1 Threshold
- type: dot_precision
value: 0.5609756097560976
name: Dot Precision
- type: dot_recall
value: 0.8771186440677966
name: Dot Recall
- type: dot_ap
value: 0.6371081438919197
name: Dot Ap
- type: manhattan_accuracy
value: 0.669921875
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 216.93173217773438
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.6819672131147542
name: Manhattan F1
- type: manhattan_f1_threshold
value: 302.77984619140625
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.5561497326203209
name: Manhattan Precision
- type: manhattan_recall
value: 0.8813559322033898
name: Manhattan Recall
- type: manhattan_ap
value: 0.7144934262831777
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.67578125
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 12.61432933807373
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.6787479406919276
name: Euclidean F1
- type: euclidean_f1_threshold
value: 16.70758819580078
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.555256064690027
name: Euclidean Precision
- type: euclidean_recall
value: 0.8728813559322034
name: Euclidean Recall
- type: euclidean_ap
value: 0.7139502747319124
name: Euclidean Ap
- type: max_accuracy
value: 0.67578125
name: Max Accuracy
- type: max_accuracy_threshold
value: 401.1481628417969
name: Max Accuracy Threshold
- type: max_f1
value: 0.684297520661157
name: Max F1
- type: max_f1_threshold
value: 335.7393798828125
name: Max F1 Threshold
- type: max_precision
value: 0.5609756097560976
name: Max Precision
- type: max_recall
value: 0.8813559322033898
name: Max Recall
- type: max_ap
value: 0.7168685131801189
name: Max Ap
---
# SentenceTransformer based on microsoft/deberta-v3-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the negation-triplets, [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), xsum-pairs, [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq), [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) and global_dataset datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
- negation-triplets
- [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc)
- [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail)
- [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail)
- xsum-pairs
- [sciq_pairs](https://huggingface.co/datasets/allenai/sciq)
- [qasc_pairs](https://huggingface.co/datasets/allenai/qasc)
- openbookqa_pairs
- [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
- [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions)
- [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa)
- [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq)
- [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws)
- global_dataset
- **Language:** en
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa3-base-STr-CosineWaves-checkpoints-tmp")
# Run inference
sentences = [
"What device is used to regulate a person's heart rate?",
"Devices for Arrhythmia Devices for Arrhythmia Updated:Dec 21,2016 In a medical emergency, life-threatening arrhythmias may be stopped by giving the heart an electric shock (as with a defibrillator ). For people with recurrent arrhythmias, medical devices such as a pacemaker and implantable cardioverter defibrillator (ICD) can help by continuously monitoring the heart's electrical system and providing automatic correction when an arrhythmia starts to occur. This section covers everything you need to know about these devices. Implantable Cardioverter Defibrillator (ICD)",
'vintage cleats | eBay vintage cleats: 1 2 3 4 5 eBay determines this price through a machine learned model of the product\'s sale prices within the last 90 days. eBay determines trending price through a machine learned model of the product’s sale prices within the last 90 days. "New" refers to a brand-new, unused, unopened, undamaged item, and "Used" refers to an item that has been used previously. Top Rated Plus Sellers with highest buyer ratings Returns, money back Sellers with highest buyer ratings Returns, money back',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [EmbeddingSimilarityEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.8158 |
| **spearman_cosine** | **0.8602** |
| pearson_manhattan | 0.8516 |
| spearman_manhattan | 0.8545 |
| pearson_euclidean | 0.8536 |
| spearman_euclidean | 0.8552 |
| pearson_dot | 0.736 |
| spearman_dot | 0.7255 |
| pearson_max | 0.8536 |
| spearman_max | 0.8602 |
#### Binary Classification
* Dataset: `allNLI-dev`
* Evaluated with [BinaryClassificationEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:-----------|
| cosine_accuracy | 0.7188 |
| cosine_accuracy_threshold | 0.8784 |
| cosine_f1 | 0.6325 |
| cosine_f1_threshold | 0.8005 |
| cosine_precision | 0.5145 |
| cosine_recall | 0.8208 |
| cosine_ap | 0.602 |
| dot_accuracy | 0.6797 |
| dot_accuracy_threshold | 452.9936 |
| dot_f1 | 0.5772 |
| dot_f1_threshold | 331.7955 |
| dot_precision | 0.4451 |
| dot_recall | 0.8208 |
| dot_ap | 0.4892 |
| manhattan_accuracy | 0.7207 |
| manhattan_accuracy_threshold | 178.9223 |
| manhattan_f1 | 0.637 |
| manhattan_f1_threshold | 235.0155 |
| manhattan_precision | 0.5354 |
| manhattan_recall | 0.7861 |
| manhattan_ap | 0.6028 |
| euclidean_accuracy | 0.7266 |
| euclidean_accuracy_threshold | 9.4672 |
| euclidean_f1 | 0.6328 |
| euclidean_f1_threshold | 13.2092 |
| euclidean_precision | 0.5269 |
| euclidean_recall | 0.7919 |
| euclidean_ap | 0.6041 |
| max_accuracy | 0.7266 |
| max_accuracy_threshold | 452.9936 |
| max_f1 | 0.637 |
| max_f1_threshold | 331.7955 |
| max_precision | 0.5354 |
| max_recall | 0.8208 |
| **max_ap** | **0.6041** |
#### Binary Classification
* Dataset: `Qnli-dev`
* Evaluated with [BinaryClassificationEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:-----------|
| cosine_accuracy | 0.6758 |
| cosine_accuracy_threshold | 0.8457 |
| cosine_f1 | 0.6787 |
| cosine_f1_threshold | 0.7042 |
| cosine_precision | 0.5535 |
| cosine_recall | 0.8771 |
| cosine_ap | 0.7169 |
| dot_accuracy | 0.6367 |
| dot_accuracy_threshold | 401.1482 |
| dot_f1 | 0.6843 |
| dot_f1_threshold | 335.7394 |
| dot_precision | 0.561 |
| dot_recall | 0.8771 |
| dot_ap | 0.6371 |
| manhattan_accuracy | 0.6699 |
| manhattan_accuracy_threshold | 216.9317 |
| manhattan_f1 | 0.682 |
| manhattan_f1_threshold | 302.7798 |
| manhattan_precision | 0.5561 |
| manhattan_recall | 0.8814 |
| manhattan_ap | 0.7145 |
| euclidean_accuracy | 0.6758 |
| euclidean_accuracy_threshold | 12.6143 |
| euclidean_f1 | 0.6787 |
| euclidean_f1_threshold | 16.7076 |
| euclidean_precision | 0.5553 |
| euclidean_recall | 0.8729 |
| euclidean_ap | 0.714 |
| max_accuracy | 0.6758 |
| max_accuracy_threshold | 401.1482 |
| max_f1 | 0.6843 |
| max_f1_threshold | 335.7394 |
| max_precision | 0.561 |
| max_recall | 0.8814 |
| **max_ap** | **0.7169** |
## Training Details
### Training Datasets
#### negation-triplets
* Dataset: negation-triplets
* Size: 6,700 training samples
* Columns: anchor
, entailment
, and negative
* Approximate statistics based on the first 1000 samples:
| | anchor | entailment | negative |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details |
- min: 5 tokens
- mean: 22.48 tokens
- max: 83 tokens
| - min: 4 tokens
- mean: 14.16 tokens
- max: 45 tokens
| - min: 5 tokens
- mean: 14.41 tokens
- max: 40 tokens
|
* Samples:
| anchor | entailment | negative |
|:---------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------|
| Gà raudot is a commune of the Aube dà partement in the north-central part of France .
| GÃ raudot is a commune in the Aube department in north-central France .
| GÃ raudot is a city in the Aube department in south-central France.
|
| Jamie S. Rich from DVD Talk said , `` In addition to the solid writing , Avatar the Last Airbender also has amazing animation .
| Jamie S. Rich from DVD Talk remarked , `` In addition to the solid writing , Avatar the Last Airbender also has amazing animation .
| Jamie S. Rich from DVD Talk remarked, "In addition to the weak writing, Avatar the Last Airbender also has poor animation.
|
| A person working with an artichoke in a bowl on a counter.
| a person putting some grapes in a bowl
| a person taking some grapes out of a bowl
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### vitaminc-pairs
* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 6,700 training samples
* Columns: claim
and evidence
* Approximate statistics based on the first 1000 samples:
| | claim | evidence |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 7 tokens
- mean: 16.54 tokens
- max: 44 tokens
| - min: 9 tokens
- mean: 38.33 tokens
- max: 184 tokens
|
* Samples:
| claim | evidence |
|:---------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| William Ingraham Koch is an American billionaire .
| William Ingraham Koch ( ; born May 3 , 1940 ) is an American billionaire businessman , sailor , and collector .
|
| The single was number 9 on US Billboard Hot 100 and number 10 in the United Kingdom .
| `` The album 's fourth single `` '' Loyal '' '' became its most successful , by peaking at number 9 on the US Billboard Hot 100 and at number 10 in the United Kingdom . ''
|
| In Sucker Punch , Amber and Blondie are shot .
| He shoots Amber and Blondie and attempts to rape Babydoll , but she stabs him with the kitchen knife and steals his master key .
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-qa
* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 7 tokens
- mean: 15.9 tokens
- max: 41 tokens
| - min: 6 tokens
- mean: 15.07 tokens
- max: 33 tokens
|
* Samples:
| sentence1 | sentence2 |
|:-------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------|
| A heterocyclic compound contains atoms of two or more different elements in its ring structure.
| What type of compound contains atoms of two or more different elements in its ring structure?
|
| Tamiflu inhibits spread of virus.
| What effect does tamiflu have on viruses and cells?
|
| Dependent variable is the term for the affected factor in an experiment.
| What is the term for the affected factor in an experiment?
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-pos
* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 5,762 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 23.5 tokens
- max: 67 tokens
| - min: 7 tokens
- mean: 15.57 tokens
- max: 39 tokens
|
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| This stored energy is called potential energy.
| Energy that is stored in a person or object is called potential energy.
|
| Although tornadoes may occur at any time of the year, peak tornado occurrence in Arkansas is during the spring.
| Tornadoes can occur in any.
|
| The sun is the ultimate source of energy in most terrestrial and marine ecosystems.
| Ultimately, most life forms get their energy from the sun.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### xsum-pairs
* Dataset: xsum-pairs
* Size: 6,700 training samples
* Columns: document
and summary
* Approximate statistics based on the first 1000 samples:
| | document | summary |
|:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 55 tokens
- mean: 217.03 tokens
- max: 410 tokens
| - min: 12 tokens
- mean: 25.26 tokens
- max: 53 tokens
|
* Samples:
| document | summary |
|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|
| Ferry operator the Steam Packet Company said 29,241 people visited during the fortnight - up 6.3% on 2015.
Figures also showed 4,013 fans brought a motorbike - an increase of 4.5%.
The fortnight-long motorcycling festival, includes the Classic TT and Manx Grand Prix races - both held on the Mountain Course.
Steam Packet Chief Executive Mark Woodward said the figures were "very encouraging".
| Nearly 30,000 passengers travelled to the Isle of Man by ferry for this year's Festival of Motorcycling, according to the latest figures.
|
| Rakhat Aliyev, a former ambassador to Austria, is accused of killing two bank managers in his home country in 2007.
Kazakhstan has attempted to have him extradited to face trial, but Austria has twice refused because of the former Soviet republic's human rights record.
Instead, Austrian prosecutors opened their own murder investigation in 2011.
Mr Aliyev has denounced the case against him as politically motivated.
However, in June he flew voluntarily to Vienna from his home in Malta and handed himself in to the Austrian authorities. Since then, he has been held in "investigative custody".
On Tuesday, a court in Vienna said Mr Aliyev had been charged.
A spokeswoman for the court told the Reuters news agency that the judge had not set any bail option and that Mr Aliyev's lawyers had two weeks to appeal against the charges.
He faces at least 10 years in prison if found guilty of murder. If extradited to Kazakhstan he could face a sentence of up to 40 years.
Mr Aliyev was once married to Kazakh President Nursultan Nazarbayev's eldest daughter, Dariga.
A businessman with extensive contacts among the Kazakh elite, he spoke out against Mr Nazarbayev after being sacked as ambassador to Austria.
| A former son-in-law of Kazakhstan's president who later became a prominent opponent has been charged with murder by prosecutors in Austria.
|
| Four people are reported to have pushed three trolleys containing more than £1,350 worth of Lego out of a Toys R Us store in Poole, Dorset Police said.
Similar thefts were also reported at Smyths Toy Superstore in Bournemouth and a Tesco store in Poole.
Officers said the offences could be linked to similar thefts in surrounding counties and CCTV images of two men and two women have been released.
The Poole thefts at Tesco on Tower Park and Toys R Us on Nuffield Road took place within 10 minutes of each other on the evening of 15 May.
Batman Lego was among the items taken, police said.
The offenders made off in a silver Vauxhall Vectra with the registration BK56 XOB - known to be a false number plate from a stolen vehicle, police said.
The theft at Smyths Toy Superstore on Mallard Road Retail Park, Bournemouth was on the afternoon of 10 May.
Anyone who recognises the people in the images is urged to call the force.
| Almost £3,000 worth of Lego has been stolen in targeted raids on toy shops.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### sciq_pairs
* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 6 tokens
- mean: 16.6 tokens
- max: 57 tokens
| - min: 2 tokens
- mean: 84.49 tokens
- max: 512 tokens
|
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| What is an upward force that fluids exert on any object that is placed in them?
| Buoyant force is an upward force that fluids exert on any object that is placed in them. The ability of fluids to exert this force is called buoyancy . What explains buoyant force? A fluid exerts pressure in all directions, but the pressure is greater at greater depth. Therefore, the fluid below an object, where the fluid is deeper, exerts greater pressure on the object than the fluid above it. You can see in the Figure below how this works. Buoyant force explains why the girl pictured above can float in water.
|
| The most abundant formed elements in blood, erythrocytes are red, biconcave disks packed with an oxygen-carrying compound called this?
| 18.3 Erythrocytes The most abundant formed elements in blood, erythrocytes are red, biconcave disks packed with an oxygen-carrying compound called hemoglobin. The hemoglobin molecule contains four globin proteins bound to a pigment molecule called heme, which contains an ion of iron. In the bloodstream, iron picks up oxygen in the lungs and drops it off in the tissues; the amino acids in hemoglobin then transport carbon dioxide from the tissues back to the lungs. Erythrocytes live only 120 days on average, and thus must be continually replaced. Worn-out erythrocytes are phagocytized by macrophages and their hemoglobin is broken down. The breakdown products are recycled or removed as wastes: Globin is broken down into amino acids for synthesis of new proteins; iron is stored in the liver or spleen or used by the bone marrow for production of new erythrocytes; and the remnants of heme are converted into bilirubin, or other waste products that are taken up by the liver and excreted in the bile or removed by the kidneys. Anemia is a deficiency of RBCs or hemoglobin, whereas polycythemia is an excess of RBCs.
|
| What is the process by which plants and animals increase in size?
|
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### qasc_pairs
* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 5,177 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 5 tokens
- mean: 11.37 tokens
- max: 23 tokens
| - min: 16 tokens
- mean: 33.9 tokens
- max: 63 tokens
|
* Samples:
| sentence1 | sentence2 |
|:---------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| What is required before a chemical change?
| All chemical reactions require activation energy to get started.. Chemical changes are a result of chemical reactions.
Activation energy must be used before a chemical change happens.
|
| Endocrine hormones move from place to place through the
| Endocrine hormones travel throughout the body in the blood.. For the tourists, there are many places to travel.
Endocrine hormones move from place to place in the body through the blood.
|
| Soil nutrition can be what?
| Soil can be depleted of nutrients.. For the nonrenewable resources, depletion means extraction of the available natural resources.
soil nutrition can be extracted
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### openbookqa_pairs
* Dataset: openbookqa_pairs
* Size: 3,029 training samples
* Columns: question
and fact
* Approximate statistics based on the first 1000 samples:
| | question | fact |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 4 tokens
- mean: 13.72 tokens
- max: 65 tokens
| - min: 5 tokens
- mean: 11.4 tokens
- max: 31 tokens
|
* Samples:
| question | fact |
|:-------------------------------------------------------------|:---------------------------------------------------------|
| What causes direct damage to the lungs?
| smoking causes direct damage to the lungs
|
| Feces on the ground is an indicator of a nearby
| an organism is a source of organic material
|
| What time are you most likely to see a rainbow
| sunlight and rain can cause a rainbow
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### msmarco_pairs
* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 4 tokens
- mean: 8.51 tokens
- max: 23 tokens
| - min: 16 tokens
- mean: 75.53 tokens
- max: 258 tokens
|
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| can olive oil cure ear infection
| Olive oil with all its below mentioned properties, helps to prevent the ear pain and infection very efficiently. 1 Olive oil, when applied on the ear, helps to decrease the irritation in the outer and inner ear to ease the pain.
|
| who was william vale
| Squadron Leader William Cherry Vale DFC & Bar, AFC (3 June 1914 â 29 November 1981) was a Royal Air Force (RAF) pilot who, during the Second World War, claimed 30 enemy aircraft shot down and shared in the destruction of three others, and also claiming 6 damaged and another two shared damaged.
|
| what is swamp fever
| Equine Infectious Anemia (Swamp Fever) Table of Contents. General Description. Equine infectious anemia (EIA), also known as swamp fever, is a potentially fatal disease caused by a virus that can infect all types of equines, including horses, mules, zebras and donkeys.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### nq_pairs
* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 9 tokens
- mean: 11.91 tokens
- max: 23 tokens
| - min: 17 tokens
- mean: 135.35 tokens
- max: 512 tokens
|
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| who plays harry from harry and the hendersons
| Kevin Peter Hall Kevin Peter Hall (May 9, 1955 – April 10, 1991) was an American actor best known for his roles as the title character in the first two films in the Predator franchise and the title character of Harry in the film and television series, Harry and the Hendersons. He also appeared in the television series Misfits of Science and 227 along with the film, Without Warning.
|
| where does the eustachian tube drain in the throat
| Eustachian tube The Eustachian tube also drains mucus from the middle ear. Upper respiratory tract infections or allergies can cause the Eustachian tube, or the membranes surrounding its opening to become swollen, trapping fluid, which serves as a growth medium for bacteria, causing ear infections. This swelling can be reduced through the use of decongestants such as pseudoephedrine, oxymetazoline, and phenylephrine.[7] Ear infections are more common in children because the tube is horizontal and shorter, making bacterial entry easier, and it also has a smaller diameter, making the movement of fluid more difficult. In addition, children's developing immune systems and poor hygiene habits make them more prone to upper respiratory infections.
|
| who played boss hogg on the dukes of hazzard
| Sorrell Booke Sorrell Booke (January 4, 1930 – February 11, 1994) was an American actor who performed on stage, screen, and television. He is best known for his role as corrupt politician Jefferson Davis "Boss" Hogg in the television show The Dukes of Hazzard.[1]
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### trivia_pairs
* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 3,749 training samples
* Columns: query
and answer
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 17.79 tokens
- max: 64 tokens
| - min: 20 tokens
- mean: 209.39 tokens
- max: 498 tokens
|
* Samples:
| query | answer |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Which singer and songwriter of the 1970s and 1980s was the first to have three consecutive double albums hit #1 on the Billboard charts, and was the first female artist to have four number-one singles in a thirteen-month period?
| Donna Summer - Music on Google Play Donna Summer About the artist LaDonna Adrian Gaines, known by her stage name Donna Summer, was an American singer, songwriter, painter, and actress. She gained prominence during the disco era of the late-1970s. A five-time Grammy Award winner, she was the first artist to have three consecutive double albums reach No. 1 on the United States Billboard 200 chart and charted four number-one singles in the U.S. within a 12-month period. Summer has reportedly sold over 140 million records, making her one of the world's best-selling artists of all time. She also charted two number-one singles on the R&B charts in the U.S. and one number-one in the U.K. Summer earned a total of 32 hit singles on the U.S. Billboard Hot 100 chart in her lifetime, with 14 of those reaching the top ten. She claimed a top 40 hit every year between 1975 and 1984, and from her first top ten hit in 1976, to the end of 1982, she had 12 top ten hits; more than any other act. She returned to the Hot 100's top five in 1983, and claimed her final top ten hit in 1989 with "This Time I Know It's for Real". Her most recent Hot 100 hit came in 1999 with "I Will Go With You".
|
| What type of car did Burt Reynolds drive in the 1977 film ‘Smokey and the Bandit’?
| What Cars Did They Drive In 'Smokey and The Bandit' Share on Twitter Perhaps more famous then the actual actors themselves, was the car driven by Burt Reynold’s character Bo Darville the “Bandit”. We got a call this morning from a woman who had claimed they had a car for sale that was the same car that Jackie Gleason’s character Sheriff Buford T. Justice.We decided to do some digging around the Internet to see exactly what kind of cars they were driving. Bo Darville The “Bandit” Flickr Well it turns out there is some debate raging across the Internet about whether or not Burt Reynolds was driving a 1976 or a 1977. But there was never any doubt to the model. Burt cruised through the movie inside a Pontiac Trans Am, with the now iconic Black and Gold paint scheme. 1 Sheriff Buford T. Justice What’s The Bandit with out being chased by the Sheriff? Of course Buford had to have something that could almost keep up with the Bandit as they raced across several states. The Sheriff’s vehicle was a 1977 Ponitac LeMans.
|
| In 1985, which funny man was the first UK citizen to make a mobile phone call?
| The call that changed our world: Blue plaque to mark site where a mobile phone was first used in the UK – The Sun SMARTPHONES may have changed the way we communicate and view the world – but the first mobile phone, which was heavier than a new-born baby, was no less groundbreaking. On New Year’s Day in 1985, the UK’s first official mobile call was made by comedian Ernie Wise from St Katharine Docks in London to the Vodafone offices in Newbury, Berkshire 65 miles away. This historic moment, which marked the start of the mobile age, is to be commemorated with a blue plaque in the town of Berkshire where Vodafone was founded and still resides today. Vodafone Newbury Town Council has applied for formal planning permission for the plaque, which will be placed at Thames Court, and will feature the wording “The first official mobile telephone call in the UK was made to Vodafone offices close to this site on January 1, 1985.” The phone which TV favourite Ernie, one half of legendary double act Morecambe and Wise, used to call Vodafone’s Sir Michael Harrison bears little comparison to the sophisticated devices we carry today. Weighing an incredible 11lbs, the equivalent of five bags of sugar, and costing the equivalent of £5,000, no one standing next to the diminutive funny man could have known the significance of what they were witnessing.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### gooaq_pairs
* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 11.34 tokens
- max: 19 tokens
| - min: 15 tokens
- mean: 57.57 tokens
- max: 143 tokens
|
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| why is my urine bright yellow when pregnant?
| It can range from an intense bright yellow to a darker, almost orange-yellow colour. The colour of the urine is caused by the pigment urochrome, which is also known as urobilin. Urobilin is made when the body breaks down haemoglobin from dead red blood cells.
|
| are vw beetles being discontinued?
| Volkswagen is discontinuing the iconic Beetle after 80 years on the market. Volkswagen announced it will stop worldwide production of the iconic Beetle by summer 2019. The move comes as the auto market continues its march toward SUVs and crossover vehicles.
|
| do seth and summer break up?
| Summer and Seth broke up quite a few times for numerous reasons including Seth running away and Summer moving on and starting a relationship with Zach, Summer thinking Seth was seeing Anna again when really she was just trying to help him get into college to be near Summer, in the end they get back together and end up ...
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### paws-pos
* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 6,700 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 9 tokens
- mean: 25.71 tokens
- max: 56 tokens
| - min: 9 tokens
- mean: 25.7 tokens
- max: 54 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| The bay ends at Quinte West ( Trenton ) and the Trent River , both also on the north side .
| The bay ends at Trenton ( Quinte West ) and the Trent River , both on the north side .
|
| The first main span was positioned in 1857 and the finished bridge was opened on 2 May 1859 by Prince Albert .
| The first main span was positioned in 1857 and the completed bridge was opened by Prince Albert on 2 May 1859 .
|
| The reactance is defined as the imaginary part of electrical impedance and is equal , but generally not analogous to reversing the susceptance .
| Reactance is defined as the imaginary part of Electrical impedance , and is equal but not generally analogous to the inverse of the susceptance .
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### global_dataset
* Dataset: global_dataset
* Size: 45,228 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 4 tokens
- mean: 29.03 tokens
- max: 329 tokens
| - min: 2 tokens
- mean: 51.81 tokens
- max: 512 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| what is emla cream used for
| EMLA cream for local anaesthesia. This leaflet is about the use of EMLA cream. The cream is used to make an area of skin numb, which is called local anaesthesia. It may be used before taking blood with a needle or putting in a drip (cannula), or before a small surgical procedure that might be painful.
|
| Adana Province , Turkey is a village in the District of Yüreğir , Düzce .
| The province of Adana , Turkey is a village in the district of Yüreğir , Düzce .
|
| A second factor is the journalistic culture of the news conference, which rewards zinger questions that provoke news--and discourages anything that courts televised dullness.
| One of the factors is the culture of journalistic news conferences which discourages dullness.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Evaluation Datasets
#### vitaminc-pairs
* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 128 evaluation samples
* Columns: claim
and evidence
* Approximate statistics based on the first 1000 samples:
| | claim | evidence |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 9 tokens
- mean: 21.42 tokens
- max: 41 tokens
| - min: 11 tokens
- mean: 35.55 tokens
- max: 79 tokens
|
* Samples:
| claim | evidence |
|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Dragon Con had over 5000 guests .
| Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell .
|
| COVID-19 has reached more than 185 countries .
| As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths .
|
| In March , Italy had 3.6x times more cases of coronavirus than China .
| As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China .
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### negation-triplets
* Dataset: negation-triplets
* Size: 128 evaluation samples
* Columns: anchor
, entailment
, and negative
* Approximate statistics based on the first 1000 samples:
| | anchor | entailment | negative |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | - min: 8 tokens
- mean: 14.66 tokens
- max: 35 tokens
| - min: 6 tokens
- mean: 12.3 tokens
- max: 22 tokens
| - min: 6 tokens
- mean: 12.62 tokens
- max: 23 tokens
|
* Samples:
| anchor | entailment | negative |
|:---------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| An open door leading to a bathroom toilet, with a shower rack and bureau visible.
| A bathroom has a red circular rug by the toilet.
| A bathroom has no red circular rug by the toilet.
|
| A child wearing a red top is standing behind a blond headed child sitting in a wheelbarrow.
| A child wearing a red top is standing behind a blond headed child
| A child wearing a red top is standing far from a blond headed child
|
| Two women waiting at a bench next to a street.
| A woman sitting on a bench and a woman standing waiting for the bus.
| A woman sitting on a bench and a woman walking away from the bus stop.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-pos
* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 9 tokens
- mean: 20.28 tokens
- max: 56 tokens
| - min: 8 tokens
- mean: 15.48 tokens
- max: 23 tokens
|
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
| humans normally have 23 pairs of chromosomes.
| Humans typically have 23 pairs pairs of chromosomes.
|
| A solution is a homogenous mixture of two or more substances that exist in a single phase.
| Solution is the term for a homogeneous mixture of two or more substances.
|
| Upwelling The physical process in near-shore ocean systems of rising of nutrients and colder bottom waters to the surface because of constant wind patterns along the shoreline.
| Upwelling is the term for when deep ocean water rises to the surface.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-qa
* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 7 tokens
- mean: 15.95 tokens
- max: 37 tokens
| - min: 8 tokens
- mean: 15.4 tokens
- max: 34 tokens
|
* Samples:
| sentence1 | sentence2 |
|:---------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------|
| A lipid is one of a highly diverse group of compounds made up mostly of hydrocarbons.
| A lipid is one of a highly diverse group of compounds made up mostly of what?
|
| Sugar crystals dissolving in water is an example of the formation of a mixture.
| Which of the following is an example of the formation of a mixture?
|
| Starches are complex carbohydrates that are the polymers of glucose.
| What complex carbohydrates are the polymers of glucose?
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### xsum-pairs
* Dataset: xsum-pairs
* Size: 128 evaluation samples
* Columns: document
and summary
* Approximate statistics based on the first 1000 samples:
| | document | summary |
|:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 64 tokens
- mean: 223.53 tokens
- max: 399 tokens
| - min: 13 tokens
- mean: 25.44 tokens
- max: 46 tokens
|
* Samples:
| document | summary |
|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Former Northampton South MP Tony Clarke said he feared that if the petition was successful the council could lose the £10.25m it is owed by the club.
HM Revenue and Customs is owed £166,000 by the club.
The borough council said it was seeking a meeting with HMRC.
Last week it was revealed that players and staff at Northampton Town Football Club have not been paid due to its financial problems.
Manager Chris Wilder paid tribute to the attitude of people who work at the League Two club, saying it has brought them all "much closer together".
Mr Clarke, who is also a former director of the club, said: "It absolutely imperative the council objects to the winding-up petition in two weeks time.
"If they don't and Northampton Town Football Club goes into receivership then the council won't recover any of its money."
He called for the council to be "positive" and "ask for an adjournment" to the winding-up hearing so the club supporters and staff have more time to help save the club.
On Monday, councillors backed a motion calling for it to do "whatever we can to help" the football club and the Supporters Trust.
It also called for the £10.25m of public money to be "retrieved" and for its audit committee to review its policies and practices.
| A former MP has called on Northampton Borough Council to oppose a winding-up petition sought by HM Revenue and Customs (HMCRC) against the town's football club.
|
| Tokyo's Nikkei 225 closed down 1.32% to 16,819.59 points as a stronger yen against the dollar hurt the country's big exporters for a second day.
Toyota and Honda shares finished the day down about 2%, while Mazda shed nearly 5%.
At the close of trade, Toyota reported a 4.7% rise in net income for the three months to December.
However, the firm's operating profit for the quarter fell by 5.3%, missing forecasts.
Australia's S&P/ASX 200 spent the day in negative territory and closed flat, down 0.08% to 16,819.59.
The country's big lenders had weighed on the market and analysts said traders were being cautious ahead of a US jobs report due out later.
Energy firms regained lost territory late in the day, however, with Woodside finishing up 0.41% and rival Santos up 2.2%. Mining giant BHP finished up close to 5%.
Official numbers released earlier showed Australia's retail sales had come in flat for the month of December - a 0.4% gain was expected. But analysts said the numbers still supported economic growth.
"December quarter real retail sales rose by 0.6%, which was less than expected, but similar to the last few quarters," said AMP Capital's head economist Shane Oliver.
"It implies that consumer spending has again helped support December quarter GDP growth," he added.
In Hong Kong, the Hang Seng was up 0.4% to 19,255.88 points in afternoon trade, while the mainland's benchmark Shanghai Composite closed down 0.63% to 2,763.49.
South Korea's Kospi index closed flat, up just 0.08% to 1,917.79.
| Shares in Asia were in mixed territory on Friday ahead of a closely watched US monthly jobs report.
|
| In February, Badminton was among seven Olympic sports to lose funding despite Chris Langridge and Marcus Ellis winning doubles bronze at Rio 2016.
"We are working through an unprecedented financial situation as a consequence of the recent funding decisions," Badminton England performance director Jon Austin said.
The Sudirman Cup starts on 21 May.
The event in Gold Coast, Australia, is seen as an unofficial test event for English players ahead of next year's Commonwealth Games to be held in the same city.
England finished ninth in the last edition of the World Mixed Team Championships, which take place every two years.
"We have had to consider the investments we make very carefully," Austin added.
"The pressures we are facing right now, both through the people resource and financial investment needed, means we are regrettably not in a position to commit to the Sudirman Cup this year."
Badminton England received around £5.5m between London 2012 and 2016 and was left "staggered" when it had its funding pulled, despite beating a Rio Games performance target set by elite sport funding body UK Sport.
| Badminton England has withdrawn from May's World Mixed Team Championships citing government funding cuts.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### sciq_pairs
* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 16.32 tokens
- max: 42 tokens
| - min: 2 tokens
- mean: 89.04 tokens
- max: 512 tokens
|
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Poetically speaking, nature reserves are islands of what, in a sea of habitat degraded by human activity?
|
|
| Multiplying the linear momentum of a spinning object by the radius calculates what?
| The angular momentum of a spinning object can be found in two equivalent ways. Just like linear momentum, one way, shown in the first equation, is to multiply the moment of inertia, the rotational analog of mass, with the angular velocity. The other way is simply multiplying the linear momentum by the radius, as shown in the second equation.
|
| What is the term for the intentional release or spread of agents of disease?
| Bioterrorism is the intentional release or spread of agents of disease.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### qasc_pairs
* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 6 tokens
- mean: 11.82 tokens
- max: 22 tokens
| - min: 17 tokens
- mean: 34.68 tokens
- max: 57 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Operating an automobile usually requires what from gasoline?
| operating an automobile usually requires fossil fuels. Energy pollution Cars operate mainly on gasoline, a fossil fuel.
Operating an automobile usually requires energy pollution from gasoline.
|
| Which unit could a graduated cylinder measure in?
| a graduated cylinder is used to measure volume of a liquid. Liquid volume is measured using a unit called a liter.
A graduated cyliner measures in liters
|
| What can decompose wood?
| Fungi are the only organisms that can decompose wood.. Mushrooms are organisms known as fungi.
mushrooms can decompose wood
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### openbookqa_pairs
* Dataset: openbookqa_pairs
* Size: 128 evaluation samples
* Columns: question
and fact
* Approximate statistics based on the first 1000 samples:
| | question | fact |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 3 tokens
- mean: 13.98 tokens
- max: 47 tokens
| - min: 4 tokens
- mean: 11.78 tokens
- max: 28 tokens
|
* Samples:
| question | fact |
|:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------|
| The thermal production of a stove is generically used for
| a stove generates heat for cooking usually
|
| What creates a valley?
| a valley is formed by a river flowing
|
| when it turns day and night on a planet, what cause this?
| a planet rotating causes cycles of day and night on that planet
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### msmarco_pairs
* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 5 tokens
- mean: 8.73 tokens
- max: 20 tokens
| - min: 26 tokens
- mean: 76.41 tokens
- max: 182 tokens
|
* Samples:
| sentence1 | sentence2 |
|:------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| what is a normal osmolar gap
| The osmolar gap is the difference between the calculated serum osmolarity and the measured serum osmolarity. The normal osmolar gap is 10-15 mmol/L H20 .The osmolar gap is increased in the presence of low molecular weight substances that are not included in the formula for calculating plasma osmolarity.
|
| how old is kendrick lamar
| The 27-year-old rapper has a broad smile on his face. He seems almost as excited as the students, who just might be having their best day of school ... ever. Lamar is on top of the rap game at the moment. His latest album, To Pimp a Butterfly, came out earlier this year and debuted at No. 1 on Billboard's albums chart.
|
| modific definition
| (ËmÉdɪfɪËkeɪÊÉn) n. 1. the act of modifying or the condition of being modified. 2. something modified; the result of a modification. 3. a small change or adjustment. 4. (Grammar) grammar the relation between a modifier and the word or phrase that it modifies. ËmodifiËcatory, ËmodifiËcative adj.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### nq_pairs
* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 10 tokens
- mean: 11.68 tokens
- max: 19 tokens
| - min: 25 tokens
- mean: 135.38 tokens
- max: 466 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| patch and kayla on days of our lives
| Steve Johnson and Kayla Brady Steve "Patch" Earl Johnson and Dr. Kayla Caroline Brady are a supercouple on the American soap opera Days of Our Lives. Steve is portrayed by Stephen Nichols and Kayla is portrayed by Mary Beth Evans. On the Internet message boards[5] the couple is often referred to by the portmanteau "Stayla" (for Steve and Kayla). The couple was initially popular from 1986 through 1990 until the "death" of Steve. Both characters have recently returned: after Steve being presumed dead for 16 years, Steve returned to the show on June 9, 2006; Kayla returned on June 12, 2006. Steve and Kayla were dropped off canvas in February 2009. Kayla returned in December 2011. In August 2015, Steve returned to Salem, and the couple reunited soon after
|
| when does the tour de france usually start
| Tour de France Traditionally, the race is held primarily in the month of July. While the route changes each year, the format of the race stays the same with the appearance of time trials,[1] the passage through the mountain chains of the Pyrenees and the Alps, and the finish on the Champs-Élysées in Paris.[7][8] The modern editions of the Tour de France consist of 21 day-long segments (stages) over a 23-day period and cover around 3,500 kilometres (2,200 mi).[9] The race alternates between clockwise and counterclockwise circuits of France.[10]
|
| what's a dell from farmer in the dell
| The Farmer in the Dell A dell is a wooded valley. In the Dutch language, the word deel means, among other things, the workspace in a farmer's barn. The use of dell in this song may be a bastardisation of this term.[citation needed]
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### trivia_pairs
* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 128 evaluation samples
* Columns: query
and answer
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 18.45 tokens
- max: 47 tokens
| - min: 38 tokens
- mean: 207.95 tokens
- max: 411 tokens
|
* Samples:
| query | answer |
|:---------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Messer is German for which item of cutlery?
| messer knife | eBay messer knife: 1 2 3 4 5 eBay determines this price through a machine learned model of the product's sale prices within the last 90 days. eBay determines trending price through a machine learned model of the product’s sale prices within the last 90 days. "New" refers to a brand-new, unused, unopened, undamaged item, and "Used" refers to an item that has been used previously. Top Rated Plus Sellers with highest buyer ratings Returns, money back Sellers with highest buyer ratings Returns, money back Please enter a minimum and/or maximum price before continuing. $ *Learn about pricing Amounts shown in italicized text are for items listed in currency other than U.S. dollars and are approximate conversions to U.S. dollars based upon Bloomberg's conversion rates. For more recent exchange rates, please use the Universal Currency Converter This page was last updated: Jan-19 18:55. Number of bids and bid amounts may be slightly out of date. See each listing for international shipping options and costs.
|
| American Presidents assassinated in office?
| Question - Number of Presidents Who Were Assassinated All four of them were killed by shooting. Interestingly enough, they were also all victims of Tecumseh's Curse . Learn More:
|
| Which Chicago building was formerly known as the Sears Tower?
| Willis Tower DISCOVER THE WARM PERSONALITY OF THIS IMPRESSIVE ADDRESS. Welcome to Willis Tower, where there is more than meets the skyline. A bustling community of business, tourism and culture, Willis Tower is so much more than North America's tallest building. It’s home to large well-known companies as well as hundreds of thriving businesses run by smart, passionate people. More than an office building, it’s a cultural landmark and iconic Chicago tourist attraction. Willis Tower is a pivotal point of reference – from across town, from financial centers on both coasts, and from Europe, Asia, and the Middle East. It’s a building with retail and commercial office space at heart, but also inspires tens of thousands of visitors to take in the amazing views of the city and experience the breathtaking Ledge. 110 FLOORS. COUNTLESS STORIES.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### gooaq_pairs
* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 8 tokens
- mean: 11.48 tokens
- max: 18 tokens
| - min: 16 tokens
- mean: 58.07 tokens
- max: 108 tokens
|
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| what is the different kinds of antivirus?
| ['Malware signature antivirus. Malware, or malicious software, installs viruses and spyware on your computer or device without your knowledge. ... ', 'System monitoring antivirus. This is where system monitoring antivirus software comes into play. ... ', 'Machine learning antivirus.']
|
| 1. what is the difference between quantitative and qualitative research proposal?
| Qualitative research identifies abstract concepts while quantitative research collects numerical data. But the substantial difference is in the type of action applied and in the size of the sample (respondents).
|
| how to calculate efn finance?
| Instead of preparing a set of forecasted financial statements, you can also calculate your external financing needs (EFN) by using a formula that looks at three changes: 1. Required increases to assets given a change in sales. Formula = (A/S) x (Δ Sales).
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### paws-pos
* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 128 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 10 tokens
- mean: 25.72 tokens
- max: 42 tokens
| - min: 10 tokens
- mean: 25.55 tokens
- max: 41 tokens
|
* Samples:
| sentence1 | sentence2 |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| They were there to enjoy us and they were there to pray for us .
| They were there for us to enjoy and they were there for us to pray .
|
| After the end of the war in June 1902 , Higgins left Southampton in the `` SSBavarian '' in August , returning to Cape Town the following month .
| In August , after the end of the war in June 1902 , Higgins Southampton left the `` SSBavarian '' and returned to Cape Town the following month .
|
| From the merger of the Four Rivers Council and the Audubon Council , the Shawnee Trails Council was born .
| Shawnee Trails Council was formed from the merger of the Four Rivers Council and the Audubon Council .
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### global_dataset
* Dataset: global_dataset
* Size: 416 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 5 tokens
- mean: 31.67 tokens
- max: 399 tokens
| - min: 2 tokens
- mean: 58.97 tokens
- max: 503 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| More than 478,000 cases and more than 21,500 deaths have been reported worldwide .
| more than 478,000 cases have been reported worldwide ; more than 21,500 people have died and more than 114,000 have recovered.
|
| Solutions are homogenous mixtures of two or more substances.
| Solution is the term for a homogeneous mixture of two or more substances.
|
| What celestial object has been visited by manned spacecraft and is easily seen from earth?
| The Moon is easily seen from Earth. Early astronomers used telescopes to study and map its surface. The Moon has also seen a great number of satellites, rovers, and orbiters. After all, it is relatively easy to get spacecraft to the satellite. Also, before humans could be safely sent to the Moon, many studies and experiments had to be completed.
|
* Loss: [CachedGISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 96
- `per_device_eval_batch_size`: 68
- `learning_rate`: 3.5e-05
- `weight_decay`: 0.0005
- `num_train_epochs`: 2
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 3.5, 'min_lr': 1.5e-05}
- `warmup_ratio`: 0.33
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa3-base-STr-CosineWaves-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates
#### All Hyperparameters
Click to expand
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 96
- `per_device_eval_batch_size`: 68
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 3.5e-05
- `weight_decay`: 0.0005
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 2
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 3.5, 'min_lr': 1.5e-05}
- `warmup_ratio`: 0.33
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa3-base-STr-CosineWaves-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand
| Epoch | Step | Training Loss | negation-triplets loss | vitaminc-pairs loss | sciq pairs loss | openbookqa pairs loss | msmarco pairs loss | paws-pos loss | qasc pairs loss | scitail-pairs-pos loss | trivia pairs loss | scitail-pairs-qa loss | nq pairs loss | gooaq pairs loss | global dataset loss | xsum-pairs loss | Qnli-dev_max_ap | allNLI-dev_max_ap | sts-test_spearman_cosine |
|:------:|:----:|:-------------:|:----------------------:|:-------------------:|:---------------:|:---------------------:|:------------------:|:-------------:|:---------------:|:----------------------:|:-----------------:|:---------------------:|:-------------:|:----------------:|:-------------------:|:---------------:|:---------------:|:-----------------:|:------------------------:|
| 0.0008 | 1 | 6.6871 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0016 | 2 | 5.8546 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0023 | 3 | 6.576 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0031 | 4 | 6.8301 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0039 | 5 | 5.2308 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0047 | 6 | 6.3588 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0054 | 7 | 6.8417 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0062 | 8 | 10.3916 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0070 | 9 | 5.5136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0078 | 10 | 7.4726 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0085 | 11 | 5.5414 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0093 | 12 | 4.8507 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0101 | 13 | 4.9058 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0109 | 14 | 6.2301 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0116 | 15 | 7.9813 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0124 | 16 | 5.5269 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0132 | 17 | 10.0786 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0140 | 18 | 9.6075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0147 | 19 | 4.671 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0155 | 20 | 6.945 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0163 | 21 | 6.233 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0171 | 22 | 5.5828 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0178 | 23 | 6.6991 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0186 | 24 | 5.9928 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0194 | 25 | 4.8443 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0202 | 26 | 11.0738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0209 | 27 | 6.4076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0217 | 28 | 6.2668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0225 | 29 | 5.092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0233 | 30 | 5.9955 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0240 | 31 | 6.6071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0248 | 32 | 4.7984 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0256 | 33 | 6.173 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0264 | 34 | 5.9604 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0272 | 35 | 6.3435 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0279 | 36 | 6.0394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0287 | 37 | 6.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0295 | 38 | 8.0721 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0303 | 39 | 6.2283 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0310 | 40 | 5.365 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0318 | 41 | 6.0378 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0326 | 42 | 6.5136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0334 | 43 | 5.6955 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0341 | 44 | 6.1769 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0349 | 45 | 6.302 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0357 | 46 | 6.0792 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0365 | 47 | 5.4317 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0372 | 48 | 5.7632 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0380 | 49 | 4.6767 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0388 | 50 | 6.1871 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0396 | 51 | 5.572 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0403 | 52 | 7.351 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0411 | 53 | 5.6907 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0419 | 54 | 5.7986 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0427 | 55 | 5.9598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0434 | 56 | 6.024 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0442 | 57 | 5.328 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0450 | 58 | 5.5545 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0458 | 59 | 5.3813 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0465 | 60 | 5.3023 | 5.3504 | 3.9951 | 0.9480 | 5.4622 | 8.8025 | 2.7421 | 6.7080 | 2.7031 | 6.3097 | 3.3195 | 7.8914 | 6.8511 | 5.3453 | 5.5659 | 0.6173 | 0.3812 | 0.2096 |
| 0.0473 | 61 | 3.5705 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0481 | 62 | 5.8784 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0489 | 63 | 8.5966 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0497 | 64 | 5.9225 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0504 | 65 | 4.6667 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0512 | 66 | 6.12 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0520 | 67 | 5.2713 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0528 | 68 | 6.9483 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0535 | 69 | 3.6892 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0543 | 70 | 7.5618 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0551 | 71 | 5.8086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0559 | 72 | 6.517 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0566 | 73 | 5.5826 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0574 | 74 | 6.5543 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0582 | 75 | 6.5721 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0590 | 76 | 5.4454 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0597 | 77 | 6.3208 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0605 | 78 | 6.5299 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0613 | 79 | 7.1424 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0621 | 80 | 5.1799 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0628 | 81 | 6.3358 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0636 | 82 | 7.9256 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0644 | 83 | 5.4606 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0652 | 84 | 5.6861 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0659 | 85 | 3.6595 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0667 | 86 | 5.3626 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0675 | 87 | 5.7214 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0683 | 88 | 3.7724 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0690 | 89 | 6.8782 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0698 | 90 | 7.1858 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0706 | 91 | 6.1052 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0714 | 92 | 5.6906 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0721 | 93 | 6.3279 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0729 | 94 | 6.6561 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0737 | 95 | 5.634 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0745 | 96 | 5.6729 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0753 | 97 | 5.1085 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0760 | 98 | 6.7441 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0768 | 99 | 6.2151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0776 | 100 | 6.6636 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0784 | 101 | 5.3276 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0791 | 102 | 5.1297 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0799 | 103 | 5.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0807 | 104 | 5.029 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0815 | 105 | 5.605 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0822 | 106 | 5.0171 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0830 | 107 | 3.6544 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0838 | 108 | 5.9355 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0846 | 109 | 5.7908 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0853 | 110 | 5.2447 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0861 | 111 | 6.5699 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0869 | 112 | 4.8604 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0877 | 113 | 5.6599 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0884 | 114 | 4.9483 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0892 | 115 | 5.7634 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0900 | 116 | 5.0934 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0908 | 117 | 4.5253 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0915 | 118 | 4.6447 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0923 | 119 | 5.5944 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0931 | 120 | 4.4379 | 5.2776 | 3.9060 | 0.8786 | 5.2407 | 6.3303 | 2.6328 | 5.1705 | 2.5746 | 5.4142 | 3.3100 | 5.9142 | 5.5463 | 4.5579 | 5.1207 | 0.6152 | 0.4023 | 0.2319 |
| 0.0939 | 121 | 4.9112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0946 | 122 | 4.6273 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0954 | 123 | 5.5262 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0962 | 124 | 4.837 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0970 | 125 | 6.3641 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0978 | 126 | 4.6542 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0985 | 127 | 3.2895 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0993 | 128 | 5.5488 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1001 | 129 | 6.3668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1009 | 130 | 6.0549 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1016 | 131 | 4.7537 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1024 | 132 | 4.245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1032 | 133 | 4.3526 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1040 | 134 | 2.9629 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1047 | 135 | 4.4471 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1055 | 136 | 4.5066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1063 | 137 | 4.7698 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1071 | 138 | 4.706 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1078 | 139 | 4.4883 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1086 | 140 | 6.0553 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1094 | 141 | 5.8958 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1102 | 142 | 4.1842 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1109 | 143 | 6.3887 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1117 | 144 | 4.0725 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1125 | 145 | 4.6545 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1133 | 146 | 6.1092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1140 | 147 | 3.7272 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1148 | 148 | 4.1651 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1156 | 149 | 3.8952 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1164 | 150 | 4.6196 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1171 | 151 | 3.7705 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1179 | 152 | 6.073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1187 | 153 | 3.7738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1195 | 154 | 3.6523 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1202 | 155 | 5.4226 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1210 | 156 | 4.5508 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1218 | 157 | 3.9043 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1226 | 158 | 3.66 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1234 | 159 | 6.0984 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1241 | 160 | 3.9498 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1249 | 161 | 4.4688 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1257 | 162 | 3.6255 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1265 | 163 | 3.658 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1272 | 164 | 3.4856 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1280 | 165 | 5.3092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1288 | 166 | 3.7321 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1296 | 167 | 3.2995 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1303 | 168 | 5.1161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1311 | 169 | 3.614 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1319 | 170 | 4.0901 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1327 | 171 | 3.4437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1334 | 172 | 5.0212 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1342 | 173 | 1.3904 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1350 | 174 | 5.6536 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1358 | 175 | 5.0981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1365 | 176 | 4.7528 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1373 | 177 | 5.4556 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1381 | 178 | 2.8553 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1389 | 179 | 5.4703 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1396 | 180 | 4.7665 | 4.8691 | 3.7699 | 0.7544 | 5.3407 | 5.4312 | 0.5739 | 4.1874 | 1.6041 | 5.0353 | 1.6546 | 4.6816 | 4.4876 | 3.1652 | 4.0281 | 0.6266 | 0.4411 | 0.2932 |
| 0.1404 | 181 | 4.0213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1412 | 182 | 4.6874 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1420 | 183 | 3.1823 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1427 | 184 | 3.1686 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1435 | 185 | 2.8957 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1443 | 186 | 4.5781 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1451 | 187 | 3.7329 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1458 | 188 | 3.4419 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1466 | 189 | 5.6953 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1474 | 190 | 3.1869 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1482 | 191 | 3.6055 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1490 | 192 | 4.6231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1497 | 193 | 4.2417 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1505 | 194 | 5.2779 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1513 | 195 | 4.3213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1521 | 196 | 3.1158 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1528 | 197 | 2.27 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1536 | 198 | 3.5041 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1544 | 199 | 2.6007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1552 | 200 | 2.4875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1559 | 201 | 5.3046 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1567 | 202 | 3.0582 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1575 | 203 | 4.9347 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1583 | 204 | 2.855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1590 | 205 | 1.7434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1598 | 206 | 3.4045 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1606 | 207 | 3.4427 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1614 | 208 | 3.3483 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1621 | 209 | 1.5883 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1629 | 210 | 5.3066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1637 | 211 | 0.6047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1645 | 212 | 0.8018 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1652 | 213 | 2.8775 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1660 | 214 | 0.5198 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1668 | 215 | 3.1591 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1676 | 216 | 2.7381 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1683 | 217 | 5.5722 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1691 | 218 | 3.998 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1699 | 219 | 2.2858 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1707 | 220 | 1.556 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1715 | 221 | 2.5352 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1722 | 222 | 3.1682 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1730 | 223 | 2.9073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1738 | 224 | 2.547 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1746 | 225 | 4.1815 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1753 | 226 | 3.7504 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1761 | 227 | 5.033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1769 | 228 | 5.2809 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1777 | 229 | 2.5598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1784 | 230 | 0.4476 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1792 | 231 | 3.4592 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1800 | 232 | 2.9202 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1808 | 233 | 1.9092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1815 | 234 | 1.9204 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1823 | 235 | 2.083 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1831 | 236 | 3.0433 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1839 | 237 | 1.5429 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1846 | 238 | 0.3347 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1854 | 239 | 1.8698 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1862 | 240 | 0.3031 | 3.8406 | 3.9217 | 0.4412 | 3.9774 | 3.6929 | 0.1191 | 2.3104 | 0.5979 | 3.7268 | 0.5432 | 3.8184 | 3.0682 | 1.8652 | 2.5906 | 0.6314 | 0.5054 | 0.4946 |
| 0.1870 | 241 | 3.2076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1877 | 242 | 2.684 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1885 | 243 | 2.234 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1893 | 244 | 3.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1901 | 245 | 0.271 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1908 | 246 | 1.878 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1916 | 247 | 2.4108 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1924 | 248 | 3.6327 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1932 | 249 | 2.9108 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1939 | 250 | 4.425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1947 | 251 | 2.6128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1955 | 252 | 3.1872 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1963 | 253 | 1.9839 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1971 | 254 | 5.0317 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1978 | 255 | 1.8456 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1986 | 256 | 2.7799 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1994 | 257 | 2.4114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2002 | 258 | 1.2619 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2009 | 259 | 1.7154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2017 | 260 | 2.9187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2025 | 261 | 1.8434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2033 | 262 | 1.7474 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2040 | 263 | 4.433 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2048 | 264 | 2.169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2056 | 265 | 4.0286 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2064 | 266 | 3.6756 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2071 | 267 | 1.853 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2079 | 268 | 2.9833 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2087 | 269 | 0.3286 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2095 | 270 | 2.9449 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2102 | 271 | 2.3685 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2110 | 272 | 1.8971 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2118 | 273 | 2.2603 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2126 | 274 | 3.4866 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2133 | 275 | 3.3005 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2141 | 276 | 1.3361 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2149 | 277 | 1.8639 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2157 | 278 | 3.9011 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2164 | 279 | 3.8815 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2172 | 280 | 2.766 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2180 | 281 | 2.545 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2188 | 282 | 1.2585 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2196 | 283 | 2.3415 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2203 | 284 | 0.1715 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2211 | 285 | 1.8196 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2219 | 286 | 1.3654 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2227 | 287 | 3.4925 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2234 | 288 | 1.4732 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2242 | 289 | 2.6534 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2250 | 290 | 1.3829 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2258 | 291 | 1.5105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2265 | 292 | 1.2445 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2273 | 293 | 1.5034 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2281 | 294 | 1.2241 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2289 | 295 | 3.0699 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2296 | 296 | 1.534 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2304 | 297 | 3.7891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2312 | 298 | 1.3415 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2320 | 299 | 2.1582 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2327 | 300 | 2.9695 | 2.8057 | 4.0184 | 0.2524 | 3.0091 | 2.3361 | 0.0922 | 1.7759 | 0.3004 | 2.5259 | 0.2377 | 2.8418 | 1.8356 | 1.3260 | 1.5138 | 0.6585 | 0.5475 | 0.7333 |
| 0.2335 | 301 | 3.4647 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2343 | 302 | 1.4592 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2351 | 303 | 0.241 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2358 | 304 | 3.3644 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2366 | 305 | 1.2295 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2374 | 306 | 0.3028 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2382 | 307 | 2.1296 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2389 | 308 | 0.9409 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2397 | 309 | 0.9241 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2405 | 310 | 1.9657 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2413 | 311 | 1.6671 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2420 | 312 | 3.1371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2428 | 313 | 2.0971 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2436 | 314 | 1.3007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2444 | 315 | 0.1475 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2452 | 316 | 1.1716 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2459 | 317 | 1.0757 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2467 | 318 | 1.2618 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2475 | 319 | 1.3991 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2483 | 320 | 2.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2490 | 321 | 2.9443 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2498 | 322 | 2.9287 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2506 | 323 | 2.096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2514 | 324 | 1.5946 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2521 | 325 | 2.4606 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2529 | 326 | 0.8688 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2537 | 327 | 1.7387 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2545 | 328 | 2.1972 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2552 | 329 | 2.5069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2560 | 330 | 0.887 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2568 | 331 | 0.7386 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2576 | 332 | 0.6545 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2583 | 333 | 1.3643 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2591 | 334 | 1.776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2599 | 335 | 1.4132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2607 | 336 | 0.9845 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2614 | 337 | 1.0809 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2622 | 338 | 2.4095 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2630 | 339 | 0.6106 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2638 | 340 | 2.0685 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2645 | 341 | 1.0661 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2653 | 342 | 0.1213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2661 | 343 | 0.1238 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2669 | 344 | 1.2497 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2676 | 345 | 1.1994 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2684 | 346 | 3.1447 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2692 | 347 | 1.1243 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2700 | 348 | 0.6845 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2708 | 349 | 0.0838 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2715 | 350 | 1.0022 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2723 | 351 | 1.0077 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2731 | 352 | 1.1412 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2739 | 353 | 0.4218 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2746 | 354 | 0.3458 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2754 | 355 | 2.3721 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2762 | 356 | 0.7093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2770 | 357 | 1.5243 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2777 | 358 | 1.0516 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2785 | 359 | 2.0051 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2793 | 360 | 0.9223 | 2.4211 | 4.3379 | 0.1747 | 2.8008 | 1.9936 | 0.0599 | 1.2901 | 0.2207 | 2.3568 | 0.0666 | 2.7399 | 1.5900 | 1.1724 | 0.8802 | 0.6826 | 0.5625 | 0.7684 |
| 0.2801 | 361 | 2.7167 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2808 | 362 | 0.0458 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2816 | 363 | 0.1212 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2824 | 364 | 1.1448 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2832 | 365 | 1.4511 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2839 | 366 | 2.2643 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2847 | 367 | 2.4578 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2855 | 368 | 1.6065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2863 | 369 | 2.4521 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2870 | 370 | 1.7531 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2878 | 371 | 2.1787 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2886 | 372 | 1.0957 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2894 | 373 | 0.0586 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2901 | 374 | 1.6929 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2909 | 375 | 1.0535 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2917 | 376 | 1.9976 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2925 | 377 | 2.4483 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2933 | 378 | 1.9593 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2940 | 379 | 0.8981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2948 | 380 | 1.5796 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2956 | 381 | 0.8614 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2964 | 382 | 1.0546 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2971 | 383 | 0.729 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2979 | 384 | 1.0482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2987 | 385 | 1.1714 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2995 | 386 | 2.4894 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3002 | 387 | 2.3088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3010 | 388 | 0.9935 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3018 | 389 | 1.939 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3026 | 390 | 0.9026 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3033 | 391 | 1.8123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3041 | 392 | 0.874 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3049 | 393 | 2.1435 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3057 | 394 | 0.931 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3064 | 395 | 3.171 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3072 | 396 | 2.1008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3080 | 397 | 0.0452 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3088 | 398 | 1.9875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3095 | 399 | 0.3775 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3103 | 400 | 2.6522 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3111 | 401 | 2.5176 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3119 | 402 | 0.8288 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3126 | 403 | 0.0501 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3134 | 404 | 0.9043 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3142 | 405 | 0.9715 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3150 | 406 | 0.7641 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3157 | 407 | 0.6805 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3165 | 408 | 0.834 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3173 | 409 | 2.4305 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3181 | 410 | 3.5913 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3189 | 411 | 0.7332 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3196 | 412 | 1.2849 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3204 | 413 | 1.0282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3212 | 414 | 2.2408 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3220 | 415 | 0.6605 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3227 | 416 | 0.0823 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3235 | 417 | 0.6969 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3243 | 418 | 0.0801 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3251 | 419 | 0.8736 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3258 | 420 | 1.9045 | 2.1232 | 4.4809 | 0.1369 | 2.3051 | 1.5919 | 0.0588 | 1.1582 | 0.1888 | 1.8373 | 0.0824 | 2.2808 | 1.4099 | 1.0108 | 0.6537 | 0.6820 | 0.5636 | 0.7990 |
| 0.3266 | 421 | 1.1833 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3274 | 422 | 0.9057 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3282 | 423 | 0.6268 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3289 | 424 | 0.4343 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3297 | 425 | 1.9356 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3305 | 426 | 0.7565 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3313 | 427 | 2.3958 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3320 | 428 | 2.2136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3328 | 429 | 0.9757 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3336 | 430 | 2.1771 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3344 | 431 | 1.4981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3351 | 432 | 0.6229 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3359 | 433 | 3.4441 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3367 | 434 | 1.9118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3375 | 435 | 1.7004 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3382 | 436 | 0.7453 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3390 | 437 | 2.1001 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3398 | 438 | 1.4916 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3406 | 439 | 0.4525 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3413 | 440 | 1.4354 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3421 | 441 | 0.6315 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3429 | 442 | 0.8246 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3437 | 443 | 2.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3445 | 444 | 0.2752 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3452 | 445 | 1.8133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3460 | 446 | 2.1664 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3468 | 447 | 1.0758 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3476 | 448 | 2.0718 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3483 | 449 | 1.5408 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3491 | 450 | 0.1437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3499 | 451 | 0.9747 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3507 | 452 | 0.9244 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3514 | 453 | 1.0036 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3522 | 454 | 1.9532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3530 | 455 | 1.425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3538 | 456 | 0.7721 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3545 | 457 | 0.8837 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3553 | 458 | 0.8375 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3561 | 459 | 1.5555 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3569 | 460 | 1.9662 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3576 | 461 | 3.0869 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3584 | 462 | 1.8195 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3592 | 463 | 0.7404 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3600 | 464 | 1.0893 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3607 | 465 | 0.707 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3615 | 466 | 0.6543 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3623 | 467 | 0.804 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3631 | 468 | 0.8668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3638 | 469 | 1.9092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3646 | 470 | 0.8892 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3654 | 471 | 1.1507 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3662 | 472 | 0.7981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3670 | 473 | 0.8476 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3677 | 474 | 0.8267 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3685 | 475 | 0.5709 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3693 | 476 | 0.0949 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3701 | 477 | 0.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3708 | 478 | 1.6049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3716 | 479 | 0.5664 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3724 | 480 | 1.7042 | 2.0069 | 3.8618 | 0.1456 | 2.1599 | 1.5386 | 0.0582 | 0.8400 | 0.1834 | 1.2336 | 0.0442 | 1.7473 | 1.1999 | 0.8533 | 0.5437 | 0.6953 | 0.5709 | 0.7868 |
| 0.3732 | 481 | 0.9542 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3739 | 482 | 1.8103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3747 | 483 | 0.7318 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3755 | 484 | 1.8031 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3763 | 485 | 0.8064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3770 | 486 | 0.9347 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3778 | 487 | 1.6457 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3786 | 488 | 0.6571 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3794 | 489 | 0.6678 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3801 | 490 | 1.2484 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3809 | 491 | 0.2608 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3817 | 492 | 1.5938 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3825 | 493 | 1.9322 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3832 | 494 | 1.966 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3840 | 495 | 1.9759 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3848 | 496 | 1.7347 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3856 | 497 | 1.4667 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3863 | 498 | 0.9997 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3871 | 499 | 0.9709 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3879 | 500 | 1.2046 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3887 | 501 | 1.4984 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3894 | 502 | 1.81 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3902 | 503 | 1.4561 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3910 | 504 | 0.6121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3918 | 505 | 1.4503 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3926 | 506 | 0.7027 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3933 | 507 | 0.9195 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3941 | 508 | 0.3975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3949 | 509 | 0.6786 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3957 | 510 | 1.8421 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3964 | 511 | 0.4891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3972 | 512 | 0.5814 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3980 | 513 | 0.9891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3988 | 514 | 1.4632 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3995 | 515 | 1.6116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4003 | 516 | 1.1468 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4011 | 517 | 1.7156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4019 | 518 | 1.397 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4026 | 519 | 0.7668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4034 | 520 | 0.977 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4042 | 521 | 1.5923 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4050 | 522 | 3.4076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4057 | 523 | 1.0647 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4065 | 524 | 1.1351 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4073 | 525 | 0.9047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4081 | 526 | 0.5271 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4088 | 527 | 1.9047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4096 | 528 | 0.5474 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4104 | 529 | 1.0393 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4112 | 530 | 0.5875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4119 | 531 | 0.5365 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4127 | 532 | 1.3738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4135 | 533 | 1.7889 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4143 | 534 | 0.2338 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4151 | 535 | 1.3742 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4158 | 536 | 1.7282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4166 | 537 | 3.8249 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4174 | 538 | 0.626 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4182 | 539 | 2.1066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4189 | 540 | 0.7842 | 1.7571 | 3.8354 | 0.1055 | 1.8825 | 1.2333 | 0.0574 | 0.7839 | 0.2297 | 1.1567 | 0.0384 | 1.5782 | 1.1067 | 0.8099 | 0.5190 | 0.6947 | 0.5776 | 0.8264 |
| 0.4197 | 541 | 0.5988 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4205 | 542 | 0.091 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4213 | 543 | 0.2432 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4220 | 544 | 0.7155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4228 | 545 | 1.8321 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4236 | 546 | 0.5349 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4244 | 547 | 0.5962 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4251 | 548 | 0.9328 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4259 | 549 | 1.491 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4267 | 550 | 1.4732 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4275 | 551 | 1.3947 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4282 | 552 | 0.8615 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4290 | 553 | 1.4688 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4298 | 554 | 1.794 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4306 | 555 | 0.7028 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4313 | 556 | 0.7697 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4321 | 557 | 1.019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4329 | 558 | 0.7246 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4337 | 559 | 0.5204 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4344 | 560 | 1.4801 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4352 | 561 | 0.7505 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4360 | 562 | 0.657 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4368 | 563 | 0.0683 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4375 | 564 | 0.4504 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4383 | 565 | 0.1295 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4391 | 566 | 0.6033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4399 | 567 | 0.9079 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4407 | 568 | 0.5326 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4414 | 569 | 1.2312 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4422 | 570 | 2.5394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4430 | 571 | 0.1594 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4438 | 572 | 0.1987 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4445 | 573 | 1.8328 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4453 | 574 | 2.5802 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4461 | 575 | 0.9177 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4469 | 576 | 0.6669 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4476 | 577 | 1.4284 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4484 | 578 | 0.8634 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4492 | 579 | 1.9154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4500 | 580 | 0.803 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4507 | 581 | 0.881 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4515 | 582 | 1.2738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4523 | 583 | 0.2015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4531 | 584 | 0.7598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4538 | 585 | 0.712 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4546 | 586 | 1.4037 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4554 | 587 | 0.6935 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4562 | 588 | 0.5348 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4569 | 589 | 1.6486 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4577 | 590 | 3.1141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4585 | 591 | 1.1971 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4593 | 592 | 0.1634 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4600 | 593 | 1.6485 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4608 | 594 | 0.7393 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4616 | 595 | 0.6824 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4624 | 596 | 1.4778 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4631 | 597 | 0.7145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4639 | 598 | 1.2698 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4647 | 599 | 3.0779 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4655 | 600 | 0.3882 | 1.6521 | 4.2670 | 0.0936 | 1.7955 | 1.1523 | 0.0497 | 0.6716 | 0.1808 | 1.1815 | 0.0304 | 1.4328 | 0.9744 | 0.7299 | 0.4299 | 0.7173 | 0.5763 | 0.8036 |
| 0.4663 | 601 | 1.3776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4670 | 602 | 0.4241 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4678 | 603 | 1.2092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4686 | 604 | 0.3678 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4694 | 605 | 0.5244 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4701 | 606 | 0.7051 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4709 | 607 | 1.6367 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4717 | 608 | 0.0546 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4725 | 609 | 0.566 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4732 | 610 | 0.1035 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4740 | 611 | 1.5533 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4748 | 612 | 0.6382 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4756 | 613 | 0.6183 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4763 | 614 | 1.2768 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4771 | 615 | 0.4508 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4779 | 616 | 0.7905 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4787 | 617 | 1.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4794 | 618 | 0.4964 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4802 | 619 | 1.7319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4810 | 620 | 1.6527 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4818 | 621 | 0.019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4825 | 622 | 0.5658 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4833 | 623 | 0.9891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4841 | 624 | 0.6243 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4849 | 625 | 0.0219 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4856 | 626 | 0.3771 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4864 | 627 | 0.5282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4872 | 628 | 1.5618 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4880 | 629 | 0.7537 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4888 | 630 | 0.5743 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4895 | 631 | 1.431 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4903 | 632 | 0.4691 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4911 | 633 | 0.4524 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4919 | 634 | 0.35 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4926 | 635 | 1.4527 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4934 | 636 | 0.098 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4942 | 637 | 0.1671 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4950 | 638 | 0.0231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4957 | 639 | 0.6049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4965 | 640 | 1.5134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4973 | 641 | 1.3231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4981 | 642 | 0.6381 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4988 | 643 | 0.614 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4996 | 644 | 1.2059 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5004 | 645 | 1.1081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5012 | 646 | 0.5197 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5019 | 647 | 0.5318 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5027 | 648 | 1.0322 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5035 | 649 | 1.1135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5043 | 650 | 0.611 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5050 | 651 | 0.4931 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5058 | 652 | 0.4262 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5066 | 653 | 0.8081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5074 | 654 | 1.2332 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5081 | 655 | 1.3815 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5089 | 656 | 0.4555 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5097 | 657 | 1.4717 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5105 | 658 | 0.6629 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5112 | 659 | 0.2861 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5120 | 660 | 0.4551 | 1.8001 | 3.9403 | 0.0920 | 1.7676 | 1.1054 | 0.0526 | 0.5906 | 0.1858 | 0.9786 | 0.0198 | 1.3685 | 1.0878 | 0.7199 | 0.4175 | 0.7008 | 0.5833 | 0.8224 |
| 0.5128 | 661 | 0.0764 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5136 | 662 | 0.7882 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5144 | 663 | 0.4966 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5151 | 664 | 1.3228 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5159 | 665 | 0.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5167 | 666 | 0.7538 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5175 | 667 | 1.3167 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5182 | 668 | 1.0212 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5190 | 669 | 1.2789 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5198 | 670 | 1.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5206 | 671 | 1.2767 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5213 | 672 | 0.945 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5221 | 673 | 1.0292 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5229 | 674 | 1.6485 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5237 | 675 | 1.0887 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5244 | 676 | 0.5656 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5252 | 677 | 0.9879 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5260 | 678 | 1.3261 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5268 | 679 | 1.2894 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5275 | 680 | 1.715 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5283 | 681 | 1.1868 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5291 | 682 | 0.9136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5299 | 683 | 0.1193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5306 | 684 | 0.0927 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5314 | 685 | 1.9202 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5322 | 686 | 1.3207 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5330 | 687 | 0.585 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5337 | 688 | 1.5339 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5345 | 689 | 0.457 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5353 | 690 | 0.8198 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5361 | 691 | 0.5941 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5369 | 692 | 2.7123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5376 | 693 | 1.0445 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5384 | 694 | 0.7682 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5392 | 695 | 0.8947 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5400 | 696 | 1.3467 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5407 | 697 | 0.5136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5415 | 698 | 0.7952 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5423 | 699 | 1.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5431 | 700 | 0.3821 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5438 | 701 | 1.2379 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5446 | 702 | 1.5138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5454 | 703 | 0.3947 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5462 | 704 | 0.8279 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5469 | 705 | 1.0396 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5477 | 706 | 0.3773 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5485 | 707 | 0.3855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5493 | 708 | 0.528 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5500 | 709 | 1.2608 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5508 | 710 | 0.8316 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5516 | 711 | 1.5927 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5524 | 712 | 0.5383 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5531 | 713 | 1.2653 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5539 | 714 | 0.6491 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5547 | 715 | 3.2608 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5555 | 716 | 0.4876 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5562 | 717 | 0.4794 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5570 | 718 | 1.6532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5578 | 719 | 0.4491 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5586 | 720 | 0.6076 | 1.7167 | 3.8189 | 0.1107 | 1.5125 | 0.9112 | 0.0508 | 0.4799 | 0.2114 | 0.8705 | 0.0260 | 1.1345 | 0.8928 | 0.6976 | 0.4569 | 0.7014 | 0.5798 | 0.8372 |
| 0.5593 | 721 | 0.1445 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5601 | 722 | 0.5675 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5609 | 723 | 0.4063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5617 | 724 | 1.5336 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5625 | 725 | 1.1264 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5632 | 726 | 0.038 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5640 | 727 | 0.4946 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5648 | 728 | 0.4399 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5656 | 729 | 0.2663 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5663 | 730 | 0.5476 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5671 | 731 | 0.0714 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5679 | 732 | 1.2477 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5687 | 733 | 0.4663 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5694 | 734 | 0.3875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5702 | 735 | 0.0636 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5710 | 736 | 0.7118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5718 | 737 | 0.5781 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5725 | 738 | 0.509 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5733 | 739 | 0.314 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5741 | 740 | 0.0644 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5749 | 741 | 0.4537 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5756 | 742 | 0.067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5764 | 743 | 0.63 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5772 | 744 | 0.9531 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5780 | 745 | 0.9424 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5787 | 746 | 0.5714 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5795 | 747 | 0.8998 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5803 | 748 | 2.9047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5811 | 749 | 1.1596 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5818 | 750 | 0.4542 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5826 | 751 | 0.4563 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5834 | 752 | 0.7169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5842 | 753 | 0.9529 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5849 | 754 | 1.8114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5857 | 755 | 0.6818 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5865 | 756 | 1.7219 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5873 | 757 | 0.483 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5881 | 758 | 0.3853 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5888 | 759 | 1.4371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5896 | 760 | 0.6702 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5904 | 761 | 1.2968 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5912 | 762 | 0.5906 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5919 | 763 | 1.044 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5927 | 764 | 0.5166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5935 | 765 | 0.5071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5943 | 766 | 0.6988 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5950 | 767 | 0.9081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5958 | 768 | 0.3738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5966 | 769 | 0.5956 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5974 | 770 | 1.1079 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5981 | 771 | 0.6008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5989 | 772 | 1.522 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5997 | 773 | 1.3301 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6005 | 774 | 0.0772 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6012 | 775 | 0.0563 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6020 | 776 | 0.6185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6028 | 777 | 0.465 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6036 | 778 | 1.1821 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6043 | 779 | 1.108 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6051 | 780 | 0.6595 | 1.6136 | 3.6305 | 0.0928 | 1.6852 | 0.9934 | 0.0470 | 0.4912 | 0.2305 | 0.8310 | 0.0155 | 1.2469 | 0.7959 | 0.6912 | 0.3482 | 0.6888 | 0.5779 | 0.8508 |
| 0.6059 | 781 | 0.051 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6067 | 782 | 0.0368 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6074 | 783 | 0.5663 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6082 | 784 | 0.5206 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6090 | 785 | 0.509 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6098 | 786 | 0.8124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6106 | 787 | 0.6161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6113 | 788 | 1.2831 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6121 | 789 | 0.059 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6129 | 790 | 0.5068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6137 | 791 | 0.5579 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6144 | 792 | 0.4626 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6152 | 793 | 0.0294 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6160 | 794 | 0.5729 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6168 | 795 | 0.4606 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6175 | 796 | 0.7513 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6183 | 797 | 1.2639 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6191 | 798 | 2.0057 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6199 | 799 | 1.0553 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6206 | 800 | 1.3601 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6214 | 801 | 0.498 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6222 | 802 | 0.0417 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6230 | 803 | 0.5607 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6237 | 804 | 0.4816 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6245 | 805 | 0.5568 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6253 | 806 | 0.1023 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6261 | 807 | 0.857 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6268 | 808 | 0.3304 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6276 | 809 | 0.38 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6284 | 810 | 1.2422 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6292 | 811 | 1.0591 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6299 | 812 | 0.4432 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6307 | 813 | 0.6122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6315 | 814 | 0.0425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6323 | 815 | 1.5015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6330 | 816 | 0.4955 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6338 | 817 | 1.3967 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6346 | 818 | 0.2778 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6354 | 819 | 0.4304 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6362 | 820 | 0.7362 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6369 | 821 | 0.5542 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6377 | 822 | 0.0701 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6385 | 823 | 0.5658 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6393 | 824 | 0.3145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6400 | 825 | 0.2661 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6408 | 826 | 1.0098 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6416 | 827 | 0.1648 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6424 | 828 | 0.3011 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6431 | 829 | 0.2553 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6439 | 830 | 0.4249 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6447 | 831 | 0.9364 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6455 | 832 | 1.5279 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6462 | 833 | 0.8555 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6470 | 834 | 0.4081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6478 | 835 | 0.0543 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6486 | 836 | 0.3788 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6493 | 837 | 0.7464 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6501 | 838 | 0.5038 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6509 | 839 | 1.0619 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6517 | 840 | 1.1649 | 1.5634 | 3.7601 | 0.0892 | 1.6154 | 0.9290 | 0.0438 | 0.5605 | 0.1683 | 0.8551 | 0.0048 | 1.1486 | 0.8120 | 0.6405 | 0.3080 | 0.7016 | 0.5949 | 0.8420 |
| 0.6524 | 841 | 0.0749 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6532 | 842 | 1.5064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6540 | 843 | 1.1311 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6548 | 844 | 0.8582 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6555 | 845 | 2.7894 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6563 | 846 | 1.0835 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6571 | 847 | 0.4166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6579 | 848 | 0.8957 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6587 | 849 | 1.203 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6594 | 850 | 1.1378 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6602 | 851 | 1.144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6610 | 852 | 0.5134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6618 | 853 | 0.4396 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6625 | 854 | 0.5735 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6633 | 855 | 0.5101 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6641 | 856 | 0.4425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6649 | 857 | 1.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6656 | 858 | 0.1042 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6664 | 859 | 0.6389 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6672 | 860 | 1.3249 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6680 | 861 | 0.4195 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6687 | 862 | 0.9902 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6695 | 863 | 0.4288 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6703 | 864 | 0.8166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6711 | 865 | 1.4688 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6718 | 866 | 0.2178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6726 | 867 | 0.9398 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6734 | 868 | 0.5052 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6742 | 869 | 0.4738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6749 | 870 | 1.6114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6757 | 871 | 0.5192 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6765 | 872 | 1.3796 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6773 | 873 | 0.4999 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6780 | 874 | 0.607 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6788 | 875 | 0.7941 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6796 | 876 | 1.2409 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6804 | 877 | 1.1035 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6811 | 878 | 0.557 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6819 | 879 | 0.3722 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6827 | 880 | 0.6515 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6835 | 881 | 0.4847 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6843 | 882 | 0.4722 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6850 | 883 | 0.4156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6858 | 884 | 1.2278 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6866 | 885 | 0.6314 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6874 | 886 | 1.5736 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6881 | 887 | 1.3482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6889 | 888 | 1.2191 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6897 | 889 | 0.3178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6905 | 890 | 0.4992 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6912 | 891 | 3.0199 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6920 | 892 | 1.4033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6928 | 893 | 0.0986 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6936 | 894 | 0.6785 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6943 | 895 | 1.145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6951 | 896 | 1.0562 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6959 | 897 | 0.6484 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6967 | 898 | 0.3522 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6974 | 899 | 0.9692 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6982 | 900 | 1.4612 | 1.5020 | 4.0341 | 0.0737 | 1.4769 | 0.9983 | 0.0465 | 0.5565 | 0.2297 | 0.7415 | 0.0181 | 1.0960 | 0.8145 | 0.6020 | 0.4034 | 0.6972 | 0.5850 | 0.8516 |
| 0.6990 | 901 | 0.2964 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6998 | 902 | 0.3666 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7005 | 903 | 0.3556 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7013 | 904 | 0.4661 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7021 | 905 | 0.3754 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7029 | 906 | 0.3114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7036 | 907 | 1.0232 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7044 | 908 | 1.1533 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7052 | 909 | 0.3414 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7060 | 910 | 0.5628 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7067 | 911 | 1.3947 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7075 | 912 | 0.4052 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7083 | 913 | 0.9239 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7091 | 914 | 0.5245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7099 | 915 | 0.8049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7106 | 916 | 1.1185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7114 | 917 | 0.9616 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7122 | 918 | 0.3811 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7130 | 919 | 0.3467 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7137 | 920 | 1.2782 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7145 | 921 | 0.5922 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7153 | 922 | 0.9406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7161 | 923 | 0.9116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7168 | 924 | 0.3033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7176 | 925 | 0.5023 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7184 | 926 | 0.0792 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7192 | 927 | 0.4333 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7199 | 928 | 1.0547 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7207 | 929 | 0.232 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7215 | 930 | 0.9777 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7223 | 931 | 0.0284 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7230 | 932 | 0.7501 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7238 | 933 | 0.7604 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7246 | 934 | 0.5379 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7254 | 935 | 0.4961 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7261 | 936 | 1.1144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7269 | 937 | 0.2896 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7277 | 938 | 0.6437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7285 | 939 | 0.0315 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7292 | 940 | 0.4572 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7300 | 941 | 1.3591 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7308 | 942 | 1.3463 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7316 | 943 | 1.0494 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7324 | 944 | 0.4497 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7331 | 945 | 0.2641 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7339 | 946 | 0.9394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7347 | 947 | 0.5147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7355 | 948 | 0.3103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7362 | 949 | 0.4554 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7370 | 950 | 0.0854 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7378 | 951 | 2.4657 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7386 | 952 | 0.3617 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7393 | 953 | 0.3915 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7401 | 954 | 0.33 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7409 | 955 | 0.6429 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7417 | 956 | 0.5283 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7424 | 957 | 0.3144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7432 | 958 | 0.267 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7440 | 959 | 0.0828 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7448 | 960 | 0.2501 | 1.5219 | 3.7843 | 0.0832 | 1.2987 | 0.8858 | 0.0451 | 0.4570 | 0.1915 | 0.6730 | 0.0155 | 1.1649 | 0.7340 | 0.5743 | 0.2775 | 0.6987 | 0.5990 | 0.8620 |
| 0.7455 | 961 | 1.2049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7463 | 962 | 1.4711 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7471 | 963 | 1.3998 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7479 | 964 | 0.8524 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7486 | 965 | 1.0452 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7494 | 966 | 0.4596 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7502 | 967 | 2.3406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7510 | 968 | 0.3107 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7517 | 969 | 0.7968 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7525 | 970 | 1.0273 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7533 | 971 | 0.8244 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7541 | 972 | 0.386 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7548 | 973 | 0.5178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7556 | 974 | 1.3655 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7564 | 975 | 1.6236 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7572 | 976 | 2.4522 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7580 | 977 | 1.0867 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7587 | 978 | 0.484 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7595 | 979 | 0.8955 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7603 | 980 | 1.0314 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7611 | 981 | 0.7964 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7618 | 982 | 0.4153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7626 | 983 | 0.9775 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7634 | 984 | 0.339 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7642 | 985 | 1.5816 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7649 | 986 | 0.897 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7657 | 987 | 1.2145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7665 | 988 | 0.3752 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7673 | 989 | 0.4954 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7680 | 990 | 0.2914 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7688 | 991 | 0.8471 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7696 | 992 | 0.374 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7704 | 993 | 0.277 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7711 | 994 | 0.9676 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7719 | 995 | 1.4121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7727 | 996 | 0.2757 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7735 | 997 | 0.5515 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7742 | 998 | 1.0891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7750 | 999 | 0.9895 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7758 | 1000 | 0.2428 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7766 | 1001 | 0.3215 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7773 | 1002 | 0.9325 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7781 | 1003 | 0.3231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7789 | 1004 | 0.2963 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7797 | 1005 | 0.5616 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7804 | 1006 | 0.3755 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7812 | 1007 | 0.8611 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7820 | 1008 | 0.0237 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7828 | 1009 | 1.222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7836 | 1010 | 1.1614 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7843 | 1011 | 1.4014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7851 | 1012 | 1.0261 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7859 | 1013 | 0.3906 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7867 | 1014 | 0.1874 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7874 | 1015 | 0.1013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7882 | 1016 | 0.2955 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7890 | 1017 | 0.8856 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7898 | 1018 | 1.2317 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7905 | 1019 | 0.6771 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7913 | 1020 | 0.26 | 1.4637 | 3.8305 | 0.0762 | 1.2571 | 0.7521 | 0.0371 | 0.3657 | 0.1728 | 0.7280 | 0.0069 | 0.9747 | 0.5492 | 0.5248 | 0.2788 | 0.7135 | 0.6113 | 0.8531 |
| 0.7921 | 1021 | 0.4263 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7929 | 1022 | 0.8427 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7936 | 1023 | 0.2991 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7944 | 1024 | 0.0394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7952 | 1025 | 0.9928 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7960 | 1026 | 0.3891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7967 | 1027 | 0.2513 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7975 | 1028 | 1.1539 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7983 | 1029 | 1.0283 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7991 | 1030 | 0.7875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7998 | 1031 | 0.2013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8006 | 1032 | 1.0791 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8014 | 1033 | 0.7575 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8022 | 1034 | 0.3728 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8029 | 1035 | 0.6275 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8037 | 1036 | 0.0223 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8045 | 1037 | 0.7849 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8053 | 1038 | 0.9409 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8061 | 1039 | 0.4584 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8068 | 1040 | 0.0287 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8076 | 1041 | 0.008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8084 | 1042 | 0.3427 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8092 | 1043 | 0.6172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8099 | 1044 | 2.497 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8107 | 1045 | 1.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8115 | 1046 | 0.0494 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8123 | 1047 | 0.0508 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8130 | 1048 | 0.8282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8138 | 1049 | 0.3678 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8146 | 1050 | 1.0414 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8154 | 1051 | 0.2956 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8161 | 1052 | 0.6022 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8169 | 1053 | 0.6047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8177 | 1054 | 0.9537 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8185 | 1055 | 0.4818 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8192 | 1056 | 0.9961 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8200 | 1057 | 0.3835 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8208 | 1058 | 0.7192 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8216 | 1059 | 0.3131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8223 | 1060 | 0.0402 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8231 | 1061 | 0.247 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8239 | 1062 | 0.7557 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8247 | 1063 | 0.0468 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8254 | 1064 | 0.0421 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8262 | 1065 | 0.0711 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8270 | 1066 | 0.3368 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8278 | 1067 | 0.5852 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8285 | 1068 | 0.999 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8293 | 1069 | 0.4071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8301 | 1070 | 0.4372 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8309 | 1071 | 0.6649 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8317 | 1072 | 0.4461 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8324 | 1073 | 0.7703 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8332 | 1074 | 0.3798 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8340 | 1075 | 0.4016 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8348 | 1076 | 0.3554 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8355 | 1077 | 1.0995 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8363 | 1078 | 1.0319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8371 | 1079 | 0.3093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8379 | 1080 | 0.3652 | 1.4324 | 3.8442 | 0.0814 | 1.2321 | 0.6611 | 0.0362 | 0.3024 | 0.2345 | 0.6610 | 0.0056 | 0.9349 | 0.6043 | 0.5297 | 0.2680 | 0.7204 | 0.6098 | 0.8614 |
| 0.8386 | 1081 | 0.2184 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8394 | 1082 | 0.18 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8402 | 1083 | 0.4299 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8410 | 1084 | 0.2402 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8417 | 1085 | 0.805 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8425 | 1086 | 0.459 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8433 | 1087 | 0.2368 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8441 | 1088 | 0.2282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8448 | 1089 | 0.3175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8456 | 1090 | 1.3646 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8464 | 1091 | 0.3908 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8472 | 1092 | 1.1518 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8479 | 1093 | 0.532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8487 | 1094 | 0.2099 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8495 | 1095 | 0.9802 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8503 | 1096 | 0.6938 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8510 | 1097 | 1.0521 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8518 | 1098 | 0.9221 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8526 | 1099 | 0.3857 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8534 | 1100 | 0.2906 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8542 | 1101 | 0.3088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8549 | 1102 | 0.9182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8557 | 1103 | 0.1258 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8565 | 1104 | 0.1814 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8573 | 1105 | 0.3998 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8580 | 1106 | 0.4094 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8588 | 1107 | 0.5049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8596 | 1108 | 0.4656 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8604 | 1109 | 1.186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8611 | 1110 | 0.383 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8619 | 1111 | 1.0752 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8627 | 1112 | 0.4525 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8635 | 1113 | 0.1916 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8642 | 1114 | 0.0106 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8650 | 1115 | 0.2125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8658 | 1116 | 0.8533 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8666 | 1117 | 0.2105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8673 | 1118 | 0.9959 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8681 | 1119 | 0.4655 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8689 | 1120 | 0.0331 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8697 | 1121 | 0.4181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8704 | 1122 | 0.2587 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8712 | 1123 | 0.2525 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8720 | 1124 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8728 | 1125 | 0.2255 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8735 | 1126 | 0.2137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8743 | 1127 | 0.6591 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8751 | 1128 | 0.1919 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8759 | 1129 | 0.496 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8766 | 1130 | 0.1756 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8774 | 1131 | 0.769 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8782 | 1132 | 0.8033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8790 | 1133 | 0.7703 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8798 | 1134 | 0.1636 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8805 | 1135 | 0.1441 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8813 | 1136 | 1.2285 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8821 | 1137 | 0.4601 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8829 | 1138 | 0.74 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8836 | 1139 | 0.2197 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8844 | 1140 | 0.3052 | 1.4354 | 3.8809 | 0.0731 | 1.1856 | 0.6550 | 0.0346 | 0.2778 | 0.2487 | 0.7380 | 0.0037 | 0.9546 | 0.5969 | 0.5445 | 0.2624 | 0.7203 | 0.5990 | 0.8607 |
| 0.8852 | 1141 | 0.6587 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8860 | 1142 | 0.3953 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8867 | 1143 | 0.6124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8875 | 1144 | 0.952 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8883 | 1145 | 0.9514 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8891 | 1146 | 0.2201 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8898 | 1147 | 0.3446 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8906 | 1148 | 0.7141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8914 | 1149 | 0.7227 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8922 | 1150 | 0.7811 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8929 | 1151 | 0.3033 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8937 | 1152 | 0.7532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8945 | 1153 | 0.0198 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8953 | 1154 | 1.0878 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8960 | 1155 | 0.9868 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8968 | 1156 | 0.1472 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8976 | 1157 | 0.8511 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8984 | 1158 | 0.3186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8991 | 1159 | 0.7747 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8999 | 1160 | 0.0443 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9007 | 1161 | 0.2762 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9015 | 1162 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9022 | 1163 | 0.2446 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9030 | 1164 | 0.1827 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9038 | 1165 | 0.3043 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9046 | 1166 | 1.1194 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9054 | 1167 | 0.344 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9061 | 1168 | 1.1798 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9069 | 1169 | 0.9406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9077 | 1170 | 0.0222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9085 | 1171 | 0.1722 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9092 | 1172 | 0.2667 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9100 | 1173 | 0.2907 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9108 | 1174 | 0.6799 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9116 | 1175 | 2.0621 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9123 | 1176 | 0.396 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9131 | 1177 | 0.3186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9139 | 1178 | 0.2532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9147 | 1179 | 0.0561 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9154 | 1180 | 0.24 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9162 | 1181 | 0.0055 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9170 | 1182 | 1.0443 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9178 | 1183 | 0.5319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9185 | 1184 | 0.1738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9193 | 1185 | 1.0717 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9201 | 1186 | 0.0151 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9209 | 1187 | 0.2959 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9216 | 1188 | 1.1981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9224 | 1189 | 0.1575 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9232 | 1190 | 0.5027 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9240 | 1191 | 0.9443 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9247 | 1192 | 0.7371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9255 | 1193 | 0.6714 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9263 | 1194 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9271 | 1195 | 0.2574 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9279 | 1196 | 0.011 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9286 | 1197 | 0.2799 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9294 | 1198 | 1.117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9302 | 1199 | 0.5258 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9310 | 1200 | 0.0 | 1.4484 | 3.8139 | 0.0712 | 1.2239 | 0.6812 | 0.0364 | 0.2785 | 0.1758 | 0.6358 | 0.0023 | 0.9432 | 0.5713 | 0.5288 | 0.2603 | 0.7160 | 0.5824 | 0.8658 |
| 0.9317 | 1201 | 0.3638 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9325 | 1202 | 0.0419 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9333 | 1203 | 0.9931 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9341 | 1204 | 0.271 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9348 | 1205 | 0.6739 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9356 | 1206 | 1.0256 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9364 | 1207 | 0.9012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9372 | 1208 | 0.0282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9379 | 1209 | 0.2159 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9387 | 1210 | 0.2428 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9395 | 1211 | 0.7847 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9403 | 1212 | 1.1893 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9410 | 1213 | 0.2918 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9418 | 1214 | 0.697 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9426 | 1215 | 0.7615 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9434 | 1216 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9441 | 1217 | 0.1928 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9449 | 1218 | 0.8963 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9457 | 1219 | 0.3505 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9465 | 1220 | 0.9786 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9472 | 1221 | 0.6061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9480 | 1222 | 0.2607 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9488 | 1223 | 0.5026 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9496 | 1224 | 0.3365 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9503 | 1225 | 0.4172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9511 | 1226 | 0.6302 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9519 | 1227 | 0.8007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9527 | 1228 | 0.358 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9535 | 1229 | 1.1527 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9542 | 1230 | 0.2319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9550 | 1231 | 0.4708 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9558 | 1232 | 0.6828 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9566 | 1233 | 0.3424 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9573 | 1234 | 0.933 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9581 | 1235 | 0.3672 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9589 | 1236 | 0.4457 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9597 | 1237 | 0.7641 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9604 | 1238 | 0.8338 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9612 | 1239 | 0.8975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9620 | 1240 | 0.4062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9628 | 1241 | 0.3386 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9635 | 1242 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9643 | 1243 | 0.2891 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9651 | 1244 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9659 | 1245 | 0.2005 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9666 | 1246 | 0.7817 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9674 | 1247 | 0.106 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9682 | 1248 | 0.0451 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9690 | 1249 | 0.25 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9697 | 1250 | 1.5482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9705 | 1251 | 0.1751 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9713 | 1252 | 0.4074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9721 | 1253 | 0.307 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9728 | 1254 | 0.9694 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9736 | 1255 | 1.0855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9744 | 1256 | 0.36 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9752 | 1257 | 0.4696 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9760 | 1258 | 0.2981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9767 | 1259 | 0.0108 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9775 | 1260 | 1.1752 | 1.5156 | 3.6225 | 0.0763 | 1.2131 | 0.8372 | 0.0338 | 0.2446 | 0.1437 | 0.6766 | 0.0021 | 0.8870 | 0.5494 | 0.5050 | 0.2076 | 0.7169 | 0.6041 | 0.8602 |
| 0.9783 | 1261 | 0.8216 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9791 | 1262 | 0.1737 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9798 | 1263 | 0.471 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9806 | 1264 | 0.4433 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9814 | 1265 | 0.7567 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9822 | 1266 | 0.9696 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9829 | 1267 | 0.0401 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9837 | 1268 | 0.5234 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9845 | 1269 | 0.6081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9853 | 1270 | 0.2459 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9860 | 1271 | 0.6489 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9868 | 1272 | 0.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9876 | 1273 | 1.4879 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9884 | 1274 | 0.456 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9891 | 1275 | 0.3709 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9899 | 1276 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9907 | 1277 | 0.42 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9915 | 1278 | 0.3612 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9922 | 1279 | 0.0003 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9930 | 1280 | 0.1506 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9938 | 1281 | 0.9189 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9946 | 1282 | 0.4371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9953 | 1283 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9961 | 1284 | 1.0255 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9969 | 1285 | 0.5756 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9977 | 1286 | 0.3936 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9984 | 1287 | 0.627 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9992 | 1288 | 0.047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0 | 1289 | 0.8975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0008 | 1290 | 0.8482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.0.1
- Transformers: 4.44.0
- PyTorch: 2.4.0
- Accelerate: 0.33.0
- Datasets: 2.21.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```