Spaces:
Running
Running
Update GCP-ML-vA.json
Browse files- questions/GCP-ML-vA.json +4 -2
questions/GCP-ML-vA.json
CHANGED
@@ -27,10 +27,12 @@
|
|
27 |
"question": "You were asked to investigate failures of a product ion line component based on sensor readings. After receiving the dataset, you discover that less than 1% of the readings are positive examples representi ng failure incidents. You have tried to train several classifi cation models, but none of them converge. How shoul d you resolve the class imbalance problem?",
|
28 |
"options": [
|
29 |
"A. Use the class distribution to generate 10% positi ve examples.",
|
30 |
-
"B. Use a convolutional neural network with max pooling and softmax activation
|
|
|
31 |
"D. Remove negative examples until the numbers of pos itive and negative examples are equal."
|
32 |
],
|
33 |
-
"correct": "",
|
|
|
34 |
"explanation": "C. Downsample the data with upweighting to create a sample with 10% positive examples.\n\nExplanation:\n\nThe correct answer is C, Downsample the data with upweighting to create a sample with 10% positive examples. This approach involves reducing the number of negative examples while assigning a higher weight to the remaining negative examples. This helps to balance the class distribution and allows the model to converge.\n\nOption A is incorrect because generating positive examples artificially can lead to overfitting and poor generalization of the model.\n\nOption B is incorrect because convolutional neural networks with max pooling and softmax activation are typically used for image classification tasks, not for handling class imbalance problems.\n\nOption D is incorrect because removing negative examples can result in loss of valuable information and may not effectively address the class imbalance issue.\n\nIn summary, downsampling the data with upweighting is a suitable approach to handle class imbalance problems, especially when the number of positive examples is very small compared to the number of negative examples.",
|
35 |
"references": ""
|
36 |
},
|
|
|
27 |
"question": "You were asked to investigate failures of a product ion line component based on sensor readings. After receiving the dataset, you discover that less than 1% of the readings are positive examples representi ng failure incidents. You have tried to train several classifi cation models, but none of them converge. How shoul d you resolve the class imbalance problem?",
|
28 |
"options": [
|
29 |
"A. Use the class distribution to generate 10% positi ve examples.",
|
30 |
+
"B. Use a convolutional neural network with max pooling and softmax activation" ,
|
31 |
+
"C. Downsample the data with upweighting to create a sa mple with 10% positive examples.",
|
32 |
"D. Remove negative examples until the numbers of pos itive and negative examples are equal."
|
33 |
],
|
34 |
+
"correct": "C. Downsample the data with upweighting to create a sa mple with 10% positive examples.",
|
35 |
+
|
36 |
"explanation": "C. Downsample the data with upweighting to create a sample with 10% positive examples.\n\nExplanation:\n\nThe correct answer is C, Downsample the data with upweighting to create a sample with 10% positive examples. This approach involves reducing the number of negative examples while assigning a higher weight to the remaining negative examples. This helps to balance the class distribution and allows the model to converge.\n\nOption A is incorrect because generating positive examples artificially can lead to overfitting and poor generalization of the model.\n\nOption B is incorrect because convolutional neural networks with max pooling and softmax activation are typically used for image classification tasks, not for handling class imbalance problems.\n\nOption D is incorrect because removing negative examples can result in loss of valuable information and may not effectively address the class imbalance issue.\n\nIn summary, downsampling the data with upweighting is a suitable approach to handle class imbalance problems, especially when the number of positive examples is very small compared to the number of negative examples.",
|
37 |
"references": ""
|
38 |
},
|