开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

小强爱英语 · 2023年05月30日

为什么超过0.65就选1,否则是0

NO.PZ2021083101000008

问题如下:

Azarov requests that Bector apply the ML model to the test dataset for Dataset XYZ, assuming a threshold p-value of 0.65. Exhibit 2 contains a sample of results from the test dataset corpus.

Based on Exhibit 2, the accuracy metric for Dataset XYZ’s test set sample is closest to:

选项:

A.

0.67

B.

0.70

C.

0.75

解释:

B is correct.

Accuracy is the percentage of correctly predicted classes out of total predictions and is calculated as (TP + TN)/(TP + FP + TN + FN).

In order to obtain the values for true positive (TP), true negative (TN), false positive (FP), and false negative (FN), predicted sentiment for the positive (Class “1”) and the negative (Class “0”) classes are determined based on whether each individual target p-value is greater than or less than the threshold p-value of 0.65. If an individual target p-value is greater than the threshold p-value of 0.65, the predicted sentiment for that instance is positive (Class “1”). If an individual target p-value is less than the threshold p-value of 0.65, the predicted sentiment for that instance is negative (Class “0”). Actual sentiment and predicted sentiment are then classified as follows:

Exhibit 2, with added “Predicted Sentiment” and “Classification” columns, is presented below:

Based on the classification data obtained from Exhibit 2, a confusion matrix can be generated:

Using the data in the confusion matrix above, the accuracy metric is computed as follows:

Accuracy = (TP + TN)/(TP + FP + TN + FN).

Accuracy = (3 + 4)/(3 + 1 + 4 + 2) = 0.70.

A is incorrect because 0.67 is the F1 score, not accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. To calculate the F1 score, the precision (P) and the recall (R) ratios must first be calculated. Precision and recall for the sample of the test set for Dataset XYZ, based on Exhibit 2, are calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60.

The F1 score is calculated as follows:

F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.

C is incorrect because 0.75 is the precision ratio, not the accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. The precision score is calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

考点:Model Training - Performance Evaluation

为什么超过0.65就选1,否则是0?

1 个答案

星星_品职助教 · 2023年05月31日

同学你好,

Logistic regression中的Y值只有1和0两种可能,但实际上回归方程算出来的是一个概率。所以需要通过下面的方式来确定Y值。

1)人为去设定一个p-value的阈值(threshold p-value)。

2)通过logistic regression来计算概率值(p-value),并规定当算出来的结果高于事先设定的阈值时,令Y=1,低于这个阈值时,令Y=0.

本题中给出的p-value的阈值为0.65(a threshold p-value of 0.65)。故logistic regression算出来的概率如果大于0.65,就令Y=1.

  • 1

    回答
  • 0

    关注
  • 410

    浏览
相关问题

NO.PZ2021083101000008问题如下 Azarov requests thBector apply the ML mol to the test taset for taset XYZ, assuming a thresholp-value of 0.65. Exhibit 2 contains a sample of results from the test taset corpus.Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67B.0.70C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.考点Mol Training - PerformanEvaluation 为什么p大于0.65预测结果就是1呢?所有情况都是这样么,讲义里面有提到这个规则么?

2024-02-28 01:13 1 · 回答

NO.PZ2021083101000008 问题如下 Azarov requests thBector apply the ML mol to the test taset for taset XYZ, assuming a thresholp-value of 0.65. Exhibit 2 contains a sample of results from the test taset corpus.Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67 B.0.70 C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.考点Mol Training - PerformanEvaluation 老师,为什么如果结果(p-value)大于threshol0.65,就令Y等于1呢?请问这个条件是默认的吗?

2023-02-26 13:38 1 · 回答

0.70 0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN). In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows: Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow: Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerate Using the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN). Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70. A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75. Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67. C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75. 考点Mol Training - PerformanEvaluation 老师好,我还是没看明白怎么从比较p确定的TP,可否举个例子?另外。。。。我已经发现几个题目的图片在移动终端都不能显示。。。。。还不是网速问题,能不能给修正一下?

2022-05-29 23:20 1 · 回答

NO.PZ2021083101000008 为什么呢?If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”).

2022-03-09 22:33 1 · 回答