开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

anyuna7 · 2022年03月09日

为什么p大于阈值就是class1 怎么分辨?

NO.PZ2021083101000008

问题如下:

Azarov requests that Bector apply the ML model to the test dataset for Dataset XYZ, assuming a threshold p-value of 0.65. Exhibit 2 contains a sample of results from the test dataset corpus.

Based on Exhibit 2, the accuracy metric for Dataset XYZ’s test set sample is closest to:

选项:

A.

0.67

B.

0.70

C.

0.75

解释:

B is correct.

Accuracy is the percentage of correctly predicted classes out of total predictions and is calculated as (TP + TN)/(TP + FP + TN + FN).

In order to obtain the values for true positive (TP), true negative (TN), false positive (FP), and false negative (FN), predicted sentiment for the positive (Class “1”) and the negative (Class “0”) classes are determined based on whether each individual target p-value is greater than or less than the threshold p-value of 0.65. If an individual target p-value is greater than the threshold p-value of 0.65, the predicted sentiment for that instance is positive (Class “1”). If an individual target p-value is less than the threshold p-value of 0.65, the predicted sentiment for that instance is negative (Class “0”). Actual sentiment and predicted sentiment are then classified as follows:

Exhibit 2, with added “Predicted Sentiment” and “Classification” columns, is presented below:

Based on the classification data obtained from Exhibit 2, a confusion matrix can be generated:

Using the data in the confusion matrix above, the accuracy metric is computed as follows:

Accuracy = (TP + TN)/(TP + FP + TN + FN).

Accuracy = (3 + 4)/(3 + 1 + 4 + 2) = 0.70.

A is incorrect because 0.67 is the F1 score, not accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. To calculate the F1 score, the precision (P) and the recall (R) ratios must first be calculated. Precision and recall for the sample of the test set for Dataset XYZ, based on Exhibit 2, are calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60.

The F1 score is calculated as follows:

F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.

C is incorrect because 0.75 is the precision ratio, not the accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. The precision score is calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

考点:Model Training - Performance Evaluation

为什么呢?If an individual target p-value is greater than the threshold p-value of 0.65, the predicted sentiment for that instance is positive (Class “1”). If an individual target p-value is less than the threshold p-value of 0.65, the predicted sentiment for that instance is negative (Class “0”).

1 个答案
已采纳答案

星星_品职助教 · 2022年03月10日

同学你好,

这是logistic regression中,Y变量的赋值规则。

logistic regression得到的结果Y是一个二值变量,即只能取值0或1,其中0对应negative,1对应positive。但实际上,这个方程求出来的结果是一个概率数字,即表格中的p-value。并不能直接得到0或1。

所以,如果得到的结果(p-value)大于了threshold 0.65,此时就令Y等于1(positive)。反之,如果回归结果小于了0.65,此时Y=0 (negative).

据此,根据Exhibit 2给出的p-value和题干给出的threshold 0.65,可以得出预测值(predicted sentiment)这一列。

根据预测值和真实值(actual sentiment)这一列进行对比,可以写出classification这一列,即得到TP,TN,FN,FP四者。

然后代入到Accuracy = (TP + TN)/(TP + FP + TN + FN)这个公式里,即可以求得“the accuracy metric”

  • 1

    回答
  • 1

    关注
  • 455

    浏览
相关问题

NO.PZ2021083101000008问题如下 Azarov requests thBector apply the ML mol to the test taset for taset XYZ, assuming a thresholp-value of 0.65. Exhibit 2 contains a sample of results from the test taset corpus.Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67B.0.70C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.考点Mol Training - PerformanEvaluation 为什么p大于0.65预测结果就是1呢?所有情况都是这样么,讲义里面有提到这个规则么?

2024-02-28 01:13 1 · 回答

NO.PZ2021083101000008问题如下 Azarov requests thBector apply the ML mol to the test taset for taset XYZ, assuming a thresholp-value of 0.65. Exhibit 2 contains a sample of results from the test taset corpus.Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67B.0.70C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.考点Mol Training - PerformanEvaluation 为什么超过0.65就选1,否则是0?

2023-05-30 18:58 1 · 回答

NO.PZ2021083101000008 问题如下 Azarov requests thBector apply the ML mol to the test taset for taset XYZ, assuming a thresholp-value of 0.65. Exhibit 2 contains a sample of results from the test taset corpus.Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67 B.0.70 C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.考点Mol Training - PerformanEvaluation 老师,为什么如果结果(p-value)大于threshol0.65,就令Y等于1呢?请问这个条件是默认的吗?

2023-02-26 13:38 1 · 回答

0.70 0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN). In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows: Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow: Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerate Using the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN). Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70. A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75. Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67. C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75. 考点Mol Training - PerformanEvaluation 老师好,我还是没看明白怎么从比较p确定的TP,可否举个例子?另外。。。。我已经发现几个题目的图片在移动终端都不能显示。。。。。还不是网速问题,能不能给修正一下?

2022-05-29 23:20 1 · 回答