开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

lcrcp3 · 2023年08月22日

如题

NO.PZ2015120204000038

问题如下:

Regarding neural networks (NNs) , which of the following statements is least accurate?

选项:

A.

NNs must have at least 10 hidden layers to be considered deep learning nets.

B.

The activation function in a node operates like a light dimmer switch since it decreases or increases the strength of the total net input.

C.


The summation operator receives input values, multiplies each by a weight, sums up the weighted values into the total net input, and passes it to the activation function.

解释:

A is correct. It is the least accurate answer because neural networks with many hidden layers—at least 3, but often more than 20 hidden layers—are known as deep learning nets.

B is incorrect, because the node’s activation function operates like a light dimmer switch which decreases or increases the strength of the (total net) input.

C is incorrect, because the node’s summation operator multiplies each (input) value by a weight and sums up the weighted values to form the total net input. The total net input is then passed to the activation function.

现在最新的到底最少是2层还是3层?基础班讲的是3层,强化班说的是2层。

1 个答案
已采纳答案

星星_品职助教 · 2023年08月22日

同学你好,

协会统一勘误了,以 at least 2 为准。

  • 1

    回答
  • 1

    关注
  • 396

    浏览
相关问题

NO.PZ2015120204000038 问题如下 Regarng neurnetworks (NNs) , whiof the following statements is least accurate? A.NNs must have least 10 hien layers to consireep learning nets. B.The activation function in a no operates like a light mmer switsinit creases or increases the strength of the totnet input. C.The summation operator receives input values, multiplies eaa weight, sums up the weightevalues into the totnet input, anpasses it to the activation function. A is correct. It is the least accurate answer because neurnetworks with many hien layers—least 3, but often more th20 hien layers—are known ep learning nets.B is incorrect, because the no’s activation function operates like a light mmer switwhicreases or increases the strength of the (totnet) input.C is incorrect, because the no’s summation operator multiplies ea(input) value a weight ansums up the weightevalues to form the totnet input. The totnet input is then passeto the activation function. 如题

2022-05-30 22:59 1 · 回答

NO.PZ2015120204000038 The activation function in a no operates like a light mmer switsinit creases or increases the strength of the totnet input. The summation operator receives input values, multiplies eaa weight, sums up the weightevalues into the totnet input, anpasses it to the activation function. A is correct. It is the least accurate answer because neurnetworks with many hien layers—least 3, but often more th20 hien layers—are known ep learning nets. B is incorrect, because the no’s activation function operates like a light mmer switwhicreases or increases the strength of the (totnet) input. C is incorrect, because the no’s summation operator multiplies ea(input) value a weight ansums up the weightevalues to form the totnet input. The totnet input is then passeto the activation function. 这个不是问最不准确的吗?为什么回答选了唯一正确的那个呢

2021-03-23 10:32 1 · 回答