官术网_书友最值得收藏!

Limitations of deep learning

Deep neural networks are black boxes of weights and biases trained over a large amount of data to find hidden patterns through inner representations; it would be impossible for humans, and even if it were possible, then scalability would be an issue. Every neural probably has a different weight. Thus, they will have different gradients.

Training happens during backpropagation. Thus, the direction of training is always from the later layers (output/right side) to the early layers (input/left side). This results in later layers learning very well as compared to the early layers. The deeper the network gets, the more the condition deteriorates. This give rise to two possible problems associated with deep learning, which are:

  • The vanishing gradient problem
  • The exploding gradient problem
主站蜘蛛池模板: 东源县| 桂东县| 榕江县| 西充县| 班玛县| 阳山县| 平遥县| 武清区| 普格县| 绥宁县| 朝阳县| 清水河县| 仲巴县| 郯城县| 额尔古纳市| 德清县| 宣武区| 钟山县| 商河县| 和政县| 嫩江县| 利津县| 哈尔滨市| 关岭| 永安市| 遂宁市| 改则县| 云南省| 山阴县| 清水河县| 巴林左旗| 米泉市| 德令哈市| 东乡| 温宿县| 海原县| 肇庆市| 社旗县| 兖州市| 岳西县| 缙云县|