官术网_书友最值得收藏!

Kullback-Leibler divergence

Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q

The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.

Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.

主站蜘蛛池模板: 蓝田县| 肇源县| 富源县| 曲麻莱县| 乌鲁木齐市| 金平| 荥阳市| 广丰县| 社会| 泰州市| 广南县| 连南| 焦作市| 综艺| 永春县| 达日县| 乌拉特后旗| 青川县| 遵化市| 灵川县| 阜新市| 安溪县| 玉环县| 罗平县| 祥云县| 宝鸡市| 浪卡子县| 四子王旗| 长子县| 勃利县| 望江县| 海伦市| 阿克苏市| 栾城县| 崇州市| 西城区| 栾城县| 庄浪县| 垦利县| 仙居县| 平昌县|