官术网_书友最值得收藏!

Jensen-Shannon divergence

The Jensen-Shannon divergence (also called the information radius (IRaDor the total divergence to the average) is another measure of similarity between two probability distributions. It is based on KL divergence. Unlike KL divergence, however, JS divergence is symmetric in nature and can be used to measure the distance between two probability distributions. If we take the square root of the Jensen-Shannon divergence, we get the Jensen-Shannon distance, so it is therefore a distance metric.

The following equation represents the Jensen-Shannon divergence between two probability distributions, p and q:

In the preceding equation, (p+q) is the midpoint measure, while  is the Kullback-Leibler divergence.

Now that we have learned about the KL divergence and the Jenson-Shannon divergence, let's discuss the Nash equilibrium for GANs.

主站蜘蛛池模板: 望城县| 西充县| 肇源县| 井冈山市| 曲周县| 大理市| 远安县| 墨江| 福清市| 阿坝县| 晋城| 龙门县| 江油市| 闵行区| 龙川县| 九龙坡区| 托克逊县| 涿鹿县| 三门县| 余干县| 南丰县| 永泰县| 乌兰察布市| 蚌埠市| 横峰县| 穆棱市| 布尔津县| 峡江县| 巫溪县| 乐陵市| 洞头县| 七台河市| 吴忠市| 内江市| 泰兴市| 集贤县| 安平县| 云安县| 青冈县| 新营市| 南靖县|