官术网_书友最值得收藏!

Jensen-Shannon divergence

The Jensen-Shannon divergence (also called the information radius (IRaDor the total divergence to the average) is another measure of similarity between two probability distributions. It is based on KL divergence. Unlike KL divergence, however, JS divergence is symmetric in nature and can be used to measure the distance between two probability distributions. If we take the square root of the Jensen-Shannon divergence, we get the Jensen-Shannon distance, so it is therefore a distance metric.

The following equation represents the Jensen-Shannon divergence between two probability distributions, p and q:

In the preceding equation, (p+q) is the midpoint measure, while  is the Kullback-Leibler divergence.

Now that we have learned about the KL divergence and the Jenson-Shannon divergence, let's discuss the Nash equilibrium for GANs.

主站蜘蛛池模板: 民权县| 焦作市| 孟津县| 盐津县| 云安县| 铜川市| 黄石市| 崇礼县| 宜州市| 社会| 嵊州市| 台南市| 绥阳县| 星子县| 朔州市| 阳朔县| 恩平市| 内江市| 灵石县| 松阳县| 左权县| 怀仁县| 营口市| 太仓市| 定南县| 石门县| 永川市| 峨眉山市| 红河县| 皮山县| 岫岩| 万源市| 凭祥市| 拜泉县| 酉阳| 荣成市| 桃园县| 县级市| 平塘县| 孝感市| 霍山县|