Home  >  Article  >  Technology peripherals  >  Understanding cross-entropy: What is its corresponding importance?

Understanding cross-entropy: What is its corresponding importance?

王林
王林forward
2024-01-23 09:54:091169browse

Entropy quantifies the uncertainty of an event. In data science, cross entropy and KL divergence are related to discrete probability distributions and are used to measure how similar two distributions are. In machine learning, cross-entropy loss is used to evaluate how close a predicted distribution is to the true distribution.

Given the true distribution t and the predicted distribution p, the cross entropy between them is given by the following equation:

Understanding cross-entropy: What is its corresponding importance?

where p(x) is True probability distribution (one-hot), q(x) is the predicted probability distribution.

However, in the real world, the differences between predicted and actual values ​​are called divergence because they diverge from the actual value. Cross-entropy is a combined measure of entropy and KL divergence.

Now let’s see how cross-entropy fits into the deep neural network paradigm using a classification example.

Each classification case has a known class label with probability 1.0 and the remaining labels with probability 0. The model determines the probability of each category name based on the case. Cross-entropy can be used to compare neural pathways with different labels.

Compare each predicted class probability to the desired output of 0 or 1. The calculated score/loss penalizes the probability based on the distance from the expected value. The penalty is logarithmic, producing larger scores for significant differences close to 1 and smaller scores for small differences close to 0.

Cross-entropy loss is used when adjusting model weights during training, with the goal of minimizing the loss - the smaller the loss, the better the model.

The above is the detailed content of Understanding cross-entropy: What is its corresponding importance?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:163.com. If there is any infringement, please contact admin@php.cn delete