Difference between revisions of "Binary cross-entropy loss"
KevinYager (talk | contribs) (Created page with "The binary cross-entropy loss is given by: :<math> L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] </math>...") |
KevinYager (talk | contribs) |
||
Line 3: | Line 3: | ||
L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] | L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] | ||
</math> | </math> | ||
− | for <math>m</math> training examples (indexed by <math>i</math>) where <math>y_i</math> is the class label (0 or 1) and <math>\hat{y_i}</math> is the prediction for that example (i.e. the predicted probability that it is a positive example). Thus <math>1-\hat{y_i}</math> is the probability that it is a negative example. | + | for <math>m</math> training examples (indexed by <math>i</math>) where <math>y_i</math> is the class label (0 or 1) and <math>\hat{y_i}</math> is the prediction for that example (i.e. the predicted probability that it is a positive example). Thus <math>1-\hat{y_i}</math> is the probability that it is a negative example. Note that we are adding |
Revision as of 16:37, 3 February 2023
The binary cross-entropy loss is given by:
- Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle L = - \frac{1}{m} \sum_{i=1}^{m} \left[ y_i \cdot \log{ (\hat{y_i}) } + (1-y_i) \cdot \log{ (1-\hat{y_i}) } \right] }
for Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle m} training examples (indexed by Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle i} ) where Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle y_i} is the class label (0 or 1) and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \hat{y_i}} is the prediction for that example (i.e. the predicted probability that it is a positive example). Thus Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle 1-\hat{y_i}} is the probability that it is a negative example. Note that we are adding