Concept information
PREFERRED TERM
backpropagation
DEFINITION(S)
- statistical method used to calculate the error gradient for each neuron in a neural network, from the last layer to the first. The gradient of a function of several variables at a given point is a vector that characterizes the variability of this function in the neighborhood of this point. (Source: translated from https://ia.gdria.fr/Glossaire/retropropagation-du-gradient/ )
BROADER CONCEPT(S)
SYNONYM(S)
- backward propagation of errors
BIBLIOGRAPHIC CITATION(S)
- • Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning internal representations by error propagation. In D. E. Rumelhart & J. L. McClelland (Eds.), Parallel distributed processing. Vol. 1 : Foundations (pp. 318‑362). MIT Press.
IN OTHER LANGUAGES
-
French
URI
http://data.loterre.fr/ark:/67375/23L-CHPG4H6H-D
{{toUpperCase label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}