Concept information
TERME PRÉFÉRENTIEL
transformer
DÉFINITION(S)
- A deep neural network architecture that process sequential data—such as text—in parallel by using an attention mechanism to capture both nearby and long-range relationships between elements (adapted from Millière, 2024).
CONCEPT(S) GÉNÉRIQUE(S)
SYNONYME(S)
- self-attention model
- self-attention network
RÉFÉRENCE(S) BIBLIOGRAPHIQUE(S)
- • Millière, R. (2024). Transformers. In Open Encyclopedia of Cognitive Science. MIT Press. https://doi.org/10.21428/e2759450.d3acfbfb
- • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł. ukasz, & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. https://proceedings.neurips.cc/paper_files/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html
TRADUCTIONS
-
français
-
modèle auto-attentif
-
transformateur
-
transformer
URI
http://data.loterre.fr/ark:/67375/23L-WCQC8ZPT-0
{{toUpperCase label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}