기본 콘텐츠로 건너뛰기

라벨이 어텐션인 게시물 표시

[트랜스포머 이해] Attention 함수

Attention 함수 정의 3.2Attention An attention function can be described as mapping a query and a set of key-value pairs to an output, where the query, keys, values, and output are all vectors. The output is computed as a weighted sum of the values, where the weight assigned to each value is computed by a compatibility function of the query with the corresponding key. 출처:  Attention Is All You Need Attention 함수 비유

[트랜스포머 이해] Multi-head Encoder-Decoder Attention

Attention 함수 Q, K, V 얻기 Attention Score Matrix 계산 Attention Value 계산

[트랜스포머 이해] Multi-head Self-Attention

Attention 함수 Q, K, V 얻기 Attention Score Matrix 계산 Attention Value 계산