티스토리 뷰
원문 : Attention is All You Need (NIPS 2017)
def scaled_dot_product_attention(q, k, v, mask):
"""Calculate the attention weights.
q, k, v must have matching leading dimensions.
k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v.
The mask has different shapes depending on its type(padding or look ahead)
but it must be broadcastable for addition.
Args:
q: query shape == (..., seq_len_q, depth)
k: key shape == (..., seq_len_k, depth)
v: value shape == (..., seq_len_v, depth_v)
mask: Float tensor with shape broadcastable
to (..., seq_len_q, seq_len_k). Defaults to None.
Returns:
output, attention_weights
"""
matmul_qk = tf.matmul(q, k, transpose_b=True) # (..., seq_len_q, seq_len_k)
# scale matmul_qk
dk = tf.cast(tf.shape(k)[-1], tf.float32)
scaled_attention_logits = matmul_qk / tf.math."""input your code(A)"""(dk)
# add the mask to the scaled tensor.
if mask is not None:
scaled_attention_logits += (mask * -1e9)
# softmax is normalized on the last axis (seq_len_k) so that the scores
# add up to 1.
attention_weights = tf.nn."""input your code(B)"""(scaled_attention_logits, axis=-1) # (..., seq_len_q, seq_len_k)
output = tf."""input your code(C)"""(attention_weights, v) # (..., seq_len_q, depth_v)
return output, attention_weights
(sqrt, softmax, matmul)
'Daliy Note' 카테고리의 다른 글
Explainable AI Cheat Sheet (0) | 2021.08.20 |
---|---|
BERT & ELMo (0) | 2021.08.20 |
Natural Language Review (0) | 2021.08.20 |
Review (0) | 2021.08.19 |
Language Modeling with RNN (0) | 2021.08.18 |
댓글
최근에 올라온 글
최근에 달린 댓글
- Total
- Today
- Yesterday
링크
TAG
- 시각화
- #시각화 #데이터시각화 #인포그래픽 #차트 #그래프 #데이터분석 #빅데이터 #시각화툴 #파이썬시각화
- kaggle
- kinsman
- miniSGD
- CertifiedAIExpert
- 인공지능
- cnn
- LeNet
- 자본시장 위험 분석 보고서
- 머신러닝
- opencv
- CS231
- ssac
- natural language
- ALFFEL
- Ai
- AutoTrading
- EDA
- 대시보드 #정보시각화
- SeSAC
- spotfire
- Python
- BatchNormalization
- flask
- probablity graph model
- scikit learn map
- Transfomer
- zfill(x)
- ising model
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 |
글 보관함