際際滷

際際滷Share a Scribd company logo
觜襯願蟲RNN
18.06.24 蟾覲伎
Contents
Intro
Many to One
Many to Many
Seq2Seq
QnA
2
Intro
讌蠍一.....
殊企 覈 code snippets  襷襦 れ 覦
蟇磯,  危語 覲伎覃 .
Download :
http://tagme.to/aisolab/To_quickly_implementin
g_RNN
Click : To quickly implementing RNN.ipynb
3
Intro
襾殊 Recurrent Neural Networks 企 螻 
螻 螳, api襯  覯襷 危エ覺
覈  variable sequence length襯 螳, 襯 
覃  螳..
# 覓語レ 企ゼ RNN  kり 覃?
# RNN 豌 螳 覓語 覲襦 伎 螳襷 sequence襯 豌襴伎狩.
sentences = [['I', 'feel', 'hungry'],
['tensorflow', 'is', 'very', 'difficult'],
['tensorflow', 'is', 'a', 'framework',
'for','deep','learning'],
['tensorflow', 'is', 'very',
'fast', 'changing']]
4
Intro : Padding
tensor ow variable sequence length襯 る蠍
伎 朱 麹 蠍語企 padding 伎狩
覃 eager mode襯 讌 伎 tensor ow
  framework願鍵 覓
padding  tensor ow graph襯 蟇磯,
python 襯  
padding 麹 豕 蠍語企ゼ 伎 蟆
5
Intro : Padding
# word dic
word_list = []
for elm in sentences:
word_list += elm
word_list = list(set(word_list))
word_list.sort()
# '<pad>'朱 覩語 token 豢螳
word_list = ['<pad>'] + word_list
word_dic = {word : idx for idx,
word in enumerate(word_list)}
6
Intro : Padding
# max_len 蠍語伎 覈視語 覓語レ <pad>襦 max_len襷 padding
def pad_seq(sequences, max_len, dic):
seq_len, seq_indices = [], []
for seq in sequences:
seq_len.append(len(seq))
seq_idx = [dic.get(char) for char in seq]
# 0 is idx of meaningless token "<pad>"
seq_idx += (max_len - len(seq_idx)) * 
[dic.get('<pad>')]
seq_indices.append(seq_idx)
return seq_len, seq_indices
7
Intro : Padding
max_length = 8
sen_len, sen_indices = pad_seq(sequences = sentences,
max_len = max_length,
dic = word_dic)
[3, 4, 7, 5]
[[1, 7, 10, 0, 0, 0, 0, 0],
[13, 11, 14, 5, 0, 0, 0, 0],
[13, 11, 2, 9, 8, 4, 12, 0],
[13, 11, 14, 6, 3, 0, 0, 0]]
8
Intro : Look up
variable sequence襯 螳讌 input RNN input
襦 l  襦  padding  ...
padding  蟆 蠏碁襦 RNN input朱 k 蟆 襷?
idx 覲 覘螳 dense vector (eg. word2vec) 煙 
狩, tf.nn.embedding_lookup !
tf.nn.embedding_lookup(
params,
ids,
partition_strategy='mod',
name=None,
validate_indices=True,
max_norm=None)
9
Intro : Look up
# tf.nn.embedding_lookup params, ids arg 蠍一
# placeholder 
seq_len = tf.placeholder(dtype = tf.int32, shape = [None])
seq_indices = tf.placeholder(dtype = tf.int32,
shape = [None, max_length])
one_hot = np.eye(len(word_dic)) #  覲 one-hot encoding
# embedding vector training  蟆企襦
one_hot = tf.get_variable(name='one_hot',
initializer = one_hot,
trainable = False)
seq_batch = tf.nn.embedding_lookup(params = one_hot,
ids = seq_indices)
10
Intro : Look up
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
tmp = sess.run(seq_batch,
feed_dict = {seq_indices : sen_indices})
print(np.shape(sen_indices))
print(np.shape(tmp))
(4, 8)
(4, 8, 15)
# tf.nn.dynamic_rnn, tf.contrib.seq2seq.TrainingHelper 煙
#  shape 讌覃伎 伎狩
11
Many to One
eg.  覓語レ 蠍/覿 螳 RNN (with GRU)
Example data
sentences = [['I', 'feel', 'hungry'],
['tensorflow', 'is', 'very', 'difficult'],
['tensorflow', 'is', 'a', 'framework', 'for',
'deep', 'learning'],
['tensorflow', 'is', 'very', 'fast',
'changing']]
label = [[0.,1.], [0.,1.], [1.,0.], [1.,0.]]
13
Example data
sentences wordれ idx襯 螳讌螻  dictionary

# word dic
word_list = []
for elm in sentences:
word_list += elm
word_list = list(set(word_list))
word_list.sort()
# '<pad>'朱 覩語 token 豢螳
word_list = ['<pad>'] + word_list
word_dic = {word : idx for idx,
word in enumerate(word_list)}
14
Example data
dictionary襯 伎 れ螻 螳 豌襴
[3, 4, 7, 5]
[[1, 7, 10, 0, 0, 0, 0, 0],
[13, 11, 14, 5, 0, 0, 0, 0],
[13, 11, 2, 9, 8, 4, 12, 0],
[13, 11, 14, 6, 3, 0, 0, 0]]
max_length = 8
sen_len, sen_indices = pad_seq(sequences = sentences, max_len =
dic = word_dic)
pprint(sen_len)
pprint(sen_indices)
15
Simple
tf.nn.dynamic_rnn
cell 襴  , return 螳 譴 state襷 
sequence_length padding 蠍一 覓語レ 蠍語企ゼ l伎朱   覓語
 sequence襷 豌襴
tf.losses.softmax_cross_entropy
朱 NN loss 
tf.nn.dynamic_rnn(cell, inputs,
sequence_length=None, initial_state=None,
dtype=None, parallel_iterations=None,
swap_memory=False, time_major=False,
scope=None)
16
Stacked
tf.contrib.rnn.MultiRNNCell
Stacking  
tf.contrib.rnn.DropoutWrapper
layer螳 蟾伎覩襦, Over tting 覦讌襯  
tf.contrib.rnn.MultiRNNCell(cells,
state_is_tuple=True)
17
Bi-directional
tf.nn.bidirectional_dynamic_rnn
fw_cell螻 bw_cell 襴  , return 螳 譴 output_states襷 ,
output_states fw_cell螻 bw_cell nal state 螳 螳螻 伎, loss
螻   concatenate伎 
tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw,
inputs, sequence_length=None,
initial_state_fw=None, initial_state_bw=None,
dtype=None, parallel_iterations=None,
swap_memory=False, time_major=False, scope=None)
18
Stacked Bi-directional
tf.contrib.rnn.stack_bidirectional_dynamic_rnn
螳螳 fw_cells, bw_cells (list )襯 襴  
return 螳 譴 output_state_fw, output_state_bw襯 concatenate伎 

tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
cells_fw, cells_bw,
inputs, initial_states_fw=None,
initial_states_bw=None, dtype=None,
sequence_length=None,
parallel_iterations=None,
time_major=False,
scope=None)
19
Many to Many
eg.  覿  RNN (with GRU)
Example data
sentences = [['I', 'feel', 'hungry'],
['tensorflow', 'is', 'very', 'difficult'],
['tensorflow', 'is', 'a', 'framework', 'for',
'deep', 'learning'],
['tensorflow', 'is', 'very', 'fast',
'changing']]
pos = [['pronoun', 'verb', 'adjective'],
['noun', 'verb', 'adverb', 'adjective'],
['noun', 'verb', 'determiner', 'noun',
'preposition', 'adjective', 'noun'],
['noun', 'verb', 'adverb', 'adjective', 'verb']]
21
Example data
sentenceswordれidx襯手讌螻dictionary

pos tokenれ idx襯 螳讌螻  dictionary 
# word dic
word_list = []
for elm in sentences:
word_list += elm
word_list = list(set(word_list))
word_list.sort()
word_list = ['<pad>'] + word_list
word_dic = {word : idx for idx, word
in enumerate(word_list)}
22
Example data
sentences wordれ idx襯 螳讌螻  dictionary

postokenれidx襯手讌螻dictionary 
# pos dic
pos_list = []
for elm in pos:
pos_list += elm
pos_list = list(set(pos_list))
pos_list.sort()
pos_list = ['<pad>'] + pos_list
pos_dic = {pos : idx for idx, pos in enumerate(pos_list)}
23
Example data
dictionary襯 伎 れ螻 螳 豌襴
sen_len, sen_indices = pad_seq(sequences = sentences,
max_len = max_length,
dic = word_dic)
_, pos_indices = pad_seq(sequences = pos,
max_len = max_length,
dic = pos_dic)
24
Example data
dictionary襯 伎 れ螻 螳 豌襴
pprint(sen_len)
pprint(sen_indices)
[3, 4, 7, 5]
[[1, 7, 10, 0, 0, 0, 0, 0],
[13, 11, 14, 5, 0, 0, 0, 0],
[13, 11, 2, 9, 8, 4, 12, 0],
[13, 11, 14, 6, 3, 0, 0, 0]]
25
Example data
dictionary襯 伎 れ螻 螳 豌襴
pprint(pos_indices)
[[6, 7, 1, 0, 0, 0, 0, 0],
[4, 7, 2, 1, 0, 0, 0, 0],
[4, 7, 3, 4, 5, 1, 4, 0],
[4, 7, 2, 1, 7, 0, 0, 0]]
26
Simple
Cell  覦 Many to One  Case 
 tf.nn.dynamic_rnn return 螳譴 outputs 
tf.contrib.rnn.OutputProjectionWrapper
step襷 classify襯 蠍一 
tf.sequence_mask
 sequence 伎襷 loss襯 螻壱蠍一 
tf.contrib.seq2seq.sequence_loss
tf.sequence_mask output weights arg  覦
targets arg [None, sequence_length] label 
27
Stacked
Cell  覦 Many to One  Case 
 tf.nn.dynamic_rnn return 螳譴 outputs 
tf.contrib.rnn.OutputProjectionWrapper
step襷 classify襯 蠍一 
tf.sequence_mask
 sequence 伎襷 loss襯 螻壱蠍一 
tf.contrib.seq2seq.sequence_loss
tf.sequence_mask output weights arg  覦
targets arg [None, sequence_length] label 
28
Bi-directional
Cell  覦 Many to One  Case 
 tf.nn.bidirectional_dynamic_rnn return 螳譴 outputs 
tf.map_fn
step襷 classify襯 蠍一 
tf.sequence_mask
 sequence 伎襷 loss襯 螻壱蠍一 
tf.contrib.seq2seq.sequence_loss
tf.sequence_mask output weights arg  覦
targets arg [None, sequence_length] label 
29
Stacked Bi-directional
Cell  覦 Many to One  Case 
 tf.contrib.rnn.stack_bidirectional_dynamic_rnn return 螳譴
outputs 
tf.map_fn
step襷 classify襯 蠍一 
tf.sequence_mask
 sequence 伎襷 loss襯 螻壱蠍一 
tf.contrib.seq2seq.sequence_loss
tf.sequence_mask output weights arg  覦
targets arg [None, sequence_length] label 
30
Seq2Seq
eg. 覓語レ 覯 RNN (with GRU)
Example data
targets = [['', '覦郁', '螻'],
['襦磯', '襷れ', '企給'],
['襦磯', 'ル', '', '企'],
['襦磯', '襷れ', '觜襯願', '覲']]
sources = [['I', 'feel', 'hungry'],
['tensorflow', 'is', 'very', 'difficult'],
['tensorflow', 'is', 'a', 'framework', 'for', 'deep',
['tensorflow', 'is', 'very', 'fast', 'changing']]
32
Example data
sourceswordれidx襯手讌螻dictionary 

targets wordれ idx襯 螳讌螻  dictionary 
# word dic for sentences
source_words = []
for elm in sources:
source_words += elm
source_words = list(set(source_words))
source_words.sort()
source_words = ['<pad>'] + source_words
source_dic = {word : idx for idx, word
in enumerate(source_words)}
33
Example data
sources wordれ idx襯 螳讌螻  dictionary 
targetswordれidx襯手讌螻dictionary 

# word dic for translations
target_words = []
for elm in targets:
target_words += elm
target_words = list(set(target_words))
target_words.sort()
# 覯覓語 螻  襴 'start', 'end' token 豢螳
target_words = ['<pad>']+ ['<start>'] + ['<end>'] + 
target_words
target_dic = {word : idx for idx, word
in enumerate(target_words)} 34
覦襦貊襦...
QnA
Reference
https://github.com/aisolab/CS20
https://github.com/HiJiGOO/tf_nmt_tutorial
https://github.com/hccho2/RNN-Tutorial
https://www.tensor ow.org/tutorials/seq2seq
https://github.com/golbin/TensorFlow-
Tutorials/blob/master/10 - RNN/03 - Seq2Seq.py
れ伎殊螳.
 Github : github.com/aisolab
 Blog : aisolab.github.io
 E-mail : bsk0130@gmail.com

More Related Content

To quickly implementing RNN

  • 2. Contents Intro Many to One Many to Many Seq2Seq QnA 2
  • 3. Intro 讌蠍一..... 殊企 覈 code snippets 襷襦 れ 覦 蟇磯, 危語 覲伎覃 . Download : http://tagme.to/aisolab/To_quickly_implementin g_RNN Click : To quickly implementing RNN.ipynb 3
  • 4. Intro 襾殊 Recurrent Neural Networks 企 螻 螻 螳, api襯 覯襷 危エ覺 覈 variable sequence length襯 螳, 襯 覃 螳.. # 覓語レ 企ゼ RNN kり 覃? # RNN 豌 螳 覓語 覲襦 伎 螳襷 sequence襯 豌襴伎狩. sentences = [['I', 'feel', 'hungry'], ['tensorflow', 'is', 'very', 'difficult'], ['tensorflow', 'is', 'a', 'framework', 'for','deep','learning'], ['tensorflow', 'is', 'very', 'fast', 'changing']] 4
  • 5. Intro : Padding tensor ow variable sequence length襯 る蠍 伎 朱 麹 蠍語企 padding 伎狩 覃 eager mode襯 讌 伎 tensor ow framework願鍵 覓 padding tensor ow graph襯 蟇磯, python 襯 padding 麹 豕 蠍語企ゼ 伎 蟆 5
  • 6. Intro : Padding # word dic word_list = [] for elm in sentences: word_list += elm word_list = list(set(word_list)) word_list.sort() # '<pad>'朱 覩語 token 豢螳 word_list = ['<pad>'] + word_list word_dic = {word : idx for idx, word in enumerate(word_list)} 6
  • 7. Intro : Padding # max_len 蠍語伎 覈視語 覓語レ <pad>襦 max_len襷 padding def pad_seq(sequences, max_len, dic): seq_len, seq_indices = [], [] for seq in sequences: seq_len.append(len(seq)) seq_idx = [dic.get(char) for char in seq] # 0 is idx of meaningless token "<pad>" seq_idx += (max_len - len(seq_idx)) * [dic.get('<pad>')] seq_indices.append(seq_idx) return seq_len, seq_indices 7
  • 8. Intro : Padding max_length = 8 sen_len, sen_indices = pad_seq(sequences = sentences, max_len = max_length, dic = word_dic) [3, 4, 7, 5] [[1, 7, 10, 0, 0, 0, 0, 0], [13, 11, 14, 5, 0, 0, 0, 0], [13, 11, 2, 9, 8, 4, 12, 0], [13, 11, 14, 6, 3, 0, 0, 0]] 8
  • 9. Intro : Look up variable sequence襯 螳讌 input RNN input 襦 l 襦 padding ... padding 蟆 蠏碁襦 RNN input朱 k 蟆 襷? idx 覲 覘螳 dense vector (eg. word2vec) 煙 狩, tf.nn.embedding_lookup ! tf.nn.embedding_lookup( params, ids, partition_strategy='mod', name=None, validate_indices=True, max_norm=None) 9
  • 10. Intro : Look up # tf.nn.embedding_lookup params, ids arg 蠍一 # placeholder seq_len = tf.placeholder(dtype = tf.int32, shape = [None]) seq_indices = tf.placeholder(dtype = tf.int32, shape = [None, max_length]) one_hot = np.eye(len(word_dic)) # 覲 one-hot encoding # embedding vector training 蟆企襦 one_hot = tf.get_variable(name='one_hot', initializer = one_hot, trainable = False) seq_batch = tf.nn.embedding_lookup(params = one_hot, ids = seq_indices) 10
  • 11. Intro : Look up with tf.Session() as sess: sess.run(tf.global_variables_initializer()) tmp = sess.run(seq_batch, feed_dict = {seq_indices : sen_indices}) print(np.shape(sen_indices)) print(np.shape(tmp)) (4, 8) (4, 8, 15) # tf.nn.dynamic_rnn, tf.contrib.seq2seq.TrainingHelper 煙 # shape 讌覃伎 伎狩 11
  • 12. Many to One eg. 覓語レ 蠍/覿 螳 RNN (with GRU)
  • 13. Example data sentences = [['I', 'feel', 'hungry'], ['tensorflow', 'is', 'very', 'difficult'], ['tensorflow', 'is', 'a', 'framework', 'for', 'deep', 'learning'], ['tensorflow', 'is', 'very', 'fast', 'changing']] label = [[0.,1.], [0.,1.], [1.,0.], [1.,0.]] 13
  • 14. Example data sentences wordれ idx襯 螳讌螻 dictionary # word dic word_list = [] for elm in sentences: word_list += elm word_list = list(set(word_list)) word_list.sort() # '<pad>'朱 覩語 token 豢螳 word_list = ['<pad>'] + word_list word_dic = {word : idx for idx, word in enumerate(word_list)} 14
  • 15. Example data dictionary襯 伎 れ螻 螳 豌襴 [3, 4, 7, 5] [[1, 7, 10, 0, 0, 0, 0, 0], [13, 11, 14, 5, 0, 0, 0, 0], [13, 11, 2, 9, 8, 4, 12, 0], [13, 11, 14, 6, 3, 0, 0, 0]] max_length = 8 sen_len, sen_indices = pad_seq(sequences = sentences, max_len = dic = word_dic) pprint(sen_len) pprint(sen_indices) 15
  • 16. Simple tf.nn.dynamic_rnn cell 襴 , return 螳 譴 state襷 sequence_length padding 蠍一 覓語レ 蠍語企ゼ l伎朱 覓語 sequence襷 豌襴 tf.losses.softmax_cross_entropy 朱 NN loss tf.nn.dynamic_rnn(cell, inputs, sequence_length=None, initial_state=None, dtype=None, parallel_iterations=None, swap_memory=False, time_major=False, scope=None) 16
  • 17. Stacked tf.contrib.rnn.MultiRNNCell Stacking tf.contrib.rnn.DropoutWrapper layer螳 蟾伎覩襦, Over tting 覦讌襯 tf.contrib.rnn.MultiRNNCell(cells, state_is_tuple=True) 17
  • 18. Bi-directional tf.nn.bidirectional_dynamic_rnn fw_cell螻 bw_cell 襴 , return 螳 譴 output_states襷 , output_states fw_cell螻 bw_cell nal state 螳 螳螻 伎, loss 螻 concatenate伎 tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw, inputs, sequence_length=None, initial_state_fw=None, initial_state_bw=None, dtype=None, parallel_iterations=None, swap_memory=False, time_major=False, scope=None) 18
  • 19. Stacked Bi-directional tf.contrib.rnn.stack_bidirectional_dynamic_rnn 螳螳 fw_cells, bw_cells (list )襯 襴 return 螳 譴 output_state_fw, output_state_bw襯 concatenate伎 tf.contrib.rnn.stack_bidirectional_dynamic_rnn( cells_fw, cells_bw, inputs, initial_states_fw=None, initial_states_bw=None, dtype=None, sequence_length=None, parallel_iterations=None, time_major=False, scope=None) 19
  • 20. Many to Many eg. 覿 RNN (with GRU)
  • 21. Example data sentences = [['I', 'feel', 'hungry'], ['tensorflow', 'is', 'very', 'difficult'], ['tensorflow', 'is', 'a', 'framework', 'for', 'deep', 'learning'], ['tensorflow', 'is', 'very', 'fast', 'changing']] pos = [['pronoun', 'verb', 'adjective'], ['noun', 'verb', 'adverb', 'adjective'], ['noun', 'verb', 'determiner', 'noun', 'preposition', 'adjective', 'noun'], ['noun', 'verb', 'adverb', 'adjective', 'verb']] 21
  • 22. Example data sentenceswordれidx襯手讌螻dictionary pos tokenれ idx襯 螳讌螻 dictionary # word dic word_list = [] for elm in sentences: word_list += elm word_list = list(set(word_list)) word_list.sort() word_list = ['<pad>'] + word_list word_dic = {word : idx for idx, word in enumerate(word_list)} 22
  • 23. Example data sentences wordれ idx襯 螳讌螻 dictionary postokenれidx襯手讌螻dictionary # pos dic pos_list = [] for elm in pos: pos_list += elm pos_list = list(set(pos_list)) pos_list.sort() pos_list = ['<pad>'] + pos_list pos_dic = {pos : idx for idx, pos in enumerate(pos_list)} 23
  • 24. Example data dictionary襯 伎 れ螻 螳 豌襴 sen_len, sen_indices = pad_seq(sequences = sentences, max_len = max_length, dic = word_dic) _, pos_indices = pad_seq(sequences = pos, max_len = max_length, dic = pos_dic) 24
  • 25. Example data dictionary襯 伎 れ螻 螳 豌襴 pprint(sen_len) pprint(sen_indices) [3, 4, 7, 5] [[1, 7, 10, 0, 0, 0, 0, 0], [13, 11, 14, 5, 0, 0, 0, 0], [13, 11, 2, 9, 8, 4, 12, 0], [13, 11, 14, 6, 3, 0, 0, 0]] 25
  • 26. Example data dictionary襯 伎 れ螻 螳 豌襴 pprint(pos_indices) [[6, 7, 1, 0, 0, 0, 0, 0], [4, 7, 2, 1, 0, 0, 0, 0], [4, 7, 3, 4, 5, 1, 4, 0], [4, 7, 2, 1, 7, 0, 0, 0]] 26
  • 27. Simple Cell 覦 Many to One Case tf.nn.dynamic_rnn return 螳譴 outputs tf.contrib.rnn.OutputProjectionWrapper step襷 classify襯 蠍一 tf.sequence_mask sequence 伎襷 loss襯 螻壱蠍一 tf.contrib.seq2seq.sequence_loss tf.sequence_mask output weights arg 覦 targets arg [None, sequence_length] label 27
  • 28. Stacked Cell 覦 Many to One Case tf.nn.dynamic_rnn return 螳譴 outputs tf.contrib.rnn.OutputProjectionWrapper step襷 classify襯 蠍一 tf.sequence_mask sequence 伎襷 loss襯 螻壱蠍一 tf.contrib.seq2seq.sequence_loss tf.sequence_mask output weights arg 覦 targets arg [None, sequence_length] label 28
  • 29. Bi-directional Cell 覦 Many to One Case tf.nn.bidirectional_dynamic_rnn return 螳譴 outputs tf.map_fn step襷 classify襯 蠍一 tf.sequence_mask sequence 伎襷 loss襯 螻壱蠍一 tf.contrib.seq2seq.sequence_loss tf.sequence_mask output weights arg 覦 targets arg [None, sequence_length] label 29
  • 30. Stacked Bi-directional Cell 覦 Many to One Case tf.contrib.rnn.stack_bidirectional_dynamic_rnn return 螳譴 outputs tf.map_fn step襷 classify襯 蠍一 tf.sequence_mask sequence 伎襷 loss襯 螻壱蠍一 tf.contrib.seq2seq.sequence_loss tf.sequence_mask output weights arg 覦 targets arg [None, sequence_length] label 30
  • 31. Seq2Seq eg. 覓語レ 覯 RNN (with GRU)
  • 32. Example data targets = [['', '覦郁', '螻'], ['襦磯', '襷れ', '企給'], ['襦磯', 'ル', '', '企'], ['襦磯', '襷れ', '觜襯願', '覲']] sources = [['I', 'feel', 'hungry'], ['tensorflow', 'is', 'very', 'difficult'], ['tensorflow', 'is', 'a', 'framework', 'for', 'deep', ['tensorflow', 'is', 'very', 'fast', 'changing']] 32
  • 33. Example data sourceswordれidx襯手讌螻dictionary targets wordれ idx襯 螳讌螻 dictionary # word dic for sentences source_words = [] for elm in sources: source_words += elm source_words = list(set(source_words)) source_words.sort() source_words = ['<pad>'] + source_words source_dic = {word : idx for idx, word in enumerate(source_words)} 33
  • 34. Example data sources wordれ idx襯 螳讌螻 dictionary targetswordれidx襯手讌螻dictionary # word dic for translations target_words = [] for elm in targets: target_words += elm target_words = list(set(target_words)) target_words.sort() # 覯覓語 螻 襴 'start', 'end' token 豢螳 target_words = ['<pad>']+ ['<start>'] + ['<end>'] + target_words target_dic = {word : idx for idx, word in enumerate(target_words)} 34
  • 36. QnA
  • 38. れ伎殊螳. Github : github.com/aisolab Blog : aisolab.github.io E-mail : bsk0130@gmail.com