#Necessary Imports
from numpy import array
from keras.preprocessing.text import Tokenizer
from keras.utils import to_categorical
from keras.preprocessing.sequence import pad_sequences
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from keras.layers import Embedding
#Define a simple text
data = """Jack and Jill went up the hill\n
To fetch a pail of water\n
Jack fell down and broke his crown\n
And Jill came tumbling after\n"""
#data = open("/Users/jerry/Downloads/shakespear.txt").read().lower()
Given one word as input, the model will learn to predict the next word in the sequence. e.g. Jack --> fell, Fetch --> a
The first way is to encode the words into vectors. Each lowercase word in the source text is assigned a unique integer and we can convert the sequences of words to sequences of integers.
#Keras provides the Tokenizer class that can be used to perform this encoding.
#First, the Tokenizer is fit on the source text to develop the mapping from words to unique integers.
#Then sequences of text can be converted to sequences of integers by calling the texts_to_sequences() function.
tokenizer = Tokenizer()
tokenizer.fit_on_texts([data])
encoded = tokenizer.texts_to_sequences([data])[0]
#Let's see the mappings
print(tokenizer.word_index)
print(tokenizer.word_counts)
print('')
{'and': 1, 'a': 10, 'his': 17, 'pail': 11, 'down': 15, 'of': 12, 'after': 21, 'crown': 18, 'up': 5, 'came': 19, 'fetch': 9, 'water': 13, 'to': 8, 'jill': 3, 'tumbling': 20, 'jack': 2, 'broke': 16, 'the': 6, 'went': 4, 'fell': 14, 'hill': 7} OrderedDict([('jack', 2), ('and', 3), ('jill', 2), ('went', 1), ('up', 1), ('the', 1), ('hill', 1), ('to', 1), ('fetch', 1), ('a', 1), ('pail', 1), ('of', 1), ('water', 1), ('fell', 1), ('down', 1), ('broke', 1), ('his', 1), ('crown', 1), ('came', 1), ('tumbling', 1), ('after', 1)])
We will need to know the size of the vocabulary later for two reasons: (a) encoding output words using a one hot encoding (b) defining the word embedding layer in the model
The size of the vocabulary can be retrieved from the trained Tokenizer by accessing the word_index attribute.
#We add 1 to the actual vocabulary size, because we want to also specify the last encoded word (before the end of line)
vocab_size = len(tokenizer.word_index) + 1
print('Vocabulary Size: %d' % vocab_size)
print('')
Vocabulary Size: 22
Next, we need to create sequences of words to fit the model with one word as input and one word as output.
sequences = list()
for i in range(1, len(encoded)):
sequence = encoded[i-1:i+1]
sequences.append(sequence)
print('Total Sequences: %d' % len(sequences))
print('')
print(sequences)
Total Sequences: 24 [[2, 1], [1, 3], [3, 4], [4, 5], [5, 6], [6, 7], [7, 8], [8, 9], [9, 10], [10, 11], [11, 12], [12, 13], [13, 2], [2, 14], [14, 15], [15, 1], [1, 16], [16, 17], [17, 18], [18, 1], [1, 3], [3, 19], [19, 20], [20, 21]]
We will fit our model to predict a probability distribution across all words in the vocabulary.
That means that we need to turn the output element from a single integer into a one hot encoding with a 0 for every word in the vocabulary and a 1 for the actual word that the value.
This gives the network a ground truth to aim for from which we can calculate error and update the model.
Keras provides the to_categorical() function that we can use to convert the integer to a one hot encoding while specifying the number of classes as the vocabulary size.
# Split into X and y elements (input, output)
sequences = array(sequences)
X, y = sequences[:,0],sequences[:,1]
# One-hot-encode outputs (it's a classification problem)
y = to_categorical(y, num_classes=vocab_size)
print(X.shape)
print(y.shape)
(24,) (24, 22)
The model uses a learned word embedding in the input layer. This has one real-valued vector for each word in the vocabulary, where each word vector has a specified length. In this case we will use a 10-dimensional projection. The input sequence contains a single word, therefore the input_length=1.
The model has a single hidden LSTM layer with 50 units. This is far more than is needed. The output layer is comprised of one neuron for each word in the vocabulary and uses a softmax activation function to ensure the output is normalized to look like a probability.
# Finally, let's build the model (One word in - One word out)
model = Sequential()
model.add(Embedding(vocab_size, 10, input_length=1))
model.add(LSTM(50))
model.add(Dense(vocab_size, activation='softmax'))
print(model.summary())
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= embedding_7 (Embedding) (None, 1, 10) 220 _________________________________________________________________ lstm_8 (LSTM) (None, 50) 12200 _________________________________________________________________ dense_6 (Dense) (None, 22) 1122 ================================================================= Total params: 13,542 Trainable params: 13,542 Non-trainable params: 0 _________________________________________________________________ None
Next, we can compile and fit the network on the encoded text data. Technically, we are modeling a multi-class classification problem (predict the word in the vocabulary), therefore using the categorical cross entropy loss function. We use the efficient Adam implementation of gradient descent and track accuracy at the end of each epoch. The model is fit for 500 training epochs, again, perhaps more than is needed.
# compile network
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
# fit network
model.fit(X, y, epochs=500, verbose=2)
Epoch 1/500 - 1s - loss: 3.0920 - acc: 0.0000e+00 Epoch 2/500 - 0s - loss: 3.0912 - acc: 0.0417 Epoch 3/500 - 0s - loss: 3.0904 - acc: 0.0417 Epoch 4/500 - 0s - loss: 3.0897 - acc: 0.0833 Epoch 5/500 - 0s - loss: 3.0889 - acc: 0.0833 Epoch 6/500 - 0s - loss: 3.0881 - acc: 0.0833 Epoch 7/500 - 0s - loss: 3.0874 - acc: 0.1250 Epoch 8/500 - 0s - loss: 3.0866 - acc: 0.1250 Epoch 9/500 - 0s - loss: 3.0858 - acc: 0.1250 Epoch 10/500 - 0s - loss: 3.0850 - acc: 0.1250 Epoch 11/500 - 0s - loss: 3.0842 - acc: 0.1250 Epoch 12/500 - 0s - loss: 3.0835 - acc: 0.1250 Epoch 13/500 - 0s - loss: 3.0826 - acc: 0.1250 Epoch 14/500 - 0s - loss: 3.0818 - acc: 0.1250 Epoch 15/500 - 0s - loss: 3.0810 - acc: 0.1250 Epoch 16/500 - 0s - loss: 3.0802 - acc: 0.1250 Epoch 17/500 - 0s - loss: 3.0793 - acc: 0.1250 Epoch 18/500 - 0s - loss: 3.0784 - acc: 0.1250 Epoch 19/500 - 0s - loss: 3.0776 - acc: 0.1250 Epoch 20/500 - 0s - loss: 3.0766 - acc: 0.1250 Epoch 21/500 - 0s - loss: 3.0757 - acc: 0.1250 Epoch 22/500 - 0s - loss: 3.0748 - acc: 0.1250 Epoch 23/500 - 0s - loss: 3.0738 - acc: 0.1250 Epoch 24/500 - 0s - loss: 3.0728 - acc: 0.1250 Epoch 25/500 - 0s - loss: 3.0718 - acc: 0.1250 Epoch 26/500 - 0s - loss: 3.0707 - acc: 0.1250 Epoch 27/500 - 0s - loss: 3.0697 - acc: 0.1250 Epoch 28/500 - 0s - loss: 3.0686 - acc: 0.1250 Epoch 29/500 - 0s - loss: 3.0675 - acc: 0.1250 Epoch 30/500 - 0s - loss: 3.0663 - acc: 0.1250 Epoch 31/500 - 0s - loss: 3.0651 - acc: 0.1250 Epoch 32/500 - 0s - loss: 3.0639 - acc: 0.1250 Epoch 33/500 - 0s - loss: 3.0626 - acc: 0.1250 Epoch 34/500 - 0s - loss: 3.0613 - acc: 0.1250 Epoch 35/500 - 0s - loss: 3.0600 - acc: 0.1250 Epoch 36/500 - 0s - loss: 3.0586 - acc: 0.1250 Epoch 37/500 - 0s - loss: 3.0572 - acc: 0.1250 Epoch 38/500 - 0s - loss: 3.0558 - acc: 0.1250 Epoch 39/500 - 0s - loss: 3.0543 - acc: 0.1250 Epoch 40/500 - 0s - loss: 3.0528 - acc: 0.1250 Epoch 41/500 - 0s - loss: 3.0512 - acc: 0.1250 Epoch 42/500 - 0s - loss: 3.0496 - acc: 0.1250 Epoch 43/500 - 0s - loss: 3.0479 - acc: 0.1250 Epoch 44/500 - 0s - loss: 3.0461 - acc: 0.1250 Epoch 45/500 - 0s - loss: 3.0444 - acc: 0.1250 Epoch 46/500 - 0s - loss: 3.0425 - acc: 0.1250 Epoch 47/500 - 0s - loss: 3.0407 - acc: 0.1250 Epoch 48/500 - 0s - loss: 3.0387 - acc: 0.1250 Epoch 49/500 - 0s - loss: 3.0367 - acc: 0.1250 Epoch 50/500 - 0s - loss: 3.0346 - acc: 0.1250 Epoch 51/500 - 0s - loss: 3.0325 - acc: 0.1250 Epoch 52/500 - 0s - loss: 3.0303 - acc: 0.1250 Epoch 53/500 - 0s - loss: 3.0280 - acc: 0.1250 Epoch 54/500 - 0s - loss: 3.0257 - acc: 0.1250 Epoch 55/500 - 0s - loss: 3.0233 - acc: 0.1250 Epoch 56/500 - 0s - loss: 3.0209 - acc: 0.1250 Epoch 57/500 - 0s - loss: 3.0183 - acc: 0.1250 Epoch 58/500 - 0s - loss: 3.0157 - acc: 0.1250 Epoch 59/500 - 0s - loss: 3.0131 - acc: 0.1250 Epoch 60/500 - 0s - loss: 3.0103 - acc: 0.1250 Epoch 61/500 - 0s - loss: 3.0074 - acc: 0.1250 Epoch 62/500 - 0s - loss: 3.0045 - acc: 0.1250 Epoch 63/500 - 0s - loss: 3.0015 - acc: 0.1250 Epoch 64/500 - 0s - loss: 2.9984 - acc: 0.1250 Epoch 65/500 - 0s - loss: 2.9952 - acc: 0.1250 Epoch 66/500 - 0s - loss: 2.9919 - acc: 0.1250 Epoch 67/500 - 0s - loss: 2.9885 - acc: 0.1250 Epoch 68/500 - 0s - loss: 2.9851 - acc: 0.1250 Epoch 69/500 - 0s - loss: 2.9815 - acc: 0.1250 Epoch 70/500 - 0s - loss: 2.9778 - acc: 0.1250 Epoch 71/500 - 0s - loss: 2.9740 - acc: 0.1250 Epoch 72/500 - 0s - loss: 2.9701 - acc: 0.1250 Epoch 73/500 - 0s - loss: 2.9662 - acc: 0.1250 Epoch 74/500 - 0s - loss: 2.9621 - acc: 0.1250 Epoch 75/500 - 0s - loss: 2.9579 - acc: 0.1250 Epoch 76/500 - 0s - loss: 2.9535 - acc: 0.1250 Epoch 77/500 - 0s - loss: 2.9491 - acc: 0.1250 Epoch 78/500 - 0s - loss: 2.9446 - acc: 0.1250 Epoch 79/500 - 0s - loss: 2.9399 - acc: 0.1250 Epoch 80/500 - 0s - loss: 2.9351 - acc: 0.1250 Epoch 81/500 - 0s - loss: 2.9301 - acc: 0.1250 Epoch 82/500 - 0s - loss: 2.9251 - acc: 0.1250 Epoch 83/500 - 0s - loss: 2.9199 - acc: 0.1250 Epoch 84/500 - 0s - loss: 2.9146 - acc: 0.1250 Epoch 85/500 - 0s - loss: 2.9092 - acc: 0.1250 Epoch 86/500 - 0s - loss: 2.9036 - acc: 0.1250 Epoch 87/500 - 0s - loss: 2.8979 - acc: 0.1250 Epoch 88/500 - 0s - loss: 2.8920 - acc: 0.1250 Epoch 89/500 - 0s - loss: 2.8860 - acc: 0.1250 Epoch 90/500 - 0s - loss: 2.8799 - acc: 0.1250 Epoch 91/500 - 0s - loss: 2.8736 - acc: 0.1250 Epoch 92/500 - 0s - loss: 2.8672 - acc: 0.1250 Epoch 93/500 - 0s - loss: 2.8606 - acc: 0.1250 Epoch 94/500 - 0s - loss: 2.8539 - acc: 0.1250 Epoch 95/500 - 0s - loss: 2.8470 - acc: 0.1250 Epoch 96/500 - 0s - loss: 2.8400 - acc: 0.1250 Epoch 97/500 - 0s - loss: 2.8328 - acc: 0.1250 Epoch 98/500 - 0s - loss: 2.8255 - acc: 0.1250 Epoch 99/500 - 0s - loss: 2.8180 - acc: 0.1250 Epoch 100/500 - 0s - loss: 2.8104 - acc: 0.1250 Epoch 101/500 - 0s - loss: 2.8026 - acc: 0.1250 Epoch 102/500 - 0s - loss: 2.7946 - acc: 0.1250 Epoch 103/500 - 0s - loss: 2.7865 - acc: 0.1250 Epoch 104/500 - 0s - loss: 2.7782 - acc: 0.1250 Epoch 105/500 - 0s - loss: 2.7697 - acc: 0.1250 Epoch 106/500 - 0s - loss: 2.7611 - acc: 0.1250 Epoch 107/500 - 0s - loss: 2.7524 - acc: 0.1250 Epoch 108/500 - 0s - loss: 2.7434 - acc: 0.1250 Epoch 109/500 - 0s - loss: 2.7343 - acc: 0.1667 Epoch 110/500 - 0s - loss: 2.7251 - acc: 0.1667 Epoch 111/500 - 0s - loss: 2.7157 - acc: 0.2083 Epoch 112/500 - 0s - loss: 2.7061 - acc: 0.2083 Epoch 113/500 - 0s - loss: 2.6964 - acc: 0.2083 Epoch 114/500 - 0s - loss: 2.6865 - acc: 0.2083 Epoch 115/500 - 0s - loss: 2.6764 - acc: 0.2083 Epoch 116/500 - 0s - loss: 2.6663 - acc: 0.2083 Epoch 117/500 - 0s - loss: 2.6559 - acc: 0.2083 Epoch 118/500 - 0s - loss: 2.6454 - acc: 0.2917 Epoch 119/500 - 0s - loss: 2.6348 - acc: 0.2917 Epoch 120/500 - 0s - loss: 2.6240 - acc: 0.2917 Epoch 121/500 - 0s - loss: 2.6130 - acc: 0.3333 Epoch 122/500 - 0s - loss: 2.6019 - acc: 0.3333 Epoch 123/500 - 0s - loss: 2.5907 - acc: 0.3750 Epoch 124/500 - 0s - loss: 2.5793 - acc: 0.3750 Epoch 125/500 - 0s - loss: 2.5678 - acc: 0.3750 Epoch 126/500 - 0s - loss: 2.5561 - acc: 0.3750 Epoch 127/500 - 0s - loss: 2.5444 - acc: 0.3750 Epoch 128/500 - 0s - loss: 2.5324 - acc: 0.3750 Epoch 129/500 - 0s - loss: 2.5204 - acc: 0.3750 Epoch 130/500 - 0s - loss: 2.5082 - acc: 0.3750 Epoch 131/500 - 0s - loss: 2.4959 - acc: 0.3750 Epoch 132/500 - 0s - loss: 2.4835 - acc: 0.3750 Epoch 133/500 - 0s - loss: 2.4710 - acc: 0.3750 Epoch 134/500 - 0s - loss: 2.4584 - acc: 0.3750 Epoch 135/500 - 0s - loss: 2.4456 - acc: 0.3750 Epoch 136/500 - 0s - loss: 2.4328 - acc: 0.3750 Epoch 137/500 - 0s - loss: 2.4198 - acc: 0.4167 Epoch 138/500 - 0s - loss: 2.4068 - acc: 0.4167 Epoch 139/500 - 0s - loss: 2.3936 - acc: 0.4167 Epoch 140/500 - 0s - loss: 2.3804 - acc: 0.4167 Epoch 141/500 - 0s - loss: 2.3670 - acc: 0.4167 Epoch 142/500 - 0s - loss: 2.3536 - acc: 0.4167 Epoch 143/500 - 0s - loss: 2.3401 - acc: 0.4583 Epoch 144/500 - 0s - loss: 2.3265 - acc: 0.4583 Epoch 145/500 - 0s - loss: 2.3129 - acc: 0.4583 Epoch 146/500 - 0s - loss: 2.2991 - acc: 0.4583 Epoch 147/500 - 0s - loss: 2.2853 - acc: 0.4583 Epoch 148/500 - 0s - loss: 2.2715 - acc: 0.5000 Epoch 149/500 - 0s - loss: 2.2575 - acc: 0.5000 Epoch 150/500 - 0s - loss: 2.2436 - acc: 0.5000 Epoch 151/500 - 0s - loss: 2.2295 - acc: 0.5000 Epoch 152/500 - 0s - loss: 2.2154 - acc: 0.5000 Epoch 153/500 - 0s - loss: 2.2013 - acc: 0.5000 Epoch 154/500 - 0s - loss: 2.1871 - acc: 0.5000 Epoch 155/500 - 0s - loss: 2.1728 - acc: 0.5000 Epoch 156/500 - 0s - loss: 2.1585 - acc: 0.5000 Epoch 157/500 - 0s - loss: 2.1442 - acc: 0.5000 Epoch 158/500 - 0s - loss: 2.1299 - acc: 0.5000 Epoch 159/500 - 0s - loss: 2.1155 - acc: 0.5000 Epoch 160/500 - 0s - loss: 2.1011 - acc: 0.5000 Epoch 161/500 - 0s - loss: 2.0866 - acc: 0.5000 Epoch 162/500 - 0s - loss: 2.0721 - acc: 0.5417 Epoch 163/500 - 0s - loss: 2.0576 - acc: 0.5417 Epoch 164/500 - 0s - loss: 2.0431 - acc: 0.5417 Epoch 165/500 - 0s - loss: 2.0286 - acc: 0.5417 Epoch 166/500 - 0s - loss: 2.0140 - acc: 0.5417 Epoch 167/500 - 0s - loss: 1.9995 - acc: 0.5833 Epoch 168/500 - 0s - loss: 1.9849 - acc: 0.5833 Epoch 169/500 - 0s - loss: 1.9703 - acc: 0.5833 Epoch 170/500 - 0s - loss: 1.9557 - acc: 0.5833 Epoch 171/500 - 0s - loss: 1.9411 - acc: 0.6250 Epoch 172/500 - 0s - loss: 1.9265 - acc: 0.6250 Epoch 173/500 - 0s - loss: 1.9120 - acc: 0.6250 Epoch 174/500 - 0s - loss: 1.8974 - acc: 0.6250 Epoch 175/500 - 0s - loss: 1.8828 - acc: 0.6250 Epoch 176/500 - 0s - loss: 1.8682 - acc: 0.6250 Epoch 177/500 - 0s - loss: 1.8536 - acc: 0.6250 Epoch 178/500 - 0s - loss: 1.8390 - acc: 0.6250 Epoch 179/500 - 0s - loss: 1.8244 - acc: 0.6250 Epoch 180/500 - 0s - loss: 1.8099 - acc: 0.6250 Epoch 181/500 - 0s - loss: 1.7954 - acc: 0.6250 Epoch 182/500 - 0s - loss: 1.7809 - acc: 0.6250 Epoch 183/500 - 0s - loss: 1.7664 - acc: 0.6250 Epoch 184/500 - 0s - loss: 1.7519 - acc: 0.6667 Epoch 185/500 - 0s - loss: 1.7374 - acc: 0.6667 Epoch 186/500 - 0s - loss: 1.7230 - acc: 0.6667 Epoch 187/500 - 0s - loss: 1.7086 - acc: 0.6667 Epoch 188/500 - 0s - loss: 1.6942 - acc: 0.6667 Epoch 189/500 - 0s - loss: 1.6799 - acc: 0.7083 Epoch 190/500 - 0s - loss: 1.6656 - acc: 0.7083 Epoch 191/500 - 0s - loss: 1.6513 - acc: 0.7083 Epoch 192/500 - 0s - loss: 1.6371 - acc: 0.7083 Epoch 193/500 - 0s - loss: 1.6228 - acc: 0.7083 Epoch 194/500 - 0s - loss: 1.6087 - acc: 0.7083 Epoch 195/500 - 0s - loss: 1.5945 - acc: 0.7500 Epoch 196/500 - 0s - loss: 1.5804 - acc: 0.7500 Epoch 197/500 - 0s - loss: 1.5663 - acc: 0.7500 Epoch 198/500 - 0s - loss: 1.5523 - acc: 0.7500 Epoch 199/500 - 0s - loss: 1.5383 - acc: 0.7500 Epoch 200/500 - 0s - loss: 1.5244 - acc: 0.7500 Epoch 201/500 - 0s - loss: 1.5105 - acc: 0.7500 Epoch 202/500 - 0s - loss: 1.4966 - acc: 0.7500 Epoch 203/500 - 0s - loss: 1.4828 - acc: 0.7500 Epoch 204/500 - 0s - loss: 1.4690 - acc: 0.7500 Epoch 205/500 - 0s - loss: 1.4553 - acc: 0.7500 Epoch 206/500 - 0s - loss: 1.4417 - acc: 0.7500 Epoch 207/500 - 0s - loss: 1.4280 - acc: 0.7500 Epoch 208/500 - 0s - loss: 1.4145 - acc: 0.7500 Epoch 209/500 - 0s - loss: 1.4010 - acc: 0.7500 Epoch 210/500 - 0s - loss: 1.3875 - acc: 0.7500 Epoch 211/500 - 0s - loss: 1.3741 - acc: 0.7500 Epoch 212/500 - 0s - loss: 1.3608 - acc: 0.7917 Epoch 213/500 - 0s - loss: 1.3475 - acc: 0.7917 Epoch 214/500 - 0s - loss: 1.3343 - acc: 0.7917 Epoch 215/500 - 0s - loss: 1.3211 - acc: 0.7917 Epoch 216/500 - 0s - loss: 1.3080 - acc: 0.7917 Epoch 217/500 - 0s - loss: 1.2950 - acc: 0.7917 Epoch 218/500 - 0s - loss: 1.2820 - acc: 0.7917 Epoch 219/500 - 0s - loss: 1.2691 - acc: 0.7917 Epoch 220/500 - 0s - loss: 1.2563 - acc: 0.8333 Epoch 221/500 - 0s - loss: 1.2435 - acc: 0.8333 Epoch 222/500 - 0s - loss: 1.2309 - acc: 0.8333 Epoch 223/500 - 0s - loss: 1.2183 - acc: 0.8333 Epoch 224/500 - 0s - loss: 1.2057 - acc: 0.8333 Epoch 225/500 - 0s - loss: 1.1933 - acc: 0.8333 Epoch 226/500 - 0s - loss: 1.1809 - acc: 0.8333 Epoch 227/500 - 0s - loss: 1.1686 - acc: 0.8333 Epoch 228/500 - 0s - loss: 1.1564 - acc: 0.8333 Epoch 229/500 - 0s - loss: 1.1443 - acc: 0.8333 Epoch 230/500 - 0s - loss: 1.1323 - acc: 0.8333 Epoch 231/500 - 0s - loss: 1.1203 - acc: 0.8750 Epoch 232/500 - 0s - loss: 1.1085 - acc: 0.8750 Epoch 233/500 - 0s - loss: 1.0967 - acc: 0.8750 Epoch 234/500 - 0s - loss: 1.0850 - acc: 0.8750 Epoch 235/500 - 0s - loss: 1.0734 - acc: 0.8750 Epoch 236/500 - 0s - loss: 1.0620 - acc: 0.8750 Epoch 237/500 - 0s - loss: 1.0506 - acc: 0.8750 Epoch 238/500 - 0s - loss: 1.0393 - acc: 0.8750 Epoch 239/500 - 0s - loss: 1.0281 - acc: 0.8750 Epoch 240/500 - 0s - loss: 1.0170 - acc: 0.8750 Epoch 241/500 - 0s - loss: 1.0060 - acc: 0.8750 Epoch 242/500 - 0s - loss: 0.9951 - acc: 0.8750 Epoch 243/500 - 0s - loss: 0.9843 - acc: 0.8750 Epoch 244/500 - 0s - loss: 0.9736 - acc: 0.8750 Epoch 245/500 - 0s - loss: 0.9630 - acc: 0.8750 Epoch 246/500 - 0s - loss: 0.9525 - acc: 0.8750 Epoch 247/500 - 0s - loss: 0.9422 - acc: 0.8750 Epoch 248/500 - 0s - loss: 0.9319 - acc: 0.8750 Epoch 249/500 - 0s - loss: 0.9217 - acc: 0.8750 Epoch 250/500 - 0s - loss: 0.9117 - acc: 0.8750 Epoch 251/500 - 0s - loss: 0.9017 - acc: 0.8750 Epoch 252/500 - 0s - loss: 0.8919 - acc: 0.8750 Epoch 253/500 - 0s - loss: 0.8821 - acc: 0.8750 Epoch 254/500 - 0s - loss: 0.8725 - acc: 0.8750 Epoch 255/500 - 0s - loss: 0.8630 - acc: 0.8750 Epoch 256/500 - 0s - loss: 0.8536 - acc: 0.8750 Epoch 257/500 - 0s - loss: 0.8443 - acc: 0.8750 Epoch 258/500 - 0s - loss: 0.8351 - acc: 0.8750 Epoch 259/500 - 0s - loss: 0.8261 - acc: 0.8750 Epoch 260/500 - 0s - loss: 0.8171 - acc: 0.8750 Epoch 261/500 - 0s - loss: 0.8082 - acc: 0.8750 Epoch 262/500 - 0s - loss: 0.7995 - acc: 0.8750 Epoch 263/500 - 0s - loss: 0.7909 - acc: 0.8750 Epoch 264/500 - 0s - loss: 0.7823 - acc: 0.8750 Epoch 265/500 - 0s - loss: 0.7739 - acc: 0.8750 Epoch 266/500 - 0s - loss: 0.7656 - acc: 0.8750 Epoch 267/500 - 0s - loss: 0.7574 - acc: 0.8750 Epoch 268/500 - 0s - loss: 0.7493 - acc: 0.8750 Epoch 269/500 - 0s - loss: 0.7414 - acc: 0.8750 Epoch 270/500 - 0s - loss: 0.7335 - acc: 0.8750 Epoch 271/500 - 0s - loss: 0.7257 - acc: 0.8750 Epoch 272/500 - 0s - loss: 0.7181 - acc: 0.8750 Epoch 273/500 - 0s - loss: 0.7105 - acc: 0.8750 Epoch 274/500 - 0s - loss: 0.7031 - acc: 0.8750 Epoch 275/500 - 0s - loss: 0.6957 - acc: 0.8750 Epoch 276/500 - 0s - loss: 0.6885 - acc: 0.8750 Epoch 277/500 - 0s - loss: 0.6813 - acc: 0.8750 Epoch 278/500 - 0s - loss: 0.6743 - acc: 0.8750 Epoch 279/500 - 0s - loss: 0.6674 - acc: 0.8750 Epoch 280/500 - 0s - loss: 0.6605 - acc: 0.8750 Epoch 281/500 - 0s - loss: 0.6538 - acc: 0.8750 Epoch 282/500 - 0s - loss: 0.6471 - acc: 0.8750 Epoch 283/500 - 0s - loss: 0.6406 - acc: 0.8750 Epoch 284/500 - 0s - loss: 0.6342 - acc: 0.8750 Epoch 285/500 - 0s - loss: 0.6278 - acc: 0.8750 Epoch 286/500 - 0s - loss: 0.6216 - acc: 0.8750 Epoch 287/500 - 0s - loss: 0.6154 - acc: 0.8750 Epoch 288/500 - 0s - loss: 0.6093 - acc: 0.8750 Epoch 289/500 - 0s - loss: 0.6034 - acc: 0.8750 Epoch 290/500 - 0s - loss: 0.5975 - acc: 0.8750 Epoch 291/500 - 0s - loss: 0.5917 - acc: 0.8750 Epoch 292/500 - 0s - loss: 0.5860 - acc: 0.8750 Epoch 293/500 - 0s - loss: 0.5804 - acc: 0.8750 Epoch 294/500 - 0s - loss: 0.5748 - acc: 0.8750 Epoch 295/500 - 0s - loss: 0.5694 - acc: 0.8750 Epoch 296/500 - 0s - loss: 0.5641 - acc: 0.8750 Epoch 297/500 - 0s - loss: 0.5588 - acc: 0.8750 Epoch 298/500 - 0s - loss: 0.5536 - acc: 0.8750 Epoch 299/500 - 0s - loss: 0.5485 - acc: 0.8750 Epoch 300/500 - 0s - loss: 0.5435 - acc: 0.8750 Epoch 301/500 - 0s - loss: 0.5385 - acc: 0.8750 Epoch 302/500 - 0s - loss: 0.5336 - acc: 0.8750 Epoch 303/500 - 0s - loss: 0.5288 - acc: 0.8750 Epoch 304/500 - 0s - loss: 0.5241 - acc: 0.8750 Epoch 305/500 - 0s - loss: 0.5195 - acc: 0.8750 Epoch 306/500 - 0s - loss: 0.5149 - acc: 0.8750 Epoch 307/500 - 0s - loss: 0.5104 - acc: 0.8750 Epoch 308/500 - 0s - loss: 0.5060 - acc: 0.8750 Epoch 309/500 - 0s - loss: 0.5016 - acc: 0.8750 Epoch 310/500 - 0s - loss: 0.4974 - acc: 0.8750 Epoch 311/500 - 0s - loss: 0.4931 - acc: 0.8750 Epoch 312/500 - 0s - loss: 0.4890 - acc: 0.8750 Epoch 313/500 - 0s - loss: 0.4849 - acc: 0.8750 Epoch 314/500 - 0s - loss: 0.4809 - acc: 0.8750 Epoch 315/500 - 0s - loss: 0.4769 - acc: 0.8750 Epoch 316/500 - 0s - loss: 0.4730 - acc: 0.8750 Epoch 317/500 - 0s - loss: 0.4692 - acc: 0.8750 Epoch 318/500 - 0s - loss: 0.4654 - acc: 0.8750 Epoch 319/500 - 0s - loss: 0.4617 - acc: 0.8750 Epoch 320/500 - 0s - loss: 0.4581 - acc: 0.8750 Epoch 321/500 - 0s - loss: 0.4545 - acc: 0.8750 Epoch 322/500 - 0s - loss: 0.4510 - acc: 0.8750 Epoch 323/500 - 0s - loss: 0.4475 - acc: 0.8750 Epoch 324/500 - 0s - loss: 0.4441 - acc: 0.8750 Epoch 325/500 - 0s - loss: 0.4407 - acc: 0.8750 Epoch 326/500 - 0s - loss: 0.4374 - acc: 0.8750 Epoch 327/500 - 0s - loss: 0.4341 - acc: 0.8750 Epoch 328/500 - 0s - loss: 0.4309 - acc: 0.8750 Epoch 329/500 - 0s - loss: 0.4277 - acc: 0.8750 Epoch 330/500 - 0s - loss: 0.4246 - acc: 0.8750 Epoch 331/500 - 0s - loss: 0.4216 - acc: 0.8750 Epoch 332/500 - 0s - loss: 0.4186 - acc: 0.8750 Epoch 333/500 - 0s - loss: 0.4156 - acc: 0.8750 Epoch 334/500 - 0s - loss: 0.4127 - acc: 0.8750 Epoch 335/500 - 0s - loss: 0.4098 - acc: 0.8750 Epoch 336/500 - 0s - loss: 0.4070 - acc: 0.8750 Epoch 337/500 - 0s - loss: 0.4043 - acc: 0.8750 Epoch 338/500 - 0s - loss: 0.4015 - acc: 0.8750 Epoch 339/500 - 0s - loss: 0.3989 - acc: 0.8750 Epoch 340/500 - 0s - loss: 0.3962 - acc: 0.8750 Epoch 341/500 - 0s - loss: 0.3936 - acc: 0.8750 Epoch 342/500 - 0s - loss: 0.3911 - acc: 0.8750 Epoch 343/500 - 0s - loss: 0.3885 - acc: 0.8750 Epoch 344/500 - 0s - loss: 0.3861 - acc: 0.8750 Epoch 345/500 - 0s - loss: 0.3836 - acc: 0.8750 Epoch 346/500 - 0s - loss: 0.3812 - acc: 0.8750 Epoch 347/500 - 0s - loss: 0.3788 - acc: 0.8750 Epoch 348/500 - 0s - loss: 0.3765 - acc: 0.8750 Epoch 349/500 - 0s - loss: 0.3742 - acc: 0.8750 Epoch 350/500 - 0s - loss: 0.3720 - acc: 0.8750 Epoch 351/500 - 0s - loss: 0.3697 - acc: 0.8750 Epoch 352/500 - 0s - loss: 0.3676 - acc: 0.8750 Epoch 353/500 - 0s - loss: 0.3654 - acc: 0.8750 Epoch 354/500 - 0s - loss: 0.3633 - acc: 0.8750 Epoch 355/500 - 0s - loss: 0.3612 - acc: 0.8750 Epoch 356/500 - 0s - loss: 0.3591 - acc: 0.8750 Epoch 357/500 - 0s - loss: 0.3571 - acc: 0.8750 Epoch 358/500 - 0s - loss: 0.3551 - acc: 0.8750 Epoch 359/500 - 0s - loss: 0.3531 - acc: 0.8750 Epoch 360/500 - 0s - loss: 0.3512 - acc: 0.8750 Epoch 361/500 - 0s - loss: 0.3493 - acc: 0.8750 Epoch 362/500 - 0s - loss: 0.3474 - acc: 0.8750 Epoch 363/500 - 0s - loss: 0.3456 - acc: 0.8750 Epoch 364/500 - 0s - loss: 0.3437 - acc: 0.8750 Epoch 365/500 - 0s - loss: 0.3419 - acc: 0.8750 Epoch 366/500 - 0s - loss: 0.3402 - acc: 0.8750 Epoch 367/500 - 0s - loss: 0.3384 - acc: 0.8750 Epoch 368/500 - 0s - loss: 0.3367 - acc: 0.8750 Epoch 369/500 - 0s - loss: 0.3350 - acc: 0.8750 Epoch 370/500 - 0s - loss: 0.3333 - acc: 0.8750 Epoch 371/500 - 0s - loss: 0.3317 - acc: 0.8750 Epoch 372/500 - 0s - loss: 0.3300 - acc: 0.8750 Epoch 373/500 - 0s - loss: 0.3284 - acc: 0.8750 Epoch 374/500 - 0s - loss: 0.3268 - acc: 0.8750 Epoch 375/500 - 0s - loss: 0.3253 - acc: 0.8750 Epoch 376/500 - 0s - loss: 0.3237 - acc: 0.8750 Epoch 377/500 - 0s - loss: 0.3222 - acc: 0.8750 Epoch 378/500 - 0s - loss: 0.3207 - acc: 0.8750 Epoch 379/500 - 0s - loss: 0.3192 - acc: 0.8750 Epoch 380/500 - 0s - loss: 0.3178 - acc: 0.8750 Epoch 381/500 - 0s - loss: 0.3163 - acc: 0.8750 Epoch 382/500 - 0s - loss: 0.3149 - acc: 0.8750 Epoch 383/500 - 0s - loss: 0.3135 - acc: 0.8750 Epoch 384/500 - 0s - loss: 0.3121 - acc: 0.8750 Epoch 385/500 - 0s - loss: 0.3108 - acc: 0.8750 Epoch 386/500 - 0s - loss: 0.3094 - acc: 0.8750 Epoch 387/500 - 0s - loss: 0.3081 - acc: 0.8750 Epoch 388/500 - 0s - loss: 0.3068 - acc: 0.8750 Epoch 389/500 - 0s - loss: 0.3055 - acc: 0.8750 Epoch 390/500 - 0s - loss: 0.3042 - acc: 0.8750 Epoch 391/500 - 0s - loss: 0.3030 - acc: 0.8750 Epoch 392/500 - 0s - loss: 0.3018 - acc: 0.8750 Epoch 393/500 - 0s - loss: 0.3005 - acc: 0.8750 Epoch 394/500 - 0s - loss: 0.2993 - acc: 0.8750 Epoch 395/500 - 0s - loss: 0.2982 - acc: 0.8750 Epoch 396/500 - 0s - loss: 0.2970 - acc: 0.8750 Epoch 397/500 - 0s - loss: 0.2958 - acc: 0.8750 Epoch 398/500 - 0s - loss: 0.2947 - acc: 0.8750 Epoch 399/500 - 0s - loss: 0.2936 - acc: 0.8750 Epoch 400/500 - 0s - loss: 0.2925 - acc: 0.8750 Epoch 401/500 - 0s - loss: 0.2914 - acc: 0.8750 Epoch 402/500 - 0s - loss: 0.2903 - acc: 0.8750 Epoch 403/500 - 0s - loss: 0.2893 - acc: 0.8750 Epoch 404/500 - 0s - loss: 0.2882 - acc: 0.8750 Epoch 405/500 - 0s - loss: 0.2872 - acc: 0.8750 Epoch 406/500 - 0s - loss: 0.2862 - acc: 0.8750 Epoch 407/500 - 0s - loss: 0.2852 - acc: 0.8750 Epoch 408/500 - 0s - loss: 0.2842 - acc: 0.8750 Epoch 409/500 - 0s - loss: 0.2833 - acc: 0.8750 Epoch 410/500 - 0s - loss: 0.2823 - acc: 0.8750 Epoch 411/500 - 0s - loss: 0.2814 - acc: 0.8750 Epoch 412/500 - 0s - loss: 0.2804 - acc: 0.8750 Epoch 413/500 - 0s - loss: 0.2795 - acc: 0.8750 Epoch 414/500 - 0s - loss: 0.2786 - acc: 0.8750 Epoch 415/500 - 0s - loss: 0.2777 - acc: 0.8750 Epoch 416/500 - 0s - loss: 0.2769 - acc: 0.8750 Epoch 417/500 - 0s - loss: 0.2760 - acc: 0.8750 Epoch 418/500 - 0s - loss: 0.2751 - acc: 0.8750 Epoch 419/500 - 0s - loss: 0.2743 - acc: 0.8750 Epoch 420/500 - 0s - loss: 0.2735 - acc: 0.8750 Epoch 421/500 - 0s - loss: 0.2727 - acc: 0.8750 Epoch 422/500 - 0s - loss: 0.2719 - acc: 0.8750 Epoch 423/500 - 0s - loss: 0.2711 - acc: 0.8750 Epoch 424/500 - 0s - loss: 0.2703 - acc: 0.8750 Epoch 425/500 - 0s - loss: 0.2695 - acc: 0.8750 Epoch 426/500 - 0s - loss: 0.2687 - acc: 0.8750 Epoch 427/500 - 0s - loss: 0.2680 - acc: 0.8750 Epoch 428/500 - 0s - loss: 0.2673 - acc: 0.8750 Epoch 429/500 - 0s - loss: 0.2665 - acc: 0.8750 Epoch 430/500 - 0s - loss: 0.2658 - acc: 0.8750 Epoch 431/500 - 0s - loss: 0.2651 - acc: 0.8750 Epoch 432/500 - 0s - loss: 0.2644 - acc: 0.8750 Epoch 433/500 - 0s - loss: 0.2637 - acc: 0.8750 Epoch 434/500 - 0s - loss: 0.2630 - acc: 0.8750 Epoch 435/500 - 0s - loss: 0.2624 - acc: 0.8750 Epoch 436/500 - 0s - loss: 0.2617 - acc: 0.8750 Epoch 437/500 - 0s - loss: 0.2611 - acc: 0.8750 Epoch 438/500 - 0s - loss: 0.2604 - acc: 0.8750 Epoch 439/500 - 0s - loss: 0.2598 - acc: 0.8750 Epoch 440/500 - 0s - loss: 0.2592 - acc: 0.8750 Epoch 441/500 - 0s - loss: 0.2586 - acc: 0.8750 Epoch 442/500 - 0s - loss: 0.2580 - acc: 0.8750 Epoch 443/500 - 0s - loss: 0.2574 - acc: 0.8750 Epoch 444/500 - 0s - loss: 0.2568 - acc: 0.8750 Epoch 445/500 - 0s - loss: 0.2562 - acc: 0.8750 Epoch 446/500 - 0s - loss: 0.2556 - acc: 0.8750 Epoch 447/500 - 0s - loss: 0.2551 - acc: 0.8750 Epoch 448/500 - 0s - loss: 0.2545 - acc: 0.8750 Epoch 449/500 - 0s - loss: 0.2540 - acc: 0.8750 Epoch 450/500 - 0s - loss: 0.2534 - acc: 0.8750 Epoch 451/500 - 0s - loss: 0.2529 - acc: 0.8750 Epoch 452/500 - 0s - loss: 0.2524 - acc: 0.8750 Epoch 453/500 - 0s - loss: 0.2519 - acc: 0.8750 Epoch 454/500 - 0s - loss: 0.2514 - acc: 0.8750 Epoch 455/500 - 0s - loss: 0.2509 - acc: 0.8750 Epoch 456/500 - 0s - loss: 0.2504 - acc: 0.8750 Epoch 457/500 - 0s - loss: 0.2499 - acc: 0.8750 Epoch 458/500 - 0s - loss: 0.2494 - acc: 0.8750 Epoch 459/500 - 0s - loss: 0.2489 - acc: 0.8750 Epoch 460/500 - 0s - loss: 0.2484 - acc: 0.8750 Epoch 461/500 - 0s - loss: 0.2480 - acc: 0.8750 Epoch 462/500 - 0s - loss: 0.2475 - acc: 0.8750 Epoch 463/500 - 0s - loss: 0.2471 - acc: 0.8750 Epoch 464/500 - 0s - loss: 0.2466 - acc: 0.8750 Epoch 465/500 - 0s - loss: 0.2462 - acc: 0.8750 Epoch 466/500 - 0s - loss: 0.2457 - acc: 0.8750 Epoch 467/500 - 0s - loss: 0.2453 - acc: 0.8750 Epoch 468/500 - 0s - loss: 0.2449 - acc: 0.8750 Epoch 469/500 - 0s - loss: 0.2445 - acc: 0.8750 Epoch 470/500 - 0s - loss: 0.2440 - acc: 0.8750 Epoch 471/500 - 0s - loss: 0.2436 - acc: 0.8750 Epoch 472/500 - 0s - loss: 0.2432 - acc: 0.8750 Epoch 473/500 - 0s - loss: 0.2428 - acc: 0.8750 Epoch 474/500 - 0s - loss: 0.2424 - acc: 0.8750 Epoch 475/500 - 0s - loss: 0.2421 - acc: 0.8750 Epoch 476/500 - 0s - loss: 0.2417 - acc: 0.8750 Epoch 477/500 - 0s - loss: 0.2413 - acc: 0.8750 Epoch 478/500 - 0s - loss: 0.2409 - acc: 0.8750 Epoch 479/500 - 0s - loss: 0.2406 - acc: 0.8750 Epoch 480/500 - 0s - loss: 0.2402 - acc: 0.8750 Epoch 481/500 - 0s - loss: 0.2398 - acc: 0.8750 Epoch 482/500 - 0s - loss: 0.2395 - acc: 0.8750 Epoch 483/500 - 0s - loss: 0.2391 - acc: 0.8750 Epoch 484/500 - 0s - loss: 0.2388 - acc: 0.8750 Epoch 485/500 - 0s - loss: 0.2384 - acc: 0.8750 Epoch 486/500 - 0s - loss: 0.2381 - acc: 0.8750 Epoch 487/500 - 0s - loss: 0.2378 - acc: 0.8750 Epoch 488/500 - 0s - loss: 0.2374 - acc: 0.8750 Epoch 489/500 - 0s - loss: 0.2371 - acc: 0.8750 Epoch 490/500 - 0s - loss: 0.2368 - acc: 0.8750 Epoch 491/500 - 0s - loss: 0.2365 - acc: 0.8750 Epoch 492/500 - 0s - loss: 0.2361 - acc: 0.8750 Epoch 493/500 - 0s - loss: 0.2358 - acc: 0.8750 Epoch 494/500 - 0s - loss: 0.2355 - acc: 0.8750 Epoch 495/500 - 0s - loss: 0.2352 - acc: 0.8750 Epoch 496/500 - 0s - loss: 0.2349 - acc: 0.8750 Epoch 497/500 - 0s - loss: 0.2346 - acc: 0.8750 Epoch 498/500 - 0s - loss: 0.2343 - acc: 0.8750 Epoch 499/500 - 0s - loss: 0.2340 - acc: 0.8750 Epoch 500/500 - 0s - loss: 0.2338 - acc: 0.8750
<keras.callbacks.History at 0x10b83ad10>
After the model is fit, we test it by passing it a given word from the vocabulary and having the model predict the next word. Here we pass in ‘Jack‘ by encoding it and calling model.predict_classes() to get the integer output for the predicted word. This is then looked up in the vocabulary mapping to give the associated word.
# Actually get to use the model!
in_text = 'jill'
print(in_text)
encoded = tokenizer.texts_to_sequences([in_text])[0]
encoded = array(encoded)
yhat = model.predict_classes(encoded, verbose=0)
for word, index in tokenizer.word_index.items():
if index == yhat:
print(word)
jill came
# Function to generate a sequence from the model
def generate_seq(model, tokenizer, seed_text, n_words):
in_text, result = seed_text, seed_text
# generate a fixed number of words
for _ in range(n_words):
# encode the text as integer
encoded = tokenizer.texts_to_sequences([in_text])[0]
encoded = array(encoded)
# predict a word in the vocabulary
yhat = model.predict_classes(encoded, verbose=0)
# map predicted word index to word
out_word = ''
for word, index in tokenizer.word_index.items():
if index == yhat:
out_word = word
break
# append to input
in_text, result = out_word, result + ' ' + out_word
return result
#Use the function to get cool outputs for more steps
print(generate_seq(model, tokenizer, 'after', 6))
after pail of water jack and jill
This model is good but does not take into account the capabilities of LSTMs Another approach is to split up the source text line-by-line, then break each line down into a series of words that build up. e.g.
Input Output , , , , , Jack, and, , , , Jack, and Jill, , , Jack, and, Jill, went , , Jack, and, Jill, went, up _, Jack, and, Jill, went, up, the Jack, and, Jill, went, up, the, hill
This approach may allow the model to use the context of each line to help the model in those cases where a simple one-word-in-and-out model creates ambiguity.
In this case, this comes at the cost of predicting words across lines, which might be fine for now if we are only interested in modeling and generating lines of text.
Note that in this representation, we will require a padding of sequences to ensure they meet a fixed length input. This is a requirement when using Keras.
# create line-based sequences
sequences = list()
for line in data.split('\n'):
encoded = tokenizer.texts_to_sequences([line])[0]
for i in range(1, len(encoded)):
sequence = encoded[:i+1]
sequences.append(sequence)
print('Total Sequences: %d' % len(sequences))
print(sequences)
Total Sequences: 21 [[2, 1], [2, 1, 3], [2, 1, 3, 4], [2, 1, 3, 4, 5], [2, 1, 3, 4, 5, 6], [2, 1, 3, 4, 5, 6, 7], [8, 9], [8, 9, 10], [8, 9, 10, 11], [8, 9, 10, 11, 12], [8, 9, 10, 11, 12, 13], [2, 14], [2, 14, 15], [2, 14, 15, 1], [2, 14, 15, 1, 16], [2, 14, 15, 1, 16, 17], [2, 14, 15, 1, 16, 17, 18], [1, 3], [1, 3, 19], [1, 3, 19, 20], [1, 3, 19, 20, 21]]
#Next, we can pad the prepared sequences. We can do this using the pad_sequences() function provided in Keras.
#This first involves finding the longest sequence, then using that as the length by which to pad-out all other sequences.
max_length = max([len(seq) for seq in sequences])
sequences = pad_sequences(sequences, maxlen=max_length, padding='pre')
print('Max Sequence Length: %d' % max_length)
#Split into input and output elements
sequences = array(sequences)
X, y = sequences[:,:-1],sequences[:,-1]
y = to_categorical(y, num_classes=vocab_size)
Max Sequence Length: 7
The model can then be defined as before, except the input sequences are now longer than a single word. Specifically, they are max_length-1 in length, (-1 because when we calculated the maximum length of sequences, they included the input and output elements)
#Define the model
model = Sequential()
model.add(Embedding(vocab_size, 10, input_length=max_length-1))
model.add(LSTM(50))
model.add(Dense(vocab_size, activation='softmax'))
print(model.summary())
#Compile network
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
#Fit network
model.fit(X, y, epochs=500, verbose=2)
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= embedding_8 (Embedding) (None, 6, 10) 220 _________________________________________________________________ lstm_9 (LSTM) (None, 50) 12200 _________________________________________________________________ dense_7 (Dense) (None, 22) 1122 ================================================================= Total params: 13,542 Trainable params: 13,542 Non-trainable params: 0 _________________________________________________________________ None Epoch 1/500 - 1s - loss: 3.0917 - acc: 0.0476 Epoch 2/500 - 0s - loss: 3.0905 - acc: 0.0476 Epoch 3/500 - 0s - loss: 3.0889 - acc: 0.0952 Epoch 4/500 - 0s - loss: 3.0873 - acc: 0.1429 Epoch 5/500 - 0s - loss: 3.0858 - acc: 0.1429 Epoch 6/500 - 0s - loss: 3.0841 - acc: 0.1905 Epoch 7/500 - 0s - loss: 3.0825 - acc: 0.1429 Epoch 8/500 - 0s - loss: 3.0808 - acc: 0.1429 Epoch 9/500 - 0s - loss: 3.0791 - acc: 0.1429 Epoch 10/500 - 0s - loss: 3.0773 - acc: 0.0952 Epoch 11/500 - 0s - loss: 3.0754 - acc: 0.0952 Epoch 12/500 - 0s - loss: 3.0735 - acc: 0.0952 Epoch 13/500 - 0s - loss: 3.0715 - acc: 0.0952 Epoch 14/500 - 0s - loss: 3.0693 - acc: 0.1429 Epoch 15/500 - 0s - loss: 3.0671 - acc: 0.1429 Epoch 16/500 - 0s - loss: 3.0647 - acc: 0.1429 Epoch 17/500 - 0s - loss: 3.0622 - acc: 0.1429 Epoch 18/500 - 0s - loss: 3.0596 - acc: 0.1429 Epoch 19/500 - 0s - loss: 3.0567 - acc: 0.1429 Epoch 20/500 - 0s - loss: 3.0537 - acc: 0.1429 Epoch 21/500 - 0s - loss: 3.0505 - acc: 0.1429 Epoch 22/500 - 0s - loss: 3.0471 - acc: 0.1429 Epoch 23/500 - 0s - loss: 3.0435 - acc: 0.1429 Epoch 24/500 - 0s - loss: 3.0395 - acc: 0.1429 Epoch 25/500 - 0s - loss: 3.0353 - acc: 0.1429 Epoch 26/500 - 0s - loss: 3.0307 - acc: 0.1429 Epoch 27/500 - 0s - loss: 3.0258 - acc: 0.0952 Epoch 28/500 - 0s - loss: 3.0206 - acc: 0.0952 Epoch 29/500 - 0s - loss: 3.0149 - acc: 0.0952 Epoch 30/500 - 0s - loss: 3.0088 - acc: 0.0952 Epoch 31/500 - 0s - loss: 3.0021 - acc: 0.0952 Epoch 32/500 - 0s - loss: 2.9951 - acc: 0.0952 Epoch 33/500 - 0s - loss: 2.9875 - acc: 0.0952 Epoch 34/500 - 0s - loss: 2.9794 - acc: 0.0952 Epoch 35/500 - 0s - loss: 2.9707 - acc: 0.0952 Epoch 36/500 - 0s - loss: 2.9616 - acc: 0.0952 Epoch 37/500 - 0s - loss: 2.9519 - acc: 0.0952 Epoch 38/500 - 0s - loss: 2.9420 - acc: 0.0952 Epoch 39/500 - 0s - loss: 2.9318 - acc: 0.0952 Epoch 40/500 - 0s - loss: 2.9217 - acc: 0.0952 Epoch 41/500 - 0s - loss: 2.9118 - acc: 0.0952 Epoch 42/500 - 0s - loss: 2.9025 - acc: 0.0952 Epoch 43/500 - 0s - loss: 2.8943 - acc: 0.0952 Epoch 44/500 - 0s - loss: 2.8874 - acc: 0.0952 Epoch 45/500 - 0s - loss: 2.8820 - acc: 0.0952 Epoch 46/500 - 0s - loss: 2.8776 - acc: 0.0952 Epoch 47/500 - 0s - loss: 2.8734 - acc: 0.0952 Epoch 48/500 - 0s - loss: 2.8685 - acc: 0.0952 Epoch 49/500 - 0s - loss: 2.8623 - acc: 0.0952 Epoch 50/500 - 0s - loss: 2.8548 - acc: 0.0952 Epoch 51/500 - 0s - loss: 2.8461 - acc: 0.0952 Epoch 52/500 - 0s - loss: 2.8368 - acc: 0.1429 Epoch 53/500 - 0s - loss: 2.8274 - acc: 0.1429 Epoch 54/500 - 0s - loss: 2.8182 - acc: 0.1429 Epoch 55/500 - 0s - loss: 2.8092 - acc: 0.1429 Epoch 56/500 - 0s - loss: 2.8003 - acc: 0.1429 Epoch 57/500 - 0s - loss: 2.7914 - acc: 0.1905 Epoch 58/500 - 0s - loss: 2.7822 - acc: 0.1905 Epoch 59/500 - 0s - loss: 2.7724 - acc: 0.1905 Epoch 60/500 - 0s - loss: 2.7619 - acc: 0.1905 Epoch 61/500 - 0s - loss: 2.7505 - acc: 0.1905 Epoch 62/500 - 0s - loss: 2.7380 - acc: 0.1905 Epoch 63/500 - 0s - loss: 2.7247 - acc: 0.1905 Epoch 64/500 - 0s - loss: 2.7103 - acc: 0.1905 Epoch 65/500 - 0s - loss: 2.6952 - acc: 0.1429 Epoch 66/500 - 0s - loss: 2.6792 - acc: 0.1429 Epoch 67/500 - 0s - loss: 2.6623 - acc: 0.1429 Epoch 68/500 - 0s - loss: 2.6445 - acc: 0.1429 Epoch 69/500 - 0s - loss: 2.6254 - acc: 0.1905 Epoch 70/500 - 0s - loss: 2.6050 - acc: 0.1905 Epoch 71/500 - 0s - loss: 2.5830 - acc: 0.1905 Epoch 72/500 - 0s - loss: 2.5595 - acc: 0.1905 Epoch 73/500 - 0s - loss: 2.5346 - acc: 0.2381 Epoch 74/500 - 0s - loss: 2.5082 - acc: 0.2381 Epoch 75/500 - 0s - loss: 2.4804 - acc: 0.2381 Epoch 76/500 - 0s - loss: 2.4514 - acc: 0.2381 Epoch 77/500 - 0s - loss: 2.4209 - acc: 0.2381 Epoch 78/500 - 0s - loss: 2.3886 - acc: 0.2381 Epoch 79/500 - 0s - loss: 2.3544 - acc: 0.2857 Epoch 80/500 - 0s - loss: 2.3187 - acc: 0.3333 Epoch 81/500 - 0s - loss: 2.2826 - acc: 0.3333 Epoch 82/500 - 0s - loss: 2.2472 - acc: 0.4286 Epoch 83/500 - 0s - loss: 2.2113 - acc: 0.4286 Epoch 84/500 - 0s - loss: 2.1757 - acc: 0.4286 Epoch 85/500 - 0s - loss: 2.1398 - acc: 0.4286 Epoch 86/500 - 0s - loss: 2.1027 - acc: 0.4286 Epoch 87/500 - 0s - loss: 2.0643 - acc: 0.4286 Epoch 88/500 - 0s - loss: 2.0259 - acc: 0.4286 Epoch 89/500 - 0s - loss: 1.9887 - acc: 0.4286 Epoch 90/500 - 0s - loss: 1.9500 - acc: 0.4286 Epoch 91/500 - 0s - loss: 1.9115 - acc: 0.4286 Epoch 92/500 - 0s - loss: 1.8739 - acc: 0.4762 Epoch 93/500 - 0s - loss: 1.8358 - acc: 0.4762 Epoch 94/500 - 0s - loss: 1.7985 - acc: 0.4762 Epoch 95/500 - 0s - loss: 1.7635 - acc: 0.5714 Epoch 96/500 - 0s - loss: 1.7285 - acc: 0.5714 Epoch 97/500 - 0s - loss: 1.6948 - acc: 0.5714 Epoch 98/500 - 0s - loss: 1.6614 - acc: 0.5714 Epoch 99/500 - 0s - loss: 1.6283 - acc: 0.6190 Epoch 100/500 - 0s - loss: 1.5955 - acc: 0.6190 Epoch 101/500 - 0s - loss: 1.5625 - acc: 0.5714 Epoch 102/500 - 0s - loss: 1.5301 - acc: 0.5714 Epoch 103/500 - 0s - loss: 1.4985 - acc: 0.5714 Epoch 104/500 - 0s - loss: 1.4676 - acc: 0.6190 Epoch 105/500 - 0s - loss: 1.4379 - acc: 0.6190 Epoch 106/500 - 0s - loss: 1.4087 - acc: 0.6190 Epoch 107/500 - 0s - loss: 1.3803 - acc: 0.6190 Epoch 108/500 - 0s - loss: 1.3522 - acc: 0.6667 Epoch 109/500 - 0s - loss: 1.3250 - acc: 0.6667 Epoch 110/500 - 0s - loss: 1.2981 - acc: 0.7143 Epoch 111/500 - 0s - loss: 1.2721 - acc: 0.7143 Epoch 112/500 - 0s - loss: 1.2468 - acc: 0.7143 Epoch 113/500 - 0s - loss: 1.2225 - acc: 0.7619 Epoch 114/500 - 0s - loss: 1.1992 - acc: 0.7619 Epoch 115/500 - 0s - loss: 1.1765 - acc: 0.7619 Epoch 116/500 - 0s - loss: 1.1547 - acc: 0.7619 Epoch 117/500 - 0s - loss: 1.1336 - acc: 0.7619 Epoch 118/500 - 0s - loss: 1.1131 - acc: 0.7619 Epoch 119/500 - 0s - loss: 1.0933 - acc: 0.8095 Epoch 120/500 - 0s - loss: 1.0740 - acc: 0.8095 Epoch 121/500 - 0s - loss: 1.0555 - acc: 0.8095 Epoch 122/500 - 0s - loss: 1.0376 - acc: 0.8095 Epoch 123/500 - 0s - loss: 1.0201 - acc: 0.8095 Epoch 124/500 - 0s - loss: 1.0031 - acc: 0.8095 Epoch 125/500 - 0s - loss: 0.9865 - acc: 0.8095 Epoch 126/500 - 0s - loss: 0.9702 - acc: 0.8095 Epoch 127/500 - 0s - loss: 0.9543 - acc: 0.8095 Epoch 128/500 - 0s - loss: 0.9387 - acc: 0.8095 Epoch 129/500 - 0s - loss: 0.9235 - acc: 0.8571 Epoch 130/500 - 0s - loss: 0.9090 - acc: 0.8571 Epoch 131/500 - 0s - loss: 0.8950 - acc: 0.8571 Epoch 132/500 - 0s - loss: 0.8815 - acc: 0.8571 Epoch 133/500 - 0s - loss: 0.8684 - acc: 0.8571 Epoch 134/500 - 0s - loss: 0.8557 - acc: 0.8571 Epoch 135/500 - 0s - loss: 0.8435 - acc: 0.8571 Epoch 136/500 - 0s - loss: 0.8317 - acc: 0.8571 Epoch 137/500 - 0s - loss: 0.8201 - acc: 0.8571 Epoch 138/500 - 0s - loss: 0.8086 - acc: 0.8571 Epoch 139/500 - 0s - loss: 0.7972 - acc: 0.8571 Epoch 140/500 - 0s - loss: 0.7862 - acc: 0.8571 Epoch 141/500 - 0s - loss: 0.7758 - acc: 0.8571 Epoch 142/500 - 0s - loss: 0.7659 - acc: 0.8571 Epoch 143/500 - 0s - loss: 0.7562 - acc: 0.8571 Epoch 144/500 - 0s - loss: 0.7464 - acc: 0.8571 Epoch 145/500 - 0s - loss: 0.7367 - acc: 0.8571 Epoch 146/500 - 0s - loss: 0.7273 - acc: 0.8571 Epoch 147/500 - 0s - loss: 0.7185 - acc: 0.8571 Epoch 148/500 - 0s - loss: 0.7101 - acc: 0.8571 Epoch 149/500 - 0s - loss: 0.7018 - acc: 0.8571 Epoch 150/500 - 0s - loss: 0.6934 - acc: 0.8571 Epoch 151/500 - 0s - loss: 0.6851 - acc: 0.8571 Epoch 152/500 - 0s - loss: 0.6772 - acc: 0.8571 Epoch 153/500 - 0s - loss: 0.6697 - acc: 0.8571 Epoch 154/500 - 0s - loss: 0.6625 - acc: 0.8571 Epoch 155/500 - 0s - loss: 0.6554 - acc: 0.8571 Epoch 156/500 - 0s - loss: 0.6482 - acc: 0.8571 Epoch 157/500 - 0s - loss: 0.6411 - acc: 0.8571 Epoch 158/500 - 0s - loss: 0.6341 - acc: 0.8571 Epoch 159/500 - 0s - loss: 0.6274 - acc: 0.8571 Epoch 160/500 - 0s - loss: 0.6210 - acc: 0.8571 Epoch 161/500 - 0s - loss: 0.6148 - acc: 0.8571 Epoch 162/500 - 0s - loss: 0.6088 - acc: 0.8571 Epoch 163/500 - 0s - loss: 0.6028 - acc: 0.8571 Epoch 164/500 - 0s - loss: 0.5970 - acc: 0.8571 Epoch 165/500 - 0s - loss: 0.5911 - acc: 0.8571 Epoch 166/500 - 0s - loss: 0.5852 - acc: 0.8571 Epoch 167/500 - 0s - loss: 0.5796 - acc: 0.8571 Epoch 168/500 - 0s - loss: 0.5742 - acc: 0.8571 Epoch 169/500 - 0s - loss: 0.5691 - acc: 0.8571 Epoch 170/500 - 0s - loss: 0.5639 - acc: 0.8571 Epoch 171/500 - 0s - loss: 0.5587 - acc: 0.8571 Epoch 172/500 - 0s - loss: 0.5536 - acc: 0.8571 Epoch 173/500 - 0s - loss: 0.5485 - acc: 0.8571 Epoch 174/500 - 0s - loss: 0.5436 - acc: 0.8571 Epoch 175/500 - 0s - loss: 0.5388 - acc: 0.8571 Epoch 176/500 - 0s - loss: 0.5342 - acc: 0.8571 Epoch 177/500 - 0s - loss: 0.5296 - acc: 0.8571 Epoch 178/500 - 0s - loss: 0.5252 - acc: 0.8571 Epoch 179/500 - 0s - loss: 0.5208 - acc: 0.8571 Epoch 180/500 - 0s - loss: 0.5164 - acc: 0.8571 Epoch 181/500 - 0s - loss: 0.5120 - acc: 0.8571 Epoch 182/500 - 0s - loss: 0.5076 - acc: 0.8571 Epoch 183/500 - 0s - loss: 0.5033 - acc: 0.8571 Epoch 184/500 - 0s - loss: 0.4990 - acc: 0.8571 Epoch 185/500 - 0s - loss: 0.4950 - acc: 0.8571 Epoch 186/500 - 0s - loss: 0.4910 - acc: 0.8571 Epoch 187/500 - 0s - loss: 0.4871 - acc: 0.8571 Epoch 188/500 - 0s - loss: 0.4833 - acc: 0.8571 Epoch 189/500 - 0s - loss: 0.4796 - acc: 0.8571 Epoch 190/500 - 0s - loss: 0.4759 - acc: 0.8571 Epoch 191/500 - 0s - loss: 0.4722 - acc: 0.8571 Epoch 192/500 - 0s - loss: 0.4685 - acc: 0.8571 Epoch 193/500 - 0s - loss: 0.4647 - acc: 0.8571 Epoch 194/500 - 0s - loss: 0.4611 - acc: 0.8571 Epoch 195/500 - 0s - loss: 0.4575 - acc: 0.8571 Epoch 196/500 - 0s - loss: 0.4540 - acc: 0.8571 Epoch 197/500 - 0s - loss: 0.4505 - acc: 0.8571 Epoch 198/500 - 0s - loss: 0.4471 - acc: 0.8571 Epoch 199/500 - 0s - loss: 0.4438 - acc: 0.8571 Epoch 200/500 - 0s - loss: 0.4405 - acc: 0.8571 Epoch 201/500 - 0s - loss: 0.4373 - acc: 0.8571 Epoch 202/500 - 0s - loss: 0.4341 - acc: 0.8571 Epoch 203/500 - 0s - loss: 0.4309 - acc: 0.8571 Epoch 204/500 - 0s - loss: 0.4278 - acc: 0.8571 Epoch 205/500 - 0s - loss: 0.4249 - acc: 0.8571 Epoch 206/500 - 0s - loss: 0.4219 - acc: 0.8571 Epoch 207/500 - 0s - loss: 0.4190 - acc: 0.9048 Epoch 208/500 - 0s - loss: 0.4160 - acc: 0.9524 Epoch 209/500 - 0s - loss: 0.4129 - acc: 0.9048 Epoch 210/500 - 0s - loss: 0.4098 - acc: 0.9524 Epoch 211/500 - 0s - loss: 0.4068 - acc: 0.9048 Epoch 212/500 - 0s - loss: 0.4038 - acc: 0.9524 Epoch 213/500 - 0s - loss: 0.4011 - acc: 0.9524 Epoch 214/500 - 0s - loss: 0.3983 - acc: 0.9524 Epoch 215/500 - 0s - loss: 0.3956 - acc: 0.9524 Epoch 216/500 - 0s - loss: 0.3929 - acc: 0.9524 Epoch 217/500 - 0s - loss: 0.3901 - acc: 0.9524 Epoch 218/500 - 0s - loss: 0.3873 - acc: 0.9524 Epoch 219/500 - 0s - loss: 0.3845 - acc: 0.9524 Epoch 220/500 - 0s - loss: 0.3817 - acc: 0.9524 Epoch 221/500 - 0s - loss: 0.3791 - acc: 0.9524 Epoch 222/500 - 0s - loss: 0.3764 - acc: 0.9524 Epoch 223/500 - 0s - loss: 0.3739 - acc: 0.9524 Epoch 224/500 - 0s - loss: 0.3715 - acc: 0.9524 Epoch 225/500 - 0s - loss: 0.3690 - acc: 0.9524 Epoch 226/500 - 0s - loss: 0.3665 - acc: 0.9524 Epoch 227/500 - 0s - loss: 0.3639 - acc: 0.9524 Epoch 228/500 - 0s - loss: 0.3613 - acc: 0.9524 Epoch 229/500 - 0s - loss: 0.3587 - acc: 0.9524 Epoch 230/500 - 0s - loss: 0.3562 - acc: 0.9524 Epoch 231/500 - 0s - loss: 0.3538 - acc: 0.9524 Epoch 232/500 - 0s - loss: 0.3515 - acc: 0.9524 Epoch 233/500 - 0s - loss: 0.3493 - acc: 0.9524 Epoch 234/500 - 0s - loss: 0.3471 - acc: 0.9524 Epoch 235/500 - 0s - loss: 0.3451 - acc: 0.9524 Epoch 236/500 - 0s - loss: 0.3428 - acc: 0.9524 Epoch 237/500 - 0s - loss: 0.3403 - acc: 0.9524 Epoch 238/500 - 0s - loss: 0.3376 - acc: 0.9524 Epoch 239/500 - 0s - loss: 0.3351 - acc: 0.9524 Epoch 240/500 - 0s - loss: 0.3329 - acc: 0.9524 Epoch 241/500 - 0s - loss: 0.3309 - acc: 0.9524 Epoch 242/500 - 0s - loss: 0.3291 - acc: 0.9524 Epoch 243/500 - 0s - loss: 0.3272 - acc: 0.9524 Epoch 244/500 - 0s - loss: 0.3249 - acc: 0.9524 Epoch 245/500 - 0s - loss: 0.3222 - acc: 0.9524 Epoch 246/500 - 0s - loss: 0.3197 - acc: 0.9524 Epoch 247/500 - 0s - loss: 0.3175 - acc: 0.9524 Epoch 248/500 - 0s - loss: 0.3157 - acc: 0.9524 Epoch 249/500 - 0s - loss: 0.3139 - acc: 0.9524 Epoch 250/500 - 0s - loss: 0.3117 - acc: 0.9524 Epoch 251/500 - 0s - loss: 0.3093 - acc: 0.9524 Epoch 252/500 - 0s - loss: 0.3071 - acc: 0.9524 Epoch 253/500 - 0s - loss: 0.3052 - acc: 0.9524 Epoch 254/500 - 0s - loss: 0.3034 - acc: 0.9524 Epoch 255/500 - 0s - loss: 0.3015 - acc: 0.9524 Epoch 256/500 - 0s - loss: 0.2993 - acc: 0.9524 Epoch 257/500 - 0s - loss: 0.2971 - acc: 0.9524 Epoch 258/500 - 0s - loss: 0.2952 - acc: 0.9524 Epoch 259/500 - 0s - loss: 0.2935 - acc: 0.9524 Epoch 260/500 - 0s - loss: 0.2917 - acc: 0.9524 Epoch 261/500 - 0s - loss: 0.2896 - acc: 0.9524 Epoch 262/500 - 0s - loss: 0.2876 - acc: 0.9524 Epoch 263/500 - 0s - loss: 0.2856 - acc: 0.9524 Epoch 264/500 - 0s - loss: 0.2838 - acc: 0.9524 Epoch 265/500 - 0s - loss: 0.2821 - acc: 0.9524 Epoch 266/500 - 0s - loss: 0.2804 - acc: 0.9524 Epoch 267/500 - 0s - loss: 0.2785 - acc: 0.9524 Epoch 268/500 - 0s - loss: 0.2766 - acc: 0.9524 Epoch 269/500 - 0s - loss: 0.2746 - acc: 0.9524 Epoch 270/500 - 0s - loss: 0.2729 - acc: 0.9524 Epoch 271/500 - 0s - loss: 0.2712 - acc: 0.9524 Epoch 272/500 - 0s - loss: 0.2696 - acc: 0.9524 Epoch 273/500 - 0s - loss: 0.2679 - acc: 0.9524 Epoch 274/500 - 0s - loss: 0.2661 - acc: 0.9524 Epoch 275/500 - 0s - loss: 0.2643 - acc: 0.9524 Epoch 276/500 - 0s - loss: 0.2625 - acc: 0.9524 Epoch 277/500 - 0s - loss: 0.2608 - acc: 0.9524 Epoch 278/500 - 0s - loss: 0.2592 - acc: 0.9524 Epoch 279/500 - 0s - loss: 0.2576 - acc: 0.9524 Epoch 280/500 - 0s - loss: 0.2559 - acc: 0.9524 Epoch 281/500 - 0s - loss: 0.2543 - acc: 0.9524 Epoch 282/500 - 0s - loss: 0.2526 - acc: 0.9524 Epoch 283/500 - 0s - loss: 0.2510 - acc: 0.9524 Epoch 284/500 - 0s - loss: 0.2495 - acc: 0.9524 Epoch 285/500 - 0s - loss: 0.2480 - acc: 0.9524 Epoch 286/500 - 0s - loss: 0.2465 - acc: 0.9524 Epoch 287/500 - 0s - loss: 0.2449 - acc: 0.9524 Epoch 288/500 - 0s - loss: 0.2433 - acc: 0.9524 Epoch 289/500 - 0s - loss: 0.2418 - acc: 0.9524 Epoch 290/500 - 0s - loss: 0.2402 - acc: 0.9524 Epoch 291/500 - 0s - loss: 0.2388 - acc: 0.9524 Epoch 292/500 - 0s - loss: 0.2374 - acc: 0.9524 Epoch 293/500 - 0s - loss: 0.2362 - acc: 0.9524 Epoch 294/500 - 0s - loss: 0.2349 - acc: 0.9524 Epoch 295/500 - 0s - loss: 0.2334 - acc: 0.9524 Epoch 296/500 - 0s - loss: 0.2317 - acc: 0.9524 Epoch 297/500 - 0s - loss: 0.2301 - acc: 0.9524 Epoch 298/500 - 0s - loss: 0.2288 - acc: 0.9524 Epoch 299/500 - 0s - loss: 0.2276 - acc: 0.9524 Epoch 300/500 - 0s - loss: 0.2263 - acc: 0.9524 Epoch 301/500 - 0s - loss: 0.2248 - acc: 0.9524 Epoch 302/500 - 0s - loss: 0.2234 - acc: 0.9524 Epoch 303/500 - 0s - loss: 0.2220 - acc: 0.9524 Epoch 304/500 - 0s - loss: 0.2208 - acc: 0.9524 Epoch 305/500 - 0s - loss: 0.2195 - acc: 0.9524 Epoch 306/500 - 0s - loss: 0.2182 - acc: 0.9524 Epoch 307/500 - 0s - loss: 0.2169 - acc: 0.9524 Epoch 308/500 - 0s - loss: 0.2156 - acc: 0.9524 Epoch 309/500 - 0s - loss: 0.2143 - acc: 0.9524 Epoch 310/500 - 0s - loss: 0.2132 - acc: 0.9524 Epoch 311/500 - 0s - loss: 0.2121 - acc: 0.9524 Epoch 312/500 - 0s - loss: 0.2109 - acc: 0.9524 Epoch 313/500 - 0s - loss: 0.2097 - acc: 0.9524 Epoch 314/500 - 0s - loss: 0.2084 - acc: 0.9524 Epoch 315/500 - 0s - loss: 0.2071 - acc: 0.9524 Epoch 316/500 - 0s - loss: 0.2059 - acc: 0.9524 Epoch 317/500 - 0s - loss: 0.2048 - acc: 0.9524 Epoch 318/500 - 0s - loss: 0.2037 - acc: 0.9524 Epoch 319/500 - 0s - loss: 0.2026 - acc: 0.9524 Epoch 320/500 - 0s - loss: 0.2014 - acc: 0.9524 Epoch 321/500 - 0s - loss: 0.2003 - acc: 0.9524 Epoch 322/500 - 0s - loss: 0.1993 - acc: 0.9524 Epoch 323/500 - 0s - loss: 0.1983 - acc: 0.9524 Epoch 324/500 - 0s - loss: 0.1973 - acc: 0.9524 Epoch 325/500 - 0s - loss: 0.1962 - acc: 0.9524 Epoch 326/500 - 0s - loss: 0.1951 - acc: 0.9524 Epoch 327/500 - 0s - loss: 0.1940 - acc: 0.9524 Epoch 328/500 - 0s - loss: 0.1929 - acc: 0.9524 Epoch 329/500 - 0s - loss: 0.1919 - acc: 0.9524 Epoch 330/500 - 0s - loss: 0.1909 - acc: 0.9524 Epoch 331/500 - 0s - loss: 0.1899 - acc: 0.9524 Epoch 332/500 - 0s - loss: 0.1889 - acc: 0.9524 Epoch 333/500 - 0s - loss: 0.1879 - acc: 0.9524 Epoch 334/500 - 0s - loss: 0.1870 - acc: 0.9524 Epoch 335/500 - 0s - loss: 0.1860 - acc: 0.9524 Epoch 336/500 - 0s - loss: 0.1851 - acc: 0.9524 Epoch 337/500 - 0s - loss: 0.1841 - acc: 0.9524 Epoch 338/500 - 0s - loss: 0.1832 - acc: 0.9524 Epoch 339/500 - 0s - loss: 0.1823 - acc: 0.9524 Epoch 340/500 - 0s - loss: 0.1813 - acc: 0.9524 Epoch 341/500 - 0s - loss: 0.1804 - acc: 0.9524 Epoch 342/500 - 0s - loss: 0.1796 - acc: 0.9524 Epoch 343/500 - 0s - loss: 0.1789 - acc: 0.9524 Epoch 344/500 - 0s - loss: 0.1780 - acc: 0.9524 Epoch 345/500 - 0s - loss: 0.1771 - acc: 0.9524 Epoch 346/500 - 0s - loss: 0.1761 - acc: 0.9524 Epoch 347/500 - 0s - loss: 0.1753 - acc: 0.9524 Epoch 348/500 - 0s - loss: 0.1745 - acc: 0.9524 Epoch 349/500 - 0s - loss: 0.1738 - acc: 0.9524 Epoch 350/500 - 0s - loss: 0.1729 - acc: 0.9524 Epoch 351/500 - 0s - loss: 0.1720 - acc: 0.9524 Epoch 352/500 - 0s - loss: 0.1712 - acc: 0.9524 Epoch 353/500 - 0s - loss: 0.1704 - acc: 0.9524 Epoch 354/500 - 0s - loss: 0.1697 - acc: 0.9524 Epoch 355/500 - 0s - loss: 0.1689 - acc: 0.9524 Epoch 356/500 - 0s - loss: 0.1681 - acc: 0.9524 Epoch 357/500 - 0s - loss: 0.1673 - acc: 0.9524 Epoch 358/500 - 0s - loss: 0.1665 - acc: 0.9524 Epoch 359/500 - 0s - loss: 0.1659 - acc: 0.9524 Epoch 360/500 - 0s - loss: 0.1653 - acc: 0.9524 Epoch 361/500 - 0s - loss: 0.1646 - acc: 0.9524 Epoch 362/500 - 0s - loss: 0.1638 - acc: 0.9524 Epoch 363/500 - 0s - loss: 0.1629 - acc: 0.9524 Epoch 364/500 - 0s - loss: 0.1622 - acc: 0.9524 Epoch 365/500 - 0s - loss: 0.1616 - acc: 0.9524 Epoch 366/500 - 0s - loss: 0.1610 - acc: 0.9524 Epoch 367/500 - 0s - loss: 0.1602 - acc: 0.9524 Epoch 368/500 - 0s - loss: 0.1594 - acc: 0.9524 Epoch 369/500 - 0s - loss: 0.1588 - acc: 0.9524 Epoch 370/500 - 0s - loss: 0.1584 - acc: 0.9524 Epoch 371/500 - 0s - loss: 0.1578 - acc: 0.9524 Epoch 372/500 - 0s - loss: 0.1570 - acc: 0.9524 Epoch 373/500 - 0s - loss: 0.1562 - acc: 0.9524 Epoch 374/500 - 0s - loss: 0.1556 - acc: 0.9524 Epoch 375/500 - 0s - loss: 0.1550 - acc: 0.9524 Epoch 376/500 - 0s - loss: 0.1545 - acc: 0.9524 Epoch 377/500 - 0s - loss: 0.1538 - acc: 0.9524 Epoch 378/500 - 0s - loss: 0.1530 - acc: 0.9524 Epoch 379/500 - 0s - loss: 0.1525 - acc: 0.9524 Epoch 380/500 - 0s - loss: 0.1521 - acc: 0.9524 Epoch 381/500 - 0s - loss: 0.1514 - acc: 0.9524 Epoch 382/500 - 0s - loss: 0.1507 - acc: 0.9524 Epoch 383/500 - 0s - loss: 0.1501 - acc: 0.9524 Epoch 384/500 - 0s - loss: 0.1495 - acc: 0.9524 Epoch 385/500 - 0s - loss: 0.1490 - acc: 0.9524 Epoch 386/500 - 0s - loss: 0.1484 - acc: 0.9524 Epoch 387/500 - 0s - loss: 0.1478 - acc: 0.9524 Epoch 388/500 - 0s - loss: 0.1472 - acc: 0.9524 Epoch 389/500 - 0s - loss: 0.1467 - acc: 0.9524 Epoch 390/500 - 0s - loss: 0.1462 - acc: 0.9524 Epoch 391/500 - 0s - loss: 0.1456 - acc: 0.9524 Epoch 392/500 - 0s - loss: 0.1451 - acc: 0.9524 Epoch 393/500 - 0s - loss: 0.1446 - acc: 0.9524 Epoch 394/500 - 0s - loss: 0.1440 - acc: 0.9524 Epoch 395/500 - 0s - loss: 0.1435 - acc: 0.9524 Epoch 396/500 - 0s - loss: 0.1431 - acc: 0.9524 Epoch 397/500 - 0s - loss: 0.1426 - acc: 0.9524 Epoch 398/500 - 0s - loss: 0.1421 - acc: 0.9524 Epoch 399/500 - 0s - loss: 0.1415 - acc: 0.9524 Epoch 400/500 - 0s - loss: 0.1410 - acc: 0.9524 Epoch 401/500 - 0s - loss: 0.1406 - acc: 0.9524 Epoch 402/500 - 0s - loss: 0.1402 - acc: 0.9524 Epoch 403/500 - 0s - loss: 0.1397 - acc: 0.9524 Epoch 404/500 - 0s - loss: 0.1391 - acc: 0.9524 Epoch 405/500 - 0s - loss: 0.1386 - acc: 0.9524 Epoch 406/500 - 0s - loss: 0.1381 - acc: 0.9524 Epoch 407/500 - 0s - loss: 0.1377 - acc: 0.9524 Epoch 408/500 - 0s - loss: 0.1372 - acc: 0.9524 Epoch 409/500 - 0s - loss: 0.1367 - acc: 0.9524 Epoch 410/500 - 0s - loss: 0.1363 - acc: 0.9524 Epoch 411/500 - 0s - loss: 0.1359 - acc: 0.9524 Epoch 412/500 - 0s - loss: 0.1356 - acc: 0.9524 Epoch 413/500 - 0s - loss: 0.1350 - acc: 0.9524 Epoch 414/500 - 0s - loss: 0.1345 - acc: 0.9524 Epoch 415/500 - 0s - loss: 0.1341 - acc: 0.9524 Epoch 416/500 - 0s - loss: 0.1337 - acc: 0.9524 Epoch 417/500 - 0s - loss: 0.1332 - acc: 0.9524 Epoch 418/500 - 0s - loss: 0.1328 - acc: 0.9524 Epoch 419/500 - 0s - loss: 0.1324 - acc: 0.9524 Epoch 420/500 - 0s - loss: 0.1320 - acc: 0.9524 Epoch 421/500 - 0s - loss: 0.1316 - acc: 0.9524 Epoch 422/500 - 0s - loss: 0.1312 - acc: 0.9524 Epoch 423/500 - 0s - loss: 0.1308 - acc: 0.9524 Epoch 424/500 - 0s - loss: 0.1304 - acc: 0.9524 Epoch 425/500 - 0s - loss: 0.1300 - acc: 0.9524 Epoch 426/500 - 0s - loss: 0.1296 - acc: 0.9524 Epoch 427/500 - 0s - loss: 0.1293 - acc: 0.9524 Epoch 428/500 - 0s - loss: 0.1290 - acc: 0.9524 Epoch 429/500 - 0s - loss: 0.1285 - acc: 0.9524 Epoch 430/500 - 0s - loss: 0.1281 - acc: 0.9524 Epoch 431/500 - 0s - loss: 0.1277 - acc: 0.9524 Epoch 432/500 - 0s - loss: 0.1274 - acc: 0.9524 Epoch 433/500 - 0s - loss: 0.1270 - acc: 0.9524 Epoch 434/500 - 0s - loss: 0.1266 - acc: 0.9524 Epoch 435/500 - 0s - loss: 0.1262 - acc: 0.9524 Epoch 436/500 - 0s - loss: 0.1259 - acc: 0.9524 Epoch 437/500 - 0s - loss: 0.1256 - acc: 0.9524 Epoch 438/500 - 0s - loss: 0.1252 - acc: 0.9524 Epoch 439/500 - 0s - loss: 0.1248 - acc: 0.9524 Epoch 440/500 - 0s - loss: 0.1244 - acc: 0.9524 Epoch 441/500 - 0s - loss: 0.1241 - acc: 0.9524 Epoch 442/500 - 0s - loss: 0.1238 - acc: 0.9524 Epoch 443/500 - 0s - loss: 0.1234 - acc: 0.9524 Epoch 444/500 - 0s - loss: 0.1231 - acc: 0.9524 Epoch 445/500 - 0s - loss: 0.1227 - acc: 0.9524 Epoch 446/500 - 0s - loss: 0.1224 - acc: 0.9524 Epoch 447/500 - 0s - loss: 0.1221 - acc: 0.9524 Epoch 448/500 - 0s - loss: 0.1218 - acc: 0.9524 Epoch 449/500 - 0s - loss: 0.1214 - acc: 0.9524 Epoch 450/500 - 0s - loss: 0.1211 - acc: 0.9524 Epoch 451/500 - 0s - loss: 0.1208 - acc: 0.9524 Epoch 452/500 - 0s - loss: 0.1205 - acc: 0.9524 Epoch 453/500 - 0s - loss: 0.1202 - acc: 0.9524 Epoch 454/500 - 0s - loss: 0.1199 - acc: 0.9524 Epoch 455/500 - 0s - loss: 0.1195 - acc: 0.9524 Epoch 456/500 - 0s - loss: 0.1192 - acc: 0.9524 Epoch 457/500 - 0s - loss: 0.1189 - acc: 0.9524 Epoch 458/500 - 0s - loss: 0.1187 - acc: 0.9524 Epoch 459/500 - 0s - loss: 0.1184 - acc: 0.9524 Epoch 460/500 - 0s - loss: 0.1181 - acc: 0.9524 Epoch 461/500 - 0s - loss: 0.1178 - acc: 0.9524 Epoch 462/500 - 0s - loss: 0.1175 - acc: 0.9524 Epoch 463/500 - 0s - loss: 0.1172 - acc: 0.9524 Epoch 464/500 - 0s - loss: 0.1169 - acc: 0.9524 Epoch 465/500 - 0s - loss: 0.1166 - acc: 0.9524 Epoch 466/500 - 0s - loss: 0.1163 - acc: 0.9524 Epoch 467/500 - 0s - loss: 0.1161 - acc: 0.9524 Epoch 468/500 - 0s - loss: 0.1158 - acc: 0.9524 Epoch 469/500 - 0s - loss: 0.1155 - acc: 0.9524 Epoch 470/500 - 0s - loss: 0.1153 - acc: 0.9524 Epoch 471/500 - 0s - loss: 0.1150 - acc: 0.9524 Epoch 472/500 - 0s - loss: 0.1147 - acc: 0.9524 Epoch 473/500 - 0s - loss: 0.1144 - acc: 0.9524 Epoch 474/500 - 0s - loss: 0.1142 - acc: 0.9524 Epoch 475/500 - 0s - loss: 0.1139 - acc: 0.9524 Epoch 476/500 - 0s - loss: 0.1136 - acc: 0.9524 Epoch 477/500 - 0s - loss: 0.1134 - acc: 0.9524 Epoch 478/500 - 0s - loss: 0.1131 - acc: 0.9524 Epoch 479/500 - 0s - loss: 0.1129 - acc: 0.9524 Epoch 480/500 - 0s - loss: 0.1127 - acc: 0.9524 Epoch 481/500 - 0s - loss: 0.1124 - acc: 0.9524 Epoch 482/500 - 0s - loss: 0.1121 - acc: 0.9524 Epoch 483/500 - 0s - loss: 0.1119 - acc: 0.9524 Epoch 484/500 - 0s - loss: 0.1117 - acc: 0.9524 Epoch 485/500 - 0s - loss: 0.1114 - acc: 0.9524 Epoch 486/500 - 0s - loss: 0.1112 - acc: 0.9524 Epoch 487/500 - 0s - loss: 0.1109 - acc: 0.9524 Epoch 488/500 - 0s - loss: 0.1107 - acc: 0.9524 Epoch 489/500 - 0s - loss: 0.1105 - acc: 0.9524 Epoch 490/500 - 0s - loss: 0.1102 - acc: 0.9524 Epoch 491/500 - 0s - loss: 0.1100 - acc: 0.9524 Epoch 492/500 - 0s - loss: 0.1098 - acc: 0.9524 Epoch 493/500 - 0s - loss: 0.1095 - acc: 0.9524 Epoch 494/500 - 0s - loss: 0.1093 - acc: 0.9524 Epoch 495/500 - 0s - loss: 0.1091 - acc: 0.9524 Epoch 496/500 - 0s - loss: 0.1089 - acc: 0.9524 Epoch 497/500 - 0s - loss: 0.1087 - acc: 0.9524 Epoch 498/500 - 0s - loss: 0.1084 - acc: 0.9524 Epoch 499/500 - 0s - loss: 0.1082 - acc: 0.9524 Epoch 500/500 - 0s - loss: 0.1080 - acc: 0.9524
<keras.callbacks.History at 0x182b69c0d0>
#Generate a sequence from a language model
def generate_seq(model, tokenizer, max_length, seed_text, n_words):
in_text = seed_text
# generate a fixed number of words
for _ in range(n_words):
# encode the text as integer
encoded = tokenizer.texts_to_sequences([in_text])[0]
# pre-pad sequences to a fixed length
encoded = pad_sequences([encoded], maxlen=max_length, padding='pre')
# predict probabilities for each word
yhat = model.predict_classes(encoded, verbose=0)
# map predicted word index to word
out_word = ''
for word, index in tokenizer.word_index.items():
if index == yhat:
out_word = word
break
# append to input
in_text += ' ' + out_word
return in_text
#Evaluate model
print(generate_seq(model, tokenizer, max_length-1, 'Jack', 4))
print(generate_seq(model, tokenizer, max_length-1, 'jill', 4))
Jack fell down and broke jill fetch a pail water
What do you think about these two sentences? Does this make sense?
=Model 3
We can use an middle-ground approach between the one-word-in and the whole-sentence-in approaches and pass in a sub-sequences of words as input.
This will provide a trade-off between the two framings allowing new lines to be generated and for generation to be picked up mid line.
We will use a window of 3 words as input to predict one word as output. The preparation of the sequences is much like the first example, except with different offsets in the source sequence arrays, as follows:
# encode 2 words -> 1 word
sequences = list()
for i in range(2, len(encoded)):
sequence = encoded[i-2:i+1]
sequences.append(sequence)
# generate a sequence from a language model
def generate_seq(model, tokenizer, max_length, seed_text, n_words):
in_text = seed_text
# generate a fixed number of words
for _ in range(n_words):
# encode the text as integer
encoded = tokenizer.texts_to_sequences([in_text])[0]
# pre-pad sequences to a fixed length
encoded = pad_sequences([encoded], maxlen=max_length, padding='pre')
# predict probabilities for each word
yhat = model.predict_classes(encoded, verbose=0)
# map predicted word index to word
out_word = ''
for word, index in tokenizer.word_index.items():
if index == yhat:
out_word = word
break
# append to input
in_text += ' ' + out_word
return in_text
# source text
data = """ Jack and Jill went up the hill\n
To fetch a pail of water\n
Jack fell down and broke his crown\n
And Jill came tumbling after\n """
#data = open("/Users/jerry/Downloads/shakespear.txt").read().lower()
# integer encode sequences of words
tokenizer = Tokenizer()
tokenizer.fit_on_texts([data])
encoded = tokenizer.texts_to_sequences([data])[0]
# retrieve vocabulary size
vocab_size = len(tokenizer.word_index) + 1
print('Vocabulary Size: %d' % vocab_size)
# encode 2 words -> 1 word
sequences = list()
for i in range(2, len(encoded)):
sequence = encoded[i-2:i+1]
sequences.append(sequence)
print('Total Sequences: %d' % len(sequences))
# pad sequences
max_length = max([len(seq) for seq in sequences])
sequences = pad_sequences(sequences, maxlen=max_length, padding='pre')
print('Max Sequence Length: %d' % max_length)
# split into input and output elements
sequences = array(sequences)
X, y = sequences[:,:-1],sequences[:,-1]
y = to_categorical(y, num_classes=vocab_size)
# define model
model = Sequential()
model.add(Embedding(vocab_size, 15, input_length=max_length-1))
model.add(LSTM(128))
model.add(Dense(vocab_size, activation='softmax'))
print(model.summary())
# compile network
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
# fit network
model.fit(X, y, epochs=500, verbose=2)
Vocabulary Size: 22 Total Sequences: 23 Max Sequence Length: 3 _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= embedding_9 (Embedding) (None, 2, 15) 330 _________________________________________________________________ lstm_10 (LSTM) (None, 128) 73728 _________________________________________________________________ dense_8 (Dense) (None, 22) 2838 ================================================================= Total params: 76,896 Trainable params: 76,896 Non-trainable params: 0 _________________________________________________________________ None Epoch 1/500 - 1s - loss: 3.0916 - acc: 0.0435 Epoch 2/500 - 0s - loss: 3.0906 - acc: 0.0870 Epoch 3/500 - 0s - loss: 3.0894 - acc: 0.0870 Epoch 4/500 - 0s - loss: 3.0880 - acc: 0.1739 Epoch 5/500 - 0s - loss: 3.0867 - acc: 0.1739 Epoch 6/500 - 0s - loss: 3.0853 - acc: 0.1739 Epoch 7/500 - 0s - loss: 3.0838 - acc: 0.1739 Epoch 8/500 - 0s - loss: 3.0824 - acc: 0.1739 Epoch 9/500 - 0s - loss: 3.0809 - acc: 0.1739 Epoch 10/500 - 0s - loss: 3.0793 - acc: 0.1739 Epoch 11/500 - 0s - loss: 3.0777 - acc: 0.1739 Epoch 12/500 - 0s - loss: 3.0761 - acc: 0.1739 Epoch 13/500 - 0s - loss: 3.0743 - acc: 0.1739 Epoch 14/500 - 0s - loss: 3.0725 - acc: 0.1739 Epoch 15/500 - 0s - loss: 3.0707 - acc: 0.1739 Epoch 16/500 - 0s - loss: 3.0687 - acc: 0.1739 Epoch 17/500 - 0s - loss: 3.0666 - acc: 0.1739 Epoch 18/500 - 0s - loss: 3.0645 - acc: 0.1739 Epoch 19/500 - 0s - loss: 3.0622 - acc: 0.1739 Epoch 20/500 - 0s - loss: 3.0598 - acc: 0.1739 Epoch 21/500 - 0s - loss: 3.0573 - acc: 0.1739 Epoch 22/500 - 0s - loss: 3.0547 - acc: 0.1739 Epoch 23/500 - 0s - loss: 3.0520 - acc: 0.1739 Epoch 24/500 - 0s - loss: 3.0490 - acc: 0.1739 Epoch 25/500 - 0s - loss: 3.0460 - acc: 0.1739 Epoch 26/500 - 0s - loss: 3.0428 - acc: 0.1739 Epoch 27/500 - 0s - loss: 3.0394 - acc: 0.1739 Epoch 28/500 - 0s - loss: 3.0358 - acc: 0.1739 Epoch 29/500 - 0s - loss: 3.0320 - acc: 0.1739 Epoch 30/500 - 0s - loss: 3.0280 - acc: 0.1739 Epoch 31/500 - 0s - loss: 3.0238 - acc: 0.1739 Epoch 32/500 - 0s - loss: 3.0194 - acc: 0.1739 Epoch 33/500 - 0s - loss: 3.0148 - acc: 0.1739 Epoch 34/500 - 0s - loss: 3.0099 - acc: 0.1739 Epoch 35/500 - 0s - loss: 3.0047 - acc: 0.1739 Epoch 36/500 - 0s - loss: 2.9992 - acc: 0.1739 Epoch 37/500 - 0s - loss: 2.9935 - acc: 0.1739 Epoch 38/500 - 0s - loss: 2.9874 - acc: 0.1739 Epoch 39/500 - 0s - loss: 2.9810 - acc: 0.1739 Epoch 40/500 - 0s - loss: 2.9742 - acc: 0.1739 Epoch 41/500 - 0s - loss: 2.9671 - acc: 0.1739 Epoch 42/500 - 0s - loss: 2.9596 - acc: 0.1739 Epoch 43/500 - 0s - loss: 2.9517 - acc: 0.1739 Epoch 44/500 - 0s - loss: 2.9434 - acc: 0.1739 Epoch 45/500 - 0s - loss: 2.9347 - acc: 0.1739 Epoch 46/500 - 0s - loss: 2.9254 - acc: 0.1739 Epoch 47/500 - 0s - loss: 2.9157 - acc: 0.1739 Epoch 48/500 - 0s - loss: 2.9055 - acc: 0.1739 Epoch 49/500 - 0s - loss: 2.8947 - acc: 0.1739 Epoch 50/500 - 0s - loss: 2.8833 - acc: 0.1739 Epoch 51/500 - 0s - loss: 2.8714 - acc: 0.1739 Epoch 52/500 - 0s - loss: 2.8588 - acc: 0.1739 Epoch 53/500 - 0s - loss: 2.8456 - acc: 0.1739 Epoch 54/500 - 0s - loss: 2.8317 - acc: 0.1739 Epoch 55/500 - 0s - loss: 2.8171 - acc: 0.1739 Epoch 56/500 - 0s - loss: 2.8018 - acc: 0.1739 Epoch 57/500 - 0s - loss: 2.7857 - acc: 0.1739 Epoch 58/500 - 0s - loss: 2.7687 - acc: 0.1739 Epoch 59/500 - 0s - loss: 2.7510 - acc: 0.1739 Epoch 60/500 - 0s - loss: 2.7322 - acc: 0.1739 Epoch 61/500 - 0s - loss: 2.7127 - acc: 0.1739 Epoch 62/500 - 0s - loss: 2.6921 - acc: 0.2174 Epoch 63/500 - 0s - loss: 2.6706 - acc: 0.2174 Epoch 64/500 - 0s - loss: 2.6482 - acc: 0.2174 Epoch 65/500 - 0s - loss: 2.6248 - acc: 0.2174 Epoch 66/500 - 0s - loss: 2.6003 - acc: 0.2609 Epoch 67/500 - 0s - loss: 2.5745 - acc: 0.2609 Epoch 68/500 - 0s - loss: 2.5479 - acc: 0.2609 Epoch 69/500 - 0s - loss: 2.5201 - acc: 0.2609 Epoch 70/500 - 0s - loss: 2.4912 - acc: 0.3043 Epoch 71/500 - 0s - loss: 2.4612 - acc: 0.3478 Epoch 72/500 - 0s - loss: 2.4300 - acc: 0.3478 Epoch 73/500 - 0s - loss: 2.3978 - acc: 0.3478 Epoch 74/500 - 0s - loss: 2.3645 - acc: 0.3478 Epoch 75/500 - 0s - loss: 2.3301 - acc: 0.3478 Epoch 76/500 - 0s - loss: 2.2948 - acc: 0.3913 Epoch 77/500 - 0s - loss: 2.2584 - acc: 0.3913 Epoch 78/500 - 0s - loss: 2.2212 - acc: 0.3913 Epoch 79/500 - 0s - loss: 2.1832 - acc: 0.3913 Epoch 80/500 - 0s - loss: 2.1441 - acc: 0.3913 Epoch 81/500 - 0s - loss: 2.1045 - acc: 0.3913 Epoch 82/500 - 0s - loss: 2.0644 - acc: 0.4348 Epoch 83/500 - 0s - loss: 2.0236 - acc: 0.4348 Epoch 84/500 - 0s - loss: 1.9824 - acc: 0.5217 Epoch 85/500 - 0s - loss: 1.9408 - acc: 0.5217 Epoch 86/500 - 0s - loss: 1.8989 - acc: 0.5217 Epoch 87/500 - 0s - loss: 1.8567 - acc: 0.5652 Epoch 88/500 - 0s - loss: 1.8144 - acc: 0.5652 Epoch 89/500 - 0s - loss: 1.7719 - acc: 0.5652 Epoch 90/500 - 0s - loss: 1.7293 - acc: 0.5652 Epoch 91/500 - 0s - loss: 1.6868 - acc: 0.5652 Epoch 92/500 - 0s - loss: 1.6443 - acc: 0.6087 Epoch 93/500 - 0s - loss: 1.6019 - acc: 0.6087 Epoch 94/500 - 0s - loss: 1.5596 - acc: 0.6522 Epoch 95/500 - 0s - loss: 1.5174 - acc: 0.6957 Epoch 96/500 - 0s - loss: 1.4756 - acc: 0.6957 Epoch 97/500 - 0s - loss: 1.4341 - acc: 0.7391 Epoch 98/500 - 0s - loss: 1.3929 - acc: 0.7391 Epoch 99/500 - 0s - loss: 1.3519 - acc: 0.7391 Epoch 100/500 - 0s - loss: 1.3114 - acc: 0.7391 Epoch 101/500 - 0s - loss: 1.2714 - acc: 0.7826 Epoch 102/500 - 0s - loss: 1.2318 - acc: 0.7826 Epoch 103/500 - 0s - loss: 1.1926 - acc: 0.8261 Epoch 104/500 - 0s - loss: 1.1539 - acc: 0.8261 Epoch 105/500 - 0s - loss: 1.1157 - acc: 0.8696 Epoch 106/500 - 0s - loss: 1.0780 - acc: 0.9130 Epoch 107/500 - 0s - loss: 1.0408 - acc: 0.9565 Epoch 108/500 - 0s - loss: 1.0041 - acc: 0.9565 Epoch 109/500 - 0s - loss: 0.9680 - acc: 0.9565 Epoch 110/500 - 0s - loss: 0.9325 - acc: 0.9565 Epoch 111/500 - 0s - loss: 0.8976 - acc: 0.9565 Epoch 112/500 - 0s - loss: 0.8633 - acc: 0.9565 Epoch 113/500 - 0s - loss: 0.8297 - acc: 0.9565 Epoch 114/500 - 0s - loss: 0.7968 - acc: 0.9565 Epoch 115/500 - 0s - loss: 0.7646 - acc: 0.9565 Epoch 116/500 - 0s - loss: 0.7332 - acc: 0.9565 Epoch 117/500 - 0s - loss: 0.7025 - acc: 0.9565 Epoch 118/500 - 0s - loss: 0.6727 - acc: 0.9565 Epoch 119/500 - 0s - loss: 0.6436 - acc: 0.9565 Epoch 120/500 - 0s - loss: 0.6153 - acc: 0.9565 Epoch 121/500 - 0s - loss: 0.5879 - acc: 0.9565 Epoch 122/500 - 0s - loss: 0.5612 - acc: 0.9565 Epoch 123/500 - 0s - loss: 0.5354 - acc: 0.9565 Epoch 124/500 - 0s - loss: 0.5104 - acc: 0.9565 Epoch 125/500 - 0s - loss: 0.4863 - acc: 0.9565 Epoch 126/500 - 0s - loss: 0.4630 - acc: 0.9565 Epoch 127/500 - 0s - loss: 0.4406 - acc: 0.9565 Epoch 128/500 - 0s - loss: 0.4191 - acc: 0.9565 Epoch 129/500 - 0s - loss: 0.3985 - acc: 0.9565 Epoch 130/500 - 0s - loss: 0.3787 - acc: 0.9565 Epoch 131/500 - 0s - loss: 0.3599 - acc: 0.9565 Epoch 132/500 - 0s - loss: 0.3420 - acc: 0.9565 Epoch 133/500 - 0s - loss: 0.3250 - acc: 0.9565 Epoch 134/500 - 0s - loss: 0.3089 - acc: 0.9565 Epoch 135/500 - 0s - loss: 0.2937 - acc: 0.9565 Epoch 136/500 - 0s - loss: 0.2792 - acc: 0.9565 Epoch 137/500 - 0s - loss: 0.2656 - acc: 0.9565 Epoch 138/500 - 0s - loss: 0.2528 - acc: 0.9565 Epoch 139/500 - 0s - loss: 0.2408 - acc: 0.9565 Epoch 140/500 - 0s - loss: 0.2295 - acc: 0.9565 Epoch 141/500 - 0s - loss: 0.2189 - acc: 0.9565 Epoch 142/500 - 0s - loss: 0.2090 - acc: 0.9565 Epoch 143/500 - 0s - loss: 0.1997 - acc: 0.9565 Epoch 144/500 - 0s - loss: 0.1911 - acc: 0.9565 Epoch 145/500 - 0s - loss: 0.1830 - acc: 0.9565 Epoch 146/500 - 0s - loss: 0.1755 - acc: 0.9565 Epoch 147/500 - 0s - loss: 0.1685 - acc: 0.9565 Epoch 148/500 - 0s - loss: 0.1620 - acc: 0.9565 Epoch 149/500 - 0s - loss: 0.1559 - acc: 0.9565 Epoch 150/500 - 0s - loss: 0.1503 - acc: 0.9565 Epoch 151/500 - 0s - loss: 0.1451 - acc: 0.9565 Epoch 152/500 - 0s - loss: 0.1402 - acc: 0.9565 Epoch 153/500 - 0s - loss: 0.1357 - acc: 0.9565 Epoch 154/500 - 0s - loss: 0.1316 - acc: 0.9565 Epoch 155/500 - 0s - loss: 0.1277 - acc: 0.9565 Epoch 156/500 - 0s - loss: 0.1241 - acc: 0.9565 Epoch 157/500 - 0s - loss: 0.1208 - acc: 0.9565 Epoch 158/500 - 0s - loss: 0.1177 - acc: 0.9565 Epoch 159/500 - 0s - loss: 0.1148 - acc: 0.9565 Epoch 160/500 - 0s - loss: 0.1122 - acc: 0.9565 Epoch 161/500 - 0s - loss: 0.1097 - acc: 0.9565 Epoch 162/500 - 0s - loss: 0.1074 - acc: 0.9565 Epoch 163/500 - 0s - loss: 0.1053 - acc: 0.9565 Epoch 164/500 - 0s - loss: 0.1033 - acc: 0.9565 Epoch 165/500 - 0s - loss: 0.1015 - acc: 0.9565 Epoch 166/500 - 0s - loss: 0.0998 - acc: 0.9565 Epoch 167/500 - 0s - loss: 0.0982 - acc: 0.9565 Epoch 168/500 - 0s - loss: 0.0967 - acc: 0.9565 Epoch 169/500 - 0s - loss: 0.0953 - acc: 0.9565 Epoch 170/500 - 0s - loss: 0.0940 - acc: 0.9565 Epoch 171/500 - 0s - loss: 0.0927 - acc: 0.9565 Epoch 172/500 - 0s - loss: 0.0916 - acc: 0.9565 Epoch 173/500 - 0s - loss: 0.0905 - acc: 0.9565 Epoch 174/500 - 0s - loss: 0.0895 - acc: 0.9565 Epoch 175/500 - 0s - loss: 0.0885 - acc: 0.9565 Epoch 176/500 - 0s - loss: 0.0876 - acc: 0.9565 Epoch 177/500 - 0s - loss: 0.0868 - acc: 0.9565 Epoch 178/500 - 0s - loss: 0.0860 - acc: 0.9565 Epoch 179/500 - 0s - loss: 0.0853 - acc: 0.9565 Epoch 180/500 - 0s - loss: 0.0845 - acc: 0.9565 Epoch 181/500 - 0s - loss: 0.0839 - acc: 0.9565 Epoch 182/500 - 0s - loss: 0.0832 - acc: 0.9565 Epoch 183/500 - 0s - loss: 0.0826 - acc: 0.9565 Epoch 184/500 - 0s - loss: 0.0820 - acc: 0.9565 Epoch 185/500 - 0s - loss: 0.0815 - acc: 0.9565 Epoch 186/500 - 0s - loss: 0.0810 - acc: 0.9565 Epoch 187/500 - 0s - loss: 0.0805 - acc: 0.9565 Epoch 188/500 - 0s - loss: 0.0800 - acc: 0.9565 Epoch 189/500 - 0s - loss: 0.0795 - acc: 0.9565 Epoch 190/500 - 0s - loss: 0.0791 - acc: 0.9565 Epoch 191/500 - 0s - loss: 0.0787 - acc: 0.9565 Epoch 192/500 - 0s - loss: 0.0783 - acc: 0.9565 Epoch 193/500 - 0s - loss: 0.0779 - acc: 0.9565 Epoch 194/500 - 0s - loss: 0.0775 - acc: 0.9565 Epoch 195/500 - 0s - loss: 0.0772 - acc: 0.9565 Epoch 196/500 - 0s - loss: 0.0768 - acc: 0.9565 Epoch 197/500 - 0s - loss: 0.0765 - acc: 0.9565 Epoch 198/500 - 0s - loss: 0.0762 - acc: 0.9565 Epoch 199/500 - 0s - loss: 0.0759 - acc: 0.9565 Epoch 200/500 - 0s - loss: 0.0756 - acc: 0.9565 Epoch 201/500 - 0s - loss: 0.0753 - acc: 0.9565 Epoch 202/500 - 0s - loss: 0.0750 - acc: 0.9565 Epoch 203/500 - 0s - loss: 0.0748 - acc: 0.9565 Epoch 204/500 - 0s - loss: 0.0745 - acc: 0.9565 Epoch 205/500 - 0s - loss: 0.0743 - acc: 0.9565 Epoch 206/500 - 0s - loss: 0.0740 - acc: 0.9565 Epoch 207/500 - 0s - loss: 0.0738 - acc: 0.9565 Epoch 208/500 - 0s - loss: 0.0736 - acc: 0.9565 Epoch 209/500 - 0s - loss: 0.0734 - acc: 0.9565 Epoch 210/500 - 0s - loss: 0.0732 - acc: 0.9565 Epoch 211/500 - 0s - loss: 0.0730 - acc: 0.9565 Epoch 212/500 - 0s - loss: 0.0728 - acc: 0.9565 Epoch 213/500 - 0s - loss: 0.0726 - acc: 0.9565 Epoch 214/500 - 0s - loss: 0.0724 - acc: 0.9565 Epoch 215/500 - 0s - loss: 0.0722 - acc: 0.9565 Epoch 216/500 - 0s - loss: 0.0720 - acc: 0.9565 Epoch 217/500 - 0s - loss: 0.0718 - acc: 0.9565 Epoch 218/500 - 0s - loss: 0.0717 - acc: 0.9565 Epoch 219/500 - 0s - loss: 0.0715 - acc: 0.9565 Epoch 220/500 - 0s - loss: 0.0714 - acc: 0.9565 Epoch 221/500 - 0s - loss: 0.0712 - acc: 0.9565 Epoch 222/500 - 0s - loss: 0.0711 - acc: 0.9565 Epoch 223/500 - 0s - loss: 0.0709 - acc: 0.9565 Epoch 224/500 - 0s - loss: 0.0708 - acc: 0.9565 Epoch 225/500 - 0s - loss: 0.0706 - acc: 0.9565 Epoch 226/500 - 0s - loss: 0.0705 - acc: 0.9565 Epoch 227/500 - 0s - loss: 0.0703 - acc: 0.9565 Epoch 228/500 - 0s - loss: 0.0702 - acc: 0.9565 Epoch 229/500 - 0s - loss: 0.0701 - acc: 0.9565 Epoch 230/500 - 0s - loss: 0.0700 - acc: 0.9565 Epoch 231/500 - 0s - loss: 0.0698 - acc: 0.9565 Epoch 232/500 - 0s - loss: 0.0697 - acc: 0.9565 Epoch 233/500 - 0s - loss: 0.0696 - acc: 0.9565 Epoch 234/500 - 0s - loss: 0.0695 - acc: 0.9565 Epoch 235/500 - 0s - loss: 0.0694 - acc: 0.9565 Epoch 236/500 - 0s - loss: 0.0693 - acc: 0.9565 Epoch 237/500 - 0s - loss: 0.0692 - acc: 0.9565 Epoch 238/500 - 0s - loss: 0.0691 - acc: 0.9565 Epoch 239/500 - 0s - loss: 0.0689 - acc: 0.9565 Epoch 240/500 - 0s - loss: 0.0688 - acc: 0.9565 Epoch 241/500 - 0s - loss: 0.0687 - acc: 0.9565 Epoch 242/500 - 0s - loss: 0.0687 - acc: 0.9565 Epoch 243/500 - 0s - loss: 0.0686 - acc: 0.9565 Epoch 244/500 - 0s - loss: 0.0685 - acc: 0.9565 Epoch 245/500 - 0s - loss: 0.0684 - acc: 0.9565 Epoch 246/500 - 0s - loss: 0.0683 - acc: 0.9565 Epoch 247/500 - 0s - loss: 0.0682 - acc: 0.9565 Epoch 248/500 - 0s - loss: 0.0681 - acc: 0.9565 Epoch 249/500 - 0s - loss: 0.0680 - acc: 0.9565 Epoch 250/500 - 0s - loss: 0.0679 - acc: 0.9565 Epoch 251/500 - 0s - loss: 0.0679 - acc: 0.9565 Epoch 252/500 - 0s - loss: 0.0678 - acc: 0.9565 Epoch 253/500 - 0s - loss: 0.0677 - acc: 0.9565 Epoch 254/500 - 0s - loss: 0.0676 - acc: 0.9565 Epoch 255/500 - 0s - loss: 0.0675 - acc: 0.9565 Epoch 256/500 - 0s - loss: 0.0675 - acc: 0.9565 Epoch 257/500 - 0s - loss: 0.0674 - acc: 0.9565 Epoch 258/500 - 0s - loss: 0.0673 - acc: 0.9565 Epoch 259/500 - 0s - loss: 0.0673 - acc: 0.9565 Epoch 260/500 - 0s - loss: 0.0672 - acc: 0.9565 Epoch 261/500 - 0s - loss: 0.0671 - acc: 0.9565 Epoch 262/500 - 0s - loss: 0.0670 - acc: 0.9565 Epoch 263/500 - 0s - loss: 0.0670 - acc: 0.9565 Epoch 264/500 - 0s - loss: 0.0669 - acc: 0.9565 Epoch 265/500 - 0s - loss: 0.0669 - acc: 0.9565 Epoch 266/500 - 0s - loss: 0.0668 - acc: 0.9565 Epoch 267/500 - 0s - loss: 0.0667 - acc: 0.9565 Epoch 268/500 - 0s - loss: 0.0667 - acc: 0.9565 Epoch 269/500 - 0s - loss: 0.0666 - acc: 0.9565 Epoch 270/500 - 0s - loss: 0.0665 - acc: 0.9565 Epoch 271/500 - 0s - loss: 0.0665 - acc: 0.9565 Epoch 272/500 - 0s - loss: 0.0664 - acc: 0.9565 Epoch 273/500 - 0s - loss: 0.0664 - acc: 0.9565 Epoch 274/500 - 0s - loss: 0.0663 - acc: 0.9565 Epoch 275/500 - 0s - loss: 0.0663 - acc: 0.9565 Epoch 276/500 - 0s - loss: 0.0662 - acc: 0.9565 Epoch 277/500 - 0s - loss: 0.0662 - acc: 0.9565 Epoch 278/500 - 0s - loss: 0.0661 - acc: 0.9565 Epoch 279/500 - 0s - loss: 0.0661 - acc: 0.9565 Epoch 280/500 - 0s - loss: 0.0660 - acc: 0.9565 Epoch 281/500 - 0s - loss: 0.0660 - acc: 0.9565 Epoch 282/500 - 0s - loss: 0.0659 - acc: 0.9565 Epoch 283/500 - 0s - loss: 0.0659 - acc: 0.9565 Epoch 284/500 - 0s - loss: 0.0658 - acc: 0.9565 Epoch 285/500 - 0s - loss: 0.0658 - acc: 0.9565 Epoch 286/500 - 0s - loss: 0.0657 - acc: 0.9565 Epoch 287/500 - 0s - loss: 0.0657 - acc: 0.9565 Epoch 288/500 - 0s - loss: 0.0656 - acc: 0.9565 Epoch 289/500 - 0s - loss: 0.0656 - acc: 0.9565 Epoch 290/500 - 0s - loss: 0.0655 - acc: 0.9565 Epoch 291/500 - 0s - loss: 0.0655 - acc: 0.9565 Epoch 292/500 - 0s - loss: 0.0654 - acc: 0.9565 Epoch 293/500 - 0s - loss: 0.0654 - acc: 0.9565 Epoch 294/500 - 0s - loss: 0.0654 - acc: 0.9565 Epoch 295/500 - 0s - loss: 0.0653 - acc: 0.9565 Epoch 296/500 - 0s - loss: 0.0653 - acc: 0.9565 Epoch 297/500 - 0s - loss: 0.0652 - acc: 0.9565 Epoch 298/500 - 0s - loss: 0.0652 - acc: 0.9565 Epoch 299/500 - 0s - loss: 0.0652 - acc: 0.9565 Epoch 300/500 - 0s - loss: 0.0651 - acc: 0.9565 Epoch 301/500 - 0s - loss: 0.0651 - acc: 0.9565 Epoch 302/500 - 0s - loss: 0.0651 - acc: 0.9565 Epoch 303/500 - 0s - loss: 0.0650 - acc: 0.9565 Epoch 304/500 - 0s - loss: 0.0650 - acc: 0.9565 Epoch 305/500 - 0s - loss: 0.0649 - acc: 0.9565 Epoch 306/500 - 0s - loss: 0.0649 - acc: 0.9565 Epoch 307/500 - 0s - loss: 0.0649 - acc: 0.9565 Epoch 308/500 - 0s - loss: 0.0648 - acc: 0.9565 Epoch 309/500 - 0s - loss: 0.0648 - acc: 0.9565 Epoch 310/500 - 0s - loss: 0.0648 - acc: 0.9565 Epoch 311/500 - 0s - loss: 0.0647 - acc: 0.9565 Epoch 312/500 - 0s - loss: 0.0647 - acc: 0.9565 Epoch 313/500 - 0s - loss: 0.0647 - acc: 0.9565 Epoch 314/500 - 0s - loss: 0.0646 - acc: 0.9565 Epoch 315/500 - 0s - loss: 0.0646 - acc: 0.9565 Epoch 316/500 - 0s - loss: 0.0646 - acc: 0.9565 Epoch 317/500 - 0s - loss: 0.0646 - acc: 0.9565 Epoch 318/500 - 0s - loss: 0.0645 - acc: 0.9565 Epoch 319/500 - 0s - loss: 0.0645 - acc: 0.9565 Epoch 320/500 - 0s - loss: 0.0645 - acc: 0.9565 Epoch 321/500 - 0s - loss: 0.0644 - acc: 0.9565 Epoch 322/500 - 0s - loss: 0.0644 - acc: 0.9565 Epoch 323/500 - 0s - loss: 0.0644 - acc: 0.9565 Epoch 324/500 - 0s - loss: 0.0643 - acc: 0.9565 Epoch 325/500 - 0s - loss: 0.0643 - acc: 0.9565 Epoch 326/500 - 0s - loss: 0.0643 - acc: 0.9565 Epoch 327/500 - 0s - loss: 0.0643 - acc: 0.9565 Epoch 328/500 - 0s - loss: 0.0642 - acc: 0.9565 Epoch 329/500 - 0s - loss: 0.0642 - acc: 0.9565 Epoch 330/500 - 0s - loss: 0.0642 - acc: 0.9565 Epoch 331/500 - 0s - loss: 0.0642 - acc: 0.9565 Epoch 332/500 - 0s - loss: 0.0641 - acc: 0.9565 Epoch 333/500 - 0s - loss: 0.0641 - acc: 0.9565 Epoch 334/500 - 0s - loss: 0.0641 - acc: 0.9565 Epoch 335/500 - 0s - loss: 0.0641 - acc: 0.9565 Epoch 336/500 - 0s - loss: 0.0640 - acc: 0.9565 Epoch 337/500 - 0s - loss: 0.0640 - acc: 0.9565 Epoch 338/500 - 0s - loss: 0.0640 - acc: 0.9565 Epoch 339/500 - 0s - loss: 0.0640 - acc: 0.9565 Epoch 340/500 - 0s - loss: 0.0639 - acc: 0.9565 Epoch 341/500 - 0s - loss: 0.0639 - acc: 0.9565 Epoch 342/500 - 0s - loss: 0.0639 - acc: 0.9565 Epoch 343/500 - 0s - loss: 0.0639 - acc: 0.9565 Epoch 344/500 - 0s - loss: 0.0638 - acc: 0.9565 Epoch 345/500 - 0s - loss: 0.0638 - acc: 0.9565 Epoch 346/500 - 0s - loss: 0.0638 - acc: 0.9565 Epoch 347/500 - 0s - loss: 0.0638 - acc: 0.9565 Epoch 348/500 - 0s - loss: 0.0638 - acc: 0.9565 Epoch 349/500 - 0s - loss: 0.0637 - acc: 0.9565 Epoch 350/500 - 0s - loss: 0.0637 - acc: 0.9565 Epoch 351/500 - 0s - loss: 0.0637 - acc: 0.9565 Epoch 352/500 - 0s - loss: 0.0637 - acc: 0.9565 Epoch 353/500 - 0s - loss: 0.0636 - acc: 0.9565 Epoch 354/500 - 0s - loss: 0.0636 - acc: 0.9565 Epoch 355/500 - 0s - loss: 0.0636 - acc: 0.9565 Epoch 356/500 - 0s - loss: 0.0636 - acc: 0.9565 Epoch 357/500 - 0s - loss: 0.0636 - acc: 0.9565 Epoch 358/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 359/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 360/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 361/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 362/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 363/500 - 0s - loss: 0.0635 - acc: 0.9565 Epoch 364/500 - 0s - loss: 0.0634 - acc: 0.9565 Epoch 365/500 - 0s - loss: 0.0634 - acc: 0.9565 Epoch 366/500 - 0s - loss: 0.0634 - acc: 0.9565 Epoch 367/500 - 0s - loss: 0.0634 - acc: 0.9565 Epoch 368/500 - 0s - loss: 0.0634 - acc: 0.9565 Epoch 369/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 370/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 371/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 372/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 373/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 374/500 - 0s - loss: 0.0633 - acc: 0.9565 Epoch 375/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 376/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 377/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 378/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 379/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 380/500 - 0s - loss: 0.0632 - acc: 0.9565 Epoch 381/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 382/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 383/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 384/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 385/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 386/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 387/500 - 0s - loss: 0.0631 - acc: 0.9565 Epoch 388/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 389/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 390/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 391/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 392/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 393/500 - 0s - loss: 0.0630 - acc: 0.9565 Epoch 394/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 395/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 396/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 397/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 398/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 399/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 400/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 401/500 - 0s - loss: 0.0629 - acc: 0.9565 Epoch 402/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 403/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 404/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 405/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 406/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 407/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 408/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 409/500 - 0s - loss: 0.0628 - acc: 0.9565 Epoch 410/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 411/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 412/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 413/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 414/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 415/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 416/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 417/500 - 0s - loss: 0.0627 - acc: 0.9565 Epoch 418/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 419/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 420/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 421/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 422/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 423/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 424/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 425/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 426/500 - 0s - loss: 0.0626 - acc: 0.9565 Epoch 427/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 428/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 429/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 430/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 431/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 432/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 433/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 434/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 435/500 - 0s - loss: 0.0625 - acc: 0.9565 Epoch 436/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 437/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 438/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 439/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 440/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 441/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 442/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 443/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 444/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 445/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 446/500 - 0s - loss: 0.0624 - acc: 0.9565 Epoch 447/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 448/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 449/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 450/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 451/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 452/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 453/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 454/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 455/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 456/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 457/500 - 0s - loss: 0.0623 - acc: 0.9565 Epoch 458/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 459/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 460/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 461/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 462/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 463/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 464/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 465/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 466/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 467/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 468/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 469/500 - 0s - loss: 0.0622 - acc: 0.9565 Epoch 470/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 471/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 472/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 473/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 474/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 475/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 476/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 477/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 478/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 479/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 480/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 481/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 482/500 - 0s - loss: 0.0621 - acc: 0.9565 Epoch 483/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 484/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 485/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 486/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 487/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 488/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 489/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 490/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 491/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 492/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 493/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 494/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 495/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 496/500 - 0s - loss: 0.0620 - acc: 0.9565 Epoch 497/500 - 0s - loss: 0.0619 - acc: 0.9565 Epoch 498/500 - 0s - loss: 0.0619 - acc: 0.9565 Epoch 499/500 - 0s - loss: 0.0619 - acc: 0.9565 Epoch 500/500 - 0s - loss: 0.0619 - acc: 0.9565 Jack and jill went up the hill And Jill went up the fell down and broke his crown and pail of water jack fell down and
# evaluate model
print(generate_seq(model, tokenizer, max_length-1, 'Jack and', 5))
print(generate_seq(model, tokenizer, max_length-1, 'And Jill', 3))
print(generate_seq(model, tokenizer, max_length-1, 'fell down', 5))
print(generate_seq(model, tokenizer, max_length-1, 'pail of', 5))
Jack and jill went up the hill And Jill went up the fell down and broke his crown and pail of water jack fell down and