当前位置: 代码迷 >> 综合 >> 第五章 神经网络
  详细解决方案

第五章 神经网络

热度:41   发布时间:2023-11-29 16:42:11.0

神经网络

1.基础知识

神经网络是由具有适应性的简单单元组成的广泛并行互连的网络

Perceptron 感知机

感知机只有两层神经元组成,而且只有输出层是M-P神经单元也就是功能神经元

反向传播算法(Back propagation)可以应用于多层前馈神经网络,还可以应用于训练递归神经网络

一般说 BP算法就是训练的多层前馈神经网络.

深度学习的基本名词

卷积神经网络(convolutional neural network CNN)

cnn复合多个 卷积层 和 采样层 来对输入信号进行加工.最终在连接层实现与输出目标之间的映射.

卷积层:包含多个特征映射,每个特征映射是一个由多个神经元构成的平面.

采样层:基于局部相关性原理进行亚采样,减少数据量的同时保留有用信息.

换个角度理解就是 用机器代替原来专家的"特征工程(feature engineering)"

神经网络的激活函数

1.logitic:典型的激活函数sigmod函数,在计算分类概率时,非常有用.

f(z)=11+exp(?z), 0<f(z)<1f(z)=11+exp(?z),0<f(z)<1

2.Tanh:

f(z)=tanh(z)=ez?e?zez+e?z ,?1<f(z)<1f(z)=tanh(z)=ez?e?zez+e?z,?1<f(z)<1

3.Relu:线性修正函数,函数的主要目的是对抗梯度消失,当梯度反向传播到第一层的时候,梯度容易趋近于0或者一个非常小的值.

f(z)=max(0,x)f(z)=max(0,x)

卷积神经网络(CNN)

卷积:就是两个操作在时间维度上的融合.

(f?g)(τ)=?f(τ)g(t?τ)dτ(f?g)(τ)=∫?∞∞f(τ)g(t?τ)dτ
卷积的使用范围可以被延展到离散域,数学表达式为
(f?g)[n]=m=?f(m)g(n?m)(f?g)[n]=∑m=?∞∞f(m)g(n?m)
卷积运算中最重要的是核函数,利用核函数分别与每个点的乘积再求和.作为下一个层的元素点.

2.思想脉络

根据训练数据集来调整神经元之间的连接权 connection weight ,以及每个功能神经元的阈值.

也就是说,神经网络所学到的东西都在连接权和阈值中.

参数的确定(利用迭代更新)调整感知机(神经网络)的权重.

ωiω+Δωiωi←ω+Δωi

Δωi=η(y?y^xi)Δωi=η(y?y^xi)

先将输入事例提供给输入层神经元,逐层将信号进行前传,直到产生输出层的结果

计算输出层的误差,再将误差逆向传播至隐层神经元

最后根据隐层神经元的误差来对连接权和阈值进行调整.并进行迭代循环进行.

3.算法推导

BP算法:

训练集
D={ (x1,y1),(x2,y2),...,(xm,ym)}D={(x1,y1),(x2,y2),...,(xm,ym)}

输入:d个属性

输出:l维实值向量 阈值θjθj

隐藏层:q个隐层神经元网络 阈值 γhγh

bh=f1(αh?γh)bh=f1(αh?γh)

yj=f2(βj?θj)yj=f2(βj?θj)

任意参数的更新估计式

υυ+Δυυ←υ+Δυ

BP算法基于梯度下降策略来进行参数的调整

知识点补充:梯度下降法(gradient descent)

梯度下降法是一种常用的一阶优化方法,是求解无约束优化问题最简单,最经典的方法之一.

f(x)是连续可微函数,且满足

f(xt+1)<f(xt)t=0,1,2,3...f(xt+1)<f(xt)t=0,1,2,3...

则不断执行该过程可收敛到局部最小点,根据泰勒公式展开

f(x+Δx)?f(x)+ΔxTf(x)f(x+Δx)?f(x)+ΔxT▽f(x)
为了使 f(x+Δx)<f(x)f(x+Δx)<f(x)  可以让
Δx=?γf(x), γΔx=?γ▽f(x),其中γ为步长,一个小常数
目标函数: Ek=12lj=1(ykj^?yKj)Ek=12∑j=1l(yjk^?yjK) 最小化目标函数 推导 ΔυihΔυih 的更新公式: 对目标函数进行求导
?Ek?υih=?Ek?bh.?bh?αh=?j=1l?Ek?βj.?βj?αhf(αh?γh)=i=1lωhjgjf(αh?γh)=bh(1?bh)j=1lωhjgj.?Ek?υih=?Ek?bh.?bh?αh=?∑j=1l?Ek?βj.?βj?αhf′(αh?γh)=∑i=1lωhjgjf′(αh?γh)=bh(1?bh)∑j=1lωhjgj.
隐藏层和输出层的激活函数是相同的
全局最小 & 局部最小

其实整个算法是一个参数寻优的过程.找到一组最优的参数.

4.编程推导

4.1BP算法,在西瓜数据集3.0上用算法训练一个单隐层神经网络

PesudoCode:

          输入:训练集学习率 过程:1.在(0,1)范围内随机初始化网络中所有的连接权值和阈值2.repeat3. for all (Xk,Yk) do4.      根据当前参数和公式,计算当前样本的输出5.      根据公式计算出输出层神经元的梯度项6.      根据公式计算隐层神经元的梯度项7.      根据公式更新连接权和阈值8.  end for9. until 达到停止条件输出:连接权与阈值确定的多层前馈神经网络注意区分标准BP算法,和累积BP算法(accumulated error backpropagation)
累积BP算法:是将训练集进行读取一遍后才进行更新
标准BP算法:针对一个训练样例进行更新
# input()函数
# 将西瓜数据集3.0进行读取
def input():"""@param : none or filepath@return : dataSet,dataFrame using pandasRandom double or random.uniform()"""try:import pandas as pdexcept ImportError:print("module import error")with open('/home/dengshuo/GithubCode/ML/CH05/watermelon3.csv') as data_file:df=pd.read_csv(data_file)return df
# learningRatio()函数
# 初始化函数的学习率
def learningRatio():"""@ return : learningRatio """try:import randomexcept ImportError:print('module import error')learningRatio=random.uniform(0,1)return learningRatio
ratio=learningRatio()
print(ratio)
input()
0.8475765311660175
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
编号 色泽 根蒂 敲声 纹理 脐部 触感 密度 含糖率 好瓜
0 1 青绿 蜷缩 浊响 清晰 凹陷 硬滑 0.697 0.460
1 2 乌黑 蜷缩 沉闷 清晰 凹陷 硬滑 0.774 0.376
2 3 乌黑 蜷缩 浊响 清晰 凹陷 硬滑 0.634 0.264
3 4 青绿 蜷缩 沉闷 清晰 凹陷 硬滑 0.608 0.318
4 5 浅白 蜷缩 浊响 清晰 凹陷 硬滑 0.556 0.215
5 6 青绿 稍蜷 浊响 清晰 稍凹 软粘 0.403 0.237
6 7 乌黑 稍蜷 浊响 稍糊 稍凹 软粘 0.481 0.149
7 8 乌黑 稍蜷 浊响 清晰 稍凹 硬滑 0.437 0.211
8 9 乌黑 稍蜷 沉闷 稍糊 稍凹 硬滑 0.666 0.091
9 10 青绿 硬挺 清脆 清晰 平坦 软粘 0.243 0.267
10 11 浅白 硬挺 清脆 模糊 平坦 硬滑 0.245 0.057
11 12 浅白 蜷缩 浊响 模糊 平坦 软粘 0.343 0.099
12 13 青绿 稍蜷 浊响 稍糊 凹陷 硬滑 0.639 0.161
13 14 浅白 稍蜷 沉闷 稍糊 凹陷 硬滑 0.657 0.198
14 15 乌黑 稍蜷 浊响 清晰 稍凹 软粘 0.360 0.370
15 16 浅白 蜷缩 浊响 模糊 平坦 硬滑 0.593 0.042
16 17 青绿 蜷缩 沉闷 稍糊 稍凹 硬滑 0.719 0.103
17 18 青绿 蜷缩 浊响 清晰 凹陷 硬滑 0.697 0.460 NaN
# outputlayer() 函数
# 计算函数输出层的输出值Yk
def outputlayer(df):"""@param df: the dataframe of pandas@return Yk:the output """
# 复杂的参数让人头疼
# define class()
# define the neural networks structure,创建整个算法的框架
''' the definition of BP network class '''
class BP_network: def __init__(self):'''initial variables'''# node number each layerself.i_n = 0           self.h_n = 0   self.o_n = 0# output value for each layerself.i_v = []       self.h_v = []self.o_v = []# parameters (w, t)self.ih_w = []    # weight for each linkself.ho_w = []self.h_t  = []    # threshold for each neuronself.o_t  = []# definition of alternative activation functions and it's derivationself.fun = {'Sigmoid': Sigmoid,          # 对数几率函数'SigmoidDerivate': SigmoidDerivate,'Tanh': Tanh,              # 双曲正切函数'TanhDerivate': TanhDerivate,}
'Sigmoid': Sigmoid,          # 对数几率函数^
SyntaxError: invalid character in identifier
# CreateNN() 函数
# 将架构进行填充def CreateNN(self, ni, nh, no, actfun):'''build a BP network structure and initial parameters@param ni, nh, no: the neuron number of each layer@param actfun: string, the name of activation function'''# import module packagesimport numpy as np import random# assignment of node number# 对每层的结点树的输入值进行赋值self.i_n = niself.h_n = nhself.o_n = no# initial value of output for each layerself.i_v = np.zeros(self.i_n)self.h_v = np.zeros(self.h_n)self.o_v = np.zeros(self.o_n)# initial weights for each link (random initialization)self.ih_w = np.zeros([self.i_n, self.h_n])self.ho_w = np.zeros([self.h_n, self.o_n])# 利用循环来对权值进行赋值for i in range(self.i_n):  for h in range(self.h_n): self.ih_w[i][h] = rand(0,1)# float(0,1) # 调用rand()函数for h in range(self.h_n):  for j in range(self.o_n): self.ho_w[h][j] = rand(0,1)# initial threshold for each neuronself.h_t = np.zeros(self.h_n)self.o_t = np.zeros(self.o_n)for h in range(self.h_n): self.h_t[h] = rand(0,1)for j in range(self.o_n): self.o_t[j] = rand(0,1)# initial activation function# 这个不调库能直接用?不是很理解self.af  = self.fun[actfun]self.afd = self.fun[actfun+'Derivate']
# 随机取值函数的定义
''' the definition of random function '''
def rand(a, b):'''random value generation for parameter initialization@param a,b: the upper and lower limitation of the random value'''from random import randomreturn (b - a) * random() + a
# define th need functions
# 一些激活函数
''' the definition of activation functions '''
def Sigmoid(x):'''definition of sigmoid function and it's derivation'''from math import expreturn 1.0 / (1.0 + exp(-x))
def SigmoidDerivate(y):return y * (1 - y)def Tanh(x):'''definition of sigmoid function and it's derivation'''from math import tanhreturn tanh(x)
def TanhDerivate(y):return 1 - y*y
# predict process through the network
# 计算一个输出def Pred(self, x):'''@param x: the input array for input layer'''# activate input layerfor i in range(self.i_n):self.i_v[i] = x[i]# activate hidden layerfor h in range(self.h_n):total = 0.0for i in range(self.i_n):total += self.i_v[i] * self.ih_w[i][h]self.h_v[h] = self.af(total - self.h_t[h])# activate output layerfor j in range(self.o_n):total = 0.0for h in range(self.h_n):total += self.h_v[h] * self.ho_w[h][j]self.o_v[j] = self.af(total - self.o_t[j])
**还有一个问题就是,已经读取的西瓜数据,该以什么样的形式来进行输入
西瓜数据集的离散性变量该如何处理 例如:色泽{青緑,乌黑,浅白}={0,1,2}  ??
如何不是这样,怎么实现离散性变量的计算?**
# the implementation of BP algorithms on one slide of sample
# backPropagate() 函数
# 后向传播函数,进行计算def BackPropagate(self, x, y, lr):'''@param x, y: array, input and output of the data sample@param lr: float, the learning rate of gradient decent iteration'''# import need module packagesimport numpy as np # get current network outputself.Pred(x)# calculate the gradient based on outputo_grid = np.zeros(self.o_n) for j in range(self.o_n):# 输出层的神经元梯度项,参考西瓜书 5.3 公式(5.10)o_grid[j] = (y[j] - self.o_v[j]) * self.afd(self.o_v[j])# 这个self.afd()函数就相当于yk(1-yk)# caculate the gradient of hidden layer# 计算隐藏层的梯度项Ehh_grid = np.zeros(self.h_n)for h in range(self.h_n):for j in range(self.o_n):h_grid[h] += self.ho_w[h][j] * o_grid[j]h_grid[h] = h_grid[h] * self.afd(self.h_v[h]) # self.afd()函数就是 Bh(1-Bh)# updating the parameter# 将参数进行更新for h in range(self.h_n):  for j in range(self.o_n): # 更新公式self.ho_w[h][j] += lr * o_grid[j] * self.h_v[h]for i in range(self.i_n):  for h in range(self.h_n): self.ih_w[i][h] += lr * h_grid[h] * self.i_v[i]     for j in range(self.o_n):self.o_t[j] -= lr * o_grid[j]    for h in range(self.h_n):self.h_t[h] -= lr * h_grid[h]
# define TrainStandard() 函数
# 标准的BP函数,计算累积误差def TrainStandard(self, data_in, data_out, lr=0.05):'''@param lr, learning rate, default 0.05@param data_in :the networks input data@param data_out:the output data of output layer@return: e, accumulated error@return: e_k, error array of each step'''    e_k = []for k in range(len(data_in)):x = data_in[k]y = data_out[k]self.BackPropagate(x, y, lr)# error in train set for each step# 计算均方误差y_delta2 = 0.0for j in range(self.o_n):y_delta2 += (self.o_v[j] - y[j]) * (self.o_v[j] - y[j])  e_k.append(y_delta2/2)# total error of training# 先计算出累积误差,然后最小化累积误差e = sum(e_k)/len(e_k)return e, e_k
# 返回预测的标签,好瓜是1,坏瓜是0
def PredLabel(self, X):'''predict process through the network@param X: the input sample set for input layer@return: y, array, output set (0,1 - class) based on [winner-takes-all] 也就是竞争学习,胜者通吃'''    import numpy as npy = []for m in range(len(X)):self.Pred(X[m])if self.o_v[0] > 0.5:  y.append(1)else : y.append(0)
# max_y = self.o_v[0]
# label = 0
# for j in range(1,self.o_n):
# if max_y < self.o_v[j]: label = j
# y.append(label)return np.array(y)  
4.2 利用tensorflow 来实现BP算法

先学习如何实现BP算法

汽车燃油效率建模,一个非线性回归.建立一个多变量输入,单变量输出的前向神经网络

1.数据集的描述和加载

这个数据集是一个著名的,标准的输入数据集.这是一个非常简单的例子,主要还是理解其主要的步骤和方法.

因为这个数据集是标准封装好的数据集,不需要进行详细的数据分析.

一般情况下,数据集会进行可视化处理和详细的数据分析.

2.数据的预处理

一般情况下的预处理也是利用sklearn包中的函数进行直接调用处理.

Sklearn中的Pre-Processing模块

sklearn.preprocessing.StandardScaler
# Standardize features by removing the mean and scaling to unit variance
scaler=preprocessing.StandardScaler()
X_train=scaler.fit_transform(X_train)
这是我现阶段认为进行算法分析最难,也是最不容易操作的地方 就是将数据进行处理,满足算法分析的要求. 一般情况下都是数据进行处理,满足输入的条件 向算法靠拢 有没有根据数据,算法向数据靠拢的,是不是就是一开始的算法选择问题?
3.模型架构

多输入,双隐层,单输出的前向神经网络

七个输入结点,第一隐藏层10,第二隐藏层5,一个输出结点.

不过这个比较简单,可直接利用tensorflow中skflow库来直接调取,skflow库的学习

4.准确度测试

利用均方误差来监测准确度.

还是sklearn.metrics 模型的性能度量.

这个例子不需要进行参数的更新? 主要还是损失函数的优化,本例中没有体现.

score=metrics.mean_squared_error(regressor.predict(scaler.transform(X_test)),y_test)
print("Total mean squared error :".format(score))
上述代码进行汇总,步骤进行合成
完整的源代码
from sklearn import datasets,cross_validation,metrics
from sklearn import preprocessing
from tensorflow.contrib import learn
import pandas as pd 
import matplotlib.pyplot as plt 
%matplotlib inline
%config InlineBackend.figure_format='svg'
from keras.models import Sequential
from keras.layers import Dense
read the original dataset with pandas packages
df=pd.read_csv('mpg.csv',header=0)
df
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
mpg cylinders displacement horsepower weight acceleration model_year origin name
0 18.0 8 307.0 130 3504 12.0 70 1 chevrolet chevelle malibu
1 15.0 8 350.0 165 3693 11.5 70 1 buick skylark 320
2 18.0 8 318.0 150 3436 11.0 70 1 plymouth satellite
3 16.0 8 304.0 150 3433 12.0 70 1 amc rebel sst
4 17.0 8 302.0 140 3449 10.5 70 1 ford torino
5 15.0 8 429.0 198 4341 10.0 70 1 ford galaxie 500
6 14.0 8 454.0 220 4354 9.0 70 1 chevrolet impala
7 14.0 8 440.0 215 4312 8.5 70 1 plymouth fury iii
8 14.0 8 455.0 225 4425 10.0 70 1 pontiac catalina
9 15.0 8 390.0 190 3850 8.5 70 1 amc ambassador dpl
10 15.0 8 383.0 170 3563 10.0 70 1 dodge challenger se
11 14.0 8 340.0 160 3609 8.0 70 1 plymouth ‘cuda 340
12 15.0 8 400.0 150 3761 9.5 70 1 chevrolet monte carlo
13 14.0 8 455.0 225 3086 10.0 70 1 buick estate wagon (sw)
14 24.0 4 113.0 95 2372 15.0 70 3 toyota corona mark ii
15 22.0 6 198.0 95 2833 15.5 70 1 plymouth duster
16 18.0 6 199.0 97 2774 15.5 70 1 amc hornet
17 21.0 6 200.0 85 2587 16.0 70 1 ford maverick
18 27.0 4 97.0 88 2130 14.5 70 3 datsun pl510
19 26.0 4 97.0 46 1835 20.5 70 2 volkswagen 1131 deluxe sedan
20 25.0 4 110.0 87 2672 17.5 70 2 peugeot 504
21 24.0 4 107.0 90 2430 14.5 70 2 audi 100 ls
22 25.0 4 104.0 95 2375 17.5 70 2 saab 99e
23 26.0 4 121.0 113 2234 12.5 70 2 bmw 2002
24 21.0 6 199.0 90 2648 15.0 70 1 amc gremlin
25 10.0 8 360.0 215 4615 14.0 70 1 ford f250
26 10.0 8 307.0 200 4376 15.0 70 1 chevy c20
27 11.0 8 318.0 210 4382 13.5 70 1 dodge d200
28 9.0 8 304.0 193 4732 18.5 70 1 hi 1200d
29 27.0 4 97.0 88 2130 14.5 71 3 datsun pl510
368 27.0 4 112.0 88 2640 18.6 82 1 chevrolet cavalier wagon
369 34.0 4 112.0 88 2395 18.0 82 1 chevrolet cavalier 2-door
370 31.0 4 112.0 85 2575 16.2 82 1 pontiac j2000 se hatchback
371 29.0 4 135.0 84 2525 16.0 82 1 dodge aries se
372 27.0 4 151.0 90 2735 18.0 82 1 pontiac phoenix
373 24.0 4 140.0 92 2865 16.4 82 1 ford fairmont futura
374 23.0 4 151.0 0 3035 20.5 82 1 amc concord dl
375 36.0 4 105.0 74 1980 15.3 82 2 volkswagen rabbit l
376 37.0 4 91.0 68 2025 18.2 82 3 mazda glc custom l
377 31.0 4 91.0 68 1970 17.6 82 3 mazda glc custom
378 38.0 4 105.0 63 2125 14.7 82 1 plymouth horizon miser
379 36.0 4 98.0 70 2125 17.3 82 1 mercury lynx l
380 36.0 4 120.0 88 2160 14.5 82 3 nissan stanza xe
381 36.0 4 107.0 75 2205 14.5 82 3 honda accord
382 34.0 4 108.0 70 2245 16.9 82 3 toyota corolla
383 38.0 4 91.0 67 1965 15.0 82 3 honda civic
384 32.0 4 91.0 67 1965 15.7 82 3 honda civic (auto)
385 38.0 4 91.0 67 1995 16.2 82 3 datsun 310 gx
386 25.0 6 181.0 110 2945 16.4 82 1 buick century limited
387 38.0 6 262.0 85 3015 17.0 82 1 oldsmobile cutlass ciera (diesel)
388 26.0 4 156.0 92 2585 14.5 82 1 chrysler lebaron medallion
389 22.0 6 232.0 112 2835 14.7 82 1 ford granada l
390 32.0 4 144.0 96 2665 13.9 82 3 toyota celica gt
391 36.0 4 135.0 84 2370 13.0 82 1 dodge charger 2.2
392 27.0 4 151.0 90 2950 17.3 82 1 chevrolet camaro
393 27.0 4 140.0 86 2790 15.6 82 1 ford mustang gl
394 44.0 4 97.0 52 2130 24.6 82 2 vw pickup
395 32.0 4 135.0 84 2295 11.6 82 1 dodge rampage
396 28.0 4 120.0 79 2625 18.6 82 1 ford ranger
397 31.0 4 119.0 82 2720 19.4 82 1 chevy s-10

398 rows × 9 columns

# convert the displacement column as float
df['displacement']=df['displacement'].astype(float)
# we got the data columns from the dataset
# first and last (mpg and car names )are ignored for X
X=df[df.columns[1:8]]
y=df['mpg']
plt.figure()
f,ax1=plt.subplots()
for i in range (1,8):number=420+iax1.locator_params(nbins=3)ax1=plt.subplot(number) # 4rows x 2 columnsplt.title(list(df)[i])ax1.scatter(df[df.columns[i]],y)  # plot a scatter draw of the datapoints
plt.tight_layout(pad=0.4,w_pad=0.5,h_pad=1.0)
plt.show()
<matplotlib.figure.Figure at 0x7f37680ad9b0>
# split the datasets
X_train,X_test,y_train,y_test=cross_validation.train_test_split(X,y,test_size=0.25)
# Scale the data for convergency optimization
scaler=preprocessing.StandardScaler()
# set the transform parameters
X_train=scaler.fit_transform(X_train)
# bulid a 2 layer fully connected DNN with 10 and 5 units respectively
model=Sequential()
model.add(Dense(10,input_dim=7,init='normal',activation='relu'))
model.add(Dense(5,init='normal',activation='relu'))
model.add(Dense(1,init='normal'))
# compile the model ,with the mean squared error as lost function
model.compile(loss='mean_squared_error',optimizer='adam')
# fit the model in 1000 epochs
model.fit(X_train,y_train,nb_epoch=1000,validation_split=0.33,shuffle=True,verbose=2)
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:9: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(10, input_dim=7, activation="relu", kernel_initializer="normal")`if __name__ == '__main__':
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:10: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(5, activation="relu", kernel_initializer="normal")`# Remove the CWD from sys.path while we load stuff.
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:11: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(1, kernel_initializer="normal")`# This is added back by InteractiveShellApp.init_path()
/home/dengshuo/anaconda3/lib/python3.6/site-packages/keras/models.py:942: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.warnings.warn('The `nb_epoch` argument in `fit` 'Train on 199 samples, validate on 99 samples
Epoch 1/1000- 2s - loss: 617.0525 - val_loss: 609.8485
Epoch 2/1000- 0s - loss: 616.6131 - val_loss: 609.3912
Epoch 3/1000- 0s - loss: 616.1424 - val_loss: 608.8852
Epoch 4/1000- 0s - loss: 615.6107 - val_loss: 608.3354
Epoch 5/1000- 0s - loss: 615.0266 - val_loss: 607.7320
Epoch 6/1000- 0s - loss: 614.3773 - val_loss: 607.0590
Epoch 7/1000- 0s - loss: 613.6486 - val_loss: 606.3037
Epoch 8/1000- 0s - loss: 612.8283 - val_loss: 605.4522
Epoch 9/1000- 0s - loss: 611.8745 - val_loss: 604.4926
Epoch 10/1000- 0s - loss: 610.7958 - val_loss: 603.3850
Epoch 11/1000- 0s - loss: 609.5498 - val_loss: 602.1220
Epoch 12/1000- 0s - loss: 608.1130 - val_loss: 600.6591
Epoch 13/1000- 0s - loss: 606.4227 - val_loss: 598.9324
Epoch 14/1000- 0s - loss: 604.4313 - val_loss: 596.8759
Epoch 15/1000- 0s - loss: 602.0180 - val_loss: 594.4553
Epoch 16/1000- 0s - loss: 599.1613 - val_loss: 591.6023
Epoch 17/1000- 0s - loss: 595.7963 - val_loss: 588.2477
Epoch 18/1000- 0s - loss: 591.8821 - val_loss: 584.3730
Epoch 19/1000- 0s - loss: 587.3363 - val_loss: 579.9527
Epoch 20/1000- 0s - loss: 582.2015 - val_loss: 574.9615
Epoch 21/1000- 0s - loss: 576.3398 - val_loss: 569.3963
Epoch 22/1000- 0s - loss: 569.9582 - val_loss: 563.1732
Epoch 23/1000- 0s - loss: 562.7825 - val_loss: 556.2878
Epoch 24/1000- 0s - loss: 554.7562 - val_loss: 548.6833
Epoch 25/1000- 0s - loss: 546.1809 - val_loss: 540.2465
Epoch 26/1000- 0s - loss: 536.4419 - val_loss: 531.0525
Epoch 27/1000- 0s - loss: 526.0052 - val_loss: 520.9966
Epoch 28/1000- 0s - loss: 514.7750 - val_loss: 510.2122
Epoch 29/1000- 0s - loss: 502.7272 - val_loss: 498.7851
Epoch 30/1000- 0s - loss: 490.0853 - val_loss: 486.8276
Epoch 31/1000- 0s - loss: 476.8980 - val_loss: 474.1135
Epoch 32/1000- 0s - loss: 462.9080 - val_loss: 460.7899
Epoch 33/1000- 0s - loss: 448.4536 - val_loss: 446.8199
Epoch 34/1000- 0s - loss: 433.3823 - val_loss: 432.3523
Epoch 35/1000- 0s - loss: 418.0738 - val_loss: 417.2292
Epoch 36/1000- 0s - loss: 402.1995 - val_loss: 401.9204
Epoch 37/1000- 0s - loss: 386.2957 - val_loss: 386.1704
Epoch 38/1000- 0s - loss: 370.0512 - val_loss: 370.3389
Epoch 39/1000- 0s - loss: 353.8821 - val_loss: 354.4465
Epoch 40/1000- 0s - loss: 337.5520 - val_loss: 338.3667
Epoch 41/1000- 0s - loss: 321.7167 - val_loss: 322.2394
Epoch 42/1000- 0s - loss: 305.6882 - val_loss: 306.7727
Epoch 43/1000- 0s - loss: 290.3743 - val_loss: 291.2963
Epoch 44/1000- 0s - loss: 274.6336 - val_loss: 276.1515
Epoch 45/1000- 0s - loss: 260.0990 - val_loss: 260.5089
Epoch 46/1000- 0s - loss: 244.4121 - val_loss: 245.3027
Epoch 47/1000- 0s - loss: 229.9722 - val_loss: 230.5114
Epoch 48/1000- 0s - loss: 215.3382 - val_loss: 216.6434
Epoch 49/1000- 0s - loss: 201.7503 - val_loss: 202.7701
Epoch 50/1000- 0s - loss: 188.0539 - val_loss: 189.3396
Epoch 51/1000- 0s - loss: 175.2160 - val_loss: 176.7564
Epoch 52/1000- 0s - loss: 162.8866 - val_loss: 164.4597
Epoch 53/1000- 0s - loss: 150.6437 - val_loss: 152.4301
Epoch 54/1000- 0s - loss: 138.7317 - val_loss: 141.0687
Epoch 55/1000- 0s - loss: 128.0692 - val_loss: 130.5078
Epoch 56/1000- 0s - loss: 117.6397 - val_loss: 120.8894
Epoch 57/1000- 0s - loss: 108.0638 - val_loss: 111.7026
Epoch 58/1000- 0s - loss: 99.0284 - val_loss: 103.0330
Epoch 59/1000- 0s - loss: 90.9092 - val_loss: 94.9790
Epoch 60/1000- 0s - loss: 83.2111 - val_loss: 87.5625
Epoch 61/1000- 0s - loss: 76.3767 - val_loss: 80.9372
Epoch 62/1000- 0s - loss: 70.2027 - val_loss: 74.9560
Epoch 63/1000- 0s - loss: 64.6454 - val_loss: 69.5457
Epoch 64/1000- 0s - loss: 59.6377 - val_loss: 64.7154
Epoch 65/1000- 0s - loss: 55.5105 - val_loss: 60.4849
Epoch 66/1000- 0s - loss: 51.8513 - val_loss: 56.9362
Epoch 67/1000- 0s - loss: 48.8381 - val_loss: 53.8420
Epoch 68/1000- 0s - loss: 46.1866 - val_loss: 50.9441
Epoch 69/1000- 0s - loss: 43.8884 - val_loss: 48.5729
Epoch 70/1000- 0s - loss: 41.9503 - val_loss: 46.5152
Epoch 71/1000- 0s - loss: 40.3024 - val_loss: 44.6339
Epoch 72/1000- 0s - loss: 38.8108 - val_loss: 42.9484
Epoch 73/1000- 0s - loss: 37.4980 - val_loss: 41.6013
Epoch 74/1000- 0s - loss: 36.3590 - val_loss: 40.3587
Epoch 75/1000- 0s - loss: 35.3350 - val_loss: 39.2768
Epoch 76/1000- 0s - loss: 34.4340 - val_loss: 38.2934
Epoch 77/1000- 0s - loss: 33.6276 - val_loss: 37.3137
Epoch 78/1000- 0s - loss: 32.8748 - val_loss: 36.3290
Epoch 79/1000- 0s - loss: 32.0255 - val_loss: 35.4493
Epoch 80/1000- 0s - loss: 31.3205 - val_loss: 34.5893
Epoch 81/1000- 0s - loss: 30.6487 - val_loss: 33.7526
Epoch 82/1000- 0s - loss: 29.9475 - val_loss: 32.9104
Epoch 83/1000- 0s - loss: 29.3175 - val_loss: 32.2003
Epoch 84/1000- 0s - loss: 28.7810 - val_loss: 31.5495
Epoch 85/1000- 0s - loss: 28.2781 - val_loss: 30.9045
Epoch 86/1000- 0s - loss: 27.7526 - val_loss: 30.3547
Epoch 87/1000- 0s - loss: 27.3363 - val_loss: 29.7988
Epoch 88/1000- 0s - loss: 26.8700 - val_loss: 29.3264
Epoch 89/1000- 0s - loss: 26.4615 - val_loss: 28.8264
Epoch 90/1000- 0s - loss: 26.0341 - val_loss: 28.3602
Epoch 91/1000- 0s - loss: 25.6106 - val_loss: 27.8731
Epoch 92/1000- 0s - loss: 25.1837 - val_loss: 27.4386
Epoch 93/1000- 0s - loss: 24.8266 - val_loss: 27.0420
Epoch 94/1000- 0s - loss: 24.4566 - val_loss: 26.6196
Epoch 95/1000- 0s - loss: 24.1025 - val_loss: 26.2527
Epoch 96/1000- 0s - loss: 23.7909 - val_loss: 25.8848
Epoch 97/1000- 0s - loss: 23.4538 - val_loss: 25.4576
Epoch 98/1000- 0s - loss: 23.1632 - val_loss: 25.0269
Epoch 99/1000- 0s - loss: 22.8261 - val_loss: 24.6789
Epoch 100/1000- 0s - loss: 22.5293 - val_loss: 24.3329
Epoch 101/1000- 0s - loss: 22.2390 - val_loss: 23.9914
Epoch 102/1000- 0s - loss: 21.9891 - val_loss: 23.6331
Epoch 103/1000- 0s - loss: 21.6775 - val_loss: 23.3320
Epoch 104/1000- 0s - loss: 21.4248 - val_loss: 23.0086
Epoch 105/1000- 0s - loss: 21.1751 - val_loss: 22.7298
Epoch 106/1000- 0s - loss: 20.9343 - val_loss: 22.4775
Epoch 107/1000- 0s - loss: 20.7016 - val_loss: 22.2347
Epoch 108/1000- 0s - loss: 20.4750 - val_loss: 22.0151
Epoch 109/1000- 0s - loss: 20.2166 - val_loss: 21.7921
Epoch 110/1000- 0s - loss: 20.0007 - val_loss: 21.5739
Epoch 111/1000- 0s - loss: 19.7949 - val_loss: 21.3669
Epoch 112/1000- 0s - loss: 19.6167 - val_loss: 21.1766
Epoch 113/1000- 0s - loss: 19.4121 - val_loss: 20.9885
Epoch 114/1000- 0s - loss: 19.2317 - val_loss: 20.8055
Epoch 115/1000- 0s - loss: 19.0297 - val_loss: 20.6317
Epoch 116/1000- 0s - loss: 18.8320 - val_loss: 20.4374
Epoch 117/1000- 0s - loss: 18.6675 - val_loss: 20.2452
Epoch 118/1000- 0s - loss: 18.4886 - val_loss: 20.0742
Epoch 119/1000- 0s - loss: 18.3178 - val_loss: 19.8894
Epoch 120/1000- 0s - loss: 18.1269 - val_loss: 19.7274
Epoch 121/1000- 0s - loss: 17.9846 - val_loss: 19.5575
Epoch 122/1000- 0s - loss: 17.8375 - val_loss: 19.3942
Epoch 123/1000- 0s - loss: 17.7009 - val_loss: 19.2003
Epoch 124/1000- 0s - loss: 17.5455 - val_loss: 19.0089
Epoch 125/1000- 0s - loss: 17.3967 - val_loss: 18.8835
Epoch 126/1000- 0s - loss: 17.2403 - val_loss: 18.7437
Epoch 127/1000- 0s - loss: 17.1088 - val_loss: 18.6076
Epoch 128/1000- 0s - loss: 16.9629 - val_loss: 18.4918
Epoch 129/1000- 0s - loss: 16.8295 - val_loss: 18.3914
Epoch 130/1000- 0s - loss: 16.7112 - val_loss: 18.2824
Epoch 131/1000- 0s - loss: 16.5754 - val_loss: 18.1041
Epoch 132/1000- 0s - loss: 16.4591 - val_loss: 17.9973
Epoch 133/1000- 0s - loss: 16.3635 - val_loss: 17.9189
Epoch 134/1000- 0s - loss: 16.2536 - val_loss: 17.8277
Epoch 135/1000- 0s - loss: 16.1516 - val_loss: 17.7355
Epoch 136/1000- 0s - loss: 16.0301 - val_loss: 17.6537
Epoch 137/1000- 0s - loss: 15.9451 - val_loss: 17.5409
Epoch 138/1000- 0s - loss: 15.8338 - val_loss: 17.4210
Epoch 139/1000- 0s - loss: 15.7295 - val_loss: 17.2850
Epoch 140/1000- 0s - loss: 15.6222 - val_loss: 17.1313
Epoch 141/1000- 0s - loss: 15.5369 - val_loss: 17.0206
Epoch 142/1000- 0s - loss: 15.4528 - val_loss: 16.9648
Epoch 143/1000- 0s - loss: 15.3375 - val_loss: 16.8977
Epoch 144/1000- 0s - loss: 15.2482 - val_loss: 16.8180
Epoch 145/1000- 0s - loss: 15.1663 - val_loss: 16.7605
Epoch 146/1000- 0s - loss: 15.0835 - val_loss: 16.6839
Epoch 147/1000- 0s - loss: 15.0009 - val_loss: 16.5229
Epoch 148/1000- 0s - loss: 14.8867 - val_loss: 16.3582
Epoch 149/1000- 0s - loss: 14.8116 - val_loss: 16.2289
Epoch 150/1000- 0s - loss: 14.7373 - val_loss: 16.1224
Epoch 151/1000- 0s - loss: 14.6495 - val_loss: 16.0367
Epoch 152/1000- 0s - loss: 14.5806 - val_loss: 15.9549
Epoch 153/1000- 0s - loss: 14.4928 - val_loss: 15.9145
Epoch 154/1000- 0s - loss: 14.4120 - val_loss: 15.8803
Epoch 155/1000- 0s - loss: 14.3389 - val_loss: 15.8220
Epoch 156/1000- 0s - loss: 14.2723 - val_loss: 15.7868
Epoch 157/1000- 0s - loss: 14.2073 - val_loss: 15.7316
Epoch 158/1000- 0s - loss: 14.1354 - val_loss: 15.6694
Epoch 159/1000- 0s - loss: 14.0668 - val_loss: 15.6067
Epoch 160/1000- 0s - loss: 14.0027 - val_loss: 15.5436
Epoch 161/1000- 0s - loss: 13.9223 - val_loss: 15.4304
Epoch 162/1000- 0s - loss: 13.8541 - val_loss: 15.3348
Epoch 163/1000- 0s - loss: 13.7808 - val_loss: 15.2501
Epoch 164/1000- 0s - loss: 13.7212 - val_loss: 15.1992
Epoch 165/1000- 0s - loss: 13.6477 - val_loss: 15.1455
Epoch 166/1000- 0s - loss: 13.5840 - val_loss: 15.1195
Epoch 167/1000- 0s - loss: 13.5280 - val_loss: 15.0793
Epoch 168/1000- 0s - loss: 13.4747 - val_loss: 15.0325
Epoch 169/1000- 0s - loss: 13.3968 - val_loss: 14.9866
Epoch 170/1000- 0s - loss: 13.3312 - val_loss: 14.9559
Epoch 171/1000- 0s - loss: 13.2840 - val_loss: 14.9374
Epoch 172/1000- 0s - loss: 13.2239 - val_loss: 14.8995
Epoch 173/1000- 0s - loss: 13.1771 - val_loss: 14.8622
Epoch 174/1000- 0s - loss: 13.1176 - val_loss: 14.7362
Epoch 175/1000- 0s - loss: 13.0417 - val_loss: 14.6778
Epoch 176/1000- 0s - loss: 12.9914 - val_loss: 14.6371
Epoch 177/1000- 0s - loss: 12.9212 - val_loss: 14.6411
Epoch 178/1000- 0s - loss: 12.8728 - val_loss: 14.6625
Epoch 179/1000- 0s - loss: 12.8163 - val_loss: 14.6399
Epoch 180/1000- 0s - loss: 12.7509 - val_loss: 14.5569
Epoch 181/1000- 0s - loss: 12.7054 - val_loss: 14.4544
Epoch 182/1000- 0s - loss: 12.6434 - val_loss: 14.3884
Epoch 183/1000- 0s - loss: 12.5792 - val_loss: 14.3512
Epoch 184/1000- 0s - loss: 12.5281 - val_loss: 14.2796
Epoch 185/1000- 0s - loss: 12.4714 - val_loss: 14.1947
Epoch 186/1000- 0s - loss: 12.4239 - val_loss: 14.1131
Epoch 187/1000- 0s - loss: 12.3744 - val_loss: 14.0642
Epoch 188/1000- 0s - loss: 12.3246 - val_loss: 14.0554
Epoch 189/1000- 0s - loss: 12.2710 - val_loss: 14.0409
Epoch 190/1000- 0s - loss: 12.2298 - val_loss: 14.0390
Epoch 191/1000- 0s - loss: 12.1819 - val_loss: 13.9518
Epoch 192/1000- 0s - loss: 12.1095 - val_loss: 13.8871
Epoch 193/1000- 0s - loss: 12.0804 - val_loss: 13.8551
Epoch 194/1000- 0s - loss: 12.0105 - val_loss: 13.8213
Epoch 195/1000- 0s - loss: 11.9592 - val_loss: 13.7950
Epoch 196/1000- 0s - loss: 11.9199 - val_loss: 13.7670
Epoch 197/1000- 0s - loss: 11.8656 - val_loss: 13.7546
Epoch 198/1000- 0s - loss: 11.8429 - val_loss: 13.8002
Epoch 199/1000- 0s - loss: 11.8014 - val_loss: 13.7367
Epoch 200/1000- 0s - loss: 11.7764 - val_loss: 13.5915
Epoch 201/1000- 0s - loss: 11.6954 - val_loss: 13.5310
Epoch 202/1000- 0s - loss: 11.6499 - val_loss: 13.4761
Epoch 203/1000- 0s - loss: 11.6060 - val_loss: 13.4136
Epoch 204/1000- 0s - loss: 11.5599 - val_loss: 13.3795
Epoch 205/1000- 0s - loss: 11.5162 - val_loss: 13.3651
Epoch 206/1000- 0s - loss: 11.4721 - val_loss: 13.3273
Epoch 207/1000- 0s - loss: 11.4354 - val_loss: 13.2890
Epoch 208/1000- 0s - loss: 11.3894 - val_loss: 13.2485
Epoch 209/1000- 0s - loss: 11.3468 - val_loss: 13.1722
Epoch 210/1000- 0s - loss: 11.3119 - val_loss: 13.1197
Epoch 211/1000- 0s - loss: 11.2801 - val_loss: 13.0959
Epoch 212/1000- 0s - loss: 11.2280 - val_loss: 13.0467
Epoch 213/1000- 0s - loss: 11.1881 - val_loss: 13.0868
Epoch 214/1000- 0s - loss: 11.1517 - val_loss: 13.0580
Epoch 215/1000- 0s - loss: 11.1362 - val_loss: 13.0343
Epoch 216/1000- 0s - loss: 11.0843 - val_loss: 12.9752
Epoch 217/1000- 0s - loss: 11.0379 - val_loss: 12.9185
Epoch 218/1000- 0s - loss: 11.0099 - val_loss: 12.8465
Epoch 219/1000- 0s - loss: 10.9683 - val_loss: 12.7735
Epoch 220/1000- 0s - loss: 10.9387 - val_loss: 12.7777
Epoch 221/1000- 0s - loss: 10.9008 - val_loss: 12.7350
Epoch 222/1000- 0s - loss: 10.8666 - val_loss: 12.6620
Epoch 223/1000- 0s - loss: 10.8378 - val_loss: 12.6455
Epoch 224/1000- 0s - loss: 10.8080 - val_loss: 12.6011
Epoch 225/1000- 0s - loss: 10.7698 - val_loss: 12.5844
Epoch 226/1000- 0s - loss: 10.7452 - val_loss: 12.5354
Epoch 227/1000- 0s - loss: 10.7170 - val_loss: 12.5571
Epoch 228/1000- 0s - loss: 10.6799 - val_loss: 12.5185
Epoch 229/1000- 0s - loss: 10.6516 - val_loss: 12.5097
Epoch 230/1000- 0s - loss: 10.6288 - val_loss: 12.4987
Epoch 231/1000- 0s - loss: 10.6059 - val_loss: 12.4943
Epoch 232/1000- 0s - loss: 10.5655 - val_loss: 12.4339
Epoch 233/1000- 0s - loss: 10.5282 - val_loss: 12.3797
Epoch 234/1000- 0s - loss: 10.4819 - val_loss: 12.2805
Epoch 235/1000- 0s - loss: 10.4445 - val_loss: 12.1988
Epoch 236/1000- 0s - loss: 10.4292 - val_loss: 12.1352
Epoch 237/1000- 0s - loss: 10.4127 - val_loss: 12.1095
Epoch 238/1000- 0s - loss: 10.3888 - val_loss: 12.1112
Epoch 239/1000- 0s - loss: 10.3424 - val_loss: 12.0910
Epoch 240/1000- 0s - loss: 10.3149 - val_loss: 12.0497
Epoch 241/1000- 0s - loss: 10.2816 - val_loss: 12.0143
Epoch 242/1000- 0s - loss: 10.2697 - val_loss: 12.0300
Epoch 243/1000- 0s - loss: 10.2290 - val_loss: 12.0288
Epoch 244/1000- 0s - loss: 10.2026 - val_loss: 12.0396
Epoch 245/1000- 0s - loss: 10.1838 - val_loss: 12.0362
Epoch 246/1000- 0s - loss: 10.1574 - val_loss: 12.0184
Epoch 247/1000- 0s - loss: 10.1397 - val_loss: 11.9757
Epoch 248/1000- 0s - loss: 10.1170 - val_loss: 11.9405
Epoch 249/1000- 0s - loss: 10.1225 - val_loss: 11.9779
Epoch 250/1000- 0s - loss: 10.0511 - val_loss: 11.8278
Epoch 251/1000- 0s - loss: 10.0240 - val_loss: 11.7634
Epoch 252/1000- 0s - loss: 10.0226 - val_loss: 11.7333
Epoch 253/1000- 0s - loss: 9.9874 - val_loss: 11.7427
Epoch 254/1000- 0s - loss: 9.9643 - val_loss: 11.7390
Epoch 255/1000- 0s - loss: 9.9444 - val_loss: 11.7894
Epoch 256/1000- 0s - loss: 9.9192 - val_loss: 11.8064
Epoch 257/1000- 0s - loss: 9.8962 - val_loss: 11.7924
Epoch 258/1000- 0s - loss: 9.8771 - val_loss: 11.7952
Epoch 259/1000- 0s - loss: 9.8646 - val_loss: 11.7574
Epoch 260/1000- 0s - loss: 9.8266 - val_loss: 11.7573
Epoch 261/1000- 0s - loss: 9.8095 - val_loss: 11.7432
Epoch 262/1000- 0s - loss: 9.7870 - val_loss: 11.7341
Epoch 263/1000- 0s - loss: 9.7733 - val_loss: 11.6837
Epoch 264/1000- 0s - loss: 9.7621 - val_loss: 11.6979
Epoch 265/1000- 0s - loss: 9.7290 - val_loss: 11.6682
Epoch 266/1000- 0s - loss: 9.7105 - val_loss: 11.5918
Epoch 267/1000- 0s - loss: 9.6964 - val_loss: 11.4957
Epoch 268/1000- 0s - loss: 9.6705 - val_loss: 11.4671
Epoch 269/1000- 0s - loss: 9.6492 - val_loss: 11.4310
Epoch 270/1000- 0s - loss: 9.6262 - val_loss: 11.4391
Epoch 271/1000- 0s - loss: 9.6017 - val_loss: 11.4193
Epoch 272/1000- 0s - loss: 9.5823 - val_loss: 11.3445
Epoch 273/1000- 0s - loss: 9.5601 - val_loss: 11.2745
Epoch 274/1000- 0s - loss: 9.5388 - val_loss: 11.2656
Epoch 275/1000- 0s - loss: 9.5148 - val_loss: 11.2622
Epoch 276/1000- 0s - loss: 9.4939 - val_loss: 11.2601
Epoch 277/1000- 0s - loss: 9.4902 - val_loss: 11.1829
Epoch 278/1000- 0s - loss: 9.4746 - val_loss: 11.1980
Epoch 279/1000- 0s - loss: 9.4436 - val_loss: 11.2050
Epoch 280/1000- 0s - loss: 9.4259 - val_loss: 11.2028
Epoch 281/1000- 0s - loss: 9.4192 - val_loss: 11.1810
Epoch 282/1000- 0s - loss: 9.3910 - val_loss: 11.1952
Epoch 283/1000- 0s - loss: 9.4012 - val_loss: 11.2408
Epoch 284/1000- 0s - loss: 9.3752 - val_loss: 11.1506
Epoch 285/1000- 0s - loss: 9.3625 - val_loss: 11.1181
Epoch 286/1000- 0s - loss: 9.3673 - val_loss: 11.0366
Epoch 287/1000- 0s - loss: 9.3421 - val_loss: 11.0141
Epoch 288/1000- 0s - loss: 9.3219 - val_loss: 11.0232
Epoch 289/1000- 0s - loss: 9.3020 - val_loss: 11.0165
Epoch 290/1000- 0s - loss: 9.2797 - val_loss: 10.9553
Epoch 291/1000- 0s - loss: 9.2648 - val_loss: 10.8703
Epoch 292/1000- 0s - loss: 9.2774 - val_loss: 10.7951
Epoch 293/1000- 0s - loss: 9.2675 - val_loss: 10.7663
Epoch 294/1000- 0s - loss: 9.2558 - val_loss: 10.6986
Epoch 295/1000- 0s - loss: 9.2492 - val_loss: 10.6766
Epoch 296/1000- 0s - loss: 9.2163 - val_loss: 10.6969
Epoch 297/1000- 0s - loss: 9.1810 - val_loss: 10.7375
Epoch 298/1000- 0s - loss: 9.1611 - val_loss: 10.8039
Epoch 299/1000- 0s - loss: 9.1537 - val_loss: 10.8679
Epoch 300/1000- 0s - loss: 9.1401 - val_loss: 10.8980
Epoch 301/1000- 0s - loss: 9.1325 - val_loss: 10.8921
Epoch 302/1000- 0s - loss: 9.1152 - val_loss: 10.8517
Epoch 303/1000- 0s - loss: 9.0888 - val_loss: 10.8592
Epoch 304/1000- 0s - loss: 9.0723 - val_loss: 10.8164
Epoch 305/1000- 0s - loss: 9.0754 - val_loss: 10.8013
Epoch 306/1000- 0s - loss: 9.0536 - val_loss: 10.8482
Epoch 307/1000- 0s - loss: 9.0342 - val_loss: 10.8750
Epoch 308/1000- 0s - loss: 9.0205 - val_loss: 10.9125
Epoch 309/1000- 0s - loss: 9.0027 - val_loss: 10.9222
Epoch 310/1000- 0s - loss: 8.9918 - val_loss: 10.9294
Epoch 311/1000- 0s - loss: 8.9797 - val_loss: 10.8656
Epoch 312/1000- 0s - loss: 8.9659 - val_loss: 10.8346
Epoch 313/1000- 0s - loss: 8.9525 - val_loss: 10.8616
Epoch 314/1000- 0s - loss: 8.9443 - val_loss: 10.8688
Epoch 315/1000- 0s - loss: 8.9302 - val_loss: 10.9272
Epoch 316/1000- 0s - loss: 8.9138 - val_loss: 10.8415
Epoch 317/1000- 0s - loss: 8.9035 - val_loss: 10.7621
Epoch 318/1000- 0s - loss: 8.8837 - val_loss: 10.7925
Epoch 319/1000- 0s - loss: 8.8797 - val_loss: 10.8277
Epoch 320/1000- 0s - loss: 8.8582 - val_loss: 10.8511
Epoch 321/1000- 0s - loss: 8.8460 - val_loss: 10.8749
Epoch 322/1000- 0s - loss: 8.8483 - val_loss: 10.8923
Epoch 323/1000- 0s - loss: 8.8387 - val_loss: 10.8425
Epoch 324/1000- 0s - loss: 8.8164 - val_loss: 10.8036
Epoch 325/1000- 0s - loss: 8.8381 - val_loss: 10.6709
Epoch 326/1000- 0s - loss: 8.7796 - val_loss: 10.6373
Epoch 327/1000- 0s - loss: 8.7661 - val_loss: 10.6759
Epoch 328/1000- 0s - loss: 8.7963 - val_loss: 10.7422
Epoch 329/1000- 0s - loss: 8.7509 - val_loss: 10.6758
Epoch 330/1000- 0s - loss: 8.7488 - val_loss: 10.6758
Epoch 331/1000- 0s - loss: 8.7206 - val_loss: 10.7761
Epoch 332/1000- 0s - loss: 8.7120 - val_loss: 10.8222
Epoch 333/1000- 0s - loss: 8.7187 - val_loss: 10.7963
Epoch 334/1000- 0s - loss: 8.6982 - val_loss: 10.7546
Epoch 335/1000- 0s - loss: 8.6870 - val_loss: 10.6930
Epoch 336/1000- 0s - loss: 8.6634 - val_loss: 10.6206
Epoch 337/1000- 0s - loss: 8.6658 - val_loss: 10.4322
Epoch 338/1000- 0s - loss: 8.6511 - val_loss: 10.4114
Epoch 339/1000- 0s - loss: 8.6222 - val_loss: 10.3955
Epoch 340/1000- 0s - loss: 8.6127 - val_loss: 10.3674
Epoch 341/1000- 0s - loss: 8.6136 - val_loss: 10.3161
Epoch 342/1000- 0s - loss: 8.5840 - val_loss: 10.3241
Epoch 343/1000- 0s - loss: 8.5690 - val_loss: 10.3284
Epoch 344/1000- 0s - loss: 8.5481 - val_loss: 10.3945
Epoch 345/1000- 0s - loss: 8.5632 - val_loss: 10.4634
Epoch 346/1000- 0s - loss: 8.5353 - val_loss: 10.4523
Epoch 347/1000- 0s - loss: 8.5279 - val_loss: 10.4433
Epoch 348/1000- 0s - loss: 8.5172 - val_loss: 10.4336
Epoch 349/1000- 0s - loss: 8.5000 - val_loss: 10.3681
Epoch 350/1000- 0s - loss: 8.4766 - val_loss: 10.3165
Epoch 351/1000- 0s - loss: 8.4767 - val_loss: 10.3574
Epoch 352/1000- 0s - loss: 8.4723 - val_loss: 10.4080
Epoch 353/1000- 0s - loss: 8.4622 - val_loss: 10.3892
Epoch 354/1000- 0s - loss: 8.4221 - val_loss: 10.2788
Epoch 355/1000- 0s - loss: 8.4285 - val_loss: 10.1881
Epoch 356/1000- 0s - loss: 8.4010 - val_loss: 10.1558
Epoch 357/1000- 0s - loss: 8.4109 - val_loss: 10.2491
Epoch 358/1000- 0s - loss: 8.3848 - val_loss: 10.2795
Epoch 359/1000- 0s - loss: 8.3713 - val_loss: 10.2002
Epoch 360/1000- 0s - loss: 8.3541 - val_loss: 10.1635
Epoch 361/1000- 0s - loss: 8.3585 - val_loss: 10.1494
Epoch 362/1000- 0s - loss: 8.3358 - val_loss: 10.0814
Epoch 363/1000- 0s - loss: 8.3387 - val_loss: 10.0118
Epoch 364/1000- 0s - loss: 8.3137 - val_loss: 10.0304
Epoch 365/1000- 0s - loss: 8.3274 - val_loss: 10.1559
Epoch 366/1000- 0s - loss: 8.3032 - val_loss: 10.1453
Epoch 367/1000- 0s - loss: 8.2848 - val_loss: 10.0953
Epoch 368/1000- 0s - loss: 8.2782 - val_loss: 10.1053
Epoch 369/1000- 0s - loss: 8.2867 - val_loss: 10.1566
Epoch 370/1000- 0s - loss: 8.2609 - val_loss: 10.0592
Epoch 371/1000- 0s - loss: 8.2492 - val_loss: 9.9413
Epoch 372/1000- 0s - loss: 8.2279 - val_loss: 9.9468
Epoch 373/1000- 0s - loss: 8.2247 - val_loss: 9.9058
Epoch 374/1000- 0s - loss: 8.2210 - val_loss: 9.8453
Epoch 375/1000- 0s - loss: 8.2330 - val_loss: 9.8605
Epoch 376/1000- 0s - loss: 8.1977 - val_loss: 9.7890
Epoch 377/1000- 0s - loss: 8.2023 - val_loss: 9.7465
Epoch 378/1000- 0s - loss: 8.2100 - val_loss: 9.7059
Epoch 379/1000- 0s - loss: 8.2008 - val_loss: 9.7453
Epoch 380/1000- 0s - loss: 8.1668 - val_loss: 9.7531
Epoch 381/1000- 0s - loss: 8.1635 - val_loss: 9.7645
Epoch 382/1000- 0s - loss: 8.1444 - val_loss: 9.7483
Epoch 383/1000- 0s - loss: 8.1365 - val_loss: 9.7124
Epoch 384/1000- 0s - loss: 8.1279 - val_loss: 9.7045
Epoch 385/1000- 0s - loss: 8.1201 - val_loss: 9.7145
Epoch 386/1000- 0s - loss: 8.1208 - val_loss: 9.7339
Epoch 387/1000- 0s - loss: 8.1222 - val_loss: 9.7939
Epoch 388/1000- 0s - loss: 8.0853 - val_loss: 9.7851
Epoch 389/1000- 0s - loss: 8.0854 - val_loss: 9.7562
Epoch 390/1000- 0s - loss: 8.0762 - val_loss: 9.7514
Epoch 391/1000- 0s - loss: 8.0688 - val_loss: 9.7722
Epoch 392/1000- 0s - loss: 8.0687 - val_loss: 9.7871
Epoch 393/1000- 0s - loss: 8.0618 - val_loss: 9.8890
Epoch 394/1000- 0s - loss: 8.0692 - val_loss: 9.8754
Epoch 395/1000- 0s - loss: 8.0575 - val_loss: 9.7703
Epoch 396/1000- 0s - loss: 8.0333 - val_loss: 9.7485
Epoch 397/1000- 0s - loss: 8.0103 - val_loss: 9.7759
Epoch 398/1000- 0s - loss: 8.0172 - val_loss: 9.7576
Epoch 399/1000- 0s - loss: 8.0080 - val_loss: 9.7525
Epoch 400/1000- 0s - loss: 8.0005 - val_loss: 9.7616
Epoch 401/1000- 0s - loss: 7.9704 - val_loss: 9.6813
Epoch 402/1000- 0s - loss: 7.9888 - val_loss: 9.6707
Epoch 403/1000- 0s - loss: 7.9841 - val_loss: 9.6930
Epoch 404/1000- 0s - loss: 7.9677 - val_loss: 9.6811
Epoch 405/1000- 0s - loss: 7.9617 - val_loss: 9.6276
Epoch 406/1000- 0s - loss: 7.9841 - val_loss: 9.5850
Epoch 407/1000- 0s - loss: 7.9654 - val_loss: 9.5989
Epoch 408/1000- 0s - loss: 7.9476 - val_loss: 9.6232
Epoch 409/1000- 0s - loss: 7.9519 - val_loss: 9.6726
Epoch 410/1000- 0s - loss: 7.9369 - val_loss: 9.7567
Epoch 411/1000- 0s - loss: 7.9223 - val_loss: 9.7369
Epoch 412/1000- 0s - loss: 7.9220 - val_loss: 9.7348
Epoch 413/1000- 0s - loss: 7.9310 - val_loss: 9.5686
Epoch 414/1000- 0s - loss: 7.8971 - val_loss: 9.5444
Epoch 415/1000- 0s - loss: 7.8960 - val_loss: 9.5441
Epoch 416/1000- 0s - loss: 7.8916 - val_loss: 9.4883
Epoch 417/1000- 0s - loss: 7.8765 - val_loss: 9.5037
Epoch 418/1000- 0s - loss: 7.8644 - val_loss: 9.5271
Epoch 419/1000- 0s - loss: 7.8523 - val_loss: 9.5008
Epoch 420/1000- 0s - loss: 7.8506 - val_loss: 9.4746
Epoch 421/1000- 0s - loss: 7.8538 - val_loss: 9.4658
Epoch 422/1000- 0s - loss: 7.8308 - val_loss: 9.3478
Epoch 423/1000- 0s - loss: 7.8875 - val_loss: 9.3091
Epoch 424/1000- 0s - loss: 7.8633 - val_loss: 9.3342
Epoch 425/1000- 0s - loss: 7.8302 - val_loss: 9.4065
Epoch 426/1000- 0s - loss: 7.8212 - val_loss: 9.4203
Epoch 427/1000- 0s - loss: 7.8332 - val_loss: 9.4748
Epoch 428/1000- 0s - loss: 7.8091 - val_loss: 9.4748
Epoch 429/1000- 0s - loss: 7.7959 - val_loss: 9.5255
Epoch 430/1000- 0s - loss: 7.8168 - val_loss: 9.4170
Epoch 431/1000- 0s - loss: 7.7823 - val_loss: 9.4302
Epoch 432/1000- 0s - loss: 7.7762 - val_loss: 9.4258
Epoch 433/1000- 0s - loss: 7.7796 - val_loss: 9.4266
Epoch 434/1000- 0s - loss: 7.7753 - val_loss: 9.4045
Epoch 435/1000- 0s - loss: 7.7657 - val_loss: 9.3825
Epoch 436/1000- 0s - loss: 7.7469 - val_loss: 9.3104
Epoch 437/1000- 0s - loss: 7.7607 - val_loss: 9.2103
Epoch 438/1000- 0s - loss: 7.7419 - val_loss: 9.2481
Epoch 439/1000- 0s - loss: 7.7340 - val_loss: 9.2872
Epoch 440/1000- 0s - loss: 7.7184 - val_loss: 9.4087
Epoch 441/1000- 0s - loss: 7.7430 - val_loss: 9.5141
Epoch 442/1000- 0s - loss: 7.7368 - val_loss: 9.4866
Epoch 443/1000- 0s - loss: 7.7329 - val_loss: 9.4196
Epoch 444/1000- 0s - loss: 7.7017 - val_loss: 9.4086
Epoch 445/1000- 0s - loss: 7.6908 - val_loss: 9.3950
Epoch 446/1000- 0s - loss: 7.6865 - val_loss: 9.4267
Epoch 447/1000- 0s - loss: 7.6731 - val_loss: 9.3945
Epoch 448/1000- 0s - loss: 7.6665 - val_loss: 9.3778
Epoch 449/1000- 0s - loss: 7.6738 - val_loss: 9.3590
Epoch 450/1000- 0s - loss: 7.6739 - val_loss: 9.3468
Epoch 451/1000- 0s - loss: 7.6753 - val_loss: 9.3527
Epoch 452/1000- 0s - loss: 7.6685 - val_loss: 9.3419
Epoch 453/1000- 0s - loss: 7.6555 - val_loss: 9.3090
Epoch 454/1000- 0s - loss: 7.6404 - val_loss: 9.2873
Epoch 455/1000- 0s - loss: 7.6279 - val_loss: 9.3055
Epoch 456/1000- 0s - loss: 7.6209 - val_loss: 9.3193
Epoch 457/1000- 0s - loss: 7.6182 - val_loss: 9.2961
Epoch 458/1000- 0s - loss: 7.6453 - val_loss: 9.3292
Epoch 459/1000- 0s - loss: 7.6278 - val_loss: 9.2790
Epoch 460/1000- 0s - loss: 7.6134 - val_loss: 9.3962
Epoch 461/1000- 0s - loss: 7.6195 - val_loss: 9.3386
Epoch 462/1000- 0s - loss: 7.5990 - val_loss: 9.3383
Epoch 463/1000- 0s - loss: 7.5899 - val_loss: 9.3752
Epoch 464/1000- 0s - loss: 7.5938 - val_loss: 9.3618
Epoch 465/1000- 0s - loss: 7.5738 - val_loss: 9.2606
Epoch 466/1000- 0s - loss: 7.5666 - val_loss: 9.2365
Epoch 467/1000- 0s - loss: 7.5713 - val_loss: 9.2111
Epoch 468/1000- 0s - loss: 7.5686 - val_loss: 9.2024
Epoch 469/1000- 0s - loss: 7.5623 - val_loss: 9.2167
Epoch 470/1000- 0s - loss: 7.5463 - val_loss: 9.2436
Epoch 471/1000- 0s - loss: 7.5377 - val_loss: 9.2348
Epoch 472/1000- 0s - loss: 7.5338 - val_loss: 9.2552
Epoch 473/1000- 0s - loss: 7.5265 - val_loss: 9.2904
Epoch 474/1000- 0s - loss: 7.5308 - val_loss: 9.2888
Epoch 475/1000- 0s - loss: 7.5382 - val_loss: 9.2878
Epoch 476/1000- 0s - loss: 7.5478 - val_loss: 9.1688
Epoch 477/1000- 0s - loss: 7.5148 - val_loss: 9.1464
Epoch 478/1000- 0s - loss: 7.5126 - val_loss: 9.1406
Epoch 479/1000- 0s - loss: 7.5022 - val_loss: 9.1098
Epoch 480/1000- 0s - loss: 7.4976 - val_loss: 9.1096
Epoch 481/1000- 0s - loss: 7.4839 - val_loss: 9.1037
Epoch 482/1000- 0s - loss: 7.4758 - val_loss: 9.1226
Epoch 483/1000- 0s - loss: 7.4828 - val_loss: 9.0665
Epoch 484/1000- 0s - loss: 7.4734 - val_loss: 9.0339
Epoch 485/1000- 0s - loss: 7.4679 - val_loss: 9.0203
Epoch 486/1000- 0s - loss: 7.4601 - val_loss: 8.9967
Epoch 487/1000- 0s - loss: 7.4945 - val_loss: 8.9713
Epoch 488/1000- 0s - loss: 7.4982 - val_loss: 8.9930
Epoch 489/1000- 0s - loss: 7.4776 - val_loss: 8.9826
Epoch 490/1000- 0s - loss: 7.4644 - val_loss: 9.0402
Epoch 491/1000- 0s - loss: 7.4525 - val_loss: 9.1728
Epoch 492/1000- 0s - loss: 7.4518 - val_loss: 9.1823
Epoch 493/1000- 0s - loss: 7.4512 - val_loss: 9.1559
Epoch 494/1000- 0s - loss: 7.4614 - val_loss: 9.2690
Epoch 495/1000- 0s - loss: 7.4647 - val_loss: 9.2601
Epoch 496/1000- 0s - loss: 7.4457 - val_loss: 9.1712
Epoch 497/1000- 0s - loss: 7.4554 - val_loss: 8.9504
Epoch 498/1000- 0s - loss: 7.4170 - val_loss: 8.9071
Epoch 499/1000- 0s - loss: 7.4237 - val_loss: 8.9131
Epoch 500/1000- 0s - loss: 7.4320 - val_loss: 8.9394
Epoch 501/1000- 0s - loss: 7.4182 - val_loss: 8.9443
Epoch 502/1000- 0s - loss: 7.4120 - val_loss: 8.9685
Epoch 503/1000- 0s - loss: 7.4104 - val_loss: 8.9074
Epoch 504/1000- 0s - loss: 7.4204 - val_loss: 8.9123
Epoch 505/1000- 0s - loss: 7.4114 - val_loss: 8.9123
Epoch 506/1000- 0s - loss: 7.4200 - val_loss: 8.9229
Epoch 507/1000- 0s - loss: 7.3940 - val_loss: 8.9299
Epoch 508/1000- 0s - loss: 7.4050 - val_loss: 8.9499
Epoch 509/1000- 0s - loss: 7.3876 - val_loss: 8.8366
Epoch 510/1000- 0s - loss: 7.4085 - val_loss: 8.8196
Epoch 511/1000- 0s - loss: 7.3853 - val_loss: 8.9050
Epoch 512/1000- 0s - loss: 7.3525 - val_loss: 8.9716
Epoch 513/1000- 0s - loss: 7.3790 - val_loss: 9.0047
Epoch 514/1000- 0s - loss: 7.4038 - val_loss: 9.0093
Epoch 515/1000- 0s - loss: 7.3796 - val_loss: 8.8904
Epoch 516/1000- 0s - loss: 7.3671 - val_loss: 8.8123
Epoch 517/1000- 0s - loss: 7.3561 - val_loss: 8.8055
Epoch 518/1000- 0s - loss: 7.3511 - val_loss: 8.7853
Epoch 519/1000- 0s - loss: 7.3664 - val_loss: 8.7575
Epoch 520/1000- 0s - loss: 7.4085 - val_loss: 8.6663
Epoch 521/1000- 0s - loss: 7.3823 - val_loss: 8.6810
Epoch 522/1000- 0s - loss: 7.3353 - val_loss: 8.7941
Epoch 523/1000- 0s - loss: 7.3142 - val_loss: 8.8653
Epoch 524/1000- 0s - loss: 7.3648 - val_loss: 9.0518
Epoch 525/1000- 0s - loss: 7.3819 - val_loss: 9.0588
Epoch 526/1000- 0s - loss: 7.3882 - val_loss: 8.8316
Epoch 527/1000- 0s - loss: 7.3255 - val_loss: 8.7841
Epoch 528/1000- 0s - loss: 7.3287 - val_loss: 8.7704
Epoch 529/1000- 0s - loss: 7.3126 - val_loss: 8.8060
Epoch 530/1000- 0s - loss: 7.3156 - val_loss: 8.8370
Epoch 531/1000- 0s - loss: 7.3028 - val_loss: 8.7535
Epoch 532/1000- 0s - loss: 7.3223 - val_loss: 8.7023
Epoch 533/1000- 0s - loss: 7.3409 - val_loss: 8.7554
Epoch 534/1000- 0s - loss: 7.2988 - val_loss: 8.7629
Epoch 535/1000- 0s - loss: 7.3027 - val_loss: 8.7798
Epoch 536/1000- 0s - loss: 7.2900 - val_loss: 8.8719
Epoch 537/1000- 0s - loss: 7.3077 - val_loss: 9.0230
Epoch 538/1000- 0s - loss: 7.3114 - val_loss: 8.9872
Epoch 539/1000- 0s - loss: 7.2892 - val_loss: 9.0381
Epoch 540/1000- 0s - loss: 7.2971 - val_loss: 9.1181
Epoch 541/1000- 0s - loss: 7.2840 - val_loss: 8.9708
Epoch 542/1000- 0s - loss: 7.2606 - val_loss: 8.9025
Epoch 543/1000- 0s - loss: 7.2699 - val_loss: 8.8968
Epoch 544/1000- 0s - loss: 7.3116 - val_loss: 8.8308
Epoch 545/1000- 0s - loss: 7.2621 - val_loss: 8.9328
Epoch 546/1000- 0s - loss: 7.2553 - val_loss: 8.8968
Epoch 547/1000- 0s - loss: 7.2492 - val_loss: 8.9049
Epoch 548/1000- 0s - loss: 7.2376 - val_loss: 8.9244
Epoch 549/1000- 0s - loss: 7.2481 - val_loss: 8.8881
Epoch 550/1000- 0s - loss: 7.2358 - val_loss: 8.8818
Epoch 551/1000- 0s - loss: 7.2338 - val_loss: 8.9121
Epoch 552/1000- 0s - loss: 7.3010 - val_loss: 9.1588
Epoch 553/1000- 0s - loss: 7.2741 - val_loss: 9.0692
Epoch 554/1000- 0s - loss: 7.2590 - val_loss: 9.0917
Epoch 555/1000- 0s - loss: 7.2662 - val_loss: 9.1665
Epoch 556/1000- 0s - loss: 7.2556 - val_loss: 9.1096
Epoch 557/1000- 0s - loss: 7.2315 - val_loss: 8.9683
Epoch 558/1000- 0s - loss: 7.2812 - val_loss: 8.9278
Epoch 559/1000- 0s - loss: 7.2007 - val_loss: 9.0530
Epoch 560/1000- 0s - loss: 7.2429 - val_loss: 9.0803
Epoch 561/1000- 0s - loss: 7.2286 - val_loss: 8.9845
Epoch 562/1000- 0s - loss: 7.2232 - val_loss: 8.9234
Epoch 563/1000- 0s - loss: 7.2181 - val_loss: 8.8678
Epoch 564/1000- 0s - loss: 7.2216 - val_loss: 8.9412
Epoch 565/1000- 0s - loss: 7.2220 - val_loss: 8.9037
Epoch 566/1000- 0s - loss: 7.2201 - val_loss: 9.0258
Epoch 567/1000- 0s - loss: 7.2315 - val_loss: 9.0276
Epoch 568/1000- 0s - loss: 7.2162 - val_loss: 8.8868
Epoch 569/1000- 0s - loss: 7.1910 - val_loss: 8.7985
Epoch 570/1000- 0s - loss: 7.1854 - val_loss: 8.7582
Epoch 571/1000- 0s - loss: 7.1888 - val_loss: 8.7860
Epoch 572/1000- 0s - loss: 7.1922 - val_loss: 8.7441
Epoch 573/1000- 0s - loss: 7.2300 - val_loss: 8.6933
Epoch 574/1000- 0s - loss: 7.2109 - val_loss: 8.7351
Epoch 575/1000- 0s - loss: 7.1788 - val_loss: 8.7148
Epoch 576/1000- 0s - loss: 7.2081 - val_loss: 8.7247
Epoch 577/1000- 0s - loss: 7.2629 - val_loss: 8.8782
Epoch 578/1000- 0s - loss: 7.2050 - val_loss: 8.8041
Epoch 579/1000- 0s - loss: 7.1858 - val_loss: 8.7894
Epoch 580/1000- 0s - loss: 7.1628 - val_loss: 8.8416
Epoch 581/1000- 0s - loss: 7.1888 - val_loss: 8.8093
Epoch 582/1000- 0s - loss: 7.1746 - val_loss: 8.8096
Epoch 583/1000- 0s - loss: 7.1765 - val_loss: 8.7336
Epoch 584/1000- 0s - loss: 7.1703 - val_loss: 8.7404
Epoch 585/1000- 0s - loss: 7.1719 - val_loss: 8.7679
Epoch 586/1000- 0s - loss: 7.1549 - val_loss: 8.7411
Epoch 587/1000- 0s - loss: 7.1621 - val_loss: 8.7402
Epoch 588/1000- 0s - loss: 7.1625 - val_loss: 8.7479
Epoch 589/1000- 0s - loss: 7.1804 - val_loss: 8.7629
Epoch 590/1000- 0s - loss: 7.1583 - val_loss: 8.9162
Epoch 591/1000- 0s - loss: 7.1504 - val_loss: 8.9899
Epoch 592/1000- 0s - loss: 7.1585 - val_loss: 9.0471
Epoch 593/1000- 0s - loss: 7.1546 - val_loss: 9.0227
Epoch 594/1000- 0s - loss: 7.1483 - val_loss: 8.9838
Epoch 595/1000- 0s - loss: 7.1467 - val_loss: 8.9171
Epoch 596/1000- 0s - loss: 7.1360 - val_loss: 8.8858
Epoch 597/1000- 0s - loss: 7.1871 - val_loss: 8.9406
Epoch 598/1000- 0s - loss: 7.1329 - val_loss: 8.9990
Epoch 599/1000- 0s - loss: 7.1322 - val_loss: 9.0193
Epoch 600/1000- 0s - loss: 7.1318 - val_loss: 8.9862
Epoch 601/1000- 0s - loss: 7.1294 - val_loss: 8.9383
Epoch 602/1000- 0s - loss: 7.1217 - val_loss: 8.8549
Epoch 603/1000- 0s - loss: 7.1296 - val_loss: 8.8326
Epoch 604/1000- 0s - loss: 7.1382 - val_loss: 8.8090
Epoch 605/1000- 0s - loss: 7.1453 - val_loss: 8.8375
Epoch 606/1000- 0s - loss: 7.1302 - val_loss: 8.7985
Epoch 607/1000- 0s - loss: 7.1177 - val_loss: 8.8318
Epoch 608/1000- 0s - loss: 7.1185 - val_loss: 8.8661
Epoch 609/1000- 0s - loss: 7.1066 - val_loss: 8.7912
Epoch 610/1000- 0s - loss: 7.1154 - val_loss: 8.7511
Epoch 611/1000- 0s - loss: 7.1331 - val_loss: 8.7270
Epoch 612/1000- 0s - loss: 7.1348 - val_loss: 8.7352
Epoch 613/1000- 0s - loss: 7.1382 - val_loss: 8.7614
Epoch 614/1000- 0s - loss: 7.1264 - val_loss: 8.7486
Epoch 615/1000- 0s - loss: 7.1129 - val_loss: 8.7995
Epoch 616/1000- 0s - loss: 7.1249 - val_loss: 8.8461
Epoch 617/1000- 0s - loss: 7.1196 - val_loss: 8.8374
Epoch 618/1000- 0s - loss: 7.1132 - val_loss: 8.6431
Epoch 619/1000- 0s - loss: 7.1038 - val_loss: 8.6073
Epoch 620/1000- 0s - loss: 7.1130 - val_loss: 8.6246
Epoch 621/1000- 0s - loss: 7.0796 - val_loss: 8.7209
Epoch 622/1000- 0s - loss: 7.0920 - val_loss: 8.7489
Epoch 623/1000- 0s - loss: 7.0839 - val_loss: 8.6729
Epoch 624/1000- 0s - loss: 7.0933 - val_loss: 8.7233
Epoch 625/1000- 0s - loss: 7.0879 - val_loss: 8.8441
Epoch 626/1000- 0s - loss: 7.0877 - val_loss: 8.8173
Epoch 627/1000- 0s - loss: 7.1302 - val_loss: 8.6925
Epoch 628/1000- 0s - loss: 7.0711 - val_loss: 8.7263
Epoch 629/1000- 0s - loss: 7.1090 - val_loss: 8.7833
Epoch 630/1000- 0s - loss: 7.1146 - val_loss: 8.7732
Epoch 631/1000- 0s - loss: 7.0759 - val_loss: 8.6664
Epoch 632/1000- 0s - loss: 7.0672 - val_loss: 8.6166
Epoch 633/1000- 0s - loss: 7.0720 - val_loss: 8.5739
Epoch 634/1000- 0s - loss: 7.0862 - val_loss: 8.5997
Epoch 635/1000- 0s - loss: 7.0622 - val_loss: 8.6503
Epoch 636/1000- 0s - loss: 7.0927 - val_loss: 8.7070
Epoch 637/1000- 0s - loss: 7.0681 - val_loss: 8.7236
Epoch 638/1000- 0s - loss: 7.0591 - val_loss: 8.6739
Epoch 639/1000- 0s - loss: 7.0881 - val_loss: 8.5848
Epoch 640/1000- 0s - loss: 7.1008 - val_loss: 8.5578
Epoch 641/1000- 0s - loss: 7.0774 - val_loss: 8.6225
Epoch 642/1000- 0s - loss: 7.0502 - val_loss: 8.6967
Epoch 643/1000- 0s - loss: 7.0538 - val_loss: 8.7666
Epoch 644/1000- 0s - loss: 7.0759 - val_loss: 8.7471
Epoch 645/1000- 0s - loss: 7.0839 - val_loss: 8.6874
Epoch 646/1000- 0s - loss: 7.0555 - val_loss: 8.6607
Epoch 647/1000- 0s - loss: 7.0576 - val_loss: 8.6915
Epoch 648/1000- 0s - loss: 7.0633 - val_loss: 8.7571
Epoch 649/1000- 0s - loss: 7.0408 - val_loss: 8.6331
Epoch 650/1000- 0s - loss: 7.0771 - val_loss: 8.5021
Epoch 651/1000- 0s - loss: 7.0837 - val_loss: 8.5108
Epoch 652/1000- 0s - loss: 7.0857 - val_loss: 8.5488
Epoch 653/1000- 0s - loss: 7.0314 - val_loss: 8.6862
Epoch 654/1000- 0s - loss: 7.0642 - val_loss: 8.7735
Epoch 655/1000- 0s - loss: 7.0981 - val_loss: 8.6833
Epoch 656/1000- 0s - loss: 7.0484 - val_loss: 8.6187
Epoch 657/1000- 0s - loss: 7.0410 - val_loss: 8.6061
Epoch 658/1000- 0s - loss: 7.0400 - val_loss: 8.6124
Epoch 659/1000- 0s - loss: 7.0291 - val_loss: 8.6136
Epoch 660/1000- 0s - loss: 7.0571 - val_loss: 8.5086
Epoch 661/1000- 0s - loss: 7.0493 - val_loss: 8.4785
Epoch 662/1000- 0s - loss: 7.0556 - val_loss: 8.5378
Epoch 663/1000- 0s - loss: 7.0432 - val_loss: 8.6592
Epoch 664/1000- 0s - loss: 7.0467 - val_loss: 8.7147
Epoch 665/1000- 0s - loss: 7.0407 - val_loss: 8.7243
Epoch 666/1000- 0s - loss: 7.0359 - val_loss: 8.7387
Epoch 667/1000- 0s - loss: 7.0407 - val_loss: 8.7396
Epoch 668/1000- 0s - loss: 7.0416 - val_loss: 8.7544
Epoch 669/1000- 0s - loss: 7.0579 - val_loss: 8.7076
Epoch 670/1000- 0s - loss: 7.0420 - val_loss: 8.7353
Epoch 671/1000- 0s - loss: 7.0809 - val_loss: 8.6717
Epoch 672/1000- 0s - loss: 7.0862 - val_loss: 8.7637
Epoch 673/1000- 0s - loss: 7.0212 - val_loss: 8.7596
Epoch 674/1000- 0s - loss: 7.0257 - val_loss: 8.7321
Epoch 675/1000- 0s - loss: 7.0084 - val_loss: 8.6148
Epoch 676/1000- 0s - loss: 7.0240 - val_loss: 8.5008
Epoch 677/1000- 0s - loss: 7.0386 - val_loss: 8.4461
Epoch 678/1000- 0s - loss: 7.0349 - val_loss: 8.4888
Epoch 679/1000- 0s - loss: 7.0252 - val_loss: 8.5074
Epoch 680/1000- 0s - loss: 7.0214 - val_loss: 8.5807
Epoch 681/1000- 0s - loss: 7.0153 - val_loss: 8.6040
Epoch 682/1000- 0s - loss: 7.0200 - val_loss: 8.7211
Epoch 683/1000- 0s - loss: 7.0113 - val_loss: 8.6607
Epoch 684/1000- 0s - loss: 7.0073 - val_loss: 8.6738
Epoch 685/1000- 0s - loss: 6.9951 - val_loss: 8.7710
Epoch 686/1000- 0s - loss: 7.0535 - val_loss: 8.8425
Epoch 687/1000- 0s - loss: 7.0248 - val_loss: 8.7719
Epoch 688/1000- 0s - loss: 6.9885 - val_loss: 8.6543
Epoch 689/1000- 0s - loss: 7.0459 - val_loss: 8.5305
Epoch 690/1000- 0s - loss: 7.0802 - val_loss: 8.5175
Epoch 691/1000- 0s - loss: 7.0609 - val_loss: 8.5654
Epoch 692/1000- 0s - loss: 7.0277 - val_loss: 8.6320
Epoch 693/1000- 0s - loss: 7.0107 - val_loss: 8.7095
Epoch 694/1000- 0s - loss: 7.0038 - val_loss: 8.7107
Epoch 695/1000- 0s - loss: 7.0720 - val_loss: 8.8791
Epoch 696/1000- 0s - loss: 7.0763 - val_loss: 8.8228
Epoch 697/1000- 0s - loss: 7.0468 - val_loss: 8.7540
Epoch 698/1000- 0s - loss: 6.9727 - val_loss: 8.5731
Epoch 699/1000- 0s - loss: 7.0117 - val_loss: 8.4158
Epoch 700/1000- 0s - loss: 7.0859 - val_loss: 8.3612
Epoch 701/1000- 0s - loss: 7.0769 - val_loss: 8.3481
Epoch 702/1000- 0s - loss: 7.0199 - val_loss: 8.4199
Epoch 703/1000- 0s - loss: 6.9942 - val_loss: 8.4981
Epoch 704/1000- 0s - loss: 7.0081 - val_loss: 8.5685
Epoch 705/1000- 0s - loss: 7.0002 - val_loss: 8.5314
Epoch 706/1000- 0s - loss: 6.9875 - val_loss: 8.5229
Epoch 707/1000- 0s - loss: 6.9855 - val_loss: 8.5419
Epoch 708/1000- 0s - loss: 6.9769 - val_loss: 8.4654
Epoch 709/1000- 0s - loss: 7.0904 - val_loss: 8.3860
Epoch 710/1000- 0s - loss: 7.0648 - val_loss: 8.4140
Epoch 711/1000- 0s - loss: 7.0413 - val_loss: 8.4135
Epoch 712/1000- 0s - loss: 7.0160 - val_loss: 8.4918
Epoch 713/1000- 0s - loss: 6.9839 - val_loss: 8.4950
Epoch 714/1000- 0s - loss: 6.9714 - val_loss: 8.5258
Epoch 715/1000- 0s - loss: 6.9909 - val_loss: 8.4725
Epoch 716/1000- 0s - loss: 7.0147 - val_loss: 8.4002
Epoch 717/1000- 0s - loss: 6.9776 - val_loss: 8.5254
Epoch 718/1000- 0s - loss: 6.9774 - val_loss: 8.6156
Epoch 719/1000- 0s - loss: 6.9822 - val_loss: 8.7260
Epoch 720/1000- 0s - loss: 7.0005 - val_loss: 8.7592
Epoch 721/1000- 0s - loss: 7.0071 - val_loss: 8.7225
Epoch 722/1000- 0s - loss: 6.9856 - val_loss: 8.6632
Epoch 723/1000- 0s - loss: 6.9765 - val_loss: 8.5598
Epoch 724/1000- 0s - loss: 6.9773 - val_loss: 8.4617
Epoch 725/1000- 0s - loss: 7.0084 - val_loss: 8.3950
Epoch 726/1000- 0s - loss: 7.0267 - val_loss: 8.4225
Epoch 727/1000- 0s - loss: 6.9970 - val_loss: 8.4516
Epoch 728/1000- 0s - loss: 6.9751 - val_loss: 8.4918
Epoch 729/1000- 0s - loss: 6.9697 - val_loss: 8.4669
Epoch 730/1000- 0s - loss: 6.9709 - val_loss: 8.4251
Epoch 731/1000- 0s - loss: 6.9992 - val_loss: 8.4364
Epoch 732/1000- 0s - loss: 6.9724 - val_loss: 8.4533
Epoch 733/1000- 0s - loss: 6.9695 - val_loss: 8.4596
Epoch 734/1000- 0s - loss: 6.9817 - val_loss: 8.3631
Epoch 735/1000- 0s - loss: 6.9956 - val_loss: 8.3666
Epoch 736/1000- 0s - loss: 6.9872 - val_loss: 8.3801
Epoch 737/1000- 0s - loss: 6.9770 - val_loss: 8.3974
Epoch 738/1000- 0s - loss: 6.9606 - val_loss: 8.4255
Epoch 739/1000- 0s - loss: 6.9616 - val_loss: 8.4131
Epoch 740/1000- 0s - loss: 6.9647 - val_loss: 8.4620
Epoch 741/1000- 0s - loss: 6.9526 - val_loss: 8.4625
Epoch 742/1000- 0s - loss: 6.9452 - val_loss: 8.5457
Epoch 743/1000- 0s - loss: 6.9532 - val_loss: 8.6293
Epoch 744/1000- 0s - loss: 7.0050 - val_loss: 8.7049
Epoch 745/1000- 0s - loss: 6.9784 - val_loss: 8.6454
Epoch 746/1000- 0s - loss: 6.9423 - val_loss: 8.5112
Epoch 747/1000- 0s - loss: 7.0955 - val_loss: 8.4148
Epoch 748/1000- 0s - loss: 6.9530 - val_loss: 8.6194
Epoch 749/1000- 0s - loss: 6.9859 - val_loss: 8.7461
Epoch 750/1000- 0s - loss: 6.9617 - val_loss: 8.6617
Epoch 751/1000- 0s - loss: 6.9470 - val_loss: 8.7342
Epoch 752/1000- 0s - loss: 6.9762 - val_loss: 8.7558
Epoch 753/1000- 0s - loss: 6.9471 - val_loss: 8.7122
Epoch 754/1000- 0s - loss: 6.9609 - val_loss: 8.6430
Epoch 755/1000- 0s - loss: 6.9442 - val_loss: 8.7058
Epoch 756/1000- 0s - loss: 6.9398 - val_loss: 8.7062
Epoch 757/1000- 0s - loss: 6.9364 - val_loss: 8.6928
Epoch 758/1000- 0s - loss: 6.9454 - val_loss: 8.6656
Epoch 759/1000- 0s - loss: 6.9365 - val_loss: 8.7003
Epoch 760/1000- 0s - loss: 6.9488 - val_loss: 8.7593
Epoch 761/1000- 0s - loss: 6.9520 - val_loss: 8.7532
Epoch 762/1000- 0s - loss: 6.9436 - val_loss: 8.6994
Epoch 763/1000- 0s - loss: 6.9363 - val_loss: 8.6286
Epoch 764/1000- 0s - loss: 6.9397 - val_loss: 8.6343
Epoch 765/1000- 0s - loss: 6.9382 - val_loss: 8.7789
Epoch 766/1000- 0s - loss: 6.9498 - val_loss: 8.7744
Epoch 767/1000- 0s - loss: 6.9404 - val_loss: 8.7064
Epoch 768/1000- 0s - loss: 6.9419 - val_loss: 8.5831
Epoch 769/1000- 0s - loss: 6.9329 - val_loss: 8.6001
Epoch 770/1000- 0s - loss: 6.9397 - val_loss: 8.5903
Epoch 771/1000- 0s - loss: 6.9368 - val_loss: 8.6268
Epoch 772/1000- 0s - loss: 6.9224 - val_loss: 8.6516
Epoch 773/1000- 0s - loss: 6.9320 - val_loss: 8.6468
Epoch 774/1000- 0s - loss: 6.9071 - val_loss: 8.5447
Epoch 775/1000- 0s - loss: 6.9548 - val_loss: 8.4981
Epoch 776/1000- 0s - loss: 6.9534 - val_loss: 8.5547
Epoch 777/1000- 0s - loss: 6.9255 - val_loss: 8.5889
Epoch 778/1000- 0s - loss: 6.9268 - val_loss: 8.6476
Epoch 779/1000- 0s - loss: 6.9264 - val_loss: 8.6437
Epoch 780/1000- 0s - loss: 6.9213 - val_loss: 8.5815
Epoch 781/1000- 0s - loss: 6.9743 - val_loss: 8.5044
Epoch 782/1000- 0s - loss: 6.9365 - val_loss: 8.5558
Epoch 783/1000- 0s - loss: 6.9597 - val_loss: 8.6376
Epoch 784/1000- 0s - loss: 6.9556 - val_loss: 8.6343
Epoch 785/1000- 0s - loss: 6.9399 - val_loss: 8.5156
Epoch 786/1000- 0s - loss: 6.9129 - val_loss: 8.6169
Epoch 787/1000- 0s - loss: 6.9338 - val_loss: 8.6341
Epoch 788/1000- 0s - loss: 6.9141 - val_loss: 8.6137
Epoch 789/1000- 0s - loss: 6.9225 - val_loss: 8.5282
Epoch 790/1000- 0s - loss: 6.9289 - val_loss: 8.5100
Epoch 791/1000- 0s - loss: 6.9324 - val_loss: 8.5717
Epoch 792/1000- 0s - loss: 6.9203 - val_loss: 8.5790
Epoch 793/1000- 0s - loss: 6.9136 - val_loss: 8.5526
Epoch 794/1000- 0s - loss: 6.9211 - val_loss: 8.5614
Epoch 795/1000- 0s - loss: 6.9231 - val_loss: 8.6062
Epoch 796/1000- 0s - loss: 6.9153 - val_loss: 8.5734
Epoch 797/1000- 0s - loss: 6.9278 - val_loss: 8.6145
Epoch 798/1000- 0s - loss: 6.9219 - val_loss: 8.5598
Epoch 799/1000- 0s - loss: 6.9134 - val_loss: 8.5734
Epoch 800/1000- 0s - loss: 6.9302 - val_loss: 8.5396
Epoch 801/1000- 0s - loss: 6.9262 - val_loss: 8.5620
Epoch 802/1000- 0s - loss: 6.9254 - val_loss: 8.5678
Epoch 803/1000- 0s - loss: 6.9047 - val_loss: 8.6422
Epoch 804/1000- 0s - loss: 6.9041 - val_loss: 8.6984
Epoch 805/1000- 0s - loss: 6.9192 - val_loss: 8.6895
Epoch 806/1000- 0s - loss: 6.9177 - val_loss: 8.5753
Epoch 807/1000- 0s - loss: 6.9109 - val_loss: 8.5719
Epoch 808/1000- 0s - loss: 6.9085 - val_loss: 8.5863
Epoch 809/1000- 0s - loss: 6.9083 - val_loss: 8.6027
Epoch 810/1000- 0s - loss: 6.9123 - val_loss: 8.5468
Epoch 811/1000- 0s - loss: 6.9112 - val_loss: 8.4689
Epoch 812/1000- 0s - loss: 6.9092 - val_loss: 8.5211
Epoch 813/1000- 0s - loss: 6.8973 - val_loss: 8.5242
Epoch 814/1000- 0s - loss: 6.8688 - val_loss: 8.7119
Epoch 815/1000- 0s - loss: 6.9726 - val_loss: 8.9140
Epoch 816/1000- 0s - loss: 6.9839 - val_loss: 8.7960
Epoch 817/1000- 0s - loss: 6.9633 - val_loss: 8.6834
Epoch 818/1000- 0s - loss: 6.9065 - val_loss: 8.6763
Epoch 819/1000- 0s - loss: 6.9133 - val_loss: 8.6713
Epoch 820/1000- 0s - loss: 6.9314 - val_loss: 8.6877
Epoch 821/1000- 0s - loss: 6.9165 - val_loss: 8.6786
Epoch 822/1000- 0s - loss: 6.9100 - val_loss: 8.6734
Epoch 823/1000- 0s - loss: 6.8999 - val_loss: 8.6742
Epoch 824/1000- 0s - loss: 6.9077 - val_loss: 8.6939
Epoch 825/1000- 0s - loss: 6.9056 - val_loss: 8.6672
Epoch 826/1000- 0s - loss: 6.8989 - val_loss: 8.6262
Epoch 827/1000- 0s - loss: 6.8945 - val_loss: 8.6037
Epoch 828/1000- 0s - loss: 6.8901 - val_loss: 8.6088
Epoch 829/1000- 0s - loss: 6.9051 - val_loss: 8.5793
Epoch 830/1000- 0s - loss: 6.8804 - val_loss: 8.5822
Epoch 831/1000- 0s - loss: 6.8904 - val_loss: 8.5982
Epoch 832/1000- 0s - loss: 6.8975 - val_loss: 8.6576
Epoch 833/1000- 0s - loss: 6.9127 - val_loss: 8.6246
Epoch 834/1000- 0s - loss: 6.8894 - val_loss: 8.6199
Epoch 835/1000- 0s - loss: 6.8814 - val_loss: 8.5784
Epoch 836/1000- 0s - loss: 6.8837 - val_loss: 8.5410
Epoch 837/1000- 0s - loss: 6.9107 - val_loss: 8.4859
Epoch 838/1000- 0s - loss: 6.8848 - val_loss: 8.5173
Epoch 839/1000- 0s - loss: 6.8804 - val_loss: 8.4971
Epoch 840/1000- 0s - loss: 6.9009 - val_loss: 8.4524
Epoch 841/1000- 0s - loss: 6.8976 - val_loss: 8.5148
Epoch 842/1000- 0s - loss: 6.8670 - val_loss: 8.5860
Epoch 843/1000- 0s - loss: 6.9224 - val_loss: 8.6937
Epoch 844/1000- 0s - loss: 6.8973 - val_loss: 8.4884
Epoch 845/1000- 0s - loss: 6.8797 - val_loss: 8.4416
Epoch 846/1000- 0s - loss: 6.8801 - val_loss: 8.4058
Epoch 847/1000- 0s - loss: 6.8972 - val_loss: 8.2906
Epoch 848/1000- 0s - loss: 6.9202 - val_loss: 8.2737
Epoch 849/1000- 0s - loss: 6.9373 - val_loss: 8.3648
Epoch 850/1000- 0s - loss: 6.8793 - val_loss: 8.4798
Epoch 851/1000- 0s - loss: 6.8842 - val_loss: 8.5027
Epoch 852/1000- 0s - loss: 6.8715 - val_loss: 8.4512
Epoch 853/1000- 0s - loss: 6.8770 - val_loss: 8.4394
Epoch 854/1000- 0s - loss: 6.8744 - val_loss: 8.4472
Epoch 855/1000- 0s - loss: 6.8755 - val_loss: 8.4393
Epoch 856/1000- 0s - loss: 6.8792 - val_loss: 8.3980
Epoch 857/1000- 0s - loss: 6.8804 - val_loss: 8.4114
Epoch 858/1000- 0s - loss: 6.8524 - val_loss: 8.5629
Epoch 859/1000- 0s - loss: 6.8745 - val_loss: 8.6568
Epoch 860/1000- 0s - loss: 6.8859 - val_loss: 8.5767
Epoch 861/1000- 0s - loss: 6.8793 - val_loss: 8.3959
Epoch 862/1000- 0s - loss: 6.8933 - val_loss: 8.3341
Epoch 863/1000- 0s - loss: 6.9167 - val_loss: 8.3749
Epoch 864/1000- 0s - loss: 6.8659 - val_loss: 8.5399
Epoch 865/1000- 0s - loss: 6.8788 - val_loss: 8.5804
Epoch 866/1000- 0s - loss: 6.8750 - val_loss: 8.5788
Epoch 867/1000- 0s - loss: 6.8661 - val_loss: 8.5265
Epoch 868/1000- 0s - loss: 6.8834 - val_loss: 8.4946
Epoch 869/1000- 0s - loss: 6.8712 - val_loss: 8.5024
Epoch 870/1000- 0s - loss: 6.8632 - val_loss: 8.4812
Epoch 871/1000- 0s - loss: 6.8693 - val_loss: 8.4996
Epoch 872/1000- 0s - loss: 6.8648 - val_loss: 8.4457
Epoch 873/1000- 0s - loss: 6.8741 - val_loss: 8.3863
Epoch 874/1000- 0s - loss: 6.9042 - val_loss: 8.3653
Epoch 875/1000- 0s - loss: 6.8735 - val_loss: 8.4483
Epoch 876/1000- 0s - loss: 6.8764 - val_loss: 8.5491
Epoch 877/1000- 0s - loss: 6.8698 - val_loss: 8.5771
Epoch 878/1000- 0s - loss: 6.8601 - val_loss: 8.5636
Epoch 879/1000- 0s - loss: 6.8552 - val_loss: 8.5683
Epoch 880/1000- 0s - loss: 6.8534 - val_loss: 8.5752
Epoch 881/1000- 0s - loss: 6.8544 - val_loss: 8.5957
Epoch 882/1000- 0s - loss: 6.8548 - val_loss: 8.5939
Epoch 883/1000- 0s - loss: 6.8577 - val_loss: 8.5866
Epoch 884/1000- 0s - loss: 6.8773 - val_loss: 8.5952
Epoch 885/1000- 0s - loss: 6.8756 - val_loss: 8.5630
Epoch 886/1000- 0s - loss: 6.8668 - val_loss: 8.4512
Epoch 887/1000- 0s - loss: 6.8745 - val_loss: 8.4540
Epoch 888/1000- 0s - loss: 6.8641 - val_loss: 8.4412
Epoch 889/1000- 0s - loss: 6.8782 - val_loss: 8.5320
Epoch 890/1000- 0s - loss: 6.8415 - val_loss: 8.5606
Epoch 891/1000- 0s - loss: 6.8534 - val_loss: 8.5682
Epoch 892/1000- 0s - loss: 6.8858 - val_loss: 8.4739
Epoch 893/1000- 0s - loss: 6.8534 - val_loss: 8.4575
Epoch 894/1000- 0s - loss: 6.8581 - val_loss: 8.4104
Epoch 895/1000- 0s - loss: 6.8834 - val_loss: 8.4251
Epoch 896/1000- 0s - loss: 6.8710 - val_loss: 8.4780
Epoch 897/1000- 0s - loss: 6.8870 - val_loss: 8.4867
Epoch 898/1000- 0s - loss: 6.8274 - val_loss: 8.6073
Epoch 899/1000- 0s - loss: 6.8772 - val_loss: 8.7308
Epoch 900/1000- 0s - loss: 6.8722 - val_loss: 8.6132
Epoch 901/1000- 0s - loss: 6.8604 - val_loss: 8.6610
Epoch 902/1000- 0s - loss: 6.8541 - val_loss: 8.6173
Epoch 903/1000- 0s - loss: 6.8730 - val_loss: 8.5093
Epoch 904/1000- 0s - loss: 6.8426 - val_loss: 8.5159
Epoch 905/1000- 0s - loss: 6.8429 - val_loss: 8.5200
Epoch 906/1000- 0s - loss: 6.8439 - val_loss: 8.5554
Epoch 907/1000- 0s - loss: 6.8537 - val_loss: 8.7608
Epoch 908/1000- 0s - loss: 6.8801 - val_loss: 8.8564
Epoch 909/1000- 0s - loss: 6.9187 - val_loss: 8.8065
Epoch 910/1000- 0s - loss: 6.8853 - val_loss: 8.7571
Epoch 911/1000- 0s - loss: 6.8544 - val_loss: 8.6461
Epoch 912/1000- 0s - loss: 6.8342 - val_loss: 8.5683
Epoch 913/1000- 0s - loss: 6.8823 - val_loss: 8.4780
Epoch 914/1000- 0s - loss: 6.8524 - val_loss: 8.5182
Epoch 915/1000- 0s - loss: 6.8370 - val_loss: 8.5544
Epoch 916/1000- 0s - loss: 6.8490 - val_loss: 8.5250
Epoch 917/1000- 0s - loss: 6.8746 - val_loss: 8.6496
Epoch 918/1000- 0s - loss: 6.8511 - val_loss: 8.6746
Epoch 919/1000- 0s - loss: 6.8391 - val_loss: 8.6202
Epoch 920/1000- 0s - loss: 6.8378 - val_loss: 8.5823
Epoch 921/1000- 0s - loss: 6.8284 - val_loss: 8.6033
Epoch 922/1000- 0s - loss: 6.8513 - val_loss: 8.4982
Epoch 923/1000- 0s - loss: 6.8424 - val_loss: 8.4665
Epoch 924/1000- 0s - loss: 6.8490 - val_loss: 8.5250
Epoch 925/1000- 0s - loss: 6.8479 - val_loss: 8.5245
Epoch 926/1000- 0s - loss: 6.8417 - val_loss: 8.4306
Epoch 927/1000- 0s - loss: 6.8274 - val_loss: 8.4696
Epoch 928/1000- 0s - loss: 6.8407 - val_loss: 8.4810
Epoch 929/1000- 0s - loss: 6.8413 - val_loss: 8.4988
Epoch 930/1000- 0s - loss: 6.8362 - val_loss: 8.5352
Epoch 931/1000- 0s - loss: 6.8365 - val_loss: 8.6174
Epoch 932/1000- 0s - loss: 6.8309 - val_loss: 8.6056
Epoch 933/1000- 0s - loss: 6.8295 - val_loss: 8.5836
Epoch 934/1000- 0s - loss: 6.8431 - val_loss: 8.6144
Epoch 935/1000- 0s - loss: 6.8263 - val_loss: 8.5581
Epoch 936/1000- 0s - loss: 6.8532 - val_loss: 8.5515
Epoch 937/1000- 0s - loss: 6.8281 - val_loss: 8.5130
Epoch 938/1000- 0s - loss: 6.8655 - val_loss: 8.4709
Epoch 939/1000- 0s - loss: 6.8737 - val_loss: 8.5025
Epoch 940/1000- 0s - loss: 6.8258 - val_loss: 8.4765
Epoch 941/1000- 0s - loss: 6.8172 - val_loss: 8.4921
Epoch 942/1000- 0s - loss: 6.8489 - val_loss: 8.6057
Epoch 943/1000- 0s - loss: 6.8361 - val_loss: 8.5947
Epoch 944/1000- 0s - loss: 6.8388 - val_loss: 8.5395
Epoch 945/1000- 0s - loss: 6.8118 - val_loss: 8.5427
Epoch 946/1000- 0s - loss: 6.8248 - val_loss: 8.5310
Epoch 947/1000- 0s - loss: 6.8355 - val_loss: 8.5500
Epoch 948/1000- 0s - loss: 6.8282 - val_loss: 8.5621
Epoch 949/1000- 0s - loss: 6.8307 - val_loss: 8.6018
Epoch 950/1000- 0s - loss: 6.8149 - val_loss: 8.6919
Epoch 951/1000- 0s - loss: 6.8535 - val_loss: 8.8221
Epoch 952/1000- 0s - loss: 6.7969 - val_loss: 8.6478
Epoch 953/1000- 0s - loss: 6.8059 - val_loss: 8.5851
Epoch 954/1000- 0s - loss: 6.8304 - val_loss: 8.5123
Epoch 955/1000- 0s - loss: 6.8407 - val_loss: 8.5116
Epoch 956/1000- 0s - loss: 6.8188 - val_loss: 8.5680
Epoch 957/1000- 0s - loss: 6.8065 - val_loss: 8.6502
Epoch 958/1000- 0s - loss: 6.8422 - val_loss: 8.6930
Epoch 959/1000- 0s - loss: 6.8171 - val_loss: 8.5316
Epoch 960/1000- 0s - loss: 6.8234 - val_loss: 8.3590
Epoch 961/1000- 0s - loss: 6.8595 - val_loss: 8.3428
Epoch 962/1000- 0s - loss: 6.8903 - val_loss: 8.3553
Epoch 963/1000- 0s - loss: 6.8414 - val_loss: 8.4878
Epoch 96
  相关解决方案