본문 바로가기
AI/Deep Learning

titanic : Machine Learning from Disaster - kaggle 연습

by KIha_Jung 2019. 7. 23.
titanic

titanic : Machine Learning from Disaster - kaggle

참조 https://www.kaggle.com/daehungwak/guide-kor-dg https://www.kaggle.com/

데이터 분석 순서

  1. 데이터 셋을 확인한다.

    • 데이터가 어떻게 구성되어 있는지 확인
    • null data 확인후 향후 수정
  1. 탐색적 데이터 분석(EDA, Exploratory Data Analysis)

    • 여러 feature 들을 개별적 분석, 상관관계 확인
    • 여러 시각화 툴을 사용해 insight를 얻는다
  1. 특성 공학(Feature Engineering)

    • 모델의 성능을 높일 수 있도록 feature 들을 engineering 합니다.
    • one-hot encoding, class로 나누기, 구간으로 나누기, 텍스트 데이터 처리
  1. 모델 개발 빛 학습

    • sklearn, keras 을 사용해 모델을 만든다.
  1. 모델 예측 및 평가

    • Train set을 가지고 모델을 학습시킨 후, Test set을 가지고 prediction 한다

라이브러리 및 패키지

- 여러 시각화 도구 : matplotlib, seaborn, plotly
- 데이터 분석 도구 : pandas, numpy
- 모델 개발 도구 : sklearn, keras


타이타닉 데이터 Feature 설명

- srvival - 생존유무, target 값. (0 = 사망, 1 = 생존)
- class - 티켓 클래스. (1 = 1st, 2 = 2nd, 3 = 3rd)
- sex - 성별
- Age - 나이(세)
- sibsp - 함께 탑승한 형제자매, 배우자 수 총합
- parch - 함께 탑승한 부모, 자녀 수 총합
- ticket - 티켓 넘버
- fare - 탑승 요금
- cabin - 객실 넘버
- bembarked - 탑승 항구

1. 데이터 셋 확인

In [1]:
import os # 운영체제에서 제공되는 여러 기능을 파이썬에서 수행할 수 있게 해주는 도구
import numpy as np # 행렬 연산
import pandas as pd # 행과 열로 이루어진 데이터 객체를 만들어 대용량의 데이터들을 처리하는데 편리한 도구
import matplotlib.pyplot as plt # 차트나 플롯을 그려주는 시각화 도구
import seaborn as sns # 데이터 분포 시각화 도구
import keras # 딥러닝 도구
import  sklearn # 딥러닝 도구

plt.style.use('seaborn')
sns.set(font_scale=2.5)

import missingno as msno # 누적 데이터에 대한 시각화 도구

import warnings
warnings.filterwarnings('ignore') # 워닝 메세지를 생략

%matplotlib inline
Using TensorFlow backend.
In [84]:
os.listdir("./titanic_dataset") 
Out[84]:
['gender_submission.csv', 'test.csv', 'train.csv']

파일 경로에 있는 파일 리스트 확인

In [85]:
df_train = pd.read_csv("./titanic_dataset/train.csv")
df_test = pd.read_csv("./titanic_dataset/test.csv")
df_submit = pd.read_csv("./titanic_dataset/gender_submission.csv")
In [86]:
df_train.shape
Out[86]:
(891, 12)
In [87]:
df_train.columns
Out[87]:
Index(['PassengerId', 'Survived', 'Pclass', 'Name', 'Sex', 'Age', 'SibSp',
       'Parch', 'Ticket', 'Fare', 'Cabin', 'Embarked'],
      dtype='object')

column 12개로 이루어져 있고, feature 11개, 예측해야 할 feature 은 'Survived' 이다.

In [88]:
df_train.head()
Out[88]:
PassengerId Survived Pclass Name Sex Age SibSp Parch Ticket Fare Cabin Embarked
0 1 0 3 Braund, Mr. Owen Harris male 22.0 1 0 A/5 21171 7.2500 NaN S
1 2 1 1 Cumings, Mrs. John Bradley (Florence Briggs Th... female 38.0 1 0 PC 17599 71.2833 C85 C
2 3 1 3 Heikkinen, Miss. Laina female 26.0 0 0 STON/O2. 3101282 7.9250 NaN S
3 4 1 1 Futrelle, Mrs. Jacques Heath (Lily May Peel) female 35.0 1 0 113803 53.1000 C123 S
4 5 0 3 Allen, Mr. William Henry male 35.0 0 0 373450 8.0500 NaN S
In [89]:
df_train.describe()
Out[89]:
PassengerId Survived Pclass Age SibSp Parch Fare
count 891.000000 891.000000 891.000000 714.000000 891.000000 891.000000 891.000000
mean 446.000000 0.383838 2.308642 29.699118 0.523008 0.381594 32.204208
std 257.353842 0.486592 0.836071 14.526497 1.102743 0.806057 49.693429
min 1.000000 0.000000 1.000000 0.420000 0.000000 0.000000 0.000000
25% 223.500000 0.000000 2.000000 20.125000 0.000000 0.000000 7.910400
50% 446.000000 0.000000 3.000000 28.000000 0.000000 0.000000 14.454200
75% 668.500000 1.000000 3.000000 38.000000 1.000000 0.000000 31.000000
max 891.000000 1.000000 3.000000 80.000000 8.000000 6.000000 512.329200

describe() 메소드는 각 feature가 가진 통계치를 반환해준다.

In [90]:
df_train.isnull().sum() / df_train.shape[0]
Out[90]:
PassengerId    0.000000
Survived       0.000000
Pclass         0.000000
Name           0.000000
Sex            0.000000
Age            0.198653
SibSp          0.000000
Parch          0.000000
Ticket         0.000000
Fare           0.000000
Cabin          0.771044
Embarked       0.002245
dtype: float64
In [91]:
fig, ax = plt.subplots(1, 2, figsize=(18, 8))

df_train['Survived'].value_counts().plot.pie(explode=[0, 0.1], 
                                             autopct='%1.1f%%', ax=ax[0], shadow=True)
ax[0].set_title('Pie plot - Survived')
ax[0].set_ylabel('')
sns.countplot('Survived', data=df_train, ax=ax[1])
ax[1].set_title('Count plot - Survived')

plt.show()

explode : pie 그래프간 떨어진 정도를 나타낸다.
autopct : 그래프 위에 반환되는 수치의 형식을 나타낸다.

2. 탐색적 데이터 분석(EDA, Exploratory Data Alalysis)

1. Pclass에 따른 생존률

In [92]:
df_train[['Pclass', 'Survived']].groupby(['Pclass'], as_index=True).count()
Out[92]:
Survived
Pclass
1 216
2 184
3 491
In [93]:
# Pclass 그룹 별 생존자 수 합
df_train[['Pclass', 'Survived']].groupby(['Pclass'], as_index=True).sum()
Out[93]:
Survived
Pclass
1 136
2 87
3 119
In [94]:
# 위에서 한 작업을 pandas.crosstab 으로 편하게 할 수 있다. - 데이터 재구조화
pd.crosstab(df_train['Pclass'], df_train['Survived'], margins=True)
Out[94]:
Survived 0 1 All
Pclass
1 80 136 216
2 97 87 184
3 372 119 491
All 549 342 891
In [95]:
df_train[['Pclass', 'Survived']].groupby(['Pclass'], as_index=True).mean().plot.bar()
Out[95]:
<matplotlib.axes._subplots.AxesSubplot at 0x29e619c5128>

2. Sex 에 따른 생존률

In [96]:
fig, ax = plt.subplots(1, 2, figsize=(18,8))
df_train[['Sex', 'Survived']].groupby(['Sex'], as_index=True).mean().plot.bar(ax=ax[0])
ax[0].set_title('Survived vs Sex')

sns.countplot('Sex', hue='Survived', data=df_train, ax=ax[1])
ax[1].set_title('Sex : Survived vs Dead')
plt.show()

3. Both Sex and Pclss

In [97]:
sns.factorplot('Pclass', 'Survived', hue='Sex', data=df_train, size=6, aspect=1.5)
Out[97]:
<seaborn.axisgrid.FacetGrid at 0x29e619bce48>

4. Age

In [98]:
fig, ax = plt.subplots(1, 1, figsize=(9,5))
sns.kdeplot(df_train[df_train['Survived']==1]['Age'], ax=ax)
sns.kdeplot(df_train[df_train['Survived']==0]['Age'], ax=ax)
plt.legend(['Survived == 1', 'Survived == 0'])
plt.show()
In [99]:
plt.figure(figsize=(8, 6))
df_train['Age'][df_train['Pclass'] == 1].plot(kind='kde')
df_train['Age'][df_train['Pclass'] == 2].plot(kind='kde')
df_train['Age'][df_train['Pclass'] == 3].plot(kind='kde')

plt.xlabel('Age')
plt.title('Age Distribution within classes')
plt.legend(['1st class', '2nd class', '3rd class'])
Out[99]:
<matplotlib.legend.Legend at 0x29e5fd61da0>
In [100]:
cummulate_survival_ratio = []
for i in range(1, 80):
    cummulate_survival_ratio.append(df_train[df_train['Age'] < i]['Survived'].sum() 
                                    / len(df_train[df_train['Age'] < i]['Survived']))

plt.figure(figsize=(7,7))
plt.plot(cummulate_survival_ratio)
plt.title('Survival rate change depending on range of Age')
plt.ylabel('Survival rate')
plt.xlabel('Range of Age(0~x)')
plt.show()

5. Embarked

In [101]:
f, ax = plt.subplots(1, 1, figsize=(7,7))
df_train[['Embarked', 'Survived']].groupby(['Embarked'], as_index=True).mean().plot.bar(ax=ax)
Out[101]:
<matplotlib.axes._subplots.AxesSubplot at 0x29e5fca15f8>
In [102]:
fig, ax = plt.subplots(2,2,figsize=(20,15))
sns.countplot('Embarked', data = df_train, ax=ax[0,0])
ax[0,0].set_title('(1) No. of Passengers Boarded')
sns.countplot('Embarked', hue='Sex', data=df_train, ax=ax[0,1])
ax[0,1].set_title('(2) Male-Female Split for Embakrd')
sns.countplot('Embarked', hue='Survived', data=df_train, ax=ax[1,0])
ax[1,0].set_title('(3) Embarked vs Survived')
sns.countplot('Embarked', hue='Pclass', data=df_train, ax=ax[1,1])
ax[1,1].set_title('(4) Embarked vs Pclass')
plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()

6. Family - SibSp(형제 자매) + Parch(부모, 자녀)

In [103]:
df_train['FamilySize'] = df_train['SibSp'] + df_train['Parch'] + 1
df_test['FamilySize'] = df_test['SibSp'] + df_test['Parch'] + 1

print('Max: ', df_train['FamilySize'].max())
print('Min: ', df_train['FamilySize'].min())
Max:  11
Min:  1
In [104]:
fig, ax = plt.subplots(1, 3, figsize = (40,10))
sns.countplot('FamilySize', data = df_train, ax=ax[0])
ax[0].set_title('(1) No. Of Passengers Boarded')

sns.countplot('FamilySize', hue='Survived', data = df_train, ax=ax[1])
ax[1].set_title('(2) Survived countplot depending on FamilySize')

df_train[['FamilySize', 'Survived']].groupby(['FamilySize'], 
            as_index=True).mean().sort_values(by='Survived', ascending=False).plot.bar(ax=ax[2])
ax[2].set_title('(3) Survived rate depending on FamilySize')

plt.subplots_adjust(wspace=0.2, hspace=0.5)
plt.show()
In [105]:
fig, ax = plt.subplots(1, 1, figsize=(8,8))
g = sns.distplot(df_train['Fare'], label='Sewness', ax=ax)
In [106]:
df_test.loc[df_test.Fare.isnull(), 'Fare']= df_test['Fare'].mean()

df_train['Fare'] = df_train['Fare'].map(lambda i: np.log(i) if i > 0 else 0)
df_test['Fare'] = df_test['Fare'].map(lambda i: np.log(i) if i > 0 else 0)

fig, ax = plt.subplots(1, 1, figsize=(8, 8))
g = sns.distplot(df_train['Fare'], color='b', label='Skewness : {:.2f}'.format(df_train['Fare'].skew()), ax=ax)
g = g.legend(loc='best')

8. Cabin

In [107]:
# Cabin feature Null 비율 계산
df_train['Cabin'].isnull().sum() / df_train.shape[0]
Out[107]:
0.7710437710437711

Null 비율이 77% 으로 모델에 포함시키지 않는다

9. Ticket

In [108]:
df_train['Ticket'].value_counts()
Out[108]:
1601             7
CA. 2343         7
347082           7
3101295          6
347088           6
CA 2144          6
S.O.C. 14879     5
382652           5
347077           4
113781           4
4133             4
349909           4
19950            4
113760           4
2666             4
W./C. 6608       4
PC 17757         4
17421            4
LINE             4
29106            3
239853           3
110152           3
347742           3
248727           3
35273            3
345773           3
C.A. 34651       3
F.C.C. 13529     3
230080           3
C.A. 31921       3
                ..
111320           1
229236           1
350050           1
349253           1
347062           1
370377           1
367232           1
349210           1
28664            1
113788           1
2662             1
349221           1
248733           1
C.A. 6212        1
SC/PARIS 2167    1
33638            1
347081           1
244270           1
237671           1
243880           1
A/5. 13032       1
345777           1
349217           1
347063           1
F.C. 12750       1
8471             1
113051           1
237442           1
3460             1
2669             1
Name: Ticket, Length: 681, dtype: int64

티켓 넘버가 매우 다양하다. 어떤 특징을 이끌어내서 생존과 연결시킬수 있을까?

3. 특성 공학(Feature Engineering)


가장 먼저 dataset에 존재하는 null data를 채운다
null data를 어떻게 채우느냐가 모델의 성능을 좌지우지 할 수 있기 때문에 신경써 줘야한다.
특성 공학은 실제 모델의 학습에 쓰려고 하는 것이므로,
train 뿐만 아니라 test에도 똑같이 적용해 줘야 한다.

1. Fill Null in Age using title

In [109]:
# 이름의 title 위치 파악
df_train[['Name']]
Out[109]:
Name
0 Braund, Mr. Owen Harris
1 Cumings, Mrs. John Bradley (Florence Briggs Th...
2 Heikkinen, Miss. Laina
3 Futrelle, Mrs. Jacques Heath (Lily May Peel)
4 Allen, Mr. William Henry
5 Moran, Mr. James
6 McCarthy, Mr. Timothy J
7 Palsson, Master. Gosta Leonard
8 Johnson, Mrs. Oscar W (Elisabeth Vilhelmina Berg)
9 Nasser, Mrs. Nicholas (Adele Achem)
10 Sandstrom, Miss. Marguerite Rut
11 Bonnell, Miss. Elizabeth
12 Saundercock, Mr. William Henry
13 Andersson, Mr. Anders Johan
14 Vestrom, Miss. Hulda Amanda Adolfina
15 Hewlett, Mrs. (Mary D Kingcome)
16 Rice, Master. Eugene
17 Williams, Mr. Charles Eugene
18 Vander Planke, Mrs. Julius (Emelia Maria Vande...
19 Masselmani, Mrs. Fatima
20 Fynney, Mr. Joseph J
21 Beesley, Mr. Lawrence
22 McGowan, Miss. Anna "Annie"
23 Sloper, Mr. William Thompson
24 Palsson, Miss. Torborg Danira
25 Asplund, Mrs. Carl Oscar (Selma Augusta Emilia...
26 Emir, Mr. Farred Chehab
27 Fortune, Mr. Charles Alexander
28 O'Dwyer, Miss. Ellen "Nellie"
29 Todoroff, Mr. Lalio
... ...
861 Giles, Mr. Frederick Edward
862 Swift, Mrs. Frederick Joel (Margaret Welles Ba...
863 Sage, Miss. Dorothy Edith "Dolly"
864 Gill, Mr. John William
865 Bystrom, Mrs. (Karolina)
866 Duran y More, Miss. Asuncion
867 Roebling, Mr. Washington Augustus II
868 van Melkebeke, Mr. Philemon
869 Johnson, Master. Harold Theodor
870 Balkic, Mr. Cerin
871 Beckwith, Mrs. Richard Leonard (Sallie Monypeny)
872 Carlsson, Mr. Frans Olof
873 Vander Cruyssen, Mr. Victor
874 Abelson, Mrs. Samuel (Hannah Wizosky)
875 Najib, Miss. Adele Kiamie "Jane"
876 Gustafsson, Mr. Alfred Ossian
877 Petroff, Mr. Nedelio
878 Laleff, Mr. Kristo
879 Potter, Mrs. Thomas Jr (Lily Alexenia Wilson)
880 Shelley, Mrs. William (Imanita Parrish Hall)
881 Markun, Mr. Johann
882 Dahlberg, Miss. Gerda Ulrika
883 Banfield, Mr. Frederick James
884 Sutehall, Mr. Henry Jr
885 Rice, Mrs. William (Margaret Norton)
886 Montvila, Rev. Juozas
887 Graham, Miss. Margaret Edith
888 Johnston, Miss. Catherine Helen "Carrie"
889 Behr, Mr. Karl Howell
890 Dooley, Mr. Patrick

891 rows × 1 columns

In [110]:
# lets extract the Salutations
# 정규 표현식을 적용하게 해주는 extract 사용
df_train['Initial'] = df_train.Name.str.extract('([A-Za-z]+)\.') 
df_test['Initial'] = df_test.Name.str.extract('([A-Za-z]+)\.') 
In [111]:
pd.crosstab(df_train['Initial'], df_train['Sex']).T.style.background_gradient(cmap='summer_r')
Out[111]:
Initial Capt Col Countess Don Dr Jonkheer Lady Major Master Miss Mlle Mme Mr Mrs Ms Rev Sir
Sex
female 0 0 1 0 1 0 1 0 0 182 2 1 0 125 1 0 0
male 1 2 0 1 6 1 0 2 40 0 0 0 517 0 0 6 1
In [112]:
df_train['Initial'].replace(['Mlle','Mme','Ms','Dr','Major','Lady','Countess','Jonkheer',
                             'Col','Rev','Capt','Sir','Don', 'Dona'],
                        ['Miss','Miss','Miss','Mr','Mr','Mrs','Mrs',
                         'Other','Other','Other','Mr','Mr','Mr', 'Mr'],inplace=True)

df_test['Initial'].replace(['Mlle','Mme','Ms','Dr','Major','Lady','Countess','Jonkheer',
                            'Col','Rev','Capt','Sir','Don', 'Dona'],
                        ['Miss','Miss','Miss','Mr','Mr','Mrs','Mrs',
                         'Other','Other','Other','Mr','Mr','Mr', 'Mr'],inplace=True)
In [113]:
df_train.groupby('Initial').mean()
Out[113]:
PassengerId Survived Pclass Age SibSp Parch Fare FamilySize
Initial
Master 414.975000 0.575000 2.625000 4.574167 2.300000 1.375000 3.340710 4.675000
Miss 411.741935 0.704301 2.284946 21.860000 0.698925 0.537634 3.123713 2.236559
Mr 455.880907 0.162571 2.381853 32.739609 0.293006 0.151229 2.651507 1.444234
Mrs 456.393701 0.795276 1.984252 35.981818 0.692913 0.818898 3.443751 2.511811
Other 564.444444 0.111111 1.666667 45.888889 0.111111 0.111111 2.641605 1.222222
In [114]:
df_train.groupby('Initial')['Survived'].mean().plot.bar()
Out[114]:
<matplotlib.axes._subplots.AxesSubplot at 0x29e61ec0160>
In [115]:
df_train.groupby('Initial').mean()
Out[115]:
PassengerId Survived Pclass Age SibSp Parch Fare FamilySize
Initial
Master 414.975000 0.575000 2.625000 4.574167 2.300000 1.375000 3.340710 4.675000
Miss 411.741935 0.704301 2.284946 21.860000 0.698925 0.537634 3.123713 2.236559
Mr 455.880907 0.162571 2.381853 32.739609 0.293006 0.151229 2.651507 1.444234
Mrs 456.393701 0.795276 1.984252 35.981818 0.692913 0.818898 3.443751 2.511811
Other 564.444444 0.111111 1.666667 45.888889 0.111111 0.111111 2.641605 1.222222
In [116]:
# null 값들을 각 Initial 별 평균값으로 채워준다.
df_train.loc[(df_train.Age.isnull())&(df_train.Initial=='Mr'),'Age'] = 33
df_train.loc[(df_train.Age.isnull())&(df_train.Initial=='Mrs'),'Age'] = 36
df_train.loc[(df_train.Age.isnull())&(df_train.Initial=='Master'),'Age'] = 5
df_train.loc[(df_train.Age.isnull())&(df_train.Initial=='Miss'),'Age'] = 22
df_train.loc[(df_train.Age.isnull())&(df_train.Initial=='Other'),'Age'] = 46

df_test.loc[(df_test.Age.isnull())&(df_test.Initial=='Mr'),'Age'] = 33
df_test.loc[(df_test.Age.isnull())&(df_test.Initial=='Mrs'),'Age'] = 36
df_test.loc[(df_test.Age.isnull())&(df_test.Initial=='Master'),'Age'] = 5
df_test.loc[(df_test.Age.isnull())&(df_test.Initial=='Miss'),'Age'] = 22
df_test.loc[(df_test.Age.isnull())&(df_test.Initial=='Other'),'Age'] = 46

3.2 Fill Null in Embarked

In [117]:
# Null 값이 몇 개인지 확인한다.
df_train['Embarked'].isnull().sum()
Out[117]:
2
In [118]:
# fillna 를 사용하여 null 값을 쉽게 채울 수 있다.
df_train['Embarked'].fillna('S', inplace=True)

3.3 Change Age(continuous to categorical)

Age는 현재 Continuous feature 이다. Age를 그룹으로 나누어 category 화 시켜줄 수 있다.
자칫 information loss 가 생길 수도 있지만, 지금은 category화 시켜 진행하겠다.

In [119]:
def category_age(x):
    if x < 10:
        return 0
    elif x < 20:
        return 1
    elif x < 30:
        return 2
    elif x < 40:
        return 3
    elif x < 50:
        return 4
    elif x < 60:
        return 5
    elif x < 70:
        return 6
    else:
        return 7  
    
df_train['Age_cat'] = df_train.Age.apply(category_age)
df_test['Age_cat'] = df_test.Age.apply(category_age)
In [120]:
df_train.groupby(['Age_cat'])['PassengerId'].count()
Out[120]:
Age_cat
0     66
1    102
2    256
3    304
4     89
5     48
6     19
7      7
Name: PassengerId, dtype: int64

3.4 change Initial, Embarked and Sex(sting to numerical)

컴퓨터가 인식할 수 있도록 수치화 시켜준다.
map 메소드를 통해 간단히 할 수 있다.

In [121]:
df_train['Initial'] = df_train['Initial'].map({'Master': 0, 'Miss': 1, 'Mr': 2, 'Mrs': 3, 'Other': 4})
df_test.Initial = df_test.Initial.map({'Master': 0, 'Miss': 1, 'Mr': 2, 'Mrs': 3, 'Other': 4})
In [122]:
df_train['Embarked'] = df_train['Embarked'].map({'C': 0, 'Q': 1, 'S': 2})
df_test['Embarked'] = df_test['Embarked'].map({'C': 0, 'Q': 1, 'S': 2})
In [123]:
df_train['Embarked'].isnull().any() , df_train['Embarked'].dtypes
Out[123]:
(False, dtype('int64'))
In [124]:
df_train['Sex'] = df_train['Sex'].map({'female': 0, 'male': 1})
df_test['Sex'] = df_test['Sex'].map({'female': 0, 'male': 1})

df_train.columns
Out[124]:
Index(['PassengerId', 'Survived', 'Pclass', 'Name', 'Sex', 'Age', 'SibSp',
       'Parch', 'Ticket', 'Fare', 'Cabin', 'Embarked', 'FamilySize', 'Initial',
       'Age_cat'],
      dtype='object')
In [125]:
# feature 간의 상관관계를 구한다.(correlation)
# -1 로 갈수록 음의 상관관계
# 1로 갈수록 양의 상관관계
# 0은 상관관계가 없다는 것을 의미
heatmap_data = df_train[['Survived', 'Pclass', 'Sex', 'Fare', 'Embarked', 
                         'FamilySize', 'Initial', 'Age_cat', 'Age']] 

# astype : float 형태로 지정(assign), annot : 숫자 표시, annot_kws : 폰트 사이즈
colormap = plt.cm.Blues
plt.figure(figsize=(14, 12))
plt.title('Pearson Correlation of Features', y=1.05, size=15)
sns.heatmap(heatmap_data.astype(float).corr(), linewidths=0.1, vmax=1.0,
           square=True, cmap=colormap, linecolor='white', annot=True, annot_kws={"size": 16})

del heatmap_data

3.5 one-hot encoding on initial and Embarked

one-hot encoding 이란?</br> 원-핫 인코딩은 집합의 크기를 벡터의 차원으로 하고, 표현하고 싶은 단어의 인덱스에 1의 값을 부여하고, 다른
인덱스에는 0을 부여하는 단어의 벡터 표현 방식이다. 이렇게 표현된 벡터를 one-hot vector 라고 한다.

In [126]:
# pd.get_dummies 를 이용하여 쉽게 one-hot encoding 할 수 있다.
df_train = pd.get_dummies(df_train, columns=['Initial'], prefix='Initial')
df_test = pd.get_dummies(df_test, columns=['Initial'], prefix='Initial')
In [127]:
df_train.head()
Out[127]:
PassengerId Survived Pclass Name Sex Age SibSp Parch Ticket Fare Cabin Embarked FamilySize Age_cat Initial_0 Initial_1 Initial_2 Initial_3 Initial_4
0 1 0 3 Braund, Mr. Owen Harris 1 22.0 1 0 A/5 21171 1.981001 NaN 2 2 2 0 0 1 0 0
1 2 1 1 Cumings, Mrs. John Bradley (Florence Briggs Th... 0 38.0 1 0 PC 17599 4.266662 C85 0 2 3 0 0 0 1 0
2 3 1 3 Heikkinen, Miss. Laina 0 26.0 0 0 STON/O2. 3101282 2.070022 NaN 2 1 2 0 1 0 0 0
3 4 1 1 Futrelle, Mrs. Jacques Heath (Lily May Peel) 0 35.0 1 0 113803 3.972177 C123 2 2 3 0 0 0 1 0
4 5 0 3 Allen, Mr. William Henry 1 35.0 0 0 373450 2.085672 NaN 2 1 3 0 0 1 0 0
In [128]:
df_train = pd.get_dummies(df_train, columns=['Embarked'], prefix='Embarked')
df_test = pd.get_dummies(df_test, columns=['Embarked'], prefix='Embarked')

3.6 Drop columns

In [129]:
df_train.drop(['PassengerId', 'Name', 'SibSp', 'Parch', 'Ticket', 'Cabin'], axis=1, inplace=True)
df_test.drop(['PassengerId', 'Name', 'SibSp', 'Parch', 'Ticket', 'Cabin'], axis=1, inplace=True)
In [130]:
df_train.head()
Out[130]:
Survived Pclass Sex Age Fare FamilySize Age_cat Initial_0 Initial_1 Initial_2 Initial_3 Initial_4 Embarked_0 Embarked_1 Embarked_2
0 0 3 1 22.0 1.981001 2 2 0 0 1 0 0 0 0 1
1 1 1 0 38.0 4.266662 2 3 0 0 0 1 0 1 0 0
2 1 3 0 26.0 2.070022 1 2 0 1 0 0 0 0 0 1
3 1 1 0 35.0 3.972177 2 3 0 0 0 1 0 0 0 1
4 0 3 1 35.0 2.085672 1 3 0 0 1 0 0 0 0 1
In [131]:
df_train.dtypes
Out[131]:
Survived        int64
Pclass          int64
Sex             int64
Age           float64
Fare          float64
FamilySize      int64
Age_cat         int64
Initial_0       uint8
Initial_1       uint8
Initial_2       uint8
Initial_3       uint8
Initial_4       uint8
Embarked_0      uint8
Embarked_1      uint8
Embarked_2      uint8
dtype: object

4. 모델 개발 및 학습

In [132]:
#importing all the required ML packages
from sklearn.ensemble import RandomForestClassifier # 유명한 randomforestclassfier 입니다. 
from sklearn import metrics # 모델의 평가를 위해서 씁니다
from sklearn.model_selection import train_test_split # traning set을 쉽게 나눠주는 함수입니다.

4.1 Preparation - Split dataset into train, valid(dev), test set

가장 먼저 학습에 쓰일 데이터와, target label(Survived)를 분리시켜 준다.

In [133]:
X_train = df_train.drop('Survived', axis=1).values
target_label = df_train.Survived.values
X_test = df_test.values
In [134]:
X_train.shape, X_test.shape
Out[134]:
((891, 14), (418, 14))
In [135]:
# 좋은 모델을 만들기 위해서 Valid(dev) set 을 따로 만들어 모델을 평가한다
# train_test_split(arrays, test_size, train_size, random_state, shuffle, stratify)
# arrays : 분할 시킬 데이터
# test_size : 테스트 데이터셋의 비율이나 갯수
# random_state : 데이터 분할시 셔플이 이루어지는데 이를 위한 시드값
# shuffle : 셔플 여부 설정(default = True)
# stratify : 지정한 데이터의 비율을 유지
# Return : X_train, X_test, Y_train, Y_test : arrays에 데이터와 레이블을 둘 다 넣었을 경우에 반환
X_tr, X_vld, y_tr, y_vld = train_test_split(X_train, target_label
                                            , test_size=0.2, random_state=2018)

4.2 Model generation and prediction

sklearn 알고리즘의 하나인 렌덤 포레스트 모델을 생성하고 학습한다
랜덤포레스트는 결정트리기반 모델이며, 여러 결정 트리들을 앙상블한 모델이다.

In [138]:
# 랜덤포레스트 생성, 학습, 예측
model = RandomForestClassifier()
model.fit(X_tr, y_tr)
prediction = model.predict(X_vld)
In [139]:
print('총 {}명 중 {:.2f}% 정확도로 생존을 맞춤'.format(y_vld.shape[0], 100 * metrics.accuracy_score(prediction, y_vld)))
총 179명 중 82.12% 정확도로 생존을 맞춤
In [141]:
from pandas import Series

feature_importance = model.feature_importances_
Series_feat_imp = Series(feature_importance, index = df_test.columns)
In [142]:
plt.figure(figsize=(8,8))
Series_feat_imp.sort_values(ascending=True).plot.barh()
plt.xlabel("Feature Importance")
plt.ylabel("Feature")
plt.show()

4. Keras를 사용한 NN 모델 개발

In [145]:
from keras.models import Sequential
from keras.layers.core import Dense, Dropout
from keras.optimizers import Adam, SGD
In [146]:
nn_model = Sequential()
nn_model.add(Dense(32,activation='relu',input_shape=(14,)))
nn_model.add(Dropout(0.2))
nn_model.add(Dense(64,activation='relu'))
nn_model.add(Dropout(0.2))
nn_model.add(Dense(32,activation='relu'))
nn_model.add(Dropout(0.2))
nn_model.add(Dense(1,activation='sigmoid'))

Loss = 'binary_crossentropy'
nn_model.compile(loss=Loss,optimizer=Adam(),metrics=['accuracy'])
nn_model.summary()
WARNING:tensorflow:From C:\Anaconda3\envs\venv\lib\site-packages\tensorflow\python\framework\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
WARNING:tensorflow:From C:\Anaconda3\envs\venv\lib\site-packages\keras\backend\tensorflow_backend.py:3445: calling dropout (from tensorflow.python.ops.nn_ops) with keep_prob is deprecated and will be removed in a future version.
Instructions for updating:
Please use `rate` instead of `keep_prob`. Rate should be set to `rate = 1 - keep_prob`.
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 32)                480       
_________________________________________________________________
dropout_1 (Dropout)          (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 64)                2112      
_________________________________________________________________
dropout_2 (Dropout)          (None, 64)                0         
_________________________________________________________________
dense_3 (Dense)              (None, 32)                2080      
_________________________________________________________________
dropout_3 (Dropout)          (None, 32)                0         
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 33        
=================================================================
Total params: 4,705
Trainable params: 4,705
Non-trainable params: 0
_________________________________________________________________
In [147]:
history = nn_model.fit(X_tr,y_tr,
                    batch_size=64,
                    epochs=500,
                    validation_data=(X_vld, y_vld),
                    verbose=1)
WARNING:tensorflow:From C:\Anaconda3\envs\venv\lib\site-packages\tensorflow\python\ops\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
Train on 712 samples, validate on 179 samples
Epoch 1/500
712/712 [==============================] - 2s 2ms/step - loss: 0.9941 - acc: 0.5562 - val_loss: 0.6499 - val_acc: 0.6592
Epoch 2/500
712/712 [==============================] - 0s 53us/step - loss: 0.8458 - acc: 0.5702 - val_loss: 0.5921 - val_acc: 0.6760
Epoch 3/500
712/712 [==============================] - 0s 83us/step - loss: 0.8033 - acc: 0.5941 - val_loss: 0.5985 - val_acc: 0.7374
Epoch 4/500
712/712 [==============================] - 0s 63us/step - loss: 0.7794 - acc: 0.5829 - val_loss: 0.5767 - val_acc: 0.7318
Epoch 5/500
712/712 [==============================] - 0s 64us/step - loss: 0.7144 - acc: 0.6025 - val_loss: 0.6200 - val_acc: 0.6983
Epoch 6/500
712/712 [==============================] - 0s 50us/step - loss: 0.7199 - acc: 0.5969 - val_loss: 0.5629 - val_acc: 0.7709
Epoch 7/500
712/712 [==============================] - 0s 56us/step - loss: 0.6799 - acc: 0.6306 - val_loss: 0.5534 - val_acc: 0.7709
Epoch 8/500
712/712 [==============================] - 0s 52us/step - loss: 0.6659 - acc: 0.6531 - val_loss: 0.5620 - val_acc: 0.7486
Epoch 9/500
712/712 [==============================] - 0s 46us/step - loss: 0.6386 - acc: 0.6756 - val_loss: 0.5529 - val_acc: 0.7430
Epoch 10/500
712/712 [==============================] - 0s 46us/step - loss: 0.6472 - acc: 0.6643 - val_loss: 0.5214 - val_acc: 0.7598
Epoch 11/500
712/712 [==============================] - 0s 50us/step - loss: 0.6199 - acc: 0.6798 - val_loss: 0.5305 - val_acc: 0.7430
Epoch 12/500
712/712 [==============================] - 0s 45us/step - loss: 0.6087 - acc: 0.6826 - val_loss: 0.5183 - val_acc: 0.7654
Epoch 13/500
712/712 [==============================] - 0s 50us/step - loss: 0.5993 - acc: 0.6728 - val_loss: 0.4962 - val_acc: 0.7933
Epoch 14/500
712/712 [==============================] - 0s 60us/step - loss: 0.6023 - acc: 0.7093 - val_loss: 0.5075 - val_acc: 0.7598
Epoch 15/500
712/712 [==============================] - 0s 49us/step - loss: 0.5993 - acc: 0.7065 - val_loss: 0.4939 - val_acc: 0.7709
Epoch 16/500
712/712 [==============================] - 0s 48us/step - loss: 0.5955 - acc: 0.7037 - val_loss: 0.4877 - val_acc: 0.7989
Epoch 17/500
712/712 [==============================] - 0s 50us/step - loss: 0.5610 - acc: 0.7247 - val_loss: 0.4830 - val_acc: 0.7933
Epoch 18/500
712/712 [==============================] - 0s 46us/step - loss: 0.5473 - acc: 0.7317 - val_loss: 0.4698 - val_acc: 0.8045
Epoch 19/500
712/712 [==============================] - 0s 60us/step - loss: 0.5448 - acc: 0.7514 - val_loss: 0.4689 - val_acc: 0.8045
Epoch 20/500
712/712 [==============================] - 0s 71us/step - loss: 0.5583 - acc: 0.7317 - val_loss: 0.4567 - val_acc: 0.7933
Epoch 21/500
712/712 [==============================] - 0s 63us/step - loss: 0.5203 - acc: 0.7500 - val_loss: 0.4449 - val_acc: 0.8212
Epoch 22/500
712/712 [==============================] - 0s 57us/step - loss: 0.5293 - acc: 0.7472 - val_loss: 0.4459 - val_acc: 0.8156
Epoch 23/500
712/712 [==============================] - 0s 48us/step - loss: 0.5293 - acc: 0.7542 - val_loss: 0.4569 - val_acc: 0.8045
Epoch 24/500
712/712 [==============================] - 0s 52us/step - loss: 0.5173 - acc: 0.7570 - val_loss: 0.4338 - val_acc: 0.8324
Epoch 25/500
712/712 [==============================] - 0s 45us/step - loss: 0.5126 - acc: 0.7584 - val_loss: 0.4366 - val_acc: 0.8380
Epoch 26/500
712/712 [==============================] - 0s 48us/step - loss: 0.5059 - acc: 0.7640 - val_loss: 0.4210 - val_acc: 0.8492
Epoch 27/500
712/712 [==============================] - 0s 48us/step - loss: 0.5045 - acc: 0.7809 - val_loss: 0.4315 - val_acc: 0.8436
Epoch 28/500
712/712 [==============================] - 0s 49us/step - loss: 0.4781 - acc: 0.7921 - val_loss: 0.4260 - val_acc: 0.8436
Epoch 29/500
712/712 [==============================] - 0s 64us/step - loss: 0.4927 - acc: 0.7865 - val_loss: 0.4089 - val_acc: 0.8547
Epoch 30/500
712/712 [==============================] - 0s 48us/step - loss: 0.4893 - acc: 0.7879 - val_loss: 0.4058 - val_acc: 0.8436
Epoch 31/500
712/712 [==============================] - 0s 46us/step - loss: 0.4803 - acc: 0.7949 - val_loss: 0.4051 - val_acc: 0.8492
Epoch 32/500
712/712 [==============================] - 0s 46us/step - loss: 0.4924 - acc: 0.7823 - val_loss: 0.4121 - val_acc: 0.8324
Epoch 33/500
712/712 [==============================] - 0s 48us/step - loss: 0.5155 - acc: 0.7879 - val_loss: 0.4097 - val_acc: 0.8547
Epoch 34/500
712/712 [==============================] - 0s 48us/step - loss: 0.4857 - acc: 0.7907 - val_loss: 0.4212 - val_acc: 0.8547
Epoch 35/500
712/712 [==============================] - 0s 48us/step - loss: 0.4755 - acc: 0.7978 - val_loss: 0.4203 - val_acc: 0.8268
Epoch 36/500
712/712 [==============================] - 0s 56us/step - loss: 0.4744 - acc: 0.7949 - val_loss: 0.4062 - val_acc: 0.8380
Epoch 37/500
712/712 [==============================] - 0s 48us/step - loss: 0.4966 - acc: 0.7697 - val_loss: 0.3933 - val_acc: 0.8547
Epoch 38/500
712/712 [==============================] - 0s 62us/step - loss: 0.4675 - acc: 0.7935 - val_loss: 0.4152 - val_acc: 0.8212
Epoch 39/500
712/712 [==============================] - 0s 64us/step - loss: 0.4747 - acc: 0.7893 - val_loss: 0.4024 - val_acc: 0.8547
Epoch 40/500
712/712 [==============================] - 0s 64us/step - loss: 0.4671 - acc: 0.7992 - val_loss: 0.3990 - val_acc: 0.8492
Epoch 41/500
712/712 [==============================] - 0s 62us/step - loss: 0.4875 - acc: 0.7921 - val_loss: 0.4114 - val_acc: 0.8268
Epoch 42/500
712/712 [==============================] - 0s 52us/step - loss: 0.4704 - acc: 0.7992 - val_loss: 0.4140 - val_acc: 0.8492
Epoch 43/500
712/712 [==============================] - 0s 50us/step - loss: 0.4492 - acc: 0.8020 - val_loss: 0.3997 - val_acc: 0.8436
Epoch 44/500
712/712 [==============================] - 0s 52us/step - loss: 0.4721 - acc: 0.8020 - val_loss: 0.4014 - val_acc: 0.8492
Epoch 45/500
712/712 [==============================] - 0s 56us/step - loss: 0.4561 - acc: 0.7921 - val_loss: 0.4042 - val_acc: 0.8436
Epoch 46/500
712/712 [==============================] - 0s 56us/step - loss: 0.4579 - acc: 0.7935 - val_loss: 0.4017 - val_acc: 0.8436
Epoch 47/500
712/712 [==============================] - 0s 63us/step - loss: 0.4594 - acc: 0.7921 - val_loss: 0.4036 - val_acc: 0.8547
Epoch 48/500
712/712 [==============================] - 0s 56us/step - loss: 0.4595 - acc: 0.8034 - val_loss: 0.4011 - val_acc: 0.8492
Epoch 49/500
712/712 [==============================] - 0s 46us/step - loss: 0.4287 - acc: 0.8216 - val_loss: 0.4051 - val_acc: 0.8492
Epoch 50/500
712/712 [==============================] - 0s 46us/step - loss: 0.4411 - acc: 0.8076 - val_loss: 0.4007 - val_acc: 0.8436
Epoch 51/500
712/712 [==============================] - 0s 49us/step - loss: 0.4418 - acc: 0.8132 - val_loss: 0.4043 - val_acc: 0.8380
Epoch 52/500
712/712 [==============================] - 0s 48us/step - loss: 0.4305 - acc: 0.7992 - val_loss: 0.4142 - val_acc: 0.8380
Epoch 53/500
712/712 [==============================] - 0s 66us/step - loss: 0.4481 - acc: 0.8020 - val_loss: 0.4109 - val_acc: 0.8380
Epoch 54/500
712/712 [==============================] - 0s 66us/step - loss: 0.4344 - acc: 0.8020 - val_loss: 0.3973 - val_acc: 0.8380
Epoch 55/500
712/712 [==============================] - 0s 53us/step - loss: 0.4297 - acc: 0.8160 - val_loss: 0.3924 - val_acc: 0.8380
Epoch 56/500
712/712 [==============================] - 0s 56us/step - loss: 0.4389 - acc: 0.8174 - val_loss: 0.4022 - val_acc: 0.8436
Epoch 57/500
712/712 [==============================] - 0s 48us/step - loss: 0.4566 - acc: 0.8104 - val_loss: 0.4041 - val_acc: 0.8380
Epoch 58/500
712/712 [==============================] - 0s 45us/step - loss: 0.4498 - acc: 0.8006 - val_loss: 0.4216 - val_acc: 0.8436
Epoch 59/500
712/712 [==============================] - 0s 46us/step - loss: 0.4408 - acc: 0.8034 - val_loss: 0.4329 - val_acc: 0.8212
Epoch 60/500
712/712 [==============================] - 0s 46us/step - loss: 0.4382 - acc: 0.8076 - val_loss: 0.3987 - val_acc: 0.8436
Epoch 61/500
712/712 [==============================] - 0s 48us/step - loss: 0.4544 - acc: 0.8034 - val_loss: 0.3960 - val_acc: 0.8436
Epoch 62/500
712/712 [==============================] - 0s 45us/step - loss: 0.4336 - acc: 0.8048 - val_loss: 0.3943 - val_acc: 0.8436
Epoch 63/500
712/712 [==============================] - 0s 46us/step - loss: 0.4412 - acc: 0.8090 - val_loss: 0.4021 - val_acc: 0.8380
Epoch 64/500
712/712 [==============================] - 0s 49us/step - loss: 0.4363 - acc: 0.8357 - val_loss: 0.3963 - val_acc: 0.8436
Epoch 65/500
712/712 [==============================] - 0s 46us/step - loss: 0.4200 - acc: 0.8132 - val_loss: 0.3980 - val_acc: 0.8380
Epoch 66/500
712/712 [==============================] - 0s 55us/step - loss: 0.4302 - acc: 0.8062 - val_loss: 0.4067 - val_acc: 0.8436
Epoch 67/500
712/712 [==============================] - 0s 50us/step - loss: 0.4222 - acc: 0.8132 - val_loss: 0.4105 - val_acc: 0.8324
Epoch 68/500
712/712 [==============================] - 0s 50us/step - loss: 0.4268 - acc: 0.8230 - val_loss: 0.3956 - val_acc: 0.8436
Epoch 69/500
712/712 [==============================] - 0s 52us/step - loss: 0.4395 - acc: 0.8062 - val_loss: 0.3926 - val_acc: 0.8436
Epoch 70/500
712/712 [==============================] - 0s 46us/step - loss: 0.4243 - acc: 0.8188 - val_loss: 0.3906 - val_acc: 0.8547
Epoch 71/500
712/712 [==============================] - 0s 53us/step - loss: 0.4219 - acc: 0.8202 - val_loss: 0.3921 - val_acc: 0.8436
Epoch 72/500
712/712 [==============================] - 0s 50us/step - loss: 0.4242 - acc: 0.8216 - val_loss: 0.3914 - val_acc: 0.8436
Epoch 73/500
712/712 [==============================] - 0s 53us/step - loss: 0.4478 - acc: 0.8020 - val_loss: 0.3947 - val_acc: 0.8492
Epoch 74/500
712/712 [==============================] - 0s 62us/step - loss: 0.4281 - acc: 0.8146 - val_loss: 0.4033 - val_acc: 0.8380
Epoch 75/500
712/712 [==============================] - 0s 49us/step - loss: 0.4134 - acc: 0.8329 - val_loss: 0.3945 - val_acc: 0.8380
Epoch 76/500
712/712 [==============================] - 0s 60us/step - loss: 0.4325 - acc: 0.8104 - val_loss: 0.3968 - val_acc: 0.8380
Epoch 77/500
712/712 [==============================] - 0s 62us/step - loss: 0.4230 - acc: 0.8329 - val_loss: 0.3935 - val_acc: 0.8380
Epoch 78/500
712/712 [==============================] - 0s 49us/step - loss: 0.4279 - acc: 0.8188 - val_loss: 0.3924 - val_acc: 0.8492
Epoch 79/500
712/712 [==============================] - 0s 50us/step - loss: 0.4161 - acc: 0.8301 - val_loss: 0.4003 - val_acc: 0.8324
Epoch 80/500
712/712 [==============================] - 0s 46us/step - loss: 0.4515 - acc: 0.8048 - val_loss: 0.3950 - val_acc: 0.8436
Epoch 81/500
712/712 [==============================] - 0s 45us/step - loss: 0.4434 - acc: 0.8160 - val_loss: 0.3957 - val_acc: 0.8324
Epoch 82/500
712/712 [==============================] - 0s 48us/step - loss: 0.4137 - acc: 0.8258 - val_loss: 0.4084 - val_acc: 0.8268
Epoch 83/500
712/712 [==============================] - 0s 46us/step - loss: 0.4207 - acc: 0.8076 - val_loss: 0.3932 - val_acc: 0.8547
Epoch 84/500
712/712 [==============================] - 0s 48us/step - loss: 0.4243 - acc: 0.8118 - val_loss: 0.4071 - val_acc: 0.8380
Epoch 85/500
712/712 [==============================] - 0s 50us/step - loss: 0.4415 - acc: 0.8104 - val_loss: 0.3956 - val_acc: 0.8380
Epoch 86/500
712/712 [==============================] - 0s 46us/step - loss: 0.4330 - acc: 0.8202 - val_loss: 0.3951 - val_acc: 0.8380
Epoch 87/500
712/712 [==============================] - 0s 52us/step - loss: 0.4116 - acc: 0.8301 - val_loss: 0.4185 - val_acc: 0.8324
Epoch 88/500
712/712 [==============================] - 0s 64us/step - loss: 0.4322 - acc: 0.8104 - val_loss: 0.4092 - val_acc: 0.8547
Epoch 89/500
712/712 [==============================] - 0s 80us/step - loss: 0.4261 - acc: 0.8202 - val_loss: 0.3990 - val_acc: 0.8492
Epoch 90/500
712/712 [==============================] - 0s 50us/step - loss: 0.4290 - acc: 0.8202 - val_loss: 0.3943 - val_acc: 0.8547
Epoch 91/500
712/712 [==============================] - 0s 45us/step - loss: 0.4314 - acc: 0.8202 - val_loss: 0.3931 - val_acc: 0.8547
Epoch 92/500
712/712 [==============================] - 0s 48us/step - loss: 0.4202 - acc: 0.8230 - val_loss: 0.3992 - val_acc: 0.8492
Epoch 93/500
712/712 [==============================] - 0s 48us/step - loss: 0.4319 - acc: 0.8160 - val_loss: 0.4097 - val_acc: 0.8380
Epoch 94/500
712/712 [==============================] - 0s 48us/step - loss: 0.4183 - acc: 0.8413 - val_loss: 0.3984 - val_acc: 0.8603
Epoch 95/500
712/712 [==============================] - 0s 48us/step - loss: 0.4327 - acc: 0.8174 - val_loss: 0.3958 - val_acc: 0.8492
Epoch 96/500
712/712 [==============================] - 0s 60us/step - loss: 0.4337 - acc: 0.8076 - val_loss: 0.3964 - val_acc: 0.8268
Epoch 97/500
712/712 [==============================] - 0s 49us/step - loss: 0.4196 - acc: 0.8301 - val_loss: 0.4045 - val_acc: 0.8547
Epoch 98/500
712/712 [==============================] - 0s 49us/step - loss: 0.4384 - acc: 0.8160 - val_loss: 0.4054 - val_acc: 0.8547
Epoch 99/500
712/712 [==============================] - 0s 46us/step - loss: 0.4285 - acc: 0.8160 - val_loss: 0.3919 - val_acc: 0.8380
Epoch 100/500
712/712 [==============================] - 0s 57us/step - loss: 0.4306 - acc: 0.8202 - val_loss: 0.3945 - val_acc: 0.8547
Epoch 101/500
712/712 [==============================] - 0s 48us/step - loss: 0.4015 - acc: 0.8174 - val_loss: 0.3997 - val_acc: 0.8380
Epoch 102/500
712/712 [==============================] - 0s 46us/step - loss: 0.4165 - acc: 0.8301 - val_loss: 0.3921 - val_acc: 0.8603
Epoch 103/500
712/712 [==============================] - 0s 46us/step - loss: 0.4090 - acc: 0.8329 - val_loss: 0.3855 - val_acc: 0.8603
Epoch 104/500
712/712 [==============================] - 0s 48us/step - loss: 0.4178 - acc: 0.8230 - val_loss: 0.3845 - val_acc: 0.8436
Epoch 105/500
712/712 [==============================] - 0s 55us/step - loss: 0.4111 - acc: 0.8244 - val_loss: 0.3860 - val_acc: 0.8603
Epoch 106/500
712/712 [==============================] - 0s 48us/step - loss: 0.4076 - acc: 0.8315 - val_loss: 0.3881 - val_acc: 0.8436
Epoch 107/500
712/712 [==============================] - 0s 46us/step - loss: 0.4125 - acc: 0.8329 - val_loss: 0.3840 - val_acc: 0.8436
Epoch 108/500
712/712 [==============================] - 0s 46us/step - loss: 0.4038 - acc: 0.8413 - val_loss: 0.3857 - val_acc: 0.8603
Epoch 109/500
712/712 [==============================] - 0s 49us/step - loss: 0.4228 - acc: 0.8258 - val_loss: 0.3860 - val_acc: 0.8380
Epoch 110/500
712/712 [==============================] - 0s 49us/step - loss: 0.4085 - acc: 0.8329 - val_loss: 0.3884 - val_acc: 0.8436
Epoch 111/500
712/712 [==============================] - 0s 62us/step - loss: 0.4274 - acc: 0.8202 - val_loss: 0.3880 - val_acc: 0.8547
Epoch 112/500
712/712 [==============================] - 0s 62us/step - loss: 0.4067 - acc: 0.8244 - val_loss: 0.3846 - val_acc: 0.8380
Epoch 113/500
712/712 [==============================] - 0s 57us/step - loss: 0.4182 - acc: 0.8062 - val_loss: 0.3907 - val_acc: 0.8547
Epoch 114/500
712/712 [==============================] - 0s 45us/step - loss: 0.4058 - acc: 0.8315 - val_loss: 0.3879 - val_acc: 0.8547
Epoch 115/500
712/712 [==============================] - 0s 56us/step - loss: 0.4051 - acc: 0.8174 - val_loss: 0.3871 - val_acc: 0.8547
Epoch 116/500
712/712 [==============================] - 0s 50us/step - loss: 0.4065 - acc: 0.8258 - val_loss: 0.3881 - val_acc: 0.8547
Epoch 117/500
712/712 [==============================] - 0s 64us/step - loss: 0.4178 - acc: 0.8202 - val_loss: 0.3860 - val_acc: 0.8380
Epoch 118/500
712/712 [==============================] - 0s 70us/step - loss: 0.4128 - acc: 0.8174 - val_loss: 0.3899 - val_acc: 0.8492
Epoch 119/500
712/712 [==============================] - 0s 53us/step - loss: 0.3966 - acc: 0.8272 - val_loss: 0.3877 - val_acc: 0.8547
Epoch 120/500
712/712 [==============================] - 0s 52us/step - loss: 0.4087 - acc: 0.8287 - val_loss: 0.3939 - val_acc: 0.8492
Epoch 121/500
712/712 [==============================] - 0s 48us/step - loss: 0.4124 - acc: 0.8301 - val_loss: 0.3901 - val_acc: 0.8380
Epoch 122/500
712/712 [==============================] - 0s 91us/step - loss: 0.4161 - acc: 0.8244 - val_loss: 0.4043 - val_acc: 0.8380
Epoch 123/500
712/712 [==============================] - 0s 76us/step - loss: 0.4101 - acc: 0.8329 - val_loss: 0.3956 - val_acc: 0.8436
Epoch 124/500
712/712 [==============================] - 0s 87us/step - loss: 0.4012 - acc: 0.8357 - val_loss: 0.3889 - val_acc: 0.8436
Epoch 125/500
712/712 [==============================] - 0s 83us/step - loss: 0.4213 - acc: 0.8301 - val_loss: 0.3920 - val_acc: 0.8436
Epoch 126/500
712/712 [==============================] - 0s 98us/step - loss: 0.4050 - acc: 0.8329 - val_loss: 0.3883 - val_acc: 0.8436
Epoch 127/500
712/712 [==============================] - 0s 122us/step - loss: 0.4243 - acc: 0.8216 - val_loss: 0.3930 - val_acc: 0.8492
Epoch 128/500
712/712 [==============================] - 0s 133us/step - loss: 0.4053 - acc: 0.8301 - val_loss: 0.3942 - val_acc: 0.8436
Epoch 129/500
712/712 [==============================] - 0s 151us/step - loss: 0.4286 - acc: 0.8174 - val_loss: 0.3907 - val_acc: 0.8436
Epoch 130/500
712/712 [==============================] - 0s 113us/step - loss: 0.4178 - acc: 0.8315 - val_loss: 0.3949 - val_acc: 0.8492
Epoch 131/500
712/712 [==============================] - 0s 97us/step - loss: 0.4076 - acc: 0.8301 - val_loss: 0.3959 - val_acc: 0.8436
Epoch 132/500
712/712 [==============================] - 0s 80us/step - loss: 0.4065 - acc: 0.8343 - val_loss: 0.3911 - val_acc: 0.8436
Epoch 133/500
712/712 [==============================] - 0s 78us/step - loss: 0.4057 - acc: 0.8357 - val_loss: 0.3866 - val_acc: 0.8436
Epoch 134/500
712/712 [==============================] - 0s 55us/step - loss: 0.3988 - acc: 0.8343 - val_loss: 0.3845 - val_acc: 0.8436
Epoch 135/500
712/712 [==============================] - 0s 154us/step - loss: 0.4026 - acc: 0.8329 - val_loss: 0.3925 - val_acc: 0.8380
Epoch 136/500
712/712 [==============================] - 0s 77us/step - loss: 0.4066 - acc: 0.8343 - val_loss: 0.3884 - val_acc: 0.8436
Epoch 137/500
712/712 [==============================] - 0s 106us/step - loss: 0.4157 - acc: 0.8244 - val_loss: 0.3818 - val_acc: 0.8492
Epoch 138/500
712/712 [==============================] - 0s 102us/step - loss: 0.4274 - acc: 0.8301 - val_loss: 0.3858 - val_acc: 0.8492
Epoch 139/500
712/712 [==============================] - 0s 105us/step - loss: 0.4094 - acc: 0.8329 - val_loss: 0.3856 - val_acc: 0.8547
Epoch 140/500
712/712 [==============================] - 0s 67us/step - loss: 0.4112 - acc: 0.8301 - val_loss: 0.3884 - val_acc: 0.8436
Epoch 141/500
712/712 [==============================] - 0s 71us/step - loss: 0.4225 - acc: 0.8272 - val_loss: 0.3941 - val_acc: 0.8436
Epoch 142/500
712/712 [==============================] - 0s 160us/step - loss: 0.4147 - acc: 0.8188 - val_loss: 0.3934 - val_acc: 0.8436
Epoch 143/500
712/712 [==============================] - 0s 90us/step - loss: 0.4003 - acc: 0.8441 - val_loss: 0.3901 - val_acc: 0.8380
Epoch 144/500
712/712 [==============================] - 0s 101us/step - loss: 0.4155 - acc: 0.8258 - val_loss: 0.3913 - val_acc: 0.8492
Epoch 145/500
712/712 [==============================] - 0s 83us/step - loss: 0.4088 - acc: 0.8230 - val_loss: 0.3913 - val_acc: 0.8492
Epoch 146/500
712/712 [==============================] - 0s 74us/step - loss: 0.3934 - acc: 0.8329 - val_loss: 0.3898 - val_acc: 0.8380
Epoch 147/500
712/712 [==============================] - 0s 77us/step - loss: 0.4096 - acc: 0.8357 - val_loss: 0.3857 - val_acc: 0.8436
Epoch 148/500
712/712 [==============================] - 0s 160us/step - loss: 0.4000 - acc: 0.8202 - val_loss: 0.3931 - val_acc: 0.8380
Epoch 149/500
712/712 [==============================] - 0s 87us/step - loss: 0.4106 - acc: 0.8160 - val_loss: 0.3805 - val_acc: 0.8492
Epoch 150/500
712/712 [==============================] - 0s 66us/step - loss: 0.3984 - acc: 0.8315 - val_loss: 0.3821 - val_acc: 0.8380
Epoch 151/500
712/712 [==============================] - 0s 84us/step - loss: 0.4000 - acc: 0.8301 - val_loss: 0.3831 - val_acc: 0.8603
Epoch 152/500
712/712 [==============================] - 0s 85us/step - loss: 0.3967 - acc: 0.8315 - val_loss: 0.3906 - val_acc: 0.8380
Epoch 153/500
712/712 [==============================] - 0s 88us/step - loss: 0.3992 - acc: 0.8272 - val_loss: 0.3878 - val_acc: 0.8436
Epoch 154/500
712/712 [==============================] - 0s 83us/step - loss: 0.3833 - acc: 0.8371 - val_loss: 0.3907 - val_acc: 0.8380
Epoch 155/500
712/712 [==============================] - 0s 150us/step - loss: 0.4059 - acc: 0.8230 - val_loss: 0.3854 - val_acc: 0.8492
Epoch 156/500
712/712 [==============================] - 0s 87us/step - loss: 0.3954 - acc: 0.8357 - val_loss: 0.3855 - val_acc: 0.8436
Epoch 157/500
712/712 [==============================] - 0s 104us/step - loss: 0.4077 - acc: 0.8244 - val_loss: 0.3888 - val_acc: 0.8436
Epoch 158/500
712/712 [==============================] - 0s 76us/step - loss: 0.4107 - acc: 0.8216 - val_loss: 0.3894 - val_acc: 0.8324
Epoch 159/500
712/712 [==============================] - 0s 80us/step - loss: 0.3785 - acc: 0.8329 - val_loss: 0.3876 - val_acc: 0.8380
Epoch 160/500
712/712 [==============================] - 0s 171us/step - loss: 0.4126 - acc: 0.8272 - val_loss: 0.3850 - val_acc: 0.8436
Epoch 161/500
712/712 [==============================] - 0s 80us/step - loss: 0.4055 - acc: 0.8258 - val_loss: 0.3843 - val_acc: 0.8547
Epoch 162/500
712/712 [==============================] - 0s 80us/step - loss: 0.3921 - acc: 0.8329 - val_loss: 0.3873 - val_acc: 0.8492
Epoch 163/500
712/712 [==============================] - 0s 105us/step - loss: 0.3955 - acc: 0.8315 - val_loss: 0.3861 - val_acc: 0.8436
Epoch 164/500
712/712 [==============================] - 0s 87us/step - loss: 0.4087 - acc: 0.8343 - val_loss: 0.3917 - val_acc: 0.8436
Epoch 165/500
712/712 [==============================] - 0s 355us/step - loss: 0.3933 - acc: 0.8315 - val_loss: 0.3868 - val_acc: 0.8492
Epoch 166/500
712/712 [==============================] - 0s 108us/step - loss: 0.4003 - acc: 0.8301 - val_loss: 0.3878 - val_acc: 0.8492
Epoch 167/500
712/712 [==============================] - 0s 92us/step - loss: 0.4152 - acc: 0.8287 - val_loss: 0.3885 - val_acc: 0.8436
Epoch 168/500
712/712 [==============================] - 0s 192us/step - loss: 0.3940 - acc: 0.8413 - val_loss: 0.3838 - val_acc: 0.8547
Epoch 169/500
712/712 [==============================] - 0s 143us/step - loss: 0.4030 - acc: 0.8272 - val_loss: 0.3852 - val_acc: 0.8603
Epoch 170/500
712/712 [==============================] - 0s 109us/step - loss: 0.4001 - acc: 0.8244 - val_loss: 0.3829 - val_acc: 0.8436
Epoch 171/500
712/712 [==============================] - 0s 74us/step - loss: 0.4017 - acc: 0.8287 - val_loss: 0.3835 - val_acc: 0.8436
Epoch 172/500
712/712 [==============================] - 0s 73us/step - loss: 0.3825 - acc: 0.8371 - val_loss: 0.3896 - val_acc: 0.8436
Epoch 173/500
712/712 [==============================] - 0s 111us/step - loss: 0.4054 - acc: 0.8216 - val_loss: 0.3836 - val_acc: 0.8380
Epoch 174/500
712/712 [==============================] - 0s 137us/step - loss: 0.4065 - acc: 0.8258 - val_loss: 0.3811 - val_acc: 0.8380
Epoch 175/500
712/712 [==============================] - 0s 83us/step - loss: 0.3955 - acc: 0.8371 - val_loss: 0.3776 - val_acc: 0.8603
Epoch 176/500
712/712 [==============================] - 0s 99us/step - loss: 0.4035 - acc: 0.8287 - val_loss: 0.3796 - val_acc: 0.8492
Epoch 177/500
712/712 [==============================] - 0s 102us/step - loss: 0.4077 - acc: 0.8371 - val_loss: 0.3836 - val_acc: 0.8436
Epoch 178/500
712/712 [==============================] - 0s 69us/step - loss: 0.3992 - acc: 0.8146 - val_loss: 0.3864 - val_acc: 0.8547
Epoch 179/500
712/712 [==============================] - 0s 80us/step - loss: 0.3981 - acc: 0.8357 - val_loss: 0.3904 - val_acc: 0.8547
Epoch 180/500
712/712 [==============================] - 0s 90us/step - loss: 0.3924 - acc: 0.8329 - val_loss: 0.3839 - val_acc: 0.8436
Epoch 181/500
712/712 [==============================] - 0s 126us/step - loss: 0.3948 - acc: 0.8329 - val_loss: 0.3850 - val_acc: 0.8436
Epoch 182/500
712/712 [==============================] - 0s 129us/step - loss: 0.3880 - acc: 0.8371 - val_loss: 0.3848 - val_acc: 0.8436
Epoch 183/500
712/712 [==============================] - 0s 98us/step - loss: 0.3926 - acc: 0.8301 - val_loss: 0.3912 - val_acc: 0.8492
Epoch 184/500
712/712 [==============================] - 0s 52us/step - loss: 0.4003 - acc: 0.8357 - val_loss: 0.3894 - val_acc: 0.8547
Epoch 185/500
712/712 [==============================] - 0s 91us/step - loss: 0.4040 - acc: 0.8441 - val_loss: 0.3954 - val_acc: 0.8436
Epoch 186/500
712/712 [==============================] - 0s 81us/step - loss: 0.3890 - acc: 0.8427 - val_loss: 0.3889 - val_acc: 0.8436
Epoch 187/500
712/712 [==============================] - 0s 179us/step - loss: 0.4088 - acc: 0.8272 - val_loss: 0.3857 - val_acc: 0.8436
Epoch 188/500
712/712 [==============================] - 0s 73us/step - loss: 0.3845 - acc: 0.8287 - val_loss: 0.3885 - val_acc: 0.8380
Epoch 189/500
712/712 [==============================] - 0s 76us/step - loss: 0.4025 - acc: 0.8371 - val_loss: 0.3939 - val_acc: 0.8436
Epoch 190/500
712/712 [==============================] - 0s 67us/step - loss: 0.3936 - acc: 0.8357 - val_loss: 0.3829 - val_acc: 0.8380
Epoch 191/500
712/712 [==============================] - 0s 52us/step - loss: 0.3838 - acc: 0.8399 - val_loss: 0.3824 - val_acc: 0.8380
Epoch 192/500
712/712 [==============================] - 0s 69us/step - loss: 0.3982 - acc: 0.8329 - val_loss: 0.3861 - val_acc: 0.8492
Epoch 193/500
712/712 [==============================] - 0s 49us/step - loss: 0.4021 - acc: 0.8329 - val_loss: 0.3889 - val_acc: 0.8436
Epoch 194/500
712/712 [==============================] - 0s 50us/step - loss: 0.4007 - acc: 0.8301 - val_loss: 0.3845 - val_acc: 0.8436
Epoch 195/500
712/712 [==============================] - 0s 181us/step - loss: 0.4093 - acc: 0.8301 - val_loss: 0.3819 - val_acc: 0.8436
Epoch 196/500
712/712 [==============================] - 0s 106us/step - loss: 0.3910 - acc: 0.8329 - val_loss: 0.3833 - val_acc: 0.8436
Epoch 197/500
712/712 [==============================] - 0s 71us/step - loss: 0.3969 - acc: 0.8385 - val_loss: 0.3846 - val_acc: 0.8436
Epoch 198/500
712/712 [==============================] - 0s 63us/step - loss: 0.3939 - acc: 0.8371 - val_loss: 0.3834 - val_acc: 0.8436
Epoch 199/500
712/712 [==============================] - 0s 185us/step - loss: 0.4102 - acc: 0.8202 - val_loss: 0.3874 - val_acc: 0.8380
Epoch 200/500
712/712 [==============================] - 0s 132us/step - loss: 0.3924 - acc: 0.8357 - val_loss: 0.3869 - val_acc: 0.8436
Epoch 201/500
712/712 [==============================] - 0s 101us/step - loss: 0.3991 - acc: 0.8301 - val_loss: 0.3880 - val_acc: 0.8436
Epoch 202/500
712/712 [==============================] - 0s 116us/step - loss: 0.3825 - acc: 0.8357 - val_loss: 0.3846 - val_acc: 0.8436
Epoch 203/500
712/712 [==============================] - 0s 60us/step - loss: 0.3901 - acc: 0.8357 - val_loss: 0.3850 - val_acc: 0.8380
Epoch 204/500
712/712 [==============================] - 0s 70us/step - loss: 0.3990 - acc: 0.8343 - val_loss: 0.3866 - val_acc: 0.8436
Epoch 205/500
712/712 [==============================] - 0s 83us/step - loss: 0.3833 - acc: 0.8427 - val_loss: 0.3897 - val_acc: 0.8380
Epoch 206/500
712/712 [==============================] - 0s 109us/step - loss: 0.3941 - acc: 0.8357 - val_loss: 0.3843 - val_acc: 0.8436
Epoch 207/500
712/712 [==============================] - 0s 148us/step - loss: 0.3903 - acc: 0.8455 - val_loss: 0.3780 - val_acc: 0.8380
Epoch 208/500
712/712 [==============================] - 0s 87us/step - loss: 0.3997 - acc: 0.8371 - val_loss: 0.3777 - val_acc: 0.8547
Epoch 209/500
712/712 [==============================] - 0s 70us/step - loss: 0.4068 - acc: 0.8076 - val_loss: 0.3885 - val_acc: 0.8492
Epoch 210/500
712/712 [==============================] - 0s 49us/step - loss: 0.3940 - acc: 0.8329 - val_loss: 0.3875 - val_acc: 0.8603
Epoch 211/500
712/712 [==============================] - 0s 242us/step - loss: 0.3928 - acc: 0.8385 - val_loss: 0.3768 - val_acc: 0.8547
Epoch 212/500
712/712 [==============================] - 0s 87us/step - loss: 0.3870 - acc: 0.8441 - val_loss: 0.3790 - val_acc: 0.8492
Epoch 213/500
712/712 [==============================] - 0s 80us/step - loss: 0.4096 - acc: 0.8272 - val_loss: 0.3838 - val_acc: 0.8436
Epoch 214/500
712/712 [==============================] - 0s 83us/step - loss: 0.3842 - acc: 0.8357 - val_loss: 0.3827 - val_acc: 0.8436
Epoch 215/500
712/712 [==============================] - 0s 73us/step - loss: 0.3899 - acc: 0.8272 - val_loss: 0.3776 - val_acc: 0.8380
Epoch 216/500
712/712 [==============================] - 0s 74us/step - loss: 0.3944 - acc: 0.8329 - val_loss: 0.3784 - val_acc: 0.8436
Epoch 217/500
712/712 [==============================] - 0s 63us/step - loss: 0.3998 - acc: 0.8357 - val_loss: 0.3903 - val_acc: 0.8603
Epoch 218/500
712/712 [==============================] - 0s 64us/step - loss: 0.3889 - acc: 0.8469 - val_loss: 0.3854 - val_acc: 0.8492
Epoch 219/500
712/712 [==============================] - 0s 63us/step - loss: 0.3927 - acc: 0.8399 - val_loss: 0.3851 - val_acc: 0.8436
Epoch 220/500
712/712 [==============================] - 0s 83us/step - loss: 0.3973 - acc: 0.8258 - val_loss: 0.3891 - val_acc: 0.8436
Epoch 221/500
712/712 [==============================] - 0s 91us/step - loss: 0.3997 - acc: 0.8287 - val_loss: 0.3931 - val_acc: 0.8380
Epoch 222/500
712/712 [==============================] - 0s 146us/step - loss: 0.3815 - acc: 0.8441 - val_loss: 0.3855 - val_acc: 0.8436
Epoch 223/500
712/712 [==============================] - 0s 84us/step - loss: 0.3953 - acc: 0.8230 - val_loss: 0.3868 - val_acc: 0.8436
Epoch 224/500
712/712 [==============================] - 0s 70us/step - loss: 0.4098 - acc: 0.8272 - val_loss: 0.3854 - val_acc: 0.8380
Epoch 225/500
712/712 [==============================] - 0s 80us/step - loss: 0.3893 - acc: 0.8343 - val_loss: 0.3877 - val_acc: 0.8436
Epoch 226/500
712/712 [==============================] - 0s 81us/step - loss: 0.3824 - acc: 0.8315 - val_loss: 0.3864 - val_acc: 0.8380
Epoch 227/500
712/712 [==============================] - 0s 81us/step - loss: 0.3812 - acc: 0.8315 - val_loss: 0.3822 - val_acc: 0.8436
Epoch 228/500
712/712 [==============================] - 0s 85us/step - loss: 0.3957 - acc: 0.8301 - val_loss: 0.3978 - val_acc: 0.8380
Epoch 229/500
712/712 [==============================] - 0s 141us/step - loss: 0.3793 - acc: 0.8441 - val_loss: 0.3867 - val_acc: 0.8492
Epoch 230/500
712/712 [==============================] - 0s 77us/step - loss: 0.3983 - acc: 0.8230 - val_loss: 0.3811 - val_acc: 0.8492
Epoch 231/500
712/712 [==============================] - 0s 81us/step - loss: 0.4009 - acc: 0.8329 - val_loss: 0.3848 - val_acc: 0.8547
Epoch 232/500
712/712 [==============================] - 0s 91us/step - loss: 0.3835 - acc: 0.8371 - val_loss: 0.3882 - val_acc: 0.8436
Epoch 233/500
712/712 [==============================] - 0s 71us/step - loss: 0.3896 - acc: 0.8385 - val_loss: 0.3836 - val_acc: 0.8380
Epoch 234/500
712/712 [==============================] - 0s 78us/step - loss: 0.3910 - acc: 0.8357 - val_loss: 0.3842 - val_acc: 0.8380
Epoch 235/500
712/712 [==============================] - 0s 80us/step - loss: 0.3959 - acc: 0.8287 - val_loss: 0.3808 - val_acc: 0.8436
Epoch 236/500
712/712 [==============================] - 0s 137us/step - loss: 0.3948 - acc: 0.8244 - val_loss: 0.3835 - val_acc: 0.8492
Epoch 237/500
712/712 [==============================] - 0s 70us/step - loss: 0.3967 - acc: 0.8371 - val_loss: 0.3909 - val_acc: 0.8492
Epoch 238/500
712/712 [==============================] - 0s 108us/step - loss: 0.3936 - acc: 0.8272 - val_loss: 0.3883 - val_acc: 0.8436
Epoch 239/500
712/712 [==============================] - 0s 83us/step - loss: 0.3853 - acc: 0.8385 - val_loss: 0.3804 - val_acc: 0.8380
Epoch 240/500
712/712 [==============================] - 0s 87us/step - loss: 0.3967 - acc: 0.8371 - val_loss: 0.3740 - val_acc: 0.8492
Epoch 241/500
712/712 [==============================] - 0s 122us/step - loss: 0.3837 - acc: 0.8483 - val_loss: 0.3795 - val_acc: 0.8547
Epoch 242/500
712/712 [==============================] - 0s 111us/step - loss: 0.3921 - acc: 0.8258 - val_loss: 0.3783 - val_acc: 0.8436
Epoch 243/500
712/712 [==============================] - 0s 90us/step - loss: 0.3947 - acc: 0.8371 - val_loss: 0.3765 - val_acc: 0.8436
Epoch 244/500
712/712 [==============================] - 0s 85us/step - loss: 0.3950 - acc: 0.8385 - val_loss: 0.3866 - val_acc: 0.8436
Epoch 245/500
712/712 [==============================] - 0s 115us/step - loss: 0.3895 - acc: 0.8329 - val_loss: 0.3855 - val_acc: 0.8436
Epoch 246/500
712/712 [==============================] - 0s 97us/step - loss: 0.3888 - acc: 0.8272 - val_loss: 0.3777 - val_acc: 0.8436
Epoch 247/500
712/712 [==============================] - 0s 59us/step - loss: 0.3730 - acc: 0.8357 - val_loss: 0.3822 - val_acc: 0.8436
Epoch 248/500
712/712 [==============================] - 0s 71us/step - loss: 0.3765 - acc: 0.8357 - val_loss: 0.3851 - val_acc: 0.8436
Epoch 249/500
712/712 [==============================] - 0s 188us/step - loss: 0.3872 - acc: 0.8329 - val_loss: 0.3774 - val_acc: 0.8492
Epoch 250/500
712/712 [==============================] - 0s 81us/step - loss: 0.3933 - acc: 0.8287 - val_loss: 0.3768 - val_acc: 0.8436
Epoch 251/500
712/712 [==============================] - 0s 64us/step - loss: 0.3883 - acc: 0.8357 - val_loss: 0.3737 - val_acc: 0.8492
Epoch 252/500
712/712 [==============================] - 0s 83us/step - loss: 0.3787 - acc: 0.8357 - val_loss: 0.3773 - val_acc: 0.8492
Epoch 253/500
712/712 [==============================] - 0s 90us/step - loss: 0.3784 - acc: 0.8455 - val_loss: 0.3768 - val_acc: 0.8492
Epoch 254/500
712/712 [==============================] - 0s 139us/step - loss: 0.3857 - acc: 0.8399 - val_loss: 0.3813 - val_acc: 0.8436
Epoch 255/500
712/712 [==============================] - 0s 139us/step - loss: 0.3898 - acc: 0.8357 - val_loss: 0.3767 - val_acc: 0.8436
Epoch 256/500
712/712 [==============================] - 0s 144us/step - loss: 0.3799 - acc: 0.8343 - val_loss: 0.3806 - val_acc: 0.8436
Epoch 257/500
712/712 [==============================] - 0s 74us/step - loss: 0.3891 - acc: 0.8287 - val_loss: 0.3771 - val_acc: 0.8380
Epoch 258/500
712/712 [==============================] - 0s 67us/step - loss: 0.3853 - acc: 0.8343 - val_loss: 0.3810 - val_acc: 0.8492
Epoch 259/500
712/712 [==============================] - 0s 80us/step - loss: 0.3768 - acc: 0.8399 - val_loss: 0.3835 - val_acc: 0.8492
Epoch 260/500
712/712 [==============================] - 0s 101us/step - loss: 0.3870 - acc: 0.8357 - val_loss: 0.3793 - val_acc: 0.8436
Epoch 261/500
712/712 [==============================] - 0s 133us/step - loss: 0.3823 - acc: 0.8385 - val_loss: 0.3828 - val_acc: 0.8547
Epoch 262/500
712/712 [==============================] - 0s 94us/step - loss: 0.3819 - acc: 0.8427 - val_loss: 0.3784 - val_acc: 0.8547
Epoch 263/500
712/712 [==============================] - 0s 118us/step - loss: 0.3873 - acc: 0.8371 - val_loss: 0.3799 - val_acc: 0.8492
Epoch 264/500
712/712 [==============================] - 0s 85us/step - loss: 0.3793 - acc: 0.8525 - val_loss: 0.3873 - val_acc: 0.8492
Epoch 265/500
712/712 [==============================] - 0s 78us/step - loss: 0.3852 - acc: 0.8385 - val_loss: 0.3886 - val_acc: 0.8547
Epoch 266/500
712/712 [==============================] - 0s 106us/step - loss: 0.3783 - acc: 0.8399 - val_loss: 0.3826 - val_acc: 0.8436
Epoch 267/500
712/712 [==============================] - 0s 122us/step - loss: 0.3893 - acc: 0.8343 - val_loss: 0.3847 - val_acc: 0.8436
Epoch 268/500
712/712 [==============================] - 0s 92us/step - loss: 0.3794 - acc: 0.8272 - val_loss: 0.3835 - val_acc: 0.8547
Epoch 269/500
712/712 [==============================] - 0s 87us/step - loss: 0.3859 - acc: 0.8511 - val_loss: 0.3807 - val_acc: 0.8492
Epoch 270/500
712/712 [==============================] - 0s 81us/step - loss: 0.3968 - acc: 0.8413 - val_loss: 0.3827 - val_acc: 0.8547
Epoch 271/500
712/712 [==============================] - 0s 84us/step - loss: 0.3850 - acc: 0.8399 - val_loss: 0.3958 - val_acc: 0.8492
Epoch 272/500
712/712 [==============================] - 0s 129us/step - loss: 0.3829 - acc: 0.8258 - val_loss: 0.3869 - val_acc: 0.8436
Epoch 273/500
712/712 [==============================] - 0s 78us/step - loss: 0.3869 - acc: 0.8427 - val_loss: 0.3914 - val_acc: 0.8436
Epoch 274/500
712/712 [==============================] - 0s 106us/step - loss: 0.3765 - acc: 0.8371 - val_loss: 0.3882 - val_acc: 0.8436
Epoch 275/500
712/712 [==============================] - 0s 77us/step - loss: 0.3778 - acc: 0.8385 - val_loss: 0.3882 - val_acc: 0.8380
Epoch 276/500
712/712 [==============================] - 0s 84us/step - loss: 0.3801 - acc: 0.8371 - val_loss: 0.3905 - val_acc: 0.8436
Epoch 277/500
712/712 [==============================] - 0s 143us/step - loss: 0.3925 - acc: 0.8385 - val_loss: 0.3854 - val_acc: 0.8492
Epoch 278/500
712/712 [==============================] - 0s 88us/step - loss: 0.3828 - acc: 0.8399 - val_loss: 0.3893 - val_acc: 0.8324
Epoch 279/500
712/712 [==============================] - 0s 50us/step - loss: 0.3680 - acc: 0.8455 - val_loss: 0.3915 - val_acc: 0.8380
Epoch 280/500
712/712 [==============================] - 0s 48us/step - loss: 0.3818 - acc: 0.8385 - val_loss: 0.3965 - val_acc: 0.8436
Epoch 281/500
712/712 [==============================] - 0s 57us/step - loss: 0.3810 - acc: 0.8413 - val_loss: 0.3897 - val_acc: 0.8492
Epoch 282/500
712/712 [==============================] - 0s 69us/step - loss: 0.3843 - acc: 0.8315 - val_loss: 0.3842 - val_acc: 0.8492
Epoch 283/500
712/712 [==============================] - 0s 77us/step - loss: 0.3798 - acc: 0.8455 - val_loss: 0.3802 - val_acc: 0.8547
Epoch 284/500
712/712 [==============================] - 0s 81us/step - loss: 0.3727 - acc: 0.8287 - val_loss: 0.3782 - val_acc: 0.8436
Epoch 285/500
712/712 [==============================] - 0s 139us/step - loss: 0.3772 - acc: 0.8343 - val_loss: 0.3880 - val_acc: 0.8324
Epoch 286/500
712/712 [==============================] - 0s 63us/step - loss: 0.3805 - acc: 0.8315 - val_loss: 0.3840 - val_acc: 0.8492
Epoch 287/500
712/712 [==============================] - 0s 53us/step - loss: 0.3947 - acc: 0.8343 - val_loss: 0.3850 - val_acc: 0.8436
Epoch 288/500
712/712 [==============================] - 0s 49us/step - loss: 0.3755 - acc: 0.8399 - val_loss: 0.3845 - val_acc: 0.8547
Epoch 289/500
712/712 [==============================] - 0s 46us/step - loss: 0.3742 - acc: 0.8455 - val_loss: 0.3855 - val_acc: 0.8547
Epoch 290/500
712/712 [==============================] - 0s 49us/step - loss: 0.3839 - acc: 0.8371 - val_loss: 0.3853 - val_acc: 0.8492
Epoch 291/500
712/712 [==============================] - 0s 133us/step - loss: 0.3741 - acc: 0.8441 - val_loss: 0.3812 - val_acc: 0.8436
Epoch 292/500
712/712 [==============================] - 0s 67us/step - loss: 0.3811 - acc: 0.8385 - val_loss: 0.3842 - val_acc: 0.8436
Epoch 293/500
712/712 [==============================] - 0s 57us/step - loss: 0.3813 - acc: 0.8343 - val_loss: 0.3827 - val_acc: 0.8380
Epoch 294/500
712/712 [==============================] - 0s 76us/step - loss: 0.3717 - acc: 0.8413 - val_loss: 0.3837 - val_acc: 0.8380
Epoch 295/500
712/712 [==============================] - 0s 109us/step - loss: 0.3832 - acc: 0.8371 - val_loss: 0.3832 - val_acc: 0.8380
Epoch 296/500
712/712 [==============================] - 0s 81us/step - loss: 0.3772 - acc: 0.8385 - val_loss: 0.3793 - val_acc: 0.8436
Epoch 297/500
712/712 [==============================] - 0s 69us/step - loss: 0.3666 - acc: 0.8385 - val_loss: 0.3738 - val_acc: 0.8547
Epoch 298/500
712/712 [==============================] - 0s 140us/step - loss: 0.3835 - acc: 0.8343 - val_loss: 0.3798 - val_acc: 0.8492
Epoch 299/500
712/712 [==============================] - 0s 59us/step - loss: 0.3759 - acc: 0.8427 - val_loss: 0.3776 - val_acc: 0.8547
Epoch 300/500
712/712 [==============================] - 0s 67us/step - loss: 0.3766 - acc: 0.8413 - val_loss: 0.3762 - val_acc: 0.8436
Epoch 301/500
712/712 [==============================] - 0s 57us/step - loss: 0.3839 - acc: 0.8427 - val_loss: 0.3764 - val_acc: 0.8436
Epoch 302/500
712/712 [==============================] - 0s 56us/step - loss: 0.3851 - acc: 0.8301 - val_loss: 0.3758 - val_acc: 0.8547
Epoch 303/500
712/712 [==============================] - 0s 57us/step - loss: 0.3909 - acc: 0.8371 - val_loss: 0.3811 - val_acc: 0.8436
Epoch 304/500
712/712 [==============================] - 0s 76us/step - loss: 0.3763 - acc: 0.8399 - val_loss: 0.3876 - val_acc: 0.8436
Epoch 305/500
712/712 [==============================] - 0s 118us/step - loss: 0.3849 - acc: 0.8469 - val_loss: 0.3834 - val_acc: 0.8436
Epoch 306/500
712/712 [==============================] - 0s 73us/step - loss: 0.3709 - acc: 0.8441 - val_loss: 0.3844 - val_acc: 0.8436
Epoch 307/500
712/712 [==============================] - 0s 53us/step - loss: 0.3742 - acc: 0.8427 - val_loss: 0.3852 - val_acc: 0.8436
Epoch 308/500
712/712 [==============================] - 0s 53us/step - loss: 0.3761 - acc: 0.8455 - val_loss: 0.3915 - val_acc: 0.8380
Epoch 309/500
712/712 [==============================] - 0s 53us/step - loss: 0.3802 - acc: 0.8441 - val_loss: 0.3827 - val_acc: 0.8436
Epoch 310/500
712/712 [==============================] - 0s 55us/step - loss: 0.3723 - acc: 0.8371 - val_loss: 0.3835 - val_acc: 0.8492
Epoch 311/500
712/712 [==============================] - 0s 52us/step - loss: 0.3547 - acc: 0.8483 - val_loss: 0.3909 - val_acc: 0.8436
Epoch 312/500
712/712 [==============================] - 0s 53us/step - loss: 0.3813 - acc: 0.8287 - val_loss: 0.3829 - val_acc: 0.8492
Epoch 313/500
712/712 [==============================] - 0s 53us/step - loss: 0.3744 - acc: 0.8371 - val_loss: 0.3864 - val_acc: 0.8380
Epoch 314/500
712/712 [==============================] - 0s 53us/step - loss: 0.3750 - acc: 0.8371 - val_loss: 0.3812 - val_acc: 0.8603
Epoch 315/500
712/712 [==============================] - 0s 53us/step - loss: 0.3683 - acc: 0.8413 - val_loss: 0.3919 - val_acc: 0.8380
Epoch 316/500
712/712 [==============================] - 0s 345us/step - loss: 0.3727 - acc: 0.8427 - val_loss: 0.3856 - val_acc: 0.8547
Epoch 317/500
712/712 [==============================] - 0s 71us/step - loss: 0.3870 - acc: 0.8399 - val_loss: 0.3831 - val_acc: 0.8547
Epoch 318/500
712/712 [==============================] - ETA: 0s - loss: 0.3414 - acc: 0.828 - 0s 74us/step - loss: 0.3861 - acc: 0.8272 - val_loss: 0.3782 - val_acc: 0.8492
Epoch 319/500
712/712 [==============================] - 0s 134us/step - loss: 0.3688 - acc: 0.8441 - val_loss: 0.3787 - val_acc: 0.8547
Epoch 320/500
712/712 [==============================] - 0s 60us/step - loss: 0.3907 - acc: 0.8385 - val_loss: 0.3857 - val_acc: 0.8547
Epoch 321/500
712/712 [==============================] - 0s 63us/step - loss: 0.3763 - acc: 0.8483 - val_loss: 0.3891 - val_acc: 0.8492
Epoch 322/500
712/712 [==============================] - 0s 62us/step - loss: 0.4003 - acc: 0.8258 - val_loss: 0.3797 - val_acc: 0.8492
Epoch 323/500
712/712 [==============================] - 0s 154us/step - loss: 0.3768 - acc: 0.8371 - val_loss: 0.3845 - val_acc: 0.8380
Epoch 324/500
712/712 [==============================] - 0s 91us/step - loss: 0.3735 - acc: 0.8413 - val_loss: 0.3810 - val_acc: 0.8436
Epoch 325/500
712/712 [==============================] - 0s 71us/step - loss: 0.3754 - acc: 0.8455 - val_loss: 0.3809 - val_acc: 0.8436
Epoch 326/500
712/712 [==============================] - 0s 56us/step - loss: 0.3859 - acc: 0.8413 - val_loss: 0.3905 - val_acc: 0.8492
Epoch 327/500
712/712 [==============================] - 0s 108us/step - loss: 0.3708 - acc: 0.8399 - val_loss: 0.3761 - val_acc: 0.8547
Epoch 328/500
712/712 [==============================] - 0s 67us/step - loss: 0.3682 - acc: 0.8483 - val_loss: 0.3677 - val_acc: 0.8547
Epoch 329/500
712/712 [==============================] - 0s 64us/step - loss: 0.3636 - acc: 0.8427 - val_loss: 0.3807 - val_acc: 0.8547
Epoch 330/500
712/712 [==============================] - 0s 57us/step - loss: 0.3757 - acc: 0.8413 - val_loss: 0.3848 - val_acc: 0.8492
Epoch 331/500
712/712 [==============================] - 0s 56us/step - loss: 0.3696 - acc: 0.8469 - val_loss: 0.3868 - val_acc: 0.8492
Epoch 332/500
712/712 [==============================] - 0s 46us/step - loss: 0.3673 - acc: 0.8413 - val_loss: 0.3823 - val_acc: 0.8492
Epoch 333/500
712/712 [==============================] - 0s 45us/step - loss: 0.3739 - acc: 0.8399 - val_loss: 0.3805 - val_acc: 0.8436
Epoch 334/500
712/712 [==============================] - 0s 49us/step - loss: 0.3678 - acc: 0.8413 - val_loss: 0.3780 - val_acc: 0.8547
Epoch 335/500
712/712 [==============================] - ETA: 0s - loss: 0.3789 - acc: 0.781 - 0s 53us/step - loss: 0.3617 - acc: 0.8455 - val_loss: 0.3761 - val_acc: 0.8492
Epoch 336/500
712/712 [==============================] - 0s 50us/step - loss: 0.3778 - acc: 0.8525 - val_loss: 0.3732 - val_acc: 0.8547
Epoch 337/500
712/712 [==============================] - 0s 46us/step - loss: 0.3718 - acc: 0.8413 - val_loss: 0.3767 - val_acc: 0.8547
Epoch 338/500
712/712 [==============================] - 0s 48us/step - loss: 0.3764 - acc: 0.8497 - val_loss: 0.3779 - val_acc: 0.8492
Epoch 339/500
712/712 [==============================] - 0s 49us/step - loss: 0.3704 - acc: 0.8399 - val_loss: 0.3824 - val_acc: 0.8492
Epoch 340/500
712/712 [==============================] - 0s 46us/step - loss: 0.3693 - acc: 0.8455 - val_loss: 0.3838 - val_acc: 0.8436
Epoch 341/500
712/712 [==============================] - 0s 50us/step - loss: 0.3687 - acc: 0.8469 - val_loss: 0.3798 - val_acc: 0.8492
Epoch 342/500
712/712 [==============================] - 0s 49us/step - loss: 0.3769 - acc: 0.8427 - val_loss: 0.3812 - val_acc: 0.8492
Epoch 343/500
712/712 [==============================] - 0s 46us/step - loss: 0.3832 - acc: 0.8315 - val_loss: 0.3816 - val_acc: 0.8436
Epoch 344/500
712/712 [==============================] - 0s 46us/step - loss: 0.3816 - acc: 0.8357 - val_loss: 0.3848 - val_acc: 0.8492
Epoch 345/500
712/712 [==============================] - 0s 50us/step - loss: 0.3701 - acc: 0.8455 - val_loss: 0.3799 - val_acc: 0.8492
Epoch 346/500
712/712 [==============================] - 0s 46us/step - loss: 0.3710 - acc: 0.8413 - val_loss: 0.3783 - val_acc: 0.8492
Epoch 347/500
712/712 [==============================] - 0s 48us/step - loss: 0.3614 - acc: 0.8483 - val_loss: 0.3809 - val_acc: 0.8547
Epoch 348/500
712/712 [==============================] - 0s 55us/step - loss: 0.3745 - acc: 0.8483 - val_loss: 0.3789 - val_acc: 0.8492
Epoch 349/500
712/712 [==============================] - 0s 66us/step - loss: 0.3623 - acc: 0.8399 - val_loss: 0.3737 - val_acc: 0.8603
Epoch 350/500
712/712 [==============================] - 0s 78us/step - loss: 0.3892 - acc: 0.8188 - val_loss: 0.3800 - val_acc: 0.8436
Epoch 351/500
712/712 [==============================] - 0s 55us/step - loss: 0.3717 - acc: 0.8497 - val_loss: 0.3742 - val_acc: 0.8547
Epoch 352/500
712/712 [==============================] - 0s 62us/step - loss: 0.3779 - acc: 0.8357 - val_loss: 0.3887 - val_acc: 0.8492
Epoch 353/500
712/712 [==============================] - 0s 55us/step - loss: 0.3746 - acc: 0.8413 - val_loss: 0.3839 - val_acc: 0.8492
Epoch 354/500
712/712 [==============================] - ETA: 0s - loss: 0.3831 - acc: 0.859 - 0s 73us/step - loss: 0.3770 - acc: 0.8357 - val_loss: 0.3796 - val_acc: 0.8547
Epoch 355/500
712/712 [==============================] - 0s 56us/step - loss: 0.3753 - acc: 0.8399 - val_loss: 0.3780 - val_acc: 0.8492
Epoch 356/500
712/712 [==============================] - 0s 56us/step - loss: 0.3858 - acc: 0.8329 - val_loss: 0.3804 - val_acc: 0.8547
Epoch 357/500
712/712 [==============================] - 0s 59us/step - loss: 0.3799 - acc: 0.8399 - val_loss: 0.3711 - val_acc: 0.8603
Epoch 358/500
712/712 [==============================] - 0s 50us/step - loss: 0.3753 - acc: 0.8455 - val_loss: 0.3723 - val_acc: 0.8492
Epoch 359/500
712/712 [==============================] - 0s 55us/step - loss: 0.3770 - acc: 0.8413 - val_loss: 0.3768 - val_acc: 0.8492
Epoch 360/500
712/712 [==============================] - 0s 50us/step - loss: 0.3646 - acc: 0.8385 - val_loss: 0.3778 - val_acc: 0.8492
Epoch 361/500
712/712 [==============================] - 0s 60us/step - loss: 0.3753 - acc: 0.8469 - val_loss: 0.3731 - val_acc: 0.8492
Epoch 362/500
712/712 [==============================] - 0s 50us/step - loss: 0.3728 - acc: 0.8413 - val_loss: 0.3776 - val_acc: 0.8492
Epoch 363/500
712/712 [==============================] - 0s 49us/step - loss: 0.3647 - acc: 0.8455 - val_loss: 0.3794 - val_acc: 0.8492
Epoch 364/500
712/712 [==============================] - 0s 46us/step - loss: 0.3672 - acc: 0.8385 - val_loss: 0.3778 - val_acc: 0.8492
Epoch 365/500
712/712 [==============================] - 0s 49us/step - loss: 0.3806 - acc: 0.8371 - val_loss: 0.3741 - val_acc: 0.8547
Epoch 366/500
712/712 [==============================] - 0s 49us/step - loss: 0.3706 - acc: 0.8399 - val_loss: 0.3756 - val_acc: 0.8492
Epoch 367/500
712/712 [==============================] - 0s 56us/step - loss: 0.3630 - acc: 0.8427 - val_loss: 0.3799 - val_acc: 0.8436
Epoch 368/500
712/712 [==============================] - 0s 60us/step - loss: 0.3639 - acc: 0.8357 - val_loss: 0.3898 - val_acc: 0.8492
Epoch 369/500
712/712 [==============================] - 0s 76us/step - loss: 0.3622 - acc: 0.8511 - val_loss: 0.3785 - val_acc: 0.8492
Epoch 370/500
712/712 [==============================] - 0s 70us/step - loss: 0.3638 - acc: 0.8441 - val_loss: 0.3728 - val_acc: 0.8547
Epoch 371/500
712/712 [==============================] - 0s 56us/step - loss: 0.3665 - acc: 0.8497 - val_loss: 0.3779 - val_acc: 0.8492
Epoch 372/500
712/712 [==============================] - ETA: 0s - loss: 0.3327 - acc: 0.875 - 0s 50us/step - loss: 0.3632 - acc: 0.8441 - val_loss: 0.3904 - val_acc: 0.8492
Epoch 373/500
712/712 [==============================] - 0s 219us/step - loss: 0.3768 - acc: 0.8343 - val_loss: 0.3749 - val_acc: 0.8492
Epoch 374/500
712/712 [==============================] - 0s 53us/step - loss: 0.3699 - acc: 0.8413 - val_loss: 0.3735 - val_acc: 0.8492
Epoch 375/500
712/712 [==============================] - 0s 50us/step - loss: 0.3651 - acc: 0.8441 - val_loss: 0.3794 - val_acc: 0.8492
Epoch 376/500
712/712 [==============================] - 0s 49us/step - loss: 0.3563 - acc: 0.8399 - val_loss: 0.3818 - val_acc: 0.8492
Epoch 377/500
712/712 [==============================] - 0s 50us/step - loss: 0.3722 - acc: 0.8455 - val_loss: 0.3828 - val_acc: 0.8547
Epoch 378/500
712/712 [==============================] - 0s 55us/step - loss: 0.3586 - acc: 0.8399 - val_loss: 0.3855 - val_acc: 0.8492
Epoch 379/500
712/712 [==============================] - 0s 57us/step - loss: 0.3768 - acc: 0.8441 - val_loss: 0.3842 - val_acc: 0.8492
Epoch 380/500
712/712 [==============================] - 0s 50us/step - loss: 0.3651 - acc: 0.8441 - val_loss: 0.3778 - val_acc: 0.8436
Epoch 381/500
712/712 [==============================] - 0s 50us/step - loss: 0.3645 - acc: 0.8567 - val_loss: 0.3922 - val_acc: 0.8492
Epoch 382/500
712/712 [==============================] - 0s 50us/step - loss: 0.3492 - acc: 0.8455 - val_loss: 0.3892 - val_acc: 0.8436
Epoch 383/500
712/712 [==============================] - 0s 55us/step - loss: 0.3554 - acc: 0.8483 - val_loss: 0.3899 - val_acc: 0.8492
Epoch 384/500
712/712 [==============================] - 0s 50us/step - loss: 0.3573 - acc: 0.8497 - val_loss: 0.3815 - val_acc: 0.8436
Epoch 385/500
712/712 [==============================] - 0s 52us/step - loss: 0.3664 - acc: 0.8441 - val_loss: 0.3787 - val_acc: 0.8603
Epoch 386/500
712/712 [==============================] - 0s 46us/step - loss: 0.3643 - acc: 0.8455 - val_loss: 0.3770 - val_acc: 0.8436
Epoch 387/500
712/712 [==============================] - 0s 50us/step - loss: 0.3702 - acc: 0.8455 - val_loss: 0.3822 - val_acc: 0.8436
Epoch 388/500
712/712 [==============================] - 0s 49us/step - loss: 0.3762 - acc: 0.8441 - val_loss: 0.3909 - val_acc: 0.8436
Epoch 389/500
712/712 [==============================] - 0s 49us/step - loss: 0.3709 - acc: 0.8399 - val_loss: 0.3802 - val_acc: 0.8547
Epoch 390/500
712/712 [==============================] - 0s 60us/step - loss: 0.3615 - acc: 0.8371 - val_loss: 0.3783 - val_acc: 0.8547
Epoch 391/500
712/712 [==============================] - 0s 48us/step - loss: 0.3769 - acc: 0.8413 - val_loss: 0.3782 - val_acc: 0.8492
Epoch 392/500
712/712 [==============================] - 0s 46us/step - loss: 0.3782 - acc: 0.8553 - val_loss: 0.3815 - val_acc: 0.8492
Epoch 393/500
712/712 [==============================] - 0s 46us/step - loss: 0.3708 - acc: 0.8469 - val_loss: 0.3873 - val_acc: 0.8547
Epoch 394/500
712/712 [==============================] - 0s 49us/step - loss: 0.3661 - acc: 0.8469 - val_loss: 0.3910 - val_acc: 0.8492
Epoch 395/500
712/712 [==============================] - 0s 46us/step - loss: 0.3725 - acc: 0.8413 - val_loss: 0.3918 - val_acc: 0.8436
Epoch 396/500
712/712 [==============================] - 0s 55us/step - loss: 0.3716 - acc: 0.8413 - val_loss: 0.3896 - val_acc: 0.8492
Epoch 397/500
712/712 [==============================] - 0s 50us/step - loss: 0.3645 - acc: 0.8399 - val_loss: 0.3819 - val_acc: 0.8492
Epoch 398/500
712/712 [==============================] - 0s 59us/step - loss: 0.3737 - acc: 0.8427 - val_loss: 0.3856 - val_acc: 0.8380
Epoch 399/500
712/712 [==============================] - 0s 49us/step - loss: 0.3740 - acc: 0.8371 - val_loss: 0.3814 - val_acc: 0.8492
Epoch 400/500
712/712 [==============================] - 0s 45us/step - loss: 0.3698 - acc: 0.8469 - val_loss: 0.3856 - val_acc: 0.8492
Epoch 401/500
712/712 [==============================] - 0s 46us/step - loss: 0.3700 - acc: 0.8399 - val_loss: 0.3872 - val_acc: 0.8436
Epoch 402/500
712/712 [==============================] - 0s 53us/step - loss: 0.3554 - acc: 0.8567 - val_loss: 0.3879 - val_acc: 0.8436
Epoch 403/500
712/712 [==============================] - 0s 46us/step - loss: 0.3762 - acc: 0.8385 - val_loss: 0.3833 - val_acc: 0.8547
Epoch 404/500
712/712 [==============================] - 0s 48us/step - loss: 0.3488 - acc: 0.8525 - val_loss: 0.3844 - val_acc: 0.8436
Epoch 405/500
712/712 [==============================] - 0s 57us/step - loss: 0.3604 - acc: 0.8497 - val_loss: 0.3898 - val_acc: 0.8436
Epoch 406/500
712/712 [==============================] - 0s 48us/step - loss: 0.3609 - acc: 0.8567 - val_loss: 0.3902 - val_acc: 0.8380
Epoch 407/500
712/712 [==============================] - 0s 48us/step - loss: 0.3692 - acc: 0.8455 - val_loss: 0.3881 - val_acc: 0.8436
Epoch 408/500
712/712 [==============================] - 0s 46us/step - loss: 0.3671 - acc: 0.8469 - val_loss: 0.3962 - val_acc: 0.8436
Epoch 409/500
712/712 [==============================] - 0s 48us/step - loss: 0.3664 - acc: 0.8441 - val_loss: 0.3849 - val_acc: 0.8492
Epoch 410/500
712/712 [==============================] - 0s 48us/step - loss: 0.3566 - acc: 0.8413 - val_loss: 0.3857 - val_acc: 0.8492
Epoch 411/500
712/712 [==============================] - 0s 48us/step - loss: 0.3594 - acc: 0.8497 - val_loss: 0.3893 - val_acc: 0.8492
Epoch 412/500
712/712 [==============================] - 0s 49us/step - loss: 0.3589 - acc: 0.8385 - val_loss: 0.3898 - val_acc: 0.8492
Epoch 413/500
712/712 [==============================] - 0s 50us/step - loss: 0.3525 - acc: 0.8483 - val_loss: 0.3903 - val_acc: 0.8492
Epoch 414/500
712/712 [==============================] - 0s 52us/step - loss: 0.3653 - acc: 0.8427 - val_loss: 0.3878 - val_acc: 0.8492
Epoch 415/500
712/712 [==============================] - 0s 53us/step - loss: 0.3692 - acc: 0.8427 - val_loss: 0.3893 - val_acc: 0.8492
Epoch 416/500
712/712 [==============================] - 0s 49us/step - loss: 0.3622 - acc: 0.8427 - val_loss: 0.3924 - val_acc: 0.8492
Epoch 417/500
712/712 [==============================] - 0s 52us/step - loss: 0.3803 - acc: 0.8427 - val_loss: 0.3891 - val_acc: 0.8547
Epoch 418/500
712/712 [==============================] - 0s 52us/step - loss: 0.3676 - acc: 0.8301 - val_loss: 0.3839 - val_acc: 0.8603
Epoch 419/500
712/712 [==============================] - 0s 55us/step - loss: 0.3788 - acc: 0.8371 - val_loss: 0.3857 - val_acc: 0.8547
Epoch 420/500
712/712 [==============================] - 0s 48us/step - loss: 0.3658 - acc: 0.8343 - val_loss: 0.3865 - val_acc: 0.8492
Epoch 421/500
712/712 [==============================] - 0s 48us/step - loss: 0.3714 - acc: 0.8399 - val_loss: 0.3849 - val_acc: 0.8436
Epoch 422/500
712/712 [==============================] - 0s 46us/step - loss: 0.3668 - acc: 0.8399 - val_loss: 0.3825 - val_acc: 0.8436
Epoch 423/500
712/712 [==============================] - 0s 46us/step - loss: 0.3644 - acc: 0.8357 - val_loss: 0.3871 - val_acc: 0.8492
Epoch 424/500
712/712 [==============================] - 0s 50us/step - loss: 0.3710 - acc: 0.8357 - val_loss: 0.3721 - val_acc: 0.8547
Epoch 425/500
712/712 [==============================] - 0s 48us/step - loss: 0.3700 - acc: 0.8483 - val_loss: 0.3775 - val_acc: 0.8436
Epoch 426/500
712/712 [==============================] - 0s 50us/step - loss: 0.3684 - acc: 0.8427 - val_loss: 0.3764 - val_acc: 0.8492
Epoch 427/500
712/712 [==============================] - 0s 57us/step - loss: 0.3662 - acc: 0.8441 - val_loss: 0.3726 - val_acc: 0.8492
Epoch 428/500
712/712 [==============================] - 0s 62us/step - loss: 0.3590 - acc: 0.8483 - val_loss: 0.3808 - val_acc: 0.8436
Epoch 429/500
712/712 [==============================] - 0s 49us/step - loss: 0.3689 - acc: 0.8455 - val_loss: 0.3836 - val_acc: 0.8380
Epoch 430/500
712/712 [==============================] - 0s 55us/step - loss: 0.3640 - acc: 0.8427 - val_loss: 0.3825 - val_acc: 0.8547
Epoch 431/500
712/712 [==============================] - 0s 53us/step - loss: 0.3616 - acc: 0.8385 - val_loss: 0.3911 - val_acc: 0.8492
Epoch 432/500
712/712 [==============================] - 0s 73us/step - loss: 0.3575 - acc: 0.8413 - val_loss: 0.3849 - val_acc: 0.8436
Epoch 433/500
712/712 [==============================] - 0s 81us/step - loss: 0.3615 - acc: 0.8483 - val_loss: 0.3760 - val_acc: 0.8547
Epoch 434/500
712/712 [==============================] - 0s 59us/step - loss: 0.3722 - acc: 0.8441 - val_loss: 0.3804 - val_acc: 0.8492
Epoch 435/500
712/712 [==============================] - 0s 66us/step - loss: 0.3619 - acc: 0.8525 - val_loss: 0.3768 - val_acc: 0.8492
Epoch 436/500
712/712 [==============================] - 0s 53us/step - loss: 0.3603 - acc: 0.8469 - val_loss: 0.3764 - val_acc: 0.8492
Epoch 437/500
712/712 [==============================] - 0s 53us/step - loss: 0.3558 - acc: 0.8455 - val_loss: 0.3814 - val_acc: 0.8492
Epoch 438/500
712/712 [==============================] - 0s 48us/step - loss: 0.3687 - acc: 0.8427 - val_loss: 0.3892 - val_acc: 0.8436
Epoch 439/500
712/712 [==============================] - 0s 45us/step - loss: 0.3674 - acc: 0.8343 - val_loss: 0.3765 - val_acc: 0.8436
Epoch 440/500
712/712 [==============================] - 0s 53us/step - loss: 0.3561 - acc: 0.8469 - val_loss: 0.3831 - val_acc: 0.8436
Epoch 441/500
712/712 [==============================] - 0s 80us/step - loss: 0.3517 - acc: 0.8525 - val_loss: 0.3871 - val_acc: 0.8436
Epoch 442/500
712/712 [==============================] - 0s 102us/step - loss: 0.3609 - acc: 0.8483 - val_loss: 0.3886 - val_acc: 0.8547
Epoch 443/500
712/712 [==============================] - 0s 130us/step - loss: 0.3657 - acc: 0.8525 - val_loss: 0.3811 - val_acc: 0.8603
Epoch 444/500
712/712 [==============================] - 0s 111us/step - loss: 0.3642 - acc: 0.8497 - val_loss: 0.3775 - val_acc: 0.8659
Epoch 445/500
712/712 [==============================] - 0s 63us/step - loss: 0.3599 - acc: 0.8511 - val_loss: 0.3804 - val_acc: 0.8492
Epoch 446/500
712/712 [==============================] - 0s 49us/step - loss: 0.3576 - acc: 0.8427 - val_loss: 0.3852 - val_acc: 0.8492
Epoch 447/500
712/712 [==============================] - 0s 69us/step - loss: 0.3592 - acc: 0.8413 - val_loss: 0.3890 - val_acc: 0.8547
Epoch 448/500
712/712 [==============================] - 0s 66us/step - loss: 0.3586 - acc: 0.8469 - val_loss: 0.3841 - val_acc: 0.8436
Epoch 449/500
712/712 [==============================] - 0s 53us/step - loss: 0.3591 - acc: 0.8427 - val_loss: 0.3924 - val_acc: 0.8492
Epoch 450/500
712/712 [==============================] - 0s 53us/step - loss: 0.3566 - acc: 0.8455 - val_loss: 0.3876 - val_acc: 0.8492
Epoch 451/500
712/712 [==============================] - 0s 52us/step - loss: 0.3682 - acc: 0.8497 - val_loss: 0.3783 - val_acc: 0.8492
Epoch 452/500
712/712 [==============================] - 0s 57us/step - loss: 0.3510 - acc: 0.8455 - val_loss: 0.3801 - val_acc: 0.8547
Epoch 453/500
712/712 [==============================] - 0s 46us/step - loss: 0.3694 - acc: 0.8413 - val_loss: 0.3841 - val_acc: 0.8436
Epoch 454/500
712/712 [==============================] - 0s 50us/step - loss: 0.3641 - acc: 0.8497 - val_loss: 0.3862 - val_acc: 0.8492
Epoch 455/500
712/712 [==============================] - 0s 91us/step - loss: 0.3516 - acc: 0.8469 - val_loss: 0.3853 - val_acc: 0.8492
Epoch 456/500
712/712 [==============================] - 0s 116us/step - loss: 0.3474 - acc: 0.8497 - val_loss: 0.3882 - val_acc: 0.8492
Epoch 457/500
712/712 [==============================] - 0s 130us/step - loss: 0.3623 - acc: 0.8441 - val_loss: 0.3870 - val_acc: 0.8492
Epoch 458/500
712/712 [==============================] - 0s 105us/step - loss: 0.3666 - acc: 0.8371 - val_loss: 0.3839 - val_acc: 0.8492
Epoch 459/500
712/712 [==============================] - 0s 70us/step - loss: 0.3538 - acc: 0.8413 - val_loss: 0.3836 - val_acc: 0.8492
Epoch 460/500
712/712 [==============================] - 0s 46us/step - loss: 0.3551 - acc: 0.8483 - val_loss: 0.3870 - val_acc: 0.8492
Epoch 461/500
712/712 [==============================] - 0s 48us/step - loss: 0.3568 - acc: 0.8539 - val_loss: 0.3907 - val_acc: 0.8492
Epoch 462/500
712/712 [==============================] - 0s 67us/step - loss: 0.3555 - acc: 0.8469 - val_loss: 0.3901 - val_acc: 0.8492
Epoch 463/500
712/712 [==============================] - 0s 70us/step - loss: 0.3500 - acc: 0.8469 - val_loss: 0.3925 - val_acc: 0.8547
Epoch 464/500
712/712 [==============================] - 0s 45us/step - loss: 0.3481 - acc: 0.8539 - val_loss: 0.3894 - val_acc: 0.8492
Epoch 465/500
712/712 [==============================] - 0s 53us/step - loss: 0.3556 - acc: 0.8497 - val_loss: 0.3841 - val_acc: 0.8547
Epoch 466/500
712/712 [==============================] - 0s 50us/step - loss: 0.3622 - acc: 0.8441 - val_loss: 0.3824 - val_acc: 0.8547
Epoch 467/500
712/712 [==============================] - 0s 57us/step - loss: 0.3677 - acc: 0.8413 - val_loss: 0.3842 - val_acc: 0.8492
Epoch 468/500
712/712 [==============================] - 0s 48us/step - loss: 0.3552 - acc: 0.8455 - val_loss: 0.3823 - val_acc: 0.8492
Epoch 469/500
712/712 [==============================] - 0s 71us/step - loss: 0.3745 - acc: 0.8413 - val_loss: 0.3950 - val_acc: 0.8492
Epoch 470/500
712/712 [==============================] - 0s 77us/step - loss: 0.3625 - acc: 0.8483 - val_loss: 0.3891 - val_acc: 0.8492
Epoch 471/500
712/712 [==============================] - 0s 88us/step - loss: 0.3560 - acc: 0.8455 - val_loss: 0.3930 - val_acc: 0.8436
Epoch 472/500
712/712 [==============================] - 0s 134us/step - loss: 0.3487 - acc: 0.8511 - val_loss: 0.3967 - val_acc: 0.8492
Epoch 473/500
712/712 [==============================] - 0s 94us/step - loss: 0.3616 - acc: 0.8441 - val_loss: 0.3863 - val_acc: 0.8436
Epoch 474/500
712/712 [==============================] - 0s 55us/step - loss: 0.3599 - acc: 0.8413 - val_loss: 0.3915 - val_acc: 0.8492
Epoch 475/500
712/712 [==============================] - 0s 45us/step - loss: 0.3603 - acc: 0.8469 - val_loss: 0.3900 - val_acc: 0.8436
Epoch 476/500
712/712 [==============================] - 0s 48us/step - loss: 0.3535 - acc: 0.8553 - val_loss: 0.4022 - val_acc: 0.8436
Epoch 477/500
712/712 [==============================] - 0s 46us/step - loss: 0.3578 - acc: 0.8539 - val_loss: 0.4029 - val_acc: 0.8492
Epoch 478/500
712/712 [==============================] - 0s 66us/step - loss: 0.3475 - acc: 0.8385 - val_loss: 0.3901 - val_acc: 0.8492
Epoch 479/500
712/712 [==============================] - 0s 69us/step - loss: 0.3574 - acc: 0.8371 - val_loss: 0.3933 - val_acc: 0.8547
Epoch 480/500
712/712 [==============================] - 0s 49us/step - loss: 0.3579 - acc: 0.8511 - val_loss: 0.3898 - val_acc: 0.8547
Epoch 481/500
712/712 [==============================] - 0s 46us/step - loss: 0.3594 - acc: 0.8469 - val_loss: 0.3889 - val_acc: 0.8603
Epoch 482/500
712/712 [==============================] - 0s 48us/step - loss: 0.3688 - acc: 0.8553 - val_loss: 0.3895 - val_acc: 0.8603
Epoch 483/500
712/712 [==============================] - 0s 91us/step - loss: 0.3520 - acc: 0.8427 - val_loss: 0.3899 - val_acc: 0.8547
Epoch 484/500
712/712 [==============================] - 0s 81us/step - loss: 0.3541 - acc: 0.8483 - val_loss: 0.3870 - val_acc: 0.8492
Epoch 485/500
712/712 [==============================] - 0s 90us/step - loss: 0.3563 - acc: 0.8455 - val_loss: 0.3925 - val_acc: 0.8492
Epoch 486/500
712/712 [==============================] - 0s 77us/step - loss: 0.3575 - acc: 0.8455 - val_loss: 0.3861 - val_acc: 0.8492
Epoch 487/500
712/712 [==============================] - 0s 141us/step - loss: 0.3384 - acc: 0.8567 - val_loss: 0.3918 - val_acc: 0.8436
Epoch 488/500
712/712 [==============================] - 0s 64us/step - loss: 0.3533 - acc: 0.8497 - val_loss: 0.3841 - val_acc: 0.8547
Epoch 489/500
712/712 [==============================] - 0s 49us/step - loss: 0.3569 - acc: 0.8455 - val_loss: 0.3811 - val_acc: 0.8492
Epoch 490/500
712/712 [==============================] - 0s 48us/step - loss: 0.3605 - acc: 0.8413 - val_loss: 0.3847 - val_acc: 0.8547
Epoch 491/500
712/712 [==============================] - 0s 45us/step - loss: 0.3521 - acc: 0.8469 - val_loss: 0.4097 - val_acc: 0.8492
Epoch 492/500
712/712 [==============================] - 0s 46us/step - loss: 0.3573 - acc: 0.8343 - val_loss: 0.3903 - val_acc: 0.8436
Epoch 493/500
712/712 [==============================] - 0s 55us/step - loss: 0.3594 - acc: 0.8441 - val_loss: 0.3947 - val_acc: 0.8436
Epoch 494/500
712/712 [==============================] - 0s 78us/step - loss: 0.3511 - acc: 0.8441 - val_loss: 0.4005 - val_acc: 0.8492
Epoch 495/500
712/712 [==============================] - 0s 49us/step - loss: 0.3621 - acc: 0.8455 - val_loss: 0.3963 - val_acc: 0.8492
Epoch 496/500
712/712 [==============================] - 0s 60us/step - loss: 0.3445 - acc: 0.8483 - val_loss: 0.4043 - val_acc: 0.8547
Epoch 497/500
712/712 [==============================] - 0s 88us/step - loss: 0.3452 - acc: 0.8483 - val_loss: 0.3999 - val_acc: 0.8492
Epoch 498/500
712/712 [==============================] - 0s 108us/step - loss: 0.3617 - acc: 0.8497 - val_loss: 0.3970 - val_acc: 0.8492
Epoch 499/500
712/712 [==============================] - 0s 84us/step - loss: 0.3467 - acc: 0.8581 - val_loss: 0.3976 - val_acc: 0.8492
Epoch 500/500
712/712 [==============================] - 0s 76us/step - loss: 0.3499 - acc: 0.8497 - val_loss: 0.4051 - val_acc: 0.8492
In [148]:
hists = [history]
hist_df = pd.concat([pd.DataFrame(hist.history) for hist in hists], sort=True)
hist_df.index = np.arange(1, len(hist_df)+1)
fig, axs = plt.subplots(nrows=2, sharex=True, figsize=(16, 10))
axs[0].plot(hist_df.val_acc, lw=5, label='Validation Accuracy')
axs[0].plot(hist_df.acc, lw=5, label='Training Accuracy')
axs[0].set_ylabel('Accuracy')
axs[0].set_xlabel('Epoch')
axs[0].grid()
axs[0].legend(loc=0)
axs[1].plot(hist_df.val_loss, lw=5, label='Validation MLogLoss')
axs[1].plot(hist_df.loss, lw=5, label='Training MLogLoss')
axs[1].set_ylabel('MLogLoss')
axs[1].set_xlabel('Epoch')
axs[1].grid()
axs[1].legend(loc=0)
fig.savefig('hist.png', dpi=300)
plt.show();
In [ ]:
 
In [ ]:
 

댓글