파이썬 케라스 질문. 손실함수가 줄어들다가 더이상 줄어들지 않을때(과적합때문인가요?)

조회수 401회
//import tensorflow as tf
import numpy as np
import pandas as pd

from pandas import DataFrame
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.optimizers import SGD
from tensorflow.keras.layers import Dropout

#sys.stdout = open('model.csv', 'w')

data = pd.read_csv('SW_model.csv', header=None)

x_train = np.array(data.iloc[:2968, 1:-1])
y_train = np.array(data.iloc[:2968, [-1]])

#validation_spliㅅ=0.2하려고 x_train데이터를 valid까지로 넓혔음
# x_train = np.array(data.iloc[:3293, 1:-1])
# y_train = np.array(data.iloc[:3293, [-1]])

x_valid = np.array(data.iloc[2968:3293, 1:-1])
y_valid = np.array(data.iloc[2968:3293, [-1]])

x_test = np.array(data.iloc[3293:, 1:-1])
y_test = np.array(data.iloc[3293:, [-1]])

print('x_train.shape= ',x_train.shape, 'y_train.shape= ', y_train.shape)
print('x_test.shape= ',x_test.shape, 'y_test.shape= ', y_test.shape)

#모델구축
model = Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=(x_train.shape[1],), activation='sigmoid'))#입력층=은닉층1, 입력층노드 3개, 은닉층노드1
model.add(Dropout(0.5))

#은닉층2 추가
model.add(tf.keras.layers.Dense(1, input_shape=(x_train.shape[1],), activation='sigmoid'))#은닉층2, 히든노드3
model.add(Dropout(0.5))
#은닉층3 추가
#model.add(tf.keras.layers.Dense(1, input_shape=(x_train.shape[1],), activation='sigmoid'))#은닉층2, 히든노드3

#출력층
model.add(tf.keras.layers.Dense(1, activation='linear'))#출력층 노드개수1


#모델컴파일
model.compile(tf.keras.optimizers.SGD(learning_rate=0.001), loss='mse')


model.summary()

#모델학습 (5000번)
hist = model.fit(x_train, y_train, epochs=5000, validation_data=(x_valid, y_valid))

#모델평가
# loss = model.evaluate(x_test, y_test)
# print('loss= ', loss)
#model.evaluate(x_train, y_train)

#예측
predict_val = model.predict(np.array(x_test))
print(predict_val)여기에 코드를 입력하세요

이 코드의 결과가

//Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 1)                 4         
_________________________________________________________________
dropout (Dropout)            (None, 1)                 0         
_________________________________________________________________
dense_1 (Dense)              (None, 1)                 2         
_________________________________________________________________
dropout_1 (Dropout)          (None, 1)                 0         
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 2         
=================================================================
Total params: 8
Trainable params: 8
Non-trainable params: 0
_________________________________________________________________
Epoch 1/5000
93/93 [==============================] - 0s 2ms/step - loss: 438.9651 - val_loss: 326.8637
Epoch 2/5000
93/93 [==============================] - 0s 672us/step - loss: 283.4008 - val_loss: 201.4716
Epoch 3/5000
93/93 [==============================] - 0s 672us/step - loss: 170.4947 - val_loss: 107.5530
Epoch 4/5000
93/93 [==============================] - 0s 693us/step - loss: 98.0244 - val_loss: 55.6279
Epoch 5/5000
93/93 [==============================] - 0s 613us/step - loss: 65.7289 - val_loss: 30.6671
Epoch 6/5000
93/93 [==============================] - 0s 529us/step - loss: 48.3818 - val_loss: 18.2332
Epoch 7/5000
93/93 [==============================] - 0s 672us/step - loss: 38.8079 - val_loss: 11.6329
Epoch 8/5000
93/93 [==============================] - 0s 672us/step - loss: 31.3868 - val_loss: 8.0862
Epoch 9/5000
93/93 [==============================] - 0s 672us/step - loss: 28.1317 - val_loss: 5.7093
Epoch 10/5000
93/93 [==============================] - 0s 672us/step - loss: 24.3784 - val_loss: 4.1870
Epoch 11/5000
93/93 [==============================] - 0s 672us/step - loss: 20.7344 - val_loss: 3.2152
Epoch 12/5000
93/93 [==============================] - 0s 672us/step - loss: 18.0395 - val_loss: 2.5161
Epoch 13/5000
93/93 [==============================] - 0s 504us/step - loss: 15.3073 - val_loss: 2.0311
Epoch 14/5000
93/93 [==============================] - 0s 672us/step - loss: 13.0903 - val_loss: 1.6734
Epoch 15/5000
93/93 [==============================] - 0s 672us/step - loss: 11.8138 - val_loss: 1.3088
Epoch 16/5000
93/93 [==============================] - 0s 672us/step - loss: 10.4758 - val_loss: 0.9973
Epoch 17/5000
93/93 [==============================] - 0s 672us/step - loss: 8.6119 - val_loss: 0.8217
Epoch 18/5000
93/93 [==============================] - 0s 672us/step - loss: 7.6953 - val_loss: 0.6511
Epoch 19/5000
93/93 [==============================] - 0s 672us/step - loss: 6.7097 - val_loss: 0.5196
Epoch 20/5000
93/93 [==============================] - 0s 672us/step - loss: 5.9342 - val_loss: 0.4080
Epoch 21/5000
93/93 [==============================] - 0s 636us/step - loss: 5.2448 - val_loss: 0.3121
Epoch 22/5000
93/93 [==============================] - 0s 613us/step - loss: 4.6371 - val_loss: 0.2399
Epoch 23/5000
93/93 [==============================] - 0s 634us/step - loss: 3.9841 - val_loss: 0.1931
Epoch 24/5000
93/93 [==============================] - 0s 645us/step - loss: 3.5638 - val_loss: 0.1550
Epoch 25/5000
93/93 [==============================] - 0s 504us/step - loss: 3.1868 - val_loss: 0.1274
Epoch 26/5000
93/93 [==============================] - 0s 672us/step - loss: 2.7959 - val_loss: 0.1099
Epoch 27/5000
93/93 [==============================] - 0s 672us/step - loss: 2.4933 - val_loss: 0.0998
Epoch 28/5000
93/93 [==============================] - 0s 672us/step - loss: 2.2309 - val_loss: 0.0964
Epoch 29/5000
93/93 [==============================] - 0s 672us/step - loss: 2.0461 - val_loss: 0.0985
Epoch 30/5000
93/93 [==============================] - 0s 672us/step - loss: 1.7789 - val_loss: 0.1048
Epoch 31/5000
93/93 [==============================] - 0s 504us/step - loss: 1.5659 - val_loss: 0.1123
Epoch 32/5000
93/93 [==============================] - 0s 504us/step - loss: 1.4383 - val_loss: 0.1237
Epoch 33/5000
93/93 [==============================] - 0s 672us/step - loss: 1.3180 - val_loss: 0.1366
Epoch 34/5000
93/93 [==============================] - 0s 672us/step - loss: 1.1840 - val_loss: 0.1507
Epoch 35/5000
93/93 [==============================] - 0s 672us/step - loss: 1.0662 - val_loss: 0.1658
Epoch 36/5000
93/93 [==============================] - 0s 672us/step - loss: 0.9697 - val_loss: 0.1793
Epoch 37/5000
93/93 [==============================] - 0s 504us/step - loss: 0.9103 - val_loss: 0.1948
Epoch 38/5000
93/93 [==============================] - 0s 790us/step - loss: 0.8118 - val_loss: 0.2139
Epoch 39/5000
93/93 [==============================] - 0s 645us/step - loss: 0.7339 - val_loss: 0.2254
Epoch 40/5000
93/93 [==============================] - 0s 645us/step - loss: 0.7010 - val_loss: 0.2406
Epoch 41/5000
93/93 [==============================] - 0s 621us/step - loss: 0.6372 - val_loss: 0.2554
Epoch 42/5000
93/93 [==============================] - 0s 672us/step - loss: 0.5877 - val_loss: 0.2682
Epoch 43/5000
93/93 [==============================] - 0s 504us/step - loss: 0.5491 - val_loss: 0.2830
Epoch 44/5000
93/93 [==============================] - 0s 672us/step - loss: 0.5140 - val_loss: 0.2963
Epoch 45/5000
93/93 [==============================] - 0s 672us/step - loss: 0.4654 - val_loss: 0.3053
Epoch 46/5000
93/93 [==============================] - 0s 672us/step - loss: 0.4516 - val_loss: 0.3154
Epoch 47/5000
93/93 [==============================] - 0s 672us/step - loss: 0.4140 - val_loss: 0.3258
Epoch 48/5000
93/93 [==============================] - 0s 504us/step - loss: 0.4060 - val_loss: 0.3372
Epoch 49/5000
93/93 [==============================] - 0s 672us/step - loss: 0.3826 - val_loss: 0.3481
Epoch 50/5000
93/93 [==============================] - 0s 672us/step - loss: 0.3716 - val_loss: 0.3546
Epoch 51/5000
93/93 [==============================] - 0s 672us/step - loss: 0.3497 - val_loss: 0.3612
Epoch 52/5000
93/93 [==============================] - 0s 672us/step - loss: 0.3336 - val_loss: 0.3679
Epoch 53/5000
93/93 [==============================] - 0s 504us/step - loss: 0.3114 - val_loss: 0.3783
Epoch 54/5000
93/93 [==============================] - 0s 819us/step - loss: 0.2944 - val_loss: 0.3843
Epoch 55/5000
93/93 [==============================] - 0s 613us/step - loss: 0.2869 - val_loss: 0.3929
Epoch 56/5000
93/93 [==============================] - 0s 645us/step - loss: 0.2819 - val_loss: 0.3996
Epoch 57/5000
93/93 [==============================] - 0s 667us/step - loss: 0.2685 - val_loss: 0.4036
Epoch 58/5000
93/93 [==============================] - 0s 645us/step - loss: 0.2602 - val_loss: 0.4109
Epoch 59/5000
93/93 [==============================] - 0s 623us/step - loss: 0.2519 - val_loss: 0.4178
Epoch 60/5000
93/93 [==============================] - 0s 504us/step - loss: 0.2400 - val_loss: 0.4235
Epoch 61/5000
93/93 [==============================] - 0s 672us/step - loss: 0.2281 - val_loss: 0.4286
Epoch 62/5000
93/93 [==============================] - 0s 672us/step - loss: 0.2272 - val_loss: 0.4321
Epoch 63/5000
93/93 [==============================] - 0s 672us/step - loss: 0.2151 - val_loss: 0.4361
Epoch 64/5000
93/93 [==============================] - 0s 672us/step - loss: 0.2082 - val_loss: 0.4416
Epoch 65/5000
93/93 [==============================] - 0s 672us/step - loss: 0.2055 - val_loss: 0.4450
Epoch 66/5000
93/93 [==============================] - 0s 504us/step - loss: 0.2026 - val_loss: 0.4476
Epoch 67/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1969 - val_loss: 0.4515
Epoch 68/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1945 - val_loss: 0.4539
Epoch 69/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1841 - val_loss: 0.4585
Epoch 70/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1839 - val_loss: 0.4599
Epoch 71/5000
93/93 [==============================] - 0s 724us/step - loss: 0.1819 - val_loss: 0.4628
Epoch 72/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1755 - val_loss: 0.4656
Epoch 73/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1779 - val_loss: 0.4684
Epoch 74/5000
93/93 [==============================] - 0s 677us/step - loss: 0.1686 - val_loss: 0.4713
Epoch 75/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1696 - val_loss: 0.4718
Epoch 76/5000
93/93 [==============================] - 0s 442us/step - loss: 0.1629 - val_loss: 0.4736
Epoch 77/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1626 - val_loss: 0.4736
Epoch 78/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1605 - val_loss: 0.4758
Epoch 79/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1565 - val_loss: 0.4773
Epoch 80/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1566 - val_loss: 0.4791
Epoch 81/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1518 - val_loss: 0.4805
Epoch 82/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1524 - val_loss: 0.4825
Epoch 83/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1473 - val_loss: 0.4827
Epoch 84/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1500 - val_loss: 0.4839
Epoch 85/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1476 - val_loss: 0.4837
Epoch 86/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1454 - val_loss: 0.4848
Epoch 87/5000
93/93 [==============================] - 0s 744us/step - loss: 0.1436 - val_loss: 0.4880
Epoch 88/5000
93/93 [==============================] - 0s 552us/step - loss: 0.1452 - val_loss: 0.4892
Epoch 89/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1406 - val_loss: 0.4900
Epoch 90/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1376 - val_loss: 0.4915
Epoch 91/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1407 - val_loss: 0.4911
Epoch 92/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1360 - val_loss: 0.4914
Epoch 93/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1355 - val_loss: 0.4926
Epoch 94/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1377 - val_loss: 0.4933
Epoch 95/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1360 - val_loss: 0.4945
Epoch 96/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1344 - val_loss: 0.4951
Epoch 97/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1318 - val_loss: 0.4956
Epoch 98/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1335 - val_loss: 0.4961
Epoch 99/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1309 - val_loss: 0.4973
Epoch 100/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1321 - val_loss: 0.4979
Epoch 101/5000
93/93 [==============================] - 0s 840us/step - loss: 0.1320 - val_loss: 0.4983
Epoch 102/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1271 - val_loss: 0.4997
Epoch 103/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1296 - val_loss: 0.5004
Epoch 104/5000
93/93 [==============================] - 0s 755us/step - loss: 0.1305 - val_loss: 0.5009
Epoch 105/5000
93/93 [==============================] - 0s 627us/step - loss: 0.1283 - val_loss: 0.5014
Epoch 106/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1295 - val_loss: 0.5018
Epoch 107/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1256 - val_loss: 0.5035
Epoch 108/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1260 - val_loss: 0.5046
Epoch 109/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1265 - val_loss: 0.5051
Epoch 110/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1258 - val_loss: 0.5058
Epoch 111/5000
93/93 [==============================] - 0s 714us/step - loss: 0.1260 - val_loss: 0.5056
Epoch 112/5000
93/93 [==============================] - 0s 593us/step - loss: 0.1259 - val_loss: 0.5065
Epoch 113/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1263 - val_loss: 0.5076
Epoch 114/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1246 - val_loss: 0.5078
Epoch 115/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1235 - val_loss: 0.5078
Epoch 116/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1254 - val_loss: 0.5076
Epoch 117/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1254 - val_loss: 0.5082
Epoch 118/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1226 - val_loss: 0.5086
Epoch 119/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1232 - val_loss: 0.5090
Epoch 120/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1221 - val_loss: 0.5095
Epoch 121/5000
93/93 [==============================] - 0s 676us/step - loss: 0.1224 - val_loss: 0.5095
Epoch 122/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1221 - val_loss: 0.5098
Epoch 123/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1226 - val_loss: 0.5102
Epoch 124/5000
93/93 [==============================] - 0s 634us/step - loss: 0.1225 - val_loss: 0.5101
Epoch 125/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1215 - val_loss: 0.5108
Epoch 126/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1219 - val_loss: 0.5108
Epoch 127/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1222 - val_loss: 0.5108
Epoch 128/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1219 - val_loss: 0.5114
Epoch 129/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1207 - val_loss: 0.5115
Epoch 130/5000
93/93 [==============================] - 0s 656us/step - loss: 0.1198 - val_loss: 0.5125
Epoch 131/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1218 - val_loss: 0.5125
Epoch 132/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1214 - val_loss: 0.5125
Epoch 133/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1206 - val_loss: 0.5129
Epoch 134/5000
93/93 [==============================] - 0s 677us/step - loss: 0.1206 - val_loss: 0.5130
Epoch 135/5000
93/93 [==============================] - 0s 656us/step - loss: 0.1203 - val_loss: 0.5134
Epoch 136/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1205 - val_loss: 0.5138
Epoch 137/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1209 - val_loss: 0.5137
Epoch 138/5000
93/93 [==============================] - 0s 709us/step - loss: 0.1210 - val_loss: 0.5141
Epoch 139/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1196 - val_loss: 0.5146
Epoch 140/5000
93/93 [==============================] - 0s 572us/step - loss: 0.1197 - val_loss: 0.5146
Epoch 141/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1206 - val_loss: 0.5150
Epoch 142/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1212 - val_loss: 0.5155
Epoch 143/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1191 - val_loss: 0.5161
Epoch 144/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1207 - val_loss: 0.5161
Epoch 145/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1200 - val_loss: 0.5159
Epoch 146/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1201 - val_loss: 0.5160
Epoch 147/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1196 - val_loss: 0.5162
Epoch 148/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1198 - val_loss: 0.5165
Epoch 149/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1194 - val_loss: 0.5169
Epoch 150/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1190 - val_loss: 0.5171
Epoch 151/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1193 - val_loss: 0.5176
Epoch 152/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1196 - val_loss: 0.5173
Epoch 153/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1191 - val_loss: 0.5172
Epoch 154/5000
93/93 [==============================] - 0s 705us/step - loss: 0.1187 - val_loss: 0.5176
Epoch 155/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1194 - val_loss: 0.5176
Epoch 156/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1193 - val_loss: 0.5177
Epoch 157/5000
93/93 [==============================] - 0s 656us/step - loss: 0.1192 - val_loss: 0.5179
Epoch 158/5000
93/93 [==============================] - 0s 585us/step - loss: 0.1183 - val_loss: 0.5187
Epoch 159/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1197 - val_loss: 0.5186
Epoch 160/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1191 - val_loss: 0.5185
Epoch 161/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1191 - val_loss: 0.5185
Epoch 162/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1186 - val_loss: 0.5187
Epoch 163/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1188 - val_loss: 0.5189
Epoch 164/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1184 - val_loss: 0.5193
Epoch 165/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1188 - val_loss: 0.5192
Epoch 166/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1188 - val_loss: 0.5193
Epoch 167/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1186 - val_loss: 0.5193
Epoch 168/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1190 - val_loss: 0.5194
Epoch 169/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1188 - val_loss: 0.5193
Epoch 170/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1185 - val_loss: 0.5196
Epoch 171/5000
93/93 [==============================] - 0s 667us/step - loss: 0.1192 - val_loss: 0.5194
Epoch 172/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1185 - val_loss: 0.5195
Epoch 173/5000
93/93 [==============================] - 0s 570us/step - loss: 0.1184 - val_loss: 0.5195
Epoch 174/5000
93/93 [==============================] - 0s 634us/step - loss: 0.1189 - val_loss: 0.5198
Epoch 175/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1190 - val_loss: 0.5199
Epoch 176/5000
93/93 [==============================] - 0s 527us/step - loss: 0.1185 - val_loss: 0.5200
Epoch 177/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1189 - val_loss: 0.5198
Epoch 178/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1187 - val_loss: 0.5201
Epoch 179/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5203
Epoch 180/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1188 - val_loss: 0.5201
Epoch 181/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1181 - val_loss: 0.5205
Epoch 182/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1186 - val_loss: 0.5204
Epoch 183/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5206
Epoch 184/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1183 - val_loss: 0.5209
Epoch 185/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1184 - val_loss: 0.5210
Epoch 186/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1185 - val_loss: 0.5210
Epoch 187/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1191 - val_loss: 0.5205
Epoch 188/5000
93/93 [==============================] - 0s 791us/step - loss: 0.1183 - val_loss: 0.5205
Epoch 189/5000
93/93 [==============================] - 0s 581us/step - loss: 0.1188 - val_loss: 0.5203
Epoch 190/5000
93/93 [==============================] - 0s 656us/step - loss: 0.1181 - val_loss: 0.5210
Epoch 191/5000
93/93 [==============================] - 0s 634us/step - loss: 0.1180 - val_loss: 0.5216
Epoch 192/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1182 - val_loss: 0.5216
Epoch 193/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1181 - val_loss: 0.5218
Epoch 194/5000
93/93 [==============================] - 0s 677us/step - loss: 0.1186 - val_loss: 0.5218
Epoch 195/5000
93/93 [==============================] - 0s 581us/step - loss: 0.1183 - val_loss: 0.5219
Epoch 196/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1181 - val_loss: 0.5218
Epoch 197/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1182 - val_loss: 0.5221
Epoch 198/5000
93/93 [==============================] - 0s 677us/step - loss: 0.1183 - val_loss: 0.5223
Epoch 199/5000
93/93 [==============================] - 0s 667us/step - loss: 0.1183 - val_loss: 0.5222
Epoch 200/5000
93/93 [==============================] - 0s 581us/step - loss: 0.1178 - val_loss: 0.5228
Epoch 201/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1183 - val_loss: 0.5226
Epoch 202/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1180 - val_loss: 0.5227
Epoch 203/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1183 - val_loss: 0.5228
Epoch 204/5000
93/93 [==============================] - 0s 570us/step - loss: 0.1186 - val_loss: 0.5223
Epoch 205/5000
93/93 [==============================] - 0s 599us/step - loss: 0.1184 - val_loss: 0.5223
Epoch 206/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1185 - val_loss: 0.5221
Epoch 207/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1181 - val_loss: 0.5224
Epoch 208/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1182 - val_loss: 0.5227
Epoch 209/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1183 - val_loss: 0.5226
Epoch 210/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1182 - val_loss: 0.5226
Epoch 211/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1180 - val_loss: 0.5228
Epoch 212/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5226
Epoch 213/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1184 - val_loss: 0.5225
Epoch 214/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1185 - val_loss: 0.5222
Epoch 215/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1183 - val_loss: 0.5222
Epoch 216/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5223
Epoch 217/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1180 - val_loss: 0.5228
Epoch 218/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5227
Epoch 219/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1182 - val_loss: 0.5230
Epoch 220/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1183 - val_loss: 0.5230
Epoch 221/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1181 - val_loss: 0.5232
Epoch 222/5000
93/93 [==============================] - 0s 772us/step - loss: 0.1183 - val_loss: 0.5231
Epoch 223/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1183 - val_loss: 0.5231
Epoch 224/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1183 - val_loss: 0.5229
Epoch 225/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1180 - val_loss: 0.5230
Epoch 226/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1181 - val_loss: 0.5230
Epoch 227/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1181 - val_loss: 0.5233
Epoch 228/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1184 - val_loss: 0.5228
Epoch 229/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1184 - val_loss: 0.5227
Epoch 230/5000
93/93 [==============================] - 0s 656us/step - loss: 0.1183 - val_loss: 0.5227
Epoch 231/5000
93/93 [==============================] - 0s 591us/step - loss: 0.1182 - val_loss: 0.5225
Epoch 232/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1182 - val_loss: 0.5229
Epoch 233/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1181 - val_loss: 0.5230
Epoch 234/5000
93/93 [==============================] - 0s 667us/step - loss: 0.1182 - val_loss: 0.5232
Epoch 235/5000
93/93 [==============================] - 0s 613us/step - loss: 0.1182 - val_loss: 0.5232
Epoch 236/5000
93/93 [==============================] - 0s 602us/step - loss: 0.1183 - val_loss: 0.5231
Epoch 237/5000
93/93 [==============================] - 0s 645us/step - loss: 0.1183 - val_loss: 0.5232
Epoch 238/5000
93/93 [==============================] - 0s 634us/step - loss: 0.1184 - val_loss: 0.5228
Epoch 239/5000
93/93 [==============================] - 0s 624us/step - loss: 0.1183 - val_loss: 0.5228
Epoch 240/5000
93/93 [==============================] - 0s 581us/step - loss: 0.1184 - val_loss: 0.5226
Epoch 241/5000
93/93 [==============================] - 0s 538us/step - loss: 0.1181 - val_loss: 0.5232
Epoch 242/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1182 - val_loss: 0.5232
Epoch 243/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1182 - val_loss: 0.5234
Epoch 244/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1181 - val_loss: 0.5236
Epoch 245/5000
93/93 [==============================] - 0s 504us/step - loss: 0.1183 - val_loss: 0.5234
Epoch 246/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1180 - val_loss: 0.5238
Epoch 247/5000
93/93 [==============================] - 0s 672us/step - loss: 0.1183 - val_loss: 0.5237여기에 코드를 입력하세요

이렇게 중간에 손실함수가 줄어들지 않고 멈춥니다.

  1. 과적합때문이라고 생각해서 Dropout을 추가했는데도 그대로입니다. 방법이 있을까요?

  2. 예측결과(predict_val)와 정답값(y_test)의 mse를 보기 위해서 마지막에 mse = model.evaluate(y_test, predict_val) print('mse= ', mse) 이런 코드를 적었는데 오류가 납니다. evaluate가 아닌 다른 함수를 써야할 것 같은데 어떤 함수를 쓰면 좋을까요?

  • (•́ ✖ •̀)
    알 수 없는 사용자

답변을 하려면 로그인이 필요합니다.

프로그래머스 커뮤니티는 개발자들을 위한 Q&A 서비스입니다. 로그인해야 답변을 작성하실 수 있습니다.

(ಠ_ಠ)
(ಠ‿ಠ)