일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
Tags
- 이미지 생성
- text2img
- #실생활영어
- #실생활 영어
- #Android
- python __init__
- tokenizing
- python list
- #opencv
- c언어
- word embedding
- keras
- 영어명언
- 딥러닝
- TensorFlow
- #English
- tensorflow update
- #영어
- 완전탐색
- python 알고리즘
- #프로젝트
- Convolution Neural Network
- convexhull
- #1일1영어
- findContours
- #일상영어
- opencv SURF
- 영어
- object detection
- #영어 명언
Archives
- Today
- Total
When will you grow up?
Convolution Neural Network (using FASHION-MNIST data) 본문
[Code]
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 | import numpy as np import mnist_reader from keras.datasets import mnist from keras.models import Sequential from keras.layers import Dense from keras.layers import Dropout from keras.layers import Flatten from keras.layers import Activation from keras.layers.convolutional import Convolution2D from keras.layers.convolutional import MaxPooling2D from keras.utils import np_utils from matplotlib import pyplot as plt from keras.layers import Conv2D, MaxPooling2D from keras.layers.normalization import BatchNormalization import time seed = 7 np.random.seed(seed) #시간측정 start_time = time.time() #data load x_train, y_train = mnist_reader.load_mnist('data/', kind='train') x_test, y_test = mnist_reader.load_mnist('data/', kind='t10k') # flatten 28*28 images to a 784 vector for each image num_pixels = 784 X_train = x_train.reshape(x_train.shape[0], num_pixels).astype('float32') x_test = x_test.reshape(x_test.shape[0], num_pixels).astype('float32') # normalize inputs from 0-255 to 0-1 x_train = x_train / 255 x_test = x_test / 255 # one hot encode outputs y_train = np_utils.to_categorical(y_train) y_test = np_utils.to_categorical(y_test) num_classes = y_test.shape[1] # define a simple DNN model def baseline_model(): # create model model = Sequential() model.add(Dense(1000, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(1500, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(2000, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(2500, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(3000, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(3000, input_dim=num_pixels, init='normal')) model.add(BatchNormalization()) model.add(Dropout(0.25)) model.add(Activation('relu')) model.add(Dense(num_classes, init='normal', activation='softmax')) # Compile model model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) return model # build the model model = baseline_model() # Fit the model hist = model.fit(x_train, y_train, validation_data=(x_test, y_test), nb_epoch=12, batch_size=128, verbose=2) # Final evaluation of the model scores = model.evaluate(x_test, y_test, verbose=0) print("DNN Error: %.2f%%" % (100-scores[1]*100)) #모델 시각 fig, loss_ax = plt.subplots() acc_ax = loss_ax.twinx() loss_ax.plot(hist.history['loss'], 'y', label='train loss') loss_ax.plot(hist.history['val_loss'], 'r', label='val loss') acc_ax.plot(hist.history['acc'], 'b', label='train acc') acc_ax.plot(hist.history['val_acc'], 'g', label='val acc') loss_ax.set_xlabel('epoch') loss_ax.set_ylabel('loss') acc_ax. set_ylabel('accuracy') loss_ax.legend(loc='upper left') acc_ax.legend(loc='lower left') plt.show() #걸린시간 print("--- %s seconds ---" %(time.time() - start_time)) | cs |
reference
https://github.com/zalandoresearch/fashion-mnist
keras.io
'02. Study > Keras' 카테고리의 다른 글
Sequence-to Sequence (0) | 2017.12.08 |
---|---|
Text Generation(using LSTM) (0) | 2017.11.24 |
Long Short Term Memory(using IMDB dataset) (0) | 2017.11.12 |
Deep Neural Network(using pima dataset) (2) | 2017.11.05 |
Convolution Neural Network (using CIFAR-10 data) (2) | 2017.11.05 |
Comments