Books Online Courses
Free Tutorials  Go to Your University  Placement Preparation 
Latest:- Important tips to get an Off Campus Placements
0 like 0 dislike
944 views
in Artificial Intelligence(AI) & Machine Learning by (757 points)
edited by
Fashion Mnist dataset is a dataset having 60000 training data and 10000 test data. In this program we will be performing CNN operation in fashion MNIST dataset using keras with tensorflow 2 as backend.

1 Answer

0 like 0 dislike
by (757 points)
selected by
 
Best answer

CNN in Fashion MNIST dataset using Keras



Fashion-MNIST is a dataset of Zalando's article images consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image. It has a total of 10 classes. In this example we will be performing CNN in this dataset.



Importing Libraries 

  • Firstly, import the required libraries.

  • We will import tensorflow and tf.keras which is TensorFlow's high-level API for building and training deep learning models. Other libraries numpy and matplotlib will be also be imported for plotting graphs and other operations.

# TensorFlow and tf.keras

import tensorflow as tf

from tensorflow import keras

import numpy as np

import matplotlib.pyplot as plt



Loading Data

  • We will store the data into fashion_mnist using keras.

  • Then, we split the data into training data and testing data. The test data is used for validation. 

  • In class_names, we will be providing the class names. The dataset has 10 classes.

fashion_mnist = keras.datasets.fashion_mnist

(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()

class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat',

               'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']



  • Now we will view the training data shape, length of training labels, training labels, testing data shape, length of test labels. 
print(train_images.shape)

print(len(train_labels))

print(train_labels)

print(test_images.shape)

print(len(test_labels))
(60000, 28, 28)     # Train image shape
60000     # Print length of training labels 
array([9, 0, 0, ..., 3, 0, 5], dtype=uint8)     # Print training labels
(10000, 28, 28)     # Test image shape
10000     # Print length of test labels


  • Let's see how 0th image of train image looks like using matplotlib library. 

  • plt.figure is to create a figure object.

  • imshow() creates an image from a 2-dimensional numpy array.

  • colorbar is for using the new axes

  • Inside plt.grid we are passing false value so that it does not create any grid.

  • plt.show() will display the current figure that you are working on. 

plt.figure()

plt.imshow(train_images[0])

plt.colorbar()

plt.grid(False)

plt.show()

shoe fashion mnist



Pre-Processing of data

  • Now, we will be pre-processing the data before training it. 

  • We will now scale these values to a range of 0 to 1 by diving them by 255.

  • Training and testing dataset must be pre-processed in same  the way.

train_images = train_images / 255.0

test_images = test_images / 255.0



  • We will be displaying the first 25 images from the training dataset alongwith class name in order to verify that the data that we want to feed is ready for training. 

  • plt.figure is to create a figure object.

  • the subplot will take the index position on a grid with nrows rows and ncols columns.

  • The plt.tick() is used for getting or set the current tick locations and labels of the x-axis and y-axis. We have passed no arguments so that it returns the current values without modifying them.

  • Inside plt.grid we are passing false value so that it does not create any grid.

  • imshow() creates an image from a 2-dimensional numpy array.

  • The plt.xlabel() command accepts a string as an argument.

  • plt.show() will display the current figure that you are working on.

plt.figure(figsize=(10,10))

for i in range(25):

    plt.subplot(5,5,i+1)

    plt.xticks([])

    plt.yticks([])

    plt.grid(False)

    plt.imshow(train_images[i], cmap=plt.cm.binary)

    plt.xlabel(class_names[train_labels[i]])

plt.show()

fashion mnist



Creating model

  • Now we will configure the layers of model for compilation. 

  • We create sequential model having all the layers.

  • Then, we will flatten the input layer which is of shape 28*28.

  • A dense layer is a classic fully connected neural network layer each input node is connected to each output node. 

  • We have passed relu as activation function.

  • The first Dense layer has 128 neurons. The second layer returns a logits array with length of 10.

model = keras.Sequential([

    keras.layers.Flatten(input_shape=(28, 28)),

    keras.layers.Dense(128, activation='relu'),

    keras.layers.Dense(10,activation='softmax')

])



Compiling the model

  • Now we will compile the model.

  • Loss function measures how accurate the model is during training. We have to minimize the loss function.

  • Optimizer  shoes how the model is updated based on the data it sees and its loss function.

  • Metrics is used to monitor the training and testing steps. The following example uses accuracy, the fraction of the images that are correctly classified.

 

model.compile(optimizer='adam',

              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),

              metrics=['accuracy'])



Training the model

  • Now we will train the model using model.fit().

  • We have passed 5 epochs. An epoch is when an entire dataset is passed forward and backward through the neural network only once.

  • You can change the number of epoch for improving the accuracy.

model.fit(train_images, train_labels, epochs=10)

 Epoch 1/10
60000/60000 [==============================] - 6s 101us/sample - loss: 0.5007 - acc: 0.8248
Epoch 2/10
60000/60000 [==============================] - 6s 97us/sample - loss: 0.3725 - acc: 0.8659
Epoch 3/10
60000/60000 [==============================] - 6s 97us/sample - loss: 0.3336 - acc: 0.8787
Epoch 4/10
60000/60000 [==============================] - 6s 97us/sample - loss: 0.3095 - acc: 0.8864
Epoch 5/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2917 - acc: 0.8915
Epoch 6/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2789 - acc: 0.8960
Epoch 7/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2669 - acc: 0.9011
Epoch 8/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2547 - acc: 0.9068
Epoch 9/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2451 - acc: 0.9089
Epoch 10/10
60000/60000 [==============================] - 6s 98us/sample - loss: 0.2369 - acc: 0.9117


Test Accuracy

  • Now, evaluate the accuracy of the test model.

test_loss, test_acc = model.evaluate(test_images,  test_labels, verbose=2)

print('\nTest accuracy:', test_acc)

10000/10000 - 1s - loss: 0.3439 - acc: 0.8824

Test accuracy: 0.8824


Prediction

  • Now, in output layer generally the softmax activation function is passed.

  • Here, we will predict the image and also see it's test label.

  • So, the model is most confident that this image is an ankle boot, or class_names[9].

predictions = model.predict(test_images)

print("Predicting the class : ",np.argmax(predictions[0]))

print("Test Label : ",test_labels[0])

Predicting the class :  9
Test Label :  9

Prediction and Plotting First 10 images from Test Set

plt.figure(figsize=(10,10))

 

for i in range(10):

 

    plt.subplot(5,5,i+1)

 

    plt.xticks([])

 

    plt.yticks([])

 

    plt.grid(False)

 

    plt.imshow(test_images[i], cmap=plt.cm.binary)

    img = test_images[i]

    img = img.reshape(128281)

    predictions = model.predict(img)

    #print(np.argmax(predictions))

    plt.xlabel(class_names[np.argmax(predictions)]) 


 ARTIFICIAL INTELLIGENCE(AI) Tutorial

 FREE Online Tutorials   

Artificial Intelligence(AI) Training in Jaipur 

Machine Learning(ML) Training in Jaipur  

3.3k questions

7.1k answers

395 comments

4.6k users

 Goeduhub:

About Us | Contact Us || Terms & Conditions | Privacy Policy || Youtube Channel || Telegram Channel © goeduhub.com Social::   |  | 
...