You are on page 1of 9

JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

Volume 1, Number 2, December 2014

JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

Image Classification using


Backpropagation Algorithm
Kesari Verma1 *, Ligendra Kumar Verma2 , Priyanka Tripathi1
1 Computer
Applications, NIT Raipur, India.
2 Department
of Computer Science, NIT Raipur, India.
*Corresponding author: keshriverma@gmail.com

Abstract:
Automatically Image Classification is one of the challenging problem of recent era.. Several
diverse research disciplines have been confluent on this important theme for searching powerful
solutions. In this paper we have implemented and tested Neural Network Backpropagation
algorithm for large image dataset. Backpropagation algorithm is one of the powerful technique
that works efficiently for numeric and huge dataset. We also compared the performance of our
algorithm with existing Machine Learning algorithm like naive bayes classifier, decision tree, lazy
classifier and others. We found Backpropagation is one of the best solution for this problem it
give 97.02 % of accuracy.
Keywords:
Image Classification; Neural Network; Machine Learning; Backpropagation Algorithm

1. INTRODUCTION

In creasing use of electronic devices, laptops, I-Pads, smart phones we are drowning with data. Image
classification, or image categorization, has been intensively studied in several fields such image processing,
computer vision, pattern recognition, machine learning, and data mining. In literature many techniques
are available in order to categorized or classify the image like minimum distance classifiers, Parallelepiped
Classifier, Maximum-likelihood classification, fuzzy classifiers and many other techniques. Initially
global features like histograms of intensity, texture feature, edge information, corner information were
used to express and classify images [3, 4, 6, 10]. The main objective of the research is to improve the
accuracy of the classification process by applying extracted knowledge from the spatial image data using
Artificial Neural Network. In present study we have used 5 kinds of images like building, person, shoes,
car and flower as shown in Figure 2.

1.1 Problem Definition

Let D be the training data set with n attributes (columns)A1 , A2 , , An and |D| rows. Let C be a list of
class labels. Specific values of attribute Ai C. For a new test data set T extract the features A1 , A2 , , An ,
7
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

identify the discriminating class and classify to a class C.


Discriminant function di (x) A function of the measurement vector, x, which provides a measure of the
probability that x belongs to the it class. A distinct discriminant function is defined for each class, under
consideration as shown in Figure 1.

Figure 1. Classification system diagram: discriminant functions perform some computation on input feature vector
x using some technique K from training and passes results for classification.

The remaining part of the paper is organized as follows. In section 2 the related work is elaborated. In
section 3 we discussed the feature extraction. Backpropagation Neural Network algorithm is elaborated
in section 4. The experimental works is described in section 5. The paper is concluded in section 6. In

Figure 2. Different class of Images

present system the model we used is shown in Figure 3.

2. RELATED WORK

The related work done in this area is elaborated in table 1.

8
Image Classification using Backpropagation Algorithm

Figure 3. Taxonomy of Model

Author Document Fea- Class Model and Learning


tures Classification Al- Mechanism
gorithm
Bagdanov and Various image variety of statis- statistical
Worring [2] features including tical classifiers classifiers
global image (such as 1-NN,
features, zone Nearest Mean,
features and text Linear Discrim-
histogram inant, Parzen
classifier)
Hu et al.[7] Block informa- Hidden Markov Learn prob-
tion of segmented Model (HMM) abilities
document of HMM
(manually
define model
topology)
Hroux et al. [8] Image features be- K Nearest Neigh- Automatically
fore block seg- bor (KNN) populate NN
mentation; Vari- space and
ous levels of pixel learn weights
densities in a form for NN
distance
computation
Ciaran et. al. [5] Histogram and Histogram Based on
Spatiogram the intensity
distribution
of Pixel in
gray level
Sangkyum Kim et. Sequential pat- DisIClass: Dis- SPM+B2S
al. [13] terns of Images criminative and DisI-
Frequent Pattern- Class.
Based Image
Classification
Chapelle et. al [4] Histogram Histogram for Machine
Support Vector Learning.
Machine

Table 1. Characterization of classifier according to Image features and recognition stage, feature representations,
class models and classification algorithms and learning mechanisms

9
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

3. FEATURE EXTRACTION

Feature Measurement space Measurement space can be thought of as an n-dimensional scatterplot


whose axes represent the gray values in the original bands. If some or all of the original bands are replaced
with other variables it is called feature space.

Feature vector Feature vector (x) : x = (k1, k2, k3, ..., kn) The feature (or measurement) vector locates a
pixel in feature (or measurement) space.

Classification is a procedure for sorting pixel and assigning them to specific categories. x = feature (or
measurement) vector di (x) = discriminant function
The extraction of efficient features is the fundamental step for image classification. The images features
can be classified as texture, shape features, statistical features of pixels, and transform coefficient features.
Haralick et. al [11] describe 14 statistical features that are described as below.

Texture features can be describe by statistical moment of the intensity histogram of image. Let
p(zx i), i = 0, 1, 2, 3, ....L 1 be the corresponding histogram, where L is the number of distinct intensity
levels.
Mean
L1
m= zi p(zi ) (1)
i=0

Variance
L1
2 = (k )2 p(zi ) (2)
i=0

Skewness
1 L1
3 = (k )3 p(zi )
3 i=0
(3)

Kurtosis
1 L1
4 = (k )4 (p(zi ) 3)
4 i=0
(4)

Maximum Probability - it measures the strongest similarity in terms of proababilty

maxi, j (pi j ) (5)

correlation A measure of pixel correlation to its neighbour over the entire image.

K K
(i mr )(i mc
r c (6)
i=0 i=0 p i, j

Contrast A measure of intensity contrast

K K
(i j)2 pi j (7)
i=0 i=0
10
Image Classification using Backpropagation Algorithm

Uniformity A measure of uniformity in the image. Uniformity 1 for contrast image

K K
p2i j (8)
i=0 i=0

Homogeneity A measure the spatial closeness of the distraction of element in image

K K
pi, j
1 + i j| (9)
i=0 i=0

Entropy : Measure the randomness on the element of image

K K
pi j log2 pi j (10)
i=0 i=0

All these selected features were stored in the form of two dimensional matrix. For each feature are
stored in the form of vector (x1 , x2, x3 .......xn and corresponding class label C.

3.1 Backpropagation Algorithm

In the present study we implement backpropagation Neural Network algorithm for classifying the
image dataset. The algorithm is explained in 1. The algorithm is taken from Simon Hykin [12].

Preliminaries



Training set : (d p ,
o p ), 1, 1 p P

P = number of input patterns


x p = (x p , 0, ......x p , N) // Input patterns



d p = (d p , 0, ......d p , K) // desired output for d p

N = Input space dimension

K = Number of output Neurons



o p = (o p , 0, ......o p , K) // Actual Output

e p, j = |d p, j o p, j | //error

x p,i : value in ith input node


(1) (1,0)
net j = ni=0 w j,i .x p,i j is hidden layer, net input of jth node in hidden layer.
(i+1,i)
wk, j : weight from jth node in ith layer to kth node of i + 1th layer
(2,1) (1)
o p,k = ( j .wk, j .x p, j : Output of kth node in output layer.

x p, j : Output of jth of ith layer for patterns



(i)
xp
11
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

(2) (2,1) (1)


netk = j=0 wk, j .x p, j k is output layer, net output of kth node in output layer
(1) (1,0)
x p, j = (ni=0 w j,i .x p,i ) : Output of jth node in the hidden layer
1
activation function logistic sigmoid (net) = 1+e.net

: proportional to actual error |dk ok | multiplied by derivative of output node i.e. k = (dk
ok .ok .(i ok )

j proportional to weighted sum of errors coming to the hidden node from node in upper layer
(2,1) (1) (1)
i.e. j = k .k wk, j .x j .(1 x j )

Objective: Cost function: Minimize Cumulative Error defined in Equation 11.

P


Error = Err(d p ,

op) (11)
p=1
Err is a metric that measures the distance between actual output and expected output. In Backpropagation
algorithm
x p = (x p , 0, ......x p , N) is a input feature vector, random weight is assigned to each features

w p = (w p , 0, ......w p , N) after the computation the actual output is assigned

o p = (o p , 0, ......o p , N) where


as the desire output expressed by d p = (d p , 0, ......d p , N). The Backpropagation [12] is a iterative process,
it continue updates the weight in each iteration in order to achieve the desired output. The cost function is
expressed by error, the loop terminates as it complete the number epoches or when it satisfied the cost
function i.e. minimum error. The whole process is expressed in algorithm below.

Algorithm 1 Backpropagation Algorithm


x p,i : value in i-th input node
Start with initial random
w;
repeat
For each input pattern x p do


{ Propagate x p (Forward pass) }
From first hidden layer to output layer do
(1)
Compute hidden node activations: net p, j
(1)
Compute hidden node Outputs: x p, j
(2)
Compute Output node activations: net p,k
Compute Network outputs: o p,k ;
Compute Network Error : e p,k = d p,k o p,k ;
{ Back Propagate
e p (Backward pass) }
From output layer to first hidden layer do
Update output layer weights
p,k = d p,k o p,k .o p,k .(1 o p,k )
(1)
w2,1
k, j = . p,k .x p,i
For a hidden layer do
Update hidden layer weights :
(1)
mu p, j = k . p,k .w2,1 1
k, j .x p, j .(1 x p, j );
w1,0
j,i = . p, j .x p, j ;
until MSE(
w ) is minimal

4. RESULT AND DISCUSSION

The proposed neural network classifier weas implemented in MATLAB (software MATLAB
version 2012a) and on a Pentium IV computer of 3.4 GHz with 4 GB of RAM. This algo-
rithm was applied in image data set. Table 3 parameter shows the network parameter structure
12
Image Classification using Backpropagation Algorithm

Figure 4. Analysis of Different Algorithms

epochs, MSE and numbers of patterns used in the training and testing phases. The activation
function is sigmoidal with scalar output in the range (0,1) and it is the same for all the neu-
rons. All the machine learning classifier like Lazy.IB1, Rules.NNge, Tree.Randomtree, Lazy.IBK,
Rules.CART, Tree.BFTree, Tree.RandomForest,Meta.FilteredClassifier, Rules.Ridor,Tree.SimpleCart,
meta.ClassificationViaRegression, meta.LogitBoost, meta.Bagging, tree.REPTree, tree.LADTree,
rules.OneR, bayes.NaiveBayesMultinomial were experimented in machine learning software weka ??.
The experimental results shows that Random Forest algorithm perform good. The problem with machine
learning weka software that it is not able to classify for test data. In all the cases we used k crossfold
technique in order to accurate result. The results are shown in Figure 4. We implement Neural Network
backpropagation algorithm using MATLAB2012a. No other applications were running while running
this program. In BPA model we found that as the number of iteration increases the mean square error is
decreasing, but after a certain point it becomes constant. BPA model take more time than other machine
learning classifiers.

Parameters Value
Size of Training Set 1000
Size of Test Data 500
Number of Classes 5
Features Vectors 239
Number of Input Neron 239
Number of Hidden Layer 10
Learning parameter .05
Goal le-05

Table 2. Parameter for BPA

Sno Epoches Error


1 1000 .10651
2 2000 .10023
3 3000 .096386
4 4000 .08768
5 5000 ..0567

Table 3. Network Parameters

13
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION

Algorithm Correct InCorrect


Lazy.IB1 96 4
Rules.NNge 96 4
Tree.Randomtree 96 4
Lazy.IBK 96 4
Rules.PART 95.8 4.2
Tree.J48 94.6 5.4
Meta.END 97.2 2.8
Tree.BFTree 90 10
Tree.RandomForest 96.4 1.6
Meta.FilteredClassifier 88.2 11.8
Rules.Ridor 81 19
Tree.SimpleCart 83.6 16.4
meta.ClassificationViaRegression 90.2 9.8
meta.LogitBoost 83.4 16.6
meta.Bagging 86.4 13.6
tree.REPTree 72.4 27.6
tree.LADTree 72.6 27.4
rules.OneR 56.4 43.
bayes.NaiveBayesMultinomial 56.8 43.2
Backpropagation 97.2 2.8

Table 4. Experimental Results

5. CONCLUSION

In this paper Backpropagation Neural Network model is implemented for image dataset, this problem
is multiclass image classification problem. We found BPA is one of the best suitable technique for catego-
rization of images because all the feature vectors are available in the numerical form. In experimentation
we found that the classifier accuracy is also dependent in hidden layer and epoches. We also experimented
other machine learning weka for same dataset. The algorithm got failure if the image containing two class
value simultaneously in the same image like building with flowers. We will extend our algorithm for this
type of images.

ACKNOWLEDGEMENTS

We are very thankful to the Indian institutions of Science, Banglore for providing the image data. We
are also thankful to weka for provide machine learning code for implementation.

References

[1] By Ery Arias-Castro and David L. Donoho. Does Median Filtering Truly Perserve Edges Better than
Linear Filtering.The Annals of Statistics 2009, Vol. 37, No. 3, 11721206 DOI: 10.1214/08-A.
[2] Bagdanov, A.D., Worring, M.: Fine-grained document genral classication using rst order random
graphs. In: Proceed-ings of the 6th International Conference on Document Anal-ysis and Recognition,

14
Image Classification using Backpropagation Algorithm

Seattle, USA, 1013 September 2001, pp. 7990 (2001)


[3] A. Vailaya, M. Figueiredo, A. Jain, and H. Zhang. Image classification for content-based indexing.
IEEE Transactions on Image Processing, 10(1):117130, 2001.
[4] O. Chapelle, P. Haffner, and V. Vapnik. Support vector machines for histogram-based image classifi-
cation. In IEEE Transactions on Neural Networks, special issue on Support Vectors, 1999.
[5] Ciaran O Conaire, Noel E. O Connor, Alan F. Smeaton. An Improved Spatigram Similarity measure
for Robus Object Localisation. IEEE International Conference on Acoustics, Speech, and Signal
Processing (ICASSP) 2007
[6] J. Huang, S. R. Kumar, and R. Zabih. An automatic hierarchical image classification scheme. In
MULTIMEDIA 98: Proceedings of the sixth ACM international conference on Multimedia, 1998.
[7] Hu, J., Kashi, R., Wilfong, G.: Document classication using layout analysis. In: Proceedings of the
1st International Work-shop on Document Analysis and Understanding for Docu-ment Databases,
Florence, Italy, September 1999, pp. 556560 (1999)
[8] Hroux, P., Diana, S., Ribert, A., Trupin, E.: Classication method study for automatic form class
identication. In: Pro-ceedings of the 14th International Conference on Pattern Rec-ognition, Brisbane,
Australia, 1620 August 1998, pp. 926929 (1998)
[9] Matthias Kirchner and Jessica Fridrich. On Detectonof Medican Fiitering in Digital Images. Proc.
SPIE 7541. Media Forensics and Security II, 754110 Doi 10.1117./12.839100.
[10] M. Szummer and R. W. Picard. Indoor-outdoor image classification. In IEEE International Workshop
on Content-based Access of Image and Video Databases, 1998.
[11] Robert M. Haralick, K. Shanmugam and Its Hak Dinstein(1979(. Texture features for Image Classifi-
cation . IEEE Transaction on System, Man and Cybernetics.
[12] Simon Haykin. Neural Networks A comprehensive Foundation Peason Education.
[13] Sangkyum Kim, Xin Jin, Jiawei Han. DisIClass: Discriminative Frequent Pattern-Based Image
Classification.MDMKDD10, July 25th, 2010, Washington, DC, USA.
[14] J. Z. Wang, J. Li, and G. Wiederhold. Simplicity: Semantics-sensitive integrated matching for picture
libraries. In VISUAL 00: Proceedings of the 4th International Conference on Advances in Visual
Information Systems, 2000.
[15] Machine Learning Implementation using Weka. www.weka.org

15

You might also like