Professional Documents
Culture Documents
Abstract:
Automatically Image Classification is one of the challenging problem of recent era.. Several
diverse research disciplines have been confluent on this important theme for searching powerful
solutions. In this paper we have implemented and tested Neural Network Backpropagation
algorithm for large image dataset. Backpropagation algorithm is one of the powerful technique
that works efficiently for numeric and huge dataset. We also compared the performance of our
algorithm with existing Machine Learning algorithm like naive bayes classifier, decision tree, lazy
classifier and others. We found Backpropagation is one of the best solution for this problem it
give 97.02 % of accuracy.
Keywords:
Image Classification; Neural Network; Machine Learning; Backpropagation Algorithm
1. INTRODUCTION
In creasing use of electronic devices, laptops, I-Pads, smart phones we are drowning with data. Image
classification, or image categorization, has been intensively studied in several fields such image processing,
computer vision, pattern recognition, machine learning, and data mining. In literature many techniques
are available in order to categorized or classify the image like minimum distance classifiers, Parallelepiped
Classifier, Maximum-likelihood classification, fuzzy classifiers and many other techniques. Initially
global features like histograms of intensity, texture feature, edge information, corner information were
used to express and classify images [3, 4, 6, 10]. The main objective of the research is to improve the
accuracy of the classification process by applying extracted knowledge from the spatial image data using
Artificial Neural Network. In present study we have used 5 kinds of images like building, person, shoes,
car and flower as shown in Figure 2.
Let D be the training data set with n attributes (columns)A1 , A2 , , An and |D| rows. Let C be a list of
class labels. Specific values of attribute Ai C. For a new test data set T extract the features A1 , A2 , , An ,
7
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION
Figure 1. Classification system diagram: discriminant functions perform some computation on input feature vector
x using some technique K from training and passes results for classification.
The remaining part of the paper is organized as follows. In section 2 the related work is elaborated. In
section 3 we discussed the feature extraction. Backpropagation Neural Network algorithm is elaborated
in section 4. The experimental works is described in section 5. The paper is concluded in section 6. In
2. RELATED WORK
8
Image Classification using Backpropagation Algorithm
Table 1. Characterization of classifier according to Image features and recognition stage, feature representations,
class models and classification algorithms and learning mechanisms
9
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION
3. FEATURE EXTRACTION
Feature vector Feature vector (x) : x = (k1, k2, k3, ..., kn) The feature (or measurement) vector locates a
pixel in feature (or measurement) space.
Classification is a procedure for sorting pixel and assigning them to specific categories. x = feature (or
measurement) vector di (x) = discriminant function
The extraction of efficient features is the fundamental step for image classification. The images features
can be classified as texture, shape features, statistical features of pixels, and transform coefficient features.
Haralick et. al [11] describe 14 statistical features that are described as below.
Texture features can be describe by statistical moment of the intensity histogram of image. Let
p(zx i), i = 0, 1, 2, 3, ....L 1 be the corresponding histogram, where L is the number of distinct intensity
levels.
Mean
L1
m= zi p(zi ) (1)
i=0
Variance
L1
2 = (k )2 p(zi ) (2)
i=0
Skewness
1 L1
3 = (k )3 p(zi )
3 i=0
(3)
Kurtosis
1 L1
4 = (k )4 (p(zi ) 3)
4 i=0
(4)
correlation A measure of pixel correlation to its neighbour over the entire image.
K K
(i mr )(i mc
r c (6)
i=0 i=0 p i, j
K K
(i j)2 pi j (7)
i=0 i=0
10
Image Classification using Backpropagation Algorithm
K K
p2i j (8)
i=0 i=0
K K
pi, j
1 + i j| (9)
i=0 i=0
K K
pi j log2 pi j (10)
i=0 i=0
All these selected features were stored in the form of two dimensional matrix. For each feature are
stored in the form of vector (x1 , x2, x3 .......xn and corresponding class label C.
In the present study we implement backpropagation Neural Network algorithm for classifying the
image dataset. The algorithm is explained in 1. The algorithm is taken from Simon Hykin [12].
Preliminaries
Training set : (d p ,
o p ), 1, 1 p P
x p = (x p , 0, ......x p , N) // Input patterns
d p = (d p , 0, ......d p , K) // desired output for d p
o p = (o p , 0, ......o p , K) // Actual Output
e p, j = |d p, j o p, j | //error
: proportional to actual error |dk ok | multiplied by derivative of output node i.e. k = (dk
ok .ok .(i ok )
j proportional to weighted sum of errors coming to the hidden node from node in upper layer
(2,1) (1) (1)
i.e. j = k .k wk, j .x j .(1 x j )
P
Error = Err(d p ,
op) (11)
p=1
Err is a metric that measures the distance between actual output and expected output. In Backpropagation
algorithm
x p = (x p , 0, ......x p , N) is a input feature vector, random weight is assigned to each features
w p = (w p , 0, ......w p , N) after the computation the actual output is assigned
o p = (o p , 0, ......o p , N) where
as the desire output expressed by d p = (d p , 0, ......d p , N). The Backpropagation [12] is a iterative process,
it continue updates the weight in each iteration in order to achieve the desired output. The cost function is
expressed by error, the loop terminates as it complete the number epoches or when it satisfied the cost
function i.e. minimum error. The whole process is expressed in algorithm below.
The proposed neural network classifier weas implemented in MATLAB (software MATLAB
version 2012a) and on a Pentium IV computer of 3.4 GHz with 4 GB of RAM. This algo-
rithm was applied in image data set. Table 3 parameter shows the network parameter structure
12
Image Classification using Backpropagation Algorithm
epochs, MSE and numbers of patterns used in the training and testing phases. The activation
function is sigmoidal with scalar output in the range (0,1) and it is the same for all the neu-
rons. All the machine learning classifier like Lazy.IB1, Rules.NNge, Tree.Randomtree, Lazy.IBK,
Rules.CART, Tree.BFTree, Tree.RandomForest,Meta.FilteredClassifier, Rules.Ridor,Tree.SimpleCart,
meta.ClassificationViaRegression, meta.LogitBoost, meta.Bagging, tree.REPTree, tree.LADTree,
rules.OneR, bayes.NaiveBayesMultinomial were experimented in machine learning software weka ??.
The experimental results shows that Random Forest algorithm perform good. The problem with machine
learning weka software that it is not able to classify for test data. In all the cases we used k crossfold
technique in order to accurate result. The results are shown in Figure 4. We implement Neural Network
backpropagation algorithm using MATLAB2012a. No other applications were running while running
this program. In BPA model we found that as the number of iteration increases the mean square error is
decreasing, but after a certain point it becomes constant. BPA model take more time than other machine
learning classifiers.
Parameters Value
Size of Training Set 1000
Size of Test Data 500
Number of Classes 5
Features Vectors 239
Number of Input Neron 239
Number of Hidden Layer 10
Learning parameter .05
Goal le-05
13
JOURNAL OF COMPUTER SCIENCE AND SOFTWARE APPLICATION
5. CONCLUSION
In this paper Backpropagation Neural Network model is implemented for image dataset, this problem
is multiclass image classification problem. We found BPA is one of the best suitable technique for catego-
rization of images because all the feature vectors are available in the numerical form. In experimentation
we found that the classifier accuracy is also dependent in hidden layer and epoches. We also experimented
other machine learning weka for same dataset. The algorithm got failure if the image containing two class
value simultaneously in the same image like building with flowers. We will extend our algorithm for this
type of images.
ACKNOWLEDGEMENTS
We are very thankful to the Indian institutions of Science, Banglore for providing the image data. We
are also thankful to weka for provide machine learning code for implementation.
References
[1] By Ery Arias-Castro and David L. Donoho. Does Median Filtering Truly Perserve Edges Better than
Linear Filtering.The Annals of Statistics 2009, Vol. 37, No. 3, 11721206 DOI: 10.1214/08-A.
[2] Bagdanov, A.D., Worring, M.: Fine-grained document genral classication using rst order random
graphs. In: Proceed-ings of the 6th International Conference on Document Anal-ysis and Recognition,
14
Image Classification using Backpropagation Algorithm
15