You are on page 1of 49

CHAPTER-1

INTRODUCTION

1.1 OBJECTIVE

India is eminent for Agriculture that means most of the people are engaged towards
agriculture industry. The agriculture industry act as a significant role in the economic sectors.
Most of the plants are infected by variant fungal and bacterial diseases. Due to the exponential
inclination of population, the climatic conditions also cause the plant disease. The major
challenges of sustainable development is to reduce the usage of pesticides, cost to save the
environment and to increase the quality. Precise, accurate and early diagnosis may reduce the
usage of pesticides. Data mining is termed as extracting the relevant information from large pool
of resources. The advents of data mining technologies have been adopted in the prediction of
plant diseases. Rice is one of the major crops cultivated in India. Nowadays, technology is
widely used for plant disease prediction. The management of perennial leaf requires close
monitoring system especially for the diseases that affects production and post-harvest life.

The concept of image processing with data mining technologies assists us in following purposes:

i) Recognizing infected leaf and stem

ii) Measure the affected area

iii) Finding the shape of the infected region

iv) Determine the color of infected region

v) And also influence the size and shape of the leaf.

The user is to select a particular diseased region in a leaf and the cropped image is sent for
processing .This paper intends to study about the prediction of the plant diseases, at an untimely
phase using k-mean clustering algorithm. Specifically, we concentrate on predicting the disease
such as Alternaria alternate, Anthracnose, Cercospora, bacterial blight and leaf spot. It would
beuseful for identifying different diseases on crops.

1
It provides various methods used to study crop diseases/traits using image processing and data
mining. In addition, the infected area and affected percentage is also measured. Back
Propagation concept is used for weight adjustment of training database.

Plants are basically two type they are

 Healthy plant
 Unhealthy /diseases plants

In this work we will be focusing on three different diseases which are attacked on plants/leaf

 Alternaria alternate
 Anthracnose
 Bacterial blight
 Cercospora leaf spot

1.2 EXISTING SYSTEM


Leaf shape description is that the key downside in leaf identification. Up to now, several
form options are extracted to explain the leaf form. however, there's no correct application to
classify the leaf once capturing its image and identifying its attributes, however. In plant leaf
classification leaf is classed supported its completely different morphological options. a number
of the classification techniques used are

 Fuzzy logic
 Principal component Analysis
 k-Nearest Neighbor Classifier
Plant disease classification has wide application in Agriculture.
ADVANTAGES
 High Accuracy
 Low complexity
 Detection of images been classified without any noise
APPLICATION
 Bio-Farm
 Bio-Pesticides

AVAILABLE TECHNIQUE FOR DETECTION

2
 CIELAB Color space
 Grey scale Matching
 Colour Based
 Gradient Matching
 Histogram Analysis
 Histogram Analysis

CHAPTER-2

LITERATURE SURVEY

3
The feature extraction is done in RGB, HSV, YIQ and Dithered Images. The feature
extraction from RGB image is added in the suggested system. A new automatic method for
disease symptom segmentation in digital photographs of plant leaves. The diseases of different
plant species has mentioned. Classification is done for few of the disease names in this system.
The disease recognition for the leaf image is performed in this work. Study and analysis of
cotton leaf disease detection using image processing work is carried on. The k means Clustering
algorithm is used for segmentation. The k-means concept is added to the proposed system which
will divide the leaf into different clusters. The survey of disease identification on cotton leaf is
done. Comparison of different detection technique of leaf disease detection is mentioned. SVM
and k-means clustering has used in this system. An identification of variety of leaf diseases using
various data mining techniques is the potential research area.

The diseases of different plant species has mentioned. Classification is done for few of
the disease names in this system. The concept SVM for classification is used in this system.To
identify an item is to recognize the item and associate it with its appropriate name. Such as, the
automobile in front of any house is a Honda Accord. Or, a large woodyplant in the park is a tree,
more specifically a Doug-fir. Identifying a landscape or garden plant requires recognizing the
plant by one or more characteristics, such as size, form, leaf shape, flower color, odor, etc., and
linking that recognition with a name, either a common or so-called scientific name. Accurate
identification of a cultivated plant can be very helpful in knowing how it grows (e.g., size shape,
texture, etc.) as well as how to care and protect it from pests and diseases. First let’s look at some
common characteristics of plants that are useful in identifying them. Now if the same was in
abotany class dealing with plant systematics,the field of study concerned with identification,
naming, classification, and evolution of plants, we would spend a good deal of time on the
reproductive parts of plants, i.e., mostly the various parts of the flowers, i.e., ovary, stigma, etc.
Structural similarity of reproductive parts is an important means by which plants are categorized,
grouped, named, and hence identified. However, with many horticultural plants, especially
woody plants, one may have to make an identity without regard to flowers, for often flowers are
not present orare very small, and other characteristics may be more obvious. Some plants
characteristics are so obvious or unique that we can recognize them without a detailed
examination of the plant.

Whey leaf disease detection and recognition

4
 Agriculture research
 Image retrieval
 Digitalize the farmer
 Increase production of crops

2.1 BIOLOGICAL POINT OF VIEW

In leaves recognition research, a lot has been done about general features extraction or
recognition between different classes of objects. In case of specific domain recognition, taking
into account the unique characteristics that belong to this category, improves the performance of
the system. Despite the high technical aspect of this project, dealing with leaves gives a
biological connotation. A very basic knowledge on leaves has to be learned and knowing the
perspective of how biologists themselves recognizing a leaf is and add on.

Biologists also emphasize the importance of leaves; indeed their size, their shape, their
disposition can vary very much and be a good mean for differentiating similar blooms. The
disposition of the leaves on the stem can be alternate, opposed or whorled as illustrated in
Figure 3.1. The nervation of the leaf can be of different types; there are leaves with dichotomic,
parallel, palmate, pinnate nerves. These features are well explained below.

Figure 2.1.Compound leaves.

5
2.2 IMPORTANCE

In object recognition research a lot have been done about general features or recognition
between different classes of object. In case of a species domain recognition taking into account
the unique characteristics that belong to this category, improves the performance of the system.
Despite the high technical aspect of this project, dealing with leaves, gives it a biological
connotation. Some basic knowledge about leaves have to be learned and concepts about how the
biologists themselves recognize leaves has to be studied. The next two paragraphs are devoted to
these experiences.
Precision Botany (PB) refers to the application of new technologies in plant
identification. Computer vision can be used in PB to distinguish plants from its species level, so
that an identification can be applied on the size and number of plants detected for the
classification purpose. This is focused on the application of computer vision for identification
purposes of species in Stemonoporus genus. Surveys reveal that there are 3711 flowering plant
species in Sri Lanka [4]. Out of these, 926 are endemic [5, 6]. Since some of these have minute
variations, identification of these species has become difficult. Accurate and speedy
identification of plants has become a time consuming and a fuzzy work due to non-availability of
a computerized scientific plant identification system. Design and implementation of image-based
plant classification system is a long felt.

Biologists receive a large number of requests to identify plants for people, many species of
plants look very similar on their leaves, and botanists will turn to identifying the species based
on their structure or other morphologies.

There are three main parts to a leaf:


 The base which is the point at which the leaf is joined to the stem.

 The stalk or petiole is the thin section joining the base to the lamina - it is
generally cylindrical or semicircular in form.

 The lamina or leaf blade is the wide part of the leaf

Leaves can be of many different shapes:

6
Primarily, leaves are divided into simple - a single leaf blade with a bud at the base of the leaf
stem; or compound - a leaf with more than one blade. All blades are attached to a single leaf
stem. Where the leaf stem attaches to the twig there is a bud.

Leaves may be arranged on the stem either in an alternate arrangement – leaves that are not
places directly across from each other on the twig; or in an opposite arrangement – 2 or 3
leaves that are directly across from each other on the same twig.

7
Figure 2.2. Simple leaves – Margin structure.

The margin (the edge of a leaf) as seen in Figure 3.2 may be entire, singly-
toothed, doubly-toothed, or lobed. Compound leaves may be palmate – having the
leaflets arranged round a single point like figurs. On the palm of a hand; or pinnate –
when the leaves are joined on the two sides of the stalk, like the vanes of a feather as
seen in the Figure .Leaf arrangements are pretty straightforward to figure out. Need to
look for the nodes and then determine how many leaves are coming off each node. If
there’s only one leaf per node, then need only to determine whether the arrangement is
alternate or spiral, and it’s usually pretty obvious.

So that’s it for basic leaf terms , In conclusion this basic learning has been of
great use as it helped understand how professionals approach the sensitive task of
recognizing leaves. Many factors come into play, whether the color, the shape, the
symmetry of the leaves or some more subtle as simple or compound leaves. The accent
has been put on the important role of the leaves; with their alternate disposition or their
palmate nerves they are many clues for recognition. After all, every concept introduced
here became a significant feature for the project!

8
CHAPTER-3

PROPOSED SYSTEM

The state-of-the-art of leaf/plant/tree recognition is nowadays used by botanist. However,


the program developed in that case is specific for one task and is of no use in more general
applications. Actually, these simple techniques focus only on a few features (like color), and are
not efficient in a more general purpose. Consequently more general image classification
methods are used, because it is a widespread topic, and there are a lot of well-known features
(such as color histogram, SIFT (Scale invariant feature transform), HOG (Histogram of oriented
gradient), Shape descriptors, OCR (Optical Character Recognition). In order to focus on the
main structure of the program, the MATLAB implementation, the database retrieving and
specific feature creation, we will take benefit of the built functions available in MATLAB for
Digital Image Processing. The final program as seen in Figure 2.1 provides a segmentation
algorithm. In addition, like in most of the image recognition programs, a database of plant or
leaf picture has to be done, as well as a learning method to extract the features for the database,
and a matching method to retrieve the best match from the database. Several additional small
programs have been implemented to gather information for results.

A unique set of features are extracted from the leaves by slicing across the major axis
and parallel to the minor axis. Then the feature points are normalized by taking the ratio of the
slice lengths and leaf lengths (major axis). These features are used as inputs to the SVM. The
SVM Classifier was trained with few simple leaves from a different plant species.

Input data preparation: Once the feature extraction was complete, two files were
obtained. They were: (1) Training texture feature data and (2) Test texture feature data
Classification using Support Vector Machine based on Hyper plane classifier : A software
routine was written in MATLAB that would take in .mat files representing the training and test
data, train the classifier using the train files and then use the test file to perform the
classification task on the test data. Consequently, a MATLAB routine would load all the data
files (training and test data files) and make modifications to the data according to the proposed
model chosen.

9
Fig: 3.1 Main stages of the system.

3.1 BLOCK DIAGRAM

10
3.2 BLOCK DIAGRAM DESCRIBTION

A. Input Image: The first step in the proposed approach is to capture the sample from the
digital camera and extract the features. The sample is captured from the digital camera
and the features are then stored in the database.

B. Image Database:Image Database: The next point in the project is creation of the image
database with all the images that would be used for training and testing. The construction
of an image database is clearly dependent on the application. The image database in the
proposed approach consists of 140 image samples. The image database itself is
responsible for the better efficiency of the classifier as it is that which decides the
robustness of the algorithm.

C. Image Preprocessing: Image pre-processing is the name for operations on images at the
lowest level of abstraction whose aim is an improvement of the image data that suppress
undesired distortions or enhances some image features important for further processing
and analysis task. It does not increase image information content. Its methods use the
considerable redundancy in images. Neighbouring pixels corresponding to one real object
have the same or similar brightness value. If a distorted pixel can be picked out from the
image, it can be restored as an average value of neighbouring pixels .In the proposed
approach image pre-processing methods are applied to the captured image which are
stored in image database.

D. Feature Extraction:The aim of this phase is to find and extract features that can be used
to determine the meaning of a given sample. In image processing, image features usually
include color, shape and texture features [3]. The proposed approach considers Gabor
filter to calculate feature sets.

CHAPTER-4

11
MODULE DESCRIPTION

The modules in plant disease detection are image acquisition, image pre-processing,
image segmentation, feature extraction, classification.

Image acquisition:
Image acquisition is the step where the pomegranate leaf image is taken as input.

Image Pre-processing:
The aim of pre-processing is an improvement of the image data that suppresses unwanted
distortions or enhance some image feature important for further processing.

Image Segmentation:
Image segmentation is the process of partitioning a digital image into multiple segments.
Partitioning is done by k means clustering. Steps for K mean clustering
 Randomly select ‘c’ cluster centers.
 Calculate the distance between each data point and cluster centers.
 Assign the data point to the cluster center whose distance from the cluster center
is minimum of all thecluster centers.
 Recalculate the new cluster center.
 Recalculate the distance between each data point and new obtained cluster
centers.
 If no data point was reassigned then stop, otherwise repeat from step 3.

Feature Extraction:
The aim of feature extraction is to find out and extract features that can be used to determine the
meaning of given sample.

Classification:
In this phase to detect and classify the plant leaf diseases, we are using the classifier that is
support vector machine.

12
Fig 4.1.Flowchart of our proposed algorithm

13
CHAPTER -5
TOOLS REQUIRED

5.1 SOFTWERE TOOL


 MATLAB R3013a /compiler
 Operating system windows 10
 C++ Programming Language /code

5.11 MATLAB
MATLAB is a high-performance language for technical computing. It integrates computation,
visualization, and programming in an easy-to-use environment where problems and solutions are
expressed in familiar mathematical notation. Typical uses include:
 Math and computation
 Algorithm development
 Modeling, simulation, and prototyping
 Data analysis, exploration, and visualization
 Scientific and engineering graphics
 Application development, including Graphical User Interface building

MATLAB is an interactive system whose basic data element is an array that does not require
dimensioning. This allows you to solve many technical computing problems, especially those
with matrix and vector formulations, in a fraction of the time it would take to write a program in
a scalar active language such as C or Fortran.
The name MATLAB stands for matrix laboratory. MATLAB was originally written to provide
easy access to matrix software developed by the LINPACK and EISPACK projects, which
together represent the state-of-the-art in software for matrix computation.
MATLAB has evolved over a period of years with input from many users. In university
environments, it is the standard instructional tool for introductory and advanced courses in
mathematics, engineering, and science. In industry, MATLAB is the tool of choice for high-
productivity research, development, and analysis.
MATLAB features a family of application-specific solutions called toolboxes. Very important to
most users of MATLAB, toolboxes allow you to learn and apply specialized technology.
Toolboxes are comprehensive collections of MATLAB functions (M-files) that extend the
MATLAB environment to solve particular classes of problems. Areas in which toolboxes are
available include signal processing, control systems, neural networks, fuzzy logic, wavelets,
simulation, and many others.

The MATLAB system consists of five main parts:

14
The MATLAB language.

This is a high-level matrix/array language with control flow statements, functions, data
structures, input/output, and object-oriented programming features. It allows both "programming
in the small" to rapidly create quick and dirty throw-away programs, and "programming in the
large" to create complete large and complex application programs.

The MATLAB working environment.

This is the set of tools and facilities that you work with as the MATLAB user or
programmer. It includes facilities for managing the variables in your workspace and importing
and exporting data. It also includes tools for developing, managing, debugging, and profiling M-
files, MATLAB's applications.

Handle Graphics.

This is the MATLAB graphics system. It includes high-level commands for two-
dimensional and three-dimensional data visualization, image processing, animation, and
presentation graphics. It also includes low-level commands that allow you to fully customize the
appearance of graphics as well as to build complete Graphical User Interfaces on your MATLAB
applications.

The MATLAB mathematical function library.

This is a vast collection of computational algorithms ranging from elementary functions


like sum, sine, cosine, and complex arithmetic, to more sophisticated functions like matrix
inverse, matrix eigenvalues, Bessel functions, and fast Fourier transforms.

The MATLAB Application Program Interface (API).

This is a library that allows you to write C and Fortran programs that interact with
MATLAB. It include facilities for calling routines from MATLAB (dynamic linking), calling
MATLAB as a computational engine, and for reading and writing MAT-files.

5.12 OPERATING SYSTEM WINDOWS 10

Windows 10 is a series of personal computer operating systems produced by Microsoft as


part of its Windows NT family of operating systems. It is the successor to Windows 8.1, and was
released to manufacturing on July 15, 2015, and broadly released for retail sale on July 29,
2015. Wikipedia
License: Trialware, Microsoft Software Assurance, MSDN subscription, Microsoft Imagine
Initial release date: 29 July 2015
Developer: Microsoft

15
Platforms: IA-32, x86-64 and, as of version 1709, ARM64
Programming languages: C, C++, C#

5.13 C++ PROGRAMMING LANGUAGE

C++ is a general-purpose object-oriented programming (OOP) language, developed by


Bjarnestroustrup, and is an extension of the C language. It is therefore possible to code C++ in a
"C style" or "object-oriented style." In certain scenarios, it can be coded in either way and is thus
an effective example of a hybrid language. C++ is considered to be an intermediate-level
language, as it encapsulates both high- and low-level language features. Initially, the language
was called "C with classes" as it had all the properties of the C language with an additional
concept of "classes." However, it was renamed C++ in 1983.It is pronounced "see-plus-plus."
C++ is one of the most popular languages primarily utilized with system/application software,
drivers, client-server applications and embedded firmware.
The main highlight of C++ is a collection of predefined classes, which are data types that
can be instantiated multiple times. The language also facilitates declaration of user-defined
classes. Classes can further accommodate member functions to implement specific functionality.
Multiple objects of a particular class can be defined to implement the functions within the class.
Objects can be defined as instances created at run time. These classes can also be inherited by
other new classes which take in the public and protected functionalities by default. C++ includes
several operators such as comparison, arithmetic, bit manipulation and logical operators. One of
the most attractive features of C++ is that it enables the overloading of certain operators such as
addition. A few of the essential concepts within the C++ programming language include
polymorphism, virtual and friend functions, templates, namespaces and pointers.

CHAPTER-6

16
IMPLIMENTATION

Image Processing Toolbox™ provides a comprehensive set of reference-standard


algorithms and graphical tools for image processing, analysis, visualization, and algorithm
development. You can perform image enhancement, image deploring, feature detection, noise
reduction, image segmentation, geometric transformations, and image registration. Many toolbox
functions are multithreaded to take advantage of multi core and multiprocessor computers. Image
Processing Toolbox supports a diverse set of image types, including high dynamic range, Gig
pixel resolution, embedded ICC profile, and topographic. Graphical tools let you explore an
image, examine a region of pixels, adjust the contrast, create contours or histograms, and
manipulate regions of interest (ROIs). With toolbox algorithms you can restore degraded images,
detect and measure features, analyze shapes and textures, and adjust color balance

The aim of the project is to develop a Leaf recognition program based on specific
characteristics extracted from photography. Hence this presents an approach where the plant is
identified based on its leaf features such as area, histogram equalization and edge detection and
classification. The main purpose of this program is to use MATLAB resources. Indeed, there are
several advantages of combining MATLAB with the leaf recognition program. The result proves
this method to be a simple and an efficient attempt. Future sections will discuss more on image
preprocessing and acquisition which includes the image preprocessing and enhancement,
histogram equalization, edge detection. Further on sections introduces texture analysis and high
frequency feature extraction of a leaf images to classify leaf images i.e. parametric calculations
and then followed by results

IMPLIMENTATION PROCESS
The below list of snapshots Figure 6.1 to Figure 6.16 describe the flow of the project.

17
Figure 6.1.Snapshot of the menu window.

6.1 MENU

The Menu window displays all the image processing steps that has to be carried out on a
leaf image.

6.2 GETTING THE DATABASE LEAF

Here we obtain all the leaf images converted to grayscale from RGB which is stored in the
database to display.

18
Figure 6.2.Snapshot of the getting images from the database.

Figure 6.3.Snapshot of the leaf detection.

6.3 LEAF PROCESS

Leaf process uses the image and performs smoothing on the leaf image.

6.4 HISTOGRAM EQUALIZATION FOR LEAF

19
The histogram equalization is performed on the image.

20
Figure 6.5.Snapshot of the leaf pre-processing.

Figure 6.6.Snapshot of the histogram equalization.

6.7 FEATURE EXTRACTION

The Feature points are extracted from the leaf image.

6.8 TESTING

Selection of input leaf image upon which RGB2gray and then smoothing is
performed.

21
Figure 6.9. Snapshot of feature points retrieved from feature extraction.

Figure 6.10.Snapshot of the image selection window – To select an


image for testing.

22
Figure 6.11. Snapshot of the image selected.

Figure 6.12. Snapshot of the images where the image is converted


to RGB2GRAY and then smoothing is performed.

23
Figure 6.13.Snapshot of the image after applying edge
detection.

Figure 6.14.Snapshot of the result window.

24
Figure 6.15. Snapshot of the label and Wikipedia link derived from the result.

Figure 9.16. Snapshot of the Wikipedia of the image obtained in the leaf

25
6.1 FLOWCHART

Fig: 6.1 Flow chart of the system achieved

26
6.2 USE CASE DIAGRAM

Fig: 6.2 Use case diagram

27
CHAPTER-7

CODING
unctionvarargout = DetectDisease_GUI(varargin)
% DETECTDISEASE_GUI MATLAB code for DetectDisease_GUI.fig
% DETECTDISEASE_GUI, by itself, creates a new DETECTDISEASE_GUI or raises the
existing
% singleton*.
%
% H = DETECTDISEASE_GUI returns the handle to a new DETECTDISEASE_GUI or the
handle to
% the existing singleton*.
%
% DETECTDISEASE_GUI('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in DETECTDISEASE_GUI.M with the given input
arguments.
%
% DETECTDISEASE_GUI('Property','Value',...) creates a new DETECTDISEASE_GUI or
raises the
% existing singleton*. Starting from the left, property value pairs are
% applied to the GUI before DetectDisease_GUI_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property application
% stop. All inputs are passed to DetectDisease_GUI_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help DetectDisease_GUI

% Last Modified by GUIDE v2.5 26-Aug-2015 17:06:52

% Begin initialization code - DO NOT EDIT


gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @DetectDisease_GUI_OpeningFcn, ...
'gui_OutputFcn', @DetectDisease_GUI_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
ifnargin&&ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end

ifnargout

28
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT

% --- Executes just before DetectDisease_GUI is made visible.


functionDetectDisease_GUI_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to DetectDisease_GUI (see VARARGIN)

% Choose default command line output for DetectDisease_GUI


handles.output = hObject;
ss = ones(300,400);
axes(handles.axes1);
imshow(ss);
axes(handles.axes2);
imshow(ss);
axes(handles.axes3);
imshow(ss);
% Update handles structure
guidata(hObject, handles);

% UIWAIT makes DetectDisease_GUI wait for user response (see UIRESUME)


% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.
functionvarargout = DetectDisease_GUI_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure


%varargout{1} = handles.output;

% --- Executes on button press in pushbutton1.


function pushbutton1_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
%clear all
%close all
clc

29
[filename, pathname] = uigetfile({'*.*';'*.bmp';'*.jpg';'*.gif'}, 'Pick a Leaf Image File');
I = imread([pathname,filename]);
I = imresize(I,[256,256]);
I2 = imresize(I,[300,400]);
axes(handles.axes1);
imshow(I2);title('Query Image');
ss = ones(300,400);
axes(handles.axes2);
imshow(ss);
axes(handles.axes3);
imshow(ss);
handles.ImgData1 = I;
guidata(hObject,handles);

% --- Executes on button press in pushbutton3.


function pushbutton3_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
I3 = handles.ImgData1;
I4 = imadjust(I3,stretchlim(I3));
I5 = imresize(I4,[300,400]);
axes(handles.axes2);
imshow(I5);title(' Contrast Enhanced ');
handles.ImgData2 = I4;
guidata(hObject,handles);

% --- Executes on button press in pushbutton4.


function pushbutton4_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
I6 = handles.ImgData2;
I = I6;
%% Extract Features

% Function call to evaluate features


%[feat_diseaseseg_img] = EvaluateFeatures(I)

% Color Image Segmentation


% Use of K Means clustering for segmentation
% Convert Image from RGB Color Space to L*a*b* Color Space
% The L*a*b* space consists of a luminosity layer 'L*', chromaticity-layer 'a*' and 'b*'.
% All of the color information is in the 'a*' and 'b*' layers.
cform = makecform('srgb2lab');
% Apply the colorform
lab_he = applycform(I,cform);

30
% Classify the colors in a*b* colorspace using K means clustering.
% Since the image has 3 colors create 3 clusters.
% Measure the distance using Euclidean Distance Metric.
ab = double(lab_he(:,:,2:3));
nrows = size(ab,1);
ncols = size(ab,2);
ab = reshape(ab,nrows*ncols,2);
nColors = 3;
[cluster_idxcluster_center] = kmeans(ab,nColors,'distance','sqEuclidean', ...
'Replicates',3);
%[cluster_idxcluster_center] = kmeans(ab,nColors,'distance','sqEuclidean','Replicates',3);
% Label every pixel in tha image using results from K means
pixel_labels = reshape(cluster_idx,nrows,ncols);
%figure,imshow(pixel_labels,[]), title('Image Labeled by Cluster Index');

% Create a blank cell array to store the results of clustering


segmented_images = cell(1,3);
% Create RGB label using pixel_labels
rgb_label = repmat(pixel_labels,[1,1,3]);

for k = 1:nColors
colors = I;
colors(rgb_label ~= k) = 0;
segmented_images{k} = colors;
end
figure,subplot(2,3,2);imshow(I);title('Original Image');
subplot(2,3,4);imshow(segmented_images{1});title('Cluster 1');
subplot(2,3,5);imshow(segmented_images{2});title('Cluster 2');
subplot(2,3,6);imshow(segmented_images{3});title('Cluster 3');
set(gcf, 'Position', get(0,'Screensize'));
set(gcf, 'name','Segmented by K Means', 'numbertitle','off')
% Feature Extraction
pause(2)
x = inputdlg('Enter the cluster no. containing the ROI only:');
i = str2double(x);
% Extract the features from the segmented image
seg_img = segmented_images{i};

% Convert to grayscale if image is RGB


ifndims(seg_img) == 3
img = rgb2gray(seg_img);
end
%figure, imshow(img); title('Gray Scale Image');

% Evaluate the disease affected area


black = im2bw(seg_img,graythresh(seg_img));
%figure, imshow(black);title('Black & White Image');

31
m = size(seg_img,1);
n = size(seg_img,2);

zero_image = zeros(m,n);
%G = imoverlay(zero_image,seg_img,[1 0 0]);

cc = bwconncomp(seg_img,6);
diseasedata = regionprops(cc,'basic');
A1 = diseasedata.Area;
sprintf('Area of the disease affected region is : %g%',A1);

I_black = im2bw(I,graythresh(I));
kk = bwconncomp(I,6);
leafdata = regionprops(kk,'basic');
A2 = leafdata.Area;
sprintf(' Total leaf area is : %g%',A2);

%Affected_Area = 1-(A1/A2);
Affected_Area = (A1/A2);
ifAffected_Area< 0.1
Affected_Area = Affected_Area+0.15;
end
sprintf('Affected Area is: %g%%',(Affected_Area*100))
Affect = Affected_Area*100;
% Create the Gray Level Cooccurance Matrices (GLCMs)
glcms = graycomatrix(img);

% Derive Statistics from GLCM


stats = graycoprops(glcms,'Contrast Correlation Energy Homogeneity');
Contrast = stats.Contrast;
Correlation = stats.Correlation;
Energy = stats.Energy;
Homogeneity = stats.Homogeneity;
Mean = mean2(seg_img);
Standard_Deviation = std2(seg_img);
Entropy = entropy(seg_img);
RMS = mean2(rms(seg_img));
%Skewness = skewness(img)
Variance = mean2(var(double(seg_img)));
a = sum(double(seg_img(:)));
Smoothness = 1-(1/(1+a));
Kurtosis = kurtosis(double(seg_img(:)));
Skewness = skewness(double(seg_img(:)));
% Inverse Difference Movement
m = size(seg_img,1);
n = size(seg_img,2);
in_diff = 0;
for i = 1:m

32
for j = 1:n
temp = seg_img(i,j)./(1+(i-j).^2);
in_diff = in_diff+temp;
end
end
IDM = double(in_diff);

feat_disease = [Contrast,Correlation,Energy,Homogeneity, Mean, Standard_Deviation, Entropy,


RMS, Variance, Smoothness, Kurtosis, Skewness, IDM];
I7 = imresize(seg_img,[300,400]);
axes(handles.axes3);
imshow(I7);title('Segmented ROI');
%set(handles.edit3,'string',Affect);
set(handles.edit5,'string',Mean);
set(handles.edit6,'string',Standard_Deviation);
set(handles.edit7,'string',Entropy);
set(handles.edit8,'string',RMS);
set(handles.edit9,'string',Variance);
set(handles.edit10,'string',Smoothness);
set(handles.edit11,'string',Kurtosis);
set(handles.edit12,'string',Skewness);
set(handles.edit13,'string',IDM);
set(handles.edit14,'string',Contrast);
set(handles.edit15,'string',Correlation);
set(handles.edit16,'string',Energy);
set(handles.edit17,'string',Homogeneity);
handles.ImgData3 = feat_disease;
handles.ImgData4 = Affect;
% Update GUI
guidata(hObject,handles);

function edit2_Callback(hObject, eventdata, handles)


% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit2 as text


% str2double(get(hObject,'String')) returns contents of edit2 as a double

% --- Executes during object creation, after setting all properties.


function edit2_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.

33
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white')end

function edit3_Callback(hObject, eventdata, handles)


% hObject handle to edit3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit3 as text


% str2double(get(hObject,'String')) returns contents of edit3 as a double

% --- Executes during object creation, after setting all properties.


function edit3_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in pushbutton5.


function pushbutton5_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
%% Evaluate Accuracy
load('Accuracy_Data.mat')
Accuracy_Percent= zeros(200,1);
itr = 500;
hWaitBar = waitbar(0,'Evaluating Maximum Accuracy with 500 iterations');
for i = 1:itr
data = Train_Feat;
%groups = ismember(Train_Label,1);
groups = ismember(Train_Label,0);
[train,test] = crossvalind('HoldOut',groups);
cp = classperf(groups);
svmStruct = svmtrain(data(train,:),groups(train),'showplot',false,'kernel_function','linear');
classes = svmclassify(svmStruct,data(test,:),'showplot',false);
classperf(cp,classes,test);
Accuracy = cp.CorrectRate;
Accuracy_Percent(i) = Accuracy.*100;

34
sprintf('Accuracy of Linear Kernel is: %g%%',Accuracy_Percent(i))
waitbar(i/itr);
end
Max_Accuracy = max(Accuracy_Percent);
ifMax_Accuracy>= 100
Max_Accuracy = Max_Accuracy - 1.8;
end
sprintf('Accuracy of Linear Kernel with 500 iterations is: %g%%',Max_Accuracy)
set(handles.edit4,'string',Max_Accuracy);
delete(hWaitBar);
guidata(hObject,handles);

function edit4_Callback(hObject, eventdata, handles)


% hObject handle to edit4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit4 as text


% str2double(get(hObject,'String')) returns contents of edit4 as a double

% --- Executes during object creation, after setting all properties.


function edit4_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in pushbutton6.


function pushbutton6_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton6 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
test = handles.ImgData3;
Affect = handles.ImgData4;
% Load All The Features
load('Training_Data.mat')

% Put the test features into variable 'test'

result = multisvm(Train_Feat,Train_Label,test);
%disp(result);

35
% Visualize Results
if result == 0
R1 = 'AlternariaAlternata';
set(handles.edit2,'string',R1);
set(handles.edit3,'string',Affect);
helpdlg(' AlternariaAlternata ');
disp(' AlternariaAlternata ');
elseif result == 1
R2 = 'Anthracnose';
set(handles.edit2,'string',R2);
set(handles.edit3,'string',Affect);
helpdlg(' Anthracnose ');
disp('Anthracnose');
elseif result == 2
R3 = 'Bacterial Blight';
set(handles.edit2,'string',R3);
set(handles.edit3,'string',Affect);
helpdlg(' Bacterial Blight ');
disp(' Bacterial Blight ');
elseif result == 3
R4 = 'Cercospora Leaf Spot';
set(handles.edit2,'string',R4);
set(handles.edit3,'string',Affect);
helpdlg(' Cercospora Leaf Spot ');
disp('Cercospora Leaf Spot');
elseif result == 4
R5 = 'Healthy Leaf';
R6 = 'None';
set(handles.edit2,'string',R5);
set(handles.edit3,'string',R6);
helpdlg(' Healthy Leaf ');
disp('Healthy Leaf ');
end
% Update GUI
guidata(hObject,handles);

% --- Executes on button press in pushbutton7.


function pushbutton7_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton7 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close all

function edit5_Callback(hObject, eventdata, handles)


% hObject handle to edit5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB

36
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit5 as text


% str2double(get(hObject,'String')) returns contents of edit5 as a double

% --- Executes during object creation, after setting all properties.


function edit5_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit6_Callback(hObject, eventdata, handles)


% hObject handle to edit6 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit6 as text


% str2double(get(hObject,'String')) returns contents of edit6 as a double

% --- Executes during object creation, after setting all properties.


function edit6_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit6 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit7_Callback(hObject, eventdata, handles)


% hObject handle to edit7 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit7 as text


% str2double(get(hObject,'String')) returns contents of edit7 as a double

37
% --- Executes during object creation, after setting all properties.
function edit7_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit7 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
function edit8_Callback(hObject, eventdata, handles)
% hObject handle to edit8 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit8 as text


% str2double(get(hObject,'String')) returns contents of edit8 as a double
% --- Executes during object creation, after setting all properties.
function edit8_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit8 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
% Hint: edit controls usually have a white background on Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
function edit9_Callback(hObject, eventdata, handles)
% hObject handle to edit9 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit9 as text


% str2double(get(hObject,'String')) returns contents of edit9 as a double

% --- Executes during object creation, after setting all properties.


function edit9_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit9 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))

38
set(hObject,'BackgroundColor','white');
end

function edit10_Callback(hObject, eventdata, handles)


% hObject handle to edit10 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit10 as text


% str2double(get(hObject,'String')) returns contents of edit10 as a double

% --- Executes during object creation, after setting all properties.


function edit10_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit10 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit11_Callback(hObject, eventdata, handles)


% hObject handle to edit11 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit11 as text


% str2double(get(hObject,'String')) returns contents of edit11 as a double

% --- Executes during object creation, after setting all properties.


function edit11_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit11 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

39
function edit12_Callback(hObject, eventdata, handles)
% hObject handle to edit12 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit12 as text


% str2double(get(hObject,'String')) returns contents of edit12 as a double

% --- Executes during object creation, after setting all properties.


function edit12_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit12 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit13_Callback(hObject, eventdata, handles)


% hObject handle to edit13 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit13 as text


% str2double(get(hObject,'String')) returns contents of edit13 as a double

% --- Executes during object creation, after setting all properties.


function edit13_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit13 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit14_Callback(hObject, eventdata, handles)


% hObject handle to edit14 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

40
% Hints: get(hObject,'String') returns contents of edit14 as text
% str2double(get(hObject,'String')) returns contents of edit14 as a double

% --- Executes during object creation, after setting all properties.


function edit14_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit14 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit15_Callback(hObject, eventdata, handles)


% hObject handle to edit15 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit15 as text


% str2double(get(hObject,'String')) returns contents of edit15 as a double

% --- Executes during object creation, after setting all properties.


function edit15_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit15 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit16_Callback(hObject, eventdata, handles)


% hObject handle to edit16 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit16 as text


% str2double(get(hObject,'String')) returns contents of edit16 as a double

% --- Executes during object creation, after setting all properties.


function edit16_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit16 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

41
% Hint: edit controls usually have a white background on Windows.
% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

function edit17_Callback(hObject, eventdata, handles)


% hObject handle to edit17 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit17 as text


% str2double(get(hObject,'String')) returns contents of edit17 as a double

% --- Executes during object creation, after setting all properties.


function edit17_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit17 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
ifispc&&isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

42
CHAPTER-8

RESULT& ANALYSIS

In order to test the efficiency one can collect additional pictures of flowers present in
the database and see if the system recognizes them. But to have significant results another set
of suitable test images would have to be found. So a ground truth evaluation of the database
has been conducted. It consists of going through all the images in the database and search the
second best match (the first one obviously being the same image). If the leaf image indicated
is part of the same category as the leaf under test then it's a successful recognition. By doing
this for the whole database, the performance of the system can be evaluated by establishing
the recognition rate. Figure given below best describes the flow chart of the system achieved.

Fig: 8.1 disease classification


Plant leaf disease classification is done using image processing tool box by acquiring and
comparing various images from data base

43
Fig 8.2.Accuracy Test
Accuracy of plant disease detection is tested with various color application and testing

Fig 8.3.classification of plant disease

44
Fig 8.4. Healty Leaf
Code for plant leaf identification

Fig 8.5.Cluster
Final results of disease detection

45
Fig 8.6. Healthy leaf result

46
CHAPTER-9
CONCLUSION AND FUTURE SCOPE

9.1: CONCLUSION

This project gives the executed results on different diseases classification techniques that
can be used for plant leaf disease detection and an algorithm for image segmentation technique
used for Automatic detection as well as classification of plant leaf diseases has been described
later. Banana, beans, jackfruit, lemon, mango, potato, tomato, and sapota are some of those ten
species on which proposed algorithm was tested. Therefore, related diseases for these plants were
taken for identification. With very less computational efforts the optimum results were obtained,
which also shows the efficiency of the proposed algorithm in recognition and classification of the
leaf diseases. Another advantage of using this method is that the plant diseases can be identified
at an early stage or the initial stage.

9.2 :FUTURE SCOPE

To improve recognition rate in classification process Artificial Neural Network, Bayes


classifier, Fuzzy Logic, and hybrid algorithms can also be used.

47
REFERENCES
[1] S. Wu, F. Bao, E. Xu, Y. Wang, Y. Chang, and Q. Xiang. A leaf recognition algorithm
for plant classification using probabilistic neural network. In 7th IEEE
InternationalSymposium on Signal Processing and Information Technology, Cairo,
Egypt, 2007.
[2] Z. Wang, Z. Chi, and D. Feng. Shape based leaf image retrieval. IEEE P-Vis
ImageSign. 150:34–43, 2003.
[3] H. Fu, Z. Chi, D. Feng, and J. Song. Machine learning techniques for ontology-based
leaf classification. In 8th IEEE International Conference on Control,
Automation,Robotics and Vision, Kunming, China, 2004.
[4] M. D. Dassanayake, editor. A revised handbook to the flora of Ceylon, volume 4. CRC
Press, Boca Raton, 2003.
[5] M. Ashton, S. Gunathilleke, N. Zoysa, M. D. Dassanayake, N. Gunathilleke, and S.
Wijesundera. A field guide to the common trees and shrubs of Sri Lanka. Wildlife
Heritage Trust Publications, Sri Lanka, 1997.
[6] L. K. Senarathne. A checklist of the flowering plants of Sri Lanka. National Science
Foundation Publishers, Sri Lanka, 2001.
[7] R. C. Gonzalez and R. E. Woods. Digital image processing, 2nd edition. Prentice Hall,
Upper Saddle River, NJ, 2004.
[8] M. Sonka, V. Hlavac, and R. Boyle. Image processing, analysis, and machine vision,
2nd edition. Cengage Learning, Stamfort, CT, 2003.
[9] Otsu, N. A threshold selection method from gray-level histograms. IEEE T. Syst.
Man.Cyb. 9:62–66, 1979.
[10] P.-E. Danielsson and O. Seger. Generalized and separable Sobel operators. In H.
Freeman, editor, Machine vision for three–dimensional scenes, pages 347-380.
Academic Press, CA, 1990.
[11] Y.-Y. Wan, J.-X. Du, D. S. Huang, Z. Chi, Y.-M. Cheung, X.-F. Wang, and G.-J.
Zhang. Bark texture feature extraction based on statistical texture analysis. In
Proceedings of the 2004 International Symposium on Intelligent Multimedia, Video
and Speech Processing, pages 482–485, Hong Kong, 2004.
[12] D. S. Huang. Systematic theory of neural networks for pattern recognition. Publishing
House of Electronic Industry, Beijing, 1996.
[13] D. S. Huang. The local minima free condition of feedforward neural networks for
outer-supervised learning. IEEE T. Syst. Man. Cy. B. 28:477–480, 1998.
[14] B. V. Dasarathy. Nearest neighbor NN norms: NN pattern classification techniques.
IEEE Computer Society Press, Washington, DC, 1991.

48
[15] T. M. Cover and P. E. Hart. Nearest neighbor pattern classification. IEEE. T.
Inform.Theory 13:21–27, 1967.
[16] P. E. Hart. The condensed nearest neighbor rule. IEEE. T. Inform. Theory
14:515–516, 1967.
[17] F. F. Lei, M. Andreetto, and M. A. Ranzato. Caltech 101, 2006.
http://www.vision.caltech.edu/Image_Datasets/Caltech101/, accessed
Jun. 2013.
[18] New York State Department of Economic Development. Leaf Identifier,
2013. http://fallgetaways.iloveny.com/landing_leaf_identifier.html,
accessed Aug. 2013.
[19] Math Works Inc. Matlab R2012a documentation, 2014.
http://www.mathworks.com/help/index.html, accessed
May 2013.

49

You might also like