Professional Documents
Culture Documents
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
Slide Credits:
A. Efros, S. Palmer, B. Leibe, S. Lazebnik, K. Grauman, S. Seitz, C.Bishop, .
Kokkinos
Mid-level vision
Half-way between the image and the objects
Scalability: Recognition-by-components
Mid-level vision
How can we abstract from the image observations?
Too many pixels, edgels, blobs, junctions
Replace with representative, higher-level structures
Fewer and amenable to subsequent processing
Core problem: Grouping
Region grouping (Segmentation)
Illusory/subjective
contours
Occlusion
Familiar configuration
Similarity
Common Fate
10
Proximity
11
Symmetry
12
Parallelism
13
Continuity
Common fate:
Parallelism:
Symmetry:
Similarity:
Closure, continuity:
motion estimation
texture analysis
ridge detection
region-based segmentation
boundary-based segmentation
Symmetry
Continuity
Color Similarity
16
Motion Estimation
This Lecture
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
18
Segmentation Problem
Task: Partition image into homogeneous regions
Intensity:
19
Feature Space
At each pixel, form a vector of measurements describing
image properties: image features
Map observations into feature space
Group pixels based
on color similarity
R=255
G=200
B=250
B
R=245
G=220
B=248
R=15
G=189
B=2
R=3
G=12
B=2
20
10
Feature Space
At each pixel, form a vector of measurements describing
image properties: image features
Perform segmentation by clustering the feature space
Grouping pixels based
on texture similarity
F1
F2
Filter bank
of 24 filters
F24
21
What if these
points lie on lines?
Line 1
Line 2
11
Line weights
line 1
line 2
Iteration
Application to segmentation
Invented `models: image segments
Modeling the features separately within each segment:
substantially easier than modeling the image.
Brown
Blue
Yellow
24
12
white
pixels
black pixels
gray
pixels
input image
intensity
Pixel count
25
Input image
Intensity
Pixel count
Input image
Intensity
26
13
190
Intensity
255
27
Clustering
chicken and egg problem:
28
14
K-Means algorithm
Input: features
d: feature vector dimensionality
N: number of pixels
Output: centers & assignments
K-Means Clustering
Randomly initialize k cluster centers.
Iterate:
1.
2.
3.
If ci have changed, go to 1
15
Intensity-based clusters
Color-based clusters
31
Limitations of k-means
Euclidean distance-based criterion
Desired
K-means
16
P(x) = .1
P(x) = .2
P(x) = .5
33
Mixture of Gaussians
17
M-step
K-means vs. EM
k-means
Closest centers index
Isotropic Distance
(Euclidean)
EM
Soft assignment, R
Anisotropic Likelihood
(Covariance-based,`Mahalanobis)
36
18
Problems of K-Means/EM
Number of clusters
Initialization/local minima
Mismatch with data distribution
This Lecture
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
38
19
e. g.
20
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
21
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
22
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
Mean-Shift
Region of
interest
Center of
mass
Mean Shift
vector
23
Mean-Shift
Region of
interest
Center of
mass
24
The blue data points were traversed by the windows towards the mode.
Slide by Y. Ukrainitz & B. Sarel
Mean-Shift Clustering
Cluster: all data points in the attraction basin of a mode
Attraction basin: the region for which all trajectories
lead to the same mode
50
25
52
26
More Results
53
Summary Mean-Shift
Pros
Cons
27
This Lecture
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
55
Images as Graphs
q
wpq
p
Fully-connected graph
56
28
Measuring Affinity
Distance
Intensity
Color
(some suitable color space distance)
Texture
(vectors of filter outputs)
58
29
Graph Cut
Graph cut
30
Graph cut
Graph Cut
62
31
Affinity matrix
Block detection
Minimum Cut
We can do segmentation by finding the minimum cut in
a graph (next lecture)
Drawback:
Cuts with
lesser weight
than the
ideal cut
Ideal Cut
64
32
65
Optimization
Original problem: partition similarity graph
Mathematically equivalent to
Partitioning
-1
Embedding
33
NCuts Example
Smallest eigenvectors
NCuts segments
68
34
Discretization
Problem: eigenvectors take on continuous values
Image
Eigenvector
NCut scores
Possible procedures
a)
b)
c)
69
4. Solve
70
35
71
72
36
Cons:
74
37
Lecture Summary
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
75
Lecture Summary
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
76
38
Lecture Summary
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
77
Lecture Summary
Introduction
Mid-Level Vision
Gestalt Theory & Grouping
K-means & EM
Mean Shift
Normalized Cuts
78
39