1 perceptually based methods for robust image hashing vishal monga committee members: prof. ross...

Post on 21-Jan-2016

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Perceptually Based Methods for Perceptually Based Methods for Robust Image HashingRobust Image Hashing

Vishal MongaVishal MongaCommittee Members:

Prof. Ross Baldick

Prof. Brian L. Evans (Advisor)

Prof. Wilson S. Geisler

Prof. Joydeep Ghosh

Prof. John E. Gilbert

Prof. Sriram Vishwanath

Ph.D. Qualifying ExamCommunications, Networks, and Systems AreaDept. of Electrical and Computer Engineering

The University of Texas at AustinApril 14th , 2004

2

• Introduction

• Related work– Digital signature techniques for image authentication– Robust feature extraction from images– Open research issues

• Expected contributions– Framework for robust image hashing using feature points– Clustering algorithms for feature vector compression– Image authentication under geometric attacks via structure

matching

• Conclusion

OutlineOutline

3

• Hash function: Projects value from set with large (possibly infinite) number of members to set with fixed number of (fewer) members in irreversible manner– Provides short, simple

representation of large digital message

– Hash Scheme – Sum of ASCII codes of characters in a name computed modulo N (= 7) a prime number

Hash ExampleHash ExampleIntroduction

NameHash Value

Ghosh 1Monga 2Baldick 3

Vishwanath 3Evans 5Geisler 5Gilbert 6

Database name search example

4

Image Hashing: MotivationImage Hashing: Motivation

Introduction

• Hash functions– Fixed length binary string extracted from a message

– Used in compilers, database searching, cryptography

– Cryptographic hash: security applications e.g. message authentication, ensuring data integrity

• Traditional cryptographic hash– Not suited for multimedia very sensitive to input, i.e.

change in one input bit changes output dramatically

• Need for robust perceptual image hashing

– Perceptual: based on human visual system response

– Robust: hash values for “perceptually identical” images must be the same (with a high probability)

5

Image Hashing: MotivationImage Hashing: Motivation

• Applications– Image database search and indexing

– Content dependent key generation for watermarking

– Robust image authentication: hash must tolerate incidental modifications yet be sensitive to content changes

Introduction

Same hash value h1

Different hash valuesh2

Original ImageJPEG Compressed Tampered

6

Perceptual Hash: Desirable PropertiesPerceptual Hash: Desirable Properties

• Perceptual robustness

• Fragility to distinct inputs

• Randomization

– Necessary in security applications

to minimize vulnerability against

malicious attacks

Pr( ( ) ( )) 1simH I H I

Pr( ( ) ( )) 1diffH I H I

Introduction

Symbol Meaning

H(I)

Hash value extracted from

image I

Isim

Image identical in appearance to I

Idiff

Image clearly distinct in

appearance w.r.t I

m

Length of hash (in bits)

{0,1} ,2

1))(Pr( m

mvvIH

7

• Introduction

• Related work– Digital signature techniques for image authentication– Robust feature extraction from images– Open research issues

• Expected contributions– Framework for robust image hashing using feature points– Clustering algorithms for feature vector compression– Image authentication under geometric attacks via structure

matching

• Conclusion

OutlineOutline

8

Content Based Digital SignaturesContent Based Digital Signatures

Related Work

• Goal– Authenticate image based on extracted signature

• Image statistics based on– Intensity histograms of image blocks [Schneider et al., 1996] – mean, variance and kurtosis of intensity values

extracted from image blocks and compare then to statistics of reference image [Kailasanathan et al., 2001]

• Drawbacks– Easy to modify the image without altering its intensity

histogram scheme is less secure

– Intensity statistics can be altered easily without significantly changing the image appearance

9

Content Based Digital Signatures…Content Based Digital Signatures…

Related Work

• Feature point based methods– Wavelet based corner detection [Bhatacherjee et al., 1998]

– Canny edge detection [Dittman et al., 1999]

– Apply public key encryption on the features to arrive at the digital signature

• Relation based methods [Lin & Chang 2001]

– Invariant relationship between discrete cosine transform (DCT) coefficients of two different blocks

• Common characteristic of above methods– work well for some attacks viz. JPEG compression

– still sensitive to several incidental modifications that do not alter the image appearance

10

Robust Image Hashing: Method # 1Robust Image Hashing: Method # 1

Related Work

• Image statistics vector from wavelet decomposition of image [Venkatesan et al., 2000]

– Averages of wavelet coefficients in coarse sub-bands and variances in other sub-bands

Extract Statistics Vector and Quantize

[00100101 | 01110100|1……………00111001 | 001010]

Error Correction Decoding

[00100101…… 011] Hash Value

Vertical freqs. Horizontal freqs. Diagonal freqs. Coarse Details

11

Robust Image Hashing: Method # 2Robust Image Hashing: Method # 2

Related Work

• Preserve magnitude of low frequency DCT coefficients [Fridrich et al., 2001]

– Survives JPEG compression, linear filtering attacks

– Very sensitive to geometric distortions (local & global)

• Randomize using a secret key K– Generate N random smooth patterns P(i), i = 1,…, N

– Take vectorized dot product of low frequency DCT coefficients (in block B) with random patterns and use threshold Th to obtain N bits bi

0 , |.| )( ii bThPBif

1 , |.| )( ii bThPBif

Back

12

Robust Image Hashing: Method # 3Robust Image Hashing: Method # 3

Related Work

• Invariance of coarse wavelet coefficients [Mihcak et al., 2001]

• Key observation– Main geometric features of image stay

invariant under small perturbations to image

• Hash algorithm– Threshold wavelet coefficients of DC sub-band

(coarse robust features) to obtain a binary matrix– Perform filtering and re-thresholding to iteratively arrive

at binary map which is then used as the hash– Iterative procedure is designed so as to preserve

significant image geometry

DC sub-band

3- level Haar wavelet decomposition

Back

13

Robust Digital Signature: Method # 4Robust Digital Signature: Method # 4

Related Work

• Interscale relationship of wavelet coefficients[Lu & Liao, 2003]

– Magnitude difference between a parent node and its four child nodes is difficult to destroy (alter) under content-preserving manipulations

– s – wavelet scale, o – orientation, 0 ≤ i, j ≤ 1

|)2,2(||),(| ,1, jyixwyxw osos

2-D wavelet decomposition tree

w0,0(x,y)

w1,3(2x+1,2y+1)w1,2(2x,2y+ 1)w1,1(2x+1,2y)w1,0(2x,2y)

14

• A robust feature point scheme for hashing – Inherent sensitivity to content-changing manipulations e.g.

could be useful in authentication– Representation of image content robust to both global and

local geometric distortions– Preferably use properties of the human visual system

• Trade-offs in image hashing– Robustness vs. Fragility, Randomness– Question: Minimum length of the final hash value (binary

string) needed to meet the above goals ?

• Randomized algorithms for secure image hashing

Open IssuesOpen IssuesRelated Work

Contribution 1

Contribution 3

Contribution 1

Contribution 2

Contribution 2

15

• Introduction

• Related Work– Digital signature techniques for Image Authentication– Robust feature extraction from Images– Open research issues

• Expected contributions– Framework for robust image hashing using feature points– Clustering algorithms for feature vector compression– Image authentication under geometric attacks via structure

matching

• Conclusion

OutlineOutline

16

• Proposed two-stage hash algorithm

Hashing FrameworkHashing Framework

Expected Contribution #1

• Feature vectors extracted from “perceptually identical” images must be close in a distance metric

Final Hash

Compression

ctorsfeature ve

similarofClustering

ctorfeature ve

obust visually rExtract

Input Image I

17

Hypercomplex or End-stopped cellsHypercomplex or End-stopped cells

• Develop filters/kernels that capture this behavior• To maintain robustness to changes in image resolution, – Wavelet based approach is needed

• Cells in the visual cortex that help in object recognition

• Respond strongly to line end-points, corners and points of high curvature [Hubel et al. 1965, Dobbins 1989]

“End-stopping and Image Geometry”, Dobbins, 1989

18

End-Stopped Wavelet BasisEnd-Stopped Wavelet Basis

• Morlet wavelets [Antoine et al., 1996]

– To detect linear (or curvilinear) structures having a specific orientation

• End-stopped wavelet [Vandergheynst et al., 2000]

– Apply First Derivative of Gaussian (FDoG) operator to detect end-points of structures identified by Morlet wavelet

))(( )(22 ||

2

1||

2

1. xkxk o

ox

eee jM

x – (x,y) 2-D spatial co-ordinates

ko – (k0, k1) wave-vector of the mother wavelet

Orientation control -0

11tank

k

19

End-Stopped Wavelets…ExampleEnd-Stopped Wavelets…Example

• Morlet Wavelet along the u-axis– Detects vertically oriented linear structures

• FDoG operator along frequency axis v– Applied on the Morlet wavelet to detect end-points and corners

Synthetic L-shaped image Response of Morlet wavelet, orientation = 0 degrees

Response of the end-stopped wavelet

20

Computing Wavelet TransformComputing Wavelet Transform

• Generalize end-stopped wavelet

• Employ the wavelet family

– Scale parameter = 2, i – scale of the wavelet – Discretize orientation range [0,π ] into M intervals i.e. – θk = (k π/M ), k = 0, 1, … M - 1

• Finally, the wavelet transform is given by

))x;(()( x)( ME oFDoG

,, )),,((( Ziyx ki

E

)),,((* ),(),,( 111111 dydxyyxxyxIyxW iEi

Expected Contribution #1

21

Proposed Feature Detection MethodProposed Feature Detection Method[Monga & Evans, 2004][Monga & Evans, 2004]

1. Compute wavelet transform at suitably chosen scale i for several different orientations

2. Significant feature selection: Locations (x,y) in the image that are identified as candidate feature points satisfy

3. Avoid trivial (and fragile) features: Qualify a location as a final feature point if

),,( max ),,( ''

),(

*

),(''

yxWyxW iNyx

iyx

TyxWi ),,( max *

Expected Contribution #1

• Randomization: Partition the image into N random regions using a secret key K, extract features from each random region

• Probabilistic Quantization: Quantize feature vector based on distribution (histogram) of image feature points to enhance robustness

22

Iterative Feature Extraction AlgorithmIterative Feature Extraction Algorithm [Monga & Evans, 2004][Monga & Evans, 2004]

1. Extract feature vector f of length P from image I, quantize f probabilistically to obtain a binary string bf

1 (increase count*)

2. Remove “weak” image geometry: Compute 2-D order statistics (OS) filtering of I to produce Ios = OS(I;p,q,r)

3. Preserve “strong” image geometry: Perform low-pass linear shift invariant (LSI) filtering on Ios to obtain Ilp

4. Repeat step 1 with Ilp to obtain bf2

5. IF (count = MaxIter) go to step 6.

ELSE IF D(bf1, bf

2) < ρ go to step 6.

ELSE set I = Ilp and go to step 1.

6. Set fv(I) = bf2

Expected Contribution #1

MaxIter, ρ and P are algorithm parameters. * count = 0 to begin with

fv(I) denotes quantized feature vector

D(.,.) – normalized Hamming distance between its arguments

23

Preliminary Results: Feature ExtractionPreliminary Results: Feature Extraction

Original ImageJPEG, QF = 10

Expected Contribution #1

AWGN, σ = 20

Image Features at Algorithm Convergence

24

Preliminary Results: Feature ExtractionPreliminary Results: Feature Extraction

• Quantized Feature Vector ComparsionD(fv(I), fv(Isim)) < 0.2D(fv(I), fv(Idiff)) > 0.3

*Attack Lena Bridge PeppersJPEG, QF = 10 0.04 0.04 0.06AWGN, σ = 20 0.04 0.03 0.02

Contrast Enhancement

0 0.06 0.04

Gaussian Smoothing

0.01 0.03 0.05

Median Filtering 0.02 0.03 0.07Scaling by 50% 0.08 0.14 0.11Rotation by 20 0.12 0.15 0.14Rotation by 50 0.18 0.20 0.19

Cropping by 10% 0.12 0.13 0.15Cropping by 20% 0.21 0.22 0.24

Table 1. Comparison of quantized feature vectors

Normalized Hamming distance between quantized feature vectors of original and attacked images

*Attacked images generated by Stirmark benchmark software

Expected Contribution #1

25

Preliminary Results: Feature ExtractionPreliminary Results: Feature Extraction

Attack

Thresholding of coarse wavelet

coefficients (Mihcak et al.)

Preserve low freq, DCT coefficients (Fridrich et al.)

Proposed feature point

detector

JPEG, QF = 10 YES YES YESAWGN, σ = 20 YES NO YES

Gaussian Smoothing

YES YES YES

Median Filtering YES NO YESScaling 50% YES YES YES

Rotation 2 degrees YES NO YESCropping 10% YES NO YESCropping 20% YES NO NO* Small object

additionNO YES NO

* Tamper with facial features

YES YES NO

Expected Contribution #1

YES survives

attack, i.e. hash was invariant

*content changing

manipulations, should be detected

26

HighlightsHighlights

Expected Contribution # 1

• Framework for image hashing using feature points– Two stage hash algorithm– Any visually robust feature point detector is a good candidate to

be used with the iterative algorithm

• Trade-offs facilitated– Robustness vs. Fragility: select feature points such that

T1, T2 large enough ensures that features are retained in several attacked versions of the image, else removed easily

– Robustness vs. Randomization: number of random regionsUntil N < Nmax, robustness largely preserved else random regions shrink to the extent that they do not contain significant chunks of image geometry

21 ),,( max TyxWT i

27

Feature Vector CompressionFeature Vector Compression

Expected Contribution # 2

• Goals in compressing to a final hash value– Cancel small perturbations between feature vectors of

“perceptually identical” images– Maintain fragility to distinct inputs– Retain and/or enhance randomness properties for secure

hashing

• Problem statement: Retain perceptual significance– Let (li, lj) denote vectors in the metric space of feature

vectors V and 0 < ε < δ, then it is desired

)()( ),( jiji lClCthenllDif

)()( ),( jiji lClCthenllDif

28

Possible SolutionsPossible Solutions

• Error correction decoding [Venkatesan et al., 2000]

– Applicable to binary feature vectors– Break the vector down to segments close to the length of

codewords in a suitably chosen error-correcting code

• More generally vector quantization/clustering– Minimize an “average distance” to achieve compression close

to the rate distortion limit

– P(l) – probability of occurrence of vector l, D(.,.) distance metric defined on the feature vectors

– ck – codewords/cluster centers, Sk – kth cluster

),( )(min1

0

K

kk

Sl

clDlPk

Expected Contribution # 2

29

Is Average Distance the Appropriate Cost for the Is Average Distance the Appropriate Cost for the Hashing Application?Hashing Application?

• Problems with average distance VQ– No guarantee that “perceptually distinct” feature vectors

indeed map to different clusters – no straightforward way to trade-off between the two goals

– Must decide number of codebook vectors in advance – Must penalized some errors harshly e.g. if vectors really

close are not clustered together, or vectors very far apart are compressed to the same final hash value

• Define alternate cost function for hashing– Develop clustering algorithm that tries to minimize that

cost

Expected Contribution # 2

30

Cost Function for Feature Vector Compression Cost Function for Feature Vector Compression

• Define joint cost matrices C1 and C2 (n x n)

– n – total number of vectors be clustered, C(li), C(lj) denote the clusters that these vectors are mapped to

• Exponential cost – Ensures that severe penalty is associated if feature vectors

far apart and hence “perceptually distinct” are clustered together

otherwise 0

)()(,),( if ),(

),(

1jiji

llD lClCllDjic

ji

otherwise 0

)()(,),( if ),(

),(

2jiji

llD lClCllDjic

ji

Expected Contribution # 2

α > 0, Г > 1

are algorithm parameters

31

Cost Function for Feature Vector Compression Cost Function for Feature Vector Compression

• Further define S1 as *S2 is defined

similarly

• Normalize to get ,

• Then, minimize the “expected” cost

– p(i) = p(li), p(j) = p(lj)

i j

jis

jicjic

),(

),(),(

1

11

i j

jicjicjpipE ),(),()()(CC 2121

1C

2C

i j

jis

jicjic

),(

),(),(

2

22

Expected Contribution # 2

otherwise 0

),( if ),(

),(

1

ji

llD llDjis

ji

32

Image Authentication Under Geometric AttacksImage Authentication Under Geometric Attacks

• Basic premise– Feature points of a reference image and a geometrically

attacked image are related by a suitable transformation – Affine transformation models the geometric distortion

x = (x1, x2) , y = (y1, y2) R – 2 x 2 matrix, t – 2 x 1 vector

• Hausdorff distance to compare feature points from two images [Atallah, 1983; Rote 1991]

– Used in computer vision for locating objects in an image– Relatively insensitive to perturbations in feature points, can

tolerate errors due to occlusion or feature detector failure

Expected Contribution # 3

tRxxy )(A

33

Image Authentication Under Geometric AttacksImage Authentication Under Geometric Attacks

• Hausdorff distance between point sets A and B– A = {a1,…, ap} and B = {b1,…, bq}

where

– Measures degree of mismatch between two sets

• Employ structure matching algorithms [Huttenlocher et al. 1993, Rucklidge 1995]

– To determine G such that

– Here, fr and fc denote feature point sets from reference and candidate image to be authenticated

Expected Contribution # 3

)),(),,(max(),( ABBABA hhH

||||minmax),( bahba

BA

BA

transformaffine ),,(minarg AAoHG crA

ff

34

Conclusion & Future WorkConclusion & Future Work

Conclusion

• Feature point based hashing framework Iterative feature detector that preserves significant image

geometry, features invariant under several attacksTrade-offs facilitated between hash algorithm goals

• Algorithms for feature vector compressionNovel cost function for the hashing application– Heuristic clustering algorithm(s) to minimize this cost– Randomized clustering for secure hashing

• Image authentication under geometric attacksAffine transformation to model geometric distortions– Hausdorff distance and structure matching algorithms to

determine affine transformation and authenticate

35

Proposed ScheduleProposed Schedule

Conclusion

Semester Work Plan

Summer 2004 Perform extensive tests on the feature extraction algorithm, implement the solution

to stage 1Fall 2004 Develop and finalize the clustering

algorithm for feature vector compression. Compare with other approaches viz. error

correction decodingSpring 2005 Finalize the design and implement the

scheme for image authentication under geometric attacks

Summer 2005 Implement the two-step hash algorithm

Fall 2005 Write and defend dissertation

36

Backup Slides

37

• Parsing in compiling a program

• Variable names kept in a data structure– Array of pointers, each pointer points to a linked list

– Index into the array is a hash value

• Example: variable name “university”– Hashing Scheme – Sum of ASCII codes of characters in a

variable name computed modulo N a prime number

– Check linked list at array index, add string to linked list if it had not been previously parsed

Hash: Illustrative ExampleHash: Illustrative ExampleIntroduction

38

End-Stopped Wavelets…ExampleEnd-Stopped Wavelets…Example

•Morlet Wavelet along

the u-axis

•FDoG operator along

frequency axis v

Expected Contribution #1

)4

)2(

4( 00

22

4

1),(

jxkkyx

E yeyx

)

2()

2

)((

22220

2),(ˆvuvku

E jveevu

Synthetic L-shaped image Response of Morlet wavelet, orientation = 0 degrees

Response of the end-stopped wavelet

spatial domain

frequency domain

39

Content Changing ManipulationsContent Changing ManipulationsFeature Detection

Original image

Maliciously manipulated

image

Back

40

• Image Conditioning– All images resized to 512 x 512 via triangular interpolation prior

to feature extraction– Intensity planes of color images were used

• Pixel neighborhood– Circular to detect isotropic features– Radius of 5 pixels

• Iterative Feature Extraction– wavelet scale, i = 3– MaxIter = 20, ρ = 0.001, P = 128– LSI filter: zero-phase low pass filter (11 x 11) designed

using McCllelan transformations– Order statistics filtering: median with 5 x 5 window

Algorithm ParametersAlgorithm ParametersResults

Back

41

Experimental ResultsExperimental ResultsFeature Detection

AWGN

σ = 20

90 degree rotation

42

Trade-offsTrade-offs

Expected Contribution # 1

• Perceptual robustness vs. fragility– Size of the search neighborhood: large feature points are

more robust– Select feature points such that

– T1, T2 large enough implies features retained in several attacked versions of the image else removed easily

• Robustness vs. Randomization– Uptil N < Nmax, robustness largely retained else random

regions shrink to the extent that they do not contain significant chunks of image geometry

21 ),,( max TyxWT i

Back

43

Relation Based Scheme : DCT coefficientsRelation Based Scheme : DCT coefficients

Digital Signature Techniques

• Discrete Cosine Transform (DCT)

– Typically employed on 8 x 8 blocks

• Digital Signature by Lin

– Fp, Fq, DCT coefficients at the same positions in two different 8 x 8 blocks

– , DCT coefficients in the compressed image

00 qpqp FFFF

pF

qF Back

1

0

1

0

2121 )12(

2cos)12(

2cos),(4),(

N

i

N

j

jN

ki

N

kjiIkkB

8 x 8 block

p q N x N image

44

                                                                 

                                                                 

Multi-Resolution ApproximationsMulti-Resolution Approximations

Wavelet Decomposition

45Back

46

Examples of Perceptually Identical Images Examples of Perceptually Identical Images

Wavelet Decomposition

Original Image Contrast EnhancedJPEG, QF = 10

10% cropping 3 degree rotation2 degree rotation

Back

47

Iterative Hash AlgorithmIterative Hash Algorithm

Expected Contribution # 1

Extract Feature Vector Probabilistic Quantization

Order Statistics FilteringLinear Shift Invariant Low

pass filtering

Probabilistic QuantizationExtract Feature Vector

Input Image

D(b1, b2) < ρ

48

Probabilistic QuantizationProbabilistic Quantization

Quantization

• Feature Vector– fmn = m + H*n

• Quantization Scheme– L quantization levels – Design quantization bins [li,li-1) such that

– Quantization Rule

i

i

l

l

f LiL

xp1

1 ,1

)(

iklkl qii )( )(1 ffBack

49

Feature Vector ExtractionFeature Vector ExtractionFeature Detection

• Randomization– Partition the image into N regions using k-means

segmentation – extract feature points from each region

– Secret key K is used to generate initial guesses for the clusters (centroids of random regions)

– Avoid very small regions since they would not yield robust image features

Back

50

Preliminary ResultsPreliminary Results

Attack

Thresholding of coarse wavelet

coefficients (Mihcak et al.)

Proposed feature point

detector

JPEG, QF = 10 0.01 0.04AWGN, σ = 20 0.03 0.04

Gaussian Smoothing 0.00 0.01Median Filtering 0.04 0.02

Scaling 50% 0.02 0.08Rotation 2 degrees 0.09 0.12

Cropping 10% 0.12 0.14Cropping 20% 0.16 0.24* Small object

addition0.17 0.54

* Tamper with facial features

0.14 0.42

Expected Contribution #1

Table 1. Comparison of quantized feature vectors

Normalized Hamming distance between quantized feature vectors of original and attacked images

51

Minimizing the CostMinimizing the Cost

Clustering Algorithms

• Decision Version of the Clustering Problem– For a fixed number of clusters k, is there a clustering with

cost less than a constant? – Shown to be NP-complete via a reduction from the k-way

graph cut problem [Monga et. al, 2004]

• Polynomial time greedy heuristic to solve the problem– Select cluster centers based on probability mass of vectors

in V – minimize error probabilities in a rigorous sense– Trade-offs: Exclusive minimization of would

compromise and vice-versa– Basic algorithm with variations to facilitate trade-offs

1CE

2CE

52

Basic Clustering AlgorithmBasic Clustering Algorithm

Clustering Algorithms

1. Obtain ε, δ, set k = 1. Select the data point associated with the highest probability mass, label it l1

2. Make the first cluster by including all unclustered points lj such that

D(l1, lj) < ε/2

3. k = k + 1. Select the highest probability data point lk amongst the unclustered points such that

where S is any cluster, C – set of clusters formed till this step and

4. Form the kth cluster Sk by including all unclustered points lj such that

D(lk, lj) < ε/2

5. Repeat steps 3-4 till no more clusters can be formed

23),(min

SlD k

CS

),(min),( yxDSxDSy

53 2/

Visualization of the Clustering AlgorithmVisualization of the Clustering Algorithm

Clustering Algorithms

54

ObservationsObservations

Clustering Algorithms

• For any (li, lj) in cluster Sk

• No errors till this stage of the algorithm– Each cluster is atleast ε away from any other cluster and

hence there are no errors by violating (1) – Within each cluster the maximum distance between any

two points is at most ε, and because 0 < ε < δ there are no errors by violation of (2)

– The data points that are left unclustered are atleast 3 ε /2 away from each of the existing clusters

• Next– Two different approaches to handle the unclustered points

),(),(),( jkk

iji llDllDllD

55

Input Image I

Final Hash Value

Hashing FrameworkHashing FrameworkExpected Contribution #1

Compress

Features

• Two-stage Hash algorithm

Feature Vectors extracted from “perceptually identical” images must be close in a distance metric

Extract visually robust feature vector

56

Approach 1Approach 1

Clustering Algorithms

1. Select the data point l* amongst the unclustered data points that has the highest probability mass

2. For each existing cluster Si, i = 1,2,…, k compute

Let S(δ) = {Si such that di ≤ δ}

3. IF S(δ) = {Φ} THEN k = k + 1. Sk = l* is a cluster of its own

ELSE for each Si in S(δ) define

where denotes the complement of Si i.e. all clusters in S(δ) except Si. Then, l* is assigned to the cluster S* = arg min F(Si)

4. Repeat steps 1 through 3 till all data points are exhausted

),(max * xlDdiSx

i

iSl

i llclplpSF ),()()()( *1

*

iS

57

Approach 2Approach 2

Clustering Algorithms

1. Select the data point l* amongst the unclustered data points that has the highest probability mass

2. For each existing cluster Si, i = 1,2,…, k define

and β lies in [1/2, 1]

where denotes the complement of Si i.e. all existing clusters except Si. Then, l* is assigned to the cluster S* = arg min F(Si)

3. Repeat steps 1 and 2 till all data points are exhausted

ii SlSl

i llclplpllclplpSF ),()()()1(),()()()( *2

**1

*

iS

58

SummarySummary

Clustering Algorithms

• Approach 1– Tries to minimize conditioned on = 0

• Approach 2– Smoothly trades off the minimization of vs. via the parameter β– β = ½ joint minimization– β = 1 exclusive minimization of

• Final Hash length determined automatically!– Given by bits, where k is the total number of

clusters formed– Proposed clustering can be used to compress feature

vectors in any metric space e.g. euclidean, hamming

1CE

2CE 1CE

2CE

1CE

k2log

59

Randomized Clustering for Secure HashingRandomized Clustering for Secure Hashing

Clustering Algorithms

• Heuristic for the deterministic map – Select the highest probability data point amongst the unclustered data

points

• Randomization Scheme– Normalize the probabilities of the existing unclustered data points to

define a new probability mass such that

where i runs over unclustered points, – Employ a uniformly distributed random variable in [0,1]

(generated via a secret key) to select the data point i as a cluster center with probability

i

si

sis

i p

p)(

)(si

)(si

s

60

Randomized Clustering: IllustrationRandomized Clustering: Illustration

Clustering Algorithms

• Example: s = 1– 4 data points with probabilities 0.5, 0.25, 0.125, 0.125

• Key Observations– s = 0, is uniform or any point is selected as the

cluster center with the same probability– s = deterministic clustering

1 2 3 40

0.5 0.75

0.8751

Uniform number

generation to select data

point

)0(i

otherwise 0

point prob.highest for the 1)(i

61

Clustering: ResultsClustering: Results

Clustering Algorithms

• Compress binary feature vector of L = 240 bits– Final hash length = 46 bits, with Approach 2, β = 1/2

• *Average distortion VQ at the same rate– Value of cost function is orders of magnitude lower for the

proposed clustering

Clustering Algorithm

Approach 1 7.64 * 10-8 0

Approach 2, β = ½ 7.43 * 10-9 7.464 * 10-10

Approach 2, β = 1 7.17 * 10-9 4.87 * 10-9

*Average distance VQ 5.96 * 10-4 3.65 * 10-5

1CE 2CE

62

Conclusion & Future WorkConclusion & Future Work

Clustering Algorithms

• Perceptual Image Hashing via Feature Points– Extract Feature Points that preserve significant image geomtery– Based on properties of the Human Visual System (HVS)– Robust to local and global geometric distortions

• Clustering Algorithms for compression– Randomized to minimize vulnerability against malicious attacks

generated by an adversary– Trade-offs facilitated between robustness and randomness,

fragility

• Future Work– Authentication under geometric attacks– Information theoretically secure hashing

63

• Feature Points are required to be invariant across “perceptually identical” images– Primary geometric features of the image are largely

preserved under small perturbations [Mihcak et. al, 2001]

– i.e. extract significant image geometry preserving feature points

– Identify what the human eye perceives as “robust” or “invariant” geometric features

• Edge based detection is not suited– Has problems with high compression ratios, quantization

and scaling [Zheng and Chellapa, 1993]

– Human recognition performance does not impede even when much edge information is lost [Beiderman, 1987]

Perceptual Image Hashing Via Feature PointsPerceptual Image Hashing Via Feature PointsImage Hashing Via Feature Points

64

ES2 WaveletES2 WaveletEnd-stopping and image features

• Example Wavelets– SDoG operator on the morlet wavelet

• Wavelet behavior– produces a strong response at the center of any oriented

linear stimuli of a particular length determined by σ

))2(2

)(2)((

32

222

2

2

211

222

e )2(

)22(2 ),(

yjkkyx

E

uyx

)2

)((()

2(

2222

21

2

2

22

)2(4-

),(kvuvu

E eeuvu

65

Clustering: Dependence on source distributionClustering: Dependence on source distribution

Clustering Algorithms

• Source distributions may be very “skewed”– Trivial clusters may be formed i.e. with very low probability

points included– For efficient compression, the number of clusters formed

should accurately represent the statistics of the source

• Solution– Consider the algorithm when m clusters are formed m < k and

i < n points already clustered– Assign remaining points i.e. {i + 1, …, n} to the remaining

clusters in a fashion similar to the basic algorithm– Compare the expected cost of this clustering vs. the one with

k clusters as formed by the algorithm described before, if the increase is not significant terminate with the current number of clusters

top related