fusion here, there and almost everywhere in computer...

182
Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals Derek T. Anderson Associate Professor Robert D. Guyton Chair Co-Director of Sensor Analysis and Intelligence Laboratory (SAIL) Electrical and Computer Engineering Department Mississippi State University, MS, USA FUZZ-IEEE, July 2017 Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Upload: hoangcong

Post on 21-Jun-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almostEVERYWHERE in computer vision - driving

new advances in fuzzy integrals

Derek T. Anderson

Associate ProfessorRobert D. Guyton Chair

Co-Director of Sensor Analysis and Intelligence Laboratory (SAIL)Electrical and Computer Engineering Department

Mississippi State University, MS, USA

FUZZ-IEEE, July 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CVResource: open source Octave/Matlab FI and CV library

http://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CVResource: open source Octave/Matlab FI and CV library

http://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CVResource: open source Octave/Matlab FI and CV library

http://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CVResource: open source Octave/Matlab FI and CV library

http://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CV

Resource: open source Octave/Matlab FI and CV libraryhttp://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

What I hope to achieve today

Convince you that

I fusion is needed EVERYWHERE incomputer vision (CV)

I fuzzy integrals (FI) are a flexible tool fornumerous CV challenges

I Most importantly ... CV presents newinteresting theoretical and appliedchallenges in fusion that need solving

Disclaimer: focus is on (1) fusion AND (2) CV, not just CVResource: open source Octave/Matlab FI and CV library

http://derektanderson.com/FuzzyLibrary/

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

“Fusion” in a nutshell

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

“Fusion” in a nutshell

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

“Fusion” in a nutshell

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

“Fusion” in a nutshell

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Big picture

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Where does fusion fit in?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Where does fusion fit in?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Where does fusion fit in?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Where does fusion fit in?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Where does fusion fit in?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

What is the question? e.g., wittiness

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

What is the question? e.g., wittiness

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

What is the question? e.g., wittiness

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

Discrete (finite X ) fuzzy measure (FM)

Let X = {x1, x2, . . . , xN} be a set of N inputs from sources such asexperts, algorithms and/or sensors. A FM is a monotonic functiondefined on the power set of X , 2X , as µ : 2X → R+ that satisfiesthe following:

I (boundary condition) µ(∅) = 0,

I (monotonicity) if A,B ⊆ X and A ⊆ B, µ(A) ≤ µ(B).

RemarkOften, an additional constraint is imposed on the FM in practice tolimit the upper bound to 1, i.e., µ(X ) = 1.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

Discrete (finite X ) fuzzy measure (FM)

Let X = {x1, x2, . . . , xN} be a set of N inputs from sources such asexperts, algorithms and/or sensors. A FM is a monotonic functiondefined on the power set of X , 2X , as µ : 2X → R+ that satisfiesthe following:

I (boundary condition) µ(∅) = 0,

I (monotonicity) if A,B ⊆ X and A ⊆ B, µ(A) ≤ µ(B).

RemarkOften, an additional constraint is imposed on the FM in practice tolimit the upper bound to 1, i.e., µ(X ) = 1.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

Discrete (finite X ) fuzzy measure (FM)

Let X = {x1, x2, . . . , xN} be a set of N inputs from sources such asexperts, algorithms and/or sensors. A FM is a monotonic functiondefined on the power set of X , 2X , as µ : 2X → R+ that satisfiesthe following:

I (boundary condition) µ(∅) = 0,

I (monotonicity) if A,B ⊆ X and A ⊆ B, µ(A) ≤ µ(B).

RemarkOften, an additional constraint is imposed on the FM in practice tolimit the upper bound to 1, i.e., µ(X ) = 1.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy measure

Discrete (finite X ) fuzzy measure (FM)

Let X = {x1, x2, . . . , xN} be a set of N inputs from sources such asexperts, algorithms and/or sensors. A FM is a monotonic functiondefined on the power set of X , 2X , as µ : 2X → R+ that satisfiesthe following:

I (boundary condition) µ(∅) = 0,

I (monotonicity) if A,B ⊆ X and A ⊆ B, µ(A) ≤ µ(B).

RemarkOften, an additional constraint is imposed on the FM in practice tolimit the upper bound to 1, i.e., µ(X ) = 1.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Discrete (finite X ) fuzzy Choquet integral (ChI)

Let h(xi ) ∈ X be the data/information (e.g., sensor readings, CValgorithm outputs, etc.) from input i . The ChI is∫

Ch ◦ µ = Cµ(h) =

N∑i=1

h(xπ(i)) [µ(Ai )− µ(Ai−1)],

where π is a permutation of X , such that

h(xπ(1)) ≥ h(xπ(2)) ≥ . . . ≥ h(xπ(N)),

Ai = {xπ(1), . . . , xπ(i)} and µ(A0) = 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Discrete (finite X ) fuzzy Choquet integral (ChI)

Let h(xi ) ∈ X be the data/information (e.g., sensor readings, CValgorithm outputs, etc.) from input i . The ChI is∫

Ch ◦ µ = Cµ(h) =

N∑i=1

h(xπ(i)) [µ(Ai )− µ(Ai−1)],

where π is a permutation of X , such that

h(xπ(1)) ≥ h(xπ(2)) ≥ . . . ≥ h(xπ(N)),

Ai = {xπ(1), . . . , xπ(i)} and µ(A0) = 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Discrete (finite X ) fuzzy Choquet integral (ChI)

Let h(xi ) ∈ X be the data/information (e.g., sensor readings, CValgorithm outputs, etc.) from input i . The ChI is∫

Ch ◦ µ = Cµ(h) =

N∑i=1

h(xπ(i)) [µ(Ai )− µ(Ai−1)],

where π is a permutation of X , such that

h(xπ(1)) ≥ h(xπ(2)) ≥ . . . ≥ h(xπ(N)),

Ai = {xπ(1), . . . , xπ(i)} and µ(A0) = 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Single instance of FI: e.g., N = 3 and h(x2) ≥ h(x1) ≥ h(x3)

{x2}

{}

{x1,x2}

{x1,x2,x3}

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...

Truly innovativeI In particular, when X 6= <

I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <

I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIR

I CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image features

I CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architectures

I Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =Tony

I What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?

I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.

I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?

I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operator

I Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analytical

I Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

What, use a method from Calculus ...Truly innovative

I In particular, when X 6= <I Sensors: let X be x1 =Radar, x2 =EMI, x3 =LWIRI CV: let X be ten different image featuresI CV: let X be three different deep learning architecturesI Experts: let X be x1 =Derek, x2 =Muhammad, x3 =TonyI What is µ ({Derek, Muhammad}) ... ?I Correlation, belief, utility, etc.I Is h and µ subjective or objective ...?

I Where does µ come from?I Pick µ, get a specific aggregation operatorI Continuous (|X | is (uncountably) infinite) and analyticalI Discrete (|X | is finite) and data-driven or expert

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

How I like to conceptualize the discrete FI

Set of aggregation functions: ~wi for πi , ∀i ∈ [1, ...,N!]

fi = wi (1)h(xπi (1))+wi (2)h(xπi (2)) + ...+wi (N)h(xπi (N))

N∑k=1

wi (k) = 1

Example: let N = 3 and sort order h(x1) ≥ h(x2) ≥ h(x3)

w(1) =µ({x1})− 0,w(2) =µ({x1, x2})− µ({x1}),w(3) =µ({x1, x2, x3})− µ({x1, x2}).

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

How I like to conceptualize the discrete FI

Set of aggregation functions: ~wi for πi , ∀i ∈ [1, ...,N!]

fi = wi (1)h(xπi (1))+wi (2)h(xπi (2)) + ...+wi (N)h(xπi (N))

N∑k=1

wi (k) = 1

Example: let N = 3 and sort order h(x1) ≥ h(x2) ≥ h(x3)

w(1) =µ({x1})− 0,w(2) =µ({x1, x2})− µ({x1}),w(3) =µ({x1, x2, x3})− µ({x1, x2}).

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

How I like to conceptualize the discrete FI

Set of aggregation functions: ~wi for πi , ∀i ∈ [1, ...,N!]

fi = wi (1)h(xπi (1))+wi (2)h(xπi (2)) + ...+wi (N)h(xπi (N))

N∑k=1

wi (k) = 1

Example: let N = 3 and sort order h(x1) ≥ h(x2) ≥ h(x3)

w(1) =µ({x1})− 0,w(2) =µ({x1, x2})− µ({x1}),w(3) =µ({x1, x2, x3})− µ({x1, x2}).

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Visualization and visitation: single instance1

{x2}

{}

{x1,x2}

{x1,x2,x3}

1Visualization and Learning of the Choquet Integral With Limited Training Data, FUZZ-IEEE 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Visualization and visitation: data-driven2

{x2}

{}

{x1,x2}

{x1,x2,x3}

Lots of visitations

Low-to-no visitations

Lots of visitations

Low-to-no visitations

N=10

2Visualization and Learning of the Choquet Integral With Limited Training Data, FUZZ-IEEE 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 1: LCOS, S-Decomp., λ-FM, etc.

A linear combination of order statistics (LCOS) is a ChI s.t.,

µ(A) = µ(B),∀A,B ∈ 2X such that |A| = |B|.

So, N! operators reduce to one operator. For example,

~w t = (1, 0, ..., 0)t

is the maximum operator

µ(∅) = 0 and µ(A) = 1,∀A ∈ 2X except ∅,

h(xπ(1)) [µ(A1)− µ(A0)] + ...+ h(xπ(N)) [µ(AN)− µ(AN−1)] ,

h(xπ(1)) [1] + 0 + ...+ 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 1: LCOS, S-Decomp., λ-FM, etc.

A linear combination of order statistics (LCOS) is a ChI s.t.,

µ(A) = µ(B),∀A,B ∈ 2X such that |A| = |B|.

So, N! operators reduce to one operator. For example,

~w t = (1, 0, ..., 0)t

is the maximum operator

µ(∅) = 0 and µ(A) = 1,∀A ∈ 2X except ∅,

h(xπ(1)) [µ(A1)− µ(A0)] + ...+ h(xπ(N)) [µ(AN)− µ(AN−1)] ,

h(xπ(1)) [1] + 0 + ...+ 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 1: LCOS, S-Decomp., λ-FM, etc.

A linear combination of order statistics (LCOS) is a ChI s.t.,

µ(A) = µ(B),∀A,B ∈ 2X such that |A| = |B|.

So, N! operators reduce to one operator. For example,

~w t = (1, 0, ..., 0)t

is the maximum operator

µ(∅) = 0 and µ(A) = 1, ∀A ∈ 2X except ∅,

h(xπ(1)) [µ(A1)− µ(A0)] + ...+ h(xπ(N)) [µ(AN)− µ(AN−1)] ,

h(xπ(1)) [1] + 0 + ...+ 0.

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].

I Efficient: few variables and just one ChI termI Maximum number of terms is

(N

N/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1

I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreement

I Example: let x1=radar, x2=LWIR, x3=EMII Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1

I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.

I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Fuzzy integral

Simplifications - Case 2: Binary FM/FI3,4

Simple, µ(A) ∈ {0, 1}, ∀A ⊆ 2X versus [0, 1].I Efficient: few variables and just one ChI term

I Maximum number of terms is(

NN/2

), likely MUCH smaller

I Trivial to prove that∑N

i=1 (µ(Ai )− µ(Ai−1)) = 1I∑

i∈A1(0− 0)hπ(i)+ (1− 0)hπ(|A1|+1) +

∑k∈A2

(1− 1)hπ(k)

I Properties: Sugeno integral == ChI

I Understanding: best pessimistic agreementI Example: let x1=radar, x2=LWIR, x3=EMI

I Let µ(x1) = µ(x2) = µ(x3) = 0 and µ({x2, x3}) = 0, else 1I h(x1) =0.8, h(x2) = 0.5, h(x3) = 0.01, BChI is 0.5.I If radar is highest, need LWIR confirmation, disregard EMI

3Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, ICMT, 2017

4Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensors

Sensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D

, κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and infrared imagery

Problem: low feature level fusion

I Supervised pattern recognition/machine learning

I Observation i (e.g., image ROI), have features (xi ,k ∈ <dk )

I Kernel; φ : x→ φ(x) ∈ <D , κ(xi ,k , xj ,k) = φ(xi ,k) · φ(xj ,k)

I Kernel matrix (n objects); Kk = [Kij = κ(xi ,k , xj ,k)]n×n

I Mercer kept all the good secrets ...

Solution: Multiple kernel learning (MKL)

I Searching for f (K1, ...,KM) (building blocks)

I Fixed rule, heuristics or optimization

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Example: fusion of learned iECO features on IR

Candidate Chip

Population 1 (HOG)

C1

C5

iECO

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Example: fusion of learned iECO features on IR

Candidate Chip

Population 1 (HOG)

C1

C5

iECO

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Example: fusion of learned iECO features on IR

Candidate Chip

Population 1 (HOG)

C1

C5

Population 2 (EHD)

C6

C10

iECO

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Example: fusion of learned iECO features on IR

Candidate Chip

Population 1 (HOG)

C1

C5

Population 2 (EHD)

C6

C10

Population 3 (SD)

C11

C15

iECO

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Different MKL approaches5

I Not a comprehensive listI Simple linear convex sum (LCS)

I Xu et al.: MKL by group lasso (MKLGL)I Varma and Babu: generalized MKL (Gaussians)I Cortes et al.: polynomial kernelsI Us: FI and genetic algorithm (FIGA)I Us: GA MKL p-norm (GAMKLp)

I Fuzzy integral (nonlinear)I Us: Decision level FI MKL p-norm (DeFIMKLp)I Us: Decision level least squares MKL (DeLSMKL)

5Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Different MKL approaches5

I Not a comprehensive listI Simple linear convex sum (LCS)

I Xu et al.: MKL by group lasso (MKLGL)I Varma and Babu: generalized MKL (Gaussians)I Cortes et al.: polynomial kernelsI Us: FI and genetic algorithm (FIGA)I Us: GA MKL p-norm (GAMKLp)

I Fuzzy integral (nonlinear)I Us: Decision level FI MKL p-norm (DeFIMKLp)I Us: Decision level least squares MKL (DeLSMKL)

5Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Different MKL approaches5

I Not a comprehensive listI Simple linear convex sum (LCS)

I Xu et al.: MKL by group lasso (MKLGL)I Varma and Babu: generalized MKL (Gaussians)I Cortes et al.: polynomial kernelsI Us: FI and genetic algorithm (FIGA)I Us: GA MKL p-norm (GAMKLp)

I Fuzzy integral (nonlinear)

I Us: Decision level FI MKL p-norm (DeFIMKLp)I Us: Decision level least squares MKL (DeLSMKL)

5Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Different MKL approaches5

I Not a comprehensive listI Simple linear convex sum (LCS)

I Xu et al.: MKL by group lasso (MKLGL)I Varma and Babu: generalized MKL (Gaussians)I Cortes et al.: polynomial kernelsI Us: FI and genetic algorithm (FIGA)I Us: GA MKL p-norm (GAMKLp)

I Fuzzy integral (nonlinear)I Us: Decision level FI MKL p-norm (DeFIMKLp)I Us: Decision level least squares MKL (DeLSMKL)

5Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifier

I ηk(x) =∑n

i=1 αikyiκk(xi , x)− bkI fk(x) = ηk (x)√

1+η2k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)

I E 2 = uTDu + fTu +∑n

i=1 y2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraints

I minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

DeFIMKL algorithm6

I fk(xi ) is decision on xi by kth classifierI ηk(x) =

∑ni=1 αikyiκk(xi , x)− bk

I fk(x) = ηk (x)√1+η2

k (x)

I Fuzzy integral isI fµ(xi ) =

∑mk=1 fπ(k)(xi ) [µ(Ak)− µ(Ak−1)]

I Sum of squared error (SSE)

I E 2 =∑n

i=1 (fµ(xi )− yi )2

I E 2 =∑n

i=1

(HT

xi u− yi)2

I E 2 =∑n

i=1

(uTHxiH

Txi u− 2yiH

Txi u + y2

i

)I E 2 = uTDu + fTu +

∑ni=1 y

2i

I D =∑n

i=1 HxiHTxi and f = −

∑ni=1 2yiHxi

I QP subject to monotonicity constraintsI minu 0.5uT D̂u + fTu + λ‖u‖p,Cu ≤ 0, (0, 1)T ≤ u ≤ 1

6Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated byI K̃ = KzK

†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated by

I K̃ = KzK†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated byI K̃ = KzK

†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated byI K̃ = KzK

†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated byI K̃ = KzK

†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Big Data: Nystrom approximation and linearization7

I MKL can be difficult-to-impossible to apply to large dataI Full MKL for m matrices is mn2

I Gram matrix, K ∈ <n×n, approximated byI K̃ = KzK

†zzK

Tz

I z are indices of |z | sampled columns of KI K †zz is Moore-Penrose pseudoinverse of Kzz

I Now, aggregate m size nz matrices, so mnzI K̃z =

∑mk=1 (wkKk)z is positive semi-definite (PSD)

I Can linearize by eigendecomposition of fused K̃zz

I K̃†zz = UzΛ−1z UT

z

I Linearized model (X̃ ) becomes X̃ = K̃zUzΛ− 1

2z

I Put into a linear SVM vs. kernel SVM (faster!)

7Efficient Multiple Kernel Classification using Feature and Decision Level Fusion, TFS, 2016

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1

I Numerous questionsI How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?

I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parameters

I What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?

I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI

, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index

, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices8

I 2N -2 “free parameters”; exclude µ(∅) = 0 and µ(X ) = 1I Numerous questions

I How important is each individual input?I How are tuples of the individuals interacting?

I Data-driven learning: E (D,Θ) = f1 (f2(D,Θ), f3(Θ))I E is error, D is data, Θ are our parametersI What should f1, f2 and f3 be?I f2 = SSE w.r.t. ChI, f3 = µ index, and f1 = a + λb

8Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Complexity: information theoretic indices9,10,11

I Labreuche’s entropy; −∑N

i=1

(∑K⊆X\{i} ζX,1 (K)

∣∣∣Ai − 1N

∣∣∣), where Ai = µ(K ∪ {i})− µ(K)

I Marichal’s entropy; −∑N

i=1

(∑K⊆X\{i} ζX,1 (K) Ai ln (Ai )

), where Ai = µ(K ∪ {i})− µ(K)

I Kojadinovic’s variance; 1N

∑Ni=1

(∑K⊆X\{i} ζX,1(K)

(Ai − 1

N

)2)

, where Ai = µ(K ∪ {i})− µ(K)

I Shapley index: Φµ(i) =∑

K⊆X\{i} ζX,1(K) (µ(K ∪ {i})− µ(K))

I Interaction index; Iµ(i, j) =∑

K⊆X\{i,j} ζX,2(K)(µ(K ∪{i, j})−µ(K ∪{i})−µ(K ∪{j}) +µ(K))

I Shannon entropy of Shapley values; −∑N

i=1 Φµ(i)ln(Φµ(i))

I k-additive index;∑

A⊆X f (|A|)|M(A)|

I `0-norm of Shapley vector;∥∥Φµ

∥∥0

=∣∣{i : Φµ(i) 6= 0

}∣∣, Φµ =(

Φµ(1), ...,Φµ(N))t

I `1-norm of Shapley vector;∥∥Φµ

∥∥1

=∑N

i=1

∣∣Φµ(i)∣∣ =

∑Ni=1 Φµ(i) = 1, Φµ =

(Φµ(1), ...,Φµ(N)

)tI Gini-Simpson of Shapley values; 1−

∑Ni=1 Φµ(i)2

9u: Lexicographic encoding of µ; ut = (µ1, ..., µ12, ...µ12...N )t

10Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, Granular Computing, 2017

11ζX,1(K) =

(|X|−|K|−1)!|K|!|X|! and ζX,2(K) =

(|X|−|K|−2)!|K|!(|X|−1)!

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.

I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKL

I Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)

I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)

I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial features

I Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Summary of results

I iECO features on infrared imageryI 173 buried targets; various days, depths, types, etc.I Compared fixed, heuristic and optimization MKLI Compared to state-of-the-art in ML/CV (important!)I DeFIMKLp led to rise in PDR/lower FAR (124%, NAUC)I Specifically, generalized very well (regularization)

I Ground penetrating radarI Energy (no phase) based spatial featuresI Performance gain for DeFIMKL (138%) and GAMKLp (147%)

I EfficiencyI We were able to keep only [2%, 10%] of kernel data, relative to

5% drop in performance

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Unsolved challenges

I Computational and storage efficiencyI Millions of training samples and many base kernels

I Non-linear SISO/FIFO MKLI n! possibilities, each a feature space

I Kij = 〈φσ(xi ), φσ(xj)〉 =∑m

k=1 σk (Kk)ij =√σ1φ1i

...√σmφ

mi

t √σ1φ1j

...√σmφ

mj

I Heterogeneous kernels and normalization

I What E (D,Θ) ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Unsolved challenges

I Computational and storage efficiencyI Millions of training samples and many base kernels

I Non-linear SISO/FIFO MKLI n! possibilities, each a feature space

I Kij = 〈φσ(xi ), φσ(xj)〉 =∑m

k=1 σk (Kk)ij =√σ1φ1i

...√σmφ

mi

t √σ1φ1j

...√σmφ

mj

I Heterogeneous kernels and normalization

I What E (D,Θ) ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Unsolved challenges

I Computational and storage efficiencyI Millions of training samples and many base kernels

I Non-linear SISO/FIFO MKLI n! possibilities, each a feature space

I Kij = 〈φσ(xi ), φσ(xj)〉 =∑m

k=1 σk (Kk)ij =√σ1φ1i

...√σmφ

mi

t √σ1φ1j

...√σmφ

mj

I Heterogeneous kernels and normalization

I What E (D,Θ) ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 1: feature fusion

Unsolved challenges

I Computational and storage efficiencyI Millions of training samples and many base kernels

I Non-linear SISO/FIFO MKLI n! possibilities, each a feature space

I Kij = 〈φσ(xi ), φσ(xj)〉 =∑m

k=1 σk (Kk)ij =√σ1φ1i

...√σmφ

mi

t √σ1φ1j

...√σmφ

mj

I Heterogeneous kernels and normalization

I What E (D,Θ) ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensors

Sensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Context: humanitarian demining

Find dangerous materials and save lives using multiple sensorsSensors: ground penetrating radar and electromagnetic induction

Problem: one size does NOT fit all

I Typically, learn a single Cµ(h)

I Instead, f (Cµ1(h1), ...,Cµk (hk)); where hi is subset of X

I Different sensors/features/algorithms/etc. require differentaggregation philosophies within and across inputs

Solution: genetic programming-based ChI (GPChI)

I Composition and arithmetic combination of ChIs

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Multi-sensor fusion12,13,14,15,16

12Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017

13Fusion of Choquet integrals for explosive hazard detection in EMI and GPR for handheld platforms, 2017

14Background adaptive division filtering for hand-held ground penetrating radar, 2016

15Curvelet filter-based prescreener for explosive hazard detection in hand-held ground penetrating radar, 2016

16Binary Fuzzy Measures and Choquet Integration for Multi-Source Fusion, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Ground penetrating radar

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Electromagnetic induction

Receiver CoilTransmitter Coil

ObjectMagnetic Field

Induced Magnetic Field

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Electromagnetic induction

Receiver CoilTransmitter Coil

Magnetic Field

Induced Magnetic Field

Object

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}I Proofs

I ChI w.r.t. the LCS of FMs is equal tothe LCS of ChIs

I LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I InterestingI Can write out GPChI, but understand ...I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}

I ProofsI ChI w.r.t. the LCS of FMs is equal to

the LCS of ChIsI LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I InterestingI Can write out GPChI, but understand ...I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}I Proofs

I ChI w.r.t. the LCS of FMs is equal tothe LCS of ChIs

I LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I InterestingI Can write out GPChI, but understand ...I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}I Proofs

I ChI w.r.t. the LCS of FMs is equal tothe LCS of ChIs

I LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I Interesting

I Can write out GPChI, but understand ...I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}I Proofs

I ChI w.r.t. the LCS of FMs is equal tothe LCS of ChIs

I LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I InterestingI Can write out GPChI, but understand ...

I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Fusion of fusions of CV information

I Multiple CV algorithms on GPR and EMI

I GP for candidates, GA for {µ1, ..., µk}I Proofs

I ChI w.r.t. the LCS of FMs is equal tothe LCS of ChIs

I LCS of FMs is a FMI ChI of ChIs is not guaranteed to be a ChII GpChI bounded conditionsI GpChI is not guaranteed to be a ChI

I InterestingI Can write out GPChI, but understand ...I Not the machines problem ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Receiver operating characteristic (ROC)

Lower FAR Higher PDR

EMI does better

Translation: higher PDR/lower FAR

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

GPR

Single ChIs

EMI

GpChI4 inputs

GpChI2 inputs

1st

2nd

3rd

4th

5th

Translation: Fusion of fusions beats a single ChI

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Unsolved challenges

I ConditioningI Even if [0, 1], same “trend” ((h(xi ) = 0.5) == (h(xj) = 0.5))?

I Can we explain it (physics explanation)?

I GP and bloating (likes long winded overfit solutions)

I Multiple µ learning (lots of constraints!)I Mathematically

I What MONSTER did we create?!?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Unsolved challenges

I ConditioningI Even if [0, 1], same “trend” ((h(xi ) = 0.5) == (h(xj) = 0.5))?

I Can we explain it (physics explanation)?

I GP and bloating (likes long winded overfit solutions)

I Multiple µ learning (lots of constraints!)I Mathematically

I What MONSTER did we create?!?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Unsolved challenges

I ConditioningI Even if [0, 1], same “trend” ((h(xi ) = 0.5) == (h(xj) = 0.5))?

I Can we explain it (physics explanation)?

I GP and bloating (likes long winded overfit solutions)

I Multiple µ learning (lots of constraints!)I Mathematically

I What MONSTER did we create?!?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Unsolved challenges

I ConditioningI Even if [0, 1], same “trend” ((h(xi ) = 0.5) == (h(xj) = 0.5))?

I Can we explain it (physics explanation)?

I GP and bloating (likes long winded overfit solutions)

I Multiple µ learning (lots of constraints!)

I MathematicallyI What MONSTER did we create?!?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Example 2: DIDO fusion

Unsolved challenges

I ConditioningI Even if [0, 1], same “trend” ((h(xi ) = 0.5) == (h(xj) = 0.5))?

I Can we explain it (physics explanation)?

I GP and bloating (likes long winded overfit solutions)

I Multiple µ learning (lots of constraints!)I Mathematically

I What MONSTER did we create?!?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out there

I Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV community

I Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metrics

I Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I Explainable

I Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from data

I Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big Data

I (Variety) Multi-source: human, algorithm, sensorI Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideas

I (Veracity) Incomplete (missing, noisy, etc.) data/informationI The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Get the word out thereI Get wonderful FST aggregation work to the CV communityI Beat them at their own game, and new metricsI Hybridization of ML and FST

I ExplainableI Yes, we can learn an optimal µ from dataI Can we UNDERSTAND what µ (and FI) is doing?

I Big DataI (Variety) Multi-source: human, algorithm, sensor

I Information heterogeneity and uncertainties (new extensions)

I (Volume, Velocity) Efficient and scalable ideasI (Veracity) Incomplete (missing, noisy, etc.) data/information

I The Fuzzy Integral for Missing Data, FUZZ-IEEE, 2017

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I Uncertainty

I Spatial, spectral and temporalI Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I UncertaintyI Spatial, spectral and temporal

I Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I UncertaintyI Spatial, spectral and temporal

I Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I UncertaintyI Spatial, spectral and temporal

I Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I UncertaintyI Spatial, spectral and temporal

I Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Big Challenges (AS I SEE IT!) for fusion in CV

I UncertaintyI Spatial, spectral and temporal

I Multiple Instance Choquet Integral for Classifier Fusion, FUZZ-IEEE, 2016

I Data-drivenI Overfitting; learning algorithms and indices

I Visualization and Learning of the Choquet Integral With Limited Training Data,FUZZ-IEEE 2017

I Measure of the Shapley Index for Learning Lower Complexity Fuzzy Integrals, GranularComputing, 2017

I How to handle data unsupported variables?

I Is the FI enough?I Genetic Programming Based Choquet Integral for Multi-Source Fusion, FUZZ-IEEE, 2017I The Arithmetic Recursive Average as an Instance of the Recursive Weighted Power Mean,

FUZZ-IEEE 2017

I Accountable fusion in deep learning ...I Where is it happening, how, and what is it doing?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Numerous colleagues and community

I Wife: Melissa Anderson (FI for forensic anthropology)

I MU: Jim Keller and Xiaoxiao Du

I MTU: Tim Havens and Tony Pinar

I Nottingham, UK: Christian Wagner

I MSU: Muhammad Islam and Ryan Smith (students)

I FU: Aina Zare

I NRL: Fred Petry and Paul Elmore (heterogeneous fusion)

I TSU: Daniel Wescott (forensic anthropology)

I Many others; Choquet, Sugeno, Hohle, Schmeidler,Murofushi, Grabisch, Yager, Mesiar, Labreuche, Beliakov, ...

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU

Goal Data/Information fusion Computer vision Fuzzy integral Examples Conclusion

Wrap it up

Thank you!Questions?

Fusion here, there and almost EVERYWHERE in computer vision - driving new advances in fuzzy integrals MSU