presentazione di powerpoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2....

37
Math tools: basics on graph theory Maria Prandini

Upload: others

Post on 06-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Math tools: basics on graph theory

Maria Prandini

Page 2: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

• Modeling agent-to-agent communication as a graph

• Undirected and directed graph

• Connectivity and strong connectivity

• Application to distributed averaging

• Extension to the time-varying case

Outline

Page 3: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Modeling

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Page 4: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Modeling

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

𝑉 = 1,2, … ,𝑚 𝐸 = { 𝑗, 𝑖 : 𝑗 communicates with 𝑖} ⊆ 𝑉 × 𝑉

Page 5: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Modeling

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

𝑉 = 1,2, … ,𝑚 𝐸 = 𝑗, 𝑖 : 𝑗 communicates with 𝑖 ⊆ 𝑉 × 𝑉

Neighbors of agent 𝑖: 𝑁𝑖 = {𝑗: (𝑗, 𝑖) ∈ 𝐸}

Page 6: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Modeling

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

𝑉 = 1,2, … ,𝑚 𝐸 = 𝑗, 𝑖 : 𝑗 communicates with 𝑖 ⊆ 𝑉 × 𝑉

Neighbors of agent 𝑖: 𝑁𝑖 = {𝑗: (𝑗, 𝑖) ∈ 𝐸}

1

2 3

4 5 6

𝑁2 = {1,5}

Page 7: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Path

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

Path is a subgraph π = 𝑉𝜋 , 𝐸𝜋 ⊂ 𝐺 with

𝑉𝜋 = 𝑖1, 𝑖2, … , 𝑖𝑘 ⊂ V 𝐸𝜋 = { 𝑖1, 𝑖2 , 𝑖2, 𝑖3 , … , (𝑖𝑘−1, 𝑖𝑘)} ⊂ 𝐸

Page 8: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Path

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

Path is a subgraph π = 𝑉𝜋 , 𝐸𝜋 ⊂ 𝐺 with

𝑉𝜋 = 𝑖1, 𝑖2, … , 𝑖𝑘 ⊂ V 𝐸𝜋 = { 𝑖1, 𝑖2 , 𝑖2, 𝑖3 , … , (𝑖𝑘−1, 𝑖𝑘)} ⊂ 𝐸

example of a path of length 2

Page 9: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Path

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

Path is a subgraph π = 𝑉𝜋 , 𝐸𝜋 ⊂ 𝐺 with

𝑉𝜋 = 𝑖1, 𝑖2, … , 𝑖𝑘 ⊂ V 𝐸𝜋 = { 𝑖1, 𝑖2 , 𝑖2, 𝑖3 , … , (𝑖𝑘−1, 𝑖𝑘)} ⊂ 𝐸

example of a path of length 5

Page 10: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Path

1

2 3

4 5 6

agent node 𝑖

agent 𝑗 communicates with agent 𝑖 edge 𝑒 = (𝑗, 𝑖)

Graph modeling the agents communication: 𝐺 = 𝑉, 𝐸

Path is a subgraph π = 𝑉𝜋 , 𝐸𝜋 ⊂ 𝐺 with

𝑉𝜋 = 𝑖1, 𝑖2, … , 𝑖𝑘 ⊂ V 𝐸𝜋 = { 𝑖1, 𝑖2 , 𝑖2, 𝑖3 , … , (𝑖𝑘−1, 𝑖𝑘)} ⊂ 𝐸

example of a path of length 5

shortest path defines the distance between two nodes

Page 11: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Undirected graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is undirected if 𝑗, 𝑖 ∈ 𝐸 ⇒ 𝑖, 𝑗 ∈ 𝐸

1

2 3

4 5 6

Page 12: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Undirected graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is undirected if 𝑗, 𝑖 ∈ 𝐸 ⇒ 𝑖, 𝑗 ∈ 𝐸

an undirected graph is connected,

if there exists a path π between any two distinct nodes

1

2 3

4 5 6

Page 13: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Undirected graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is undirected if 𝑗, 𝑖 ∈ 𝐸 ⇒ 𝑖, 𝑗 ∈ 𝐸

an undirected graph is connected,

if there exists a path π between any two distinct nodes

diameter of a connected graph is the maximum distance between two nodes

1

2 3

4 5 6

Page 14: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Directed graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is directed if ∃ 𝑗, 𝑖 ∈ 𝐸 such that 𝑖, 𝑗 ∉ 𝐸

1

2 3

4 5 6

Page 15: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Directed graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is directed if ∃ 𝑗, 𝑖 ∈ 𝐸 such that 𝑖, 𝑗 ∉ 𝐸

a directed graph is strongly connected, if there exists a directed path π between any two distinct nodes

a directed graph is weakly connected, if the undirected graph obtained by making all arches bidirectional is connected

1

2 3

4 5 6

Page 16: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Directed graph and connectivity

graph 𝐺 = 𝑉, 𝐸 is directed if ∃ 𝑗, 𝑖 ∈ 𝐸 such that 𝑖, 𝑗 ∉ 𝐸

a directed graph is strongly connected, if there exists a directed path π between any two distinct nodes

a directed graph is weakly connected, if the undirected graph obtained by making all arches bidirectional is connected

1

2 3

4 5 6

Page 17: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

a weighted graph is a graph 𝐺 = 𝑉, 𝐸 together with

a map φ: 𝐸 → 𝑅 that assigns a weight 𝑤𝑗𝑖 = φ( 𝑗, 𝑖 ) to an edge (𝑗, 𝑖) ∈ 𝐸

We can then define the 𝑚 ×𝑚 weight matrix 𝑊 such that

𝑊 𝑖, 𝑗 = 𝑤𝑗

𝑖 (𝑗, 𝑖) ∈ 𝐸

0 otherwise

Weighted graph

1

2 3

4 5 6

𝑤36

𝑤13

𝑤31

𝑤23

𝑤12

𝑤24

𝑤52

𝑤45

𝑤65

Page 18: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

𝑊 is row-stochastic if 𝑊 𝑖, 𝑗 ≥ 0 and 𝑊(𝑖, 𝑗)𝑗 = 1, 𝑖 = 1,… ,𝑚

𝑊 is column-stochastic if 𝑊 𝑖, 𝑗 ≥ 0 and 𝑊(𝑖, 𝑗)𝑖 = 1, 𝑗 = 1, … ,𝑚

𝑊 is doubly-stochastic if it is both row and column stochastic

If 𝟏 =

11⋮1

, then, 𝑊𝟏 = 𝟏 (row-stochastic) and 𝟏𝑇𝑊 = 𝟏𝑇 (column-stochastic)

Weighted graph

1

2 3

4 5 6

𝑤36

𝑤13

𝑤31

𝑤23

𝑤12

𝑤24

𝑤52

𝑤45

𝑤65

Page 19: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

• 𝑚 agents communicate along a set of links described by a graph 𝐺 = 𝑉, 𝐸 (by definition each agent communicates with itself, and hence belongs to its neighborhood)

• each agent 𝑖 has a scalar state 𝑥𝑖 with initial value 𝑥𝑖(0) and the agents aim at jointly reaching consensus to the average of their initial states

𝑥 0 =1

𝑚 𝑥𝑖(0)

𝑖

Application: consensus protocols

Page 20: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Distributed averaging solution:

• Associate a weight 𝑊 𝑖, 𝑗 > 0 to (𝑗, 𝑖) ∈ 𝐸

• Let each agent compute in parallel the weighted average

𝑥𝑖 𝑘 + 1 = 𝑊(𝑖, 𝑗)𝑗∈𝑁𝑖

𝑥𝑗 𝑘

Application: consensus protocols

Page 21: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Distributed averaging solution:

• Associate a weight 𝑊 𝑖, 𝑗 > 0 to (𝑗, 𝑖) ∈ 𝐸

• Let each agent compute in parallel the weighted average

𝑥𝑖 𝑘 + 1 = 𝑊(𝑖, 𝑗)𝑗∈𝑁𝑖

𝑥𝑗 𝑘

Theorem

If the weight matrix 𝑊 is doubly-stochastic and the communication graph 𝐺 = 𝑉, 𝐸 is (strongly) connected, then

lim𝑘→∞

𝑥𝑖(𝑘) = 𝑥 0 , 𝑖 = 1,… ,𝑚

Application: consensus protocols

Page 22: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

Let 𝑥 =

𝑥1𝑥2⋮𝑥𝑚

be the collection the states of the 𝑚 agents. Then, we have

𝑥 𝑘 + 1 = 𝑊𝑥(𝑘)

Application: consensus protocols

Page 23: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

Let 𝑥 =

𝑥1𝑥2⋮𝑥𝑚

be the collection the states of the 𝑚 agents. Then, we have

𝑥 𝑘 + 1 = 𝑊𝑥(𝑘)

Property 1: the average 𝑥 𝑘 =1

𝑚 𝑥𝑖(𝑘)𝑖 is preserved throughout iterations

Since 𝟏𝑇𝑊 = 𝟏𝑇, we have that

𝑥 𝑘 + 1 =1

𝑚𝟏𝑇𝑥 𝑘 + 1 =

1

𝑚𝟏𝑇𝑊𝑥 𝑘 =

1

𝑚𝟏𝑇𝑥 𝑘 = 𝑥 𝑘 = 𝑥 0

Application: consensus protocols

Page 24: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

Let 𝑥 =

𝑥1𝑥2⋮𝑥𝑚

be the collection the states of the 𝑚 agents. Then, we have

𝑥 𝑘 + 1 = 𝑊𝑥(𝑘)

Property 1: 𝑥 𝑘 =1

𝑚𝟏𝑇𝑥 𝑘 = 𝑥 0 , 𝑘 > 0

Note also that since

• W𝟏𝑥 𝑘 = 𝟏𝑥 𝑘

•1

𝑚𝟏𝟏𝑇 𝑥 𝑘 − 𝟏𝑥 𝑘 = 𝟏(

1

𝑚𝟏𝑇𝑥 𝑘 −

1

𝑚𝟏𝑇𝟏𝑥 𝑘 ) = 0

Application: consensus protocols

Page 25: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

Let 𝑥 =

𝑥1𝑥2⋮𝑥𝑚

be the collection the states of the 𝑚 agents. Then, we have

𝑥 𝑘 + 1 = 𝑊𝑥(𝑘)

Property 1: 𝑥 𝑘 =1

𝑚𝟏𝑇𝑥 𝑘 = 𝑥 0 , 𝑘 > 0

Note also that since

• W𝟏𝑥 𝑘 = 𝟏𝑥 𝑘

•1

𝑚𝟏𝟏𝑇 𝑥 𝑘 − 𝟏𝑥 𝑘 = 𝟏(

1

𝑚𝟏𝑇𝑥 𝑘 −

1

𝑚𝟏𝑇𝟏𝑥 𝑘 ) = 0

the dynamics of the disagreement vector is given by: 𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊𝑥 𝑘 − 𝟏𝑥 𝑘 = 𝑊(𝑥 𝑘 − 𝟏𝑥 𝑘 )

= 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

Application: consensus protocols

Page 26: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1.

Application: consensus protocols

Page 27: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1.

Preliminary results:

Application: consensus protocols

Page 28: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1.

Preliminary results:

• since 𝑊 is doubly-stochastic, then, it has all eigenvalues with λ ≤ 1

λ𝑣 = 𝑊𝑣 → λ max𝑖

𝑣𝑖 = λ 𝑣𝑖∗ = | 𝑊 𝑖∗, 𝑗 𝑣𝑗|𝑗 ≤ 𝑊 𝑖∗, 𝑗 |𝑣𝑗| ≤𝑗 𝑣𝑖∗

Application: consensus protocols

Page 29: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1

Preliminary results:

• since 𝑊 is doubly-stochastic, then, it has all eigenvalues with λ ≤ 1

λ𝑣 = 𝑊𝑣 → λ max𝑖

𝑣𝑖 = λ 𝑣𝑖∗ = | 𝑊 𝑖∗, 𝑗 𝑣𝑗|𝑗 ≤ 𝑊 𝑖∗, 𝑗 |𝑣𝑗| ≤𝑗 𝑣𝑖∗

• each eigenvector 𝑣 associated to an eigenvalue with λ < 1 is orthogonal to 𝟏 𝟏𝑇𝑊𝑣 = 𝟏𝑇λ𝑣 → 𝟏𝑇𝑣 = λ𝟏𝑇𝑣 → 𝟏𝑇𝑣 = 0

Application: consensus protocols

Page 30: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1

Preliminary results:

• since 𝑊 is doubly-stochastic, then, it has all eigenvalues with λ ≤ 1

λ𝑣 = 𝑊𝑣 → λ max𝑖

𝑣𝑖 = λ 𝑣𝑖∗ = | 𝑊 𝑖∗, 𝑗 𝑣𝑗|𝑗 ≤ 𝑊 𝑖∗, 𝑗 |𝑣𝑗| ≤𝑗 𝑣𝑖∗

• each eigenvector 𝑣 associated to an eigenvalue with λ < 1 is orthogonal to 𝟏 𝟏𝑇𝑊𝑣 = 𝟏𝑇λ𝑣 → 𝟏𝑇𝑣 = λ𝟏𝑇𝑣 → 𝟏𝑇𝑣 = 0

• λ = 1 is an eigenvalue of 𝑊 and 𝟏 is an eigenvector associated with it 𝑊𝟏 = 𝟏 → λ = 1

Application: consensus protocols

Page 31: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1

Preliminary results:

• since 𝑊 is doubly-stochastic, then, it has all eigenvalues with λ ≤ 1

λ𝑣 = 𝑊𝑣 → λ max𝑖

𝑣𝑖 = λ 𝑣𝑖∗ = | 𝑊 𝑖∗, 𝑗 𝑣𝑗|𝑗 ≤ 𝑊 𝑖∗, 𝑗 |𝑣𝑗| ≤𝑗 𝑣𝑖∗

• each eigenvector 𝑣 associated to an eigenvalue with λ < 1 is orthogonal to 𝟏 𝟏𝑇𝑊𝑣 = 𝟏𝑇λ𝑣 → 𝟏𝑇𝑣 = λ𝟏𝑇𝑣 → 𝟏𝑇𝑣 = 0

• λ = 1 is an eigenvalue of 𝑊 and 𝟏 is an eigenvector associated with it 𝑊𝟏 = 𝟏 → λ = 1

• since the graph is strongly connected, by Perron–Frobenius theorem, only one eigenvalues of 𝑊 satisfy λ = 1, it is equal to 1 and simple with eigenvector 𝟏

Application: consensus protocols

Page 32: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1

To this purpose, by using the preliminary results we shall show that

1. the eigenvalue λ = 1 of 𝑊 with eigenvector 𝟏 is shifted to 0

2. all the other eigenvalues of 𝑊 (with λ < 1) are preserved

Application: consensus protocols

Page 33: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1.

To this purpose, by using the preliminary results we shall show that

1. the eigenvalue λ = 1 of 𝑊 with eigenvector 𝟏 is shifted to 0

𝑊 −1

𝑚𝟏𝟏𝑇 𝟏 = 𝑊𝟏− 𝟏

1

𝑚𝟏𝑇𝟏 = 𝟏 − 𝟏 = 0 ∙ 𝟏

Application: consensus protocols

Page 34: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Proof of the distributed average consensus theorem

𝑥 𝑘 + 1 − 𝟏𝑥 𝑘 + 1 = 𝑊 −1

𝑚𝟏𝟏𝑇 (𝑥 𝑘 − 𝟏𝑥 𝑘 )

We now just need to prove that 𝑊 −1

𝑚𝟏𝟏𝑇 has all eigenvalues with λ < 1.

To this purpose, by using the preliminary results we shall show that

1. the eigenvalue λ = 1 of 𝑊 with eigenvector 𝟏 is shifted to 0

2. all the other eigenvalues of 𝑊 (with λ < 1) are preserved

Let 𝑣 be an eigenvector associated with an eigenvalue 𝜆 of 𝑊 with λ < 1, then, it is orthogonal to 𝟏 and, hence,

𝑊 −1

𝑚𝟏𝟏𝑇 𝑣 = 𝑊𝑣 −

1

𝑚𝟏𝟏𝑇𝑣 = 𝑊𝑣 = 𝜆𝑣

Application: consensus protocols

Page 35: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Let the time-varying graphs 𝐺𝑘 = 𝑉, 𝐸𝑘 , 𝑘 = 0,1, … with

𝑉 = 1,2, … ,𝑚 𝐸𝑘 = { 𝑗, 𝑖 : 𝑗 communicates with 𝑖 at time 𝑘}

model a time-varying communication network

Introduce the weight matrices 𝑊𝑘 , k = 0,1, … associated with 𝐺𝑘 , 𝑘 = 0,1, … and

consider the distributed algorithm

𝑥 𝑘 + 1 = 𝑊𝑘𝑥(𝑘)

Time-varying setting

Page 36: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Assumptions:

• Connectivity

(V, 𝐸∞) strongly connected where 𝐸∞ = { 𝑗, 𝑖 : 𝑗 communicates with 𝑖 infinitely often}

• Bounded intercommunication time (partial asynchronism)

there exists T ≥ 1 such that for every 𝑗, 𝑖 ∈ 𝐸∞,

𝑗, 𝑖 ∈ 𝐸𝑘 ∪ 𝐸𝑘+1 ∪⋯∪ 𝐸𝑘+𝑇−1, 𝑘 ≥ 0

i.e., agent i receives information from a neighboring agent j at least once every consecutive T iterations

• Weights rules

each 𝑊𝑘 is doubly-stochastic and there exists 𝜂 > 0 such that

𝑊𝑘 𝑖, 𝑖 ≥ 𝜂, ∀𝑖, ∀𝑘

𝑊𝑘 𝑖, 𝑗 ≥ 𝜂, ∀(𝑗, 𝑖) ∈ 𝐸𝑘

Time-varying setting

Page 37: Presentazione di PowerPoint · 1. the eigenvalue λ=1 of with eigenvector 𝟏 is shifted to 0 2. all the other eigenvalues of (with λ

Let the time-varying graphs 𝐺𝑘 = 𝑉, 𝐸𝑘 , 𝑘 = 0,1, … with

𝑉 = 1,2, … ,𝑚 𝐸𝑘 = { 𝑗, 𝑖 : 𝑗 communicates with 𝑖 at time 𝑘}

model a time-varying communication network

Introduce the weight matrices 𝑊𝑘 , k = 0,1, … associated with 𝐺𝑘 , 𝑘 = 0,1, … and

consider the distributed algorithm

𝑥 𝑘 + 1 = 𝑊𝑘𝑥(𝑘)

Theorem

Under the previous assumptions on the communication network and the weights, the agents asymptotically reach consensus on the average

lim𝑘→∞

𝑥𝑖(𝑘) = 𝑥 0 , 𝑖 = 1,… ,𝑚

Time-varying setting