network coding theory - freewiiau4.free.fr/pdf/network coding theory (ppt).pdf · network coding...

100
Network Coding Theory Network Coding Theory Raymond W. Raymond W. Yeung Yeung Department of Information Engineering Department of Information Engineering The Chinese University of Hong Kong The Chinese University of Hong Kong

Upload: buidang

Post on 07-Apr-2018

258 views

Category:

Documents


2 download

TRANSCRIPT

Network Coding TheoryNetwork Coding TheoryRaymond W. Raymond W. YeungYeungDepartment of Information EngineeringDepartment of Information EngineeringThe Chinese University of Hong KongThe Chinese University of Hong Kong

Course OutlineCourse Outline

IntroductionIntroductionLinear Network CodingLinear Network Coding

1.1. Acyclic NetworksAcyclic Networks2.2. Cyclic NetworksCyclic Networks

Network Coding for Multiple SourcesNetwork Coding for Multiple SourcesConcluding Remarks Concluding Remarks

IntroductionIntroduction

Multicast Multicast vsvs BroadcastBroadcast

Broadcast Broadcast –– OneOne--toto--many communication without many communication without

specific receiver addressesspecific receiver addresses

MulticastMulticast–– OneOne--toto--many communication with many communication with

specific receiver addressesspecific receiver addresses

What is Network Coding?What is Network Coding?The traditional approach (The traditional approach (storestore--andand--forwardforward) to multicasting is to find an ) to multicasting is to find an efficient efficient multicast treemulticast tree in the network, in the network, with the nodes acting as switches with the nodes acting as switches which can route and/or replicate which can route and/or replicate information.information.Network coding is a new theory which Network coding is a new theory which changes the way we think of changes the way we think of transmitting information in a network.transmitting information in a network.

What is Network Coding?What is Network Coding?

In the paradigm of network coding, In the paradigm of network coding, nodes can encode information nodes can encode information received from the input links before it received from the input links before it is transmitted on the output links.is transmitted on the output links.Switching is a special case of network Switching is a special case of network coding.coding.

A Network Coding ExampleA Network Coding Example

The Butterfly NetworkThe Butterfly Network

b1 b2

b1

b1

b2

b2

b1

b1 b2

b1 b2

b1+b2

b1+b2b1+b2

b2b1 b2 b1 b2

A Network Coding Example A Network Coding Example with Two Sourceswith Two Sources

b1b2

b1 b2

b1 b2b1

b1 b2

b1

b1+b2

b1+b2

b1+b2

b2 b2

Wireless/Satellite ApplicationWireless/Satellite Applicationb1 b2

t = 1b1

t = 2

t = 3b1+b2

b2

b1+b2

50% saving for downlink bandwidth!

Two Themes of Network CodingTwo Themes of Network Coding

When there is 1 source to be multicast When there is 1 source to be multicast in a network, storein a network, store--andand--forward may forward may fail to optimize bandwidth.fail to optimize bandwidth.When there are 2 or more When there are 2 or more independentindependentsources to be transmitted in a network sources to be transmitted in a network (even for (even for unicastunicast), store), store--andand--forward forward may fail to optimize bandwidth.may fail to optimize bandwidth.

Why Surprising?Why Surprising?

Classical information theory (C.E. Classical information theory (C.E. Shannon, 1948) tells us that for Shannon, 1948) tells us that for pointpoint--toto--point communication, all point communication, all information can ultimately be information can ultimately be represented as independent (represented as independent (i.i.di.i.d.) .) bits through data compression.bits through data compression.Independent bits cannot be further Independent bits cannot be further compressed compressed —— commoditycommodity..

Why Surprising?Why Surprising?

The folklore was that the commodity The folklore was that the commodity view for information continues to apply view for information continues to apply in networks. in networks. For a long time, processing of For a long time, processing of information at the intermediate nodes information at the intermediate nodes in a network for the purpose of in a network for the purpose of transmission has not been considered.transmission has not been considered.

Applications of Network CodingApplications of Network Coding

Computer NetworksComputer NetworksWireless/Satellite CommunicationsWireless/Satellite CommunicationsDistributed information storage/ Distributed information storage/ dissemination (e.g., Bitdissemination (e.g., Bit--Torrent)Torrent)Robust Network ManagementRobust Network ManagementNetwork Error CorrectionNetwork Error Correction

Recent ActivitiesRecent Activities

Over 120 papers have been written Over 120 papers have been written on the subject in the context of on the subject in the context of –– Information and Coding TheoryInformation and Coding Theory–– NetworkingNetworking–– Wireless CommunicationsWireless Communications–– Computer Science (algorithms, Computer Science (algorithms,

computational complexity)computational complexity)–– Graph Theory, Matrix TheoryGraph Theory, Matrix Theory

Recent ActivitiesRecent Activities

Joint special issue of Joint special issue of IEEE Transactions on IEEE Transactions on Information TheoryInformation Theory and and IEEE/ACM IEEE/ACM Transactions on NetworkingTransactions on NetworkingCall for papers for various conferences (PODC Call for papers for various conferences (PODC 2004, ICC 2005, 2004, ICC 2005, NetCodNetCod 2005/06, 2005/06, WirelessComWirelessCom 2005, ISIT 2006, International 2005, ISIT 2006, International Symposium on Turbo Coding 2006, etc)Symposium on Turbo Coding 2006, etc)Tutorials at Tutorials at GlobecomGlobecom 2004, ISIT 2005.2004, ISIT 2005.

Model of a PointModel of a Point--toto--Point NetworkPoint Network

A network is represented by a graph A network is represented by a graph G = (V,E) with G = (V,E) with node set Vnode set V and and edge edge (channel) set E(channel) set E..A symbol from an A symbol from an alphabet Falphabet F can be can be transmitted on each channel.transmitted on each channel.There can be multiple edges between There can be multiple edges between a pair of nodes.a pair of nodes.

SingleSingle--Source Network CodingSource Network Coding

The source node S generates The source node S generates an an information vector information vector

x = (xx = (x11 xx22 …… xxωω) ) ∈∈ FFωω..What is the condition for a node T to What is the condition for a node T to be able to receive the information be able to receive the information vector x?vector x?

MaxMax--Flow MinFlow Min--Cut TheoremCut Theorem

Consider a graph G = (V,E).Consider a graph G = (V,E).There is a source node S and a sink There is a source node S and a sink node T in set V.node T in set V.A A cutcut in the graph G is a partition of in the graph G is a partition of the node set V into two subsets Vthe node set V into two subsets V11 and and VV22 such that S is in Vsuch that S is in V11 and T is in Vand T is in V22

The The capacitycapacity of a cut is the total of a cut is the total number of edges across the cut from Vnumber of edges across the cut from V11to Vto V22

MaxMax--Flow MinFlow Min--Cut TheoremCut Theorem

The maxThe max--flow from S to T, denoted flow from S to T, denoted by by maxflow(Tmaxflow(T),), isis the maximum rate the maximum rate at which a fluid (e.g., water) can flow at which a fluid (e.g., water) can flow from S to T provided that the rate of from S to T provided that the rate of flow in each edge does not exceed 1 flow in each edge does not exceed 1 and the flow conserves at each node.and the flow conserves at each node.By the MaxBy the Max--Flow MinFlow Min--Cut Theorem, Cut Theorem, maxflow(Tmaxflow(T) is equal to ) is equal to the minimum the minimum value of the capacity of a cutvalue of the capacity of a cut in G.in G.

The MaxThe Max--Flow BoundFlow Bound

What is the condition for a node T to What is the condition for a node T to be able to receive the information be able to receive the information vectorvector

x = (xx = (x11 xx22 …… xxωω) ?) ?

Obviously, if Obviously, if maxflow(Tmaxflow(T) < ) < ωω, then T , then T cannot possibly receive x.cannot possibly receive x.

The Basic ResultThe Basic Result

Network coding was first proposed in Network coding was first proposed in R. R. AhlswedeAhlswede, N. , N. CaiCai, S., S.--Y. R. Li, and R. W. Y. R. Li, and R. W. YeungYeung, , ““Network Information Flow,Network Information Flow,”” IEEE Transactions on IEEE Transactions on Information Theory, Information Theory, vol. 46, pp. 1204vol. 46, pp. 1204--1216, July 1216, July 2000.2000.

Their result says that a node T can receive Their result says that a node T can receive the information vector x the information vector x iffiff maxflow(Tmaxflow(T) ) ≥≥ωω, , i.e., the maxi.e., the max--flow bound can be achieved flow bound can be achieved simultaneously by all such nodes T (using simultaneously by all such nodes T (using network coding).network coding).

Linear Network CodingLinear Network Coding

In a linear network code, messages In a linear network code, messages are represented as symbols in a finite are represented as symbols in a finite field, and encoding/decoding is by field, and encoding/decoding is by means of linear operations in the means of linear operations in the finite field.finite field.Linear codes greatly simplify the Linear codes greatly simplify the encoding/decoding process.encoding/decoding process.

Linear Network CodingLinear Network CodingIt was proved that the maxIt was proved that the max--flow bound flow bound can be achieved by linear network can be achieved by linear network codes:codes:S.S.--Y. R. Li, R. W. Y and N. Y. R. Li, R. W. Y and N. CaiCai, , ““Linear Network Linear Network Coding,Coding,”” IEEE Transactions on Information IEEE Transactions on Information Theory, Theory, vol. 49, no. 2, pp. 371vol. 49, no. 2, pp. 371--381, 2003381, 2003(vector space approach)(vector space approach)..R. R. KoetterKoetter and M. and M. MedardMedard, "An Algebraic , "An Algebraic Approach to Network Coding", Approach to Network Coding", Transactions on Transactions on Networking, Networking, October 2003October 2003 (matrix approach)(matrix approach)..

Linear Network Code ConstructionsLinear Network Code Constructions

A construction of a A construction of a generic generic linear network linear network code was provided in LYC.code was provided in LYC.A generic linear network code is considerably A generic linear network code is considerably stronger than a stronger than a linear multicastlinear multicast, a linear , a linear network code that can achieve the maxnetwork code that can achieve the max--flow flow bound bound in the weakest sense.in the weakest sense.The LYC algorithm has been refined for The LYC algorithm has been refined for constructing aconstructing a linear multicastlinear multicast: : S. S. JaggiJaggi, P. Sanders et al., , P. Sanders et al., ““Polynomial time Polynomial time algorithms for multicast network code algorithms for multicast network code construction,construction,”” submitted to submitted to IEEE ITIEEE IT..

MultiMulti--Source Network CodingSource Network Coding

PointPoint--toto--point communicationpoint communication–– Classical information theory says that if 2 Classical information theory says that if 2

independent sources X and Y are to be independent sources X and Y are to be transmitted from one point to another point, transmitted from one point to another point, coding X and Y separately or jointly does not coding X and Y separately or jointly does not make any difference asymptotically, becausemake any difference asymptotically, because

H(X,Y) = H(X) + H(Y). H(X,Y) = H(X) + H(Y). –– A multiA multi--source problem can be decomposed source problem can be decomposed

into a number of single source problems.into a number of single source problems.

MultiMulti--Source Network CodingSource Network Coding

Network multicastNetwork multicast–– To multicast 2 independent sources X and Y To multicast 2 independent sources X and Y

sometimes bandwidth optimality sometimes bandwidth optimality cannotcannot be be achieved by coding X and Y separately (Y 95).achieved by coding X and Y separately (Y 95).

–– General acyclic networks has been studied (Y General acyclic networks has been studied (Y and Zhang 99, Song, Y and and Zhang 99, Song, Y and CaiCai 04).04).

–– A taxonomy of network coding problems have A taxonomy of network coding problems have been reported by Lehman and Lehman (no been reported by Lehman and Lehman (no coding, polynomialcoding, polynomial--time solvable, NPtime solvable, NP--hard).hard).

Some RamificationsSome Ramifications

Robust network managementRobust network managementSeparation of network coding and Separation of network coding and channel codingchannel codingSecure network codingSecure network codingNetwork error correctionNetwork error correctionNetwork coding on undirected Network coding on undirected networksnetworks

Robust Network ManagementRobust Network Management

KoetterKoetter and and MedardMedard (03) proved the (03) proved the existence of existence of staticstatic network codes in the network codes in the presence of link failures, as long as the maxpresence of link failures, as long as the max--flow of each receiving node in the resulting flow of each receiving node in the resulting network continues to exceed the information network continues to exceed the information rate.rate.Algorithms are designed for efficient network Algorithms are designed for efficient network recovery.recovery.Explicit construction of static network code to Explicit construction of static network code to be discussed.be discussed.

Separation of Network Coding Separation of Network Coding and Channel Codingand Channel Coding

Adjacent nodes are connected by a Adjacent nodes are connected by a noisy channel.noisy channel.All the channels are assumed All the channels are assumed memorylessmemoryless and and independentindependent..Separation of network coding and Separation of network coding and channel coding is asymptotically channel coding is asymptotically optimal for singleoptimal for single--source networks source networks ((BoradeBorade 02, Song and Y 04).02, Song and Y 04).

Separation of Network Coding Separation of Network Coding and Channel Codingand Channel Coding

A generalization of ShannonA generalization of Shannon’’s s celebrated result that the capacity of celebrated result that the capacity of a a memorylessmemoryless channel cannot be channel cannot be increased by feedback.increased by feedback.

Secure Network CodingSecure Network Coding

A collection of subsets of links are A collection of subsets of links are given.given.An eavesdroppers is allowed to access An eavesdroppers is allowed to access any one (but not more than one) of any one (but not more than one) of these subsets.these subsets.The eavesdropper is not able to The eavesdropper is not able to obtain any information about the obtain any information about the information source transmitted.information source transmitted.

Secure Network CodingSecure Network Coding

This model is a network generalization This model is a network generalization of of secret sharingsecret sharing in cryptography (in cryptography (CaiCaiand Y 02).and Y 02).

Network Error CorrectionNetwork Error Correction

Error correction in existing communication Error correction in existing communication networks is done on a linknetworks is done on a link--byby--link basis.link basis.Network error correction is to achieve error Network error correction is to achieve error correction in a network in a distributed correction in a network in a distributed manner in conjunction with network coding.manner in conjunction with network coding.Network generalization of the Hamming Network generalization of the Hamming bound and the Gilbertbound and the Gilbert--VarshamovVarshamov bound bound have been obtained (have been obtained (CaiCai and Y 02).and Y 02).

Network Coding on Undirected Network Coding on Undirected NetworksNetworks

The total bandwidth of a channel can The total bandwidth of a channel can be timebe time--shared between the 2 shared between the 2 directions (e.g., walkiedirections (e.g., walkie--talkie).talkie).It is conjectured that network coding It is conjectured that network coding does not help in undirected networks does not help in undirected networks (Li and Li 04).(Li and Li 04).

Linear Network CodingLinear Network CodingAcyclic NetworksAcyclic Networks

Acyclic NetworksAcyclic NetworksThe following type of networks is considered:The following type of networks is considered:–– CyclicCyclic--free. The nodes can be ordered in an free. The nodes can be ordered in an

upstreamupstream--toto--downstreamdownstream manner.manner.–– There is only There is only one information sourceone information source in the in the

whole network, generated at the node S.whole network, generated at the node S.–– Transmission is assumed to be instantaneous.Transmission is assumed to be instantaneous.

The maxThe max--flow bound can be achieved by flow bound can be achieved by linear network coding at linear network coding at different levelsdifferent levels..

NotationNotationLet F be a Let F be a finite fieldfinite field..The information source generates a vector The information source generates a vector x = (xx = (x11 xx22 …… xxωω) in F) in Fωω..The value of the maxThe value of the max--flow from S to a node T flow from S to a node T is denoted by is denoted by maxflow(Tmaxflow(T).).In(TIn(T) denotes the set of edges incoming at T.) denotes the set of edges incoming at T.Out(TOut(T) denotes the set of edges outgoing ) denotes the set of edges outgoing from T.from T.

Imaginary ChannelsImaginary Channels

The information vector contains The information vector contains ωωsymbols from F.symbols from F.By convention, we assume that there By convention, we assume that there are are ωω imaginary channels ending at imaginary channels ending at the source node S when the source node S when ωω is specified.is specified.

Linear MappingLinear Mapping

Let f: FLet f: Fωω (row vector)(row vector) →→ F. If f is F. If f is linear, then there exists a column linear, then there exists a column vector a vector a ∈∈ FFωω such thatsuch that

f(xf(x) = ) = xx··aafor all row vectors x for all row vectors x ∈∈ FFωω..

Adjacent Pair of ChannelsAdjacent Pair of Channels

A pair (A pair (d,ed,e) of channels be called an ) of channels be called an adjacent pairadjacent pair if there exists a node T if there exists a node T with and d with and d ∈∈ In(TIn(T) and e ) and e ∈∈ Out(TOut(T). ).

Local Description of a Linear Local Description of a Linear Network Code Network Code

Definition 2.4.Definition 2.4. Let F be a finite field and Let F be a finite field and ωω a a positive integer. An positive integer. An ωω--dimensional Fdimensional F--valued valued linear network code on an acyclic network linear network code on an acyclic network consists of a scalar consists of a scalar kkd,ed,e, called the , called the local local encoding kernelencoding kernel, for every adjacent pair of , for every adjacent pair of channels (channels (d,ed,e). The local encoding kernel at ). The local encoding kernel at the node T refers to the |the node T refers to the |In(T)|In(T)|××|Out(T|Out(T)| )| matrix Kmatrix KTT = (= (kkd,ed,e))dd∈∈In(T),eIn(T),e∈∈Out(TOut(T))..

Local Description (cont.)Local Description (cont.)

Let yLet yee be the symbol sent on the be the symbol sent on the channel e.channel e.The local inputThe local input--output relation at a output relation at a node T is given bynode T is given by

yyee = = ΣΣdd∈∈In(TIn(T)) kkd,ed,e yyd d

for all e for all e ∈∈ Out(TOut(T), i.e., an output ), i.e., an output symbol from T is a linear combination symbol from T is a linear combination of the input symbols at T.of the input symbols at T.

Global Encoding KernelGlobal Encoding KernelGiven the local description of a linear Given the local description of a linear network code, the code maps the information network code, the code maps the information vector x to each symbol yvector x to each symbol yee sent on the sent on the channel e.channel e.Obviously, such a mapping is linear.Obviously, such a mapping is linear.Therefore, yTherefore, yee = = xx··ffee for some column vector for some column vector ffee ∈∈ FFωω..The vector The vector ffee, called the , called the global encoding global encoding kernelkernel for the channel e, is like a for the channel e, is like a transfer transfer functionfunction..

Global Description of a Linear Global Description of a Linear Network CodeNetwork Code

Definition 2.5.Definition 2.5. An An ωω--dimensional Fdimensional F--valued linear network code consists of valued linear network code consists of a scalar a scalar kkd,ed,e for every adjacent pair of for every adjacent pair of channels (channels (d,ed,e) in the network as well as ) in the network as well as an an ωω--dimensional column vector dimensional column vector ffee for for every channel e such that:every channel e such that:ffee = = ΣΣdd∈∈In(TIn(T)) kkd,ed,e ffdd for e for e ∈∈ Out(TOut(T).).The vectors The vectors ffee corresponding to the corresponding to the ωωimaginary channels e imaginary channels e ∈∈ In(SIn(S) form the ) form the natural basis of the vector space Fnatural basis of the vector space Fωω..

The global description of a linear network The global description of a linear network code incorporates the local description.code incorporates the local description.The relation The relation ffee = = ΣΣdd∈∈In(TIn(T)) kkd,ed,e ffdd for for ee∈∈Out(TOut(T) ) says that the global encoding kernel of an says that the global encoding kernel of an output channel at Toutput channel at T is a linear combination is a linear combination of the global encoding kernels of the input of the global encoding kernels of the input channels channels at T.at T.From this, we haveFrom this, we have

yyee = = xx··ffee = = xx··((ΣΣdd∈∈In(TIn(T)) kkd,ed,e ffdd))= = ΣΣdd∈∈In(TIn(T)) kkd,ed,e ((xx··ffdd))= = ΣΣdd∈∈In(TIn(T)) kkd,ed,e yydd

as in the local description.as in the local description.

Example 2.6Example 2.6

Figure 2.2 shows the relation between Figure 2.2 shows the relation between the local description and the global the local description and the global description.description.

Example 2.7Example 2.7

Figure 2.3 shows the same network in Figure 2.3 shows the same network in Example 2.6 with all the local encoding Example 2.6 with all the local encoding kernels being kernels being indeterminatesindeterminates..

Desirable Properties of a Linear Desirable Properties of a Linear Network CodeNetwork Code

Definition 2.8. Definition 2.8. Let Let VVTT = = ⟨⟨{{ffdd: d : d ∈∈ In(TIn(T)})}⟩⟩. .

An An ωω--dimensional Fdimensional F--valued linear network code valued linear network code qualifies as aqualifies as aLinear multicastLinear multicast dim(Vdim(VTT) = ) = ωω for every nonfor every non--source source

node T with node T with maxflow(Tmaxflow(T) ) ≥≥ ωω..Linear broadcastLinear broadcast dim(Vdim(VTT) = min{) = min{ωω, , maxflow(Tmaxflow(T)} for )} for

every nonevery non--source node T. source node T. Linear dispersionLinear dispersion

dimdim⟨∪⟨∪TT∈℘∈℘ VVTT⟩⟩ = min{= min{ωω, , maxflowmaxflow((℘℘)} for every )} for every collection collection ℘℘ of nonof non--source nodes.source nodes.

Linear MulticastLinear Multicast

For every node T, if For every node T, if maxflow(Tmaxflow(T) ) ≥≥ ωω, , then T receives the whole vector then T receives the whole vector space so that the information vector x space so that the information vector x can be recovered.can be recovered.If If maxflow(Tmaxflow(T) ) < < ωω, possibly no useful , possibly no useful information can be recovered.information can be recovered.A linear network code may not be a A linear network code may not be a linear multicast. See Figure 2.4(d).linear multicast. See Figure 2.4(d).

Linear BroadcastLinear BroadcastFor every node T, if For every node T, if maxflow(Tmaxflow(T) ) ≥≥ ωω, then T , then T receives the whole vector space so that the receives the whole vector space so that the information vector x can be recovered.information vector x can be recovered.If If maxflow(Tmaxflow(T) ) < < ωω, then T receives , then T receives maxflow(Tmaxflow(T) dimensions, but which ) dimensions, but which dimensions to be received is not guaranteeddimensions to be received is not guaranteed. . A linear broadcast is stronger than a linear A linear broadcast is stronger than a linear multicast. See Figure 2.4(c).multicast. See Figure 2.4(c).

Application of Linear BroadcastApplication of Linear BroadcastFor a For a oneone--timetime information vector x, a information vector x, a linear broadcast allows each node T to linear broadcast allows each node T to receive x at rate equal to receive x at rate equal to maxflow(Tmaxflow(T), ), the maximum possible rate.the maximum possible rate.See Example 2.22.See Example 2.22.

Linear DispersionLinear DispersionExample of a binary linear dispersion. See Example of a binary linear dispersion. See Figure 2.4(a).Figure 2.4(a).For every collection of nodes T, if For every collection of nodes T, if maxflowmaxflow((℘℘) ) ≥≥ ωω, then , then ℘℘ receives the whole vector space receives the whole vector space so that the information vector x can be so that the information vector x can be recovered.recovered.If If maxflowmaxflow((℘℘) ) < < ωω, then , then ℘℘ receives receives maxflow(Tmaxflow(T) dimensions) dimensions..Linear dispersion is stronger than a linear Linear dispersion is stronger than a linear broadcast. See Figure 2.4(b).broadcast. See Figure 2.4(b).

Application of Linear DispersionApplication of Linear Dispersion

A collection of nodes, each of which A collection of nodes, each of which has has maxflowmaxflow < < ωω, can be pooled , can be pooled together to receive the whole together to receive the whole information vector x.information vector x.Facilitates network expansion.Facilitates network expansion.

Example 2.10Example 2.10

The linear network code in Figure 2.2 The linear network code in Figure 2.2 is a linear dispersion is a linear dispersion regardless of the regardless of the choice of the field Fchoice of the field F..Hence, it is also a linear broadcast Hence, it is also a linear broadcast and a linear multicast.and a linear multicast.

Example 2.11Example 2.11The more general linear network code in The more general linear network code in Figure 2.3 is a linear multicast whenFigure 2.3 is a linear multicast when–– ffTWTW and and ffUWUW are linearly independent. are linearly independent. –– ffTYTY and and ffXYXY are linearly independent. are linearly independent. –– ffUZUZ and and ffWZWZ are linearly independent. are linearly independent. Equivalently, the criterion says that s, t, u, y, Equivalently, the criterion says that s, t, u, y, z, nr z, nr −− pqpq, and , and npswnpsw + + nruxnrux −− pnswpnsw −− pquwpquware all nonzero. Example 2.6 has been the are all nonzero. Example 2.6 has been the special case with n = r = s = t = u = v = w = special case with n = r = s = t = u = v = w = x = y = z = 1 and p = q = 0.x = y = z = 1 and p = q = 0.

How to Construct Linear Network How to Construct Linear Network Codes with Desirable PropertiesCodes with Desirable Properties

We want the global encoding kernels We want the global encoding kernels to be as independent as possible.to be as independent as possible.How can this be formulated How can this be formulated algebraically?algebraically?

Generic Linear Network CodeGeneric Linear Network Code

Definition 2.12. Definition 2.12. An An ωω--dimensionaldimensional FF--valued linear network code on an valued linear network code on an acyclic network is said to be acyclic network is said to be genericgeneric if: if: Let {eLet {e11, e, e22, , …… , e, enn} be an arbitrary } be an arbitrary set of channels, where n set of channels, where n ≤≤ ωω and each and each eejj ∈∈ Out(TOut(Tjj). Then, the vectors ). Then, the vectors ffe1e1, , ffe2e2, , …… , , ffenen are linearly independent are linearly independent provided thatprovided that

VVTjTj ⊄⊄ ⟨⟨{{ffekek: : kk≠≠jj}}⟩⟩ for 1 for 1 ≤≤ j j ≤≤ n.n.

Main Idea of a Generic Linear Main Idea of a Generic Linear Network CodeNetwork Code

In the collection of channels {eIn the collection of channels {e11, e, e22, , …… , e, enn}, }, if each if each TTjj knows something knows something ““not transmitted not transmitted byby”” the other channels, then the channels ethe other channels, then the channels e11, , ee22, , …… , e, en n ““transmittransmit”” independent global independent global encoding kernels.encoding kernels.The converse always holds.The converse always holds.Roughly, in a generic linear network code, Roughly, in a generic linear network code, every collection of at most every collection of at most ωω global encoding global encoding kernels that can possibly be linearly kernels that can possibly be linearly independent are linearly independent. independent are linearly independent.

A Consequence of A Consequence of GenericityGenericity

For any collection of no more than For any collection of no more than dim(Vdim(VTT) outgoing channels from a ) outgoing channels from a node T, the corresponding global node T, the corresponding global encoding kernels are linearly encoding kernels are linearly independent.independent.In particular, if |In particular, if |Out(TOut(T)| )| ≤≤ dim(Vdim(VTT), ), then the global encoding kernels of all then the global encoding kernels of all the outgoing channels from T are the outgoing channels from T are linearly independent.linearly independent.

Example 2.13Example 2.13

Figure 2.5 shows a 2Figure 2.5 shows a 2--dimensional dimensional linear dispersion which is not a linear dispersion which is not a generic linear network code.generic linear network code.

The Effect of Field SizeThe Effect of Field SizeTo construct a linear network code with To construct a linear network code with desirable properties, the field F often needs desirable properties, the field F often needs to be sufficient large.to be sufficient large.When the field size is small, certain When the field size is small, certain polynomial equations are unavoidable. E.g., polynomial equations are unavoidable. E.g., zzpp -- z = 0 for all z z = 0 for all z ∈∈ GF(pGF(p).).Figure 2.6 in Example 2.6 shows a network Figure 2.6 in Example 2.6 shows a network on which there exists a ternary linear on which there exists a ternary linear multicast but not a binary linear multicast.multicast but not a binary linear multicast.This is similar to the construction of a This is similar to the construction of a VandermondeVandermonde matrix.matrix.

Lemma 2.15Lemma 2.15

Let g(zLet g(z11, z, z22, , …… , , zznn) be a nonzero ) be a nonzero polynomial with coefficients in a field polynomial with coefficients in a field F. If |F| is greater than the degree of F. If |F| is greater than the degree of g in every g in every zzjj, then there exist a, then there exist a11, , aa22, , …… , a, ann ∈∈ F such that F such that

g(ag(a11, a, a22, , …… , a, ann) ) ≠≠ 0.0.Proof: By induction on n and invoke the Proof: By induction on n and invoke the

fundamental theorem of algebra.fundamental theorem of algebra.

Example 2.16Example 2.16Refer to Figure 2.3, with all the global encoding Refer to Figure 2.3, with all the global encoding kernels being kernels being indeterminatesindeterminates..LLWW, L, LYY, and L, and LZZ are the matrices of the global are the matrices of the global encoding kernels received at W, Y, and Z, encoding kernels received at W, Y, and Z, respresp..Clearly, Clearly, det(Ldet(LWW), ), det(Ldet(LYY), ), det(Ldet(LZZ) are not the zero ) are not the zero polynomial, or equivalently, polynomial, or equivalently,

P(n,p,qP(n,p,q,,……) = ) = det(Ldet(LWW))··det(Ldet(LYY))··det(Ldet(LZZ) ) is not the zero polynomial.is not the zero polynomial.When F is sufficiently large, by Lemma 2.16, it is When F is sufficiently large, by Lemma 2.16, it is possible to assign the global encoding kernels so possible to assign the global encoding kernels so that that P(n,p,qP(n,p,q,,……) is not evaluated to 0.) is not evaluated to 0.

Construction of Generic Network Construction of Generic Network Code: Improved LYC AlgorithmCode: Improved LYC Algorithm

Algorithm 2.17.Algorithm 2.17. Let F is a field with > (n+Let F is a field with > (n+ω−ω−1 choose 1 choose ω−ω−1) elements.1) elements.{{

for (every channel e in the network except for the imaginary chafor (every channel e in the network except for the imaginary channels)nnels)ffee = the zero vector;= the zero vector;

for (every node T, following an upstreamfor (every node T, following an upstream--toto--downstream order)downstream order){{

for (every channel e for (every channel e ∈∈ Out(TOut(T)))){{

Choose a vector w in the space VChoose a vector w in the space VTT such that w such that w ∉∉ ⟨⟨{{ffdd: : dd∈ξ∈ξ}}⟩⟩ for every collection for every collection ξξ of of ω−ω−1 channels, including 1 channels, including possibly imaginary channels in possibly imaginary channels in In(SIn(S) but excluding e, with ) but excluding e, with VVTT ⊄⊄ ⟨⟨{{ffdd: d: d∈ξ∈ξ}}⟩⟩;;ffee = w;= w;

}}}}

}}

Justification of Field SizeJustification of Field SizeTo see the existence of such a vector w, denote To see the existence of such a vector w, denote dim(Vdim(VTT) ) = k and let = k and let

N = (n+N = (n+ω−ω−1 choose 1 choose ω−ω−1).1).If If ξξ is any collection of channels with is any collection of channels with VVTT ⊄⊄ ⟨⟨{{ffdd: d: d∈ξ∈ξ}}⟩⟩, then , then

dim((Vdim((VTT))∩⟨∩⟨{{ffdd: d: d∈ξ∈ξ}}⟩⟩) ) ≤≤ kk−−1.1.There exists at most N collections of such There exists at most N collections of such ξξ..Thus Thus

||VVTT∩∩((∪ξ⟨∪ξ⟨{{ffdd: d: d∈ξ∈ξ}}⟩⟩)| )| ≤≤ N|N|FF||kk−−11

< |< |FF||kk

= |V= |VTT|.|.Note: When F is very large, a generic code can be Note: When F is very large, a generic code can be constructed randomly with high probability.constructed randomly with high probability.

Justification of AlgorithmJustification of AlgorithmWe need to show that the network code We need to show that the network code constructed is indeed generic.constructed is indeed generic.Refer to paper.Refer to paper.In fact, the code constructed is somewhat In fact, the code constructed is somewhat stronger than what is required by a generic stronger than what is required by a generic linear network code.linear network code.So we have proved Theorem 2.18:So we have proved Theorem 2.18:Given a positive integer Given a positive integer ωω and an acyclic and an acyclic network, there exists an network, there exists an ωω--dimensional Fdimensional F--valued generic linear network code for valued generic linear network code for sufficiently large base field F.sufficiently large base field F.

Complexity AnalysisComplexity AnalysisRecall that N = (n+Recall that N = (n+ω−ω−1 choose 1 choose ω−ω−1), which 1), which is polynomial in n for a fixed is polynomial in n for a fixed ωω..For each T, at most N collections For each T, at most N collections ξξ of of ωω--1 1 channels need to be processed.channels need to be processed.Throughout the algorithm, at most Throughout the algorithm, at most nNnNcollections of collections of ωω--1 channels need to be 1 channels need to be processed.processed.This can be implemented in polynomial time This can be implemented in polynomial time in n.in n.

A Generic Linear Network Code is a A Generic Linear Network Code is a Linear DispersionLinear Dispersion

Theorem 2.27. Theorem 2.27. For an For an ωω--dimensionaldimensional generic linear generic linear network code, for every set network code, for every set ℑℑ of nonof non--source nodes,source nodes,

dimdim⟨⟨{{ffee: e : e ∈∈ cut(cut(ℑℑ)})}⟩⟩ = min{= min{ωω, , maxflowmaxflow((ℑℑ)}.)}.Consequently, the generic linear network code is a Consequently, the generic linear network code is a linear dispersion.linear dispersion.Generic Linear Network Code Generic Linear Network Code

⇒⇒ Linear DispersionLinear Dispersion⇒⇒ Linear BroadcastLinear Broadcast

⇒⇒ Linear MulticastLinear Multicast

Consequences of the Existence of Consequences of the Existence of Generic Linear Network CodeGeneric Linear Network Code

Corollary 2.19. Corollary 2.19. Given a positive integer Given a positive integer ωω and an and an acyclic network, there exists an acyclic network, there exists an ωω--dimensionaldimensional FF--valued valued linear dispersionlinear dispersion for sufficiently large base for sufficiently large base field field FF..

Corollary 2.20. Corollary 2.20. Given a positive integer Given a positive integer ωω and an and an acyclic network, there exists an acyclic network, there exists an ωω--dimensionaldimensional FF--valued valued linear broadcastlinear broadcast for sufficiently large base for sufficiently large base field field FF..

Corollary 2.21. Corollary 2.21. Given a positive integer Given a positive integer ωω and an and an acyclic network, there exists an acyclic network, there exists an ωω--dimensionaldimensional FF--valued valued linear multicastlinear multicast for sufficiently large base for sufficiently large base field field FF..

Alternative Proof of Alternative Proof of Corollary 2.21Corollary 2.21

KoetterKoetter--MedardMedard 03 (Matrix approach)03 (Matrix approach)For any node T such that For any node T such that maxflow(Tmaxflow(T) ) ≥≥ ωω, , there exist there exist ωω edgeedge--disjoint paths connecting disjoint paths connecting the the ωω imaginary channels in imaginary channels in In(SIn(S) to T.) to T.Let LLet LTT be the matrix of the global encoding be the matrix of the global encoding kernels of the channels of these paths ending kernels of the channels of these paths ending at T.at T.Claim: Claim: det(Ldet(LTT) is not the zero polynomial.) is not the zero polynomial.

Proof: Proof: Set the local encoding kernels to emulate the Set the local encoding kernels to emulate the routing of routing of ωω symbols from S to T. symbols from S to T. With these settings of the local encoding With these settings of the local encoding kernels, Lkernels, LTT is evaluated to the is evaluated to the ωω××ωω identity identity matrix and matrix and det(Ldet(LTT) is evaluated to 1. ) is evaluated to 1. Therefore, Therefore, det(Ldet(LTT) cannot be the zero ) cannot be the zero polynomial.polynomial.Consider Consider ΠΠT:maxflow(TT:maxflow(T))≥≥ωω det(Ldet(LTT), which is not ), which is not the zero polynomial.the zero polynomial.For a sufficient large base field F, it is For a sufficient large base field F, it is possible to set the local encoding kernels so possible to set the local encoding kernels so that this polynomial is not evaluated to 0.that this polynomial is not evaluated to 0.

The feasible local encoding kernels The feasible local encoding kernels can in principle be found by can in principle be found by exhaustive search over the base field exhaustive search over the base field F, but not in polynomial time.F, but not in polynomial time.

Refinement of AlgorithmRefinement of AlgorithmThe original LYC algorithm for constructing a The original LYC algorithm for constructing a generic linear network code has been refined generic linear network code has been refined by by JaggiJaggi and Sanders et al. for the and Sanders et al. for the construction of construction of linear multicastlinear multicast in polynomial in polynomial time.time.See Algorithm 2.29.See Algorithm 2.29.The complexity is lower than that of the The complexity is lower than that of the improved LYC algorithm for constructing improved LYC algorithm for constructing generic linear network codegeneric linear network code..

Static Linear Network CodeStatic Linear Network CodeIn the event of link failure, as long as in the In the event of link failure, as long as in the resulting network, resulting network, maxflow(Tmaxflow(T) ) ≥≥ ωω for every for every receiving node Treceiving node T, the information vector x , the information vector x can still be transmitted to all the receiving can still be transmitted to all the receiving nodes.nodes.But this involves a change of the network But this involves a change of the network code throughout the whole network.code throughout the whole network.KoetterKoetter and and MedardMedard 03 propose the 03 propose the staticstaticcode to alleviate this problem.code to alleviate this problem.The idea is to use the bandwidth spared for The idea is to use the bandwidth spared for rerouting to achieve seamless recovery in rerouting to achieve seamless recovery in case of network failure.case of network failure.

In a static code, in the case of link failure, an In a static code, in the case of link failure, an unreceivedunreceived symbol is treated as a 0, and symbol is treated as a 0, and coding is performed as usual.coding is performed as usual.The receiving nodes either have to know the The receiving nodes either have to know the link failure pattern or obtain necessary link failure pattern or obtain necessary decoding information from the packet header.decoding information from the packet header.KoetterKoetter and and MedardMedard show the existence of a show the existence of a static linear multicast.static linear multicast.A static network code can be applied even A static network code can be applied even when there is only one receiving node.when there is only one receiving node.

Construction of Static Generic Linear Construction of Static Generic Linear Network CodeNetwork Code

The improved LYC algorithm can be The improved LYC algorithm can be further enhanced for constructing a further enhanced for constructing a static generic linear network code.static generic linear network code.The algorithm can be made The algorithm can be made polynomial if only a certain subset of polynomial if only a certain subset of link failure patterns is considered.link failure patterns is considered.See Algorithm 2.34.See Algorithm 2.34.

Linear Network CodingLinear Network CodingCyclic NetworksCyclic Networks

UnitUnit--Delay Cyclic NetworksDelay Cyclic Networks

We consider the following type of We consider the following type of networks:networks:–– There may be directed cycles. It may There may be directed cycles. It may

not be possible to order the nodes in an not be possible to order the nodes in an upstreamupstream--toto--downstream manner.downstream manner.

–– There is only 1 information source There is only 1 information source generated at the node S.generated at the node S.

–– There is a unitThere is a unit--delay at each node to delay at each node to avoid information looping.avoid information looping.

Information SourceInformation SourceAt each discrete time t At each discrete time t ≥≥ 0, t0, the information source he information source generates a row vectorgenerates a row vector

xxtt = ( x= ( x1,t1,t, x, x2,t2,t, , ……, x, xωω,t ,t ),),where each symbol where each symbol xxi,ti,t is in some finite field F.is in some finite field F.For 1 For 1 ≤≤ i i ≤≤ ωω, t, the pipeline of symbols xhe pipeline of symbols xi,0i,0, x, xi,1i,1, , xxi,2i,2, , …… can be conveniently represented by a can be conveniently represented by a power series (zpower series (z--transform)transform)

XXii(z(z) = x) = xi,0 i,0 + x+ xi,1i,1 z + xz + xi,2i,2 zz22 + + ……So the information source can be represented by So the information source can be represented by

x(zx(z) = ( x) = ( x11(z) x(z) x22(z) (z) …… xxωω(z) ).(z) ).

Rational Power SeriesRational Power SeriesFunctions of the form p(z)/(1+zq(z)), where Functions of the form p(z)/(1+zq(z)), where p(zp(z) and ) and q(zq(z) are polynomials, can be ) are polynomials, can be expanded into power series at z=0 by long expanded into power series at z=0 by long division.division.Such functions are called Such functions are called rational power rational power seriesseries..Denote the integral domain of all rational Denote the integral domain of all rational power series by power series by FF⟨⟨zz⟩⟩..Denote the integral domain of all power Denote the integral domain of all power series by series by F[[zF[[z]].]].FF⟨⟨zz⟩⟩ is a is a subdomainsubdomain of of F[[zF[[z]].]].

The Global Description of a The Global Description of a ConvolutionalConvolutional Network CodeNetwork Code

Definition 3.5.Definition 3.5. An An ωω--dimensional Fdimensional F--valued valued convolutionalconvolutional network code on a unitnetwork code on a unit--delay delay network consists of an element network consists of an element kkd,ed,e(z(z)) ∈∈ FF⟨⟨zz⟩⟩ for for every adjacent pair of channels (every adjacent pair of channels (d,ed,e) as well as an ) as well as an ωω--dimensional column vector dimensional column vector ffee(z(z) ) ∈∈ FF⟨⟨zz⟩⟩ωω for every for every channel e such that:channel e such that:

–– ffee(z(z) = ) = zz ΣΣdd∈∈In(TIn(T)) kkd,ed,e(z(z) ) ffdd(z(z) for e ) for e ∈∈ Out(TOut(T).).–– for the for the ωω imaginary channels in imaginary channels in In(SIn(S), the vectors ), the vectors ffee(z(z) consist ) consist

of scalar components and form the natural basis of the vector of scalar components and form the natural basis of the vector space Fspace Fωω..

The vector The vector ffee(z(z) is called the ) is called the global encoding kernelglobal encoding kernelfor the channel e and for the channel e and kkee(z(z) is called the ) is called the local local encoding kernelencoding kernel for the adjacent pair (for the adjacent pair (d,ed,e). The ). The local encoding kernel at the node T refers to the local encoding kernel at the node T refers to the ||In(T)|In(T)|××|Out(T|Out(T)| matrix )| matrix

KKTT(z(z) = [) = [kkd,ed,e(z)](z)]dd∈∈In(T),eIn(T),e∈∈Out(TOut(T))..

The equationThe equationffee(z(z) = z ) = z ΣΣdd∈∈In(TIn(T)) kkd,ed,e(z(z) ) ffdd(z(z))

gives the relation between the global gives the relation between the global encoding kernels and the local encoding encoding kernels and the local encoding kernels.kernels.In time domain, it becomes the In time domain, it becomes the convolutionalconvolutionalequationequation

ffe,te,t = = ΣΣdd∈∈In(TIn(T)) ((ΣΣ00≤≤u<tu<t kkd,e,ud,e,u ffd,td,t−−11−−uu) ) for all t > 0, allowing the global encoding for all t > 0, allowing the global encoding kernels to be computed from the local kernels to be computed from the local encoding kernels.encoding kernels.

InputInput--Output RelationOutput RelationLet Let yyee(z(z) be the power series representing the ) be the power series representing the pipeline of symbols transmitted on channel e, pipeline of symbols transmitted on channel e, given by given by x(zx(z) ) ffee(z(z).).Then for e Then for e ∈∈ Out(TOut(T),),

yyee(z(z) ) = = x(zx(z) ) ffee(z(z) ) = = x(zx(z) ) [[zz ΣΣdd∈∈In(TIn(T)) kkd,ed,e(z(z) ) ffdd(z(z)])]= = zz ΣΣdd∈∈In(TIn(T)) kkd,ed,e(z(z) ) [[x(zx(z)) ffdd(z(z)])]= = zz ΣΣdd∈∈In(TIn(T)) kkd,ed,e(z(z) ) yydd(z(z) )

In other words, an output power series is the sum In other words, an output power series is the sum of the convolutions between an input power series of the convolutions between an input power series with the corresponding local encoding kernels, with the corresponding local encoding kernels, with a unit delaywith a unit delay. .

Example 3.6Example 3.6

Figure 3.4 shows the relation between Figure 3.4 shows the relation between the local encoding kernels and the the local encoding kernels and the global encoding kernels of a unitglobal encoding kernels of a unit--delay network.delay network.

The Local Description of a The Local Description of a ConvolutionalConvolutional Network CodeNetwork Code

For any given local encoding kernels For any given local encoding kernels {{kkd,ed,e}, {}, {ffee} can be calculated } can be calculated recursively. So, recursively. So, ffee ∈∈ F[[zF[[z]] for all e.]] for all e.By definition, the vectors {By definition, the vectors {ffee} need to } need to be in be in FF⟨⟨zz⟩⟩ in order to be qualified as in order to be qualified as global encoding kernels. This is global encoding kernels. This is necessary for the purpose of decoding.necessary for the purpose of decoding.The next theorem shows that this is The next theorem shows that this is indeed the case.indeed the case.

Theorem 3.9Theorem 3.9

Let Let kkd,ed,e(z(z) ) ∈∈ FF⟨⟨zz⟩⟩ be given for every be given for every adjacent pair of channels (adjacent pair of channels (d,ed,e) on a ) on a unitunit--delay network. Then there exists delay network. Then there exists a unique a unique ωω--dimensional Fdimensional F--valued valued convolutionalconvolutional network code with network code with kkd,ed,e(z(z) ) as the local encoding kernel for every as the local encoding kernel for every ((d,ed,e).).

Proof of Theorem 3.9Proof of Theorem 3.9Let [Let [ffee(z(z)] be the )] be the ωω××n matrix formed by put all n matrix formed by put all the global encoding kernels in juxtaposition the global encoding kernels in juxtaposition (excluding the (excluding the ωω imaginary channels).imaginary channels).Let Let HHSS(z(z) be the ) be the ωω××n matrix formed by appending n matrix formed by appending nn--|Out(S|Out(S)| columns of zeros to the local encoding )| columns of zeros to the local encoding kernel kernel KKSS(z(z).).ThenThen

[[ffee(z(z)] = z [)] = z [ffee(z(z)] [)] [kkd,ed,e(z(z)] + z )] + z HHSS(z(z).).so thatso that

[[ffee(z(z)] = z )] = z det(Idet(Inn –– z [kz [kd,ed,e(z)])(z)])--11 HHSS(z(z) ) A(zA(z),),where where A(zA(z) is the ) is the adjointadjoint matrix of matrix of

IInn –– z [z [kkd,ed,e(z(z)].)].

Therefore the global encoding kernels Therefore the global encoding kernels can be expressed in closedcan be expressed in closed--form in form in terms of the local encoding kernels (in terms of the local encoding kernels (in frequency domain).frequency domain).Moreover, it is seen that Moreover, it is seen that ffee(z(z) are in ) are in FF⟨⟨zz⟩⟩, i.e., they are rational., i.e., they are rational.

ConvolutionalConvolutional MulticastMulticastDefinition Definition 3.103.10.. An An ωω--dimensional Fdimensional F--valued valued

convolutionalconvolutional network code qualifies as an network code qualifies as an ωω--dimensional dimensional convolutionalconvolutional multicastmulticast if:if:For every nonFor every non--source node T with source node T with maxflow(Tmaxflow(T) ) ≥≥ ωω, there exists an |, there exists an |In(TIn(T)|)|×ω×ωmatrix matrix DDTT(z(z) over ) over FF⟨⟨zz⟩⟩ and a positive integer and a positive integer t such that t such that

[[ffee(z)](z)]ee∈∈In(T)In(T)··DDTT(z(z) = ) = zztt IIωω, , where t depends on the node T and Iwhere t depends on the node T and Iωω is the is the ω×ωω×ω identity matrix. The matrix identity matrix. The matrix DDTT(z(z) is ) is called the called the decoding kerneldecoding kernel at the node T.at the node T.

Decoding of a Decoding of a ConvolutionalConvolutional MulticastMulticast

Theorem Theorem 3.143.14.. Given a unitGiven a unit--delay delay network, a finite field F and a positive network, a finite field F and a positive integer integer ωω, there exists an , there exists an ωω--dimensional Fdimensional F--valued valued convolutionalconvolutionalmulticast. Furthermore, if E is a multicast. Furthermore, if E is a sufficiently large subset of sufficiently large subset of FF⟨⟨zz⟩⟩, then , then the local encoding kernels of the the local encoding kernels of the convolutionalconvolutional multicast can be chosen multicast can be chosen to take values from E.to take values from E.

KoetterKoetter and and MedardMedard 03 has shown 03 has shown that the local encoding kernels can be that the local encoding kernels can be chosen in a sufficient large finite field chosen in a sufficient large finite field F.F.Theorem 3.9 says that the local Theorem 3.9 says that the local encoding kernels can, for example, encoding kernels can, for example, also be chosen in the set of all binary also be chosen in the set of all binary polynomials up to a sufficiently large polynomials up to a sufficiently large degree.degree.

Network Coding for Network Coding for Multiple SourcesMultiple Sources

FormulationFormulationInformation can be generated at possibly Information can be generated at possibly different nodes in the network and at different nodes in the network and at possibly different rates.possibly different rates.Each information source is to be transmitted Each information source is to be transmitted to a specified subset of nodes.to a specified subset of nodes.Given a network, can such a set of Given a network, can such a set of communication requirements be communication requirements be accommodated?accommodated?

NonNon--DecomposabilityDecomposability

It has been shown (Y 95) that it is not It has been shown (Y 95) that it is not always possible to decompose a always possible to decompose a multimulti--sources problem into singlesources problem into single--source problems.source problems.Very difficult in general.Very difficult in general.Decomposability has been shown for Decomposability has been shown for special classes of networks (Roche, special classes of networks (Roche, HauHau, Y, Zhang 97, Y, Zhang 97--99).99).

Characterization of the Achievable Characterization of the Achievable Information Rate RegionInformation Rate Region

Involves information theory heavily.Involves information theory heavily.Specifically, involves the framework of Specifically, involves the framework of information inequalities developed by information inequalities developed by Y 97.Y 97.General acyclic networks have been General acyclic networks have been studied (studied (Y and Zhang 99, Song, Y and Y and Zhang 99, Song, Y and CaiCai 04).04).The problem is still open.The problem is still open.

Concluding RemarksConcluding Remarks

Network coding theory reveals that information Network coding theory reveals that information cannot be regarded as a commodity.cannot be regarded as a commodity.Besides information and coding theory, network Besides information and coding theory, network coding theory is quickly propagating into coding theory is quickly propagating into networking, wireless communications, networking, wireless communications, cryptography, computer science (complexity cryptography, computer science (complexity and algorithms), operations research (graph and algorithms), operations research (graph theory and optimization), and matrix theory.theory and optimization), and matrix theory.Many issues, theoretical and practical, are yet Many issues, theoretical and practical, are yet to be resolved.to be resolved.Many ramifications, too.Many ramifications, too.

ResourcesResourcesNetwork Coding HomepageNetwork Coding Homepagehttp://http://www.networkcoding.infowww.networkcoding.info

R. W. R. W. YeungYeung, S., S.--Y. R. Li, N. Y. R. Li, N. CaiCai and Z. Zhang, and Z. Zhang, ““Theory Theory of Network Coding,of Network Coding,”” preprint, to appear in preprint, to appear in Foundation Foundation and Trends in Communications and Information Theoryand Trends in Communications and Information Theory..R. W. R. W. YeungYeung, , A First Course in Information TheoryA First Course in Information Theory, , KluwerKluwer Academic/Plenum Publishers, 2002, Academic/Plenum Publishers, 2002, http://http://www.ie.cuhk.edu.hk/IT_bookwww.ie.cuhk.edu.hk/IT_book//Tutorial by R. Tutorial by R. KoetterKoetter, M. , M. MedardMedard, and P. A. Chou , and P. A. Chou ((GlobecomGlobecom 04, ISIT 05)04, ISIT 05)http://www.comm.csl.uiuc.edu/~koetter/NWC/Course/Nhttp://www.comm.csl.uiuc.edu/~koetter/NWC/Course/NWC_complete.pdfWC_complete.pdf