map reduce - school of computingmapreduce โ€“word counting input set of documents map: reads a...

Post on 12-Sep-2020

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Map Reduce

Last Time โ€ฆ

Parallel Algorithms

Work/Depth Model

Spark

Map Reduce

Assignment 1

Questions ?

Today โ€ฆ

Map Reduce

Matrix multiplication

Similarity Join

Complexity theory

MapReduce โ€“ word counting

Input set of documents

Map:

reads a document and breaks it into a sequence of words๐‘ค1, ๐‘ค2, โ€ฆ ,๐‘ค๐‘›

Generates (๐‘˜, ๐‘ฃ) pairs,๐‘ค1, 1 , ๐‘ค2, 1 , โ€ฆ , (๐‘ค๐‘›, 1)

System:

group all ๐‘˜, ๐‘ฃ by key

Given ๐‘Ÿ reduce tasks, assign keys to reduce tasks using a hash function

Reduce:

Combine the values associated with a given key

Add up all the values associated with the word total count for that word

Matrix-vector multiplication

๐‘› ร— ๐‘› matrix ๐‘€ with entries ๐‘š๐‘–๐‘—

Vector ๐’— of length ๐‘› with values ๐‘ฃ๐‘—

We wish to compute

๐‘ฅ๐‘– =

๐‘—=1

๐‘›

๐‘š๐‘–๐‘—๐‘ฃ๐‘—

If ๐’— can fit in memory

Map: generate (๐‘–,๐‘š๐‘–๐‘—๐‘ฃ๐‘—)

Reduce: sum all values of ๐‘– to produce (๐‘–, ๐‘ฅ๐‘–)

If ๐’— is too large to fit in memory? Stripes? Blocks?

What if we need to do this iteratively?

Matrix-Matrix Multiplication

๐‘ƒ = ๐‘€๐‘ ๐‘๐‘–๐‘˜ = ๐‘—๐‘š๐‘–๐‘—๐‘›๐‘—๐‘˜

2 mapreduce operations

Map 1: produce ๐‘˜, ๐‘ฃ , ๐‘—, ๐‘€, ๐‘–, ๐‘š๐‘–๐‘— and ๐‘—, ๐‘, ๐‘˜, ๐‘›๐‘—๐‘˜

Reduce 1: for each ๐‘— ๐‘–, ๐‘˜, ๐‘š๐‘–๐‘— ร— ๐‘›๐‘—๐‘˜

Map 2: identity

Reduce 2: sum all values associated with key ๐‘–, ๐‘˜

Matrix-Matrix multiplication

In one mapreduce step

Map:

generate ๐‘˜, ๐‘ฃ ๐‘–, ๐‘˜ , ๐‘€, ๐‘—,๐‘š๐‘–๐‘— & ๐‘–, ๐‘˜ , ๐‘, ๐‘—, ๐‘›๐‘—๐‘˜

Reduce:

each key (๐‘–, ๐‘˜) will have values ๐‘–, ๐‘˜ , ๐‘€, ๐‘—,๐‘š๐‘–๐‘— & ๐‘–, ๐‘˜ , ๐‘, ๐‘—, ๐‘›๐‘—๐‘˜ โˆ€๐‘—

Sort all values by ๐‘—

Extract ๐‘š๐‘–๐‘— & ๐‘›๐‘—๐‘˜ and multiply, accumulate the sum

Complexity Theory for mapreduce

Communication cost

Communication cost of a task is the size of the input to the task

We do not consider the amount of time it takes each task to

execute when estimating the running time of an algorithm

The algorithm output is rarely large compared with the input or the

intermediate data produced by the algorithm

Reducer size & Replication rate

Reducer size ๐‘ž

Upper bound on the number of values that are allowed to appear in the list associated with a single key

By making the reducer size small, we can force there to be many reducers

High parallelism low wall-clock time

By choosing a small ๐‘ž we can perform the computation associated with a single reducer entirely in the main memory of the compute node

Low synchronization (Comm/IO) low wall clock time

Replication rate (๐‘Ÿ)

number of (๐‘˜, ๐‘ฃ) pairs produced by all the Map tasks on all the inputs, divided by the number of inputs

๐‘Ÿ is the average communication from Map tasks to Reduce tasks

Example: one-pass matrix mult

Assume matrices are ๐‘› ร— ๐‘›

๐‘Ÿ โ€“ replication rate

Each element ๐‘š๐‘–๐‘— produces ๐‘› keys

Similarly each ๐‘›๐‘—๐‘˜ produces ๐‘› keys

Each input produces exactly ๐‘› keys load balance

๐‘ž โ€“ reducer size

Each key has ๐‘› values from ๐‘€ and ๐‘› values from ๐‘

2๐‘›

Example: two-pass matrix mult

Assume matrices are ๐‘› ร— ๐‘›

๐‘Ÿ โ€“ replication rate

Each element ๐‘š๐‘–๐‘— produces 1 key

Similarly each ๐‘›๐‘—๐‘˜ produces 1 key

Each input produces exactly 1 key (2nd pass)

๐‘ž โ€“ reducer size

Each key has ๐‘› values from ๐‘€ and ๐‘› values from ๐‘

2๐‘› (1st pass), ๐‘› (2nd pass)

Real world example: Similarity Joins

Given a large set of elements ๐‘‹ and a similarity measure ๐‘ (๐‘ฅ, ๐‘ฆ)

Output: pairs whose similarity exceeds a given threshold ๐‘ก

Example: given a database of 106 images of size 1MB each, find pairs of images that are similar

Input: (๐‘–, ๐‘ƒ๐‘–), where ๐‘– is an ID for the picture and ๐‘ƒ๐‘– is the image

Output: ๐‘ƒ๐‘–, ๐‘ƒ๐‘— or simply ๐‘–, ๐‘— for those pairs where ๐‘  ๐‘ƒ๐‘– , ๐‘ƒ๐‘— > ๐‘ก

Approach 1

Map: generate (๐‘˜, ๐‘ฃ)

๐‘–, ๐‘— , ๐‘ƒ๐‘– , ๐‘ƒ๐‘—

Reduce:

Apply similarity function to each value (image pair)

Output pair if similarity above threshold ๐‘ก

Reducer size โ€“ ๐‘ž 2 (2MB)

Replication rate โ€“ ๐‘Ÿ 106 โˆ’ 1

Total communication from mapreduce tasks?

106 ร— 106 ร— 106 bytes 1018 bytes 1 Exabyte (kB MB GB TB PB EB)

Communicate over GigE 1010 sec 300 years

Approach 2: group images

Group images into ๐‘” groups with 106

๐‘”images each

Map: Take input element ๐‘–, ๐‘ƒ๐‘– and generate

๐‘” โˆ’ 1 keys ๐‘ข, ๐‘ฃ | ๐‘ƒ๐‘– โˆˆ ๐’ข ๐‘ข , ๐‘ฃ โˆˆ {1, โ€ฆ , ๐‘”} โˆ– {๐‘ข}

Associated value is (๐‘–, ๐‘ƒ๐‘–)

Reduce: consider key (๐‘ข, ๐‘ฃ)

Associated list will have 2 ร—106

๐‘”elements ๐‘—, ๐‘ƒ๐‘—

Take each ๐‘–, ๐‘ƒ๐‘– and ๐‘—, ๐‘ƒ๐‘— where ๐‘–, ๐‘— belong to different groups and

compute ๐‘  ๐‘ƒ๐‘– , ๐‘ƒ๐‘—

Compare pictures belonging to the same group

heuristic for who does this, say reducer for key ๐‘ข, ๐‘ข + 1

Approach 2: group images

Replication rate: ๐‘Ÿ = ๐‘” โˆ’ 1

Reducer size: ๐‘ž = 2 ร— 106/๐‘”

Input size: 2 ร— 1012/๐‘” bytes

Say ๐‘” = 1000,

Input is 2GB

Total communication: 106 ร— 999 ร— 106 = 1015 bytes 1 petabyte

Graph model for mapreduce

problems Set of inputs

Set of outputs

many-many relationship between the inputs and outputs, which describes which inputs are necessary to produce which outputs.

Mapping schema

Given a reducer size ๐‘ž

No reducer is assigned more than ๐‘žinputs

For every output, there is at least one reducer that is assigned all input related to that output

๐‘ƒ1

๐‘ƒ4

๐‘ƒ2

๐‘ƒ3

๐‘ƒ1, ๐‘ƒ2

๐‘ƒ1, ๐‘ƒ3

๐‘ƒ1, ๐‘ƒ4

๐‘ƒ2, ๐‘ƒ3

๐‘ƒ2, ๐‘ƒ4

๐‘ƒ3, ๐‘ƒ4

Grouping for Similarity Joins

Generalize the problem to ๐‘ images

๐‘” equal sized groups of ๐‘

๐‘”images

Number of outputs is ๐‘2โ‰ˆ๐‘2

2

Each reducer receives 2๐‘

๐‘”inputs (๐‘ž)

Replication rate ๐‘Ÿ = ๐‘” โˆ’ 1

๐‘Ÿ =2๐‘

๐‘ž

The smaller the reducer size, the larger the replication rate, and therefore higher the communication

communication โ†” reducer size

communication โ†” parallelism

Lower bounds on Replication rate

1. Prove an upper bound on how many outputs a reducer with ๐‘žinputs can cover. Call this bound ๐‘”(๐‘ž)

2. Determine the total number of outputs produced by the problem

3. Suppose that there are ๐‘˜ reducers, and the ๐‘–๐‘กโ„Ž reducer has ๐‘ž๐‘– < ๐‘ž

inputs. Observe that ๐‘–=1๐‘˜ ๐‘”(๐‘ž๐‘–)must be no less than the number of

outputs computed in step 2

4. Manipulate inequality in 3 to get a lower bound on ๐‘–=1๐‘˜ ๐‘ž๐‘–

5. 4 is the total communication from Map tasks to reduce tasks. Divide

by number of inputs to get the replication rate

Lower bounds on Replication rate

1. Prove an upper bound on how many outputs a reducer with ๐‘žinputs can cover. Call this bound ๐‘”(๐‘ž)

2. Determine the total number of outputs produced by the problem

3. Suppose that there are ๐‘˜ reducers, and the ๐‘–๐‘กโ„Ž reducer has ๐‘ž๐‘– < ๐‘ž

inputs. Observe that ๐‘–=1๐‘˜ ๐‘”(๐‘ž๐‘–)must be no less than the number of

outputs computed in step 2

4. Manipulate inequality in 3 to get a lower bound on ๐‘–=1๐‘˜ ๐‘ž๐‘–

5. 4 is the total communication from Map tasks to reduce tasks. Divide

by number of inputs to get the replication rate

๐‘ž2โ‰ˆ๐‘ž2

2

๐‘2โ‰ˆ๐‘2

2

๐‘–=1

๐‘˜๐‘ž๐‘–2

2โ‰ฅ๐‘2

2

๐‘ž

๐‘–=1

๐‘˜

๐‘ž๐‘– โ‰ฅ ๐‘2

๐‘–=1

๐‘˜

๐‘ž๐‘– โ‰ฅ๐‘2

๐‘ž๐‘Ÿ โ‰ฅ๐‘

๐‘ž

Matrix Multiplication

Consider the one-pass algorithm extreme case

Lets group rows/columns into bands ๐‘” groups ๐‘›/๐‘” columns/rows

=

๐‘€ ๐‘ ๐‘ƒ

๐‘› ร— ๐‘›

Matrix Multiplication

Map:

for each element of ๐‘€,๐‘ generate ๐‘” (๐‘˜, ๐‘ฃ) pairs

Key is group paired with all groups

Value is ๐‘–, ๐‘—, ๐‘š๐‘–๐‘— or ๐‘–, ๐‘—, ๐‘›๐‘–๐‘—

Reduce:

Reducer corresponds to key (๐‘–, ๐‘—)

All the elements in the ๐‘–๐‘กโ„Ž band of ๐‘€ and ๐‘—๐‘กโ„Ž band of ๐‘

Each reducer gets ๐‘›๐‘›

๐‘”elements from 2 matrices

๐‘ž =2๐‘›2

๐‘”, ๐‘Ÿ = ๐‘” ๐‘Ÿ =

2๐‘›2

๐‘ž

Lower bounds on Replication rate

1. Prove an upper bound on how many outputs a reducer with ๐‘žinputs can cover. Call this bound ๐‘”(๐‘ž)

2. Determine the total number of outputs produced by the problem

3. Suppose that there are ๐‘˜ reducers, and the ๐‘–๐‘กโ„Ž reducer has ๐‘ž๐‘– < ๐‘žinputs. Observe that ๐‘–=1

๐‘˜ ๐‘”(๐‘ž๐‘–)must be no less than the number of outputs computed in step 2

4. Manipulate inequality in 3 to get a lower bound on ๐‘–=1

๐‘˜ ๐‘ž๐‘–

5. 4 is the total communication from Map tasks to reduce tasks. Divide by number of inputs to get the replication rate

Each reducer receives ๐‘˜ rows from ๐‘€and ๐‘ ๐‘ž = 2๐‘›๐‘˜ and produces ๐‘˜2

outputs ๐‘” ๐‘ž =๐‘ž2

4๐‘›2

๐‘›2

๐‘–=1๐‘˜ ๐‘ž๐‘–

2

4๐‘›2โ‰ฅ ๐‘›2

๐‘–=1

๐‘˜

๐‘ž๐‘–2 โ‰ฅ 4๐‘›4

๐‘–=1๐‘˜ ๐‘ž๐‘– โ‰ฅ

4๐‘›4

๐‘ž

๐‘Ÿ =1

2๐‘›2

๐‘–=1

๐‘˜

๐‘ž๐‘– =2๐‘›2

๐‘ž

Matrix Multiplication LET US REVISIT THE TWO-PASS APPROACH

Matrix-Matrix Multiplication

๐‘ƒ = ๐‘€๐‘ ๐‘๐‘–๐‘˜ = ๐‘—๐‘š๐‘–๐‘—๐‘›๐‘—๐‘˜

2 mapreduce operations

Map 1: produce ๐‘˜, ๐‘ฃ , ๐‘—, ๐‘€, ๐‘–, ๐‘š๐‘–๐‘— and ๐‘—, ๐‘, ๐‘˜, ๐‘›๐‘—๐‘˜

Reduce 1: for each ๐‘— ๐‘–, ๐‘˜, ๐‘š๐‘–๐‘— ร— ๐‘›๐‘—๐‘˜

Map 2: identity

Reduce 2: sum all values associated with key ๐‘–, ๐‘˜

Grouped two-pass approach

=

๐‘”2 groups of ๐‘›2

๐‘”2elements each

First pass: compute products of square

(๐ผ, ๐ฝ) of ๐‘€ with square (๐ฝ, ๐พ) of ๐‘

Second pass: โˆ€๐ผ, ๐พ sum over all ๐ฝ

Grouped two-pass approach

Replication rate for map1 is ๐‘” 2๐‘”๐‘›2 total communication

Each reducer gets 2๐‘›2

๐‘”2 ๐‘ž =

2๐‘›2

๐‘”2 ๐‘” = ๐‘›

2

๐‘ž

Total communication 22๐‘›3

๐‘ž

Assume map2 runs on same nodes as reduce1 no communication

Communication ๐‘”๐‘›2 2๐‘›3

๐‘ž

Total communication 32๐‘›3

๐‘ž

Comparison

๐‘›4

๐‘ž

๐‘›3

๐‘ž๐‘ž < ๐‘›2<

If ๐‘ž is closer to the minimum of 2๐‘›, two pass is better by a factor of ๐’ช ๐‘›

top related