EECS 150 - Components and Design
Techniques for Digital Systems
Lec 13 – Project Overview
David CullerElectrical Engineering and Computer Sciences
University of California, Berkeley
http://www.eecs.berkeley.edu/~cullerhttp://www-inst.eecs.berkeley.edu/~cs150
10/12/2004 EECS 150, Fa04, Lec 13-Project 2
Traversing Digital Design
EE 40 CS61C
EECS150 wks 1-6
EECS150 wks 6 - 15
You Are Here
10/12/2004 EECS 150, Fa04, Lec 13-Project 3
Caveats
• Today’s lecture provides an overview of the project.
• Lab will cover it in MUCH more detail.
• Where there are differences, the lab information is correct!
• Names of many components are different from what is used in lab, so you won’t be confused…
10/12/2004 EECS 150, Fa04, Lec 13-Project 4
Basic Pong Game
composite video
pla
yer
-0in
pu
t
pla
yer
-1in
pu
t
• Court = set of obstacles– fixed position
• Paddle = moving obstacle– Position & vertical velocity– Function of joystick– P’ = f ( P, j )
• Ball– 2D position & velocity– [ spin, acc ]– Bounces off walls and
paddles– B’ = f ( B, j ,C )
• Score– Ball hitting sides
• Effects– Display, audio, …
17 9
10/12/2004 EECS 150, Fa04, Lec 13-Project 5
Calinx Board
Flash Card & Micro-drive Port
Video Encoder & Decoder
AC ’97 Codec & Power Amp
Video & Audio Ports Four 100 Mb Ethernet Ports
8 Meg x 32SDRAM
Quad Ethernet Transceiver
XilinxVirtex 2000ESeven Segment
LED Displays
Prototype Area
10/12/2004 EECS 150, Fa04, Lec 13-Project 6
Add-on card
10/12/2004 EECS 150, Fa04, Lec 13-Project 7
Project Design Problem
Map this application
17 9To this technology
10/12/2004 EECS 150, Fa04, Lec 13-Project 8
Input-Output Constraints
ADV7194
composite video
ITU 601/656
N64 controller interface
8
pla
yer
-0in
pu
t
pla
yer
-1in
pu
t
FPGA
• Ball moves within a court
• Players control movement of the paddles with joysticks
• Observe game as rendered on video display
• Bounces off walls and paddles till point is scored
• I/O devices provide design constraints
17 9
switches LEDS LCD
10/12/2004 EECS 150, Fa04, Lec 13-Project 9
Input/Output Support
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
N64 controller interface
8p
lay
er-0
inp
ut
pla
yer
-1in
pu
t
Render Engine
Joystick
Interface
FPGA
• Digitize and abstract messy analog input
• Rendering pipeline to translate display objects into byte stream
• Off-chip device to translate digital byte stream into composite video
17 9
10/12/2004 EECS 150, Fa04, Lec 13-Project 10
“Physics” of the Game
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
N64 controller interface
8p
lay
er-0
inp
ut
pla
yer
-1in
pu
t
Render Engine
Joystick
Interface
FPGA
• Court = set of obstacles– fixed position
• Paddle = moving obstacle– Position & vertical velocity– Function of joystick– P’ = f ( P, j )
• Ball– 2D position & velocity– [ spin, acc ]– Bounces off walls and
paddles– B’ = f ( B, j ,C )
• Score– Ball hitting sides
• Effects– Display, audio, …
17 9
10/12/2004 EECS 150, Fa04, Lec 13-Project 11
Representing state• State of the game
– Court obstacles
– Paddles
– Ball
– Score
• Additional data– Display blocks
» Paddle & ball image
– Numerals
– Frame buffer
• SDRAM holds frame buffer– Rendered to frame buffer
– Spooled to video encoder
• SDRAM has sophisticated interface
– Grok Data sheet, design bus controller
• FPGA block RAM holds board– also Registers, Counters, …
– Timing sequence, Controller state
– FIFOs, Packet buffers
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
N64 controller interface
8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
frame
10/12/2004 EECS 150, Fa04, Lec 13-Project 12
N64 Interface (cp 1)• Continually poll N64 and
report state of buttons and analog joystick
– Issue 8-bit command
– Receive 32-bit response
• Each button response is 32 bit value containing button state and 8-bit signed horizontal and vertical velocities
• Serial interface protocol– Multiple cycles to perform each
transaction
• Bits obtained serially– Framing (packet start/stop)
– Bit encoding
» start | data | data | stop
Game
Physics
Video stream
ADV7194
composite video
ITU 601/656 8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
N64 controller interface
clock (27 MHz)resetstartpausevelocity
8
DQ
10/12/2004 EECS 150, Fa04, Lec 13-Project 13
Video Encoder (cp 2)
• Rendering engine processes display objects into frame buffer
– Renders rectangles, image blocks, …
• Drive ADV7194 video encoder device so that it outputs the correct NTSC
• Gain experience reading data sheets
• Dictates the 27 MHz operation rate– Used throughout graphics subsystem
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
N64 controller interface
8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
10/12/2004 EECS 150, Fa04, Lec 13-Project 14
Announcements
• Midterm will be returned in section
• Solutions available on-line
• Reading: – Video In a Nutshell (by Tom Oberheim) on class web page
– Lab project documents (as assigned)
10/12/2004 EECS 150, Fa04, Lec 13-Project 15
Digital Video Basics – a little detour
• Pixel Array:– A digital image is represented by a
matrix of values where each value is a function of the information surrounding the corresponding point in the image. A single element in an image matrix is a picture element, or pixel. A pixel includes info for all color components.
– The array size varies for different applications and costs. Some common sizes shown to the right.
• Frames: – The illusion of motion is created
by successively flashing still
pictures called frames.
1920
1080High-Definition Television (HDTV), 2 Mpx
1152
900Workstation, 1 Mpx
800
600PC/Mac,1‡2 Mpx
640
480Video, 300 Kpx
352
240
SIF,82 Kpx
High-Definition Television (HDTV), 1 Mpx
1280
720
10/12/2004 EECS 150, Fa04, Lec 13-Project 16
Refresh Rates & Scaning
• The human perceptual system can be fooled into seeing continuous motion by flashing frames at a rate of around 20 frames/sec or higher.
– Much lower and the movement looks jerky and flickers. TV in the US uses 30 frames/second (originally derived from the 60Hz line current frequency).
• Images are generated on the screen of the display device by “drawing” or scanning each line of the image one after another, usually from top to bottom.
• Early display devices (CRTs) required time to get from the end of a scan line to the beginning of the next. Therefore each line of video consists of an active video portion and a horizontal blanking interval portion.
• A vertical blanking interval corresponds to the time to return from the bottom to the top.
– In addition to the active (visible) lines of video, each frame includes a number of non-visible lines in the vertical blanking interval.
– The vertical blanking interval is used these days to send additional information such as closed captions and stock reports.
10/12/2004 EECS 150, Fa04, Lec 13-Project 17
Interlaced Scanning
• Early inventers of TV discovered that they could reduce the flicker effect by increasing the flash-rate without increasing the frame-rate.
• Interlaced scanning forms a complete picture, the frame, from two fields, each comprising half the scan lines. The second field is delayed half the frame time from the first.
• Non-interlaced displays are call progressive scan.
• The first field, odd field, displays the odd scan lines, the second, even field, displays the even scan lines.
10/12/2004 EECS 150, Fa04, Lec 13-Project 18
Pixel Components
• A natural way to represent the information at each pixel is with the brightness of each of the primary color components: red, green and blue (RBG).
– In the digital domain we could transmit one number for each of red, green, and blue intensity.
• Engineers had to deal with issue when transitioning from black and white TV to color. The signal for black and white TV contains the overall pixel brightness (a combination of all color components).
– Rather than adding three new signals for color TV, they decided to encode the color information in two extra signals to be used in conjunction with the B/W signal for color receivers and could be ignored for the older B/W sets.
• The color signals (components) are color differences, defined as:
Y-B and Y-R, where Y is the brightness signal (component).
• In the digital domain the three components are called:
Y luma, overall brightness
CB chroma, Y-B
CR chroma, Y-R
• Note that it is possible to reconstruct the RGB representation if needed.
• One reason this representation survives today is that the human visual perceptual system is less sensitive to spatial information in chrominance than it is in luminance. Therefore chroma components are usually subsampled with respect to luma component.
10/12/2004 EECS 150, Fa04, Lec 13-Project 19
Chroma Subsampling
R0
R2
R1
R3
G0
G2
G1
G3
B0
B2
B1
B3
Y0
Y2
Y1
Y3
CB
CB
CB
CB
CR
CR
CR
CR
Y0
Y2
Y1
Y3
CB 0-1
CB 2-3
CR 0-1
CR 0-1
Y0
Y2
Y1
Y3
CB 0-3
CR 0-3
Y0
Y2
Y1
Y3
CB 0-3
CR 0-3
RGB 4:4:4 Y CR CB 4:4:4 4:2:2 (ITU-601) 4:2:0 (MPEG-1) 4:2:0 (MPEG-2)
• Variations include subsampling horizontally, both vertically and horizontally.
• Chroma samples are coincident with alternate luma samples or are sited halfway between alternate luna samples.
10/12/2004 EECS 150, Fa04, Lec 13-Project 20
Common Interchange Format (CIF)
• Common Interchange Format (CIF)
• Developed for low to medium quality applications. Teleconferencing, etc.
• Variations: – QCIF, 4CIF, 16CIF
• Examples of
component streaming:
line i: Y CR Y Y CR Y Y…
line i+1: Y CB Y Y CB Y Y…
Alternate (different packet types):
line i: Y CR Y CB Y CR Y CB Y …
line i+1: Y Y Y Y Y …
Bits/pixel: – 6 components / 4 pixels
– 48/4 = 12 bits/pixel
Example 1: commonly used as output of MPEG-1 decoders.
Frame size 352 x 288
Frame rate 30 /sec
Scan progressive
Chroma subsampling
4:2:02:1 in both X & Y
Chroma alignment
interstitial
Bits per component
8
Effective bits/pixel
12
10/12/2004 EECS 150, Fa04, Lec 13-Project 21
ITU-R BT.601 Format
• Formerly, CCIR-601. Designed for digitizing broadcast NTSC (national television system committee) signals.
• Variations: – 4:2:0
– PAL (European) version
• Component streaming:line i: Y CB Y CR Y CB Y CR Y …
line i+1: Y CB Y CR Y CB Y CR Y …
• Bits/pixel: – 4 components / 2 pixels
– 40/2 = 20 bits/pixel
The Calinx board video encoder supports this format.
Frame size 720 x 487
Frame rate 29.97 /sec
Scan interlaced
Chroma subsampling
4:2:22:1 in X only
Chroma alignment
coincident
Bits per component
10
Effective bits/pixel
20
10/12/2004 EECS 150, Fa04, Lec 13-Project 22
Calinx Video Encoder
• Analog Devices ADV7194
• Supports:– Multiple input formats and outputs
– Operational modes, slave/master
– VidFX project will use default mode:
ITU-601 as slave
s-video output
• Digital input side connected to Virtex pins.
• Analog output side wired to on board connectors or headers.
• I2C interface for initialization:– Wired to Virtex.
10/12/2004 EECS 150, Fa04, Lec 13-Project 23
ITU-R BT.656 Details
• Interfacing details for ITU-601.Pixels per line 858
Lines per frame 525
Frames/sec 29.97
Pixels/sec 13.5 M
Viewable pixels/line 720
Viewable lines/frame 487
• With 4:2:2 chroma sub-sampling need to send 2 words/pixel (1 Y and 1 C).
• words/sec = 27M,Therefore encoder runs off a 27MHz
clock.
• Control information (horizontal and vertical synch) is multiplexed on the data lines.
• Encoder data stream show to right:
7 1 8 7 1 9 72 0 7 2 1 0 1 2
3 59 3 6 0 0 1
359 3 6 0 0 1
7 3 67 3 2( )
3 6 83 66( )
3 6 83 6 6( )
8 5 78 6 3)(
Y 71
8
Y 71
9C
36
0B Y 72
0C
36
0R
Y 7
21
C
359
B
C
359
R
Y 7
36(7
32)
C
368(
366)
B
C
368(
366)
R
Y 85
5(86
1)C
42
8(43
1)B
Y 8
56(8
62)
Y 85
7(86
3)C
0
B
Y 0
C
0R
Y 1
C
428(
431)
R
C
0B
Y 0
Y 1
C
0R
C
359
B
Y 71
8
Y 71
9C
35
9R
Las t sam p leof d igi ta l act iv e l in e
S am p le d a tafor O i n st an t
F irs t sam p leof d igi ta l act iv e l in eH
Lu m in an ced a ta , Y
C h ro m in an ced a ta , C R
C h ro m in an ced a ta , C B
R e p lace d byti m in g referen ce
s ign al
R ep laced b yd igi tal b la nk in g d a t a
R ep laced b yt im i n g re fe r en ce
s ig na l
E n d ofact iv e v id eo
Star t ofact iv e v id eo
T im in g refere nce si gn a ls
N o te 1 – Sam p le i d en ti fi cat io n nu m b ers i n pa ren the se s a re for 6 2 5 -line syst em s w h ere th e se di ffe r fro m th os e for 5 2 5 -li n e s yst em s . (Se e a lso R eco m m en d ation IT U -R B T .8 0 3 .)
FIG U RE 1
C om po sit ion of in ter fa ce da ta s tre a m
D 0 1
10/12/2004 EECS 150, Fa04, Lec 13-Project 24
ITU-R BT.656 Details• Control is provided through
“End of Video” (EAV) and “Start of Video” (SAV) timing references.
• Each reference is a block of four words: FF, 00, 00, <code>
• The <code> word encodes the following bits:
F = field select (even or odd)
V = indicates vertical blanking
H = 1 if EAV else 0 for SAV
• Horizontal blanking section consists of repeating pattern
80 10 80 10 …
10/12/2004 EECS 150, Fa04, Lec 13-Project 25
Calinx Video Decoder (not this term)• Analog Devices ADV7185
• Takes NTSC (or PAL) video signal on analog side and outputs ITU601/ITU656 on digital side.
– Many modes and features not use by us.
– VidFX project will use default mode:
no initialization needed.
• Generates 27MHz clock synchronized to the output data.
• Digital input side connected to Virtex pins.
• Analog output side wired to on board connectors or headers. Camera connection through “composite video”.
analogside
digitalside
10/12/2004 EECS 150, Fa04, Lec 13-Project 26
SDRAM interface (cp 3)• Memory protocols
– Bus arbitration
– Address phase
– Data phase
• DRAM is large, but few address lines and slow
– Row & col address
– Wait states
• Synchronous DRAM provides fast synchronous access current block
– Little like a cache in the DRAM
– Fast burst of data
• Arbitration for shared resource
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
N64 controller interface
8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
frame
10/12/2004 EECS 150, Fa04, Lec 13-Project 27
SDRAM READ burst timing (for later)
10/12/2004 EECS 150, Fa04, Lec 13-Project 28
Rendering Engine• Fed series of display objects
– Obstacles, paddles, ball
– Each defined by bounding box
» Top, bottom, left, right
• Renders object into frame buffer within that box
– Bitblt color for rectangles
– Copy pixel image
• Must arbitrate for SDRAM and carry out bus protocol
Game
Physics
Video Enc. i/f
ADV7194
composite video
ITU 601/656
N64 controller interface
8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
10/12/2004 EECS 150, Fa04, Lec 13-Project 29
Game Physics• Divide time into two major phases
– Render
– Compute new board
• Compute phase is divided into 255 ticks
• Each tick is small enough that paddles and board can only move a small amount
– Makes fixed point arithmetic each
– New paddle pos/vel based on old pos/vel and joystick velocity
– New ball is based on old ball pos/vel and all collisions
– Stream all obstacles and paddles by the ball next state logic to determine bounce
Game
Physics
Video Encode
ADV7194
composite video
ITU 601/656
N64 controller interface
8
SDRAMControl
Data
32
pla
yer-
0in
pu
t
board state
pla
yer-
1 in
pu
t
Render EngineSDRAM Control
Joystick
Interface
FPGA
17 9
More when we look at arithmetic
10/12/2004 EECS 150, Fa04, Lec 13-Project 30
Network Multiplayer Game
ADV7194
composite video
SDRAMControl
Data
32
board state
FPGA
17 9 ADV7194
composite video
SDRAMControl
Data
32board state
FPGA
17 9
networknetwork
10/12/2004 EECS 150, Fa04, Lec 13-Project 31
Rendezvous & mode of operation
• Player with host device publishes channel and ID– Write it on the white board
• Set dip switches to select channel
• Start game as host– Wait for guest attach
• Start game as guest– Send out attach request
• Host compute all the game physics– Local joystick and remote network joystick as inputs
» Receive Joystick movement packets and xlate to equivalent of local
– Determines new ball and paddle position
» Transmits court update packets
– Network remote device must have fair service
• Both devices render display locally
10/12/2004 EECS 150, Fa04, Lec 13-Project 32
Host Device (player 0)
Game
Physics
Video Stream
ADV7194
composite video
ITU 601/656
Joystick
Interface
N64 controller interface
8
SDRAM Control
SDRAMControl
Data
32
Render Engine
pla
yer
-0in
pu
t
board state
pla
yer
-1in
pu
t
CC2420
Net
wo
rk In
terf
ace
con
tro
llerBoard
encoder
Joystickdecoder
SP
I
10/12/2004 EECS 150, Fa04, Lec 13-Project 33
Guest Device (player 1)
Video Stream
ADV7194
composite video
ITU 601/656
Joystick interface
N64 controller interface
8
SDRAM Control
SDRAMControl
Data
32
Render Engine
pla
yer
-1in
pu
t
board state
CC2420
Net
wo
rk In
terf
ace
con
tro
ller
Boarddecoder
JoystickEncoder
SP
I
10/12/2004 EECS 150, Fa04, Lec 13-Project 34
Protocol Stacks
• Usual case is that MAC protocol encapsulates IP (internet protocol) which in turn encapsulates TCP (transport control protocol) with in turn encapsulates the application layer. Each layer adds its own headers.
• Other protocols exist for other network services (ex: printers).
• When the reliability features (retransmission) of TCP are not needed, UDP/IP is used. Gaming and other applications where reliability is provided at the application layer.
application layer ex: http
TCP
IP
MAC Layer 2Layer 3Layer 4Layer 5
Streaming Ex. Mpeg4
UDP
IP
MAC Layer 2Layer 3Layer 4Layer 5
10/12/2004 EECS 150, Fa04, Lec 13-Project 35
Standard Hardware-Network-Interface
• Usually divided into three hardware blocks. (Application level processing could be either hardware or software.)
– MAG. “Magnetics” chip is a transformer for providing electrical isolation.
– PHY. Provides serial/parallel and parallel/serial conversion and encodes bit-stream for Ethernet signaling convention. Drives/receives analog signals to/from MAG. Recovers clock signal from data input.
– MAC. Media access layer processing. Processes Ethernet frames: preambles, headers, computes CRC to detect errors on receiving and to complete packet for transmission. Buffers (stores) data for/from application level.
• Application level interface– Could be a standard bus (ex: PCI)
– or designed specifically for application level hardware.
• MII is an industry standard for connection PHY to MAC.
MAG(transformer)
PHY(Ethernet signal)
MAC(MAC layer processing)
application level
interfaceEthernet
connection
Media Independent Interface (MII)
Calinx has no MAC chip, mustbe handled in FPGA.
Calinx has no MAC chip, mustbe handled in FPGA.
You have met ethernet. IEEE 802.15.4 will look similar…yet different
10/12/2004 EECS 150, Fa04, Lec 13-Project 36
802.15.4 Frames
10/12/2004 EECS 150, Fa04, Lec 13-Project 37
Packet protocols
• Framing definitions– IEEE 802.15.4
• Packet formats– Request game
– Joystick packet
– Board Packet
10/12/2004 EECS 150, Fa04, Lec 13-Project 38
Schedule of checkpoints
• CP1: N64 interface (this week)
• CP2: Digital video encoder (week 8)
• CP3: SDRAM controller (two parts, week 9-10)
• CP4: IEEE 802.15.4 (cc2420) interface (wk 11-12)– unless we bail out to ethernet
– Overlaps with midterm II
• Project CP: game engine (wk 13-14)
• Endgame– 11/29 early checkoff
– 12/6 final checkoff
– 12/10 project report due
– 12/15 midterm III