crazy camera killing compadres - texas a&m...

115
Autonomous Tracking Unit CPSC 483 – Section 501 Final Report Spring ‘99 John Berglund Randy Cuaycong Wes Day Andrew Fikes Kamran Shah Professor: Dr. Rabi Mahapatra

Upload: truongdiep

Post on 29-Mar-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Autonomous Tracking Unit

CPSC 483 – Section 501

Final ReportSpring ‘99

John BerglundRandy Cuaycong

Wes DayAndrew FikesKamran Shah

Professor: Dr. Rabi Mahapatra

Table of ContentsABSTRACT....................................................................................................................................................4

APPLICATIONS............................................................................................................................................5

SYSTEM OVERVIEW..................................................................................................................................6

XILINX MODULES......................................................................................................................................7

CAMERA INTERFACE.....................................................................................................................................7SERVO CONTROL.........................................................................................................................................11MEMORY AND CAMERA CONTROL MODULE..............................................................................................15ALGORITHM................................................................................................................................................19

EXTERNAL COMPONENTS....................................................................................................................23

MAJOR PARTS LIST.....................................................................................................................................23SRAM.........................................................................................................................................................23SERVO PLATFORM.......................................................................................................................................23PROTOTYPE BOARD....................................................................................................................................24

SYSTEM INTEGRATION..........................................................................................................................25

POSSIBLE IMPROVEMENTS..................................................................................................................28

REFERENCES.............................................................................................................................................29

APPENDIX A – VERILOG FILES............................................................................................................30

ALGOR.V......................................................................................................................................................30CONT.V........................................................................................................................................................38CONTROL.V..................................................................................................................................................42INTRFACE.V.................................................................................................................................................44PULSGEN.V..................................................................................................................................................48RTOX.V........................................................................................................................................................49RTOY.V........................................................................................................................................................50

APPENDIX B – PROJECT SCHEMATICS.............................................................................................52

MAIN SCHEMATIC.......................................................................................................................................52PART A – ALGORITHM AND SERVO CONTROL MODULES..........................................................................52PART B – CAMERA INTERFACE MODULE...................................................................................................53PART C – CLOCKS.......................................................................................................................................53PART D – MEMORY AND CAMERA CONTROL MODULE.............................................................................54PART E – DEBUG........................................................................................................................................55

APPENDIX C – TEST DRIVERS AND STUBS.......................................................................................56

TEST.BS2......................................................................................................................................................56CAMMEM.BS2..............................................................................................................................................57CAM_STUB.V...............................................................................................................................................59WESSIM.V....................................................................................................................................................60

APPENDIX D – ALGORITHM DEVELOPMENT FILES.....................................................................63

ATU.JAVA...................................................................................................................................................63ALGORITHMS.JAVA.....................................................................................................................................63MAINWINDOW.JAVA...................................................................................................................................64PICTURE.JAVA.............................................................................................................................................65WINDOWDISPOSER.JAVA............................................................................................................................68

2

APPENDIX E – PROJECT DOCUMENTS..............................................................................................69

APPENDIX F – XILINX LABS DEVELOPED DURING THE PROJECT..........................................70

CHAPTER 15 – CONTROLLING SERVOS.......................................................................................................70CHAPTER 19 – BASIC STAMP 2 TO FPGA..................................................................................................72CHAPTER 22 – INTERFACING A CONNECTIX QUICKCAM............................................................................75

APPENDIX G – ATU USER’S MANUAL.................................................................................................86

HARDWARE SETUP......................................................................................................................................86SOFTWARE SETUP.......................................................................................................................................86USING THE UNIT.........................................................................................................................................86

3

AbstractOur objective is to develop an autonomous camera that can identify and track an object in motion. Our implementation uses a Connectix QuickCam interfaced to a Xilinx FPGA. The Xilinx logic is divided into four components, and is responsible for gathering the images from the camera, storing them to memory, deciding if movements are necessary, and controlling the servos. The algorithm for detecting motion is based on histograms obtained from individual images.

4

ApplicationsAlthough our original idea was to develop a system that would track and kill lab ops with a dart gun, our camera system has many practical applications. A few of the real-life applications that we have considered are as follows: Sports Camera Operator: Televised sports are an excellent application for our camera

system. Imagine the scenario in which the camera could track the action of a ball as it flies through the air. When Tiger Woods strikes a golf ball, the camera could automatically follow the ball’s flight down the course. Also, when Rickey Henderson steals second during a baseball game, the camera operator would not have to worry about accidentally missing another record-setting steal. The camera would automatically detect that the player is moving and follow him on his attempt to steal a base.

Surveillance: Department stores and casinos contain hundreds of surveillance cameras to record and monitor actions of their building’s occupants. As a result, security officers constantly monitor each individual camera, even if the scene is empty. With a motion-sensing camera, a computer could selectively filter which images are important for the officer to view, reducing the chance of missing important images.

Population Control: The camera has environmental applications for areas with excessive populations of a particular species. The system could be customized to detect specific species in order to cull them. The Galapagos Islands for example, has a large population of introduced goats that threaten the survival of its endemic species. Mounted on a gun, the camera could selectively filter out the goats and eliminate them.

5

System OverviewOur system can be divided up into seven discrete components (See Figure 0).

The four Xilinx modules, which operate together to control all functionality, are described below: Camera Interface - This module is intended to provide a simple interface in FPGA for the

Connectix QuickCam. The interface provides a means for both initialization and image capture from the QuickCam.

Servo Control - Two servos are used to physically orient the QuickCam. The servo control system accepts a simple offset value, which is translated by the servo control mechanism into pulses that turn the camera in two dimensions.

Memory and Camera Control - This responsible for storing the captured images in SRAM and controlling image collection from the camera. It also provides a means by which the motion detection hardware can read from memory.

Algorithm - The fourth and final module is responsible for the image analysis. This module is the actual brains behind our system, and is responsible for detecting objects and sending the appropriate signals to move the camera. A Java application was developed to allow us to visually compare possible algorithms. Because of CLB limitations, a simple histogram approach to edge detection was chosen and implemented.

6

SRAM

Algorithm

Camera Interface

Camera

Servo Control

Memory & Camera Control

Servos

Figure 0: System Diagram

Xilinx Modules

Camera Interface When we first developed our project idea, our intentions were to reuse the camera interface developed by the NetCam team. Unfortunately, as we began reviewing their camera interface, we quickly realized that a more robust, general-purpose camera interface was needed. With this in mind, we decided not to try and adapt the NetCam interface, but to build a comprehensive QuickCam interface solution of our own. Our hope was to develop a “drag-and-drop” solution to QuickCam interfacing that future groups could take advantage of.

Because our project design changed, we actually ended up developing two separate interfaces. The first interface we developed used a complex handshaking protocol to provide asynchronous data exchange. Although this interface worked incredibly well, we quickly realized that it would be a performance bottleneck to our system. For this reason, we developed a second interface that removed the handshaking, and simply streamed the data out on a clock edge. With the second interface, we built a separate module to handle the configuration and multi-step initialization of the camera. The result is a super clean interface that is easily reconfigured to any situation.

Interface I DesignFigure 1 shows the signals of the original module. Notice that it is extremely complex and requires 25 bits of data to be exchanged between the controlling logic and the camera interface module. During a data read, the original module used the Rdy, Valid and Send signals to

handshake data between the interface and controlling logic. The controlling logic would assert Send during a read to indicate that it was requesting data. It then asserted Rdy every time it was ready to receive data. The interface responded to these requests by asserting Valid whenever the data from the QuickCam was on the data bus. The controlling logic would deassert Rdy when it no longer required the data to be valid on the data bus.

Interface I TestingThe testing was completed in three parts:1. Testing in FPGA. Our initial tests were designed to verify that our module behaved as

planned. To do this, we implemented the module in FPGA, and set up switches and LEDs so

7

Memory and Camera

Control

Camera Interface Module

QuickCamInstruction (4)

Parameter (8)

Start

Hold

Send

Rdy

Valid

Data (8)

Nibble (4)

PCAck

Reset

Command (8)

CamRdy

Figure 1: Camera Interface I

that we could physically see the behavior of the machine. Test cases were developed to exercise the machine’s logic, and all passed with flying colors.

2. Connect a BSII to the Xilinx. After we completed our first round of tests, we realized that we really needed to see an image from the QuickCam. For this, we decided to use a BSII to simulate the controlling logic, and used its debug feature to display the returned data to the screen.

3. Connect both the BSII and the QuickCam to the Xilinx. The final goal of our testing was to display the binary image data returned by the camera. This testing was successful and an image of a star was correctly captured and returned.

Interface II DesignOur goal with the new interface (See Figure 2) was to remove the byte by byte handshaking and clear up any hidden complexities. We began by removing the Rdy signal, so that the new interface no longer waits on the controlling module. Instead, the new implementation only requires that the controlling logic assert Send when it is ready to receive data. The camera interface should respond by asserting Valid when data is on the bus. The positive edge of the Valid signal can then be used by the controlling logic to latch or store the data. The result is that the interface module can stream the data out as quickly as it is arrives from the QuickCam, providing a much improved transfer rate over the original handshaking interface.

On another note, the Instr (4) and Parameter (8) bits have been replaced by a one bit Command control. The original interface required the controlling logic to not only pass the desired instruction, but to also pass the parameters for each instruction. This requires the controlling logic to exert a significant amount of effort to operate the camera, since the initialization process requires seven to eight instructions.

If Command is low (See Figure 3) and Start is asserted, the interface will initialize the QuickCam and assert Hold until the process is completed. If Command is high (See Figure 4) and Start is asserted, the new interface will synchronously stream data out on a Valid clock, and assert Hold until the process is finished.

8

Memory and Camera

Control

Camera Interface Module

QuickCamStart

Hold

Send

Valid

Data (8)

Nibble (4)

PCAck

Reset

Command

CamRdy

Command (8)

Figure 2: Camera Interface II

Interface II ImplementationFigure 5 shows the schematic for Interface II. The two important additions to Interface II can be seen in this diagram. First, the module labeled “control” was added to reduce the complexity of the controlling circuit. Second, the ROM was added to store the command and parameter pairs to send to the camera. When an initialization command is sent to the interface, the control module steps through the first 8 entries in the ROM. The 9th position in the ROM is reserved for the command that takes a picture.

As a result, it is very easy to alter camera parameters such as image size, contrast and brightness. For example, when we originally designed our project, we had intended to use a 256x243 image size to detect motion. During the last stages of development, we realized that this was more data than we needed. To fix this, we simply changed the ROM file to reduce the image size to 128x120. As a result of the ROM approach, no other changes were required.

9

CMD

HOLD

START

Figure 3: Camera Interface II Initialization Sequence

VALID

CMD

HOLD

START

Figure 4: Camera Interface II GetPicture Sequence

Interface II TestingThe testing of the second implementation was divided up into two parts:1. Simulation. Simulation was done to ensure that the interaction between the camera interface

and the QuickCam was behaving properly. 2. BSII and QuickCam. After simulating the new interface, we programmed the BSII to

produce the controlling signals for the interface. A number of issues arose during testing that are fairly significant:

a) The BSII cannot trigger on the edge of a signal. This is a point of concern as Valid is asserted to indicate that data has been placed on the bus. The implementation requires that the positive edge of Valid be used to latch or store data. To overcome this, we had to test for both a high and a low before we could display the returned data. This works only marginally, and only if the BSII is running at a much faster rate than the camera interface.

b) While trying to keep the camera interface running at a slower rate than the BSII, we actually ran it too slow. We spent well over two weeks trying to figure out why the camera was returning washed out images. By chance, we increased the speed of the clock, and the image magically appeared. It is our hypothesis that the internal CCD of the camera was decaying faster than we could download the image.

The most interesting thing that came out of our testing was the frame rate of our interface. For the image size that we used in our project (128x120), we can capture images at a rate in excess of 30 fps. This caught our group completely off guard. We had only hoped to get maybe 5 frames per second from the camera. We adjusted the module’s clock several times to see if we could improve the rate even more, but we discovered that the interface saturates at around 4 MHz. This seems to be the point at which the camera cannot respond any faster.

Interface II DocumentationInterface II has been thoroughly documented in the appended Xilinx Chapter 22. Lab 22 describes in detail how to restore, test and integrate our camera interface into any project.

10

Figure 5: QUICKCAM.sch

Servo Control

IntroductionFor this project, we will be using two Futaba FP-S148 servos. The specifications are given in Table 0.

Control System +Pulse control 152 s neutralPower Supply 4.8V or 6.0 V (shared with the receiver)Operation angle Rotary system, one side 45or greater (including trim)Power Consumption 6.0V 8mA (at idle)Operating speed 0.22 sec / 60Output torque 42 oz. / in (3kg, cm)Dimensions 1.58x0.77x1.4 inch (40.4x19.8x36 m/m)Weight 1.5 oz (44.4 g)

Table 0: FP-S148 Specifications

The servos have a range of approximately 180 degrees. In reality, only 160 degrees of motion are available. The servos are controlled by pulse width modulation. The servos expect a pulse with a duty cycle of between 1 and 2 ms with a frame pulse of 17 to 20ms. In our case, neutral position is established by sending a pulse with a duty cycle of 152s.

Design 1Design 1, our simplest design, has a counter that counts up to 10000 (enough pulses for the 20 ms pulse cycle) then resets. It used a 500 kHz clock (period = 2s). During the first 625-875 clock cycles, the servo control output drives the servo output high. If the servo control output goes low after 625 pulses, the servo moves to the 0 position. If the servo control stays high until 875 pulses, the servo moves to the 160 position. Figure 6 illustrates the servo control output.This design was abandoned due to a large CLB requirement because of the 14-bit counter required to count to 10000.

Design 2Design 2 was based on a 500 kHz clock and a 6-bit angle value. Through the use of clock dividers, we produced a 22ms frame clock and a 0.02 ms clock. The 0.02ms clock was used to drive the “pulsegen” module that had the responsibility to deliver the proper signal to the servo. When the frame clock goes high, a 7-bit counter starts incrementing and the servo pulse signal becomes high. When the MSB of the counter goes high, the output will have been high for 64 clock cycles, or 1.24ms, which is the required minimum length. The design continues to count until the six lower bits of the counter are equal to the angle value. The length of the pulse resulting from this algorithm has a range of 1.28 to 1.74 ms. This implementation was designed

11

17-20ms10000 clock cycles

0º 160º

1.25ms625 clock cycles

1.75ms875 clock cycles

Figure 6: Servo Control Output

to provide a pulse of 1.25 to 1.75 milliseconds to the servo every 20ms. The use of the 7 bit counter is illustrated in Figure 7.

This design addressed the CLB usage problem encountered in design 1, as a 14-bit counter is not used. Unfortunately this design has two other problems. According to data figures retrieved from the Internet, a pulse length between 1.25ms and 1.75ms should control the full range of the servo. These timing figures were found to be incorrect. The actual values for the maximum and minimum pulse length are approximately 1 to 2 ms. The second problem was that this design only allowed for 25 increments from minimum to maximum servo position, which does not provide the accuracy we require.

These two problems were inherently opposite of each other. To increase the number of increments between minimum and maximum, we need to add bits to the angle input. This increases the MSB and in turn increases the minimum pulse length. On the other hand, decreasing the minimum pulse length reduces the number of increments.

Design 3Design 3 is a modification of Design 2. Instead of testing the MSB for the minimum pulse length, an 8-bit counter first counts to the minimum pulse length. Once the minimum pulse length is reached, it would reset to 0 and count until it is equal to the 8 bit input angle. Once the pulse angle length is reached, the pulse signal is lowered. A flag is used to differentiate between the count to the minimum pulse length and the count to the input angle length.

This design allows for a shorter minimum pulse length and allows an 8-bit of increment. Unfortunately, the number of CLBs required increased, due to the 8-bit comparator.

Final DesignFor the final design, we used an internal 9-bit counter and an 8-bit input angle. We reverted back to the idea of using a MSB to determine when the minimum pulse length has been reached. Since we are using 8-bits instead of 5-bits for the input angle, we have successfully increased the number of increments from design 2.

A simple solution to our minimum pulse length problem was devised. The 9-bit counter has a bit value of 1|0000|0000| when the minimum pulse length is reached. To allow for shorter minimum pulse lengths, the counter starts at 256 (1|0000|0000b) minus the number of clock cycles needed to produce a minimum pulse length. This allows us to have a 1-bit comparator for the minimum pulse length and test for the angle using the lower 8-bits.

12

If (bit 6 == 0){ Minimum pulse not reached} else{ Minimum pulse reached Test for angle}

If (bit5:0 == angle){ Minimum+angle pulse length generated. Pulse = 0;}

Bit 6 5 4 3 2 1 0

Figure 7: Bit Field Usage

Our algorithm works by first setting the counter to 28 (256) minus the minimum pulse length (60). The counter is incremented on each clock cycle. After 60 clock cycles, the counter equals 256(1|0000|0000b). When this condition is met, the counter continues to increment. The output pulse is high from the time the counter begins to increment past the minimum pulse length until the value of the lower 8-bits of the angle and the counter is equal. When this occurs, the counter stops incrementing and the output pulse is lowered. When the frame clock goes low, we reset the counter and angle value, and wait for the rising edge of the frame clock to begin again.

The interface defined between the Algorithm module and the Servo Control module calls for the Algorithm module to pass a 6-bit offset from the current position and a direction bit. The servo works on an absolute value system, so a separate Verilog module, named “RtoA” is needed.

“RtoA” has two major responsibilities. First, it has to maintain the current position of the servo by means of a registered angle output to the “pulsegen” module. This allows us to take a direction and offset for inputs and translate them to an absolute value to send to “pulsegen”. Second, the module has to reset the angle to the normal position when the angle value is incremented out of the range of the servo. In previous designs, “RtoA” also prevented the angle from changing while “pulsegen” is producing a pulse. We decided that this was better implemented in the Algorithm module.

The clock speeds in the Xilinx FPGA are not perfectly accurate. Our current design can be easily modified to accept clock cycles of various speeds as long as they are close to 100 kHz. The closest frequency we achieved was 88.7 kHz which was obtained by dividing the 500 kHz oscillator by 5. This clock was used to drive the logic within the “pulsegen” module.

Our design needs another clock to define our frame. This clock was initially set to provide a pulse of 20ms. During the final stages of integration, however, we found that the algorithm module was wasting valuable time waiting for the frame clock to be deasserted. We found that if we drove the servos with a 17ms frame clock rather than a 20ms frame clock this increases our overall system performance (This is discussed in the System Integration section on page 19). The final servo control schematic is shown in Figure 8.

13

Figure 8: Schematic for Servo Control

To prevent damage to the servos during testing, we used the oscilloscope to verify that the pulse generated would set the servo in the normal position. We then attached the servos to the control signal and measured the pulse lengths at the “physical” minimum and maximum ranges. Since we also knew the clock speed, simple mathematics determined how many pulse clock periods were required for the extreme values of the servo. These values were initially used in the “RtoA” module to control the reset conditions. Further experimentation provided the final values for maximum and minimum ranges and the center.

14

Memory and Camera Control Module

The Metamorphosis of a ModuleIn our initial proposal, we described the development of the Memory and Camera Control (MCC) module with a short and innocent definition of “Memory System”:

“Memory System: This will involve implementing an SRAM interface that will store image representations and allow for them to be accessed by other system components.”

The MCC is the tool we developed to accomplish this objective. Our goal was to develop a memory system that shielded the algorithm module from the specific details of interfacing with the QuickCam, and Memory. We felt that this modular approach would greatly facilitate possible research and experimentation with different algorithms. Despite facing a considerable number of challenges, we were able to accomplish this goal. The following details the development of the MCC. MCC Initial DesignIn the initial design of the MCC, we intended to use asynchronous communications with the Camera Control module. Figure 9 shows the handshaking protocol we planned on using. Figure 10 illustrates the asynchronous handshaking we planned to use to transfer data from the Camera Control module.

We invested a considerable amount of our resources in designing this initial module and implementing it in Verilog. The initial design required approximately 75 CLBs. We had the module simulating correctly, but revised it for a number of reasons. We felt that the extra cycles introduced by the asynchronous interface with the camera control module could be avoided. This module also spent many states going through a complex procedure to initialize the camera. We felt these details could be taken care of in the Camera Control module, and moved them there.

15

Data Valid

Cam_instr

Cam_param

Cam_start

Cam_hold

Figure 9: Initial Instruction Execution between the MCC and the Camera Control

Design 2We decided to move the camera initialization sequence into the camera control module and to stream data from the camera synchronously. Removing the long camera initialization sequence and the asynchronous hand shaking from the Memory Control module only lowered the CLB count to approximately 68. We also decided that it would be best to unpack the packed 6-bit pixels the camera was sending before storing them in memory. We had to use a 24-bit buffer to unpack every 3 transfers from the camera into four memory writes with one 6-bit pixel in each. The complexity and buffer size were expensive components of this design in terms of clock cycles and CLBs. We began testing our implementation in simulation. Realizing that the CLB count would be too large for the module to be successfully integrated into the entire unit we revised this design also. Design 3In response to the unpacking problem mentioned above, we decided to simplify things by using 4-bit pixels instead of 6-bit pixels. The Camera Control module allowed us to easily make this modification. Using 4-bit packed pixels and storing them directly in memory allowed us to remove the pixel decompression logic and the 24-bit buffer, and to use half of the previous amount of external memory.

To reduce complexity, we also decided to use only 64K of memory, which would give us room to store two full 256X243X4 pictures, leaving 3K for the algorithm to use. All of this reduction dropped the CLB count to approximately 28 for this module. Using a memory space of 256X256X4 for each frame, where we pack two pixels into each byte, memory access by row or by column becomes simpler. Each row takes 256X4 bits or 128 bytes, which makes it byte addressable with 7 data lines. To get a column of data, we add 1 to the 8th position of the address for each pixel as the data is read in. Since each byte contains two pixels, the column returned has a width of two pixels. Since each image capture is in a memory frame of 256X256/2 bytes, the frame became the top bit of the memory address. Figure 11shows the memory-addressing scheme we planned on using. The advantage of this memory management scheme is that accessing it requires only 8 bit adders, and no multiplication. This saves a considerable amount of resources in terms of CLBs and maximum clock cycle frequency.

16

Send

Cam_rdy

Cam_valid

Cam_data

Data Valid

Figure 10: Original Asynchronous Picture Transfer Between the MCC and the Camera Control

This module also added a feature that allowed the algorithm to read and write specific bytes to memory. At this point, we determined that we did not need the resolution we were using to track objects.

Design 4After timing the frame capture rates with the Camera Control module, (see the Camera Control section) we decided to reduce our frame size from 256X243X4 to 128X120X4 (number of rows X number of column X pixel size in bits). This resulted in a significant increase in frame capture rate without sacrificing resolution too much. We felt the gains in speed far outweighed the gains of using the higher resolution when considering the overall objective of our project. We made some minor modifications to the MCC module to account for this new resolution. Our new address scheme is shown in Figure 12. Since the Frame portion of the address now takes up 3 bits, we could potentially store up to 8 frames in memory at once.

We began testing this module by integrating it with the Camera Control module (which had already been unit tested) and driving it using the Basic Stamp II externally to simulate the algorithm module. The BS2 asked the MCC to initialize the camera, get a video frame, store it in memory, and return the picture by rows and by columns. The results we obtained were sporadic at best. Occasionally, we would get entire picture frames from the camera and things seemed to work very well. On other occasions, we would get garbage. We later found out that this garbage was mainly due to a faulty cord coming from the camera. However, we spent a great amount of time investigating the source of this garbage. After becoming frustrated with our results, we decided to redesign the MCC to make it more strait forward and easier to debug.

Final DesignThe final implementation of the MCC was mostly a re-coding of Design 4. We made state and signal transitions easier to follow in hopes that this would allow us to debug the module more easily. We also removed the operations that allowed the algorithm to read and write bytes to memory. This allowed us to connect the cameras data pins directly to the memory output buffers, and its valid signal to the read/write line.

Our efforts were in vain though, as the performance of this re-coded module was about the same as the previous one. After spending some time fighting with it, we decided to make a separate project to unit test the MCC, replacing the camera control module with a stub that wrote a sequence of numbers to memory. This unit test still used the BS2 to simulate the algorithm. The results of the unit test were very good, though not perfect. We were able to write a stream of numbers to memory and read them back fairly well. However, when performing a consistency test where we read the same row back from memory over and over again, occasionally the read would miss a pixel. We were unable to account for this inconsistency, but decided to move on

17

Frame (1 bit) Vertical Offset (8 bits) Horizontal Offset (7 bits)

Figure 11: MCC Design 3 Image Memory Addressing Scheme.

Frame (3 bit) Vertical Offset (7 bits) Horizontal Offset (6 bits)

Figure 12: MCC Design 4 Image Memory Addressing Scheme.

with our project in spite of it. After re-integrating with the camera control module, we were able to get pictures again. At this point we discovered that some problems were due to the camera sending bad data. If we jiggled the camera’s cord operation improved. We had been aware of this problem before, but for some reason did not think of it when running these tests. We also attributed the inconsistencies of the unit testing to internal problems with the tools we were using. After discovering how to synchronously debug our designs, we found these kinds of problems to be common. Occasionally, state transitions would not occur as we specified them. In several cases we had to make changes to keep our state machines from going into invalid states. In spite of these difficulties, we were able to complete our Memory and Camera Control Module.

The final timing diagrams for the interfaces between the algorithm and the MCC, and the camera and the MCC are available in those respective sections.

Possible Improvements Find the Missing Byte - We could improve the memory interface to get rid of the occasional

misses encountered when reading. These misses were notices when running our unit tests and reading the same row over and over again. We believe the tools we are a contributing factor to this problem and using the synchronous debugger could to possible solutions.

Wasted Space - Since the Frame portion of the memory address now takes up 3 bits, we could potentially store up to 8 frames in memory at once. Ironically, the algorithm we finally decided on only needs one frame. We could have completed our project with only 26+7 = 8 Kbytes of memory, instead of the 64Kbytes currently wired to our board.

18

Algorithm

JAVA Testing PlatformSeveral algorithms were proposed during our investigation of motion tracking. Since implementation in hardware was going to be a difficult method to test algorithms, we decided to implement our proposed algorithms in JAVA. We chose JAVA because of its extensive graphical user interface libraries. This allowed us to create a basic application to efficiently test various algorithms.

Screen shots are used to simulate input from the camera. The source code for the JAVA application can be found in Appendix D, on page Error: Reference source not found. The Algorithms compared used either one screen shot or two to perform calculations. When the "Find" button is selected, it performs its calculations and places a red crosshair on the resulting "center".

Unfortunately, the majority of our proposed algorithms are difficult to implement in hardware. This is due to the limited number of CLBs and the complexity involved. We chose the simplest algorithm, which uses edge detection to find the center of a solid uniform object. Algorithm Problems and SolutionsThe algorithm that we chose to implement operates by calculating delta values for consecutive pixel rows and columns of an image frame. The maximum and minimum column deltas represent the left and right sides of the object, and the max and min row deltas represent the top and bottom of the object. Given the boundaries of the object, the offset from the center of the object to the center of the frame can be calculated. While the basics of this algorithm are simple, the actual implementation is much more complicated because of the interfacing required to get the data of the frame. Before the Verilog implementation was completed, several problems were noticed and corrected in the algorithm.

The logic of the original algorithm failed to take into account two common cases. First, our algorithm failed to deal with the instance where no object was in the frame. To solve this problem, we initialized the registers that store the maximum and minimum deltas to a non-zero number. This acts as a buffer, so only significant changes in the deltas would be identified as edges of an object. Our original algorithm also did not handle the case where an object was not completely within the frame. Initializing the minimum row position (which represents the bottom of the object) to the bottom of the frame (last row) and the maximum row positions (which represents the top of the object) to the top of the frame (first row) solves this problem. This same solution was applied to the column operations.

We also encountered several problems that had nothing to do with the logic of the motion detection. These problems were due to timing issues, CLB usage, and unexpected and inconsistent behavior of Verilog. The majority of our debugging effort went into addressing these problems.

The timing problems were caused by changing a variable more than once or changing a variable and using comparisons with that variable in a single state. These timing issues caused the system to hang when trying to interface with the memory module. By spreading the operations across several states, this timing problem was solved. Another issue involved the use of a nested state machine. We had previously encountered problems with a similar design, so the nested state

19

machine was replaced with a single layer state machine. This switch to a single state machine required more state names, but the number of states remained the same, so no extra CLBs were required. In the course of system integration it became necessary to reduce the number of CLBs used. Most of the reduction of CLBs came from reducing the sizes of registers. The largest reduction was the size of the x and y offsets which are sent to the servo module. These were reduced from 8 bits to 2 bits. We also reduced the internal summing registers and the maximum and minimum position registers.

Final Algorithm ModuleThe final Verilog implementation of the Algorithm module has four routines. The first of these initializes the camera. This routine only executes one time, when the system first comes online. There are four states required in this routine. First, the servo module is sent a signal to initialize the servos. Since no values have been sent to the servos yet, this centers the camera. Also in this state, the "initialize camera" instruction is placed on the instruction bus. In the next state a signal is sent to verify that the data on the instruction bus is valid. In the third state, we wait for the memory module to acknowledge our instruction. The fourth state monitors the memory module's acknowledgement and waits for the instruction to complete.

Once the camera has been initialized we are ready to enter the next routine, which takes a picture and stores the information into memory. This routine is essentially the same as the camera initialization except the servos are not updated and the "get picture" instruction is placed on the instruction bus. Figure 13 illustrates the handshaking protocol for these two routines. After a picture is stored in memory, we begin the row routine.

The first state of the row routine initializes all of the variables necessary to determine which row the center of the object lies on. The previous sum and the parameter registers are initialized to zero. The maximum and minimum deltas are set to 50, which serves to buffer as mentioned earlier. The maximum position, which stores the top of the object, is set to 0, and we set the minimum position, which stores the bottom of the object, to 119 (which is the last row). With the deltas and positions initialized to these values, we can handle the cases of no object or part of an object in the frame. Next, we begin the summing of the current row.

The next state places the instruction and parameter on the appropriate busses to tell the memory module which row to retrieve. In this state, current sum and counter are initialized to zero. When the memory module acknowledges that the data on the bus is valid, the data is added to the current sum. Each memory access retrieves one byte of data, which represents the intensities of two pixel. So the upper and lower halves of this byte of data are added to the sum separately. This requires two states to avoid timing problems. Once this data has been summed, we increment the counter and tell the memory module that we are done with the data. This causes the memory module to load the next byte of data. Until our counter reaches 64, we have not yet summed the entire row, so we return to the state where we wait for memory to acknowledge that

20

Instruction

Send

Valid

Figure 13: Handshaking Protocol for Camera Initialization and Image Capture

the data on the bus is valid. When our counter reaches 64, we have processed 128 pixels worth of data, so we are done with the current row. Unless this is the first row, we generate a delta value.

To generate a delta, we need values for the current row's sum and the previous row's sum. The delta should equal the current sum minus the previous sum. Since we cannot represent negative numbers, the delta is set to the larger of the two sums minus the smaller sum. We know that only a positive delta could be a maximum and only a negative delta could be a minimum. So if the previous sum is larger than the current sum we know that the delta should be negative, and we test it against the minimum delta. If the current sum is greater than the previous sum, we check the delta against the maximum delta. If the delta is a new maximum or minimum, then the max or min position is set to the value of the parameter variable because the parameter represents the row number we are currently testing.

After generating and testing the delta, we increment parameter and set the previous sum variable equal to the current sum. Until parameter equals 120 we have not finished processing every row of the current frame, so we return to the state that puts the instruction and parameter on the busses to the memory module.

After all of the rows have been summed, we calculate the center of the object. Adding the maximum and minimum positions and dividing by two generates this value. To decide which direction to move the servo, we compare the center of the object to the middle row. If the center is higher than the middle row (row 60), we tilt the camera up, if the center is lower than the middle row, we tilt the camera down. Once we know the direction, we need to decide how far to tilt the camera. To do this we need the number of rows from the center of the object to the middle row of the frame. If this distance (in rows) is less than 15, we don't move the camera. If the distance is greater than 15, we move the camera one increment. If the distance is greater than 31, we move the camera 2 increments. By not moving the camera if the distance is less than 16, we stabilize the camera by eliminating hunting of the object. Now that we have the distance and direction, we can update the camera's position. Before we send the direction and increment values to the Servo Control module we have to ensure it is not currently driving the servo. We wait for the frame clock, which controls the servo module, to go high before updating the Servo Control module. After updating the Servo Control module responsible for the vertical movement servo we proceed with the column operation.

The main difference between the column and row operations involves the way we get data from the memory. Instead of getting two pixels from the same row, we get one pixel from two different columns. Because of this, we sum two columns worth of data at a time, so we need another register to store the second column's sum. Despite this difference, the procedure for the column operation is very similar to that of the row operation. In the first state, the previous sum, and parameter are initialized to zero. The maximum and minimum deltas are set to 50. The maximum position, which stores the top of the object, is set to 0. However, we address the columns differently. We compute two columns at a time so we set the minimum position, which stores the bottom of the object, to 63 instead of 128.

The first state involved in summing columns initializes the current sum and the next sum. It also places the parameter and "get column" instruction on the appropriate busses. The parameter instructs the Memory and Camera Control module which pair of columns to get. When the data is valid, the two sums are updated with the data. Next, we increment the counter and tell the memory to get the next byte of data. If the counter is 63, we are done with the current set of columns, otherwise we continue processing the next pair of columns. Unless this is the first set of columns, we generate delta values.

21

The column operation uses the same basic procedure used for the row operation to calculate delta values and compare them to the maximum and minimum values. The first column's delta is computed using the current sum and the previous sum. After this delta has been compared to the maximum and minimum deltas, we compute and compare the second column's delta, instead of going to the next set of columns. The second column's delta is computed using the first column's sum instead of the previous sum. After the second column's delta has been compared to the maximum and minimum deltas, we increment parameter and set previous sum equal to the second column's sum. Until parameter equals 64, we continue to compute the sums for the next set of columns.

After all columns have been summed, we know the maximum and minimum position represent the left and right sides of the object respectively. The procedure for computing the center of the object is the same, adding the maximum and minimum positions and dividing by two. Finding the direction and magnitude to move the camera is similar to the row operation. The main difference is determining how far to move the camera. Because of the way we address the columns, there are about half as many columns as rows. If the distance of the center of the object to the middle column is less than 4, we don't move the camera. If the distance is greater than 4, but less than 8, we move the camera 1 increment. If the distance is 8 or greater, we move the camera 2 increments. We want the Servo Control module to be updated only when it's not driving a servo. In the row operation we waited for the frame clock to go high. To avoid drawing too much current the horizontal or x and vertical or y servos are active on opposite edges of the frame clock. So, for the column operation we wait for the frame clock to go low and then send the signal to update the Servo Control module controlling the x servo with the direction and increment values.

After updating the x servo, we are done processing the current frame. The next iteration of the algorithm begins with the "get picture" operation and the process described above is repeated. Figure 14 illustrates the handshaking used when transferring data for the row and column operations.

22

Data Valid Data Valid

Transfer Complete

Instruction

Parameter

Ready

Send

Valid

Data

Figure 14: Sample Data Transfer

External Components

Major Parts ListPart Name QuantityXilinx XC40010E PC84 FPGA 1Connectix QuickCam 1Winbond W24257AX-15 SRAM 2Futaba FP-S148 Servos 2

SRAM We used two W24257AK-15 Winbond SRAM chips. We made a separate project to unit test this hardware. The memory test wrote a different number to every single memory address and then read all of these numbers back. If at any point the number read back did not match the number written, a sequence of lights indicated the failure. We were able to get this test to run perfectly at up to 8MHz, the maximum internal frequency we could achieve in our FPGA.

Servo PlatformThe servo platform design has been through several changes. Unique to each design was the materials used and how the servos are attached. Steel, brass, aluminum, and plastic were all considered. Issues addressed in platform design were stability, ample viewing angles, weight, balance, and cost. The final design illustrated below in Figure 15 best balances these requirements. Using PVC piping with diameters approximately 7 ½ and 4 ¾ inches, cost could be kept at a minimum and provide the stiffness required to limit unwanted movement caused by the servos. Weight was of no concern because the PVC pipe, camera, and left/right servo are relatively light.

A few minor modifications were made to our design while building the servo. The servo that moves the camera in the Y direction was mounted differently. It was mounted into the “outer ring” and connected to the center rod with an extra horn and thin metal rods. This provided better stability and a cleaner design. It was also difficult to find a 7 ½ inch PVC pipe at a reasonable

23

ServoMounted to a solid object

Camera

Figure 15: Platform

cost. Therefore, a plastic plant pot was cut and reinforced with a metal bracket. It replaced the outer PVC pipe. To allow more ample viewing angles, we mounted the platform on an ordinary camera tripod. There was some talk about the stability of the finished product and methods to reinforce the platform were suggested. After completing the first round of tests on the servos, the platform proved to be stable.

Prototype BoardThe pins from the FPGA to our components are shown in Figure 16.

75 77Camera

CamRdy

79Camera PCAck

81SRAM

CE

83SRAM

Addr14

1 3SRAM Addr9

5SRAM

Addr13

7SRAM Addr5

9SRAM Addr4

11

74 76 78Camera

Reset

80SRAM Data7

82SRAM

Addr10

84SRAM

Addr11

2 4SRAMAddr8

6SRAM Addr6

8SRAM Addr2

10Camera

CamValid

13SRAMAddr7

12

72Camera

Com7

73Xcheck C_CLK

15Xcheck

TDI

14Button1

70Camera

Com6

71Xcheck

DIN

BOTTOM VIEW 17Xcheck

TMS

16Xcheck

TCK68

Camera Com4

69Camera

Com5

4 0 1 0 E 19Button2

18

66Camera

Com2

67Camera

Com3

X I L I N X 21 20Switch1

64 65Camera

Com1

23Switch2

22

62Camera

Com0

63 25Switch4

24Switch3

60Camera

Data2

61Camera

Data3

27AlgSend

26ServoClk2

58Camera

Data0

59Camera

Data1

29Switch8

28CamHold

56Xcheck

INIT

57Yservo

31 30Xcheck

RD54 55

Xcheck PROG

52 50SRAM Data0

48SRAM Data2

46SRAM Data4

44SRAM Data6

42 40SRAM

Addr12

38SRAM

Addr15

36 34 32Xcheck

RT53

Xcheck DONE

51SRAM Addr0

49SRAM Data1

47SRAM Addr3

45SRAM Data5

43 41XServo

39SRAM Addr1

37 35SRAM Data3

36

Figure 16: FPGA Pin Assignments

24

System Integration

OverviewThe wire wrapping of our prototype board was completed shortly after our mid-term presentation. This was a big step, since it enabled us to leave the constraints of the demoboard, and begin integrating the different system modules.

Before we combined modules, we took the time to verify the current consumption of each of the components on the newly built board. Using an ammeter, we found that the QuickCam draws 80mA and each SRAM chip draws 20mA. This is very close to what we had expected, and is well within appropriate limits.

The first two modules that we combined were the Camera Interface module and the Memory and Camera Control module. This combination was first simulated on the PC to ensure that the handshaking was operating correctly. We then used a BSII to physically test the combined interfaces. The BSII simulated the Algorithm module and sent instructions to initialize and store images from the camera. The BSII then retrieved the stored images in both row and column oriented fashions, and the results were compared to test the consistency of the values being returned.

The consistency of returned data was not up to the standard we were expecting, and although changes were made that improved it greatly, consistency still remains an open issue. Running the Camera Interface module and the Camera Control module at different clock rates produced the best results. The final implementation uses a 4MHz clock for the Camera Interface module and an 8MHz clock for the Memory and Camera Control module. Although the data was not perfectly consistent, it was our opinion that these were minor problems and should not affect the overall performance of the system. This assumption has proved correct after integrating the entire unit and observing the unit’s performance.

The next stage of integration involved combining the Algorithm module and the Servo Control module with the already combined Camera Interface module and the Memory and Camera Control module. We first attempted to integrate all of the modules in one go, without ever having done any testing of the Algorithm module. This, of course, did not go well. As a result, we returned to unit testing each module.

It did not take us long to realize that most of our problems were in the untested algorithm component. To make testing easier, we cut the algorithm module down so that it was only responsible for calculating the vertical center of an object. We then developed a Verilog stub module, wessim.v, which returned a white image with a black rectangular object. By changing parameters within the module, we oriented this object in different positions and used it to verify the correctness of the algorithm module. Our algorithm module was completely bug-ridden, and in several instances our design fell short. Once we were confident that the Algorithm module was stable, we added the Servo Control module to the project and tested the signals being sent to the servos.

We then proceeded to reintegrate the Algorithm and Servo Control modules with the Camera Interface and Memory and Camera Control modules. Almost immediately, we observed correct operation in the vertical direction using actual images from the QuickCam. Exuberant from our

25

success, we repeated the simulation approach for the horizontal direction, and once we verified its correctness, we combined it with the already working vertical direction.

ProblemsDuring integration of all of the modules, we encountered several problems related to clock rates. Our first attempt at integration clocked the Camera Interface, Memory and Camera Control, Algorithm, and Servo Control modules at 4MHz, 8MHz, 500kHz and 500kHz respectively. During unit testing, we found that the Camera Interface and Memory and Camera Control modules were optimally clocked at 4MHz and 8MHz respectively. The Servo Control module used the 500kHz clock to generate the pulse widths for servo control signals. For this reason, the only module whose speed we could possibly increase was the Algorithm module.

Unfortunately, increasing the clock rate to the Algorithm module resulted in choppy movement of the camera. We discovered that this behavior was directly related to the manner in which the servos operate. The pulse being sent to the servo was being updated while the signal to the servo was still high. To ensure that the Algorithm module did not issue a request to update the servos prematurely, the module was altered to monitor the frame clock and to send updates to the horizontal and vertical movement servos only when the servo signal is low. This change greatly improved the smoothness of camera motion, and at the 8MHz Algorithm clock, we processed nearly 16 frames per second.

To improve the processing rate even further, we concentrated on improving the behavior of the Algorithm module, which controls the operation of the entire unit. After monitoring signals between the different modules, we measured the times taken for each phase of image processing. These times are shown in Table 1.

Operation Phase Time (ms) Description

Download & Store Image 33.20 Time for the image to be retrieved from the camera and stored in SRAM.

Process Vertical Position 8.5 Time to retrieve image data from SRAM and calculate the objects vertical position.

Wait to Process Horizontal Position 8.3 Wait for Frame Clock

Process Horizontal Position 7.5 Time to retrieve image data from SRAM and calculate the objects horizontal position.

Wait to Capture Next Image 2.6 Wait for Frame Clock

Total 60.1 Time to process one image

Table 1: Image Processing Times with a 20.20ms Frame Clock.

The 33.2ms taken by the Camera Interface to capture an image is close to the physical limits of the QuickCam. Note, however, that waiting for frame clock accounts for nearly 10.9 ms of the total 60.1 ms to process an image. If we could reduce this waited time, we could increase our image-processing rate. We were able to achieve this by reducing our frame clock rate to 17.00ms, which is the smallest servo pulse period allowable. This small change nearly eliminated the wait time, and increased our frame rate from 16.64 to 19.73 frames per second. Table 2 displays the timing results with a 17.00 ms frame clock.

26

Operation Phase Time (ms) Description

Download & Store Image 33.20 Time for the image to be retrieved from the camera and stored in SRAM.

Process Vertical Position 8.5 Time to retrieve image data from SRAM and calculate the objects vertical position.

Wait to Process Horizontal Position 0.640 Wait for Frame Clock

Process Horizontal Position 7.5 Time to retrieve image data from SRAM and calculate the objects horizontal position.

Wait to Capture Next Image 0.840 Wait for Frame Clock

Total 50.68 Time to process one image

Table 2: Image Processing Times with a 17.00ms Frame Clock.

After successfully integrating the unit, we tested its ability to track different objects. With the camera stationary, we have been able to track objects of various shapes, including a black square, a black roll of electrical tape and a blue football. The unit is also able to track other dark objects like a hand or watch. We then proceeded to observe the unit’s performance when the camera mount is in motion as well. We found that the unit will still center an object.

27

Possible Improvements

We have considered two possible ways to improve the design we are currently using. Both of the designs should result in increased performance for the system. The first design involves image buffering. By having the ability to maintain two images in two different SRAM chips, the unit would be able to process one image while acquiring the next image. This image could then be processed while another image is stored in place of the first image. The operational limit for this implementation would be determined primarily by the physical limits of the QuickCam interface. We can expect this implementation to process an image in a little over 33.20 ms. This gives a maximum processing rate of around 30 image frames, a 52% improvement on the current implementation.

The control mechanism for this system should be more complex than the current implementation. A shortage of pins is likely, due to the different data and address pins needed for each of the SRAM chips. A possible workaround to this could be to stream the data from the QuickCam directly to the SRAM chips. By limiting the interaction of the QuickCam with the Xilinx chip to only control functions, the pins being used in the current implementation for QuickCam data lines can be freed.

The second design involves using the internal Xilinx RAM to store the image data. Since we have a histogram approach, we can generate the histogram information as the data is being transferred from the camera. As the data is row order, the y or vertical position can be calculated as the image is being received and we need to store no more than 2 values. More storage is required for the horizontal position calculation. This process should run at around 30 fps and should eliminate the external hardware requirement of the unit.

The difficulty with implementing the second approach lies with the number of CLB’s remaining in the chip. Calculating the vertical position only requires two 11-bit registers to be maintained, while calculation of the horizontal position requires 128 11-bit registers. Xilinx can store 32bits per CLB when implementing RAM. If we used the current algorithm implementation, an additional 45 CLBs are needed. By optimizing the Verilog code of the current system and by integrating the modules into a single dedicated controlling module, it may be possible to obtain these needed CLBs. The current implementation uses 399 of the 400 available CLBs in the Xilinx chip. Using a chip with a larger number of CLBs is also a possible solution.

28

ReferencesThe Programmable Logic Data Book 1998, Xilinx Inc. 1998

Engdahl, Tomi. “Tomi Engdahl’s Electronic Info Page.” 25 March 1999. http://www.hut.fi/Misc/Electronics/robotics.html#robotics

Keever, Darin, et.all. The NetCam. 03 Feb. 1999. http://www.cs.tamu.edu/people/bwadswor/483/98c/

Messmer, Hans-Peter. The Indispensable PC Hardware Book. Harlow, England: Addison-Wesley, 1997.

29

Appendix A – Verilog Files

algor.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: algor.v//Authors: Randy Cuaycong and John Berglund//Description: Algorithm module// This module interfaces with the memory/control module to issue the following// commands: InitCamera -> GetFrame -> ColOp -> RowOp -> GetFrame....// In essence, the module will add all the grey scale values in a column and // compare column(n)-column(n-1) values. The most positive and most negative // values represent the largest change which should be the objects edge. We // then can determine the X coordinate of the center. This is also performed// on the Rows to determine the Y coordinate.

module algmodule (clk,valid,data,ydir,yenable,ymagnitude,xdir,xenable,xmagnitude, param,inst,ready,send,frameclk);

//INPUTS//Xilinxinput clk;

//Memory and Camera Controlinput valid; // valid signal from the memory moduleinput [7:0] data; // Data from the memory module

//Servoinput frameclk; // Used to prevent servo update while servo // frame clock is asserted and incrementing

//OUTPUTS//To Servooutput ydir; // Direction of movement Y output yenable; // Strobes the servo module, RTOA, to accept valuesoutput [1:0] ymagnitude; // How much to move in the Y directionoutput xdir; // Direction of movement Xoutput xenable; // Strobes the servo module, RTOA, to accept valuesoutput [1:0] xmagnitude; // How much to move in the X direction

//To Memory and Camera Controloutput [6:0] param; // Parameter for command. Mainly used to increment // current row or column read from memory.output [1:0] inst; // Instruction bus to the memory moduleoutput ready; // Used in data handshaking with memory moduleoutput send; // High when param and inst are valid

//DECLARATIONSreg ydir;reg yenable;reg [1:0] ymagnitude;reg xdir;reg xenable;reg [1:0] xmagnitude;reg [6:0] param;reg [1:0] inst;reg ready;reg send;

//INTERNAL VARIABLESreg [10:0] prev_sum; // Contains the sum of the prev col in relation to // the current sum. For use with ColOpreg [10:0] cur_sum; // Contains the sum of current Row or Col.reg [10:0] next_sum; // Contains the sum of the next column in relation // to the current sum. For use with ColOp. // NOTE: Prev_sum and next_sum are needed becuase // in get column the memory returns two column values

30

// at a time. Example 0 returns column 0,1 / 1 returns // column 2, 3. Therefor registers are needed to // perform calculations of two columns at the same time.reg [10:0] max; // Holds the most positive summed valuereg [10:0] min; // Holds the most negative summed valuereg [7:0] minpos; // Holds the most negative summed value // MUST be 7 bits to hold minpos+maxposreg [6:0] maxpos; // Holds the position of the most positive row/colreg [6:0] count;reg [5:0] com_state; // State machine internal variable

//PARAMETERS//instructions to the main controlparameter InitCamera=0, GetVideoFrame=1, GetRow=2, GetColumn=3;

//define values for stateparameter Initialize=0, RowOp=1, ColOp=2, Frame=3;

//define values for com_state

parameterI_A=0, I_B=1, I_C=2, I_D=3,F_A=5, F_B=6, F_C=7, F_D=8,R_INIT=9, R_A=10, R_B=11, R_C=12, R_D=13, R_E=14, R_F=15, R_G=16, R_H=17, R_I1=18, R_I2=19, R_J=20, R_K=21, R_L=22, R_M=23, R_N=24,R_O=25, R_P=26, R_Q=50,C_INIT=27, C_A=28, C_B=29, C_C=30, C_D=31, C_F=33, C_G=34, C_H=35, C_I1=36, C_I2=37, C_J=38, C_K1=39, C_K2=40, C_L=41, C_M=42, C_N=43, C_O=44, C_P=45, C_Q=46, C_R=47, C_S=48, C_T=49;

//internal constantsparameter num_rows=120, num_2cols=64;parameter y_center=60, x_center=31;

always@(posedge clk)begin case(com_state)

I_A: begin yenable=1; //Send signal to reset y servo xenable=1; //Send signal to reset x servo if(valid) //Wait until mem_module is ready for instruction begin inst=InitCamera; // place instruction on instruction bus com_state=I_B; // go to next state end end

I_B: begin yenable=0; //Lower servo signals xenable=0; send=1; //Validate the instruction on instruction bus (InitCamera) com_state=I_C; end I_C: begin if(!valid) //Wait for mem_module to acknowledge instruction begin com_state=I_D; end end I_D: begin send=0; //Take the instruction off of the bus (Unvalidates instruction) if(valid)

31

//Wait for instruction to finish (Valid=1 when instruction complete) begin com_state=F_A; //Begin GetVideoFrame end end F_A: begin if(valid) //Wait until mem_module is ready for an instruction begin inst=GetVideoFrame; //Place instruction on instruction bus com_state=F_B; end end F_B: begin send=1; //Validate instruction on bus (GetVideoFrame) com_state=F_C; end F_C: begin if(!valid) //Wait for mem_module to acknowledge the instruction begin com_state=F_D; end end F_D: begin send=0; //Take the instruction off of the bus (Unvalidates instruction) if(valid) //Wait for instruction to complete begin com_state=R_INIT; end end

R_INIT: begin //Initialize variables param=0; // param represents to current row number prev_sum=0; ydir=0; yenable=0; ymagnitude=0; maxpos=0; //Represents the top of the object minpos=119; //Represents the bottom of the object max=50; //Set buffer for max min=50; //Set buffer for min com_state=R_A; end R_A: // This is the beginning state for each Row. We return here // until param = 120, at which time, all rows have been summed. begin if(valid) //Wait until mem_module is ready for instruction begin inst=GetRow; //Place instruction on instruction bus count=0; //count is the number of bytes recieved from memory cur_sum=0; com_state=R_B; end end R_B:

32

begin send=1; //Validate instruction on bus com_state=R_C; end R_C: begin if(valid==0) //Wait for mem_module to acknowledge instruction begin com_state=R_D; end end R_D: begin ready=1; //acknowledge acknowledgement and wait for data if(valid==1) //Wait for data to be valid begin cur_sum = cur_sum + data[3:0]; //Add pixel value to cur_sum com_state=R_E; end end R_E: begin cur_sum=cur_sum+data[7:4]; //Add pixel value to cur_sum com_state=R_F; end R_F: begin ready=0; //Tell mem_module we are done with data count=count+1; //Increment counter (number of bytes recieved) com_state=R_G; end R_G: begin if(valid==0) //Wait for mem_module to acknowledge our completion of data transfer begin if(count==num_2cols) //Test if we have summed all 64 bytes (128 pixels) com_state=R_H; //Finished with current row else com_state=R_C; //Not finished, grab next data end end R_H: begin send=0; //Take instruction off of instruction bus (Unvalidate it) if(param!=0) //Make sure prev_sum is initialized correctly begin if(prev_sum>cur_sum) //Negative Delta begin prev_sum=prev_sum-cur_sum; //prev_sum=delta com_state=R_I1; end else //Positive Delta begin prev_sum=cur_sum-prev_sum; //prev_sum=delta com_state=R_I2; end end else begin com_state=R_J; end

33

end R_I1: //Negative Delta begin if(prev_sum>min) //prev_sum = delta for this row //Test if delta is a minimum begin min=prev_sum; //Set min to current delta minpos=param; //Set minpos to current row end com_state=R_J; end R_I2: //Positive Delta begin if(prev_sum>max) //Test if delta is a maximum begin max=prev_sum; //Set max to current delta maxpos=param; //Set maxpos to current row end com_state=R_J; end R_J: begin param=param+1; //Proceed to next row com_state=R_K; end R_K: begin prev_sum = cur_sum; //Set prev_sum for next row if(param==num_rows) //When param == 120, we are done with all rows com_state=R_L; //Finish this frame else com_state=R_A; //Start a new row end R_L: begin minpos=minpos+maxpos; //Compute middle of object com_state=R_M; end R_M: begin minpos=minpos >> 1; //minpos=sum/2 com_state=R_N; end R_N: begin if(minpos>y_center) //Determine which direction to move the servo begin minpos=minpos-y_center; //Compute offset from center of object and frame ydir = 0; //Move servo up end else begin minpos=y_center-minpos; //Compute offset from center of object and frame ydir = 1; //Move servo down end com_state=R_O; end R_O: begin //Determine how far to move the servo if(minpos[6]==1) ymagnitude=3;

34

else if(minpos[5]==1) ymagnitude=2; else if(minpos[4]==1) ymagnitude=1; else ymagnitude=0;

com_state=R_P; end R_P: begin if(!frameclk) //Wait until servo_module isn't sending a pulse com_state=R_Q; end R_Q: begin yenable=1; //Tell servo to update position com_state=C_INIT; //Begin column operation end C_INIT: begin param=0; prev_sum=0; xdir=0; xenable=0; xmagnitude=0; maxpos=0; //Represents the left side of the object minpos=63; //Represents the right side of the object max=50; //Set buffer for max min=50; //Set buffer for min com_state=C_A; end C_A: begin if(valid) //Wait until mem_module is ready for an instruction begin inst=GetColumn; count=0; cur_sum=0; //Column 1's sum next_sum=0; //Column 2's sum com_state=C_B; end end C_B: begin send=1; //Validate instruction on instruction bus com_state=C_C; end C_C: begin if(valid==0) //Wait for mem_module to acknowledge instruction begin com_state=C_D; end end C_D: begin ready=1; //Acknowledge acknowledgement and wait for data if(valid==1)

35

//Wait until mem_module validates data begin cur_sum=cur_sum+data[7:4]; //Add data to cur_sum and next_sum next_sum=next_sum+data[3:0]; com_state=C_F; end end C_F: begin ready=0; //Tell mem_module we are done with the data count=count+1; //Increment counter (number of bytes recieved) com_state=C_G; end C_G: begin if(valid==0) //Wait for mem_module to acknowledge our completion of data transfer begin if(count==num_rows) //Test if we are done with this pair of columns com_state=C_H; //Finished with current pair of columns else com_state=C_C; //Grab next data end end C_H: begin send=0; //Take instruction off of instruction bus (Unvalidate) if(param!=0) begin //Generate delta for first column of this set if(prev_sum>cur_sum) //Negative Delta begin prev_sum=prev_sum-cur_sum; //Generate delta com_state=C_I1; end else //Positive Delta begin prev_sum=cur_sum-prev_sum; //Generate delta com_state=C_I2; end end else begin com_state=C_J; end end C_I1: //Negative Delta begin if(prev_sum>min) //Test if delta is a minimum begin min=prev_sum; //Set min to current delta minpos=param; //Set minpos to current column set end com_state=C_J; end C_I2: //Positive Delta begin if(prev_sum>max) //Test if delta is a maximum begin max=prev_sum; //Set max to current delta maxpos=param; //Set maxpos to current column set end com_state=C_J; end

36

C_J: begin //Generate the delta for the second column of the set if(cur_sum>next_sum) //Negative delta begin cur_sum=cur_sum-next_sum; com_state=C_K1; end else //Positive delta begin cur_sum=next_sum-cur_sum; com_state=C_K2; end end C_K1: //Negative Delta begin if(cur_sum>min) //Test if delta is a minimum begin min=cur_sum; //Set min to the current delta minpos=param; //Set minpos to the current column set end com_state=C_L; end C_K2: //Positive Delta begin if(cur_sum>max) //Test if delta is a maximum begin max=cur_sum; //Set max to the current delta maxpos=param; //Set maxpos to the current column set end com_state=C_L; end C_L: begin param=param+1; //Proceed to the next column com_state=C_M; end C_M: begin prev_sum = next_sum; //Set prev_sum to second columns value if(param==num_2cols) //Test if we are done with all column sets com_state=C_N; //Finish this frame else com_state=C_A; //Proceed to next column set end C_N: begin minpos=minpos+maxpos; //Compute middle of object com_state=C_P; end C_P: begin minpos=minpos >> 1; //minpos=sum/2 com_state=C_Q; end C_Q: begin if(minpos>x_center) //Determine which direction to move begin minpos=minpos-x_center; //Generate offset

37

xdir=0; //move right end else begin minpos=x_center-minpos; //Generate offset xdir=1; //move left end com_state=C_R; end C_R: //Determine how far to move the servo begin if(minpos[4]==1) xmagnitude=2; else if(minpos[3]==1) xmagnitude=2; else if(minpos[2]==1) xmagnitude=1; else xmagnitude=0; com_state=C_S; end C_S: begin if(frameclk) //Wait until servo_module isn't sending pulse to servo com_state=C_T; end C_T: begin xenable=1; //Tell servo_module to update position com_state=F_A; //Time to get a new picture end endcaseend endmodule

cont.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: Cont.v//Authors: Wesley Day and Andrew Fikes//Description: Memory and Camera Control Module (MCCM)// This module interfaces with the algorithm module to execute the following// commands: InitCamera, GetFrame, GetRow, and GetColumn// This module insulates the algorithm module from the details of interfacing with // the camera control module and memory.// At start-up the algorithm should first send the CmdInitCam command to // initialize the camera.// Then, it should send CmdGetVideoFrame, to get a picture frame dumped into // memory. It should then send CmdGetRow or CmdGetColumn to asynchronously// read the picture from memory one row at a time or one column at a time.// The parameter alg_param indicates which row or which column will be // retrieved.

module control(clk, cam_data, cam_hold, cam_valid, cam_cmd, cam_start, alg_instr, alg_rdy, alg_frame, alg_param, alg_valid, alg_data_out, alg_send, sram_data_in, sram_address, chip_enable); //Sram address format: [FRAME[15:13],Y[12:6],X[5:0]]

38

//INPUTSinput clk;

//QuickCam Interfaceinput cam_hold; //High when the camera control module is busyinput cam_valid; //High when good data is on the businput [7:0] cam_data; //Pixel Data

//Algorithminput alg_send; //send flag from algorithm moduleinput alg_rdy; //ready flag from algorithm moduleinput [1:0] alg_instr; //instruction from algorithm moduleinput [2:0] alg_frame; //frame number from algorithm moduleinput [6:0] alg_param; //parameter from algorithm module//SRAMinput [7:0] sram_data_in; //data from SRAM

//OUTPUTS//QuickCam Interfaceoutput cam_cmd; //Command to the Camera Contrl Moduleoutput cam_start; //Send an instruction to the Camera Control Module

//Algorithmoutput alg_valid; //valid signal to algorithm moduleoutput [7:0] alg_data_out; //data to the algorithm module

//SRAMoutput chip_enable; //chip enable pin to one of the sram chipsoutput [15:0] sram_address; //address for SRAM

//DECLARATIONSreg cam_cmd;reg cam_start;reg alg_valid;reg [7:0] alg_data_out; reg chip_enable; reg [15:0] sram_address;

//INTERNAL VARIABLESreg [3:0] state; //state machine variable

//PARAMETERS//Instructions from the Algorithmparameter CmdInitCam=0,CmdGetVideoFrame=1,CmdGetRow=2,CmdGetColumn=3;

//Statesparameter Idle=0,InitCam1=1,InitCam2=2,InitCam3=3,GetVideoFrame1=4, GetVideoFrame2=5,GetVideoFrame3=6,GetVideoFrame4=7,GetRow=8, GetColumn=9,ReadData1=10,ReadData2=11;

//IMPLEMENTATIONalways@(posedge clk)begin chip_enable=!(sram_address[15]); //always select one chip or the other

case(state)

Idle: //Idle state, waiting for something to do begin alg_valid=1; //Indicates module is idle in this context if(alg_send) //if the algorithm is sending a command case(alg_instr) CmdInitCam: //Initialize the camera state=InitCam1; CmdGetVideoFrame: //Get a picture from the camera and put it in memory

39

state=GetVideoFrame1; CmdGetRow: //Send a row to the algorithm state=GetRow; CmdGetColumn: //Send a column to the algorithm state=GetColumn; endcase end

InitCam1: begin cam_cmd=CmdInitCam; //Command sent to the camera module alg_valid=0; //for hand shaking with the algorithm module state=InitCam2; //move on to the next state end

InitCam2: begin cam_start=1; //send the instruction to the camera module if(cam_hold) //wait for the camera module to raise hold begin state=InitCam3; end end

InitCam3: begin cam_start=0; //lower start, prevents multiple instruction execution if(!alg_send && !cam_hold) //if this instruction is done executing begin //go wait for another one alg_valid=1; state=Idle; end end GetVideoFrame1: begin //calculate memory base address sram_address[15:13]=alg_frame[2:0]; sram_address[12:0]=0; //First pixel in the frame

cam_cmd=CmdGetVideoFrame; //command for the camera module alg_valid=0; //for hand shaking with the algorithm module state=GetVideoFrame2; end

GetVideoFrame2: begin cam_start=1; //send the instruction to the camera if(cam_hold) //wait the the camera control module to raise hold begin state=GetVideoFrame3; end end

GetVideoFrame3: begin cam_start=0; //lower start, prevents multiple instruction execution if(cam_valid || !cam_hold) //wait for a valid byte of data or end of frame begin state=GetVideoFrame4; end end

GetVideoFrame4: begin if(!cam_valid && cam_hold) //wait for valid to drop

40

begin sram_address[12:0]=sram_address[12:0]+1; //prepare to write to the next address state=GetVideoFrame3; end else if(!cam_valid && !cam_hold && !alg_send) //if done getting the frame begin alg_valid=1; state=Idle; end end GetRow: //send a row to the algorithm one byte at a time begin //calculate memory base address sram_address[15:13]=alg_frame[2:0]; sram_address[12:6]=alg_param[6:0]; //set Y position of first pixel sram_address[5:0]='b000000; //set X position of first pixel alg_valid=0; //not ready with a byte yet state=ReadData1; end

GetColumn: //send a column to the algorithm on byte at a time begin //calculate memory base address sram_address[15:13]=alg_frame[2:0]; sram_address[12:6]='b0000000; //set Y position of first pixel sram_address[5:0]=alg_param[5:0]; //set X position of first pixel alg_valid=0; //not ready with a byte yet state=ReadData1; end

ReadData1: //send data to the algorithm module begin if(!alg_send) //if done sending data wait for next instruction begin alg_valid=1; state=Idle; end else begin alg_valid=0; //for hand shaking with the algorithm module if(alg_rdy) //if the algorithm is ready for the next byte begin alg_data_out=sram_data_in; //latch the data from the sram state=ReadData2; end end end ReadData2: begin alg_valid=1; //indicate the data is ready if(!alg_rdy) //wait for the algorithm to indicate it has latched the data begin if(alg_instr==CmdGetRow) //if sending a row increment the column begin sram_address[5:0]=sram_address[5:0]+'b1; end else if(alg_instr==CmdGetColumn) //if sending a column increment the row begin sram_address[12:6]=sram_address[12:6]+'b1; end state=ReadData1; end

41

end endcase //(state) end //alwaysendmodule

control.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: control.v//Authors: Andrew Fikes and Kamran Shah//Description: Control to QuickCam Interface//Last Edit: 04/02/99; K.S.

module intr_control(cntr_clk,cntr_cmd,cntr_start,cntr_intrHold,cntr_hold,cntr_rom_select, cntr_intrStart,cntr_intrSend);

//PARAMETERS//Statesparameter init=0,init_camera_1=1,init_camera_2=2,init_camera_3=3, init_camera_4=4,send_video_1=5,send_video_2=6,send_video_3=7;

//INPUTS//Xilinxinput cntr_clk;input cntr_cmd;input cntr_start;//Interface moduleinput cntr_intrHold;

//OUTPUTS//Xilinxoutput cntr_hold;output [3:0] cntr_rom_select;//Interface moduleoutput cntr_intrStart;output cntr_intrSend;

//DECLARATIONS//Outputsreg cntr_hold;reg [3:0] cntr_rom_select;reg cntr_intrStart;reg cntr_intrSend;

//INTERNAL VARIABLESreg [2:0] state;reg [3:0] counter;

always @(posedge cntr_clk) begin case(state)

//Initialization State init: begin //If command is 0 initialize camera if((cntr_start==1)&&(cntr_cmd==0)) begin cntr_hold=1; counter=0; state=init_camera_1; //Next State end //If command is 1 get a video frame if((cntr_start==1)&&(cntr_cmd==1)) begin cntr_hold=1; //Specifies the proper instruction from the ROM

42

// in the controlling module so a send video // instruction is processed. cntr_rom_select=9; state=send_video_1; //Next State end end //Specifies the proper instruction from the // ROM in the controlling module. init_camera_1: begin cntr_rom_select=counter; state=init_camera_2; //Next State end

//Send start signal to the camera interface module. init_camera_2: begin cntr_intrStart=1; state=init_camera_3; //Next State end //Wait till the controlling moudle indicates that // the instruction is being processed and deassert // the start signal. init_camera_3: begin if(cntr_intrHold==1) begin cntr_intrStart=0; state=init_camera_4; //Next State end end //Wait until the interface module indicates that the // instruction has been completed. This is indicated // by deasserting it's hold signal. init_camera_4: begin if(cntr_intrHold==0) //If this was the last instruction in the initialization // sequence inform the controlling logic that that the // you are done by deasserting hold. if(cntr_rom_select==8) begin cntr_hold=0; state=init; //Next State end //If this is not the last instruction in the initialization // sequence increment the counter for selecting the ROM in // the controlling logic. This will specify the next instruction // that is to be sent to the camera. else begin counter=counter+1; state=init_camera_1; //Next State end end //Initialize for receiving a video frame. Set the start signal high // so the interface module will start processing the // instruction. send_video_1: begin cntr_intrStart=1; state=send_video_2; //Next State end //Wait until the the interface module indicates that the // instruction is being processed (by asserting it's hold // line) and deassert the start signal.

43

send_video_2: begin if(cntr_intrHold==1) begin cntr_intrStart=0; state=send_video_3; //Next State end end //Wait until the video frame has been received, indicated by // the controlling logic by deasserting hold. send_video_3: begin if(cntr_intrHold==0) begin cntr_hold=0; state=init; //Next State end end endcase end endmodule

intrface.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: intrface.v//Authors: Andrew Fikes and Kamran Shah//Description: Low-Level QuickCam Interface to Xilinx FPGA

module intrface(I_Clk,I_Start,I_Instr,I_Param,I_Nibble,I_Camrdy,I_Transfers, I_Hold,I_Valid,I_Command,I_Data,I_Reset,I_Pcack);

//PARAMETERS//Statesparameter init=0,instr_send=1,wait_nb1=2,valid_nb1=3,wait_nb2=4, valid_nb2=5,param_send_rcv=6,trans_video=7,deassert_valid=8, video_end_pulse=9,instr_reset=10,wait_not_start=11, deassert_valid2=12; //Commandsparameter SendVideoFrame=7,SetExposure=11,SetTop=13,SetLeft=15, SetNumV=17,SetNumH=19,SendVersion=23,SetContrast=25, AutoAdjustOffset=27,BytePortEcho=29,SetOffset=31, GetOffset=33;

//3usec definition (See video_end_pulse state for explanation)parameter three_usec=2;

//Instruction Typesparameter write_param=0,read_param=1,nyb_video=2;

//INPUTS//Xilinxinput I_Clk;input I_Start;input [3:0] I_Instr;input [7:0] I_Param;//QuickCaminput [3:0] I_Nibble;input I_Camrdy;input [14:0] I_Transfers;

//OUTPUTS//Xilinxoutput I_Hold;output I_Valid;

44

//QuickCamoutput [7:0] I_Command;output [7:0] I_Data;output I_Reset;output I_Pcack;

//DECLARATIONS//Outputsreg I_Hold;reg I_Valid;reg [7:0] I_Command;reg [7:0] I_Data;reg I_Reset;reg I_Pcack;

//INTERNAL VARIABLESreg [3:0] state;reg [1:0] instr_type;reg instr_pass;reg video_pass;reg [1:0] count;reg [14:0] transfers;

//IMPLEMENTATIONalways @(posedge I_Clk) begin case(state)

//The init state looks for the I_Start signal, and then asserts I_Hold //to indicate that the machine will accept no more instructions. Then, //if the instruction is not RESET, it calculates the appropriate camera //command and sets the instruction type variable to indicate //whether the instruction is write, read or video. init: begin //Make sure I_Reset is always held high. I_Reset=1; //Handle all instructions except for RESET. if((I_Start==1)&&(I_Instr!=14)) begin //Prevent another instruction from being sent. I_Hold=1;

//Calculate I_Command. Command = (Instr*2)+7; I_Command=I_Instr; I_Command=I_Command<<1; I_Command=I_Command+7;

//Determine instruction type. case(I_Command) SendVideoFrame: instr_type=nyb_video; GetOffset: instr_type=read_param; SendVersion: instr_type=read_param; default: instr_type=write_param; endcase

//Set next state. state=instr_send; end

//Handle RESET instruction. if((I_Start==1)&&(I_Instr==14)) begin I_Hold=1; state=instr_reset; //Set next state end

45

end

//The instr_send state sets the instr_pass variable to indicate that //this is the first pass through the state machine. It then asserts //I_Pcack to tell the camera that the command on the command bus is valid. instr_send: begin instr_pass=0; I_Pcack=1;

state=wait_nb1; //Set next state end

//The param_send_rcv state sets the instr_pass variable to indicate that //this is the last pass through the state machine. If the instruction //type is write_param or nyb_video, it places the parameter on the command bus. //It then asserts I_Pcack to tell the camera that the data on the bus is valid. param_send_rcv: begin instr_pass=1; I_Pcack=1; I_Command=I_Param; state=wait_nb1; //Set next state end

//The trans_video state controls the transfer of video data. The number //of transfers is specified in outside logic through the I_Transfers input. //This is used as a counter by the interface module. transfers is set to //I_Transfers in the valid_nb2 state. It decremented in this state if it is not //equal to zero. trans_video: begin if(transfers==0) begin video_pass=0; I_Pcack=1; state=video_end_pulse; //Set next state end else begin video_pass=1; I_Pcack=1; transfers=transfers-1; state=wait_nb1; //Set next state end end

//The deassert_valid states concludes data transfer between the intrface //module and outside logic by deasserting valid. Previously (in another //state), valid was asserted to indicate that the data on the bus was valid. //I_Valid is provide for two cycles so that outside logic can latch it properly. deassert_valid: begin state=deassert_valid2; end deassert_valid2: begin I_Valid=0; if(instr_type==read_param) state=wait_not_start; //Set next state if(instr_type==nyb_video) state=trans_video; //Set next state end //After a video transfer occurs, a I_Pcack pulse of length 3usec or greater //has to be sent to the camera to reset it properly. The video_end_pulse //state implements a count variable which is incremented on each clock tick.

46

//When the counter equals the number of clocks needed for 3usec, //defaults are reset and control is transferred to the init state. //three_usec is a user defined parameter and should be changed depending //on the frequency of the clock. Note: Maximum count == 2^(10) video_end_pulse: begin if(count==three_usec) begin I_Pcack=0; count=0;

//Set next state state=wait_not_start; end count=count+1; end

//The wait_nb1 state waits for the first nibble from the camera. //The camera places the nibble on the nibble bus, and then asserts //I_Camrdy to indicate that the data is valid. The returned //nibble is the most significant nibble of either an echoed //command/parameter or a returned value from a read_param //instruction. wait_nb1: begin if(I_Camrdy==1) state=valid_nb1; //Set next state

end

//The valid_nb1 state is entered when the camera indicates that the //first nibble is valid. The nibble returned is placed on the upper //4 bits of the data line. I_Pcack is deasserted to indicate that the //nibble has been read. The wait_nb2 state is then entered to wait //for the second nibble. valid_nb1: begin

I_Data[7:4]=I_Nibble[3:0];

I_Pcack=0; state=wait_nb2; //Set next state end

//The wait_nb2 state is used to wait for the second nibble from the //camera. This state has many functions. When sending a command the //camera will echo the command back and this state is used to wait //for the second nibble of this echo. When sending parameters the //camera will echo the parameter and this state is used to wait for //the second nibble of this echo. When retrieving data this command is //used to wait for the second nibble of data. wait_nb2: begin

if(I_Camrdy==0) state=valid_nb2; //Set next state

end

//The valid_nb2 state is entered when the camera indicates that the //second nibble is valid. The nibble returned is placed on the lower //4 bits of the data line. I_Pcack is deasserted to indicate that the //nibble has been read. The next state entered is dependent on the //instruction pass and the instruction type. valid_nb2: begin

//Store returned or echoed parameter I_Data[3:0]=I_Nibble[3:0];

47

if(instr_pass==0) state=param_send_rcv;

if(instr_pass==1) case(instr_type) read_param: begin I_Valid=1; state=deassert_valid; //Set next state end nyb_video: begin if(video_pass==0) begin transfers=I_Transfers; state=trans_video; //Set next state end if(video_pass==1) begin I_Valid=1; state=deassert_valid; //Set next state end end default: begin state=wait_not_start; //Set next state end endcase end

//The instr_reset state lowers I_Reset to provide the falling edge //necessary to reset the camera. The camera must be reset before //any programming can occur. instr_reset: begin I_Reset=0; state=wait_not_start; //Set next state end //The wait_not_start state is a safety state which checks to see that //the outside logic has dropped the start signal. If they have not, the //camera might accidently reexecute the same instruction. wait_not_start: begin //Make sure I_Reset is high before going to the init state. if(I_Instr==14) I_Reset=1;

if (I_Start==0) begin I_Hold=0; state=init; end end endcase end

endmodule

pulsgen.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: Pulsegen.v//Authors: Randy Cuaycong and John Berglund//Description: Pulse Width modulated signal generator for servos

48

// Given the constant updated input position (absolute) from the rtox/rtoy modules. // Pulsegen generates the minimum pulse length and adds extra length depending // on the value of the input. This module requires two clocks. First is a very // fast clock which runs the pulse high length counter. The second is a frame clock,// which determines how far apart the pulses occur. module pulsegen (clk, fclk, angle, pulse) ;

// INPUTSinput clk; // Clock inputinput fclk; // Frame clock inputinput [7:0] angle ; // Input for the pulse length output pulse ; // Output signal to the servo

// OUTPUTreg [8:0] pclk ; // Internal counter for length of pulsereg pulse; // Output to the servos themselves

parameter minpulse=60; // defines minimum pulse length

always @(posedge clk)begin if (fclk == 0) // reset condition begin pclk=256-minpulse; // I want pclk to equal 1|0000|0000 // when pclk reaches minpulse pulse=0; // Pull the servo pulse low when minpulse reached end else if (pclk[8]==1) // minpulse reached. Extend pulse for angle length begin if (pclk[7:0] == angle[7:0]) // angle length reached pulse = 0; else // increment towards angle length begin pclk = pclk + 1; pulse = 1; end end // minpulse reached else // min pulse not reached begin // note: fclk should be high pclk = pclk + 1; pulse = 1; endend endmodule

rtox.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: rtox.v//Authors: Randy Cuaycong and John Berglund//Description: "Reference to Absolute" converter.// The servos are pulse width modulated where its duty cycle depends on the position// needed. Unfortunately, it is easier for the algorithm to work in reference to the// current position of the camera. This module is responsible for keeping track// of the servo's current position and determing the absolute value when given a // direction and magnitude. Later, it was decided that this also would be a good// module to place bounds checking.//Last Edited 4/23/99 Randy Cuaycong and John Berglund

module rtox (x_input,x_enable,x_output,x_dir);

//INPUTS From algorithm moduleinput [1:0] x_input; // input offset (reference)input x_enable; // Input from the algorithm module which will // determine whether or not the input values are // valid. input x_dir; // determines direction

//OUTPUTS To pulsegen

49

output [7:0] x_output; // output offset (absolute)

//DECLARATIONSreg [7:0] x_output;

//INTERNAL VARIABLESreg start; // Flags used to center camera on start-up

//PARAMETERSparameter x_min=45, x_max=132, x_center=90; // range values - can be tweeked

always @(posedge x_enable)begin if(!start) // On power-up, start =0. Center and set start=1 begin x_output=x_center; start=1; end else begin if(x_dir) // Add or subtract the magnitude from the current positon x_output = x_output + x_input; else x_output = x_output - x_input; // Bounds checking. We used a series of or'ed equalities becuase Xilinx does // not seem to like multiple comparisons on the same clock cycle. if((x_output==x_max)||(x_output==x_max+1)||(x_output==x_max+2)|| (x_output==x_min)||(x_output==x_min-1)||(x_output==x_min-2)) begin x_output=x_center; // If camera has reached maximum viewing range on either // X or Y, center the camera and continue. end endendendmodule

rtoy.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: rtoy.v//Authors: Randy Cuaycong and John Berglund//Description: "Reference to Absolute" converter.// The servos are pulse width modulated where its duty cycle depends on the position// needed. Unfortunately, it is easier for the algorithm to work in reference to the// current position of the camera. This module is responsible for keeping track// of the servo's current position and determing the absolute value when given a // direction and magnitude. Later, it was decided that this also would be a good// module to place bounds checking.//Last Edited 4/23/99 Randy Cuaycong and John Berglund

module rtoy (y_input,y_enable,y_output,y_dir);

//INPUTS From algorithm moduleinput [1:0] y_input; // Input offset (reference) From algorithminput y_enable; // Input from the algorithm module which will // determine whether or not the input values are // valid. input y_dir; // Determines direction

//OUTPUTS To pulsegenoutput [7:0] y_output; // output offset (absolute) To pulsegen

//DECLARATIONSreg [7:0] y_output;

//INTERNAL VARIABLESreg start; // Flag used to center camera at powerup

50

//PARAMETERSparameter y_min=45, y_max=132, y_center=90; // range values - can be tweeked

always @(posedge y_enable)begin if(!start) // On power-up, start =0. Center and set start=1 begin y_output=y_center; start=1; end else begin if(y_dir) // Add or subtract the magnitude from the current y_output = y_output + y_input; // position. else y_output = y_output - y_input;

// Bounds checking. We used a series of or'ed equalities becuase Xilinx does // not seem to like multiple comparisons on the same clock cycle. if((y_output==y_max)||(y_output==y_max+1)||(y_output==y_max+2)|| (y_output==y_min)||(y_output==y_min-1)||(y_output==y_min-2)) begin y_output=y_center; // If camera has reached maximum viewing range on either // X or Y, center the camera and continue. end endendendmodule

51

Appendix B – Project Schematics

Main Schematic

Part A – Algorithm and Servo Control Modules

52

C A

EDB

Part B – Camera Interface Module

Part C – Clocks

53

Part D – Memory and Camera Control Module

54

Part E – Debug

55

Appendix C – Test Drivers and Stubs

test.bs2'Group 1: Autonomous Motion Tracking'CPSC 483 - Spring 1999'File: cameraII.bs2'Author: Andrew Fikes'Description: BS2 test code for Camera Interface II

'NOTE: This file was written based on a 76x60x4 image.

'Name Output Pinscommand con 04start con 05

'Name Input Pinsenable con 00hold con 06valid con 07data0 con 08data1 con 09data2 con 10data3 con 11data4 con 12data5 con 13 data6 con 14data7 con 15

'Assign Output Pinsoutput commandoutput start

'Assign Input Pinsinput enableinput holdinput validinput data0input data1input data2input data3input data4input data5input data6input data7

counter var word

'******************************************************************'0. WAIT FOR ENABLE'****************************************************************** loop0:if IN0=0 then loop0serout 16,16390,["ENABLED",10,13]

'******************************************************************'1. INITIALIZE CAMERA'******************************************************************'Send "Initialize" commandlow command

'Send start signalhigh start

'Wait for confirmation of hold and drop startwait_hold_high1:if IN6=0 then wait_hold_high1low start

56

'Wait for hold to dropwait_hold_low1:if IN6=1 then wait_hold_low1

serout 16,16390,["INITIALIZED",10,13]

'******************************************************************'2. WAIT FOR ENABLE'****************************************************************** loop2:if IN0=1 then loop2serout 16,16390,["ENABLED",10,13]

'******************************************************************'3. PICTURE COMMAND'******************************************************************'Send "Picture" commandhigh command

'Send start signalhigh start

'Wait for confirmation of hold and drop startwait_hold_high3:if IN6=0 then wait_hold_high3low start

'Print pictureprint_picture3:

'Test for valid highwait_valid_high3:if IN6=0 then finish_picture3if IN7=0 then wait_valid_high3serout 16,16390,[hex INH]

'Test for valid lowwait_valid_low3:if IN6=0 then finish_picture3if IN7=1 then wait_valid_low3counter=counter+1

'Insert line breaksif counter//38<>0 then print_picture3serout 16,16390,[10,13]if IN6=1 then print_picture3

finish_picture3:serout 16,16390,["PICTURE 1 FINISHED",10,13]

cammem.bs2'Group 1: Autonomous Motion Tracking'CPSC 483 - Spring 1999'File: cam_mem.bs2'Author: Andrew Fikes & Kamran Shah'Description: Code to test camera & memory modules

'Name Output Pinsalg_instr0 con 01alg_instr1 con 02alg_rdy con 04alg_send con 05cnt_incr con 06

'Name Input Pinsenable con 00alg_valid con 07alg_data4 con 12

57

alg_data5 con 13 alg_data6 con 14alg_data7 con 15

'Assign Output Pinsoutput alg_instr0output alg_instr1 output alg_rdy output alg_send output cnt_incr

'Name Input Pinsinput enableinput alg_valid input alg_data4input alg_data5 input alg_data6input alg_data7

'Internal variablesi var wordj var word

'******************************************************************'* INITIALIZE OUTPUTS'******************************************************************low alg_instr0low alg_instr1low alg_rdylow alg_sendlow cnt_incr

'******************************************************************'0. WAIT FOR ENABLE'****************************************************************** loop0:if IN0=0 then loop0serout 16,16390,["ENABLED",10,13]

'******************************************************************'2. INITIALIZE CAMERA'******************************************************************'Send "Initialize" commandlow alg_instr0low alg_instr1high alg_send

'Wait for "valid" low and drop "alg_send"wait_valid_low2:if IN7=1 then wait_valid_low2low alg_send

'Wait for "valid" highwait_valid_high2:if IN7=0 then wait_valid_high2

serout 16,16390,["CAMERA INITIALIZED",10,13]

'******************************************************************'3. WAIT FOR NOT ENABLE'****************************************************************** loop3:if IN0=1 then loop3serout 16,16390,["ENABLED",10,13]

'******************************************************************'4. TAKE AND STORE PICTURE IN FRAME 0'******************************************************************'Send "PICTURE" command

58

high alg_instr0low alg_instr1high alg_send

'Wait for "valid" low and drop "alg_send"wait_valid_low4:if IN7=1 then wait_valid_low4low alg_send

'Wait for "valid" highwait_valid_high4:if IN7=0 then wait_valid_high4

serout 16,16390,["PICTURE STORED",10,13]

'******************************************************************'5. RETRIEVE FRAME 0 BY ROWS'******************************************************************i=0j=0

loop5:'Send "ROW" commandlow alg_instr0high alg_instr1high alg_send

'Wait for "valid" low wait_valid_low5a:if IN7=1 then wait_valid_low5a

data_loop5:high alg_rdy

'Wait for "valid" high wait_valid_high5:if IN7=0 then wait_valid_high5

'Print data and drop alg_rdyserout 16,16390,[hex IND]low alg_rdy

'Wait for "valid" lowwait_valid_low5b:if IN7=1 then wait_valid_low5b

i=i+1if (i<>64) then data_loop5

low alg_sendserout 16,16390,[10,13]

i=0j=j+1high cnt_incrlow cnt_incrif (j<>120) then loop5

cam_stub.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: cam_stub.v//Authors: Wesley Day//Description: Test stub which emulates camera interface handshaking

module cam_stub (clock,hold,start,valid,cmd) ;

input clock;

59

input start ;

output hold ;output valid ;input cmd ;

reg hold, valid;reg [12:0] count;reg [4:0] state;

parameter idle=0, init=1, send_frame1=2, send_frame2=3, send_frame3=4, FRAME_SIZE=7680;

always @(posedge clock)begin case(state) idle: begin hold=0; count=0; valid=0; if (start) //receiving a command if (!cmd) //initialize state=init; else state=send_frame1; end //idle init: //wait for start to drop begin hold=1; if (!start) state=idle; end send_frame1: begin hold=1; if (count == FRAME_SIZE) state=idle; else begin count=count+1; valid=1; state=send_frame2; end end send_frame2: begin state=send_frame3; end send_frame3: begin valid=0; state=send_frame1; end endcase endendmodule

wessim.v//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: wessim.v//Authors: Andrew Fikes and Kamran Shah//Description: Test stub which returns a "fake" image to the Algorithm module

module sim(sim_clk,sim_inst,sim_param,sim_send,sim_rdy,sim_valid,sim_data,sim_state);

60

// Variables that specify where the rectangle is.parameter TOP=48,BOTTOM=50,LEFT=60,RIGHT=80;

//statesparameter init=0,s1=1,s2=2,s3=3,s4=4;

//INPUTSinput sim_clk;input [1:0] sim_inst;input [6:0] sim_param;input sim_send;input sim_rdy;

//OUTPUTSoutput sim_valid;output [7:0] sim_data;output [2:0] sim_state;

//DECLARATIONSreg sim_valid;reg [7:0] sim_data;reg [2:0] sim_state;

//INTERNAL VARIABLESreg [6:0] count;reg [3:0] i;

always@(posedge sim_clk)begin case(sim_state) init: begin sim_valid=1; count=0; if(sim_send && (sim_inst==2)) begin sim_state=s1; end end s1: begin sim_valid=0; if(sim_rdy) begin i=0; sim_state=s2; end

if(!sim_send) begin sim_state=init; end end s2: begin if(sim_param > TOP) i[3]=1; if(sim_param < BOTTOM) i[2]=1; if(count > LEFT) i[1]=1; if(count < RIGHT) i[0]=1; sim_state=s3; end

61

s3: begin if(i==15) sim_data='b11111111; else sim_data='b00010001; sim_state=s4; end s4: begin sim_valid=1; if(!sim_rdy) begin count=count+1; sim_state=s1; end end endcaseendendmodule

62

Appendix D – Algorithm Development Files

ATU.java//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: ATU.java//Author: Andrew Fikes//Description: //Related Files: MainWindow.java, Picture.java, WindowDisposer.java

public class ATU{ public static void main (String[] args) { MainWindow mw = new MainWindow(); }}

Algorithms.java//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: MainWindow.java//Author: Randy Cuaycong //Description:

public class Algorithms{

public int temp_direction; public void alg1(Picture Picture2){ int x_coord=0; int y_coord=0; x_coord=alg1_calc(Picture2.x_histogram, Picture2.width); y_coord=alg1_calc(Picture2.y_histogram, Picture2.height); x_coord = Picture2.offset+x_coord; y_coord = Picture2.offset+y_coord; Picture2.draw_crosshair(x_coord, y_coord); } private int alg1_calc (int [] histogram, int length) { int count=1; int low_flag; int poschange=0; int poslocation=0; int negchange=0; int neglocation=0; int difference[]=new int[length]; int center;

difference[0]=0; for (count=1;count<length;++count) { difference[count]=histogram[count]-histogram[count-1]; } // find greatest positive change for(count=0;count<length;++count){ if (difference[count] > poschange) { poschange = difference[count]; poslocation = count; } if (difference[count] < negchange) { negchange = difference[count]; neglocation = count;

63

} } center=(poslocation + neglocation)/2; return center; } }

MainWindow.java//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: MainWindow.java//Author: Andrew Fikes//Description: //Related Files: ATU.java, Picture.java, WindowDisposer.java

import java.awt.*;import java.awt.event.*;

public class MainWindow{ MainWindow(){ Algorithms alg_picture2 = new Algorithms(); /******************** GUI LAYOUT ********************/ /* Create main window */ Frame gui = new Frame("Autonomous Tracking Unit"); /* West Panel */ Panel path1_panel = new Panel(); Label path1_label = new Label("Picture 1:"); TextField path1_field = new TextField(40); path1_panel.add(path1_label); path1_panel.add(path1_field); Panel picture1_panel=new Panel(); Picture picture1 = new Picture(); picture1.setSize(325,325); picture1_panel.add(picture1); Panel west_panel = new Panel(); west_panel.setLayout(new BorderLayout()); west_panel.add("North",path1_panel); west_panel.add("South",picture1_panel); gui.add("West",west_panel); /* East Panel */ Panel path2_panel = new Panel(); Label path2_label = new Label("Picture 2:"); TextField path2_field = new TextField(40); path2_panel.add(path2_label); path2_panel.add(path2_field); Panel picture2_panel=new Panel(); Picture picture2 = new Picture(); picture2.setSize(325,325); picture2_panel.add(picture2);

Panel east_panel = new Panel(); east_panel.setLayout(new BorderLayout()); east_panel.add("North",path2_panel); east_panel.add("South",picture2_panel); gui.add("East",east_panel); /* South Panel */

64

Panel navigation_panel = new Panel(); Button load=new Button("Load"); Button alg1=new Button("Alg1"); Button quit=new Button("Quit"); navigation_panel.add(load); navigation_panel.add(alg1); navigation_panel.add(quit); gui.add("South",navigation_panel); /* Adjust Window */ gui.setBackground(SystemColor.control); gui.show(); gui.pack();

/****************** EVENT HANDLING *****************/ /* Handle window destruction event */ gui.addWindowListener(new WindowDisposer(gui)); /* Handle a button event */ class ButtonHandler implements java.awt.event.ActionListener{ private TextField textfield1; private TextField textfield2; private Picture picture1; private Picture picture2; Algorithms center = new Algorithms(); ButtonHandler(){ } ButtonHandler(TextField textfield1,TextField textfield2,Picture picture1,Picture picture2){ this.textfield1=textfield1; this.textfield2=textfield2; this.picture1=picture1; this.picture2=picture2; } public void actionPerformed(ActionEvent e){ if(e.getActionCommand().equals("Load")){ picture1.load((textfield1.getText()).replace('/','\\')); picture2.load((textfield2.getText()).replace('/','\\')); } if(e.getActionCommand().equals("Alg1")){ center.alg1(picture2); } if(e.getActionCommand().equals("Quit")){ System.exit(1); } } } load.addActionListener(new ButtonHandler(path1_field,path2_field,picture1,picture2)); alg1.addActionListener(new ButtonHandler(path1_field,path2_field,picture1,picture2)); quit.addActionListener(new ButtonHandler()); } }

Picture.java//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: Picture.java//Author: Andrew Fikes//Description: //Related Files: ATU.java, MainWindow.java, WindowDisposer.java

65

import java.awt.*;import java.io.*;import java.util.*;

public class Picture extends java.awt.Canvas{ public int max_color; public int width; public int height; public int data [][]; public int x_histogram []; public int y_histogram []; public int show_crosshair; public int x_crosshair; public int y_crosshair; public int show_direction; public int x_direction; public int y_direction; public int offset; public void paint(Graphics g){ offset=165; draw_image(g); draw_x_histogram(g); draw_y_histogram(g); if(show_crosshair==1){ g.setColor(Color.red); g.drawLine(x_crosshair-5,y_crosshair,x_crosshair+5,y_crosshair); g.drawLine(x_crosshair,y_crosshair-5,x_crosshair,y_crosshair+5); } if(show_direction==1){ g.setColor(Color.green); g.drawLine(offset+80, offset-20, offset+80+x_direction, offset-20+y_direction); } } public void draw_crosshair(int x, int y){ x_crosshair=x; y_crosshair=y; show_crosshair=1; repaint(); } public void draw_direction(int x, int y) { x_direction=x; y_direction=y; show_direction=1; repaint(); } public void load(String filename){ int x,y; BufferedReader br; String buffer = new String(); StringTokenizer st; try{ br = new BufferedReader(new FileReader(filename)); //Read width,height buffer=br.readLine(); st=new StringTokenizer(buffer); width=Integer.parseInt(st.nextToken()); height=Integer.parseInt(st.nextToken()); buffer=br.readLine(); //Create data array data = new int [width][height];

66

//Read max_color st=new StringTokenizer(buffer); max_color=(Integer.parseInt(st.nextToken()));

//Read data values buffer=br.readLine(); st=new StringTokenizer(buffer); for(y=0;y<height;y++) for(x=0;x<width;x++){ while(st.hasMoreTokens()==false){ buffer=br.readLine(); st=new StringTokenizer(buffer); } data[x][y]=Integer.parseInt(st.nextToken()); } }catch(IOException ioe){} x_histogram=calc_x_histogram(width,height,data); y_histogram=calc_y_histogram(width,height,data); show_crosshair=0; repaint(); } public int [] calc_x_histogram(int width, int height, int [][] data){

int x,y; int x_histogram[]=new int[width]; for (x=0;x<width;x++) for (y=0;y<height;y++) x_histogram[x]=f(x_histogram[x],data[x][y]); return x_histogram; } public int [] calc_y_histogram(int width, int height, int [][] data){

int x,y; int y_histogram[]=new int[height]; for (y=0;y<height;y++) for (x=0;x<width;x++) y_histogram[y]=f(y_histogram[y],data[x][y]); return y_histogram; } public int f(int a, int b){ return a+(255-b); } public void draw_x_histogram(Graphics g){ int x; g.setColor(Color.black); g.drawLine(offset,offset-5,offset+width,offset-5); for(x=0;x<width;x++) g.drawLine(offset+x,offset-5,offset+x,offset-5-x_histogram[x]/max_color); } public void draw_y_histogram(Graphics g){ int y; g.setColor(Color.black); g.drawLine(offset-5,offset,offset-5,offset+height); for(y=0;y<height;y++) g.drawLine(offset-5,offset+y,offset-5-y_histogram[y]/max_color,offset+y); } public void draw_image(Graphics g){ int x,y; int grayvalue;

67

for(x=0;x<width;x++){ for(y=0;y<height;y++){ grayvalue=(data[x][y])*(256/(max_color+1)); g.setColor(new Color(grayvalue,grayvalue,grayvalue)); g.drawLine(offset+x,offset+y,offset+x,offset+y); } } }

}

WindowDisposer.java//Group 1: Autonomous Motion Tracking//CPSC 483 - Spring 1999//File: MainWindow.java//Author: Andrew Fikes//Description: //Related Files: ATU.java, MainWindow.java, Picture.java

import java.awt.*;import java.awt.event.*;

public class WindowDisposer extends WindowAdapter{ Window window;

WindowDisposer(Window window) { super(); this.window = window; }

public void windowClosing(WindowEvent e) { window.dispose(); System.exit(1); }}

68

Appendix E – Project DocumentsThe following documents are available in our web-enabled version of this document: Proposal Proposal Presentation Mid-term Report Mid-term Presentation Final Report Final Presentation Bi-weekly 1 Bi-weekly 2 Bi-weekly 3

69

Appendix F – Xilinx Labs Developed During the Project

Chapter 15 – Controlling Servos

Servos are very useful in projects such as remote control airplanes and simple electronic robots. They range in all sorts of sizes and capabilities. A typical servo is the Futaba S-148 pictured below:

Here are some specifications for the Futaba S148:

Control System +Pulse control 152 s neutralPower Supply 4.8V or 6.0 V (shared with the receiver)Operation angle Rotary system, one side 45or greater (including trim)Power Consumption 6.0V 8mA (at idle)Operating speed 0.22 sec / 60Output torque 42 oz. / in (3kg, cm)Dimensions 1.58x0.77x1.4 inch (40.4x19.8x36 m/m)Weight 1.5 oz (44.4 g)

Three wires extend outside the unit: red, black, and white. Red supplies the power and black is ground. White is the control wire which provides the pulse coded modulated signal that determines the servo’s position. The servo expects a pulse every 17 to 20 ms and the length of the pulse determines the current position of the servo.

70

WARNING: The values above are just examples. The real values are different from one servo unit to the next.Here is a simple method to produce the required pulse:

module pulsegen (clk, fclk, angle, pulse) ;

// INPUTSinput clk; // Clock inputinput fclk; // Frame clock inputinput [7:0] angle ; // Input for the pulse length

// OUTPUToutput pulse ; // Output signal to the servoreg pulse; // Output to the servos themselves

// INTERNAL reg [8:0] pclk ; // Internal counter for length of pulse

// PARAMETERSparameter minpulse=60; // defines minimum pulse length = produces a 0 degree position

always @(posedge clk)begin if (fclk == 0) // reset condition begin pclk=256-minpulse; // I want pclk to equal 1|0000|0000 when pclk reaches minpulse pulse=0; // Pull the servo pulse low when minpulse reached end else if (pclk[8]==1) // minpulse reached. Extend pulse for the length required for angle begin if (pclk[7:0] == angle[7:0]) // Length for requested angle reached pulse = 0; else // increment towards angle length begin pclk = pclk + 1; pulse = 1; end end // minpulse reached else // min pulse not reached begin // note: fclk should be high pclk = pclk + 1; pulse = 1; endend endmodule

We want the 9-bit counter to have the bit value, 1|0000|0000|b, when minimum pulse length is reached. To allow for shorter minimum pulse lengths, we had our counter start at 256 (1|0000|

71

0000b) minus the number of clock cycles needed to produce a minimum pulse length. This allows us to have a 1-bit comparator for the minimum pulse length and test for the angle using the lower 8-bits.

Our algorithm works by first setting the counter to 28 (256) minus minimum pulse length (60). So, the counter begins at 196. Each pulse clock cycle, the counter is incremented by one. After 60 clock cycles, the counter equals 256(1 0000 0000b). When this condition is met, the counter continues to increment. The output pulse is high from the time the counter begins to increment until the lower 8-bits of the angle value and the counter is equal. When this occurs, the counter stops incrementing and the output pulse is pulled to zero. All of this happens when the frame clock is high, so when the frame clock goes low, the counter and angle value can be reset and the module can wait for the rising edge of the frame clock to begin once again.

This module provides the interface to the servos but additional logic will be needed for control. Since the ‘pulsegen’ continuously uses the input ‘angle’, the inputs values should be driven at all times. The input also should not be changed while the frame clock is high. Changing the input angle value while the servos are updating can cause erratic movement. Furthermore, the logic should also check for the physical bounds of the servo. You should not force a servo to go below the 0 or above the 180 position. This can damage the servo. The frame clock should not have a period less than 17ms to avoid internal damage. A frame clock with a period greater than 20ms is acceptable, but the servos will perform with less torque.

Chapter 19 – Basic Stamp 2 to FPGA

To demonstrate how to interface the Basic Stamp 2 with the Xilinx FPGA, we will create a simple counter using both devices. The counter itself will be incremented within the BS2, transferred to a register in the Xilinx FPGA, and then displayed on a row of LEDs.

The first thing we must do is create the counter module within the BS2. The BS2 is programmed on the computer using a language called PBASIC. Information on PBASIC’s constructs can be found in the Basic Stamp Programming Manual. The programming manual provides in depth information on each construct, and also provides numerous examples of BS2 applications. A shorter reference guide that contains a brief description of the commands is also available from your instructor.

The following is the pinout for the BS2:

To set up the BS2:1. Power off the breadboard to prevent damaging the BS2.2. Connect PWR (pin 24) to the 5V power source on your breadboard. 3. Connect GND (pin 23) to the ground on your breadboard.

72

4. Using the serial cable and connectors provided by your instructor, hook pins 2-5 of the serial cable to pins 1-4 of the BS2. Make sure that the pins from the serial cable match up to the BS2 as follows:

Serial Cable BS2Pin 2 Pin 1Pin 3 Pin 2Pin 4 Pin 3Pin 5 Pin 4

Once you have properly set up the BS2, start the stampw program and select the proper COM port for your serial cable. If your COM port is not listed, add it by typing its number in the COM # box. Once the stampw program has opened, you are ready to program the BS2. The following code should properly implement the counter in the BS2.

Basic Stamp II Code:

'******************************************************************'File: bs2demo.bs2'******************************************************************

'Name inputsENABLE con 07

'Declare input pinsinput ENABLE

'Name outputsCLK con 10LED0 con 12LED1 con 13LED2 con 14LED3 con 15

'Declare output pinsoutput CLKoutput LED0output LED1output LED2output LED3

'Variablescounter var nib

'Check for ENABLE signalCHK_ENABLE:if IN7<>1 then CHK_ENABLE

'Pause to make LED change visiblepause 500

CHK_BIT0:if counter.bit0=IN12 then CHK_BIT1toggle LED0

CHK_BIT1:if counter.bit1=IN13 then CHK_BIT2toggle LED1

CHK_BIT2:if counter.bit2=IN14 then CHK_BIT3toggle LED2

CHK_BIT3:

73

if counter.bit3=IN15 then CHK_COUNTtoggle LED3

CHK_COUNT:if counter<>15 then INC_COUNT

RST_COUNT:counter=0high CLKlow CLKgoto CHK_ENABLE

INC_COUNT:counter=counter+1high CLKlow CLKgoto CHK_ENABLE

The implementation of the Xilinx module is extremely simple. The module has 5 inputs, 1 for the clock and 4 for the counter data. When the clock is asserted, the Xilinx should place the counter data from the BS2 into a register and display that data to the 4 LED’s. The following shows the schematic for the Xilinx module. Note that the register was created using the Xilinx LOGIBLOX tool, and the clock enable input to the register was removed.

Xilinx Schematic:

At this point, it is a good idea to test the modules separately to make sure they work as expected. Note that the IPADs in the Xilinx schematic are connected to switches. This was done so that the Xilinx module can be tested as a separate component. After you finish testing, however, make sure that you return the switches to the “0” position before hooking the modules together.

If you are confident that both modules work separately, you can hook the two modules together as follows:1. Make sure that both the BS2 and Xilinx are powered off and are hooked up properly.2. Both the BS2 and Xilinx have compatible source and sink currents, which means that we can

hook them together without any intermediate buffering. Your instructor should have cute little cables with black ends that should make the whole process fairly simple. Using these cables, connect the BS2 and Xilinx together as follows:

BS2 XilinxPin 20 Pin 24Pin 19 Pin 23Pin 18 Pin 20

74

Pin 17 Pin 19Pin 15 Pin 25

3. The only thing that remains is to hook up a switch to work the enable. The following diagram was taken from page 250 of the Basic Stamp Programming Manual and shows the proper way to hook up both active-high and active-low switches to the BS2.

4. Now, make sure that the enable is set low, and download your modules to the BS2 and Xilinx. Set your enable high, and the LEDs should begin counting.

Chapter 22 – Interfacing a Connectix QuickCam

Written 05/05/99 by Andrew Fikes ([email protected])

The CPSC 483 lab contains several Connectix QuickCams that can be easily added into a Xilinx project. The interface between the camera and the FPGA was developed as part of the Autonomous Tracking Unit project (Spring 1999). The following document describes how to restore, test and integrate that interface.

Interface highlights: Supports images up to 324x243 pixels Supports both 16-color grays (4-bit depth) and 64-color grays (6-bit depth) Supports sampling at 1:1, 2:1 and 4:1 ratios Observed frame rates exceeding 30 frames per second ROM stored instructions allow for easy configuration Requires only 15 I/O pins for operation Requires only 56 CLBs

Restoring the QuickCam Project

The QuickCam project is archived in the file camera.zip. If you are reading the .doc or .ps format of this document, a hyperlink to the file is available in the .html version.

To unarchive the project:1. Copy the camera.zip file into your project directory. 2. In the main Xilinx window, select File->Restore Project.3. In the dialog box, select the camera.zip file and press Next. 4. Specify your project directory, and press Next. 5. When the project finishes decompressing, press Finish.

75

You should now have a directory named camera in your project directory.

Testing the Project’s Functionality

The camera project included in the archive is setup to allow you to quickly test the functionality of the camera and learn more about the hardware components. The tests use a simple BS2 program to simulate a control module and also to serially transfer the data back to a terminal session. As a result, you will be able to see the data returned by the QuickCam and get a good idea of the quality of returned images.

Step 1: Connect the Camera to the Xilinx FPGAThe cameras available in lab make use of the parallel interface to transfer data. The parallel interface supports two modes of data transfer, a nibble mode and a byte mode. The interface currently supports only nibble mode transfers from the camera. Table 1 shows the pins used by the interface and the signals transferred by each pin.

DB-25 Pin # Pin Signal Xilinx Pin #2 Command (0) P393 Command (1) P384 Command (2) P365 Command (3) P356 Command (4) P297 Command (5) P408 Command (6) P449 Command (7) P3713 Nibble(0) P312 Nibble(1) P410 Nibble(2) P511 Nibble(3) P615 CamRdy P816 Reset P4917 PCAck P7

Table 1: Parallel Port Connections

To connect the camera:1. Construct a female parallel port to breadboard connection. For our project we constructed a

gray ribbon cable with a port on one end and a twenty-four pin connector on the other. WARNING: It is important to test every pin’s connectivity.

2. Connect each of the pins listed in Table 1 to the corresponding Xilinx pin. Your TA should have cables that are designed for this purpose.

76

1425

113

Step 2: Power the QuickCamA direct result of the QuickCam’s parallel interface is that it has to derive its power from another source. Connectix solved this problem by tapping the keyboard power supply. Since it is important that the QuickCam and the Xilinx share the same power source, you must add a female DIN connector to your breadboard to provide power. Table 2 shows the pins used by the QuickCam to derive power.

DIN Pin # Connect To4 Gnd5 +5V

Table 2: Keyboard Port Connections

To connect the keyboard port:1. Construct or locate a female keyboard port.2. Connect pins 4 and 5 of the port to GND and +5V as shown in Table 2.

WARNING: It is VERY important that you do not reverse these connections.

Once you have connected both the parallel and keyboard ports, and are satisfied that they have been connected correctly, go ahead and plug in the QuickCam.

Step 3: Set up the BS2 to Transfer DataThe BS2 provides a painless way by which we can transfer data from the QuickCam to the computer. Table 3 shows the pinout of the BS2, and describes the signals assigned to each pin.

BS2 Pin Signal Connect To1 TX Serial Cable2 RX3 ATN4 GND5 Enable Active High Switch9 Cmd P25 10 Start P2611 Hold P5012 Valid P4813 Data (0) P6114 Data (1) P6215 Data (2) P6516 Data (3) P6617 Data (4) P5718 Data (5) P5819 Data (6) P5920 Data (7) P6022 RES Active Low Button23 GND Gnd24 PWR +5V

Table 3 : Basic Stamp Connections

77

5 42

3 1

To connect the BS2:1. Obtain a BS2. The lab has several available.2. Power off the breadboard to prevent damaging the BS2. 3. Connect pins 23 and 24 to GND and +5V respectively.4. Connect pins 1-4 to a serial cable modified for BS2 purposes. Your TA should have several

available, but more information on constructing this connection can be found in the BS2 manual.

5. Connect pin 22 to an active-low button. Pin 22 controls the RESET signal of the BS2. When the button is depressed, it pulls the voltage low, restarting the BS2.

6. Connect pin 5 to an active high switch. This switch will be used by the BS2 program to control program execution.

7. Connect each of the remaining pins listed in Table 3 to the corresponding Xilinx pin. Your TA should have cables that are designed for this purpose.

Step 4: Program the BS2The next step is to transfer the test program, test.bs2, to the BS2. The file should have been included in the archived project.

To program the BS2:1. Locate and start the program stampw.2. Select Edit->Preferences, and under the Editor Operation tab, set Default Comport to

COM2.3. Open the file test.bs2 in the BS2 editor4. Make sure the active-high switch on pin 5 of the BS2 is turned off.5. Download the file to the BS2.6. Close the stampw program.

The BS2 should now be programmed and ready to go!

Step 5: Set Up the Terminal SessionIn order to receive data from the BS2, you will need to start a terminal session listening on COM2. Unfortunately, starting a terminal session on COM2 will reprogram the BS2. To counteract this, simply remove the connection to BS2 pin 3.

To set up the Terminal Session:1. Open the program Hyperterminal.2. Create a new connection named BS2.3. Change the comport settings to

Port: COM2 Bits Per Second: 38400 Data Bits: 8 Parity: None Stop Bits: 1

Step 6: Program the Xilinx FPGAAt this point you should program the Xilinx. A bitstream for the project was included in the archive. If you are not using a 4003, you will need to reimplement the project.

78

Step 7: Run the ProgramSwitch to your terminal session and flip the enable switch connected to BS2 pin 5. The following should appear on your screen:

ENABLEDCAMERA INITIALIZED

Now, flip the enable switch the other way, and the camera should take a picture and the image should start being returned to the screen. Below is part of a sample picture of a black star on a white background:

ENABLEDCAMERA INITIALIZEDENABLED0111111111111111111111111111111111111111111111111111111111111111111111111102111111111111111111111111111111111111111111111111111111111111111111111111F01111111111111111111111111111111111111111111111111111111111111111111111111E01111111111111111111111111111111111111111111111111111111111111111111111111DF1111111111111111111111111111111111111117111111111111111111111111111111111C0F1111111111111111111111111111111111111119511111111111111111111111111111111BF111111111111111111111111111111111111114B6111111111111111111111111111111119FE111111111111111111111111111111111111119A8111111111111111111111111111111118EF11111111111111111111111111111111111111AA7611111111111111111111111111111115EE11111111111111111111111111111111111116AA7811111111111111111111111111111113DE11111111111111111111111111111111111118A98641111111111111111111111111111111DE11111111111111111111111111111111111168AA7761111111111111111111111111111111CE111111111111111111111111111111111111799A6771111111111111111111111111111111CD11111111111111111111111111111111111498966685111111111111111111111111111111BD11111111111111111111111111111111111698956687111111111111111111111111111111BD11111111111111111111163787657244355797957587311111111111111111111111111111BD11111111111111111111111698767355276787956678547966796566273111111111111111AD11111111111111111111111187757354266387856579558977787676311111111111111111AD11111111111111111111111112737354175387855559557967797651111111111111111111AC111111111111111111111111111172541653778564685588668871111111111111111111119C11111111111111111111111111111245165377846467647866741111111111111111111111AB111111111111111111111111111111331652868463675387621111111111111111111111119B111111111111111111111111111111111553767554675476111111111111111111111111119B111111111111111111111111111111112552758453665361111111111111111111111111119B111111111111111111111111111111131642667544484444111111111111111111111111119B111111111111111111111111111111131552758452575455111111111111111111111111118B111111111111111111111111111111331542657344474457111111111111111111111111118A11111111111111111111111111111142154266635356526631111111111111111111111111591111111111111111111111111111122316125111135653564111111111111111111111111169111111111111111111111111111121321411111111264266611111111111111111111111115911111111111111111111111111112132111111111111345751111111111111111111111111581111111111111111111111111111312111111111111111566111111111111111111111111158111111111111111111111111111131111111111111111117511111111111111111111111115811111111111111111111111111111111111111111111111131511111111111111111111111481111111111111111111111111111111111111111111111111131111111111111111111111158111111111111111111111111111111111111111111111111111111111111111111111111114811111111111111111111111111111111111111111111111111111111111111111111111111471111111111111111111111111111111111111111111111111111111111111111111111111146111111111111111111111111111111111111111111111111111111111111111111111111114511111111111111111111111111111111111111111111111111111111111111111111111111351111111111111111111111111111111111111111111111111111111111111111111111111PICTURE FINISHED

79

What to Do if Something Goes Wrong…

Q: What to all of these blinking lights mean? Will they help me diagnose my problem?A: Yes, as a matter of fact, they will. Most of the important outputs are tied to LEDs on the demoboard. Table 4 describes the LEDs and the signals they carry.

LED SignalRight A Reset*Right B Valid*Right C State (0)Right D State (1)Right E State (2)Right F Hold*Right G State (3)Right OFL State (4)Left A Command (0)*Left B Command (1)*Left C Command (2)*Left D Command (3)*Left E Command (4)*Left F Command (5)*Left G Command (6)*Left OFL Command (7)*9-16 Data (7) – Data (0)** Indicates that the Signal is inverted

Table 4: LED Descriptions

Q: The camera halts in an intermediate state. What is wrong?A: The state machine is most likely waiting for a signal from the camera. Check to make sure that the camera is properly plugged into the parallel and keyboard ports, and that your wires between the ports and the Xilinx FPGA are connected properly.

Q: Why is nothing being returned by the BS2?A: The most likely cause is that you forgot to unplug the third pin of the serial connection to the BS2 prior to starting your Hypterminal session. As a result, your BS2 program was cleared from memory by Hyperterminal. Go back and reload your BS2 and Xilinx. If this doesn’t work, check your settings in your Hyperterminal session. Make sure that they are set to COM2, 38400, 8 bits, and no parity. Also, check to make sure that you connected the BS2 to the Xilinx properly.

Q: I adjusted the module’s clock and now the image doesn’t appear!A: The clock speed used by our module is no coincidence. The limitations imposed by the BS2 require that the clock be significantly slower than the rate at which it samples. Lower clock speeds are not fast enough to retrieve data before the camera’s CCD decays, and higher clock speeds prevent the BS2 from capturing all of the data.

80

A

B

C

D

E

F

G

OFL

A

B

C

D

E

F

G

OFL

9 16

Importing the Camera Module Into Your Schematic

The archived project contains two schematics, test.sch and quickcam.sch. The quickcam.sch schematic contains the actual camera interface and control modules. It was turned into a macro before adding the quickcam component to the test.sch schematic. In order to import the camera interface into your project, you will need to reconstruct this sheet in your project and create a macro from it. Figure 1 shows the completed schematic.

Figure 1: QUICKCAM.sch

To import the camera module:1. Move quickcam.sch, 76x60x4.mem, intrface.v, control.v from the camera

project directory to your project directory. 2. Open the file intrface.v in the HDL editor.3. Create a macro by selecting Project -> Create Macro.4. Open the file control.v in the HDL editor.5. Create a macro by selecting Project -> Create Macro.6. Return to your main Xilinx window, choose Document -> Add, and add quickcam.sch.7. Open quickcam.sch. The two verilog modules should already be in place. WARNING:

It is very possible that the macro creator rearranged the pins on the modules. If this is the case, open the symbol editor and rearrange the pins on the modules so that they line up properly.

8. Double click on where the ROM was previously. Create a LOGIBLOX ROM with the following properties: Name: thingy Data Bus Width: 12 Mem File: 76x60x4.mem Multiplexer Style: Normal Gates Trim: True

9. Double click on where the constant num_trans was previously. Create a LOGIBLOX constant with value 2280.

10. The schematic should be complete. Select Hierarchy -> Create Macro Symbol from Current Sheet. You are now ready to drop it on any schematic.

81

NOTE: Nibble(3) of a standard parallel port interface is an active-low signal. As a result, when you tap the Nibble bus, make sure to invert the Nibble(3) signal.

Configuring the ROM

One of the benefits of our camera interface is that you can easily configure the camera. The camera instructions are stored in a ROM, and affect properties such as size and contrast. Each instruction is 4-bits long, and is paired with an 8-bit parameter. The ROM contains space for up to 8 initialization instructions, and 1 picture instruction.

Figure 2: QuickCam Image Window

Before you can modify these instructions, you must make decisions about the image that you want to gather. Figure 2 shows the QuickCam image window. There are three main image options that you need to consider : Image Size – The camera supports images up to 324x243. Color Depth – The camera supports both 4-bit and 6-bit color Sampling Mode – The camera supports sampling at 1:1, 2:1 and 4:1 ratios. A 2:1 ratio

implies that the camera samples every other pixel in both x and y directions. This is an important technique if you are trying to setup a wide field of view, but want a reduced image size.

Once you have made your decisions, you can use Tables 5 and 6 to help you quickly calculate and encode the ROM contents. The Parameter Calculation Worksheet (Table 5) contains the formula and valid range for each instruction supported by the camera interface. The ROM Configuration Worksheet (Table 6) is set up so that you can record your ROM setup. Once you have finished calculating the necessary values, you can update the ROM’s configuration.

82

Hei

ght

Width

(Left, Top)

(336,243)

(13,0)

Parameter Calculation WorksheetInstruction Formula Valid RangeReset - -SetContrast - 1-255SetOffset - 1-254Set Exposure - 1-255SetTop Top 1-243SetLeft Left/2 1-168 SetNumVertical Height 1:1 1-243

Height*1/2 2:1Height*1/4 4:1

SetNumHorizontal Width*1/2 4-bit,1:1 1-162Width*1/4 6-bit,1:1Width*1/4 4-bit,2:1Width*1/8 6-bit,2:1Width*1/8 4-bit,4:1Width*1/16 6-bit,4:1

GetFrame 0 4-bit,1:1 0,2,4,6,8,102 6-bit,1:14 4-bit,2:16 6-bit,2:18 4-bit,4:110 6-bit,4:1

ROM Configuration WorksheetInstruction Instruction (Binary) Parameter(Decimal) Parameter (Binary)Reset 1110SetContrast 1001SetOffset 1010SetExposure 0010SetTop 0011SetLeft 0100SetNumVertical 0101SetNumHorizontal 0110GetFrame 0000

83

Table 5: Parameter Calculation Worksheet

Table 6: ROM Configuration Worksheet

To update the configuration of the ROM:1. Select the ROM in the schematic.2. Open up its configuration window.3. Open up the .mem file, and replace the contents of the ROM with the values in your ROM

configuration worksheet. Below is a sample .mem file created for a 76x60x4 image:

;; Header SectionRADIX 10DEPTH 16WIDTH 12DEFAULT 0;; Data Section; Specifies data to be stored in different addresses; e.g., DATA 0:A, 1:0RADIX 2DATA 111000000000, ;Reset 100101101000, ;SetContrast 104 110010000000, ;SetOffset 128 001010000111, ;SetExposure 135 001100000001, ;SetTop 1 010000000001, ;SetLeft/2 2 010100111100, ;SetNumV 60 011000100110, ;SetNumH 38 000000001000 ;SendVideoFrame 1:4,4bit; end of LogiBLOX memfile

When you alter the image size, depth or sampling mode, you also have to change the value of the LOGIBLOX constant connected to the intrface.v module. This constant tells the intrface module the number of transfers necessary to download a complete image.

To configure the num_tran constant:1. Select the constant in the schematic.2. Open up its configuration window.3. Multiply the SetNumVertical and SetNumHorizontal parameters.4. Set the constant to this value.

84

Controlling the Interface

The new interface provides a simple one bit Command control. If this bit is low (See Figure 3) and Start is asserted, the interface will initialize the QuickCam and assert Hold until the process is completed. If Command is high (See Figure 4) and Start is asserted, the new interface will synchronously stream data out on a Valid clock, and assert Hold until the process is finished.

Figure 3: Timing Diagram for "Initialize" Command

Figure 4: Timing Diagram for "Picture" Command

85

CMD

HOLD

START

VALID

CMD

HOLD

START

Appendix G – ATU User’s Manual

Using the Autonomous Tracking Unit should be a simple procedure. Once you have set up the necessary hardware and software components you should be ready to roll.

Hardware SetupThree major hardware components are involved: the mounted QuickCam, the prototype board, and a 5.0V D.C. power supply. The instructions below illustrate how to set up these components.

1) Turn off the power supply.2) Connect the QuikCam’s parallel (DB25) port to the corresponding port on the prototype

board.3) Connect the QuickCam’s keyboard port (DIN 5) to the corresponding port on the prototype

board.4) Connect the Xchecker cable to the prototype board, this connection is the same as for the

Demo Board.5) Servo cable consists of two different connectors. The prototype side (Wrapped in black tape)

has four connected wires and two hanging wires (blocked with green wire). To connect, match up the four wires according to the figure 1 with the green plug hanging off the board.

6) The camera side has six exposed wires. To connect, match the yellow wire to the yellow marker on the servo connectors and match the green wire to the green marker on the servo connectors.

7) Connect the red (hot) and black (ground) leads of the prototype to the power supply.8) Recheck all connections before continuing.

Software SetupThe final project is archived in “yetagain.zip”. This project must be implemented after it has been restored into a Xilinx project.

Using the UnitOnce you have completed hardware and software setup, two steps remain before the unit will become operational.

1) Turn the power supply on.2) Use the Xilinx hardware debugger to download the bit stream to the Xilinx FPGA on the

prototype board.

The unit should now be operational.

86

87

To Power Supply

Servo Port

Servo Connector

Xilinx Port

SRAM

SRAMDB25 Parallel

DIN5 Keyboard port

XilinxFPGA

Black (Ground)

Red (Hot)

RST (hanging)INITPROGDIND/PCCLK(missing)GROUNDVCC

Towards bottom edge of board

Hanging (green plug)

Servo Cable

Figure1: Prototype Board Layout