Week 05: Camera

MIT Fall 2024

You are not logged in.

Please Log In for full access to the web site.
Note that this link will take you to an external site (https://shimmer.mit.edu) to authenticate, and then you will be redirected back to this page.

In Week 5 we're going to do quite a few things. First do some reflections on timing in digital circuits. Then we're going to look in to how to render more complicated images and files in our graphics pipeline. This will involve using BRAMs, but doing so intelligently so as to not waste space. Then we're going to write some logic to interface to a camera. (The camera gives us pixel information along with a vsync and hsync signal and we need to use those, collectively to reassemble the pixel values along with their corresponding hcount and vcount. Then we'll work on integrating that camera into our video pipeline system and do some interesting color masking work which will permit us to track features based off of color. Then we'll write a center-of-mass module to track those features and display a few entities to demonstrate that tracking. Finally we'll make sure our system is properly pipelined and timed. Exciting.

If you're looking for things to do before/outside of the lab space, the clocking exercise and the pixel reconstructor can all be done wherever, though your pixel reconstructor will get used on the board. The first half of popcat can also be testbenched to verify it is working. Checkoff 2 (the Camera I page) gives the specifications for the center of mass module you need to write. You can develop that remotely as well to make the in-lab parts go more smoothly.

I slightly updated the build.tcl file you should be using for this lab. Use this variant here.