streaming pixel interface
what is a streaming pixel interface?
in hardware, processing an entire frame of video at one time has a high cost in memory and area. to save resources, serial processing is preferable in hdl designs. vision hdl toolbox™ blocks and system objects operate on a pixel, line, or neighborhood rather than a frame. the blocks and objects accept and generate video data as a serial stream of pixel data and control signals. the control signals indicate the relative location of each pixel within the image or video frame. the protocol mimics the timing of a video system, including inactive intervals between frames. each block or object operates without full knowledge of the image format, and can tolerate imperfect timing of lines and frames.
all vision hdl toolbox blocks and system objects support single pixel streaming (with 1 pixel per cycle). some blocks and system objects also support multipixel streaming (with 2, 4, or 8 pixels per cycle) for high-rate or high-resolution video. multipixel streaming increases hardware resources to support higher video resolutions with the same hardware clock rate as a smaller resolution video. hdl code generation for multipixel streaming is not supported with system objects. use the equivalent blocks to generate hdl code for multipixel algorithms.
how does a streaming pixel interface work?
video capture systems scan video signals from left to right and from top to bottom. as these systems scan, they generate inactive intervals between lines and frames of active video.
the horizontal blanking interval is made up of the inactive cycles between the end of one line and the beginning of the next line. this interval is often split into two parts: the front porch and the back porch. these terms come from the synchronize pulse between lines in analog video waveforms. the front porch is the number of samples between the end of the active line and the synchronize pulse. the back porch is the number of samples between the synchronize pulse and the start of the active line.
the vertical blanking interval is made up of the inactive cycles between the ending active line of one frame and the starting active line of the next frame.
the scanning pattern requires start and end signals for both horizontal and vertical directions. the vision hdl toolbox streaming pixel protocol includes the blanking intervals, and allows you to configure the size of the active and inactive frame.
in the frame diagram, the blue shaded area to the left and right of the active frame indicates the horizontal blanking interval. the orange shaded area above and below the active frame indicates the vertical blanking interval. for more information on blanking intervals, see .
why use a streaming pixel interface?
format independence
the blocks and objects using this interface do not need a configuration option for the exact image size or the size of the inactive regions. in addition, if you change the image format for your design, you do not need to update each block or object. instead, update the image parameters once at the serialization step. some blocks and objects still require a line buffer size parameter to allocate memory resources.
by isolating the image format details, you can develop a design using a small image for faster simulation. then once the design is correct, update to the actual image size.
error tolerance
video can come from various sources such as cameras, tape storage, digital storage, or switching and insertion gear. these sources can introduce timing problems. human vision cannot detect small variance in video signals, so the timing for a video system does not need to be perfect. therefore, video processing blocks must tolerate variable timing of lines and frames.
by using a streaming pixel interface with control signals, each vision hdl toolbox block or object starts computation on a fresh segment of pixels at the start-of-line or start-of-frame signal. the computation occurs whether or not the block or object receives the end signal for the previous segment.
the protocol tolerates minor timing errors. if the number of valid and invalid cycles between start signals varies, the blocks or objects continue to operate correctly. some vision hdl toolbox blocks and objects require minimum horizontal blanking regions to accommodate memory buffer operations. for more information, see .
pixel stream conversion using blocks and system objects
in simulink®, use the block to convert framed
video data to a stream of pixels and control signals that conform to this protocol. the
control signals are grouped in a nonvirtual bus data type called
pixelcontrol
. you can configure the block to return a pixel
stream with 1, 2, 4, or 8 pixels per cycle.
in matlab®, use the object to convert framed video data to a stream of pixels and control signals that conform to this protocol. the control signals are grouped in a structure data type. you can configure the object to create a pixel stream with 1, 2, 4, or 8 pixels per cycle.
if your input video is already in a serial format, you can design your own logic to
generate pixelcontrol
control signals from your existing serial
control scheme. for example, see and integrate vision hdl blocks into camera link system.
supported pixel data types
vision hdl toolbox blocks and objects include ports or arguments for streaming pixel data. each block and object supports one or more pixel formats. the supported formats vary depending on the operation the block or object performs. this table details common video formats supported by vision hdl toolbox.
type of video | pixel format |
---|---|
binary | each pixel is represented by a single
boolean or logical
value. used for true black-and-white video. |
grayscale | each pixel is represented by luma, which is the gamma-corrected luminance value. this pixel is a single unsigned integer or fixed-point value. |
color | each pixel is represented by 2 to 4 unsigned integer or fixed-point values representing the color components of the pixel. vision hdl toolbox blocks and objects use gamma-corrected color spaces, such as r'g'b' and y'cbcr. to process
multicomponent streams for blocks that do not support
multicomponent input, replicate the block for each
component. the to set up multipixel streaming for color video, you can configure the frame to pixels block to return a multicomponent and multipixel stream. see . |
vision hdl toolbox blocks have an input or output port, pixel
, for the
pixel data. vision hdl toolbox system objects expect or return an argument representing the pixel
data. the following table describes the format of the pixel data.
port or argument | description | data type |
---|---|---|
pixel |
you can simulate system objects with a multipixel streaming interface, but you cannot generate hdl code for system objects that use multipixel streams. to generate hdl code for multipixel algorithms, use the equivalent simulink blocks. | supported data types can include:
the software supports |
note
the blocks in this table support multipixel input, but not multicomponent pixels. the table shows what number of input pixels each block supports.
block | number of pixels |
---|---|
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
2, 4, or 8 | |
binary morphology: , , , and | 4 or 8 |
these blocks support multipixel-multicomponent pixel streams. the table shows what number of pixels and components each block supports.
block | number of pixels | number of components |
---|---|---|
2, 4, or 8 | 1, 3, or 4 | |
2, 4, or 8 | 3 | |
2, 4 or 8 | 3 (output only) | |
2, 4, or 8 | 1, 3, or 4 | |
2, 4, or 8 | 1, 3, or 4 |
streaming pixel control signals
vision hdl toolbox blocks and objects include ports or arguments for control signals relating to each pixel. these five control signals indicate the validity of a pixel and its location in the frame. for multipixel streaming, each vector of pixel values has one set of control signals.
in simulink, the control signal port is a nonvirtual bus data type called
pixelcontrol
. for details of the bus data type, see .
in matlab, the control signal argument is a structure. for details of the structure data type, see .
sample time
because the frame to pixels block creates a serial stream of the pixels of each input frame, the sample time of your video source must match the total number of pixels in the frame. the total number of pixels is total pixels per line × total video lines, so set the sample time to this value.
if your frame size is large, you may reach the fixed-step solver step size limit for sample times in simulink, and receive an error like this.
the computed fixed step size (1.0) is 1000000.0 times smaller than all the discrete sample times in the model.
timing diagram of single pixel serial interface
to illustrate the streaming pixel protocol, this example converts a frame to a sequence of control and data signals. consider a 2-by-3 pixel image. to model the blanking intervals, configure the serialized image to include inactive pixels in these areas around the active image:
1-pixel-wide back porch
2-pixel-wide front porch
1 line before the first active line
1 line after the last active line
you can configure the dimensions of the active and inactive regions with
the frame to pixels block or the visionhdl.frametopixels
object.
in the figure, the active image area is in the dashed rectangle, and the inactive pixels surround it. the pixels are labeled with their grayscale values.
the block or object serializes the image from left to right, one line at a time. the timing diagram shows the control signals and pixel data that correspond to this image, which is the serial output of the frame to pixels block for this frame, configured for single-pixel streaming.
for an example using the frame to pixels block to serialize an image, see .
for an example using the frametopixels
object to
serialize an image, see .
timing diagram of multipixel serial interface
this example converts a frame to a multipixel stream with 4 pixels per cycle and corresponding control signals. consider a 64-pixel-wide frame with these inactive areas around the active image.
4-pixel-wide back porch
4-pixel-wide front porch
4 lines before the first active line
4 lines after the last active line
the frame to pixels block configured for multipixel streaming returns pixel vectors formed from the pixels of each line in the frame from left to right. this diagram shows the top-left corner of the frame. the gray pixels show the active area of the frame, and the zero-value pixels represent blanking pixels. the label on each active pixel represents the location of the pixel in the frame. the highlighted boxes show the sets of pixels streamed on one cycle. the pixels in the inactive region are also streamed four at a time. the gray box shows the four blanking pixels streamed the cycle before the start of the active frame. the blue box shows the four pixel values streamed on the first valid cycle of the frame, and the orange box shows the four pixel values streamed on the second valid cycle of the frame. the green box shows the first four pixels of the next active line.
this waveform shows the multipixel streaming data and control signals for the first
line of the same frame, streamed with 4 pixels per cycle. the
pixelcontrol
signals that apply to each set of four pixel values
are shown below the data signals. because the vector has only one
valid
signal, the pixels in the vector are either all valid or
all invalid. the hstart
and vstart
signals apply
to the pixel with the lowest index in the vector. the hend
and
vend
signals apply to the pixel with the highest index in the
vector.
prior to the time period shown, the initial vertical blanking pixels are streamed four
at a time, with all control signals set to false
. this waveform shows
the pixel stream of the first line of the image. the gray, blue, and orange boxes
correspond to the highlighted areas of the frame diagram. after the first line is
complete, the stream has two cycles of horizontal blanking that contains 8 invalid
pixels (front and back porch). then, the waveform shows the next line in the stream,
starting with the green box.
for an example model that uses multipixel streaming, see filter multipixel video streams.
see also
| | |