Adjusting sensor clock phase
- 1 Elphel cameras architecture - separate computer and sensor boards
- 2 Inter-board cable
- 3 Phase shift of the sensor clock input
- 4 Effect of different cable lengths
- 5 On-chip DCM (Digital Clock Management) for sensor clock phase adjustment
- 6 Software interface for adjusting the sensor clock phase (firmware 8.0.x)
- 7 Old instructions - valid for 7.1.x firmware
Elphel cameras architecture - separate computer and sensor boards
Elphel cameras are designed so they can support multiple different sensors, including those that were not available when the particular camera model was first released. That is possible, because the camera functionality is split into the sensor-independent computer (see 10353) and the sensor front end. This board is very simple for the CMOS images sensors, like 10338 - basically it includes just the image sensor itself, connector for the cable to the camera main board and the power conditioning circuitry. In other cameras (like Model 363) the high resolution CCD-based sensor front end is much more complex (10347+10342), but they all interface the main board through the 30-conductor flex cable that on the 10353 side is connected directly to the FPGA pins (all but power supply/ ground pins). That makes the interface universal, as functions of the FPGA pins can be changed with just a new code, no hardware changes are needed.
The inter-board connection is made of the flex cables (that are available in different length) provide freedom in mutual orientation of the main boards and the sensor front end, allowing the smaller sensor board to get into the tight spaces. But that 30-conductor cable has not enough conductors to make nice transmission lines - there is not enough of the return wires, number of ground+power lines is much less than the number of signal ones, but that is not a big issue for most camera applications where the cable is really short - less than 2 inches (50mm) long, but still the signals (in 5MPix sensor the output 12-bit data has a 96 mpix/sec, the clock lines run at 96MHz both directions) are used with reduced drive strength to reduce the EMI. Fortunately both FPGA and the sensor allow control over the drive strength. With the low output currents on the pins, it takes some time to recharge the parasitic capacitance of the cable wires, and the signals look far from "digital" rectangular shape - it takes most of the pixel period to just transition from "0" to "1" or back from "1" to "0", so the data on the other end should be sampled at a precise moment during each pixel period.
Phase shift of the sensor clock input
In the camera it is achieved by adding controlled phase shift between the clock that synchronizes operation of the sensor-related portion of the FPGA and the clock signal that is sent to the sensor. The sensor output signals have certain delay from that clock (unfortunately in 5MPix sensor that delay is very different for the pixel data and sync signals), so by varying the phase of a clock to sensor it is possible to adjust phase of the sensor data with respect to the FPGA pixel clock. The sensor pixel output clock is not used - it is more convenient when the FPGA clock does not depend on the state of the sensor. But that sensor output clock is connected to the FPGA and is sampled at different phases of the FPGA clock - that data can (and will) be used for the automatic phase adjustment, but current software does not support that yet.
Effect of different cable lengths
For the standard camera configuration the phase shift is programmed as a predefined constant, but that does not work when the cable length differs from the original one. If you try the camera with different cable length you may see image that looks like monochrome or purple - if the pixels are sampled at wrong time, red and green (also green and blue) pixels can swap, you may see horizontal lines when the sample moment is close to the pixel-to-pixel transition - in some lines pixels were sampled correctly, in some - colors are swapped (and remember - each pixel is represented by individual 12 bit lines, that can have slightly different delays to the clock). For some reasonable length cables (tested with 8" / 200mm) it is possible to re-adjust the phase shift, sample pixels in the middle of the pixel window and so restore the nice images.
On-chip DCM (Digital Clock Management) for sensor clock phase adjustment
Sensor phase control in the camera is achieved with one of the FPGA on-chip DCM (digital clock management) blocks by combining the fine phase adjustment with 4-to-1 multiplexor switching between on of the four outputs shifted by 90 degrees. The fine shift is not precisely known, for these chips it is specified in the 20-40 ps/step, while guaranteed number of phase shift steps (for 96MHz clock) is +/- 140 (so more than +/- 90 degrees in the worst case.
Software interface for adjusting the sensor clock phase (firmware 8.0.x)
Open the following URL in the browser (replace 192.168.0.9 if camera has different IP): http://192.168.0.9/parsedit.php?embed=0.15&title=Parameters+for+groups:+phase&SENSOR_PHASE&SENSOR_REGS7&SENSOR_REGS7__0310&SENSOR_REGS7__0307&WOI_HEIGHT&refresh
That page combines several controls:
- SENSOR_PHASE - phase shift between the FPGA pixel clock and the sensor pixel clock. Normal adjustment range is 0xff80 (-64) to 0x0080 (+64)
- SENSOR_REGS7 - sensor register #7 that, among others control drive strengths of the sensor output. It is safer to modify selected bit fields of this register:
- SENSOR_REGS7__0307 (bits 9:7) is the drive strength of clock output (not used in the current setup)
- SENSOR_REGS7__0310 (bits 12:10) is the drive strength of all other outputs - pixel data, horizontal and vertical sync
- WOI_HEIGHT - window if interest height is not related to sensor phase adjustment, but it can be used to restore system (sensor to FPGA) synchronization after it is lost when adjusting sensor phase (that involves restarting the sensor). Just reduce/increase it by 0x10 and press "refresh" button - that should restore the "broken" image caused by losing system sync.
When you change phase and press "apply" the image would not reflect the new settings (changing phase requires complete sensor reset and the regular image acquisition sequence is aborted), so the following sequence is recommended:
- modify SENSOR_PHASE value
- press "apply"
- change WOI_HEIGHT (alternate between 0x790 and 0x780 for the full frame)
- press "apply"
- monitor image quality - look for horizontal color stripes
- repeat with different phase value
If at some point camera will stop responding to http requests (actually requests to PHP scripts) you may restart PHP by issuing
from the telnet session - it kills the web server that will be restarted together with the instances of PHP.
The phase default settings in the camera are optimized for a short cable (30 to 50mm), for longer cable it may need adjustment. You may also need to replace /etc/x353.bit file with the one that is generated with higher drive strength of the clock line from the FPGA to the sensor (that will also cause increase of the radiated EMI from the camera)
Old instructions - valid for 7.1.x firmware
The phase adjustment value passed to the driver is split into two parts - lower 16 bits represent (signed short) fine shift in steps (20..40ps), next 2 bits - 90-degrees additional shift, higher bits are "don't care". In default configuration bit 18 (0x40000) is set, because only non-zero values change the phase, so 0x40000 means "set 90-degree phase to 0, fine phase - to zero", while 0x0 means "no phase changes"
The phase control parameter has index 513 (0x201), you may read the current value (see Driver_parameters) with the following command:
Default value is:
That is: no 90-degree shifts, fine shift is -60 steps.
Supposing you would like to set new phase shift to +32:
1: Write data to the driver "write" parameter
You may verify it by reading
it will return
2. Driver "looks" at the phase shift value only there is a request to change the sensor clock frequency. Even as we do not need to change it, we write 96000000 (frequnecy in Hz) to the appropriate register - it will be replaced with zero (no changes required) when applied:
3. Now - apply changes:
That will cause sensor to be reset and then re-programmed to the current parameters (current settings are erased during reset). The sensor has to be reset after any clock changes, as it "assumes" that clock is stable after the reset signal (input pin or software command) is released. Experimentally we've found that the Micron 5MPix sensor can be confused if the clock frequency changes and in 50% instances develops a 1 pixel delay in the pixel output (effectively getting wrong colors). Strangely that applies only to the real sensor data, simulated test patters in the sensor are not influenced by this effect.