https://wiki.elphel.com/api.php?action=feedcontributions&user=Pfavr&feedformat=atomElphelWiki - User contributions [en]2024-03-28T12:18:14ZUser contributionsMediaWiki 1.28.0https://wiki.elphel.com/index.php?title=User:Pfavr&diff=1969User:Pfavr2005-10-27T20:59:58Z<p>Pfavr: </p>
<hr />
<div>Hi I'm an engineer who just bought an Elphel model 333 camera. I hope to use it for some industrial purpose, i.e. quality control in packaging lines.<br />
<br />
I have barely done any VHDL, and am totally new to Verilog.<br />
<br />
My C is pretty decent - in 1994 I did a small multitasking kernel for MC68000 - and I have been programming since childhood (1985). My main force lies in realtime and lowlevel software. Recently I've done some web applications using AJAX, LAMP and XUL (mozilla graphical toolkit). Perl, Python, Bash, Awk or whatever is the serving of today :-)<br />
<br />
Oh, yes, I also know a little Digital Signal Processing.</div>Pfavrhttps://wiki.elphel.com/index.php?title=Ccam.cgi&diff=1968Ccam.cgi2005-10-27T20:58:49Z<p>Pfavr: /* Image Quality, Gamma correction, Color Saturation */ tried making it more clear and yes: I understand it now thanks!</p>
<hr />
<div>== overview ==<br />
The interface described below and all the links are for the Model 333 camera, interface for the 313 is approximately (but not completely) the same.<br />
<br />
ccam.cgi (source - [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/333/apps/ccam/ccam.c?view=markup ccam.c]) is currently the main interface to the camera functionality that uses GET method to pass parameters and receive the data back, so you may call it as<br />
<nowiki>http://<camera-ip-address>/admin-bin/ccam.cgi?parameter1=value1&parameter2=value2&...</nowiki><br />
Most parameters are persistent, so if the value is not specified it will be assumed to remain the same.<br />
These parameters are approximately related to the pairs of parameters passed to the main camera driver [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/arch/cris/drivers/cc333.c?view=markup cc333.c] that uses specific sensor driver (for [http://www.micron.com/products/imaging/products/MT9T001.html Micron sensors] - [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/arch/cris/drivers/mt9x001.c?view=markup mt9x001.c]) from the user space to the driver with IOCTL. The list of these 63 driver parameters is defined in [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/include/asm-cris/c313a.h?view=markup c313a.h] (names staring with "P_"), most of the values come in pairs desired and actual:<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_''name''), ''desired_value''); //set new value of the parameter <br />
''current_actual_value''=ioctl(devfd, _CCCMD(CCAM_RPARS , P_''name'' ), 0); // read current actual value - driver modifies the set value if needed to match the valid range.<br />
<br />
Writing these parameters will not cause immediate action, additional write needs to be performed to make driver process the new values. Some parameters can be updated without interrupting the sensor operation and the video stream output if active (i.e. exposure time, panning without window resizing, analog gains, color saturation). Changes in other parameters (such as window size or decimation) will not be applied until the sensor is stopped.<br />
<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_UPDATE ), 3); // "on the fly"<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_UPDATE ), 1); // stop the sensor if needed, write new parameters, start sensor and wait sensor-dependent (usually 2) petentially "bad" frames before sending images through the FPGA compressor.<br />
<br />
It is possible to read the current values of CCAM_RPARS using special request to ccam.cgi as HTML table, set of javascript assignments or xml data<br />
<br />
There is only one copy of these kernel-space variables - they reflect current state of a single sensor and single compressor. <br />
<!-- http://192.168.0.44/admin-bin/ccam.cgi?opt=vhcxy&dv=1&dh=1&iq=70&kga=63&kgm=6&gr=17&gg=14&ggb=14&gb=17&csb=200&csr=200&bit=8&gam=57&pxl=10&pxh=254&e=25&ww=2048&wh=1536&wl=0&wt=0&fps=0&_time=1127851252626 --><br />
<br />
== ccam.cgi parameters ==<br />
Not all of the parameters are applicable to all sensors/camears, some are obsolete.<br />
=== opt ===<br />
opt value is an unordered string of characters:<br />
{| border="1" cellpadding="2"<br />
|-<br />
| character || Description || Working?<br />
|-<br />
| h || Use hardware compression || Y<br />
|-<br />
| c || Consider sensor to be the color one, if not - skip Bayer color filters processing || Y<br />
|-<br />
| x || Flip (mirror) image horizontally (uses in-sensor capabilities) || Y<br />
|-<br />
| y || Flip (mirror) image vertically (uses in-sensor capabilities) || Y<br />
|-<br />
| p || test pattern (ramp) instead of an image (for Micron sensors - same as "f" below) || Y<br />
|-<br />
| f || test pattern (ramp) generated in FPGA || Y<br />
|-<br />
| b || buffer file || N?<br />
|-<br />
| m || restart exposure after sending || N?<br />
|-<br />
| s || software trigger (for image intensifiers) - trigger if sum of pixels in a line > threshold || N?<br />
|-<br />
| t || external trigger - wait for external trigger input || N?<br />
|-<br />
| v || video mode - currently only means that it is not a reload from memory || Y<br />
|-<br />
| g || use background image || N?<br />
|-<br />
| q || return a quicktime movie clip || Y<br />
|-<br />
| u || updates (some) parameters "on the fly", returns 1x1 pix dummy image || Y<br />
|-<br />
| * || ignore lock file, recover from "camera in use" || Y<br />
|-<br />
|}<br />
<br />
----<br />
=== Frame size and resolution ===<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| ww || 2..2048 || Sensor active window width (before decimation) || Y || N ||1<br />
|-<br />
| wh || 2..1536 || Sensor active window height (before decimation) || Y || N ||1<br />
|-<br />
| wl || 0..(2047-ww) || Sensor active window left margin (before decimation) || Y || Y || 2<br />
|-<br />
| wt || 0..(1535-wh) || Sensor active window top margin (before decimation) || Y || Y || 2<br />
|-<br />
| dh || 1..8 || Horizontal decimation (resoulution/image size reduction) || Y || N || 3<br />
|-<br />
| dv || 1..8 || Vertical decimation (resoulution/image size reduction) || Y || N || 3<br />
|-<br />
|}<br />
<br />
Notes:<br />
# Has to be (or will be truncated to) multiple of a macroblock (16x16 pixels) after the decimation<br />
# Even value<br />
# Decimation for MT9T001 3MPix sensor can be any integer from 1 to 8, for most other sensors - only 1/2/4/8. Because of the Bayer color filter mosaic, pixels are decimated in pairs, so decimation "4" means that for each pair of pixels used 6 pixels are skipped.<br />
<br />
=== Exposure controls ===<br />
There are multiple factors that influence image pixel values for the same lighting conditions, one is exposure time.<br />
<br />
Most CMOS image sensors (including Micron sensors used in Elphel camears) use [[Electronic Rolling Shutter]]. <br />
<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| e || 0..600000 || exposure time (0.1 msec step) || Y || Y || 1<br />
|-<br />
| vw || ? || virtual frame width || Y || ? || 2<br />
|-<br />
| vh || ? || virtual frame height || Y || ? || 3<br />
|-<br />
| fps= || xx.xx || desired frame rate || Y || ? || 4<br />
|-<br />
| sclk= || 6..48 || sensor clock (in MHz) || Y || N || 5<br />
|-<br />
| fclk= || 0..127 || FPGA clock (in MHz) || Y || N || 6<br />
|-<br />
| xtra= || 0..?? || extra frame time || Y || N || 7<br />
|-<br />
<br />
|}<br />
Notes:<br />
# Sensor driver will calculate the number of lines of exposure, will increase virtual frame height (vertical blanking) if needed (but currently - not the virtual frame width - horizontal blanking). For longer exposures you may want to do that manually or decrease the sensor clock frequency. ''Update'' - for the MT9T001 sensor that might not be needed - I'll fix the driver --[[User:Andrey.filippov|Andrey.filippov]] 12:39, 29 September 2005 (MDT). '''Done in version 6.4.9''' - now the frame time (for MT9T001 only) can be as long as 0xfffff (approximately 1 millilon) scan lines - nearly a full minute with the full frame and 48MHz clock.--[[User:Andrey.filippov|Andrey.filippov]] 11:27, 11 October 2005 (MDT)<br />
# It is possible to extend line readout time, but is not normally needed/used.<br />
# Explicitly specified virtual frame height - this parameter (if present) overwrites exposure setting. Not normally needed.<br />
# Driver will try to reduce frame rate by adding vertical blanking - limited by the maximal blanking time<br />
# Sensor clock, may be used with 1.3 and 2 MPix sensors to make longer exposure time (not needed with MT9T001 with rev. 6.4.9 or later), It also can make sense to reduce the frequency when the maximal frame rate is not needed to reduce sensor noise visible as horizontal lines in early revisions of MT9T001 sensor. You may read the sensor chip ID (revision/type) from telnet as "hello -IR ba 0" ("hello -IR" will read all the sensor registers). Current FPGA code uses the sensor clock to synchronize sensor power supply. And so the sensor power can be lost if this clock is too low, 6MHz is safe to use. On the upper side 48MHz is the maximal clock frequency for these sensors, driver limits this value.<br />
# FPGA clock frequency (drives compressor and frame buffer memory. For the model 313 practical limit was about 95MHz and you could easily change it "on the fly". Model 333 cameras uses DDR SDRAM and the implemented FPGA inteface to DDR SDRAM needs clock phase adjustment for the memory when you change the frequency. Currently it can be done manually through telnet as "fpcf -phase 0 200". Initial value for the sensor and FPGA clock frequencies might be set in the [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/packages/initscripts/333/fpga?view=markup /etc/init.d/fpga] initialization script of the camera.<br />
# For debugging purposes (probably needed only for the model 313 camera) frame period might be increased by the specified number of pixel clock periods. It was inteded to fine-tune the frame period (that depends on multiple sensor settings) and make sure it is not shorter than compressor could handle (333 compressor is faster).<br />
<br />
===Binning ===<br />
<br />
Binning allows effectively to increase the sensor sensitivity when it is operating with reduced resolution (decimation). Decimation still determins the resolution, binning defines how many pixel pairs are added together.<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| bh || 1..dh || Horizontal binning (sensitivity for lower resolution) || Y || Y || 1<br />
|-<br />
| bv || 1..dv || Vertical binning (sensitivity for lower resolution) || Y || Y || 1<br />
|-<br />
|}<br />
Notes:<br />
# Currently for MT9T001 sensor only, works for all vertical binning values, but not all of the horizontal (some have no effect, others - produce vertical lines). I would expect this glitches will be fixed in newer sensors by Micron. <br />
<br />
Here are two examples:<br />
<br />
1. Full frame with decimation by 4 in each direction will result in image of 512*384 pixels, pixel values the same as for the full resolution (only 2x2 pixels for each 8x8 are used, others are discarded)<br />
ww=2048&wh=1536&wl=0&wt=0&dl=4&dh=4&bh=1&bv=1<br />
2. Full frame with decimation by 4 in each direction will result in image of 512*384 pixels, pixel values are 16 times higher than for the full resolution (all 8x8 pixels are used, values are added together following the bayer RG/GB mosaic - reds with reds, greens with greens, blues with blues)<br />
ww=2048&wh=1536&wl=0&wt=0&dv=4&dh=4&bh=4&bv=4<br />
<br />
=== Analog Gains ===<br />
<br />
Most sensors have some controls for the analog signal gains before the pixel data is digitized. Some sensors (as now discontinued Kodak KAC-1310) have individual color gains and separate global gain, others (as Micron ones) - only color. Usually there are two "green" gains as with Bayer mosaic filters there are two green pixels in each 2x2 pixek cell (RG/GB).<br />
Gain values can be far from linear, too low gain setting might be not enough to saturate pixel value to 1023 (usually 255 after conversion) even with the very bright light.<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| gr || 0..63 || analog gain RED (or mono) || Y || Y || <br />
|-<br />
| gg || 0..63 || analog gain GREEN (or green in "red" line) || Y || Y || <br />
|-<br />
| gb || 0..63 || analog gain BLUE || Y || Y ||<br />
|-<br />
| ggb || 0..63 || analog gain GREEN in "blue" line) || Y || Y ||<br />
|-<br />
| kga || 63 || Kodak KAC1310 analog gain Y (all colors) || Y || Y || 1<br />
|-<br />
| kgb || ? || Kodak KAC1310 analog gain ? (all colors) || ? || Y || 1<br />
|-<br />
| kgm || 6 || Kodak KAC1310 mode || Y || Y || 1<br />
|-<br />
|}<br />
# Used in Kodak KAC-1310 (now obsolete) sensors. For MT9?001 sensors driver just multiplies gr, gg, gb and ggb by kga/63. It is better to keep it 63 (or do not use at all) for this family of sensors.<br />
<br />
=== Image Quality, Gamma correction, Color Saturation ===<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| iq || 1..99 || JPEG Quality (%) || Y || ? || 1<br />
|-<br />
| gam || 0.13 .. 10 || Gamma correction value (%) || Y || Y || 2<br />
|-<br />
| pxl || 0..255 || Black level || Y || Y || 3<br />
|-<br />
| pxh || 0..255 || White level || Y || Y || 3<br />
|-<br />
| csb || 0..710 || Color Saturation (%), Blue || Y || Y || 4<br />
|-<br />
| csr || 0..562 || Color Saturation (%), Red || Y || Y || 4<br />
|-<br />
|}<br />
# Standard JPEG compression quality in (%). Earlier negative values were used (in software compression mode only) to generate BMP images, then"-1" meant BMP non-compressed and "-2" - BMP RLE compressed. The code is likely rotten by now.<br />
# Camera implements virtually arbitrary table-based conversions from e.g. 10-bit sensor data to 8-bit used for compression. You may think of it as 256-entry tables of single-byte values that are used to convert 10 (or more) bit sensor data to 8-bit foramt using linear interpolation between entries. There is four such tables T: one for each color including 2 greens - RG/GB. The interpolation is done as follows:<br />
<br />
Y= T[C][x]+ (((T[C][x+1]-T[C][x]) * (X & ((1<<D)-1))) >> D)<br />
where<br />
X - W-bit input (sensor) data,<br />
Y - 8-bit output<br />
x - X (input) truncated to 8 bits: x = X>>(W-8)<br />
D = W-8 (number of bits to be truncated, i.e. 2 for a 10-bit sensor) <br />
C - color (0..3)<br />
and T is a 4 x 256 byte table (one for each color).<br />
<br />
for 10 bits it will be<br />
Y= T[C][x]+ (((T[C][(x)+1]-T[C][x]) * (X & 3)) >> 2)<br />
<br />
An implementation detail: (T[C][(X>>2)+1]-T[color][X>>2]) is calculated by software in advance and stored in a separate table. For practical reasons this table is combined with the main one as 16-bit values. The 8 MSB stores the difference T[C][x+1]-T[C][x] and the 8 LSB stores the T[C][x] value. So written to the table is (R[C][x]<<8)+T[C][X].<br />
The 8 bit imposes a restriction on the granularity of the table. Difference between consecutive entries needs to be in the range -128..127 (this is enough).<br />
All this could change in the future (use the source Luke) --[[User:Andrey.filippov|Andrey.filippov]] 17:18, 11 October 2005 (MDT)) <br />
<br />
(How about adding dither before the final truncate? Or maybe the signal has noise enough already to make dither unecessary/harmful? --[[User:Pfavr|Pfavr]] 15:58, 27 October 2005 (CDT))<br />
<br />
Model 313 camera had a single table for all colors, Model 333 FPGA has room for bigger tables so each color has each own 256-entry table. Currently for the simplicity of the web-interface ccam.cgi only calculate a single table from a gamma value (100 - linear, 47 - standard gamma setting for video cameras) and 2 of the values (pxl and pxh) below. Together they make something like "levels" in image manipulation programs (such as [http://www.gimp.org/ GIMP]), but camera hardware (FPGA code) allows more flexible "curves" control.<br />
<br />
# values for the 8 MSBs of the sensor data that map to total black (0x00) and total white (0xff) of the output signal. Sensors have different modes of auto-zero, and with default settings MT9T001 sensor adjusts black level so in the complete darkness each pixel would output 0x18 (8 MSBs are 0x0a or 10 decimal), other sensors have different values, it is also possible to reprogram sensors to change the "hardware" black value if needed.<br />
# Color saturation values for blue (B-G) and red (R-G) color components, in (%). In linear mode (gam=100) the true colors will be produced with color saturations of 1.0 (100), but for lower gamma setting the color saturation should be increased to compensate for the lowering contrast of the image - with the mosaic color filter pattern lower relative difference between the pixels will be decoded as less intense color.<br />
<br />
Verilog code that implements such conversion is [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/fpga/x3x3/sensorpix333.v?view=markup here].<br />
<br />
=== Histograms ===<br />
<br />
Model 333 camera calculates histograms (individually for each of 4 colors (including 2 greens). Histograms are calculated inside a specified window - the following parameters are written directly to FPGA - now shadows in kernel space yet (so no way to read back the current values). As the sensors use 2x2 pixel mosaic, these 4 values are made even (by truncating LSB).<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| hl || 0..2046 || Histogram window left margin || Y || Y ||<br />
|-<br />
| ht || 0..1534 || Histogram window top margin || Y || Y ||<br />
|-<br />
| hw || 0..2048 || Histogram window width || Y || Y ||<br />
|-<br />
| hh || 0..1536 || Histogram window height || Y || Y ||<br />
|-<br />
|}<br />
<br />
hl= distance from the left active window boarder to the left of histogram calculation window, default=0<br />
<br />
ht= distance from the top active window boarder to the top of histogram calculation window, default=0<br />
<br />
hw= histogram calculation window width , default=0xffe (will extend to the bottom right corner)<br />
<br />
hh= histogram calculation window height , default=0xffe (will extend to the bottom right corner)<br />
<br />
Currently position and size of the histogram window is truncated to even values (LSB ignored).<br />
<br />
Histogram calculation is always on when the sensor is running (normally it is even if no stream is output), FPGA uses<br />
two pages of internal memory and switches between them when ready. For each frame it first writes zero to each histogram<br />
value (4x256) and then adds pixels (after converting from 10 bits sensor data to 8 bit using "curves" tables), limiting<br />
the value by 2^18-1 (hardware limitation). If you read histogram table asynchronously it is likely that the sum will<br />
differ from the total number of pixels as FPGA could switch pages while you were reading. But it switches only at<br />
the end of frame, so there will be no partial sums read out.<br />
<br />
There are two ways to read the histogram table now:<br />
<br />
* manually (through telnet) using "fpcf -histogram" - it will print data as hex values or<br />
* read binary file (4*256*32bits=4KB) /dev/histogram or through a symlink at <nowiki>http://<camera_ip>/histogram</nowiki><br />
<br />
The four (as number of color filters in Bayer mosaic) tables will be read out in the following order:<br />
<br />
R (256 values), Gr (green in the "red" row - 256 values), Gb (green in the "blue" row - 256 values), B (256 values)<br />
<br />
It is the same order how now the "curves" tables are written to the FPGA. Now ccam.cgi only can fill these table from<br />
the gamma value, all colors the same. But you can experiment with it by creating a text file with 1024 hex values -<br />
(ccam.c shows how to build it), copy it to the camera file system and use "fpcf -table 400 <path_to_table>"<br />
to transfer it to FPGA. If you then reacquire image without changing gamma value ccam.cgi will not overwrite<br />
the table you've just downloaded.<br />
<br />
It seems that colors work correctly with all image orientations and decimations for all Micron sensors. If not -<br />
let me now, you can temporary compensate wrong colors by adding "&byr=<0..3>" (bayer phase shift) to the image URL - it reassigns<br />
RG/GB mosaic in different ways, but the sequence of the colors R,Gr,Gb,B in the FPGA tables ("curves", histogram)<br />
will still correspond to the colors in the JPEG output.<br />
<br />
=== HTML, XML or VRML ===<br />
<br />
ccam.cgi can send html, xml or vrml (code broken needs to be restored) files, not just images if any of html, htmlr, htmll or htmlj parameters are present in the url.<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| html || 0 || no output || Y || Y ||<br />
|-<br />
| || 1 || all sensor parameters as javaScript || Y || Y || 1, 4<br />
|-<br />
| || 2 || all sensor parameters as html || Y || Y || 2, 4<br />
|-<br />
| || 3 || beam data as javaScript || Y || Y || 1, 8<br />
|-<br />
| || 4 || beam data as html || Y || Y || 2, 8<br />
|-<br />
| || 5 || state (5 -picture ready) as javaScript || Y || Y || 1,5<br />
|-<br />
| || 6 || state (5 -picture ready) as html || Y || Y || 2,5<br />
|-<br />
| || 7 || start image acquisition (option "s" or "t" should be present) || ? || Y || 6<br />
|-<br />
| || 8 || reset waiting for trigger || ? || Y || 7<br />
|-<br />
| || 10 || all sensor parameters as XML || Y || Y || 3<br />
|-<br />
| || 11 || beam data as XML || Y || Y || 3,8<br />
|-<br />
| || 12 || state (5 -picture ready) as XML || Y || Y || 3, 5<br />
|-<br />
| || 13 || start image acquisition (option "s" or "t" should be present), return XML || Y || Y || 3, 6<br />
|-<br />
| || 14 || reset waiting for trigger, return XML || Y || Y || 3,7<br />
|-<br />
| htmlr || n || Refresh each n seconds || Y || Y || 9<br />
|-<br />
| htmll || escaped string || command to be executed onLoad in <nowiki><body></nowiki> tag || Y || Y || 10<br />
|-<br />
| htmlj || escaped string || include javaScript file || Y || Y || 11<br />
|-<br />
<br />
<br />
|}<br />
<br />
Notes:<br />
# Head section of the html output file will have javascrips assingments "document.''variable_name''=value;" for each parameter. No visible elements in the file - it was intended to be used in a frame set before XMLHttpRequest was supported in most browsers.<br />
# Parameters are output as a two-column html table (first column - name, second - value).<br />
# Parameters and their values are output as XML file.<br />
# Sensor-related parameters are output<br />
# Only sensor/compressor state is output. State 7 - sensor is running, constant compression is off (single frame mode), state 8 - compressor is in constant compression mode (such as during streaming), static images can not be acquired, some acquisition parameters can not be changed withowt stopping the compression.<br />
# This was designed for sensors with asynchronous reset (such as now obsolete now Zoran ones). Don't remember what it will do (or how to use it) with Micron ones.<br />
# Reset waiting for an external trigger (not sure if it still works)<br />
# Output beam parameters (center of gravity, half width in x, y, etc.). This code is broken now, but might be repaired.<br />
# Instruct the html page to refresh itself each specified number of seconds.<br />
# Value is an "escaped" string that contains javaScript command to be executed whenn the page is loaded (body onLoad).<br />
# Value is an "escaped" string that has the path of the external javaScriptg file to be included inside the <nowiki><head></nowiki> tag of the page<br />
<br />
<br />
== below is yet unedited text from ccam.c comments ==<br />
<br />
<br />
<br />
<br />
* vrmld - decimation to make a grid (>=8 for full frame) (default = 16)<br />
* vrmlb - number of individual blocks in each x/y (default=2)<br />
* vrmlm - maximal pixel. 1023 - full scale, less - increase contrast, 0 - automatic (default =1023)<br />
<br />
* vrmli - indentation (default=1)<br />
* vrmlf - format - 0 - integer, 1 - one digit after "." (default 0)<br />
* vrmll - number of countours to build (default = 32)<br />
* vrmlo - options for isolines - e - elevated, f - flat (default=ef)<br />
* vrmlz - 0..9 output (gzip) compression level (0 - none, 1 - fastest, default - 6, best -9)<br />
<br />
* hist=n - read frame from "history" applies only to rereading from memory after acquisition of a clipÃÂ<br />
n<=0 - from the end of clip (0 - last), n>0 - from the start (1 - first)<br />
<br />
<br />
<br />
* pfh - photofinish mode strip height (0 - normal mode, not photofinish). In this mode each frame will consist of multiple<br />
pfh-high horizontal (camera should be oriented 90 deg. to make vertical) strips, and no extra lines will be added to the frames<br />
for demosaic<br />
for now: +65536 - timestamp for normal frames, +131072 - timestamps for photo-finish mode<br />
* ts - time stamp mode: 0 - none, 1 - in upper-left corner, 2 - added to the right of the image (photo-finish mode) <br />
* fsd - frame sync delay (in lines) from the beginning of a frame (needed in photofinish mode - 3 lines?)<br />
<br />
<br />
<br />
* _time=t (ms) will try to set current system time (if it was not set already. _stime - will always set)<br />
<br />
<br />
<br />
<br />
<br />
<br />
* fpns - 0..3 fpga background subtraction:<br />
* 0 - none,<br />
* 1 (fine) - subtract 8-bit FPN from 10-bit pixel<br />
* 2 - multiply FPN by 2 before subtracting<br />
* 3 - multiply FPN by 4 before subtracting (full scale)<br />
* note: negative result is replaced by 0, decrease FPN data before applying for "fat 0"<br />
* fpnm - muliply by inverse sensitivity (sensitivity correction) mode:<br />
* 0 - no correction<br />
* 1 - fine (+/- 12.5%)<br />
* 2 - medium (+/- 25%)<br />
* 3 - maximal (+/- 50%)<br />
* pc - pseudo color string. Applies to monochrome images and vrml<br />
<br />
* any of vrml* specified - vrml instead of a picture/html<br />
*<br />
* background measurement/subtraction will (now) work only with 10-bit images<br />
* gd = "digital gain" 0..5 (software)<br />
* byr =0..3 Overwite Bayer phase shift, =4 - use calculated by driver.<br />
<br />
* bit - pixel depth (10/4/8)<br />
* shl - shift left (FPGA in 8 and 4-bit modes) - obsolete<br />
* clk - MCLK divisor - 80MHz/(2..129) - obsolete?<br />
<br />
<br />
* bg = n - calculate background 1-2-4..16 times (does not need option s/t/v)<br />
* parameters for "instant on" quicktime<br />
* qfr = n - number of frames to send in a quicktime clip<br />
* qpad = % to leave for the frame size to grow (initial size 1-st frame * (100- 1.5*qpad)/100<br />
* qdur = frame duration in 1/600 of a second<br />
* parameters for quicktime clips (send after shooting)<br />
* qsz = n - clip size in KB (w/o headers) (<=0 will use "instant on") - will be obsolete<br />
* qcmd= (mandatory for videoclip)<br />
1 - start constant compression of all acquired frames<br />
2 - stop constant compression.<br />
3 - acquire the whole buffer and stop<br />
4 - read movie from buffer<br />
6 (and 5?) - stop, then read<br />
7 - acquire buffer, then read<br />
<br />
* qts = t - playback time/real time</div>Pfavrhttps://wiki.elphel.com/index.php?title=User:Pfavr&diff=1967User:Pfavr2005-10-27T20:25:26Z<p>Pfavr: about myself ;-)</p>
<hr />
<div>Hi I'm an engineer who just bought an Elphel model 333 camera. I hope to use it for some industrial purpose, i.e. quality control in packaging lines.<br />
<br />
I have barely done any VHDL, and am totally new to Verilog.<br />
<br />
My C is pretty decent - in 1994 I did a small multitasking kernel for MC68000 - and I have been programming since childhood (1985). My main force lies in realtime and lowlevel software. Recently I've done some web applications using AJAX, LAMP and XUL (mozilla graphical toolkit). Perl, Python, Bash, Awk or whatever is the serving of today :-)</div>Pfavrhttps://wiki.elphel.com/index.php?title=Camera_Synchronization&diff=1966Camera Synchronization2005-10-27T20:15:21Z<p>Pfavr: summary of the discussion so far</p>
<hr />
<div>There are several parts of the camera synchronization task.<br />
# Camera should receive synchronizing event. It can be done by either special '''hardware inputs''' or just over the '''network'''. It most cases if you want to syncronize 2 or more networked cameras you do not need extra wires, so the network sycnchronizatrion is the most convinient. But sometimes you would like to be able to trigger the camera without the network - i.e. from some contact closure.<br />
# Camera should be able to start image acquisition process when required - generally not possible with most CMOS sensors. ''/this used with "external trigger" in FPGA API? - Spectr/''<br />
# And (in some cases) camera should be able to precisely keep time, so in-sync state of two or more cameras will last longer.<br />
<br />
Here is a [https://sourceforge.net/forum/forum.php?thread_id=1068147&forum_id=371579 Thread on sf] about synchronizing two 313 cameras using [http://mhonarc.axis.se/dev-etrax/msg02121.html sntpdate client]: "Synchronizing the cameras ended up being incredibly simple, I didn't have to do anything special at all. I decided to try the easiest solution first, keeping the 2 cameras at the exact same settings. I had the client I wrote request image captures from both cameras at approx. the same time. I havn't benchmarked it to see the exact amount of jitter between matching frames, but I can set FPS to any value and not see any noticable difference."<br />
<br />
How about a digital phase locked loop using the [[RTC]] timer in the FPGA? NTP is basically a digital phase locked loop in software, but it also adds a lot of code for robustness against "malicious" NTP servers - something we probably don't need for the cameras --[[User:Pfavr|Pfavr]] 15:15, 27 October 2005 (CDT)</div>Pfavrhttps://wiki.elphel.com/index.php?title=Current_events&diff=1961Current events2005-10-26T22:00:28Z<p>Pfavr: /* Quick RTSP server */ minor typos and language</p>
<hr />
<div>= 2005-10-23 =<br />
<br />
== GenReS plugin for Mozilla v 0.6 released ==<br />
Added support for recording with mencoder. Video splits to files with specifed number of frames.<br />
New feature will be used in HTML video surveillance system.<br />
<br />
= 2005-10-20 =<br />
<br />
ElphelWiki is now hosted at [http://www.siteground.com SiteGround]<br />
<br />
= 2005-10-19 =<br />
<br />
Wiki will be moved to the new location shortly <br />
<br />
= 2005-10-15 =<br />
== GenReS v0.5 Released ==<br />
Bugfix release.<br />
Several issues with tag parameters were fixed. The SRC tag parameter can now be used as an alias for HREF.<br />
[http://sourceforge.net/projects/genres/ GenReS] plugin for Mozilla/Firefox used in our [[HTML Video Surveillance]].<br />
<br />
= 2005-09-24 =<br />
== Live.com was renamed ==<br />
Ross Finlayson informs:<br />
Live.com was renamed to [http://Live555.com Live555.com]<br />
<br />
= 2005-09-23 =<br />
== New Video Surveillance System is written on HTML ==<br />
See [[HTML Video Surveillance]].<br />
== Quick RTSP server ==<br />
Added to old CVS repository<br />
The rtsp server (qrtsp) is written in C and much quicker than old rtsp server which was a shell script.<br />
With new server used with rtp streamers mplayer starts in less than one second.<br />
Program will be added to new branches.<br />
<br />
== GenReS v0.4.1 Released ==<br />
[http://sourceforge.net/projects/genres/ GenReS] plugin for Mozilla/Firefox used in our [[HTML Video Surveillance]].<br />
Fixed bug in the player watchdog.</div>Pfavrhttps://wiki.elphel.com/index.php?title=Using_the_cameras&diff=1960Using the cameras2005-10-26T21:38:22Z<p>Pfavr: /* Video Controls */ small typo</p>
<hr />
<div><div style="display:inline; color:blue">'''''Elphel Network Cameras Manual'''''</div><br />
----<br />
<div style="display:inline; color:green">Using the cameras</div> | [[Camera software]] | [[Camera hardware]] | [[Diagnostic and repair]] | [[Live CD]] | [[Development documentation]] | [[333 prices]] | [[Information]] | [[Glossary]] | [[About Elphel, Inc]]<br />
----<br />
<br />
== Introduction ==<br />
<br />
Elphel network cameras — complex devices by which development the advanced technologies and non-standard program decisions are used. We constantly improve our products that users could to use the cameras with the least expenses of time and forces. In this section of Elphel Network Cameras Manual you can find out the detailed information about using the Elphel cameras.<br />
<br />
== Complete set ==<br />
<br />
The camera complete set of delivery includes actually the network camera Elphel-313/323/333 and Live-CD or Live-DVD with special edition of the Debian-based GNU/Linux operational system Knoppix completed with the camera's necessary software. <br />
<br />
Also the complete set of delivery can include lenses, an external power unit and connection cables for an additional payment.<br />
<br />
== Appearance ==<br />
<br />
[[Image:Cam side.jpg]] The Elphel-313/333 camera's box is folding and made of anodized aluminium. Overall dimensions 116х45х45 mm, weight approximately 150 g (without lens). <br />
<br />
[[Image:Cam front.jpg]] On the front side of the camera's box the carving socket for standard C/CS-mount lenses is located. <br />
<br />
[[Image:Cam back.jpg]] On the back side of the camera's box are located a socket for connection of a network cable, the switch-on push button for the software reloading mode and camera's model number mark. We constantly improve the products, therefore amount and appearance of the elements placed on the back panel, can change. <br />
<br />
On the bottom side of the camera's box the standard carving socket for installation of the camera on a support is located.<br />
<br />
== Connection ==<br />
<br />
== Main Control Page ==<br />
<br />
The Main Control Page uses javaScript to process user input (input fields and window of interest that can be selected by dragging a frame with the mouse), combines all camera acquisition parameters in a single GET request to a CGI program in the camera through the embedded web server (Boa). Currently camera does not support simultaneous access as each request actually controls the camera operation mode (including sensor resolution and frame rate) and not just connects to the camera output. <br />
<br />
=== Page layout ===<br />
<br />
On the top part of the page you should see preview image. It is always 640x512, 800x600 or 512x386 pixels (1/2 or 1/4 of camera resolution depending on the sensor) and does not change with the selected decimation and window of interest (red rectangle over the image). <br />
<br />
Below the preview image there are camera controls:<br />
<br />
=== General Controls ===<br />
<br />
[[Image:Control panel 1.jpg]]<br />
<br />
In '''Window''' section there are 2 buttons at the low-right corner of window. You may specify window of interest (WOI) by dragging this buttons with the mouse. You may also change it numerically by entering data to '''W'''(idth), '''H'''(eight), '''L'''(eft) and '''T'''(op) input fields below. That numbers are rounded according to the selected sensor decimation and to use integer multiple of 16x16 pixels MCUs in JPEG compression. <br />
<br />
In '''Exposure''' field there is an input field to specify frame exposure time (in ms) - it is possible to specify fractions. You may change this parameter by dragging the slider with the mouse. <br />
<br />
'''Saturation''' section allows adjusting contrast saturation. <br />
<br />
'''Gamma''' field controls contrast by adjusting intensity conversion, gamma=1.0 corresponds to linear response (high contrast). Values less than 1 increase input dynamic range by expanding low intensity values (low contrast). You may change this parameter by dragging the slider with the mouse. <br />
<br />
JPEG '''Quality''' sets standard JPEG compression quality. The higher the quality the bigger is the result file. It does not change the camera frame rate. <br />
<br />
'''White balance''' section is the drop-down menu containing four static parameters: '''Sunlight''', '''Cloudy''', '''Incandenscent''' and '''Fluorescent'''. You can choose one of these parameters for achievement of the most qualitative image depending on conditions of illumination. <br />
<br />
In '''Image size''' drop-down menu you may choose the few static windows size. <br />
<br />
In '''Resolution''' section there are radiobuttons to select sensor decimation (resolution). Selecting '''1''' uses all pixels in WOI, '''1/2''' - each other (in both directions) - 1/4 total, '''1/4''' - 1 in 4 in each direction - 1/16 total and so on. Both WOI and decimation control actual sensor operation that runs at 20 MHz pixel clock, so the smaller the total number of pixels the higher is the frame rate (it is actually slower as the sensor has some "margins" around the active area). <br />
<br />
[[Image:Color.jpg]] In monochrome mode (when '''Color''' radiobutton is not checked) "R" gain settings are used for all channels and color conversion is disabled. <br />
<br />
Below are 3 buttons '''Preview''', '''Reset''' and '''Apply'''. '''Preview''' button refreshes the image in the top portion on a page (ignoring selected WOI and decimation), '''Reset''' button returns the initial adjustment parameters. The '''Apply''' button allow to accept the new changes. <br />
<br />
Also there are two links after '''Photo''': '''New''' contains a link to a camera CGI program that has all acquisition parameters attached. You may just click it to open the image in a new window or right-click it and select "Save link target as..." (or equivalent) to save the image on your computer. The second link '''Last''' points to last image already in the camera memory, you may use it to save already aquired image to your hard drive. <br />
<br />
[[Image:Help.png]] button opens this help page. <br />
<br />
[[Image:Info.png]] button opens information window where is a list of the acquisition parameters used during the last image/clip acquisition. <br />
<br />
=== Video Controls ===<br />
<br />
'''Video Controls''' in this section allow users to record a video clip in the camera memory and send it over the LAN/Internet as Quicktime video file. All settings for the image size, resolution, compression quality, analog gains and exposure are the same as for a still images described above, and that link adds extra parameters - '''Frame Rate''' and '''Time Scale'''. The first of them set desired frame rate (leave blank or set to 0 for the maximal rate), the secon field sets how much the playback time is longer than the acquisition one. I.e. setting of "10" makes the clip play 10 times slower that it was shot. <br />
<br />
Total size of the clip (6-7MB) is determined by a camera internal buffer used for compressed images storage and the total number of frames as the frame headers are attached during the clip output and are not stored in the buffer. There are 2 ways to shoot a clip (usual duration 3-5 sec, depending on compression quality). First is to press '''Start stream''' button, wait for the event to occur and then press '''Acquire'''. In this mode after '''Start stream''' the camera continuously writes to the buffer, overwriting footage when all the available space is used. Pressing '''Stop stream''' button ends data saving and so the buffer contains the last data recorded. Other way - is to press '''Acquire''' before the event. In that mode the clip recording starts after the trigger and stops when the whole buffer is full. <br />
<br />
There are additional controls to preview videoclip from the camera memory. Frame number 1 is the first one, 2 - second, etc. Frame number 0 is the last one in the clip, -1 - previous before last, etc. Any image acquisition (including '''Preview''') erases the stored clip and makes it one frame long. <br />
<br />
Clicking on the '''Video Clip''' link opens the clip in the Apple QuickTime ™ (or compartible) player plugin (if installed), you may also use this link to save the clip as a file in your computer.<br />
<br />
=== Advanced Panel ===<br />
<br />
[[Image:Control panel 2.jpg]]<br />
<br />
'''Advanced Panel''' tab opens additional adjustment parameters of the image. In this section you may specify the more exact adjustments of color levels of the image. <br />
<br />
'''Gains''' section control the sensor analog gain settings. '''R''', '''G''' and '''B''' input fields control color components gains. Using high gain settings increases sensor noise but is required to view fast events with 1ms exposure time with the lightning used in the current setup. <br />
<br />
'''R-Y''' specifies color saturation for red color component. <br />
<br />
'''B-Y''' specifies color saturation for blue color component. <br />
<br />
'''B''' specifies color saturation for blue color component. <br />
<br />
'''W''' specifies level saturation for white color component. <br />
<br />
The '''Flip-X''' button allows to turn the image horizontally. <br />
<br />
The '''Flip-Y''' button allows to turn the image vertically. <br />
<br />
''NOTE: Some parameters (i.e. window size) cannot be changed without stopping the acquisition/compression.''<br />
<br />
----<br />
''Free Software and Open Hardware. Elphel, Inc., 2005''</div>Pfavrhttps://wiki.elphel.com/index.php?title=Camera_Synchronization&diff=1959Camera Synchronization2005-10-26T21:15:33Z<p>Pfavr: </p>
<hr />
<div>There are several parts of the camera synchronization task.<br />
# Camera should receive synchronizing event. It can be done by either special '''hardware inputs''' or just over the '''network'''. It most cases if you want to syncronize 2 or more networked cameras you do not need extra wires, so the network sycnchronizatrion is the most convinient. But sometimes you would like to be able to trigger the camera without the network - i.e. from some contact closure.<br />
# Camera should be able to start image acquisition process when required - generally not possible with most CMOS sensors. ''/this used with "external trigger" in FPGA API? - Spectr/''<br />
# And (in some cases) camera should be able to precisely keep time, so in-sync state of two or more cameras will last longer.<br />
<br />
How about something like ntp - network time protocol? --[[User:Pfavr|Pfavr]] 16:15, 26 October 2005 (CDT)</div>Pfavrhttps://wiki.elphel.com/index.php?title=Roadmap&diff=1956Roadmap2005-10-26T20:19:13Z<p>Pfavr: /* HTML Video Surveillance */ minor typo</p>
<hr />
<div>== Background ==<br />
I (Andrey Filippov) quit my job and started Elphel, Inc. in 2001 (Magna, UT USA), all the projects were covered in [http://www.linuxdevices.com LinuxDevices] (complete list on the articles is available [http://www.elphel.com/articles/index.html here]). For several years Elphel was a one-man company, in January 2004 I wrote an article [http://www.computerra.ru/hitech/tech/31862/ Taming of the Iron Penguin (Russian)] in the largest Russain computer-related magazine [http://www.computerra.ru Computerra] and announced there a competition among the software developers for the best video streamer to run in the camera. That was a good idea and after the competition itself was over most of the developers remained in the Elphel team. At first - as volunteers, later - as full/part time employees.<br />
<br />
Not all of these developers live in Russia - two, including the winner of the competition are from Kiev, Ukraine. But still all of them know Russian much better than English and so most of our technical discussions were on our private Russian-language forum. So far I failed to move these discussions to the broader audience but believe that Wiki technology can help. Here we will mantain most of the site in English but still have some pages/discussions in Russian, translating documents as we go. Or when somebody else needs it and is not satisfied with [http://babelfish.altavista.com Babelfish] automatic translation. We will try to keep English pages current - anyway even in Elphel not everybody knows Russian.<br />
<br />
Please excuse not-so-good English of our developers and feel free to fix the errors if you see them.<br />
<br />
== Software Architecture of Elphel 3x3 cameras ==<br />
Software in the Elphel cameras started from [http://developer.axis.com/ Axis Developer Boards Software] and was amended for the camera specific functions. It was modified to work with newer hardware (models 303-313/323-333), support more features and now it seems to be a good time to make a major redesign instead of applying incremental changes.<br />
<br />
Some discussion already started in Russian here - [[Nc3x3]]<br />
<br />
Related to the architecture are the [[#Camera Interface]] and the [[#Client Software]]<br />
<br />
Elphel will continue developing web browser based user interface with [http://en.wikipedia.org/wiki/AJAX AJAX] technique. That will require to develop/modify player plugins controllable from [http://en.wikipedia.org/wiki/Javascript JavaScript] and implementing specific features needed for video surveillance applications - multiple camera views on the same page, digital PTZ (inside the hi-res incoming stream) and temporal decimation (reducing frame rate) that uses as low CPU resources as possible.<br />
<br />
Web-based user interface can be especially useful for the open hardware as it reduces the entrance threshold for the developer who would like to customize the cameras functionality - regular web development tools are sufficient for the job.<br />
<br />
=== Camera Interface ===<br />
<br />
Camera now has two alternative APIs:<br />
<br />
==== ccam.cgi ====<br />
<br />
Original interface that supports most camera features - [[ccam.cgi]]<br />
<br />
and<br />
<br />
==== API compatible with Axis cameras ====<br />
<br />
This ([[AxisAPI]]) makes Elphel cameras work with some third-party software<br />
<br />
==== JavaScript library ====<br />
We will create a set of javascript routines to control cameras, which can be used in a different AJAX applications.<br />
See [[JavaScript API]]<br />
<br />
=== Camera Software ===<br />
==== File systems ====<br />
[[333_File_System]]<br />
<br />
=== Client Software ===<br />
==== [[MPlayer]] ====<br />
We have MPlayer patched for use with our cameras. Patches for source codes are accessible on Source Forge but compiled package is only for Debian on i386 architecture. We plan to make a compiled packages for PowerPC architecture and also for Slackware.<br />
<br />
==== [[HTML Video Surveillance]] ====<br />
[http://sourceforge.net/project/showfiles.php?group_id=105686&package_id=138717&release_id=358392 Multiple camera view HTML page] is based on [http://sourceforge.net/projects/genres/ GenReS plugin] for [http://www.mozilla.org/ Mozilla/FireFox].<br />
Now works: scrolling by picture dragging (digital PTZ), camera selection, zoom switch, automatic detection of stream stop by timeout.<br />
List of cameras adresses is now editable manually. It will be automaticaly generated in the [[Live CD]].<br />
The page will runs recording software by user request. Video will be saved to a fixed directory and splitted to separate files by tunable number of frames.<br />
Main parameters of video capture wil be changeable from the page.<br />
The page later can be used in the [[#Video Server]].<br />
<br />
==== [[Live CD]] ====<br />
Elphel live Linux CD contain software for camera users. We also will make a live DVD for developers.<br />
The live CD based on [http://knopper.net/knoppix/ Knoppix].<br />
<ul>The software which should be included to a future releases of the Live CD<br />
<li>[[HTML Video Surveillance]]<br />
<li>[http://lives.sourceforge.net LiVES] video editor<br />
<li>client software packages with simple installation for different distributions of GNU/Linux<br />
</ul><br />
Currently we have CD only for i386 architecture.<br />
<br />
We have plans to make Live CD for PowerPC too.<br />
<br />
We should move to DVD distribution as most of the disks are anyway provided with the hardware, not downloaded.<br />
<br />
The idea of keeping as full Knopix as possible was to introduce GNU/Linux to the camera users who never had this experience before. But these users will get DVD in a box, downloadable CD version can have more standard packages removed and replaced with camera-specific software.<br />
<br />
One of such major additions will be preinstalled camera development environment (possibly based on [http://www.eclipse.org Eclipse]) to simplify modification of the camera code. Again - don't forget that many of those future developers use now only Visual Studio (or how exactly it is called?) and GNU/Linux can be somewhat alien to them. With this environment they might start playing with their code without prior knowledge of GNU/Linux software development process.<br />
<br />
It can be useful for the hardware/fpga developers too - to be able to write some code to support the hardware features without spending much time on the mastering software development process.<br />
<br />
=== Video Server ===<br />
PC-based video server that will archive incoming Ogg Theora incoming streams from several cameras and transcode them on the fly to lower resolution (binary decimation, windowing) and frame rate (i.e. using only key frames) presenting multiple streams (real time and recorded) to the operator. The external interface of the server might be one of the industry standard and compatible with 3-rd party legacy software.<br />
<br />
== Camera hardware ==<br />
=== RTC ===<br />
[[RTC]]<br />
=== 10331 ===<br />
[[10331]]<br />
===10332 ===<br />
[[10332]]<br />
=== 10334 ===<br />
[[10334]]<br />
<br />
== Active Projects ==<br />
=== Synchronization of the Cameras ===<br />
<br />
Sometimes you need to acquire images triggered by an extarnal event ar several cameras need to be syncronized with each other. [[Camera Synchronization]] is all about it.<br />
<br />
=== Photo-finish ===<br />
Photo-finish device made of Elphel model 333 camera with additional FPGA code and software - [[Photo-finish]]<br />
<br />
=== Zeroconf for Elphel cameras ===<br />
[[zeroconf for Elphel cameras]]<br />
=== Elphel cameras and Zoneminder ===<br />
We plan to make model 333 camera work with [http://www.zoneminder.com Zoneminder]<br />
=== USB host interface ===<br />
<br />
daugther board with USB and DC-DC power for lens control board [[10334]]<br />
<br />
=== Motorized lens control ===<br />
I'll try to retrieve what was written before on the motorized lens control. In short - C/CS mount is rather old and does not work well for interchangeable motorized lenses. We are trying to build an adapter from C/CS-mount to a bayonet type connector. And place a tiny 5mm wide PCB ring in that adapter. This [[10331]] PCB has a reprogrammable microcontroller and uses just 2 connections to the camera for power and data signals combined. It provides all the necessary connections for the most types of motorized lenses. <br />
<br />
lens control board [[10331]]<br />
DC-DC power board for motorized lens control board [[10332]]<br />
lens control board In System Programmer [[lbcontrol]]<br />
<br />
=== Outdoor enclosure ===<br />
<br />
<br />
'''Step Zero'''<br />
<br />
Determine working setup<br />
-Does the system need a control board <br />
-CCD board needs longer cable for minimal package when stacking lens on top of board [http://www.maartenmenheere.nl/blog/images/014-001-0.jpg Camera casing]<br />
<br />
'''Step one'''<br />
Test setup. Assemble all components in a setup that can record video<br />
<br />
Components in test setup<br />
- Lens (Computar H3Z4512CS varifocal lens? using power)<br />
- Elphel USB setup. Is it possible to directly plug in a usb drive. Where does the power come from?<br />
- Battery<br />
- Usb cable or network calbe<br />
- Usb exteral harddrive or flashdrive<br />
- ON/off switch<br />
<br />
Objective: Does it work, at all?<br />
Secundary: Battery life? Video quality?<br />
<br />
'''Step two'''<br />
<br />
Wooden box. Test setup 1 integrated in outside video testing setup.<br />
<br />
Components added in test 2<br />
- Hardboard casing<br />
<br />
Objective: Optimize recording setup of video for ease of use<br />
Secundary: optimal settings? correct lens?<br />
<br />
'''Step three'''<br />
<br />
Building of waterproof casing<br />
- Amphenol plugs<br />
- Camera window<br />
- Casing camera (fibre reinforced composite)<br />
- Casing base station (battery + storage) (fibre reinforced composite)<br />
<br />
[http://www.maartenmenheere.nl/blog/images/outdoorvideosystem.jpg Schematic]<br />
[http://www.maartenmenheere.nl/blog/images/014-001-0.jpg Camera casing]<br />
[http://www.maartenmenheere.nl/blog/images/camera.jpg Camera casing]<br />
<br />
=== Current enclosure design ===<br />
<br />
We are switching to extruded aluminum tube (actually original 303/313 also was design for a standard aluminum profile). Model 333 RJ-45 connector is designed to fit into RJField shell [http://www.rjfield.com/ethernet_connectors_rjf_en.htm].<br />
<br />
[[Image:Tube_section.jpg]]</div>Pfavrhttps://wiki.elphel.com/index.php?title=Roadmap&diff=1954Roadmap2005-10-26T19:56:01Z<p>Pfavr: /* Background */ small typo fixed</p>
<hr />
<div>== Background ==<br />
I (Andrey Filippov) quit my job and started Elphel, Inc. in 2001 (Magna, UT USA), all the projects were covered in [http://www.linuxdevices.com LinuxDevices] (complete list on the articles is available [http://www.elphel.com/articles/index.html here]). For several years Elphel was a one-man company, in January 2004 I wrote an article [http://www.computerra.ru/hitech/tech/31862/ Taming of the Iron Penguin (Russian)] in the largest Russain computer-related magazine [http://www.computerra.ru Computerra] and announced there a competition among the software developers for the best video streamer to run in the camera. That was a good idea and after the competition itself was over most of the developers remained in the Elphel team. At first - as volunteers, later - as full/part time employees.<br />
<br />
Not all of these developers live in Russia - two, including the winner of the competition are from Kiev, Ukraine. But still all of them know Russian much better than English and so most of our technical discussions were on our private Russian-language forum. So far I failed to move these discussions to the broader audience but believe that Wiki technology can help. Here we will mantain most of the site in English but still have some pages/discussions in Russian, translating documents as we go. Or when somebody else needs it and is not satisfied with [http://babelfish.altavista.com Babelfish] automatic translation. We will try to keep English pages current - anyway even in Elphel not everybody knows Russian.<br />
<br />
Please excuse not-so-good English of our developers and feel free to fix the errors if you see them.<br />
<br />
== Software Architecture of Elphel 3x3 cameras ==<br />
Software in the Elphel cameras started from [http://developer.axis.com/ Axis Developer Boards Software] and was amended for the camera specific functions. It was modified to work with newer hardware (models 303-313/323-333), support more features and now it seems to be a good time to make a major redesign instead of applying incremental changes.<br />
<br />
Some discussion already started in Russian here - [[Nc3x3]]<br />
<br />
Related to the architecture are the [[#Camera Interface]] and the [[#Client Software]]<br />
<br />
Elphel will continue developing web browser based user interface with [http://en.wikipedia.org/wiki/AJAX AJAX] technique. That will require to develop/modify player plugins controllable from [http://en.wikipedia.org/wiki/Javascript JavaScript] and implementing specific features needed for video surveillance applications - multiple camera views on the same page, digital PTZ (inside the hi-res incoming stream) and temporal decimation (reducing frame rate) that uses as low CPU resources as possible.<br />
<br />
Web-based user interface can be especially useful for the open hardware as it reduces the entrance threshold for the developer who would like to customize the cameras functionality - regular web development tools are sufficient for the job.<br />
<br />
=== Camera Interface ===<br />
<br />
Camera now has two alternative APIs:<br />
<br />
==== ccam.cgi ====<br />
<br />
Original interface that supports most camera features - [[ccam.cgi]]<br />
<br />
and<br />
<br />
==== API compatible with Axis cameras ====<br />
<br />
This ([[AxisAPI]]) makes Elphel cameras work with some third-party software<br />
<br />
==== JavaScript library ====<br />
We will create a set of javascript routines to control cameras, which can be used in a different AJAX applications.<br />
See [[JavaScript API]]<br />
<br />
=== Camera Software ===<br />
==== File systems ====<br />
[[333_File_System]]<br />
<br />
=== Client Software ===<br />
==== [[MPlayer]] ====<br />
We have MPlayer patched for use with our cameras. Patches for source codes are accessible on Source Forge but compiled package is only for Debian on i386 architecture. We plan to make a compiled packages for PowerPC architecture and also for Slackware.<br />
<br />
==== [[HTML Video Surveillance]] ====<br />
[http://sourceforge.net/project/showfiles.php?group_id=105686&package_id=138717&release_id=358392 Multiple camera view HTML page] is based on [http://sourceforge.net/projects/genres/ GenReS plugin] for [http://www.mozilla.org/ Mozilla/FireFox].<br />
Now works: scrolling by picture dragging (digital PTZ), camera selection, zoom switch, automatic detection of stream stop by timeout.<br />
List of cameras adresses is now editable manually. It will be automaticaly generated in the [[Live CD]].<br />
The page will runs recording software by user request. Video will be saved to a fixed directory and splitted to separate files by tunable number of frames.<br />
Main parameters of video captire wil be changeable from the page.<br />
The page later can be used in the [[#Video Server]].<br />
==== [[Live CD]] ====<br />
Elphel live Linux CD contain software for camera users. We also will make a live DVD for developers.<br />
The live CD based on [http://knopper.net/knoppix/ Knoppix].<br />
<ul>The software which should be included to a future releases of the Live CD<br />
<li>[[HTML Video Surveillance]]<br />
<li>[http://lives.sourceforge.net LiVES] video editor<br />
<li>client software packages with simple installation for different distributions of GNU/Linux<br />
</ul><br />
Currently we have CD only for i386 architecture.<br />
<br />
We have plans to make Live CD for PowerPC too.<br />
<br />
We should move to DVD distribution as most of the disks are anyway provided with the hardware, not downloaded.<br />
<br />
The idea of keeping as full Knopix as possible was to introduce GNU/Linux to the camera users who never had this experience before. But these users will get DVD in a box, downloadable CD version can have more standard packages removed and replaced with camera-specific software.<br />
<br />
One of such major additions will be preinstalled camera development environment (possibly based on [http://www.eclipse.org Eclipse]) to simplify modification of the camera code. Again - don't forget that many of those future developers use now only Visual Studio (or how exactly it is called?) and GNU/Linux can be somewhat alien to them. With this environment they might start playing with their code without prior knowledge of GNU/Linux software development process.<br />
<br />
It can be useful for the hardware/fpga developers too - to be able to write some code to support the hardware features without spending much time on the mastering software development process.<br />
<br />
=== Video Server ===<br />
PC-based video server that will archive incoming Ogg Theora incoming streams from several cameras and transcode them on the fly to lower resolution (binary decimation, windowing) and frame rate (i.e. using only key frames) presenting multiple streams (real time and recorded) to the operator. The external interface of the server might be one of the industry standard and compatible with 3-rd party legacy software.<br />
<br />
== Camera hardware ==<br />
=== RTC ===<br />
[[RTC]]<br />
=== 10331 ===<br />
[[10331]]<br />
===10332 ===<br />
[[10332]]<br />
=== 10334 ===<br />
[[10334]]<br />
<br />
== Active Projects ==<br />
=== Synchronization of the Cameras ===<br />
<br />
Sometimes you need to acquire images triggered by an extarnal event ar several cameras need to be syncronized with each other. [[Camera Synchronization]] is all about it.<br />
<br />
=== Photo-finish ===<br />
Photo-finish device made of Elphel model 333 camera with additional FPGA code and software - [[Photo-finish]]<br />
<br />
=== Zeroconf for Elphel cameras ===<br />
[[zeroconf for Elphel cameras]]<br />
=== Elphel cameras and Zoneminder ===<br />
We plan to make model 333 camera work with [http://www.zoneminder.com Zoneminder]<br />
=== USB host interface ===<br />
<br />
daugther board with USB and DC-DC power for lens control board [[10334]]<br />
<br />
=== Motorized lens control ===<br />
I'll try to retrieve what was written before on the motorized lens control. In short - C/CS mount is rather old and does not work well for interchangeable motorized lenses. We are trying to build an adapter from C/CS-mount to a bayonet type connector. And place a tiny 5mm wide PCB ring in that adapter. This [[10331]] PCB has a reprogrammable microcontroller and uses just 2 connections to the camera for power and data signals combined. It provides all the necessary connections for the most types of motorized lenses. <br />
<br />
lens control board [[10331]]<br />
DC-DC power board for motorized lens control board [[10332]]<br />
lens control board In System Programmer [[lbcontrol]]<br />
<br />
=== Outdoor enclosure ===<br />
<br />
<br />
'''Step Zero'''<br />
<br />
Determine working setup<br />
-Does the system need a control board <br />
-CCD board needs longer cable for minimal package when stacking lens on top of board [http://www.maartenmenheere.nl/blog/images/014-001-0.jpg Camera casing]<br />
<br />
'''Step one'''<br />
Test setup. Assemble all components in a setup that can record video<br />
<br />
Components in test setup<br />
- Lens (Computar H3Z4512CS varifocal lens? using power)<br />
- Elphel USB setup. Is it possible to directly plug in a usb drive. Where does the power come from?<br />
- Battery<br />
- Usb cable or network calbe<br />
- Usb exteral harddrive or flashdrive<br />
- ON/off switch<br />
<br />
Objective: Does it work, at all?<br />
Secundary: Battery life? Video quality?<br />
<br />
'''Step two'''<br />
<br />
Wooden box. Test setup 1 integrated in outside video testing setup.<br />
<br />
Components added in test 2<br />
- Hardboard casing<br />
<br />
Objective: Optimize recording setup of video for ease of use<br />
Secundary: optimal settings? correct lens?<br />
<br />
'''Step three'''<br />
<br />
Building of waterproof casing<br />
- Amphenol plugs<br />
- Camera window<br />
- Casing camera (fibre reinforced composite)<br />
- Casing base station (battery + storage) (fibre reinforced composite)<br />
<br />
[http://www.maartenmenheere.nl/blog/images/outdoorvideosystem.jpg Schematic]<br />
[http://www.maartenmenheere.nl/blog/images/014-001-0.jpg Camera casing]<br />
[http://www.maartenmenheere.nl/blog/images/camera.jpg Camera casing]<br />
<br />
=== Current enclosure design ===<br />
<br />
We are switching to extruded aluminum tube (actually original 303/313 also was design for a standard aluminum profile). Model 333 RJ-45 connector is designed to fit into RJField shell [http://www.rjfield.com/ethernet_connectors_rjf_en.htm].<br />
<br />
[[Image:Tube_section.jpg]]</div>Pfavrhttps://wiki.elphel.com/index.php?title=Ccam.cgi&diff=1953Ccam.cgi2005-10-26T19:14:11Z<p>Pfavr: /* Image Quality, Gamma correction, Color Saturation */ request for better explanation of table base conversions</p>
<hr />
<div>== overview ==<br />
The interface described below and all the links are for the Model 333 camera, interface for the 313 is approximately (but not completely) the same.<br />
<br />
ccam.cgi (source - [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/333/apps/ccam/ccam.c?view=markup ccam.c]) is currently the main interface to the camera functionality that uses GET method to pass parameters and receive the data back, so you may call it as<br />
<nowiki>http://<camera-ip-address>/admin-bin/ccam.cgi?parameter1=value1&parameter2=value2&...</nowiki><br />
Most parameters are persistent, so if the value is not specified it will be assumed to remain the same.<br />
These parameters are approximately related to the pairs of parameters passed to the main camera driver [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/arch/cris/drivers/cc333.c?view=markup cc333.c] that uses specific sensor driver (for [http://www.micron.com/products/imaging/products/MT9T001.html Micron sensors] - [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/arch/cris/drivers/mt9x001.c?view=markup mt9x001.c]) from the user space to the driver with IOCTL. The list of these 63 driver parameters is defined in [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/os/linux/include/asm-cris/c313a.h?view=markup c313a.h] (names staring with "P_"), most of the values come in pairs desired and actual:<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_''name''), ''desired_value''); //set new value of the parameter <br />
''current_actual_value''=ioctl(devfd, _CCCMD(CCAM_RPARS , P_''name'' ), 0); // read current actual value - driver modifies the set value if needed to match the valid range.<br />
<br />
Writing these parameters will not cause immediate action, additional write needs to be performed to make driver process the new values. Some parameters can be updated without interrupting the sensor operation and the video stream output if active (i.e. exposure time, panning without window resizing, analog gains, color saturation). Changes in other parameters (such as window size or decimation) will not be applied until the sensor is stopped.<br />
<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_UPDATE ), 3); // "on the fly"<br />
ioctl(devfd, _CCCMD(CCAM_WPARS , P_UPDATE ), 1); // stop the sensor if needed, write new parameters, start sensor and wait sensor-dependent (usually 2) petentially "bad" frames before sending images through the FPGA compressor.<br />
<br />
It is possible to read the current values of CCAM_RPARS using special request to ccam.cgi as HTML table, set of javascript assignments or xml data<br />
<br />
There is only one copy of these kernel-space variables - they reflect current state of a single sensor and single compressor. <br />
<!-- http://192.168.0.44/admin-bin/ccam.cgi?opt=vhcxy&dv=1&dh=1&iq=70&kga=63&kgm=6&gr=17&gg=14&ggb=14&gb=17&csb=200&csr=200&bit=8&gam=57&pxl=10&pxh=254&e=25&ww=2048&wh=1536&wl=0&wt=0&fps=0&_time=1127851252626 --><br />
<br />
== ccam.cgi parameters ==<br />
Not all of the parameters are applicable to all sensors/camears, some are obsolete.<br />
=== opt ===<br />
opt value is an unordered string of characters:<br />
{| border="1" cellpadding="2"<br />
|-<br />
| character || Description || Working?<br />
|-<br />
| h || Use hardware compression || Y<br />
|-<br />
| c || Consider sensor to be the color one, if not - skip Bayer color filters processing || Y<br />
|-<br />
| x || Flip (mirror) image horizontally (uses in-sensor capabilities) || Y<br />
|-<br />
| y || Flip (mirror) image vertically (uses in-sensor capabilities) || Y<br />
|-<br />
| p || test pattern (ramp) instead of an image (for Micron sensors - same as "f" below) || Y<br />
|-<br />
| f || test pattern (ramp) generated in FPGA || Y<br />
|-<br />
| b || buffer file || N?<br />
|-<br />
| m || restart exposure after sending || N?<br />
|-<br />
| s || software trigger (for image intensifiers) - trigger if sum of pixels in a line > threshold || N?<br />
|-<br />
| t || external trigger - wait for external trigger input || N?<br />
|-<br />
| v || video mode - currently only means that it is not a reload from memory || Y<br />
|-<br />
| g || use background image || N?<br />
|-<br />
| q || return a quicktime movie clip || Y<br />
|-<br />
| u || updates (some) parameters "on the fly", returns 1x1 pix dummy image || Y<br />
|-<br />
| * || ignore lock file, recover from "camera in use" || Y<br />
|-<br />
|}<br />
<br />
----<br />
=== Frame size and resolution ===<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| ww || 2..2048 || Sensor active window width (before decimation) || Y || N ||1<br />
|-<br />
| wh || 2..1536 || Sensor active window height (before decimation) || Y || N ||1<br />
|-<br />
| wl || 0..(2047-ww) || Sensor active window left margin (before decimation) || Y || Y || 2<br />
|-<br />
| wt || 0..(1535-wh) || Sensor active window top margin (before decimation) || Y || Y || 2<br />
|-<br />
| dh || 1..8 || Horizontal decimation (resoulution/image size reduction) || Y || N || 3<br />
|-<br />
| dv || 1..8 || Vertical decimation (resoulution/image size reduction) || Y || N || 3<br />
|-<br />
|}<br />
<br />
Notes:<br />
# Has to be (or will be truncated to) multiple of a macroblock (16x16 pixels) after the decimation<br />
# Even value<br />
# Decimation for MT9T001 3MPix sensor can be any integer from 1 to 8, for most other sensors - only 1/2/4/8. Because of the Bayer color filter mosaic, pixels are decimated in pairs, so decimation "4" means that for each pair of pixels used 6 pixels are skipped.<br />
<br />
=== Exposure controls ===<br />
There are multiple factors that influence image pixel values for the same lighting conditions, one is exposure time.<br />
<br />
Most CMOS image sensors (including Micron sensors used in Elphel camears) use [[Electronic Rolling Shutter]]. <br />
<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| e || 0..600000 || exposure time (0.1 msec step) || Y || Y || 1<br />
|-<br />
| vw || ? || virtual frame width || Y || ? || 2<br />
|-<br />
| vh || ? || virtual frame height || Y || ? || 3<br />
|-<br />
| fps= || xx.xx || desired frame rate || Y || ? || 4<br />
|-<br />
| sclk= || 6..48 || sensor clock (in MHz) || Y || N || 5<br />
|-<br />
| fclk= || 0..127 || FPGA clock (in MHz) || Y || N || 6<br />
|-<br />
| xtra= || 0..?? || extra frame time || Y || N || 7<br />
|-<br />
<br />
|}<br />
Notes:<br />
# Sensor driver will calculate the number of lines of exposure, will increase virtual frame height (vertical blanking) if needed (but currently - not the virtual frame width - horizontal blanking). For longer exposures you may want to do that manually or decrease the sensor clock frequency. ''Update'' - for the MT9T001 sensor that might not be needed - I'll fix the driver --[[User:Andrey.filippov|Andrey.filippov]] 12:39, 29 September 2005 (MDT). '''Done in version 6.4.9''' - now the frame time (for MT9T001 only) can be as long as 0xfffff (approximately 1 millilon) scan lines - nearly a full minute with the full frame and 48MHz clock.--[[User:Andrey.filippov|Andrey.filippov]] 11:27, 11 October 2005 (MDT)<br />
# It is possible to extend line readout time, but is not normally needed/used.<br />
# Explicitly specified virtual frame height - this parameter (if present) overwrites exposure setting. Not normally needed.<br />
# Driver will try to reduce frame rate by adding vertical blanking - limited by the maximal blanking time<br />
# Sensor clock, may be used with 1.3 and 2 MPix sensors to make longer exposure time (not needed with MT9T001 with rev. 6.4.9 or later), It also can make sense to reduce the frequency when the maximal frame rate is not needed to reduce sensor noise visible as horizontal lines in early revisions of MT9T001 sensor. You may read the sensor chip ID (revision/type) from telnet as "hello -IR ba 0" ("hello -IR" will read all the sensor registers). Current FPGA code uses the sensor clock to synchronize sensor power supply. And so the sensor power can be lost if this clock is too low, 6MHz is safe to use. On the upper side 48MHz is the maximal clock frequency for these sensors, driver limits this value.<br />
# FPGA clock frequency (drives compressor and frame buffer memory. For the model 313 practical limit was about 95MHz and you could easily change it "on the fly". Model 333 cameras uses DDR SDRAM and the implemented FPGA inteface to DDR SDRAM needs clock phase adjustment for the memory when you change the frequency. Currently it can be done manually through telnet as "fpcf -phase 0 200". Initial value for the sensor and FPGA clock frequencies might be set in the [http://cvs.sourceforge.net/viewcvs.py/elphel/camera333mjpeg/packages/initscripts/333/fpga?view=markup /etc/init.d/fpga] initialization script of the camera.<br />
# For debugging purposes (probably needed only for the model 313 camera) frame period might be increased by the specified number of pixel clock periods. It was inteded to fine-tune the frame period (that depends on multiple sensor settings) and make sure it is not shorter than compressor could handle (333 compressor is faster).<br />
<br />
===Binning ===<br />
<br />
Binning allows effectively to increase the sensor sensitivity when it is operating with reduced resolution (decimation). Decimation still determins the resolution, binning defines how many pixel pairs are added together.<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| bh || 1..dh || Horizontal binning (sensitivity for lower resolution) || Y || Y || 1<br />
|-<br />
| bv || 1..dv || Vertical binning (sensitivity for lower resolution) || Y || Y || 1<br />
|-<br />
|}<br />
Notes:<br />
# Currently for MT9T001 sensor only, works for all vertical binning values, but not all of the horizontal (some have no effect, others - produce vertical lines). I would expect this glitches will be fixed in newer sensors by Micron. <br />
<br />
Here are two examples:<br />
<br />
1. Full frame with decimation by 4 in each direction will result in image of 512*384 pixels, pixel values the same as for the full resolution (only 2x2 pixels for each 8x8 are used, others are discarded)<br />
ww=2048&wh=1536&wl=0&wt=0&dl=4&dh=4&bh=1&bv=1<br />
2. Full frame with decimation by 4 in each direction will result in image of 512*384 pixels, pixel values are 16 times higher than for the full resolution (all 8x8 pixels are used, values are added together following the bayer RG/GB mosaic - reds with reds, greens with greens, blues with blues)<br />
ww=2048&wh=1536&wl=0&wt=0&dv=4&dh=4&bh=4&bv=4<br />
<br />
=== Analog Gains ===<br />
<br />
Most sensors have some controls for the analog signal gains before the pixel data is digitized. Some sensors (as now discontinued Kodak KAC-1310) have individual color gains and separate global gain, others (as Micron ones) - only color. Usually there are two "green" gains as with Bayer mosaic filters there are two green pixels in each 2x2 pixek cell (RG/GB).<br />
Gain values can be far from linear, too low gain setting might be not enough to saturate pixel value to 1023 (usually 255 after conversion) even with the very bright light.<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| gr || 0..63 || analog gain RED (or mono) || Y || Y || <br />
|-<br />
| gg || 0..63 || analog gain GREEN (or green in "red" line) || Y || Y || <br />
|-<br />
| gb || 0..63 || analog gain BLUE || Y || Y ||<br />
|-<br />
| ggb || 0..63 || analog gain GREEN in "blue" line) || Y || Y ||<br />
|-<br />
| kga || 63 || Kodak KAC1310 analog gain Y (all colors) || Y || Y || 1<br />
|-<br />
| kgb || ? || Kodak KAC1310 analog gain ? (all colors) || ? || Y || 1<br />
|-<br />
| kgm || 6 || Kodak KAC1310 mode || Y || Y || 1<br />
|-<br />
|}<br />
# Used in Kodak KAC-1310 (now obsolete) sensors. For MT9?001 sensors driver just multiplies gr, gg, gb and ggb by kga/63. It is better to keep it 63 (or do not use at all) for this family of sensors.<br />
<br />
=== Image Quality, Gamma correction, Color Saturation ===<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| iq || 1..99 || JPEG Quality (%) || Y || ? || 1<br />
|-<br />
| gam || 0.13 .. 10 || Gamma correction value (%) || Y || Y || 2<br />
|-<br />
| pxl || 0..255 || Black level || Y || Y || 3<br />
|-<br />
| pxh || 0..255 || White level || Y || Y || 3<br />
|-<br />
| csb || 0..710 || Color Saturation (%), Blue || Y || Y || 4<br />
|-<br />
| csr || 0..562 || Color Saturation (%), Red || Y || Y || 4<br />
|-<br />
|}<br />
# Standard JPEG compression quality in (%). Earlier negative values were used (in software compression mode only) to generate BMP images, then"-1" meant BMP non-compressed and "-2" - BMP RLE compressed. The code is likely rotten by now.<br />
# Camera implements virtually arbitrary table-based conversions from 10-bit sensor data to 8-bit used for compression. Each of the table has 256 entries each consisting of 2 bytes, the tables are indexed by 8 MSBs of the pixel data. One of the bytes holds the base value for the linear interpolation - the output value if the remaining LSBs (2 for 10-bit sensors) are zeros. The other byte - signed (-128..+127)increment between consequtive base values so in the case of 10-bit data this difference is multiplied by pixel 2 LSBs and divided by 4 (for 2 bits). The result is added to the base value. Of course, these second bytes might be calculated from the base ones, but in current FPGA code there are no extra cycles to retrieve 2 values and subtract them from each other (maybe will do that later --[[User:Andrey.filippov|Andrey.filippov]] 17:18, 11 October 2005 (MDT)) so both bytes in each table entry have to be filled by software. Model 313 camera had a single table for all colors, Model 333 FPGA has bigger tables so each of the 4 (including 2 greens) has an individual 256-entry table. Currently for the simlicity of the interface ccam.cgi allows to calculate a single table using gamma value (100 - linear, 47 - standard gamma setting for video camears) and 2 of the values (pxl and pxh) below. Together they make something like "levels" in image manipulation programs (such as [http://www.gimp.org/ GIMP]), but camera hardware (FPGA code) allows more flexible "curves" control. (I didn't understand the preceeding paragraph, maybe some figures would help? --[[User:Pfavr|Pfavr]] 14:14, 26 October 2005 (CDT))<br />
# values for the 8 MSBs of the sensor data that map to total black (0x00) and total white (0xff) of the output signal. Sensors have different modes of auto-zero, and with default settings MT9T001 sensor adjusts black level so in the complete darkness each pixel would output 0x18 (8 MSBs are 0x0a or 10 decimal), other sensors have different values, it is also possible to reprogram sensors to change the "hardware" black value if needed.<br />
# Color saturation values for blue (B-G) and red (R-G) color components, in (%). In linear mode (gam=100) the true colors will be produced with color saturations of 1.0 (100), but for lower gamma setting the color saturation should be increased to compensate for the lowering contrast of the image - with the mosaic color filter pattern lower relative difference between the pixels will be decoded as less intense color.<br />
<br />
=== Histograms ===<br />
<br />
Model 333 camera calculates histograms (individually for each of 4 colors (including 2 greens). Histograms are calculated inside a specified window - the following parameters are written directly to FPGA - now shadows in kernel space yet (so no way to read back the current values). As the sensors use 2x2 pixel mosaic, these 4 values are made even (by truncating LSB).<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| hl || 0..2046 || Histogram window left margin || Y || Y ||<br />
|-<br />
| ht || 0..1534 || Histogram window top margin || Y || Y ||<br />
|-<br />
| hw || 0..2048 || Histogram window width || Y || Y ||<br />
|-<br />
| hh || 0..1536 || Histogram window height || Y || Y ||<br />
|-<br />
|}<br />
<br />
hl= distance from the left active window boarder to the left of histogram calculation window, default=0<br />
<br />
ht= distance from the top active window boarder to the top of histogram calculation window, default=0<br />
<br />
hw= histogram calculation window width , default=0xffe (will extend to the bottom right corner)<br />
<br />
hh= histogram calculation window height , default=0xffe (will extend to the bottom right corner)<br />
<br />
Currently position and size of the histogram window is truncated to even values (LSB ignored).<br />
<br />
Histogram calculation is always on when the sensor is running (normally it is even if no stream is output), FPGA uses<br />
two pages of internal memory and switches between them when ready. For each frame it first writes zero to each histogram<br />
value (4x256) and then adds pixels (after converting from 10 bits sensor data to 8 bit using "curves" tables), limiting<br />
the value by 2^18-1 (hardware limitation). If you read histogram table asynchronously it is likely that the sum will<br />
differ from the total number of pixels as FPGA could switch pages while you were reading. But it switches only at<br />
the end of frame, so there will be no partial sums read out.<br />
<br />
There are two ways to read the histogram table now:<br />
<br />
* manually (through telnet) using "fpcf -histogram" - it will print data as hex values or<br />
* read binary file (4*256*32bits=4KB) /dev/histogram or through a symlink at <nowiki>http://<camera_ip>/histogram</nowiki><br />
<br />
The four (as number of color filters in Bayer mosaic) tables will be read out in the following order:<br />
<br />
R (256 values), Gr (green in the "red" row - 256 values), Gb (green in the "blue" row - 256 values), B (256 values)<br />
<br />
It is the same order how now the "curves" tables are written to the FPGA. Now ccam.cgi only can fill these table from<br />
the gamma value, all colors the same. But you can experiment with it by creating a text file with 1024 hex values -<br />
(ccam.c shows how to build it), copy it to the camera file system and use "fpcf -table 400 <path_to_table>"<br />
to transfer it to FPGA. If you then reacquire image without changing gamma value ccam.cgi will not overwrite<br />
the table you've just downloaded.<br />
<br />
It seems that colors work correctly with all image orientations and decimations for all Micron sensors. If not -<br />
let me now, you can temporary compensate wrong colors by adding "&byr=<0..3>" (bayer phase shift) to the image URL - it reassigns<br />
RG/GB mosaic in different ways, but the sequence of the colors R,Gr,Gb,B in the FPGA tables ("curves", histogram)<br />
will still correspond to the colors in the JPEG output.<br />
<br />
=== HTML, XML or VRML ===<br />
<br />
ccam.cgi can send html, xml or vrml (code broken needs to be restored) files, not just images if any of html, htmlr, htmll or htmlj parameters are present in the url.<br />
<br />
{| border="1" cellpadding="2"<br />
|-<br />
| Key || Value range (3MPix sensor)|| Description || Working? ||"on the fly"? ||Notes<br />
|-<br />
| html || 0 || no output || Y || Y ||<br />
|-<br />
| || 1 || all sensor parameters as javaScript || Y || Y || 1, 4<br />
|-<br />
| || 2 || all sensor parameters as html || Y || Y || 2, 4<br />
|-<br />
| || 3 || beam data as javaScript || Y || Y || 1, 8<br />
|-<br />
| || 4 || beam data as html || Y || Y || 2, 8<br />
|-<br />
| || 5 || state (5 -picture ready) as javaScript || Y || Y || 1,5<br />
|-<br />
| || 6 || state (5 -picture ready) as html || Y || Y || 2,5<br />
|-<br />
| || 7 || start image acquisition (option "s" or "t" should be present) || ? || Y || 6<br />
|-<br />
| || 8 || reset waiting for trigger || ? || Y || 7<br />
|-<br />
| || 10 || all sensor parameters as XML || Y || Y || 3<br />
|-<br />
| || 11 || beam data as XML || Y || Y || 3,8<br />
|-<br />
| || 12 || state (5 -picture ready) as XML || Y || Y || 3, 5<br />
|-<br />
| || 13 || start image acquisition (option "s" or "t" should be present), return XML || Y || Y || 3, 6<br />
|-<br />
| || 14 || reset waiting for trigger, return XML || Y || Y || 3,7<br />
|-<br />
| htmlr || n || Refresh each n seconds || Y || Y || 9<br />
|-<br />
| htmll || escaped string || command to be executed onLoad in <nowiki><body></nowiki> tag || Y || Y || 10<br />
|-<br />
| htmlj || escaped string || include javaScript file || Y || Y || 11<br />
|-<br />
<br />
<br />
|}<br />
<br />
Notes:<br />
# Head section of the html output file will have javascrips assingments "document.''variable_name''=value;" for each parameter. No visible elements in the file - it was intended to be used in a frame set before XMLHttpRequest was supported in most browsers.<br />
# Parameters are output as a two-column html table (first column - name, second - value).<br />
# Parameters and their values are output as XML file.<br />
# Sensor-related parameters are output<br />
# Only sensor/compressor state is output. State 7 - sensor is running, constant compression is off (single frame mode), state 8 - compressor is in constant compression mode (such as during streaming), static images can not be acquired, some acquisition parameters can not be changed withowt stopping the compression.<br />
# This was designed for sensors with asynchronous reset (such as now obsolete now Zoran ones). Don't remember what it will do (or how to use it) with Micron ones.<br />
# Reset waiting for an external trigger (not sure if it still works)<br />
# Output beam parameters (center of gravity, half width in x, y, etc.). This code is broken now, but might be repaired.<br />
# Instruct the html page to refresh itself each specified number of seconds.<br />
# Value is an "escaped" string that contains javaScript command to be executed whenn the page is loaded (body onLoad).<br />
# Value is an "escaped" string that has the path of the external javaScriptg file to be included inside the <nowiki><head></nowiki> tag of the page<br />
<br />
<br />
== below is yet unedited text from ccam.c comments ==<br />
<br />
<br />
<br />
<br />
* vrmld - decimation to make a grid (>=8 for full frame) (default = 16)<br />
* vrmlb - number of individual blocks in each x/y (default=2)<br />
* vrmlm - maximal pixel. 1023 - full scale, less - increase contrast, 0 - automatic (default =1023)<br />
<br />
* vrmli - indentation (default=1)<br />
* vrmlf - format - 0 - integer, 1 - one digit after "." (default 0)<br />
* vrmll - number of countours to build (default = 32)<br />
* vrmlo - options for isolines - e - elevated, f - flat (default=ef)<br />
* vrmlz - 0..9 output (gzip) compression level (0 - none, 1 - fastest, default - 6, best -9)<br />
<br />
* hist=n - read frame from "history" applies only to rereading from memory after acquisition of a clipÃÂ<br />
n<=0 - from the end of clip (0 - last), n>0 - from the start (1 - first)<br />
<br />
<br />
<br />
* pfh - photofinish mode strip height (0 - normal mode, not photofinish). In this mode each frame will consist of multiple<br />
pfh-high horizontal (camera should be oriented 90 deg. to make vertical) strips, and no extra lines will be added to the frames<br />
for demosaic<br />
for now: +65536 - timestamp for normal frames, +131072 - timestamps for photo-finish mode<br />
* ts - time stamp mode: 0 - none, 1 - in upper-left corner, 2 - added to the right of the image (photo-finish mode) <br />
* fsd - frame sync delay (in lines) from the beginning of a frame (needed in photofinish mode - 3 lines?)<br />
<br />
<br />
<br />
* _time=t (ms) will try to set current system time (if it was not set already. _stime - will always set)<br />
<br />
<br />
<br />
<br />
<br />
<br />
* fpns - 0..3 fpga background subtraction:<br />
* 0 - none,<br />
* 1 (fine) - subtract 8-bit FPN from 10-bit pixel<br />
* 2 - multiply FPN by 2 before subtracting<br />
* 3 - multiply FPN by 4 before subtracting (full scale)<br />
* note: negative result is replaced by 0, decrease FPN data before applying for "fat 0"<br />
* fpnm - muliply by inverse sensitivity (sensitivity correction) mode:<br />
* 0 - no correction<br />
* 1 - fine (+/- 12.5%)<br />
* 2 - medium (+/- 25%)<br />
* 3 - maximal (+/- 50%)<br />
* pc - pseudo color string. Applies to monochrome images and vrml<br />
<br />
* any of vrml* specified - vrml instead of a picture/html<br />
*<br />
* background measurement/subtraction will (now) work only with 10-bit images<br />
* gd = "digital gain" 0..5 (software)<br />
* byr =0..3 Overwite Bayer phase shift, =4 - use calculated by driver.<br />
<br />
* bit - pixel depth (10/4/8)<br />
* shl - shift left (FPGA in 8 and 4-bit modes) - obsolete<br />
* clk - MCLK divisor - 80MHz/(2..129) - obsolete?<br />
<br />
<br />
* bg = n - calculate background 1-2-4..16 times (does not need option s/t/v)<br />
* parameters for "instant on" quicktime<br />
* qfr = n - number of frames to send in a quicktime clip<br />
* qpad = % to leave for the frame size to grow (initial size 1-st frame * (100- 1.5*qpad)/100<br />
* qdur = frame duration in 1/600 of a second<br />
* parameters for quicktime clips (send after shooting)<br />
* qsz = n - clip size in KB (w/o headers) (<=0 will use "instant on") - will be obsolete<br />
* qcmd= (mandatory for videoclip)<br />
1 - start constant compression of all acquired frames<br />
2 - stop constant compression.<br />
3 - acquire the whole buffer and stop<br />
4 - read movie from buffer<br />
6 (and 5?) - stop, then read<br />
7 - acquire buffer, then read<br />
<br />
* qts = t - playback time/real time</div>Pfavr