Difference between revisions of "Roadmap"
(→Active Projects) |
(→Removal of distortion) |
||
Line 177: | Line 177: | ||
Distortions from the non-instantaneous exposure of the frame can be done in LiVES. But first some other infrastructure must be in place: | Distortions from the non-instantaneous exposure of the frame can be done in LiVES. But first some other infrastructure must be in place: | ||
− | - Camera can send unpackaged compressed frames to videojack server, along with an array of floats | + | - Camera must start a videojack server on the host machine, with the correct fps, width, height and frame palette |
+ | |||
+ | - Camera must activate the videojack receiver in LiVES | ||
+ | |||
+ | - Camera can start to send unpackaged compressed frames to videojack server, along with an array of floats | ||
- Floats will be in pairs for each horizontal band: the vertical compression/expansion (1.0 means no compression) and the horizontal shift (+-shift/width) | - Floats will be in pairs for each horizontal band: the vertical compression/expansion (1.0 means no compression) and the horizontal shift (+-shift/width) | ||
Line 185: | Line 189: | ||
- LiVES will decode the frame and pass it along with the float array to a Weed effect which will apply the compress/expand/shift | - LiVES will decode the frame and pass it along with the float array to a Weed effect which will apply the compress/expand/shift | ||
− | - LiVES will receive the altered frame and save it to | + | - LiVES will receive the altered frame and save it to a stream |
== Known problems == | == Known problems == |
Revision as of 09:29, 8 December 2005
Contents
Background
Elphel was started in 2001 (Magna, UT USA) when I quit my job inspired by the possibilities that Free Software gives to developers (it was not so obvious for my that-time employer). All the projects were covered in LinuxDevices (complete list on the articles is available here). For several years Elphel was a one-man company, in January 2004 I wrote an article Taming of the Iron Penguin (Russian) in the largest Russain computer-related magazine Computerra and announced there a competition among the software developers for the best video streamer to run in the camera. That was a good idea and after the competition itself was over most of the developers remained in the Elphel team. At first - as volunteers, later - as full/part time employees.
Not all of these developers live in Russia - two, including the winner of the competition are from Kiev, Ukraine. But still all of them know Russian much better than English and so most of our technical discussions were on our private Russian-language forum. So far I failed to move these discussions to the broader audience but believe that Wiki technology can help. Here we will mantain most of the site in English but still have some pages/discussions in Russian, translating documents as we go. Or when somebody else needs it and is not satisfied with Babelfish automatic translation. We will try to keep English pages current - anyway even in Elphel not everybody knows Russian.
Please excuse not-so-good English of our developers and feel free to fix the errors if you see them.
--Andrey.filippov 15:34, 22 September 2005 (CST)
Software Architecture of Elphel 3x3 cameras
Software in the Elphel cameras started from Axis Developer Boards Software and was amended for the camera specific functions. It was modified to work with newer hardware (models 303-313/323-333), support more features and now it seems to be a good time to make a major redesign instead of applying incremental changes.
Some discussion already started in Russian here - Nc3x3
Related to the architecture are the #Camera Interface and the #Client Software
Elphel will continue developing web browser based user interface with AJAX technique. That will require to develop/modify player plugins controllable from JavaScript and implementing specific features needed for video surveillance applications - multiple camera views on the same page, digital PTZ (inside the hi-res incoming stream) and temporal decimation (reducing frame rate) that uses as low CPU resources as possible.
Web-based user interface can be especially useful for the open hardware as it reduces the entrance threshold for the developer who would like to customize the cameras functionality - regular web development tools are sufficient for the job.
Camera Interface
Camera now has two alternative APIs:
ccam.cgi
Original interface that supports most camera features - ccam.cgi
and
API compatible with Axis cameras
This (AxisAPI) makes Elphel cameras work with some third-party software
JavaScript library
We will create a set of javascript routines to control cameras, which can be used in a different AJAX applications. See JavaScript API
Camera Software
File systems
Client Software
Recorder
We start new recorder for MJPEG RTP stream. It will be small standalone program, which will take multicast IP address and port, receive stream and record it to files or to stdout. The most important with recorder - avoid frame drops (maybe something can be doen with network buffering) and control them using timestamps. And the recorder better provide managable files (under 2GB), but with zero drops between them.
Plugins for browsers
MPlayer
We have MPlayer patched for use with our cameras. Patches for source codes are accessible on Source Forge but compiled package is only for Debian on i386 architecture. We plan to make a compiled packages for PowerPC architecture and also for Slackware.
HTML Video Surveillance
Multiple camera view HTML page is based on GenReS plugin for Mozilla/FireFox. Now works: scrolling by picture dragging (digital PTZ), camera selection, zoom switch, automatic detection of stream stop by timeout. List of cameras adresses is now editable manually. It will be automaticaly generated in the Live CD. The page will runs recording software by user request. Video will be saved to a fixed directory and splitted to separate files by tunable number of frames. Main parameters of video capture wil be changeable from the page. The page later can be used in the #Video Server.
Live CD
Elphel live Linux CD contain software for camera users. We also will make a live DVD for developers. The live CD based on Knoppix.
- The software which should be included to a future releases of the Live CD
- HTML Video Surveillance
- LiVES video editor
- client software packages with simple installation for different distributions of GNU/Linux
Currently we have CD only for i386 architecture.
We have plans to make Live CD for PowerPC too.
We should move to DVD distribution as most of the disks are anyway provided with the hardware, not downloaded.
The idea of keeping as full Knopix as possible was to introduce GNU/Linux to the camera users who never had this experience before. But these users will get DVD in a box, downloadable CD version can have more standard packages removed and replaced with camera-specific software.
One of such major additions will be preinstalled camera development environment (possibly based on Eclipse) to simplify modification of the camera code. Again - don't forget that many of those future developers use now only Visual Studio (or how exactly it is called?) and GNU/Linux can be somewhat alien to them. With this environment they might start playing with their code without prior knowledge of GNU/Linux software development process.
It can be useful for the hardware/fpga developers too - to be able to write some code to support the hardware features without spending much time on the mastering software development process.
see LiveCD Realease Notes for schedule
Video Server
PC-based video server that will archive incoming Ogg Theora incoming streams from several cameras and transcode them on the fly to lower resolution (binary decimation, windowing) and frame rate (i.e. using only key frames) presenting multiple streams (real time and recorded) to the operator. The external interface of the server might be one of the industry standard and compatible with 3-rd party legacy software.
Camera hardware
RTC
10331
10332
10334
Active Projects
Synchronization of the Cameras
Sometimes you need to acquire images triggered by an extarnal event ar several cameras need to be syncronized with each other. Camera Synchronization is all about it.
Photo-finish
Photo-finish device made of Elphel model 333 camera with additional FPGA code and software - Photo-finish
Zeroconf for Elphel cameras
Elphel cameras and Zoneminder
We plan to make model 333 camera work with Zoneminder
USB host interface
daugther board with USB and DC-DC power for lens control board 10334
Proposal for an usb audio solution : PCM2903
Motorized lens control
I'll try to retrieve what was written before on the motorized lens control. In short - C/CS mount is rather old and does not work well for interchangeable motorized lenses. We are trying to build an adapter from C/CS-mount to a bayonet type connector. And place a tiny 5mm wide PCB ring in that adapter. This 10331 PCB has a reprogrammable microcontroller and uses just 2 connections to the camera for power and data signals combined. It provides all the necessary connections for the most types of motorized lenses.
lens control board 10331 DC-DC power board for motorized lens control board 10332 lens control board In System Programmer lbcontrol
Outdoor enclosure
Step Zero
Determine working setup -Does the system need a control board -CCD board needs longer cable for minimal package when stacking lens on top of board Camera casing
Step one Test setup. Assemble all components in a setup that can record video
Components in test setup - Lens (Computar H3Z4512CS varifocal lens? using power) - Elphel USB setup. Is it possible to directly plug in a usb drive. Where does the power come from? - Battery - Usb cable or network calbe - Usb exteral harddrive or flashdrive - ON/off switch
Objective: Does it work, at all? Secundary: Battery life? Video quality?
Step two
Wooden box. Test setup 1 integrated in outside video testing setup.
Components added in test 2 - Hardboard casing
Objective: Optimize recording setup of video for ease of use Secundary: optimal settings? correct lens?
Step three
Building of waterproof casing - Amphenol plugs - Camera window - Casing camera (fibre reinforced composite) - Casing base station (battery + storage) (fibre reinforced composite)
Schematic Camera casing Camera casing
Current enclosure design
We are switching to extruded aluminum tube (actually original 303/313 also was design for a standard aluminum profile). Model 333 RJ-45 connector is designed to fit into RJField shell [1].
Removal of distortion
Distortions from the non-instantaneous exposure of the frame can be done in LiVES. But first some other infrastructure must be in place:
- Camera must start a videojack server on the host machine, with the correct fps, width, height and frame palette
- Camera must activate the videojack receiver in LiVES
- Camera can start to send unpackaged compressed frames to videojack server, along with an array of floats
- Floats will be in pairs for each horizontal band: the vertical compression/expansion (1.0 means no compression) and the horizontal shift (+-shift/width)
- LiVES will pull these frames and the float array from the videojack server
- LiVES will decode the frame and pass it along with the float array to a Weed effect which will apply the compress/expand/shift
- LiVES will receive the altered frame and save it to a stream