Eyesis4Pi data structure

From ElphelWiki
Revision as of 10:27, 23 April 2013 by Oleg (talk | contribs)
Jump to: navigation, search

Intro

Eyesis4Pi stores images and gps/imu logs independently.

Data Stored on Comments
Images Host PC or (9x) internal SSDs (if equipped)
IMU/GPS logs Internal Compact Flash cards (2x16GB)



IMU/GPS logs

Description

A sensor's log is a list of registered events from various sources:

  • Trigger for starting image exposure (fps)
  • IMU sentence received (2460 samples per second)
  • GPS sentence received (5 samples per second) - in NMEA or other configured format.

Logs are stored in a binary format to have smaller size. Also, there's a file size limit - when it's reached a new file with an auto-incremented index will be started.

Raw logs examples

All the raw *.log files are found here

Parsed log example

parsed_log_example.txt (41.3MB) at the same location

[localTimeStamp,usec]: IMU: [gyroX] [gyroY] [gyroZ] [angleX] [angleY] [angleZ] [accelX] [accelY] [accelZ] [veloX] [veloY] [veloZ] [temperature]

[localTimeStamp,usec]: GPS: $GPRMC,231112.8,A,4043.36963,N,11155.90617,W,000.00,089.0,250811,013.2,E

[localTimeStamp,usec]: SRC: [masterTimeStamp,usec]=>1314335482848366 [localTimeStamp,usec]=>1314335474855775

Tools for parsing logs

Download one of the raw logs.




Images

Samples

Footage samples

Description

The pictures from each image sensor are stored in 8 triplets (because 3 sensors are connected to a single system board for the 24-sensor equipped camera) in the RAW JP4 format. ImageJ plugin deals with the triplet structure and does all reorientation automatically.

JP4 file opened as JPEG - sample from the master camera.
Download original JP4
Open in an online EXIF reader
JP4 converted to JPEG - sample from the master camera
Each triplet is a vertical segment
Eyesis4Pi images set and their location on the result panorama


File names

Image filename is a timestamp of when it was taken plus the index of the camera (seconds_microseconds_index.jp4):

1334548426_780764_1.jp4
1334548426_780764_2.jp4
...
1334548426_780764_9.jp4


EXIF headers

The JP4 images from the 1st (master) camera have a standard EXIF header which contains all the image taking related information and is geotagged. So the GPS coordinates are present in both the GPS/IMU log and the EXIF header of the 1st camera images. Images from other cameras are not geotagged.



Post-Processing

Requirements

  • Linux OS (Kubuntu preferably).
  • ImageJ.
  • Elphel ImageJ Plugins.
  • Put loci_tools.jar into ImageJ/plugins/.
  • Put tiff_tags.jar into ImageJ/plugins/.
  • Hugin tools - enblend.
  • ImageMagick - convert.
  • PHP
  • Download calibration kernels for the current Eyesis4Pi. Example kernels and sensor files can be found here(~78GB, download everything).
  • Download default-config.corr-xml from the same location.
  • Download footage samples from here.
  • Processed files are available for downloading from here (ready for the stitching step).
  • Stitched results are found here.

Instructions(subject to changes soon)

  • Launch ImageJ -> Plugins -> Compile & Run. Find and select EyesisCorrection.java.
Eyesis corrections plugin interface
  • Restore button -> browse for default_config.corr-xml.
  • Configure correction button - make sure that the following paths are set correctly (if not - mark the checkboxes - a dialog for each path will pop up):

Source files directory                - directory with the footage images
Sensor calibration directory          - [YOUR-PATH]/calibration/sensors
Aberration kernels (sharp) directory  - [YOUR-PATH]/calibration/aberration_kernels/sharp
Aberration kernels (smooth) directory - [YOUR-PATH]/calibration/aberration_kernels/smooth
Equirectangular maps directory(may be empty) - [YOUR-PATH]/calibration/equirectangular_maps (it should be created automatically if the w/r rights of [YOUR-PATH]/calibration allow)

  • Configure warping -> rebuild map files - this will create maps in [YOUR-PATH]/calibration/equirectangular_maps. Will take ~5-10 minutes.
  • Select source files -> select all the footage files to be processed.
  • Process files to start the processing. Depending on the PC power can take ~40 minutes for a panorama of (24+2) images.
  • After processing is done there is only the blending step - the following script scans directory for *.tiffs from ImageJ and uses enblend(to stitch into 16-bit tiffs) and convert them into jpegs, in terminal:

php stitch.php [source_directory] [destination_directory]
- no slashes in the end of the paths

Previewer

Note: This step is done independently from the processing and is not necessary if all the footage is to be post-processed. Just needs a KML file generated from the footage.

Here's an example of previewing the footage. If used on a local PC requires these tools to be installed.

Previewer snapshot. 24 head sensors

Stitched panoramas Editor/Viewer

Stitched results WebGL Editor/Viewer snapshot


Links