Get Support
Frequently Asked Questions

We've created a knowledge base with Frequently Asked Questions for you to get quick help.

Measurement Ranges

What are the measurement ranges of the 3D cameras?

For the individual measurement ranges of the 3D camera please see the data sheets on the website.

With compromises on the data quality, closer distances are possible.

How is accuracy and repeatability defined?

Accuracy is the precision of a distance measurement over several measurement frames.
It is a measure how precise distance is measured e.g. while changing exposure time.

Repeatability is the precision of a frame-by-frame distance measurement.
It is a measure how noisy the distance measurement is.

The values in the data sheet includes accuracy and repeatability.

What about reflectivities of targets?

The reflectivity coefficient describes the percentage of the light returned from the target. Targets with high reflectivity can be detected on long distances.

The values in the data sheet are based on a 75% reflection of a Lambertian reflector.

Why do I need to consider the unambiguity range?

Due to the periodicity of the frequency modulation the unambiguous range for the distance calculation is limited. The 3D cameras of the pico family provide some flexibility in the unambiguity ranges. It can be modified by the use cases, provided in SW.

In use cases with high frame rates and low range, the unambiguity range is ~2.5 m.
In use cases with frame rates up to 25 fps, the unambiguity range is 7.5 m.

How does background light influence the measurement ranges?

If not noted differently, the measurement ranges in the data sheets are defined in indoor light conditions with moderate background light.

Outdoor light conditions: On strong ambient light (e.g. sunlight), some light from the sun will still pass through the optical filter of the camera. In general, the system is quite robust against it due to the patented circuitry (SBI – Suppression of Backgrund Illumination).
a. In bright sunlight, the measurement range of the 3D camera is reduced.
b. In bright sunlight, the noise of the distance measurement increases.

General Information

Function of the 3D cameras

The 3D cameras of the pmd 3D Sensing Family provide a digital stream of raw data. This data is used for depth calculation on the host system, using SW (Royale SDK).

Output, provided by the Royale SDK:
a. Royale provides a 3D point cloud of x, y, z data, gray value and confidence.
b. The gray image provides an IR image of the amplitudes. The amplitude values represent the amount of the reflected light of the VCSEL laser.

Working Principle

The devices of the CamBoard pico family are 3D camera development kits, working based on Time-of-Flight (ToF). Each device consist of VCSEL lasers and a 3D imager.

VCSEL laser is sending out modulated IR light, the reflection of the modulated light is received by the 3D imager. The 3D imager is measuring the phase shift of the modulated light and by this the distance to the target.

Technology Side Effects

Saturation

Saturation happens when the exposure time is too long. No distance information is generated. The easiest way to avoid is by reducing the exposure time or by using the auto exposure.

Low signal

In case of low signal (caused by low reflective targets), no distance information is generated. The best way to avoid this is changing the use case to a mode with higher exposure time.

Stray light

Stray light is light that scatters randomly in the optical lens system. That means that not only the light coming from a specific point in the scene reaches a corresponding pixel on the image sensor, but also a mixture of light reflected from other targets. Typical scenarios with stray light are high reflective targets close to the camera (e.g. white table). Therefore, the most effective way is to remove stray light causing objects from the environment.

In addition, there is SW filtering to mitigate this effect.

Multi path interference

Multi path interference occurs when modulated light scatters around within the scene. In such cases, beside of the direct reflection of the target, some reflections from other targets are mixed into the distance measurement.

Typical scenarios with multi path interference are when high reflecting targets are close to those with low reflection. With measuring on 2 frequencies, this effect can be detected.
On Royale, by default the pixel with multi path interference are filtered out.

Interference to sunlight

pmd´s Time-of-Flight technology is very robust against ambient light due to the patented SBI (Suppression of Background Illumination).
The SBI is an in-pixel circuitry that subtracts ambient light and therefore prevents the pixels from saturating.

Further improvement of ambient light robustness can be achieved with optical filters (bandpass) and / or coated lenses.

The illumination power plays an important role, so rule of thumb is: the more active illumination is used the more range can be achieved and the more robust the system is against ambient light. The pmd 3D sensing family cameras have been optimized for indoor use. They will also work in sunlight but please note that the data will be more noisy or in other words the usable range will decrease.

Changing light source to 940 nm wavelength: The sunlight spectrum has a dip of amplitude in the range of 940 nm; therefore a performance increase is possible

Interference to other 3D cameras

For 3D cameras of the pico family, interference is minimized based on a technique called SSC.

SSC (Spread Spectrum Clock): permanent shifting of modulation frequencies to minimize the chance that two cams work with the same modulation frequency.

Motion artefacts

Since the distance image is calculated out of 5 to 9 raw images, distance changes within this measuring time will lead to signal inconsistency.

The best way for mitigation is to shorten the time to capture the raw images. This can be done by reducing the exposure time. In case there is a need for range and for depth data in parallel, the use cases „mixed mode“ are recommended. There are 2 data streams in parallel. One stream is used for the range and the other one for the fast moving objects.

Flying pixel

When a pixel is directly at the corner of an object, the reflected light is a mix of the object and the background distance. As a result, the calculated distance is in the middle between object and background. Flying pixels are detected by Software and filtered out.

Depth of focus

Below a distance of 10 cm, the image gets blurry. In general, the technology can handle distances below this threshold. But an adaptation of the 3D camera would be needed.

Software

Which operating systems are supported?

Visit the Software page for more information.

What kind of Software is provided with the pico family devices?

The 3D sensing development kits come with pmd´s powerful software suite Royale, containing all the logic to operate the 3D camera.

Royale is cross platform compatible and runs on Windows, Linux/ARM, Ubuntu Linux, macOS and Android/ARM.
The package includes a visualization tool, the Royale Viewer.

The SDK to develop your own applications is C++ based and supports also several programming languages and libraries like ROS, OpenCV, OpenNI2, Matlab, C, DotNet.

What kind of data is provided with the Royale API?

  • 3D point cloud: X, Y and Z values for every pixel result in a point cloud of the observed scene.
  • Gray value: In addition to the Z value, every pixel provides a gray value, which represents the signal strength (amplitude) of the active illumination, so this is an IR gray value image. It can be used for standard 2D image processing and it is perfectly aligned to the depth image. It is also not affected by background light so it is a very robust 2D image in every light condition. This data also directly corresponds to the depth data quality so it gives a performance indication for the depth data.
  • Confidence value: This value provides information whether the pixel measured a valid 3D value or whether the 3D data is not reliable due to saturation, underexposure or other reasons.

What can I do with the Royale viewer?

The Royale viewer is a tool to control the camera and to view the 3D and gray values, provided by the Royale API. With this viewer, different modes and settings can be explored. In addition, the viewer provides data to export 3D data in the PLY format.

Programing interfaces

The Royale API to develop your own applications is C++ based and supports also several programming languages and libraries like ROS, Python, OpenCV, OpenNI2, Matlab, C, DotNet.

Which platforms are supported?

Raspberry Pi:
The LINUX arm 32-Bit binaries are tested with Raspbian. Due to computational limitations of the Raspberry Pi, the number of frames per seconds is limited.

Source code

In principle, pmd is open licensing software commercially. Details are to be discussed under NDA.

Can I export data in a common format?

Yes, each measurement frame can be exported to the polygon file format (PLY).

Troubleshooting

Partially no distance information

Near range

  • a. Saturation
  • b. Low signal
  • c. Depth of focus
  • d. Stray light

Far range

  • a. Stray light See also: stray light
  • b. Low signal See also: low signal
    • Another way to avoid low signal is to add more light. One option is to switch HW to the pico monstar.
  • c. Strong ambient light

Open a ticket
Ask a pmd 3D Specialist

If you experience technical issues with your Development Kit or have questions on how to use the 3D camera or the Royale SDK, write us a ticket. A pmd Technical Support Specialist will get in touch with you soon.