Software Synchronization of Projector and Camera for Structured Light 3D

Write here about construction, ideas, equipment, tips n tricks etc. related to structured light scanning
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

I came across this article.
I find it brilliant.Especially the way he determines the lag between projector and camera.
If this method was implemented in HP 3D, it would be a major leap foward.

http://www.hometrica.ch/publ/2016_3dbody_petkovic.pdf
User avatar
Micr0
Posts: 586
Joined: 15 Nov 2016, 15:20
Location: New York City

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Micr0 »

I wish we had access to more technical answers from the David people still. There is some of this in david but I don't know if it's the same thing or just looks like it. I'd love to know if the David people have tried directly syncing the camera to the HDMI stream or just the projected image. And like wise do they know of any benefit. I'd also really like to know if David really did modify the firmware of the SLS equipment and if so how.
µ
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

I have posted this on the HP site. We will see.

As you say, the only possibility that this is implemented now in David would be through a specific camera driver or camera firmware.

They might as well hire this guy. He gets a scan in less than a second! With nearly the same equipment as us (ok, his projector is 120hz, but still, it's great).

I imagine that this not only solves the problem of sync between projected image and capture image (1 for 1) but also the actually frequency sync to avoid those horrible waves/lines we all hate.
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

I noticed this in David 5.

See image below.

I find the software solution of the article more interesting, because more universal. I wonder what is the hardware sync they are refering too.
Attachments
Sync-option.JPG
Sync-option.JPG (22.19 KiB) Viewed 12120 times
User avatar
Micr0
Posts: 586
Joined: 15 Nov 2016, 15:20
Location: New York City

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Micr0 »

Curiousjeff wrote: 17 Aug 2017, 12:51 I noticed this in David 5.

See image below.

I find the software solution of the article more interesting, because more universal. I wonder what is the hardware sync they are refering too.
You should ask on the "other" forum. It may be an acknowledgement that there needs/should to be some camera/projector sync for better results.
µ
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

I wrote to the author of the paper above.

Here is his answer with his permission:


------------------------------
thank you for your e-mail.

It is always nice to read a message from someone
who is enthusiastic about what you do.


> If I understand correctly your article, you resolve two
> problems at the same time:
>
> 1) With your VBLANK technique, you can trigger the capture of
> the image precisely when a new pattern is projected, thus
> allowing for the fastest scan possible.
>
> 2) Capture of the frame is in perfect sync with the frequency
> of the projector, thus avoiding ugly artifacts.
>
> Is my understanding correct ?

It is correct on 1), but not entirely correct on 2).

Cause of scanning artifacts are various and so are
the steps to avoid them:
a) ambient light - use dark room or structured light
pattern insensitive to ambient light;
b) inter-reflections in the scene - use high-frequency
sinusoidal fringe patterns;
c) non-compensated projector gamma - perform
colorimetric calibration of the projector;
d) highly specular objects - apply anti-reflective
coating/scanning spray;
e) demosaicing or any pre-processing done on the
camera - use camera which supplies raw sensor data;
f) for DLP projectors: camera shutter (exposure time)
is not matched to DLP color wheel period - use camera
which allows precise shutter control;
etc.

Major synchronization errors most often produce
completely unusable scan. Fortunately, it is easy to
check if synchronization is the cause simply by
extending the delay between projection and acquisition:
if artifacts exist when projecting each image for more
than 1 second then synchronization is most probably
not the cause of artifacts.

Therefore, even if synchronization is perfect artifacts
may still appear.

> Do you think point number 2 could be used with the David SLS
> software by running a software prior to starting David SLS. I
> don't think that opening the video stream in David changes
> anything to the settings of the camera (?). This would allow a
> near perfect scan thought not as fast as the one you have
> developed.

I'm not sure what you mean by "running some software prior
to starting David SLS".

Generally speaking the any 3D scanning software may be changed
to use VBLANK interrupts for synchronization if a camera which
supports software triggering is attached.

> One last question: Could this be done using Directshow or will
> it only work with a SDK from the camera manufacturer ?

Any camera which supports proper software triggering may be used.

Consumer grade cameras used in Windows are mostly UVC
(USB Video Class) compliant. In theory such cameras may
be used if they support still image capture using dedicated
software trigger (so called method 2 in UVC 1.5 standard).
This may be done using DirectShow or some other API.

If DLP projectors are used then besides precise software
triggering you also need precise shutter control.
This is the reason we are mostly using industrial cameras.
For example, using common grade Canon or Nikkon DSLR with
120Hz 3D DLP projector may be problematic as shutter cannot
be set to 1/119.909s exactly which will produce scanning
artifacts even if synchronization is perfect. In other
words: our article describes how to synchronize the
start of frame while the end of frame is synchronized
only if the shutter speed is correctly set.
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

> Do you mind if I post your reply on the forum: "3D Scanning
> Forum" ?

I don't mind. Feel free to share with anyone who is interested.


> I understand all your remarks regarding surface artifacts.
> On the "old" forum, many of these problems were discussed.
> We also noticed that Texas Instrument had implemented a
> "diamond" shape pixel on some of their DLP, which also
> creates artifacts.

Interesting. For such projector I would suggest to
use multiple phase-shifting patterns and simply slightly defocus
the projector. Defocusing hides exact pixel shape
and multiple phase-shifting is generally insensitive
to projector defocus as it only affects signal amplitude
and does not affect signal phase.


> But we have often have lines crossing the surface, We believe
> this is a sync problem:
>
>

When saying "lines crossing the surface" I assume you mean the
fine Moiré-like pattern which is especially prominent in the
topmost scan (yellow-orange one). If the scan is obtained by
temporal phase shifting (PS) then the effect is most probably
caused by interference due to unwanted sinusoidal frequencies.

The cause behind unwanted frequencies which cause
the effect may vary:

1) Gamma-correction in projector and in camera may introduce
unwanted frequencies. Ideally, camera should be operated
in linear mode and sinusoidal fringes sent to the projector
should be pre-compensated for projector gamma correction
so projector is also operating in linear mode.
If this is not valid then both camera and projector introduce
unwanted frequencies which may alias over the baseband
sinusoidal fringe and produce ripple-like effect on the
surface of an object, e.g. flat plane will not be flat but
will have waves. Another possibility to correct for this
if gamma correction cannot be turned off is to increase
the number of phase shifts so introduced frequencies due
to gamma non-linearity are not aliased over the baseband
sinusoidal fringe.

2) If DLP projector is used then shutter (exposure) time
may be incorrectly set. I have observed this on DSLR cameras
as there is a difference between nominal and true shutter,
for example see http://scantips.com/lights/fstop2.html.
The problem lies in the fact that DLP wheel rotates at 120Hz
(or at 119.909Hz) but 1/60s shutter on DLSR is actually
1/64s shutter. This means during each exposure camera does
not integrate a whole period of the color wheel which in
turn yields unwanted fringe harmonics and alters the scan.
If exact shutter time which matches the period of color
wheel rotation of a DLP projector cannot be set then
to correct either use a LCD projector (for which shutter time
needs not match the projector refresh rate) or increase the shutter
time so it is as close as possible to some high multiple the
period of the DLP wheel rotation (this decreases the effect
but does not eliminate the error).

Also note that some DLP projectors always use same rotation
speed of the color wheel regardless of the input signal,
e.g. Acer X1260 always rotates the wheel at 119.909Hz.
You cannot be certain about how particular DLP projector
behaves without measuring color wheel frequency and
recording DLP mirror sequences first.

3) Some ambient illumination flickers. Regardless of the
flicker frequency depending on the number of phase shifts
of the sinusoidal fringe the flickering frequency may alias
over the baseband fringe. As light flickering is not synchronized
to the camera the effect in the scan would not have a nice
regular periodic structure so this is probably not the cause.
Obviously, use dark room to eliminate this effect.

Depending on the equipment and used structured light
pattern the reason why such an effect appears may be different.
You would have to test further to identify the exact cause.


> David does not trigger anything. It just grabs the incoming
> video stream. At the same time, I don't believe it changes
> anything to the setting that were adjusted by Flycapture. It
> just opens the video stream.

So it simply selects appropriate frame from the input video
stream. Such approach is simple but may be tricky as certain
cameras introduce different delays due to different buffer
depths: some cameras may cache 1 or 2 and some may
cache up to 10 frames. I see how it may lead to
synchronization problems when using non-standard camera
or projector (ones for which the software was developed for)
if one wants to have short scanning times.


> My point was to have a utility program that would follow your
> sync procedure to adjust correctly fps, shutter and to sync
> the start of the frame with the start of the projected image.
> Once this is adjusted correctly. Opening in David should not
> modify the relationship camera-projector and would maybe
> provide the best sync possible without extra hardware ?

You may match the FPS and shutter speed to the projector,
but full synchronization is not possible.

From your description the David software simply grabs
frames from the video stream which is sent by the camera.
This means camera operates using the internal trigger at
the preselected FPS, however, exact timing of each trigger
cannot be controlled. To trigger the camera exactly when
needed the computer must send the software trigger
to the camera some time after corresponding VBLANK interrupt.
Software synchronization must be implemented in the
David software.
User avatar
Micr0
Posts: 586
Joined: 15 Nov 2016, 15:20
Location: New York City

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Micr0 »

Thanks Jeff. This was very educational. In someway it's the answer I've always hoped for from the David team. It also seems to confirm or suspicion that the surface waves are caused by the sync between the camera and the projector both directly and indirectly. That said, my IS cameras have sync inputs and can be triggered from a signal in the HDMI stream. However, without support in david I don't see this being an improvement worth the time and expense of implementing. I think there would have to be some way of compensating for the latency in the camera shutter/buffer which would have to be measured and would vary depending on manufacturer. If david were able to implement a trigger out function....

Narmellas Lets scan eliminates this by capturing a full exposure of each pattern. And now that I think of it, this could also help eliminate the scan distortion errors (geometric) that also seem to be a problem around here. Not that I've actually used FelxScan (so I'm speculating based on what I've seen in their demos) but they seem to have eliminated or reduced both of these problems and it looks like they did so by modifying the projector and possibly syncing the system. Does anyone know more about this?

Some higher end workstation graphic cards can produce and or slave to sync signals (Nvidia G-Sync). I think I'm going to do some research here.....
µ
User avatar
Micr0
Posts: 586
Joined: 15 Nov 2016, 15:20
Location: New York City

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Micr0 »

This from the Ti lightcrafter forum:

https://e2e.ti.com/support/dlp__mems_mi ... 4/t/360874
µ
Curiousjeff
Posts: 121
Joined: 16 Nov 2016, 22:31

Re: Software Synchronization of Projector and Camera for Structured Light 3D

Post by Curiousjeff »

Micr0,

Regarding Narmellas's Letscan, I tested a lot. I have the same camera (Canon 60D) and I could never get the same surface quality as Narmella while using the Acer K135.

I came to the conclusion that part of my problem was coming from the Acer (maybe the diamond shape ?).I did purchased nearly the same projector as Narmella (NEC) but it's size makes it less pratical in my setup.

Following the recommendation of Petkovic, I have slightly defocus the projector. The surface is better, but one must be careful not to move the object item into the focus zone. This is positive since it would mean that a good DOF for the projector is not necessary and actually counter productive.

I have not noticed any softening of the scan. I need to still make sure this does not induce any deformation that would effect alignment and fusion.

I am interested in the part of his message concerning linear gamma calibration of the projector, but I don't know how to go about it.

Jeff

I
Post Reply