This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

Quantum Light Detection and Ranging

Jiuxuan Zhao 1    Ashley Lyons 2    Arin Can Ulku 1    Hugo Defienne 2    Daniele Faccio 2,∗    Edoardo Charbon 1,∗
1Advanced Quantum Architecture Laboratory (AQUA), Ecole Polytechnique Federale de Lausanne (EPFL), 2002 Neuchatel, Switzerland
2School of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ, UK  
(December 31, 2024)
Abstract

Single-photon light detection and ranging (LiDAR) is a key technology for depth imaging through complex environments. Despite recent advances, an open challenge is the ability to isolate the LiDAR signal from other spurious sources including background light and jamming signals. Here we show that a time-resolved coincidence scheme can address these challenges by exploiting spatio-temporal correlations between entangled photon pairs. We demonstrate that a photon-pair-based LiDAR can distill desired depth information in the presence of both synchronous and asynchronous spurious signals without prior knowledge of the scene and the target object. This result enables the development of robust and secure quantum LiDAR systems and paves the way to time-resolved quantum imaging applications.

preprint: APS/123-QEDCorresponding authors:
daniele.faccio@glasgow.ac.uk
edoardo.charbon@epfl.ch

Light detection and ranging (LiDAR) systems with the ability to reach long distance at high speed and accuracy have emerged as a key technology in autonomous driving, robotics, and remote sensing Schwarz (2010). Today miniaturised LiDARs are integrated in many consumer electronics devices, e.g. smartphones. Moving beyond depth sensing, the LiDAR technique has been also used for non-line-of-sight imaging Velten et al. (2012); Gariepy et al. (2016); O’Toole et al. (2018); Liu et al. (2019a); Faccio et al. (2020), imaging through scattering media Lyons et al. (2019) and biophotonics applications Bruschini et al. (2019). A typical LiDAR system records the time-of-flight, tt, of light back-reflected from a scene, which enables to estimate distance d=ct/2d=ct/2, where cc is the speed of light Sun et al. (2016). Thanks to their single-photon sensitivity, picosecond temporal resolution and low cost, single-photon avalanche diodes (SPADs) are widely used as detectors in LiDAR Shin et al. (2016); Tachella et al. (2019). In this respect, two well established techniques can be used to achieve the timing information at picosecond resolution: time-correlated single-photon counting (TCSPC) that operates by recording a time-stamp for each individual photon Zhang et al. (2018); Ronchini Ximenes et al. (2019); Seo et al. (2021) or time gating in which a gate window is finely shifted Ulku et al. (2018); Morimoto et al. (2020); Ren et al. (2018); Chan et al. (2019).
Despite recent advances, interference is a major challenge for robust and secure LiDAR applications through complex environments. In our work, the term ‘interference’ refers to the detection by the LiDAR sensor of any optical signals other than those emitted by the LiDAR source. These may originate from ambient light, other LiDAR systems operating concurrently and intentional spoofing signals. In addition to depth distortion such as degradation in accuracy and precision, interference could result in misleading information, causing the system to make incorrect decisions. Over the past several years, some approaches addressing LiDAR interference have been proposed. One technique isolates the signal based on temporal correlations between two or more photons Niclass et al. (2013), which effectively suppresses noise due to ambient light. Another technique based on laser phase modulation can reduce both ambient light and mutual interference Ronchini Ximenes et al. (2019); Seo et al. (2021). However, these approaches have limitations. For example, an external signal can still spoof the LiDAR detector if it copies the temporal correlation or phase modulation of the illumination source, which is easily achievable by placing a photodiode close to the target object. To date, there is no LiDAR system immune to all types of interference.
The use of non-classical optical states can also improve object detection in the presence of spurious light and noise. In a quantum illumination protocol, a single photon is sent out towards a target object while its entangled pair is retained and used as an ancilla Lloyd (2008). Coincidence detection between the returned photon and its twin increases the effective signal-to-noise ratio compared to classical illumination, an advantage persisting even in the presence of noise and losses. Recently, practical quantum illumination schemes have been experimentally demonstrated using spatially entangled photon pairs for target detection Lopaeva et al. (2013); Zhang et al. (2020) and imaging objects England et al. (2019); Defienne et al. (2019); Gregory et al. (2020) in the presence of background light and spurious images. These approaches rely on the ability to measure photon coincidences between many spatial positions, which is conventionally performed using single-photon sensitive cameras such as electron multiplied charge coupled device (EMCCD) cameras Moreau et al. (2012); Edgar et al. (2012); Defienne et al. (2018), intensified(i)CCD Chrapkiewicz et al. (2014, 2016) and SPAD cameras Eckmann et al. (2020); Ianzano et al. (2020); Ndagano et al. (2020); Defienne et al. (2021). However, while a handful of works have reported the use of photon pairs for target detection at distance Liu et al. (2019b); Frick et al. (2020); Ren et al. (2020), no imaging LiDAR experiments with absolute range distance and interference or background rejection have been reported.

Refer to caption
Figure 1: Experimental setup and principle. (a) An object O1O_{1} placed in the far field of a 1-mm-thick β\beta-Barium Borate (BBO) nonlinear crystal is illuminated by spatially entangled photon pairs produced via type-I spontaneous parametric down conversion (SPDC), while an object O2O_{2} is illuminated by diffused classical light. A lens f1=50f_{1}=50 mm is positioned after the crystal to direct the photon-pairs towards O1O_{1}. Both objects are composed of an absorptive pattern layer on a reflective surface. They are imaged onto the SPAD camera using lens f2=100f_{2}=100 mm, f3=50f_{3}=50 mm, f4=100f_{4}=100 mm and an unbalanced beam splitter (0.1R/0.9T). (b) When the SPAD gate window is set to capture only photon pair pulses reflected by the quantum object, the “skater-shape” object appears in the intensity image and a peak is detected in the spatially-averaged correlation image, which shows the number of photon coincidences spatially averaged over all pair of pixels 𝐫𝟏\mathbf{r_{1}} and 𝐫𝟐\mathbf{r_{2}} separated by a given distance 𝐫𝟏+𝐫𝟐\mathbf{r_{1}}+\mathbf{r_{2}}. The correlation peak confirms the presence of photon pairs among the detected photons. (c) When the SPAD gate window is set to capture only classical light, the “car-shape” object appears in the intensity image whereas no peak obtained is visible in the spatially-averaged correlation image. Intensity and spatially-averaged correlation images were reconstructed from N=2000N=2000 frames (8-bit) acquired using an exposure time of 350350 ns (1-bit). Intensity image coordinate units are in pixels.

In this work, we demonstrate a quantum LiDAR system immune to any type of classical interference by using a pulsed light source of spatially entangled photon pairs and a time-resolved SPAD camera. We use spatial anti-correlations between photon pairs as a unique identifier to distinguish them from any other light sources in the target scene. In particular, we show our LiDAR system successfully images objects and retrieves their depths in two different interference scenarios mimicking the presence of spoofing or additional classical LiDAR signals. In the first case, spurious light from a synchronised laser is used to demonstrate the robustness against intentional spoofing attacks. In the second case, the interference takes the form of asynchronous pulses imitating the presence of multiple background LiDAR systems running in parallel. The results show that our approach enables to image with high depth resolution while offering immunity to classical light interference.
Imaging system. The experimental setup is shown in Fig. 1a. Spatially entangled photon pairs are produced by type-I spontaneous parametric down conversion (SPDC) with a β\beta-Barium Borate (BBO) nonlinear crystal pumped by a 355355 nm pulsed laser with a repetition frequency of 2020 MHz. The objects to be imaged are masks placed on a reflective mirror. One object O1O_{1} (“skater”) is placed in the far field of the crystal thus the down-converted photon pairs are spatially anti-correlated at the object plane. Another object O2O_{2} (“car”) is illuminated by a diffused 780780 nm laser pulsed at 2020 MHz as well to produce the interference. Both objects are imaged onto the SPAD camera SwissSPAD2 Ulku et al. (2018) (see Methods). Similar to a typical LiDAR scheme, the pump laser is synchronised with the camera, while the spoofing signal generated by a classical pulse laser can be synchronous or asynchronous.

Refer to caption
Figure 2: Results with synchronous classical light interference. (a) The reflected light from objects O1O_{1} (“person”) and O2O_{2} (“bike”) are both synchronous with the camera. (b) shows the selected intensity and spatially-averaged correlation images (9×99\times 9 central data) at the gate positions 0.090.09 ns, 7.27.2 ns, 14.5814.58 ns and 23.423.4 ns covered with none reflected light, reflected light from only O1O_{1}, O1O_{1} &\& O2O_{2}, and only O2O_{2} respectively. Correlation peaks are obtained at 14.5814.58 ns and 23.423.4 ns gate positions when there is quantum light reflected to the camera. The measurement is implemented over a time range of 2727 ns corresponding to 15001500 continuous gate positions with a proper time offset initially to the pump laser trigger. (c) Average intensity over all pixels (blue curve) and the peak coincidences (red curve) values along the measured time range. The four positions in (b) are also marked on the horizontal axis of the curve. (d) The subtracted intensity image of O2O_{2} (classical) and its arrival time (16.11016.110 ns) to the camera by locating the first falling edge of the average intensity profile. (e) The subtracted intensity image of O1O_{1} (quantum) and its arrival time (24.46224.462 ns) to the camera by locating the falling edge of the correlation peak profile. Experiments are performed by N=5000N=5000 frames (8-bit) acquired in 13.513.5 s at each gate position using an exposure time of 350350 ns for 1-bit frame. The time step between two successive gate windows is 1818 ps. Intensity image coordinate unites are in pixels.
Refer to caption
Figure 3: Measurement of spatially-resolved correlation images over time. The shape of the object illuminated by photon pairs (“person”) is retrieved by measuring spatially-resolved correlation images at the gate positions 0.090.09 ns, 7.27.2 ns, 14.5814.58 ns and 23.423.4 ns. Each image is obtained by acquiring 55 million frames (8-bit), which corresponds to approximately 3.83.8 hours of acquisition. See methods for more details.

As in conventional time-gated LiDAR, backscattered photons with specific time-of-flight are detected by scanning a gate window (1515 ns wide) using 1818 ps time steps, which corresponds to a depth resolution of 2.72.7 mm Ulku et al. (2018); Morimoto et al. (2020). At each gate position, a set of 8-bit frames (103\sim 10^{3} frames) is acquired to reconstruct two different types of images: (i) a classical intensity image, obtained by summing all frames, and (ii) a spatially-averaged photon correlation image computed by identifying photon coincidences in the frame set using a technique detailed in Defienne et al. (2018) (see Methods). The intensity image retrieves the shape of the objects in the scene at a given depth, while the spatially-averaged correlation image measures spatial correlations between detected photons to identify the presence of photon pairs. For example, if only reflected photon pairs are captured within the gate window (Fig. 1b), the intensity image shows the “skater” object and an intense peak is observed at the center of the spatially-averaged correlation image. The presence of such a correlation peak above the noise level confirms the presence of photon pairs among the detected photons. If only classical light is detected (Fig. 1c), the intensity images show the “car” object illuminated by the pulse laser and the spatially-averaged correlation image is flat.

Refer to caption
Figure 4: Results with asynchronous spurious light. (a) Photon pairs reflected by object O1O_{1} (“STOP traffic sign”) is synchronous with the camera, while classical photons reflected by O2O_{2} (“50 traffic sign”) arrives at the camera in temporally random sequences as the classical laser is asynchronous. (b) The camera scanned over a time range of 2727 ns (1500 continuous gate positions). Intensity and spatially-averaged correlation images (central 9×\times9 pixels area) are shown for three different gate positions (2.162.16 ns, 13.513.5 ns and 24.6624.66 ns). The correlation peak only appears at the gate window covered with photon pairs reflected by O1O_{1}. (c) The corresponding three gate positions are marked in the curve of the average intensity and correlation peak responses over the detected time range. (d) Intensity image reconstructed by subtracting intensity image at 13.513.5 ns by this at 24.6624.66 ns. At each gate position, N=3000N=3000 frames (8-bit) were acquired in 8.18.1 s using an exposure time of 350350 ns (1-bit). The time step between two successive gate positions is 1818 ps. Intensity image coordinate unites are in pixels.

Synchronous classical light interference. First, we consider the case of a spurious classical source of light that is synchronised with the SPAD camera i.e. photons reflected by both O1O_{1} (“person”) and O2O_{2} (“bike”) are synchronous with the camera (Fig. 2a). This scenario corresponds to a spoofing attack. To operate the LiDAR, the gate window is continuously shifted over a range of 2727 ns, which corresponds to 1500 gate positions. Figure 2b shows the intensity and spatially-averaged correlation images (zoom 9×99\times 9 pixels in inset) measured at four specific gate positions 0.090.09 ns, 7.27.2 ns, 14.5814.58 ns and 23.423.4 ns. At the early gate position (0.090.09 ns), there is only noise recorded by the camera such as dark count, crosstalk, afterpulsing and ambient light. As the gate window is shifted, O2O_{2} appears in the intensity image (7.27.2 ns) and the absence of a peak in the spatially-averaged correlation image shows that it originates from classical light alone. When the reflected quantum light starts to be collected in the gate window together with the classical light (14.5814.58 ns), O1O_{1} and O2O_{2} are superposed in the intensity image and a correlation peak becomes visible. For the late gate window (23.423.4 ns) the classical laser pulse vanishes while only quantum light is detected, as shown by the peak persisting in the spatially-averaged correlation image, and only O1O_{1} is visible in the intensity image.
To acquire depth information and distinguish classical interference, the spatially-averaged intensity and correlation peak values represented in function of the gate position in Fig. 2c are analyzed. The two-step average intensity profile represents the double reflections from O1O_{1} and O2O_{2}, while the correlation peak profile only reveals the trend of quantum light over the given time range. By locating the falling edges of the intensity profile, the arrival time information of all the objects can be obtained Morimoto et al. (2021); Chan et al. (2019). Whilst by just searching for the last falling edge of the correlation peak profile, the arrival time information of the quantum object is extracted. As shown in Fig. 2d the arrival time of the classical object of 16.11016.110 ns and its intensity image are obtained. The arrival time of the quantum object, 24.46224.462 ns, is located at the last fitted falling edge of the correlation peak profile and the corresponding intensity image is subtracted in Fig. 2e. Refer to the Supplementary Video for the scanned results over the entire detected range. The proposed dual-profile locating method enables locating and distinguishing objects illuminated by quantum light or classical light.
The anti-spoofing capability works as described, as long as the time delay between the two objects is larger than the temporal resolution of the SPAD camera. One may then enhance the removal of temporally overlapping interferences by increasing the number of frames (e.g. up to 106\sim 10^{6} 8-bit frames) for each gate delay so as to retrieve a spatially-resolved correlation image instead of a spatially-averaged correlation image Defienne et al. (2018, 2019); Gregory et al. (2020); Defienne et al. (2021). An example is shown in Fig. 3: a spatially-resolved correlation image retrieves directly the shape of the object illuminated by photons pairs and remains insensitive to classical interference (classical background noise added in the experiment). In fact, such a spatially-resolved correlation image could potentially be measured at all gate positions of the LiDAR scanning. However, the acquisition time is much longer than that required to retrieve an spatially-averaged correlation image (a few hours instead of seconds for a single time gate delay) and it is therefore better to limit its use to gate positions for which the objects cannot be distinguished otherwise. In addition, note that because of the anti-symmetric spatial structure of photon pairs illuminating the object, each spatially-resolved correlation image contains both the object and its symmetric image, which means that the object must interact with only half of the photon pair beam to be imaged through correlations without ambiguity.
Asynchronous classical light interference. In real-world applications, another possible scenario is the interference coming from ambient light and other LiDAR systems. We therefore consider a classical source of light that is not synchronised with the SPAD camera but still running at the same repetition frequency (2020 MHz) and illuminates the object O2O_{2} “50 traffic sign” (Fig. 4a). Fig. 4b shows the intensity and correlation images at three example gate positions 2.162.16 ns, 13.513.5 ns and 24.6624.66 ns. The object O2O_{2} is visible in the intensity images at all gate positions as background noise. When the gate is shifted to 13.513.5 ns, the SPAD also captures photon pairs reflected by O1O_{1} (“STOP traffic sign”) and both objects are superimposed in the intensity image. We now also observe a peak in the spatially-averaged correlation image which highlights the presence of photon pairs. By locating the falling edge of the spatially-averaged correlation peak shown in Fig. 4c, the time arrival information of the quantum object is located at 20.68220.682 ns, and its intensity image is also obtained by subtraction as shown in Fig. 4d. See the Supplementary Video for the entire measured results.
Discussion. We demonstrated a LiDAR system based on spatially entangled photon pairs showing robustness against interference from classical sources of light. In particular, we showed its successful use in the presence of (i) a spoofing attack (synchronous classical light interference) and (ii) of a background light and another LiDAR system operating in parallel (asynchronous classical light interference). Note also that because the quantum LiDAR harnesses anti-correlations between photon pairs to retrieve images, it is also immune to classically-correlated sources of light such as thermal and pseudo-thermal sources in which photons are position-correlated Valencia et al. (2005).
In our current implementation, time gate position is acquired in several seconds so that the full scanning takes several hours (5.6 hours for the synchronous case and 3.4 hours for the asynchronous case). This total acquisition time can however be significantly decreased by reducing (i) the acquisition time per gate position and (ii) the number of gate positions to detect the falling edge of quantum light (currently 15001500 steps scanned linearly). For example, in the case of the synchronous classical light shown in Fig. 2, the quantum illuminated object could be located by measuring only 300300 frames for 88 different gate positions by using a correlation-driven scanning and falling edge fitting algorithm, which would reduce the total acquisition time to 77 seconds (see details in the Supplementary Information). In addition, the speed of the SPAD camera in our experiment was limited to 370370 fps by the readout architecture, but it has been demonstrated that the similar cameras can be operated at frame rates up to 800000800000 fps Gasparini et al. (2018), which would further diminish the total acquisition time and potentially reach real-time acquisition. Furthermore, in the current quantum LiDAR prototype the target object is a two dimensional ‘co-operative’ object attached to a mirror, which ensures enough photon pairs are reflected and collected by the camera. However, the proposed scheme can be extended to scattering materials with three dimensional profiles by using brighter photon pair sources and more sensitive SPAD cameras, which are currently under development. Looking forward, these results could enable the development of robust and secure LiDAR systems and more general time-resolved quantum imaging applications.

Methods

Details of the experimental setup. The 355355 nm pump laser used in this work is VisUV-355-HP (Picoquant GmbH) with 2727 mW average output power at a repetition frequency of 2020 MHz. The non-linear crystal cut for type-I SPDC is a β\beta-Barium Borate crystal of size 5×5×15\times 5\times 1 mm with a half opening angle of 3 degrees (Newlight Photonics). A 710710 nm bandpass filter is placed after the BBO crystal to filter out the spurious pump light. The 780780 nm classical laser (PiL079XSM, ALS GmbH) running at a repetition rate of 2020 MHz with 4040 ps (FWHM) pulse width is coupled with a fiber and then connected to a collimator, the output power of which can be finely tuned. A diffuser (ED1-C50-MD, Thorlabs) is used to create even illumination of the classical light over the object. The 355355 nm pump laser always operates as a master to trigger only the camera in the asynchronous measurement, and both camera and the 780780 nm laser in the synchronous experiment.
Details of the SPAD camera. The camera used in our study is the SwissSPAD2 with microlens on chip. It is composed of 512×512512\times 512 pixels with a pitch of 16.3816.38 μ\mum, a native fill factor of 10.5%10.5\% and photon detection probability (PDP) of approximately 25%25\% at 700700 nm. The camera runs in time-gating mode by scanning gate windows (15\sim 15 ns wide) continuously with a time step of 1818 ps. The starting gate position is tuned to be prior to the first synchronous light reflection by adding an appropriate initial offset to the laser trigger. There is no initial calibration approach implemented in this work to compensate the time-arrival skew from electrical signal (cable length, trigger circuit) and optical signal (fiber) as we only focus on the relative range of different objects. At each gate window, the number of N 88-bit frames are transferred to the computer by USB33 connection and photon correlations are processed on a GPU before the gate shifts to the next successive position to run the same operation. Each 8-bit frame is accumulated by 255 successive 1-bit measurements with 350350 ns exposure time. The overall acquisition speed is 370370 fps (with 10.210.2 μ\mus readout time for each bit) and the post-processing time is less than 11 ms per 8-bit frame when running on a GPU. The central 150×150150\times 150 pixels are selected for all experiments to minimize the skew influence due to the electrical propagation across the SPAD array. The gate control signal injected to the middle of the pixel array resulted a symmetrical time propagation to the right and left pixels. To remove the hot pixels, we define a threshold at 200200 dark counts and set all pixel values above this threshold in each frame to 0, as described in Defienne et al. (2021). Then the hot pixels are smoothed by its neighboring pixels Shin et al. (2016) for all intensity images shown in the work. The crosstalk effects are also removed by setting the correlation values from direct neighbour pixels to 0.
Spatial correlation image calculation. The photon coincidence processing model used in this study is detailed in  Defienne et al. (2018), in which the spatial joint probability distribution (JPD) Γ\Gamma(𝐫𝐢\mathbf{r_{i}},𝐫𝐣\mathbf{r_{j}}) of entangled photon pairs is measured by multiplying values measured at pixel ii in each frame by the difference of values measured at pixel jj between two successive frames:

Γ(𝐫𝐢,𝐫𝐣)=1N=1N[I(𝐫𝐢)I(𝐫𝐣)I(𝐫𝐢)I1(𝐫𝐣)],\Gamma(\mathbf{r_{i}},\mathbf{r_{j}})=\frac{1}{N}\sum_{\ell=1}^{N}[I_{\ell}(\mathbf{r_{i}})I_{\ell}(\mathbf{r_{j}})-I_{\ell}(\mathbf{r_{i}})I_{\ell-1}(\mathbf{r_{j}})], (1)

where NN is the acquired number of frames at each gate position. I(𝐫𝐢)I_{\ell}(\mathbf{r_{i}}) and I(𝐫𝐣)I_{\ell}(\mathbf{r_{j}}) represent the photon-count value at any pixel ii and jj (in position 𝐫𝐢\mathbf{r_{i}} and 𝐫𝐣\mathbf{r_{j}}) of the th\ell^{th} frame ([[1;N]]\ell\in[\![1;N]\!]). The genuine coincidences that only originates from correlations between entangled photon pairs are obtained by removing the accidental coincidences resulting from dark counts, after-pulsing, hot pixels, crosstalk, detection of multiple photon pairs and stray light. Both the (i) spatially-averaged correlation and the (ii) spatially-resolved correlation image used in our study can be extracted from the JPD:
(i) The spatially-averaged correlation image (noted Γ+\Gamma_{+}) is calculated from JPD Γ\Gamma using the formula:

Γ+(𝐫𝟏+𝐫𝟐)=𝐫Γ(𝐫𝟏+𝐫𝟐𝐫,𝐫).\Gamma_{+}(\mathbf{r_{1}}+\mathbf{r_{2}})=\sum_{\bm{\mathbf{r}}}\Gamma(\mathbf{r_{1}}+\mathbf{r_{2}}-\mathbf{r},\mathbf{r}). (2)

This represents the average number of photon coincidences detected between all pairs of pixels 𝐫𝟏\mathbf{r_{1}} and 𝐫𝟐\mathbf{r_{2}} separated by a distance 𝐫𝟏+𝐫𝟐\mathbf{r_{1}}+\mathbf{r_{2}}.
(ii) The spatially-resolved correlation image is defined as the anti-diagonal component of the JPD Γ(𝐫,𝐫)\Gamma(\mathbf{r},-\mathbf{r}). It represents the number of photon coincidences detected between symmetric pair of pixels.

References

  • Schwarz (2010) B. Schwarz, Nature Photonics 4, 429 (2010).
  • Velten et al. (2012) A. Velten, T. Willwacher, O. Gupta, A. Veeraraghavan, M. G. Bawendi,  and R. Raskar, Nature communications 3, 1 (2012).
  • Gariepy et al. (2016) G. Gariepy, F. Tonolini, R. Henderson, J. Leach,  and D. Faccio, Nature Photonics 10, 23 (2016).
  • O’Toole et al. (2018) M. O’Toole, D. B. Lindell,  and G. Wetzstein, Nature 555, 338 (2018).
  • Liu et al. (2019a) X. Liu, I. Guillén, M. La Manna, J. H. Nam, S. A. Reza, T. H. Le, A. Jarabo, D. Gutierrez,  and A. Velten, Nature 572, 620 (2019a).
  • Faccio et al. (2020) D. Faccio, A. Velten,  and G. Wetzstein, Nature Reviews Physics 2, 318 (2020).
  • Lyons et al. (2019) A. Lyons, F. Tonolini, A. Boccolini, A. Repetti, R. Henderson, Y. Wiaux,  and D. Faccio, Nature Photonics 13, 575 (2019).
  • Bruschini et al. (2019) C. Bruschini, H. Homulle, I. M. Antolovic, S. Burri,  and E. Charbon, Light: Science & Applications 8, 1 (2019).
  • Sun et al. (2016) M.-J. Sun, M. P. Edgar, G. M. Gibson, B. Sun, N. Radwell, R. Lamb,  and M. J. Padgett, Nature communications 7, 1 (2016).
  • Shin et al. (2016) D. Shin, F. Xu, D. Venkatraman, R. Lussana, F. Villa, F. Zappa, V. K. Goyal, F. N. Wong,  and J. H. Shapiro, Nature communications 7, 1 (2016).
  • Tachella et al. (2019) J. Tachella, Y. Altmann, N. Mellado, A. McCarthy, R. Tobin, G. S. Buller, J.-Y. Tourneret,  and S. McLaughlin, Nature communications 10, 1 (2019).
  • Zhang et al. (2018) C. Zhang, S. Lindner, I. M. Antolović, J. M. Pavia, M. Wolf,  and E. Charbon, IEEE Journal of Solid-State Circuits 54, 1137 (2018).
  • Ronchini Ximenes et al. (2019) A. Ronchini Ximenes, P. Padmanabhan, M. Lee, Y. Yamashita, D. Yaung,  and E. Charbon, IEEE Journal of Solid-State Circuits 54, 3203 (2019).
  • Seo et al. (2021) H. Seo, H. Yoon, D. Kim, J. Kim, S. J. Kim, J. H. Chun,  and J. Choi, IEEE Journal of Solid-State Circuits , 1 (2021).
  • Ulku et al. (2018) A. C. Ulku, C. Bruschini, I. M. Antolović, Y. Kuo, R. Ankri, S. Weiss, X. Michalet,  and E. Charbon, IEEE Journal of Selected Topics in Quantum Electronics 25, 1 (2018).
  • Morimoto et al. (2020) K. Morimoto, A. Ardelean, M.-L. Wu, A. C. Ulku, I. M. Antolovic, C. Bruschini,  and E. Charbon, Optica 7, 346 (2020).
  • Ren et al. (2018) X. Ren, P. W. Connolly, A. Halimi, Y. Altmann, S. McLaughlin, I. Gyongy, R. K. Henderson,  and G. S. Buller, Optics express 26, 5541 (2018).
  • Chan et al. (2019) S. Chan, A. Halimi, F. Zhu, I. Gyongy, R. K. Henderson, R. Bowman, S. McLaughlin, G. S. Buller,  and J. Leach, Scientific reports 9, 1 (2019).
  • Niclass et al. (2013) C. Niclass, M. Soga, H. Matsubara, S. Kato,  and M. Kagami, IEEE Journal of Solid-State Circuits 48, 559 (2013).
  • Lloyd (2008) S. Lloyd, Science 321, 1463 (2008).
  • Lopaeva et al. (2013) E. Lopaeva, I. R. Berchera, I. P. Degiovanni, S. Olivares, G. Brida,  and M. Genovese, Physical review letters 110, 153603 (2013).
  • Zhang et al. (2020) Y. Zhang, D. England, A. Nomerotski, P. Svihra, S. Ferrante, P. Hockett,  and B. Sussman, Physical Review A 101, 053808 (2020).
  • England et al. (2019) D. G. England, B. Balaji,  and B. J. Sussman, Physical Review A 99, 023828 (2019).
  • Defienne et al. (2019) H. Defienne, M. Reichert, J. W. Fleischer,  and D. Faccio, Science advances 5, eaax0307 (2019).
  • Gregory et al. (2020) T. Gregory, P.-A. Moreau, E. Toninelli,  and M. J. Padgett, Science advances 6, eaay2652 (2020).
  • Moreau et al. (2012) P.-A. Moreau, J. Mougin-Sisini, F. Devaux,  and E. Lantz, Physical Review A 86, 010101 (2012).
  • Edgar et al. (2012) M. P. Edgar, D. S. Tasca, F. Izdebski, R. E. Warburton, J. Leach, M. Agnew, G. S. Buller, R. W. Boyd,  and M. J. Padgett, Nature communications 3, 1 (2012).
  • Defienne et al. (2018) H. Defienne, M. Reichert,  and J. W. Fleischer, Physical review letters 120, 203604 (2018).
  • Chrapkiewicz et al. (2014) R. Chrapkiewicz, W. Wasilewski,  and K. Banaszek, Optics letters 39, 5090 (2014).
  • Chrapkiewicz et al. (2016) R. Chrapkiewicz, M. Jachura, K. Banaszek,  and W. Wasilewski, Nature Photonics 10, 576 (2016).
  • Eckmann et al. (2020) B. Eckmann, B. Bessire, M. Unternährer, L. Gasparini, M. Perenzoni,  and A. Stefanov, Optics Express 28, 31553 (2020).
  • Ianzano et al. (2020) C. Ianzano, P. Svihra, M. Flament, A. Hardy, G. Cui, A. Nomerotski,  and E. Figueroa, Scientific reports 10, 1 (2020).
  • Ndagano et al. (2020) B. Ndagano, H. Defienne, A. Lyons, I. Starshynov, F. Villa, S. Tisa,  and D. Faccio, npj Quantum Information 6, 1 (2020).
  • Defienne et al. (2021) H. Defienne, J. Zhao, E. Charbon,  and D. Faccio, Physical Review A 103, 042608 (2021).
  • Liu et al. (2019b) H. Liu, D. Giovannini, H. He, D. England, B. J. Sussman, B. Balaji,  and A. S. Helmy, Optica 6, 1349 (2019b).
  • Frick et al. (2020) S. Frick, A. McMillan,  and J. Rarity, Optics Express 28, 37118 (2020).
  • Ren et al. (2020) X. Ren, S. Frick, A. McMillan, S. Chen, A. Halimi, P. W. Connolly, S. K. Joshi, S. Mclaughlin, J. G. Rarity, J. C. Matthews, et al., in CLEO: Applications and Technology (Optical Society of America, 2020) pp. AM3K–6.
  • Morimoto et al. (2021) K. Morimoto, M.-L. Wu, A. Ardelean,  and E. Charbon, Physical Review X 11, 011005 (2021).
  • Valencia et al. (2005) A. Valencia, G. Scarcelli, M. D’Angelo,  and Y. Shih, Phys. Rev. Lett. 94, 063601 (2005).
  • Gasparini et al. (2018) L. Gasparini, M. Zarghami, H. Xu, L. Parmesan, M. M. Garcia, M. Unternährer, B. Bessire, A. Stefanov, D. Stoppa,  and M. Perenzoni, in 2018 IEEE International Solid - State Circuits Conference - (ISSCC) (2018) pp. 98–100.
  • Kim et al. (2021) B. Kim, S. Park, J.-H. Chun, J. Choi,  and S.-J. Kim, in 2021 IEEE International Solid-State Circuits Conference (ISSCC), Vol. 64 (IEEE, 2021) pp. 108–110.


Acknowledgements. DF is supported by the Royal Academy of Engineering under the Chairs in Emerging Technologies scheme and acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/T00097X/1 and EP/R030081/1). This project has received funding from the European union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754354. H.D. acknowledges support from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 840958.

Authors contributions. D.F. and E.C. conceived the research. H.D., A.L. and J.Z. designed the experimental setup. J.Z. and A.L. performed the experiment. J.Z. and A.L. analysed the data. A.U. E.C. developed  SwissSPAD2 and H.D. developed the coincidence counting algorithm. E.C. supervised the project. All authors contributed to the manuscript.

Data availability. The experimental data and codes that support the findings presented here are available from the corresponding authors upon reasonable request.

Supplementary Information

.1 Details on data processing

The intensity images shown in this work are pixel-wise summed by the acquired number N frames. The value of hot pixels is set to be 0 for correlation calculation and then interpolated with the neighboring pixels to show a better intensity image. The spatially-averaged correlation image shown in the context is cropped from the full 300×300300\times 300 data for better visualization purpose. The results shown in Fig. 5 are the full spatially correlation data corresponding to the 4 selected gate positions in Fig. 2. In Fig. 5c and d, the correlation peaks are well visible, while the background fluctuation in Fig. 5d is smaller as no classical light is collected at that gate position Defienne et al. (2019).

Refer to caption
Figure 5: Full spatially-averaged correlation images. (a), (b), (c) and (d) The full 300×300300\times 300 spatially-averaged correlation images at 4 gate positions corresponding to those in Fig. 2. The red curves on the walls are the projections of the data across the correlation peak.

The time arrival information of the quantum illuminated object can be located by finding the falling edge of the correlation peak profile. In order to locate the middle point at the falling edge, we fitted the fluctuated correlation peak data by the the error function erf(.)erf(.) Chan et al. (2019). From the fitted curves shown in Fig. 6a and b, we obtained the falling time of 0.8820.882 ns for the synchronous case and 0.8640.864 ns for the asynchronous case (from 90%90\% to 10%10\%). The tiny variation (1 gate position) also proves the reliability of the correlation peak profile used for object ranging. As the falling edge is not perfectly sharp, we recovered the intensity images of the classical object and quantum object by subtracting the intensity image at the gate position 4545 gates before the middle point from the one 4545 gates after. This allows us to achieve a subtracted intensity image with better contrast by avoiding using the images at the falling edge. However, the falling edge profile can be improved by optimizing the electrical gate shape of the camera and decreasing the width of the laser pulse.

Refer to caption
Figure 6: Correlation peak data and fitted curves at falling edge. (a) The correlation peak data from 21.621.6 ns to 2727 ns corresponding to the synchronous result in Fig. 2 and its fitted falling edge. (b) The correlation peak data from 1818 ns to 23.423.4 ns corresponding to the asynchronous result in Fig. 4 and its fitted falling edge.

.2 Correlation peak SNR analysis

To analyse the visibility of the correlation peak in the spatially-averaged correlation image, we calculate the single-to-noise ratio (SNR) defined as the correlation peak value divided by the standard deviation of the background noise surrounding it. Fig. 7 shows the SNR of the corresponding result in Fig. 2 and the average intensity plotted as reference. We observe that the SNR increases above 11 when quantum light arrives at the camera 1111 ns (classical light already present). In addition, the SNR is further improved when the classical light disappears at 1717 ns because the background noise decreases.

Refer to caption
Figure 7: Correlation peak SNR profile. The SNR data of the synchronous result corresponding to Fig. 2. The average intensity profile is plotted as reference.
Refer to caption
Figure 8: SNR analysis with various classical over quantum light intensity ratios. (a) The intensity images and spatially-averaged correlation images with Ic/Iq=0.75I_{c}/I_{q}=0.75. (b) The intensity images and spatially-averaged correlation images with Ic/Iq=3.48I_{c}/I_{q}=3.48. The insets in the correlation image are the central 9×99\times 9 data and the curves are the vertical profiles across the center. (c) More measured correlation SNR values with different Ic/IqI_{c}/I_{q} ratios and the fitted curve with a theoretical model.

In order to further evaluate the visibility of the correlation peak in the spatially-average correlation image, we measure the correlation peak SNR for various classical light over quantum light intensity ratios (Ic/IqI_{c}/I_{q}) by keeping the pump laser power constant and tuning the classical laser power. The experiments are performed with the same camera configurations as in Fig. 4. As shown in Fig. 8a and b, the superimposed images are acquired at 13.513.5 ns gate position where the gate window captures both classical light and quantum light. It is difficult to distinguish the traffic stop sign when Ic/IqI_{c}/I_{q} is high (3.483.48) in the superimposed image, While it is better resolved after subtracting the classical light acquired at gate position of 24.6624.66 ns. From the spatially-averaged correlation image we can see that the background is more noisy when classical light intensity is higher as shown in the profiles across the center. Fig. 8c shows the measured correlation peak SNR values with various Ic/IqI_{c}/I_{q} and the fitting curve with the theoretical model described in Defienne et al. (2019). All the results are based on the measurements with fixed N=3000N=3000 8-bit frames.

Refer to caption
Figure 9: Results with reduced number of frames per gate. The average intensity and the correlation peak over 1500 gate positions are measured corresponding to the scenario in Fig. 2. At each gate position, N=300N=300 frames are acquired with an acquisition time of 0.810.81 seconds. The fitted curve of the correlation peak falling edge is shown with green line.
Refer to caption
Figure 10: Correlation-driven scanning approach and the result. (a) The correlation-driven scanning principle. (b) Only 88 scanning gate positions (300 frames acquired per gate) are required to locate the range of the object, which costs about 6.566.56 seconds. (c) The spatially-averaged correlation image at scanning position 2 in (b). The inset is the central 9×99\times 9 data. (d) shows the subtracted intensity image and the arrival time of the reflected quantum light from the object, 25.52425.524 ns.

.3 Improving the quantum LiDAR acquisition speed

.3.1 Reducing the number of frames

In Figure 2, 50005000 frames are acquired in 13.513.5 s at each gate position to measure a spatially-averaged correlation image and identify the peak with a SNR on the order of 3030. As shown in Figure 9, the peak is still well visible (SNR on the order of 66) if the number of acquired frame per gate is reduced to 300300, which strongly lower the acquisition time down to 0.810.81 s per gate.

.3.2 Using a correlation-driven algorithm

To further improve the quantum LiDAR speed, we developed a coincidence-driven algorithm inspired by the binary search process used in successive-approximation register (SAR) analog-to-digital converter. To cover the range of the object, the scanning time range should be larger than the laser pulse period. Here, we use 28002800 gate positions corresponding to 50.450.4 ns. As shown in Fig. 10a, we initially scan 33 gate positions with 700700 gates (12.612.6 ns) interval dividing the scanning range to 44 parts equally. As the width of the gate window is 15.06615.066 ns, we can make sure that at least one of the initially scanned gates can capture the reflected quantum light pulse, thus a corresponding higher correlation peak will be obtained. Since the falling edge is not perfectly sharp, a following scanning process is implemented to check if the gate position with higher correlation peak is from the falling edge. To avoid such false locating, we scanned the gate position 4545 gates before and 4545 gates after the target position from the last scanning. The target range defined for the next scanning is between the last gate position with higher correlation peak and the scanned gate position just after it. When the target range is narrow enough the fitting method can be applied to achieve the falling edge according to the discrete scanned points. An example is depicted in Fig. 10b, where only 6.566.56 seconds is consumed with 88 scanning points to locate the range of the object. Fig. 10c shows the projected coincidences at scanning point 2, in which the coincidence peak is obvious. Fig. 10d is the subtracted intensity image and the measured relative range is 25.52425.524 ns. Note that the initial offset for measurements in Fig. 2 and Fig. 10 are different resulting different relative time ranges.

By reducing the number of acquired frames to 300300 and using a correlation-driven algorithm, we retrieve the quantum-illuminated object and its depth in 77 seconds. In the current implementation, it is not necessary to consider the gating profile variation over different pixels since the object is based on a 2 dimensional mask, which simplifies the processing and makes the proposed algorithm effective. However, the correlation-driven algorithm can be also extended to quantum LiDAR applications with 3 dimensional objects by applying in-pixel successive approximation. The similar approach has been implemented for conventional TCSPC-based LiDAR to reduce the output bandwidth Kim et al. (2021).

.4 Analysis of different scenarios

According to the different arrival time of the reflected photons from the classical light and quantum light, different scenarios are shown in Fig. 11. For the first three scenarios shown in Fig. 11a, b and c, the target quantum object can be located by the falling edge of the correlation peak profile and subtracted from the intensity image after the falling edge no matter it is background light (in a and b) or classical light (c). However, if the distance between the two classical and quantum objects are smaller than the depth resolution of the camera, the spatially-resolved correlation image has to be performed for distillation, which takes more time.

Refer to caption
Figure 11: Different synchronous scenarios. (a) The response profiles of classical light and quantum light are separated. (b) The response profiles of classical light and quantum light are overlap, while the classical light arrives into the camera first similar to the scenario in Fig. 2. (c) The response profiles of classical light and quantum light are overlap, while the quantum light arrives into the camera first. (d) The two profiles are entirely overlap or the distance is smaller than the depth resolution of the camera, which can be only distinguished with spatially-resolved correlation image as shown in Fig. 3.