1.中国科学院西安光学精密机械研究所,西安 710119
2.中国科学院大学,北京 100049
3.中国科学院空间精密测量技术重点实验室,西安 710119
张海洋,zhanghaiyang21@mails.ucas.ac.cn
周亮,zhouliang@opt.ac.cn
收稿:2025-10-28,
修回:2026-03-08,
录用:2026-03-09,
纸质出版:2026-03-25
移动端阅览
张海洋,吕媛媛,李静,等. 事件相机高动态静止场景成像方法研究[J].光子学报,2026,55(3):0311003
ZHANG Haiyang, LV Yuanyuan, LI Jing, et al. Research on Event Camera Imaging Methods for High Dynamic Range Static Scenes[J]. Acta Photonica Sinica, 2026, 55(3):0311003
张海洋,吕媛媛,李静,等. 事件相机高动态静止场景成像方法研究[J].光子学报,2026,55(3):0311003 DOI: 10.3788/gzxb20265503.0311003. CSTR: 32255.14.gzxb20265503.0311003.
ZHANG Haiyang, LV Yuanyuan, LI Jing, et al. Research on Event Camera Imaging Methods for High Dynamic Range Static Scenes[J]. Acta Photonica Sinica, 2026, 55(3):0311003 DOI: 10.3788/gzxb20265503.0311003. CSTR: 32255.14.gzxb20265503.0311003.
针对事件相机在静止场景中无法响应的问题,提出一种基于快门映射的高动态范围成像方法。该方法在静止场景中通过控制机械快门旋转,从而触发场景亮度变化并产生事件流,最终将不同机械快门旋转速度下获得离散事件重建为高动态范围静止图像。实验结果表明,该方法能够在场景动态范围达到112 dB时,成功实现事件相机对静止场景的成像。该方法优势在于无需改变场景的静止性,能够在不牺牲成像效果的前提下,有效扩展事件相机在静止场景下的应用范围。所提出的基于快门映射方法为复杂光照条件下的高动态静止场景成像提供了一种简单可行的途径,为事件相机的应用拓展提供了新的思路。
Event cameras are a novel type of visual sensor inspired by the biological retina, capable of asynchronously capturing pixel brightness changes with microsecond-level temporal resolution. Unlike traditional cameras that capture frames at fixed intervals, event cameras only trigger an event when pixel brightness exceeds a preset threshold. This unique mechanism endows event cameras with an ultra-wide dynamic range of up to 140 decibels and exceptional temporal resolution, granting them significant advantages in high-speed, high-dynamic-range scenarios such as object tracking, motion detection, and Simultaneous Localization and Mapping (SLAM). These capabilities enable event cameras to capture rapid movements and subtle changes in dynamic scenes that conventional cameras might miss. However, event cameras have a significant limitation: they can not directly capture grayscale images in static scenes. This is because when a scene is stationary and light intensity remains constant, the camera detects no change in light intensity, failing to meet the conditions for event triggering. These characteristics restrict their application in scenarios requiring complete brightness information, such as static object or scene recognition and measurement in traditional image processing tasks. Consequently, despite their ultra-high dynamic range, temporal resolution, and exceptional performance in dynamic environments, event cameras still face challenges in applications demanding detailed, continuous brightness data acquisition under static conditions.
To address the challenge of capturing static scenes with event cameras, this paper introduces a novel approach using mechanical shutter mapping for imaging. The core innovation lies in modulating brightness variations in static scenes through the rotation of a mechanical shutter, triggering events even in the absence of light intensity changes. By adjusting the shutter's rotation speed at varying rates, discrete events are generated, which are then reconstructed into High Dynamic Range (HDR) grayscale images. This technique overcomes a significant limitation of traditional event cameras, which struggle to capture static scenes due to the lack of natural light variations. A key aspect of this method is the calibration process, which ensures that the generated event data is accurately aligned with the actual light intensity of the scene, enabling precise and accurate grayscale image reconstruction. In this setup, the mechanical shutter acts as an active driving mechanism, inducing event generation by modulating the scene's brightness. This allows the event camera to capture dynamic information in static scenes, where traditional event cameras would otherwise fail to register changes. As a result, this method significantly enhances the imaging capabilities of event cameras, broadening their potential applications in static environments with minimal or no natural light variations.
Experimental results show that the reconstructed HDR images achieve a dynamic range of up to 112 dB. This range allows the event camera to capture fine details in both dark and bright areas of the scene, overcoming the limitations of conventional cameras with fixed dynamic ranges. The calibration process further improves the accuracy of grayscale image reconstruction. Our experimental results demonstrate that shutter-based techniques successfully reconstruct grayscale images of static scenes under extreme lighting conditions. Compared to prior methods, this approach significantly improves image quality, reduces grayscale distortion, and preserves details often lost in conventional event reconstruction techniques. By generating events through controlled brightness variations, this method enhances the imaging capabilities of event cameras in static environments. Furthermore, the mechanical shutter mechanism facilitates portability, enabling adaptation to multiple event camera models and enhancing the system's versatility and practical applicability. Beyond expanding event cameras' capabilities in static scenes, this approach also provides a solution for event cameras operating under complex lighting conditions. Our shutter mapping technology offers a controlled event triggering mechanism, enabling event cameras to function in scenarios where traditional methods fail. This opensnew possibilities for event cameras in fields requiring high-quality grayscale images, such as industrial inspection.
This paper proposes and validates a shutter-mapped event camera method for grayscale imaging in high-dynamic-range static scenes. The core innovation of this approach lies in introducing a mechanical shutter as an active driving mechanism, which effectively overcomes the limitation that event cameras cannot generate event streams in static scenes due to the absence of brightness changes. By modulating the scene's brightness using controlled shutter rotations, the event camera can trigger events even in stationary conditions. Experimental results demonstrate that this method successfully reconstructs grayscale images of high-dynamic-range static scenes, achieving a dynamic range of up to 112 dB under extreme illumination conditions, all using a frameless event camera. Compared to existing methods, our approach significantly improves reconstruction quality, reduces grayscale distortion, and preserves fine details. Additionally, the proposed mechanical shutter triggering mechanism offers versatility and portability, providing a controllable active refresh mechanism that is compatible with various event camera models. In summary, the shutter mapping method extends the capabilities of event cameras in static scenes, enabling high-quality HDR grayscale image reconstruction. By inducing brightness variations through mechanical shutter actuation, event cameras can generate event streams and reconstruct HDR images, even in static scenes with extreme lighting conditions.
ALIAKBARPOUR H , MOORI A , KHORRAMDEL J , et al . Emerging trends and applications of neuromorphic dynamic vision sensors: a survey [J]. IEEE Sensors Reviews , 2024 , 1 : 14 - 63 .
GALLEGO G , DELBRUCK T , ORCHARD G , et al . Event-based vision: a survey [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2022 , 44 ( 1 ): 154 - 180 .
GLOVER A , BARTOLOZZI C . Event-driven ball detection and gaze fixation in clutter [C]. 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) , IEEE , 2016 : 2203 - 2208 .
MUGLIKAR M , MOEYS D P , SCARAMUZZA D . Event guided depth sensing [C]. 2021 International Conference on 3D Vision (3DV) , IEEE , 2021 : 385 - 393 .
KONG Delei , FANG Zheng . A review of event-based vision sensors and their applications [J]. Information and Control , 2021 , 50 ( 1 ): 1 - 19 .
孔德磊 , 方正 . 基于事件的视觉传感器及其应用综述 [J]. 信息与控制 , 2021 , 50 ( 1 ): 1 - 19 . DOI: 10.13976/j.cnki.xk.2021.0069 http://dx.doi.org/10.13976/j.cnki.xk.2021.0069
AFSHAR S , NICHOLSON A P , VAN SCHAIK A , et al . Event-based object detection and tracking for space situational awareness [J]. IEEE Sensors Journal , 2020 , 20 ( 24 ): 15117 - 15132 .
COHEN G , AFSHAR S , MORREALE B , et al . Event-based sensing for space situational awareness [J]. The Journal of the Astronautical Sciences , 2019 , 66 ( 2 ): 125 - 141 .
LV Y , ZHOU L , LIU Z , et al . Structural vibration frequency monitoring based on event camera [J]. Measurement Science and Technology , 2024 , 35 ( 8 ): 085007 .
LV Y , ZHOU L , LIU Z , et al . Small space target detection using megapixel resolution CeleX-V camera [J]. Journal of Electronic Imaging , 2024 , 33 ( 5 ) : 085007 .
BELBACHIR A N , SCHRAML S , MAYERHOFER M , et al . A novel HDR depth camera for real-time 3D 360° panoramic vision [C]. 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops , IEEE , 2014 : 425 - 432 .
REBECQ H , GALLEGO G , MUEGGLER E , et al . EMVS: event-based multi-view stereo—3D reconstruction with an event camera in real-time [J]. International Journal of Computer Vision , 2018 , 126 ( 12 ): 1394 - 1414 .
CAO R , GALOR D , KOHLI A , et al . Noise2Image: noise-enabled static scene recovery for event cameras [J]. Optica , 2025 , 12 ( 1 ): 46 - 55 .
BRANDLI C , MULLER L , DELBRUCK T . Real-time, high-speed video decompression using a frame- and event-based DAVIS sensor [C]. 2014 IEEE International Symposium on Circuits and Systems (ISCAS) , IEEE , 2014 : 686 - 689 .
HAN J , ZHOU C , DUAN P , et al . Neuromorphic camera guided high dynamic range imaging [C]. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) , IEEE , 2020 : 1727 - 1736 .
BAO Y , SUN L , MA Y , et al . Temporal-mapping photography for event cameras [C]. European Conference on Computer Vision , 2024 : 55 - 72 .
WU Y , TAN G , CHEN J , et al . Event-based asynchronous HDR imaging by temporal incident light modulation [J]. Optics Express , 2024 , 32 ( 11 ): 18527 .
GAO Q , SUN X , YU Z , et al . Understanding and controlling the sensitivity of event cameras in responding to static objects [C]. 2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics , IEEE , 2023 : 783 - 786 .
LICHTSTEINER P , POSCH C , DELBRUCK T . A 128×128 120 db 15μs latency asynchronous temporal contrast vision sensor [J]. IEEE Journal of Solid-State Circuits , 2008 , 43 ( 2 ): 566 - 576 .
HAFLIGER Ph . Asynchronous event redirecting in bio-inspired communication [C]. 8th IEEE International Conference on Electronics, Circuits and Systems , IEEE , 2001 : 87 - 90 .
Inivation . Understanding the performance of neuromorphic event-based vision sensors [R/OL]. 2020 - 05 . https://inivation.com/wp-content/uploads/2020/05/White-Paper-May-2020.pdf https://inivation.com/wp-content/uploads/2020/05/White-Paper-May-2020.pdf
REBECQ H , RANFTL R , KOLTUN V , et al . High speed and high dynamic range video with an event camera [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2021 , 43 ( 6 ): 1964 - 1980 .
0
浏览量
10
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构
京公网安备11010602201714号