Workshops and Tutorials
Monday 19 September 2022
Wafer-level 3D Stacked Imagers: Technologies and Sensors Architectures
(Room - Aula 1E)
David Stoppa (Sony Semiconductor Solutions - Sony Europe B.V.)
David Stoppa (SM’12-M’97) received the Laurea degree in Electronics Engineering from Politecnico of Milan, Italy, in 1998, and the Ph.D. degree in Microelectronics from the University of Trento, Italy, in 2002. Since 2021 he is the head of Sony Europe Technology Development Centre in Trento, Italy, working on the research and development of next generation sensor architectures, technologies, and systems. From 2017 and 2021 he has been with AMS-OSRAM leading a team developing range-sensors and imagers products. From 2014 to 2017 he has been the head of the Integrated Radiation and Image Sensors research unit at FBK where he has been working as a research scientist since 2002 and as group leader of the Smart Optical Sensors and Interfaces group from 2010 to 2013.
He was a Guest Editor for IEEE Journal of Solid-State Circuits special issues on ISSCC’14 in 2015 and he is serving as Associate Editor since 2017. Dr. Stoppa received the 2006 European Solid-State Circuits Conference Best Paper Award
The advent of 3D-stacking technologies had a dramatic impact on the performance of CMOS image sensors (CIS), opening the way to new sensor architectures and alternative process optimization solutions. 3D-stacking is now a mature manufacturing process exploited by most of the state-of-the-art CIS used in consumer, automotive and industrial applications. This technology enables several design optimization options: a fully optimized photo-sensing top-wafer process in combination with deep-submicron digital nodes for the bottom-logic wafer, and the adoption of back-side illuminated photodiodes virtually featuring 100% fill factor also for complex pixel readout architectures. Pixel-level Through Silicon Via (TSV) interconnections enable full parallel implementation of new pixel readout solutions and unlocked the full potential of SPAD-based imagers for ToF/LiDAR applications. This workshop will address the state-of-the-art 3D-stacking BSI-CIS technologies and manufacturing processes, and provide an overview of different sensors architectures exploiting stacking to implement advanced intensity- (RGB) and 3D-imagers. The goal is to gather experts in the field to discuss about the next challenges and future perspectives in this fast evolving field.
08:20 - 08.30
Welcome and introduction
08:30 - 09.15
CIS STACKING TECHNOLOGY: overview and future outlook
In the opening presentation of the workshop, a historical overview and technical background will be given of the CMOS image sensor stacking technology. Stacking technology does allow to extend the pixel volume, available to implement electronic circuitry, in the third dimension. Although the idea of stacking together multiple layers of active silicon already existed in the previous century, only about 10 years ago it was introduced in the imaging world. Despite the complicated technology, the step from first scientific publications till a mature technology for consumer products was realized in a very short time. Back-side illumination, deep-trench isolation, through-silicon vias, hybrid bonding were all prerequisites to establish a cost-effective, reliable and high-performance stacking technology.
Now that double layer stacking in combination with hybrid bonding is available (even in the foundry business), the first experiments and even first products with triple layer stacking are announced. In which direction is this technology going to evolve and what is going to be the added value of having three or maybe even more layers of silicon for the imaging devices ? Which new applications can be addressed ? What are the technological consequences for the performance of the imaging devices ?
Albert Theuwissen (Harvest Imaging, BE)
Albert Theuwissen received the MSc and PhD degree in electrical engineering from the Catholic University of Leuven (Belgium) in 1977 and 1983 respectively. In 1983 he joined Philips Research Labs (the Netherlands) and in 2002 he started working for DALSA. He issued several patents and he is author or coauthor of 250+ technical papers, including a textbook "Solid State Imaging with Charge Coupled Devices". He acted as general chairman of the International Image Sensor Workshop in ’97, ’03, ‘09 and in ’15, and as International Technical Program Chair of the ISSCC2010.
In 2001, he became part-time professor at the Delft University of Technology, the Netherlands. After he left DALSA in 2007, he founded Harvest Imaging. Since then he is fully focusing on training, teaching and consulting in the field of solid-state imaging technology.
In 2006 he co-founded (together with his peers Eric Fossum and Nobukazu Teranishi) ImageSensors, Inc. (a California non-profit public benefit company) to address the needs of the image sensor community. From 2017 till 2021 he was appointed as the president of the International Image Sensor Society (IISS).
In 2008, he received the SMPTE’s Fuji Gold medal. In 2011 he was elected as “Electronic Imaging Scientist of the Year”, in 2013 he received the Exceptional Service Award of the International Image Sensor Society and in 2014 he was awarded with the SEMI Award. Albert is an IEEE Life Fellow.
The Evolution of Front-Side-Illuminated Direct Time-of-Flight Sensors towards Back-Side-Illuminated Stacked Mid/High-Resolution 3D Imaging Systems at ams-OSRAM. Technical Challenges and Opportunities
This talk will cover the development of SPAD based direct Time-of-Flight Systems (dToF) based on the discussion of multiple generations of sensors. Starting from Front-Side-Illuminated implementations with its inherent limitation of pixel resolution and performance to the challenges and opportunities when introducing Back-Side-Illuminated 3D stacking technology to dToF systems. Focus will be on the technical aspects of SPAD pixel architecture, readout electronics, time resolving unit and histogram memory and the requirements in terms of data handling and processing in both cases.
In addition to the sensing part, the laser module, which is as well essential for the overall system performance will be discussed. It needs to generate high power, sub-ns optical pulses and therefore benefits as well from 3D stacked integration to the module.
To complete the picture, a reconfigurable Mid/High resolution 3D imaging system, using back-side-illuminated SPADs, with real-time on-chip depth-map computation will be presented. It consist of an emitter and receiver IC which allows very flexible implementation of different algorithms performing the key functions necessary for dToF depth maps generation such as: i) run-time monitoring of histogram, ii) target peak detection, iii) histogram-to-distance conversion. The emitter module is able to generate high peak power without exceeding the class 1 eye safety limit. Main features and key parameters of the system as well as options for future optimizations of dToF performance will be discussed finally.
Robert Kappel (ams OSRAM, AT)
Robert Kappel received his Master and Ph.D. degree at Graz University of Technology in 2012 and 2015 respectively. In 2015, Robert joined ams-OSRAM, Advanced Optical Sensors Business Line where we was deeply involved in the development of SPAD based Time-of-Flight technology covering front-side and back-side-illuminated sensors. He is currently working as principal design engineer with focus on product development of 3D-sensors for mobile devices.
10:30 - 11:15
2D, or not 2D, that is the question: how 3D-stacking is shaping the future of SPAD image sensors
Photon counting has entered the realm of image sensing with the creation of deep-submicron CMOS SPAD technology. The format of SPAD image sensors has expanded from 8x4 pixels of our first LIDAR in 2004 to our recent megapixel camera in 2019, and the applications have literally exploded in the last few years, with the introduction of proximity sensing and portable telemeters. SPAD image sensors are today in almost every smartphone and will likely be in every car by 2022. The introduction of SPADs in 3D-stacked ICs in 2015 has created a great opportunity for this technology, which is computationally intensive. Inherently digital in nature, SPADs can now take advantage of more processing and computation over multiple silicon layers offered by 3D integration. More silicon real estate will enable deep-learning processors, neural networks directly on chip, thus enabling complex processing in situ and reducing the overall power consumption. Another recent trend has been the use of SPADs in qubit readout and control, thus making SPADs amenable to interface with quantum processors, due to SPAD capability of operating at cryogenic temperatures. The talk will conclude with a technical and economic perspective on SPAD imagers and the vision for massively parallel solid-state photon counting in scientific and consumer applications.
Edoardo Charbon (EPFL, IT)
Edoardo Charbon (SM’00 F’17) received the Diploma from ETH Zurich, the M.S. from the University of California at San Diego, and the Ph.D. from the University of California at Berkeley in 1988, 1991, and 1995, respectively, all in electrical engineering and EECS. He has consulted with numerous organizations, including Bosch, X-Fab, Texas Instruments, Maxim, Sony, Agilent, and the Carlyle Group. He was with Cadence Design Systems from 1995 to 2000, where he was the Architect of the company's initiative on information hiding for intellectual property protection. In 2000, he joined Canesta Inc., as the Chief Architect, where he led the development of wireless 3-D CMOS image sensors. Since 2002 he has been a member of the faculty of EPFL, where is a full professor since 2015. From 2008 to 2016 he was with Delft University of Technology’s as Chair of VLSI design. Dr. Charbon has been the driving force behind the creation of deep-submicron CMOS SPAD technology, which is mass-produced since 2015 and is present in telemeters, proximity sensors, and medical diagnostics tools. His interests span from 3-D vision, LiDAR, FLIM, FCS, NIROT to super-resolution microscopy, time-resolved Raman spectroscopy, and cryo-CMOS circuits and systems for quantum computing. He has authored or co-authored over 400 papers and two books, and he holds 23 patents. Dr. Charbon is a distinguished visiting scholar of the W. M. Keck Institute for Space at Caltech, a fellow of the Kavli Institute of Nanoscience Delft, a distinguished lecturer of the IEEE Photonics Society, and a fellow of the IEEE.
11:15 - 12:00
Pixel-Parallel Image Sensor Architectures with 3D-Stacking Technology
This talk introduces the pixel-parallel image sensor architectures that have rapidly evolved in recent years with the advent of 3D-stacking technology. Advances in the hybrid bonding process technology for image sensors as further shrinking the contact pitch with stable connections. This trend suggests the possibility of realizing complex pixel circuit architectures with practical pixel sizes and resolutions. The pixel-parallel configuration contributes not only to high-frame-rate and high-dynamic-range imaging, but also to the performance improvement of depth sensing, event-based vision sensing, and invisible light imaging with non-Si photo detectors. The next challenge will require more scalability, more optimization of process technologies, and more area efficiency to further evolve the imaging and sensing world.
Yusuke Oike (Sony Semiconductor, JP)
Yusuke Oike has been involved in research and development of architectures, circuits, and devices for image sensors since joining Sony Corporation in 2005. Currently, he is responsible for the development of CMOS image sensors as Deputy Senior General Manager of Sony Semiconductor Solutions. He has also been appointed as Distinguished Engineer of Sony Group Corporation.
13:30 - 14:15
From 3D stacking to 3D CMOS image sensors: evolution, challenges and directions for manufacturing technologies and pixel architectures
CMOS Image sensors have evolved beyond passive 2D image capture to active decision making. Depth sensing has increased the accuracy of information and opened new application opportunities.
Pixel architectures have diversified from rolling shutters to differentiated sensors:
• global shutter pixels embedding memory
• direct and indirect time of flight based on single photon avalanche detection or fast photodiodes
• native high-dynamic-range pixels combining both electron and hole sensing.
The optical spectrum has also spread from visible to infrared, towards longer wavelengths for safety and power saving reasons.
The new pixels deliver highly sought after pitch reduction and can be optimized for new market opportunities.
Some of the fundamental technologies for high performance pixels include deep isolation trenches, wafers stacking, optimizing diodes and transfer gates, and QE enhancement with silicon structuration. We are also researching opportunities to increase sensitivity through new materials like quantum dots to improve QE and density.
Héleène Wehbe-Alause (STMicroelectronics, FR)
Helene Wehbe-Alause joined ST in 2000 after obtaining her PhD in Physics and working for Thales avionics. Helene took over 2.2µm pixel development in 2006 and has been managing the ST Process integration team for CIS since 2011. In 2021, she was appointed Director of Technology for Optical Sensors.
14:15 - 15:00
3D stacking meets 3D imaging: Vertically Integrated SPAD Sensor Architectures
3D stacking and backside illuminated (BSI) CMOS image sensor (CIS) manufacturing technologies offer a means to unleash the full potential of single photon avalanche diode (SPAD) sensors. Optimised BSI SPADs with small pitch, high detection efficiency, low noise and broad spectral range have recently been developed. Digital processing placed under these SPADs has leaped to advanced digital 40/22nm nodes matching those adopted in state-of-the-art stacked-BSI CIS. Unlike CIS, SPAD pixels require no analogue processing and offer relatively large areas for embedded digital signal processing. This offers new potential for image sensor designers to implement direct Time of Flight (dToF) imaging and time-correlated single photon counting (TCSPC) on a massively parallel scale. Power efficiency and timing uniformity represent a particular challenge of these devices given the high charge per pulse of detectors and large simultaneous power draw of time to digital converters operating at GHz frequencies. Huge data volumes generated by SPADs firing collectively at Giga-events/s in high ambient scenarios must be handled by novel sharing and time multiplexing of digital pixel processing resources. This talk will review some of these recent advances in stacked 3D SPAD pixel and sensor architectures including examples drawn from the author’s own work. Technology trends and future pixel scaling possibilities will be discussed.
Robert K. Henderson (University of Edinburgh, UK)
Robert K. Henderson is a Professor of Electronic Imaging in the School of Engineering at the University of Edinburgh. He designed the first SPAD image sensors in nanometer CMOS technologies leading to high volume SPAD time-of-flight products. He is a Fellow of the IEEE and the Royal Society of Edinburgh.
15:30 - 16:15
Panel and open discussions