TECHNOLOGY FOCUS

Digital cameras are an essential part of our daily life, e.g. in mobile phones, camcorders and in imaging applications for medical, industrial and broadcasting industries. All these camera applications rely on the solid-state image sensors. 

However, if consumers were forced to choose a digital camera on the basis of the raw data produced by the imager, it would be very doubtful that anyone would buy a digital camera. The data produced by a solid-state image sensor is contaminated by various noise sources, by defects, by inconsistencies, and many other error sources. To make matters worse, the solid-state image sensors do not themselves produce a coloured image - it is the data processing that must correct all these potential errors and even regenerate the colour information in the post-processing stage.  So, what a person actually sees on a display or hard copy is absolutely not the same as what the imager has captured: what you see is not what you got!

As we can foresee that our homes, offices and cars soon will be fully equipped with cameras to make life safer and more enjoyable and to reduce our workload, we can recognize the digital camera technique as a forefront technology. Even today, for many applications imaging is in the embryo stage of its development.

Digital camera
Digital camera

COURSE CONTENT

The objective of this course is to provide theoretical familiarity and hands-on experience with digital cameras and associated topics with focus on the overall system aspects.

The path on which the photons enter and the digital numbers exit is long and complex. The course will describe this path with all its shortcomings, and provide examples of how these can be overcome.

Hands-on sessions will form a strong backbone of the course, complemented by a number of tutorial lectures. Many examples of images will be used to explain the various details. The course is not aimed at demonstrating existing products; on the contrary, computer animations and simulations will be used throughout the course to achieve a realistic experience.

WHO SHOULD ATTEND

No detailed knowledge of device physics is assumed. The course is developed to give an in-depth understanding of digital camera systems and technologies to engineers who are active in the field, and to give those with a theoretical knowledge an opportunity to learn more about the practical issues. The course will provide a valuable update on the latest developments in this rapidly changing technology.

The main difference with course #013 Digital Imaging: Image Capturing, Image Sensors - Technologies and Applications is the fact that course #013 is primarily focusing on the sensor, while this course is targeting the system around the image sensor.  The two courses can be seen as twins: they really belong to the same parents, but they are like two-egg twins.  Besides their common background, the two courses are completely different from each other, but are complementary to each other as well.  The two courses can be attended fully independent of each other.

Digital camera

Due to Covid-19, and the uncertain travel recommendations for Autumn 2020, it is decided that this course is planned to run Online only. The daily schedule will be adjusted to fit remote training, with less hours per day divided into extra days. Make a preliminary booking and we will keep you updated.


Day 1

The Image Sensor, the Optics, Image Processing: A Review 
The course begins with a brief overview of the basic theory of solid-state image sensors. The imager is of course only a small, but vital, component of the complete camera system. The effect of the spectral content of the light sources will be discussed. Lectures on optics and on digital image processing are included to form a strong backbone for the remaining parts of the course. 

Image Quality 
Noise, defects, irregularities of the video signal, and inconsistencies can all deteriorate the quality of the image. We will discuss where these problems come from and how they can be corrected. The dilemma that correcting one effect can have a negative impact on some other camera parameters will be discussed. 

Day 2

Digital Camera Systems

  • Dark Current Compensation: The average value of the dark current can be corrected by the use of dark-reference lines/pixels. Fixed-pattern noise can be corrected by means of dark frame subtraction. How efficient are these techniques? What is their influence on signal-to-noise performance and what about temperature effects?
  • Colour Interpolation: The Bayer pattern sampling is extensively used in digital imaging, but the sampling is only half of the story. The other half is the demosaicing or interpolation. Several methods will be discussed and compared with each other.
  • White Balancing: The human eye is adapting easily and quickly to the spectrum of a light source, the image sensors do not adapt at all! How can we deal with this "shortcoming" of the imagers?
  • Defect Correction: How can defect pixels be corrected without any visible effect? Can similar techniques also be applied to correct defect columns?
  • Noise Filtering: A very important issue in data processing is the filtering of any remaining noise. This can be done in a non-adaptive or an adaptive way. What are the pros and cons of the various techniques?

Day 3

Digital Camera Systems (cont´d)

  • Colour Matrixing: Nobody is perfect, neither are the imagers that suffer from optical cross-talk and from imperfections when it comes to the transmission characteristics of the colour filters. Colour matrixing takes care about these issues. Question is how to find to optimum correction matrix coefficients?
  • Contouring: This is a technique to "regain" details, edges and sharpness in an image. But quite often not only the details are enhanced, but the noise in the image as well. Various contouring techniques will be discusses and compared with each other.
  • Lens-Vignetting: Lenses have a strong fall-off of intensity and sharpness towards the edges. On top of that, also the image sensor will add an extra fall-off of intensity. Is correction possible? How complicated needs the correction to be to become invisible for the observer?
  • Auto-focusing: How can the data of the image sensor itself being used to activate the auto-focusing function?
  • Auto-exposure: How can the data of the image sensor itself being used to optimize the exposure time of the imager?
  • Gamma Correction: How to adapt the brightness of the output device to the characteristics of the human eye?

Said about the course from previous participants:

"Good examples and illustrations."

"Deep knowledge of the topic."

"Very good teacher and material gave excellent overview to imaging area."

"Not too specific details so easy to follow."

"Good examples, also theory behind these were explained well."

"Would recommend this course to my colleagues."

"Step-by-step process through the whole chain."

"Good explanations of each step."