Phosphene

Shutter Exposure

Let’s look at a camera’s shutter mechanism. The shutter has two tasks. The first is to expose the photosensitive medium for a preset amount of time. This provides exposure control for creative purposes, regulated by a parameter common in photography: the shutter speed. The second task is to perform this exposure at the correct time in order to capture the desired moment. This part is a bit more nuanced, as we shall see.

Focal plane shutter mechanism of an Olympus E-620 with a closed shutter curtain. The shutter curtain consists of multiple blades, they collapse over each other when the curtain retracts upwards to expose the imaging sensor.

Consider a common shutter technique found in most interchangeable lens cameras: the focal plane shutter. The shutter consists of two curtains positioned directly above the imaging plane. The first curtain initially covers the imaging medium, and is pulled away to start the exposure. The second curtain is initially pulled away and moves to cover the imaging medium, ending the exposure. The curtains take time to move across the imaging plane. This leads to an exposure that starts earlier at one end of the frame compared to the other. The two curtains follow the same path, ensuring an even exposure duration across the frame. This is known as a rolling shutter, where the image is exposed progressively from one side to the other over time. As different parts of the image are exposed at different times this leads to warping of moving objects.

Our goal for today is to analyse how the shutter curtains move across the frame during exposure, and where in time each part of the frame is exposed.

A wobbly image of a straight fence due to rolling shutter distortion. The image is taken using an electronic shutter, which is prone to rolling shutter artifacts.

As some of the concepts we will discuss are confusingly similar in name, I’ll try to stick to the following terminology:

  • Exposure time: The period of time each part of the frame is exposed for. This determines the brightness of the captured image, and whether motion is blurred. This is universally referred to as the shutter speed. However, for a focal plane shutter the curtains’ velocity does not change as the exposure time is altered, just the time between the first and the second curtain’s movement.

  • Shutter travel time: The time it takes the shutter curtain to move from fully closed to fully open, or the reverse. The exposure initiation or termination does not occur instantaneously, but rather progresses across the frame over this amount of time. Objects moving across the captured image are thus geometrically distorted.

From a photographer’s perspective, motion within the exposure time will result in a blurred image, while motion within the shutter travel time will result in a warped image.

Camera specifications

Let’s have a look at my trusty Olympus E-M10 Mark II from 2015. We can already get an idea what to expect based on the user manual alone:

  • The image height is 3456 px, or 13.0 mm on the 4/3 “ sensor.
  • The exposure time range is limited to 1/4000 s on the short end.

There is no mention of anything relating to the shutter travel time, but we can find clues. Specifically in the flash synchronization time: 1/200 s when using an Olympus FL-600R flash with a full power flash duration of 1/500 s. That means that with an exposure time of 1/200 s, or 5.0 ms, there is a period of at least 1/500 s, or 2.0 ms, when the full imaging sensor is exposed and the flash can be fired without obstruction by the shutter curtains. Subtracting the latter from the former, this leaves 3.0 ms for the shutter to travel without interfering with the flash exposure. A shutter travel time of 3.0 ms or less is as good an estimate as we’ll get for now.

With this in mind, what happens when an image is captured with an exposure time of 1/4000 s, or 0.25 ms? While each pixel sees an exposure of 0.25 ms locally, globally the image would only be fully exposed after 3.25 ms. The last row of the image will start its exposure 3.0 ms later than the first row due to the shutter travel time.

Image formation diagram detailing the relationship between image position and time of exposure due to the movement of the shutter curtains across the frame.

Method

The rolling shutter effect is not often noticeable in photographs, but can be easily observed in an artificial setting. We just need a pulsing light source with controlled frequency. We’ll use the camera’s own sensor as the actual measurement device, no high-speed camera needed.

We will observe banding in images of such a light source, as the light turns on and off while the sensor is progressively exposed. The shutter travel time will determine how time progresses across the frame: 3456 px over an estimated 3.0 ms, or 1152 px/ms assuming linear motion. For a light frequency of 1 kHz we know the duration of the pulses: a period of 1.0 ms, or 0.5 ms on and 0.5 ms off for a rectangular signal with 50 % duty cycle. This translates to alternating light and dark bands of 576 px wide each at 1152 px/ms. We will not observe sharp transitions between the light and dark bands, as each pixel captures an average over the exposure time. At 1/4000 s we expect a ramp up/down of 0.25 ms, or 288 px at 1152 px/ms, as both light states are partially observed when a pulse’s rising/falling edge occurs within this 0.25 ms exposure.

Expected banding profile across the image's vertical axis. Light switching at 1 kHz captured with an exposure time of 1/4000 s. Full image height of 3456 px.

The light pulse frequency should be somewhat in the right range to properly observe this banding. It should be high enough to create preferably multiple bands across the image, but not as high as to create bands less than a few pixel wide. With the estimated 3.0 ms shutter travel time we expect to see one single light cycle across the frame around 333 Hz, while at 576 kHz each light cycle will span just two pixels1 of a 3456 px high image. However, if each pixel is exposed for 0.25 ms, the individual bands will already overlap at a frequency above 2 kHz.

Setup

The light pulses are generated using an LED on a programable development board2. The camera is set to its shortest exposure time of 1/4000 s, pointed directly at the LED. This creates a bright image of the LED with very little ambient light. A wide aperture with a long lens3 focused at infinity creates a de-focused image of the nearby point light. That way the complete sensor area is illuminated, and we observe banding across the whole frame.

The image processing is kept to a minimum4. The image is demosaiced just so we don’t have to worry about color filter arrays. The output is saved in a linear format to preserve light intensity values. No geometric distortion correction should be applied5. The data is averaged across the horizontal dimension and color channels to obtain a one-dimensional signal for analysis.

Results

1 kHz light

Let’s start with the LED driven at 1 kHz and see if we get the expected banding.

Image with banding across the frame due to light pulses. Light switching at 1 kHz captured with an exposure time of 1/4000 s.

This looks good. From this image we get the following light intensity profile across the vertical direction:

Observed banding profile across the image's vertical axis. Light switching at 1 kHz captured with an exposure time of 1/4000 s. Full image height of 3456 px.

At first this appears to be what we’d expect. We see the light pulses creating alternating bright and dark bands, and we see the rising and falling edges are distinctly sloped. The amplitude varies a bit across the frame, but that’s to be expected in a minimally controlled setting and is not really an issue. We do not see three complete cycles across the image. With each cycle being 1.0 ms wide, the shutter travel time then would be a bit shorter than the originally estimated 3.0 ms.

We could take the exact pulse width and derive the shutter velocity to infer the total shutter travel time… Except these bands are not quite similar in size! It appears the shutter curtains do not move at a constant velocity across the frame. The narrow band at the left indicates that the shutter curtain is traveling slower initially, accelerating across the frame as the following bands become wider.

We can do better than this, right?

24 kHz light

As each band constitutes one single measurement, we would ideally have more bands covering the full image. As mentioned previously, we observe distinct bands only up to a pulse frequency of 2 kHz, the bands overlap beyond that. We lose definition of single pulses, the peak amplitude decreases due to the 1/4000 s moving average, and the signal to noise ratio unfortunately drops, as seen below:

Observed banding profile across the image's vertical axis. Light switching at 2, 4, 8 and 16 kHz captured with an exposure time of 1/4000 s. Full image height of 3456 px.

However, as long as the oscillations in the resulting signal are still perceptible we should be good. It seems this signal holds up to about 24 kHz before the bands blend into the noise, so we’ll continue with 24 kHz.

Observed banding profile across the image's vertical axis. Light switching at 24 kHz captured with an exposure time of 1/4000 s. Full image height of 3456 px.

We count a total of 64 cycles6 of 1/24 ms each captured across the image. That’s 2.67 ms for the shutter curtains to travel across the frame. Our initial estimate of 3.0 ms was not far off. In practice this means that significant motion across the frame within 1/375 s will be distorted7, not the 1/4000 s the exposure time would imply.

Velocity variability

What about the variable shutter curtain velocity? Let’s look at our signal in the frequency domain. We would expect a single sharp peak at 64 cycles per image height if the velocity was constant across the frame. Instead, we see a frequency band of increased magnitude somewhere between 50 and 100 cycles per image height. It’s plausible that this band is caused by the variability in the observed light pulse frequency.

Frequency domain spectrum for a light switching at 24 kHz captured with an exposure time of 1/4000 s. The high magnitude at the very low end is caused by signal drift across the frame.

A short-time Fourier transform calculates how the measured signal frequency changes over time. We use a sliding window of 512 px8, and observe a much more localized peak within each window. Tracking this frequency peak across the frame, we indeed see the frequency change from 85 to 52 cycles.

Peak frequency across the image's vertical axis. Averaged across a 512 px moving window due to short time Fourier transform.

Let’s convert this to more relatable px/ms units, given that one cycle is 1/24 s and the frame is 3456 px high.

Shutter curtain velocity across the image's vertical axis. Averaged across a 512 px moving window due to short time Fourier transform.

Here’s a mechanical perspective. The shutter curtains travel upwards9 at an average velocity of 4.88 m/s for 2.67 ms across the 13.0 mm frame10. They accelerate from 3.67 m/s at the bottom to 6.0 m/s at the top of the frame.

Shutter curtain motion

To wrap things up, let’s plot the observed relation between time and image position being exposed11. Note that the shutter curtain movement path is curved due to variable shutter curtain velocity.

Shutter curtain position over time during an exposure of 0.25 ms. The exposed sensor area is indicated in gray.

We see again that the shutter curtains travel across the frame in 2.67 ms. For an exposure time of 1/4000 s, or 0.25 ms, the whole image is captured only after 2.92 ms.

And we’re done!

Conclusion

We’ve seen that an image is not captured at one moment in time. There can be quite a difference between the exposure time and the actual time it takes to form a complete image, about a factor ten in this case. In practice, using a mechanical focal plane shutter should be mostly OK. One exception that requires special attention is very fast motion, such as airplane propellers, where distortion will be an issue. Another exception, as we have seen, is artificial lighting, which is prone to banding.

Bottom line, high shutter speeds and short exposure times may just be a bit less impressive in reality. They do offer the full advertised brightness and motion blur control. However, an attempt at capturing rapid motion might end up looking like Salvador Dali’s molten clocks in The Persistence of Memory.


  1. Due to the color filter array, we’re sampling each channel only every other pixel. One cycle every four pixels seems more appropriate then, or 288 kHz. It is best to keep far from that limit either way. ↩︎

  2. An Arrow CYC1000 board with an Intel Cyclone 10 LP FPGA and a 12 MHz oscillator. A bit excessive for our needs, but it has eight on-board directly controllable LED’s which will be handy later on. Any cheap micro-controller development board with an LED will do for now. ↩︎

  3. Sigma 56mm F1.4 at F 1/1.4. ↩︎

  4. Dcraw is used to convert the raw .ORF file to a linear 16 bit .PPM file:
    dcraw -4 -o 0 -M -r 1 1 1 1 *.ORF ↩︎

  5. The banding is generated by the shutter, positioned just in front of the imaging sensor, thus we assume a linear relationship between the shutter position and the exposed pixels. Geometric distortion correction, if enabled, would correct the projection of an image onto the sensor by the lens, and distort the relationship between the shutter position and the processed image coordinates. ↩︎

  6. For the lazy: filter the signal and detect zero-crossings. ↩︎

  7. Horizontal motion will be skewed across the frame, while vertically moving areas will be compressed or expanded depending on direction. ↩︎

  8. The window size determines the number of samples considered for each local frequency spectrum, and acts as an averaging moving window over time. There’s a trade-off, a smaller window will measure the frequency less accurately, while a larger window will average over a larger area and smoothen local variations. ↩︎

  9. The shutter starts at the top of the observed image and accelerates as it travels towards the bottom of the image. In reality the mechanical shutter curtains travel upwards, as the image is projected onto the imaging plane upside-down. ↩︎

  10. In reality the focal plane shutter is not located directly above the imaging sensor. Most digital cameras have a filter stack consisting of an infrared cut-off filter and an anti-aliasing filter located between the sensor and the shutter. This stack is about 4 mm thick on Micro Four Thirds cameras, offsetting the shutter location by at least that amount. Depending on the lens geometry, the direction of the incoming light will not always be perpendicular to the sensor plane, thus this offset affects the sensor position to shutter position mapping. ↩︎

  11. The performed measurement is an average of the first and second curtain motion, we cannot differentiate between the two. This measured average is used for both the first and second curtain paths in the image. We do not measure the exact exposure time, the time between the first and second shutter paths is assumed constant. ↩︎


Thoughts?   or 

May 4, 2020 Galin Bajlekov CC BY-NC-SA 4.0