PDA

View Full Version : Larger prints with FF given the same resolution?



Markus Jais
07-18-2012, 02:57 PM
Hi,

let's assume I have 3 cameras with the same 18 MP resolution

- one FF with 18 MP
- one 1.5/1.6 crop with 18 MP
- one compact camera with a 4x or 5x crop and 18 MP

Given current sensor technology, can I make larger prints with images taken with the FF and see a difference in resolution in the
print? Let's assume a really large print like 60x90 cm or 80x120cm (about 32x48 inches).
Of course noise will be different but will resolution be different?

Or is this a stupid question and doesn't make any sense?#

Given so much marketing about sensors, so many different opinions and so much nonsense on some web pages, I am still confused :-)

Markus

Roger Clark
07-19-2012, 12:11 AM
Hi Markus,

I would say that the main difference in the 3 scenarios would be the quality of the lens, with the exception that the 4x or 5x crop camera would have a tough time even equalling the larger sensor cameras, assuming decent lenses on all three. Diffraction will hurt the smaller sensor camera the most, and could impact the 1.5/1.6 relative to the FF camera. So it really depends on f/ratio and lens quality. Is it a different focal length on each camera (same field of view) or constant focal length (focal length limited situation)? So there are a lot of variables beyond the sensor. Probably most important (and another variable) is the experience and ability of the person behind the camera, and how familiar they are with the camera. Finally, if the Etendue were equalized, there would be no difference. I may not be possible to equalize the 4 or 5x crop with the FF, but it often can be done on 1.5 or 1.6 versus FF.

Roger

Markus Jais
07-19-2012, 01:14 AM
How Roger, thanks for the explanation. Let's make it more concrete.
Let's ignore the 4x crop and just compare a 1.6 crop and a FF camera with both 18 MP.

Let's assume I photography a Puma with a 4/500L IS and the Puma is approachable and doesn't move (ok, I am dreaming here) and I can get the same frame filling composition with the same out of focus green BG. And let's assume I shoot both with ISO 100 and proper long lens technique and get a technically perfect picture of the Puma in perfect light. In both shots the Puma has the same amount of pixels. And let's assume the noise is as low as technically possible in both shots.

Would I see more details in the Puma in a large print when the image was taken with the FF camera? Or are there still too many variables?

Markus

Roger Clark
07-19-2012, 08:30 AM
How Roger, thanks for the explanation. Let's make it more concrete.
Let's ignore the 4x crop and just compare a 1.6 crop and a FF camera with both 18 MP.

Let's assume I photography a Puma with a 4/500L IS and the Puma is approachable and doesn't move (ok, I am dreaming here) and I can get the same frame filling composition with the same out of focus green BG. And let's assume I shoot both with ISO 100 and proper long lens technique and get a technically perfect picture of the Puma in perfect light. In both shots the Puma has the same amount of pixels. And let's assume the noise is as low as technically possible in both shots.

Would I see more details in the Puma in a large print when the image was taken with the FF camera? Or are there still too many variables?

Markus

Hi Markus,

Because you are moving to get the same pixels on subject, there would be essentially no difference in detail if pixel limited. If you were stopped down so the lens is diffraction limited, and the same f/ratio in both cases, the FF camera image would have slightly higher contrast in the fine details. Note you put the crop camera at a disadvantage because you would have to move further away. Thus, you would get less light per pixel (so lower signal-to-noise ratio). The other case is you make both images from the same distance, and then the crop sensor camera would show more detail on the subject (again with lower S/N because you chop up the available light into smaller bins (pixels).

Roger

Markus Jais
07-19-2012, 02:34 PM
Hi Roger,

thanks for the detailed explanation, this is very interesting and helpful.

Markus

Stuart Hill
08-20-2012, 04:29 PM
Really sorry to ask Roger, but why, when moving further away with a crop sensor, does it mean less light?

Regards.
Stu.

Graeme Sheppard
08-21-2012, 09:05 AM
Stu,
The image is created on the sensor from light reflecting off the subject, which reflects at random angles.
Light generally follows an inverse square law, which basically means it spreads out the further it is from the object.
(The same amount of light covers a larger area)

This is easily seen with light bulbs. If you get very close they can be blindingly bright, but moving away makes them appear dimmer. Less of the light is reaching your eye.

Stuart Hill
08-21-2012, 04:43 PM
Thanks Graeme. That makes sense. Bit of a DOH! moment for me.

So if I took a spot meter reading off an A4 sheet of white paper at 4ft and then metered the same sheet from 40ft the reading would be different?

Ed Cordes
08-22-2012, 08:15 PM
Thanks Graeme. That makes sense. Bit of a DOH! moment for me.

So if I took a spot meter reading off an A4 sheet of white paper at 4ft and then metered the same sheet from 40ft the reading would be different?

I don't think that is correct. I believe the reading as described would be the same as long as the distance from the light source to the A4 paper were constant. It is the light source to subject distance that is the critical factor.

Graeme Sheppard
08-23-2012, 06:16 AM
I'm not really sure.
Let's assume that you are going back to 40ft but still taking exactly the same photo then you would need to change from an ultrawide angle lens to a telephoto one.

How would that affect metering and exposure?

Roger Clark
08-23-2012, 08:15 AM
I don't think that is correct. I believe the reading as described would be the same as long as the distance from the light source to the A4 paper were constant. It is the light source to subject distance that is the critical factor.

Try this example. In a large room, walk from one side of the room to the other, staring at a point on the wall. Does the wall appear brighter as you get closer? No. As you get closer, a small spot on the wall will show the 1/r squared law, but that is balanced by moving closer we see a smaller spot. The two cancel (more light via the 1/r squared law versus more spatial resolution). Same with a camera and lens. While exposure per pixel will be the same as you change distance, the area on the paper per pixel also changes, balancing the two. If you change lenses, to a longer focal length but same f/ratio, one actually collects more light even though the exposure is the same. This is because the longer focal length (same f/ratio) collects more light with the larger aperture diameter, but the focal length spreads it out, so the increased light is spread over more pixels, again balancing the light per pixel. Sum all the light from all the pixels on the paper and one would see more light was collected by the bigger lens.

Roger

Graeme Sheppard
08-23-2012, 09:58 AM
Thanks Roger, that helped me a lot. Now I can think it through more mathematically for those inclined.
correct me if I'm wrong.

You take a photo of a sheet of A4 paper that fills the frame exactly.

Using a 32mm lens you'd be pretty close. At f/4 the aperture size is 8mm wide. (32/4)
Using a 320mm lens you'd back off a lot, so light drops off following that 1/d^2 rule. But at f/4, the aperture is 80mm wide.
Since the area of the aperture is given by (pi * r^2), we can therefore see that the effect of increasing distance is balanced by increasing aperture size, as Roger's analogy showed. I'm not fussed about following this through more rigorously.

Graeme Sheppard
08-23-2012, 10:18 AM
Actually, thinking about my calculation again, the light gathered by both lenses should be exactly the same at the same f/stop, shouldn't it?

I wonder Roger's explanation went a little bit wrong at the end. The intensity of light passing through the aperture, and therefore hitting the sensor, is always the same for a particular f/stop regardless of the lens used.

Stuart Hill
08-24-2012, 03:50 PM
My head hurts! So, basically, the longer the lens I can use, and, the wider the apeture=the best possible image as far as s/n ration goes? Therefore less noise?

Graeme Sheppard
08-24-2012, 06:19 PM
My head hurts! So, basically, the longer the lens I can use, and, the wider the apeture=the best possible image as far as s/n ration goes? Therefore less noise?

I think that Lenses don't really have anything to do with noise, that is controlled by the sensor. The FF sensor will have better s/n characteristics.

Here we were considering changing sensor size and using different lenses to allow for the crop factor of a small sensor.

Roger Clark
08-26-2012, 12:25 AM
Actually, thinking about my calculation again, the light gathered by both lenses should be exactly the same at the same f/stop, shouldn't it?

Hi Graeme,
In the example given, the light gathered is exactly the same. But in the example, the with the longer lens one moved further away.



I wonder Roger's explanation went a little bit wrong at the end. The intensity of light passing through the aperture, and therefore hitting the sensor, is always the same for a particular f/stop regardless of the lens used.

This concept is incorrect and is a broader case of the example given above. In the above example, we moved further away to reduce the liught from the subject. The other case is do not change distance. Then the larger aperture collects more light even at the same f/ratio. Examples:

1) Photograph stars with a 50 mm lens at f/4 and 500 mm lens at f/4. Example, a 10 second exposure of the celestial pole (e.g. Polaris in the northern hemisphere). The 500 mm lens collects more light and shows fainter stars. Stars 100 times fainter!

2) A bird small in the frame: Say the bird filled 100 pixels with the 50 mm lens, and one recorded 1000 photons/pixel, thus collecting 100*1000 = 100,000 photons from the bird.. With the 500 mm lens, the bird would fill 10,000 pixels with 1000 photons/pixel for a total light from the bird = 10000*1000 = 10 million photons, or 100 times more light from the bird.

Roger

Roger Clark
08-26-2012, 12:32 AM
I think that Lenses don't really have anything to do with noise, that is controlled by the sensor. The FF sensor will have better s/n characteristics.

Here we were considering changing sensor size and using different lenses to allow for the crop factor of a small sensor.

Actually enses have everything to do with noise as the noise we see in our images is the square root of the number of photons collected and the lens collects the light. The sensor is merely the bucket that holds the collected light. It is the sensor that has little to do with noise unless the pixels are too small to hold all the electrons generated from the photons hitting the sensor. Basically aperture times exposure time sets the signal and signal-to-noise ratio.

See the distussion on what is ISO:
http://www.birdphotographers.net/forums/showthread.php/100597-What-is-ISO-on-a-digital-camera

Roger

Graeme Sheppard
08-26-2012, 07:32 AM
Hi Graeme,
In the example given, the light gathered is exactly the same. But in the example, the with the longer lens one moved further away.



This concept is incorrect and is a broader case of the example given above. In the above example, we moved further away to reduce the liught from the subject. The other case is do not change distance. Then the larger aperture collects more light even at the same f/ratio. Examples:

1) Photograph stars with a 50 mm lens at f/4 and 500 mm lens at f/4. Example, a 10 second exposure of the celestial pole (e.g. Polaris in the northern hemisphere). The 500 mm lens collects more light and shows fainter stars. Stars 100 times fainter!

2) A bird small in the frame: Say the bird filled 100 pixels with the 50 mm lens, and one recorded 1000 photons/pixel, thus collecting 100*1000 = 100,000 photons from the bird.. With the 500 mm lens, the bird would fill 10,000 pixels with 1000 photons/pixel for a total light from the bird = 10000*1000 = 10 million photons, or 100 times more light from the bird.

Roger

I'm not quite following this. I think we may be talking from slightly different angles, confusing one another. I'm also learning and broadening my knowledge :)

For (1), the stars are distant point sources of light, which is quite different to other examples, so I'm not sure of its relevance (I agree with it, though).

For (2), I think this is true only if you assume that the bird is the only thing providing light to the lens. For the 50mm lens, under most normal circumstances, the bird may only fill 100 pixels but the other 9,900 would be filled, to some extent by something else. So yes, we're getting more light from the bird in this example, but we're not actually getting more light (give or take other factors in the composition).

Cheers,
Graeme.

Graeme Sheppard
08-26-2012, 07:46 AM
Actually enses have everything to do with noise as the noise we see in our images is the square root of the number of photons collected and the lens collects the light. The sensor is merely the bucket that holds the collected light. It is the sensor that has little to do with noise unless the pixels are too small to hold all the electrons generated from the photons hitting the sensor. Basically aperture times exposure time sets the signal and signal-to-noise ratio.
Roger

Roger,

I still maintain that in modern optics the lens makes no significant difference to the SN ratio, although I agree that its settings do (aperture and exposure time).
As I understand the other thread to say, the noise we see in our images is determined by how the sensor deals with the light that reaches it, so isn't it the sensor that controls things (for a fixed setup)?

Note that I'm following on from my last post, and its assumptions, and possible misconceptions.

Cheers again,
Graeme.