# a convex lens forms a real image on a screen placed at a distance 60 cm from the object.when the lens is shif?

ted towards the screen by 20 cm another image is formed on the screen.determine the focal length of the lens.i kinda forgot how to solve this one

Update:

is there any there algebraic way to solve this question

Relevance

f = 13.3333 cm

Both images being real, and the object-image distance being constant, mean the numerical values of di and do are swapped when the lens is moved. This means the distance between the two lens positions = |di-do|. For this question we can remove the absolute-value symbols since we know the lens moves "toward the screen".

So we are given:

di-do = deltad = 0.2 m, and

di+do = td = 0.6 m

di = (td+deltad)/2 = 0.4 m

do = td-di = 0.2 m

f = 1/(1/do+1/di) = 1/(5+2.5) = 1/7.5 = 0.133333 m

EDIT: I wondered if the thumbs-down had a basis in reality so I checked:

Starting with di = 0.4 m,

do = 1/(1/f-1/di) = 1/(1/0.133333-1/0.4) = 0.2 m

Moving the lens 20 cm toward the screen so di = 0.2 m,

do = (1/0.133333-1/0.2) = 0.4 m

In both cases, do+di = 0.6 m, verifying the above answer.

Perhaps the TD-er is just nostalgic for the good old days at the Colosseum.

EDIT: Re your added question, I believe my answer does provide an algebraic formula. But you first have to use logic based on the clues given in the question, i.e., the fact that both images are real ("on a screen") and that di+do is constant, which leads to the conclusion that moving the lens interchanges the values of di and do and thus that di-do = 20 cm. The solution ultimately uses the thin-lens equation ( 1/f = 1/do+1/di

==> f = 1/(1/do+1/di) ).