Electrical consumption is measured in Watts.
Voltage is the equivalent to the pressure of the electricity (110 Volts, 220 volts etc..)
Amperage (amps) is the current of the electricity.
The power company bills you on Watts consumed (Kilowatt Hours). A Kilowatt Hour is 1000 watts per hour.
So a 40 watt light bulb (if rated at the Voltage of 110 volts) would use 40 watts per hour. If the rated voltage for the 40 watt lightbulb was 220 volts, it would still use 40 watts of electricity, but it wouldn't draw the same amperage.
Now, if you are trying to figure out the amperage the 40 watt lightbulb is going to use, there is a simple formula to use. It is a P.I.E. formula. P= Wattage, I=Amperage E=Voltage.
To figure out the amperage of your 40 watt bulb take the wattage value (40) and divide it by the Voltage (110 volts for example) and you'll get .364 (rounded) amps. So on a 15 amp circuit you could put about 41 lightbulbs before you pop the breaker.
If you want to figure out your wattage and you know the amperage draw and the voltage, you multiply the voltage and the amperage and it will give you the wattage.
Finally, if you know the amperage draw and the Wattage. Divide the Wattage by the amperage and it will give you the Voltage.
This is a lot of info for such a simple question....but now you know.