Ok, at the risk of being blunt, if you need to ask this, you might be better of leaving things well alone - or doing as the other person suggested and using a transormer to lower the voltage to begin with.
First of all, you need to plonk a diode in line really, LED's do not like AC so the diode will make things into "raggy DC", using a bridge rectifier would be better and would reduce the flickering, as would plonking a capacitor over the DC output - typically with AT LEAST a 380V rating if using a bridge and a 200V rating if using a single diode.
Still with me?
You now need to work out the voltage drop over the LED's - this varies according to type - and mainly colour. About 1.8V for a typical RED LED and around 4v for a white one.
You then need to look at how many volts you have "left" after you have taken into account all of the LED's in series.
EG: You take 220V, use a single diode (110 v) and then pop a cap over to stop the flicker (around 160V)
You now have 30 red, 20 green and 10 blue LED's in series. Assume 1.8V drop on Red, 2.2V on Green and 4.1V on blue.
54v + 44v + 22v = 120v
This gives us 40v to "burn off" with a resistor.
Let us assume you want to drive the LED's at 20mA
R = V/I therefore R = 40/.02 = 2000 Ohms
Power = I x I x R (current sqaured 8 resistance) so the resistor will be dissipating 800mW - Allowing for plenty of play and wanting a warm rather than hot resistor, then 2w ideally.
So a 2K2 resistor - or 1K8 for a bit brighter would be your answer here (for the number of LEDS and type), rated at 2W or better.
As I said at the beginning, if ONE step of this doesn't make GOOD sense to you, PLEASE leave WELL alone.
PLEASE DO NOT use this as a way to hook up the leds exactly as shown WITHOUT first checking their "forward voltage losses", I have used typical examples, different models can vary considerably.