It would maybe save you $20-$30 per year depending on how much you play and how much of its full power you use.
Ok, lets use worst case scenario, a computer running 100% load, 24.7 per year on the highest tariff in the USA mainland of $0.178/kWh.
The difference between bronze, gold and titanium is about $60 -$100 per year in electricity costs ONLY if you run the computer at 100% load every second of the year. Applicable to miners may be, but everyone else would be running at 1/10th of that or less with a lower tariff.
So yeah, for gaming and normal pc's its about $10 per year if you're lucky. You'll never get a return on investment.
A
L = maximum wattage required by computer (e.g. 800W)
A
W = power (W) consumed from the grid
Tariff = 17.8c / kWh
T = 24/7 operation (8760 hours per year)
E
f = PSU efficiency (e.g. 82%)
Calculation (82% efficiency (bronze)):
A
W = ((1-E
f) * A
L) + A
L
A
W = ((2 - 0.82) * 800W))
A
W = (1.18 * 800W)
A
W = 944 W
Cost ($) = (T * Tariff * A
W) /1000
Cost ($) = (8760 * $0.178 * 944 W) / 1000
Cost ($) = $1,472 per year
Calculation (87% efficiency (gold)):
A
W = ((1-E
f) * A
L) + A
L
A
W = ((2 - 0.87) * 800W))
A
W = (1.13 * 800W)
A
W = 904 W
Cost ($) = (T * Tariff * A
W) /1000
Cost ($) = (8760 * $0.178 * 904 W) / 1000
Cost ($) = $1,410 per year
Calculation (90% efficiency (titanium)):
A
W = ((1-E
f) * A
L) + A
L
A
W = ((2 - 0.90) * 800W))
A
W = (1.10 * 800W)
A
W = 880 W
Cost ($) = (T * Tariff * A
W) /1000
Cost ($) = (8760 * $0.178 * 880 W) / 1000
Cost ($) = $1,372 per year