Interestingly, CPU use and CPU power drain are non-linearly related:
100% CPU use = 62W
75% CPU use = 60W
25% CPU use = 35W
12% CPU use = 22W
0% CPU use = 7W (!)
Sadly, I didn't manage to rig things for 50% use, but I suspect that it is not much less than 60W. It's a four core CPU with hyperthreading and running it with HT doesn't use much more electricity than running without HT. These numbers were obtained by running the ECM subproject only with a varying number of work units. Percent CPU use was monitored by Windows Task Manager and the CPU power draw by Core Temp. My CPU is an i7 4770S at 3,9GHz.
By using my electricity "smart meter" and turning BOINC off for a week, I discover that the difference in power consumption is about 1,6kWh per day, and this must be equal to the difference in CPU power consumption plus the difference for the rest of the system, mostly VRM. To consume 1,6kWh per day corresponds to 67W continuous load.
So, my pc consumes 67W including the differential for VRM, chipset and memory, over and above what it consumes for my normal use scenario. 2016 was a leap year so there are five 365 day years plus one day to account for, yes, a totally unjustifiable degree of precision, but never mind. My electricity price is about 20c per kWh.
20c per kWh x 0,067 x 24 x (5 x 365 +1) = €590 total for 3MWh to two significant figures, spread over five years.
The figure for the ECM subproject is less than this because I also crunched for Muon. Let's split it according to the subproject credits. This is not as simple as it should be. I commented some time ago on the fact that the subproject totals do not add up!
Muon 2001517
ECM 21627226
Evolution 2035
Odd Weird 1747
Crunch 1005
Total 23633530
BUT my total score according to the stats is 23888617. There is a discrepancy of 23888617-23633530 = 255087 too many and this is too big to be accounted for by credits in the pipeline. What to do? Let's just assume that we are dealing with Muon and ECM only and round the subproject numbers, ignoring the stats total. Obviously this is wrong, but I don't see a practical alternative. An error of 250k in 24M is only 1%, but it's irritating that in such a mathematics-intensive project, the credit numbers don't add up. Yes, I'm complaining at having too many

Muon 2.000.000
ECM 21.600.000
So ECM is 21,6/23,6 = 91,5% of 587,24 = 537,5 / 4 = 134 so let's call it €135 per prime factor, or 0,75MWh each.
Hope it was worth it! The bare minimum for these four factors is €540, 3MWh!
47188825045808432173014562283034244106868084583502567641
131781922673757513206895612112252415462126150388431
3386341295396416572865813774391866633970590764225481
1465420354666397042855818942073858707531680887191
As a final note, the energy expenditure and carbon footprint are not anything like 100% wasted. This pc sits in my living room and consequently serves as an "intelligent room heater", reducing the cost of running the central heating during the colder part of the year by 2,5kWh (guessed) per day when it is running BOINC, which is a significant contribution. I don't have air conditioning, so this is not counterbalanced by a cooling bill for the warmer months. Perhaps intelligent room heaters are a way forward in reducing environmental impact. Ordinary heaters only convert fuel to heat, computers produce data as a useful byproduct

It'd be interesting to see cost comparisons from other users based on their own CPU and subproject credit breakdowns. I think these figures above are actually towards the top end of the spread, as, looking at my position in the ECM subproject, most participants around me have considerably more factors for the credit numbers and therefore the mean cost per factor for the subproject overall is likely to be less than for my system.
Happy crunching!
Dunckx