Visit us on Facebook

Published on  29.08.2021

Visit us on Facebook

Published on  29.08.2021

The Greenhouse Defect - The most disruptive site on climate science

The 2xCO2 forcing disaster

In this article I am going to analyze CO2 forcing estimates, put them into perspective, account for the major flaws there, and finally produce an accurate estimate. It is quite a significant achievement I would think, as for the first time in the history of “climate science” this kernel of climate modelling will be done correctly. Brace for impact!

It is one of the most profound questions of climate science. What is the ECS (equilibrium climate sensitivity) for any delta in CO2. Practically this term is commonly used on a doubling of CO2 concentration in the atmosphere, though it could be applied on any kind of perturbation. Since we are doing an audit of "climate science", we will just stick with that.

The starting point of ECS is about how much forcing a doubling of CO2 (2xCO2) would cause per se. The literature gives us 3.7W/m2. Then the question is into what amount of warming this forcing will translate, and the term Lambda (~0.3) is usually used as a multiplicator (confusingly both to convert radiation to temperature, but also vice verse, in which case it would 1/0.3 = 3.3). So 2xCO2 should warm the surface by 3.7 x 0.3 = 1.11K. We can possibly do that more comprehensible by simply assuming 240W/m2 emissions TOA would correspond to 288K surface temperature and then check for the difference if we add 3.7W/m2. 288-((240-3.7)/240)^0.25*288 = 1.12K. This part is not complicated and deeply rooted in the SB-law. For any percent change in emissivity, a quarter of a percent change in temperature will compensate, due to the power 4 law. So we can calculate Lambda with (1/240)/4*288 = 0.3. It is so trivial, that arguably the term “Lambda” has only been introduced to make it look more “scientific”.

Per se a doubling of CO2 would cause a warming of 1.1K and that figure is largely undisputed, though some sources name slight deviations like 1K or 1.2K. With undisputed I mean there are plenty of "critical" climate sites (and sides) which all agree so far. From there on, since 1.1K is not quite enough to cause any climate emergency, it is all about feedbacks. The most significant feedback of course would be vapor, as a) it is assumed to be the most important GHG and b) its abundance depends on temperature. For every Kelvin in additional temperature, the atmosphere can (and will) hold roughly 6.5% more vapor. Generally it is being assumed vapor would roughly double any warming due to CO2 alone.

From there on the gloves are off and the art of climate modelling begins. Is cloud feedback positive or negative? What if the albedo changes due to less ice and snow? If the permafrost warms it could release unknown amounts of methane and CO2 causing a tipping point of no return. Will oceans behave like warm lemonade and quickly release all of its CO2? Questions over questions, and it is easy to see how different models will project different ECSs. Yet, the consensus has build around some 1.5-4.5K ECS. Note: this is not even with a 95% confidence, but far less.

When you understand the nature of a thing, you know what it’s capable of (Blade)

Of course I only explained all this just to give an overview on the status quo of "climate science". It does not mean I would agree to it. Given I have already explained how GHGs hardly cause any GHE, though that is subject to definitions, you might already guess the above logic contains plenty of blunders. Sorting them out will be fun and revealing. Also we should honor the irony in the way that critical thinkers have been trying to crack this nut for decades now, but all failed more or less. That is apart from some mediocre criticism. So I guess the mission, should I accept it, will be impossible! Just kidding..

Indeed by now it is one of the easier missions to decode and disassemble the core of climate models, fix the bugs, and put it back together. This article will start with 2xCO2 and its forcing. Btw. some of the older literature is quite useful, since it tends to explain certain logics, which are so “understood” today they will hardly get mentioned.

The dirty Lambda Trick

Cess et al (1990)1:


Or we can calculate ourselves (289/288)^4 * 240 – 240 = 3.35 and 1/3.35 = 0.2985, fair enough. There are a couple alternative ways to calculate Lambda, but really it is a no-brainer. It is just about the relation between radiation and temperature within the given system. Due to the SB law any minimal change in temperature will result in a four times larger change in radiation, or vice verse. So if the emission temperature should increase from 288K to 289K, this will require a forcing of 3.35W/m2. Umm, wait a moment!

288K is the surface temperature, not the emission temperature! The emission temperature is usually assumed to be only 255K, and that is what the GHE is all about. You know emissions occur higher up the atmosphere where it is colder. I am afraid we already stumbled over one significant mistake (or trick)! Percentage wise going from 255 to 256K is a larger increase as going from 288 to 289K, meaning it will require a stronger forcing. (255/256)^4 * 240 – 240 = 3.79W/m2 and 1/3.79 = 0.264. So Lambda is not 0.3 but only 0.264. Of course I have pointed out many times, that nothing really emits like a perfect black body and this will also be true for GHGs and clouds, meaning I will openly accept a more realistic emission temperature of maybe 260K. In this case Lambda will slightly grow to 0.27.

Note how subtle this trick is and what it does. By introducing this little error any radiative forcing will produce 11% (and almost 14% if “climate science” was consequent and went with the 255K figure) more warming in the end. Sure, this could only be an innocent typo in 30 year old paper, totally unsuspicious of any intended malpractice. But then, why is it still being used today? Has no one ever recognised the problem? Or did it even help to “outsource” the immanent logic into the term “Lambda”, so that the flaw would be less obvious? Certainly it shows the dangers of quoting figures without knowing the context.

For now we can conclude a 3.7W/m2 forcing due to 2xCO2 will produce exactly 1K warming (= 3.7 * 0.27) and nothing more. That is not 1.1K or 1.2K. I am not going to use Lambda anyhow since there is no reasonable scope for it, as it only obscures the context

Analyzing 2xCO2 forcing

Next we need to get a handle on the 3.7W/m2 figure. If you read a paper, or an IPCC report and it gives you this figure it is nothing but a black box. You may choose to believe or not, that’s all. For analytic purposes such little information is absolutely useless. However, at least in theory, modtran should be able to reproduce it. So in a first step I play through the different “localities” the uchicago2 version offers, set all other GHGs to zero (no clouds either) and check for emissions with 0, 400 and 800ppm.

The different locations yield a 2xCO2 forcing of between 1.853 and 4.71W/m2. The GHE of CO2 (with given concentration) should logically depend on surface temperature and the lapse rate of the atmosphere. The lapse rate given here is true for the troposphere and is only meant to give some orientation. The model output is pretty consistent with what we would expect. The strongest GHE comes with the highest surface temperatures, while a smaller lapse rate will mitigate it. Also 2xCO2 forcing will be largely proportionate to the GHE, or the base magnitude so to say.

Notably the US std model has parameters very close to the global average, and certainly not incidentally so. Both surface temperature and lapse rate do represent common base assumptions for the global average. Accordingly the 2xCO2 forcing figure of 3.768W/m2 sounds very familiar. In fact it is such a good match, that we do not really need to consider other localities, except for “back checking” what might occur at the boundaries of the model.

Then of course I would like to know what surface emissivity the model is using. This can be done lowering the (observation-) altitude to 0km, remove all GHGs (which oddly enough makes a little difference) and read the emissions, which will be 381.51W/m2 @288.2K. A perfect emitter at this temperature would emit 288.2^4*5.67e-8 = 391.164W/m2. So we should have a surface emissivity of 381.51/391.164 = 0.975.

The severe effect of surface emissivity

Next I am going to switch to a different modtran installation3, as this one allows us to adept surface emissivity (denoted as “ground albedo”) to a certain extend. The question is how the modelled parameter for surface emissivity impacts both the GHE of CO2, as well as the 2xCO2 forcing. As I have shown I have reasonable estimates on it already, but it is certainly worthwhile back checking. With emissions in the CO2 absorption band being roughly 45% lower than those of a perfect black body, we might assume a 1% drop the inferred surface emissivity should reduce the inferred CO2 GHE by 1/0.45=2.2%. The same should be true for 2xCO2 forcing.

This modtran installation is regrettably restricted in the far-IR. The maximum wavelength you can pick 20µm, while the minimum wavenumber is 250, corresponding to 40µm, so I will go with that option. Also I put the upper limit to wavenumber 2000, to avoid solar reflected IR, which the model takes into account. The emissions TOA figure is to be found next to “Upward Diffuse (100 km)” in W/cm2, which will need to be multiplied by 10.000 to give the usual dimension of W/m2. Also it will still retain some minor GHGs and aerosols you can not modify, which impair emissions a little bit (~5W/m2). The table below gives the results for US std atmosphere, all other GHGs set to zero.

The modtran version behaves a bit different as the one from uchicago. The GHE of CO2 is a little bit higher, which is to be expected with a surface emissivity of 1, opposed to 0.975. Other then I projected a 10% reduction of surface emissivity does not reduce the CO2 GHE by 22%, but by over 23%. Fair enough. A little unlike the projected linearity between the effect on GHE and 2xCO2 forcing, the model suggests a way larger impact on CO2 forcing, with almost 26%, or 2.6times the delta in surface emissivity. I think it is a very powerful and impressive demonstration of the importance of surface emissivity for ECS estimates.
Despite modtran suggesting a 2.6 multiplicator for any inferred surface emissivity, while obviously endorsing the significance of the point I am making, I will prefer to stick to a more conservative 2.2 multiplicator. The reasons are that a) I have done a lot of considerations here back and forth, which led me to the lower figure, b) I do not want to fully rely on modtran when it could produce some unanticipated errors and c) for the sake of being conservative.

Over most of the Earth’s surface the thermal infrared emissivity surface emissivity, is observed to be in the interval [0.9 to 1]. Negligible error is introduced by setting surface emissivity = 1 in spectral regions of high atmospheric opacity..4

Recalling the above statement, and again it is no specific criticism of Will Happer, as it only represents the status quo of “climate science”, I have to make a little correction here. Rather than a “negligible” error, it introduces an error maximus. Ignoring surface emissivity is a devastating mistake! Oh the irony..




We could include vapor in the above approach and so account for the overlap with it, and surface emissivity at the same time. When doing so, a 10% reduction in surface emissivity would still yield an impressive 9.5% reduction in 2xCO2 forcing, as shown in the table.
The problem is, we need to consider what the model does and if it even makes sense. The charts below demonstrate the issue. When removing all GHGs except for traces of vapor, there will be a few spectral lines reducing emissions against an otherwise perfectly emitting surface. Once we drop surface emissivity by 10%, the “GHE” of vapor within the model reverses. That is because the emissivity of vapor will still be assumed to be 1, which it certainly is not in reality. It is an imprecision not really mattering as long as you do not bother about surface emissivity, so that every deviation from perfect emissivity can be blamed on some GHG elevating the emission level. Here the model is just too simple, or too restricted in its parameters, to give reasonable results.


Using this information we can interpolate the figures from the uchicago modtran and put them into perspective. As if the 3.768W/m2 figure was not a good enough match for the circulating 3.7W/m2 figure, we can test some modifications. If a 1% change in inferred surface emissivity produces a 2.2% change in 2xCO2 forcing, then with a perfectly emitting surface we should get 3.768 * (1 + 0.025 x 2.2) = 4W/m2. So that would be the origin of a 4W/m2 forcing with 2xCO2, which is another term very common in the literature.

Alternatively assuming a surface emissivity of 0.97, apparently a common base assumption to “refine” ECS estimates, you get 3.73, or 3.7 straight, which explains the preferred IPCC figure. Then Wijngaarden, Happer (2018) name a 3W/m2 forcing including the overlap with vapor.

Doubling the standard concentration of CO2 (from 400 to 800 ppm) would cause a forcing increase (the area between the black and red lines) of ∆F{i} = 3.0 W m−2, as shown in Table 25

That would be quite a significant finding on its own, if they were able to exploit it. Regrettably they are not, and so this little shimmer of hope remains without consequences. Anyhow, we can reproduce this figure as well. With all GHGs back in the game, based on the US std atmosphere scenario, modtran gives us 267,842W/m2 in TOA emissions with 400ppm, and 264,859W/m2 with 800ppm. It is a difference of 2.983W/m2, spot on.

So with the help of modtran we can reproduce all these figures, and we now know which assumptions they are based on. With a perfectly emitting surface and no overlaps at all, it is 4W/m2 for 2xCO2. If allowed for a tendentially more realistic surface emissivity of 0.97 the figure drops to 3.7W/m2. Allowing for the overlap with vapor it will only be 3W/m2.

Considering cloud overlaps

Beyond these shenanigans it is now time to investigate the real figure for 2xCO2, since we can. All the foundations are there, it is only about executing. Logically the first thing to do is allowing for clouds. Clouds mean another significant overlap, and as overlaps work, this will reduce 2xCO2 by a significant margin.

Adding “Stratus/Stratu CU Base .66km Top 2.0km”, which means a LWCRE of 25W/m2, drops 2xCO2 forcing to only 2.292W/m2 (=242.91 – 240.618). Temperature wise it corresponds to 0.61K! Remember the “Lambda” issue? A lambda of 0.3 would wrongly give us 2.292 x 0.3 = 0.69K. Modtran however rightfully applies the forcing on the emission temperature, and so it gets (1-(240.618/242.91)^0.25) x 256 = 0.606K. Why 256K? It is because 242.91W/m2 correspond to a black body emission temperature of 256K.

Either way, we fairly well understand what modtran does, and why. Yet a LWCRE of only 25W/m2 is way too low, as the global average is inferred to be 30W/m2, and equally 242.91W/m2 emissions TOA is way above the usual 238-240W/m2 figure. Thus we will need to run the same procedure with a larger LWCRE (Cumulus Cloud Base .66km Top 2.7km) of 39W/m2.

The Synthesis

In this case 2xCO2 forcing is 1.853W/m2 and 0.52K respectively. With this we have two waypoints and so we can infer what we would get with accurate figures for the global average. The LWCRE should be 30W/m2, as opposed to 25 and 39W/m2 in the named scenarios. Equally emissions TOA should be in the 238-240W/m2 range, as opposed to 242.91 and 228.875. Interpolation will give us some 2.15W/m2 in forcing and 0.58K respectively.

Finally we need to consider the impact of real life surface emissivity. As shown within the main CO2 absorption band surface emissivity is only about 0.895 as opposed to 0.975, as the uchicago modtran assumes. It is a difference of 8 percentage points which, as explained above, will have 2.2 times the effect on 2xCO2 forcing, meaning minus 17.6%. Yet this will only impact the result as far as clouds and vapor do not block surface emissions anyway. If we assume a 40% opaque cloud cover and a 25% reduction due to vapor, we get an adjustment of 0.6 x 0.75 x 16%= 8%. Really it is just ballpark figures, but we need to have some reasonable estimate here. This will give a best estimate of 2.15 x (1-0.08) = 2W/m2, and 0.58 x (1-0.08) = 0.53K for a 2xCO2 forcing.


For the first time in history we have an accurate assessment of the forcing of 2xCO2, excluding feedbacks. With the help of modtran we were able to precisely reconstruct figures given in the literature, be it 4, 3.7 or 3W/m2. Therefore we also understand under which circumstances these results were obtained. Also it is compelling evidence of modtran giving accurate figures within the restrictions of the model.

In the wider context of a highly superior understanding of the physics, as ongoingly presented on this site, it is easy to identify the severe shortcomings of common forcing estimates. I do not know, and can not understand, how the idea came up such profound issues as surface emissivity and overlaps with vapor and clouds could be ignored. It may well be due to a general lack of intellect among “climate scientists”, as well as to intended misinformation. Without a doubt the “science” has been severely corrupted by politics, to a level where it represents the opposite of science.

As the evidence shows, allowing for surface emissivity and overlaps is a conditio sine qua non for climate modelling and ECS estimates. All papers and models failing in this regard are not worth the paper, they may be written, or the computers, they are running on. To my knowledge that is true for ALL of them, as this article is the first of its kind to reveal the named defect. The concluding table summarizes the results and the assumptions they are based on. Only the last result is accurate.

Write new comment

Comments (8)

  • Peter Smith
    Peter Smith
    at 07.09.2021
    Simply outstanding and sensible. Thank you.
  • NH4Kx3
    at 19.09.2021
    Thank you; very interesting.

    I do not see any mention of H2O vapor. My understanding of Chicago Modtran is that it defaults to 1 cm of column water, whereas, again as I understand, it is more like 2. Would adoption of 2 cm or higher column water change anything?
    • GHD
      at 25.09.2021
      Vapor scale = 1 only means the preset amount of vapor for the respective scenario at a 100%. To find out how much vapor it actually means you need to look up the "raw model output".
  • JJW
    at 08.10.2021
    This indeed seems a tiny little number compared to the scaremongering IPCC that come up with 5.7ºC by the year 2100.
    IMO, this makes perfect sense. Considering current run rates we'll hit 2xCO2 by ~2200 if we don't run out of fossils earlier anyway.

    I am wondering if you have a view on the impact that ocean temperatures have. HADSST3 shows an increase of >1ºC since 1900, apparently having been measured in somewhat deeper layers of water before the advent of satellites.
    And surely, if my physics doesn't let me down, the oceans warm the atmosphere. Not the other way round.

    Also, have you considered bringing your findings up to Wijngaarden and Happer? They seem to be quite reasonable folks.
    • GHD
      at 09.10.2021
      a) that is without feedbacks
      b) I do not deal with oceans, except for surface properties. And there are good reasons for that.
      c) Not just considered. I told him about the issue, he replied, but regrettably failed to comprehend. It is ironic since WN2020 (and other versions) do feature the impact of the CO2/vapor overlap, and it should be only one more logical step to see the cloud overlap and surface emissivity issue. However, Happer is 83 years old and he might have troubles understanding the one big blunder in climate science that he missed to recognise throughout his scientific career.
    • JJW
      at 11.10.2021
      a) understood. Your blog write-up was clear. But, even if you applied that rule of thumb of doubling the 0.53º figure to account for forcings you'd still be at only 1º instead of IPCC 5.7º

      b) I am not challenging that there might be good reasons for not dealing with oceans on your side. What looks strange to me (as a non-physicist) is that average "global" ocean temperatures are said to be 17ºC whereas average "global" atmosphere temps (close to surface) are always calculated with 288K/15ºC. How can colder matter heat up warmer matter (as in "mankind is boiling the ocean")?

      c) Happer might be unwilling or too old to realise his blunder but you would expect Wijngaarden to be a bit more open. I understand they "do something with clouds" these days. If I can help you hammer home your findings and raise your points (again), then just let me know.
  • Simon
    at 03.04.2022
    Great work, I see it must be getting read by those who disagree by the number of down votes some coments get, but they never post any rebuttal. Says it all really.

    Are you working on getting your work published?
    • GHD
      2 weeks ago
      Well, I am publishing here so that I don't have to do it anywhere else. There is so much junk science, especially with regard to climate, published in both reputable and pay-to-publish journals. "The science" is in a dreadful state. What I can do is to take it back to its roots and restart from there. After all it is just a mental, not an institutional game.

To top

Saving the planet is one of the harder jobs. Feel free to support ;)