Kênh Tên Miền chuyên cung cấp tên miền đẹp, giá rẻ! Hãy liên hệ kỹ thuật: 0914205579 - Kinh doanh: 0912191357 để được tư vấn, hướng dẫn miễn phí, Cảm ơn quý khách đã ủng hộ trong thời gian qua!
kiem tien, kiem tien online, kiem tien truc tuyen, kiem tien tren mang
Friday 15 May 2015

Climate is a heat engine with input energy from the sun and exhaust to outer space. It sustains all living organisms by supplying them with the necessary energy.

Genuine climate science models are the numerical models used for weather forecasting.

They consist of a coupling of the latest mathematical representation of the atmosphere and ocean, into which is substituted experimental observations made with calibrated instruments over definite time periods at every level of the atmosphere. The most modern computers permit observations made at ever smaller time and space increments of the climate properties. An additional essential feature of all these models is representations of the position of the sun over each region plus a range of other local properties and influences such as mountains, events like monsoons, ocean oscillations and extent of industrialisation. Even after all of this is done, using the most advanced computers as shown in Chapter 1, the accuracy of local of regional temperature forecasts is only ±2ºC with a bias of ±1ºC and other forecasts are similar. 

A global climate may be represented by an assembly of local models where each individual local variability and correction must be preserved. If these are ignored, as is proposed by IPCC climate change models, the accuracy of any forecast is much less than a fully adjusted local model.

An increasing range and variety of climate measurements and observations are now incorporated into weather forecasting models. The importance of water in all its forms is involved in many of them, but so far, there is no use made of measurements of carbon dioxide concentrations or those of other trace gases. Indeed, no effort is made to measure concentrations of these gases over most land surfaces.

The IPCC models begin with the assumption that climate is exclusively controlled by the so-called greenhouse gases, and that their increase is responsible for widespread harm to the climate, notably warming.

Most scientists would agree that the greenhouse gases have infra red absorption bands which cause a temperature rise in the atmosphere. There is, however, no evidence that this rise is detectable or could be considered as harmful. By contrast, carbon dioxide is responsible for the mere existence of most living organisms and there is evidence any increase is beneficial.

In order to promote their fraudulent climate theory, the IPCC models lump together all of the many observations made by meteorologists into parametric simulations of what they term natural variabilitybased on averages of properties over past periods.  This task is impossible because no account is made of the chaotic behaviour of all of them and none of the figures possess the necessary symmetry before a scientifically acceptable average or measure of variability can be derived. The diurnal, seasonal and irregular radiant input from the sun imposes heavily skewed populations of parameters. Any attempt of this kind provides the IPCC absurdity of a sun with the same intensity day and night.

Despite these impossibilities the IPCC models assume that these simulated phoney averages are constants so that the only influence on the climate can be claimed to be provided by increases in human–induced greenhouse gases. Models based on these dubious principles are then combined with scenarios of the future which are dominated by the pipe dreams of committed environmentalists, to provide projections which are evaluated, into levels of likelihood and confidence, by scientists whose opinions are closely monitored and controlled by the anonymous Government Representatives who approve the IPCC Reports.

The models typically calculate the radiative forcing consequences of doubling the effective atmospheric carbon dioxide concentrations (the climate sensitivity) to provide a future projection: this must be combined with a futures scenariowhich estimates how long this process will take.

The IPCC has run into trouble with its approved pseudo temperature record which has now been unchanged for 18 years. They have therefore began to project for decades. Their conclusions are not predictions, but projections which are evaluated into levels of likelihood by approved and financially dependent IPCC scientists. None of the projections have ever been confirmed.

Their attitude is typically expressed by the following Frequently Asked Question1.

FAQ 11.1 |  If You Cannot Predict the Weather Next Month, How Can You Predict Climate for the Coming Decades?:

Although weather and climate are intertwined, they are in fact different things. Weather is defined as the state of the atmosphere at a given time and place, and can change from hour to hour and day to day. Climate, on the other hand, generally refers to the statistics of weather conditions over a decade or more.

They define weather as what climate generally refers to,  not what it is.

As discussed in Chapter 1, climate is a general term and weather is the actual condition at a particular place and time. All the same, climateis always essentially local and is different from one place to another. It is perfectly possible over a limited period to provide rough average local weather statistics. Averages with a plausible level of accuracy can only be provided by populations of figures that are approximately symmetrical. This might apply to maximum and minimum daily temperature, wind direction and strength, sunshine hours or rainfall. The frequency of extreme events cannot be judged because of the small numbers and overall climate changes that undoubtedly take place.


Figure 6.1 Global Radiation Budget
according to Climate Change 19902
THE  IPCC MODELS

The earliest model is from the first IPCC Report2. 


This Figure is extremely crude. It is like the moon. There is no atmosphere, no life, no heat transfer, an apparent equilibrium, no rotation.


The maximum solar irradiance is 1370 Wm-2 whereas their climate system gets only an average of 240 Wm-2even on   a spherical earth.

The 2nd IPCC Report 1995 even provided a fairly realistic diagram of the climate3.

Figure 6.2 Diagram of the climate from 2nd IPCC Report3.

But the very next page provided their real model which differs from Figure 2 in every possible respect.


Figure 6.3 Earth’s Annual Global Mean Energy Budget4.
                                      
It was described in detail by Kiehl and Trenberth (1997)5.

The climate is considered to be balancedwith the entering energy equal to the energy leaving. In practice this amounts to a belief that the whole climate system is in equilibrium. The whole object of this concept is to assume that any extra energy (radiative forcing) that can be detected by a rise on temperature is the results of increases in human induced greenhouse gases.

Heat input (radiative forcing) is assumed to be exclusively radiative, and thus to be instantaneous. It is measured in units of Watts per square meter, Wm-2.

The model assumes that the earth can be considered to be flat with mean quantities providing input and output of energy. The apparent curvature of the earth in the diagram is illusory, as all quantities operate strictly up and down.

The heat engine that is the climate provides the energy for living organisms plus energy for erosion mountain building and changes in topology. The model ignores this and essentially treats the earth as dead and uninhabited. Yet somehow the dead earth succeeds in emitting greenhouse gases.

No part of the earth is ever in equilibrium or could possess an energy balance. There is excess energy in daytime and in the summer and a deficit at night and in the winter. The imbalance can change with every breath of wind, every cloud, and every change of weather. The earth has a large thermal capacity to absorb energy or lose it, for short or long periods.
           
The figures on the diagram do turn out to be balancedbut since they are mean figures there should be large uncertainties attached to each of them which destroy the balance. The calculated effects of greenhouse gases could not be distinguished from uncertainties in these figures.

The figure for the radiant energy emitted by the earth, is given as 390 Wm-2.

If the earth is assumed to be black body then the radiation intensity from the earth, E, in W/sqm is related to the absolute temperature T in K by the Stefan-Boltzmann equation:

E  =  σ x T4

Where σ is the Stefan-Boltzmann constant  5.67 x 10-8 Wm-2.

The average temperature of the earth is thought to be 288K (15ºC) (although there is no reliable evidence that this is so). If you put 288 into the Stefan-Boltzmann equation, you get 390 Wm-2, the amount shown in Figure 6.3. This means that they have assumed that the earth is an ideal black body with a constant temperature of 288K (15ºC). Such an assumption is absurd.

Trenberth et al 6  admit that there are Inter-annual variationsin their atmospheric heat budget and they give details of local and temporal variability, dependent on changes in Sea Surface Temperature and ENSO events. The variability, as well as the errors frequently exceed the supposed perturbations of greenhouse gas emissions.

The diagram and the models also assume other inappropriate averages, such as the supposed well-mixed concentrations of all the greenhouse gases, where distribution curves are actually necessary.

The 3rdIPCC  Report (2007) had a slightly different diagram7 where several of the figures were changed, but there was still an unbelievable balance.

Figure 6.4 Earth's Mean Energy Budget in 3rd IPCC Report7

Fasullo and Trenberth (2008) have extended the model to cover changes in latitude and in seasons and the difference between land and ocean. It still remains an unrealistic static concept, unable to cope with the fact that half of the earth at any one time has no solar radiation at all, and that equilibrium and balance are never established.

Then, in a paper by Trenberth, Fasuiio and Kiehl (2009)8 a complete revision was made.
Figure 6.5  Revision by Trenberth Fasullo & Kiehl8.

This version represents the period from March 2000 to May 2004. It caused something of a sensation when it first appeared, as for the first time there is no balance but a surplus of Wm/sqm which is much greater than the 1.6Wm-3 which is supposed to be the total effect of greenhouse gas emissions since 1750. The figures are all uncertain and variable and this surely means that they are incapable of telling whether increases in greenhouse gases affect the climate

The latest version, in the 5th IPCC Report 9, is

Figure 6.6 The Earth’s Annual Global Mean Energy Budget9

This version makes it a whole lot obvious that the models cannot show the influence of greenhouse emissions, For the first time they have given uncertainty figures for each energy estimate 
RADIATIVE FORCING

The models are used to calculate the additional global input energy, radiative forcing caused since the year 1750.Its breakdown is shown in Figure 6.710


Figure 6.7 Components of radiative forcing since 175010

Their current estimate of 2.3 Wm-2 (1.33 to 3.33) for the radiative forcing caused by greenhouse gas increases since 1750 is well below the uncertainties of this model. The maximum forcing “expected” in 2100 of nearly 8Wm-2 is still smaller than the possible extra that could come from the model.  It is therefore worthless as a means of estimating the influence of increases in greenhouse gases. It does not seem to be worth arguing about different values of “climate sensitivity” as they are all much less than the uncertainties.

It does not include the main greenhouse gas, water vapour which is treated as a feedback. This is highly ridiculous since water vapour has a huge, largely unknown range of concentration and variability... It is just assumed to be related only to the assumed global temperature. Since carbon dioxide causes warming, water vapour increase causes enhancement of carbon dioxide warming. This is the tail wagging the dog.

Column water vapour can now be measured by AIRS satellites Figure 6.811 shows an example. Obtaining a global average from this diagram is rather a challenge.

Figure 6.8 Column Water vapour as measured by AIRS11.

My book12 pointed out the large range of assumptions for cloud feedback for the Third 1PCC Report6. Figure 6.7 shows that this has not changed. Model assumptions as a function of latitude are also not very impressive (Figure 6.9)13

EVALUATION OF MODELS

The IPCC does not make future predictions, only projections whose value depends on the assumptions of the models themselves and also on the scenarios which make a range of assumptions on how fast the additional emissions of greenhouse gases may happen in the future. These projections combine two different levels of uncertainty, those of the models and those of the scenarios

Instead of trying to estimate the uncertainty of these calculations by direct comparison between projections and actuality they carry out the following procedure14

for AR5, the three IPCC Working Groups use two metrics to communicatethe degree of certainty in key findings: 

Confidence is a qualitative measure of the validity of a finding, based on the type, amount, quality and consistency of evidence (e.g., data, mechanistic understanding, theory, models, expert judgment) and the degree of agreement ; and (2) Likelihood provides a quantified measure of uncertainty in a finding expressed probabilistically (e.g., based on statistical analysis of observations or model results, or both, and expert judgement) 
Each key finding is based on an author team’s evaluation of associated evidence and agreement. The confidence metric provides qualitative synthesis of an author team’s judgement about the validity of a finding, as determined through evaluation of evidence and agreement. If uncertainties can be quantified probabilistically, an author team can characterize a finding using the calibrated likelihood language or a more precise presentation of probability. Unless otherwise indicated, high or very high confidence is associated with findings for which an author team has assigned a likelihood term.
Figure 6.10 Evaluating Models14

All these are nothing more than personal opinions of the dragooned zombie scientists who are under the absolute control of the environmentally motivated government representatives who control the IPCC. This control is now absolute. It was displayed when the 1995 Final Draft was made to agree with the government dictated Summary for Policymakers and more recently when the Final Draft was not approved until it had been thoroughly checked for reliability. The penalty for independent thinking is instant dismissal.

THE IPCC SCENARIOS


There have been three sets of scenarios

The first, launched with Climate Change 902 (consisted of four scenarios, A, B, C, and D. Scenario A was termed “Business As Usual” (BaU) or as SA90.

The next series of six scenarios was developed for the Supplementary Report Climate Change 9215 and the details are available from another supplementary Report16. The scenarios were designated IS92a, IS92b, IS92c, IS92d, IS92e and IS92f...7

I published a paper17 which showed that the scenarios were not plausible. Its publication contributed to the dismissal not only of the Editor but of the entire editorial Board. As shown in Chapter 8 on Climategate.

A new set of scenarios was used in Climate Change 20016

The scenarios were prepared by a special committee of IPCC Working Group III. Their Report, Special Report on Emission Scenarios18, was produced without any input from the scientists involved with Working Group I, and its conclusions were foisted on Climate Change 2001 without opportunity for discussion.

Six teams of specialists from 18 countries drew up a total of 40 scenarios, all based on four “storylines” claimed to represent different views of what might happen in the future. The 40 scenarios were originally summarised in the form of four Marker Scenarios, A1, A2, B1. and B2, but, as time went on, they split A1 into three; A1F1, A1B and A1T. These Scenarios were also used for the Fourth IPCC Report 20079

In my book19 I showed that these scenarios not only failed to predict the future, but in some respects failed to predict the past

A series of Intercomparison exercises were carried out with the object of providing a  set of approved  models. These were the result of The Coupled Model Intercomparison Process CMIP which had achieved Number 3 by the 4th Report 20079.  CMIP5 models were used for the calculations of the 5th Report1

A new approach to Scenarios has been a feature of the Fifth Report which now has a set of Representative Concentration Pathways 1-5. Instead of a detailed history of emissions each pathway was given a target reduction in radiative forcing by the year 2100. The four RCPs, RCP2.6, RCP4.5, RCP6, and RCP8.5, are named after a possible range of radiative forcing values in the year 2100 relative to pre-industrial values (+2.6, +4.5, +6.0, and +8.5 W/m2 respectively.

The relationship between the various scenarios is shown in Figure 6.1120
Figure 6.11 The relationship between the Scenarios20


CLIMATE SENSITIVITY


Most scientists would agree that carbon dioxide and other trace gases cause a warming of the global climate as a result of absorption of the infra red radiation from the earth by their spectral bands.

Weather forecasting meteorologists measure the many properties of the climate, and provide a daily presentation of their influence on the global climate. but they have never found evidence that trace gas concentrations are sufficiently important in forecasting even to require regular measurement.

Scientists involved with the Intergovernmental Panel on Climate Change  (IPCC) argue that carbon dioxide and other trace gases are not only important, but even the only cause of climate warming since 1750 and responsible for further warming as the concentrations rise.

The equilibrium climate sensitivity quantifies the response of the climate system to constant radiative forcing on multi-century time scales. It is defined as the change in global mean surface temperature at equilibrium that is caused by a doubling of the atmospheric CO2 concentration.

It may be defined thus:

              ΔTCS  = ΔTT x ΔFCS /ΔFT
where
ΔTCS  is the Climate Sensitivity
ΔTT is the temperature change since 1750
ΔFCS is the radiative forcing from doubling carbon dioxide
ΔFT  is the radiative  forcing since 1750

The current IPCC range of values for equilibrium climate sensitivity are shown in Figure 6.917

They regard a range between 1.5ºC and 3.5º as likely, with high confidence, extremely unlikely below 1ºC with high confidence and very unlikely above 6ºC with medium confidence.
Figure 6.12 Current IPCC figures for equilibrium climate sensitivity21

Annexe II of The IPCC 5th Report22includes figures for atmospheric concentration of CO2 and other greenhouse gases from 1750 to 2011 in order to calculate the forcing from 1750 to 2000 and the forcing from 1850 to 2000 to calculate climate sensitivity, on the assumption that all the temperature change from 1850 to 2000 was caused by greenhouse gases.

Geologists know that there are changes of climate from Natural reasons in every geological period, short or long, whether or not human influences existed. There have been several where carbon dioxide concentrations were not related to assumed temperatures.

The FCCC assumption that all natural climate properties do not change from 1850 to 2000  but are merely variable may not be true. Perhaps some or all of the claimed temperature change since 1850 had natural causes. The First IPCC Report2(1990) suggested that recent temperature increases could have been a recovery from the Little Ice Age (1550-1850). Some increase was due to urban development and some, to the persistent attempts to manipulate the record as summarised in Chapter 7. 

The Mean Global Annual Surface Temperature Anomaly whose origins are described in Chapter 7 is now incompatible with all the current models, as shown by this diagram from the Technical Summary  of AR519

Figure 6.12 Comparison between the IPCC Mean Annual Global
Surface Temperature Anomaly and the current IPCC climate models22

Because of this failure and the fact that the IPCC Mean Annual Global Temperature anomaly has not changed more than the past 18 years they have decided to treat it on a decadal basis instead, as follows23:

Figure 6.13 IPCC  Decadal Temperatures23
                                                                                                                                            

This irregularity is simply not compatible with a theory that it is caused by a steadily increasing concentration of greenhouse gases.

There is a much more plausible temperature anomaly record from measurements in the lower atmosphere since 1978 by Microwave Sounder Units (MSU) on NASA satellites, measuring the microwave spectrum of oxygen. Their results are confirmed by weather balloons which have been providing a record since 1958 (Figure 6.14)24

Figure 6.14 Comparison between MSU Satellite and weather
balloon temperature measurements and 73 CMIP Climate models24.

The IPCC models project a hotspot.  Figure 6.14 shows that this hotspot does not exist when compared with temperature observations by satellites and  weather balloons

All the measurements are now beginning to resemble each other, as shown in Figure 6.15.They provide further evidence that IPCC models are currently incapable of predicting climate properties. These results suggest that IPCC figures for climate sensitivity are far too high, and the opinion of the meteorologists that its value is negligible is confirmed.
Figure 6.15 Comparison between surface and satellite temperature records and  IPCC climate models25

The IPCC models have rejected all of the measurements used by meteorologist except those used as feedbacks for their radiative forcing. The forcing is based on the belief that all temperature rise since 1750 was caused by greenhouse gas emissions, and it was calibrated  from the temperature trend established by their own flawed and biased temperature anomaly record, which has now been forced to become decadal because of the unexpected pause of the past 18 years.

They have then attempted to simulate properties by using parameters based on their radiative forcing. A case of the unnecessary tail wagging the scientifically determined dog.

As might be expected it does a reasonably successful simulation of their annual temperature record, at least up to 2000 and as well as diurnal and regional variability. The rest depend on the plausibility of the parameterization. Precipitation comes out poorly until it is modified regionally. Ocean oscillations and cyclones depend on their supposed regularity.  Sea level is dependent on the belief that the ice is melting.

They have compiled projections to the year 2100 and map changes using all 42 of the CMIP5 models and all the RCP Scenarios for temperature and precipitation change for different seasons for some 21 different regions. These are presented in time series and global maps in Annex 1 and 4 Supplementary Reports to Annex 1. The temperature graphs all start with Figure 1 above which makes them equally implausible and then follow with very large uncertainties.

They are quite frank about the adequacy of their models which they insist are not forecasts in some of the Supplementary Report to Annex 1 of the fifth Report26
Projections of future climate change are conditional on assumptions of climate forcing, affected by shortcomings of climate models and inevitably also subject to internal variability when considering specific periods. Projected patterns of climate change may differ from one climate model generation to the next due to improvements in models. Some model inadequacies are common to all models, but so are many patterns of change across successive generations of models, but so are many patterns of change across successive generations of models, which gives some confidence in projections. The information presented is intended  to be only a starting point for anyone interested in more detailed information on projections of future climate change and complements the assessments in Chapters 11,12 and 14.

It is surely evident that the models have no predictive value and are therefore a complete failure. The estimates of climate sensitivity given by the IPCC are grossly exaggerated. The true figure is most probably near to that which is already assumed by the weather forecast meteorologists, negligibly low.

REFERENCES

1.IPCC, 2013: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovern-mental Panel on Climate Change [Stocker, T.F., D. Qin, G.-K. Plattner, M. Tignor, S.K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex and P.M. Midgley(eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 1535 pp. Page 964
2..Houghton J T , G J Jenkins & J J Ephraums (Editors) 1990 Climate Change : The IPCC Scientific Assessment. Cambridge University Press, .page 78
3 Houghton, J T,  L G Meira Filho, B A Callander, N Harris, A Kattenberg & K Maskell (Editors) 1996 Climate Change 1995 :The Science of Climate Change Cambridge University Press page 57
4 Ibid page 58.
5 T. Kiehl and Kevin E. Trenberth, 1997: Earth's Annual Global Mean Energy Budget. Bull. Amer. Meteor. Soc., 78, 197–208.
6 Houghton, J T, Y Ding, D J Griggs, M Noguer, P.J van der Linden, X Dai, K Maskell, & C A Johnson (Editors)  2001.Climate Change 2001: The Scientific Basis.  Cambridge University Press page 90.
7 Fasullo, J T and K E Trenberth 2008,The Annual Cycle of the Energy Budget, Part I Global Mean and Land-Ocean exchanges, J Clim. 27, 2297-2311; Part II, Meridional Structures and Poleward Transports. 2313-23
8 K. E. Trenberth, J. T. Fasullo, J. Kiehl, The Earth’s Global Energy Budget   Bull. Am. Meteorol.Soc. 90, 311 (2009).  http://journals.ametsoc.org/doi/pdf/10.1175/2008BAMS2634.1
9 Solomon, S., D Qin, M. R. Manning,  M.  Marquis, K.  Averyt, M. H Tignor, H. L. Miller, and Z. Chin.  (Eds.). Climate Change 2007: The Physical Science Basis (IPCC), Cambridge University Press.
9 Reference 1 Chapter2 Figure 2.11 page 181
10 Reference 1 Summary for Policymakers page  12
11 AIRS  http://airs.jpl.nasa.gov/resources/todays_earth_maps/water_vapor_total_column
12 Gray V R 2002 The Greenhouse Delusion: A Critique of Climate Change2001, Multiscience Publishing, Essex, UK page 38
13 San Jose State University  http://www.sjsu.edu/faculty/watkins/watervapor01.htm
14 Reference 1. Pages 139, Pae 1.4.1. 141 Figure 1.11
15 Houghton, J. T, B A Callendar, and S K Varney. 1992 Climate Change 1992: The Supplementary Report.Cambridge University Press..
16 1992  Pepper  W et al Emissions Scenarios for the IPCC An Update  IPCC
17 Gray, V. R. 1998. The IPCC future projections: are they plausible?  Clim, Res. 10, 155-162.
18 Nakicenovic, N, and R. Swart. (Editors),  2000. IPCC Special Report: Emissions Scenarios. Cambridge University Press.
19 Gray V R 2002. The Greenhouse Delusion: A Critique of ‘Climate Change 2001’ Multi-Science Publishers UK page 67
20 Ref 1 Chapter 1 Figure 1.15 page 146
21 Ref 1 Technical Summary Fig TFE6 Page 83
22 Ref1 Technical Summary page 87
23 Ref 1 Obsevations page 192
26 Ref 1 Annex ! Supplementaries


























0 comments:

Post a Comment

domain, domain name, premium domain name for sales

Popular Posts