In 2007, the Fourth Assessment Report of the Intergovernmental Panel on Climate Change was published and branded climate change as being unequivocal. However, the existence of evidence that that climate is changing, it still remains uncertain how the environment, people, and other structures will experience these changes. This is because there is a high level of uncertainty with regards to forecasts and projections on how soon the earth will experience changes in sea levels and temperatures. This uncertainty varies across systems and scales. It is, therefore, important that policy makers understand these changes because they present serious biological, social, and physical threats. However, the existence of diverse climate change projections portends a problem to many policy makers because of their conflicting nature (Lemos and Rood, 2010).
Most research funded programs are now concentrating much effort tin trying to reduce levels of uncertainty in projections and predictions regarding climate change. The research process creates this uncertainty thereby making the decision making process to become more complex. There is, therefore, a need to scale down climate change predictions by tying to improve climate change projections. This is because scientific knowledge is usually perceived differently by different people which might hinder their full utilization. There are different types of climate change models which are used to predict and project climate change. These models use different methodologies to draw conclusions about climate change (Barron, 1995). This is the reason why there is confusing and conflict in terms of decision making and policy making due to their conflicting outcomes.
Delegate your assignment to our experts and they will do the rest.
The Climate Change Models
In the last 40 years, scientists having been making climate change projections and predictions using climatic models that have continued to increase in complexity. Most of these models are based on theories from biochemistry and atmospheric physics and play a crucial role to helping us understand the earth’s climate and how it will be in future. Climate model projections have been around since 1973 and have been used in projecting and forecasting past and future global temperatures. However, these models have produced different forecast outcomes where some have projected more global warming and others less. Nevertheless all of them have exhibited increases in temperatures in the period 1970-2016. These projections were not far off the mark given the rising temperatures in that period. This is especially true when one takes into consideration the differences in the projected future emissions. The climate change forecasts of the past largely benefited from knowledge of volcanic eruptions, atmospheric greenhouse concentrations including a number of radiative forcing which affected the earth’s climate (Hickman, 2018).
However, it remains highly uncertain to forecast global climate change for the future. These climate change models can be analyzed on their ability to forecast future temperatures as well as hindcast the past. Hincasts are important because they act as controls for radiative forcings. On the other hand, forecast is useful because they cannot be simulated to act in the same manner as observations. Climate models can, therefore not be used to provide information on historical temperatures but can provide knowledge which modelers can use to know the appropriate model parameters to use such as aerosol effect and cloud physics (Barron, 1995)).
The first climate change forecast to be used was introduced by John Sawyer in 1973 at the UK’s Metrological Office. This model was published in a paper where he came up with the hypothesis that the globe was set to warm by 0.6 degrees in the period 1969-2000. He went further to state that the atmospheric carbon dioxide would rise by about 25%. He called for policy makers to beware of climate sensitivity which translated to the amount of long term warming in relation to the doubling of the atmospheric carbon dioxide levels at about 2.4 degrees. It seems that his forecasts and projections were not too far off the mark in relation to the best estimates of 3 degrees provided by the Intergovernmental Panel on Climate Change (IPCC) in recent years. He, however, did not give a yearly estimate as most scientists do but an expected value for the year 2000. His estimate of 0.6 degrees compared favorably with the one that was observed in that same period which stood between 0.51 and 0.56 degrees. However, his atmospheric carbon dioxide concentration levels for the year 2000 were overestimated. He had estimated that the levels will stand at between 375-400 ppm and the actual value stood at 370 ppm (Hickman, 2018).
Prof Wally Broecker was another scientist who introduced the first projections of future temperatures as a result of global warming. He published his work in 1975 in an academic journal. In his projections he utilized the energy balance model to forecast what will happen to global temperatures in the event that that atmospheric carbon dioxide continued to rise from the year 1975. His forecast was close to the one observed for some decades but has deviated considerably in recent years. The main reason is because he overestimated how atmospheric concentrations and carbon dioxide emissions would rise. His forecast was fairly accurate up to the year 2000 where he has projected a carbon dioxide increase to stand at 373 ppm as compared to 370ppm that was observed during this period. He also estimated that in 2016 the carbon dioxide increase will reach 424 ppm but bony 404 ppm was actually observed at the time. His model made the error of not taking into consideration other greenhouse gases even though they could not have made any great difference. Like Sawyer, he made use of an optimum climate sensitivity of 2.4 degrees per doubling of atmospheric carbon dioxide. He made the assumption that the globe warms up and matches the atmospheric carbon dioxide. However, modern models do not agree with this assumption and usually take into consideration the lag between ho oceans and the atmosphere warm up. The oceans have a slower heat uptake which is also known as the thermal inertia of the climate system. His projections were made in an era when most forecasters wee of the opinion that observations indicated a slow cooling of the earth. Broecker was a scientist of repute who put a strong case for his model (Barron, 1995).
In 1981, Dr. James Hansen of NASA and his colleagues made climatic change projections that used the energy balance model. Their work was also published in an academic paper. Their model was different from that of Broecker because they took account of thermal inertia as a result of the ocean heat uptake. They projected a climate sensitivity of 2.8 Centigrade per doubling carbon dioxide. They also considered another range of 1.4-5.6 Centigrade per carbon dioxide doubling. They introduced a number of scenarios which included diverse future emissions and climate sensitivity. There was a fast growth scenario whereby carbon dioxide emissions increased at a rate of 4% every year after 1981 and a slow growth scenario where carbon dioxide emissions increased at a rate of 2% every year. The fast growth scenario is n overestimation of current emissions. However, when it is combined with the other scenario it provides an estimate that is similar to the one observed in the year 2000 warming. Their overall rate of warming for the period 1970-2016 in the fast growth scenario was about 20% lower than the one observed in that period. Nevertheless, they published another paper in 1988 on a new model which is regarded as among the first modern climate change models. This model divided the globe into cells representing ten degrees longitudes and eight degrees latitudes. The grid cells consisted of nine vertical layers that represented the atmosphere which included cloud dynamics, aerosols, carbon dioxide, and several greenhouse gases. This time round they had three different scenarios with different greenhouse emissions for the future. The first scenario showed exponential growth in emissions which include GHG concentrations and carbon dioxide which were projected to be higher than those of today. The second scenario represented a carbon dioxide emission slowdown with a concentration of 401 ppm by the year 2016 which was close to the one observed that stood at 404 ppm. This scenario, however, projected a continuous growth in emissions of several halocarbons which are considered to be potent greenhouse gases. The third scenario indicated emissions that were nearing the zero mark after 2000. The second scenario is the one that neared the observed radiative forcing although it was about 10 % higher than the actual. Their model had a climate sensitivity of 4.2 Centigrade per carbon dioxide doubling. As a result of combining different factors, the second scenario had a warming projection which was about 30% higher than the actual observed during the stated period.
In 1990, the IPCC First Assessment Report (FAR) came up with a simple energy balance diffusion ocean model which was used to project changes in global air temperatures. Their scenario was based on the assumption of rapid atmospheric carbon dioxide growth that was projected to reach 418ppm carbon dioxide by 2016. However, the observed growth stood at 404ppm. The FAR also made the assumption that atmospheric halocarbon concentrations will continue to grow at a faster rate than before. Their best estimate of climate sensitivity was pegged at 2.5 Centigrade warming per carbon dioxide doubling having a range of about 1.5-4.5 Centigrade. The best estimate given was lower than the actual observed today by about 3 Centigrade. The rate of warming between 1970 and 2016 was overestimated by about 17%. The main reason is because there was a much higher projection of atmospheric carbon dioxide concentrations than it has ever happened.
The Second Assessment Report (SAR) by IPCC was published in 1995 and only applied to projections from 1990 onwards. The climate sensitivity was pegged at 2.5 Centigrade having a range of between 1.5 and 4.5 Centigrade. There was also a mid-range projected scenario which was pegged at 405 ppm carbon dioxide levels in 2016. This projection nearly matched with the observed concentrations. The SAR model was much more improved and included anthropogenic aerosols which have an effect on climate. The SAR projections registered a lower score than that of actual observation with about 28% for the period between 1990 and 2016. This was mainly caused by lower climate sensitivity that was observed in modern projection of between 2.5 Centigrade and 3 Centigrade as well as an overestimation of the radiative forcing of carbon dioxide.
The Third Assessment Report (TAR) by IPCC was conducted in 2001 and heavily relied on the atmosphere-ocean general circulation models (GCMs) from about seven modeling groups. These models introduced new clusters of emission scenarios (SRES) which were socioeconomic in nature. The first scenario had warming and emission trajectories of up to the year 2020 which projected a 2016 atmospheric carbon dioxide concentration of 406 ppm that was very close to what was actually observed. The SRES starting year was 2000 onwards. A simple climate model was used which was simulated to fit the mean output of several other GCMs. It exhibited a climate sensitivity of 2.8 Centigrade per carbon dioxide doubling including an average range of between 1.5 and 4.5 Centigrade. However, the warming rate of TAR was about 14 percent lower than that which was observed.
The Fourth Assessment Report (AR4) was done in 2007 and consisted of models which had atmospheric dynamics and model resolution that were considerably improved. AR4 fully utilized Earth System Models that integrate carbon cycles. It also showed improvements in the simulation of land surface and ice processes. It used similar scenarios that were used as the TAR including historical atmospheric concentrations and emissions up to the year 2000. The models that were used in this projection consisted on a mean climate sensitivity of 3.26 Centigrade with a range of between 2.1Centigrade and 4.4 Centigrade (Hickman, 2018).
The Fifth Assessment Report (AR5) by the IPCC was done in 2013 and featured a number of refinements on the models which included in future model uncertainty as compared to AR4. This particular model was an extension of the Coupled Model Intercomparison Project 5 (CIMP5) in which a number of different model clusters from all over the world operated climate models using the same scenarios and inputs. This model introduced a new group of scenarios related to greenhouse gas concentrations also known as Representative Concentration Pathways. They represented future forecasts from the year 2006 onwards including historical data produced before that year. However, comparisons between the models and the actual observation are difficult which mostly use global surface temperature. But it is well known that observed temperatures originate from surface air temperatures over sea and land surface above the ocean. To compensate for this, researchers have been able to create integrated model fields that include all those variables so as to match what is measured and what is observed. In CIMP5 models, global surface air temperatures have been observed to warm up about 16% faster than the actual observation since 1970. A large chunk of this difference which represents about 40% is mainly caused by air temperatures above the ocean warming faster than that above sea surfaces. The integrated model fields show a warming of about 9% faster than that displayed through observation. Iselin Medhaug and colleagues did a paper in Nature which suggested that the divergence exhibited by the model can be compensated by taking account of short-term natural variability, lower than expected solar output, and small volcanoes that were not featured in the 2005 projections (Hausfather, 2017).
Conclusion
Owing to the different types of climate change forecasts and projections, it is important to note that indeed climate is changing. Increased anthropogenic factors and the emission of greenhouse gases have an impact on climate systems. This impact can result in negative changes in global climate systems such as rising ocean temperatures, melting ice, rising surface temperatures, and increased chances of flooding. However, there have been different types of forecasting and projections on climate change making it hard for policy makers and communities to plan for the future. It is therefore important that scientists improve their forecasting methods by devising universally acceptable climate change models to help in decision making and planning to assist in efforts to mitigate against the dangerous effects of climate change. Most climate models have provided fair forecasting and projection abilities albeit with a certain degree of variability. Some models have predicted low climate change projections while others are too high as compared to observed occurrences. These discrepancies need to be bridged by coming up with effective climate models which do not show considerable degrees of variability. The different climate forecasting models are far from being perfect and should be improved with time.
References
Barron, E. J. (1995). Climate models: How reliable are their predictions . National Emergency Training Center.
Hausfather, Z. (2017). Analysis: How well have climate models projected global warming? CarbonBrief. Retrieved from: https://www.carbonbrief.org/analysis-how-well-have-climate-models-projected-global-warming
Hickman, L. (2018). Timeline: The history of climate modelling. Climate Modelling. Retrieved from: https://www.carbonbrief.org/timeline-history-climate-modelling
Lemos, M. C., & Rood, R. B. (2010). Climate projections and their impact on policy and practice. Wiley interdisciplinary reviews: climate change , 1 (5), 670-682.