The amount of predicted global warming varies from one computer climate model to another. What probably causes?
i'll give few illustrations why vary. there many, many other reasons. bottom line though there lot of continuing refinement going on time. no single font or oracle of perfect knowledge.
temperature data isn't homogeneous, either spatially or in time. on sea, example, measurements not taken on nice cleanly spaced locations , @ nice cleanly timed moments. satellites may not cover entire atmosphere, instead have significant "blind spots." or satellite sensors may make measurements confounded different emissions @ different altitudes, or different calibration adjustments , error bounds @ different latitudes. global climate models simpler when have nice, simple grid work , regular sampling. there groups little else process sporadic data turning regularized datasets easier process models. lumped called renomalization. models can "cope with" sporadic datasets , have own methods doing so. no matter what, though, models not incorporate of data sets. there datasets cannot , not process. variation results that, too.
temperature measurements composed variety of sensors variety of error sources associated them. different groups may apply different corrections. example, dr. spencer , dr. christy (u of alabama in huntsville, aka uah) publish dataset msu t2lt data satellite. remote sensing systems, took on task of discovering , publishing errors in uah dataset processing algorithms , correcting them. now, both uah , rss separately publish global datasets msu t2lt satellite sensor. there other groups publishing data, though they've since stopped. there 2 datasets same sensors , periods 2 different sources. rss maintains theirs better. uah same. , 1 sensor system. there hundreds of different dimensions of data , thousands of sources data , many, many groups researching sensor errors, correction, , renormalization.
global climate models highly experimental , in research stage -- not ready "prime time," important explore because may uncover important. (models exploring medium-term projections on order of year or two, example, combine weather , climate , still in exploratory stages.) models in various stages of incorporating best practices, well. investigations don't require complex models , can afford use simpler models explore ideas more quickly, on long simulation periods otherwise impossible if more complex , more detailed model systems applied. different horses different courses.
, don't have enough compute power, including every single computer on planet, process known. cloud physics has advanced great deal, best has been able make predictions on small geographic areas of few dozen square kilometers , on few weeks' time. then, took best super computers available come close making predictions in real-time. , these not coupled global models... bare physics-based cloud modeling system. result, various levels of cloud parameterizations used models can produce results, @ all. these datasets vary degree, too.
Environment Global Warming Next
Comments
Post a Comment