private thoughts about life behind data processing

Monthly Archives: elokuu 2016


It’s all about I/O

In previous posts I’ve been talking about data dissemination — how crucial data post-processing and exporting the data is. Producing huge amount of data is an useless effort if it can’t be ennobled to and delivered as information. But a weather model is not the source of all information. Weather models are based on many kinds of observations about atmosphere. Before running the model, those observations need to be processed and read in. It is far from trivial task.

Few months ago there was a fuzz about Panasonic’s new weather model1. The TV Maker has been developing it’s own weather model as side of it’s TAMDAR2 (Tropospheric Airborne Meteorological Data Reporting) services (a weather observation system attached to airplanes). While providing weather services for airlines they are gathering a large amount of useful data for weather forecasting. Instead of selling all data away, Panasonic took GFS3 weather model and developed their own one to take full advantage of observations they had.

The first time private company has been able to develop and run a global weather model that can compete with governments’ models. I don’t find the news themselves very surprising. Creating a well functioning weather model core sure needs huge amount of research. But this physics is commonly known. And many global weather models are open source code which gives private companies a good starting point for further development.

But there’s an other aspect which is notable in the article. One of the main advantages Panasonic says it has is a large amount of TAMDAR observations. Still, it’s not that governments’ modellers hadn’t had enough observations available. They just haven’t been able to use them. Weather forecasting has been a driving force in IT development for decades. It has created global telecommunication system (GTS)4 to share weather observations before World Wide Web5. Today, the most powerful computers are used for weather and climate modelling6. But long and glorious history has it’s disadvantages as well. Models are typically written in Fortran7 which has a great mathematical power but lacks of modern data I/O capabilities. Model developers are traditionally physicists, not IT specialists, who have more interest and skills in models’ scientific performance than creating the most sophisticated data flow. Public sectors’ weather modellers are also typically running and developing an operational system with large amount of legacy which slows down the development.

But regardless of partly inevitable disadvantages that traditional public sector weather forecasters have, the article emphasises how crucial data assimilation (reading data in) is. Having a good data assimilation process8 is also one reason why ECMWF model is better than GFS9. (These training course lecture notes10 give an idea how complicated data assimilation process can be.) Having the best available IT resources for creating fluent data I/O may also be one advantage that companies like Panasonic has.


1http://arstechnica.com/science/2016/04/tv-maker-panasonic-says-it-has-developed-the-worlds-best-weather-model/

2https://en.wikipedia.org/wiki/TAMDAR

3https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-forcast-system-gfs

4https://www.wmo.int/pages/prog/www/TEM/index_en.html

5https://en.wikipedia.org/wiki/World_Wide_Web

6http://www.wired.com/2016/06/fastest-supercomputer-sunway-taihulight/

7https://en.wikipedia.org/wiki/Fortran

8http://www.ecmwf.int/en/research/data-assimilation

9http://arstechnica.com/science/2016/03/the-european-forecast-model-already-kicking-americas-butt-just-improved/

10http://www.ecmwf.int/sites/default/files/Data%20assimilation%20concepts%20and%20methods.pdf