the Cosmo model at operational Applications within the COSMO consortium

Last updated: 23 Feb 2021

The Cosmo model used to be the operational and research & development softare of the Cosmo consortium until 2022, when Icon became the new recommended and supported one. This page gives a short account of the Cosmo-model operational use until then.

short history

The first operational application of the COSMO-Model has been at DWD, where the model became operational on 1 December 1999. By that time it was still called LM, for Local Model. This first application was on a domain covering Middle Europe with a size of 325x325x32 grid points and a resolution of 0.0625 degree (about 7 km). This resolution is nowadays used by all partners of the Consortium for Small-Scale Modelling, except for DWD and MeteoSwiss. At DWD this application has been replaced by a nest of the global model ICON (called ICON-EU) in December 2016. MeteoSwiss switched to a full ensemble prediction system at higher resolutions in 2020.

In 2003 work started for developing and implementing a version of the COSMO-Model which is capable of running with very high resolutions reaching to the convection resolving scale of 1-3 km. The main work for that was the development of the Runge-Kutta dynamical core. The boundary conditions for these high-resolution runs are derived from forecasts of the coarser grid COSMO applications. Since April 2007, the COSMO-DE has been running at DWD with a resolution of 2.8 km. Other centers implemented applications with similar resolutions in the following years.

The next step in the further development of NWP applications has been the introduction of convection permitting ensemble systems to represent uncertainties in the forecast process. Based on the very-high resolution application, DWD developed its COSMO-DE-EPS, which runs operationally since May 2012.

Even before convection permitting ensembles were developed, ARPAE-SIMC installed the COSMO-LEPS (Limited Area Ensemble Prediction System) based on the COSMO-Model with 7 km resolution and ECMWF ensemble forecasts at the ECMWF computing centre. 16 operational model runs are performed (7 km grid spacing, 40 levels forecast range of 132 hours) starting at 12 UTC on initial and boundary conditions from 16 representative members of an ECMWF-EPS superensemble.

From 2010 on, several national weather services, which were using a former hydrostatic model HRM from DWD, migrated to the COSMO-Model. They can use the COSMO-Model for a yearly license fee of 20.000€. For countries which belong to the lower-middle-income economies in the World Bank list, the license fee is waived.

Initial and Boundary Conditions

Boundary conditions for operational runs with moderate resolutions were derived from the forecasts of the global. IFS data from ECMWF could be used for boundary conditions. The forecasts from the moderate resolution applications can then be used to derive boundary conditions for higher resolution applications.

Also initial conditions could be derived from interpolated global analyses. When doing this it is possible to smooth the initial fields using the digital filtering scheme of Lynch et al. (1997).

A better choice for producing initial conditions for the COSMO-Model was to run a data assimilation system. At DWD a comprehensive assimilation system for the model has been installed, comprising the analysis for atmospheric fields, a sea surface temperature (SST) analysis and a snow depth analysis. For the 7 km application COSMO-EU also a soil moisture analysis according to Hess (2001) had been implemented. But this is not used for the higher resolution application COSMO-DE.

From the beginning of the COSMO-Model the assimilation of the atmospheric fields has been done by the Nudging technique. But since several years an ensemble based approach has been developed in the COSMO Priority Project KENDA (km based ensemble data assimilation). And since April 2017 DWD uses the KENDA based assimilation system.

Applications within the Consortium

The following table1, gives a brief overview on the configurations and the major specifications of the operational coarse grid systems in the COSMO centres. Through the links in the headings you can get a more detailed description of the different configurations including a detailed NAMELIST setting of the applications.

Please note that DWD and MeteoSwiss do not run a 7 km application any more.

Table2 again gives a brief overview on the major specifications of the very high resolution setups.

In Table 3 the major specifications of the ensemble systems are provided. Here are listed the Perturbed Parameters (pdf) with their values.

Table1: Major specifications of COSMO-Model coarse grid applications
  ARPAE−SIMC HNMS IMWM-NRI NMA Roshydromet ITAF-Met LEPS
Domain Size
(grid points)
1083 x 559 1001 x 601 415 x 460 201 x 177 2000 x 1000 1083 x 559 511 x 415
Hor. Grid Spacing
(degree/km)
0.045 / 5 0.04 / 4 0.0625 / 7 0.0625 / 7 0.06 / 6.6 0.045 / 5 0.0625  / 7
Number of Layers 45 80 40 40 40 45 40
Time Step (sec) 45 30 60 66 40 45 66
Forecast Range (h) 72 72 84 78 120,78, 120,78 72 132
Initial Time of Model Runs (UTC) 00, 12 00, 12 00, 06, 12, 18 00, 06, 12, 18  00, 06,  12, 18 00, 06, 12, 18 00, 12
Lateral Boundary Conditions IFS IFS ICON ICON ICON IFS IFS-ENS members
LBC Update Frequency (h) 3 3 3 3 3 3 3
Initial State CNMCA-LETKF IFS DAC/ICON ICON ICON CNMCA-LETKF IFS-ENS members
External Analysis SST, SNOW COVER MASK None None SYNOP None SST, SNOW COVER MASK soil-fields from ICON-EU
Cosmo Version 5.05 5.04e 5.01 5.03 5.05 5.06 5.03
Hardware Lenovo Broadwell Intel Cluster Cray XC30 Intel&HP based cluster IBMcluster Cray XC40-LC Hybrid CPU/GPU cluster
(CPU Intel, GPU NVIDIA)
Cray XC (at ECMWF)
No. of Processors used 704 (22 nodes) 1260 (35 nodes) 140 56(of 112) 1946 (of 35136) 576 (24 nodes) 720 tasks (on 20 nodes)
Table2: Major specifications of COSMO-Model high resolution applications
  DWD IMWM-NRI ITAF-Met NMA Roshydromet IMS ARPAE−SIMC
Domain Size
(grid points)
651 x 716 380 x 405 576 x 701 361 x 291 1200 x 1400 561 x 401 576 x 701
Hor. Grid Spacing (degree/km) 0.02 / 2.2 0.025 / 2.8 0.02 / 2.2 0.025 / 2.8 0.02 / 2.2 0.025 / 2.8 0.02 / 2.2
Number of Layers 65 50 65 50 50 50 65
Time Step (sec) 20 20 20 25 20 20,25 20
Forecast Range (h) 27 48 30, 48   48 24h 48
Initial Time of
Model Runs (UTC)
00, 03, 06, 09,
12, 15, 18, 21
00, 06, 12, 18 00, 06, 12, 18 00, 06, 12, 18 00, 06, 12, 18 00, 12
hourly when rainy
00, 03, 06, 09, 12, 15, 18, 21 UTC
Lateral Boundary Conditions ICON-EU COSMO-PL7 COSMO-ME COSMO-Ro2 COSMO-Ru6ENA IFS COSMO-5km
LBC Update Frequency (h) 1 1 1 1 1 1 1
Initial State KENDA + LHN COSMO-PL7 KENDA-LETKF SYNOP/Radar COSMO-Ru6ENA+ +Nudging+LHN FS+Nudging+LHN KENDA-LETKF
External Analysis SST, Snow Depth None SST None None SST SST
Cosmo Version 5.05b_1 5.01 5.06 4.18 5.05 5.05 5.05
Hardware Cray XC40 Intel&HP based cluster Hybrid CPU/GPU cluster
(CPU Intel, GPU NVIDIA)
IBM cluster Cray XC40-LC SGI Altix Lenovo Broadwell Intel Cluster
No. of Cores used 43x54+6 = 2328 (of 29952) 160 576 (on 24 nodes) 90 (of 144) 2880 (of 35136) 576 (of 1408) 256 (8 nodes)
Table3: Major specifications of ensemble applications of COSMO-Model (perturbed parameters)
  ICON-D2-EPS
(DWD)
COSMO-1E (MeteoSwiss) COSMO-2E
(MeteoSwiss)
TLE-MVE
(IMWM-NRI)
COSMO-2I-EPS
(Arpae-SIMC, pre-ope)
COSMO-ME-EPS
(ITAF-Met)
COSMO-IT-EPS
(ITAF-Met)
COSMO-LEPS (Arpae-SIMC)
Domain Size
(grid points)
R19B07
542040 cells
1170x786 582x390 380x405 576x701 779x401 576x701 511x415
Number of members 20 11 21 21 20 40 20 20
Hor. Grid Spacing (degree/km) ~2.1 km 0.01/1.1 0.02/2.2 0.025/2.8 0.02/2.2 0.0625/7 0.02/2.2 0.0625/7
Number of Layers 65 80 60 50 65 45 65 40
Time Step (s) 20 10 20 20 18 60 20 66
Forecast Range (h) 27 (45 for 03 UTC run) 33 (45 for 03 UTC run) 120 48 51 72 48 132
Initial Time of
Model Runs (UTC)
00, 03, 06, 09,
12, 15, 18, 21
00, 03, 06, 09,
12, 15, 18, 21
00, 06, 12, 18 00, 06, 12, 18 21 00, 12 00, 12 00, 12
Lateral Boundary Conditions ICON-EU-EPS ECMWF HRES and ENS ECMWF ENS COSMO-PL7 (Time-Lagged) COSMO-ME-EPS IFS-ENS COSMO-ME-EPS IFS-ENS (time-lagged + cluster analysis)
LBC update freq (h) 1 1 3 1 3 3 3 3
Initial state KENDA (+LHN) KENDA 1.1km (+LHN) KENDA 1.1 km, upscaled to 2.2 km (+LHN) COSMO-PL7 (Time-Lagged) KENDA (+LHN) CNMCA-LETKF KENDA IFS-ENS (time-lagged + cluster analysis)
+ soil from ICON-EU
External analysis SST, snow height SST (ECMWF HRES and ENS), snow depth SST (ECMWF ENS), snow depth None None None None None
Physics perturbation Parameter perturbation
(randomized draw from selection of predefined perturbed values)
SPPT SPPT c_soil None SPPT None Parameter perturbation (randomized draw from selection of predefined perturbed values)
COSMO version - 5.07+ (GPU/SP) 5.07+ (GPU/SP) 5.01 5.05 5.05b
5.0+ (GPU)
5.0+ (GPU) 5.03 in single precision
hardware   Cray CS-Storm Cray CS-Storm Intel & HP based cluster Lenovo Broadwell Intel cluster ECMWF Cray
HP
HP ECMWF Cray
No. of core used   14+8 GPUs per member 6+4 GPUs per member 240 per member 192 core per member 432 every 4 members
24+4 GPUs per member
24+8 GPUs per member 720