Operational Applications within COSMO

Last updated: 3 Sep 2020

Contents

1. Introduction

The first operational application of the COSMO-Model has been at DWD, where the model became operational on 1 December 1999. By that time it was still called LM, for Local Model. This first application was on a domain covering Middle Europe with a size of 325x325x32 grid points and a resolution of 0.0625 degree (about 7 km). This resolution is nowadays used by all partners of the Consortium for Small-Scale Modelling. Only at DWD this application has been replaced by a nest of the global model ICON (called ICON-EU) in December 2016.

In 2003 work started for developing and implementing a version of the COSMO-Model which is capable of running with very high resolutions reaching to the convection resolving scale of 1-3 km. The main work for that was the development of the Runge-Kutta dynamical core. The boundary conditions for these high-resolution runs are derived from forecasts of the coarser grid COSMO applications. Since April 2007, the COSMO-DE has been running at DWD with a resolution of 2.8 km. Other centers implemented applications with similar resolutions in the following years. Since March 2016, MeteoSwiss runs an application with 1.1 km.

The next step in the further development of NWP applications has been the introduction of convection permitting ensemble systems to represent uncertainties in the forecast process. Based on the very-high resolution application, DWD developed its COSMO-DE-EPS, which runs operationally since May 2012.

Even before convection permitting ensembles were developed, ARPAE-SIMC installed the COSMO-LEPS (Limited Area Ensemble Prediction System) based on the COSMO-Model with 7 km resolution and ECMWF ensemble forecasts at the ECMWF computing centre. 16 operational model runs are performed (7 km grid spacing, 40 levels forecast range of 132 hours) starting at 12 UTC on initial and boundary conditions from 16 representative members of an ECMWF-EPS superensemble.

From 2010 on, several national weather services, which were using a former hydrostatic model HRM from DWD, migrated to the COSMO-Model. They can use the COSMO-Model for a yearly license fee of 20.000€. For countries which belong to the lower-middle-income economies in the World Bank list, the license fee is waived.

2. Initial and Boundary Conditions

Boundary conditions for operational runs with moderate resolutions are derived from the forecasts of DWD's global model ICON. Only a subset of the ICON data covering the respective COSMO-domain are provided by DWD via the Internet. This ensures an efficient and timely transmission. Alternatively, the global IFS data from ECMWF can be used for boundary conditions. The forecasts from the moderate resolution applications can then be used to derive boundary conditions for higher resolution applications.

Also initial conditions can be derived from interpolated global analyses. When doing this it is possible to smooth the initial fields using the digital filtering scheme of Lynch et al. (1997).

A better choice for producing initial conditions for the COSMO-Model is to run a data assimilation system. At DWD a comprehensive assimilation system for the model has been installed, comprising the analysis for atmospheric fields, a sea surface temperature (SST) analysis and a snow depth analysis. For the 7 km application COSMO-EU also a soil moisture analysis according to Hess (2001) had been implemented. But this is not used for the higher resolution application COSMO-DE.

From the beginning of the COSMO-Model the assimilation of the atmospheric fields has been done by the Nudging technique. But since several years an ensemble based approach has been developed in the COSMO Priority Project KENDA (km based ensemble data assimilation). And since April 2017 DWD now uses the KENDA based assimilation system.

3. Applications within the Consortium

The following table1, gives a brief overview on the configurations and the major specifications of the operational coarse grid systems in the COSMO centres. Through the links in the headings you can get a more detailed description of the different configurations including a detailed NAMELIST setting of the applications.

Please note that DWD does not run a 7 km application any more.

Table2 again gives a brief overview on the major specifications of the very high resolution setups.

In Table 3 the major specifications of the ensemble systems are provided. Here are listed the Perturbed Parameters (pdf) with their values.

Table1: Major specifications of COSMO-Model coarse grid applications
  ARPAE−SIMC HNMS IMWM-NRI MeteoSwiss NMA Roshydromet ITAF-ReMet LEPS
Domain Size
(grid points)
1083 x 559 1001 x 601 415 x 460 393 x 338 201 x 177 2000 x 1000 1083 x 559 511 x 415
Hor. Grid Spacing
(degree/km)
0.045 / 5 0.04 / 4 0.0625 / 7 0.06 / 6.6 0.0625 / 7 0.06 / 6.6 0.045 / 5 0.0625  / 7
Number of Layers 45 80 40 60 40 40 45 40
Time Step (sec) 45 30 60 60 66 40 45 66
Forecast Range (h) 72 72 84 72 78 120,78, 120,78 72 132
Initial Time of Model Runs (UTC) 00, 12 00, 12 00, 06, 12, 18 00, 06, 12, 18 00, 06, 12, 18  00, 06,  12, 18 00, 06, 12, 18 00, 12
Lateral Boundary Conditions IFS IFS ICON IFS ICON ICON IFS IFS-ENS members
LBC Update Frequency (h) 3 3 3 1 3 3 3 3
Initial State COMET-LETKF IFS DAC/ICON Nudging Scheme ICON ICON COMET-LETKF IFS-ENS members
External Analysis SST, SNOW COVER MASK None None SST(IFS), Snow Depth SYNOP None SST, SNOW COVER MASK soil-fields from ICON-EU
Cosmo Version 5.05 5.04e 5.01 5.0+ (GPU/SP) 5.03 5.05 5.06 5.03
Hardware Lenovo Broadwell Intel Cluster Cray XC30 Intel&HP based cluster Cray CS-Storm IBMcluster Cray XC40-LC Hybrid CPU/GPU cluster
(CPU Intel, GPU NVIDIA)
Cray XC (at ECMWF)
No. of Processors used 704 (22 nodes) 1260 (35 nodes) 140 2+16 GPUs (of 192) 56(of 112) 1946 (of 35136) 576 (24 nodes) 720 tasks (on 20 nodes)
Table2: Major specifications of COSMO-Model high resolution applications
  DWD IMWM-NRI MeteoSwiss ITAF-ReMet NMA Roshydromet IMS ARPAE−SIMC
Domain Size
(grid points)
651 x 716 380 x 405 1158 x 774 576 x 701 361 x 291 1200 x 1400 561 x 401 576 x 701
Hor. Grid Spacing (degree/km) 0.02 / 2.2 0.025 / 2.8 0.01 / 1.1 0.02 / 2.2 0.025 / 2.8 0.02 / 2.2 0.025 / 2.8 0.02 / 2.2
Number of Layers 65 50 80 65 50 50 50 65
Time Step (sec) 20 20 10 20 25 20 20,25 20
Forecast Range (h) 27 48 33, 45 30, 48   48 24h 48
Initial Time of
Model Runs (UTC)
00, 03, 06, 09,
12, 15, 18, 21
00, 06, 12, 18 00, 03, 06, 09,
12, 15, 18, 21
00, 06, 12, 18 00, 06, 12, 18 00, 06, 12, 18 00, 12
hourly when rainy
00, 03, 06, 09, 12, 15, 18, 21 UTC
Lateral Boundary Conditions ICON-EU COSMO-PL7 IFS COSMO-ME COSMO-Ro2 COSMO-Ru6ENA IFS COSMO-5km
LBC Update Frequency (h) 1 1 1 1 1 1 1 1
Initial State KENDA + LHN COSMO-PL7 Nudging+LHN KENDA-LETKF SYNOP/Radar COSMO-Ru6ENA+ +Nudging+LHN FS+Nudging+LHN KENDA-LETKF
External Analysis SST, Snow Depth None SST(IFS), Snow Depth SST None None SST SST
Cosmo Version 5.05b_1 5.01 5.0+ (GPU/SP) 5.06 4.18 5.05 5.05 5.05
Hardware Cray XC40 Intel&HP based cluster Cray CS-Storm Hybrid CPU/GPU cluster
(CPU Intel, GPU NVIDIA)
IBM cluster Cray XC40-LC SGI Altix Lenovo Broadwell Intel Cluster
No. of Cores used 43x54+6 = 2328 (of 29952) 160 18+144 GPUs (of 192) 576 (on 24 nodes) 90 (of 144) 2880 (of 35136) 576 (of 1408) 256 (8 nodes)
Table3: Major specifications of ensemble applications of COSMO-Model (perturbed parameters)
  COSMO-D2-EPS
(DWD)
COSMO-E
(MeteoSwiss)
TLE-MVE
(IMWM-NRI)
COSMO-2I-EPS
(Arpae-SIMC, pre-ope)
COSMO-ME-EPS
(ITAF-ReMet)
COSMO-IT-EPS
(ITAF-ReMet)

COSMO-LEPS (Arpae-SIMC)

Domain Size
(grid points)
651x716 582x390 380x405 576x701 779x401 576x701 511x415
Number of members 20 21 21 20 40 20 20
Hor. Grid Spacing (degree/km) 0.02/2.2 0.02/2.2 0.025/2.8 0.02/2.2 0.0625/7 0.02/2.2 0.0625/7
Number of Layers 65 60 50 65 45 65 40
Time Step (s) 20 20 20 18 60 20 66
Forecast Range (h) 27 (45 for 03 UTC run) 120 48 51 72 48 132
Initial Time of
Model Runs (UTC)

00, 03, 06, 09,

12, 15, 18, 21

00, 12 00, 06, 12, 18 21 00, 12 00, 12 00, 12
Lateral Boundary Conditions ICON-EU-EPS IFS-ENS COSMO-PL7 (Time-Lagged) COSMO-ME-EPS IFS-ENS COSMO-ME-EPS

IFS-ENS (time-lagged + cluster analysis)

LBC update freq (h) 1 3 1 3 3 3 3
Initial state KENDA (+LHN) KENDA (+LHN) COSMO-PL7 (Time-Lagged) KENDA (+LHN) COMET-LETKF KENDA

IFS-ENS (time-lagged + cluster analysis)

+ soil from ICON-EU

External analysis SST, snow height SST (IFS-ENS), snow depth None None None None None
Physics perturbation Parameter perturbation
(randomized draw from selection of predefined perturbed values)
SPPT c_soil None SPPT None

Parameter perturbation (randomized draw from selection of predefined perturbed values)

COSMO version 5.05b_1 5.0+ (GPU/SP) 5.01 5.05

5.05b

5.0+ (GPU)

5.0+ (GPU) 5.03 in single precision
hardware Cray XC40 Cray CS-Storm Intel & HP based cluster Lenovo Broadwell Intel cluster

ECMWF Cray

HP

HP ECMWF Cray
No. of core used

21x29+3 = 612 (on Broadwell)

20x25+4 = 504 (on Haswell)

10+8 GPUs per member 240 per member 192 core per member

432 every 4 members

24+4 GPUs per member

24+8 GPUs per member 720

4. Licensees of the COSMO-Model

Besides universities or research institutes, which can use the COSMO-Model with a free research license, the model can also be used by national weather services not belonging to the COSMO consortium. They have to pay a yearly license fee of 20.000€. This fee is waived for developing countries.

Users of COSMO-Model
Figure 1: Countries where the national weather service or a research institute is using the COSMO-Model.

For the model users in tropical areas we provide a special tropical setup. This tropical setup has a higher model top than usual to allow a proper simulation of convective systems, which can rise much higher in the tropics than outside.