the national centers for environmental prediction national weather

21
WWW Technical Progress Report on the Global Data Processing System 2000 THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER SERVICE, U.S.A. 1. Highlights For Calendar Year 2000 Following the fire at the Federal Office Building (FOB) 4 Central Computer Facility in September 1999, which damaged the Cray C90 computer, NCEP’s operational focus has been on improving the timeliness and reliability of numerical weather forecast model products. These efforts were carried out on the “Phase I” IBM RS/6000 SP massively parallel processor (MPP) computer system and then on the upgraded “Phase II” IBM RS/6000 SP system located in a modern computer facility at the Bowie Computer Center (BCC) in Bowie, Maryland. Much of the improvement in timeliness and reliability of operations have come from a combination of factors involving sufficient system resources to thoroughly test changes in an operational environment prior to implementation resulting in fewer program failures, progress in optimizing processing which enabled product generation schedules to be met even as processing was enhanced, redundant communications, and high availability architecture for data ingest combined with an even broader oversight of processing to improve fault tolerance. All of these elements were pulled together, unified and successfully strengthened by the high reliability of the MPP components, the important redundancy gained by the system’s flexible configuration and the quality and robustness of IBM’s hardware and system support. Other highlights for 2000 include improvements in model resolution, increases in the range of forecast products and the replacement of the operational wave model with an improved model. Specifically, the resolution of the Eta model increased from 32 km to 22 km while that of the Global Model went from T126 to T170. The Hurricane model run was extended from 72 hours to 120 hours and the Wave Model (WAM) was replaced by the Global Wave Watch 3 model.

Upload: others

Post on 03-Feb-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

WWW Technical Progress Report on the Global Data Processing System 2000

THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION

NATIONAL WEATHER SERVICE, U.S.A.

1. Highlights For Calendar Year 2000

Following the fire at the Federal Office Building (FOB) 4 CentralComputer Facility in September 1999, which damaged the Cray C90computer, NCEP’s operational focus has been on improving thetimeliness and reliability of numerical weather forecast modelproducts. These efforts were carried out on the “Phase I” IBMRS/6000 SP massively parallel processor (MPP) computer system andthen on the upgraded “Phase II” IBM RS/6000 SP system located in amodern computer facility at the Bowie Computer Center (BCC) inBowie, Maryland.

Much of the improvement in timeliness and reliability ofoperations have come from a combination of factors involvingsufficient system resources to thoroughly test changes in anoperational environment prior to implementation resulting in fewerprogram failures, progress in optimizing processing which enabledproduct generation schedules to be met even as processing wasenhanced, redundant communications, and high availabilityarchitecture for data ingest combined with an even broaderoversight of processing to improve fault tolerance. All of theseelements were pulled together, unified and successfullystrengthened by the high reliability of the MPP components, theimportant redundancy gained by the system’s flexible configurationand the quality and robustness of IBM’s hardware and systemsupport.

Other highlights for 2000 include improvements in modelresolution, increases in the range of forecast products and thereplacement of the operational wave model with an improved model.Specifically, the resolution of the Eta model increased from 32 kmto 22 km while that of the Global Model went from T126 to T170.The Hurricane model run was extended from 72 hours to 120 hoursand the Wave Model (WAM) was replaced by the Global Wave Watch 3model.

Page 2: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

2

Figure 1 reflects the significant improvement and stability ingenerating model guidance products on-time (within 15 minutes ofthe schedule) achieved by NCEP’s new high performance computersystem at the modern BCC facility.

Figure1.

NCEP

Model Guidance Product Generation Performance.

2. Equipment

2.1 IBM RS/6000 SP

NCEP’s high performance computer contract was awarded to IBM inOctober 1998 and provided for a phased implementation.Installation of the IBM RS/6000 SP Phase I system was accepted inJune 1999. Following the fire at the Central Computer Site, theentire Phase I system was dismantled and was physically relocatedto the new computer facility at the BCC in Bowie, Maryland. Astaged implementation of operational processing started again onNovember 17, 1999. The system became operational on January 17,2000.

Under NCEP’s contract, the final Phase II system was skillfullystaged so that some components from the Phase I installation couldbe used for the Phase II system. By configuring the system with agigabit, high speed ethernet ring, it was possible to fenceoperations for highest throughput performance and still share anNFS file system to address all data files, a back-end sharedHierarchical Storage Manager to access 0.9 TB of disk cache and

Page 3: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

3

two Storage Tek Silos with a capacity of about 200 TB. The PhaseII system was installed in September 2000 and became operationalon December 7, 2000. Since installation, improvements have beenmade to the system to support new NCEP programming requirements byadding nodes and enhancing storage. NCEP manages the IBM SP sothat one portion of the system is used primarily for operationalwork and the rest is used primarily for development. Thedistinction between the operational and development parts of thesystem is interchangeable and this flexibility makes operationsmore reliable while at the same time supporting and enablingsystem enhancements.

The Phase I system ran benched-marked numerical applications morethan five times faster than NCEP’s previous operational system, aCray C90. The Phase II system’s performance on the benchmarkcodes was in excess of forty-two times the Cray C90 which is aboutsix to seven times faster than the Phase I system. A roughestimate of the theoretical peak performance of NCEP’s Phase IIsystem is about three teraflops.

Table 1. Comparison of the high performance Phase I and Phase IISystems.

Configuration/System

“Phase I”IBM RS/6000 SP

“Phase II”IBM RS/6000 SP

Processors 768 2176

Memory 208 GB 1126.4 GB

Operating System AIX AIX

Disk Storage 4.6 TB disksubsystem

14.1 TB disksubsystem

CPU Power-3 WinterhawkI200 MHz - 2CPUs /node 4 MB Cache

Power-3Winterhawk II375 MHz - 4 CPUs/node 8 MB Cache

2.2 Additional Components

Independent from the IBM RS/6000 SP, two additional computers arepart of NCEP’s Central Computer System. An SGI Origin 2000/12system and an Origin 3000/16 system which share over one TB ofdisk space. They also share a third Storage Tek silo with eight9840 drives. These systems are used to run several non-operationalprojects.

Page 4: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

4

2.3 Communications

Figure 2 shows the current communications network used by NCEP atthe World Weather Building (WWB) in Camp Springs, Maryland toexchange information within the NWS and to provide analysis andforecast products to users via the internet. NCEP sends andreceives data from the BCC in Bowie, Maryland via a high speed OC3(155 Mbits/sec) circuit. Communications between WWB, BCC and theTelecommunication Gateway computer system at the Silver SpringMetro Center (SSMC) in Silver Spring, Maryland is provided by aFast Network Server (FNS) running at 10 MBits/sec. The WWB siteis connected to three remote NCEP centers, the Storm PredictionCenter (SPC) in Norman, Oklahoma, the Aviation Weather Center(AWC) in Kansas City, Missouri, and the Tropical Prediction Center(TPC) in Miami, Florida by dual T1 lines, each with a capacity of1.546 MBits/sec. Access to the internet from WWB is provided by aFDDI (a fiber optic circuit) to Federal Office Building 4 (FOB4),then by the Metropolitan Area Network (MAN) ATM cloud to SSMC.From there, information is sent to the internet over a commercialcommunications line to the Internet Service Provider (ISP).

Page 5: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

5

2.4 Future Plans

Figure 3 shows the configuration of the upgraded communicationsnetwork scheduled to be operational by Fall 2001. The current FNSlink between SSMC, WWB and Bowie will be retained temporarily as abackup; however, it will be superseded operationally with highspeed OC3 lines from the ATM cloud. The dual T1 line between WWBand the remote centers will be replaced by single T3 lines, eachwith an initial capacity of 12 Mbits/sec. The multiple circuitpath that feeds the internet will be replaced by a single, 45Mbits/sec direct link between WWB with SSMC through MAN ATM cloud.

3. Observational Data Ingest and Access System

Page 6: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

6

3.1 Status at the End of 2000

3.1.1 Observational Data-Ingest

NCEP receives the majority of its data from the GlobalTelecommunications System (GTS) and the National EnvironmentalSatellite, Data, and Information Service (NESDIS). The GTS andaviation circuit bulletins are transferred from the NWSTelecommunications Gateway (NWSTG) to NCEP's Central Operations(NCO) over two 56 kbits/sec lines. Each circuit is interfacedthrough an X.25 pad connected to a PC running a LINUX operatingsystem with software to accumulate the incoming data-stream infiles. Each file is open for 20 seconds, after which the file isqueued to the Distributive Brokered Network (DBNet) server fordistributive processing. Files containing GTS observational dataare networked to one of two IBM workstations. There the data-stream file is parsed for bulletins which are then passed to theLocal Data Manager (LDM). The LDM controls continuous processingof a bank of on-line decoders by using a bulletin header pattern-matching algorithm. Files containing GTS gridded-data are parsedon the LINUX PC, “tagged by type” for identification, and thentransferred directly to the IBM SP by DBNet. There, allobservations for assimilation are stored in accumulating datafiles according to the type of data. Some observational data andgridded data from other producers (e.g., satellite observationsfrom NESDIS) are similarly processed in batch mode on the IBM SPas the data become available. Observational files remain on-linefor up to 10 days before migration to offline cartridges. Whileonline, there is open access to them for accumulating latearriving observations and for research and study.

3.1.2 Data Access

The process of accessing the observational data base andretrieving a select set of observational data is accomplished inseveral stages by a number of FORTRAN codes. This retrievalprocess is run operationally many times a day to assemble “dump”data for model assimilation. The script that manages theretrieval of observations provides users with a wide range ofoptions. These include observational date/time windows,specification of geographic regions, data specification andcombination, duplicate checking and bulletin “part” merging, andparallel processing. The primary retrieval software performs theinitial stage of all data dumping by retrieving subsets of thedatabase that contain all the database messages valid for the timewindow requested by a user. The retrieval software looks only atthe date in BUFR Section One to determine which messages to copy.This results in an observing set containing possibly more datathan was requested, but allows the software to function veryefficiently. A final 'winnowing' of the data to an observing setwith the exact time window requested is done by the duplicatechecking and merging codes applied to data as the second stage ofthe process. Finally, manual quality marks are applied to thedata extracted. The quality marks are provided by personnel intwo NCEP groups: the NCO Senior Duty Meteorologists (SDMs) and the

Page 7: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

7

Marine Prediction Center (MPC).

3.1.3 Observational Data Ingest Platforms

Currently, observational data ingest is performed on two IBMworkstations. However, at the beginning of the year 2000, thisfunctionality was performed on a pair of Silicon Graphics (SGI)Origin 200 workstations. In between, it is worth noting, thatthis processing was migrated, in March 2000, to two dedicatednodes (separated from the switching fabric) on the Phase I IBM SPsystem joined for backup through High Availability ClusterMultiProcessing (HACMP) software. With this configuration, if oneof the data ingest servers fails, processing is automaticallytransferred to the other server. This design has workedexceedingly well; there has been no instance of interruption ofthe data ingest processing. However, it soon became clear thatthe Phase II system was to be implemented in a configuration thatwould require the use of four dedicated nodes to carry this designforward, and this was not viewed as an efficient use of resources.So, it was decided to return to the paradigm of using twoworkstations, and thus be independent from the IBM SP Phase IIsystem. Therefore in October 2000, the observational data ingestwas migrated again, back to a workstation’s environment, therebyreturning to the original design. Nevertheless, this “double hop”did serve to prove that separate processing modules can be removedand successfully installed and executed on dedicated nodes withinthe overall high performance MPP system. Therefore, it was viewedas a valuable experience.

3.2 Future Plans

There are a couple of major enhancements planned for theobservational data ingest in 2001. The first involves replacingthe two 56 kbps circuits with network socket connections betweenNWS Telecommunications Gateway (NWSTG) and NCEP Central Operations(NCO), and the second involves replacing the two Linux PCs withsimilar machines containing faster processors. Both enhancementsare for the purpose of improving the overall speed and reliabilityof the data flow between NWSTG and NCO, thereby lessening theamount of recovery time required after outages.

4. Quality Control System

4.1 Status at the End of 2000

Quality control (QC) of data is performed at NCEP, but the qualitycontrolled data are not disseminated on the GTS. QC informationis included in various monthly reports disseminated by the NCEP.The data quality control system for numerical weather predictionat the NCEP has been designed to operate in two phases:interactive and automated. The nature of the quality controlprocedures is somewhat different for the two phases.

Page 8: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

8

4.1.1 Interactive Phase

During the first phase, interactive quality control isaccomplished by the Marine Prediction Center (MPC) and the SeniorDuty Meteorologists (SDMs). The MPC personnel use a graphicalinteractive system called CREWSS (Collect, Review, Edit, Weatherdata from the Sea Surface) which provides an evaluation of thequality of the marine surface data provided by ships, buoys(drifting and moored), Coastal Marine Automated Network (CMAN)stations, and tide gauge stations based on comparisons with themodel first guess, buddy checks vs. neighboring platforms, theplatform’s track, and a one week history for each platform. TheMPC personnel can flag the data as to the quality or correctobvious errors in the data, such as incorrect hemisphere,misplaced decimal, etc. The quality flags and corrections arethen uploaded to the IBM SP and are stored there in an ASCII filefor use during the data retrieval process. The SDM performs asimilar process of quality assessment for rawinsonde temperatureand wind data, aircraft temperature and wind reports, andsatellite wind reports. The SDMs use an interactive program whichinitiates the offline execution of two of the automated qualitycontrol programs (described in the next paragraph) and then reviewthe programs’ decisions before making assessment decisions. TheSDMs use satellite pictures, meteorological graphics, continuityof data, input from reporting stations, past station performanceand horizontal data comparisons (buddy checks) to decide whetheror not to override automatic data QC flags.

4.1.2 Automated Phase

In the automated phase, the first step is to include any manualquality marks attached to the data by MPC personnel and the SDMs.This occurs when time-windowed BUFR (Binary Universal Form for theRepresentation of meteorological data) data retrieval files arecreated from the NCEP BUFR observational data base. Next is thepreprocessing program which makes some simple quality controldecisions to handle problems and to re-code the data in a BUFRformat with added descriptors to handle and track quality controlchanges. In the process, certain classes of data, e.g., surfacemarine reports over land, are flagged for non-use in the dataassimilation but are included for monitoring purposes. Theprocessing program also includes a step which applies the globalfirst guess background and observational errors to theobservations. Under special conditions (e.g., data too far underthe model surface), observations are flagged for non-use in thedata assimilation.

Separate automated quality control algorithms for rawinsonde, non-automated aircraft, wind profiler, and NEXRAD Vertical AzimuthDisplay (VAD) reports are executed next. The purpose of thesealgorithms is to eliminate or correct erroneous observations thatarise from location, transcription or communications errors.Attempts are made, when appropriate, to correct commonly occurringtypes of errors. Rawinsonde temperatures and height data passthrough a complex quality control program for heights and

Page 9: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

9

temperatures program (Gandin, 1989), which makes extensivehydrostatic, baseline, and horizontal and vertical consistencychecks based upon differences from the 6-hour forecast.Corrections and quality values are then applied to the rawinsondedata. In April 1997, a new complex quality control algorithm wasinstalled that performs the quality control for all levels as awhole, rather than considering the mandatory levels first, andthen the significant levels. In addition, an improvement was madeto the way in which the hydrostatic residuals are calculated andused (Collins, 1998). AIREP, PIREP, and AMDAR (Aircraft Report,Pilot Report, Aircraft Meteorological Data Relay) aircraft reportsare also quality controlled through a track-checking procedure byan aircraft quality control program. In addition, AIREP and PIREPreports are quality controlled in two ways: isolated reports arecompared to the first guess and groups of reports in closegeographical proximity are inter-compared. Both of the abovequality control programs are run offline by the SDM. Finally,wind profiler reports are quality controlled with a complexquality control program using multiple checks based on differencesfrom a 6-hour forecast, including a height check, and VAD windreports are quality controlled with a similar type of programwhich also includes an algorithm to account for contaminationby bird migration.

The final part of the quality control process is for all datatypes to be checked using an optimum interpolation based qualitycontrol algorithm, which uses the results of both phases of thequality control. As with the complex quality control procedures,this program operates in a parallel rather than a serial mode.That is, a number of independent checks (horizontal, vertical,geostrophic) are performed using all admitted observations. Eachobservation is subjected to the optimum interpolation formalismusing all observations except itself in each check. A finalquality decision (keep, toss, or reduced confidence weight) ismade based on the results from all individual checks and anymanual quality marks attached to the data. Results from all thechecks are kept in an annotated observational data base.

4.2 Future Plans

A review of the quality control system will be conducted in 2001with the purpose of planing upgrades to various parts of thesystem. One part of this review will be to further evaluate theperformance of quality control functions within the three-dimensional variational (3DVAR) analysis which was implemented inFall 2000. This method unifies the analysis and the qualitycontrol and is applicable to all data types. If the variationalanalysis continues to perform adequately, the use of a separate,automated quality control step for all data will not be needed,although separate QC for rawinsonde heights and temperatures,aircraft, wind profiler data and VAD wind reports will remainuseful.

Page 10: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

10

5. Monitoring System

5.1 Status at the End of 2000

5.1.1 Real-time Monitoring

As mentioned in the previous section, “real-time” monitoring ofthe incoming GTS and satellite data is performed by a number ofcomputer programs which run automatically during each dataassimilation cycle, or are run interactively by the NCEP CentralOperations SDMs, and provide information on possible action. Ifthere are observational types or geographic areas for which datawas not received, the SDM will request Washington DC RegionalTelecommunications Hub (RTH) assistance in obtaining theobservations. The SDM may also delay starting a numerical weatherprediction (NWP) model to ensure sufficient data are available.Four times a day, a web site,“www.ncep.noaa.gov/NCO/DMQAB/QAP/thanks”, is updated with reportson what data have been received from US supported upper air sites.Daily average data input counts for January 2001 are shown inTables 2 and 3.

5.1.2 Next-day Monitoring

"Next-day" data assessment monitoring is accomplished by routinelyrunning a variety of diagnostics on the previous day's dataassimilation, the operational quality control programs, and theNWP analyses to detect problems. When problems are detected,steps are taken to determine the origin of the problem(s), todelineate possible solutions for improvement and to contactappropriate data providers if it is an encoding or instrumentproblem.

Page 11: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

11

Table 2. Average Daily Non-Satellite Data Counts for January2001.

Category

Subcategory Total Input Percent Input

Land Surface Synoptic 61590

METAR 106452

subtotal 168042 12.44

MarineSurface

Ship 2877

Drifting Buoy 8514

Moored Buoy 2902

CMAN 1586

Tide Gauge 2351

subtotal 18230 1.35

LandSoundings

Fixed RAOB 1493

Mobile RAOB 98

Dropsonde 7

Pibal 296

Profiler 788

VAD Winds 6027

subtotal 8709 0.64

Aircraft AIREP 3874

PIREP 1022

AMDAR 20101

ACARS 64298

RECCO 10

subtotal 89305 6.61

Total Non-Satellite 284286 21.05

Page 12: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

12

Table 3. Average Daily Satellite Data Counts for January 2001.

Category Subcategory Total Input Percent Input

SatelliteSoundings

GOES 81406

Ozone 1193

subtotal 82599 6.12

SatelliteWinds

US HighDensity

96557

US PictureTriplet

1349

Japan 2472

Europe 4988

subtotal 105366 7.80

DMSP SSMPN NeuralNet

433036867119(superobs)

4.97

SatelliteSurface

ERSScatterometer

not used

Quikscat winds evaluating

TOVS1B HIRS 148965

HIRS3 0

MSU 35559

AMSU-A 223717

AMSU-B 402713

subtotal 810954 60.06

Total Satellite 1066038 78.95

5.1.3 Delayed-time Monitoring

“Delayed-time” monitoring includes a twice weekly automated reviewof the production reject list and monthly reports on the quantityand quality of data (in accordance with the WMO/CBS) which areshared with other Global Data Processing System (GDPS) centers. Amonthly report is prepared showing the quality, quantity, andtimeliness of U.S. supported sites. Monthly statistics onhydrostatic checks and guess values of station pressure are usedto help find elevation or barometric problems at upper air sites.This monitoring system includes statistics on meteorological data

Page 13: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

13

which can be used for maintaining the reject list and forcontacting sites with problems. For global surface marine data,monthly statistics are generated for those platforms which meetspecific criteria (e.g. at least 20 observations in a givenmonth). These statistics are forwarded to the U.K. MeteorologicalOffice in Bracknell, England (the lead center for marine datamonitoring) and are also uploaded to an NCO web site for use byU.S. Port Meteorological Officers. Separate monthly statisticsproduced for global moored and drifting buoys are sent to the DataBuoy Cooperation Panel, who may then contact the appropriateparties to have faulty buoy data removed from GTS distributionuntil the data problems are fixed.

5.2 Future Plans

The operational capability to find current upper air reports thatare in reality duplicates of old data and to track-check ACARSaircraft data will be improved. New software will be developed toprovide the capability to automatically diagnose deficiencies inthe numbers of reports within various data categories andsubcategories and alert the SDMs of deficiencies. New proceduresand software will be added to improve real-time monitoring.

6. Forecasting System

6.1 Global Forecast System

6.1.1 Status of the Global Forecasting System at the End of 2000

Global Forecast System Configuration: The global forecastingsystem consists of:

a) The final (FNL) Global Data Assimilation System (GDAS), an

assimilation cycle with 6-hourly updates and late data cut-

off times:

b) The aviation (AVN) analyses, the 126-hour forecasts run at00Z and 12Z UTC, and the 84-hour forecasts run at 06Z and18Z UTC with a data cut-off of 2 hours and 45 minutes usingthe 6-hour forecast from the FNL as the first guess;

c) A once per day 16-day medium-range forecast (MRF) from 0000UTC using FNL initial conditions and producing highresolution T170L42 predictions to 7 days and lower-resolutionT62 predictions from 7 to 16 days;

Page 14: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

14

d) Ensembles of global 16-day forecasts from perturbed FNLinitial conditions (five forecasts from 1200 UTC, and twelveforecasts from 0000 UTC). Ensembles are run at T126 for thefirst 84 hours and at T62 after that;

Global Data Assimilation System: Global data assimilation for theFNL and AVN is done using a multi-variate Spectral StatisticalInterpolation (SSI) analysis scheme using a 3-dimensionalvariational technique in which a linear balance constraint isincorporated, eliminating the need for a separate initializationstep. The analyzed variables are the associated Legendre spectralcoefficients of temperature, vorticity, divergence, water vapormixing ratio, and the natural logarithm of surface pressure (lnpsfc). All global analyses are done on 42 sigma levels at a T170spectral truncation. European Research Satellite (ERS) windswere dropped from the analysis because the quality of the data hasbecome erratic due to a failing sensor. Data cut-off times are0600, 1200, 2100 and 0000 UTC for the 0000, 0600, 1200, and 1800UTC FNL analyses, respectively, and 0245, 0845, 1445, and 2045 UTCfor the 0000, 0600, 1200, and 1800 UTC AVN analyses.

Global Forecast Model: The global forecast model (Sela 1980,1982) has the associated Legendre coefficients of ln psfc,temperature, vorticity, divergence and water vapor mixing ratio asits prognostic variables. The vertical domain includes the surfaceto 2mb and is discretized with 42 sigma layers. The Legendreseries for all variables are truncated (triangular) at T170L42 forthe FNL and the first seven days of the MRF and T62 for theensemble forecasts and days 8 through 16 of the MRF. The AVN usesT170L42 out to 126 hours at 00Z and 12Z to provide guidance andboundary forcing for the Hurricane model. The AVN also usesT170L42 for the 06Z and 18Z runs but only forecasts out to 84hours. A semi-implicit time integration scheme is used. Themodel includes a full set of parameterizations for physicalprocesses, including moist convection, cloud-radiationinteractions, stability dependent vertical diffusion, evaporationof falling rain, similarity theory derived boundary layerprocesses, land surface vegetation effects, surface hydrology, andhorizontal diffusion. See the references in Kalnay, et al (1994)for details.

Global Forecast System Products: Products from the global systeminclude:

a) Gridded (GRIB) Sea level pressure (SLP) and height (H),temperature (T), zonal wind component (U), meridional windcomponent (V), and relative humidity (R) at a number ofconstant pressure levels every 6 hours for the first 60

Page 15: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

15

hours of all four runs of the AVN and at 72 hours of the 00Zand 12Z AVN forecasts;

b) Specialized aviation grids (GRIB) with tropopause H, T, andpressure as well as fields depicting the altitude andintensity of maximum winds;

c) Extended forecasts (3.5-10 days, every 12 hrs) of SLP, H, U,V, and R at 1000, 850 and 500 hPa issued once per day;

d) A large number of graphic products.

6.1.2 Future Plans for the Global Forecasting Svstem

Near term changes planned for the production suite include:

a) Higher vertical resolution (60 levels) to make better use ofsatellite radiances; higher horizontal resolution to T254(55km)

b) Modifying convection and tropical storm initializationprocedures to reduce false alarms and improve guidance fortropical storms.

c) Include GOES-10 soundings and Quickscat surface windobservations; TRMM estimated precipitation.

d) Forecast cloud water in the MRF model.

e) Implement an improved quality control procedure forrawinsondes and AMSU radiances.

f) Refine the hurricane relocation algorithm

6.2 Regional Forecast System

6.2.1 Status of the Regional Forecasting System at the End of 2000

Regional Forecast System Configuration: The Regional Systems are:

a) The Mesoscale Eta Forecast Model, which provides highresolution (22 km and 50 levels) forecasts over NorthAmerica out to 60 hours at 0000 and 1200 UTC and 48 hours600 and 1800 UTC;

b) The Rapid Update Cycle (RUC) System, which generates (40 kmand 40 level) analyses and 3-hour forecasts for thecontiguous United States every hour with 12-hr forecasts

Page 16: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

16

eight times per day on a 3-hourly cycle; andc) The Nested Grid Model (NGM), whose North American grid has

approximately 90 km resolution on 16 layers, and whichgenerates twice daily 48-hr forecasts for the NorthernHemisphere.

Regional Forecast System Data Assimilation: Initial conditions forthe four Meso Eta forecasts are produced by a multivariate 3-dimensional variational (3DVAR) analysis which uses as its firstguess a 3 hour Meso Eta forecast from the Eta-based DataAssimilation System (EDAS – Rogers, et al., 1996). The EDAS is afully cycled system using 3-hour Meso Eta forecasts as a backgroundand global fields only for lateral boundary conditions. Data cut-off is at 1 hour and 10 minutes past the nominal analysis times.No initialization is applied. In 2000, direct assimilation of GOESand TOVS-1B radiance data was included. In addition, a new non-linear quality control algorithm was applied.

Until March 2000, initial conditions for the twice-daily NGMforecasts came from a hemispheric optimum interpolation analysiswhich used as its first guess a 3-hour NGM forecast from theRegional Data Assimilation System (RDAS). The RDAS performed 3hour updates during a 12 hour pre-forecast period, but started fromthe global fields each 12 hours and so was not a fully cycledsystem. Data cut off times were 2 hours past the synoptic time. Animplicit normal mode initialization was used.

In March 2000, the initialization procedure for the NGM waschanged. The hemispheric RDAS used to provide initial conditionsto the NGM was replaced by the Eta analysis over North America andwith a 6-hr forecast from the GDAS over the rest of the NorthernHemisphere. After the Eta analysis is interpolated to the NGMgrid, an implicit normal mode initialization is performed.

Regional Forecast System Models: The Mesoscale Eta forecast model(Black, et al., 1993 & 1994; Mesinger, et al., 1988) has surfacepressure, temperature, u, v, turbulent kinetic energy, water vapormixing ratio and cloud water/ice mixing ratio as its prognosticvariables. The vertical domain is discretized with 50 eta layerswith the top of the model currently set at 25 mb. The horizontaldomain is a 22 km semi-staggered Arakawa E-grid covering all ofNorth America. An Euler-backward time integration scheme is used.The model is based on precise dynamics and numerics (Janjic 1974,1979, 1984; Mesinger 1973, 1977), a step-mountain terrainrepresentation (Mesinger 1984) and includes a full set ofparameterizations for physical processes, including Janjic (1994)modified Betts-Miller convection, Mellor-Yamada turbulent exchange,Fels-Schwartzkopf(1974) radiation, a land surface scheme with 4soil layers (Chen et al. 1996) and a predictive cloud scheme (Zhaoand Carr 1997, Zhao et al. 1997). The lateral boundary conditionsare derived from the prior global AVN forecast at a 3 hourfrequency. In 2000, the power provided by the IBM SP highperformance computer allowed an improvement in the resolution (32vs. 22 km, 45 vs. 50 layers) and an extension of the domain to 900km farther west of Hawaii.

The RUC system was developed by the NOAA/Forecast SystemsLaboratory under the name of Mesoscale Analysis and Prediction

Page 17: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

17

System (MAPS) (Benjamin, et al, 1991). The RUC run provides high-frequency, short-term forecasts on a 40-km resolution domaincovering the lower 48 United States and adjacent areas of Canada,Mexico, and ocean. Run with a data cut off of 18 minutes, theanalysis relies heavily on asynoptic data from surface reports,profilers, and especially ACARS aircraft data. One of its uniqueaspects is its use of a hybrid vertical coordinate that isprimarily isentropic. Most of its 40 levels are isentropic exceptfor layers in the lowest 1-2 km of the atmosphere where terrain-following coordinates are used. The two types of surfaces changesmoothly from one to another. A full package of physics isincluded with 5 cloud precipitation species carried as historicvariables of the model.

The NGM model (Phillips, 1979) uses a flux formulation of theprimitive equations, and has surface pressure, and sigma u, sigmav, and sigma q as prognostic variables where 1 is potentialtemperature and sigma q specific humidity. The finest of thenested grids has a resolution of 85 km at 450N and covers NorthAmerica and the surrounding oceans. The coarser hemispheric gridhas a resolution of 170 km. Fourth-order horizontal differencingand a Lax-Wendroff time integration scheme are used. Verticaldiscretization is done using 16 sigma levels. Parameterizedphysical processes include surface friction, grid-scaleprecipitation, dry and moist convection, and vertical diffusion.

Regional Forecast System Products: Products from the variousregional systems include:

a) Heights, winds, temperatures:(1) Meso Eta (to 48 hours at 0600 and 1800 UTC; to 60 hoursat 0000 and 1200 UTC) every 25 hPa and every 3 hours at windsaloft altitudes;(2) RUC (to 12 hours) every 25 hpa and hourly; and(3) NGM (to 48 hours) every 50 hPa and every 6 hours.

b) 3, 6 and 12 hour precipitation totals;

c) Freezing level;

d) Relative humidity;

e) Tropopause information;

f) Many model fields in GRIB format;

g) Hourly soundings in BUFR; and

h) Hundreds of graphical output products and alphanumericbulletins

Operational Techniques for Application of Regional Forecast SystemProducts: Model Output Statistics (MOS) forecasts of an assortmentof weather parameters such as probability of precipitation, maximumand minimum temperatures, indicators of possible severe convection,etc. are generated from NGM model output. These forecasts are made

Page 18: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

18

using regression techniques based on statistics from many years ofNGM forecasts.

6.2.2 Future Plans for the Regional Forecast System

Year 2001 Goals

a) Extension of 22km Eta to 84 hours at 00Z / 12Z

b) Eta resolution increased to 12 km.

c) 88D Radial Velocities on an hourly basis.

d) Regional Ensembles – 48 km / 10 members

e) Threats (nested Eta 8 km resolution)

f) 20 km/50 level RUC with 3DVAR initialization

6.3 Specialized Forecasts

Specialized forecasts and systems include the following:

a) A Hurricane (HCN) Run is performed when requested by NCEP'sTropical Prediction Center (TPC). The HCN forecast model isthe Geophysical Fluid Dynamics Laboratory (GFDL) HurricaneModel (GHM), which is a triple-nested model with resolutionsof 1.0, 1/3, and 1/6 degree latitude resolution and 18vertical levels. The outermost domain extends 75 degrees inthe meridional and longitudinal directions. Initialconditions are obtained from the current AVN run. Inputparameters for each storm are provided by the TPC and includethe latitude and longitude of the storm center, current stormmotion, the central pressure, and radii of 15 m/s and 50 m/swinds. Output from the model consists primarily of forecasttrack positions and maximum wind speeds but also includesvarious horizontal fields on pressure surfaces such as windsand sea-level pressure, and some graphic products such as aswath of maximum wind speeds and total precipitationthroughout the 72 hour forecast period.

b) The Regional Spectral model (RSM) provides forecasts over theHawaiian Islands at a very high resolution (10 km) from 00 and12 UTC out to 48 hours for distribution to Hawaii via FTP.The RSM uses spectral basis functions to represent forecastvariables in a similar way to the MRF model. Initialconditions for this run are interpolated from the AVN initialconditions. The model was moved to the IBM SP early in 2000.A 10 km nested version of the Eta is being prepared as areplacement for the RSM.

c) The global Wave Model (WAM) runs twice a day with AVN forecastforcing on a 2.5 degree grid, and makes wave heightpredictions out to 126 hours. A new global ocean wave model,NOAA WAVEWATCH III (NWW3), was implemented operationally inMarch 2000. This model runs twice a day on a 1.25 x 1.00degree lat/lon grid covering a band from 78N to 78S. Wave

Page 19: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

19

forcing is provided by T170 winds from the AVN model. TheNWW3 makes forecasts of wave direction, frequency and heightout to 126 hours.

d) Daily global Sea Surface Temperature (SST) analyses are madewith an optimum interpolation technique which combines sevendays worth of in-situ and satellite observations. Weekly SSTanalyses derived with this system are used as lower boundaryconditions in the global assimilation and forecasts.

e) A storm surge model makes twice daily predictions for theAtlantic coast and Northwest Pacific coast of the UnitedStates out to 48 hours. When needed, the model has also beenapplied to the Bering Sea and Arctic coast of Alaska.

f) Wave models are run twice daily to provide sea state forecastsfor the Gulf of Mexico and the Gulf of Alaska. Regionalmodels, one covering the western half of the Atlantic Oceanand the Gulf of Mexico and the other covering the Gulf ofAlaska and the Bering Sea. Based on the new NWW3, thesemodels replaced the old operational models in early 2000.

g) A once-per-day forecast of an Ultraviolet Index (UVI) (Long,et al., 1996).

h) A seasonal ensemble climate forecast run consisting of a 20member ensemble of an atmospheric general circulation model(AGCM). The forecasts are run once per month with 28 levelsand a horizontal resolution of approximately 200 km (T62). Itproduces seasonally averaged forecasts out to 7 months and isscheduled for operational implementation in 2001.

i) The sea ice analysis has a resolution of ½ degree which allowsit to capture the rapid retreat/advance of the pack ice edgein Spring and Fall. This analysis is used as input to theGlobal Forecast System. A sea ice drift model providesguidance for the drift distance and direction over thenorthern hemisphere, and along the ice edges in bothhemispheres. This year the guidance was extended from day 7out to day 16.

6.4 Verification of Forecast Products for Year 2000

Annual verification statistics are calculated for NCEP's globalmodels by comparing the model forecast to the verifying analysisand the model forecast interpolated to the position of verifyingrawinsondes (see Tables 4 and 5).

Page 20: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

20

Table 4. Verification against the Global Analysis for 2000.

StatisticAVN24 hr

AVN72 hr

MRF120 hr

500hPa Geopotential RMSE(m)

Northern Hemisphere 11.8 33.9 60.0

Southern Hemisphere 16.4 44.1 73.7

250hPa Wind RMSVE(m/s)

North Hemisphere 4.7 10.6 16.1

Southern Hemisphere 5.1 11.7 17.4

Tropics 4.3 7.5 9.3

850hPa Wind RMSVE(m/s)

Tropics 2.9 4.7 5.7

Page 21: THE NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION NATIONAL WEATHER

21

Table 5. Verification against rawinsondes for 2000.

Statistic

AVN24 hr

AVN72 hr

MRF120 hr

500hPa Geopotential RMSE(m)

North America 15.2 38.1 63.2

Europe 15.4 34.9 63.6

Asia 16.7 31.4 49.6

Australia/New Zealand 12.3 25.8 40.7

250hPa Wind RMSVE(m/s)

North America 7.1 13.2 14.0

Europe 6.2 11.7 18.4

Asia 7.1 11.6 15.3

Australia/New Zealand 6.8 10.8 15.0

Tropics 6.5 8.4 9.8

850hPa Wind RMSVE(m/s)

Tropics 4.6 5.9 6.7