Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have any nerdy Congresspeople wanted to solve this ridiculous inefficiency in recent years. Seems like a natural "the Europeans kick our ass at forecasting" message would go over well and potentially negate the lobbying from Weather Channel and others with a vested interest in perpetuating this stupid situation.


Yes, Congress passed the Weather Research and Forecasting Innovation Act back in 2017 which not only gave massive funding injection to model development activities, data procurement, and more, but also formalized the new next-generation community modeling initiatives which are focused on tackling the dominance of the EC models in the 2-5 day forecast space.

The diagnosis of the problems of the American forecast modeling community here is based on flawed premises. There are three major factors which led to the ECMWF leap-frogging the US in day-ahead forecasting capability. The first is the consolidated investment in supercomputing resources; the WRFIA tackles this by earmarking a much larger allocation of funding for NOAA's next gen supercomputer, but this still pales in comparison to ECMWF investments.

The second factor is the fragmentation of the research and operational weather modeling communities due to the divergent investment from NOAA and USAF in the 90's and 2000's; USAF in conjunction with NCAR sponsored the development of the WRF model which was widely adopted by the research community. NOAA continued investing the GFS lineage of models. The bifurcation of these communities slowed down the ability to matriculate advances in model developments to operations, and this was exacerbated by an old, closed-off approach by NOAA which made it extraordinarily difficult to run and develop the GFS on anything other than NOAA hardware.

Finally, the ECMWF went all-in on 4DVAR data assimilation in the late 90's, whereas the American community pursued a diversity of other approaches ranging from 4DVAR to ensemble Kalman filters. 4DVAR necessitates advances to core weather model software (e.g. you need to write a model's adjoint or its tangent linear in order to actually use 4DVAR) adn the US' failure to adopt it led, imo to a "double edged sword" effect of (a) failing to provide impetus to greatly improve the US modeling software suite and supporting tools, and (b) being a worse assimilation technique unless advanced EnsKF techniques are employed using very large ensembles of models (expensive).

The other problem as others have pointed out is that there is no accountability in the US private sector weather market. Virtually every player is re-transmitting raw model guidance or statistical post-processed forecasts using DiCast, _maybe_ with a some manual tweaking of forecast fields by humans. But this is not transparent, and many companies - if we're being charitable, here - are not honest about what they're actually doing to produce their forecasts. Put another way - there's a lot of BS claims out there, and it seems that investors have been more than happy to fund it over the past few years.


I disagree that my (extremely broad) diagnosis of the problems with forecasting in America is based on flawed premises. You've provided thorough, correct, and important details here but my comment was aimed at the broader HN audience, not on writing the central argument for a discussion on the historical timeline. I think what I said, "Horrible oversight by the federal govt (read, congress) of our technical/scientific forecasting resources means that our forecasting ability is extremely fragmented and poorly organized." is a very concise summary of the things you've laid out here.

As for the details in your comments, the only thing I disagree with strongly is the comment about investment in computing. While sufficient computing resources are central to good forecasting, the lack of investment by NOAA in computing (I sit at NOAA) is a red herring. ECMWF is significantly better than either of the two available American forecasts because they are just better at what they do, all around. In particular with respect to data assimilation. I've sat at meetings with ECMWF forecasters who have asked for access to my in-situ data products and their pipeline is as simple as "point us to the data please". Their data assimilation pipeline is so much more sophisticated and thorough that catching up on that alone would be a huge huge leap. Mind you, not just '4DVAR' the methodology, but literally the way that the community finds and integrates observational data.

ECMWF, the organization, is quite literally structured to strictly accomplish the goal of 'improve the forecast'. Whereas, again broadly, the American institutions are much more a congolemerate of associated researchers doing individual science projects while small teams work on specific segments of the forecast. Yes, we are attempting to fix this. No, we haven't fixed it yet.

This is not to say I don't think we should fund computing or that computing won't help. But we are quite literally 5-10 years of research behind on multiple fronts.


The thing about the computing is that it has impacted the culture surrounding NWP model development within the American modeling community. At ECMWF, there is capacity in the primary systems to support R&D, so the total cost to the community to maintain this capability is much lower than in the US where everything is fragmented. If there was greater capacity for researchers to run GFS on the limited number of systems with first-class (or any) support, it may have helped consolidate the community.

Totally acknowledge that there are other takes here. And I have a bit of skepticism about how much EPIC will really achieve and what it can do to resolve these issues. But I don't necessarily agree that the science at EC is 5-10 years ahead of the American community's. What's matriculated R2O is definitely a few years ahead, of us, especially for medium-range forecasting. But the US likely still maintains a competitive edge in mesoscale/rapid refresh forecasting, and even though we've lost key developers to the private sector recently, the NBM seems (in my admittedly limited experience) to perform favorably to similar products out of ECMWF or other European national weather services.

Your point about ECMWF being fundamentally structured with the singular goal of improving the forecast is super important - I 100% agree with that, and the US has yet to do much of anything to address this.


Extremely valid points. Thanks for sharing your perspective, counters. It's much appreciated. One thing I think is fascinating, every time this comes up on a place like HN, is how detached the conversation often is from the meat-and-potatoes of forecasting. Which is to say I've seen many a googler think that weather forecasting is a simple problem and handwave the discussion away with "throw compute at the problem". It's always great when there are people around to ground the discussion.

Last point to the specifics. You're very right that the American teams have nailed mesoscale/rr forecasting. Which is, if we wanted to really divert the discussion, interesting and arguably more advanced because of its industrial applications (wind farms, etc et al).

I think your most salient point is about how the extra compute resources foster a culture of improvement on the models themselves. I am on the compute task team and it's something I argue for every day. Are you a researcher? If so, send me a message. I used to have my NOAA email in my profile but it would probably be neat to connect professionally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: