This billion-dollar supercomputer will create super-accurate weather forecasts

As usual the latest supercomputer is x number of times faster that the present incumbent, in this latest iteration let x equal six. The thing is, this one is costing a whopping £1.2 billion, the largest amount of money ever injected into what remains of the 167 year organisation founded by Vice-Admiral Robert Fitzroy. I notice that a lot of the UNIX diehards are upset that the Met Office have got into bed with Microsoft in this project, with dire warnings of ‘blue screens of death’.
I, like a lot of others, will find it hard to believe that they are able to tweak greater accuracy out of their existing weather and climate models when they can’t even forecast what next month’s weather will be like. Take todays long range forecast for next month as an example of what we can do with today’s current supercomputer and the models they’re running.

Courtesy UKMO

I’m not convinced it’s the speed of the supercomputer that’s so vitally important as much as the accuracy of the NWP model that they’re running. The supercomputer itself may be six times faster, but if that means we will be six times faster to an inferior forecast what have we gained?

8 thoughts on “This billion-dollar supercomputer will create super-accurate weather forecasts”

  1. To pay for the computer power the Met Office has to employ less staff so very little human interaction with the forecast product. On the written forecasts there seems to be no proof reading before the forecast is issued so at times they make little sense. The civilian TAFS are computer generated with at times several Probs in each one and at times making little sense. I wonder what pilots think of them. As long range forecasting I think it is pointless trying to do it. So many variables so the further out the forecast goes the less likely it is to be correct.

    1. I completely agree with you. There must come a point with ‘weather’ forecasts (rather than ‘climate’ predictions) of diminishing returns. That point at the moment seems to be at around the T+96 range depending on the weather set up, and beyond that we enter the world of science fiction. No matter how many ensembles are run it won’t produce any more precision, if it did the long range forecasts would be much more detailed and accurate, but thanks to chaos it isn’t. In the years to come we need a paradigm shift in our NWP models, and I don’t think, even with the advances in AI, that day will ever come because weather is too chaotic to model after a certain point.

  2. It’s (correctly) not looked like raining much (if at all) here at T+120 for weeks, indeed it doesn’t look like it will rain much (if at all) here by T+120, T+240, or T+360 atm! I think there’s a fair bet that will be correct.

  3. So they are moving from GIGO weather models to Fast GIGO weather models, as you say “weather is too chaotic to model after a certain point”.

  4. What point would that be? It’s clear that ‘certain point’ has been moved further into the future and I don’t think we know where that point actually is? Do you? It sounds like you do.

    Slow moving situations are clearly difficult to predict for only a few days ahead but with strong zonal flow advanced warning of major systems can now appear at T+144 and I don’t see why that might not be pushed further. Maybe more computer power will make periods like now (not much movement, so forecasting difficult) easier? Or maybe more computer power will help with forecasting thunderstorms?

    1. In my opinion computing power is not the problem, lack of quality observations is, and currently there’s a dire shortage of surface observations going into the models from across the Atlantic. I don’t whatever happened to weather reports from ships, which seem to have dried up entirely, although it is possible they may be being kept private for fear of piracy in some parts of the world. There’s no reason why a network of North Atlantic buoys can’t be established. Until the 1970’s we had weather ships A, B, C, D, I, J, K, and M, positioned at strategic locations across the North Atlantic. We have some buoys but often half of those are not reporting for some reason. Some of that billion dollars should be used to create that network. An accurate NWP forecast starts with a good coverage of accurate observations from T+0.

  5. With good data & good programing you can get reliable results on an old BBC computer;
    With a billion £ computer & either bad data or poor programing you’ll get fast all singing all dancing … rubbish, but people will be impressed & believe it because – ‘well look at the price tag, how could it be wrong’ !!

    1. Looking into my crystal ball, and very slightly changing the subject at the same time, I would have thought that jobs this year at the Met Office are safe this side of the COP26 conference, but beyond that they’re could well be swingeing cuts down in Exeter in an attempt by the government to try and claw back some of the hundred of billions spent on the COVID-19 crisis. I’m sure in time, the old organisation will gradually morph itself into a supercomputing centre, that issues occasionally dire warnings and projections about global heating, with just enough ancillary staff to ensure there’s someone to change the fuses and turn off the lights at night.

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top
%d bloggers like this: