"Angel hair pasta rather than fine-tuned code"
-
Trust the scientists, they said...
The Covid-19 modelling that sent Britain into lockdown, shutting the economy and leaving millions unemployed, has been slammed by a series of experts.
Professor Neil Ferguson's computer coding was derided as “totally unreliable” by leading figures, who warned it was “something you wouldn’t stake your life on".
The model, credited with forcing the Government to make a U-turn and introduce a nationwide lockdown, is a “buggy mess that looks more like a bowl of angel hair pasta than a finely tuned piece of programming”, says David Richards, co-founder of British data technology company WANdisco.
“In our commercial reality, we would fire anyone for developing code like this and any business that relied on it to produce software for sale would likely go bust.”
The comments are likely to reignite a row over whether the UK was right to send the public into lockdown, with conflicting scientific models having suggested people may have already acquired substantial herd immunity and that Covid-19 may have hit Britain earlier than first thought. Scientists have also been split on what the fatality rate of Covid-19 is, which has resulted in vastly different models.
Up until now, though, significant weight has been attached to Imperial's model, which placed the fatality rate higher than others and predicted that 510,000 people in the UK could die without a lockdown.
It was said to have prompted a dramatic change in policy from the Government, causing businesses, schools and restaurants to be shuttered immediately in March. The Bank of England has predicted that the economy could take a year to return to normal, after facing its worst recession for more than three centuries.
The Imperial model works by using code to simulate transport links, population size, social networks and healthcare provisions to predict how coronavirus would spread. However, questions have since emerged over whether the model is accurate, after researchers released the code behind it, which in its original form was “thousands of lines” developed over more than 13 years.
A broader question: How are, in fact, those models working out?
-
Sensitive dependence on initial conditions.
-
@Aqua-Letifer said in "Angel hair pasta rather than fine-tuned code":
Sensitive dependence on initial conditions.
Chaos theory.
I'd post a picture of Jeff Goldblum but then I'd have to murder my computer screen.
-
So, what did the UK do differently from pretty much every other country based on this buggy code?
More importantly, why would I listen to somebody who called his company WANdisco?
-
"Fine-tuned" for what? For today's software developers, that usually means getting the code to run efficiently within the given execution environment, usually along the line of running with the least amount of memory or storage, getting results within the least amount of time, or consuming the least amount of power to complete certain tasks.
Scientists and statisticians are not typically trained to write "fine-tuned code."
Computer code need not be "fine-tuned" to produce correct results.
"Fine-tuned" code can still produce wrong results if the model or data fed into the code are wrong.