I’ve worked a lot with research organisations and there is a trick they use to look clever. When a report is looking a bit weak, you insert some sourced graphs to make it look more intelligent.
How many needless graphs and charts have you seen in media?
Graphs are summarised information that make us look as if we have real knowledge of a topic.
We must stop treating visual representation of data as something that is special and clever – or it will easily baffle us.
Why am I ranting about this? Because data analytics is still treated as if it’s a dark art.
Decision-making engines – ie. analytics apps that make decisions based on what data says and the rules they work to – are dumb.
They do what you tell them to do and have a hard time quantifying the context of data without your rules and boundaries of ‘if this then that’.
Decision-making engines can do a lot of good. They can send you an email reminding you to book an appointment with the doctor if your blood pressure is high. They can call an engineer to a fault on a train. Or they can summon the police to an area if they recognise gunshots fired.
To make these decisions they need input, which is more frequently coming from sensor technology.
And this is where things can go tits up.
Take Volkswagen, which is still in the process of cleaning up its own mess after it enabled the emissions sensors on its diesel cars to go into a ‘test’ mode to reduce the amount of pollution testers would be able to find.
They put the car into another mode to pass the emissions test. But initially the data showed that the cars were passing the tests – so they got the green light.
We are about to see many more cases of cheating decision-making systems.
Take gunshots sensors, which alert police to an area when they hear a shot fired.
What if you could spoof that noise next to a sensor? Or what if you could hack the system and relay the same pattern (in ones and zeros) to the decision-making engine. It is programmed to alert police when it sees such a data pattern – not when it hears gunfire because it doesn’t know what gunfire is.
So how do you ensure your sensors are telling the truth?
I’m going to blog about the new technologies coming onto the market that enable this, but until then… we must assume data is often wrong and no matter how pretty the graphics, understand machines are dumb.
Otherwise we are dumber.