Why over the past year “information security” has become another cargo cult.
Let’s imagine: a person comes to a casino. The croupier tells him: here is the roulette wheel, before you there fell out “red” 20 times, what will you bet on? Correct Answer: What’s the difference?
Whenever the croupier throws the ball, either red or black will come up – 50/50, and the previous results do not affect anything. Intuition does not work in a casino, only probability theory works. It is the same in security and, of course, in information security.
Information security
We run into companies that say, “We’re protected. We haven’t been attacked for 20 years.” This is the illusion of control. If you haven’t been hacked yet, that’s not your merit. The hackers just didn’t get their hands on it. Haven’t been hacked for 20 years. They will tomorrow. The success of a future hack does not depend on previous security experience. It’s the same with an airplane, every time you fly. It doesn’t matter if that airline has had planes crash before or not.
I don’t want to get the wrong picture. Computer crime exists, it’s become organized, and it’s a huge problem. We talk about it a lot in our environment. But the point is that you have to properly assess the risks in every situation and do it scientifically. We have a business – forensics and investigative analytics. All of our investigations are investigations for large and wealthy companies.
It is thought that corporations and banks seem to be well protected, they spend a lot of money on information security, they buy new technology. But on the other hand, they take action when they already have incidents.
The paradox: what is wrong with this? The main reason: people and companies don’t understand what to protect themselves from, they don’t understand what computer crime is.
The concept of “information security” emerged from the word danger. Danger comes not from technology, but from people. In order to understand what to protect yourself from, you have to understand who the computer criminals are and what they want. And this is the foundation.
Pseudoscience triumph
2016 was a year of pseudoscience triumph, and information security has become surrounded by myths and legends.
If the media and some industry experts are to be believed, “Russian hackers” have actually taken over the world. They allegedly hacked into the servers of the Democratic Party in the United States and brought Donald Trump to power. They stole the WADA database. They have left Ukraine without lights, have cyberattacked the Bundestag, the Polish Foreign Ministry, and are preparing to attack nuclear power plants, factories.
Fear sells well. If you show 12 slides of plane crashes now, it will seem to us that it is very dangerous to fly. In fact, this is a generalization of individual cases-an unfounded transference of isolated cases to the whole picture of the world, in plain language, a “conspiracy theory.
We use the statistics from our Threat Intelligence monitoring: Russian-speaking criminals really do make up 80% of the most complex and high-profile cases in the world. But 99% of all incidents are theft of money. Hackers are only interested in money. At the same time, there is about 1% of crimes, which we divide in half – espionage and cyberterrorism. Clearly, a single attack like Stuxnet (the virus that attacked Iran’s uranium enrichment plant) could bring the world to the brink of World War III, but it is still too unlikely.
We are told: hackers will hack into nuclear power plants and factories en masse. But why would a computer criminal attack a nuclear power plant? It won’t make money and will sic the intelligence services around the world on him. Computer criminals seek maximum monetization and minimal risk. It is much easier to attack a bank account than to get into a nuclear power plant.
Failure to understand the attack vector leads to paradoxical things – dispersion of budgets. It’s as if Belarus bought several nuclear-powered submarines. Good weapons, powerful. But what to do with these submarines? Belarus has no access to the sea!
The wonderful Nobel economics laureate Daniel Kahneman wrote a book called “Think Slow. Decide Fast.” This book is very good for people who are in the security business. It’s about cognitive distortions. It’s about how our brains misjudge risks and don’t know how to deal with probability theory.
One of the most common traps for the brain is amplification. This is when we build infinite possible defenses based on incorrect data. I saw a report from one of the well-known consulting companies – they were pushing a risk assessment for huge sums of money that never existed! We often see that even the Big Four make risk assessments based on surveys.
I closely follow the latest scientific research that helps humans work better and live better. We live in an insanely interesting time: a vast array of sources of information and research that answer questions almost about the meaning of human life. People continue to believe in homeopathy and are held captive by myths. I hope none of you are being treated with homeopathy, because homeopathy doesn’t help. In order to build the right security strategy, you have to know who the enemy is, how he operates, and what tools he uses. At the core of fighting the enemy is knowledge. If you don’t know who you’re defending yourself against, it’s useless to defend yourself.