During a session at Compliance Week Europe 2017 in Amsterdam today, Mark Johnson, CEO of The Risk Management Group, ran an interesting exercise to show just how easy it is to commit cyber-fraud. He produced an iPhone 7, and passed it around to members of the audience, encouraging them to get Siri to activate, and then ask it a few basic questions, such as “Who am I?” “Who did I call last?” and “E-mail my wife.” Just off of simple questions such as that, one could learn a few kernels of information about Mark that, once supplemented by a few adroit Google searches, soon revealed Mark’s full name, his phone number, his home address, a picture of his house, the name of his wife, the birth date of his daughter, and other details. All off of a few Google searches. To a dedicated fraudster, it would have taken all of five minutes to capitalize off of a lost iPhone.

Johnson then explained how passwords really work. When we use a password and submit it to a site, the site doesn’t actually get the password. The password is encrypted by one of any number of commonly used encryption algorithms that turns your password into a strong of random characters. That string is called a “hash.” Since most people use “password” as their password, however, you can reverse engineer hashes back into their source password and, since password data gets compromised all the time, large lists of hashes can be easily obtained online—again through Google—thus showing that even when we think we have strong passwords, we really don’t. Companies can add an additional hash to your hash (known as a “salt,”) but even that isn’t foolproof, because those salts often can be compromised or accidentally exposed to the public. The bottom line is that passwords are essentially worthless, no matter what they are. “Mixlplyk1492!” is basically the same as “Password” once the hash is obtained and reverse engineered. 

Then Johnson gave a primer on the dark Web, and how purloined data gets bundled, packaged, and sold. He showed where a few sites on the most superficial levels of the dark Web might be found, but cautioned that nobody should visit them, because they are malware factories looking to infect computers. Among the criminal fraternity online, there really is no honor among thieves.

All of this pointed to a security regimen that was pretty good at giving people ignorant to the inner workings of IT security (read: 99.99%l percent of the population) the sense that they were protecting themselves. But that protection is essentially meaningless. What’s worse, even strict new laws like the forthcoming GDPR will count for little if the people they are supposed to protect don’t know to ask for their personally identifiable data to be scrubbed from company lists, or if they ask nine companies and overlook the tenth, who then sells that information to a big data merchant, who might resell it or forget it on a laptop left on a subway. The end result is one’s personally identifiable information is out of their control, bundled with other data that makes identity theft something limited not by technical difficulty, but by the energy level and drive of thieves. That is not an encouraging prognosis.

The answer, Johnson suggested, was meaningful IT security law that was fairly simple in scope, but deep in effect, such as requiring device makers to have data security features defaulted to “on,” so that users would have to opt out of heightened security (and agree to an acknowledgement of increased risk and liability for doing so), rather than opt in to it. Device makers who don’t practice this simply would not be able to sell their products in jurisdictions that require default security options. Just imagine if the European Union required that of all smartphone makers. It may seem heavy-handed, but as Johnson noted, the government installs the stop lights on the streets, not the drivers. Expecting search engines, social media companies, and other data merchants to willingly enact strong security measures to protect their users’ privacy actually undermines their business model, so we can’t expect them to step up here and do what’s right for their users.

Meaningful data security is still a long way away, in large part because legislators with the power to enact tough new laws—those that go well beyond GDPR—lack the energy, intent, and focus to do so. Until they do, the best bet might be for companies themselves—and their compliance officers in particular—to highlight best practices for good data security, designing the protocols to ensure their use, and then enforcing the adoption of those protocols to keep the company safe. A single compliance officer can’t save the world here. But he or she can help a company save itself. It’s not like Facebook or Google are ever going to.