I have an unmanned drone in my office. How the drone came to be there is a story for another day. The drone’s mere existence is our story for this week, since drones are coming to symbolize the difficulty, nuance, and sheer inanity of today’s regulatory challenges. Get used to it.

An acquaintance walked into my office last week and saw the drone. “You know,” he said, “I was standing in front of my apartment last week and saw some guy with an unmanned drone. It was more sophisticated than yours, with a camera and controls on his iPad and a display to stream the camera’s images. We watched as he flew the camera up a few hundred feet, right over Massachusetts General Hospital. Really cool.”

My colleague merely wanted to talk about unmanned drones as a hobby, and like he said, they are indeed really cool. But let’s flash back to torts class in first-year law school and start playing out the preposterous trouble unmanned drones can cause—because eventually they will cause trouble, on a wide scale. And if you are unclear on which corporate officer will get the call to help companies clean up this mess, look in the mirror.

Let’s say the drone hovered over MGH, camera running, just as a helicopter landed on the roof with a patient. He is wheeled across the landing pad to the elevator, and the drone’s camera beams that personal tragedy across the Internet full-color high definition.

I’m sure compliance executives in the healthcare sector are already thinking in the right direction: “Uh-oh. HIPAA violation.” The patient’s personally identifiable information—his face—was exposed for all to see. Nursing homes have already suffered regulatory grief for posting photos of patients onto company Facebook pages and other social media sites, where viewers might piece together the patient’s medical condition by examining equipment or injuries visible in the photo. Our unmanned drone scenario isn’t a big leap to make.

So there’s our drone, hovering over the hospital helipad, exposing the hospital to a HIPAA violation as the drone films a patient wheeled across the roof. We all know the proper response to a regulatory compliance failure is to impose a control that will prevent future failures—that’s Compliance Theory 101, that everyone learns at Compliance Officer School. 

Well, what control can you establish to stop the unmanned drone, exactly? Do you put shielding over the helipad to block the drone’s view, and risk making the helicopter’s landing more difficult? Do you shoot it down? What if the drone then falls to the ground and hits someone on the head? Is the hospital liable for any injury, or the drone operator?

Those may sound like foolish hypothetical questions, but they lead to a real and powerful point: there is no effective control to address this situation. You can’t put a roof over a helipad. You can’t let private parties shoot down objects from the sky. The compliance officer at our hypothetical hospital has no means to keep his or her institution in full regulatory compliance. It can’t be done without creating another problem.

The point of this column (you were probably wondering about that) isn’t to get mired in drone regulation. Drones are simply a good example to raise a broader discussion about the limits of internal control as technology marches forward, and gives individuals the means to create situations—like observing a hospital helipad—that aren’t controllable by the rest of us. How do we deal with that?

In theory, regulators will intervene with policies for how we all might use unmanned drones or similar newfangled tools that can cause trouble. The Federal Aviation Administration is studying how to regulate drones, for example. The fundamental problem, however, is that strict regulations won’t matter much without strict enforcement—since preventive controls won’t really solve anything, and when they don’t, enforcement is all you have left.

That is, if a preventive control like a roof over a helipad isn’t feasible, then who cares how strict FAA rules for unmanned drones are? They rules themselves will only be words, and won’t offer any help unless we somehow create a tough enforcement mechanism too. I can’t even imagine what that enforcement world might look like, let alone do I want to live there.