Security teams carefully monitor potential threat activity, but incidents aren’t always black and white.
It is just as tiring for security teams to keep saying “No” as it is for every other department to keep hearing it. To preserve some level of smooth operations in an organization, security teams need to find a way to let employees move data around while still protecting digital assets like IP and customer data.
Imagine a colleague starts uploading sensitive IP files to the cloud – is he planning to share them with a competitor or with an approved agency developing messaging for an upcoming release? What if an executive is accessing personnel info from the HR system while traveling overseas – is she handling critical tasks from the road or does someone else have access to her laptop?
The problem is that a lot of risky behaviors live in a “grey space” that could easily be malicious, innocent or even unintentional. Blocking all of these “grey” activities will force employees to implement workarounds. The key is to use contextual clues and patterns of behavior to show whether an activity is malicious or innocent and whether the user conducting them is really behind them.
Knowing What You Don’t Know
The cybersecurity industry can’t avoid dealing with the ambiguity of machine actions. There is no wall that can be built to protect data, efforts to thwart social engineering often fail, and attack surfaces continue to grow. As information systems become more and more complex, we are demanding that our technology “make sense” of a vast amount of vague environmental inputs, many of which are associated with human behavior.
We are asking systems to seamlessly balance information, generated by innumerable sensors with various degrees of sensitivity, and to automatically prioritize or reason through these inputs. But, machine-centric tools struggle to cope with ambiguity—algorithms are not always able to analyze all salient variables and make a confident decision whether to allow or block risky actions. Humans, on the other hand, are better at balancing multiple variables and context to make decisions – especially when dealing with the unexpected – but cannot reckon with the massive amounts of data security requires.
The Odd Couple of Man and Machine
Striking the right balance requires that we leverage the power of technology while simultaneously capitalizing on human insight. Computers are uniquely capable of coping with and summarizing large amounts of data. Technology can also detect things that humans cannot, such as the volume of files a person sends from an email account, file access patterns or how data flows within a network. Automatically alerting human analysts when a person or network’s behavior falls within the ambiguous “grey space,” but these efforts typically result in a frustrating amount of false alarms.
The meaning that we make from the world around us depends heavily on our ability to sense things, and our subsequent ability to make sense of things. This is complicated by living in a world that is full of ambiguities. There are many positives to embracing ambiguity, regardless of the challenges. Ambiguity allows us to question our preconceived notions and explore new possibilities for understanding the world. We can question why something exists or why something happens – and there may not be one right answer.
Humans may not always be comfortable embracing ambiguity, but we are better at coping with it than machines are – especially when presented with something we’ve never experienced before. We have exceedingly advanced sensory integration capabilities, allowing us to automatically and immediately transform information from our five senses into rich symbolic meaning.
In a world full of cybersecurity threats that morph faster than our technology can keep pace, it’s not enough to build technology based on yesterday’s understanding of danger. It’s not enough to rely on outmoded categories of black or white. We must create technology that is able to adjust to changing conditions and make meaning out of a wide range of behavior and activities that fall into that “grey space.”
We can start by understanding that humans excel at identifying ambiguity and dealing with abstract constructs. We’re able to devise strategies for exploring or tackling ambiguity without supervision and can create a reasoning approach to deal with whatever is occurring in the environment at any given moment. By combining our human ability to adjust and adapt to changing conditions with technology’s power and scope, we can start to uncover the truth to be found in the grey space, and begin to prepare for the tremendous cybersecurity threats to come. – Threat Post