In the Words of President Ronald Reagan, "Trust but Verify"

Behavioral Analytics Enables Verification That Users Are Doing the Right Thing, And Promotes Transparency

Trust underlies almost every interaction we have with people and companies.  Whether it’s a buyer and seller, employee and employer, client and vendor, elected official and constituent, or a person lending a tool to their neighbor, we are all banking on the fact that we can count on the other party to act in good faith and do the right thing.  When face to face interactions were commonplace, trust was easier to manage.  We interacted with fewer people and companies, allowing for well established relationships. We also transacted using slow and less volatile paper.  With the internet and gig economy, we interact with far more entities, and exchange far more data that is all too easy to lose or expose.  The concept and mechanisms of trust have also changed in dramatic ways compared to just a few years ago.  

The world is shifting to a transparency based trust model, where it is up to the community to validate trustworthiness and to exclude those who behave badly.  Blockchain, the technology underlying cryptocurrency, is a good example of a community based trust model (if not one completely based on transparency).  On the flip side, the recent occurrence of Uber hiding the fact that it lost some 57 million personal records, will most likely be met with a strong reaction from its customers and industry at large.  Losing the data was bad enough, but unfortunately not that unusual in today’s day and age.  Hiding the breach for a year, and paying off the hackers $100k to keep it quiet could be the company’s undoing.  The Watergate era saying that the cover up is worse than the crime recalls the words of the famous author Zig Ziglar who said, “If people like you, they'll listen to you, but if they trust you, they'll do business with you.”  However, before we start throwing the proverbial stone in a glass house, if we are honest with ourselves as an industry, we know that this is probably not an isolated incident.  

Given the ever-growing crisis of trust on the part of consumers, and upcoming regulations like GDPR that require rapid notification of loss events, it is time for companies to come clean with their cyber breach skeletons.  Six months from now, those lost records would have likely cost the company over $20M in GDPR fines, not to mention the loss of business and lawsuits.

Another good example of the trust model is the SWIFT (Society for Worldwide Interbank Financial Telecommunications) Customer Security Controls Framework.  After several significant cyber heists at its member banks, SWIFT realized that the network is only as secure as the weakest bank that subscribes to it. SWIFT subsequently established the control framework which published earlier this year.  A key part of the implementation process is a self-attestation by every participant on the network of their compliance with the framework, the first of which is due in January 2018. While this certification/self-attestation process is not a first (New York State’s Department of Financial Services Cyber Regulation already has this requirement), what is unique is the publication of those attestations and potentially noncompliance, to all other SWIFT users.  That kind of transparency promotes trust between counterparties who are acting in good faith and motivates participants to get their ducks in a row.

President Ronald Reagan taught us to “trust but verify,” meaning trust is great, but blind trust is dangerous.  

In the SWIFT framework, one of the eight principles is to “detect anomalous activity on systems or transaction records.” Detecting anomalous activity is all about using behavioral analytics to identify malicious or careless behavior amongst the vast volume of transactions occurring in the environment.  

While the concept of monitoring user behavior seems to contradict the trust model, it actually complements it.  Behavioral analytics enables verification that users are doing the right thing, and promotes transparency. For those users who are not putting the community at risk, behavioral analytics has no impact.  For those users who do something unusual, behavioral analytics verifies if the behavior indicates risky activity, whether malicious, careless or compromised, and requires further investigation. If the investigation verifies the activity is indeed business justified, it can be “whitelisted” so it is not flagged in the future. Behavioral analytics has become a necessary tool in the trust model arsenal, detecting insider threats and cyber breaches, even when the user is not tripping alarms along the way.

I expect we will continue to see increasing transparency and community based trust models showing up in commerce and regulations.  It’s a good time to get your cyber house in order so you are prepared when it knocks on your door.

Original author: Steven Grossman