Cybersecurity and the AI Arms Race

We had a very productive field trip to the University of Waterloo for their Cybersecurity and Privacy Conference last week. From a teacher point of view, I had to do a mad dance trying to work out how to be absent from the classroom since our school needs days got cut and suddenly any enrichment I’m looking for seemingly isn’t possible.  I managed to find some board support from our Specialist High Skills Major program and pathways and was able not only arrange getting thirty students and two teachers out to this event, but also to do it without touching the school’s diminished cache of teacher out of the classroom days.

We arrived at the conference after the opening keynote had started.  The only tables were the ones up front (adults are the same as students when it comes to where you sit in a room).  Sarah Tatsis, the VP, Advanced Technology Development Labs at BlackBerry, kindly stopped things and got the students seated.  The students were nervous about being there, but the academic and industry professionals were nothing but approachable and interested in their presence.


What followed was an insightful keynote into Blackberry’s work in developing secure systems in an industry famous for fail fast and early.  Companies that take a more measured approach to digital technology can sometimes seem out of step with the rock-star Silicon Valley crowd, but after a day of listening to software engineers from various companies lamenting ‘some companies’ (no one said the G-word), who tend to throw unfinished software out and then iterate (and consider that a virtue), the hard work of securing a sustainable digital ecosystem seems further and further out of reach.  The frustration in the air was palpable and many expressed a wish for more stringent engineering in online applications.

From Sarah Tatsis I learned about Cylance, Blackberry’s AI driven cybersecurity system.  This reminded me of an article I read in WIRED recently about Mike Beck, a (very) experienced cybersec analyst who has been working on a system called Darktrace, that uses artificial intelligence to mimic his skills and experience as a cybersecurity analyst in tracking down incursions.

 I spent a good chunk of this past summer becoming the first high school teacher in Canada qualified to teach Cisco’s CCNA Cyber Operations course which, as you can gather from the name, is focused on the operational nature of cybersecurity.  After spending that time learning about the cyber-threatscape, I was more and more conscious of how attackers have automated the attack process.  Did you know criminals with little or no skill or experience can buy an exploit kit that gives them a software dashboard?  From that easy to use dashboard, complex attacks on networks are a button push away.

So, bad actors can perform automated attacks on networks with little or no visibility, or experience.  On the other side of the fence you’ve got people in a SOC (so much of this is the acronyms – that’s a Security Operations Centre), picking through anomalies in the system and then analyzing them as potential threats. That threat analysis is based on intuition, itself developed from years of experience.  Automating the response to automated attacks only makes sense.

In the WIRED article they make a lot of hay about how AI driven systems like Darktrace or Cylance could reduce the massive shortage of cybersecurity professionals (because education seems singularly disinterested in helping), but I don’t think that will happen.  In an inflationary technology race like this, when everyone ups their technology it amplifies the complexity and importance of jobs, but doesn’t make them go away.  I think a better way to look at this might be with an analogy to one of my other favourite things.

Automating our tech doesn’t reduce our effort.  If
anything it amplifies it.  The genius of Marc Marquez
can only be really understood in slow motion as he
drifts a 280hp bike at over 100mph.  That’s what an

AI arms race in cybersec will look like too – you’ll only
be able to watch it played back in slow motion to
understand what is happening.
What’s been happening to date is that bad actors have automated much of their work, sort of like how a bicycle automated the pedaling by turning into a motorcycle.  If you’re trying to race a bicycle (human based cyber-defence) against a motorcycle (bad actors using automated systems) you’re going to quickly find yourself dropping behind – much like cybersecurity has.  As the defensive side of things automates, it will amplify the importance of an experienced cybersec operator, not make it irrelevant.  The engines will take on the engines, but the humans at the controls become even more important and have to be even more skilled since the crashes are worse.  Ironically, charging cyber defence with artificial intelligence will mean fewer clueless script kiddies running automated attack software and more crafty cybercriminals who can ride around the AI.  I’ve also been spending a bit of time working with AI in my classroom and can appreciate the value of machine learning, but it’s a data driven thing, and when it’s working with something it has never seen before you quickly come to see its limitations.  AI is going to struggle, especially with things like zero day threats.  There’s another vocab piece for you – zero day threats are attacks that have never been seen before, so there is no established defence!

Once a vulnerability is found in software it’s often held back and sold to the highest bidder.  If you discovered a backdoor into banking software, imagine what that might sell for.  Did you know that there is a huge market for zero day threats online?  Between zero day attacks, nation-state cyberwar on a level never seen before and increasingly complex cybercriminals (some of whom were trained in those nation state cyber war operations), the digital space we spend so much of our time in and more and more of our critical infrastructure relies on is only going to get more fraught.  If you feel like our networked world and all this cybersecurity stuff is coming out of nowhere, you ain’t seen nothing yet.  AI may very well help shore up the weakest parts of our cyber-defence, but the need for people going into this underserved field isn’t going away any time soon.


***


Where did the Cybersecurity & Privacy Conference turn next?  To privacy!  Which is (like most things) more complicated than you think.  The experts on stage ranged from legal experts to sociologists and tackled the concept from many sides, with an eye on trying to expose how our digitally networked world is eroding expectations of private information.


I found the discussion fascinating, as did my business colleague, but many of the students were finding this lecture style information delivery to be exhausting.  When I asked who wanted to stick around in the afternoon for the industry panel on ‘can we fix the internet’, only a handful had the will and interest.  We had an interesting discussion after about whether or not university is a good fit for most students.  Based on our time at the conference, I’d say it isn’t – or they just haven’t grown into the brain they need to manage it yet.  What’s worrying is that in our increasingly student centred, digital classrooms we’re not graduating students who can handle this kind of information delivery.  That kind of metacognitive awareness is gold if you can find it in high school, and field trips like this one are a great way to highlight it.


The conference (for us anyway) wrapped up with an industry panel asking the question, “Can the Internet be saved?”  In the course of the discussion big ideas, like public, secure internet for all (ie: treating our critical ICT infrastructure with the same level of intent as we do our water, electrical and gas systems) were bandied about.  One of my students pointed out that people don’t pirate software or media for fun, they do it because they can’t afford it, which leads to potential hazards.  There was no immediate answer for this, but many of the people up there were frustrated at the digital divide.  As William Gibson so eloquently said, “the future is already here – it’s just not evenly distributed.”  That lack of equity in entering our shared digital space and the system insecurity this desperation causes was a recurring theme.  One speaker pointed out that a company only fixated on number of users has a dangerously single minded obsession that is undermining the digital infrastructure that increasingly manages our critical systems.  If society is going to embrace digital, then that future better reach everyone, or there are always going to be people upsetting the boat if they aren’t afforded a seat on it.  That’s also assuming the people building the boats are more interested in including everyone rather than chasing next quarter earnings.


This conversation wandered in many directions, yet it always came back to something that should be self-evident to everyone.  If we had better users, most of our problems would disappear.  I’ve been trying to drive this ‘education is the answer‘ approach for a while now, but interest in picking up this responsibility seems to slip off everyone from students and teachers to administration at all levels.  We’re all happy to use digital tools to save money and increase efficiencies, but want to take no individual responsibility for them.


I’ve been banging this drum to half empty rooms for over a year now.  You say the c-word (cybersecurity) and people run away, and then get on their networked devices and keep doing the same silly things they’ve always done.  Our ubiquitous use of digital technology is like everyone getting a new car that’s half finished and full of safety hazards and then driving it on roads where no one can be bothered to learn the rules.  We could do so much better.  How digital skills isn’t a mandatory course in Ontario high schools is a mystery, especially when every class uses the technology.



I was surprised to bump into Diana Barbosa, ICTC’s Director of Education and Standards at the conference.  She was thrilled to see a troop of CyberTitans walk in and interrupt the opening keynote.  The students themselves, including a number of Terabytches from last year’s national finalist team who met Diana in Ottawa, were excited to have a chat and catch up.  This kind of networking is yet another advantage of getting out of the classroom on field trips like this.  If our pathways lead at the board hadn’t helped us out, all of that would have been lost.


We left the conference early to get everyone back in time for the end of the school day.  When I told them we’d been invited back on the bus ride home they all gave out a cheer.  Being told you belong in a foreign environment like an industry and academic conference full of expert adults is going to change even more student trajectories.  If our goal is to open up new possibilities to students, this opportunity hit the mark.


From a professional point of view, I’m frustrated with the lack of cohesion and will in government and industry to repair the fractured digital infrastructure they’ve made.  Lots of people have made a lot of money driving our society onto the internet.  The least they could do is ensure that the technology we’re using is as safe as it can be, but there seems to be no short term gain in it.


The US hacked a drone out of the sky this summer.

Some governments have militarized their cyber-capabilities and are building weapons grade hacks that will trickle down into civilian and criminal organizations.  In this inflationary threat-scape, cybersecurity is gearing up with AI and operational improvements to better face these threats, but it’s still a very asymmetrical situation.  The bad actors have a lot more going for them than the too few who stand trying to protect our critical digital infrastructure.  


Western governments have stood by and let this happen with little oversight, and the result has been a wild west of fake news, election tampering, destabilizing hacks and hackneyed software.  There are organizations in this that are playing a long game.  If this digital revolution is to become a permanent part of our social structure, a part that runs our critical infrastructure, then we all need to start taking networked infrastructure as something more than an entertaining diversion.


One of the most poignant moments for me was when one of the speakers asked the audience full of cybersecurity experts who they should call, police wise, if their company has been hacked.  There was silence.  In a room full of experts no one could answer because there is no answer.  That tells you something about just how asymetrical the threat-scape is these days.  Criminals and foreign powers can hack at will and know there are no repercussions, because there are none.


Feel safer now?  Reading this?  Online?  I didn’t even tell you about how many exploit kits drop hidden iframe links into web pages without their owners even knowing and then infect any machine that looks at the page anonymously.  Or the explosion of tracking cookies designed to sell your browsing habits to any interested party.

***

I’m beating this drum again at ECOO’s #BIT19 #edtech Conference in Niagara Falls on November 6, 7 and 8…

from Blogger https://ift.tt/2IHHQjL
via IFTTT