Cyber 9/12: Why We Fight* Learning from competition. By Robert Carolina, Senior Visiting Fellow

Once again, 2020 was a great year for CDT student participation in the Atlantic Council “Cyber 9/12 Strategy Challenge.” The third annual competition in London was the toughest to date, starting with a competitive entry process. Of more than 30 UK-based teams who applied, only 17 (including two teams from RHUL CDT) were selected to compete. One of our teams went on to the Final Round of this year’s competition, placing Third.

Convened in different locales around the world, teams comprised of four students simulate the high-pressure task of analysing available information about cyber security threats, synthesising these, and briefing senior government officials with findings and recommendations. The competition relies upon information sources assembled into a briefing pack such as (real) research reports, (real and simulated) online media, (real and simulated) private sector threat analysis, (simulated) classified government intelligence reports, and even a (simulated) television news report.

Our first CDT student team competed in Geneva in 2017, advancing to the Semi-Finals. Our next team placed First in the 2018 inaugural London competition. Three teams competed in 2019: two in London and one in Geneva. And now two more in London, including our second appearance in the Final Round. That’s a lot of competition.

Students who wish to compete organise themselves into teams and recruit a coach. Participating is entirely voluntary and brings no formal academic credit.

So WHY do they do it?

It’s been my privilege to coach CDT teams three times: twice in London and once in Geneva. From this vantage point, I have seen a number of benefits students can take from the competition.

Each competition forces students to make use of a wide variety of disciplines they might not otherwise encounter on their academic journey. Teams must prepare to justify their recommendations within the emerging framework of international law which now pervades state cyber operation decision-making. They are required to appreciate the risks and potential impact of hostile cyber operations and countermeasures.

Teams are encouraged to think holistically about the needs of an entire society; to prioritise both domestic and international responses; and to consider non-cyber impacts and responses. Their chances of success go up tremendously if they exhibit an appreciation of the practicalities needed to implement their recommendations, such as the length of time necessary to adopt new laws or

procedures, to commission new offensive cyber programmes, to task or redeploy limited civil service resources, to leverage support from non-state actors such as the community of CISOs and security vendors, or to persuade international partners to participate in multilateral action.

Teams are forced to confront the reality of decision-making in an atmosphere of less-than-complete, potentially inaccurate, and sometimes conflicting, information. They must sift through messy and diverse sources of intelligence and synthesise a picture of threats that can be explained to non-expert decision-makers within minutes – all while being careful to assign appropriate degrees of confidence to different elements of their report. They must learn the difference between acting as an honest broker of available evidence (which is the job of an analyst) and acting as an advocate for a specific outcome (which is not).

The best teams learn and demonstrate good teamwork skills. They face difficult choices in how to allocate tasks among themselves. The time pressure of the competition begins at a relaxed pace with weeks available to produce and deliver Round 1 submissions. Those selected to advance to Round 2 are thrown into a situation in which they have a single overnight window to absorb significant new intelligence and revise their view of the situation. The very few teams who advance to the Final Round face the highest-pressure component – only 20 minutes in which to absorb a few bits of critical additional intelligence before briefing the judging panel who simulate government leaders – often comprised of persons who have served in the senior civil service roles the students are now simulating.

The competition is a labour of love for a large group of volunteers from industry, government and academia (including CDT graduate and former competitor Dr Andreas Haggman, RHUL ’19, who remains heavily involved in the London competition). The effort required to develop each competition’s intelligence pack is considerable, as is the effort to recruit and coordinate large numbers of judges.

Each competition strongly reflects local values, methods, and standards. Judges in London simulate UK government officials; in Washington, DC they simulate US federal government officials; and in Geneva they simulate a multinational “task force of European leaders” including heads of government and defence. Competitors must be prepared to make recommendations fit for the relevant environment.

Of course, no competition is perfect, no simulation is perfect, and the process being simulated is itself far from perfect. Judges and competition officials are ultimately required to rank teams. Reasonable people can disagree about aspects of the competition process, as well as the results.

But I find that the students who take the most from the competition are those who embrace it and invest in it for the learning opportunity it represents. I’ve watched students climb and conquer steep learning curves. I’ve seen cryptography students gain a better understanding of politics. I’ve watched students of law and international relations learn to appreciate the practicalities of cyber operations. I’ve seen computer science students learn how international law continues to influence this sphere of operation. And I’ve watched as all of them learn more about how the decision-making “sausage” is made.

These are all good reasons to compete. And in the context of the competition, this is, I believe, why we fight.

 


Comments

  1. Insightful read, Robert! The lessons from competition are crucial in cybersecurity. For those coding in this field, FiraCode is an excellent font that enhances readability and helps maintain focus while developing solutions.

    ReplyDelete

Post a Comment

Popular posts from this blog

Remote working and Cyber Security: Georgia Crossland and Amy Ertan

New Publication: Remote Working and (In)Security?: Amy Ertan

The Artificial Intelligence Monster: Nicola Bates