Cyber security training for the Board of Directors



The Board and the CEO must have the knowledge and skills necessary to assess cybersecurity risks, challenge security plans, discuss activities, formulate opinions, and evaluate policies and solutions that protect the assets of their organization. The failure to maintain adequate risk oversight can expose companies, officers, and directors to liability.

Directors owe fiduciary duties to their shareholders and have a significant role in overseeing the risk management of the company. The failure to exercise appropriate oversight in the face of known risks constitutes a breach of the duty of loyalty. A decision about cybersecurity that was “ill-advised or negligent” constitutes a breach of the duty of care.

The Board and the CEO must also assess whether and how to disclose a cyberattack internally and externally to customers and investors. After a successful cyberattack, companies and organizations must provide evidence that they have an adequate and tested cybersecurity program in place that meets international standards, and that they are prepared to respond to a security breach properly and quickly.

We provide short, comprehensive briefings on key issues the board needs to be informed about in order to exercise professional judgment and adequate risk oversight.

Our Briefings for the Board:

Please feel free to discuss your needs with us. We can create custom briefings for the board focusing on your required topic(s), and tailored to your specific needs. Our briefings can be as short and comprehensive as 30 minutes, or longer, depending on the needs, the content of the program and the case studies.

Alternatively, you may choose one of our existing briefings:


1. State-sponsored but independent hacking groups. The long arm of countries that exploit legal pluralism and make the law a strategic instrument

Overview

According to Article 51 of the U.N. Charter: “Nothing in the present Charter shall impair the inherent right of individual or collective self-defense if an armed attack occurs against a Member of the United Nations, until the Security Council has taken measures necessary to maintain international peace and security.”

But is a cyber-attack comparable to an armed attack?

There is no international consensus on a precise definition of a use of force, in or out of cyberspace. Nations assert different definitions and apply different thresholds for what constitutes a use of force.

For example, if cyber operations cause effects that, if caused by traditional physical means, would be regarded as a use of force under jus ad bellum, then such cyber operations would likely also be regarded as a use of force.

Important weaknesses of international law include the assumption that it is possible to isolate military and civilian targets with sufficient clarity, and to distinguish a tangible military objective to be attained from an attack.

More than 20 countries have announced their intent to use offensive cyber capabilities, in line with Article 2(4) and Article 51 of the United Nations (UN) Charter.

Unfortunately, these capabilities will not help when the attackers are State-sponsored groups, and the States supporting them, claim that not only they are not involved, but also that their adversaries (the victims) have fabricated evidence about it. This is a very effective disinformation operation.

Adversaries have already successfully exploited weakness of non-authoritarian societies, especially the political and legal interpretation of facts from different political parties. It is difficult to use offensive cyber capabilities in line with democratic principles and international law, as it is almost impossible to distinguish with absolute certainty between attacks from States and attacks from State-sponsored independent groups.

Even when intelligence services know that an attack comes from a State that uses a State-sponsored independent group, they cannot disclose the information and the evidence that supports their assessment, as disclosures about technical and physical intelligence capabilities and initiatives can undermine current and future operations. This is the “second attribution problem” – they know but they cannot disclose what they know.

As an example, we will discuss the data breach at the U.S. Office of Personnel Management (OPM). OPM systems had information related to the background investigations of current, former, and prospective federal government employees, U.S. military personnel, and those for whom a federal background investigation was conducted. The attackers now have access to information about federal employees, federal retirees, and former federal employees. They have access to military records, veterans' status information, addresses, dates of birth, job and pay history, health insurance and life insurance information, pension information, data on age, gender, race, even fingerprints.

But why?

Aldrich Ames, a former intelligence officer turned mole, has said: “Espionage, for the most part, involves finding a person who knows something or has something that you can induce them secretly to give to you. That almost always involves a betrayal of trust.”

Finding this person is much easier, if you have data easily converted to intelligence, like the data stolen from the U.S. Office of Personnel Management (OPM). This leak is a direct risk for the critical infrastructure.

There are questions to be answered, and decisions to be made, not only about tactic and strategy, but also political and legal interpretation.

We tailor the program to meet specific requirements. You may contact us to discuss your needs.


Target Audience

The program is beneficial to the Board of Directors, the CEO, and senior management of firms and organizations of the private and the public sector.


Duration

1 hour to half day, depending on the needs, the content of the program and the case studies. We always tailor the program to the needs of each client.


Instructor

George Lekatis. His background and some testimonials: https://www.cyber-risk-gmbh.com/George_Lekatis_Testimonials.pdf

2. Deception, disinformation, misinformation, propaganda, and direct democracy

Overview

Misinformation is incorrect or misleading information.

Disinformation is false information, deliberately and often covertly spread, in order to influence public opinion, or obscure the truth.

Propaganda is a broader and older term. Propaganda uses disinformation as a method. While the French philosopher Jacques Driencourt asserted that everything is propaganda, the term is most often associated with political persuasion and psychological warfare.

Psychological warfare is the use of propaganda against an enemy (or even a friend that could become an enemy in the future), with the intent to break his will to fight or resist, or to render him favorably disposed to one's position.

In deception (according to Bell and Whaley), someone is showing the false and hiding the real. Hiding the real is divided into masking, repackaging, and dazzling, while showing the fake is divided into mimicking, inventing, and decoying.

People are remarkably bad at detecting deception and disinformation.

They often trust what others say, and usually they are right to do so. This is called the “truth bias”. People also tend to believe something when it is repeated. They tend to believe something they learn for the first time, and subsequent rebuttals may reinforce the original information, rather than dissipate it.

Humans have an unconscious preference for things they associate with themselves, and they are more likely to believe messages from users they perceive as similar to themselves. They believe that sources are credible if other people consider them credible. They trust fake user profiles with images and background information they like.

Citizens must understand that millions of fake accounts follow thousands of real and fake users, creating the perception of a large following. This large following enhances perceived credibility, and attracts more human followers, creating a positive feedback cycle.

People are more likely to believe others who are in positions of power. Fake accounts have false credentials, like false affiliation with government agencies, corporations, activists, and political parties, to boost credibility.

Freedom of information and expression are of paramount importance in many cultures. The more freedom of information we have, the better. But the more information we have, the more difficult becomes to understand what is right and what is wrong. The right of expression and the freedom of information can be used against the citizens. We often have the weaponization of information.

The Internet and the social media are key game-changers in exploiting rights and freedoms. In the past, a secret service should work hard to get disinformation in the press. Today, the Internet and the social media give the opportunity for spreading limitless fake photos, reports, and "opinions". Many secret services wage online wars using Twitter, Facebook, LinkedIn, Google+, Instagram, Pinterest, Viber etc. Only imagination is the limit.

Social media platforms, autonomous agents, and big data are directed towards the manipulation of public opinion. Social media bots (computer programs mimicking human behaviour and conversations, using artificial intelligence) allow for massive amplification of political views, manufacture trends, game hashtags, add content, spam opposition, attack journalists and persons that tell the truth.

In the hands of State-sponsored groups these automated tools can be used to both boost and silence communication and organization among citizens.

Over 10 percent of content across social media websites, and 62 percent of all web traffic, is generated by bots, not humans. Over 45 million Twitter accounts are bots, according to researchers at the University of Southern California.

Machine-driven communications tools (MADCOMs) use cognitive psychology and artificial intelligence based persuasive techniques. These tools spread information, messages, and ideas online, for influence, propaganda, counter-messaging, disinformation, espionage, intimidation. They use human-like speech to dominate the information-space and capture the attention of citizens.

Artificial intelligence (AI) technologies enable computers to simulate cognitive processes, such as elements of human thinking. Machines can make decisions, perceive data or the environment, and act to satisfy objectives.

The rule of the people, by the people, and for the people, requires citizens that can make decisions in areas they do not always understand. When citizens understand the online environment, they will be way more prepared to protect their families, their working environment, and their country.


Target Audience

The program is beneficial to the Board of Directors, the CEO, and senior management of firms and organizations of the private and the public sector.


Duration

1 hour to half day, depending on the needs, the content of the program and the case studies. We always tailor the program to the needs of each client.


Instructor

George Lekatis. His background and some testimonials: https://www.cyber-risk-gmbh.com/George_Lekatis_Testimonials.pdf

3. Social engineering: the targeting and victimization of key people through weaponized psychology

Overview

Threat actors are not interested in attacking everyone and anyone in an organization. High value individuals are the ones with elevated access to information, assets and systems. Board members and the C-Suite become by default high-risk targets for cyberattacks.

The most effective and frequent method to attack high value individuals is weaponized psychology. Board members and C-level executives must learn the answers to the following questions:

- Which is the advanced psychological game that threat actors use to compromise their targets?

- How do they find their targets’ vulnerabilities?

- What can we do to avoid being exploited from a determined adversary with a carefully planned attack?

High-value individuals must understand the threat, to protect themselves and their organisation from cyber attacks, industrial espionage, competitors, and other threat actors lurking online and offline.


Target Audience

The program is beneficial to the Board of Directors, the CEO, and senior management of firms and organisations of the private and the public sector.


Duration

1 hour to half day, depending on the needs, the content of the program and the case studies. We always tailor the program to the needs of each client.


Instructor

Christina Lekati. You can learn about her at: https://www.cyber-risk-gmbh.com/About_Christina_Lekati.html




Our Services

Cyber security is ofter boring for employees. We can make it exciting.


Online Training

Recorded on-demand training and live webinars.

In-house Training

Engaging training classes and workshops.

Social Engineering

Developing the human perimeter to deal with cyber threats.


For the Board

Short and comprehensive briefings for the board of directors.


Assessments

Open source intelligence (OSINT) reports and recommendations.


High Value Targets

They have the most skilled adversaries. We can help.





Which is the next step?

1

You contact us

2

We meet and discuss

3

Our proposal

4

Changes and approval

5

We deliver







Cyber Risk GmbH, Cyber Risk Awareness and Training in Switzerland, Germany, Liechtenstein