Social engineering, in the context of information security, is the psychological manipulation of people into performing actions or divulging confidential information.
This differs from social engineering within the social sciences, which
does not concern the divulging of confidential information. A type of confidence trick
for the purpose of information gathering, fraud, or system access, it
differs from a traditional "con" in that it is often one of many steps
in a more complex fraud scheme.
It has also been defined as "any act that influences a person to take an action that may or may not be in their best interests."
Information Security Culture
Employee
behavior can have a big impact on information security in
organizations. Cultural concepts can help different segments of the
organization work effectively or work against effectiveness towards
information security within an organization. "Exploring the Relationship
between Organizational Culture and Information Security Culture"
provides the following definition of information security culture: "ISC
is the totality of patterns of behavior in an organization that
contribute to the protection of information of all kinds."
Andersson and Reimers (2014) found that employees often do not
see themselves as part of the organization Information Security "effort"
and often take actions that ignore organizational information security
best interests.
Research shows Information security culture needs to be improved
continuously. In "Information Security Culture from Analysis to Change,"
authors commented that "it's a never ending process, a cycle of
evaluation and change or maintenance." They suggest that to manage
information security culture, five steps should be taken:
Pre-evaluation, strategic planning, operative planning, implementation,
and post-evaluation.
- Pre-Evaluation: to identify the awareness of information security within employees and to analyse current security policy.
- Strategic Planning: to come up with a better awareness-program, we need to set clear targets. Clustering people is helpful to achieve it.
- Operative Planning: set a good security culture based on internal communication, management-buy-in, and security awareness and training program.
- Implementation: four stages should be used to implement the information security culture. They are commitment of the management, communication with organizational members, courses for all organizational members, and commitment of the employees.
Techniques and Terms
All social engineering techniques are based on specific attributes of human decision-making known as cognitive biases.
These biases, sometimes called "bugs in the human hardware", are
exploited in various combinations to create attack techniques, some of
which are listed below. The attacks used in social engineering can be
used to steal employees' confidential information. The most common type
of social engineering happens over the phone. Other examples of social
engineering attacks are criminals posing as exterminators, fire marshals
and technicians to go unnoticed as they steal company secrets.
One example of social engineering is an individual who walks into
a building and posts an official-looking announcement to the company
bulletin that says the number for the help desk has changed. So, when
employees call for help the individual asks them for their passwords and
IDs thereby gaining the ability to access the company's private
information.
Another example of social engineering would be that the hacker contacts
the target on a social networking site
and starts a conversation with the target. Gradually the hacker gains
the trust of the target and then uses that trust to get access to
sensitive information like password or bank account details.
Social engineering relies heavily on the 6 principles of influence established by Robert Cialdini.
Cialdini's theory of influence is based on six key principles:
reciprocity, commitment and consistency, social proof, authority,
liking, scarcity.
Six Key Principles
- Reciprocity – People tend to return a favor, thus the pervasiveness of free samples in marketing. In his conferences, he often uses the example of Ethiopia providing thousands of dollars in humanitarian aid to Mexico just after the 1985 earthquake, despite Ethiopia suffering from a crippling famine and civil war at the time. Ethiopia had been reciprocating the diplomatic support Mexico provided when Italy invaded Ethiopia in 1935. The good cop/bad cop strategy is also based on this principle.
- Commitment and consistency – If people commit, orally or in writing, to an idea or goal, they are more likely to honor that commitment because they have stated that that idea or goal fits their self-image. Even if the original incentive or motivation is removed after they have already agreed, they will continue to honor the agreement. Cialdini notes Chinese brainwashing of American prisoners of war to rewrite their self-image and gain automatic unenforced compliance. Another example is marketers who make the user close popups by saying “I’ll sign up later” or "No thanks, I prefer not making money”.
- Social proof – People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were missing. At one point this experiment was aborted, as so many people were looking up that they stopped traffic. See conformity, and the Asch conformity experiments.
- Authority – People will tend to obey authority figures, even if they are asked to perform objectionable acts. Cialdini cites incidents such as the Milgram experiments in the early 1960s and the My Lai massacre.
- Liking – People are easily persuaded by other people whom they like. Cialdini cites the marketing of Tupperware in what might now be called viral marketing. People were more likely to buy if they liked the person selling it to them. Some of the many biases favoring more attractive people are discussed. See physical attractiveness stereotype.
- Scarcity – Perceived scarcity will generate demand. For example, saying offers are available for a "limited time only" encourages sales.
Four Social Engineering Vectors
Vishing
Vishing, otherwise known as "voice phishing", is the criminal practice of using social engineering over a telephone system
to gain access to private personal and financial information from the
public for the purpose of financial reward. It is also employed by
attackers for reconnaissance purposes to gather more detailed intelligence on a target organization.
Phishing
Phishing is a technique of fraudulently obtaining private
information. Typically, the phisher sends an e-mail that appears to come
from a legitimate business—a bank, or credit card company—requesting "verification" of information and warning of some dire consequence
if it is not provided. The e-mail usually contains a link to a
fraudulent web page that seems legitimate—with company logos and
content—and has a form requesting everything from a home address to an ATM card's PIN or a credit card number. For example, in 2003, there was a phishing scam in which users received emails supposedly from eBay claiming that the user's account was about to be suspended unless a link provided was clicked to update a credit card
(information that the genuine eBay already had). By mimicking a
legitimate organization's HTML code and logos, it is relatively simple
to make a fake Website look authentic. The scam tricked some people into
thinking that eBay was requiring them to update their account
information by clicking on the link provided. By indiscriminately spamming
extremely large groups of people, the "phisher" counted on gaining
sensitive financial information from the small percentage (yet large
number) of recipients who already have eBay accounts and also fall prey
to the scam.
Smishing
The act of using SMS text messaging to lure victims into a specific course of action. Like phishing it can be clicking on a malicious link or divulging information.
Impersonation
Pretending
or pretexting to be another person with the goal of gaining access
physically to a system or building. Impersonation is used in the "SIM swap scam" fraud.
Other Concepts
Pretexting
Pretexting (adj. pretextual) is the act of creating and using an invented scenario (the pretext)
to engage a targeted victim in a manner that increases the chance the
victim will divulge information or perform actions that would be
unlikely in ordinary circumstances. An elaborate lie, it most often involves some prior research or setup and the use of this information for impersonation (e.g., date of birth, Social Security number, last bill amount) to establish legitimacy in the mind of the target.
This technique can be used to fool a business into disclosing customer information as well as by private investigators
to obtain telephone records, utility records, banking records and other
information directly from company service representatives. The information can then be used to establish even greater legitimacy under tougher questioning with a manager, e.g., to make account changes, get specific balances, etc.
Pretexting can also be used to impersonate co-workers, police,
bank, tax authorities, clergy, insurance investigators—or any other
individual who could have perceived authority or right-to-know in the
mind of the targeted victim. The pretexter must simply prepare answers
to questions that might be asked by the victim. In some cases, all that
is needed is a voice that sounds authoritative, an earnest tone, and an
ability to think on one's feet to create a pretextual scenario.
Vishing
Phone phishing (or "vishing") uses a rogue interactive voice response
(IVR) system to recreate a legitimate-sounding copy of a bank or other
institution's IVR system. The victim is prompted (typically via a
phishing e-mail) to call in to the "bank" via a (ideally toll free)
number provided in order to "verify" information. A typical "vishing"
system will reject log-ins continually, ensuring the victim enters PINs
or passwords multiple times, often disclosing several different
passwords. More advanced systems transfer the victim to the
attacker/defrauder, who poses as a customer service agent or security expert for further questioning of the victim.
Spear Phishing
Although similar to "phishing", spear phishing is a technique that
fraudulently obtains private information by sending highly customized
emails to few end users. It is the main difference between phishing
attacks because phishing campaigns focus on sending out high volumes of
generalized emails with the expectation that only a few people will
respond. On the other hand, spear phishing emails require the attacker
to perform additional research on their targets in order to "trick" end
users into performing requested activities. The success rate of
spear-phishing attacks is considerably higher than phishing attacks with
people opening roughly 3% of phishing emails when compared to roughly
70% of potential attempts. Furthermore, when users actually open the
emails phishing emails have a relatively modest 5% success rate to have
the link or attachment clicked when compared to a spear-phishing
attack's 50% success rate.
Spear Phishing success is heavily dependent on the amount and quality of OSINT (Open Source Intelligence) that the attacker can obtain. Social media account activity is one example of a source of OSINT.
Water Holing
Water holing is a targeted social engineering strategy that
capitalizes on the trust users have in websites they regularly visit.
The victim feels safe to do things they would not do in a different
situation. A wary person might, for example, purposefully avoid clicking
a link in an unsolicited email, but the same person would not hesitate
to follow a link on a website they often visit. So, the attacker
prepares a trap for the unwary prey at a favored watering hole. This
strategy has been successfully used to gain access to some (supposedly)
very secure systems.
The attacker may set out by identifying a group or individuals to
target. The preparation involves gathering information about websites
the targets often visit from the secure system. The information
gathering confirms that the targets visit the websites and that the
system allows such visits. The attacker then tests these websites for
vulnerabilities to inject code that may infect a visitor's system with malware.
The injected code trap and malware may be tailored to the specific
target group and the specific systems they use. In time, one or more
members of the target group will get infected and the attacker can gain
access to the secure system.
Baiting
Baiting is like the real-world Trojan horse that uses physical media and relies on the curiosity or greed of the victim. In this attack, attackers leave malware-infected floppy disks, CD-ROMs, or USB flash drives
in locations people will find them (bathrooms, elevators, sidewalks,
parking lots, etc.), give them legitimate and curiosity-piquing labels,
and waits for victims.
For example, an attacker may create a disk featuring a corporate
logo, available from the target's website, and label it "Executive
Salary Summary Q2 2012". The attacker then leaves the disk on the floor
of an elevator or somewhere in the lobby of the target company. An
unknowing employee may find it and insert the disk into a computer to
satisfy their curiosity, or a good Samaritan may find it and return it
to the company. In any case, just inserting the disk into a computer
installs malware, giving attackers access to the victim's PC and,
perhaps, the target company's internal computer network.
Unless computer controls block infections, insertion compromises PCs "auto-running" media. Hostile devices can also be used. For instance, a "lucky winner" is sent a free digital audio player compromising any computer it is plugged to. A "road apple" (the colloquial term for horse manure, suggesting the device's undesirable nature) is any removable media with malicious software left in opportunistic or conspicuous places. It may be a CD, DVD, or USB flash drive,
among other media. Curious people take it and plug it into a computer,
infecting the host and any attached networks. Again, hackers may give
them enticing labels, such as "Employee Salaries" or "Confidential".
One study done in 2016 had researchers drop 297 USB drives around
the campus of the University of Illinois. The drives contained files on
them that linked to webpages owned by the researchers. The researchers
were able to see how many of the drives had files on them opened, but
not how many were inserted into a computer without having a file opened.
Of the 297 drives that were dropped, 290 (98%) of them were picked up
and 135 (45%) of them "called home".
Quid pro quo
Quid pro quo means something for something:
- An attacker calls random numbers at a company, claiming to be calling back from technical support. Eventually this person will hit someone with a legitimate problem, grateful that someone is calling back to help them. The attacker will "help" solve the problem and, in the process, have the user type commands that give the attacker access or launch malware.
- In a 2003 information security survey, 91% of office workers gave researchers what they claimed was their password in answer to a survey question in exchange for a cheap pen. Similar surveys in later years obtained similar results using chocolates and other cheap lures, although they made no attempt to validate the passwords.
Tailgating
An attacker, seeking entry to a restricted area secured by unattended, electronic access control, e.g. by RFID
card, simply walks in behind a person who has legitimate access.
Following common courtesy, the legitimate person will usually hold the
door open for the attacker or the attackers themselves may ask the
employee to hold it open for them. The legitimate person may fail to ask
for identification for any of several reasons, or may accept an
assertion that the attacker has forgotten or lost the appropriate
identity token. The attacker may also fake the action of presenting an
identity token.
Other Types
Common confidence tricksters
or fraudsters also could be considered "social engineers" in the wider
sense, in that they deliberately deceive and manipulate people,
exploiting human weaknesses to obtain personal benefit. They may, for
example, use social engineering techniques as part of an IT fraud.
A very recent type of social engineering technique includes spoofing or hacking IDs of people having popular e-mail IDs such as Yahoo!, Gmail, Hotmail, etc. Among the many motivations for deception are:
- Phishing credit-card account numbers and their passwords.
- Cracking private e-mails and chat histories, and manipulating them by using common editing techniques before using them to extort money and creating distrust among individuals.
- Cracking websites of companies or organizations and destroying their reputation.
- Computer virus hoaxes
- Convincing users to run malicious code within the web browser via self-XSS attack to allow access to their web account
Countermeasures
Organizations reduce their security risks by:
Training to Employees
Training employees in security protocols relevant to their position.
(e.g., in situations such as tailgating, if a person's identity cannot
be verified, then employees must be trained to politely refuse.)
Standard Framework
Establishing frameworks of trust on an employee/personnel level (i.e.,
specify and train personnel when/where/why/how sensitive information
should be handled)
Scrutinizing Information
Identifying which information is sensitive and evaluating its exposure
to social engineering and breakdowns in security systems (building,
computer system, etc.)
Security Protocols
Establishing security protocols, policies, and procedures for handling sensitive information.
Event Test
Performing unannounced, periodic tests of the security framework.
Inoculation
Preventing social engineering and other fraudulent tricks or traps by
instilling a resistance to persuasion attempts through exposure to
similar or related attempts.
Review
Reviewing the above steps regularly: no solutions to information integrity are perfect.
Waste Management
Using a waste management service that has dumpsters with locks on them,
with keys to them limited only to the waste management company and the
cleaning staff. Locating the dumpster either in view of employees so
that trying to access it carries a risk of being seen or caught, or
behind a locked gate or fence where the person must trespass before they
can attempt to access the dumpster.
The life cycle of Social Engineering
- Information gathering-Information gathering is the first and for the most step that requires much patience and keenly watching habits of the victim. This step gathering data about the victim’s interests, personal information. It determines the success rate of the overall attack.
- Engaging with victim-After gathering required amount of information, the attacker opens a conversation with the victim smoothly without the victim finding anything inappropriate.
- Attacking-This step generally occurs after a long period of engaging with the target and during this information from the target is retrieved by using social engineering. In phase, the attacker gets the results from the target.
- Closing interaction-This is the last step which includes slowly shutting down the communication by the attacker without arising any suspicion in the victim. In this way, the motive is fulfilled as well as the victim rarely comes to know the attack even happened.
Notable Social Engineers
Frank Abagnale Jr.
Frank Abagnale Jr.
is an American security consultant known for his background as a former
con man, check forger, and impostor while he was between the ages of 15
and 21. He became one of the most notorious impostors,
claiming to have assumed no fewer than eight identities, including an
airline pilot, a physician, a U.S. Bureau of Prisons agent, and a
lawyer. Abagnale escaped from police custody twice (once from a taxiing
airliner and once from a U.S. federal penitentiary) before turning 22
years old.
Kevin Mitnick
Kevin Mitnick is an American computer security consultant, author and hacker,
best known for his high-profile 1995 arrest and later five year
conviction for various computer and communications-related crimes.
He now runs the security firm Mitnick Security Consulting, LLC which
helps test companies' security strengths, weaknesses, and potential
loopholes. He is also the Chief Hacking Officer of the security
awareness training company KnowBe4, as well as an active advisory board
member at Zimperium, a firm that develops a mobile intrusion prevention system.
Susan Headley
Susan Headley was an American hacker active during the late 1970s and early 1980s widely respected for her expertise in social engineering, pretexting, and psychological subversion.
She was known for her specialty in breaking into military computer
systems, which often involved going to bed with military personnel and
going through their clothes for usernames and passwords while they
slept. She became heavily involved in phreaking with Kevin Mitnick and Lewis de Payne in Los Angeles,
but later framed them for erasing the system files at US Leasing after a
falling out, leading to Mitnick's first conviction. She retired to
professional poker.
Badir Brothers
Brothers
Ramy, Muzher, and Shadde Badir—all of whom were blind from
birth—managed to set up an extensive phone and computer fraud scheme in Israel in the 1990s using social engineering, voice impersonation, and Braille-display computers.
Law
In common law, pretexting is an invasion of privacy tort of appropriation.
Pretexting of Telephone Records
In December 2006, United States Congress approved a Senate sponsored bill making the pretexting of telephone records a federal felony
with fines of up to $250,000 and ten years in prison for individuals
(or fines of up to $500,000 for companies). It was signed by President
George W. Bush on 12 January 2007.
Federal Legislation
The 1999 "GLBA" is a U.S. Federal
law that specifically addresses pretexting of banking records as an
illegal act punishable under federal statutes. When a business entity
such as a private investigator, SIU insurance investigator, or an
adjuster conducts any type of deception, it falls under the authority of
the Federal Trade Commission
(FTC). This federal agency has the obligation and authority to ensure
that consumers are not subjected to any unfair or deceptive business
practices. US Federal Trade Commission Act, Section 5 of the FTCA
states, in part:
"Whenever the Commission shall have reason to believe that any such
person, partnership, or corporation has been or is using any unfair
method of competition or unfair or deceptive act or practice in or
affecting commerce, and if it shall appear to the Commission that a
proceeding by it in respect thereof would be to the interest of the
public, it shall issue and serve upon such person, partnership, or
corporation a complaint stating its charges in that respect."
The statute states that when someone obtains any personal,
non-public information from a financial institution or the consumer,
their action is subject to the statute. It relates to the consumer's
relationship with the financial institution. For example, a pretexter
using false pretenses either to get a consumer's address from the
consumer's bank, or to get a consumer to disclose the name of their
bank, would be covered. The determining principle is that pretexting
only occurs when information is obtained through false pretenses.
While the sale of cell telephone records has gained significant
media attention, and telecommunications records are the focus of the two
bills currently before the United States Senate,
many other types of private records are being bought and sold in the
public market. Alongside many advertisements for cell phone records,
wireline records and the records associated with calling cards are
advertised. As individuals shift to VoIP telephones, it is safe to
assume that those records will be offered for sale as well. Currently,
it is legal to sell telephone records, but illegal to obtain them.
1st Source Information Specialists
U.S. Rep. Fred Upton (R-Kalamazoo,
Michigan), chairman of the Energy and Commerce Subcommittee on
Telecommunications and the Internet, expressed concern over the easy
access to personal mobile phone records on the Internet during a House
Energy & Commerce Committee hearing on "Phone Records For Sale: Why Aren't Phone Records Safe From Pretexting?" Illinois
became the first state to sue an online records broker when Attorney
General Lisa Madigan sued 1st Source Information Specialists, Inc. A
spokeswoman for Madigan's office said. The Florida-based company
operates several Web sites that sell mobile telephone records, according
to a copy of the suit. The attorneys general of Florida and Missouri
quickly followed Madigan's lead, filing suits respectively, against 1st
Source Information Specialists and, in Missouri's case, one other
records broker – First Data Solutions, Inc.
Several wireless providers, including T-Mobile, Verizon, and
Cingular filed earlier lawsuits against records brokers, with Cingular
winning an injunction against First Data Solutions and 1st Source
Information Specialists. U.S. Senator Charles Schumer
(D-New York) introduced legislation in February 2006 aimed at curbing
the practice. The Consumer Telephone Records Protection Act of 2006
would create felony criminal penalties for stealing and selling the records of mobile phone, landline, and Voice over Internet Protocol (VoIP) subscribers.
HP
Patricia Dunn,
former chairwoman of Hewlett Packard, reported that the HP board hired a
private investigation company to delve into who was responsible for
leaks within the board. Dunn acknowledged that the company used the
practice of pretexting to solicit the telephone records of board members
and journalists. Chairman Dunn later apologized for this act and
offered to step down from the board if it was desired by board members.
Unlike Federal law, California law specifically forbids such
pretexting. The four felony charges brought on Dunn were dismissed.
Preventive measures
Taking
some precautions reduce the risk of being a victim to social
engineering frauds. The precautions that can be made are as follows:-
- Be aware of offers that seem “Too good to be true “.
- Avoid clicking on attachments from unknown sources.
- Not giving out personal information to anyone via email, phone, or text messages.
- Use of spam filter software such as Spam box.
- Avoid befriending people that you do not know in real life.
- Teach kids to contact a trusted adult in case they are being bullied over the internet (cyberbullying) or feel threatened by anything online.