Cyber Security: The Human Factor

You’ve installed state-of-the-art security equipment on your control system to keep out the bad guys. But do your people leave the front door open?
By Peter Welander, Control Engineering November 1, 2007

One of the first scenes in the season opener of NBC’s sitcom, The Office , shows the corporate IT guy sitting at Pam’s computer, trying to flush out a virus as Jim and Pam look on.
IT guy: “Generally, it’s not a good idea to click on any internet offers that you haven’t requested. What was the exact offer?”
Pam: “It was for a video.”
IT guy: “What kind of video?”
Pam: (embarrassed) “A celebrity sex tape.”
Jim: “Really? What kind of celebrity?”
Pam: “Not relevant.”
Jim: “How much did you pay for it?”
Pam: “Not relevant.”
Jim: “You paid for it?!?”
Pam: “It all happened so fast!”
Their situation, while humorous, is all too real. Your people can be the weakest security link. In some cases an unhappy employee can cause deliberate sabotage, but these situations are less frequent than people doing stupid things. Or sometimes people fall for a social engineering scam that makes them open the door to a virus or hacker.

Four examples:

No. 1: Imagine your office phone rings. The caller says he’s from IT and asks you to help solve a network problem by changing your password. If you’ve had any cyber security training, you know that you really shouldn’t do that sort of thing, right? The IRS recently did a test just like this. Here is a brief excerpt from their report:
“The IRS has nearly 100,000 employees and contractors on approximately 240 computer systems and over 1,500 databases. Using social engineering tactics, we determined IRS employees, including managers, are not complying with the rudimentary computer security practices of protecting their passwords. As a result, the IRS is at risk of providing unauthorized persons access to taxpayer data that could be used for identity theft and other fraudulent schemes.
“We made 102 telephone calls to IRS employees, including managers and a contractor, and posed as computer support helpdesk representatives. Under this scenario, we asked for each employee’s assistance to correct a computer problem and requested that the employee provide his or her username and temporarily change his or her password to one we suggested. We were able to convince 61 (60%) of the 102 employees to comply with our requests. Only 8 of the 102 employees in our sample contacted…the IRS computer security organization to validate our test as being part of an official audit.”

No. 2:
A hacker who was caught and convicted for breaking into VoIP systems said his job wasn’t all that hard. Finding Web interfaces on devices using Google search strings was simple, but he still had to get past a password to do anything. As the hacker put it, “The way we got into them is that most of the telecom administrators were using the most basic password,‘Cisco’ or ‘admin.’ They weren’t hardening their boxes at all.”

No. 3:
One attack vector for hackers to get into a company is to scatter thumb drives around the parking lot and grounds of the subject company. People going to work find them and can’t resist plugging one in. File names that show up sound interesting (a celebrity sex tape, for example) so someone will open one out of curiosity. A program launches that makes the person’s computer contact the hacker and allow a way to get in. It all happens so fast.
Is that really possible? “A warning has been released about a family of worms that spreads by copying itself onto removable drives such as USB memory sticks, and then automatically runs when the device is next connected to a computer. The W32/SillyFD-AA worm hunts for removable drives such as floppy disks and USB memory sticks, and then creates a hidden file called autorun.inf to ensure a copy of the worm is run the next time it is connected to a Microsoft Windows PC.” (IT Network and Computer Security, May 16, 2007.)
This being the case, the handy thumb drive is considered a serious threat because it is a very effective medium for stealing data or injecting malware. While some suggest disabling auto-run options for thumb drives and CD roms, others feel that isn’t nearly enough protection. A more effective but drastic approach is to fill unused USB ports on a server with epoxy or make sure there are locked covers over any computer ports.

No. 4:
More sinister are those unhappy employees who deliberately set out to cause problems. Information Week has followed the story of a former systems administrator at Medco Health Solutions who created a logic bomb and planted it in the company’s network. The bomb would have disabled records in Medco’s customer database that allows pharmacists to check customer’s existing medication use before issuing a new prescription.
Yung-Hsun Lin, the administrator, allegedly wrote the code in October 2003, as he was expecting to be laid off. He set the code to execute the following April. Lin was not laid off, but left the bomb in place. When the day arrived, the bomb fizzled due apparently to coding errors. Lin fixed the problem and reset the bomb to go off in April 2005.
In January 2005, a co-worker stumbled across the malicious code, and the IT department removed it safely. Eventually it was traced back to Lin and he was arrested by FBI agents in December 2006. Medco estimates that it cost between $70,000 and $120,000 to clean up the problem. Had the bomb worked, the physical damage this could have done to patients due to medication problems is impossible to determine. Lin pleaded guilty and will be sentenced in January.
The moral of these stories is that technical solutions alone cannot secure a system. But on the other hand, even the best trained and conscientious people cannot stop a determined hacker from invading a weak system. Hardening involves people and systems. The two must work together to minimize vulnerabilities. ce
Peter Welander is process industries editor. Reach him at .

Author Information