Security Ranch Security Ranch

February 15, 2021

Auditing Industrial Control Systems

Filed under: Uncategorized — Tags: , — Ken @ 8:00 pm

Auditing Industrial Control Systems

            Some of the first Industrial Controls Systems (ICS) came about in the 1960’s when computers were first starting to become popular.  These ICS were mainly used by the old mainframe computers and were used to control machines and sensors for industries such as the oil and gas industry and the electric grid industry.  As technology increased and the size and power of computer decrease ICS evolved by becoming more integrated into just about every aspect of life.

ICS are most a general term for several different types of systems.  ICS is made up of Supervisory Control and Data Acquisition (SCADA) systems, Distributed Control Systems (DCS), and Programmable Logic Controllers (PLC).  These three systems, along with sensors companies the power to control anything from the generation of power in the electric grid, drilling in the oil and gas industry, and in the manufacturing of raw materials like metals or plastics.

Lately, the security and weakness of the United States’ power grid have come into the spotlight due to the prevalence of cyber attacks.  An excellent example of what could happen during a cyberattack on the U.S. is the Aurora Generator Test.  In 2007, at the Idaho National Laboratory, a test was conducted to simulate a cyber attack on an electric grid.  An Aurora Generator was used during the test and was connected to a power substation.  To simulate the cyber attack, specially design code was sent to the generator to open and close circuit breakers out of sync.  When the circuit breakers were opened and closed, it created an enormous amount of stress that was enough to break parts off of the generator.  Within about three minutes of the test starting the generator had been destroyed and left smoking (Swearingen, Brunasso, 2013).  While a cyber attack destroying only one generator does not seem like a big deal, it is significant.  This experiment only applied to one generator.  Imagine if the cyber attack occurred on tens or hundreds of generators at the same time.  Within three minutes entire cities could be blacked out.  This could also cause a surge in demand on other energy stations that would short them out as well.

To prevent this from happening it is imperative to continually test the ICS for vulnerabilities and correct them as soon as possible.  Auditing can help by keeping the companies honest and preventing them from becoming complacent.  In the Marines Corps, there is a famous slogan that says, “Complacency Kills.”  What this means is that when a person gets lazy doing the same task over and over again, they begin to take shortcuts and skip steps in the daily task.  When that happens, accidents happen and people sometimes get killed.  This is the same for industrial control systems.  Auditing is becoming essential for the U.S. government and society as the U.S. relies more and more on the benefits of using ICS.

There are two organizations that specialize in protecting ICS.  The Industrial Control System-Cyber Emergency Response Team (ICS-CERT) operates within the National Cybersecurity and Integration Center (NCCIC), which is a division within the Department of Homeland Security (ICS-CERT, n.d.).  ICS-CERT, along with the NCCIC has created a document called the “Seven Strategies to Defend ICSs.

The first step calls for implementing Application Whitelisting (NCCIC, 2015).  By whitelisting what applications are allowed on the network, it can help detect malware uploaded by hackers.  This is especially helpful when the nature of the network is static and does not change much.

Step two is to ensure proper configuration and patch management.  By updating and patching the systems in a timely fashion, companies can avoid attacks that would have been easily prevented had they been patched (NCCIC, 2015).  In fact, the recent attack on Equifax was due to a vulnerability that had been patched three months before their systems were attacked (Newman, 2017).

The third step is to reduce your attack surface area.  By disabling or uninstalling any services or programs that you do not use you limit what is available to the hacker to exploit.  Companies should also isolate the ICS networks from all untrusted network to include the internet (NCCIC, 2015).

The fourth step is to build a defendable network.  By segmenting networks companies can limit the damage to networks if they are compromised.  If the whole network has access to itself, it will only take one compromised computer to affect the entire network.  However, if the network is separated into separate smaller networks, the damage will be limited to only the smaller network (NCCIC, 2015).

The fifth step is to manage authentication.  To accomplish this company should use best practices for managing authentication such as strong password policies, multi-factor authentication, and “least privilege” (NCCIC, 2015).

The sixth strategy is to implement secure remote access.  The best strategy is to not allow remote access.  If that is not possible the next best solution is to only allow for monitoring and not executing.  If users must have executable permissions, then access should be restricted to a single access point.  All other pathways should be restricted.  Again, companies should use multifactor authentication (NCCIC, 2015).

The final seventh step is to monitor and respond.  Installing Intrusion Detection Systems (IDS), Intrusion Prevention Systems (IPS), and continuously monitoring log files, companies can catch problems before they become security incident.  Companies should also develop response plans and regularly test those plans (NCCIC, 2015).

The National Institute of Standards and Technology (NIST) is also an important organization that has produced some unique reports about cybersecurity.  Specifically, the NIST SP 800-82 Revision 2 Guide to Industrial Control Systems (ICS) Security and the Framework for Improving Critical Infrastructure Cybersecurity.  The NIST SP 800-82 R2 guide discusses ICS Risk Management, ICS Security Architectures, and how to apply security controls to ICS (NIST, 2015).

NIST’s Risk Management Framework (RMF) is a six-step process that includes the steps: Categorize Information Systems, Select Security Controls, Implement Security Controls, Assess Security Controls, Authorize Information Systems, and Monitor Security Controls.  By implementing the six-step cycle RMF, you can identify vulnerabilities and select controls to mitigate those vulnerabilities based on the company’s priorities (NIST, 2015).

NIST’s Framework for Improving Critical Infrastructure Cybersecurity is also another essential document for improving security for ICS’s.  The basic framework consists of the five functions of Identify, Protect, Detect, Respond, and Recover (NIST, 2014).  Each function also consists of several categories and sub-categories that get more specific and technical.  The best part of the framework is the Informative References where a specific section of standards, guidelines, and practices are tied to each sub-category.  These references include COBIT, ISA, ISO/IEC 27001, and NIST SP 800-53 R4 and the specific sections that they apply to (NIST, 2014).  This is an indispensable guide for ICS auditors.  The framework also has a tiered strategy that can help companies understand how developed and mature their risk management practices are.  The four tiers are Partial (Tier 1), Risk-Informed (Tier 2), Repeatable (Tier 3), and Adaptive (Tier 4).  Tier 1 is primarily for companies that only have an understanding of risk management and are mostly reactive to any problems.  Tier 2 is slightly more mature where the company has formal processes in place.  Tier 3 is actively using and monitoring their risk management processes and making improvements when needed.  Finally, tier 4 is when a company has fully mature risk management processes where they have learned from theirs and other mistakes and are continually adapting their processes as the security situation changes (NIST, 2014).

There are some special considerations when working and auditing Industrial Control Systems.  It is essential to understand that many of these systems were designed and installed before the internet became common.  Often what happens is that when technology advanced enough components were “bolted on” to connect the control systems to the internet.  When that happened, the systems that were not designed to be connected to the internet became utterly vulnerable.  Some problems that became apparent were that most of the communications were sent in plain text.  Other problems were that the systems were hard-coded with default passwords that could not be changed.  While this may look like a glaring error today, five to ten years ago it was not a problem because these systems were not connected to the internet.

Even today there are issues with medical devices that are connected to the networks that have hard-coded passwords.  An alert by ICS-CERT named ICS-ALERT-13-164-01 stated that there was a vulnerability in over 300 medical devices spanning 40 vendors that had hard-coded passwords (ICS-CERT, 2013).  Another issue with industrial control systems is that the hardware specs are often only enough to run the software.  There is no room for improvement to add encryption.  Encryption and Control Systems do not mix well (Sample, 2006).  Encryption often using more bandwidth and memory than what the ICS can provide.  The final issue that also comes with auditing control systems is that they often use vulnerable software and protocols (Sample, 2006).  Windows XP stopped being support several years ago. However, many ICS still use Windows XP and are incapable of upgrading to more secure versions of windows.

Many of these vulnerabilities can be mitigated by using a proper security architecture that blocks insecure control systems from untrusted networks or the internet.  Port Management can be implemented, and any other controls can be added that makes it more difficult for hackers to gain access.  Companies can look into upgrading their control systems and where they cannot they can look into upgrading network components and software that have better capabilities.

After understanding about some of the standards and vulnerabilities of ICS a review can be done of an actual audit.  In February 2017, the NASA Office of Inspector General (IG) conducted an audit of NASA’s critical and supporting infrastructure.  The IG office has conducted this audit as well as 21 other audit reports because NASA has moved steadily moved from older isolated and manually controlled operational technology to more modern technology where systems are controlled over networks (Leatherwood, 2017).  What the IG office has found is that NASA still has several deficiencies and significant issues concerning its critical infrastructure.  There are two main concerns in the audit report.  The first concern is that NASA lacks comprehensive security planning for managing risk to its Operation Technology Systems (OT).  Examples of these OT systems are HVAC systems, tracking and telemetry systems, command and control systems.  The second concern is that NASA’s critical infrastructure assessment and protection could benefit from improved OT security (Leatherwood, 2017).

For NASA’s comprehensive security planning for managing risk to its OT systems, there are sever issues where NASA could use improvement.  First is that NASA needs to do better at defining what OT systems it has.  There was inconsistency in how the OT systems were defined when looking at several different NASA Centers.  Often these OT systems would be defined as critical infrastructure at one location and other locations not listed at all.  There needs to be consistency in how the OT systems are defined (Leatherwood, 2017).

Another finding is that NASA did not follow the NIST guidance on how to categorize its OT systems.  NASA failed to make distinctions in OT systems and IT systems.  When they fail to make distinctions, NASA ends up grouping systems with different security risks into a single group.  This makes it more challenging to make risk assessments and implement the right security controls (Leatherwood, 2017).

Awareness and training is an area that NASA could improve in.  The auditors visited five NASA centers including the NASA Headquarters and interviewed more than two dozen employees.  The auditors discovered that NASA does not require role-based OT specific training.  Most employees receive IT training, however.  By not including OT training along with the regular IT security training employee will be less able to identify vulnerable systems.  An example would be a building HVAC system.  If an employee did not recognize the HVAC system as a high-risk system, they might not take the proper steps to prevent the system from being compromised.  In this case, the employee could fail to lock or security a door that would lead to the HVAC controls.  If a hacker gained access to the HVAC system, they could shut it off potentially putting the IT systems at risk (Leatherwood, 2017).

Lastly, NASA had several very easily exploitable risks that could be prevented with administrative controls.  For the OT systems, there was a lack of internal monitoring, auditing, and log file management.  Most of the systems were checked manually by NASA personnel.  There were no controls in place to monitor physical or logical isolation from the main networks.  NASA also used group accounts.  This created vulnerability in two ways.  The first way was that by using group accounts, there was no way to know who accessed a system.  If anything went wrong, there was no way to attribute the risk to a single employee.  The second vulnerability was an insider attack.  If an employee was fired and the credentials were not changed that fired employee still can gain access to the OT systems.  Most of these issues can be identified and corrected by using some of the known best practices by implementing the proper security controls.  There just needs to be a centralized and control effort so that all of the NASA offices are using the same language and playbook (Leatherwood, 2017).

Industrial Control Systems have come a long way since the 1960s.  These systems will continue to evolve and get more complicated as time goes on.  Luckily, there are organizations that provide several excellent documents in how to protect the ICS from hackers.  There are plenty of examples of what could go wrong.  Companies just need to use what information is available and implement it in their networks.  If companies fail to take ICS security seriously, there will eventually be a significant attack on the nations critical infrastructure that could put thousands of lives at risk.

References

Swearingen, Michael. Brunasso, Steven. Et.Al.  (September 2013).  What you need to know (and don’t) about the Aurora Vulnerability.  Retrieved from http://www.powermag.com/what-you-need-to-know-and-dont-about-the-aurora-vulnerability/?printmode=1

ICS-CERT. (n.d.).  About the Industrial Control Systems Cyber Emergency Response Team.  Retrieved from https://ics-cert.us-cert.gov/About-Industrial-Control-Systems-Cyber-Emergency-Response-Team

NCCIC.  (December 2015).  Seven Strategies to Defend ICSs.  Retrieved from https://ics-cert.us-cert.gov/sites/default/files/documents/Seven%20Steps%20to%20Effectively%20Defend%20Industrial%20Control%20Systems_S508C.pdf

Newman, Lily Hay.  (September 2017).  Equifax Officially Has No Excuse.  Retrieved from https://www.wired.com/story/equifax-breach-no-excuse/

NIST.  (February 2014).  Framework for Improving Critical Infrastructure Cybersecurity.  Retrieved from

https://www.nist.gov/sites/default/files/documents/cyberframework/cybersecurity-framework-021214.pdfNIST. (May 2015).  NIST Special Publication 800-82 Revision 2:  Guide to Industrial Control Systems (ICS) Security.  Retrieved from http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-82r2.pdf

ICS-CERT. (June 2013).  Alert (ICS-ALERT-13-164-01) Medical Devices Hard-Coded Passwords.  Retrieved from https://ics-cert.us-cert.gov/alerts/ICS-ALERT-13-164-01

Sample, James.  (2006).  Challenges of Securing and Auditing Control Systems.  Retrieved from http://www.isacala.org/doc/ISACALA_SCADA_Presentation_FinalJamey.pdf

Leatherwood, James.  (February 2017).  Industrial Control System Security Within NASA’s Critical And Supporting Infrastructure.  Retrieved from https://oig.nasa.gov/audits/reports/FY17/IG-17-011.pdf

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress