Impacts of Modern Computing on Legacy Mainframe Systems: Key Considerations for Securing Sensitive Information

Center for Long-Term Cybersecurity
CLTC Bulletin
Published in
13 min readJun 27, 2023

--

By Shaitaj Dhaliwal

The author of this article, Shaitaj (Shai) Dhaliwal is a graduate student researcher in Information Management & Systems with the UC Berkeley School of Information, specializing in cybersecurity and systems engineering.

This article explores the contributions of modern cloud computing in replacing IBM mainframes for large-scale information processing, and examines how recent innovations in cloud and quantum computing pose a unique risk to critical legacy systems. At the heart is a central question about whether the shift from on-premises data storage to cloud-based platforms increases the risk of critical data being compromised.

In other words, how might advances in cloud and quantum computing impact personally identifiable information (PII) and personal health information (PHI) stored within mainframe information systems that run critical transactions for the financial services and healthcare industries, and what happens when organizations attempt to replatform mainframe-hosted applications to the cloud?

The article identifies (1) the impact of cloud computing on how information is stored; (2) modern security and privacy risks imposed by quantum computing for information hosted in a distributed environment, including the role of key regulations such as European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA); and (3) recommended strategies for effectively managing these emergent risks.

Introduction

With the introduction of cloud computing in 2006, a new model emerged that enabled organizations to “pay-as-you-go” for unlimited use of computer system resources, storage, and computing power, without having to manage these resources themselves.

Many believed this to be the end of the mainframe era, which was characterized by decades of efficiency, large storage, computing power, and global scalability. Prior to the cloud, mainframe computers were the most resilient source for large-scale data processing used for such purposes as simulating rocket engines, scientific computing in research laboratories, or more commonly, credit card transactions and insurance claims database processing. The cloud disrupted this model with the ability to perform similar activities while offering significant cost reductions for organizations.

However, mainframes are still widely used today, and they have in many cases evolved and adapted, rather than be replaced by cloud computing. Indeed, cloud computing is itself a modern form of mainframe computing, with many of the concepts and principles from the mainframe era influencing the cloud model, such as the consumption-based pricing model, by which a customer pays according to the resources used; Linux virtual machines, or computing environments that combine various IT components and isolate them from the rest of the system; and multi-tenancy, an architecture in which a single instance of a software application serves multiple customers.

While cloud computing has not killed off mainframes, they have pushed mainframe system engineers to optimize performance, scalability, and fault tolerance, in many cases by using a hybrid environment, in which mainframes are partially replatformed to the cloud, meaning parts of the legacy system are modified to work optimally in the cloud without rewriting the system’s core architecture.

The concern is that, through a hybrid architecture, where the mainframe remains on-premises but can replatform virtualized workloads to cloud service providers such as Microsoft Azure, we must consider the introduction of new risks, including through advances in quantum computing, which could upend traditional encryption and lead to sophisticated cyberattacks for which mainframes may not be prepared.

As many organizations that use mainframes are undergoing a cloud migration to reduce costs, cyber adversaries may develop new techniques to identify vulnerabilities and intercept data flowing between on-premises and the cloud. These developments raise questions about whether user privacy for sensitive PII and PHI may be at risk, and whether the GDPR or CCPA are sufficiently accounting for emerging risks in hybrid environments.

Key Challenges Leading Organizations to Change

Mainframes continue to be used extensively in the financial and health insurance industries. According to IBM, 71% of Fortune 500 organizations are still leveraging a mainframe as of 2022. As these systems run legacy applications that were developed two or three decades ago, a variety of challenges are inevitable, including a skills gap. IT professionals who learned COBOL programming and understand mainframe z/OS are retiring; as of 2019, the average age of such programmers was 58, and according to TechChannel, roughly 10 percent are retiring per year. This research also estimated that in 2020, there were 84,000 unfilled positions for mainframe programmers. Young computer science graduates are no longer taught about mainframe software in their courses. Unless these companies decommission their mainframes altogether, upgrading or moving to a new cloud solution is the only alternative.

Through interviews with representatives from a large healthcare company in the U.S. that still uses a mainframe, I identified the following four challenges that may be common to many mainframe users:

  1. Increasing MIPS cost: The number of MIPS (millions of instructions per second) is a general measure of computing performance and, by implication, the amount of work a larger computer can do. For large servers or mainframes, MIPS is a way to measure the cost of computing: the more MIPS delivered for the money, the better the value. Every year, the MIPS cost increases based on higher utilization of the mainframe by legacy applications, driven by ongoing maintenance, patches, and upgrade costs to sustain baseline performance. Measures taken to reduce the consumption of MIPS, such as limiting execution at specific off-peak periods, are forcing delays in delivery cycles or test cycles. Internal chargeback models, where internal consumers (such as departments or functional units) are charged for the IT services they used, are placing increased pressure on application owners, forcing them to reevaluate priorities of implementing new features, as this would be an added cost.
  2. Implementing new features and staying compliant with regulations: When introducing new features based on market needs, it can take several months to enhance and update legacy applications due to the lack of modern development operations processes, understaffed mainframe development and test teams, and complexity. Long manual testing cycles introduce further delays in implementing new enhancements. Privacy regulations such as the GDPR and CCPA may not be applied to mainframes given their black-box nature, and not many modern professionals understand the mainframe or receive training in managing such systems, causing potential compliance concerns.
  3. Complex architecture: The complex architecture of mainframes makes it difficult for application owners to introduce changes with multiple dependencies, and enhancements sitting in the backlog over years can add to the complexity of maintaining these systems. Most of these architectures have been in existence for several decades. There is also limited documentation available, and administrators maintaining mainframes may push application owners to continue with the status quo, rather than attempt to innovate the system for simplicity and greater efficiency.
  4. Mainframe skills availability: The labor market for mainframe administrators is tightening, due in part to attrition or retirement of the older workforce. Limited documentation is available, and knowledge about these systems is often retained by individuals who have been maintaining the legacy applications for decades. In some cases, legacy applications are maintained by staff with limited knowledge or familiarity with the underlying software. This can only continue for so long, pressuring organizations to move toward a hybrid approach that combines mainframe and cloud-based resources.

Security and Privacy for Information Hosted in Distributed Environments

According to a Gartner report published in 2014, “The IBM z/OS mainframe continues to be an important platform for many enterprises, hosting about 90% of their mission critical applications. Enterprises may not take the same steps to address configuration errors and poor identity and entitlements administration on the mainframe as they do on other operating systems. Thus, the incidence of high-risk vulnerabilities is astonishingly high, and enterprises often lack formal programs to identify and remediate these.”¹

Broadcom’s Michael Dickson wrote in 2022 that the mainframe is the most important IT asset for 71% of organizations today, and is essential for keeping operations up and running and sending key data required by modern technologies like machine learning and artificial intelligence. According to Database Trends and Applications, the security of the mainframe is often taken for granted, especially by modern chief information officers (CIOs) or chief information security officers (CISOs). According to the Verizon Data Breach Investigations Report, lost or stolen credentials, which are inherently composed of a user’s identity attributes (first name, last name, ID, email, username, password, and in some cases social security number) have been the single most common security attack for hackers to access an organization’s critical data, web application, and infrastructure (e.g., servers and databases). HelpNet Security states that mainframe security is a top priority for 85% of IT professionals, yet few are adequately protecting their systems.

Quantum Computing Accelerates The Need for Change

There are clear reasons for organizations to strengthen the security of the mainframe beyond the current known risks. Future risks, such those introduced through quantum computing, may be able to compromise highly resilient systems at a speed yet unknown. Quantum computing, according to IBM, is a rapidly emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers. There are clear benefits of quantum computing, especially as our global data estate continues to increase, with approximately 2.5 quintillion bytes of data produced every day. The issues arise, as with many innovative and profound technologies, when there are associated harms that must be accounted for and equally solutioned against.

Quantum computing has the potential power to solve once unsolvable complex computing problems, but it also could break widely used global security protocols. Although IBM has been a major contributor to both the mainframe zOS and emerging quantum computing solutions, it has failed to address the organizational risk that comes when you mix these two systems together. Researchers conducting their analysis for the article Cybersecurity in an Era with Quantum Computers: Will We Be Ready? argue that industries such as financial services and healthcare will be the most vulnerable, given they store almost all of their critical data within mainframe systems, which are susceptible to quantum breaking most existing encryption algorithms. Kilber et. al expand on this claim, illustrating that nation-state actors, cybercriminals, and hacktivists could use quantum to break common cloud-based protocols and public key infrastructure (PKI) used to protect certificates, secrets, and SSH keys. Securely passing data from on-premises systems to cloud-based servers is vulnerable at a level never known before.

In summary, the following key risks could arise in a model where mainframes are undergoing modernization and quantum computing is introduced:

1. Data loss due to ineffective encryption for data-at-rest and data-in-transit

  • Cyber adversaries will target the confidentiality, integrity, and availability of mainframe transformations, cloud environments, and quantum services to try to take these systems down.
  • Cyber adversaries will seek financial gain through the sabotage of these systems and by extortion and data theft.

2. Application code vulnerabilities

  • As a mainframe application is undergoing modernization, if proper static application security testing (SAST), dynamic application security testing (DAST), and software composition analysis (SCA) scans are not taking place with continual monitoring, cyber adversaries may be able to intercept plain-text credentials or secrets used to gain access to the mainframe system.
  • If the mainframe system is not air-gapped, a security measure that involves isolating a computer or network and preventing it from establishing an external connection, the adversary could escalate through the internal network. Likewise on the cloud, if a mainframe application is looking to be replatformed to Azure, the Azure AD environment may not have strong cloud security controls yet in place to detect intrusion and alert the tenant owner in time.

3. Cloud misconfigurations

  • During the replatform, human error, such as misconfiguration of cloud resources, S3 buckets, infrastructure security, service or data integration, and lower environment security, can put the organization at greater risk of insecurity and data loss.

4. Broken authentication or access control

  • Every mainframe zOS has a built-in identity and access management system called the resource access control facility (RACF), which handles common authentication, authorization, and lifecycle management functions. During a cloud replatform, the cloud system will take over basic authentication. In the case of Azure AD, this would allow for Microsoft to run Azure MFA. If there is a gap in process and a user is not triggered to authenticate (i.e., prove they are who they say they are), or if the system fails to properly authorize them to access only to the information they have been allowed to access and nothing more, sensitive data exposure or cross-site scripting may allow a cyber adversary to hijack user accounts, access browser histories, or spread malware.

Organizations undergoing mainframe modernization efforts must ensure they are properly diagnosing the risk and building a clear assessment model to proactively catalog and secure each stage of the transformation.

The next section will conclude with a recommendation for when organizations should act, what the required people, process, technology capabilities are, and why financial services and healthcare executives should treat this as a call to action.

Recommendations for Security Data in a Distributed or Hybrid Environment: Strategies for Organizations to Mitigate Risk

Organizations considering mainframe modernization can consider the below recommendations to protect their critical data during and after a cloud modernization program.

  1. Catalog every critical mainframe application in the environment and conduct a corresponding risk assessment. This will offer insight into which systems are “crown jewels” and cannot undergo any immediate changes, and which applications have the least dependency across the enterprise.
  2. A pilot or proof of concept is recommended for the identified application where an organization should consider modern code conversion from COBOL to Java. Using JANUS Studio for example, organizations can transform COBOL to maintainable Java. This enables modern security architecture, automatically introducing cloud native services, and more. As code is converted, apply application security scans such as static application security testing (SAST) and dynamic application security testing (DAST). Remediate any identified vulnerabilities and prepare to push the application to a cloud environment, such as Microsoft Azure or Amazon Web Services (AWS).
  3. As mainframe application workloads are rehosted to the cloud, ensure the cloud environment is up to date with strong cloud security controls in place, turn on monitoring services, and ensure the system is compliant with NIST 800–53 or any industry regulations such as HIPAA or SOX. At this stage, organizations may determine which identity and access management controls they would like to test, typically starting with cloud-based multi-factor authentication such as Azure MFA for ensuring a secure “front door.” Enabling quantum encryption for data-in-transit and at-rest will be critical to ensure defense against malicious actors.
  4. Finally, knowing that the systems in question — mainframe zOS, TopSecret, and ACF2 — were first developed over 50 years ago, modern regulations such as the EU’s General Data Protection Regulation (GDPR) may not apply for certain legacy systems that require extensive knowledge and expertise. How can we prove that legacy mainframes are being audited by the same set of controls and regulations to which all modern platforms are held liable ? GDPR is the toughest privacy and security law in the world, but there are exceptions to the rule. The GDPR levies harsh fines against those who violate its privacy and security standards, with penalties reaching into the tens of millions of euros, but only for organizations that process the personal data of EU citizens or residents or offer goods/services to such people. In the case of healthcare, which may only service US-based residents, there is no widely adopted regulation that audits against mainframe systems to the extent the GDPR does.

Given mainframe systems in use today were originally configured and installed between 30–40 years ago for many organizations, before security-by-design or privacy-by-design were adopted as best practices, there may be significant amounts of data stored on users, including PHI and PII, that may be unknown to the user and vulnerable. If the user is looking to gain information about their data, such as receiving a report on how their data is being used, it is not easy to do on the mainframe , s much of this information is encrypted and stored on a relationship database called DB2. In some cases, it is not easily correlated back to a given identity profile, as is done on most modern platforms today. If an organization uses a mainframe and is planning to migrate workloads to cloud-based platforms in the near term, it is useful to conduct an in-depth assessment and data clean-up process prior to starting this effort.

Future Work/Other Considerations

This analysis introduces the field of mainframe modernization, as well as issues related to cloud computing and quantum computing when applied to legacy systems. Research was conducted with a small sample of organizations within two industries, financial services and healthcare. It is recommended that a second study be conducted that includes a greater number of study participants. There may also be potential to run a proof of concept on a volunteer organization to test and prove the recommended framework.

Conclusion

Mainframe systems have had a 50+-year reign in enterprise information systems, but a variety of pressures are weighing on these systems’ viability. As the cloud grows in dominance, key challenges will continue to force organizations still using mainframes to rethink how they run large batch processes and assess whether a replatform is the next step.

In doing so, I encourage organizations to conduct due diligence by extending the duration of their to-be state design phase. During this time, organizations can conduct valuable proofs of concept to discover how data stored in their mainframe is being protected throughout the course of the replatform and finally once it is moved into the cloud. With strict guardrails in place that abide by GDPR and CCPA and protect against quantum risk, organizations can make improvements with how they are securing and protecting user PHI and PII over the long run.

About the Author

Shaitaj (Shai) Dhaliwal is a graduate student researcher in Information Management & Systems with the UC Berkeley School of Information, specializing in cybersecurity and systems engineering. Other contributors to the analysis described in this article include advisors Professor Clifford Lynch and Professor Michael Buckland. This research was conducted as part of the research seminar at the School of Information and was co-sponsored by the UC Berkeley Center for Long-Term Cybersecurity.

References

A Relational Theory of Data Governance. Yale Law Journal. S. Vailjoen. (2021).

Attribute quality management for dynamic identity and access management. Kunz, M., Puchta, A., Groll, S., Fuchs, L. and Pernul, G., (2019).

Cost of a Data Breach IBM Security Report 2021. IBM

Cybersecurity in an Era with Quantum Computers: Will We Be Ready? in IEEE Security & Privacy. M. Mosca. (2018).

Cybersecurity for Quantum Computing. Cornell University. Kilber, Natalie; Kaestle, Daniel; Wagner, Stefan. (2021).

Digital Identity Guidelines: revision 3, NIST. Grassi, P. A., Garcia, M. E., & Fenton, J. L. (2017)

https://www.ibm.com/docs/en/zos-basic-skills?topic=zos-what-is-racf

EU “What is GDPR? The EUs new data protection law?” and “Everything you need to know about the ‘Right to Be Forgotten’” 2020 Horizon Framework

Gartner Research Note G00172909. Referenced in “The Myth of Mainframe Security,” Glinda Cummings, IBM

Identity and access management in cloud environment: Mechanisms and challenges. Engineering Science and Technology, an International Journal, 21(4), 574–588, Indu, I., Anand, P. M. R., & Bhaskar, V. (2018).

Is Quantum Computing a Cybersecurity Threat? Although quantum computers currently don’t have enough processing power to break encryption keys, future versions might. The Scientific Research Society. Denning E, Dorothy. (2019).

Quantum cryptography for IoT: A Perspective. 2017 International Conference on IoT and Application (ICIOT). S. K. Routray, M. K. Jha, L. Sharma, R. Nyamangoudar, A. Javali and S. Sarkar. (2017)

Reduce Risk and Improve Security on IBM Mainframe: Volume 3 Mainframe Subsystem and Application Security. IBM. Redbooks. Buecker, Alex; Kanke, Marcela; Mohanan, Mohit; Oliveria, Vinicius; Ramalingam, Vinodkumar; Rowley, David; Thalouth, Botrous; Theilmann, Jan. (2015).

--

--