Y2K, The Sequel

Dov Keshet
5 min readJan 3, 2021

Intro

In PART 1 I wrote about the potential Y2.1K leap year problem that should interest anyone who is involved with Risk Management, Business Continuity, Regulatory Compliance, and IT Governance.

As I looked into the Y2.1K leap year issue, I came across something much more closely related to the original Y2K bug that seems to be of a much more urgent nature with serious implications in the near future — Date Windowing.

Date Windowing — in a nutshell

Date Windowing is a method used in systems where the 2-digits that represent a year in a century were stored, but the 2-digits representing the century itself (such as 19, 20, 21) were not stored, but rather inferred, from the 2-digits of the year.

The century value could only be inferred if, based on the nature of an organization’s business and historical records, a “pivot year” would be defined to mark a point where all years from year 00 until the pivot year were associated with one century, and all years from the pivot year until year 99 were associated with another century. A century could not be inferred if all century year values from 00 to 99 were associated with the same century.

Here’s an example taken from IBM documentation relating to a pivot year of 40, which was widely adopted:

  • If the 2-digit year is greater than or equal to 40, the century used is 1900. In other words, 19 becomes the first 2 digits of the 4-digit year.
  • If the 2-digit year is less than 40, the century used is 2000. In other words, 20 becomes the first 2 digits of the 4-digit year.

There was no single standard regarding the pivot year — some organizations used 20 as their pivot year, others 30, 40, 50 or 60. And sometimes an organization would use one pivot year when relating to birth dates and another pivot year when relating to interest calculations.

Then came Y2K

The default recommended approach to fixing the Y2K bug was Date Expansion — a “technical” approach whose goal was to physically expand all 2-digit year field definitions in a system, without any exceptions, whether in program memory or stored externally in files, to 4-digits, so as to include the value of the century. As part of this effort, the century value (such as 19 or 20) would have to be calculated as well. The end result, for example, would be going from a 2-digit field containing the value 95, to a 4-digit field with the value of 1995. This was considered the “purest” method, providing a future-proof solution. It was also the most costly option, requiring massive development and testing resources, as it’s technical approach affected entire systems.

In cases where Date Expansion was considered too high of a risk — whether because of the cost (it is estimated that approximately $500 billion were spent on Y2K worldwide), time constraints (changes had to be completed with ample time to test before the end of 1999), and other factors — Data Windowing was used. It was not a straightforward “technical” approach, and usually required in-house staff who were familiar with the embedded business logic in the system, and could pinpoint where windowing needed to be put into place, and where the software could continue to function without any change. This approach was less costly, less invasive, took less time, and allowed for more focused testing.

Problem is

The Date Windowing solution could only be used to discern between two centuries, for example between the 20th century (where the century value is 19) and 21st century (where the century value is 20). As time progressed, if the need arose for identifying a third century, such as the 22nd (where 21 was the century value), then Data Windowing would no longer be a sustainable solution.

Looking back at the IBM example above — if the pivot year was 40, then a year with a value from 40 to 99 would be associated with the 20th century — meaning the century value would be 19. That worked back in 2000. However, over 20 years have passed since then, and programs that perform future projection calculations for interest, savings, loans and so on are already reaching the pivot year and beyond. If a program performs a 30-year mortgage calculation in 2020, it will cross over the pivot year 40, and may find it is relating to years 40, 41 and 42 as 1940, 1941, and 1942 instead of 2040, 2041 and 2042.

I checked with a few colleagues of mine in different countries. One reported that they had already had an ugly run-in with a batch job that got it’s centuries crossed.

What now, what next

It seems that if there are places in your organization’s core legacy (aka Cobol) systems that are still implementing the Date Windowing method as a Y2K remedy, now would be a very good time to seek out an alternative solution. Data Windowing is going to become increasingly defunct sooner than we’d like to think.

It would be a good idea to begin exploring, analyzing and deciding if and how your organization will meet this challenge. A good first step would be to start scanning the relevant system sources to get a feel for the magnitude and effort that may be needed to mitigate the issue.

Acknowledgement

I would like to acknowledge Bob Bemer, considered to be the Grandfather of Cobol, who worked together with Grace Hopper (“Grandma Cobol”), and was the first person known to publicly address the Y2K issue as early as 1958, continuously urging governments and organizations for decades to recognize and fix the problem.

Dov Keshet has been designing and developing award-winning core business and infrastructural solutions in large organizations for over 35 years. He has served as technical lead and architect in the development, integration and migration of information systems predominately on the IBMi (aka iSeries or AS/400) platform.

Feel free to contact me at dovk@cobol.ninja or via LinkedIn. If you have any specific R&D or training needs, you can contact me via Fiverr

--

--

Dov Keshet
0 Followers

Dov Keshet has been designing and developing award-winning core business and infrastructural solutions in large organizations for over 35 years.