“ | Y2K was the first global challenge caused by information technology. Left unaddressed it would have significantly disrupted everyday life in many parts of the world.[1] | ” |
Definition[]
The term Y2K problem (also referred to as the "Year 2000 problem", the "Year 2000 computer bug" and the "Millennium bug") "refers to the inability of certain computer software to accurately process dates after December 31, 1999."[2]
Overview[]
“ | At 12:01 on New Year’s morning of the year 2000, many computer systems could either fail to run or malfunction — thereby producing inaccurate results — simply because the equipment and software were not designed to accommodate the change of date to the new millennium.[3] | ” |
"The use of two digits to represent a four-digit year, and the inherent fault of '00' being interpreted as 1900 instead of 2000, was a standard programming practice throughout the computer industry that had the potential to affect millions of information systems around the world."[4] For instance, 1968 or 1974 would be stored and processed as 68 and 74, respectively. The number 19, indicating years in the 1900s, was implied.
This worked smoothly until users started to input dates occurring after December 31, 1999. Computers ran into problems when required to calculate a number based on the difference in two dates, such as the interest due on a mortgage loan. Computers continued to assume that the prefix 19 was implied, so dates such as 00 or 01 were treated as 1900 or 1901. Consequently, computers could not correctly calculate the difference between a year in the 1900s and a year in the 2000s.
"The mechanics involved in making any one of these systems capable of correctly processing the Year 2000 date were fairly straightforward, but the scope of the work — identifying, fixing, and testing millions of systems and data exchange points in a global economy — was daunting."[5]
Reasons for the Y2K problem[]
There are many reasons for this, some historical, some sociological, and some anecdotal. However, the bottom line was that unless the problem was fixed, the Y2K problem would have begun to manifest itself in computer errors, crashes, data corruption, or other problems.
There were many reasons why computer system designers and programmers abbreviated the date field, including the prevalent use of punched cards, the expense of storage, and the programming methodologies used in these early systems.
Punched cards[]
In the late 1800s, the U.S. government was facing a data processing crisis of sorts. It was taking the U.S. Census Bureau more than a decade to perform and report the results of the census, which is required by law to be taken every ten years. Something had to be done to tame the flood of census data that was overwhelming the manual tabulation methods used.
Herman Hollerith was a Census Bureau employee, who developed an electronic tabulating system using punched tapes, and later punched cards.[6] His system represented information as a series of holes on a punched card.[7] Each column of the card could represent to a single fact, yes or no answer, or number. In 1884, Hollerith filed a patent on the punched tape system, and later the punched card system. Hollerith’s company (The Tabulating Machine Company), through a series of acquisitions and name changes, eventually became IBM.
The Hollerith machines used what became the ubiquitous 80-column “IBM Hollerith cards.”[8] Unfortunately, the 80-column cards imposed limitations on the amount of data that could be recorded on each card. Only 80 bytes of data (1 byte equals 1 character of data) were available. If the data could be shortened, then there would be more available space for additional information.[9]
One solution to squeezing more data on each card was to eliminate the first two digits of the century code, so that 1915 would be entered as simply “15.” If there were several dates on one card, two bytes could be saved for each date. As long as the century stayed the 20th Century, there would be no problem. No computer programmer or manager ever thought that such systems would last for so many years, or even get close to the Year 2000.
Computer storage[]
In the beginning of computer history (while officially 1948, the relevant time period actually starts in the 1960s), resources were very limited and very expensive. For instance, the cost of computer memory was $1.00 per byte on a Univac mainframe in 1971. That cost, coupled with technology limitations, meant that even the largest computers could only have a limited amount of memory. Despite this fact, programmers still had to create complex programs, instructions, and data structures to meet the intricate business needs of their clients/employers. In fact, programmers learned to develop rather challenging programs in as little as 4,000 bytes.
The nature of the early programmers and their challenges and psychology was very different from today’s programmers. These early programmers developed tricks to cleverly fool the system into processing so much logic, with so little resources — yet with acceptable performance. Today’s programmers most often just buy more memory, faster processors, and more disk storage, to avoid performance and space problems.
In addition to the cost of memory, the cost of storing information on disk drives, back in the 1970s and early 1980s, was also very expensive, with a megabyte of storage space costing as much as 10,000 times what a megabyte would cost today.
The cost of memory and storage encouraged programmers to take shortcuts as well. One of them was to truncate century dates to the same two digits as was done for punched cards. Again, most programmers in the 1970s and 1980s had no idea that the programs they were developing would still be in use a quarter of a century later.
Programming methodology[]
Before there were structured programming, CASE tools, fuzzy logic, and automated testing suites, programmers were a very asocial breed. With virtually no tools and little history before them, they took on the challenges of developing complex software in tremendously small amounts of memory, and with very expensive and limited database capability. To get the job done, they had to make the computer do things that the resources were otherwise incapable of doing. They developed short cuts and tricks to enable their programs to survive and operate in the smallest of machines and using all of the available computer resources. Documentation took up precious space and was an anathema to most programmers.
These men and women were brilliant, but the code they left was virtually impossible to maintain, being a beehive of complex and non-intuitive logic intended to make the computer perform beyond its intended capabilities. It is this unstructured, clever, unique spaghetti code that was a great part of the difficulty of fixing and testing Year 2000 problems.
Legacy systems[]
"Information technology hardware and software have evolved so continuously over the past 25 years that new applications have been incorporated by modifying slightly older versions of programming and records. The result is that very few systems are completely devoid of code or records from the 1970s and 1980s, when the century turnover seemed too distant to worry about. Further, in the rigorous and non-reflective manner in which microprocessors and computers operate, even one line of code that has not been touched in decades can disrupt or shut down a system, or produce error-laden results."[10]
Embedded chips[]
"Billions of microprocessors produced over the past 30 years include clock chips . . . set at 'absolute time' beginning with a two-digit year-date followed by months, days, hours, seconds and sometimes milliseconds. Most of these chips have been used in watches or other applications that do not carry risks of causing a cascade of problems. Many . . . have been embedded in a wide range of equipment as convenient timing devices for industrial applications. By subtracting one absolute time from another, for instance, a typical device derives elapsed time for the flow rates in a pump, the timing circuit in a traffic light, or the interval between moving trains. If the device is operating when the year date for the end time shifts from '99 to '00, a malfunction may occur as the device reads an extremely long interval of time or time running backwards. However, if the device is not in operation when the clock turns from 99 to 00, both the start time and the end time may register the same century and the device could continuing operating correctly. These vagaries in results mean that diagnostic costs for this problem are often steep and the consequences unknown. For these reasons, many businesses are dealing with embedded chips by replacing the equipment or by adopting an approach of simply fixing devices after they fail."[11]
Systems affected[]
The Y2K problem affects two general classes of equipment. The first class comprises business systems or mainframe systems. The second class of equipment has several common names, including embedded chips, embedded processors and embedded control systems.
Impact on organizations[]
Y2K-related failures in business systems will generally cause an enterprise to lose partial or complete control of critical processes. In the private sector, loss of business systems means that a company may have difficulty managing its finances, making or receiving payments and tracking inventory, orders, production or deliveries.
“ | To be certain that a system is Y2K compliant, a programmer must often search for the date error in millions of lines of software code, and in data sets where only 2 digits are available for birth years and the like. In an industrial application, technicians must determine whether embedded chips — often the proverbial black boxes of modern machinery are susceptible to the date error, and apply one of several possible fixes to remove or work around the problem. Remediated software and hardware must be extensively checked to insure that new problems are not introduced by the correction. The majority of the total costs of fixing Y2K are entailed in this search for errors and testing the remedy applied. The actual fixes and work-arounds are relatively inexpensive.[12] | ” |
In the public sector, government organizations may be severely hindered in performing basic functions such as paying retirement and medical benefits, maintaining military readiness, responding to state and local emergencies, controlling air traffic, collecting taxes and customs and coordinating law enforcement efforts.
“ | Solutions to any individual Y2K problem were usually not technically challenging. Given the sheer volume of computerized systems and the lack of proper documentation, however, the work became complex, time consuming, and expensive. The tasks involved (a) evaluating the potential impact of Y2K on each of millions of different digital systems around the world, (b) fixing or replacing each essential system, (c) testing the fixed or new systems, and (d) developing and testing contingency plans in case problems were missed or fixes did not work properly. On a global scale, the task of simultaneously correcting millions of software problems was an enormous management challenge.[13] | ” |
References[]
- ↑ Y2K: Starting the Century Right!, at ii.
- ↑ Peerless Wall & Window Coverings, Inc. v. Synchronics, Inc., 85 F.Supp.2d 519, 522 n.1 (W.D. Pa. 2000).
- ↑ High-Risk Series: Information Management and Technology, at 37-38.
- ↑ The Journey to Y2K: Final Report of the President's Council on Year 2000 Conversion, at 2.
- ↑ The Journey to Y2K: Final Report of the President's Council on Year 2000 Conversion, at 2.
- ↑ For additional information on Hollerith, see Geoffrey Austrain, Herman Hollerith (Columbia Univ. Press 1982).
- ↑ The concept of punched cards did not originate with Hollerith, but rather with a French inventor, Jacquard, who invented punched cards to control the selection of strands on a loom in 1804. See generally Punched Cards (Robert S. Casey & James W. Perry eds. 1951).
- ↑ To avoid litigation, Univac Corporation developed a 90-column punched card, which had the same limitations as the 80-column card. Later IBM introduced a 120-column card for its System 3 Minicomputer line.
- ↑ If data for a particular record spanned more than one card, the developer/programmer would have to sequentially number the cards so that they could be sorted correctly and organized properly if the cards were accidentally dropped. This meant that even more data was used up for control purposes, as opposed to unique data associated with a particular record.
- ↑ The Economics of Y2K and the Impact on the United States, at 1.
- ↑ Id.
- ↑ The Economics of Y2K and the Impact on the United States, at 1.
- ↑ The infoDev Y2K Initiative: Scope, Impact, and Lessons Learned, at 5.
See also[]
- The Economics of Y2K and the Impact on the United States
- Year 2000 compliant
- Year 2000 Computing Challenge: Federal Business Continuity and Contingency Plans and Day One Strategies
- Year 2000 Computing Challenge: Noteworthy Improvements in Readiness But Vulnerabilities Remain
- Year 2000 Computing Crisis: An Assessment Guide
- Year 2000 Computing Crisis: A Testing Guide
- Year 2000 Computing Crisis: Business Continuity and Contingency Planning
- Year 2000 Computing Crisis: Customs Has Established Effective Year 2000 Program Controls
- Year 2000 Computing Crisis: Time is Running Out for Federal Agencies to Prepare for the New Millennium
- The Year 2000 Problem
- Year 2000 Project
- Y2K
- Y2K Action Weeks
- Y2K Computing Challenge: Day One Planning and Operations Guide
- Y2K: Starting the Century Right!
- 100 Days to Y2K: A Resource Guide for Small Organizations