Y2K problem

From Conservapedia
Jump to: navigation, search
Computerized display sign that shows Y2K bug on Jan 3, 2000.
The Y2K problem is a problem in computer hardware and software programming where unintended results may take place if a year represented as two digits is not interpreted correctly. There are two basic aspects to the Y2K problem:
- Determining the correct value of a 2-digit year. Example - does "20" represent 1920 or 2020?
- Performing date-based calculations correctly. Example - if you subtract "97" (1997) from "07" (2007), will you get 10 years, or minus 90?

The origins of the problem lie in the high cost of data storage in the early days of computing, where storing a date in 6 digits (mmddyy) versus 8 (mmddyyyy) could result in significant savings depending on volume. If the specifications for a piece of software were well-designed this is not a problem, but in many cases the century component of a 2-digit year was assumed instead of being explicitly accounted for in programming.

As the 1990s progressed, it was determined that there was a significant base of legacy software and hardware that might not work correctly once the default century portion of current dates changed from "19" to "20". While the code to handle 2-digit years correctly is not complex, the challenge was in identifying every instance of bad code, and correcting & retesting it before January 1, 2000.

The potential risks of devices failing to work after that date ranged were often exaggerated, with visions of aircraft falling from the sky and nuclear missiles launching by mistake. The more practical concerns were a loss of integrity and confidence in the financial system, and the potential liabilities faced by those who did not ensure their systems to be Y2K compliant.

Approaches to Y2K Remediation

The optimal approach to Y2K remediation is to change all date references to use 4-digit years instead of 2-digit ones, and revise any date-based calculations accordingly.

If this is not possible or practical, then the century needs to be implied based on the value of a 2-digit year and its context. For example, a system tracking new 30-year mortgages with a maturity date from "00" to "49" can safely assume the century to be "20" ,with "19" used for the other values. For birth dates the logic is not as simple - a birth date of "99" could mean "1999" or "1899", so you need to know if your system deals with adults, infants or both before determining a fix. With this approach, date calculations also have to be checked so they work properly when the dates in a formula span centuries.

Another option for Y2K remediation is to use serial dates instead of calendar dates. In computing, a serial date is represented as the number of days since a base date like 1/1/1900, or 01/01/0001. While these dates have to be converted to and from Gregorian dates for display on screens and reports, they allow thousands of years worth of dates to be stored using the original 6-digit storage space, and performing date-based math is easy.

Challenges in Y2K Remediation

  1. Lack of documented source-code. Programmers develop software in readable languages, and that code is then compiled into the binary executable objects that are actually run on servers and mainframes. In many cases, the source code corresponding to the object code in use was lost, poorly documented or found to work differently when compiled. Companies and government agencies spent considerable sums on establishing proper software life-cycle management protocols and verifiable code bases to work against.
  2. Availability of qualified programmers. Much of the non-compliant software was written using assembler code and outdated languages like COBOL, FORTRAN for which there was a scarcity of skilled programmers by the late 1990s. People with those skills charged a premium for their services, driving up the cost of remediation work significantly.
  3. Firmware and Embedded Devices. Software is not only found in standalone computers, but is also embedded in hardware and 'smart' devices that are sub-components of complex systems like cars, aircraft, power grids and the like. Testing the code used in these devices is simple, but updating or replacing them in the field is not.

Cost of Y2K-related Efforts

There are varying estimates of the total costs related to performing hardware and software analysis, remediation and testing related to the Y2K issue. Some estimates in the late 1990s put the worldwide cost in the trillions of dollars, but this has been dismissed as hyperbole. A typical exaggeration was counting hardware/software upgrades that happened to be Y2K-compliant as Y2K-related expenditures, when they would have taken place anyway. A more accurate estimate of the truly Y2K-driven analysis and cleanup effort would be in the tens of billions of dollars.[1]

In many cases involving computer hardware and packaged software, Y2K remediation was achieved by replacing or upgrading non-compliant versions with compliant ones. The upgrade-related technology spending in the last half of the 1990s was a key contributor to the economic growth during that period.

Legacy of Y2K

Prior to the Y2K issue surfacing, software and computer systems had been considered an expense by many entities rather than an asset, and few resources were dedicated to ensuring that these systems were documented, well-managed and auditable after their initial launch. The cost of achieving Y2K compliance prompted most firms and government agencies to improve the quality of their software life-cycle management.

Since Y2K analysis and remediation usually involves repeated testing of the same set of code, many companies and government agencies developed and/or improved their competency in software quality assurance.

The shortage of programmers trained in legacy languages like COBOL in the late 1990s provided an opportunity for companies in low-cost nations like India to offer outsourcing services to new markets. Companies found that commodity-type assignments like Y2K remediation were a good fit for offshore outsourcing under these circumstances.[2] The momentum in using offshore programming continued and accelerated after January 1, 2000 had passed.

A similar problem will be encountered in the year 2038, when 32 bit computers running the UNIX operating system will exceed their time storage capacity. 64 bit computers will not run into a similar problem for literally billions of years.

It was reported in 2017 that the federal government still required Y2K reports.[3]

References

  1. [1] Everyone pays a price for Y2K hype
  2. http://www.lib.utk.edu/news/readyfortheworld/archives/the_world_is_flat/001189.html
  3. Bedard, Paul (June 15, 2017). Shock: Feds still require Y2K impact reports 17 years later. Washington Examiner. Retrieved June 15, 2017.

External links