Malicious Life Podcast: The Y2K Bug Pt. 1

In the 1950s and 60s - even leading into the 1990s - the cost of storage was so high, that using a 2-digit field for dates in a software instead of 4-digits could save an organization between $1.2-$2 Million dollars per GB of data. From this perspective, programming computers in the 1950s to record four-digit years would’ve been outright malpractice. But 40 years later, this shortcut became a ticking time bomb which one man, computer scientist Bob Bemer, was trying to diffuse before it was too late.

 

Powered by RedCircle

ran-levi-headshot
About the Host

Ran Levi

Born in Israel in 1975, Malicious Life Podcast host Ran studied Electrical Engineering at the Technion Institute of Technology, and worked as an electronics engineer and programmer for several High Tech companies in Israel.

In 2007, created the popular Israeli podcast Making History. He is author of three books (all in Hebrew): Perpetuum Mobile: About the history of Perpetual Motion Machines; The Little University of Science: A book about all of Science (well, the important bits, anyway) in bite-sized chunks; Battle of Minds: About the history of computer malware.

perry-chen

Perry Chen

Founder of Kickstarter

Principally, Chen’s work revolves around systems. His studio practice often explores how we negotiate a world of growing complexity and uncertainty, using research and archival material as entry points for engagement. Time Magazine named him one of the 100 most influential people in the world in 2013.

About The Malicious Life Podcast

Malicious Life by Cybereason exposes the human and financial powers operating under the surface that make cybercrime what it is today. Malicious Life explores the people and the stories behind the cybersecurity industry and its evolution. Host Ran Levi interviews hackers and industry experts, discussing the hacking culture of the 1970s and 80s, the subsequent rise of viruses in the 1990s and today’s advanced cyber threats.

Malicious Life theme music: ‘Circuits’ by TKMusic, licensed under Creative Commons License. Malicious Life podcast is sponsored and produced by Cybereason. Subscribe and listen on your favorite platform:

All Posts by Malicious Life Podcast

Transcript

Peterson Air Force Base, Colorado Springs, Colorado. December 27th, 1999.

Around 20 Russian soldiers arrive at the second floor of Building 1840, a US Space Command office complex inside the facility.

They occupy a stuffy, windowless room stuffed with dozens of desks, computers, and monitors, and a large video display in the front. Also: one red telephone, connected in a secure line to Russia. At this very moment, approximately 2,440 nuclear-tipped American missiles are primed and ready to launch at a moment’s notice. In Russia, 2,000 more are pointed in the opposite direction. 

Both of these countries have decades of experience building, pointing, and threatening to launch rockets. This time, though, they’re extra worried that somebody might just press a big, red button. And they have good reason to be.

15 miles southwest of Peterson Air Force base lies a dark, steel tunnel leading directly inside of a 10,000-foot-high mountain. Burrowed inside of this mountain, through a military checkpoint and behind a giant, steel door, lie top secret operations centers for the Air Force and North American Aerospace Defense Command–NORAD–plus a command-and-control center and an ultra-sophisticated missile detection system.

Thanks to decades of cold war, America’s turn-of-the-century network of satellites, radars, and sensors can detect a SCUD missile firing off from well over 20,000 miles in space, then instantly feed that data to operators at Cheyenne mountain. Millions and millions of dollars have gone into building, modifying, and keeping this system up-to-date and resilient to small bugs.

Russia, by contrast, does not have quite as sophisticated a system, and they haven’t fully invested in updating it. For this reason, both Russian and US leaders are worried about an imminent, potentially catastrophic situation.

At the stroke of midnight on December 31st, 1999, as Russia enters its next millennium, there exists a non-zero chance that its missile detection system could glitch or fail in any number of unexpected ways. Such an occurrence could inspire what the Pentagon’s joint staff call “opportunistic engagements”–mistaken (or malicious) interpretations of the false data, which Russian agents might respond to by actually launching missiles.

Intro to Bob Bemer

Eight months before Space Command welcomed Russian troops to Colorado Springs, a reporter for the Baltimore Sun took a two hour drive from Dallas-Fort Worth, Texas, to visit a large house on top of a cliff, overlooking a lake. Inside of this hilltop home lived an old, charismatic, surly, legendary computer programmer.

“He’s not like normal people,” one friend warned the reporter, “He’s eccentric — but bright as hell.”

Upon arriving, a quick tour of the house confirmed as much. Quote:

“There’s his obsession with lists, page after page detailing every country he’s ever visited, every flight he’s ever taken (complete with latitude and longitude of the destination and total miles), and the date of every visit to his parents. There’s his bedside collection of 40-year-old Pogo Possum comic books — “My hero,” he says with a smile — whose nuggets of wisdom he likes to quote. His favorite Pogoism: “The depths of human stupidity are as yet unplumbed.” There are his scrapbooks overflowing with yellowed newspaper clips and fading interoffice memos, documenting a half-century at computer giants such as IBM, GE, Univac, and Honeywell.”

Robert Bemer was born in Sault Saint Marie, Michigan in 1920. He earned degrees in mathematics and aeronautical engineering as a young adult, then took on a series of rather odd jobs. First he was a machinist, then a furniture maker, even a movie set designer, before being enlisted to work as an aerodynamicist for the Douglas Aircraft Company during World War II. He worked for the RAND Corporation, Marquardt Aircraft, then Lockheed Martin, until in 1955 he landed as an assistant manager of programming research at IBM.

And that’s the least important part of his resumé.

In 1960, he was part of the committee which created ASCII, the character encoding format which remains ubiquitous today. He’s been called “the father of ASCII,” in fact, for his role there, and for having invented a number of characters we take for granted today, like the backslash, curly brackets, and the Escape key. So you can thank Bob Bemer the next time you use Escape to exit out of an annoying program.

He also developed a business programming equivalent to FORTRAN, called COMTRAN. Then as a member of CODASYL–the Committee on Data Systems Languages–he integrated elements of COMTRAN into what would become COBOL, a 65-year-old programming language still widely used in mainframe computers today.

It was his magnum opus. It was also, in some ways, a catalyst for an issue which would only rear its head four decades later.

The 00 Problem

Bob may well have been the very first person to discover that issue, thanks to none other than the Mormon church.

Working for IBM in the late 50s, he was put on a job helping the Church of Jesus Christ of Latter-day Saints computerize all of its reams of genealogical records. It sounds easy enough, but, in the process, he ran into a technical roadblock. The records dated back hundreds of years, but he couldn’t actually reflect that in the data. In the format he was working with, years were only represented by two digits.

The reason for that traced back to pre-computer times–even as far back as the late 19th century–when people used to use electromechanical equipment and punch cards to record their data. Punch cards survived the times–IBM computers still utilized cards with space for 80 characters. Over time, the inherent limit to how much data could be recorded by these machines and on these cards inspired the people who used them to come up with every creative means they could to abbreviate.

Dates were an easy one: 1957 could easily be represented as 57, and you’d save on two characters. That doesn’t sound like much today, maybe, but it represented 2.5% of an IBM punch card.

This was the case even if you were working with cutting-edge digital memory. We’re in 1957 now, but even years later punch cards will still be around, and computer memory will be so expensive that it’s hardly preferable.

Take 1963, when leasing just one megabyte of mainframe storage cost around $175 per month. That’s $2,100 a year, and remember: we’re talking in 1963 dollars. Accounting for inflation, in 2024, it means that one megabyte of memory cost the equivalent of $21,500 per year.

Let’s fast forward through a dozen more years of technological advancement. In 1975, Hewlett Packard ran an ad for a 30% price cut on RAM capable of storing 8,000 words–a paltry figure to us today, of course. What was the new, low price of this budget offering? $990. And that was for bulk orders of 50–if you wanted just one for yourself, it actually cost $1,500, down from its prior price tag of $2,150. $1,500 in 1975 is equivalent to around $9,000 today.

So how much money did it save to chop the 19- of dates? Two researchers did that math, in a 1995 article titled “Accrued Savings of the Year 2000 Computer Date Problem.” Quote:

“Depending on the industry and application, about 3% to 6% of the data stored in organizational databases is dates. Using 4-digit, rather than 2-digit year fields would have required 33% more storage for dates […]. Using the lower 3% figure for date density, we conservatively estimate that 1% more disk storage would have been required through the years if we had stored all 4 digits of the year field.

One percent doesn’t seem to be very much until you consider several important factors: the large volumes of data stored by organizations, the historically high cost of storage, and the rate of inflation. So for every gigabyte (GB) of organizational data stored, a 1% savings represents 10 MB. Designing systems to save 1% of storage costs since 1963 has saved between $1.2 and $2 million per GB (in 1995 dollars)[.]”

From this perspective, it feels like programming computers in the 1950s to record four-digit years would’ve been outright malpractice.

Before the Senate Banking Committee in 1998, then-Federal Reserve Chair Alan Greenspan reflected on this, admitting that, quote:

“I’m one of the culprits who created this problem. I used to write those programs back in the 1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of space out of my program by not having to put a 19 before the year. Back then, it was very important. We used to spend a lot of time running through various mathematical exercises before we started to write our programs so that they could be very clearly delimited with respect to space and the use of capacity.”

 It totally made sense to cut off the 19-.

On the other hand there was Bob Bemer, trying to computerize Mormon ancestry records in the late 50s, failing to distinguish between people born in 1950 and 1850, or 1750–IBM’s own fault.

“It was the moment of awareness to me,” he told the Baltimore Sun. “I realized that using two-digit years wasn’t going to hold up.”

It was a ticking time bomb and, as one of the pioneers of business computing, he’d been complicit in building it. Now he would have to diffuse it, before it was too late.

To be fair, though, he did have 40 whole years to work with. More than enough time to fix everything, with decades to spare. Right?

Conflict with the Pentagon

Not long after the Mormon incident, in the 60s, the US government recruited Bob Bemer to a committee alongside a few dozen other computer experts, and tasked them with creating standards for the entire industry.

It was as yet a nascent industry–only a few thousand or so computers existed in the entire country, which was already far more than in any other country. But this was still urgent work since, prior to this point, there were no common rules for engineers to follow. We complain sometimes about interoperability today but, in the 50s, everyone did things however they wanted. Going from one system to another must’ve been a nightmare.

Bemer was, perhaps, partially responsible for that mess. His baby, COBOL, universalized business programming, but it also had an opposite effect. Being so user-friendly–with simple, human-readable commands–it opened up programming to, as he told the Washington Post, “any jerk.”

“I thought it would open up a tremendous source of energy,” he recalled in 1999. “It did. But what we got was arson.”

In the face of everyone’s bespoke, incompatible software and hardware, Bemer and his colleagues created things like ASCII, and other common coding standards which became utterly foundational to computing to this day. But not all of the committee’s ideas were equally successful.

One task it took on was to decide how computers should represent dates. Bemer, keenly aware of the problem posed by centuries, knew that they needed to implement a four-digit system, even if it came with significant costs.

And, by this time, he wasn’t the only one who knew that. Besides the Mormon church, there were at least a few organizations which used four-digit years for computerized recordkeeping, because they needed to account for the pre-1900s. The Veterans Administration, which represented plenty of former soldiers born before the century mark, did it. The Smithsonian needed more characters than anyone to distinguish centuries in BC and AD.

Bemer and a colleague proposed standardizing the four-digit year (they could save on BC and AD).

The thing is: most of the world’s computers were being run by the US government, and most of the US government’s computers were being run by the Department of Defense, which had a different view on the matter.

“We would have had to change every stinking file,” Bill Robertson, the director of data standards for the Office of the Secretary of Defense, told the Washington Post decades later. “We would have had a revolt.”

Not only were all of DoD’s programs already using two-digit years, but they already had a system for accounting for the century. Robertson had helped develop and implement it during his time in the Air Force. Besides the day, month, and year, there was an optional element in the system for indicating which century it was. It was flexible, so that anyone could save space or provide that extra information as needed.

Two Digits Formalized

Overhauling their systems would’ve been time consuming, costly, unnecessary, and risky–since they handled top secret files and, you know, missiles. They rejected the four-digit proposal.

And so on November 1st, 1968, the National Bureau of Standards–known today as The National Institute of Standards and Technology, or NIST for short–published its Federal Information Processing Standards Publication 4, formalizing the two-digit year under Section 4.1. The standard became a requirement for US government agencies beginning on January 1st, 1970.

(Try to remember that date–it’ll be relevant in our next episode.)

Following the government’s example, and with an eye to being compatible with its systems, most computer companies adopted the two-digit standard.

Bemer lost his fight, but he still knew he was right, and that the consequences could be dire. He just had to convince everyone else of that.

Bemer’s Crusade

He wrote to the American Medical Association, and politicians, and stormed into the offices of the Postmaster General.

In the February 1979 edition of “Interface Age,” a computer magazine, he wrote what might’ve been the earliest published story on a coming millennium bug, warning that “There are many horror stories about programs, working for years, that died on some significant change in the date.”

He even got his message to The White House — or, rather, the science advisor to the president.

“I discussed it with my staff,” the advisor told The Washington Post. But they and other agencies, quote, “wagged their head sagely and said this problem is simply not on the radar screen. [. . .] It’s 30 years in the future. We’ll be out of office. Leave it to the civil servants. They’ll still be here.”

Eventually, Bemer gave up. In 1982, at the age of 62, he retired.

14 years passed and then, one day, Bob picked up a copy of the Wall Street journal. A headline immediately stuck out: “Businesses Make A Date to Battle Year 2000 Problem.” As he remembered, quote:

“I read that and reread it and reread it again and said, ‘My God, they didn’t take my advice.’ “

Epilogue

One last thing before we go: I want to introduce you to Perry Chen, who in 2014 began collecting books, media, and other cultural remnants leftover from the Y2K scare.

“[Chen] I went into the investigation myself as a person who lived through when I was young, but I went through it I think it was kind of like in my early 20s.”

Today, Perry’s best known as the founder and former CEO of Kickstarter. For our purposes, though, he’s most relevant for his Y2K exhibition “Computers in Crisis,” for New York City’s New Museum. You can check it out at computersincrisis.com.

Among all of the silly survival books, government records, conspiracies, and events he recorded, Perry noticed a strange pattern to what happened just after the millennium turned.

 “[Chen] As I investigated this because I was so intrigued by the fact that we went through this kind of very strange and very loud, very alarmist period [. . .] I kind of want to like look back at it because it was this thing where it’s like we all decided or, or basically the people with the microphones, the press and the government decided that, like, right after it happened, like to never speak of it again.”

It was like a giant cover-up, without any need for coordination, or even stating it out loud.

“[Chen] And so I was like, this is fascinating, like, things that are 1,000th of the size of this get post mortems. And yet this thing, the biggest story of the years leading up to the millennium, literally, like it is like people want to pretend it didn’t occur.”

Y2K is really only ever remembered today as a joke. Some of the reason for that is that nobody wanted to talk about what happened after it was over.

And part of the reason is that few people actually understood what happened that day. It’s still a source of debate, in fact, how serious the millennium bug literally was.

Was it way overblown, and not actually a problem to begin with, like most people now believe? Or was it only stopped thanks to the hundreds of millions of dollars, and thousands upon thousands of hours of effort invested into preventing the worst possible outcomes?

In our next episode, Bob Bemer develops a solution for how to prevent a Y2K disaster, the Americans and the Russians team up to stop nuclear launches, and much more. Civilization as we know it doesn’t collapse, but it does change in some major ways. And it also fails to account for a second Y2K-style bug–one which is now just 14 years away…