Well, I was programming professionally from 1969, and we designed a number of systems with two-digit years (as well as maintaining existing ones). Our motivation was simple: we needed to save the space.
This early work was done on an IBM 1401. Our shop had disk drives; removable pack 5-platter drives which, if I'm not miss-remembering, stored one million characters of information. One pack had to store the entire data for a "system" (such as alumni, or student information, or admissions). The college had about 1700 students, a roughly similar number of applicants each year, and rather more alumni than that. Update transactions were keypunched onto 80-column cards and then submitted in batches, so the compactness of the data representation on the cards was also valuable to us.
I will say that we saw the problem coming, and tried to code to handle it; we had a breakpoint below which dates were considered to be in the previous century, and tried to centralize that number in each program (not hard-code it everywhere in the program it was used). We also tried to code date comparisons, for example, so that they took this into account. I'm sure we missed a lot, though; no systematic concept of testing in that shop at that time.
[This is my first contribution, not sure yet if I'll be sticking around but there are some neat conversations going on. Also not sure if it's necessary or appropriate to explicitly sign this contribution; if it's inappropriate I'm sure the housekeepers will take care of the problem. I'm David Dyer-Bennet, dd-b@dd-b.net.]
BertrandMeyer has argued that the problem was not really that people used 2-digit fiels for years; in 1969, given the cost of storage, this was a reasonable engineering decision. However, the real problem was that the resulting systems often weren't flexible enough to cope with change. In a well-refactored system, going from two digits to four digits should be reasonable easy.
However, I guess most of this stuff was done in CobolLanguage, and I really don't know how easy it is in Cobol to have an AbstractDataType with readers, writers, accessors, etc..
COBOL barely has types, never mind abstract ones.
In CobolLanguage it's argued that the expensive and limited hardware and processing available at the time was the main reason behind such decisions, not abstraction. Computers were just frippen expensive back then. Younglings tend to take computing power for granted. Only roughly around 1973 did the labor costs of software development and maintenance start to overshadow hardware costs. Every byte, card character column, and CPU cycle was treated like a chunk of precious metal.
Whilst working for a very large British international corporation I was required to make a few Visual Basic applications Y2K compliant. These apps were written in VB5 in 1997, not 1970. The answer 'We couldn't afford the memory' is total BullS**t, it may have been right in the 60s and 70s, but more recently its just the fault of very bad programmers with no understanding of software engineering. -- BryanDollery
And RealProgrammers understand software engineering. Riiiight. One of those SoftwareLies.
No, Real Programmers understand programming, experienced software engineers with accounting and business knowledge understand SoftwareEngineering.