A friend and colleague introduced me to a 94-year-old gentleman with a rare tale to tell. John McCallister was recruited during World War II to be a US Army liaison officer at “Station X,” the UK’s highly secret codebreaking operation at Bletchley Park. Station X collected intercepted German radio messages, all encrypted with the supposedly-unbreakable Enigma cipher, and broke the encryption. The resulting data was distributed to a handful of senior UK and US military commanders.
At first, McCallister worked at Bletchley and learned about the codebreaking operation. He met Alan Turing, now recognized as a giant in computer science. Turing developed codebreaking machines at Bletchley, including the “bombe” (left). Then McCallister prepared for his own role: to handle and distribute the highly secret information to senior US military commanders.
Following the war, McCallister left the crypto world. After college and reserve service for the Korean War, he applied his mathematic skills to business accounting at General Electric and Zenith Electronics. He retired in 1984.
Continue reading A Yank at Bletchley Park
CNET recently published a list of cables to keep and cables to discard. I like to keep things for historical interest as well as for practical reasons. Historical examples allow me to show students different ways of doing the same thing. The picture on the left illustrates “serial vs parallel” and I use a similar image in my textbook. I don’t collect ancient types of wire for investment purposes: values don’t justify it.
You need to decide why you want to keep cables, and keep the cables accordingly. Like most Web journalism, CNET largely ignored that question. Here are some reasons:
- I have equipment that uses a particular cable
- I’ll probably buy equipment that uses a particular cable.
Let’s look at those reasons and consider CNET’s recommendations.
Continue reading Which cables to keep, which to discard?
I’ve been looking at the evolution of electronic funds transfer (EFT) and payment systems recently. My research uncovered a gem: about two years ago, David Stearns completed a dissertation that looks at the early evolution of the Visa card (originally “Bank Americard”) in the context of other evolving electronic payment systems. Stearns’ work is both readable and filled with interesting information.
Continue reading Pragmatic Security: the history of the Visa card
(circa 1970-85, maybe later)
The Forth programming system was developed in the late 1960s by Chuck Moore. It provided a very powerful, text based mechanism for controlling a computer and writing programs when RAM and hard drive space were extremely tight. Early implementations were routinely restricted to 8KB of RAM. Some early implementations relied exclusively on diskette drives that stored less than a half a megabyte of data.
Starting in the 1970s, typical Forth systems treated hard drives as consisting of a linear set of numbered blocks, each 1KB in size. The first block on the drive (block 0) contained the bootstrap program to get Forth started, and a small number of subsequent blocks might also contain binary executable code that was loaded into RAM when Forth started.
Following the blocks of executable code, the remaining hard drive blocks generally contained ASCII text and were referred to by number. If a programmer needed to modify part of a Forth program, he would edit the hard drive block that contained that program, and refer to the block by its number.
Here is an assessment of Forth’s file system in the context of the eight concepts noted above:
- File storage – each file was essentially of a fixed, 1KB size. Other measures had to be used to link multiple blocks together into longer sequences of data.
- Locating files – files were referred to by block number, and the number had to be remembered by the programmers.
- Free space management – programmers had to remember what blocks had not been used, or which blocks contained obsolete text that could be erased so the block could be reused.
- Easy to implement – Yes, yes, yes!
- Speed – Very fast at the hardware level, since no hard drive searching had to take place
- Direct vs. sequential – supported both
- Storage sizes – no built-in limit, but this obviously became impractical as drives increased dramatically in size
- Robustness – there is little on-disk structure to destroy, so the robustness becomes a social issue having to do with the reliabilty of the people using the system: will they forget, or resign, or otherwise be unavailable? Can others fill in the gaps left by missing people? Will the hard drive get too big for humans to keep track of its contents without a more conventional file system?