Cryptanalysis of the Lorenz cipher

Cryptanalysis of the Lorenz cipher was the process that enabled the British to read high-level German army messages during World War II. The British Government Code and Cypher School (GC&CS) at Bletchley Park decrypted many communications between the Oberkommando der Wehrmacht (OKW, German High Command) in Berlin and their army commands throughout occupied Europe, some of which were signed "Adolf Hitler, Fuhrer". These were intercepted non-Morse radio messages that had been enciphered by the Lorenz SZ teletypewriter rotor stream cipher attachments. Decrypts of this traffic became an important source of "Ultra" intelligence.

For its high-level secret messages, the German armed services enciphered each character using various online Geheimschreiber (secret writer) stream cipher machines at both ends of a telegraph link using the 5-bit International Telegraphy Alphabet No. 2 (ITA2). These machines were the Lorenz SZ (SZ for Schlüsselzusatz, meaning "cipher attachment") machine for the army, the Siemens and Halske T52 for the air force and the Siemens T43, which was little used and never broken by the Allies.

Bletchley Park decrypts of messages enciphered with the Enigma machines revealed that the Germans called one of their wireless teleprinter transmission systems "Sägefisch" (sawfish), which led British cryptographers to refer to encrypted German teleprinter traffic as "Fish". "Tunny" was the name given to the first non-Morse link, and it was subsequently used for the Lorenz SZ machines and the traffic enciphered by them.

As with the entirely separate Cryptanalysis of the Enigma, it was German operational shortcomings that allowed the initial diagnosis of the system, and a way into decryption. Unlike Enigma, no physical machine reached allied hands until the very end of the war in Europe, long after wholesale decryption had been established. Initially, operator errors produced a number of pairs of messages sent with the same keys, giving a "depth", which often allowed manual decryption to be achieved. One long depth also allowed the complete logical structure of the machine to be worked out, a quite remarkable cryptanalytical feat on which the subsequent comprehensive decrypting of Tunny messages relied.

When depths became less frequent, decryption was achieved by a combination of manual and automated methods. The first machine to automate part of the decrypting process was called "Heath Robinson" and it was followed by several other "Robinsons". These were, however, slow and unreliable, and were supplemented by the much faster and flexible "Colossus" the world's first electronic, programmable digital computer, ten of which were in use by the end of the war.

Albert W. Small, an American cryptographer from the US Signal Corps who was seconded to Bletchley Park and worked on Tunny, said in his December 1944 report back to Arlington Hall that: "Daily solutions of Fish messages at GC&CS reflect a background of British mathematical genius, superb engineering ability, and solid common sense. Each of these has been a necessary factor. Each could have been overemphasised or underemphasised to the detriment of the solutions; a remarkable fact is that the fusion of the elements has been apparently in perfect proportion. The result is an outstanding contribution to cryptanalytic science."

The German Tunny machines


The Lorenz SZ cipher attachments implemented a Vernam stream cipher, using a complex array of twelve wheels that delivered what should have been a cryptographically secure pseudorandom number as a key stream. The key stream was combined with the plaintext to produce the ciphertext at the transmitting end using the exclusive or (XOR) function. At the receiving end, an identically configured machine produced the same key stream which was combined with the ciphertext to produce the plaintext, i. e. the system implemented a symmetric-key algorithm.

The right hand five wheels, the chi ($$\chi$$) wheels, changed the five impulses (bits) of the incoming character, advancing one position every time. The left hand five, the psi ($$\psi$$) wheels, further changed the result of the chi transform, but they did not always move on with each new character.

The central two mu ($$\mu$$) or "motor" wheels determined whether or not the psi wheels rotated with a new character. The SZ42A and SZ42B machines had a more complex arrangement for advancing the psi wheels than the original SZ40.

Each wheel had a number of cams that could be set in one of two positions. The numbers of cams on the set of wheels were co-prime with each other giving an extremely long period before the key sequence repeated. The process of working out which of the 501 cams were in the raised position was called "wheel breaking" at Bletchley Park. Deriving the start positions of the wheels for a particular message was termed "wheel setting" or simply "setting". The fact that the psi wheels all moved together, but not with every input character, was a major weakness of the machines that led to cryptanalytical success.

Secure Telegraphy
Electro-mechanical telegraphy was developed in the 1830s and 1840s, well before telephony, and was in worldwide use by the time of the Second World War. An extensive system of cables were used within and between countries, with a standard voltage of −80 V indicating a "mark" and +80 V indicating a "space". Where cable transmission was impracticable or inconvenient, such as for mobile German Army Units, radio transmission was used.

Teleprinters at each end of the circuit consisted of a keyboard and printing mechanism, and very often a five-hole perforated paper tape reading and punching mechanism. When used online, pressing an alphabet key at the transmitting end caused the relevant character to be printed at the receiving end. Commonly, however, the communication system involved the transmitting operator preparing a message offline by punching it onto paper tape, and then going online only for the transmission of the message recorded on the tape. Typically this would be at some ten characters per second, and so occupy the line or radio channel for a shorter time than for online typing.

The characters of the message were represented by the codes of the International Telegraphy Alphabet No. 2 (ITA2). The transmission medium, either wire or radio, used asynchronous serial communication with each character signaled by a start (space) impulse, 5 data impulses and 1½ stop (mark) impulses. At Bletchley Park mark impulses were signified by x and space impulses by •. For example the letter "H" would be coded as ••x•x.

The figure shift (FIGS) and letter shift (LETRS) characters determined how the receiving end interpreted the string of characters up to the next shift character. Because of the danger of a shift character being corrupted, some operators would type a pair of shift characters when changing from letters to numbers or vice versa. So they would type 55M88 to represent a full stop. Such doubling of characters was very helpful for the statistical cryptanalysis used at Bletchley Park. After encipherment, shift characters had no special meaning.

Unlike Morse-coded signals, a human listener could not interpret a radio telegraph message. A standard teleprinter, however would produce the text of the message. The Lorenz cipher attachment changed the plaintext of the message into ciphertext that was uninterpretable to those without an identical machine identically set up. This was the challenge faced by the Bletchley Park codebreakers.

Interception
Intercepting Tunny transmissions presented substantial problems. As the transmitters were directional, most of the signals were quite weak at receivers in Britain. Furthermore, there were some 25 different frequencies used for these messages, and the frequency would sometimes be changed in mid-message. After the initial discovery of the non-Morse signals in 1940, a Y-station was set up on a hill at the Ivy Farm Communications Centre at Knockholt in Kent, specifically to intercept this traffic. The centre was headed by Harold Kenworthy, had 30 receiving sets and employed some 600 staff. It became fully operational early in 1943. Because a single missed or corrupted character could make decryption impossible, the greatest accuracy was required. The undulator technology used to record the impulses had originally been developed for high-speed Morse. It produced a visible record of the impulses on narrow paper tape. This was then read by people employed as "slip readers" who interpreted the peaks and troughs as the marks and spaces of ITA2 characters. Perforated paper tape was then produced for telegraphic transmission to Bletchley Park where it was punched out.

The Vernam cipher
This cipher uses the Boolean "exclusive or" (XOR) function, symbolised by ⊕ and verbalised as "A or B but not both". This is represented by the following "truth table", where x represents "true" and • represents "false". Other names for this function are: Not equal (NEQ), and modulo 2 addition (without "carry") and subtraction (without "borrow"). Note that modulo 2 addition and subtraction are identical. Some descriptions of Tunny decryption refer to addition and some to differencing, i.e. subtraction, but they mean the same thing.

A desirable feature of a machine cipher is that the same machine with the same settings can be used either for enciphering or for deciphering. The Vernam cipher achieves this reciprocity, as combining the stream of plaintext characters with the key stream produces the ciphertext, and combining the same key with the ciphertext regenerates the plaintext.

Symbolically:


 * Plaintext ⊕  Key = Ciphertext

and


 * Ciphertext ⊕  Key = Plaintext

Vernam's original idea was to use conventional telegraphy practice, with a paper tape of the plaintext combined with a paper tape of the key at the transmitting end, and an identical key tape combined with the ciphertext signal at the receiving end. Each pair of key tapes would have been unique (a one-time tape), but generating and distributing such tapes presented considerable practical difficulties. In the 1920s four men in different countries invented rotor Vernam cipher machines to produce a key stream to act instead of a key tape. Lorenz SZ40/42 was one of these.

Security Features


A monoalphabetic substitution cipher such as the Caesar cipher can easily be broken, given a reasonable amount of ciphertext. This is achieved by frequency analysis of the different letters of the ciphertext, and comparing the result with the known distribution of letters in the language of the plaintext. With a polyalphabetic cipher, however, such as the Lorenz cipher, there is a different substitution alphabet for each successive character. So a frequency analysis shows an approximately uniform distribution, such as that obtained from a (pseudo) random number generator. By trying multiple putative chi-component partial key streams against the ciphertext, the Bletchley Park cryptanalysts were able to detect some of the underlying non-uniformity and so identify which partial key stream was likely to be correct.

The total number of cams on the twelve wheels of the SZ machines was 501. Each cam could either be in a raised position, in which case it contributed x to the logic of the system, or in the lowered position, in which case it generated •. The total possible number of patterns of raised cams was 2501 which is an astronomically large number. In practice, however, about half of the cams on each wheel were in the raised position. Later, the Germans realized that if the number of raised cams was not very close to half and there were runs of xs and •s, a cryptographic weakness existed. Indeed this weakness was one of the two factors that led to the system being diagnosed.

The pattern of raised and lowered cams was changed daily on the motor wheels ($$\mu$$37 and $$\mu$$61). The psi wheel patterns were changed quarterly until October 1942 when the frequency was increased to monthly, and then to daily on 1 August 1944, when the chi wheel patterns were also changed from their original monthly frequency to daily.

The number of start positions of the wheels was 43×47×51×53×59×37×61×41×31×29×26×23 which is approximately 1.6×1019, far too large a number for cryptanalysts to try an exhaustive "brute-force attack". As the numbers of positions of the wheels are co-prime with each other this number is also the period before the key repeated. Sometimes the Lorenz operators disobeyed instructions and two messages were transmitted with the same start positions, a phenomenon termed a "depth". The method by which the transmitting operator told the receiving operator the wheel settings that he had chosen for the message which he was about to transmit was termed the "indicator" at Bletchley Park.

In August 1942, the stereotyped starts to the messages, which were useful to cryptanalysts, were replaced by some irrelevant text, which made identifying the true message somewhat harder. This new material was dubbed quatsch (German for "nonsense") at Bletchley Park.

During the phase of the experimental transmissions, the indicator consisted of twelve German forenames, the initial letters of which indicated the position to which the operators turned the twelve wheels. As well as showing when two messages were fully in depth, it also allowed the identification of partial depths where two indicators differed only in one or two wheel positions. From October 1942 the indicator system changed to the sending operator transmitting the unenciphered letters QEP followed by a two digit number. This number was taken serially from a code book that had been issued to both operators and gave, for each QEP number, the settings of the twelve wheels. The books were replaced when they had been used up, but between replacements, complete depths could be identified by the re-use of a QEP number on a particular Tunny link.

Diagnosis
The first step in breaking a new cipher is to diagnose the logic of the processes of encryption and decryption. In the case of a machine cipher such as Tunny, this entailed establishing the logical structure and hence functioning of the machine. This was achieved without the benefit of seeing a machine—which only happened in 1945, shortly before the allied victory in Europe.

During the experimental period of Tunny transmissions when the twelve-letter indicator system was in use, John Tiltman, Bletchley Park's veteran and remarkably gifted cryptanalyst, studied the Tunny ciphertexts and identified that they used a Vernam cipher.

When two messages (a and b) are transmitted with the same key, i.e. they are in depth, combining them eliminates the effect of the key. Let us call the two ciphertexts Za and Zb, the key K and the two plaintexts Pa and Pb. We then have:
 * Za ⊕  Zb = Pa  ⊕  Pb

If the two plaintexts can be worked out, the key can be recovered from either ciphertext-plaintext pair e.g.:
 * Za ⊕  Pa = K or Zb  ⊕  Pb = K

On 31 August 1941, two long messages were received that had the same indicator HQIBPEXEZMUG. The first seven characters of these two ciphertexts were the same, but the second message was shorter. The first 15 characters of the two messages were as follows: John Tiltman tried various likely pieces of plaintext, i.e. a "cribs", against the Za ⊕  Zb string and found that the first plaintext message started with the German word SPRUCHNUMMER (message number). In the second plaintext, the operator had used the common abbreviation NR for NUMMER. There were more abbreviations in the second message, and the punctuation sometimes differed. This allowed Tiltman to work out, over ten days, the plaintext of both messages, as a sequence of plaintext characters discovered in Pa, could then be tried against Pb and vice versa. In turn, this yielded almost 4000 characters of key.

Members of the Research Section worked on this key to try to derive a mathematical description of the key generating process, but without success. Bill Tutte joined the section in October 1941 and was given the task. He had read chemistry and mathematics at Trinity College, Cambridge before being recruited to Bletchley Park. At his training course, he had been taught the Kasiski examination technique of writing out a key on squared paper with a new row after a defined number of characters that was suspected of being the frequency of repetition of the key. If this number was correct, the columns of the matrix would show more repetitions of sequences of characters than chance alone.

Tutte knew that the Tunny indicators used 25 letters (excluding J) for 11 of the positions, but only 23 letters for the other. He therefore tried Kasiski's technique on the first two impulses of the key characters using a repetition of 25 × 23 = 575. Tutte did not observe a large number of repetitions in the columns with this period, but he did observe the phenomenon on a diagonal. He therefore tried again with 574, which showed up repeats in the columns. Recognising that the prime factors of this number are 2, 7 and 41, he tried again with a period of 41 and "got a rectangle of dots and crosses that was replete with repetitions".

It was clear, however, that the first impulse of the key was more complicated than that produced by a single wheel of 41 positions. Tutte called this component of the key $$\chi$$1 (chi). He figured that there was another component, which was XOR-ed with this, that did not always change with each new character, and that this was the product of a wheel that he called $$\psi$$1 (psi). The same applied for each of the five impulses—indicated here by subscripts. So for a single character, the key K consisted of two components:
 * K = $$\chi$$ ⊕  $$\psi$$.

For a stream of characters, the psi component of the key stream did not change with each new character and is referred to at the extended psi, symbolised by $$\psi$$ ' :
 * K = $$\chi$$ ⊕  $$\psi$$ '.

Tutte's derivation of the $$\psi$$ component was made possible by the fact that dots were more likely than not to be followed by dots, and crosses more likely than not to be followed by crosses. This was a product of a weakness in the German key setting, which they later stopped. Once Tutte had made this breakthrough, the rest of the Research Section joined in to study the other impulses, and it was established that the five $$\psi$$ wheels all moved together under the control of two $$\mu$$ (mu or "motor") wheels.

Diagnosing the functioning of the Tunny machine in this way was a truly remarkable cryptanalytical achievement.

Turingery
In July 1942 Turing spent a few weeks in the Research Section. He had become interested in the problem of breaking Tunny from the keys that had been obtained from depths. In July, he developed a method of deriving the cam settings from a length of key. It became known as "Turingery" or "Turing's Method" (playfully dubbed "Turingismus" by Peter Ericsson, Peter Hilton and Donald Michie ) and introduced the important method of "differencing" on which much of the rest of breaking Tunny messages in the absence of depths, was based.

Differencing
The search was on for a process that would manipulate the ciphertext or key to produce a frequency distribution of characters that departed from the uniformity that the enciphering process aimed to achieve. Turing worked out that the XOR combination of the values of successive characters in a stream of ciphertext or key, emphasised any departures from a uniform distribution. The resultant stream was called the difference (symbolised by the Greek letter "delta" Δ) because XOR is the same as modulo 2 subtraction. So, for a stream of characters S, the difference ΔS was obtained as follows, where underline indicates the succeeding character:


 * ΔS = S ⊕ S

The stream S may be ciphertext Z, plaintext P, key K or either of its two components $$\chi$$ and $$\psi$$. The relationship amongst these elements still applies when they are differenced. For example, as well as:


 * K = $$\chi$$ ⊕ $$\psi$$

It is the case that:
 * ΔK = Δ$$\chi$$ ⊕ Δ$$\psi$$

Similarly for the ciphertext, plaintext and key components:


 * ΔZ = ΔP ⊕ Δ$$\chi$$ ⊕ Δ$$\psi$$

So:
 * ΔP = ΔZ ⊕ Δ$$\chi$$ ⊕ Δ$$\psi$$

The reason that differencing provided a way into Tunny, was that although the frequency distribution of characters in the ciphertext could not be distinguished from a random stream, the same was not true for a version of the ciphertext from which the chi element of the key had been removed. This is because, where the plaintext contained a repeated character and the psi wheels did not move on, the differenced psi character (Δ$$\psi$$) would be the null character (' / ' at Bletchley Park). When XOR-ed with any character, this character has no effect, so in these circumstances, Δ$$\chi$$ = ΔK. The ciphertext modified by the removal of the chi component of the key was called the de-chi D at Bletchley Park, and the process of removing it as "de-chi-ing". Similarly for the removal of the psi component which was known as "de-psi-ing" (or "deep sighing" when it was particularly difficult).

So the delta de-chi ΔD was:
 * ΔD = ΔZ ⊕ Δ$$\chi$$

Repeated characters in the plaintext were more frequent both because of the characteristics of German (EE, TT, LL and SS are relatively common), and because telegraphists frequently repeated the figures-shift and letters-shift characters as their loss in an ordinary telegraph message could lead to gibberish.

To quote the General Report on Tunny:"Turingery introduced the principle that the key differenced at one, now called ΔΚ, could yield information unobtainable from ordinary key. This Δ principle was to be the fundamental basis of nearly all statistical methods of wheel-breaking and setting."

As well as applying differencing to the full 5-bit characters of the ITA2 code, it was also applied to the individual impulses (bits). So, for the first impulse, that was enciphered by wheels $$\chi$$1 and $$\psi$$1, differenced at one:
 * ΔK1 = K1 ⊕ K 1

And for the second impulse:
 * ΔK2 = K2 ⊕ K 2

And so on.

It is also worth noting that the periodicity of the chi and psi wheels for each impulse (41 and 43 respectively for the first impulse) is reflected in its pattern of ΔK. However, given that the psi wheels did not advance for every input character, as did the chi wheels, it was not simply a repetition of the pattern every 41 × 43 = 1763 characters for ΔK1, but a more complex sequence.

Turing's method
Turing's method of deriving the cam settings of the wheels from a length of key obtained from a depth, involved an iterative process. Given that the delta psi character was the null character ' / ' half of the time on average, an assumption that ΔK = Δ$$\chi$$ had a 50% chance of being correct. The process started by treating a particular ΔK character as being the Δ$$\chi$$ for that position. The resulting putative bit pattern of x and • for each chi wheel, was recorded on a sheet of paper that contained as many columns as there were characters in the key, and five rows representing the five impulses of the Δ$$\chi$$. Given the knowledge from Tutte's work, of the periodicity of each of the wheels, this allowed the propagation of these values at the appropriate positions in the rest of the key.

A set of five sheets, one for each of the chi wheels, was also prepared. These contained a set of columns corresponding in number to the cams for the appropriate chi wheel, and were referred to as a 'cage'. So the $$\chi$$3 cage had 29 such columns. Successive 'guesses' of Δ$$\chi$$ values then produced further putative cam state values. These might either agree or disagree with previous assumptions, and a count of agreements and disagreements was made on these sheets. Where disagreements substantially outweighed agreements, the assumption was made that the Δ$$\psi$$ character was not the null character ' / ', so the relevant assumption was discounted. Progressively, all the cam settings of the chi wheels were deduced, and from them, the psi and motor wheel cam settings.

As experience of the method developed, improvements were made that allowed it to be used with much shorter lengths of key than the original 500 or so characters.

Testery
The Testery was the section at Bletchley Park that performed the bulk of the work involved in decrypting Tunny messages. By July 1942, the volume of traffic was building up considerably. A new section was therefore set up, led by Ralph Tester—hence the name. The staff consisted mainly of ex-members of the Research Section, and included Peter Ericsson, Peter Hilton, Denis Oswald and Jerry Roberts. The Testery's methods were almost entirely manual, both before and after the introduction of automated methods in the Newmanry to supplement and speed up their work.

The first phase of the work of the Testery ran from July to October, with the predominant method of decryption being based on depths and partial depths. After ten days, however, the stereotyped opening of the messages was replaced by nonsensical quatsch making decryption more difficult. This period was a productive time, albeit each decryption took considerable time until, in September, a depth was received that allowed Turing's method of wheel breaking "Turingery" to be used, and current traffic started to be read. Extensive data about the statistical characteristics of the language of the messages was compiled, and the collection of cribs extended.

In late October 1942 the original, experimental Tunny link was closed and two new links (Codfish and Octopus) were opened. With these and subsequent links, the 12-letter indicator system of specifying the message key was replaced by the QEP system. This meant that only full depths could be recognised—from identical QEP numbers—which led to a considerable reduction in traffic decrypted.

Once the Newmanry became operational in June 1943, the nature of the work performed in the Testery changed, with decrypts, and wheel breaking no longer relying on depths.

British Tunny
The so-called "British Tunny Machine" was a device that exactly replicated the functions of the SZ40/42 machines. It was used to produce the German cleartext from the Testery's ciphertext, after the settings had been determined. The functional design was produced at Bletchley Park where ten Testery Tunnies were in use by the end of the war. It was designed and built in Tommy Flowers' laboratory at the General Post Office Research Station at Dollis Hill by Gil Hayward, "Doc" Coombs, Bill Chandler and Sid Broadhurst. It was mainly built from standard British telephone exchange electro-mechanical equipment such as relays and uniselectors. Input and output was by means of a teleprinter with paper tape reading and punching. These machines were used in both the Testery and later the Newmanry. Dorothy Du Boisson who was a machine operator and a member of the Women's Royal Naval Service (Wren), described plugging up the settings as being like operating an old fashioned telephone exchange and that she received electric shocks in the process.

When Flowers was invited by Hayward to try the first British Tunny machine at Dollis Hill by typing in the standard test phrase: "Now is the time for all good men to come to the aid of the party", he much appreciated that the rotor functions had been set up to provide the following Wordsworthian output:

Additional features were added to the British Tunnies to simplify their operation. Further refinements were made for the versions used in the Newmanry.

Newmanry
The Newmanry was a section set up under Max Newman in December 1942 to look into the possibility of assisting the work of the Testery by automating parts of the processes of decrypting Tunny messages. Newman had been working with Gerry Morgan, head of the Research Section on ways of breaking Tunny when Bill Tutte approached them in November 1942 with the idea of what became known as the "1+2 break in". This was recognised as being feasible, but only if automated.

Newman produced a functional specification of what was to become "Heath Robinson". He recruited the Post Office Research Station at Dollis Hill, and Dr C.E. Wynn-Williams at the Telecommunications Research Establishment (TRE) at Malvern to implement his idea. Work on the engineering design started in January 1943 and the first machine was delivered in June. The staff at that time consisted of Newman, Donald Michie, Jack Good, two engineers and 16 Wrens. By the end of the war the Newmanry contained three Robinson machines, ten Colossus Computers and a number of British Tunnies. The staff were 26 cryptographers, 28 engineers and 275 Wrens.

The automation of these processes required the processing of large quantities of punched paper tape such as those on which the enciphered messages were received. Absolute accuracy of these tapes and their transcription was essential, as a single character in error could invalidate or corrupt a huge amount of work. Jack Good introduced the maxim "If it's not checked it's wrong".

Tutte's "1+2 break in"
The essence of this method was to find the initial settings of the chi component of the key by exhaustively trying all positions of its combination with the ciphertext, and looking for evidence of the non-uniformity that reflected the characteristics of the original plaintext. The wheel breaking process had to have successfully produced the current cam settings to allow the relevant sequence of characters of the chi wheels to be generated. It was totally impracticable to generate the 22 million characters from all five of the chi wheels, so it was initially limited to 41 × 31 = 1271 from the first two.

Given that for each of the five impulses i:
 * Zi = $$\chi$$i ⊕ $$\psi$$i ⊕ Pi

and hence
 * Pi = Zi ⊕ $$\chi$$i ⊕ $$\psi$$i

for the first two impulses:
 * (P1 ⊕ P2) = (Z1 ⊕ Z2) ⊕ ($$\chi$$1 ⊕ $$\chi$$2) ⊕ ($$\psi$$1 ⊕ $$\psi$$2)

Calculating a putative P1 ⊕ P2 in this way for each starting point of the $$\chi$$1 ⊕ $$\chi$$2 sequence would yield xs and •s with, in the long run, a greater proportion of •s when the correct starting point had been used. Tutte knew, however, that using the differenced (∆) values amplified this effect because any repeated characters in the plaintext would always generate •, and similarly ∆$$\psi$$1 ⊕ ∆$$\psi$$2 would generate • whenever the psi wheels did not move on, and about half of the time when they did - some 70% overall.

Tutte analyzed a decrypted ciphertext with the differenced version of the above function:
 * (∆Z1 ⊕ ∆Z2) ⊕ (∆$$\chi$$1 ⊕ ∆$$\chi$$2) ⊕ (∆$$\psi$$1 ⊕ ∆$$\psi$$2)

and found that it generated • some 55% of the time. Given the nature of the contribution of the psi wheels, the alignment of chi-stream with the ciphertext that gave the highest count of •s from (∆Z1 ⊕ ∆Z2 ⊕ ∆$$\chi$$1 ⊕ ∆$$\chi$$2) was the one that was most likely to be correct. This technique could be applied to any pair of impulses and so provided the basis of an automated approach to obtaining the de-chi (D) of a ciphertext, from which the psi component could be removed by manual methods.

Robinsons
Heath Robinson was the first machine produced to automate Tutte's 1+2 method. It was given the name by the Wrens who operated it, after cartoonist William Heath Robinson, who drew immensely complicated mechanical devices for simple tasks, similar to Rube Goldberg in the USA.

The functional specification of the machine was produced by Max Newman. The main engineering design was the work of Frank Morrell at the Post Office Research Station at Dollis Hill in North London, with his colleague Tommy Flowers designing the "Combining Unit". Dr C. E. Wynn-Williams from the Telecommunications Research Establishment at Malvern produced the high-speed electronic valve and relay counters. Construction started in January 1943, the prototype machine was delivered to Bletchley Park in June and was first used to help read current encrypted traffic soon afterwards. The main parts of the machine were:
 * a tape transport and reading mechanism (dubbed the "bedstead" because of its resemblance to an upended metal bed frame) that ran the looped key and message tapes at between 1000 and 2000 characters per second;
 * a combining unit that implemented the logic of Tutte's method;
 * a counting unit that counted the number of •s, and if it exceeded a pre-set total, displayed or printed it.

The prototype machine and was effective despite a number of serious shortcomings. Most of these were progressively overcome in the development of what became known as "Old Robinson". A later development was a machine called "Super Robinson".

Colossus


Tommy Flowers' experience with Heath Robinson, and his previous, unique experience of thermionic valves (vacuum tubes) led him to realize that a better machine could be produced using electronics. Instead of the key stream being read from a punched paper tape, an electronically generated key stream could allow much faster and more flexible processing. Flowers' suggestion that this could be achieved with a machine that was entirely electronic and would contain between one and two thousand valves, was treated with incredulity at both the Telecommunications Research Establishment and at Bletchley Park, as it was thought that it would be "too unreliable to do useful work". He did, however, have the support of the Controller of Research at Dollis Hill, W Gordon Radley, and he implemented these ideas producing Colossus, the world's first electronic, digital, computing machine that was at all programmable, in the remarkably short time of ten months. In this he was assisted by his colleagues at the Post Office Research Station Dollis Hill: Sidney Broadhurst, William Chandler, Allen Coombs and Harry Fensom.

The main parts of the machine were:
 * a tape transport and reading mechanism (the "bedstead") that ran the message tape in a loop at 5000 characters per second;
 * a unit that generated the key stream electronically;
 * a combining unit that implemented the logic of Tutte's method;
 * a counting unit that counted the number of •s, and if it exceeded a pre-set total, printed it out.

The prototype Mark 1 Colossus (Colossus I), with its 1500 valves, was shown to be working at Dollis Hill in December 1943 and was operational at Bletchley Park by February 1944. This processed the message at 5000 characters per second using the impulse from reading the tape's sprocket holes to act as the clock signal. It quickly became evident that this was a huge leap forward in cryptanalysis of Tunny. Further Colossus machines were ordered and the orders for more Robinsons cancelled.

An improved Mark 2 Colossus (Colossus II) contained 2400 valves and first worked at Bletchley Park on 1 June 1944, just in time for the D-day Normandy landings. This processed the message at an effective speed of 25,000 characters per second by the use of circuitry invented by Flowers that would now be called a shift register. Donald Michie worked out a method of using Colossus to assist in wheel breaking as well as for wheel setting. This was then implemented in special hardware on later Colossi.

A total of ten Colossus computers were in use by the end of the war.

Special machines
As well as the commercially produced teleprinters and re-perforators, a number of other machines were built to assist in the preparation and checking of tapes in the Newmanry and Testery. The approximate complement as at May 1945 was as follows.

Steps in Wheel Setting
Working out the start position of the chi ($$\chi$$) wheels required first that their cam settings had been determined by "wheel breaking". Initially, this was achieved by two messages having been sent in depth.

The number of start positions for the first two wheels, $$\chi$$1 and $$\chi$$2 was 41×31 = 1271. The first step was to try all of these start positions against the message tape. This was Tutte's "1+2 break in" which involved computing (∆Z1 ⊕ ∆Z2 ⊕ ∆$$\chi$$1 ⊕ ∆$$\chi$$2) –which gives a putative ( ∆D1 ⊕ ∆D2 )–and counting the number of times this gave •. Both Heath Robinson, which was developed into what became known as "Old Robinson", and Colossus were designed to automate this process. Statistical theory allowed the derivation of measures of how far any count was from the random situation expected with an incorrect starting point for the chi wheels. For this step, the measure of deviation from randomness was called sigma. Starting points that gave a count of less than 2.5 × sigma, named the "set total", were not printed out. In the ideal case there was a single large value for sigma that identified the start positions of $$\chi$$1 and $$\chi$$2. An example of the output from such a run on a Mark 2 Colossus with its five counters: a,b,c,d and e, is given below.

Having identified possible $$\chi$$1 $$\chi$$2 start positions, the next step was to try to find the start positions for the other chi wheels. In the example given above, there is a single setting of $$\chi$$1 = 36 and $$\chi$$2 = 21 whose sigma value makes it stand out from the rest. This was not always the case and there were many different actions that could be taken. Small enumerates 36 different further runs that might be done. At first the choice was made by the cryptanalyst sitting at the typewriter output, and calling out instructions to the Wren operators. Max Newman devised a decision tree and then set Jack Good and Donald Michie the task of devising others. These were used by the Wrens without recourse to the cryptanalysts if certain criteria were met.

In the above one of Small's examples, the next run was with the first two chi wheels set to the start positions found and three separate parallel explorations of the remaining three chi wheels.

Once the probable start positions for the chi wheels had been derived, they had to be verified before the de-chi ( D ) message was passed to the Testery. This involved performing a count of the frequency of the characters in ∆D. Small describes the check of the frequency count of the ∆D characters as being the "acid test", and that practically every cryptanalyst and Wren in the Newmanry and Testery knew the contents of the following table by heart.

If the derived start points of the chi wheels passed this test, the de-chi-ed message was passed to the Testery where manual methods were used to derive the psi and motor settings. As Small remarked, the work in the Newmanry took a great amount of statistical science, whereas that in the Testery took much knowledge of language and was of great interest as an art. Jerry Roberts makes the point that this Testery work was a greater load on staff than the automated processes in the Newmanry.