Search This Blog

Saturday, June 18, 2022

Cryptanalysis of the Lorenz cipher

From Wikipedia, the free encyclopedia
Timeline of key events
Time Event
September 1939 War breaks out in Europe.
Second half of 1940 First non-Morse transmissions intercepted.
June 1941 First experimental SZ40 Tunny link started with alphabetic indicator.
August 1941 Two long messages in depth yielded 3700 characters of key.
January 1942
  • Tunny diagnosed from key.
  • August 1941 traffic read.
July 1942
  • Turingery method of wheel breaking.
  • Testery established.
  • First reading of up-to-date traffic.
October 1942
  • Experimental link closed.
  • First two of eventual 26 links started with QEP indicator system.
November 1942 The "1+2 break in" invented by Bill Tutte.
February 1943 More complex SZ42A introduced.
May 1943 Heath Robinson delivered.
June 1943 Newmanry founded.
December 1943 Colossus I working at Dollis Hill prior to delivery to Bletchley Park.
February 1944 First use of Colossus I for a real job.
March 1944 Four Colossi (Mark 2) ordered.
April 1944 Order for further Colossi increased to 12.
June 1944
August 1944 Cam settings on all Lorenz wheels changed daily.
May 1945

Cryptanalysis of the Lorenz cipher was the process that enabled the British to read high-level German army messages during World War II. The British Government Code and Cypher School (GC&CS) at Bletchley Park decrypted many communications between the Oberkommando der Wehrmacht (OKW, German High Command) in Berlin and their army commands throughout occupied Europe, some of which were signed "Adolf Hitler, Führer". These were intercepted non-Morse radio transmissions that had been enciphered by the Lorenz SZ teleprinter rotor stream cipher attachments. Decrypts of this traffic became an important source of "Ultra" intelligence, which contributed significantly to Allied victory.

For its high-level secret messages, the German armed services enciphered each character using various online Geheimschreiber (secret writer) stream cipher machines at both ends of a telegraph link using the 5-bit International Telegraphy Alphabet No. 2 (ITA2). These machines were subsequently discovered to be the Lorenz SZ (SZ for Schlüssel-Zusatz, meaning "cipher attachment") for the army, the Siemens and Halske T52 for the air force and the Siemens T43, which was little used and never broken by the Allies.

Bletchley Park decrypts of messages enciphered with the Enigma machines revealed that the Germans called one of their wireless teleprinter transmission systems "Sägefisch" (sawfish), which led British cryptographers to refer to encrypted German radiotelegraphic traffic as "Fish". "Tunny" (tunafish) was the name given to the first non-Morse link, and it was subsequently used for the cipher machines and their traffic.

As with the entirely separate cryptanalysis of the Enigma, it was German operational shortcomings that allowed the initial diagnosis of the system, and a way into decryption. Unlike Enigma, no physical machine reached allied hands until the very end of the war in Europe, long after wholesale decryption had been established. The problems of decrypting Tunny messages led to the development of "Colossus", the world's first electronic, programmable digital computer, ten of which were in use by the end of the war, by which time some 90% of selected Tunny messages were being decrypted at Bletchley Park.

Albert W. Small, a cryptanalyst from the US Army Signal Corps who was seconded to Bletchley Park and worked on Tunny, said in his December 1944 report back to Arlington Hall that:

Daily solutions of Fish messages at GC&CS reflect a background of British mathematical genius, superb engineering ability, and solid common sense. Each of these has been a necessary factor. Each could have been overemphasised or underemphasised to the detriment of the solutions; a remarkable fact is that the fusion of the elements has been apparently in perfect proportion. The result is an outstanding contribution to cryptanalytic science.

German Tunny machines

The Lorenz SZ machines had 12 wheels each with a different number of cams (or "pins").
OKW/Chi
wheel name
A B C D E F G H I J K L
BP wheel
name
1 2 3 4 5 37 61 1 2 3 4 5
Number of
cams (pins)
43 47 51 53 59 37 61 41 31 29 26 23

The Lorenz SZ cipher attachments implemented a Vernam stream cipher, using a complex array of twelve wheels that delivered what should have been a cryptographically secure pseudorandom number as a key stream. The key stream was combined with the plaintext to produce the ciphertext at the transmitting end using the exclusive or (XOR) function. At the receiving end, an identically configured machine produced the same key stream which was combined with the ciphertext to produce the plaintext, i. e. the system implemented a symmetric-key algorithm.

The key stream was generated by ten of the twelve wheels. This was a product of XOR-ing the 5-bit character generated by the right hand five wheels, the chi () wheels, and the left hand five, the psi () wheels. The chi wheels always moved on one position for every incoming ciphertext character, but the psi wheels did not.

Cams on wheels 9 and 10 showing their raised (active) and lowered (inactive) positions. An active cam reversed the value of a bit (x and x).

The central two mu () or "motor" wheels determined whether or not the psi wheels rotated with a new character. After each letter was enciphered either all five psi wheels moved on, or they remained still and the same letter of psi-key was used again. Like the chi wheels, the 61 wheel moved on after each character. When 61 had the cam in the active position and so generated x (before moving) 37 moved on once: when the cam was in the inactive position (before moving) 37 and the psi wheels stayed still. On all but the earliest machines, there was an additional factor that played into the moving on or not of the psi wheels. These were of four different types and were called "Limitations" at Bletchley Park. All involved some aspect of the previous positions of the machine's wheels.

The numbers of cams on the set of twelve wheels of the SZ42 machines totalled 501 and were co-prime with each other, giving an extremely long period before the key sequence repeated. Each cam could either be in a raised position, in which case it contributed x to the logic of the system, reversing the value of a bit, or in the lowered position, in which case it generated . The total possible number of patterns of raised cams was 2501 which is an astronomically large number. In practice, however, about half of the cams on each wheel were in the raised position. Later, the Germans realized that if the number of raised cams was not very close to 50% there would be runs of xs and s, a cryptographic weakness.

The process of working out which of the 501 cams were in the raised position was called "wheel breaking" at Bletchley Park. Deriving the start positions of the wheels for a particular transmission was termed "wheel setting" or simply "setting". The fact that the psi wheels all moved together, but not with every input character, was a major weakness of the machines that contributed to British cryptanalytical success.

A Lorenz SZ42 cipher machine with its covers removed at The National Museum of Computing on Bletchley Park

Secure telegraphy

Electro-mechanical telegraphy was developed in the 1830s and 1840s, well before telephony, and operated worldwide by the time of the Second World War. An extensive system of cables linked sites within and between countries, with a standard voltage of −80 V indicating a "mark" and +80 V indicating a "space". Where cable transmission became impracticable or inconvenient, such as for mobile German Army Units, radio transmission was used.

Teleprinters at each end of the circuit consisted of a keyboard and a printing mechanism, and very often a five-hole perforated paper-tape reading and punching mechanism. When used online, pressing an alphabet key at the transmitting end caused the relevant character to print at the receiving end. Commonly, however, the communication system involved the transmitting operator preparing a set of messages offline by punching them onto paper tape, and then going online only for the transmission of the messages recorded on the tape. The system would typically send some ten characters per second, and so occupy the line or the radio channel for a shorter period of time than for online typing.

The characters of the message were represented by the codes of the International Telegraphy Alphabet No. 2 (ITA2). The transmission medium, either wire or radio, used asynchronous serial communication with each character signaled by a start (space) impulse, 5 data impulses and 1½ stop (mark) impulses. At Bletchley Park mark impulses were signified by x ("cross") and space impulses by ("dot"). For example, the letter "H" would be coded as ••x•x.

Binary teleprinter code (ITA2) as used at Bletchley Park, arranged in reflection order whereby each row differs from its neighbours by only one bit.
Pattern of impulses Mark = x, Space = Binary Letter shift Figure shift BP 'shiftless' interpretation
••.••• 00000 null null /
••.x•• 00100 space space 9
••.x•x 00101 H # H
••.••x 00001 T 5 T
••.•xx 00011 O 9 O
••.xxx 00111 M . M
••.xx• 00110 N , N
••.•x• 00010 CR CR 3
•x.•x• 01010 R 4 R
•x.xx• 01110 C : C
•x.xxx 01111 V ; V
•x.•xx 01011 G & G
•x.••x 01001 L ) L
•x.x•x 01101 P 0 P
•x.x•• 01100 I 8 I
•x.••• 01000 LF LF 4
xx.••• 11000 A - A
xx.x•• 11100 U 7 U
xx.x•x 11101 Q 1 Q
xx.••x 11001 W 2 W
xx.•xx 11011 FIGS
+ or 5
xx.xxx 11111
LTRS - or 8
xx.xx• 11110 K ( K
xx.•x• 11010 J Bell J
x•.•x• 10010 D WRU D
x•.xx• 10110 F ! F
x•.xxx 10111 X / X
x•.•xx 10011 B ? B
x•.••x 10001 Z " Z
x•.x•x 10101 Y 6 Y
x•.x•• 10100 S ' S
x•.••• 10000 E 3 E

The figure shift (FIGS) and letter shift (LETRS) characters determined how the receiving end interpreted the string of characters up to the next shift character. Because of the danger of a shift character being corrupted, some operators would type a pair of shift characters when changing from letters to numbers or vice versa. So they would type 55M88 to represent a full stop. Such doubling of characters was very helpful for the statistical cryptanalysis used at Bletchley Park. After encipherment, shift characters had no special meaning.

The speed of transmission of a radio-telegraph message was three or four times that of Morse code and a human listener could not interpret it. A standard teleprinter, however would produce the text of the message. The Lorenz cipher attachment changed the plaintext of the message into ciphertext that was uninterpretable to those without an identical machine identically set up. This was the challenge faced by the Bletchley Park codebreakers.

Interception

Intercepting Tunny transmissions presented substantial problems. As the transmitters were directional, most of the signals were quite weak at receivers in Britain. Furthermore, there were some 25 different frequencies used for these transmissions, and the frequency would sometimes be changed part way through. After the initial discovery of the non-Morse signals in 1940, a radio intercept station called the Foreign Office Research and Development Establishment was set up on a hill at Ivy Farm at Knockholt in Kent, specifically to intercept this traffic. The centre was headed by Harold Kenworthy, had 30 receiving sets and employed some 600 staff. It became fully operational early in 1943.

A length of tape, 12 millimetres (0.47 in) wide, produced by an undulator similar to those used during the Second World War for intercepted 'Tunny' wireless telegraphic traffic at Knockholt, for translation into ITA2 characters to be sent to Bletchley Park

Because a single missed or corrupted character could make decryption impossible, the greatest accuracy was required. The undulator technology used to record the impulses had originally been developed for high-speed Morse. It produced a visible record of the impulses on narrow paper tape. This was then read by people employed as "slip readers" who interpreted the peaks and troughs as the marks and spaces of ITA2 characters. Perforated paper tape was then produced for telegraphic transmission to Bletchley Park where it was punched out.

The Vernam cipher

The Vernam cipher implemented by the Lorenz SZ machines utilizes the Boolean "exclusive or" (XOR) function, symbolised by ⊕ and verbalised as "A or B but not both". This is represented by the following truth table, where x represents "true" and represents "false".

INPUT OUTPUT
A B A ⊕ B
x x
x x
x x

Other names for this function are: exclusive disjunction, not equal (NEQ), and modulo 2 addition (without "carry") and subtraction (without "borrow"). Modulo 2 addition and subtraction are identical. Some descriptions of Tunny decryption refer to addition and some to differencing, i.e. subtraction, but they mean the same thing. The XOR operator is both associative and commutative.

Reciprocity is a desirable feature of a machine cipher so that the same machine with the same settings can be used either for enciphering or for deciphering. The Vernam cipher achieves this, as combining the stream of plaintext characters with the key stream produces the ciphertext, and combining the same key with the ciphertext regenerates the plaintext.

Symbolically:

PlaintextKey = Ciphertext

and

CiphertextKey = Plaintext

Vernam's original idea was to use conventional telegraphy practice, with a paper tape of the plaintext combined with a paper tape of the key at the transmitting end, and an identical key tape combined with the ciphertext signal at the receiving end. Each pair of key tapes would have been unique (a one-time tape), but generating and distributing such tapes presented considerable practical difficulties. In the 1920s four men in different countries invented rotor Vernam cipher machines to produce a key stream to act instead of a key tape. The Lorenz SZ40/42 was one of these.

Security features

A typical distribution of letters in English language text. Inadequate encipherment may not sufficiently mask the non-uniform nature of the distribution. This property was exploited in cryptanalysis of the Lorenz cipher by weakening part of the key.

A monoalphabetic substitution cipher such as the Caesar cipher can easily be broken, given a reasonable amount of ciphertext. This is achieved by frequency analysis of the different letters of the ciphertext, and comparing the result with the known letter frequency distribution of the plaintext.

With a polyalphabetic cipher, there is a different substitution alphabet for each successive character. So a frequency analysis shows an approximately uniform distribution, such as that obtained from a (pseudo) random number generator. However, because one set of Lorenz wheels turned with every character while the other did not, the machine did not disguise the pattern in the use of adjacent characters in the German plaintext. Alan Turing discovered this weakness and invented the differencing technique described below to exploit it.

The pattern of which of the cams were in the raised position, and which in the lowered position was changed daily on the motor wheels (37 and 61). The chi wheel cam patterns were initially changed monthly. The psi wheel patterns were changed quarterly until October 1942 when the frequency was increased to monthly, and then to daily on 1 August 1944, when the frequency of changing the chi wheel patterns was also changed to daily.

The number of start positions of the wheels was 43×47×51×53×59×37×61×41×31×29×26×23 which is approximately 1.6×1019 (16 billion billion), far too large a number for cryptanalysts to try an exhaustive "brute-force attack". Sometimes the Lorenz operators disobeyed instructions and two messages were transmitted with the same start positions, a phenomenon termed a "depth". The method by which the transmitting operator told the receiving operator the wheel settings that he had chosen for the message which he was about to transmit was termed the "indicator" at Bletchley Park.

In August 1942, the formulaic starts to the messages, which were useful to cryptanalysts, were replaced by some irrelevant text, which made identifying the true message somewhat harder. This new material was dubbed quatsch (German for "nonsense") at Bletchley Park.

During the phase of the experimental transmissions, the indicator consisted of twelve German forenames, the initial letters of which indicated the position to which the operators turned the twelve wheels. As well as showing when two transmissions were fully in depth, it also allowed the identification of partial depths where two indicators differed only in one or two wheel positions. From October 1942 the indicator system changed to the sending operator transmitting the unenciphered letters QEP followed by a two digit number. This number was taken serially from a code book that had been issued to both operators and gave, for each QEP number, the settings of the twelve wheels. The books were replaced when they had been used up, but between replacements, complete depths could be identified by the re-use of a QEP number on a particular Tunny link.

Diagnosis

Notation
Letters can represent character streams, individual 5-bit characters or, if subscripted, individual bits of characters
P plaintext
K key – the sequence of characters XOR'ed (added)
to the plaintext to give the ciphertext
χ chi component of key
ψ psi component of key
ψ' extended psi – the actual sequence of characters
added by the psi wheels, including those
when they do not advance
Z ciphertext
D de-chi — the ciphertext with the chi
component of the key removed
Δ any of the above XOR'ed with
its successor character or bit
the XOR operation

The first step in breaking a new cipher is to diagnose the logic of the processes of encryption and decryption. In the case of a machine cipher such as Tunny, this entailed establishing the logical structure and hence functioning of the machine. This was achieved without the benefit of seeing a machine—which only happened in 1945, shortly before the allied victory in Europe. The enciphering system was very good at ensuring that the ciphertext Z contained no statistical, periodic or linguistic characteristics to distinguish it from random. However this did not apply to K, χ, ψ' and D, which was the weakness that meant that Tunny keys could be solved.

During the experimental period of Tunny transmissions when the twelve-letter indicator system was in use, John Tiltman, Bletchley Park's veteran and remarkably gifted cryptanalyst, studied the Tunny ciphertexts and identified that they used a Vernam cipher.

When two transmissions (a and b) use the same key, i.e. they are in depth, combining them eliminates the effect of the key. Let us call the two ciphertexts Za and Zb, the key K and the two plaintexts Pa and Pb. We then have:

Za ⊕ Zb = Pa ⊕ Pb

If the two plaintexts can be worked out, the key can be recovered from either ciphertext-plaintext pair e.g.:

Za ⊕ Pa = K or
Zb ⊕ Pb = K

On 31 August 1941, two long messages were received that had the same indicator HQIBPEXEZMUG. The first seven characters of these two ciphertexts were the same, but the second message was shorter. The first 15 characters of the two messages were as follows (in Bletchley Park interpretation):

Za JSH4N ZYZY4 GLFRG
Zb JSH4N ZYMFS /884I
Za ⊕ Zb ///// //FOU GFL3M

John Tiltman tried various likely pieces of plaintext, i.e. a "cribs", against the Za ⊕ Zb string and found that the first plaintext message started with the German word SPRUCHNUMMER (message number). In the second plaintext, the operator had used the common abbreviation NR for NUMMER. There were more abbreviations in the second message, and the punctuation sometimes differed. This allowed Tiltman to work out, over ten days, the plaintext of both messages, as a sequence of plaintext characters discovered in Pa, could then be tried against Pb and vice versa. In turn, this yielded almost 4000 characters of key.

Members of the Research Section worked on this key to try to derive a mathematical description of the key generating process, but without success. Bill Tutte joined the section in October 1941 and was given the task. He had read chemistry and mathematics at Trinity College, Cambridge before being recruited to Bletchley Park. At his training course, he had been taught the Kasiski examination technique of writing out a key on squared paper with a new row after a defined number of characters that was suspected of being the frequency of repetition of the key. If this number was correct, the columns of the matrix would show more repetitions of sequences of characters than chance alone.

Tutte thought that it was possible that, rather than using this technique on the whole letters of the key, which were likely to have a long frequency of repetition, it might be worth trying it on the sequence formed by taking only one impulse (bit) from each letter, on the grounds that "the part might be cryptographically simpler than the whole". Given that the Tunny indicators used 25 letters (excluding J) for 11 of the positions, but only 23 letters for the twelfth, he tried Kasiski's technique on the first impulse of the key characters using a repetition of 25 × 23 = 575. This did not produce a large number of repetitions in the columns, but Tutte did observe the phenomenon on a diagonal. He therefore tried again with 574, which showed up repeats in the columns. Recognising that the prime factors of this number are 2, 7 and 41, he tried again with a period of 41 and "got a rectangle of dots and crosses that was replete with repetitions".

It was clear, however, that the sequence of first impulses was more complicated than that produced by a single wheel of 41 positions. Tutte called this component of the key χ1 (chi). He figured that there was another component, which was XOR-ed with this, that did not always change with each new character, and that this was the product of a wheel that he called ψ1 (psi). The same applied for each of the five impulses—indicated here by subscripts. So for a single character, the key K consisted of two components:

K = χψ.

The actual sequence of characters added by the psi wheels, including those when they do not advance, was referred to as the extended psi, and symbolised by ψ′.

K = χψ′.

Tutte's derivation of the ψ component was made possible by the fact that dots were more likely than not to be followed by dots, and crosses more likely than not to be followed by crosses. This was a product of a weakness in the German key setting, which they later stopped. Once Tutte had made this breakthrough, the rest of the Research Section joined in to study the other impulses, and it was established that the five ψ wheels all moved together under the control of two μ (mu or "motor") wheels.

Diagnosing the functioning of the Tunny machine in this way was a truly remarkable cryptanalytical achievement, and was described when Tutte was inducted as Officer of the Order of Canada in October 2001, as "one of the greatest intellectual feats of World War II".

Turingery

In July 1942 Alan Turing spent a few weeks in the Research Section. He had become interested in the problem of breaking Tunny from the keys that had been obtained from depths. In July, he developed a method of deriving the cam settings ("wheel breaking") from a length of key. It became known as "Turingery" (playfully dubbed "Turingismus" by Peter Ericsson, Peter Hilton and Donald Michie) and introduced the important method of "differencing" on which much of the rest of solving Tunny keys in the absence of depths, was based.

Differencing

The search was on for a process that would manipulate the ciphertext or key to produce a frequency distribution of characters that departed from the uniformity that the enciphering process aimed to achieve. Turing worked out that the XOR combination of the values of successive (adjacent) characters in a stream of ciphertext or key, emphasised any departures from a uniform distribution. The resultant stream was called the difference (symbolised by the Greek letter "delta" Δ) because XOR is the same as modulo 2 subtraction. So, for a stream of characters S, the difference ΔS was obtained as follows, where underline indicates the succeeding character:

ΔS = S ⊕ S

The stream S may be ciphertext Z, plaintext P, key K or either of its two components χ and ψ. The relationship amongst these elements still applies when they are differenced. For example, as well as:

K = χψ

It is the case that:

ΔK = Δχ ⊕ Δψ

Similarly for the ciphertext, plaintext and key components:

ΔZ = ΔP ⊕ Δχ ⊕ Δψ

So:

ΔP = ΔZ ⊕ Δχ ⊕ Δψ

The reason that differencing provided a way into Tunny, was that although the frequency distribution of characters in the ciphertext could not be distinguished from a random stream, the same was not true for a version of the ciphertext from which the chi element of the key had been removed. This is because, where the plaintext contained a repeated character and the psi wheels did not move on, the differenced psi character (Δψ) would be the null character ('/' at Bletchley Park). When XOR-ed with any character, this character has no effect, so in these circumstances, ΔK = Δχ. The ciphertext modified by the removal of the chi component of the key was called the de-chi D at Bletchley Park, and the process of removing it as "de-chi-ing". Similarly for the removal of the psi component which was known as "de-psi-ing" (or "deep sighing" when it was particularly difficult).

So the delta de-chi ΔD was:

ΔD = ΔZ ⊕ Δχ

Repeated characters in the plaintext were more frequent both because of the characteristics of German (EE, TT, LL and SS are relatively common), and because telegraphists frequently repeated the figures-shift and letters-shift characters as their loss in an ordinary telegraph transmission could lead to gibberish.

To quote the General Report on Tunny:

Turingery introduced the principle that the key differenced at one, now called ΔΚ, could yield information unobtainable from ordinary key. This Δ principle was to be the fundamental basis of nearly all statistical methods of wheel-breaking and setting.

Differencing was applied to each of the impulses of the ITA2 coded characters. So, for the first impulse, that was enciphered by wheels χ1 and ψ1, differenced at one:

ΔK1 = K1K1

And for the second impulse:

ΔK2 = K2K2

And so on.

The periodicity of the chi and psi wheels for each impulse (41 and 43 respectively for the first impulse) is also reflected in the pattern of ΔK. However, given that the psi wheels did not advance for every input character, as did the chi wheels, it was not simply a repetition of the pattern every 41 × 43 = 1763 characters for ΔK1, but a more complex sequence.

Turing's method

Turing's method of deriving the cam settings of the wheels from a length of key obtained from a depth, involved an iterative process. Given that the delta psi character was the null character '/' half of the time on average, an assumption that ΔK = Δχ had a 50% chance of being correct. The process started by treating a particular ΔK character as being the Δχ for that position. The resulting putative bit pattern of x and for each chi wheel, was recorded on a sheet of paper that contained as many columns as there were characters in the key, and five rows representing the five impulses of the Δχ. Given the knowledge from Tutte's work, of the periodicity of each of the wheels, this allowed the propagation of these values at the appropriate positions in the rest of the key.

A set of five sheets, one for each of the chi wheels, was also prepared. These contained a set of columns corresponding in number to the cams for the appropriate chi wheel, and were referred to as a 'cage'. So the χ3 cage had 29 such columns. Successive 'guesses' of Δχ values then produced further putative cam state values. These might either agree or disagree with previous assumptions, and a count of agreements and disagreements was made on these sheets. Where disagreements substantially outweighed agreements, the assumption was made that the Δψ character was not the null character '/', so the relevant assumption was discounted. Progressively, all the cam settings of the chi wheels were deduced, and from them, the psi and motor wheel cam settings.

As experience of the method developed, improvements were made that allowed it to be used with much shorter lengths of key than the original 500 or so characters."

Testery

The Testery was the section at Bletchley Park that performed the bulk of the work involved in decrypting Tunny messages. By July 1942, the volume of traffic was building up considerably. A new section was therefore set up, led by Ralph Tester—hence the name. The staff consisted mainly of ex-members of the Research Section, and included Peter Ericsson, Peter Hilton, Denis Oswald and Jerry Roberts. The Testery's methods were almost entirely manual, both before and after the introduction of automated methods in the Newmanry to supplement and speed up their work.

The first phase of the work of the Testery ran from July to October, with the predominant method of decryption being based on depths and partial depths. After ten days, however, the formulaic start of the messages was replaced by nonsensical quatsch, making decryption more difficult. This period was productive nonetheless, even though each decryption took considerable time. Finally, in September, a depth was received that allowed Turing's method of wheel breaking, "Turingery", to be used, leading to the ability to start reading current traffic. Extensive data about the statistical characteristics of the language of the messages was compiled, and the collection of cribs extended.

In late October 1942 the original, experimental Tunny link was closed and two new links (Codfish and Octopus) were opened. With these and subsequent links, the 12-letter indicator system of specifying the message key was replaced by the QEP system. This meant that only full depths could be recognised—from identical QEP numbers—which led to a considerable reduction in traffic decrypted.

Once the Newmanry became operational in June 1943, the nature of the work performed in the Testery changed, with decrypts, and wheel breaking no longer relying on depths.

British Tunny

A rebuilt British Tunny at the National Museum of Computing, Bletchley Park. It emulated the functions of the Lorenz SZ40/42, producing printed cleartext from ciphertext input.

The so-called "British Tunny Machine" was a device that exactly replicated the functions of the SZ40/42 machines. It was used to produce the German cleartext from a ciphertext tape, after the cam settings had been determined. The functional design was produced at Bletchley Park where ten Testery Tunnies were in use by the end of the war. It was designed and built in Tommy Flowers' laboratory at the General Post Office Research Station at Dollis Hill by Gil Hayward, "Doc" Coombs, Bill Chandler and Sid Broadhurst. It was mainly built from standard British telephone exchange electro-mechanical equipment such as relays and uniselectors. Input and output was by means of a teleprinter with paper tape reading and punching. These machines were used in both the Testery and later the Newmanry. Dorothy Du Boisson who was a machine operator and a member of the Women's Royal Naval Service (Wren), described plugging up the settings as being like operating an old fashioned telephone exchange and that she received electric shocks in the process.

When Flowers was invited by Hayward to try the first British Tunny machine at Dollis Hill by typing in the standard test phrase: "Now is the time for all good men to come to the aid of the party", he much appreciated that the rotor functions had been set up to provide the following Wordsworthian output:

Input NOW IS THE TIME FOR ALL GOOD MEN TO COME TO THE AID OF THE PARTY
Output I WANDERED LONELY AS A CLOUD THAT FLOATS ON HIGH OER VALES AND H

Additional features were added to the British Tunnies to simplify their operation. Further refinements were made for the versions used in the Newmanry, the third Tunny being equipped to produce de-chi tapes.

Newmanry

The Newmanry was a section set up under Max Newman in December 1942 to look into the possibility of assisting the work of the Testery by automating parts of the processes of decrypting Tunny messages. Newman had been working with Gerry Morgan, head of the Research Section on ways of breaking Tunny when Bill Tutte approached them in November 1942 with the idea of what became known as the "1+2 break in". This was recognised as being feasible, but only if automated.

Newman produced a functional specification of what was to become the "Heath Robinson" machine. He recruited the Post Office Research Station at Dollis Hill, and Dr C.E. Wynn-Williams at the Telecommunications Research Establishment (TRE) at Malvern to implement his idea. Work on the engineering design started in January 1943 and the first machine was delivered in June. The staff at that time consisted of Newman, Donald Michie, Jack Good, two engineers and 16 Wrens. By the end of the war the Newmanry contained three Robinson machines, ten Colossus Computers and a number of British Tunnies. The staff were 26 cryptographers, 28 engineers and 275 Wrens.

The automation of these processes required the processing of large quantities of punched paper tape such as those on which the enciphered messages were received. Absolute accuracy of these tapes and their transcription was essential, as a single character in error could invalidate or corrupt a huge amount of work. Jack Good introduced the maxim "If it's not checked it's wrong".

The "1+2 break in"

W. T. Tutte developed a way of exploiting the non-uniformity of bigrams (adjacent letters) in the German plaintext using the differenced cyphertext and key components. His method was called the "1+2 break in," or "double-delta attack". The essence of this method was to find the initial settings of the chi component of the key by exhaustively trying all positions of its combination with the ciphertext, and looking for evidence of the non-uniformity that reflected the characteristics of the original plaintext.  The wheel breaking process had to have successfully produced the current cam settings to allow the relevant sequence of characters of the chi wheels to be generated. It was totally impracticable to generate the 22 million characters from all five of the chi wheels, so it was initially limited to 41 × 31 = 1271 from the first two.

Given that for each of the five impulses i:

Zi = χiψi ⊕ Pi

and hence

Pi = Ziχiψi

for the first two impulses:

(P1 ⊕ P2) = (Z1 ⊕ Z2) ⊕ (χ1χ2) ⊕ (ψ1ψ2)

Calculating a putative P1 ⊕ P2 in this way for each starting point of the χ1χ2 sequence would yield xs and s with, in the long run, a greater proportion of s when the correct starting point had been used. Tutte knew, however, that using the differenced (∆) values amplified this effect because any repeated characters in the plaintext would always generate , and similarly ∆ψ1 ⊕ ∆ψ2 would generate whenever the psi wheels did not move on, and about half of the time when they did - some 70% overall.

Tutte analyzed a decrypted ciphertext with the differenced version of the above function:

(∆Z1 ⊕ ∆Z2) ⊕ (∆χ1 ⊕ ∆χ2) ⊕ (∆ψ1 ⊕ ∆ψ2)

and found that it generated some 55% of the time. Given the nature of the contribution of the psi wheels, the alignment of chi-stream with the ciphertext that gave the highest count of s from (∆Z1 ⊕ ∆Z2 ⊕ ∆χ1 ⊕ ∆χ2) was the one that was most likely to be correct. This technique could be applied to any pair of impulses and so provided the basis of an automated approach to obtaining the de-chi (D) of a ciphertext, from which the psi component could be removed by manual methods.

Robinsons

Heath Robinson was the first machine produced to automate Tutte's 1+2 method. It was given the name by the Wrens who operated it, after cartoonist William Heath Robinson, who drew immensely complicated mechanical devices for simple tasks, similar to the American cartoonist Rube Goldberg.

The functional specification of the machine was produced by Max Newman. The main engineering design was the work of Frank Morrell at the Post Office Research Station at Dollis Hill in North London, with his colleague Tommy Flowers designing the "Combining Unit". Dr C. E. Wynn-Williams from the Telecommunications Research Establishment at Malvern produced the high-speed electronic valve and relay counters. Construction started in January 1943, the prototype machine was in use at Bletchley Park in June.

The main parts of the machine were:

  • a tape transport and reading mechanism (dubbed the "bedstead" because of its resemblance to an upended metal bed frame) that ran the looped key and message tapes at between 1000 and 2000 characters per second;
  • a combining unit that implemented the logic of Tutte's method;
  • a counting unit that counted the number of s, and if it exceeded a pre-set total, displayed or printed it.

The prototype machine was effective despite a number of serious shortcomings. Most of these were progressively overcome in the development of what became known as "Old Robinson".

Colossus

A Mark 2 Colossus computer. The Wren operators are (left to right) Dorothy Du Boisson and Elsie Booker. The slanted control panel on the left was used to set the pin patterns on the Lorenz. The "bedstead" paper tape transport is on the right.
 
In 1994, a team led by Tony Sale (right) began a reconstruction of a Mark 2 Colossus at Bletchley Park. Here, in 2006, Sale and Phil Hayes supervise the solving of an enciphered message with the completed machine.

Tommy Flowers' experience with Heath Robinson, and his previous, unique experience of thermionic valves (vacuum tubes) led him to realize that a better machine could be produced using electronics. Instead of the key stream being read from a punched paper tape, an electronically generated key stream could allow much faster and more flexible processing. Flowers' suggestion that this could be achieved with a machine that was entirely electronic and would contain between one and two thousand valves, was treated with incredulity at both the Telecommunications Research Establishment and at Bletchley Park, as it was thought that it would be "too unreliable to do useful work". He did, however, have the support of the Controller of Research at Dollis Hill, W Gordon Radley, and he implemented these ideas producing Colossus, the world's first electronic, digital, computing machine that was at all programmable, in the remarkably short time of ten months. In this he was assisted by his colleagues at the Post Office Research Station Dollis Hill: Sidney Broadhurst, William Chandler, Allen Coombs and Harry Fensom.

The prototype Mark 1 Colossus (Colossus I), with its 1500 valves, became operational at Dollis Hill in December 1943 and was in use at Bletchley Park by February 1944. This processed the message at 5000 characters per second using the impulse from reading the tape's sprocket holes to act as the clock signal. It quickly became evident that this was a huge leap forward in cryptanalysis of Tunny. Further Colossus machines were ordered and the orders for more Robinsons cancelled. An improved Mark 2 Colossus (Colossus II) contained 2400 valves and first worked at Bletchley Park on 1 June 1944, just in time for the D-day Normandy landings.

The main parts of this machine were:

  • a tape transport and reading mechanism (the "bedstead") that ran the message tape in a loop at 5000 characters per second;
  • a unit that generated the key stream electronically;
  • five parallel processing units that could be programmed to perform a large range of Boolean operations;
  • five counting units that each counted the number of s or xs, and if it exceeded a pre-set total, printed it out.

The five parallel processing units allowed Tutte's "1+2 break in" and other functions to be run at an effective speed of 25,000 characters per second by the use of circuitry invented by Flowers that would now be called a shift register. Donald Michie worked out a method of using Colossus to assist in wheel breaking as well as for wheel setting. This was then implemented in special hardware on later Colossi.

A total of ten Colossus computers were in use and an eleventh was being commissioned at the end of the war in Europe (VE-Day).

Special machines

As well as the commercially produced teleprinters and re-perforators, a number of other machines were built to assist in the preparation and checking of tapes in the Newmanry and Testery. The approximate complement as of May 1945 was as follows.

Machines used in deciphering Tunny as of May 1945
Name Function Testery Newmanry
Super Robinson Used for crib runs in which two tapes were compared in all positions. Contained some valves.
2
Colossus Mk.2 Counted a condition involving a message tape and an electronically generated key character stream imitating the various Tunny wheels in different relative positions ("stepping"). Contained some 2,400 valves.
10
Dragons Used for setting short cribs by "crib-dragging" (hence the name).
2
Aquarius A machine under development at the war's end for the "go-backs" of the SZ42B, which stored the contents of the message tape in a large bank of capacitors that acted as an electronic memory.
1
Proteus A machine for utilising depths that was under construction at the war's end but was not completed.

Decoding Machines Translated from ciphertext typed in, to plaintext printed out. Some of the later ones were speeded up with the use of a few valves. A number of modified machines were produced for the Newmanry
13
Tunnies See British Tunny above 3
Miles A set of increasingly complex machines (A, B, C, D) that read two or more tapes and combined them in a variety of ways to produce an output tape. 3
Garbo Similar to Junior, but with a Delta'ing facility – used for rectangling. 3
Juniors For printing tapes via a plug panel to change characters as necessary, used to print de-chis. 4
Insert machines Similar to Angel, but with a device for making corrections by hand. 2
Angels Copied tapes. 4
Hand perforators Generated tape from a keyboard. 2
Hand counters Measured text length. 6
Stickers (hot) Bostik and benzene was used for sticking tapes to make a loop. The tape to be stuck was inserted between two electrically heated plates and the benzene evaporated. 3
Stickers (cold) Stuck tapes without heating. 6

Steps in wheel setting

Working out the start position of the chi (χ) wheels required first that their cam settings had been determined by "wheel breaking". Initially, this was achieved by two messages having been sent in depth.

The number of start positions for the first two wheels, χ1 and χ2 was 41×31 = 1271. The first step was to try all of these start positions against the message tape. This was Tutte's "1+2 break in" which involved computing (∆Z1 ⊕ ∆Z2 ⊕ ∆χ1 ⊕ ∆χ2)—which gives a putative (∆D1 ⊕ ∆D2)—and counting the number of times this gave . Incorrect starting positions would, on average, give a dot count of 50% of the message length. On average, the dot count for a correct starting point would be 54%, but there was inevitably a considerable spread of values around these averages.

Both Heath Robinson, which was developed into what became known as "Old Robinson", and Colossus were designed to automate this process. Statistical theory allowed the derivation of measures of how far any count was from the 50% expected with an incorrect starting point for the chi wheels. This measure of deviation from randomness was called sigma. Starting points that gave a count of less than 2.5 × sigma, named the "set total", were not printed out. The ideal for a run to set χ1 and χ2 was that a single pair of trial values produced one outstanding value for sigma thus identifying the start positions of the first two chi wheels. An example of the output from such a run on a Mark 2 Colossus with its five counters: a, b, c, d and e, is given below.

Output table abridged from Small's "The Special Fish Report". The set total threshold was 4912.
χ1 χ2 Counter Count Operator's notes on the output
06 11 a 4921
06 13 a 4948
02 16 e 4977
05 18 b 4926
02 20 e 4954
05 22 b 4914
03 25 d 4925
02 26 e 5015 ← 4.6 σ
19 26 c 4928
25 19 b 4930
25 21 b 5038 ← 5.1 σ
29 18 c 4946
36 13 a 4955
35 18 b 4926
36 21 a 5384 ← 12.2 σ ch χ1 χ2  ! !
36 25 a 4965
36 29 a 5013
38 08 d 4933

With an average-sized message, this would take about eight minutes. However, by utilising the parallelism of the Mark 2 Colossus, the number of times the message had to be read could be reduced by a factor of five, from 1271 to 255.  Having identified possible χ1, χ2 start positions, the next step was to try to find the start positions for the other chi wheels. In the example given above, there is a single setting of χ1 = 36 and χ2 = 21 whose sigma value makes it stand out from the rest. This was not always the case, and Small enumerates 36 different further runs that might be tried according to the result of the χ1, χ2 run. At first the choices in this iterative process were made by the cryptanalyst sitting at the typewriter output, and calling out instructions to the Wren operators. Max Newman devised a decision tree and then set Jack Good and Donald Michie the task of devising others. These were used by the Wrens without recourse to the cryptanalysts if certain criteria were met.

In the above one of Small's examples, the next run was with the first two chi wheels set to the start positions found and three separate parallel explorations of the remaining three chi wheels. Such a run was called a "short run" and took about two minutes.

Output table adapted from Small's "The Special Fish Report". The set total threshold was 2728.
χ1 χ2 χ3 χ4 χ5 Counter Count Operator's notes on the output
36 21 01

a 2938 ← 6.8 ρ  ! χ3  !
36 21
01
b 2763
36 21

01 c 2803
36 21
02
b 2733
36 21

04 c 3003 ← 8.6 ρ  ! χ5  !
36 21 06

a 2740
36 21

07 c 2750
36 21
09
b 2811
36 21 11

a 2751
36 21

12 c 2759
36 21

14 c 2733
36 21 16

a 2743
36 21
19
b 3093 ← 11.1 ρ  ! χ4  !
36 21 20

a 2785
36 21
22
b 2823
36 21 24

a 2740
36 21
25
b 2796
36 21
01
b 2763
36 21

07 c 2750

So the probable start positions for the chi wheels are: χ1 = 36, χ2 = 21, χ3 = 01, χ4 = 19, χ5 = 04. These had to be verified before the de-chi (D) message was passed to the Testery. This involved Colossus performing a count of the frequency of the 32 characters in ΔD. Small describes the check of the frequency count of the ΔD characters as being the "acid test", and that practically every cryptanalyst and Wren in the Newmanry and Testery knew the contents of the following table by heart.

Relative frequency count of characters in ΔD.
Char. Count
Char. Count
Char. Count
Char. Count
/ 1.28
R 0.92
A 0.96
D 0.89
9 1.10
C 0.90
U 1.24
F 1.00
H 1.02
V 0.94
Q 1.01
X 0.87
T 0.99
G 1.00
W 0.89
B 0.82
O 1.04
L 0.92
5 1.43
Z 0.89
M 1.00
P 0.96
8 1.12
Y 0.97
N 1.00
I 0.96
K 0.89
S 1.04
3 1.13
4 0.90
J 1.03
E 0.89

If the derived start points of the chi wheels passed this test, the de-chi-ed message was passed to the Testery where manual methods were used to derive the psi and motor settings. As Small remarked, the work in the Newmanry took a great amount of statistical science, whereas that in the Testery took much knowledge of language and was of great interest as an art. Cryptanalyst Jerry Roberts made the point that this Testery work was a greater load on staff than the automated processes in the Newmanry.

Gravity of Mars

From Wikipedia, the free encyclopedia

The gravity of Mars is a natural phenomenon, due to the law of gravity, or gravitation, by which all things with mass around the planet Mars are brought towards it. It is weaker than Earth's gravity due to the planet's smaller mass. The average gravitational acceleration on Mars is 3.72076 ms−2 (about 38% of that of Earth) and it varies. In general, topography-controlled isostasy drives the short wavelength free-air gravity anomalies. At the same time, convective flow and finite strength of the mantle lead to long-wavelength planetary-scale free-air gravity anomalies over the entire planet. Variation in crustal thickness, magmatic and volcanic activities, impact-induced Moho-uplift, seasonal variation of polar ice caps, atmospheric mass variation and variation of porosity of the crust could also correlate to the lateral variations. Over the years models consisting of an increasing but limited number of spherical harmonics have been produced. Maps produced have included free-air gravity anomaly, Bouguer gravity anomaly, and crustal thickness. In some areas of Mars there is a correlation between gravity anomalies and topography. Given the known topography, higher resolution gravity field can be inferred. Tidal deformation of Mars by the Sun or Phobos can be measured by its gravity. This reveals how stiff the interior is, and shows that the core is partially liquid. The study of surface gravity of Mars can therefore yield information about different features and provide beneficial information for future landings.

Measurement

Rotating spherical harmonic, with = 0 to 4 for the vertical, and = 0 to 4 for the horizontal. For the Martian C20 and C30, they vary with time because of the seasonal variation of mass of the polar ice caps through the annual sublimation-condensation cycle of carbon dioxide.

To understand the gravity of Mars, its gravitational field strength g and gravitational potential U are often measured. Simply, if Mars is assumed to be a static perfectly spherical body of radius RM, provided that there is only one satellite revolving around Mars in a circular orbit and such gravitation interaction is the only force acting in the system, the equation would be,

,

where G is the universal constant of gravitation (commonly taken as G = 6.674 x 10−11 m3 kg−1 s−2),[10] M is the mass of Mars (most updated value: 6.41693 x 1023 kg), m is the mass of the satellite, r is the distance between Mars and the satellite, and is the angular velocity of the satellite, which is also equivalent to (T is the orbiting period of the satellite).

Therefore, , where RM is the radius of Mars. With proper measurement, r, T and RM are obtainable parameters from Earth.

However, as Mars is a generic, non-spherical planetary body and influenced by complex geological processes, more accurately, the gravitational potential is described with spherical harmonic functions, following convention in geodesy, see Geopotential_model.

,

where are spherical coordinates of the test point. is longitude and is latitude. and are dimensionless harmonic coefficients of degree and order . is the Legendre polynomial of degree with and is the associated Legendre polynomial with . These are used to describe solutions of Laplace's equation. is the mean radius of the planet. The coefficient is sometimes written as .

  1. The lower the degree and order , the longer wavelength of anomaly it represents. In turn, long-wavelength gravity anomaly is influenced by global geophysical structures.
  2. The higher the degree and order , the shorter wavelength of anomaly it represents. For degree over 50, it has been shown that those variations have high correlation with the topography. Geophysical interpretation of surface features could further help deriving a more complete picture of the Martian gravity field, though misleading results could be produced.

The oldest technique in determining the gravity of Mars is through Earth-based observation. Later with the arrival of unmanned spacecraft, subsequent gravity models were developed from radio tracking data.

Earth-based observation

Before the arrival of the Mariner 9 and Viking orbiter spacecraft at Mars, only an estimate of the Mars gravitational constant GM, i.e. the universal constant of gravitation times the mass of Mars, was available for deducing the properties of the Martian gravity field. GM could be obtained through observations of the motions of the natural satellites of Mars (Phobos and Deimos) and spacecraft flybys of Mars (Mariner 4 and Mariner 6).

Long term Earth-based observations of the motions of Phobos and Deimos provide physical parameters including semi-major axis, eccentricity, inclination angle to the Laplacian plane etc., which allow calculation of the ratio of solar mass to the mass of Mars, moment of inertia and coefficient of the gravitational potential of Mars, and give initial estimates of the gravity field of Mars.

Inferred from radio tracking data

Three-way Doppler, with signal transmitter and receiver separated

Precise tracking of spacecraft is of prime importance for accurate gravity modeling, as gravity models are developed from observing tiny perturbation of spacecraft, i.e. small variation in velocity and altitude. The tracking is done basically by the antennae of the Deep Space Network (DSN), with one-way, two-way and three-way Doppler and range tracking applied. One-way tracking means the data is transmitted in one way to the DSN from the spacecraft, while two-way and three-way involve transmitting signals from Earth to the spacecraft (uplink), and thereafter transponded coherently back to the Earth (downlink). The difference between two-way and three-way tracking is, the former one has the same signal transmitter and receiver on Earth, while the latter one has the transmitter and receiver at different locations on Earth. The use of these three types of tracking data enhances the coverage and quality of the data, as one could fill in the data gap of another.

Doppler tracking is a common technique in tracking the spacecraft, utilizing radial velocity method, which involves detection of Doppler shifts. As the spacecraft moves away from us along line of sight, there would be redshift of signal, while for the reverse, there would be blueshift of signal. Such technique has also been applied for observation of the motion of exoplanets. While for the range tracking, it is done through measurement of round trip propagation time of the signal. Combination of Doppler shift and range observation promotes higher tracking accuracy of the spacecraft.

The tracking data would then be converted to develop global gravity models using the spherical harmonic equation displayed above. However, further elimination of the effects due to affect of solid tide, various relativistic effects due to the Sun, Jupiter and Saturn, non-conservative forces (e.g. angular momentum desaturations (AMD), atmospheric drag and solar radiation pressure) have to be done, otherwise, considerable errors result.

History

The latest gravity model for Mars is the Goddard Mars Model 3 (GMM-3), produced in 2016, with spherical harmonics solution up to degree and order 120. This model is developed from 16 years of radio tracking data from Mars Global Surveyor (MGS), Mars Odyssey and Mars Reconnaissance Orbiter (MRO), as well as the MOLA topography model and provides a global resolution of 115 km. A separate free-air gravity anomaly map, Bouguer gravity anomaly map and a map of crustal thickness were produced along with this model. Compared with MRO110C and other previous models, major improvement of the estimation of the gravity field comes from more careful modeling of the non-conservative forces applied to the spacecraft.

Gravity solutions AuthorsYearDegree (m) and order (l) of the spherical harmonic solution

[Surface resolution (km)]

Data Source
JP Gapcynski, RH Tolson and WH Michael Jr 1977 6 Tracking data of Mariner 9, Viking 1 and 2 spacecraft
Geoide martien G Balmino, B Moynot and N Vales 1982 18

[¬600 km]

Tracking data of Mariner 9, Viking 1 and 2 spacecraft
GMM-1 DE Smith, FJ Lerch, RS Nerem, MT Zuber, GB Patel, SK Fricke and FG Lemoine 1993 50

[200–300 km]

Tracking data of Mariner 9, Viking 1 and 2 spacecraft
Mars50c AS Konopliv, WL Sjogren 1995 50 Tracking data of Mariner 9, Viking 1 and 2 spacecraft
GMM-2B FG Lemoine, DE Smith, DD Rowlands, MT Zuber, GA Neumann, DS Chinn, and DE Pavlis 2001 80 Tracking data of Mars Global Surveyor (MGS), and MOLA-derived topography data 
GGM1041C FG Lemoine 2001 90 Tracking data of Mars Global Surveyor (MGS) and Mars Odyssey, and MOLA-derived topography data
MGS95J AS Konopliv, CF Yoder, EM Standish, DN Yuan, WL Sjogren 2006 95

[~112 km]

Tracking data of Mars Global Surveyor (MGS) and Mars Odyssey, and MOLA-derived topography data 
MGGM08A JC Marty, G Balmino, J Duron, P Rosenblatt, S Le Maistre, A Rivoldini, V Dehant, T. Van Hoolst 2009 95

[~112 km]

Tracking data of Mars Global Surveyor (MGS) and Mars Odyssey, and MOLA-derived topography data
MRO110B2 AS Konopliv, SW Asmar, WM Folkner, Ö Karatekin, DC Nunes, SE Smrekar, CF Yoder, MT Zuber 2011 110 Tracking data of Mars Global Surveyor (MGS), Mars Odyssey and Mars Reconnaissance Orbiter (MRO), and MOLA-derived topography data
MGM2011 C Hirt, SJ Claessens, M Kuhn, WE Featherstone 2012 [3 km (equator) – 125 km] Gravity solution MRO110B2, and MOLA-derived topography data
GMM-3 A Genova, S Goossens, FG Lemoine, E Mazarico, GA Neumann, DE Smith, MT Zuber 2016 120

[115 km]

Mars Global Surveyor (MGS), Mars Odyssey and Mars Reconnaissance Orbiter (MRO)
  • MGS (SPO-1, SPO-2, GCO, MAP)
  • ODY (ODYT, ODYM)
  • MRO (MROT, MROM)

The techniques in tracking the spacecraft and geophysical interpretation of surface features can affect the resolution of the strength of gravity field. The better technique favors spherical harmonic solutions to higher degrees and orders. Independent analysis on Mariner 9 and Viking Orbiter tracking data yielded a degree and order of 6 spherical harmonic solution., Further combination of the two data sets, along with correlation of anomalies with volcanic features (positive anomaly) and deep-printed depression (negative anomaly) assisted by image data allows a degree and order of 18 spherical harmonic solution produced. Further use of spatial a priori constraint method, which had taken the topography into account in solving the Kaula power law constraint, had favored model of up to degree 50 spherical harmonic solution in global resolution (Goddard Mars Model-1, or GMM-1) then the subsequent models with higher completeness and degree and order up to 120 for the latest GMM-3.

Mars free-air gravity map produced along with the GMM-3 gravity solution (Red: gravity high; Blue: gravity low) (Credit: NASA's Scientific Visualization Studio)

Therefore, gravity models nowadays are not directly produced through transfer of the measured gravity data to any spatial information system because there is difficulty in producing model with sufficiently high resolution. Topography data obtained from the MOLA instrument aboard the Mars Global Surveyor thus becomes a useful tool in producing a more detailed short-scale gravity model, utilizing the gravity-topography correlation in short-wavelength. However, not all regions on Mars show such correlation, notably the northern lowland and the poles. Misleading results could be easily produced, which could lead to wrong geophysics interpretation.

The later modifications of gravity model include taking other non-conservative forces acting on spacecraft into account, including atmospheric drag, solar radiation pressure, Mars reflected solar radiation pressure, Mars thermal emission, and spacecraft thrusting which despins or desaturates the angular moment wheels. In addition, Martian precession and third body attraction due to the Sun, Moon and planets, which could affect the spacecraft orbit, as well as relavistic effects on the measurements should also be corrected. These factors could lead to offset of the true gravity field. Accurate modeling is thus required to eliminate the offset. Such work is still ongoing.

Static gravity field

Many researchers have outlined the correlation between short-wavelength (locally varying) free-air gravity anomalies and topography. For regions with higher correlation, free-air gravity anomalies could be expanded to higher degree strength through geophysical interpretation of surface features, so that the gravity map could offer higher resolution. It has been found that the southern highland has high gravity/topography correlation but not for the northern lowland. Therefore, the resolution of free-air gravity anomaly model typically has higher resolution for the southern hemisphere, as high as over 100 km.

Free-air gravity anomalies are relatively easier to measure than the Bouguer anomalies as long as topography data is available because it does not need to eliminate the gravitational effect due to the effect of mass surplus or deficit of the terrain after the gravity is reduced to sea level. However, to interpret the crustal structure, further elimination of such gravitational effect is necessary so that the reduced gravity would only be the result of the core, mantle and crust below datum. The product after elimination is the Bouguer anomalies. However, density of the material in building up the terrain would be the most important constraint in the calculation, which may vary laterally on the planet and is affected by porosity and geochemistry of the rock. Relevant information could be obtained from Martian meteorites and in-situ analysis.

Local gravity anomalies

The crust-mantle boundary variation, intrusion, volcanism and topography can bring effect to the orbit of spacecraft, due to the higher density of mantle and volcanic material and lower density of the crust. (Not in scale) +ve: Positive anomaly; -ve: Negative anomaly

Since Bouguer gravity anomalies have strong links with depth of crust-mantle boundary, one with positive Bouguer anomalies may mean that it has a thinner crust composed of lower density material and is influenced more strongly by the denser mantle, and vice versa. However, it could also be contributed by the difference in density of the erupted volcanic load and sedimentary load, as well as subsurface intrusion and removal of material. Many of these anomalies are associated with either geological or topographic features. Few exception includes the 63°E, 71°N anomaly, which may represent an extensive buried structure as large as over 600 km, predated the early-Noachian buried surface.

Topography anomalies

Strong correlation between topography and short-wavelength free-air gravity anomalies has been shown for both study of the gravity field of the Earth and the Moon, and it can be explained by the wide occurrence of isostasy. High correlation is expected for degree over 50 (short-wavelength anomaly) on Mars. And it could be as high as 0.9 for degrees between 70 and 85. Such correlation could be explained by flexural compensation of topographic loads. It is noted that older regions on Mars are isostatically compensated when the younger region are usually only partially compensated.

Anomalies from volcanic constructs

Mars Bouguer gravity map, produced along with GMM-3 gravity solution in 2016 (Red: gravity high; Blue: gravity low) (Credit: NASA's Scientific Visualization Studio)

Different volcanic constructs could behave differently in terms of gravity anomalies. Volcanoes Olympus Mons and the Tharsis Montes produce the smallest positive free-air gravity anomalies in the solar system. Alba Patera, also a volcanic rise, north of the Tharsis Montes, however, produces negative Bouguer anomaly, though its extension is similar to that of Olympus Mons. And for the Elysium Mons, its center is found to have slight increase in Bouguer anomalies in an overall broad negative anomaly context in the Elysium rise.

The knowledge of anomaly of volcanoes, along with density of the volcanic material, would be useful in determining the lithospheric composition and crustal evolution of different volcanic edifices. It has been suggested that the extruded lava could range from andesite (low density) to basaltic (high density) and the composition could change during the construction of the volcanic shield, which contributes to the anomaly. Another scenario is it is possible for high density material intruded beneath the volcano. Such setting has already been observed over the famous Syrtis major, which has been inferred to have an extinct magma chamber with 3300 kg m3 underlying the volcano, evident from positive Bouguer anomaly.

Anomalies from depressions

Different depressions also behave differently in Bouguer anomaly. Giant impact basins like Argyre, Isidis, Hellas and Utopia basins also exhibit very strong positive Bouguer anomalies in circular manner. These basins have been debated for their impact crater origin. If they are, the positive anomalies may be due to uplift of Moho, crustal thinning and modification events by sedimentary and volcanic surface loads after impacting.

But at the same time there are also some large basins that are not associated with such positive Bouguer anomaly, for example, Daedalia, northern Tharsis and Elysium, which are believed to be underlain by the northern lowland plain.

In addition, certain portions of Coprates, Eos Chasma and Kasei Valles are also found to have positive Bouguer anomalies, though they are topographic depressions. This may suggest that these depressions are underlain by shallow dense intrusion body.

Global gravity anomalies

Global gravity anomalies, also termed as long-wavelength gravity anomalies, are the low-degree harmonics of the gravity field, which cannot be attributed to local isostasy, but rather finite strength of the mantle and density differences in the convection current. For Mars, the largest component of Bouguer anomaly is the degree one harmonic, which represents the mass deficit in the southern hemisphere and excess in the northern hemisphere. The second largest component corresponds to the planet flattening and Tharsis bulge.

Early study of the geoid in the 1950s and 1960s has focused on the low-degree harmonics of the Earth's gravity field in order to understand its interior structure. It has been suggested that such long-wavelength anomalies on Earth could be contributed by the sources located in deep mantle and not in the crust, for example, caused by the density differences in driving the convection current, which has been evolving with time. The correlation between certain topography anomalies and long-wavelength gravity anomalies, for example, the mid-Atlantic ridge and Carlsberg ridge, which are topography high and gravity high on the ocean floor, thus became the argument for the convection current idea on Earth in the 1970s, though such correlations are weak in the global picture.

Another possible explanation for the global scale anomalies is the finite strength of the mantle (in contrast to zero stress), which makes the gravity deviated from hydrostatic equilibrium. For this theory, because of the finite strength, flow may not exist for most of the region that are understressed. And the variations of density of the deep mantle could be the result of chemical inhomogeneities associated with continent separations, and scars left on Earth after the torn away of the moon. These are the cases suggested to work when slow flow is allowed to happen under certain circumstances. However, it has been argued that the theory may not be physically feasible.

Time-variable gravity field

Sublimation-condensation cycle occurs on Mars which results in carbon dioxide exchange between the cryosphere and the atmosphere. In turn, there is exchange in mass between the two spheres, which gives seasonal variation of gravity. (Courtesy NASA/JPL-Caltech)

Seasonal change of gravity field at the poles

The sublimation-condensation cycle of carbon dioxide on Mars between the atmosphere and cryosphere (polar ice cap) operates seasonally. This cycle contributes as almost the only variable accounting for changes in gravity field on Mars. The measured gravitational potential of Mars from orbiters could be generalized as the equation below,

In turn, when there is more mass in the seasonal caps due to the more condensation of carbon dioxide from the atmosphere, the mass of the atmosphere would drop. They have inverse relationship with each other. And the change in mass has direct effect towards the measured gravitational potential.

The seasonal mass exchange between the northern polar cap and southern polar cap exhibits long-wavelength gravity variation with time. Long years of continuous observation has found that the determination of even zonal, normalized gravity coefficient Cl=2, m=0, and odd zonal, normalized gravity coefficient Cl=3, m=0 are crucial for outlining the time-variable gravity due to such mass exchange, where is the degree while is the order. More commonly, they are represented in form of Clm in research papers.

If we regard the two poles as two distinct point masses, then, their masses are defined as,

Data has indicated that the maximum mass variation of the southern polar cap is approximately 8.4 x 1015 kg, occurring near the autumnal equinox, while for that of the northern polar is approximately 6.2 x 1015 kg, occurring in between the winter solstice and spring equinox.

In long term speaking, it has been found that the mass of ice stored in North Pole would increase by (1.4 ± 0.5) x 1011 kg, while in South Pole it would decrease by (0.8 ± 0.6) x 1011 kg. In addition, the atmosphere would have decrease in term of the mass of carbon dioxide by (0.6 ± 0.6) x 1011 kg in long term as well. Due to existence of uncertainties, it is unclear whether migration of material from the South Pole to the North Pole is ongoing, though such a possibility cannot be ruled out.

Tide

The two major tidal forces acting on Mars are the solar tide and Phobos tide. Love number k2 is an important proportional dimensionless constant relating the tidal field acting to the body with the multipolar moment resulting from the mass distribution of the body. Usually k2 can tell quadrupolar deformation. Finding k2 is helpful in understanding the interior structure on Mars. The most updated k2 obtained by Genova's team is 0.1697 ± 0.0009. As if k2 is smaller than 0.10 a solid core would be indicated, this tells that at least the outer core is liquid on Mars, and the predicted core radius is 1520–1840 km.

However, current radio tracking data from MGS, ODY and MRO does not allow the effect of phase lag on the tides to be detected because it is too weak and needs more precise measurement on the perturbation of spacecraft in the future.

Geophysical implications

Crustal thickness

Histogram of percentage area against crustal thickness of Mars: 32 km and 58 km are the two major peaks of the histogram.
 
Comparison of topography, free-air gravity anomaly and crustal density map – Red: gravity high; Blue: gravity low

No direct measurement of crustal thickness on Mars is currently available. Geochemical implications from SNC meteorites and orthopyroxenite meteorite ALH84001 suggested that mean crustal thickness of Mars is 100–250 km. Viscous relaxation analysis suggested that the maximum thickness is 50–100 km. Such thickness is critical in maintaining hemispheric crustal variations and preventing channel flow. Combination studies on geophysics and geochemistry suggested that average crustal thickness could be down to 50 ± 12 km.

Measurement of gravity field by different orbiters allows higher-resolution global Bouguer potential model to be produced. With local shallow density anomalies and effect of core flattening eliminated, the residual Bouguer potential is produced, as indicated by the following equation,

The residual Bouguer potential is contributed by the mantle. The undulation of the crust-mantle boundary, or the Moho surface, with mass of terrain corrected, should have resulted in varying residual anomaly. In turn, if undulating boundary is observed, there should be changes in crustal thickness.

Global study of residual Bouguer anomaly data indicates that crustal thickness of Mars varies from 5.8 km to 102 km. Two major peaks at 32 km and 58 km are identified from an equal-area histogram of crustal thickness. These two peaks are linked to the crustal dichotomy of Mars. Almost all the crust thicker than 60 km are contributed by the southern highland, with generally uniform thickness. And the northern lowland in general has thinner crust. The crustal thickness of the Arabia Terra region and northern hemisphere are found to be latitude-dependent. The more southward towards the Sinai Planum and Lunae Planum, the more thickened the crust is.

Among all regions, the Thaumasia and Claritis contain the thickest portion of crust on Mars that account for the histogram > 70 km. The Hellas and Argyre basins are observed to have crust thinner than 30 km, which are the exceptionally thin area in the southern hemisphere. Isidis and Utopia are also observed to have significant crustal thinning, with the center of Isidis basins believed to have the thinnest crust on Mars.

Crust redistribution by impacting and viscous relaxation

After the initial impact, high heat flux and high water content would have favored viscous relaxation to take place. The crust becomes more ductile. The basin topography of the craters is thus subjected to greater stress due to self-gravitation, which further drive crustal flow and decay of relief. However, this analysis may not work for giant impact craters such as Hellas, Utopia, Argyre and Isidis basins.

Crustal thinning is believed to have taken place underneath almost all the major impact craters. Crustal excavation, modification through emplacement of volcanic material and crustal flow taking place in the weak lithosphere are the possible causes. With the pre-impact crust excavated, gravitational restoration would take place through central mantle uplift, so that the mass deficit of cavity could be compensated by the mass of the uplifted denser material.

Giant impact basins Utopia, Hellas, Argyre and Isidis are some of the most prominent examples. Utopia, an impact basin located in northern lowland, is filled by light and water-deposited sedimentary material and has slightly thickened crust at the center. This is potentially due to large resurfacing process in the northern lowland. While for Hellas, Argyre and Isidis basins, they have great Moho uplifted relief and exhibit annuli of diffuse thickened crust beyond the crustal rim.

But on the contrary, almost all the Martian basins with diameter of 275 km < D < 1000 km are associated with low amplitude surface and low amplitude Moho relief. Many are even found to have negative free air gravity anomaly, though evidence has shown that all of them should have experienced gravity high (positive free air gravity anomaly). These have been suggested not caused by erosion and burial alone, as the adding of material into the basin would in fact increase the gravity strength rather than decrease it. Thus viscous relaxation should have been taking place. High heat flux and high water content in the early Martian crust favored viscous relaxation. These two factors have made the crust more ductile. The basin topography of the craters would be subjected to greater stress due to self-gravitation. Such stress would drive crustal flow and therefore decay of relief. The giant impact basins are the exceptions that have not experienced viscous relaxation, as crustal thinning has made the crust too thin to sustain sub-solidus crustal flow.

Low bulk crustal density

The most recent crustal density model RM1 developed in 2017 gives the bulk crustal density to be 2582 ± 209 kg m−3 for Mars, which represents a global average value. Lateral variation of the crustal density should exist. For example, over the volcanic complexes, local density is expected to be as high as 3231 ± 95 kg m−3, which matched the meteorite data and previous estimations. In addition, the density of the northern hemisphere is in general higher than that of the southern hemisphere, which may imply that the latter is more porous than the former.

To achieve the bulk value, porosity could play an important role. If the mineral grain density is chosen to be 3100 kg m−3, 10% to 23% porosity could give a 200 kg m−3 drop in the bulk density. If the pore spaces are filled with water or ice, bulk density decrease is also expected. A further drop in bulk density could be explained by increasing density with depth, with the surface layer more porous than the deeper Mars, and the increase of density with depth also has geographical variation.

Engineering and scientific applications

Areoid

The areoid is a planetary geoid that represents the gravitational and rotational equipotential figure of Mars, analogous to the concept of geoid ("sea level") on Earth. This has been set as the reference frame for developing the MOLA Mission Experiment Gridded Data Records (MEGDRs), which is a global topography model. The topography model is important in mapping the geomorphological features and understanding different kinds of processes on Mars.

To derive the areoid, two parts of works are required. First, as gravity data is essential for identifying the position of the center of mass of the planet, which is largely affected by the distribution of the mass of the interior, radio tracking data of spacecraft is necessary. This was largely done by the Mars Global Surveyor (MGS). Then, the MOLA 2 instrument aboard the MGS, which operates at 400-km elevation orbit, could measure the range (distance) between the spacecraft and the ground surface through counting the round-trip time of flight of the pulse from the instrument. Combination of these two works allows the areoid as well as the MEGDRs to be constructed. Based on the above, the areoid has taken the radius as the mean radius of the planet at the equator as 3396 km.

The topography model MEDGRs was developed through range (distance) measurement done by MOLA 2 instrument and radio tracking data of the Mars Global Surveyor (MGS). The highest point is located at the Olympus Mons while the deepest point is located within the Hellas Basin. (Brown-Red: Topography high; Green-Blue: Topography low) (Credit: NASA/JPL-Caltech)

Surface landing

As there is a large distance between Mars and Earth, immediate command to the lander is almost impossible and the landing relies highly on its autonomous system. It has been recognized that to avoid failure, precise understanding of the gravity field of Mars is essential for the landing projects, so that offsetting factors and uncertainties of gravitational effects could be minimized, allowing for a smooth landing progress. The first ever man-made object landing on Mars, the Mars 2 lander, crashed for an unknown reason. Since the surface environment of Mars is complex, composed of laterally varying morphological patterns, in order to avoid rock hazard the landing progress should be further assisted by employment of LIDAR on site in determining the exact landing position and other protective measures.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...