Codebreaking-A+History+and+Explanation+of+Frequency+Analysis

 Overview
== As many of us probably realized when trying to access the first quest hub, the hardest part of decoding a cipher is often knowing where to start. Frequency analysis, a technique that has been in practice since medieval times, provides us with a logical option of how to get the ball rolling. The technique is based on a knowledge of the most frequently used letters and combinations of letters in a given language. For example, in English, the most commonly used single letter is E, the most commonly used group of two letters (or //bigram// ) is TH and the most commonly used group of three letters (or //trigram// ) is THE. When presented with a substitution cipher, such as the one we decoded to enter the quest hub, one can use this knowledge to make educated guesses as to what each letter might correspond to. All it takes is counting up the frequency of letters, bigrams, and trigrams in the cipher and seeing if any of them make sense when substituted for the most common letters, bigrams, and trigrams in the alphabet. Once things do start to make some sense, such as when we recognize a fragment of a word, we can start making other educated guesses at letters surrounding those fragments. Decoding the cipher then becomes progressively more simple as we go, ultimately thanks to the frequency analysis techniques which got the ball rolling. ==

==** Frequency analysis is as old as code-breaking itself. The first explanation of any cryptoanalytical techniques was given in the 9th century by Al-Kindi, an Arab Mathmetician, in //A manuscript on Deciphering Cryptographic Messages//, the first known text of its kind. It is probable that he obtained his knowledge of Arabic letter frequency from study of the Qu’ran. By the Rennaissance, the technique had spread to Europe. As this knowledge of how to easily decode substitution ciphers became more widespread, some techniques were developed to try to fortify the substitution system; these included designating more than one coded letter for each common decoded letter, using multiple alphabets in the code, and substituting groups of letters for one letter as opposed to a 1-1 relationship. The main failure of these methods was that, while it made code-breaking harder, it also made code-making harder and led to more mistakes in transcription. Nevertheless, substitution ciphers and frequency analysis techniques at breaking these ciphers were in use up until the dawn of computers, when breaking these ciphers became so easy and quick that new forms of code-making and breaking were needed. ** ==

Info taken from []

**Opinion**
(There is no text here yet.)

**Future Trends?**
(There is no text here yet.)