This article is focused on how it is possible that an algorithm can read 1 bit of data yet store the results of 3 bits.

The concept behind this idea is quite simple. We know that bits are sent via an encoded signal from the CPU of an informaton system. It is also known, that at the base of all CPU’s, the machine language is based on binary bits, represented as values of 1 or 0.

Here is an example of how the bit string 0110 might look if viewed as an encoded square wave:

If we created an algorithm that could read 1 bit of data from a encoded square wave, it is easy to see how it is possible to generate a 1:3 ratio of decryption.

Imagine that we are able to read the signal of exactly 1 bit of data. If we were to read such a signal, think in terms of reading the signal in a linear fashion. Rather then reading just one point on a line, we would need to read the signals equivelant to exactly 1 bit. In mathematical terms, this could be represented as [0.5, 1.5]. The following representation of an encoded square wave breaks down 1 bit of data a bit further.

The 4 bits of data (0110) in this wave form are represented in a linear fasion: [ [0.5, 1.5] | [1.5, 2.5] | [2.5, 3.5] | [3.5, 4.5] ]

Now lets break this down just a bit further by focusing on only 1 bit of data in this wave form.

Our focus is going to turn to the 3rd bit of data in our signal (0 1 **1** 0).

Note that we captured 1 full bit of the wave form that included [2.5, 3.5]. At the beginning of the our signal was the end of the 2 bit, and at the end of our signal was the beginning of the 4th bit. As a result of the 2nd bit having a value of 1, the wave was a straight line at the point of 2.5. As a result of the 4th bit having a value of 0, the line started to drop at the 3.5. The final result is simple, the bit before the our signal is a 1 and the bit after is a 0.

Do you have feedback (good, bad or indifferent, all feedback is welcome) about this article? Please feel free to drop me a message!

JUL