Invitation to Cybersecurity

2. The Context of Cybersecurity: Cyberspace 7 With every electric pulse of a computer, electricity flows through the circuits based on the state of the transistors and creates a new state—the transistors are set to either 1 or 0. This state then acts as the input that determines the next state, and so on. A computer’s clock speed determines how quickly it moves between states. The ticks of the clock that trigger the state changes take place on the order of gigahertz (GHz), or billions per second. To help put this type of speed in perspective, think about a digital stopwatch where the last two digits represent tenths and hundredths of a second. When we look at such a stopwatch, we see the seconds go by clearly, and each tenth of second also registers, but just barely (try it!). The hundredths of a second, on the other hand, cycle through so quickly we only see a blur of numbers. This demonstrates that at best, our minds operate at the timescale of tenths of a second. This makes sense because we use the expression “in the blink of an eye” to mean instantaneously, but in fact, an eye blink takes around three tenths of a second. But computers operate on the timescale of billionths of a second, so to a computer the miniscule span of time between each blurry hundredth of a second on the stopwatch, in “human time,” feels like a span of weeks. To a computer each tenth of a second seems like months, and each second, years. We can get a lot of things done over the course of weeks—just like a computer can accomplish a lot of work between every seemingly instantaneous hundredth of a second on a stopwatch. Again, incomprehensible! You may be wondering, what is the big deal about Boolean logic? What can possibly be accomplished by such simple operations, even given the ability to do billions of them per second? Well, remarkably, any computable problem can be solved with Boolean logic. We will not get into the theory of computability and the difference between tractable and intractable problems here, but suffice it to say, the class of tractable computable problems with real-world applications is huge—it includes much of mathematics and all kinds of information processing tasks. 2.1.2 Data Encoding “Computers are just 1s and 0s.” - Common saying Claude Shannon also invented information theory paving the way for computer processing. He proved that all types of information can be represented using just a pair of differentiable signals or symbols. This means binary strings, composed of the base two number symbols 1 and 0, can be made to represent any kind of information whatsoever, including colors, letters, images, and sounds. This is called data encoding. Needless to say, pairing blazing clock speeds and miniscule transistors with the power of Boolean logic and data encoding gives rise to an incredible phenomenon—and we are witnesses to that every time we use a computer.

RkJQdWJsaXNoZXIy MTM4ODY=