The Importance of Checking and Debugging Code

The year is 1962. NASA’s Mariner I spacecraft has just taken off aboard a rocket from Cape Canaveral, destined for a flyby of Venus. Only a few minutes into the rocket’s flight, it flies off-course, and a self-destruct order is sent. Mariner I explodes. $80 million (almost $630 million in today’s money) falls into the sea.

As it turns out, the reason for Mariner I’s fault was a simple one – a lone, omitted hyphen that was accidentally left out of mathematical code by a NASA programmer.

NASA official Richard B. Morrison explained to Congress:
““[The hyphen] gives a cue for the spacecraft to ignore the data the computer feeds it until radar contact is once again restored. When that hyphen is left out, false information is fed into the spacecraft control systems. In this case, the computer fed the rocket in hard left, nose down and the vehicle obeyed and crashed.”

So, when writing code, CHECK IT – or else you too might end up costing someone tens of millions of dollars.

Source: https://priceonomics.com/the-typo-that-destroyed-a-space-shuttle/

Packet Switching

Information isn’t always broken up into easily-digestible bites. Imagine having a conversation with someone who doesn’t pause between sentences. Things would get pretty confusing pretty quickly, right? You might ask this other person to slow down a little so you can figure out what they’re talking about. In the digital world, when two entities “speak” to each other, the data that is transferred is broken up into packets which makes it easier for them to understand one another. Just like how we use pauses in conversation to make things easier to understand, computers break up and send data in packets containing special instructions as to how they’re to be ordered and read. You can watch this short video for a visual representation of this process.

Foolish Tech Prediction of the Day

“There is no reason anyone would want a computer in their home.”

-Ken Olsen, founder of Digital Equipment Corporation, 1977

Source: https://www.pcworld.com/article/155984/worst_tech_predictions.html

Let’s think about this quote for a second. Do you know anyone who doesn’t own a computer? It’s crazy to think how much has changed in the world of computing over the past forty years. But let’s just go back to 1977 for a second.

Personal computers as we know them today weren’t really a thing back in 1977. Anything small enough to fit on your desk was either prohibitively expensive for home use or not user-friendly enough for the average person. The Internet wasn’t even a thing yet. So, could you really blame Mr. Olsen for making that statement?

Technology is constantly evolving, and we’re finding new uses for it every single day. I wonder if, many decades from now, an article similar to this one will be published online. I’m sure some of the super-cool tech we’re using today will look pretty lame then, too.

Decoding Encoding

In the digital world, all data is transcribed as a combination of numbers – It’s the universal language of computers. Practically every word and every object in every language is encoded in one of two codebooks: ASCII and Unicode (UTF-8). The following definitions will help you understand this concept further.

Binary encoding: The representation of symbols in a source alphabet by strings of binary digits, i.e. a binary code (Source: encyclopedia.com)

ASCII: ASCII stands for American Standard Code for Information Interchange. Computers can only understand numbers, so an ASCII code is the numerical representation of a character such as ‘a’ or ‘@’ or an action of some sort (Source: http://www.pld.ttu.ee/~marek/PA_R4/ascii.html)

Unicode: The Unicode Standard provides a unique number for every character, no matter what platform, device, application or language (Source: http://unicode.org/standard/WhatIsUnicode.html)

UTF-8: UTF-8 encodes each Unicode character as a variable number of 1 to 4 octets, where the number of octets depends on the integer value assigned to the Unicode character (Source: http://www.utf-8.com)

For reference: Unicode Character TableASCII Table

What Are Bits? An Explanation of the Binary System

It’s important that the integrity of the original message is kept when transmitting a signal. To ensure this, a universal encoding system was developed called binary. Every signal, at its most basic level, is comprised of individual bits.

Well, what are bits? Bits, or binary digits, are the smallest measurable unit of data. The binary system converts electrical signals into bits (either a “1” or a “0”), which is much simpler than the preceding decimal system which utilized values from 0 to 9. The binary system makes encoding and reading information much more efficient for computers and greatly reduces instances of errors in the transmission of data.

Ensuring that data is free of errors is important. Even minuscule coding mistakes, for example, can be costly. Take NASA’s Mariner 1 spacecraft. Launched aboard an Atlas-Agena rocket on July 22, 1962, Mariner 1 never made it into orbit. The rocket’s guidance program was transcribed by hand, and a single character was accidentally left out in the process. This costly mistake sent the rocket off course and it was subsequently destroyed, costing NASA $80 million.

Intro To The Digital World

Let’s face it: Technology dominates our day to day lives. It gives us access to an unlimited source of information, gets us from place to place, and allows us to communicate with one another instantaneously. Despite the fact that we’ve surrounded ourselves in this connected world, many of us take for granted the technologies that made this way of living possible. The truth is, it’s really not as complicated as it seems to be.

Take your cell phone for example: You speak in one end and your voice comes out the other. At its core, today’s cell phone isn’t all that different than the rotary phones of 100 years ago – both technologies serve the same function of transmitting a signal between two or more people. What was once a purely analog technology has entered the digital world of today and transformed into something infinitely more convenient and reliable.

While this is a very simple analogy, I hope you’ll use that same thought process when reading about our exploration of space – from Yuri Gagarin’s first spaceflight in April 1961 to our future voyages to Mars and beyond. The technologies that have opened our eyes to the heavens took many years and many brilliant minds to make a reality.

It all started somewhere.
Welcome to the Digital World.