Today, the world is in the midst of the transformative and ever-developing Digital Age, otherwise referred to as the “Age of Information.” It has been an unprecedented, remarkable, and explosive era marked by social media and computer-generated imagery (and with it, deep fakes), among other novel, previously unimaginable concepts. The bulky monitors and blocky towers of personal computers and laptops, which were once upon a time considered fashionable, futuristic contraptions, have since been replaced with a sleek and stylish array – both multi-functional and specialized – of aerodynamic, minimalistic devices, ranging from smartphones and tablets to lightweight laptops and full-fledged gaming set-ups packed with powerhouse processors.
While many are familiar with those facts, and a recent movie revived interest in Alan Turing’s achievements with computing during World War II, it was Charles Babbage who was the first to conceive the notion of a programmable and automatic universal computer, which, on top of its ability to calculate any mathematical equation at an unmatched speed, could also be used for a seemingly infinite number of other applications. In other words, he envisioned the precursor to the modern computer.
At first blush, Babbage hardly seemed the type, because in many ways, Babbage was the antithesis of the debonair, silver-tongued, and effortlessly charismatic CEOs of present-day tech giants. Babbage was a quirky individual to say the least. He was highly observant, but was in the same breath a habitual daydreamer, often caught in a trance of deep thought. He spoke with a stutter, cared little about his appearance, often sporting stained collars and rumpled coats, and in his later years became something of an agoraphobe, developing a disdain for crowds and music.