A Brief History of Cloud Technology

If you ‘Google’ the term Cloud Technology, the search returns over 2billion results. But with the wealth of Cloud solutions now available, who were the original thought leaders, and where did the idea of an interconnected ‘digital’ world come from?

What is ‘The Cloud’

‘The Cloud’ refers to servers that can be accessed over the internet that are located in data centres all over the world. These servers give users access to software and databases that reside on those servers, allowing users and businesses to run software and applications on their own devices, and limiting the need for local server infrastructure.

This also allows individuals and businesses to store, share, and receive files without having to send those files directly to other parties involved.

Cloud computing takes things one step further, allowing users to manipulate the data being stored in the Cloud. Cloud computing often needs some form of Cloud storage from the outset in order to be effective, but some systems may offer a basic storage package to get the user started.

In short, the Cloud can be summarised as a combination of three fundamental concepts that aim to:

  • Deliver a service, such as storage or computing as a utility
  • Provide a shared computer resource that multiple people can work on, referred to as virtualisation
  • Giving access to services via a network

The precursor

Back in the 1950s, computers were large in size and incredibly expensive to set up and maintain. Huge banks of mainframes were required to house and operate a company’s computer network. As a result, this set-up only really made financial and operational sense for large organisations, governments or institutions that could afford their own mainframe infrastructure.

Early theories around Cloud computing are thought to have originated from John McCarthy in the 1950s – who had previously coined the term ‘artificial intelligence’ (AI).

McCarthy’s theory revolved around the idea of computational ‘time sharing’, whereby smaller companies – who couldn’t afford their own mainframes – could share mainframe ‘time’ in order to receive some of the computing benefits larger organisations could afford to have.

Further ideas around computational ‘sharing’ would come from American Computer Scientist, J.C.R Licklider who posited the idea of an interconnected system of computers. This idea would later influence the creation of ARPANET (Advanced Research Projects Agency Network).

Commonly perceived as the precursor for the internet, this network started to allow digital sources to be shared amongst multiple computers that didn’t need to reside in the same location.

Despite being 30+ years from the actual inception of the World Wide Web, Licklider and the team working on ARPANET – having envisioned a world where everyone was connected no matter their location – had just paved the way for the modern ‘digital’ world that we now live in (and probably couldn’t live without).

Further developments throughout the 20th century

Over the next two decades, the field saw an explosion in new ideas and developments in information sharing and network infrastructure.

These advancements were further highlighted in 1976 when Queen Elizabeth II sent one of the first emails (although the very first is widely accepted to have been sent by Ray Tomlinson, a computer programmer who worked on the ARPANET system).

The 1980s continued the snowball of progress. During this decade, the White House installed its first computers; network operating systems were launched allowing computers to ‘talk’ to each other, and storage tapes were made available that could hold up to 200MB of data.

By the end of the 80s, over 100,000 computers would be connected to the internet, able to complete basic tasks. Around this time, computers were also starting to become ubiquitous in office life, and so the question of how to enable these devices to ‘talk’ to each other was becoming an ever increasing necessity.

Early iterations of modern Cloud computing

With the public release of the World Wide Web in the 1990s came the dot-com revolution and the widening popularity of e-commerce.

Throughout most of the 90s, ‘the Cloud’ was referred to as the space between the end-user and the provider. This perception was later challenged by Professor Ramnath Chellapa of Emory University who would go on to define Cloud computing as a “computing paradigm, where the boundaries of computing will be determined by economic rationale, rather than [the] technical limits alone.”

The first usage of the term ‘Cloud Computing’ is thought to have come from the offices of Compaq Computing in 1996. While contemplating the future of Internet businesses, a small group (including Sean O’Sullivan and George Favalaro) of Compaq employees were theorising how all business software would eventually move to the Web. They also suggested that ‘Cloud computing-enabled software’ would soon become the norm.

The modern Cloud

There’s still a debate to be had around where the term definitively came from.

As we’ve seen, early ideas around the term can be dated as far back as the 1950s, but in its modern form, it’s generally agreed to have been introduced by Eric Schmidt in 2006 – Google’s CEO at the time – at an industry conference.

The modern Cloud was born, and the advancements within this space have only gone from strength to strength.

As the technology became more affordable and more accessible, companies began to offer these applications over the internet, introducing the Software-as-a-Service (SaaS) model.

Early notable examples of SaaS products going to market around that time were the likes of Salesforce in 1999, and Amazon Web Services (AWS) in 2006.

Around the same time, Google launched their Google Docs service. Just a year later, they would also go on to partner with IBM and a number of American universities to develop a server farm that would enable faster processing and a place to house larger data sets. In the same year Netflix released their Cloud streaming service, giving rise to on-demand video accessibility.

Further notable Cloud advancements came in the form of NASA’s OpenNebula – the first open-source software for deploying Private and Hybrid Cloud (2008).

IBM then introduced IBM SmartCloud (2011), with Oracle following one year later with Oracle Cloud.

At Nuvias UC, we work with a number of Cloud technology vendors to bring the Channel next generation digital technologies to enable the end user to work and communicate as effectively as possible.

*

There have been many more developments since. Cloud technology is continuing to grow and fulfil the needs of an ever increasingly interconnected world.

In a world where even our watches and doorbells are connected to the Cloud, the possibilities seem to be endless.

To find out more about what Cloud technology is bringing to the IT Channel, take a look at our Cloud solutions guide here>>.

 

Sources:

DATAVERSITY

ecpi.edu

BCS.org

scality.com

itchronicles.com

servermania.com

Cloudflare

MIT Technology Review