A brief history of edge computing

edge computing watch
Image: Yury Zap/Adobe Stock

Edge computing is one of the most important technologies of modern times. The edge allows organizations to leverage data-hungry innovations such as artificial intelligence, biometrics, the Internet of Things, and endpoint management.

Combined with 5G and the cloud, companies are using the edge to bring data closer to where it is processed, perform real-time operations, and reduce latency and IT costs. But where did it all start? What is the history of edge computing?

What was before the edge?

To understand the early days of the edge, we have to go back to the history of computers. The origins of computers go back more than 200 years in history. However, it wasn’t until World War II that data-processing computers really took shape with devices like MIT’s 1931 mechanical analog computer and the 1936 Turing Machine, a principle for a universal machine created by British scientist Alan Turing.

As Live Science’s timeline reveals, the ’40s, ’50s, and ’60s saw computer improvements, but all of these computers had a common ground. They were large, often occupying entire rooms and processed all data on site. They were in fact data servers. These huge computers were expensive, rare and difficult to build. They were only mainly used by the military, governments and major industries.

SEE: Don’t hold back your enthusiasm: Trends and challenges in edge computing (TechRepublic)

By the late 1970s, major technology companies such as IBM, Intel, Microsoft, and Apple began to take shape, and microprocessors and other microtechnology inevitably shaped the first personal computers. In the 1980s, iconic computers like the 1984 Apple Macintosh found their way into every home. These personal computers provided new applications, but like the large machines of the past, they processed all the data on the device.

It wasn’t until 1989 that a significant shift in data computing began when Tim Berners-Lee invented the World Wide Web, the first web server, the first web browser, and the formatting protocol called Hypertext Markup Language.

Data shifted from processing by devices to processing by servers, giving rise to the server-computing model. But even before the Internet was officially established, Berners-Lee knew this model had a major problem: congestion. Berners-Lee realized that as more devices connected to the Internet, the servers that supplied the data came under pressure. Eventually, a breaking point would inevitably be reached and applications and sites would inevitably fail and crash in the near future.

From centralized servers to the first edge

The same year the web was created, a small group of computer scientists from MIT presented a business proposal at the 1998 MIT $50K competition. The group was selected as one of the finalists that year. Out of that group came a company that would change the way data is managed around the world, the name of the company: Akamai.

Today Akamai – with annual revenues of $3.5 billion, more than 355,000 servers in more than 135 countries and more than 1,300 networks worldwide – is a content delivery network, cybersecurity and cloud services company. But in 1998, they were a small group of scientists working to solve the traffic congestion problem that had the early World Wide Web. Foreseeing how the congestion would cripple the internet, they developed an innovative concept to ensure data flows smoothly without crashing sites. The first edge computing architecture was born.

The model shifted away from the relationships of centralized servers that manage all data transfers and away from the server-device relationship. The edge would decentralize this model and create thousands of networks and servers that ease bandwidth and reduce data processing latency and fatigue.

SEE: 20 Good Habits Network Admins Need — And 10 Habits To Break (Free PDF) (TechRepublic)

Akamai’s 2002 article entitled Globally Distributed Content Delivery revealed how the company deployed its 12,000-service system across more than 1,000 networks to combat service bottlenecks and shutdowns by delivering content from the edge of the Internet.

“Providing web content from one location can pose serious problems to site scalability, reliability, and performance,” explains Akamai. “By caching content at the edge of the Internet, we reduce demand for the site’s infrastructure and provide faster service to users whose content comes from servers nearby.”

The Akamai system, when launched in 1999, was focused on delivering web objects such as images and documents. It soon evolved to distribute dynamically generated pages and applications that handle flash crowds by allocating more servers to sites that are heavily loaded. With automatic network control and mapping, the edge computing concept presented by Akamai is still used.

Edge computing: from content data to business use

Shortly after Akamai’s edge network emerged, major tech companies and vendors began offering similar content distribution networks to meet the demands of the global rise of the Internet. For the next decade, the focus of the edge was mainly on data management for websites, but new technology would find new uses for the edge.

The Central Servers-Edge Servers-Device model would see another shift as IoT, smart devices and new endpoints emerged. The edge network today adds devices and nodes that can process data in the machine. Their primary function is not limited to the distribution of Internet content.

Businesses use the edge to process data in sight, avoid expensive and time-consuming cloud transfers, and improve their operations. IoT devices connected via 5G are used by retail for instant payment options, inventory and customer experience. Industries, on the other hand, use IoT and endpoint devices to improve performance, insights, security, and operations.

While the use of the edge has moved away from online content distribution and is tailored to each business, storing, processing, managing and distributing data at the edge remains true to its essence.

The history of edge computing is still being written as incredible developments have taken place over the past 30 years and innovation shows no signs of slowing down. The edge will continue to drive progress as centralized servers and the cloud cannot compete with its speed, low latency, cost, security benefits and data management capabilities.

Leave a Comment