Back to ArticlesScience & Discovery

The Internet: From ARPANET to the World Wide Web

From a two-letter message on ARPANET in 1969 to the World Wide Web, social media, and the smartphone revolution — how the Internet transformed human civilization in half a century.

James HarringtonMonday, February 9, 202610 min read
The Internet: From ARPANET to the World Wide Web

The Internet: From ARPANET to the World Wide Web

On October 29, 1969, a graduate student at UCLA named Charley Kline attempted to send the first message over a new computer network called ARPANET. He was trying to type "LOGIN" to a computer at the Stanford Research Institute, 350 miles away. The system crashed after just two letters. The first message ever sent over what would become the Internet was: "LO." It was an accidental prophecy — within decades, this stuttering experiment would transform human civilization more profoundly than any invention since the printing press.

The Cold War Origins

The Internet's origins lie in the Cold War paranoia of the 1960s. The Advanced Research Projects Agency (ARPA), a Pentagon research agency created in response to the Soviet launch of Sputnik, funded a project to create a computer network that could survive a nuclear attack. The concept, developed by engineers like Paul Baran at RAND Corporation and Donald Davies at the UK's National Physical Laboratory, was packet switching — breaking data into small packets that could travel independently across multiple routes, reassembling at their destination.

This was revolutionary. Traditional telephone networks used circuit switching — a dedicated connection between two points. If the connection was broken, communication ceased. Packet switching meant there was no single point of failure. If one route was destroyed, packets would find another way through. It was a network designed to be indestructible.

"The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had." — Eric Schmidt

ARPANET, the network built on these principles, went live in 1969 with four nodes: UCLA, the Stanford Research Institute, UC Santa Barbara, and the University of Utah. By the early 1970s, it had grown to dozens of nodes, connecting researchers at universities and government labs across the United States.

The Protocols: TCP/IP

The critical technical breakthrough came in the 1970s with the development of TCP/IP (Transmission Control Protocol/Internet Protocol) by Vint Cerf and Bob Kahn. TCP/IP established a universal set of rules (protocols) that allowed any computer network to communicate with any other. This was the key insight: rather than building a single monolithic network, TCP/IP created a "network of networks" — the inter-net.

On January 1, 1983 — known as "Flag Day" — ARPANET officially adopted TCP/IP. This date is often considered the birthday of the Internet as we know it. The protocol allowed ARPANET to connect with other networks, including those at universities, international research institutions, and eventually commercial providers.

Email and Early Culture

Email was the Internet's first killer app. Developed by Ray Tomlinson in 1971 (who chose the @ symbol to separate user names from computer names), email quickly became the dominant use of ARPANET — far outstripping the scientific computing it was designed for. By the late 1970s, email accounted for 75 percent of ARPANET traffic.

The early Internet culture was academic, collaborative, and informal. Usenet (1980) created a distributed discussion system. Internet Relay Chat (IRC) (1988) enabled real-time group communication. The ethos was open, decentralized, and suspicious of commercial influence — values that would be severely tested as the Internet grew.

The World Wide Web

The Internet and the World Wide Web are not the same thing — a distinction frequently confused. The Internet is the global network infrastructure. The World Wide Web is an application built on top of it — a system for accessing and linking documents using hypertext.

The Web was invented by Tim Berners-Lee, a British physicist working at CERN (the European particle physics laboratory in Geneva), in 1989–1991. Berners-Lee created three fundamental technologies: HTML (HyperText Markup Language, for creating web pages), HTTP (HyperText Transfer Protocol, for transmitting them), and the URL (Uniform Resource Locator, for addressing them). He also built the first web browser and the first web server.

Berners-Lee's crucial decision was to make the Web free and open — he did not patent or monetize his invention. "Had the technology been proprietary, and in my total control," he later wrote, "it would probably not have taken off." This decision, arguably the most consequential act of technological generosity in history, ensured that anyone could create a website, anyone could build a browser, and the Web would grow without centralized control.

The Browser Wars and Commercialization

The Web remained largely academic until 1993, when Marc Andreessen and colleagues at the University of Illinois created Mosaic — the first graphical web browser that made the Web accessible to non-technical users. Andreessen went on to co-found Netscape, whose Navigator browser (1994) brought the Web to millions.

The commercialization of the Internet accelerated rapidly. Amazon (1994), Yahoo (1994), eBay (1995), and Google (1998) were founded in quick succession. The dot-com boom of the late 1990s saw frenzied investment in Internet companies, many of which had no viable business model. The dot-com bust of 2000–2001 destroyed hundreds of companies and wiped out trillions of dollars in market value — but the surviving companies emerged stronger, and the Internet's underlying growth continued.

The Social Internet

The 2000s brought the rise of social media and user-generated content. Wikipedia (2001) demonstrated that collaborative, open-source knowledge creation could work at scale. Facebook (2004), YouTube (2005), Twitter (2006), and Instagram (2010) transformed the Internet from a medium of consumption to one of participation. Smartphones, beginning with the iPhone (2007), made the Internet portable and ubiquitous.

The consequences were transformative and contradictory. Social media connected people across distances, enabled political organizing (the Arab Spring), and democratized media. It also enabled surveillance capitalism, disinformation campaigns, cyberbullying, and the erosion of shared factual reality.

Legacy

The Internet is arguably the most transformative technology since the printing press — and its revolution is still in its early stages. It has reorganized commerce (e-commerce now accounts for a significant share of global retail), transformed communication (email, messaging, video calls), disrupted media (newspapers, television, music), enabled new forms of work (remote work, the gig economy), and created entirely new industries (cloud computing, social media, streaming).

But the Internet's utopian promise — a global village of free information, open communication, and democratic empowerment — has been complicated by corporate consolidation, government surveillance, algorithmic manipulation, and the weaponization of information. The Internet that Cerf, Kahn, and Berners-Lee built was decentralized and open. The Internet of the 2020s is increasingly dominated by a handful of giant corporations and shaped by forces its creators neither intended nor desired.

The first message on the Internet was "LO" — an accident, a fragment, a beginning. The story of what comes next is still being written.

internetarpanetworld-wide-webtim-berners-leedigital-revolution

Share This Article

JH

About the Author

James Harrington

James Harrington is a public historian and former museum curator who makes history accessible to general audiences. He is passionate about American history and revolutionary movements.

Discussion

Sign in to join the discussion.

Sign In

Loading comments...