The world’s first computer began operating in 1890. It used punch cards and performed calculations for the U.S. Census Bureau.
Computer science has come a long way in the last 125 years. This industry achieved extraordinary progress by rapidly increasing the capacity, speed and interactivity of computing equipment.
These five historical events made a huge difference in its evolution:
Table of Contents
1. Algorithmic Programming Introduced
In 1945, German inventor Konrad Zuse started developing Plan Calculus. It was the first programming language that utilized algorithms, according to the Computer History Museum. Although modern languages don’t resemble it in many ways, today’s programs continue to make extensive use of algorithms. Zuse also created the first programmatic digital computer during the 1940s. A mathematician named Grace Hopper devised the original word-based programming language in 1953. Unlike Plan Calculus, it let programmers control computers by entering phrases that looked like English sentences. Hopper initially called this language “A-0,” but its name later changed to COBOL.
2. ARPANET Networking Plan Disseminated
Lawrence Roberts created and published his plan for the first computer network in 1967. He called for the system to use efficient packet-switching technology, according to the New Media Institute. The switching concept was originally examined by Leonard Kleinrock, a researcher at MIT. Roberts’ paper helped make it possible to establish ARPANET two years later. Computer scientists installed the initial node at a university in Los Angeles. The network eventually expanded to link computers at four colleges in California and Utah. Today, schools and businesses still use networks that incorporate some of ARPANET’s technology.
3. Personal Computers Invented
Compact computing equipment began to enter homes and small businesses during the 1970s. John Blankenbaker developed the original personal computer in 1971 and built it in his garage. The Kenbak-1 cost $750 and used switches rather than a keyboard. Its memory could hold just 256 characters. Customers only ordered 40 units before Blankenbaker discontinued it. Unlike modern PCs, his device lacked a microprocessor. The Micral N became the first personal computer to incorporate this space-saving technology. This $1,700 French unit went into production in 1973, the year when Kenbak went out of business.
4. Graphical User Interface Unveiled
Xerox introduced the Alto computer system in 1973. Although it failed to generate many sales, this machine offered a graphical user interface with many of the features that remain common in today’s operating systems. It included elements like windows, pointers, menus and icons. The Xerox Star and Apple Lisa also had GUIs, but they remained quite expensive and didn’t attract many buyers. In 1984, Apple’s Macintosh became the first popular computer with a graphical interface. Digital Research and Microsoft introduced similar software during the following year. By the early 1990s, most personal computers came with GUIs rather than relying on text-based command lines.
5. World Wide Web Developed
While working at a Swiss laboratory in 1990, English computer scientist Tim Berners-Lee created the Web. He introduced URLs, the hypertext transfer protocol and the online programming language known as HTML. People continue to use these technologies to operate and access millions of websites. During the same year, Berners-Lee developed the original Web browser and established the world’s first website. His contributions made the Internet far more useful for people without extensive technical training. He still advocates for universal Web programming standards and a decentralized Internet without censorship.
All of these historic developments greatly contributed to the advancement of computer science. As this technology continues to evolve, users benefit from even greater portability and versatility. A device that once filled an entire building can now fit inside of a watch, smartphone or graphing calculator.