STKE IBC Lifesciences Webcast
HENRY JAKUBOWSKI | Change Password | Change User Info | CiteTrack Alerts | Subscription Help | Sign Out

Summary of this Article
dEbates: Submit a response to this article
Similar articles found in:
SCIENCE Online
Search Medline for articles by:
Lucky, R.
Alert me when:
new articles cite this article
Download to Citation Manager
Collections under which this article appears:
Computers/Mathematics

The Quickening of Science Communication

Robert Lucky*

In the 16th century, science progressed at the pace of the postal system. Often it would take 6 months for one scientist to learn of the ongoing results of another. It took even more time for scientists to build on one another's accomplishments. The great Danish astronomer Tycho Brahe meted out his results more carefully than anyone before him, yet it was only after Brahe's death that Johannes Kepler was able to inherit the observational data he used to discern his laws of planetary motion. Today astronomers post data on the Web for instant worldwide access, and they routinely manipulate telescopes remotely through the Internet. Communication has always been the circulatory system of science, if not its very heartbeat. Yet even as progress in communications technology speeds the progress of science, there is a recursive relationship in which science improves communications. There is both a communication of science and a science of communications.

Communication Before Electricity
Today communications technology is almost synonymous with the electrical means of communicating: radio and television, the telephone and Internet. Yet science itself--defined as the quest for knowledge about nature--far predates the electrical era of the last century. The practitioners of science already were at work during the early evolution of written language, which is the fundamental underpinning of modern human communication. After all, the builders of Stonehenge demonstrated a remarkable knowledge of engineering and astronomy around 3100 B.C. This is almost contemporary with the beginnings of the Sumerian written language and surely predates the first syllabic writings, in the languages Linear A and B. Usage of Linear A, the older of the two, dates to about 1800 B.C.

After the development of written language, the next stage of communications was the emergence of postal systems, which could transport that recorded language farther than the loudest voice. Historical references to postal systems date back to 2000 B.C. in ancient Egypt and to about 1000 B.C. in China. These ancient systems invariably employed relays of messengers, either mounted or on foot, and were a natural outgrowth of the need for administration of states and kingdoms. The most celebrated ancient postal system was the cursus publicus, which is credited with holding together the vast Roman Empire over a number of centuries before its disintegration.

Although a postal system provides the rudiments of individual communication, it does not enable broadcast or widespread knowledge dispersal. For that purpose, books have been a remarkable social instrument, and they've remained unchanged in concept from nearly the time written language arose until today. There are early books on clay tablets in Mesopotamia and Egyptian papyruses that date from the third millennium B.C. Without a system for access, however, the power to communicate via books was limited, which is why the rise of the library was so important. This, too, was an ancient innovation, dating back to at least the third century B.C. with the founding of the most famous library of antiquity, the Library of Alexandria.

Books had existed for 2000 years before the invention of movable type and the publication of the Gutenberg Bible in 1454. After the conception of written language, this may be the most important invention in the history of communications in general, and in science communication in particular. The printing press made possible the widespread dissemination of books, which had previously been the exclusive property of the rich and powerful. The subsequent history of science is filled with books of singular fame and influence, such as Newton's Principia Mathematica and Darwin's Origin of Species. Even in the fast and transient world of today, science is still marked by the publication of a succession of celebrated books that summarize the wisdom emerging from the ongoing multitude of research papers.

Books and journals have been the carriers of information in science over the centuries, but there is, in addition, a social infrastructure necessary for effective communications in any community. It may seem remarkable in reading histories of science that the scientists working in a particular field always seem to know each other and to correspond regularly, in spite of the seemingly insuperable barriers between countries and continents in the ages before airplanes, telephones, and radios. The world of a particular scientific field has always seemed to be a small place. The rule of six degrees of separation has been especially effective in science, where fame and acceptance of one's views by peers have been powerful motivating forces. Even today the millions of scientists worldwide naturally subdivide themselves into specialties where everyone knows everyone else. Science has become a busy system of small worlds.

The social infrastructure for the communication of science was first formalized with the founding of the Royal Society in London in 1660. This provided a venue and process for the discussion of scientific issues. Its publication, Philosophical Transactions, which debuted in 1667, was one of the earliest periodicals in the West. The Royal Society's members, such as Isaac Newton, Edmond Halley, Robert Hooke, and Christopher Wren, are still remembered as the giants of scientific history. In contrast, science today is brimming with societies and journals in which most of the names on the mastheads are relatively unknown outside of their own fields of specialization.

Communication in the Age of Electricity
The science of communications is generally understood to have begun as late as 1837 with the invention of the telegraph by Samuel Morse.* (Curiously, Morse was a professor of painting; some of his artworks are on display in the world's leading museums.)

The major significance of the telegraph was that it severed the bond between transportation and communication. Before the invention of the telegraph, information could move only as fast as the physical means of transportation would allow. It took weeks before what was known in Europe became known in America. The telegraph enabled a kind of space-time convergence that brought countries and continents together, causing profound effects on economics and government policies. At one stroke, the life force of science--information--was freed of its leaden feet and allowed to fly at the speed of light. Conceptually, at least, advances in communications since then have simply made instant communication easier and more convenient.

In fact, the telegraph was awkward to use. Morse's famous code, which is a relatively efficient representation of the English alphabet, was a practical help, but it required transcribing natural language into another symbol system. (More importantly, it presaged an outpouring of theoretical studies of coding in the field of information theory nearly a century later.) Still, the telegrapher's experience was not much different from that of the writer using the postal service: Both had to compose carefully crafted messages and take them to the nearest telegraph or post office. In contrast, Alexander Graham Bell's telephone brought interactive and easily accessible communications directly to the user. At the time of Bell's invention in 1875, the United States was already densely wired with telegraph lines, and the Western Union Co. was one of the largest corporations on Earth.

The telegraph, of course, was undone by the telephone. Bell's invention was basically the idea of analog--that is, the transmitted voltage should be proportional to the air pressure from speech. In that era it was the right thing to do. After all, the world we experience appears to be analog. Quantities such as time, distance, voltage, and sound pressure seem to be continuous in value, whereas bits--ones and zeros, or dots and dashes--are artificial and seemingly unrepresentative of reality.

For over a century thereafter, the wires that marched across the nation would be connected to telephones, and the transmission of voiced information would be analog. Progress in telephony had mainly to do with extending coverage across the nation and meeting the social goal of universal service. The functionality of the telephone remained almost unchanged until the latter part of the 20th century.

Whether the pace of science was immediately changed by the telegraph and telephone is difficult to determine. The handling of the British expeditions during the solar eclipse of 1919 to measure deflection of starlight at the sun's perimeter (a test of Albert Einstein's predictions based on his general theory of relativity) illustrates how the pace of science generally continued at a postal rate and yet could, on occasion, be electrified. Although the eclipse was on 29 May of that year, Einstein had still not heard of the results as late as 2 September. Then on 27 September he received a telegram from Hendrik Lorentz informing him that Arthur Eddington had found the predicted displacement of starlight. On 6 November there was a famous meeting of the Royal Society proclaiming the results more widely, but this time, because of the telegraphed news of that meeting, Einstein awoke in Berlin on the morning of 7 November to find himself instantly famous in the newspapers of the world.

While the telephone companies were wiring the world, there was a parallel evolution of wireless media beginning with Guglielmo Marconi's experiments in 1894. Marconi made James Clerk Maxwell's theoretical work a practical reality with this first demonstration of radio transmission. In 1901 he defied the current understanding of line-of-sight transmission by successfully transmitting a signal across the Atlantic Ocean from St. Johns, Newfoundland, to Cornwall on the English coast. At that time there was no understanding of the importance of the reflective properties of the ionosphere in radio propagation. Viewed in retrospect, this was the kind of remarkable achievement sometimes made by determined amateurs who refuse to accept expert opinion.

In one of those curious juxtapositions of historical incidents, the sinking of the Titanic propelled radio into the limelight. Signals from the spark transmitter on the Titanic, picked up by neighboring ships, resulted in many lives being saved. Even more could have been saved had the nearest ship been attending to its radio, a fact that soon led to governmental regulations. A young telegrapher, David Sarnoff, came to fame by broadcasting the news of survivors; he later went on to head RCA and to play a major role in the world-changing developments of both radio and television.

Television was certainly the most influential communications medium of the last century. There came to be more television receivers in the world than telephones, and television broadcasts have purveyed culture to the farthest corners of Earth. The popularity and politics of science were affected by this new medium. The television broadcast of Neil Armstrong stepping on the moon on 20 July 1969 was a singular event in history, when people throughout the world were unified in their celebration of science. Moreover, educational television made heroes of individual scientists, such as Jacob Bronowski in The Ascent of Man, but especially Carl Sagan in Cosmos. Sagan became an icon of science, and with his death in 1996 the last popularly known scientist disappeared from public view.

As the wireless media expanded, so did the capacity of the terrestrial telecommunications facilities. What began as a single telephone conversation on a pair of copper wires has evolved into several hundred thousand conversations being carried by optical pulses over a glass fiber. Along the way, the copper wires were first supplanted by microwave radio relay systems. These relied on technology developed for radar during World War II, incorporating klystron and magnetron generators first developed in England. However, the microwave radio systems were soon surpassed in capacity by transmission over buried coaxial cables. Then, around 1960, Bell Labs began developing a millimeter waveguide system that was projected to meet all the demands for communications capacity well into the next millennium.

The millimeter waveguide system used a cylindrical metal pipe about 5 cm in diameter, with the transmission mode such that the electrical field was zero at the inside circumference of the guide. In theory, there was no loss with distance in such a mode, but that only held true as long as there were no bends or imperfections. In those days, the capacity of this system was considered enormous, and such capacity would be needed for what was considered to be its dominant future usage--the Picturephone, which was undergoing final development at Bell Labs during the same time period of the late 1960s. Neither of these major developments were ever a commercial success. The Picturephone, introduced commercially by AT&T in 1971, was a market failure. And the millimeter waveguide system was abandoned when it suddenly became evident that a fantastic, newly developed optical fiber offered higher capacity and lower cost.

Those optical fibers were key components of a leapfrog technology that had been in the wings for more than a decade. The laser had been invented by Arthur Schawlow and Charles Townes in 1958, and Theodore Maiman had built the first practical laser 2 years later. At first its use for communications had been envisioned as modulated beams in free space. At the time of the millimeter waveguide development, engineers considered using laser sources for systems of guided optical beams within pipes harboring lenses. But in 1966, Charles Kao had studied guided optical wave phenomena and predicted the emergence of low-loss optical fiber, a prediction that was fulfilled by a team of materials researchers at Corning in 1970. That development changed the world, and by the end of the century the vast majority of all long-distance transmission was over optical fibers at data rates approaching a terabit (a trillion bits per second) per fiber.

The Era of Information
The focus of research in communications prior to 1960 had been on the physical media for transmission. Little attention had been paid to the content of that transmission--the information being conveyed. But in the last half-century, three themes came together to focus attention on that informational content--the development of microelectronics, the shift to a digital representation of information, and the rise of an information economy.

The technological seeds of the Information Age were sown in a fertile period after the end of World War II. In 1947, the transistor was invented at Bell Labs by William Shockley, John Bardeen, and Walter Brattain. At the same time, John Mauchly and John Eckert were assembling 18,000 vacuum tubes into the ENIAC computer at the University of Pennsylvania. Meanwhile, Claude Shannon at Bell Labs was writing a landmark paper on a theory of information, and telecommunications engineers were just beginning to consider the possible advantages of digital transmission using pulse code modulation, whose potential importance for encrypted transmission became apparent during the war.

At first the transistor was seen as a direct replacement for the vacuum tube, albeit smaller and requiring less power. But it became the first step on a pathway that constituted possibly the most significant technological development of the century. In 1958, Jack Kilby at Texas Instruments fabricated a chip with several transistors on a substrate, and the integrated circuit was born. Perhaps the transistors themselves were less important than the development of photolithographic fabrication, which enabled engineers to mass-produce on a microscale the wires and passive components that connected the active devices. Following Kilby's work was an inevitable evolution from calculator chips to microprocessors to personal computers. The availability, power, and digital form of this hardware enabled and shaped its applications in the subsequent Information Age.

In 1965 Gordon Moore, one of the founders of Intel and a pioneer of the Information Age, made an observation that has since become fundamental to business planning about technology evolution. This observation, called Moore's Law, states that there is a factor of 2 improvement in semiconductor technology every 18 months. This steady exponential improvement in the cost effectiveness of integrated circuits has maintained its pace almost exactly for the last 3 decades. Since 1965, this represents a gain of 224, or about eight orders of magnitude. Compare this with, for example, the difference between walking and flying in a jet plane, which is only two orders of magnitude. This unprecedented scaling of technology has made possible the computer and Internet technology on which science relies today.

It seems conceivable that technological progress has always been exponential, but that Moore's Law brought it to our attention because an exact measure of progress--the dimensions of circuit features--became possible with the advent of microelectronics. Progress in a number of related technical fields is also exponential, with various doubling periods. For example, optical capacity doubles every 12 months, Internet traffic doubles every 6 months, wireless capacity doubles every 9 months, and so forth.

The electronics revolution fostered the development of computers, the rise of computer networks, and the digitization of information and media. Together they created the present digital networked economy. It is hard to separate these themes and to say where one leaves off and the other begins; their evolution continues unabated. What's more, today's World Wide Web has been cited as a counterexample of the well-known thesis that all major technological developments require 25 years for widespread availability, as was the case with radio, television, and the telephone. The Web, by contrast, became overwhelmingly popular in only a few years. Of course, the Web needed the ubiquitous infrastructure of the Internet, which in turn required widespread availability of computers, which required the microprocessor, which required integrated circuits, and so forth. Certainly, it all goes back to the transistor, although it seems possible to make an argument that takes everything back to some development in antiquity. To paraphrase Newton, "We always are standing on the shoulders of giants."

In their influence on how science is transacted, the Internet and World Wide Web have had the greatest impact of any communications medium since possibly the printing press. The telegraph, telephone, and wireless were not different in concept from the postal system, except that the modern technologies were so much faster. The postal system, telephone, and telegraph are also one-to-one topologies, connecting a single user to another single, predesignated user. On the other hand, radio and television are one-to-many topologies for the broadcast of a small number of common channels to a great many receivers. The Internet and Web are something else entirely.

The beauty and power of these new media are that they allow the formation of spontaneous communities of unacquainted users. Their topology is neither one to one nor one to many, but rather many to many. They allow the sharing of information in textual, graphic, and multimedia formats across these communities, and they empower users within these communities to build their own applications. It is this empowerment of the periphery that has opened the floodgates of innovation to millions. In all previous communications technologies the ability to innovate and craft new systems and applications was confined to a small number of industrial engineers who tended the centralized intelligence and functionality.

The key idea of the Internet--a simple, common core protocol with intelligence at the periphery--was the critical ingredient of the culture from which the Internet arose. In the 1960s, computer communications centered upon the development of modems to enable shared access to expensive central computers over the existing telephone infrastructure, which was circuit-switched and designed entirely around voice transmission. Packet transmission and the open architecture that characterized the U.S. Defense Department's experimental network, ARPAnet, at its inception in 1969 had to come from outside the traditional industry. We are fortunate today that government and academia led this development. That's a big part of the reason why today's Internet is not fragmented into proprietary subnets but is instead open to all on an equal basis. It has been said that the greatest invention of the computer industry was not the PC, but rather the idea of an open platform that allows different innovators to mix and match their hardware and software. The Internet did the same thing for the telecommunications industry.

The protocol that defines the Internet, TCP/IP, was written by Robert Kahn and Vinton Cerf in 1973. Its genius, perhaps better understood in retrospect, was that it perfectly obeyed the maxim of being as simple as possible, but not more so. Consequently, the Internet is often viewed as an hourglass, with the multitude of physical media at the wide bottom and the plethora of applications at the wide top. The two are connected by the narrow neck of the Internet Protocol, IP, through which all communications must flow. It is a beautiful, flexible, and extensible architecture.

The most important application in the early days of the Internet turned out not to be access to time-shared computers but rather the simple e-mail that flowed between networked researchers. E-mail today remains a mainstay, but the World Wide Web became the sociotechnical invention that facilitated the kind of communications most important to science. It is perhaps not a coincidence that the Web came from a physics laboratory, CERN, where it was pioneered by Tim Berners-Lee in 1989. Subsequently, the first browser, Mosaic, was conceived at the National Center for Supercomputing Applications (NCSA) at the University of Illinois, Urbana-Champaign. This was followed by the commercial development of browsers and of search engines, which had originated at Carnegie Mellon, Stanford, and Berkeley. All of the ingredients of an information environment to promote scientific collaboration fell into place.

Science and the Web Today
The infrastructure for research and collaboration in science today through the Internet and Web is rich with power and promise. Yet at the same time it is often filled with frustration. Who could have dreamed a decade ago that we would have instant access to a billion documents from the comfort and privacy of our office or laptop? What a pleasure it is to do library research today! However, we all frequently have the experience of starting a search for some topic, only to get sidetracked into interesting but irrelevant material, and never finding what we initially were seeking.

Imperfect as they are, it is a wonder that search engines exist at all. What a daunting job--crawling the Web daily to retrieve and inventory those billion pages! Using various relevance criteria, they reduce those pages by about two-thirds, but then they have to conduct user searches that are based on an average of only about two English words. It is not surprising that the links they return often overwhelm and undersatisfy the desires of the searcher. Some of the search improvements being pursued today include the use of information on context or popularity with other users, human categorization, and interactive search dialogs--improvements that are coming from the insight and innovation of scientists and engineers of many different disciplines.

The information access dream has often been stated as having the Library of Congress online. Unfortunately, the reality is very different. Web pages are full of junk and questionable material (as is the Library of Congress, for that matter) and often do not include the detailed material necessary for scientific research. Even though scientists are, as a group, entirely comfortable with computers and networks, they generally have been slow to provide online access to their journals.

One of the main obstacles has been the existing publication system, which depends on the sale of journals and magazines (particularly to libraries) to support its operation. Among Internet users a strong culture has evolved that believes that "information wants to be free." Like other businesses that rely on the sale and distribution of intellectual property--such as the publication, music, and movie industries--science has yet to evolve a satisfactory economic model that defines and protects the rights of intellectual property owners in the face of the perfect copying and widespread distribution enabled by digital networking technology.

Several other characteristics of the scientific establishment have hindered Web access to current research results. One is the need for peer review in order to establish some judgment on the material. In a world increasingly filled with questionable and irrelevant material, the guidance of peers regarding what is genuinely worth our time to read and examine has become more critical than ever. Even though Web publication can be nearly instantaneous, peer review still proceeds at a human pace. Another serious obstacle has been the tenure committees at universities and their reluctance to give full credit to online publication. In spite of these obstacles, there are a number of branches of science where fellow researchers exchange their latest research results--through listservs and other mechanisms--nearly in real time.

The Internet also is providing new mechanisms to enable scientists to collaborate at a distance. Programs that implement "whiteboards," where participants can sketch on a shared visual space, have been available for several years. Remote sharing and manipulation of instruments and lab facilities, such as telescopes and microscopes, is commonly done today. Videoconferencing over the Internet is relatively simple at low resolution, but higher resolutions and multiparty calls remain increasingly difficult with today's technology. Considerable work is being done to establish architectures and protocols for efficient video broad- cast trees. Nonetheless, telepresence may always remain, in many respects, a poor substitute for traditional face-to-face interaction.

In the current sociology of the Net, the notion of portals is popular. These are single Web sites that serve as entrances to integrate material in a particular field. In science, a good example of taking this concept to the next level is the Biology Workbench at the NCSA (biology.ncsa.uiuc.edu). Web visitors to the Biology Workbench can access a number of worldwide biology databases and can choose among the collected application programs to perform operations on these databases. Users also can view their results in customized ways and are unaware of where the applications are actually running.

The Net is also increasingly important for continuing education, the lifeblood of the scientific profession. A number of courses are being put on the Web, and both universities and commercial enterprises are getting in the business of packaging educational material for distance learning. The advantages are ease of access and the ability to search and speed through material for the most efficient use of time. Typically, these Web-based courses offer a "talking head" (which enhances involvement, if nothing else), graphics for slides, and a moving transcript of the lecture.

It is worth reflecting upon the momentous improvements in the infrastructure for scientific work that have occurred in the last several decades because of information technology. The ability to simulate and graph results, to ask "what ifs" that might never be answerable with traditional lab-bench experimentation, to share results instantaneously, to search the world's information archives, and to be educated at one's own pace remotely are just a few of those improvements. What a pleasure it is to work in science today, and how far we have come from the days of libraries and slide rules!

Future Surprises
No futurist predicted the emergence of the World Wide Web. Yet in retrospect it seems to be an obvious idea. This epitomizes the way information technology evolves. Although the underlying technology trends--the exponential increases in processing power and bandwidth--are predictable, the applications are not. While the technologists speak of bandwidth and dream of video-intensive applications, society focuses on e-mail and the Web. The ways in which we use information processing and telecommunications appear to emerge from the chaos of the social milieu.

Technologically, we are headed to a time when bandwidth and processing will be unlimited and free. Latency will approach the speed-of-light delay, and even that will pose a problem for distributed computing applications. Service quality will approach the "five nines" (99.999%) availability of the traditional telephone network. Encryption will finally be freed of its political restraints, and security and privacy will be assured.

The way we tap into the network will also change. Much of the access to future online environments will be wireless and untethered, as will be our lifestyles. Except for voice conversations, which will be seen as just another application of the Net, most communication will be asynchronous, happening at the convenience of the users, who are increasingly turning into nomads who need to tap into media wherever they happen to be. But these technical problems are relatively simple compared with the sociotechnical engineering required to improve the three dimensions of communications--human to information, human to human, and human to computer.

In improving access to information, greater digitization of archived material and better search methodologies are necessary, and we can expect significant improvements in both areas. However, science has always been about the murky chain of converting data to information, to knowledge, and finally to wisdom. This process is likely to remain human-centered.

In the second dimension, human to human, the aim of communications has always been to replicate the experience of a face-to-face relationship at a distance. No communications technology has yet preserved the important nuances of a face-to-face interaction. There is, however, no reason to believe that perfect replication of such interactions is either achievable or desirable. Conceivably, by mediating the human interaction, communications at a distance could, in some ways, be better than face to face. A simple example would be simultaneous language translation, or the augmentation of the dialog with computer information retrieved in real time by agents that automatically roam the Web in search of relevant data. Nor are all the nuances of a face-to-face dialog helpful. For example, early experiments with videoconferencing showed that it was easier to deceive others using video than with audio alone--gestures and facial expressions could be used to draw attention away from lies or faulty logic.

Finally, communication between humans and computers can be improved by speech technology, natural language understanding, and machine implementation of commonsense reasoning. However, even though next year, 2001, is the year of HAL from Stanley Kubrick's famous movie, we are nowhere near to realizing HAL's capabilities for speech interaction. It will likely be some time before we start talking to computers like friends, and more time still before we think of computers as fellow scientists. Considering how HAL turned out in the movie, maybe this is just as well.

Further Reading

J. Abbate, Inventing the Internet (MIT Press, Cambridge, MA, 1999).

T. Berners-Lee, Mark Fischetti, Michael Dertouzos, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor (Harper San Francisco, 1999).

J. Brooks, Telephone: The First Hundred Years (Harper & Row, New York, NY, 1976).

R. W. Clark, Einstein: The Life and Times (Avon Books, New York, NY, reissue 1999).

G. B. Dyson, Darwin Among the Machines: The Evolution of Global Intelligence (Perseus Books, New York, NY, 1998).


Robert Lucky is Corporate Vice President, Applied Research, at Telcordia Technologies. He joined Telcordia in 1992 after an extensive career at Bell Labs. He is an inventor and author in the field of communications, and is well known for his bimonthly columns in IEEE Spectrum Magazine and his numerous public speaking engagements.
As with many great inventions, the genealogy of the telegraph is complex, and the attribution to a single inventor at a particular moment in time is a great oversimplification.


A Timeline of Science and Communications
B.C.
3500-3100 B.C.
Sumerians develop cuneiform writing using a stylus to etch wedge-shaped symbols into soft clay. Meanwhile, Egyptians develop hieroglyphic writing and use papyrus, the physical and etymological precursor of paper.
2000 B.C.
Early postal system is established in Egypt.
c. 1800 B.C.
Minoans write using Linear A, the first known syllabic script.
300 B.C.
The founding of the Library of Alexandria helps establish practice of collecting knowledge and making it more widely accessible.
A.D.
1400s
1454
Johann Gutenberg uses movable type and a press to produce 300 bibles.
1500s
1500s-1600s
Postal systems proliferate in Europe.
1600s
1609
First regularly published newspaper appears in Germany.
1660
The Royal Society is founded and begins publishing its Philosophical Transactions 7 years later. They become models for other scientific societies.
1700s
1700
Gottfried Liebniz shows that a binary numeric system of 1's and 0's can be used to denote any number.
1732
Benjamin Franklin starts a circulating library.
1800s
1832
Charles Babbage conceives of the "Analytical Engine," forerunner to the computer.
1837
Samuel Morse patents his version of the telegraph and a year later creates his now-famous code of long and short electrical pulses to represent letters. In 1844, a telegraph connects Washington and Baltimore, and the following message is sent: "What hath God wrought?"
1847
George Boole mathematizes logical arguments.
1866
After a short-lived transatlantic cable was laid in 1858, the first reliable one is installed.
1873
James Clerk Maxwell publishes theory of electromagnetism.
1876
Alexander Bell patents the telephone.
1888
Heinrich Hertz observes radio waves.
1894
Guglielmo Marconi invents wireless telegraphy--radio. Seven years later, he sends a radio signal across the Atlantic Ocean.
1900s
1924
With 2.5 million radio sets in the United States, radio enters its Golden Age.
1925
Commercial picture facsimile radio service is established across the U.S.
1928
Television sets are put into three homes and programming begins. Regular TV broadcasts begin in 1939.
1946
John Mauchly and John Eckert build the first practical electronic digital computer--ENIAC, which stands for Electronic Numerical Integrator and Computer. It weighs nearly 30 tons and takes up 140 square meters of space.
1947
William Shockley, John Bardeen, and Walter Brattain invent the transistor at Bell Labs.
1949
Claude Shannon publishes landmark work on information theory, which becomes a theoretical cornerstone for the subsequent Information Age.
1958
Jack Kilby demonstrates the integrated circuit by fabricating several transistors onto a single substrate.
1960
Theodore Maiman builds the first practical laser, which had been conceived of earlier by Charles Townes, Arthur Schawlow, and others.
1962
The U.S. launches Telstar, the first true communications satellite. It could receive and then amplify and resend radio signals.
1965
Gordon Moore articulates what has become a famous law of technological development: There is a factor of 2 improvement in semiconductor technology every 18 months.
1969
ARPAnet, the seed of the Internet, becomes operational.
1970
A team of Corning researchers makes practical optical fibers that are transparent enough for communications.
1971
Intel builds the microprocessor, "a computer on a chip."
1973
Robert Kahn and Vinton Cerf formulate the Internet-defining file-transfer protocol, TCP/IP.
1975
The first personal computer, the Altair 8800, is marketed.
1980
CNN begins 24-hour news channel.
1980s to present
As computational power increases, computational science ascends to become a third arm of science, joining theoretical science and experimental science.
1981
IBM introduces the Personal Computer.
1983
Cellular phone network starts in U.S.
1989
Tim Berners-Lee and colleagues at the Swiss-based international elementary particle laboratory CERN create Hypertext Transfer Protocol (HTTP), a standardized communication mode for computer networks. The World Wide Web is launched.
1993
Mosaic, the first user-friendly graphical interface, is released; it greatly accelerates the proliferation of Web users.
1990s
The Internet rapidly becomes a socially transforming technology.
For a much more extensive timeline, see www.mediahistory.com/time/alltime.html


Summary of this Article
dEbates: Submit a response to this article
Similar articles found in:
SCIENCE Online
Search Medline for articles by:
Lucky, R.
Alert me when:
new articles cite this article
Download to Citation Manager
Collections under which this article appears:
Computers/Mathematics

Volume 289, Number 5477, Issue of 14 Jul 2000, pp. 259-264.
Copyright © 2000 by The American Association for the Advancement of Science.