Technology is the application of Conceptual model to achieve practical , especially in a reproducible way. The word technology can also mean the products resulting from such efforts,: "The first pole, that of the naturalisation of a new discipline within the university curriculum, was presented by Christian Wolff in 1728, in Chapter III of the "Preliminary discourse" to his Philosophia rationalisis sive Logica: 'Technology is the science of skills and works of skill, or, if one prefers, the science of things made by man's labour, chiefly through the use of his hands.'" including both tangible such as Kitchen utensil or , and intangible ones such as software. Technology plays a critical role in science, engineering, and everyday life.
Technological advancements have led to significant changes in society. The earliest known technology is the stone tool, used during prehistory, followed by the control of fire—which in turn contributed to the Brain size of the human brain and the development of language during the Pleistocene, according to the cooking hypothesis. The invention of the wheel in the Bronze Age allowed greater travel and the creation of more complex machines. More recent technological inventions, including the printing press, telephone, and the Internet, have lowered barriers to communication and ushered in the knowledge economy.
While technology contributes to economic development and improves human prosperity, it can also have negative impacts like pollution and resource depletion, and can cause social harms like technological unemployment resulting from automation. As a result, philosophical and political debates about the role and use of technology, the ethics of technology, and ways to mitigate its downsides are ongoing.
Starting in the 19th century, continental Europeans started using the terms Technik (German) or technique (French) to refer to a 'way of doing', which included all technical arts, such as dancing, navigation, or printing, whether or not they required tools or instruments. At the time, Technologie (German and French) referred either to the academic discipline studying the "methods of arts and crafts", or to the political discipline "intended to legislate on the functions of the arts and crafts." The distinction between Technik and Technologie is absent in English, and so both were translated as technology. The term was previously uncommon in English and mostly referred to the academic discipline, as in the Massachusetts Institute of Technology.
In the 20th century, as a result of Progress and the Second Industrial Revolution, technology stopped being considered a distinct academic discipline and took on the meaning: the systemic use of knowledge to practical ends.: "With the industrial revolution and the important part England played in it, the word technology was to lose this meaning as the subject or thrust of a branch of education, as first in English and then in other languages it embodied all technical activity based on the application of science to practical ends."
The discovery of fire was described by Charles Darwin as "possibly the greatest ever made by man". Archaeological, dietary, and social evidence point to "continuous human fire-use" at least 1.5 Mya. Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten. The cooking hypothesis proposes that the ability to cook promoted an increase in hominid brain size, though some researchers find the evidence inconclusive. Archaeological evidence of was dated to 790 kya; researchers believe this is likely to have intensified human socialization and may have contributed to the emergence of language.
Other technological advances made during the Paleolithic era include clothing and shelter. No consensus exists on the approximate time of adoption of either technology, but archaeologists have found archaeological evidence of clothing 90-120 kya and shelter 450 kya. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 kya, humans were constructing temporary wood huts. Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa around 200 kya, initially moving to Eurasia.
With this increase in population and availability of labor came an increase in labor specialization. What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war among adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.
The invention of writing led to the spread of cultural knowledge and became the basis for history, libraries, schools, and scientific research.
Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelting and forging gold, copper, silver, and leadnative metals found in relatively pure form in nature. The advantages of copper tools over stone, bone and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 kya). Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4,000 BCE). The first use of iron alloys such as steel dates to around 1,800 BCE.
Archaeologists estimate that the wheel was invented independently and concurrently in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture), and Central Europe. Time estimates range from 5,500 to 3,000 BCE with most experts putting it closer to 4,000 BCE. The oldest artifacts with drawings depicting wheeled carts date from about 3,500 BCE. More recently, the oldest-known wooden wheel in the world as of 2024 was found in the Ljubljana Marsh of Slovenia; Austrian experts have established that the wheel is between 5,100 and 5,350 years old.
The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used a potter's wheel and may have invented it. A stone pottery wheel found in the city-state of Ur dates to around 3,429 BCE, and even older fragments of wheel-thrown pottery have been found in the same area. Fast (rotary) potters' wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through , windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois and were first used in Mesopotamia and Iran in around 3,000 BCE.
The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to , and timber roads leading through the swamps of Glastonbury, England, dating to around the same period. The first long-distance road, which came into use around 3,500 BCE, spanned 2,400 km from the Persian Gulf to the Mediterranean Sea, but was not paved and was only partially maintained. In around 2,000 BCE, the Minoans on the Greek island of Crete built a 50 km road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island. Unlike the earlier road, the Minoan road was completely paved. in France, one of the most famous Roman aqueduct|alt=refer to caption]] Ancient Minoan private homes had Tap water. A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos. Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain. The ancient Romans had many public flush toilets, which emptied into an extensive sewage system. The primary sewer in Rome was the Cloaca Maxima; construction began on it in the sixth century BCE and it is still in use today.
The ancient Romans also had a complex system of aqueducts, which were used to transport water across long distances. The first Roman aqueduct was built in 312 BCE. The eleventh and final ancient Roman aqueduct was built in 226 CE. Put together, the Roman aqueducts extended over 450 km, but less than 70 km of this was above ground and supported by arches.
The Renaissance era produced many innovations, including the introduction of the movable type printing press to Europe, which facilitated the communication of knowledge. Technology became increasingly influenced by science, beginning a cycle of mutual advancement.
The 20th century brought a host of innovations. In physics, the discovery of nuclear fission in the Atomic Age led to both nuclear weapons and nuclear power. were invented and asserted dominance in processing complex data. While the invention of allowed for digital computing with like the ENIAC, their sheer size precluded widespread use until innovations in quantum physics allowed for the invention of the transistor in 1947, which significantly compacted computers and led the digital transition. Information technology, particularly optical fiber and optical amplifiers, allowed for simple and fast long-distance communication, which ushered in the Information Age and the birth of the Internet. The Space Age began with the launch of Sputnik 1 in 1957, and later the launch of crewed missions to the moon in the 1960s. Organized efforts to search for extraterrestrial intelligence have used to detect signs of technology use, or , given off by alien civilizations. In medicine, new technologies were developed for diagnosis (CT scan, PET, and MRI scanning), treatment (like the dialysis machine, defibrillator, pacemaker, and a wide array of new pharmaceutical drugs), and research (like interferon cloning and ).
Complex manufacturing and construction techniques and organizations are needed to make and maintain more modern technologies, and entire industries have arisen to develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education – their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have developed to support them, including engineering, medicine, and computer science; and other fields have become more complex, such as construction, transportation, and architecture.
Technologies have contributed to human welfare through increased prosperity, improved comfort and quality of life, and medical progress, but they can also disrupt existing social hierarchies, cause pollution, and harm individuals or groups.
Recent years have brought about a rise in social media's cultural prominence, with potential repercussions on democracy, and economic and social life. Early on, the internet was seen as a "liberation technology" that would democratize knowledge, improve access to education, and promote democracy. Modern research has turned to investigate the internet's downsides, including disinformation, polarization, hate speech, and propaganda.
Since the 1970s, technology's impact on the environment has been criticized, leading to a surge in investment in Solar power, Wind power, and other forms of clean energy.
Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties. Marx framed it as a tool used by capitalists to oppress the proletariat, but believed that technology would be a fundamentally liberating force once it was "freed from societal deformations". Second-wave philosophers like Ortega later shifted their focus from economics and politics to "daily life and living in a techno-material culture", arguing that technology could oppress "even the members of the bourgeoisie who were its ostensible masters and possessors." Third-stage philosophers like Don Ihde and Albert Borgmann represent a turn toward de-generalization and empiricism, and considered how humans can learn to live with technology.
Early scholarship on technology was split between two arguments: technological determinism, and social construction. Technological determinism is the idea that technologies cause unavoidable social changes. It usually encompasses a related argument, technological autonomy, which asserts that technological progress follows a natural progression and cannot be prevented. Social constructivists argue that technologies follow no natural progression, and are shaped by cultural values, laws, politics, and economic incentives. Modern scholarship has shifted towards an analysis of sociotechnical systems, "assemblages of things, people, practices, and meanings", looking at the value judgments that shape technology.
Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies", societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices, values, and world views. Herbert Marcuse and John Zerzan suggest that technological society will inevitably deprive us of our freedom and psychological health.
Prominent debates have surrounded genetically modified organisms, the use of robotic soldiers, algorithmic bias, and the issue of AI alignment behavior with human values.
Technology ethics encompasses several key fields: Bioethics looks at ethical issues surrounding biotechnologies and modern medicine, including cloning, human genetic engineering, and stem cell research. Computer ethics focuses on issues related to computing. Cyberethics explores internet-related issues like intellectual property rights, Internet privacy, and censorship. Nanoethics examines issues surrounding the alteration of matter at the atomic and molecular level in various disciplines including computer science, engineering, and biology. And engineering ethics deals with the professional standards of engineers, including software engineers and their moral responsibilities to the public.
A wide branch of technology ethics is concerned with the ethics of artificial intelligence: it includes robot ethics, which deals with ethical issues involved in the design, construction, use, and treatment of robots, as well as machine ethics, which is concerned with ensuring the ethical behavior of artificially intelligent agents.
Other fields of ethics have had to contend with technology-related issues, including military ethics, media ethics, and educational ethics.
In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three technologies. Genetic engineering will allow far greater control over human biological nature through a process called directed evolution. Some thinkers believe that this may shatter our sense of self, and have urged for renewed public debate exploring the issue more thoroughly; others fear that directed evolution could lead to eugenics or extreme social inequality. Nanotechnology will grant us the ability to manipulate matter "at the molecular and atomic scale", which could allow us to reshape ourselves and our environment in fundamental ways. Nanobots could be used within the human body to destroy cancer cells or form new body parts, blurring the line between biology and technology. Autonomous robots have undergone rapid progress, and are expected to replace humans at many dangerous tasks, including search and rescue, bomb disposal, firefighting, and war.
Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2018 believe that AI will "accomplish every task better and more cheaply" than humans by 2063, and automate all human jobs by 2140. This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about universal basic income. Political science experts predict that this could lead to a rise in extremism, while others see it as an opportunity to usher in a post-scarcity economy.
The transhumanism movement is founded upon the "continued evolution of human life beyond its current human form" through science and technology, informed by "life-promoting principles and values."
Singularitarians believe that machine superintelligence will "accelerate technological progress" by orders of magnitude and "create even more intelligent entities ever faster", which may lead to a pace of societal and technological change that is "incomprehensible" to us. This event horizon is known as the technological singularity.
Major figures of techno-utopianism include Ray Kurzweil and Nick Bostrom. Techno-utopianism has attracted both praise and criticism from progressive, religious, and conservative thinkers.
The earliest known revolt against technology was Luddism, a pushback against early automation in textile production. Automation had resulted in a need for fewer workers, a process known as technological unemployment.
Between the 1970s and 1990s, American terrorist Ted Kaczynski carried out a series of bombings across America and published the Unabomber Manifesto denouncing technology's negative impacts on nature and human freedom. The essay resonated with a large part of the American public. It was partly inspired by Jacques Ellul's The Technological Society.
Some subcultures, like the off-the-grid movement, advocate a withdrawal from technology and a return to nature. The ecovillage movement seeks to reestablish harmony between technology and nature.
The direction of causality between scientific discovery and technological innovation has been debated by scientists, philosophers and policymakers. Because innovation is often undertaken at the edge of scientific knowledge, most technologies are not derived from scientific knowledge, but instead from engineering, tinkering and chance. For example, in the 1940s and 1950s, when knowledge of turbulent combustion or fluid dynamics was still crude, jet engines were invented through "running the device to destruction, analyzing what broke ... and repeating the process". Scientific explanations often follow technological developments rather than preceding them. Many discoveries also arose from pure chance, like the discovery of penicillin as a result of accidental lab contamination. Since the 1960s, the assumption that government funding of basic research would lead to the discovery of marketable technologies has lost credibility. Probabilist Nassim Taleb argues that national research programs that implement the notions of serendipity and convexity through frequent trial and error are more likely to lead to useful innovations than research that aims to reach specific outcomes.
Despite this, modern technology is increasingly reliant on deep, domain-specific scientific knowledge. In 1975, there was an average of one citation of scientific literature in every three patents granted in the U.S.; by 1989, this increased to an average of one citation per patent. The average was skewed upwards by patents related to the pharmaceutical industry, chemistry, and electronics. A 2021 analysis shows that patents that are based on scientific discoveries are on average 26% more valuable than equivalent non-science-based patents.
|
|