▶ Your Answer :
The history of humanity has evolved around the problems that affect human contexts of living--and how to ameliorate them. During various historic periods, we have turned to different disciplines of arts and sciences to seek answers. For many centuries, it has been God who supposedly answered all we asked. Beginning in the monumental era when Nietzsche claimed that He is dead, we have turned to science. The past century indeed has seen a soaring tendency to configure solutions around one specific paradigm: technology. It is a broader term that encompasses engineering applications and advances evolved from findings in natural sciences. Beginning in the late 19th century, we first observed the rise of mechanical inventions such as automobiles, telephones, and film cameras. Now entering the era of digital revolution, we start to not only witness, but also absorb ourselves fully into the digital innovations of the late 20th and 21st century: personal computers, internet, smartphones, drones, self-driving cars, and the list does not appear to cease anytime soon.
It is in this contemporary context of ceaseless, almost obsessive culture of technological innovation that the statement connects the prevalence of technology in our modes of living to “the ability of humans to think for themselves.” The connection assumes the fundamentally antithetical position technology takes to the cerebral capacity of humans. It is not groundless when we consider, for instance, a day of an average American. He wakes up with an alarm, not a built-in natural body clock; then he takes a mechanical device that would transport him from home to school or work; and for the most part during education, work, or leisure, his eyes are likely to be fixed on screens varying in size from 5 to 20 inches that seem to provide all the knowledge and connections contemporary humans are seeking. From physiology to transportation, from leisure to labor, technological devices are ubiquitous. Given the prevalence of technology in a contemporary developed society, the statement, admittedly, is not without its reason.
However, the sheer omnipresence of technological inventions does not necessarily lead to debilitating human capacity to think; in other words, technology is not antithetical to human capacity to think. In all the aforementioned areas in the human context that technology has permeated through, from physiology to transportation to labor, human cerebral functions have not been precluded by technological innovations; instead, technology allows for humans’ intellectual realms on a more proactive, dynamic level. Consider transportation, for example. As cities expand with networks of highways, public transits, and even airways, the sheer periphery of human mobility has increased tremendously. This expansion brings about exposure to myriads of cultures, contexts, and other people that a person can interact with nowadays, a phenomenon preposterously unimaginable only a century ago. Humans of the 19th century, preoccupied with operational and logistical concerns of transportation, presumably a horse-pulled carriage, would have spent an equal amount merely in order to move from point A to B. Freed from mechanical obstacles, humans of the 21st century can focus on projects that require a true creative thinking such as global inequality, ecological sustainability, and extraterrestrial investigations, all of which we as a humanity have demonstrated a laudable progress unforeseen in history.
Further, in configuring novel paradigms and aesthetics of the millennium, technology is not a hinderance, but certainly the chief impetus to imagine and think. Starting in the fin-de-siècle Europe, the invention of chemical photography has pushed the limits and preconception that have dominated the medium of painting for a dozen of centuries: representation. As photographic gadgets were replacing the traditional role of painting, the old medium began to configure its own strategy to express itself in the larger cultural landscape of art. The birth of modern art was prompted by photographic technology. Not only did photographers started to translate creativity into photographic expressions, but also did traditional artists. Painters imagined Impressionist, Cubist, Surrealist worlds where the real appearances of objects and surroundings have elevated into visual alternatives where various cerebral measures such as perception and unconsciousness chiefly guided the artistic creation. Now with a set of phenomena largely coined as digital revolution, active exploration of human imagination has never relished more freedom than ever. We not only watch movies of moving animated 3D figures of supposedly prehistoric Pangea, but also expect to have goods delivered door-to-door by self-operating flying creatures. How could our intellectual lives be possibly wilder?
The contention of “side-effects” of technology is a widely claimed illusion. What makes us humans has always been our utilization of tools, whether they be neolithic swords or literal brain-freezing biomedical “miracles.” One should, however, be reminded that distraction that numbs human minds has also long existed in any human society across all the geographical and historical borders. In a contemporary society, distractive measures surely do exist as ubiquitously as general adaptation of technological means. Yet, a crucial reminder to our concerned contemporaries would be now a farcical concern of the 17th century: widespread circulation of literary fiction, prompted by the mechanical advance in printing technology, was condemned for dumbing down humans. When technology wildly and widely frees human capacity, both operationally and intellectually, one shall not mistake the methodology for its argument. |