MBA. Hellen Ruiz Hidalgo
Strategic Communicator
Foreign Trade Observatory (OCEX)
Vice-Rectory for Research - Distance Learning State University (UNED)
Introduction. Trade has always been the fundamental form of relationship between peoples. Its functioning has been linked to transformations in production, transport logistics, and communication. In these three areas, each revolution has resulted in corresponding changes in commercial operations. The introduction of computers has progressively revolutionized social, political, economic, educational, and productive life. Its impact on trade has been formidable, even on social cohesion, human relationships, and interpersonal and international communications. Hence the importance of looking back at the evolution of science, technology, and computing practice from its beginnings to the present day, highlighting its impact on trade at every step.
We will divide our journey into four sections:
1) In this first section, we will take a historical tour of the technological processes of computing, until we reach the beginnings of the great changes that Artificial Intelligence points to.
2) A second section will focus on the direct and indirect impacts of computing on international trade.
3) In a later section, we will address the “State of Affairs” of Artificial Intelligence today.
4) We will address the geopolitics of computing and artificial intelligence in the final section of this series, where we will focus on the race between the United States and China for dominance in Artificial Intelligence.
General impacts of computing on modern life and trade. Without attempting to be exhaustive in any way, we will take a brief look at several fields of activity that would be unthinkable without the presence of computing or whose scope has been revolutionized by it.
- Let's start with the most basic administrative activities, where word processors, spreadsheets, and presentation templates have facilitated and revolutionized office work, as well as academic and educational tasks, which would be unthinkable without these tools.
- In the field of health, the use of computers in electronic instruments, with the application of macro-data management and advances in robotics, has led to applications that allow everything from the early detection of skin diseases to “computed axial tomography,” a decisive means of diagnosis. Laboratory analysis and even the performance of surgical tasks by remote “surgical robots” such as “Da Vinci” allow for cardiac surgery with greater precision and skill than that of a human being, offering, in addition, a three-dimensional view.
- In sociology and the creation of human relationship systems, demographically universal levels of participation in interactive social networks have been reached. Politics itself has seen the use of networks and computer communication systems playing an important role in the manipulation of electoral decisions. A relevant case in point was the “Cambridge Analytica” program, which used knowledge of internet and social media usage preferences to precisely direct voters' attention toward like-minded community bubbles for electoral purposes. It is said to have been a decisive factor in Donald Trump's first victory in 2016.
- In today's political electoral marketing, the use of computer-based tools is universal.
- In the field of advertising and consumer preference management, computing and its applications are common tools for segmenting markets or targeting product proposals.
- In manufacturing, computing has been introduced not only into automated production processes but also into the components of the manufactured items themselves. For example, the chip distribution crisis caused by the transport shutdown during the COVID-19 epidemic brought many industries to a standstill. Cars, washing machines, refrigerators, etc., manufactured in computer-automated processes, owe their functional advances to the introduction of chips in their mechanisms.
- Interpersonal communication via smartphones has been taken to levels that were unimaginable just a few years ago in terms of cost and distance.
- In the area of file storage and management, the ability to store documents digitally on distant, massive computer servers has created the innovative concept of the “cloud,” a cyber space for personal and business storage.
- When it comes to immediate and widespread access, we cannot fail to mention internet search engines, which provide universal access to the universal accumulation of human, historical, literary, musical, and visual culture.
The case of the universal encyclopedia Wikipedia. The concept that sums up this whole transformation is that of the “digital society,” but also that of the “knowledge society.” A tool such as Wikipedia has contributed greatly to this designation. This is an encyclopedia created collaboratively by tens of thousands of users and fed with all kinds of material likely to arouse interest. Wikipedia has replaced and possibly improved the intellectual efforts of professional thinkers and academics.
It is worth briefly mentioning its analog antecedents. In 1751, during the French Enlightenment, under the leadership of Denis Diderot and Jean le Rond d'Alembert, more than 200 scientists, writers, philosophers, and university professors of the highest caliber agreed to write and publish, 21 years later, in 1772, " The Encyclopedia or Reasoned Dictionary of the Sciences, Arts, and Crafts."
Since then, almost all developed countries have imitated this effort. It has been improved and updated, culminating in its most famous expression in the Encyclopaedia Britannica, updated annually and to which personalities such as Marie Curie, Albert Einstein, and Henry Ford contributed. However, it suspended its print edition in 2012 in the face of competition from Wikipedia. Even so, it maintains a digital edition.
Wikipedia is considered to have surpassed all printed encyclopedias, both in terms of number of articles and scientific quality. It produces 500 new articles per day, currently reaching more than 5 million. It makes 21 edits and updates per minute.
The link between computing and industry and commerce. Computing applications have been developed in every industry, and these same computing applications have generated their own industries. They have created new needs that have resulted in new demands, from which continuous specialized market niches have emerged. But computing itself has also revolutionized the forces of production, increasing competitiveness and generating new forms of comparative advantage. In turn, international trade itself, both wholesale and retail, has needed to develop its own applications, revolutionizing the relationship between different countries, as well as between producer, distributor, and consumer.
From this introduction, we can deduce the importance of knowing the history and development of computing to the present day. Its trajectory, its components, and its rapid introduction into production processes and human life are not public knowledge. Its impacts are known, and its products are used without questioning their origin. Computing has become an internalized assumption of social life, which has robbed humanity of its capacity for wonder. That is why we have decided to dedicate this capsule to the topic of its birth, evolution, and impact, particularly on commerce.
The productive and commercial premises that preceded computing. A lot of time passes between the introduction of new technological systems and their full productive use. For example, at the beginning of the 17th century, the first productive uses of the steam engine were developed. It was used to pump water and provide faster access to coal mines. However, it took more than a century for it to be widely used in European countries. This marked a turning point in economic and commercial history, changing and influencing all aspects of social and political life in what was called the “First Industrial Revolution.”
Industrial and agricultural production multiplied, and new social classes, such as the proletariat and the bourgeoisie, emerged from the new industrial development. The new social divisions that arose gave rise to the political transformation of the old feudal regime, and after great political revolutions, liberal democracy was born. The national social and economic transformations were accompanied by commercial revolutions. Trade dominated by the industrial powers determined the division of the world into industrialized countries and dependent countries. Colonialism emerged, extracting raw materials to feed the industrial machinery of the colonial metropolises. At times, trade wars broke out: what was called “gunboat trade,” where military force compelled countries to open up to unequal trade. These were the times that preceded the introduction of computing into the productive and social universe.
The prelude to computing and its first steps. It took a long time, practically 200 years, before the industrial introduction of new technological, productive, and social factors that would revolutionize all systems of organization and production. The culmination of this new transformation is the development and widespread application of computing systems, which, in their most advanced stage, have been called Artificial Intelligence.
In 1936, Claude Shannon published his doctoral thesis, which laid the mathematical foundation for developments in programming. In it, Shannon linked Boole's mathematical logic with the possibility of generating instructions for electrical circuits. Boole's system was based on true-false binary operations, which could be linked to an electrical switch, connected-disconnected. This was the mathematical and technological basis that would enable computing by linking a binary system (programming, software) to an electrical circuit (processor, hardware).
This brilliant concept, which connects a mathematical instruction with an electrical circuit, where a switch represents the two states of the binary system, demonstrated that Boole's logic can be used to enable a machine to perform complex logical and algebraic operations. This relationship between digital or binary programming and electrical circuits was the basis of computing. Hence the word “digital,” meaning binary.
In the beginning, it was only a matter of designing simple instructions to perform calculations or execute commands on electromechanical devices. But, although in an elementary form, it was already a way of reproducing in a machine the processes that the human mind performed to execute a task. Hence, from the outset, the concept of “intelligence” was introduced into this work.
Thus, the first modern history of computing was born out of the need to develop statistical methods to solve problems, such as the storage and retrieval of statistical data on the population. But the binary mathematical logic of Boole's system had not yet been linked to the simple mechanics of electrical circuits. This made its implementation impossible, but it left the problem on the table, that is, half of its solution.
War, maritime trade, and computing. The link between the development of computing and maritime transport of goods is interesting. World War II created the political conditions for the state to finance the investment necessary for the construction of electromechanical devices that would allow the decoding of encrypted messages.
England received commercial and military support from the United States by sea, with large commercial ships traveling in convoys across the Atlantic. With its fleet of submarines, Germany attacked and sank them. The convoys were detected from the air and their position was transmitted to Berlin, which sent it in coded messages to its submarines. Everything was in encrypted messages. For the aid to reach England, the encryption of German military messages had to be “decoded.”
The private market, on its own, would never have made such an investment because it would take many years before computing yielded returns capable of recouping large investments. This is the type of expenditure that only the state is capable of making in emergency situations.
In 1936, three years before the outbreak of the world conflict, mathematical genius Alan Turing was working for the British Intelligence Service's Government Code & Cypher School (GC&CS). Turing was familiar with Shannon's work. At the Bletchley Park mansion, Turing's collaboration with and Gordon Welchman, along with the best English electrical engineers, a system (software) was developed so that an electromechanical machine with 1,800 vacuum tubes (semiconductors), called “The Bomb,” could decipher messages between Berlin and German submarines, which used a complex decryption machine called “Enigma.”
The first computers and their limitations. In 1943, still during the war and on that basis, the first modern programmable computer appeared, called ENIAC (Electronic Numerical Integrator And Computer). It was a huge technological device weighing 30 tons, with 18,000 vacuum tubes occupying 170m2.
In 1945, mathematician John Von Neumann summarized the basics of computing in three basic principles:
- having a memory that integrates instructions and data,
- having a fixed internal unit for calculating logical operations, and
- controling data management to decide on actions based on the instructions received and the results of previous operations.
With that, computing could develop. It was not surprising that, after the war, and in parallel with the logical, mathematical, and technological development of computing, discussions began about the possibility of building “machines” with the ability to mimic human mental processes.
The electrical circuits on which the first computers were based were composed of fragile vacuum glass tubes with a precarious operating life of 48 hours. This presented the first obstacle: unaffordable costs and enormous energy consumption. Added to this, from a commercial point of view, was the absence of market demand, as their initial applications were limited and their costs prohibitive.
In 1947, this obstacle was only partially overcome when vacuum glass tubes were replaced with metal transistors, which were much smaller than the tubes, had a longer life, and consumed less energy. Later, transistors were flattened and printed to produce them. Then printed circuits were designed in which they were inserted.
Personal computers and their impact on international trade. Transistors laid the initial technical foundations for producing machines that could be “instructed” (programmed) to perform simple processes similar to human mental processes.
Even so, computing power was limited by the size of transistors. This was overcome because it became increasingly possible to reduce the size of transistors. As they became smaller and smaller, integrated circuits (chips) could contain a greater number of them. In turn, by increasing the number of transistors within a chip, computers were able to perform more operations per second.
It is the evolution of chips that will determine the decrease in the cost and size of computers. This will make them accessible to the public and businesses in general, creating a new international market in which the most developed countries will compete and an increasingly universal consumer market.
Moore's Law. This has been astonishing and has so far followed the deductive-intuitive predictions of the famous and well-known Moore's Law. In 1965, Gordon Moore, then director of research and development at Fairchild Semiconductor, wrote an article in Electronics magazine. Its title was: “Cramming more components onto integrated circuits.” In that article, he showed that, from 1959 until then, the number of transistors (components) that could fit on an integrated circuit, i.e., on chips, had doubled every year. He predicted that this trend would continue for at least another 10 years.
This obviously forms a geometric progression that quickly reaches enormous figures and, from a physical point of view, cannot reach infinity. Obviously, what Moore was saying was a simple assumption based on his observation. It was not based on any technological premise. It was simply what had happened in just the last 5 years. There was no basis for thinking that it would continue to be so. But his intuition proved correct. That is why this progression of the increasing number of transistors that can fit on a chip is called “Moore's Law.” It is thought, however, that this law is reaching its physical limits, at least with current technologies. But so far, it has held true.
Evolution of the number of transistors on a chip and its commercial impact. The result has been a systematic decrease in the size of transistors, which has allowed more transistors to “fit” on the same chip, whose computational architecture has also evolved. The data is simply astonishing.
The first chip, which came to be called a microprocessor (transistor measured in microns), was produced by Intel Corporation in 1971. Intel had been founded just a few years earlier by Moore himself, who understood the market potential of his law due to the cost savings that would result from the construction of computers.
This advance laid the technological foundations for the mass distribution of computers and the creation of a new market niche. This was followed by the corresponding programming that would create new needs for an increase in brands, industries, applications, and jobs.
In the 1970s, relatively inexpensive computers designed for individual users began to be developed, such as the Altair 8800 and the Kenbak-1. But it wasn't until Apple released a personal computer in 1977 that truly marked the beginning of the popularization of easy-to-use computers accessible to the general public. It was the Apple II. However, its closed-source model prevented the development of mass software applications for different tasks, which slowed down its spread.
This began to be overcome in 1981, when IBM launched a model that ended up becoming the de facto standard for personal computers, with operating systems such as MS-DOS and Windows, with open source code, which marked a significant change in the industry, as they facilitated free programming for different tasks, many linked to production. From there, a huge market was created and the mass adoption of personal, commercial, industrial, and other computers was generated.
The process continued. Chips with more and more transistors boosted data processing speed and enabled increasingly sophisticated applications. We will not follow that process in detail, but we can jump to the present to compare the path taken.
Comparison of advances in the number of transistors in early chips with current chips. Intel's first chip contained 2,300 transistors and could perform 60,000 operations per second at a speed of 700 kHz. Today, Apple's Mac Pro computer has the M1 Ultra processor (chip) with 114 billion transistors. It has a processing speed of 22 trillion operations per second at a speed of 3.2 GHz. The transistors in the M1 Ultra measure 5 nanometers. The M1 Ultra is produced by TSMC (Taiwan Semiconductor Manufacturing Company), the leading company in the miniaturized production of transistors.
When considering this comparison, it should be noted that we have compared Intel's first chip with one of TSMC's latest. But these are small personal computers. Supercomputers, on the other hand, have 20 times more transistors. A modern supercomputer runs on chips, such as the Wafer Scale Engine (WSE), which integrates no less than 1.2 trillion transistors, each measuring 2 nanometers.
That number of transistors can fit on a chip because they have become increasingly smaller. Incredibly smaller. Transistors in 1971 measured 10 micrometers (a micrometer is one millionth of a meter, and ten micrometers is equivalent to the average diameter of a human hair). Today's smallest transistors measure 2 nanometers (a nanometer is one thousandth of a micrometer), which means they are more than 5,000 times smaller than the transistors of 1971.
If we compare that to the thickness of a human hair, we can say that the diameter of a hair is between 80,000 and 100,000 nanometers. The construction of such a small material means that its processes can only be carried out by specialized machinery built and controlled by machine-programming software.
It should be noted that such technological feats, with current techniques, push the limits of what is possible. They have been manufactured by only one conglomerate of two large companies: the Dutch ASML (Advanced Semiconductor Materials Lithography), the dominant supplier of integrated circuit lithography systems, and the Taiwanese TSMC (Taiwan Semiconductor Manufacturing Company).
It is also worth mentioning that their chip production chain supplies all the major computer companies, large and small, such as Apple, Nvidia, Qualcomm, AMD, etc. This creates a contrast between the high degree of concentration of high-tech computing companies and the widespread and gigantic expansive value chain in international trade, which has resulted in the universalization of the use of computing in all areas of productive, social, and commercial life.
Computing has had a profound and transformative impact on international trade. The adoption of digital technologies and efficient information management are mandatory components for all types of companies and for all forms of international trade. They bring dynamism to the area of trade in services, with an even greater impact than trade in goods.
By boosting efficiency, reducing costs, and connecting businesses and consumers globally, digitization has facilitated process automation, supply chain optimization, and the expansion of e-commerce.
The second part of this capsule will analyze this relationship between computing and international trade in greater depth, focusing on the quantitative analyses carried out by the World Trade Organization, which examine the effects of three trends:
- the impact of robotization and digitization on the redistribution of tasks in production,
- increased use of ICT services in the rest of the economy with their expansion into production processes, and
- the fall in trade costs as a result of technological changes.
References:
- Negocios Internacionales: Historia, Importancia y Principales Relaciones Comerciales. Revista Raia Diplomática. 18 de noviembre, 2024. En: https://raiadiplomatica.info/2024/11/18/negocios-internacionales-historia-importancia-y-principales-relaciones-comerciales/#
- El impacto del avance tecnológico en el comercio internacional de México. Revista Debate Económico, Vol. 6 (2), No. 17, mayo-agosto, 2017, pp. 95-111. En: https://debateeconomico.org/wp-content/uploads/2023/02/17-05-avance-tecnologico.pdf
- Las enciclopedias: el caso particular de la Wikipedia. 2010, Researchgate.net. En: https://www.researchgate.net/publication/216292893_Cap_9_Las_enciclopedias_el_caso_particular_de_la_wikipedia
- The Nature of Information Technology related Industrial Attachment in Kenya. 2023, International Journal of Advanced Research in Computer and Communication Engineering. En: https://www.researchgate.net/publication/375355280_The_Nature_of_Information_Technology_related_Industrial_Attachment_in_Kenya
- Fundamentos de la Computación: Evolución de las Computadoras. 2010, Universidad Nacional Autónoma de México. En: https://gc.scalahed.com/recursos/files/r161r/w23942w/unidad_1/Evolucion_de_las_computadoras.pdf
- Desarrollo histórico de los dispositivos de computo. Conalep mexicano italiano. En: https://www.geocities.ws/gildardoaguilar/iisc1.pdf
- Las generaciones de ordenadores. ACTA (Autores científico-técnicos y académico). En: https://www.acta.es/medios/articulos/informatica_y_computacion/031035.pdf
- Impacto de las tecnologías digitales en la transformación del comercio internacional. Economía Internacional: cuadernos de información económica. En: https://www.iberglobal.com/files/2019-1/tecnologias_digitales_comercio_poncela.pdf
- Ley de Moore, nanotecnología y nanociencias: síntesis y modificación de nanopartículas mediante la implantación de iones. Revista Digital Universitaria UNAM. 10 de Julio 2005. Volumen 6 Número 7. En: https://www.revista.unam.mx/vol.6/num7/art65/jul_art65.pdf
- Entorno histórico y social de la aparición del microchip. Mayo 2014. Universidad Politécnica de Madrid. Escuela universitaria de ingeniería técnica de telecomunicación. En: https://oa.upm.es/34023/1/PFC_maria_gonzalez_mondaza.pdf
- Circuitos Electrónicos: Estado Actual, Innovaciones y Perspectivas Futuras. Marzo 2025. Researchgate.net. En: https://www.researchgate.net/publication/389696025_Circuitos_Electronicos_Estado_Actual_Innovaciones_y_Perspectivas_Futuras
- El futuro del comercio mundial: cómo las tecnologías digitales están transformando el comercio mundial. Organización Mundial del Comercio (OMC), 2018: Informe sobre el comercio mundial. En: https://www.wto.org/spanish/res_s/publications_s/world_trade_report18_s.pdf