Facing the anthropocene

ESOF 2020: discussions on our future

Claudio Tuniz


1 Introduction

ESOF 2020 was held in the old port of Trieste. Tergeste’s origins go back to the military fortifications built in the second century B.C. to secure the north-eastern borders of the Roman Republic. About 1500 years later, Trieste was a multi-ethnic Italian-speaking city in the Habsburg monarchy, with a thriving shipping industry and the status of a free port. Its key role ceased with the collapse of the Austro-Hungarian empire after World War I and its incorporation into the Kingdom of Italy. After World War II, Trieste again became a border city, this time along the Iron Curtain. Its position encouraged the development of the Abdus Salam International Centre for Theoretical Physics (ICTP), with the support of the International Atomic Energy Agency. The mission of the ICTP is to foster advanced studies in physics and mathematics and to promote dialogue among nations on the common ground of scientific cooperation. According to the Nobel Prize speech of its first director, “scientific thought and its creation is the common and shared heritage of humankind”.

During the last 60 years, a number of international research institutions flourished in Trieste, such as the World Academy of Sciences, the International Centre for Genetic Engineering and Biotechnology and the International School for Advanced Studies. They were all integrated into the so-called Trieste Science System, which includes the University of Trieste, the Area Science Park, the National Institute of Oceanography and Applied Geophysics, and Sincrotrone Trieste.

Today, Trieste is a “port” where scientists from all over the world, particularly from developing countries, can safely dock to study basic science and its applications in many fields of research, from theoretical physics and mathematics to genetics, from wireless technologies to synchrotron radiation imaging. The Trieste Science System also promotes interdisciplinary studies on sustainable development, global climate change, predictability of extreme environmental events and pandemics. The lofty goal that motivated the creation of these institutions proposes that science represents both mainstream culture in our era and a tool for economic development and international peace. The choice of Trieste as the venue for ESOF 2020 thus evokes the spirit of the old imperial port and the international role of the city.

ESOF 2020 overcame the Covid-19 restrictions by integrating in-person and digital events. About 2400 people registered for the Forum: 1000 were present in Trieste and 1400 connected remotely every day. Its hybrid format was a metaphor for our way of life in a world where cyberspace pervades physical space, blurring boundaries between our material and virtual domains.

The main aim of the programme, organised into 18 themes (fig.1), was to foster a debate on how science and technology can help boost sustainable development. This review is mainly focused on topics from the following ESOF themes: Sustainable Future; Science and Society; I Compute Therefore I Am; and Breaking Frontiers in Science and Knowledge. Participants discussed how to further enhance human capabilities through artificial intelligence and high-performance computers. Intelligent technologies are crucial to face the challenges of the Anthropocene, but they can be disruptive. We need to be aware of their possible ethical, social, political and cognitive impacts.

To get ready for the future, we also need to understand how humans became the rulers of the Earth. The origins and evolution of Homo sapiens’ mind can shed light on the drivers of our behaviour, both as individuals and societies. A uniquely human cognitive trait is symbolic thinking. It emerged between 100000 and 50000 years ago, promoting pro-social behaviour and energy consumption. Our impact on the biosphere increased dramatically during the last 10000 years. In the last few centuries, as industrialised societies flourished, they have had a major global impact on the atmosphere, the hydrosphere, the cryosphere, the lithosphere and the biosphere. To these we have added our own domain, the technosphere, which emerged in the “geological blink of an eye, and it is evolving at a furious pace”. Human ecological disruptions are – among other things – implicated in the origin of pandemics and their rapid spread.

2 Of viruses and men

The impact of Covid-19 on humans has been largely discussed according to medical, epidemiological, psychological, political, economic and social viewpoints. A less visited topic is our relationship with the virus from the perspective of evolutionary biology. Let’s focus on this matter.

The origin of viruses is still debated. They probably evolved more than 3 billion years ago and thus had a long time to perfect their strategies to survive and reproduce. They are the most common and abundant biological entity on Earth. Viruses can have diverse shapes and sizes, based on RNA and DNA genetic structures (single- or double-stranded) that are sealed in a protein capsule, but they all need a host cell to reproduce. They make copies of themselves using the cells of other organisms, which become the vehicles for their diffusion. The RNA genome of SARS-CoV-2, responsible for the Covid-19 pandemic, includes sequences that inhibit the immune system of host cells and the degradation processes that affect the virus, increasing its replication efficiency and virulence.

In the deep past, viruses first infected bacteria and other unicellular organisms, then multicellular organisms of increasing complexity, until they eventually reached our Homo sapiens species, which emerged in Africa between 300000 and 200000 years ago. This ancient biological history is recorded in our own DNA, which contains about 100000 pieces of viral DNA. They were included in the primordial genetic material of our ancient ancestors many millions of years ago, and then inherited by subsequent species, reaching modern humans. Some of this viral genetic material helps improve our health, some raises our risk of developing cancer or other diseases. A fraction of the genetic material that we inherited from the Neanderthals (which stands between 1.5% and 4%) has also been associated to specific diseases. It seems that we might have reciprocated by infecting the Neanderthals with our own African pathogens, perhaps contributing to their extinction. Interestingly, a recent study shows that we inherited from Neandertals a genomic fragment that increases the risk of respiratory failure after infection with Covid-19.

Since the dawn of our evolution, we had many chances of being infected by unfriendly viruses, but this risk greatly increased during the Late Pleistocene, when we became an intensive pro-social, self-domesticated species. It was at this time that we developed extraordinary capacities for imitation learning and cultural transmission of knowledge.

Our new behaviour was supported by rituals that feed our brain’s reward mechanisms with dopamine, serotonin and other neuro-activators of pleasure. The remains of musical instruments, ornaments and rock paintings, found in the archaeological record of the Late Pleistocene, suggest that sociality was based on rituals and representations capable of creating emotions and feelings of belonging. Rich ritual burials, found among simple ones, are consistent with a pattern of emerging societies based on inequality and submission. And the new habits were probably instrumental to the creation of a hierarchical society.

After moving out of Africa, Homo sapiens groups spread around the planet like pandemic waves, carrying their ideas, tools, weapons and pathogens. They were able to replace all other human species, wipe out many animal species and change entire ecosystems. If the Anthropocene is defined as the era in which humanity leaves major indelible marks in the geological register, it probably started back then.

The human impact on ecological systems intensified during the Holocene, thanks to agriculture and animal domestication. However, it is in the last few centuries that mankind’s footprint has dramatically intensified. While Palaeolithic societies consumed 2000 kWh/year per capita, current industrialized countries reached 50000 kWh/year per capita. In addition, by dramatically increasing in number and number of contacts, we turned into a fantastic host for viruses. And by enthusiastically continuing our ancient practice of environment disruption, and biodiversity annihilation, we shrank the natural habitats and the number of species that protect us, as a buffer, from the spread of viruses. Add illegal poaching, trade of exotic animals and Chinese wet markets and you will get a recipe for disaster.

In conclusion, pandemics are closely related to the construction of our own ecological niche. For tens of thousands of years we had adapted to the environment by augmenting our body with clothes, shoes, instruments and weapons. Then we created multifunctional shelters of increasing complexity to make the world more suitable for our survival and reproduction. Finally, we built complex infrastructures, road systems, bridges and skyscrapers that require an increasing amount of energy. In comparison with all other species, we are by far the most successful niche creators, thanks to our cognitive capacities and our extreme pro-social behaviour. By building our ecological niches, we pass down to future generations genes, cultural innovations and ecological changes (fig. 2). Unfortunately, human niche construction – with its disruptive impacts on environment and biodiversity – has recently turned into an increasingly dangerous game. What benefited previous generations can put future survival at risk.

3 The environmental pandemic

Global climate change could be viewed as a “global environmental pandemic”. “Fever” is a key symptom of climate change. The average global surface temperature increased by 1 °C during the last 100 years – a very rapid effect in geologic terms. Temperatures are demonstrating a higher-than-linear increase, correlating with the behaviour of CO2 concentrations in the atmosphere, which rose by 45% in the last 100 years.

Geographic maps with different shades of red represent the coronavirus pandemic dispersion around the planet. Similar maps describe the rapid spreading of climate- induced disasters, affecting wildlife, marine ecosystems, food production, human health, economics and social systems.

Scientists believe we have already passed the tipping point on global warming. Since 1980, extreme weather events – drought, floods, intense storms, heat waves and wild fires – doubled in number every 20 years. In the last 20 years there were 7348 major recorded disaster events, 6681 of which were related to climate change. They affected 4.2 billion people, causing 1.23 million deaths and economic losses of US$ 2.97 trillions.

A tipping point is when a small change can tip the balance of a system, generating a big change. Beyond this point, an epidemic that previously evolved at a slow and linear pace, starts spreading very rapidly, creating social unrest and havoc in the health system. Soon it becomes difficult to monitor the infection, manage the recovery and make people follow the rules. The Covid-19 pandemic shows that outbreaks can be managed only if measures are clear, rapid and strictly followed. Unfortunately, we do not know in advance when we are approaching a tipping point.

Some environmental phenomena induced by global climate change are also characterised by tipping points, with thresholds above which the processes cannot be stopped. Think of the melting of Antarctic ice sheets or the collapse of the oceans’ thermohaline circulation. The 2015 Paris Agreement, the first legally binding global climate change deal, aims to limit global warming to 2 °C. It also promotes adaptation measures to counter the impacts of climate change. The “business as usual” scenario that would lead to a 5 oC temperature increase by the end of this century is probably overestimated. Recent assessments suggest that, without any intervention, the global surface temperature would increase by 3 °C above pre-industrial levels by that time. However, this is a significant rise, considering that during the Holocene the global temperature was within a variation of 1 °C.

Humans have to adapt to environmental conditions they have caused (or co-generated). Evolution is a circular process: if we want to survive while changing the world, we need to adapt to its pace of change. If it changes faster, we have to adapt faster. As Lewis Carroll put it, in the words of the Red Queen: “Now, here, you see, it takes all the running you can do, to keep in the same place”.

Covid-19 is a wake-up call, urging us to find new ways to prevent and manage global disasters, pandemics and climate catastrophes.

4 The race in cyberspace

There are new forces emerging in the human technosphere that include deep learning machines, supercomputers and robots. They are having a huge impact on our society, government and business. The question is will they help us to achieve global sustainability (fig. 3), or will they foster deeper and wider conflict, further social inequality and environmental disruption?

The Space Race of the 20th century meant technological competition between the United States and the Soviet Union to prevail in spaceflight capabilities. Today’s competition in new technologies has enlarged the number of players and moved to “cyberspace”, an environment where Big Data are feeding intelligent algorithms that are trained by high-performance computers (HPCs). To win this race, one needs the fastest HPCs, the most sophisticated artificial neural networks and comprehensive datasets. The 5th generation technology standard for broadband cellular network (5G) will connect humans and devices at a dramatically increasing speed. More futuristic technologies, like quantum computing, will come later, depending on major basic research. If going to the Moon served “to organise and measure the best of our energies and skill”, cyberspace is the new terrain for measuring one’s scientific, technological and political power.

This race is having a revolutionary impact on a range of manufacturing sectors. Until a few years ago, Information Technology (IT) developed new products every 1-2 years; for the automotive sector the product lifetime was 10 years, for aircraft 40 years. Intelligent technologies and digitalisation are now dramatically speeding up product development. IT delivers new products every 6 months, whereas the lifetime of automotive products is 3-4 years. Furthermore, the automotive sector is considering the future market of flying cars. New alliances like Aston Martin/Rolls Royce and Porsche/Boeing are exploring the 3D control of traffic through swarm motion.

A big revolution is occurring in the aircraft industry thanks to HPC simulations. Digital avatars of helicopters, airplanes and satellites can be efficiently designed and tested in cyberspace before their physical construction, skipping the slow process of working on physical prototypes. Digital design will be more effective and sustainable, avoiding the waste of energy, raw materials and other resources.

During flight, terabytes of data are collected through airplane sensors, and used only if there is a specific problem, typically an accident. Now these Big Data can be analysed with AI and HPCs, making maintenance a more predictive process.

Another important issue in the aircraft industry is certification, which presently requires complex procedures based on prototype testing. If this task is performed on digital twins, it will save time, costs and energy. Finally, pilots can be trained by applying Augmented Reality to precise digital descriptions of the flying system.

The development of new materials is critical in the technological race. They powered human societies since the Pleistocene, when instruments made from stone, wood and bone extended our bodies, giving us new capacities. At some point we started using fire to heat up stones and improve their flaking properties. These pyrotechnologies required a sophisticated knowledge of the effect of fire on specific minerals. At the end of the Ice Ages, in the Holocene, humans started extending the use of high temperatures to treat minerals and produce new materials, like ceramics, glass and different metals. More recently, we became capable of producing synthetic products like plastics, by treating petroleum with very complex chemical and physical procedures. Today we are able to exploit a plethora of new materials, such as semiconductors, superconductors, metamaterials and nanomaterials, which have been designed over many years of extended scientific research. We are collecting the fruits of the quantum physics revolution, producing materials that are empowering us in a very pervasive way, from the chips of our cell phones and computers to systems for storing data and solar energy production.

New revolutions in materials science are now driven by both quantum mechanics models and computer simulations. HPCs allow us to analyse complex atomic systems with electronic clouds that change and interact. Thanks to first-principle calculations, the properties of materials can be predicted without spending years in a laboratory.

Materials science is also relevant to the study of biomolecular systems of interest to life sciences and medicine. HPCs are used to capture structural and dynamic information on the molecular machinery of organisms. Extending all-atom molecular dynamics simulations from single proteins to organelle and cell-scale simulations is challenging. The application of computational molecular microscopy to hundreds of millions of atoms requires a large computational power and efficient codes. Simulations are used to interpret data, suggest new experiments and give information on living systems at the atomic level. Computational molecular microscopy is being used to study SARS-CoV-2 (fig. 4) and HIV and provides high- resolution insights into plant processes like photosynthesis.

Somebody is wondering whether scientific research based on intelligent technologies might supersede the traditional scientific method. This approach, pioneered by Galileo Galilei, is based on a self-reinforcing circular process between experimental data and theoretical models, born in the minds of scientists. Hypotheses are tested to confirm or falsify theories about how the “real” world functions.

In the Petabyte Age, supercomputers and AI analytics can extract information and knowledge from the Big Data emerging from a vast array of disciplines such as physics, biology, linguistics and sociology. It would seem that hypotheses and theories are now sometimes pleonastic: patterns describing the system could be extracted directly from Big Data, bypassing good old science methods. This is particularly appealing for the life sciences, where complexity hinders mathematical modelling. Think of the implication of Big Data generated by shotgun gene sequencing for generating new knowledge of biological species.

On the other hand, integrating the traditional scientific methods with Big Data analytics is even more promising and could provide a paradigm shift in the multi-scale modelling of complex systems characterised by nonlinearity, non-locality and hyper dimensions. In any case, supercomputers, Big Data and AI are set to play a crucial role in scientific research.

5 Artificial Intelligence: friend or foe?

There are two definitions of AI. The first definition (strong AI) considers it a discipline that aims to create a machine that is “like us” in every respect. Such a definition includes intelligent behaviour and the capacity of understanding to further extend towards “being creative”, “having feelings”, etc. The second definition (weak AI) considers AI “the discipline that aims to demonstrate the truth of the assumption that implemented Turing machines can be sufficiently powerful for exhibiting intelligent behaviour that is indistinguishable from human intelligent behaviour”. This latter definition covers all current forms of AI (deep learning, reinforcement learning, Bayesian methods, robots, etc.).

AI is constantly improving its performance and is present in all aspects of our life: from industrial production to education and scientific research; from leisure and social life to health and environmental monitoring. AI can replace us in many boring and unpleasant tasks and in difficult places like space and the deep sea. It is superior to us when it comes to very fast decisions, accuracy, parallel processing and classification. AI can also compensate some perceived human limitations such as weakness of will, forgetfulness and incorrect reasoning.

In the following we will provide a brief summary of recent advances in AI methods and applications and mention some possible risks.

The most popular AI system is machine learning (ML), where the machine is “trained” with the input of a large number of “relevant” data (i.e. data representative of the conditions the machine will analyse in the applications) and can tune many ML parameters to give the required output. These machines are built with artificial neural networks, where signals propagate like in human brain’s neurons, which are connected through axons and synapses. The ML parameters modulate the signals that come from a group of neurons and act as input for other neurons. This will combine the signals and if the sum of the weighted inputs is above some threshold, the signal is propagated further to other neurons. In deep neural networks, neurons are organised in layers, including input and output layers, plus other hidden layers. It is possible to build systems with hundreds of layers, each with 100000 neurons and millions of ML parameters to weigh the signals. The training of deep neural networks needs large data sets labelled with their content. One example is Imagenet, a database of more than 14 million images used for training algorithms in visual object recognition. In the last few years, the algorithms trained with Imagenet started outperforming the accuracy of humans. However, there are still problems in ML applications with other data sets, which are not benchmark data sets like Imagenet, as they are always characterised by bias and contextual effects.

AI has been around for over 70 years, but real practical progress started only in 2012, when AI based on deep neural networks showed its capacity in image and speech recognition. Today the progress in these areas has been enormous. There are also many new applications, from the analysis of data generated by particle accelerators to strategy board games. Image recognition and segmentation are crucial for developing algorithms for self-driving cars and trucks.

In medical applications, deep neural networks can distinguish between benign and malignant skin melanomas. The variability of the skin lesion images makes the automated classification a challenging task. In a recent application, the deep neural network required pretraining in general object recognition with 1.4 million images, then it had to be fine-tuned on a dataset of 130000 clinical images related to both benign and malignant skin lesions. It seems that AI can classify skin cancer with a competence comparable to dermatologists. In the near future, the use of smartphones outfitted with deep neural networks will provide low-cost access to these important diagnoses.

AI can listen to our voice, understand the meaning of what we are saying and learn about our traits and our psychological or physical conditions. The human voice is a biological mechanism that carries a lot of information about our personality, health, gender, age, emotions and also state (e.g., sleepiness, tiredness, boredom, interest). AI can identify alcohol intoxication better than any human. Speech analysis can give information on facial movements, heart rate, cortisol levels and blood pressure. It is also useful for the early diagnostics of autism, security applications and counselling (e.g., couple relationship).

Researchers have been recently exploring how to use AI-based voice analysis to diagnose coronavirus infections. They collect the recorded voice of thousands of people who had been tested positive for Covid-19 and use them to train ML systems. The aim is to identify a voice fingerprint for the disease. Other groups are analysing audio recordings of coronavirus coughs and designing voice-analysis algorithms that can detect if somebody is wearing a mask. There is hope that in the future the Alexa or Siri of your smartphone will tell you whether you have caught Covid-19 or you just have a bad cold. During the last decade, scientists have been using deep neural networks to identify potential vocal biomarkers for other illnesses, including dementia, autism and depression.

AI will continue to advance with new applications in medical diagnostics, identification of pathogens, development of new drugs, environmental monitoring, pandemic tracing, optimisation of moving people and goods, robotics for elderly people, case-law research, etc.

There are different opinions on the future of our relationship with AI: optimists believe that it will transform us into superhumans, while pessimists are convinced that increasingly autonomous intelligence might pose an existential risk for the human species. Others believe that AI will be immoral or that it will suffer. Which are the realistic and clear dangers that could derive from current forms of AI under development?

One possible risk is linked to the unpredictability and bias of AI. As we said, sub-symbolic algorithms, like deep learning, can only be as fair as the data upon which they have been trained. For example, biases in databases can produce racial discrimination in worker recruitment and in the judicial system. Other problems come from the “amorality” of (weak) AI, which does not have feelings or emotions, and can reason only in terms of goals. Even if algorithms do not have basic drives and pursue only the assigned objectives, these could be misaligned with our own goals.

Other disruptive dangers would affect the functioning of our societies. While the machines of the industrial revolution made their social impacts by replacing many types of physical labour, the intelligent machines of the 21st century will increasingly compete with labour based on reasoning and decisions. Given the fast timescales characterising the digitalisation process, our economic and social system will find it hard to adapt to these changes.

Immediate and real risks derive from the misuse of currently available AI systems by repressive and aggressive regimes and in the development of new autonomous weapons. Particular concerns regard the erosion of privacy linked to the application of intelligent technologies to trace individual actions and opinions for social control (e.g., in the Chinese Social Credit System). People’s personal data, collected through social networks, browsers and other digital applications, are increasingly used to influence not only our shopping trends, but also our political decisions, and thus interfere with democratic processes. In the next section we will follow an evolutionary perspective to evaluate our relationship with all the instruments that extend the limited capabilities of our body and mind.

6 Extending body and mind into digital technologies

The extension of our body and mind does not necessarily imply implanting chips in our brain and connecting intelligent prostheses that change our anatomy and our perceptions. We can augment our mental capacities by resorting to external tools such as smartphones and their apps, which increase our capacity to socialize, memorise, calculate, orientate and share images. Our individual cognitive capacity derives both from (more or less) intelligent tools and from the social networks we belong to and rely on as a source of information.

However, there are some possible risks. We mentioned the impact of intelligent technologies on social, political and ethical issues. Now we consider an evolutionary perspective to understand their impact on our cognitive traits. The biological mechanisms behind our “prosthetic capacity” and pro-social behaviour show the constraints and potentialities of our relationships with digital technologies.

A few questions arise. Which are the specific regions of our brain that became crucial to supporting an enhanced prosthetic capacity? How do we use this capacity to expand our body and mind into the environment? How do we integrate tools into the brain’s sensory machinery? And finally, which are the main issues related to the development of intelligent technologies to extend human capabilities?

Comparative neuroanatomy and paleoneurology show that modern humans have evolved large and complex parietal regions, when compared with other primates or extinct hominids. This suggests a specialization of visuospatial cognition, projecting body and vision into a spatial and temporal frame. Hence, this is requisite to supporting an enhanced prosthetic capacity. Technologies are readily integrated into body schemes and incorporated into the cognitive process. Some of the components of the cognitive process are thus exported into out-of-the-body elements, offloading the information flow to devices entrusted with physical, sensorial, storage and computational functions.

In human evolution, natural selection has been integrated with an interactive expansion of ecological, neural and cognitive niches, extending the mind/body into the environment. The neurobiological mechanisms that supported this extension developed with three phase transitions. The first transition occurs with the evolution of the primate brain, around 60 million years ago; the second with the first human use of instruments, around 3 million years ago; and the third with the evolution of symbolic thought and complex language in H. sapiens, about 100000 years ago. A fourth phase transition may now be induced by the current explosion of artificial intelligence, introducing a novel model of mind-body-environment interactions.

It is worth remembering that technology extends the boundary of the self. Recent experimental evidence suggests that we integrate tools into the brain’s extended sensory machinery. Such changes can be detected through quantitative signals, showing the influence of the tools into the brain-body cognitive system. The brain’s strategy to sense with tools is to recruit primary somatosensory dynamics otherwise dedicated to the body.

In conclusion, the extension of our bodies and minds into digital technologies has very deep evolutionary roots (fig. 5). This process needs to be carefully monitored by relying on our “social intelligence”.

7 Final remarks

The European Union Green Deal and the UN Agenda for Sustainable Development assign a crucial role to intelligent technologies. They are being proposed to monitor public health and the environment, to predict extreme weather events and pandemic peaks, to implement smart work, to manage transport, and to optimize energy efficiency and renewable energy management. The Internet of Things paradigm will have a crucial role in helping to address complex socio-environmental challenges and shape the future of our communities.

The intelligent power that increasingly permeates our society appears very friendly and does not make explicit its intention to rule our lives. Yet, some believe we have been infected by an invisible digital virus that might threaten our freedom. Not only our communications, but also our bodies and our health will be under digital surveillance.

Nevertheless, we should be optimistic and assume that we still have the capacity to limit and direct our new technologies. We could exploit them for another kind of progress, more respectful of human health, biodiversity and sustainability. We should consider the disruptive effects of artificial intelligence, i.e. in military technologies, such as lethal autonomous weapons systems, which “irreversibly alter the nature of warfare, detaching it further from human agency”.

It is difficult to forecast the combined impact of climatic, viral and digital pandemics on the future of humankind. But looking at the past, it all began when we started behaving like a virus.

Acknowledgements

Sincere thanks to Patrizia Tiberi Vipraio and Davide Fiocco for their critical review of the manuscript and to Michaela Jarvis for her careful editorial work.