How close are we to the technical singularity

The Destructive Potential of Big Data and Artificial Intelligence for Democracy


The approach of collecting, storing, scouring, combining and evaluating mass data (“Big Data”) with today's powerful, exponentially growing computer capacities and the appropriate methods also has new impulses for Artificial Intelligence (AI) and created opportunities. On the one hand, these move old AI dreams closer to the realm of the real, but on the other hand they can unfold a great destructive potential. This is accompanied by new threats, including for data integrity, personal rights and privacy, for independence from knowledge and information and for social cohesion. New media like Facebook or Twitter, overpowering IT corporations like Google or Amazon as well as the advancing digitization of everyday processes and objects (“Internet of Things”) play a decisive mediating role.

Visions of the "technical singularity" that have been announced again and again. H. The possible takeover of power by machines over people raises questions about the primacy of politics over economy and technology and the existence and viability of democracy, if not humanity.

Big data - a new hype or more?

In December 2016, the Swiss magazine The magazine reports on the method of the psychologist Michael Kosinski to create personal psychograms from the collected amounts of data of large groups of Facebook users and with their help to carry out targeted campaigns tailored to the smallest groups of voters [9]. In spring 2018 this news finally found its way into the big media world. Whether these methods were decisive for the election of Donald Trump as US President is controversial, but the possibilities of creating customized personal profiles from the (voluntarily expressed) “likes” of unsuspecting social media users and for commercial users alone are controversial or to use political ends, to say the least: impressive.

This is just one of the innumerable uses of Big data, a not new, but for some time very popular approach in information technology (IT) for solving complex and often not even precisely defined questions. The success of this approach is essentially based on the tremendous increase in computing speed and storage capacity of computer hardware over the last few decades.

For several years now, “Big Data” has haunted the IT gazettes as a new fashion term. It actually stands - very simply - for mass data and thus by no means for a new phenomenon. However, with the expanded IT applications - including digital photography and the processing of moving images - the amount of data has increased tremendously. So-called Moore’s Law has been in effect for over 50 years, according to which the computing power (speed, storage density) of the computer regularly doubles approximately every 18–20 months. This means that today there are approx. 230 Store (correspondingly over a billion) times more data than in 1965, when Gordon Moore first formulated this law.

But not only the storage density, but above all the processing speed could be increased to a similar extent. This means, among other things, that huge amounts of data can be searched, sorted, recombined or filtered in a very short time. So there are the modern super-fast search engines or so-called Agent systems, in which a myriad of programmed virtual robots (so-called bots - short from: robots) Plowing through the Internet to perform assigned tasks such as B. to carry out detailed travel planning with flights, accommodation, meals, excursions, etc. The success of such applications, as offered by every smartphone in the palm of a hand today, would be inconceivable without this miniaturization and super acceleration.

In this respect, “Big Data” is not only a new buzzword, but also such an enormous increase in quantity that one is right to think of a new one quality can speak when storing and processing data.

Artificial intelligence - an established area with new impulses

The long established (and sometimes already declared dead) area of Artificial intelligence benefits. The American logician John McCarthy had the term back in the 1950s Artificial Intelligence as the name for a research project, the aim of which was to automate human thinking and decision-making processes.

The eventful history of AI is rich in successes, but also in setbacks. So z. For example, the initially obvious approach of designing AI algorithms on the basis of logical inference reached a certain dead end in the 1990s. On the other hand, the approach could be about Neural Networks (a simulation of human brain structures) and machine learning to simulate human cognitive, coordination and intellectual abilities, to achieve considerable successes, among other things. 2016 in the sensational victory of the computer program AlphaGo culminated in the Korean go-master Lee Sedol.

The younger AI draws its strengths not least from the findings in the treatment of big data, where approaches such as raster searches and pattern recognition play a greater role than classic reasoning based on logical rules and calculations. So z. For example, in board games like chess or go, the comparison with huge amounts of stored position patterns goes much further than the analysis of individual possible moves and variants.

In 2016, Yvonne Hofstetter wrote a book about AI and its relationship to democracy The end of democracy: How artificial intelligence takes over politics and incapacitates us released. It says, among other things: "Big data stores our behavior, artificial intelligences analyze our intentions" ([8], blurb). Statements like these raise questions about the political aspects of recent technological developments such as the primacy of politics over economy and technology and the viability of democracy.

Computer-based technology (embodied by Big Data), the associated science (embodied by AI) and politics, which are already highly connected with the economy, form a triangle with closely linked interrelationships, which are shown schematically in the following graphic (Fig. 1 ). The theses contained therein on these areas and their relationships are to be elaborated and explained in more detail below. I will mainly focus on three more recent publications - The smart dictatorship by Harald Welzer [16], The figured world by Colin Crouch [3] and Robocracy from Thomas Wagner [14].

Our democracy on the way to a smart dictatorship?

In his 2016 book The smart dictatorship writes Harald Welzer in the technology-fueled convergence of consumption and surveillance Attack on our freedom [16]. As a global player, the new focus of economic activity is formed by global Internet companies such as Google, Apple, Microsoft, Facebook, Amazon or PayPal.

All-round surveillance

The new treasures of the IT age are the data of the computer users. So use z. B. an estimated 3 billion people use Google, 1.5 billion use Facebook. For the most part, personal data is provided by the users themselves, irresponsibly and "voluntarily". This data only has to be filtered and screened out (e.g. with algorithms that follow the principle of the raster search). Many IT users today make themselves accomplices in their own surveillance and commercial exploitation.

Many of the effects are only felt indirectly or not at all. If z. B. the above. Michael Kosinski with his method for Psychometrics and the Ocean model Just from the “likes” of Facebook users, tailor-made personality profiles are created, which are then used for targeted election or product advertising, so the recipient of the advertising messages does not even suspect anything about their origin and how they came about.

Even with apparently well-intentioned offers such as B. the petition platform one can by no means be sure that these really serve the announced well-meaning goals. Rather, there is a well-founded suspicion that this may be global data collection companies that deal wholesalers in e-mail addresses and use modern algorithms to classify the petitioners, pass on this data and expose those affected to further targeted actions. has one for this business practice in 2016 Big Brother Award received [4].

Social bots

Agent technology (see above), with social bots, provides a new form of mass, but nonetheless targeted advertising and influencing. Social bots are computer-controlled (pseudo) actors in social media who are specifically used to e.g. B. to give automatic answers, to disseminate advertising messages or political expressions of opinion. The special thing about these “social” robots is that the addressees in social networks can hardly tell them apart from “real” participants. In the last US election campaign, such automated tweets are said to have been used on a large scale by Trump and the Republicans, but also in part by the Democrats [13].


The author Welzer sees a new form of the Panoptikum, a form of total surveillance of prison inmates from a single point designed by the English philosopher Jeremy Bentham towards the end of the 18th century [16, p. 53]. The technical potential for the acquisition and collection of user data is by no means exhausted - on the contrary: ongoing developments such as Smart house, Internet of Things, digitized highways, Share economy, "Smart" toys, body sensors with data recording (life logger, self logger) will deliver gigantic amounts of further personal data, which initially seem like huge heaps of data garbage, but with the sophisticated modern evaluation methods deliver valuable treasures for total commercialization and surveillance. And all of this does not happen through the access of a total state in Big Brother manner, but is delivered voluntarily and mostly carelessly by naive IT users who are lured with supposed comfort applications, but incapacitating themselves in the process.


In addition to the progressive delivery to technical systems and equipment, we are faced with a gradual erosion of states and democratic institutions (e.g. associations and unions) in favor of anonymous actors, markets and technology companies. This form of “neo-feudalism” (H. Welzer) is a creeping process that has been going on for almost a generation - largely unnoticed by the public. In its unmistakable effects such as privatization, deregulation, squandering of state assets, financial and environmental crises, it is a consequence of economic neoliberalism, which is only perceived more clearly from time to time and then sometimes leads to spontaneous reactions. Vocabulary such as “Occupy Wall Street”, “political disenchantment”, “grasshoppers”, “predatory capitalism”, “angry citizens” etc. show that economic liberalization does not necessarily go hand in hand with political liberalism, as shown in particular by the example of China, but also in in the USA and European countries such as Hungary and Poland clearly show authoritarian tendencies of the increasing threat to civil liberties.

The creeping dissolution of democracy

It is a widespread belief that the current developments in ubiquitous technologization and digitization are less dangerous as long as there is no regime change towards a totalitarian state. Indeed, a combination of a totalitarian state and a technology that would bring Orwell's visions of 1984 already far exceeds it today, really terrifying. In China, however, they are already in the process of introducing a nationwide point system that evaluates citizens according to their social and political good behavior, punishes non-compliant behavior with point deductions and makes access to social benefits dependent on the current number of points [7].

But even with us we are already dealing with a more subtle, but at least as effective form of indirect exercise of power by largely invisible and therefore almost invulnerable commercial rulers. This change to “smart dictatorship” (as Welzer calls it) takes place within our defined and (still) guaranteed scope for freedom - “the dissolution of democracy takes place within the framework of democracy” [16, p. 200].

Libertarianism - less state and more monopolies

A prominent representative of this new understanding of democracy is Peter Thiel. The son of German US immigrants made a large fortune through skilful investments and financial transactions and is now a multi-billionaire. According to Thiel, democracy and freedom do not go well together - now of course, because democratic laws and regulations hinder free enterprise and could reduce possible profits that can be achieved at the expense of the general public [16, p. 186/187]. This fits in with the tendency of current free trade agreements to remove “trade barriers” (i.e. justified restrictions, e.g. to protect local actors or the environment).

Thiel founded a. the internet payment service PayPal, invested in start-ups at an early stage and did so with the rise of Facebook immense profits. He is involved, inter alia. for the Seasteading Institute, a project for the construction of artificial islands and cities outside of any state sovereignty as well as for visions of “hard AI” such as life extension and the achievement of immortality by means of AI [14, p. 64].

Politically, he is against property taxes and the limitation of monopolies, he considers monopoly groups like Google to be good because they are maximally efficient. In an essay, Thiel reveals that he no longer believes that “democracy and freedom go together”. He sees political salvation "in the hands of a single person who creates or spreads the mechanism of freedom that we need to make the world a safe place for capitalism" (cited in n. [16, p. 188]) ).

These ideas largely coincide with those of new (right-wing) libertarian movements, which are particularly popular in the USA. Privileges, regulations and opportunities for intervention by states are to be reduced to a minimum (or, in the extreme anarchic variant, abolished entirely); freedom of property and entrepreneurship takes precedence over demands for equality or fraternity.


In the opinion of critical observers, the goals of the creeping dismantling of democracy have already been achieved at least in part. This is how the British sees Sociologist Colin Crouch already the age of Post-democracy come, in which elections, democratic institutions and expressions of power only have a sham and spectacle function for the people and the decisive decisions are brought about behind closed doors by business and information magnates or their lobbyists. The rule of large corporations over information and knowledge plays a decisive role:

The democratic community suffers particularly from this abundance of power, because reliable information is its elixir of life. As soon as the holders of large spheres of influence have the power to withhold information or to supply the public with one-sided, misleading or otherwise manipulated information, the affected community becomes a hostage to its self-interests [3, p. 16].

The author emphatically opposes the neoliberal thesis, according to which the market alone is able to make all goods and services comparable by means of a uniform evaluation scale - the price - and thus allegedly embodies the best possible, all-encompassing social knowledge. Rather, he recognizes neoliberalism as the "enemy of knowledge" and substantiates his criticism of numerous examples from the financial economy (e.g. the financial crisis of 2008), the public service, private consumption, health and education, etc.

Big data plays a key role in this: as a central instrument of a continuous, all-encompassing, always up-to-date evaluation machinery, as an almost unassailable, pseudorational decision-making authority and thus ultimately a power factor that undermines democracy. "Pseudorational" because the evaluation criteria and rules are selected and designed by the dominant forces and the whole process is largely self-referential. Crouch's conclusion is:

In their endeavor to transform us into a new type of person who has to know exactly who he is, the neoliberals have come very close to the totalitarian ideologues, whose diametrical opposite they so gladly believe ([3]) , Blurb).

Former US Vice President and 2000 presidential candidate Al Gore has similar theses:

The more the power to make future decisions shifts from the political systems to the markets and the stronger the invisible hand becomes thanks to increasingly sophisticated techniques, the more the muscles of self-administration atrophy ([5], chap. The discomfort in democratic capitalism, P. 23).

The theses of artificial intelligence

AI is playing an increasingly important role in the assessment processes mentioned, which are now largely automated. Great increases in efficiency in pattern recognition, artificial neural networks and automatic learning, together with the use of big data, have helped AI to new heights.

Claus and Schwill define AI in the "Duden Informatik" [1] as follows:

The AI ​​investigates how one can record and understand intelligent behavior of computers or how one generally solves problems with the help of computers that require intelligence (quoted from [11, p. 13]).

This definition allows various possible interpretations in a very clever form and thus forms a suitable roof for two schools of thought that have fought bitter battles in the history of AI. So can for the followers of the so-called weak AI thesis Computer only human intelligence simulatewhile for followers of the strong thesis the AI self intelligent behavior is.

If the thought of an AI entity that exhibits intelligent behavior itself is spun on, far-reaching consequences will soon become visible and possible: a strong AI (and ultimately computers, i.e. machines) can make decisions for which it should consequently also take responsibility . From there, it is only a further, consistent step to view and treat them like an autonomous subject. It is obvious that AI understood in this way and possibly accepted not only provides assistance, but also great dangers. Even ardent followers like Elon Musk quoted below are well aware of these dangers.

Robocracy: Vision of a global techno dictatorship

The cultural sociologist and publicist Thomas Wagner sits down in his book Robocracy - Google, Silicon Valley and humans as obsolete models deals with the old and new theses and prophecies of leading AI protagonists and examines them critically [14].

Hostility towards the state and belief in technology

This shows a (almost) consensus between libertarianism and leading infotech forges in the Silicon Valley and elsewhere: distrust of the state and unhindered primacy of technology - not politics, but computers, big data and artificial intelligence are the most important levers for improving the world. The primary goal is radical economic liberalization, accompanied by further technical development.

In Silicon Valley, both right and left welcomed a technological development which they believed would increase the power and personal freedom of the individual and radically reduce that of the state [14, p. 28].

Man as a discontinued model

Here the new AI draws on the theses of the old "hard AI" and the dream of a reproductive "ultra-intelligent machine" (Irving John Good, cited in n. [14, p. 72]), i. H. a machine that itself designs intelligent machines and puts them into the world. Machines of this type could, inter alia. Uploading brain content and downloading it to humans or other human-like beings, interacting with humans physically and intellectually Cyborgs connect, achieve immortality and surpass people of old in almost every respect. Ideas of this kind were already formulated by (tough) AI researchers like Hans Moravec at the end of the 20th century and then among others. Propagated by Google chief developer, author and futurist Ray Kurzweil.

Kurzweil is one of the most famous representatives of the Transhumanism, a newer philosophical current that foresees a future stage of human evolution through scientific and technical upheavals and associated fundamental changes in human nature. For the Israeli historian and bestselling author Yuval Noah Harari, such a development could mean the end of humanism, which has determined the worldview of at least the occidental peoples for more than 200 years. Instead of the familiar Homo sapiens, there could be a new evolutionary stage Homo deus - a person who is “godlike” equipped and exaggerated through the use of technology would kick - this would usher in a new era of “dataism” or “techno-humanism” [6].

The machines take power - the singularity is at hand

The way from the “super-intelligent” machines mentioned is not far to the final triumphant advance of artificial intelligence: computers take over power, enslave people or ultimately make them completely superfluous. Futurists describe the moment when the intelligence of machines outperforms that of humans as a “technical singularity”. At that moment the story would become unpredictable - "... then man will have to fundamentally rethink his own role in the world. The world will be completely different. ... “(Moravec 1999, quoted from [14, p. 34]).

If you believe authors like Ray Kurzweil or Nick Bostrom, this moment of an “intelligence explosion” (Bostrom) could be imminent. This is the title of a book: Humanity 2.0. The singularity is approaching [10]. It predicts the convergence of genetic research, nanotechnology, AI and robotics as well as a self-reinforcing, exponential growth in technology (first “trans-biological”, then “post-biological”). The goal is the “becoming God by means of technology” [14, p. 38].

These are by no means just statements from individual, crazy nuts, but research subjects of well-equipped and generously funded institutes and organizations such as. B. the one founded in 2008 by Ray Kurzweil and Peter Diamandis on the NASA campus in Silicon Valley Singularity University, the Machine Intelligence Research Institute (MIRI) in Berkeley, the Future of Humanity Institute (FHI) at Oxford University or the Transhumanist Association Humanity +. So offers the Singularity University 10-week courses for $ 30,000 with lectures by Kurzweil, Diamandis (molecular geneticist and aerospace engineer) and Larry Page (Google) as well as team seminars and is considered the "cadre of Silicon Valley" [14, p. 83].


It goes without saying that visions or utopias such as those mentioned above are also the subject of much criticism. The British sociologist Richard Barbrock, for example, states that the belief in technical progress has a downright “religious character” and the artist and computer scientist Jaron Lanier sees a “new form of religion based on the pursuit of immortality” [14, pp. 40–41].

Stephen Hawking also warned against (hard) AI and saw it as a threat to humanity. The end of mankind could be ushered in through AI. The future will tell whether the machines will take control at some point. But it is already clear today that machines are increasingly displacing people from the labor market.

Even tech entrepreneurs like PayPal co-founder and Tesla investor Elon Musk have warned (however they may be rated):

The advancement of artificial intelligence (I don't mean simple artificial intelligence) is incredibly fast. ...

and further:

Artificial intelligence can be more dangerous than nuclear weapons (quoted in [14, p. 17]).

Yuval Harari outlines the vision of a new “data religion” and refers to the great political visions of the 20th century with devastating results: “The combination of godlike technology with megalomaniac politics would open the door to catastrophe.” In connection with the one currently under construction With the "Internet of Things" in place, dataism could spread to the entire planet earth (and possibly beyond into the universe) - "this cosmic data processing system would then be like God ..." and "people ... doomed to be absorbed in it." P. 509, 515].

The proclaimed singularity as the gravedigger of democracy

Even if one considers such dystopias to be exaggerated, the much closer quintessence of the author Wagner remains as a warning: “The real danger that threatens us is not this Robots take power, but those Man's self-abandonment“[14, p. 28].

Thought a little further, this self-abandonment could proceed as follows: The singularity finds not easy instead of (how should it also be determined?), but it is - entirely in accordance with its quasi-religious character - proclaimed. Super-intelligences are then the gods of this new data religion. Singularity, machine omnipotence and transhumanism form the new doctrines of salvation. "Enlightened" and knowledgeable people like Ray Kurzweil, Nick Bostrom, Peter Diamandis or Peter Thiel function as prophets / ayatollahs / gurus of a new "state of God" and the masses of the earth's inhabitants have to bow to an all-embracing technocracy in which they still have to survive have to be happy.

Ultimately, the two apparently contradicting goals of the Man's self-abandonment on the one hand and the Man becoming God through technology on the other hand, connect with each other in a clever way: While a privileged caste of (data) high priests command the global digital infrastructure and its use, the rest of humanity has to submit to the total control of a doctrinally established machine rule.

Even a kind of (pseudo) democracy could continue to run underneath - for example based on the Iranian model: elections are held regularly, but the last instance is always the one New gods (= Machines), those who have unfortunately now taken over power and those through the new prophets only represented become. They could even take on the role of mediator and comforter: We warned you.

What can you do?

In view of the close interdependencies between what appears to be an inexorable advance in technology and science on the one hand, and a willingly following politics and economy on the other, securing democracy and protecting it from dangerous tendencies to dissolve represents a huge challenge - perhaps even the central one of the current century.

How can one face this challenge and establish effective opposing tendencies? A few suggestions should conclude this treatise as an outlook.

Education instead of (forced) digitization

"Digital first - concerns second." That was the slogan in 2017 of a party allegedly committed to "liberality" in the federal election campaign. The constant calls for more and faster digitization, also cultivated by other parties, are hardly suitable for stopping or even slowing down the dangerous tendencies outlined above. Rather, they are more than sufficiently promoted by financially strong companies and their lobbyists without any state or public intervention - additional state aid would therefore rather pour oil into the already blazing fire.

What it takes on the other hand is Critical Thinking Education. The digital sector plays a prominent role in this today. How do I handle digital media and my data? What do I want and can reveal about myself and what are the consequences of such disclosures? How do I protect myself against the tapping and possible disclosure of my data by "social" media and other Internet providers? How is the protection and security of my data guaranteed, how do I avoid my (mostly reckless) self-exposure and incapacitation? What (possibly hidden, often difficult or impossible to quantify) price do I pay for apparently convenient and often free online applications? What are the social effects of innovations such as driverless cars, the Internet of Things, life loggers, “smart” toys, houses, apartments and surroundings, and what consequences can they (in addition to the skills and amenities advertised) have for me personally?

All of these questions are broad, open and fair public education indispensable. Discussions must not take place unilaterally, driven by the interests of entrepreneurs and providers, but must always involve data and consumer advocates, critical associations, NGOs, etc.

At the government action Not least because of the diverse international interdependencies and the long-term effects of a neoliberal deregulation policy that has now lasted for almost 40 years - the challenges are one level higher. Here must be sensible and well-coordinated Regulations ensure that states and public institutions retain or regain their capacity to act.

So z. B. in the IT sector social media and online providers careful handling be committed to the data of their users. Aggressive advertising must be prevented, users must be better informed of their rights and the reckless consequences of their actions, party advertising and the influence of democratic elections must be excluded. Large monopolies in the information and communication market must be checked for their compatibility with democracy and, if necessary, smashed. International corporations are subject to national law (if necessary fragmented) and are appropriately taxed under this.

In the area Science and Research an ethics council must monitor the allocation of public funds and the use of third-party funds and, if necessary, refuse approval for inhuman AI or genetic research. Private universities, private research institutes and think tanks must be checked for their compatibility with the constitution and basic democratic principles and, if necessary, warned or closed.

In addition to the above special educational content related to the digital world, the “old” subjects and areas that are committed to humanism should not be neglected. The focus is on historical and political education. So should z. B. Works like George Orwell's 1984, Aldous Huxleys Beautiful new world and The power of computers and the impotence of reason by Joe Weizenbaum [15] will be required reading for all adolescents.

The old issue: Religious or Ethics Lessons in Schools? could Solomonic through a common subject Ethics and religions In addition to the history and principles of ethics and humanism, fundamental knowledge of the major world religions is imparted. This also includes the Digital ethics as well as a critical examination of utopias such as immortality, the idolatry of machines or transhumanism.

And - last but not least - with all the obvious need for science, math, and computer science - it would be a big mistake that Humanities to sacrifice to a shimmering zeitgeist - they stand for a culture that is thousands of years old and for respect for the human ghost versus a hastily machined ratio. They preserve and convey to us immortal works from literature, philosophy, art and music, which also exist in the Brave new transhuman world no products from sealing AI’s, download surrogates or immortal cyborgs will be able to record.


  1. 1.

    Claus V, Schwill A (2006) Duden Informatik A – Z, Bibl. Institute, Mannheim

  2. 2.

    Crouch C (2008) Post-Democracy. Suhrkamp, ​​Frankfurt am Main

    Google Scholar

  3. 3.

    Crouch C (2015) “The numbered world” - How the logic of the financial markets threatens knowledge. Suhrkamp 2015; also: From the loss of democracy and the withdrawal of knowledge, Deutschlandfunk 2015.ö June 17, 2020

  4. 4.

    FoeBuD (2016) Accessed: June 17, 2020

  5. 5.

    Al Gore (2014) The Future - Six Forces That Are Changing Our World. Siedler, Munich

    Google Scholar

  6. 6.

    Harari YN (2017) Homo Deus. A short story of tomorrow. C.H. Beck, Munich (translated from English by Andreas Wirthensohn)

    Book Google Scholar

  7. 7.

    Heise (Hrsg) (2018) Accessed: June 17, 2020

  8. 8.

    Hofstetter Y (2016) The End of Democracy: How Artificial Intelligence takes over politics and incapacitates us. Bertelsmann, Munich

    Google Scholar

  9. 9.

    Kosinski (2016) Brille-dass-es-die-bombe-gibt/. Accessed March 6, 2018

  10. 10.

    Kurzweil R (2014) Humanity 2.0. The singularity is approaching, 2nd edition Lola Books, Berlin

    Google Scholar

  11. 11.

    Lämmel U, Cleve J (2012) Artificial Intelligence. Hanser, Munich

    Book Google Scholar

  12. 12.

    Moravec H (1999) Computers take power. From the triumph of artificial intelligence. Hoffmann and Campe, Hamburg

    Google Scholar

  13. 13.

    Tagesspiegel (Hrsg) (2016) Accessed: June 18, 2020

  14. 14.

    Wagner T (2015) Robocracy - Google, Silicon Valley and humans as obsolete models. PapyRossa, Cologne

    Google Scholar

  15. 15.

    Weizenbaum J (1977) The power of computers and the impotence of reason. Suhrkamp, ​​Frankfurt am Main

    Google Scholar

  16. 16.

    Welzer H (2016) The smart dictatorship - the attack on our freedom. S. Fischer, Frankfurt / M

    Google Scholar

Download references


Open Access funding provided by the DEAL project.

Author information


  1. LMU Munich, Munich, Germany

    Wolfgang Hesse

Corresponding author

Correspondence to Wolfgang Hesse.

additional information

This article is a slightly revised reprint of a book chapter from the volume:

Wiegerling K, Nerurkar M, Wadephul Ch (Eds) (2020) Datafication and Big Data. Ethical, anthropological and epistemological perspectives. Springer, Wiesbaden, pp. 213-228

Rights and permissions

Open access This article is published under the Creative Commons Attribution 4.0 International License, which permits use, copying, editing, distribution and reproduction in any medium and format, provided you properly credit the original author (s) and source, a link to Include a Creative Commons license and indicate whether changes have been made.

The images and other third-party material contained in this article are also subject to the named Creative Commons license, unless otherwise stated in the legend. If the material in question is not under the named Creative Commons license and the action in question is not permitted under statutory provisions, the consent of the respective rights holder must be obtained for the further uses of the material listed above.

For more details on the license, please refer to the license information at

Reprints and Permissions

About this article

Cite this article

Hesse, W. The Destructive Potential of Big Data and Artificial Intelligence for Democracy. Computer science spectrum43, 339-346 (2020).

Download citation