After taking Yuval Noah Harari's brilliant course, A Brief History of Humankind, now available on YouTube, I decided to read his 2014 book, Sapiens: A Brief History of Humankind. The book brought numerous unexpected pleasures, including Harari's fluid, trenchant writing style.
I am reading Harari's most recent book, Nexus: A Brief History of Information Networks from the Stone Age to AI, brings new surprises, both in content and style. Note these two sentences from page 119 of Nexus:
To summarize, a dictatorship is a centralized information network, lacking strong self-correcting mechanisms. A democracy, in contrast, is a distributed information network, processing strong self-correcting mechanisms.
The premise of Nexus is that since the beginning of time, no one person can change much in our world; indeed, communities (or information networks) are necessary to disseminate, interpret, and act on data for society to flourish. As Dennis Duncan's mostly unfavorable review of Nexus in the New York Times puts it, "In a nutshell, Harari's thesis is that the difference between democracies and dictatorships lies in how they handle information. Dictatorships are more concerned with controlling data than with testing its truth value; democracies, by contrast, are transparent information networks in which citizens are able to evaluate and, if necessary, correct bad data."
You can see why the two sentences quoted above from Nexus are essential to understanding Harari's proposition. Those sentences taken together sing because of their parallelism in both parts of the statement. The first part of the sentences describes the type of information network belonging to each form of government, and the second part reflects on each government's relationship with what Harari calls self-correction. Communities that commit to infallibility of a central guiding doctrine, whether economic, political, or religious, do not take well to self-correcting. Such systems are intolerant of people questioning their doctrine. Contrarily, true scientific communities are by definition self-correcting. They have no infallible doctrine and continually seek to advance knowledge and human prospects, meaning they will be quick to overrule an existing standard if someone can disprove it and propose a more fitting doctrine.
The second part of the 500-page book covers what will happen to self-correcting communities as AI imposes greater influence on the world. I'm not there yet, but I still delight in Harari's thinking and expressiveness.