Koestler, Kurzweil, Wakefulness

I first looked into the work of Arthur Koestler after I had heard that he had something particularly interesting to say about Pythagoras.  Koestler is one of those writers who, sadly, lies largely forgotten. This is striking given that he was one of the most popular writers of the mid-20th century, and showed a degree of versatility almost unheard of today, writing not only the great anti-Stalinists novel Darkness at Noon, but also penetrating works on the history of science such as The Sleepwalkers.  The value of the man’s work is obscured not just by forgetfulness, but by personal scandal, the most damning of which is an allegation of rape by the wife of Koestler’s friend. An allegation, it must be said, that was made after Koestler had died, and against which he, therefore, was never able to defend himself.

For anyone interested the CBC has a detailed biography of Koestler over at IDEAS.

There are several lessons that can be taken away from The Sleepwalkers, written in the 1950s, that are extremely relevant to the present, most especially when one applies the insight of Koestler with those of a current mystic/scientist- Ray Kurzweil.

Koestler’s The Sleepwalkers reminds us of the spiritual origins of science. This was the case with Pythagoras, who was the first person to conceive of the universe as a mathematical model, right down through Kepler and Newton who were both on a spiritual quest to uncover the “mind of God”.

Koestler also brings to our attention the zig-zag nature of what we would call progress.  There are long dark spells in the history of human knowledge such that “the world in 1500 knew less mathematics than at the time of Archimedes “, and also long plateaus where nothing really interesting seems to happen.   Periods of progress are the exception not the norm, though such periods can be spectacular in how they revolutionize human life.

If we step back to take in the big picture we do indeed see something like an almost linear advance of human knowledge and power over nature the progress from “cavemen to spacemen” in Koestler’s corny characterization.  What we do not see, and this was something that would have been especially apparent to Koestler writing just after the horror-show of World War II, is anything like a corresponding moral progress of human kind.  In fact, periods of the greatest technological and scientific advancement also saw the greatest extension of human cruelty.  And it is this technical progress coupled with moral immaturity that Koestler sees as our greatest danger.

What does all of this have to do with Ray Kurzweil? Well, Kurzweil is the best known prophet of what is known as the Singularity- a distinction which won him a profile in Time Magazine a few years back. The Singularity movement essentially holds that the exponential growth in computer power will eventually result in machines that match and then quickly exceed human intelligence.  This world of post-human intelligence will solve the perennial problems that are now largely in the realm of religion- especially the problem of death. Human beings will be able to integrate themselves with superior machine intelligences, and be able to avoid death by uploading to a digital “cloud”.

Kurzweil is like an Al Gore from the other- side of the sun. He is famous for his slideshows and graphs which show exponential progress of almost everything, but especially the ability of computers. Exponential curves grow very slowly at first and then explode with incredible impact, like the proverbial lilies on a pond that seem to cover almost none of the water until their final set of doublings where they expand to cover everything.  By the middle of this century technological progress will become so rapid and so overwhelming that we will leap into a post-human world seeming overnight.

Members of the Singularity movement certainly bear a striking resemblance to the mystic-scientists of Koestler’s imagining- the very people who gave birth to the scientific world-view in the first place.  It would be a colossal mistake to dismiss Singularians as a bunch of sci-fi addicted kooks. Kurzweil himself is a genius inventor and the figures of some of the very largest tech companies, such as Sergey Brin at Google, are disciples of the movement. Indeed some of the best and brightest in the fields of computer and genetic science are consciously pursuing the religious goals that are at the root of the Singularity movement.

Believers in the Singularity will be the first to insist that theirs is not a utopian movement, which is the best give away that we are dealing with a true utopian line of thought. Indeed, from where I sit, it is hard to see the Singularity as anything less than the mother of all utopias with its promise of immortality, universal abundance, machine sentience, and omniscient intelligence. Like other utopian ideologies we’ve seen before Singularians exhibit a weird mix of determinism and human freedom. The Singularity is said to be written in the stars, the destiny of the universe once intelligent life emerges, and at the same time is relentlessly pursued by individual inventors and thinkers.

The Sleepwalkers should provide a cautionary tale for true believers in the Singularity. Koestler makes us aware of the non-linear nature of progress when viewed in the time frame of an individual life, and even centuries. Sometimes humanity gets stuck and just spins its wheels, and even lurches backward. In fact, some of the best critics of the assumptions regarding the current rates of progress held by Singularians today comes from the school that might be called “where is my jetpack?!” These thinkers argue that not only has present realty failed to live up to all of the hype from the middle of the last century- instead of bases on Mars and cold fusion we have the iPhone- but that the very lack of technological progress is the true source of our current economic ills. There is mounting evidence that we may have even hit a plateau in terms of scientific discovery.

Singularians appear almost fanatically driven by the desire to make their vision come true right now. One wonders why someone would push so hard to reach what they consider an inevitable destination, especially where trying to get there so fast potentially puts humanity in such grave danger- how could the earth possibly survive if people physically lived, not just a century, but centuries, and under conditions of hyper-abundance? What will people do to sustain themselves if we ever actually do manage to create sentient machines? And these are only material questions, the moral questions aren’t dealt with at all including the existential value of the fact that we die, so beautifully articulated by one of the few giants of the tech industry who didn’t believe in the Singularity- Steve Jobs.  One wonders, what is the rush?  Only to realize the hurry is because of the fact that the Singularians themselves hope to defeat death. They are terrified of death, in fact so terrified they are willing to risk humanity itself so that they personally will not have to die.

And this is another thing that The Sleepwalkers points out to us. That technical and scientific knowledge does not entail our moral development- quite the opposite. That technological change, especially rapid technological change, seems to go hand in hand with the periods when human beings treat each other the worst from Iron Age warfare, to the religious wars fueled by the Guttenberg printing press, to the industrial revolution and total war.  For Koestler we were at a crossroads hurtling towards utopia or dystopia, and The Sleepwalkers was meant as a warning.  He wrote:

Thus within the foreseeable future, man will either destroy himself or take off for the stars. It is doubtful whether reasoned argument will play a role in the ultimate decision, but if it does, a clearer insight into the evolution of the ideas that led us to the current predicament perhaps may serve of some value.

(It) may serve as a cautionary tale against the hubris of science, or rather, the philosophical outlook based on it.

Our hypnotic enslavement to the numerical aspects of reality has dulled our perception of non-quantitative moral value: the resultant end-justifies-the-means ethics may be a major factor in our undoing.

(Koestler hoped his tale) may have some sobering effect on the worshipers of the new Baal lording it over the moral vacuum with his electronic brain.  *

Perhaps we could have avoided all the carnage and dislocation that occurred in past periods of technological change had we kept our wits about us and thought things through before we acted.  Kurzweil himself has acknowledged that there may be some “bumps in the road” as we approach the Singularity, but working as a consultant for the US Military, and founding a “university” in search of ways to contain the ill effects of a reality he himself is trying to create seems a little like sub-contracting out strategies for climate change adaptation to Exxon-Mobile.

Kurweil himself does not believe in regulating technology. The future is for the technologist not the government to decide. But to the extent that in a democracy we are the government he leaves no role for all of us to have a say in the future world that both we and our children will inhabit.

I have no idea how we might choose our technology in a way that has never been done before, in a reflective way, but I do know one thing, while I may not have time, we have time to think about what we are doing before we cross what may be very dangerous thresholds- we have the chance to finally cease being sleepwalkers and wakeup.

* The Sleepwalkers, 1959, Arthur Koestler, pp. 552-553

20 comments on “Koestler, Kurzweil, Wakefulness

  1. James Cross says:

    As I point out on a post on my blog, the underlying psychic force behind the singularity movement is the ongoing effort of humanity to deal with the problem of mortality. It is no surprise that Raymond Kurtzweil has written not only The Age of Spiritual Machines in which the hails the coming age of AI but also has written Fantastic Voyage: Live Long Enough to Live Forever in which he hopes through biotechnology and eventual downloading of his mind to a machine to achieve immortality. The project to create intelligent machines and the project for immortality are two sides of the same coin.

    I don’t think that either one of these goals is achievable or even desirable. We may extend life far beyond its current span, but we will not achieve immortality. We may create very intelligent, but not sentient, machines, but we will not be able to download our consciousness to them. Ultimately I think consciousness will be shown to be property only of living matter. Neither of these achievements will ultimately satisfy us. In the end, what will satisfy must be some new appreciation of the moral value that Koestler writes about.

    Thanks for bringing Koestler to my attention again. It has been a while since I have read him and I may want to do a refresher.

    • Rick Searle says:

      Hello James:
      Thanks for commenting on my post. I certainly agree with you as to whether the goals of the singularity movement are truly possible or even desirable. I think the points you make in your post “Why the Future Needs Us” dovetail nicely with the point that Koestler was trying to make in The Sleepwalkers (though for him the background was the risk of nuclear war), and I was trying to make in my post.

      The Singularians are not thinking through the possible consequences of what they are doing, the real consequences of which are unpredictable, and might result in disaster even if the destination turns out to be far short of their goals.

      I look forward to your future posts regarding consciousness as limited to “wetware”, and your other posts as well.

  2. Charles says:

    I just finished a book by Kevin Kelly, ‘What Technology Wants’ which claims that technology evolves and regardless of what individuals think (or even act to stop certain technologies from coming into existence), they are going to be around in our lives. While that perspective is too techno-optimistic for my taste, he did make an important point – in the long run, technological (and scientific) changes are going to alter the way we lead our lives (whether we like it or not). I doubt the existing generation is likely to survive to witness the Singularity but I suspect we are getting close to it. As it is, we are already handing over massive amounts of personal details to Information Technology companies and social networks such as Twitter and Facebook… The debate isn’t whether we want to embrace these technologies but how to ensure our privacy rights are still protected…

    • Rick Searle says:

      Hello Charles:

      Thanks for commenting on my post. I actually wrote a post on Kelly’s book a while back, if you want to check it out: https://utopiaordystopia.com/2011/12/29/what-humanity-wants/

      I’d like to know what your take on the book was too.

      The problem I see with Kelly’s reasoning is that the only way he thinks we might be able to control technology is through individual choice- I CHOOSE to bring this or that gadget into my life. But, the counter-intuitive example he gives of people who are able to effectively control technology- the Amish- don’t do so as individuals but as a society. He never speculates on if some more collective, democratic system of figuring out what technology is of net benefit or net harm is even possible. Not that I am certain if such a thing is possible myself.

      • Charles says:

        I would agree that individual choice has very little impact when a new technology overtakes predecessors and becomes dominant. A smart phone for instance, to tweet, facebook, IM, read documents so that one is constantly on the ‘job’.

        I’m also not entirely sure if democratic platforms could temper or modify technologies to ensure it doesn’t create harm. For one, consensus might be difficult to achieve. More likely, the boundaries of what is acceptable (or not) of a particular technologies (and whether society might decide to discard it) might evolve over time through multiple forums which could vary depending on factors such as the context of its use, the nature of the technology, its potential harm e.t.c.

      • Rick Searle says:

        My concern, Charles is less with “gadgets”, or platforms such as Twitter- though I think they are enormously important. My concern is with technologies that would fundamentally alter what it means to be human- the kinds of things that you find with transhumanism. I think it might be possible, though I am pretty pessimistic that it will work, to create a democratic means of choosing what thresholds should be crossed and when.

        It is one thing to have people be forced to adopt this or that technology in order to function effectively in a society and in my view quite another to force them to transform elements of their humanity. I am talking about technologies likely to appear over the next generation such as the genetic engineering of children before birth, enhancement technologies that permit better cognitive and emotional performance (and perhaps even morality), silicon based implants that allow direct communication between persons and artificial intelligence- that sort of used to be sci-fi thing.

  3. James Cross says:

    I am mostly with Rick’s view on this.

    I think perhaps there is a bigger problem in that we are often slowly lured into technologies without understanding their full impact. I doubt if anyone would have imagined the wide impact of the automobile at the time of its invention – pollution, global warming, urban sprawl, and finally the entire oil driven geopolitics.

    When we come to genetic engineering and other enhancement technologies, each step down the road might look completely benign but when we look back a hundred years from now we may discover ourselves and our society changed in ways we can’t imagine today.

    • Rick Searle says:

      Totally agree, and this is a much better fit with Koestler’s warning about “sleep walking” than my original post. The question I’d like to ask you, James, is whether or not you think there is a way out of this predicament?

      • James Cross says:

        I am really not sure.

        I am going to try to address some of this in the Part II post on Why the Future Needs Us but I don’t claim to have any definitive answer.

        Here is a little of my line on thought on this that I am still working through. I think for the next 100 to 200 years or so we are in a very dangerous period. I think we could easily slip into any number of traps that transform into something much different than anyone of us would now want to become or perhaps selected out of existence by the machines we create. If we get through this period some way, I think we will create a future technology that will be in some way “living”. It will be conscious but it will not be in a power struggle with us. It will be an equal partner to us. We can step back from merging with this technology and return to our natural biological roots – we must accept death and cease striving for immortality. This technology, although “living” , may not be based on DNA and it may continue to evolve long after we as a species have died away. We will follow a different path that leads us to an Enlightened Society based on the fundamental ethical principles underlying all the great religions. And we will with Blake “see a world in a grain of sand And a heaven in a wild flower, Hold infinity in the palm of your hand And eternity in an hour”.

        If this sounds vague and “off the wall”, I agree it is. It might be overly optimistic and perhaps it is – the Enlightened Society may be a thousand years off instead of a few hundred. And I don’t think at all that it is guaranteed we get through the next 100 to 200 years without damaging ourselves beyond repair.

        You will need to see the context of this in my post.

      • Rick Searle says:

        My next post will be about the novel Accelerado by Charles Stross which deals with very similar questions and possibilities to the ones you raise. Keep an eye out for it, I will certainly keep an eyes out for your Why the Future Needs Us, Part II.

  4. This was an interesting find. You’ve mentioned two of my favorite thinkers in one post. Koestler is one of the my all-time favorites, whereas Kurzweil is a recent fascination. I’d agree with much of your assessment of his outlook, but the one thing I do take away from the Singularitarian manifesto is that it’s a likely scenario.

    Kurzweil himself and many others might be desperate to see it happen within their lifetime, but the likelihood of it happening sometime in this or the next century seems somewhat inevitable. I say this not out of endorsement but an appraisal of human history from the last 100,000 years. And I agree with your moral assessment, hence why it scares me to think it would be sooner other than later.

    But getting to Koestler, have you read “The God that Failed” or “The Ghost in the Machine”? They are both excellent, especially the first. In case you haven’t heard of it, it’s an autobiographical piece where he and other former communists tell of their experiences during the 1930’s, what led them to communism and what led to their eventual break with the Party. It’s marvelous, especially Koestler’s contribution.

    Ghost in the Machine is also very good and contains his thought son human psychology, biology, and the stupidities of the behaviorist movement. That one inspired me to write a freaking novel! Again, I’m assuming you haven’t read these, so forgive me for wasting your time if you have. In that case, would love to hear your thoughts on them.

    • Rick Searle says:

      Hello Matt,

      For my part, I am not so certain that the singularity is as inevitable as proponents propose. That was one lesson I took away from Koestler who challenged the idea that science had any inevitable course. (I suppose this grew out of his experience with communism, which also saw history as deterministic). This point was also, I think, made by Charles Stross in his Acclerando which I reviewed on my blog). To me Stross was giving a possible explanation for Fermi’s Paradox- the idea that if intelligent extraterrestrial life exists- where the hell are they? He was suggesting that perhaps the most likely outcome of approaching the singularity is for civilization to collapse rather than reach some god like stage.

      I also think that taking the singularity as a determined outcome of history ignores the amount of active pursuit that is going on to bring it about by people like Kurzweil, and also technology heavy weights such as Sergi Brin at Google. In addition, it discourages attempts to give the singularity a shape that is consistent with humanistic values.

      I am sure you explore some of these questions in your books, which I’d love to check out. All of them look pretty awesome, but is there one you would recommend I start out with?

      Both The Ghost in the Machine and The God that Failed are on my summer reading list. (I am late in discovering Koestler). I’d love to exchange thoughts on them as soon as I’ve got them under my belt.

      • Of my books, not sure which I’d recommend. I can synopsis them for ya and see if any tickles your fancy: Source is a space operatic piece that deals with resource control and the relationship between an environment and its people. It’s also got some space colonization going on, not to mention some aliens and interstellar war. It began as a collection of short stories that turned full-length.

        Eyes in the Dark, Flight of the Icarus, Turncoats and Vega Rising are all set in the same universe and kind of prequel my first full-length novel. Eyes is about generational ships on their way to a new world, Icarus is about the development of FTL, Turncoats is about espionage and warfare, and Vega is about war and insurgency on an occupied world. If any of those sound interesting, let me know. I’ll see if I can hook you up with a free ebook.

      • Rick Searle says:

        They all sound really interesting, but I think I’ll start with Source.

        Thanks for your generous offer, but I know the book will be well worth my money. And besides, maybe once I finish my own book you’ll do me the honor of buying mine as well. ;>)

      • I didn’t know you had a title! Where might I find it?

      • Rick Searle says:

        The book’s in process, but I hope to be done by next summer, at least in the first installment. It’s a mostly a non-fiction book and kind of a hybrid of political philosophy, history of science and religion, science-fiction, history of real-world utopian experiments and why they ultimately failed, and contemporary political and social commentary. The tentative title is: Utopia: The Traveler’s and Builder’s Guide.

        In parallel, I am also working on a second book this one on dystopia. The tentative title for that is: Dystopia: The Survivor’s and Rebel’s Guide. Hopefully I will be able to finish that by spring 2013.

        I work full-time and have too young daughters, so it’s tough to get writing in- but I’m pretty determined. Hopefully I can count you as a future reader.

      • Yeah, sounds mighty interesting. Not to mention right up my alley!

  5. There are two, major in my estimate, problems with the entire singularity movment and its expectations.
    1. It assumes that human intelligence is basically a computational and measurement process. In my view, it is not. Instead, it is the fusion of many forces, some quite physical and others spiritual and largely undefinable in metric terms, as the “introspectionists” of the early 20th century wrongly assumed that with close attention it was possible to “introspect” the mind’s processes, understanding them in a way that would allow us to improve the species.
    2. It assumes, or is willing to accept, that the move toward ever greater computing power will be allowed to progress to a logical and largely benign singularity end. Nothing in human history suggests that this would be the case. Instead, the acolytes of singularity would very quickly impute godlike qualities to themselves, building machines that, while far from human, were capable of enslaving great segments of humanity. In short, flawed humanity’s ability to build something without human flaws must be viewed as highly unlikely. Moreover, as suggested in 1. above, no matter how humanlike, machines wouldn’t be likely to posses human compassion, viewing their tasks in the cold light of logic, eliminating everything they considered inferior.

    • Rick Searle says:

      Thank you Barry, for reading this post. I almost forgot it was there.

      I certainly agree with your first point that human intelligence is a “fusion of many forces” though I don’t think we even need to bring up “spiritual” here. How about the complex way we are shaped by parental relationships, culture, friends etc. I don’t agree that it is theoretically impossible to replicate this complexity in some other substrate, but the singulartains timeline- 2030-2045 I find WAY WAY too optimistic.

      As to your point: “In short, flawed humanity’s ability to build something without human flaws must be viewed as highly unlikely” I agree one 100%,

      In light of your observation:

      :no matter how humanlike, machines wouldn’t be likely to posses human compassion, viewing their tasks in the cold light of logic, eliminating everything they considered inferior”

      You might like this post:

      Psychobot

  6. […] way for any theory to win. Theories such as Heraclitus: world as flux and fire, or Pythagoras: world as number, or Democritus: world as […]

Leave a comment