In 1974 or so, Lewis Thomas wrote an interesting essay titled “Computers,” which he reprinted in his book The Lives of a Cell. The whole essay is interesting, but I especially like this bit:
Even when technology succeeds in manufacturing a machine as big as Texas to do everything we recognize as human, it will still be, at best, a single individual. This amounts to nothing, practically speaking. To match what we can do, there would have to be 3 billion of them with more coming down the assembly line, and I doubt that anyone will put up the money, much less make room. And even so, they would all have to be wired together, intricately and delicately, as we are, communicating with each other, talking incessantly, listening. If they weren't at each other this way, all their waking hours, they wouldn't be anything like human, after all. I think we're safe, for a long time ahead.
It is in our collective behavior that we are most mysterious. We won't be able to construct machines like ourselves until we've understood this, and we're not even close. All we know is the phenomenon: we spend our time sending messages to each other, talking and trying to listen at the same time, exchanging information. This seems to be our most urgent biological function; it is what we do with our lives.
The first paragraph seems to get quoted a lot for both its prescience and its lack of prescience; he seemed to think that we would get a single powerful AI before we had networking. (Which really probably just means he didn't anticipate Moore's Law; it looks like he thought that building multiple powerful computers would be too expensive. Well, and he also didn't anticipate the Internet, nor how hard AI would turn out to be.) The second paragraph seems to get quoted a lot in more philosophical contexts, with a focus on human communication instead of on computers. I like the juxtaposition of the two paragraphs, as they appeared in the original essay.