The article spans a variety of subjects from psychology to philosophy, and the most interesting passage is in the midst of a discussion about the neurophysical processes behind consciousness and language with philosopher Daniel Dennett:
"This is what I've meant over the years when I've said that the brain is a syntactic engine mimicking a semantic engine." By that, Dennett presumably means that consciousness produces orderly, grammatical representations of something out there in the world that is meaningful, but it does not create meaning. It is not necessary to meaning.
This argument rejects the notion that a Turing machine is not actually conscious since it merely mimics the understanding of language by suggesting that this mechanical procedure is the only system at work within the human mind. By reducing the physical processes in the brain down to their fundamental components, we see that they are no more remarkable than the internal workings of a digital computer. Critics of AI argue that machines & algorithms cannot be conscious since we can observe precisely how they reproduce human behavior. Perhaps once human behavior itself is de-mystified to the point where our consciousness is understood completely, these people will be willing to apply the label of conscious thought more broadly.