We publish humans, not machines.

There’s nothing “intelligent” about AI or Large Language Models, there’s definitely nothing creative about them, and they’re really, really awful for the world.

Besides gluttonously consuming absurd amounts of energy, water, and other resources so that a handful of really awful rich people can get even richer, they’re making us all dumber, more reliant on computers, and more distrustful of our own capacity to understand and shape the world we all share.

AI operates on prediction. Each word or pixel that a chatbot or image generator spits out is a computer’s guess at what should follow the previous ones, based on all the human writing and images fed into its training sets. That’s how it generates sentences that sound human and images that look “real,” fooling some into believing there’s a real intelligence in there.

There isn’t.

Real writing, the kind that we call “art” and “literature,” the stuff that conveys actual truth and meaning, the stuff that changes your world, is never going to be predictable. Ask a writer how a particularly beautiful sentence or a soul-shaking truth appeared in a manuscript, and most will give you a confused and perhaps even panicked stare. That’s because real writing and real art comes from otherworldly and unconscious realms that only artists, writers, mystics, and madmen ever visit, and never for very long.

Pagan and animist people believed that thoughts and stories came from the gods and spirits. That’s why it’s call inspiration: a spirit got into you. And when you hear a really good song, read really good writing, and gaze upon really good art, it feels like something spiritual just happened to you; that’s because it did.

But there’s no spirit in AI. There’s no ghost in the machine.

And that’s why we don’t use it.

Sul Books only publishes humans, not machines. All our artists, authors, and editors have agreed not to use AI, and we’ll never use it in anything we offer to the world.