“We’re now seeing the transition we’ve been expecting. After five years, ebooks is a multi-billion dollar category for us and growing fast — up approximately 70 percent last year. In contrast, our physical book sales experienced the lowest December growth rate in our 17 years as a book seller, up just 5 percent.”
The book publishing conference season is in full swing and “discovery” is the buzzword du jour, driven by the curious notion that, with the decline of physical bookstores, readers supposedly can’t easily find books online. There’s even new research that claims “frequent book buyers visit sites like Pinterest and Goodreads regularly, but those visits fail to drive actual book purchases.”
Unfortunately, in the spirit of “lies, damned lies, and statistics,” that research is skewed partly by its authors’ underlying agenda (“Physical retail works if you protect it.”), but more importantly, by its flawed methodology, specifically its dependence on what’s known as last-click attribution, wherein the final interaction that led to a sale is given 100% credit for the conversion, ignoring the realities of multiple touchpoints and myriad potential influencers.
The problem is that this assumes that people are waaaay less complicated than they really are. Very few people buy anything after one brand interaction. We’re comparison shoppers. We want the best deals. I don’t buy anything until I’m sure I’ve found the best item at the best price.
–“The Death of Last Click Attribution,” Kimm Lincoln
Never mind the folly of dismissing Goodreads, a social network dedicated to books with 13m+ members and steadily growing, or even Pinterest, where Random House has inexplicably attracted 1.5m followers, but the very idea that “something is really, chronically missing in online retail discovery” is arguably contradicted by Amazon’s 2012 results, suggesting that “online retail discovery” isn’t really a problem for readers.
It’s a problem for publishers.
METADATA: NOT THE UNICORN YOU’RE LOOKING FOR
Metadata is important. Few would argue otherwise.
But it’s a foundational piece of the puzzle, like saying “publish good books.” If we can’t get these basic steps right, then game over, turn off the lights, go home, and let the algorithms have at it.
Publishers will never beat Amazon at SEO. Hell, B&N can barely keep up with them, though I’ve noticed Goodreads is often in the top 5 results in my own searches. Where publishers might have a shot is in narrow niches that are typically ignored because they don’t generate bestsellers, but they’re more likely to lose out to their own authors and organic interest communities anyway, which isn’t necessarily a bad thing.
Meanwhile, every eager startup with some bootstrap funding and an angle on getting books indexed by Google that suggests otherwise are either lying to themselves, or are lying to the publishers they’re attempting to “partner” with as they position themselves for acquisition.
At Book^2 Camp yesterday, I asked point blank what happens when all publishers have the ideal metadata and all of their books are indexed by Google? The answer was, effectively, nothing. The playing field is once again leveled (though Amazon and Google will still live at the top of hill), and the underlying problem, an ever-increasing glut of content, will remain unaddressed.
THAT THING WE SHOULD STOP TALKING ABOUT, AND JUST DO IT
“Fewer, better books.”
At some point in every conference, often more than once, somebody says it, we all nod in agreement, and yet, the output from “traditional” publishers continues to grow, or at least remain steady, each year. In the final years of the “bigger is better” era, investors demand it, budgets quantify it, and the bad decisions continue to pile up.
One of the first articles I published back when I was running Digital Book World was written by F+W Media’s CEO, David Nussbaum, wherein he explicitly made the bold claim himself: “Produce fewer, but better books.”
Not surprisingly, he caught a lot of flak for that post because his proposed solution, “eliminate the mid-list,” seemed rather draconian, and while he clarified in the comments that what he meant was “middling product,” I think many would argue that publishers have mostly leaned in the direction of his original wording.
More Snooki, less mid-list, and the ice gets thinner and thinner.
KNOW THY READERS; SERVE THY READERS
One of my biggest frustrations with publishing conferences is the preference for harping on what publishers are supposedly not doing, instead of spotlighting those who are doing plenty, and doing it successfully.
Some forward thinking publishers (Osprey, Sourcebooks, Constable & Robinson, F+W Media and the like) have already begun to segment their lists into easily definable verticals. The next stage is to target the consumers who inhabit those verticals and to do that they will need to have strategies in place to leverage the market intelligence derived from ‘big data’ by using modern marketing techniques such as SEO and social media management.
–“Vertical Publishing. Take it to the people.” Chris McVeigh
Of course, “verticalization” gets thrown around like some kind of unicorn, too, and while I’ve noted in the past that there are plenty of pitfalls along that path, at least it’s a viable path, with plenty of success stories, and is far less reliant on the hit-or-miss, gamble-by-committee, no accountability approach that seemingly drives a lot of acquisition decisions these days. At least in the big houses.
The publishers who have a direct relationship with their readers — not necessarily via direct sales, but via direct engagement — are the ones who will not simply survive the “digital shift,” but will thrive, being less prone to the whims of Amazon, Apple, Google, or whomever the next big tech player might be. Readers won’t have any trouble discovering their books, old or new, nor will they have any obstacles to spreading the word to their friends about those books.
Publishers lacking that relationship are the ones with a discovery problem, and the clock is ticking…