UX in publishing meetup
The ‘UX in publishing’ meetup put on by OpenAthens, the academic single sign-on service with the goal of creating a seamless end-user journey for people accessing password protected e-resources across all platforms and publications.
We talked about:
- RA21 – the publishing and medical industry’s initiative to improve the user experience of people accessing academic articles whilst ensuring that only those entitled to access get it. They recently published their ‘Recommended Practices for Improved Access to Institutionally-Provided Information Resources’. It’s slightly ironic that their website doesn’t have great UX, but it’s interesting that they are working to make it easier for people to get access, or that they are tightening their grip on controlling access, depending on your point of view.
- Publishers using ‘impact factor’ as a metric for understanding how successful an article is, and how that is made up of things like how many times the article has been cited. I think the idea behind a single metric like this is to provide librarians with a guide for purchasing and customers a guide . I wondered if there was any use for a similar approach with Standards that uses socially-driven measures from other customers to help potential customers make purchasing decisions, something like ‘x number of business have used this standard’. This relates to how publishing as a concept communicates it’s value proposition when the purchaser doesn’t know if they are going to get value from what they read until they’ve read it, but they have to pay up front in order to read it. It’s a commercial model weighted in favour of the supplier and using a traditional optimised-for-production approach. I wonder what publishing (books, articles, standards, or any communication of ideas) might look like if it took a more modern optimised-for-consumption approach.
- Chest Agreements, which are negotiated preferential licence agreements for software and online resources for the academic sector. The business model here is that universities are judged by how much money they save rather than how much they spend, and so an organisation that negotiates with the likes of Adobe to agree bulk purchase prices on behalf of academic institutions can corner that market. Then, they become the default place to go for purchasing access to software and other digital products such as books published by the American Psychological Association. They serve as an intermediary and aggregater, and are attempting the tackle the issues around access control of digital content and software products.
- Sci-hub, the pirate website “that provides free access to millions of research papers otherwise locked behind paywalls. Widespread dissatisfaction with scholarly communications has led many to overlook or dismiss concerns over the site’s legality, praising its disruptive technology and seeing justification in the free access it affords people all over the world.“
So, clearly there is a theme running through all of this that is about controlling access to content, on individual and institutional levels. There is some thinking around shifting away from the commodity approach of accessing individual articles, standards, etc., towards accessing the service that provides those things. I wonder how much research has been done on how accepting the market is of that shift (although I probably wouldn’t be able to access it even if there was research about it).
I also read Open Source Beyond The Market, DHH’s Keynote on open source software, markets, debts, purpose from RailsConf 2019. He reflects a similar line of thinking; that selling digital products is often based on the unit economics of our traditional commodity commerce where each individual thing produced had a cost, and so the selling price had to be based on that, but that with software and access to digital content, there is no unit production cost, and so the old way of thinking breakdown. As he’s talking about open source software he’s talking about allowing free access rather than developing different different pricing models, but he presents some really interesting thoughts on the context of it all.
The Domino Project
What happens when a publisher has a tight, direct connection with readers, is able to produce intellectual property that spreads, and can do both quickly and at low cost? A new kinf of publishing, the brainchild of Seth Godin, and powered by Amazon.
The Domino Project is named after the domino effect—one powerful idea spreads down the line, pushing from person to person. The Project represents a fundamental shift in the way books (and digital media based on books) have always been published. Eventually consisting of a small cadre of stellar authors, this is a publishing house organized around a new distribution channel, one that wasn’t even a fantasy when most publishers began.
We are reinventing what it means to be a publisher, and along the way, spreading ideas that we’re proud to spread. Our core beliefs:
- Exceptionally high quality ideas, created without regard for what bookstores and middlemen want.
- Ideas packaged with cogency and urgency in mind, not a word wasted, no filler.
- Permission at the heart of the model. Ideas for our readers, not more readers for our ideas.
- Virality first. An idea that requires a direct sale won’t thrive in a world where the most powerful ideas spread from hand to hand. Create content that works best when spread, and then package it so it’s easy to spread.
- Reward the sneezers who stand up and spread these ideas.
- No patience for obsolete institutions. Bestseller lists are not worth compromising for.
- Speed triumphs. Rapid time to market, rapid evolution, rapid response to reader feedback.
- Format agnostic. Kindle, audiobook, paperback, collectible… all good.
- Different products for different customers. A variety of price points and formats to match audience desires.