wiki books

With my purchase of an iPad I have finally made the move to actually reading ebooks – about time! since I have (for theoretical reasons) long been a proponent of electronic reading. However, in hunting down an obscure book (a history of Byzantium written in 1892 in the ePub format), I discovered that it was full of typos and layout errors (probably because it was scanned using OCR). My first reaction was to correct these (a reflex for someone who spends his life working with digital texts) and, in truth, I may well have been tempted to do so had the ebook reader I was using allowed me easily to edit the text. Assuming such editing facilities – and I can’t imagine that, if they don’t already exist, they will take long in coming – and if I were motivated to correct the whole text, my next and natural step would be to return the corrected book to the internet so that other people might be able to read my ‘cleaner’ version… (as I do CD tags)

And this set me thinking. Even though my intention might be to correct typos only, I could well make corrections in error – that is, I could change the text. And what if I were more ambitious and deliberately set out to improve the text? This after all is what is being done all the time in various wikis. Surely it would only take a short step to consider a book as just another digital text that could benefit from improvement. Before you know it you would have countless versions of each book drifting around the internet.

Now, as an author, this somewhat alarms me (I have considered ongoing corrections of my own texts, but that’s another story) because I put a lot of effort into bringing my books to a state that I am satisfied with and I have all kinds of intentions behind what I write that might not be immediately obvious to a reader. What a reader might imagine to be an error, for example, might in fact be something deliberate that only comes into play somewhere deeper into the text. I don’t think it would be hard to knock up a list of what could go wrong with a wiki approach to books – and that’s without even considering deliberate defacement. My question is: how would it be possible to make certain this does not happen?

I would suggest to you that it could become much harder keeping a text ‘authentic’ than it might at first appear. You might argue, for example, that the versions of a text produced by a publisher will remain ‘quality assured’. There are assumptions made here about security – and we all know that the moment something becomes digitally traded on the internet it immediately becomes vulnerable to all kinds of ‘interventions’, both accidental and deliberate. It seems probable to me that texts may be even more vulnerable than other digital objects – in that if a word changes, who is going to notice? Unlike a program, it’s not going to suddenly stop working.

Beyond this, as the recent example of the recall of Jonathan Franzen’s Freedom shows, errors can even creep in during the publishing process. Although this article doesn’t appear to say it, my understanding was that these errors occurred because an earlier version of the text was printed – and that this was discovered entirely fortuitously by Franzen leafing through a published book and noticing something in it he knew he had corrected. (I think this is likely to be a cause of nightmares to authors – it certainly is to me.)

So, it seems to me that though the untethering of books from their fixed papery form frees them in all kinds of beneficial ways, it also carries this risk: that texts are going to become more tractable and that perhaps we are going to lose the notion of a definitive version of a text.

Before the advent of printing, books copied by hand were prone to countless errors and accidental changes. Before writing, the content of books was passed from one mind to another orally. I wonder if what we are witnessing is a return to that far more fluid form of ‘storytelling’…?

Posted by Ricardo

writer and blogger

6 Replies to “wiki books”

  1. […] in to correct any errors. (I deal with the notion of direct reader correction of digital texts in this post.) On balance, I feel it is likely that an author would wish to retain control of his text. However, […]

    Reply

  2. Daniel Cardoso 17th May 2011 at 6:59 pm

    And yet in the pirating world, working and legitimate versions constantly show themselves to be more accessible than other versions.

    I do get your point, and see reasons for concern in it, but we must not discount:
    – croud-sourcing (of the volunteer kind) in quality assurance
    – increased difficult with tampering with a text as compared with just copying seamlessly.

    After all, what Gutenberg gave us – and what computer technology refined to its absolute power – is the possibility of replicating one thing exactly as it is.
    Which isn’t to say that ill intent should be factored out, of course. 🙂

    Reply

  3. Surely you just have to become more specific in stating which version you are talking about, preferably a version controlled by some reputable source.

    Ie, The Third God, Wikipedia revision 9864734. Of course, if you can’t easily see the changes between one source and another without reading, say Wikipedia and Amazon, it gets harder – but if can edit these things, we should be able to diff them also…

    Reply

    1. there could be certified editions, of course, but the problem is in keeping a certified edition ‘closed’. Security would have to be maintained somehow along the entire delivery chain… from delivery by author, processing by the publisher (editing, copyediting, layout etc), transfer into different ebook formats, delivery of those digital objects to distributors – Amazon, for example, or Apple… I suppose this could be an argument for direct delivery from a publish – or even an author – without any middlemen…

      however, many editions will not be obtained this way – there will be a vast exchange of pirated texts – and I can’t see how they could be quality controlled – even diffing them – what would you diff them against. Besides, what if other texts arise that people prefer to the originals…?

      Reply

  4. Deliberate defacement would have political implications, too – sure, someone might deface a text by replacing every instance of the word “hat” with “emu”, but I imagine bigger problems involving far more subtle changes – tiny changes of a word or two which alter the whole emphasis of what is said…

    Which is not to say that I want to argue “the technology could be mis-used; therefore we mustn’t proliferate the technology”, but I do wonder about the implications in, say, repressive regimes… or would they just find it easier to ban a text altogether than twist it into misinformation?

    Reply

    1. yes indeed… this is one thing I fear… the other is that with books that are out of copyright – in other words ‘free’ – who is going to quality control them… I just can’t see how this isn’t going to lead to a proliferation of versions of any given text… that, as I have pointed out above, is really leading us to the way things were in the past – even the bible, one of the texts in which attempts at absolute accuracy were fanatically maintained – exists in many versions…

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *