wiki books

With my purchase of an iPad I have finally made the move to actually reading ebooks – about time! since I have (for theoretical reasons) long been a proponent of electronic reading. However, in hunting down an obscure book (a history of Byzantium written in 1892 in the ePub format), I discovered that it was full of typos and layout errors (probably because it was scanned using OCR). My first reaction was to correct these (a reflex for someone who spends his life working with digital texts) and, in truth, I may well have been tempted to do so had the ebook reader I was using allowed me easily to edit the text. Assuming such editing facilities – and I can’t imagine that, if they don’t already exist, they will take long in coming – and if I were motivated to correct the whole text, my next and natural step would be to return the corrected book to the internet so that other people might be able to read my ‘cleaner’ version… (as I do CD tags)

And this set me thinking. Even though my intention might be to correct typos only, I could well make corrections in error – that is, I could change the text. And what if I were more ambitious and deliberately set out to improve the text? This after all is what is being done all the time in various wikis. Surely it would only take a short step to consider a book as just another digital text that could benefit from improvement. Before you know it you would have countless versions of each book drifting around the internet.

Now, as an author, this somewhat alarms me (I have considered ongoing corrections of my own texts, but that’s another story) because I put a lot of effort into bringing my books to a state that I am satisfied with and I have all kinds of intentions behind what I write that might not be immediately obvious to a reader. What a reader might imagine to be an error, for example, might in fact be something deliberate that only comes into play somewhere deeper into the text. I don’t think it would be hard to knock up a list of what could go wrong with a wiki approach to books – and that’s without even considering deliberate defacement. My question is: how would it be possible to make certain this does not happen?

I would suggest to you that it could become much harder keeping a text ‘authentic’ than it might at first appear. You might argue, for example, that the versions of a text produced by a publisher will remain ‘quality assured’. There are assumptions made here about security – and we all know that the moment something becomes digitally traded on the internet it immediately becomes vulnerable to all kinds of ‘interventions’, both accidental and deliberate. It seems probable to me that texts may be even more vulnerable than other digital objects – in that if a word changes, who is going to notice? Unlike a program, it’s not going to suddenly stop working.

Beyond this, as the recent example of the recall of Jonathan Franzen’s Freedom shows, errors can even creep in during the publishing process. Although this article doesn’t appear to say it, my understanding was that these errors occurred because an earlier version of the text was printed – and that this was discovered entirely fortuitously by Franzen leafing through a published book and noticing something in it he knew he had corrected. (I think this is likely to be a cause of nightmares to authors – it certainly is to me.)

So, it seems to me that though the untethering of books from their fixed papery form frees them in all kinds of beneficial ways, it also carries this risk: that texts are going to become more tractable and that perhaps we are going to lose the notion of a definitive version of a text.

Before the advent of printing, books copied by hand were prone to countless errors and accidental changes. Before writing, the content of books was passed from one mind to another orally. I wonder if what we are witnessing is a return to that far more fluid form of ‘storytelling’…?

Posted by Ricardo

writer and blogger

Leave a Reply

Your email address will not be published. Required fields are marked *