Too Big To Know

Getaneh Alemu (University of Portsmouth)

New Library World

ISSN: 0307-4803

Article publication date: 13 July 2012

209

Keywords

Citation

Alemu, G. (2012), "Too Big To Know", New Library World, Vol. 113 No. 7/8, pp. 408-408. https://doi.org/10.1108/03074801211245156

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited


David Weinberger, a thought‐provoking author, who had called for a total rethink of, not only the notion of classification systems, but also of the very definition of metadata in his previous publication “Everything is Miscellaneous”, published in 2007, has come out with a new book that reinforces his radical position. His latest work, entitled “Too Big to Know: Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room”, has been made available to the public since the first week of January 2012. In the same manner as his previous work, this book is very relevant for librarians and information science professionals as it addresses a number of issues of topical interest.

In it, the author presents, yet again, his staunchest critique of well‐established conceptual and theoretical foundations of knowledge, including the DIKW (Data‐Information‐Knowledge‐Wisdom) pyramid, which has been used and abused by both theoreticians and practitioners in the fields of computing and library and information science. His position on DIKW essentially emanates from his contention that knowledge be better conceived as being fuzzy, rather than discrete, unsettled instead of settled, always in constant change than permanent, and “intertwingled” than linear – making it all the more difficult to clearly differentiate between what is data, information, knowledge and wisdom. Hence, he argues, the notion of DIKW is grounded on a wrong assumption that has persisted for millennia, namely the notion that perceived knowledge as a justified and logically provable true belief. The old paradigm, Weinberger reasons out, leads one to attempt to sift knowledge and derive a final truth, which then could be published as long‐form argument in books.

The author contrasts this long‐form argument approach, typical in the “Age of Books”, with that of the loosely connected webs of “truths” that characterize the “Age of Networks,” wherein the long form argument remains as a constraint inherited from the medium of print. Furthermore, the author expounds, our thought process does not operate in a simplistic, linear and long form fashion, but functions as an intricate web of links and associations, which is better reflected in the “Age of Networks”. Scientists were accustomed to working individually, or in small groups, during the “Age of Books”, whence after‐the‐fact peer‐reviewing used to be the norm. But, in the “Age of Networks”, Weinberger points out, the filtering process is immediate, open and takes place within the network itself. That is to say, the simultaneous presence of what is considered junk and good in abundance on the network, generated through the system itself, gets filtered out by the same structure. During the “Age of Books”, Weinberger notes, only the best could join a university while only a few “reputed” authorities could get their works published, after passing the rigorous peer‐review process. Furthermore, very often, only a few of those that got published could go “live” on television, in order to give definitive answers, while laymen and the general public referenced these authorities when settling disagreements amongst themselves. However, the “Age of Networks” has removed the need for anybody, including laymen, to necessarily undergo the peer‐review process prior to getting published, as each takes upon himself/herself to complete the activities associated with publishing, editing, tagging, reviewing, posting, tweeting and re‐tweeting. The setup has lowered the barriers to entry for these practitioners to such an extent that they face little constraints when co‐creating content. This capability for co‐creating as well as the virtue of openness has enabled the realization of some of the biggest projects of the twenty‐first century (such as Wikipedia, the arXiv.org).

Another issue addressed by the author is information overload, also known as info glut, data smog, or information tsunami. Popularized by Alvin Toffler, a Technology Futurist, the phenomenon is one that preoccupies the minds of librarians, especially those who, for so long, have based their value proposition on solving the problems associated with the availability of “too‐much‐information”. As noted by Weinberger, information overload has become to be considered a serious handicap, warranting special attention in the scientific and, especially psychiatric, discourse, resulting in the identification and characterization of such disorders as information anxiety or information fatigue syndrome. After crediting Clay Shirky for having contributed several interesting insights on the topic, the author argues that fear of information overload has been with us for much too long and that too much information is not necessarily bad. Hence, he contends, the problem is not information overload per se, but filter failure.

While reading this book, one can surmise that Weinberger endorses Open Access. Going further, one can state that “He is for Open Internet. He is for Open Data. He is for Linked Data.” Such an open ecology, the author argues, provides a fertile ground for innovation and creativity. To summarize, influenced by little baggage from either of the disciplines of computer science or library science, he seems to suggest that the influence of the “Age of Books” is fading and the time for the “Age of Networks” has dawned and, consequently, knowledge currently resides in the network, and not in the mind of somebody, however intelligent he/she might be. Extending much of the discussions of Web 2.0, the author seems to fully embrace the vision held by Sir Tim Berners‐Lee – opening and publishing data in accordance with Open Linked Data principles. He advocates the use of loosely joined, smaller vocabularies and imperfect metadata structures, in place of big and complex ontologies.

This book is a fascinating and stimulating read for anyone and has a particular importance for those engaged in library and information science for a number of reasons. The discussion on the value of metadata, especially metadata generated by users in the form of user tagging, ratings, reviews, filtering, and recommendations is crucial for librarians, especially those who happen to be at the cross‐roads of making a choice between old and new metadata paradigms. This book illuminates new ideas on library collections and provides a glimpse of what the future of libraries would look like, notwithstanding the author's discussion being at a philosophical level. Metadata is ultimately one of the solutions to “filter failure”, whereby, he asserts “the solution to the information overload problem is to create more metadata”. This is a notion, one believes, that all those engaged in library and information science should promote.

Related articles