Reflections on Wilson Award for Excellence in Indexing

Reflections on Wilson Award for Excellence in Indexing

 

Judging, 2012

After careful consideration of all the submissions, the 2012  Wilson Award Committee decided unanimously that there was no Wilson Award winner. I want to take this opportunity to share with you how the judging went and to reflect a bit on the judges' conclusions.

As I have mentioned before, the Wilson Committee's work is both collegial and collaborative. It is appropriate that I first thank those who were part of this year's committee. We could not have had a more diverse and experienced group of indexers, and I really appreciate their efforts. I also want to thank Caryl Wenzel who ably took on the registrar's role. My gratitude extends to EBSCO Publishing which  has continued and will continue to support this important award.

I want to emphasize three points about the Wilson Award:

  • First, the registrar is the only person who knows who submitted indexes. None of the judges knew before, during, or after judging. That includes the chair. With no winner, every sealed envelope containing that information went back unopened to the registrar. This is how we maintain anonymity.
  • Second, this is not a "Best of Show" award. It is given only if we have a submission that meets the Wilson Award criteria for an excellent index.
  • Third, while there is subjectivity in the indexing process, the Wilson Award judging process is not subjective. All the judges use the same criteria to look at each entry, as well as rely on their collective indexing knowledge. This process provides the most objective result possible.

On the day of judging, we started about 9:00 in the morning. The first step is for every judge to evaluate every submission using the Wilson Award criteria (at the end of the criteria is a link to a one-page short form that is easy to use). We each found a spot where we were comfortable and set to work. There is no discussion of the submissions during this process or at lunch. With every submission that we turned to, each of us hoped to find a Wilson Award winner.

By mid-afternoon, we gathered together to discuss each submission, with every judge having an opportunity to comment. After discussing each submission, we decided if it would remain on the table as a possible winner or if it was out of the running. After discussing all of the submissions, there were none left standing, so to speak. We sat and looked at each other as we all realized that there was no winner this year. Disappointment showed on every face.

These were by-and-large okay indexes, but none of them were of Wilson Award-winning quality. Our next step, then, was to summarize the kinds of problems we were seeing. Let me assure you, in the strongest terms, not a single submission was rejected for ticky tack things like a couple typos or a few missed cross-references or incomplete double-posts.

Several of the entries might have been considered after the first round if they had had one more careful, substantive edit. Overall, we found a lack of application of indexing standards, or best practices. All evidenced systemic problems in at least two of following areas:

  • metatopic handling and structural issues;
  • missing topics that, as users who had perused the table of contents and the book, we expected to find;
  • usability issues in general;
  • poorly handled and incomplete cross-referencing;
  • incomplete double-posting (or flips);
  • clunky phraseology;
  • awkward main headings;
  • subheadings that were puzzling or that didn't clarify the relation of subheading to main heading;
  • orphan subheadings;
  • unbalanced analysis (overanalysis of some things and underanalysis of others of similar importance);
  • strings of undifferentiated locators as well as unruly locators;
  • and problems in readability, such as single entries that went on several pages and lack of continued lines.

Not every submission had every one of these problems, but each had two or more. Andthese are all interconnected in the structure of an index. This is not to say there weren't any good qualities in the submissions, because there were some, but there were never enough to overcome the negatives. So there you have it. We did our best. We hoped for a winner but there wasn't one.

I would like to share very briefly the Committee's response to this. We felt several of these errors might be due to a lack of understanding of the essence of the book's subject. That argues strongly for indexing within your knowledge base. That is not to say that most experienced indexers could not produce a reasonable index for a book outside of their knowledge base, especially if it is fairly well structured. But it does suggest the following: To produce an excellent index, the underlying knowledge must be there.

I was thinking about this in relation to the Renaissance cartography book I recently did. I didn't know anything about island books or globe gores, but I had a substantive background in European history and literature on which I could draw as well as a fascination with maps, not to mention a willingness to research if needed during indexing.

The other conclusion we came to was: Not many indexers know how to evaluate an index. This was evident in last year's judging, too. Evaluating an index is an entirely different process than editing an index. But when you know how to carefully evaluate an index, you put that skill to work in your everyday indexing practice.

In relation to this, I would like to make two suggestions. First, let a highly experienced indexer take the proverbial red pencil (or track changes in MSWord, as it were) to one of your best indexes, using the Wilson criteria (a summary of best practices plus elegance) and marking not just what might be wrong or could be improved but also why. This will provide you an experienced indexer's evaluation. Completion of an indexing course is just the beginning. You must continue to fine-tune your skills, learning to implement those best practices with each index you create.

Second, every indexer should take time to evaluate an index, whether one that s/he created or one off the bookshelf. Pick one that at first glance looks good, then work through it examining each of the criteria. As you learn to evaluate indexes, your own will improve as well.

Thank you, Margie Towery Wilson Award Committee Chair, 2011-2012

 

American Society for Indexing


phone: (480) 245-6750

web: https://www.asindexing.org