ways of adding metadata
I was digging around in my files this weekend and found this table I made once of different approaches to applying metadata to content. At first glance the volunteers example looks like it is only relevant to charities but alot of scenarios that refer to users tagging, it is actually volunteers tagging. The difference is doing something for your own benefit (users) or contributing something to a greater cause (volunteers).
users | volunteers | staff-authors | staff-specialists | automatic-rules | automatic-training sets | |
Users apply metadata to their own content or content they have gathered for their own use | Unpaid volunteers apply metadata to content produced by others e.g Freebase | The paid author applies metadata to their own content. | Paid metadata specialists apply metadata to content produced by others | Software applies metadata to content based on rules defined by specialists | Software applies metadata to content based on training sets chosen by specialists | |
Strengths | Cheap, real user language, subjective value judgements, highly reactive, latest trend vocab | depending on how handled can be more predictable and reliable than users, may be close to user language, can be guided more like staff, asked to go back and change | small commitment required from each staff member, expert knowledge of the content | highly motivated, objectives likely to be tied to quality of this work | more efficient than staff options | more efficient than staff options |
Weaknesses | no guarantees of contributions, same tag to mean different things, different tags mean the same thing, cryptic personal tags, smaller interpretations drowned out, hardly anyone goes back and changes out-of-date tagging, | can require more management/attention than users, smaller number, may not make up enough hours, probably not viable in most commercial enterprises – although can still be done if company offers a free-at-consumption service that may be perceived as a public good. | low motivation and interest, may be too close to the content to understand user needs, more likely to be formal/objective | cost, needs to read the content first, may not necessarily be user focused, more likely to be formal/objective | needs operational staffing | hard to control, can be ‘black-box’, need a mechanism for addressing errors |
Recommended environment | Large user-base, with a *selfish* motivation for users – often gathering/collecting, reasonably shared vocabulary, rarely works on a single site where the user could instead aggregate links or content on a generic site like delicious | Where you can rely on lots of good will. Probably in combination with another approach, unless a large number of volunteers are likely. | You have good historical examples of imposing new activities on the authors and getting them to follow them. Probably quite process and guideline driven organisation. Bad where your authors think of themselves as creatives…they’ll think metadata is beneath them. | Strong information management skills in the organisation. The project needs to be resourced on an ongoing basis. Business probably needs to see a very close correlation between the quality of the metadata and profit. | As for specialist staff. | Strong technical and information management skills in the organisation. An understanding from management of the ongoing need for operational staffing. Management do not believe the vendors promises. |