Reviews microformat call for implementors

The microformats community is currently working on a reviews microformat and is interested in collecting feedback and acquired knowledge from implementers. We have collected some of the existing reviews formats present on the web today, but may not have represented the entire corpus.

Microformats allow authors to markup content with a specific format and structure for discovery by others using XHTML. You control the content but make that structured content discoverable by others through a microformat such as hCalendar, hCard, XFN, or tags. A review microformat will utilize the existing elements and attributes of XHTML to define the essential components of a review for use by authors and tool builders for easy identification, discovery and retrieval with the the flexibility to publish to your own site, a community site, or both.

I have spent a lot of time thinking about gathering community reviews through my experiences at shopping comparison sites PriceGrabber and NexTag. Consumers want the best information available about a company or product before making a decision, but the best information is difficult to discover and strewn across the entire web. I sometimes turn to specialist sites such as Digital Photography Review for camera reviews, Chowhound or Zagat for restaurants and Amazon for book reviews. I am fortunate to know about some passionate experts in the world of weblogs and check Russell Beattie’s blog for information about Series 60 cell phones and Slashdot for reviews of programming books from people who should know.

What if all these research resources could be brought together? What if you could publish select content to multiple locations for discovery by large communities? It’s exciting and empowering to think about all the possibilities of the semantic web.

Technorati Tags:

3 comments

Commentary on "Reviews microformat call for implementors":

  1. Greg Linden on wrote:

    Hi, Niall. Great idea on aggregating reviews.

    However, once you’ve aggregated the reviews, an even bigger problem comes up. How do you determine which reviews are objective and authoritative? Some of the reviews will be shills. Others will be useless garbage. How do you find and focus attention on the most useful reviews?

    It starts to get into questions of reputation. Perhaps authoritative reviews are written by those who are considered authoritative by the community? Or by those considered authoritative by other authoritative people in the community?

    It’s a hard problem, but one that meshes nicely with your interest in social software.

  2. Niall Kennedy on wrote:

    Hi Greg,
    Determining the motivations, authority, and even identity of an author is an emerging space, but tools exist to help us mine the data and make sense of it all in our own unique ways. Technorati tracks outbound links from weblogs, and Bloglines has a list of the subscribed feeds of you and your friends.

    Authoritative reviews are open to interpretation but the data exist to create our own interpretations as well as a community interpretation which Technorati may provide on some level. Objectivity is more difficult to measure and usually best discovered through experienced interactions discoverable through social networks of trust. Efforts such as the Silicon Valley 100 could be swayed by receiving the reviewed item for free, and having a nice sales rep walk them through its use, and the reviewed item will be in pristine condition. When I test drive a new car the manufacturer has representatives onsite, the car has 18 miles on the odometer, and is maintained daily.

    How do you measure objectivity?

  3. Greg Linden on wrote:

    I think you’re on to something when you talk about discovery through social networks of trust. But building these networks and learning who is reputable is a cumbersome and time-consuming process.

    What I was trying to suggest was propagating reputation through an implicit network. The authoritative people are those considered authoritative by other authoritative people.

    Not a new idea. In fact, I just bumped into it again reviewing the TrustRank paper (http://dbpubs.stanford.edu:8090/pub/2004-17).

    The key problem in building a good review aggregator is identifying good reviews. Perhaps this is one way to get there.