Clever Turks
Crowd-sourcing offers great promise in the knowledge industry. At 365 Media we've been studying it for a couple of years now and we've used some online services such as MTurk on a number of projects (although mostly in a test sense). Now it can consider itself a real industry, because there's a conference about it in San Francisco today. Congratulations to everyone (in the crowd), I guess...
The core question to using crowd-sourcing is can you outsource some types of service work to large distributed, sometimes unknown, workforces and get a better return than building a dedicated, trained and expert team? We've built an application that enables us to engage and use outside knowledge workers. In this application we have a rigorous series of tests and screenings, and we have to maintain quite a high ratio of internal QA staff to external workers to be sure that nothing slips through that could damage the integrity of the information product that is being supported.
The original Mechanical Turk, after which Amazon named its trail-blazing online service, was basically a box with a robot - a Turk - sitting at a chessboard. The Turk would play chess against anyone from the audience and generally would win. The audience were thrilled - not only because it was a machine (it wasn't, there was a guy inside the box) but also because IT WON the games. The guy inside the machine was good at chess - that was as important as anything in the whole ruse. Imagine if the guy in the box didn't know the rules of chess!
In crowd-sourcing a task that needs any level of expertise then the crowd needs to have at least that level of expertise. At least.
The core question to using crowd-sourcing is can you outsource some types of service work to large distributed, sometimes unknown, workforces and get a better return than building a dedicated, trained and expert team? We've built an application that enables us to engage and use outside knowledge workers. In this application we have a rigorous series of tests and screenings, and we have to maintain quite a high ratio of internal QA staff to external workers to be sure that nothing slips through that could damage the integrity of the information product that is being supported.
The original Mechanical Turk, after which Amazon named its trail-blazing online service, was basically a box with a robot - a Turk - sitting at a chessboard. The Turk would play chess against anyone from the audience and generally would win. The audience were thrilled - not only because it was a machine (it wasn't, there was a guy inside the box) but also because IT WON the games. The guy inside the machine was good at chess - that was as important as anything in the whole ruse. Imagine if the guy in the box didn't know the rules of chess!
In crowd-sourcing a task that needs any level of expertise then the crowd needs to have at least that level of expertise. At least.
Labels: crowd-sourcing, crowds, distributed workforces
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home