One of the interesting questions in debating the value of crowdsourcing is whether the concept can be imported across disciplines. The most successful incarnation of crowdsourcing has undoubtedly been in the world of the news media, especially online. With the implementation of the iReport in CNN's endless repertoire of excessively tech-y news delivery systems, the crowd has been legitimized by an established media organization – a legitimization long forthcoming since the popularity of newsblogs, often run by "ordinary citizens," overtook the popularity of newspaper subscription. But what happens when we try to port crowdsourcing outside of the news arena and into some other common disciplines?
Different disciplines require a unique amount of proficiency and training in order to be competent while partaking in its activities; while it takes a significant level of skill to craft a well-written presentation of information, the actual gathering of information and observing of events requires very little occupational education. On the other hand, it requires several years of rigorous schooling and practicing in order to approach a level of competence in the field of medicine – no ordinary person can simply hop into a surgery room or behind an X-ray machine and deliver acceptable results. This is also the case in engineering – but not necessarily in graphic design. This variation is key to the question of whether crowdsourcing can be ported into a new discipline: when we need greater expertise, the ability to crowdsource significantly decreases. And in the case of education, where we count on teachers to offer an authoritative voice of reason, discussion, excellence, maturity, and understanding, perhaps crowdsourcing falls flat. But what if we change the parameters of the "crowd"?
An experimental surgery hits a dead end. The chief surgeon is called in, but has no ideas that might bring about progress. What if this situation was crowdsourced? Dozens, or even hundreds, of surgeons all weighing in on the situation through a live video feed of the procedure would generate ideas far more quickly than the lone surgery team of this particular hospital. Similarly, if a crucial engineering decision needed to be made regarding the integrity of a particular structure in a critical situation, what would be better than allowing several creative engineering minds to come together to make the decision? When the crowd is made up of experts in that particular field, progress can be made. And perhaps we can turn to this idea in academia.
Grading in school is ultimately subjective: no two classes are graded on the same basis, with the same standards, requiring the same level of mastery. Even math classes – where we expect one right answer – can be nuanced, as processes are sometimes given as much or more importance than the end result. Many of us have even approached a level of comfort with a class where we can write an A or a B paper for that teacher. What if grading was crowdsourced to other teachers with a familiarity of the subject? Of course there would be logistical stipulations – what teacher has time to deal with projects from other classes when they have their hands full with stuff of their own? – but let's say these have been taken care of. Would things be better? Fairer? As rhetoricians, we're programmed to write for our audience. If we wrote to a collection of extremely experienced yet unique writing scholars rather than the individual teacher we've spent several weeks picking apart, wouldn't we create stronger work?
Crowdsourcing won't be a passing craze – it'll work its way into our lives more effectively every day. The question is how it integrates into different disciplines, and perhaps crowdsourcing to experts presents the answer to the question.
I think it's an intriguing idea - crowdsourcing to experts - but I'm not sure it would work. You pointed out that grading was subjective, and that different teachers have different methods. But different teachers also have different concepts of truth.
ReplyDeleteOn one hand, say you crowdsource to experts. One professor is generally competent, but has pitfalls in some areas. The more experts you incorporate, the fewer pitfalls remain (because people make up for the shortcomings of their peers). At some points, there would be a general consensus on what's the "right" way to grade something, or what the "right" answer is.
On the other hand, maybe it doesn't work that way. Say there's a question about evolution. If you crowdsource to biology professors, you may find an answer. But what about those who oppose the teaching of evolution? How do you exclude them? They make up a significant portion of our culture (though maybe not many are 'experts'), and clearly they would not agree with most of the bio profs.
What about government? A liberal professor and a conservative professor may have different interpretations on how a certain policy works, or if one's desirable. Outsourcing to the crowd may result in deadlock - much like our current political sphere.
So maybe crowdsourcing to experts only works with skills, like surgery or engineering, and not strictly knowledge, like biology or political science. But with your surgery example, I'm not sure crowdsourcing would work. Sure, you'd get hundreds of different opinions, most of them backed by experience and knowledge of surgery. But how do you determine whether to continue with the surgery or close the patient up and treat with a medicine? How do you determine whether to remove X organ vs. Y organ? There would be different opinions, and maybe several possible "correct" solutions. But it seems impossible to determine the correct solutions given the time constraint. You ask us to assume that the logistical aspects are taken care of, but I think that's a massive task that may not be possible.
Ultimately, without a time constraint crowdsourcing to experts of a particular skill may be more beneficial than the current method, but I can't think of any examples that seem plausible right now.
This presents an interesting point. By redefining the "crowd," you redefine the way we look at the issue. What if we had an online social community specializing in technical fields, and only experts in those fields could create accounts? For instnace, a live video feed of a patient's symptoms could go to a crowd of doctors to diagnose on the M.D. only networking site.
ReplyDeleteThe possibilities are mind-boggling.
I think you make an interesting point about crowdsourcing via the iReport of CNN. When one compares this to crowdsourcing grades, they seem relatively similar, especially in that we usually defer to experts on the subject. My issue with this comparison, though, is that we often think of news as coming from those who are there. Eyewitness reports are seen as some of the best stories that the experts can bring to us. It may just be that I'm skeptical of the practice in general, but while I think the iReports are an interesting enhancement, I have serious reservations about crowdsourcing grading.
ReplyDeleteI really like the idealized version of this idea, but wonder if it will lead to battles of wits. Well, perhaps even that would be productive in achieving the end goal. I especially like the idea for academia, as you mentioned, papers are graded so objectively, but with already stressed work-loads, how could a teacher endure an even more lengthy process. How could a surgeon put a dying patient on hold to consult? Time is of serious concern with implementing this type of reform, but I think in some ways it could be really helpful.
ReplyDelete