Notes+from+our+session+on+Engaging+'Crowds'+&+Impact+Assessment


 * Here's a blog post that Beth Kanter wrote synthesizing her presentation and our conversation:** []
 * To kick off the call, participants were asked to share in the web “chat” two words that come to mind when thinking of when thinking about crowdsourcing. Following are their thoughts:**
 * Phrases like “group wisdom,” “group trends,” and “collective” came up with greatest frequency, followed by concepts relating to “diversity” and “diverse opinions.”
 * Additional phrases included “crowds chaos,” “experiment,” “better ideas, “case foundation,” “sharing & people,” and “emergent.”


 * After discussing a definition of crowdsourcing, participants shared their experiences with crowdsourcing (supporting it and their own experiments):**
 * [|Nitrogen Wiki] was referenced with the most frequency: A Packard project 3.5 years ago which used a wiki to facilitate public contribution to the Packard’s formulation of a strategy to reduce nitrogen. The project continues to be heavily referenced many years later.
 * [|Island Innovation Fund]: Hawai‘i Community Foundation’s recently launched open grant process
 * [|Public Insight Network], designed to engage the community in local reporting for public media.


 * A variety of crowdsourcing examples were then shared, beginning with the earliest examples of the Audobon Society, which began crowdsourcing birds cataloging 100 years ago. A few examples:**
 * Crowds for Wisdom / Collective Intelligence: Clouds of information that others can contribute to or distribute.
 * Aforementioned [|Public Insight Network] supported by the Knight Foundation
 * [|Maine Health Access Foundation]: Circulated RFP through usual channels, but also via Facebook page to generate greater interest in applying, as well as to increase volume and diversity of letters of inquiry. Letters of Inquiry were then posted back on Facebook to allow community commentary. More than 150 comments were provided, advice from which applicants were encouraged to reflect in their proposals
 * HCF's [|Island Innovation Fund]: Website functioning as a portal for RFPs and commentaries. 180 proposals were submitted, receiving more than 300 comments, and enabling much greater interaction amongst the grantee community. Aspiration is that the program will enable participants to feed off each other, and also increase adoption of these innovations.
 * [|Ushahidi]: Mapping disasters, crime incidents, etc. through cell phone reports by citizens.
 * [|YourOpera]: Engaging crowd in a creating an opera story. Since then crowdsourcing has been used to create novels, choreography, plays, poems, etc. Program was facilitated and moderated, and did not incorporate all suggestions.
 * [|Pepsi’s VOTENOW]: Started by Case Foundation as a Pepsi cause marketing project with a goal to enable social impact while spreading the Pepsi brand. Pepsi repurposed its Super Bowl add budget, allowing nonprofits to submit proposals and compete to rally votes for funds. To date 46 M people have voted, but the process remains controversial. This remains amongst the most open models, where votes directly led to funding.
 * In discussing these examples, a few questions and comments arose from the group:
 * What is the role of curator? Does open strategy fit into crowd curation?
 * The Nitrogen Wiki had the potential to be crowd creation, but did not get there. Do you have to cede more control to really co-create?
 * Examples of Hybrid Approaches, with varying degrees of crowd participation, were also discussed:
 * [|Knight News Challenge]: Open call for proposals and crowd rating, ultimately narrowed down by panel of experts.
 * [|Brooklyn Museum Click!]: Open call for photography exhibit. Almost 400 entries, with a call to the public to rate images. Over 3,000 responses from the general public, as well as expert photography curators. Ultimately comparisons were made between expert and general population rankings, which proved remarkably similar.
 * [|Next Stop Design]: Public voting on design of bus stop facilities. More than 11,000 votes were received, with experts making the final choices.
 * [|NTEN]: Public served as a conference committee, voting on ideas for sessions at upcoming conference.
 * [|Spot.US]: Exploring whether individuals will “crowdfund” reporting if it better reflects their community interests. Individuals fill out market research surveys, and get points which they can allocate to funding local stories. Will Spot.US inspire a new level of audience engagement? Will it provide a new way for sustaining local news and information?


 * Participants then discussed how to most effectively tap the intelligence and power of crowds, and surfaced clarity of decision rights / role of influence as one of the most important considerations:**
 * Be intentional and clear about the role of the public’s participation (e.g., whether or not all input will be accepted, how decisions will be made, etc.)
 * Some organizations have faced challenges when the process was not clearly communicated, too difficult to understand, or when rules were changed midway through the process.


 * The conversation then turned to assessing impact:**
 * Goals / types of impact:
 * Impact on the outcome the foundation is trying to create / output being developed
 * Impact on the individual participating, by virtue of their participation and enhanced sense of commitment
 * Impact on field-level knowledge
 * Questions raised:
 * how to assess impact when the desired outcome / innovation is unclear?
 * how to assess impact at the field-level - when impact may be broadly distributed?


 * Questions to consider when assessing the impact of projects that use crowdsourcing techniques?
 * Motivations
 * Why are people coming to the table?
 * What metrics matter most given the objectives?
 * How can you gather diverse perspectives on these metrics?
 * Efficiency
 * How efficient is crowdsourcing as a means to getting to the end goal?
 * How time intensive are efforts by foundation staff? Crowdsourcing may be more efficient for certain types of interactions (making a decision versus advancing an innovation).
 * How customized is the platform being used (which determines how much manual work is involved; e.g., Twitter is much more labor intensive in the data aggregating phase).
 * Connectivity
 * Is there a change in the connectivity among people (e.g., comments on other people’s comments, transition towards lighter touch from the convener / facilitator, etc.)?
 * Who is included in crowd, and how is their input included?
 * A group of already known constituents or broader group?
 * Input can be open, while decision remains more closed or moderated.
 * Explore demographic differences between heavier and lighter participants. Who’s voices are the loudest?
 * Who is not present, and why (e.g., technical accessibility)?