Peer+Assists+Jan+14

===//In this peer assist, Stephanie McAuliffe and Kathy Reich from the Packard Foundation engaged the group in a discussion of the Goldmine Project, which they started in 2010 in an effort to make use of learnings gained from over 25 years of grant-making. The specific question they posed to the group was: how to seek input for the project beyond the usual suspects and usual approaches? //===
 * **Outreach efforts **:
 * There are multiple channels and social media tools Packard could use to engage practitioners during the research process. For example, they could post questions on Twitter, blog posts, webinars, or any other venue that is open to the public and where users are generating input.
 * It’s better not to post data that’s too polished, as this may dissuade stakeholders from giving input. While an outside consulting firm may be disinclined to publish / print something that is not polished, it is possible for the foundation to disseminate such information on the web (e.g., on blogs). These are not held to the same standard.
 * Even if the project ends in a polished report, it would be beneficial for there to be a site where the content could live on in perpetuity, which someone would have to maintain. He or she could release parts of the report to bloggers (e.g., philanthropy bloggers who are committed to responding). In this way, the foundation would be leveraging bloggers with established audiences to disseminate this information, losing control over the “report” as it becomes a topic on the web.
 * How the findings are presented is really important. They may choose to do so by themes.
 * **Data sharing **:
 * The foundation could make grantee survey data available to the same grantees interviewed, and to pose the question, //what other things would you have asked?//
 * The foundation could share the results with funders as well, to make lessons learned a recognized practice in the field. In this way, if/when practitioners adopt these practices, funders would recognize and acknowledge their efforts. The foundation could even turn to the GEO list serve to access these funders.
 * **Data analysis **:
 * It would be insightful to segment survey responses based on how successful the grants were, and to assess the distinctions between highly successful vs. less successful projects. This might even lead to a nuanced statement about what makes a really good consulting intervention.
 * **Acting on information about consultants **:
 * <span style="font-family: 'Arial','sans-serif'; font-size: 13.3333px;">If a funder discovers that a grantee wants to hire a consultant whose projects fail consistently, they could call the grantee and ask what the interview process was like, whether they checked the consultants’ references, and expose the mismatch (e.g., consultant does not have the expertise required for the project). The last option is to deny the grant.
 * <span style="font-family: 'Arial','sans-serif'; font-size: 13.3333px;">Funders could make information about excellent consultants transparent. For example, they could publish a list of consultants who are really skilled at some project type, e.g., scaling an organization. They could even pull out case studies for some of these consultants, and engage different stakeholder to publish them. This would even be beneficial for other foundations who are interested in consultants with certain expertise.