8-Mar+Case+Meeting

Feb 9 Webinar: Network Metrics, conversation with iScale

 * Overview**: Sanjeev Khagram of iScale shared findings from the report he co-authored, "[|Next Generation Network Evaluation"], and led a conversation on network metrics.

media type="custom" key="8639332"
 * Presentation Material:**
 * Conversation Highlights:**

//The relationship between complexity science & network impact assessment [Slide #6] //
 * Practical tools for network impact assessment are few and far between and could benefit from applications drawn from complexity science and systems thinking, particularly with regards to ToC development. Currently, adoption of complexity science / systems thinking tools and methods is mostly in the form of mental models, e.g., understanding non-linearity vs. practical tools.
 * Applying complexity science and systems thinking tools to people and the relations between them is no easy feat. In science, a 4-variable model is considered complex; here we have an exponentially greater number of variables.

//Using network lifecycle as an evaluation tool// // [slides #10 and #19] //
 * The network lifecycle diagram was developed through iScale’s observation of +100 networks.
 * **Case Study**: Network lifecycle was part of the evaluation for the Campaign to End Pediatric HIV/AIDS (CEPA), a networked campaign that cuts across 6 African countries, with coordination at the regional, national, and global levels. Part of CEPA’s meta-ToC was to achieve a large, vibrant, and connected network.
 * The campaign started 18-24 months ago—the uncertainty around when the network started illustrates the difficulty we often face in establishing “baseline” for the network.
 * iScale determined the degree to which global, national, and regional networks within CEPA were vibrant and connected, across geographic locations. iScale also evaluated how far along the network lifecycle each was, and identified the challenges & opportunities they faced.
 * **Integrating learning from across the CEPA network / campaign**: Conducting evaluations at different levels within CEPA surfaced differences / similarities across the network and helped iScale make recom by region, for each country, and in the network overall. In this way, using the lifecycle model when assessing network impact can be helpful for creating comparative data
 * **Highlighting successes**: It also allowed them to highlight not only outcomes achieved (e.g., heightened awareness of the problem of Pediatric HIV/AIDS) but also the progress made in terms of network connectivity (e.g., convening of diverse groups that would not have otherwise come together). Discussing early successes of the network was particularly important because many funders were first-time advocacy funders.
 * **Getting everyone on the same page**: Understanding the various lifecycles of the network helped create a shared mental model of the network, including the gaps that existed, and it helped align network members around collective action.

//Articulating theory of change work is difficult, but can be extremely useful for monitoring the network [slides #14-17] //
 * A clearly articulated ToC can be helpful for getting stakeholder buy-in and engagement—a ToC can also serve as a touchstone for monitoring and course correcting.
 * Several tools can be used in ToC development, including:
 * Linear causal mapping, which are easier for people to access;
 * Outcomes mapping, which tend to be thin on ultimate / intermediate outcome identification; and
 * Certain system approaches, like feedback tools, which often get underplayed.

//How are folks managing the tension between the network as a means and the network as an end?//
 * There is often tension between measuring network vibrancy and effects vs. field levels outcomes and programmatic evaluation. //How do you communicate and measure those different aspects of the network?//
 * Some evaluators and funders try to think of creative ways to combine Next Gen network evaluation with more standard programmatic evaluation. For them, the Holy Grail is showing that higher levels of network vibrancy, connectivity, and diversity are linked to increased impact and achievement of outcomes. But we’re not there yet. That’s where we’re headed.
 * Other funders focus instead on the **value-add of the network to some measurable outcome**, so as to assess whether the time spent on network strategy had a positive outcome on the network.
 * E.g., Lawrence Community Works found that retention rates of workers recruited via the network were higher than other workers not recruited through the network. They concluded that the network was actually cost effective compared with recruiting / turnover costs associated with low retention rates (i.e., it’s cheaper to give the network money than to pursue the traditional recruiting strategy).