Measure
How do we know we are successful?
Measuring the success of our conservation and restoration efforts must be tackled on a project-by-project basis. Coastal Resilience takes a multi-faceted science approach to designing local projects on the ground as well as global indicators of success. To consistently track project outcomes and identify areas for improvement in tool effectiveness and communication of results, we have developed a list of performance measures for the Coastal Resilience program.
As part of this development, we have defined the program within four themes:
- Decision Support Technology: the suite of Coastal Resilience tools used to support decision-making to decrease ecological and socioeconomic risks from coastal hazards
- Science Application and Modeling: Scientific research and modelling efforts concerning effective conservation and restoration to reduce risk
- People and Practitioner: the network of communities, practitioners, and policymakers that use tools to protect people and ecosystems in coastal areas through conservation and restoration of habitats
- Business Impacts: emerging stakeholder group of interest that will use tools to take into account nature-based solutions to benefit coastal communities, their bottom line and overall success
Science Publications
Restoration and Conservation Actions
Communities
Within each of these themes, we have identified priority resources inputs and regular activities as well as short, medium, and long-term outcomes. From the full list of outcomes, our team, with review from the Coastal Resilience network, developed 17 priority measures according to criteria including: significance, added value, span of control, feasibility, clarity, comparability, and verifiability. The measures will be collected through a combination of analytics and surveys to both internal and external partners. Semi-annual analytics will be collected, compiled and then shared annually alongside the complete suite of measures of tool effectiveness across the four themes.
Priority Measures:
Measure | Type | Primary Theme | Methodology category | Unit of Measure | Frequency of collection |
---|---|---|---|---|---|
Practitioners on network list serve and newsletter recipients | Output | People | Analytics | Number | Quarterly; assess annually |
Peer review and other publications | Output | Science application | Analytics | Number | Quarterly; assess annually |
Twitter followers | Output | People | Analytics | Number | Quarterly; assess annually |
Visitors to mapping sites | Output | Technology | Analytics | Number | Monthly; assess annually |
Visitors to website | Output | People | Analytics | Number | Monthly; assess annually |
Tool applications and project sites | Output | Technology | Analytics | Number | Annually |
Tool applications developed for business audience | Output | Business | Analytics | Number | Annually |
Collaborative apps developed | Output | Business | Analytics | Number | Annually |
Area covered relative to habitats of concern | Output | Technology | Survey and analysis | Number | Annually |
New business partners | Short-term outcome | Business | Analytics | Number | Annually |
Agencies and organizations using CR-DST for planning | Short-term outcome | Technology | Analytics | Number | Annually |
Local and enterprise partnerships | Short-term outcome | People | Analytics | Percentage and qualitative | Annually |
Training participant understanding of CR-DST | Short-term outcome | Technology | Analytics and interview | Number, narrative | Annually |
Individuals trained | Short-term outcome | Technology | Survey and analysis | Project examples | Annually |
Stakeholder awareness of CR-DST | Short-term outcome | Technology | Survey and analysis | Percentage | Annually |
Examples/case studies of CR-DST use | Intermediate outcome | Technology | Analytics, survey | Number | Annually |
Demonstrated tool application for site implementation | Long-term outcome | Technology | Interviews | Percentage | Annually |