AmeriCorps Texas
Data Collection
Data Collection Slide Deck
View and follow along with all slides from our Data Collection trainings.
Overview & Requirements
Data can be defined as reliable information collected in a systematic way that provides a documented record of performance. Data can provide insight into what a program is doing well in addition to where a program might need improvement. Data also provides information on strengthening or even changing the activities of the program, and it can help determine where to allocate member service or other resources.
Data also holds us accountable to our current funders, as well as allowing us to show our impact to the community and future funders.
Programs should establish and provide outcome objectives, including a strategy for achieving these objectives to determine to what extent the program has a positive impact on communities, on members, on participants in projects, and on other areas determined by the program or by AmeriCorps, the agency.
Types of Data to Collect
Quantitative data is numbers-based, countable, and measurable. Start with data that is easy to manage and that you readily have access to. Some examples of quantitative data might be the count of things done by members, the count of individuals or things that improve due to member service, or member recruitment and retention data.
Qualitative data is interpretation-based, descriptive, and informative. When thinking about where to start with qualitative data, think about what you can observe or gain from listening to others. This might be member observations, things that beneficiaries say about member service, or what partner organizations have to say about member service.
As we think about what types of data to collect and where we might source them, it is also helpful to think about some of the different purposes for data that we have for our programs. These purposes will drive the types of data that we collect.
Common Types of Data Collected by AmeriCorps Programs
Evaluation and Evidence Data
Member Recruitment & Retention Data
Performance Measure Data
Informational or Impact Data
The data that programs collect is also informed by what needs to be reported to OneStar. Data is collected in both the mid-year (in the spring) and end of year (in August/September) progress report. The Mid Year APR is a mid-program year check in to assess performance measure progress and provide an opportunity for us to brainstorm and problem solve if data collection challenges are occurring. The problem solving that occurs here can help ensure that data is being reported as aligned with the performance measures, and to ensure that data can be reported to OneStar and AmeriCorps, the agency, at the end of year in the End of Year APR.
Mid-Year AmeriCorps Progress Report Components
Program design and beneficiaries
Performance measure data actuals to date
Source documentation for performance measure data
End of Year AmeriCorps Progress Report Components
Performance measurement data actuals
Source documentation for performance measure data
Performance indicators (i.e., enrollment, retention, 8-day enrollment cycle time, 30-day exit cycle time)
Performance data elements
Narrative prompts
Data Collection Considerations
There are also several things to consider in the data collection process, such as the access a program has to data, partners in data collection, and the tools needed to collect data. Some of these are contingent on agreements with outside parties, creating surveys or instruments, and staff member involvement in this process.
Access to data: Individuals served, outcomes of member service, and survey responses
Partners in data collection: Software used, partner organizations, and staff members
Tools: Surveys, forms, or software
Mid-Year and End of Year AmeriCorps Progress Report Worksheets
Use these Word documents to preview questions and draft responses prior to completing your reports online in the AmeriCorps Texas Grantee Portal.
Performance Measures & Data Collection
Performance Measures are part of your grant award and should reflect significant program activities whose outputs and outcomes are consistent with the program design. You can find these in eGrants or reach out to your OneStar Program Officer for a copy.
All programs have at least one aligned performance measure, which is an output paired with an outcome.
Most data collected for performance measures will be quantitative.
Most National Performance Measures have guidance on measuring and collecting data.
Key Takeaways on Performance Measures
Unduplicated counts. In your program you will want to ensure that you count those served only once within an aligned output and outcome measure.
Those served in outputs should be the same ones that are served in outcomes. An example of how to remember this may be to think of outputs as all students in the class attend every weekday, but only ten of those attend on Saturday.
In all performance measures, clear engagement that represents meaningful service should be represented for those that are served.
Output definitions for what it means to be served or counted and clear definitions for the indicator of improvement in outcomes are needed in all performance measures.
Clearly identified Instruments or assessment tools are critical to data collection for performance measures.
Data collected towards progress on target goals for performance measures is reported to AmeriCorps State & National in progress reports. In some cases, OneStar will request and review source documentation in support of reliable and verifiable data.
Source Documentation for Performance Measure Data Collection
Source documentation is the term that we refer to when considering the documentation that supports the data and process by which your program has aggregated or analyzed the raw data and reported it to OneStar. OneStar does not approve instruments that programs use to collect data for performance measures. In cases where OneStar is reviewing source documentation, programs will need to ensure that the source documentation aligns with the instrument indicated in the program’s performance measures.
Data collected on AmeriCorps program performance is an important component of reporting on the impact AmeriCorps programs make across the state and across the country. Much of this data is compiled at the state and national level for AmeriCorps funding allocations. For this reason, it’s important that we take the time to ensure that the source documentation does verify the data actual reported.
Source Documentation Review Common Issues
Not being able to attribute data to reporting periods and/or member service
Instrument used not aligned with performance measures
Calculation on source documentation doesn’t match the data actual reported
Program uses a calculation method on source documentation that doesn’t align with the raw data submitted
Missing raw data as indicated in performance measures
Duplication errors
Source Documentation Tips
Review, review, review data and source documentation prior to submission
Be transparent about your data and your source documentation. What isn’t working? How can we be partners in addressing data issues?
Don’t provide data or information that OneStar does not need. And if you aren’t sure about what we need to see, ask us!
Share what you have learned from the data and the source documentation that you’ve collected. Are there program improvements to be made?
If needed, revise your Data Collection Policy & Procedure.
Data Collection Policy & Procedures and Best Practices
At a minimum, programs should have policies and procedures which address:
How and when data is collected, aggregated, and analyzed
How data is reviewed and verified for reporting
A replicable process for data collection
How data aligns with significant program activities as outlined in the program’s current outputs and outcomes
How staff and members are trained in data collection
Data Collection Policy & Procedures Template
Customize this sample data collection and quality policy and procedure to fit your program's particular systems.
Data Collection Best Practices
At a minimum, programs should have policies and procedures which address:
Read and know your program’s performance measures.
Share and talk about the data you are collecting often with:
- AmeriCorps members
- supervisors
- stakeholders
Create data sharing agreements with partners, host sites, and/or staff.
Learn from data. Not meeting the target goal is not the result. What you do with that information is what matters.
Data Collection Training for Members & Staff
Data collection training is an important component of data collection. Train members and staff (including site/partner staff) who participate in data collection on the program’s data collection procedures. Be sure to explain what the tool is, why it’s being used, and/or what it tells you about member service and beneficiaries. Use scenarios and examples where possible, and ensure members and staff know how to explain the tool and provide instructions to others about how to complete it.
Another important component of training is providing your team with the performance measures for your program. Share the target goal of the service members are providing and how you determined the target goal - addressing this with members and staff may help them understand how it is achievable. Even if members and staff aren't directly involved in instruments, explaining what instruments you will use to reach your target goal can still be helpful for them to understand the full picture of their service and work.
Provide your team with training on how to produce and gather qualitative data. For example:
Have members create an elevator speech (a quick three-minute description of their service)
Practice talking about service and beneficiary impact
Practice writing impact stories in monthly submissions
Data Collection Training Best Practices
Offer regular opportunities to retrain or review data with staff and members.
Follow up to correct data that is inaccurate. Follow up with team members when they express concerns or confusion about data collection.
Provide both written and recorded information for training purposes. Remember that while some perform well with written instructions, others may need a recording they can reference later.
Remember to obtain feedback from those that attend training.