On behalf of the Center for Applied Research and Educational Improvement (CAREI), welcome back to the start of another school year! Whether you are just beginning your career as a school administrator or are a seasoned veteran, the start of a new school year is always an exciting time filled with new opportunities and challenges. For those of you who are not familiar with CAREI, we are a research center that serves as the link between research and practice in Minnesota schools PreK-16 and other agencies interested in applied educational research. CAREI partners with local, state, and national service agencies and policy makers to improve outcomes for all learners. We have set an ambitious goal that we want our services to impact 80% of school districts within three years! The 2015-16 school year was an exciting year for CAREI! Here is a look back at what we accomplished “by the numbers”:
- 43 districts1 joined the CAREI Assembly
- 4 CAREI Assembly meetings were held with a 96% satisfaction rating.
- 175 people attended CAREI Assembly meetings in person.
- 60 people attended CAREI Assembly meetings via Webex.
- 1 Standards Based Grading Literature Review2 was written and disseminated to CAREI members.
- 1 math resource guide3 was developed and disseminated to CAREI members.
- 1 Statewide Needs Assessment4 on research, evaluation, and assessment was completed and disseminated statewide.
- 13 presentations were given to professional organizations about the importance of research, evaluation, and assessment.
- 8 Research Watch electronic newsletters were disseminated and opened 848 times throughout the year.
- 1 Twitter account (@CAREIUMN) created with 32 tweets.
- 10 CAREI affiliates5 were added.
- 2 bills were authored in the MN Legislature to provide funding for statewide technical assistance in the areas of research, evaluation, and assessment.
- 1 bill #3275 (Dahle)6 received a hearing (also see: slides7 & one-pager8).
- 14 new external sales projects were awarded.
The basis for much of our work last year and moving into future years was the completion of a Statewide Needs Assessment focusing on research, evaluation, and assessment. A large percentage of survey respondents indicated their school’s or district’s capacity to effectively use data to guide educational decisions was fair or poor. Despite substantial motivations and efforts to use data, most educational systems in Minnesota lack the capacity to meet their own needs for data-based decision making. In addition, those who responded to the interviews and surveys consistently indicated a lack of resources and expertise to support their efforts. To truly leverage state and local investments, professional educators require infrastructure to build capacity and efficiencies to use data that improve educational outcomes. Historically, CARIE coordinated with educational agencies located in the twin cities metropolitan area. The proposed solution will expand the mission and accessibility of CAREI. It will provide resources to policy makers and educational agencies throughout the state; especially those in rural and high need communities who were historically underserved. CAREI will enable the use of evidence and data at all levels of the education system and foster high-value partnerships. In its expanded role, CAREI will continue as an impartial and independent hub for applied research and educational improvement. It will bring others together to define their values, goals, objectives, policies, and programs. It will provide services and resources to facilitate high quality research, evaluation, and assessment practices among its partners.
The Importance of Evaluation in Education
One finding from the needs assessment was that 51% of administrators rated their capacity to evaluate policies and programs as poor. High-quality program evaluation was rated as infrequent due to lack of time (78%), inadequate staffing/expertise (63%), and cost (53%). Why is evaluating programs and policies so important? The answer is fairly simple – to determine whether the program or policy had the intended effect in order to guide decision-making. Large-scale evaluations in education help us improve policy. Smaller-scale evaluations at the local level help guide decision-making and allocation of resources based on outcome data. Learning how or why a policy or program does or does not work is central to program improvement.
These days, we have a tendency to want a quick turnaround on data to answer our questions. Most people want to conduct program evaluations quickly and with minimal expense. However, in education, quicker isn’t always better. We need to consider the logistics of the program to be studied and what we hope to learn. Many educational programs or frameworks are multi-faceted and complex and require several years of implementation before all of the components are fully implemented. In addition, sometimes new programs take time to achieve the desired outcomes. That means that we need to collect data, often from multiple sources, over an extended period of time. Yet, educators can be impatient. The field of education has a long history of “swinging pendulums” –adopting new programs and practices one year and abandoning them after a year or two of implementation to move on the next “educational fad.” The result is that programs are not given enough time to demonstrate the intended results, and staff suffers from “initiative fatigue.”
What’s the solution to the swinging pendulum of initiatives? Program evaluation is often used as part of implementation science. Implementation science is the study of methods that influence the integration of evidence-based interventions into practice settings. Implementation science helps answer the following questions. Why do established programs lose effectiveness over days, weeks, or months? Why do tested programs sometimes exhibit unintended effects when transferred to a new setting? The real message around implementation science is that effective intervention practices or models coupled with ineffective or inefficient implementation will result in ineffective, unsustainable program and outcomes! Implementation science focuses on stages of implementation over time and implementation “drivers” that provide the infrastructure needed for effective implementation that support high fidelity, effective, and sustainable programs.
CAREI uses an implementation science framework to assist districts in program evaluation efforts. Districts who belong to the CAREI District Assembly have access to four, high-quality professional development and networking sessions per year with either on-site or remote access. Along with discussing and disseminating applied educational research across a variety of important areas, we intend to focus on program evaluation in the upcoming year to build capacity within our member districts. For more information on CAREI or joining the CAREI District Assembly, please visit or new website at www.cehd.umn.edu/carei/. Please contact me at firstname.lastname@example.org if you want more information about CAREI or if you have certain topics you would like covered in future newsletters! •