The 2012 OSEP Project Directors' Conference was held July 23-25, 2012, at the Marriott Wardman Park Hotel in Washington, DC.
The focus of this year's SPDG Program agenda was on implementation fidelity and coaching. Participants had an opportunity to learn about effective resources and tools, as well as to problem solve issues that projects are addressing in these two topical areas.The list of resources to be shared have been posted to the website.
Date: Monday, July 23
ICAT Tools ® A Comprehensive Web-Based Management System for IC Teams. The integrated system includes program fidelity and student outcome measures. IC school, district and state staff have secure access to track and monitor all program activities. School-level and aggregated district reports are available to inform current progress.
Target audience/level: State, district, school leaders responsible for (IC Team) program implementation and evaluation.
Presenter: Jane Splean, Nevada SPDG Director
ICAT Tools (PDF)
District Capacity Assessment. This assessement is intended to help district leadership teams complete a self assessment of their readiness with elements related to tiered interventions for academics and behavior and implementation science.
Target audience/level: District leadership teams and state professional development providers
Presenter: Martha Buenrostro, Oregon SPDG Coordinator
District Capacity Assessment (DRAFT, DOC, 2012)
Mini Grant Program for IHEs
Target audience: State and Institutes of Higher Education
Presenter: Linda Krantz, Wisconsin SPDG Director
Wisconsin Department of Public Instruction - FY 2011-2012 IHE Mini-Grant Application Guidance and Instructions (PDF)
Wisconsin Department of Public Instruction SPDG Mini-Grant Application (PDF)
Wisconsin SPDG IHE Mini-Grant Initiative Abstract (DOC)
Innovation Adoption Readiness Model (IARM). Provides a means to identify and track the school through the implementation process. An Innovation Configuration Map (ICM) is a diagnostic tool for measuring the fidelity of implementation of an innovation over time. Both of these tools can be used with the new SPDG Performance Measurement 2 to demonstrate improvement in implementation of SPDG-supported practices over time.
Target audience: SPDG project directors, school staff and state personnel.
Presenter: Ed Caffarella, Nevada SPDG Evaluator
Significant Support Needs Quality Indicators Fidelity Tools
Target Audience: Building staff
Presenters: Cyndi Boezio, Colorado SPDG Director and Dan Jorgensen, Colorado SPDG
Online Activity Tracking System for Annual SPDG Reporting
Target Audience: SPDG Directors, Coordinators, Evaluator and other project staff
Presenters: Howie Knoff, Arkansas SPDG Director and Jennifer Huisken LaPointe, Arkansas SPDG Evaluator
Culturally Responsive PBIS: Modifications of Tools to Include Culturally Responsiveness. Drs. Skiba and Cole will showcase three tools Indiana is using - EMS Rubric, Culturally Responsiveness Assessment (CRA), and 5 X 5 Walkthrough - the most important one being the EMS Rubric. The rubric shows how they've modified standard PBIS fidelity measures to include issues of culture. The CRA and the 5 X 5 Walkthrough are interesting in that they are different methods for getting at some of the same dimensions of cultural responsiveness in a school. The CRA is self-report and the 5 X 5 is based on observation by staff.
Target Audience: State, District and school staff
Presenters: Russ Skiba and Sandi Cole, Indiana SPDG Co-Directors
Supporting Schools Based on Stages of Implementation
Target Audience: District leadership teams, state professional development providers.
Presenter: Steve Goodman, Michigan MiBLSi Director
The Stages of Professional Development: A Resource for ALL Teachers Responsible for the Achievement of Students with Disabilities. This document was developed in 2005-2006 to serve as a tool that could be used for self assessment and monitoring a teacher’s growth in implementing instructional practices that are effective with students with disabilities. In addition, the tool can also be used to assist mentors and teachers in the self-assessment process and the creation of professional development plans.
Target Audience: School leadership, teachers, mentors, IHE faculty, special education department chairs, director of special education and other SEAs
Presenter: Karla Marty, Maryland
The Stages of Professional Development: A Resource for ALL Teachers Responsible for the Achievement of Students with Disabilities (PDF)
Implementation and Utilization Guide (PDF)
Self Assessment Form (PDF)
Coaching and School Data Collection App: The Classroom Mosaic App, the 20-Minute Target Survey and Fidelity Checklist Tools
Target Audience: SPDG project staff, administrators and instructional coaches
Presenter: Susan Beck, South Carolina SPDG Director
On-line Integrity Rubric from the NCRTI as a self-assessment for all WA schools. Also, training modules around the rubric to train "evaluators" for RTI systems.
Target Audience: School Staff
Presenter: Leslie Pyper, SPDG Director, Washington
Washington's Online The Integrity Rubric for RTI implementation. Anonymous users can click the "Explore RTI Data" link to browse the data that’s been entered so far. Washington launched the system in August 2012 so there may not be a lot of users. If you want to see the actual rubric web form you must create an account. After creating username and email you’ll be able to flip through and see how it actually works. Users click on stars to “rate” – then a comment box opens so they can insert “evidence” for selecting that rating. The optimal browser is Google Chrome. They are working to resolve some bugs when displayed in Internet Explorer. Based on feedback from their July training sessions, they are doing some tweaking to the integrity rubric training modules. The revised modules will be posted once they are made available (sometime after September 10, 2012).
The RTI Essential Components Integrity Rubric (PDF, 8/2011) and the RTI Essential Components Integrity Worksheet (PDF, 8/2011)were developed by the National Center for Response to Intervention are for use by individuals responsible for monitoring the school-level fidelity of Response to Intervention (RTI) implementation or for self-appraisal. The rubric and the worksheet are designed to be used together and are aligned with the essential components of RTI.
NOTE: The Integrity Rubric for evaluation of implementation has been used by Washington for two years (since 2010) with their pilot sites. Washington has begun entering the data summer/fall 2012.
RTI Essential Components Integrity Worksheet: Examples of Possible Evidence (DOC, 8/2012). This accompanying resource was developed by Washington's Office of Superintendent of Public Instruction (OSPI).
Most SEAs and LEAs understand the value of evidence-based professional development, which includes effective coaching systems. However, establishing infrastructures (funding sources, training, policies, etc.) designed to support such systems appears to be a challenge. Some LEAs have been much more successful than others at implementing these systems. This consultancy discussion will focus on identifying ways to develop and support a cadre of coaches (systems and instructional) who can help facilitate LEA implementation of MTSS/evidence-based practices with fidelity.
Presenter: Lowell K. Oswald, Utah SPDG Director
Facilitator: Peg Sullivan, Florida SPG Project Director
Bringing about change in an educational setting is a difficult process that takes place over time as potential users of the change move through the Levels of Use from Orientation through Mechanical to Routine. Many educators see change as an event that takes place by administrative action such as “Starting on November 1, our school will use the XYZ model for Response to Intervention.” How do we get educators to understand and utilize the research on the change process and to move beyond viewing change as a simple administrative directive?
Presenter: Ed Caffarella, Nevada SPDG Evaluator
Facilitator: Jane Splean, Nevada SPDG Director
The SEA provides training and coaching to SPDG initiative staff on the competency drivers. However, there is little evidence of buy-in and consistent application before initial implementation of new practices. What strategies do you propose to gain buy-in for attending to selection, training and coaching to get short-term wins as well as accomplish long-term outcomes?
Presenter: Jeanna Mullins, Mid-Sourth Regional Resource Center, TA Specialist
Facilitator: Monica Ballay, Louisiana SPDG Evaluator
It is an accepted principle that having a building level coach is key to successful implementation of RtI/PBIS processes in the school. However, often the building level coach position does not exist, or the role is extremely limited (i.e., developing a new token economy, 5 minute data share out, etc.) How can we encourage our school teams to utilize the building level coach in a capacity that will truly develop sustainability?
Presenter: Susan Shipley, Wyoming SPDG Director and Christine Revere, Wyoming Department of Education, Educational Program Consultant
Facilitator: Debrajean Scheibel, Maine SPDG Coordinator
Establishing the technological capacity and infrastructure to build capacity for online coaching demands extensive preparation, research, and planning prior to implementation. However, the ‘human factor’ of teacher avoidance and reluctance to participate in the online coaching process may not be fully considered as a barrier to implementation. When conducting online coaching, how do you persuade reluctant teachers to participate in the process? This consultancy discussion will focus on ways to develop systems of support that facilitate teacher participation in online coaching implementation.
Presenter: Susan Williamson, Alabama SPDG Director
Facilitator: Gary Cates, Illinois SPDG Evaluator
There are expectations that high quality professional development relies on adult learning principles; is skill-based; and collects, analyzes, and uses the data to inform future professional development. At times it is difficult to get adults to act like adults and fully participate in skill-based training, either as a presenter or a participant. What strategies could we use to evaluate adult learning principles and skill-based professional development so that data are available for decision making for future PD?
Presenter: Brent Garrett, Vermont, Kentucky, Mississippi and New Hampshire SPDG Evaluator
Facilitator: Leslie Pyper, Washington SPDG Director
As a result of personnel changes in SEA leadership and project staffing across the five year SPDG, competing priorities gain importance for state and local professional development. Maintaining the SPDG initiatives as a priority in LEAs through professional development, technical assistance and coaching is vital to its success. With various initiatives (e.g. Common Core, changes to Teacher Evaluation System, new state assessment) competing for time that LEAs have to participate in training and coaching, how can the SEA ensure sustainability in implementation and coaching of SPDG activities at the local level? Staff that serve as RTI, Instructional Support, Literacy or Math coaches often find their roles redefined to meet immediate school, district or state priorities. Evaluating the effectiveness of on-going PD and coaching is difficult, such that when faced with financial reductions, coaching and support roles are eliminated. What possible solutions would you suggest for sustainability of initiatives in this ever-changing world of educational priorities?
Presenter: Karen Jones
Facilitator: Melanie Lemoine
The Group Meet Up is an annual event for SPDG project personnel, partners and friends to join up for a little socializing at a nearby restaurant.
Date: Tuesday, July 24
Locale: Zoo Bar Cafe, 3000 Connecticut Avenue. It's a 10-15 minute walk from the hotel directly across from Zoo entrance.
Date: Wednesday, July 25
Topics discussed: Program Measures, Data Management and Evaluator role in using SISEP’s Implementation Assessments.
Facilitator: Jennifer Huisken LaPointe, Evaluator, Arkansas SPDG
Implementation Assessment Suite of Resources are found on the June Evaluators' Webinar Page.
This year's event could not have been possible without our Planning Committee members. We extend our thanks to: