Disposition Assessment: Revising an Assessment Instrument for the University and Field - School of Education, Indiana University Southeast - EPP Unit-Wide Created Assessment
The School of Education, prior to revising the disposition instrument, gathered disposition data, but it was inconsistent across the various programs; we knew, for the sake of our candidates, we needed to move to a Unit-wide assessment. Because of this, the dispositions document went through multiple feedback reviews and the faculty agreed that seven dispositions were superior to the previous nine. The School of Education voted in a 2016 faculty meeting to move to only seven dispositions. At the same time, the School of Education personnel were also still exploring efficient ways of tracking dispositions from entrance to exit within all programs and identifying ways to clarify various components of the instrument. Ongoing discussions and meetings took place to explore possible changes to the instrument.
In the fall of 2017, the School of Education started using Taskstream to collect, enter, and house data responding to feedback from our stakeholders, faculty, and teacher candidates. Once these data were examined, the faculty in the School of Education found there was a great need to revise the disposition instrument to ensure reliability and validity. Since the instrument was used to make education majors aware of the school’s expectations for their dispositions in both the university classroom and in the field, it was determined the instrument needed better directions, clarification, and examples of indicators. At the same time, CAEP began to require educator preparation programs to develop appropriate assessment instruments to measure and document dispositions. Therefore, the School of Education used this opportunity to explore ways to revise and create a new Unit-wide instrument to assess dispositions and track progress within Taskstream. This later became our EPP-created assessment.
In the summer of 2017, the School of Education’s CAEP Coordinator was awarded a summer research fellowship to revise the dispositions assessment to establish validity and reliability. This study was designed to create an instrument with clear directions and ratings with only seven dispositions. The current Dean joined in this study as a Co-Principal Investigator.
All supportive evidence from this study is uploaded for this document. One cycle of data is submitted and by the time of the visit we will have three cycles of data.
Documentation for EPP created assessment which meets CAEP Assessment Evaluation rubric:
During which part of the candidate's experience is the assessment used? Is the assessment used just once or multiple times during the candidate's preparation?
- Administration and Purpose
- Content of Assessment
- Data Validity and Reliability
- Survey Content
1. Administration and Purpose
This disposition assessment is used as a self-rating in decision point one for all programs Unit-wide, and assessed at different decision points in every program.
Therefore, every program monitors and tracks dispositions from entrance to exit of the candidates.
Clear directions and expectations are shared with the candidates at the time of the orientation to the program and at every decision point after.
Criteria are clearly defined in the assessment instrument.
2. Content of Assessment
Dispositions are clearly aligned with CAEP, INTASC and School of Education conceptual framework. Dispositions are clearly defined with indicators that serves as examples of behaviors. This work was based on the year long study that started with the summer faculty fellowship for the CAEP Coordinator.
Each level is defined by specific criteria aligned with indicators. Feedback provided to candidates is actionable with a follow up meeting with the program coordinator or faculty. Monitoring and getting feedback helps the SOE to track progress and use it for continuous improvement.
4. Data validity and Reliability
Since 2016, the School of Education faculty revised the dispositions several times and also established inter-rater reliability for scoring. Informal training was completed through faculty meetings when the new instrument was presented. In addition, an IRB approved study was conducted to take into account the feedback from the stakeholders, candidates, and the faculty. This instrument was piloted and data were collected as of spring 2019. We are still in the infancy stage of tracking and monitoring. For the purpose of the study, a great deal of research was completed. We are still reviewing the piloted data and looking for improvements and interpreting the results.
In reviewing the literature on dispositions assessments, this particular study was very helpful for the team to have a foundation in the design of the study. Although, we have completed our study, we are still in the process of submitting the manuscript for publication.
Dispositions Assessment in teacher education: developing an assessment instrument for the college classroom and the field:
5. Survey Content
It is aligned to all standards. Content was reviewed several times with experts and stakeholders. Assessment instrument is presented to the candidates at the orientation and in the classroom with clear expectations.
Evidence for this document
- Old dispositons
- New dispositions
- Old disposition assessment
- Feedback from stakeholders
- New disposition assessment (with alignment to all standards)
- Proposal for the summer fellowship grant
- IRB approval document
- Article on Disposition used as a reference for this study
- Presentation on the study, data, and other materials
- Raw data
- Disposition reports from all programs (one cycle or two)