The School of Education
Teaching for the Next Generation
NCATE Standard 2 | Assessment System & Unit Evaluation
2a. 1. How does the unit ensure that the assessment system collects information on candidate proficiencies outlined in the unit's conceptual framework, state standards, and professional standards?
Demographic, gpa, background check, and admissions data is collected and monitored by the office of admissions.
Each program area has an assessment system that includes assessments that are common to the unit and assessments that are program specific. (See program reports) Most content, pedagogical or dispositional assessments are tied to courses. Templates for these assessments and related syllabi are housed in Livetext (access at www.livetext.com with visitor pass) and automatically appear in faculty accounts supporting consistency in the data collection. The selection and development of assessments, syllabi and course texts have been developed and agreed upon by program faculty (see program meeting minutes) and reviewed or revised as needed. Dispositional data and classroom performance data is collected from mentor teachers and faculty supervisors through Livetext. NGCSU has an on-line course and instructor evaluation system that students complete at the end of each course.
Key assessment data for the unit is drawn and aggregated from Livetext.
Alumni and employer data is tracked by the University System of Georgia Board of Regents (BOR) utilizing student identification numbers that become employee identification numbers for employed teachers. This data is collected for the first three years of teacher employment and disseminated to universities.
Graduating candidates in teacher education at the initial and advanced levels (except the Ed.S.) are administered Educational Benchmarking (EBI) surveys that provide program evaluation information each year. EBI surveys also provide longitudinal and cross-university comparison data. Graduating candidates also complete open ended surveys regarding perceptions of program effectiveness.
The School of Education contracts with Eduventures to provide surveys of mentor teachers, principals, superintendents, and graduates to provide additional data regarding candidate effectiveness.
GACE licensure data is collected and disseminated to all programs after each administration of the exams.
Course evaluation data from the on-line NGCSU system is available to faculty at the end of each term, and is utilized as one source of evidence of teaching effectiveness that comprises 60% of faculty annual evaluations. Faculty also have access to candidate performance assessment reports from their Livetext accounts. The coordinator of Assessment runs annual assessment reports (see www.livetext.com with visitor pass) from Livetext of candidate performance and inter-rater reliability across multiple sections of courses.
Each year the SOE faculty meet as a whole for at least two days to review all of the data above and form goals for program an unit improvement. (See Annual Retreat) One day is spent reviewing unit data, and programs each spend an additional 1-2 days reviewing their data. The results of these meetings and on-going curricular program or workgroup meetings are documented in the NGCSU SAINT (institutional) report as well reports for GAPSC, SACS and NCATE. Significant changes to programs are discussed with Arts and Sciences and P-12 partners in the Professional Education Committee (PEC) and Academic Affairs and/or Graduate Studies. (See Meeting Minutes)
2a. 2. What are the key assessments used by the unit and its programs to monitor and make decisions about candidate performance at transition points such as those listed in Table 6?
2a. 3. How is the unit assessment system evaluated? Who is involved and how?
The unit assessment system undergoes on-going evaluation and analysis from a variety of differing constituencies. Common assessments across the unit are developed and monitored by faculty workgroups as well as the faculty as a whole. Changes to key assessments (entry requirements, portfolios, teacher work samples, evaluation of dispositions, GACE content evaluation data, course grades, and graduate core data points) are reviewed by faculty. Course based evaluation of candidate performance outcomes are always available to faculty and aggregate course assessment data is compiled and distributed for analysis and proposed programmatic changes at the annual retreat. This also includes a program by program analysis of GACE content pass rates. GACE content licensure assessment data is also disseminated to faculty, including arts and sciences faculty, deans, and department heads as it is received. Review of program changes including assessment outcomes is offered to Academic Affairs, program coordinators, relevant faculty (including arts and sciences) deans and partner school district personnel once each term as a part of the Professional Education Committee functions. Questions, concerns and suggestions for unit and program improvement occur at this time.
Unit data is also reported to the faculty and administration each year at the opening of the annual School of Education retreat held each August. Aggregate data from standardized and open ended unit assessments as well as data from graduating seniors, mentor teachers, employers and NGCSU teachers employed at least one year is reviewed. Data and proposed course and program changes are reviewed each term with the Professional Education Committee (PEC) which is comprised of program coordinators, Arts and Sciences faculty and department heads, representation from Academic Affairs, principals, superintendants and other P-12 partners. As a part of our movement to more comprehensive professional development school relationships curriculum and assessments have been reviewed with partner groups for the purpose of receiving their input on courses, assessments and program sequences. (See PDC agenda/meeting minutes).
The NGCSU School of Education also contracts with Eduventures to collect data utilized for unit and program improvement from graduates and employers including area superintendants. Other utilization of data: Course and instructor evaluation data derived from and institution on-line system is analyzed by faculty and their immediate supervisors as a part of faculty evaluation, information literacy skills are measured through a process at the institution level as a part of the Quality Enhancement Plan (QEP) and that information is disseminated to program coordinators. The Georgia Board of Regents tracks graduates through their first three (2012 is to be the third year) of employment and seeks perceptions of preparation levels from teachers who are alumni of our programs and their employing principals. This is a newly developed system and is steadily capturing increasingly higher rates of return each year.
2a. 4. How does the unit ensure that its assessment procedures are fair, accurate, consistent, and free of bias?
The fairness and accuracy of assessments is ensured in a number of ways. Assessments are modeled on state and national research based standards thereby drawing on accepted indicators of professional performance. Candidate performance in any category is measured using multiple types of assessments with input from multiple assessors including the candidate, classroom faculty, field supervisory faculty, mentor teachers and school administrators. If discrepancies in assessment of candidate performance occur there are multiple sources of documentation that can inform comprehensive evaluation of the candidate. Consistency in assessment is supported by the utilization of the Livetext system. The standards based course objectives and assessments are loaded into each faculty member’s account each term, ensuring their awareness of the agreed upon common assessment(s) for the course. The Livetext system also permits a faculty member to compare the performance of groups of candidates over time. The Livetext system also permits analysis of inter-rater reliability outcomes across multiple sections of the same course.
In addition to the utilization of standards based assessment instrumentation developed by faculty and K-12 partners there are many elements of the system that provide for utilization of objective standardized assessments including the GACE basic skills and content (licensure) examinations, GRE and MAT admissions data, utilization of the Teacher Performance Record (TPR) in clinical evaluations, Board of Regents surveys and Educational Benchmarking surveys. In addition to providing low inference and/or objective data on current teacher candidates each of these elements of the system provides an opportunity to track program and individual candidate outcomes longitudinally. The validity of the TPR, teacher work sample, GACE, GRE, EBI, and dispositional evaluations have been established (see supporting research in the exhibit room).
Training in the use of assessments also helps to ensure unbiased, accurate and consistent assessment of candidate. Program faculty meet to discuss requirements for course assessments. The coordinator of field placements conducts training for utilization of the TPR in clinical supervision each term. The TPR has training modules that allow the calculation of reliability of the “scorers” to ensure that faculty and supervisors new to the instrumentation are qualified to use it consistently. Additionally, new faculty and supervisors receive training in the use of the internship evaluation and the requirements and scoring of the teacher work sample (TWS). Faculty work within their program work groups to establish expectations for common course assessments. The analysis of this data holistically during the annual retreat reveals inconsistencies within and among programs. The most common reason for problems of inconsistency is typically related to the development of new faculty and is addressed through continued training and support.
2a. 5. What assessments and evaluation measures are used to manage and improve the operations and programs of the unit?
The Dean of the SOE is evaluated yearly by the faculty, associate deans and program coordinators. These evaluations become a part of an annual review conducted by the Vice President for Academic Affairs. Associate Deans are evaluated by program coordinators and faculty as a part of annual review conducted by the Dean. Staff positions are evaluated by faculty and administrators logically served by those positions and reviewed by Associate Deans. Annual faculty evaluations are conducted by program coordinators, who review institutional teaching evaluations, service and scholarship. During faculty evaluations coordinators work with each faculty member to identify goals for personal professional development and for course and program improvement.
The budget operations are conducted by the Dean of the SOE in collaboration with the SOE budget officer. The Dean is accountable to the Vice President of Academic Affairs and Vice President for Business and Finance for budget planning and expenditure. A unique characteristic of operations is the transparency and collaborative nature of budget planning. When state budget cuts were threatened (and partially materialized) two years ago the Dean conducted extensive analysis with coordinators across programs to identify a prioritized list of possible opportunities for savings. As a result, the SOE was able to manage significant reductions without loss of instructional positions.
The NGCSU School of Education operations are administrated as a whole. This permits fiscal flexibility in times of shifting enrollments and budget reductions and keeps curricular program decisions within workgroups. This also supports curricular collaboration across programs, since many faculty teach in more than one initial, advanced or endorsement program.
Teacher candidates have opportunities at the time of graduation to evaluate operational aspects of the unit and/or programs on completion of the EBI survey on such items as administration, quality of advisement, availability of instructors, library and technology resources, and career services. In 2010, EBI data indicated that Administrative services constituted a significant predictor of overall program effectiveness and were ranked in the “excellent’ range by graduating teacher candidates at all levels.
2b. 1. What are the processes and timelines used by the unit to collect, compile, aggregate, summarize, and analyze data on candidate performance, unit operations, and program quality?
Please see the chart describing the Unit assessment system that describes how and from whom data are collected, and the frequency of collection.
Faculty are responsible for analyzing course performance data through the assessment reporting function of Livetext, as well as the course and instructor evaluations completed by students in the electronic institutional system. The Dean and coordinator of assessment provide assessment reports that aggregate common course data when there multiple sections of a course for the purpose of examining inter-rater reliability data annually. These data are provided for program improvement discussions that take place among faculty at SOE retreats each August.
The Dean or Coordinator of Assessment disseminates unit and program data including key unit assessments, EBI and informal graduate survey and employer data, data derived from Eduventures special reports examining specific populations, TPR, Internship evaluations, mentor teacher evaluations of candidates to faculty prior to the August retreat.
GACE content evaluations pass rates are disseminated by program to all relevant faculty and coordinators (including Arts and Sciences partners) as it becomes available after each administration.
Aggregated Livetext data is disseminated in the form of a Livetext assessment report (see www.livetext.com with visitor pass).
GACE pass rates are disseminated via system reports that include information regarding aggregate rates for specific exams and individual performance date.
TPR data is summarized from Excel spreadsheets and disseminated in chart form.
EBI data is summarized by unit and by program and placed in reports for the August retreat, as are informal exit survey data by the SOE Associate Dean of Assessment.
PAAR, AIMS and PRS data are uploaded in summarized table formats.
The Georgia Board of Regents requires an annual narrative summary report.
Please see the chart describing the Unit assessment system that describes how and from whom data are collected, and the frequency of collection.
2b. 2. How does the unit disaggregate candidate assessment data for candidates on the main campus, at off-campus sites, in distance learning programs, and in alternate route programs?
Data is collected in Livetext for all programs, so disaggregated data is always available by program, aggregated data is derived from mathematical computation of means based on the numbers of candidates in each data set. Post Bac candidates are assessed utilizing the same instrumentation that is utilized for other initial candidates. MAT candidates complete critical assessments that mirror critical assessments of undergraduates (TWS, Internship evaluation, dispositions evaluation) in year one and critical assessments from the graduate program in year two. Data is derived by program and later aggregated for unit purposes. Demographic data is derived from admissions records and Banner.
2b. 3. How does the unit maintain records of formal candidate complaints and their resolutions?
Formal complaints from candidates are generally related to grade appeals or faculty performance. Grade appeals are addressed through the regular appeals process developed by the university faculty. The records of the outcome of these appeals are reflected on the student’s transcript and are kept in the Office of Academic Affairs.
Formal complaints about faculty performance are rare, but when they are received, the faculty member is informed. As with any appeals process, the first action suggested to the student is that s/he meet with the faculty to discuss the concerns that precipitated the complain. If the student is not comfortable doing this, then the student will be advised that another faculty member or administrator cam be present at the meeting. A persistent and significant conflict could involved the intervention of the dean or a representative from the Office of Student Affairs. (Complaint records are kept in the student and/or faculty files.)
2c. 1. In what ways does the unit regularly and systematically use data to evaluate the efficacy of and initiate changes to its courses, programs, and clinical experiences?
The SOE conducts an annual retreat to review and discuss program and unit data and develop goals for improvement.
Workgroup (program or subgroup) meet at least twice per term. The faculty review coursework outcomes and candidate performance together to inform collaborative development of course assessments.
The inclusion of faculty contributions to course and program improvement in faculty evaluation and promotion and tenure communicates the importance of faculty commitment to program improvement.
Administrative or curricular changes are generated or approved in coordinator’s meeting before going before Dean’s Council, Academic Affairs or the Professional Education Committee.
2c. 2. What data-driven changes have occurred over the past three years?
Faculty identified problems in candidate performance on the teacher work sample. Faculty changed the language used in classes, assessments and assignments to be consistent with the language used in the teacher work sample.
Faculty moved lesson and unit planning activities forward in the curriculum of all programs and included “practice” work sample activities in the assignments of relevant parts of courses. When some problems continued faculty generated modifications to the content, instructions and grading rubrics of the work sample so that they aligned more closely (all initial programs).
The ECE/SPED program responded to graduating senior open ended exit interviews to move one course from fall of senior year to spring of junior year to better distribute candidate work requirements.
In response to candidate difficulties in classroom management, the ECE/SPED program moved the Applied Behavior Analysis course to junior year so that a continuum of classroom, instructional, behavioral and school wide intervention content could be linked by taking Classroom Management term #1, Applied Behavior Analysis term #2, and Working with Students in At Risk Situations term #3 in the program.
Faculty developed a portfolio process for each program to ensure that candidates had opportunities to view their work holistically in the context of state and national standards.
Data that suggests that as a unit candidates have difficulty utilizing data to inform instructional decision making at the unit level has caused faculty inclusion of August experiences to improve goals and understanding of the “big picture” decision making processes in which teachers engage at the beginning of the year. Additionally, at PDC sites candidates are being included in professional development opportunities and participation in the statewide testing protocols.
In order to improve ECE/SPED candidate performance on a group school wide improvement projects faculty elaborated the research assignment to include collection of data on an intervention effort and report of outcomes to faculty and/or leadership teams within schools.
Faculty in the ECE/SPED program changed a textbook for SPED 3100 Introduction to Mild Disabilities when candidate performance on foundational knowledge on SPED portions of GACE content examinations were not at desired levels of mastery.
Examination of middle grades candidate’s ability to support content literacy problems caused inclusion of additional reading courses and special education for middle grades.
Examination of teacher work sample data in all initial programs generated further faculty development of differentiation components of instruction and curriculum classes. The Middle Grades program is in development of secondary focused special education class to support further candidate expertise in this area.
Examination of data in the TESOL endorsement program resulted in termination of a faculty member and suspension of TESOL endorsement while a search for qualified faculty and redesign of the program to meet state and national standards occurs.
Inter-rater reliability data on the teacher work sample, TPR clinical evaluations and professional portfolios has caused more frequent and more intensive training on TWS, portfolio and TPR.
Performance of 6-12 History Education candidates on GACE content evaluations resulted in the requirement of inclusion of an economics course in the program of study.
Portfolio performance data related to reflections on the standards caused initiation of professional portfolios to be pushed forward into Area F pre-education courses.
Internship evaluation data for Middle Grades and P-12/6-12 resulted in a changes to improve internship performance through more structuring of field experiences earlier in the program beginning with content area tutoring in structured situations in Area F pre-education courses for Middle Grades and P-12/6-12 candidates.
“Cohorting” of all programs occurred in all programs to improve coherence and progressive rigor of curriculum and improved clinical performance and long term retention in the field of education. There are still efforts to make more program changes to ensure graduate candidate adherence to the program sequence, possibly through consolidation of masters degrees into curriculum and instruction degrees with content emphases.
Inconsistencies in performance data for advanced candidates resulted in portfolio requirements for graduate students.
Middle Grades faculty completed a redesign of their program two years ago to be more closely aligned with NMSA standards.
The TPR was adopted as one form of instrumentation utilized in to be able to improve the quality of feedback to candidate based on display of categories of research based behaviors.
2c. 3. What access do faculty members have to candidate assessment data and/or data systems?
Faculty members have unlimited access to their own course data at any time in the Livetext system or institutional course evaluation system.
Program coordinators and workgroup members can receive aggregated data for multiple sections of courses through request to the associate dean of assessment. They receive program and unit data annually prior to retreat.
GACE licensure data are disseminated as received.
Key unit assessments, TPR, EBI and survey outcomes are disseminated annually by program and by unit before the retreat.
2c. 4. How are assessment data shared with candidates, faculty, and other stakeholders to help them reflect on and improve their performance and programs?
Candidate performance data is shared individually through grading of performance rubrics in the Livetext system and end of term individual conferences in terms with field experience components. Through this system candidates also receive on-going formative feedback on comprehensive pieces of work including case studies, teacher work samples and portfolio development.
Faculty access to data sharing is described above and also included development of program reports. (See program reports)
P-12 partners formally receive access to data during dissemination at Professional Education Committee meetings. Informally P-12 partners receive and contribute to candidate performance evaluation through PDC group meetings. Institutional administration receives data through the PEC meetings, the Dean’s Council, NGCSU SAINT (institutional) reports and direct dissemination of copies of program and unit accreditation reports.
1. How does your unit do particularly well related to Standard 2?
The unit has developed a comprehensive system of assessment that includes multiple indicators across each professional domain, is standards based and is effectively archived, analyzed and distributed through formal and informal means to a variety of stakeholders and constituencies. The foundation of the system is the culture of assessment and on-going program improvement by faculty willing to collaboratively develop common course and unit assessments and commitment to conduct the sometimes difficult conversations regarding prioritization of content and clinical objectives. The recent and successful inclusion of P-12 partners in a much more intensive manner in those conversations regarding candidate needs and assessments in the context of school needs and objectives has created daunting levels of complexity in program planning and assessment. As in all successful ventures in our programs, the commitment of the faculty has been key in the development of a comprehensive assessment system.
2. What research related to Standard 2 is being conducted by the unit or its faculty?
A significant research question for the NGCSU SOE will relate to the practical development of common assessment systems with Professional Development School Partners that link candidate performance and student achievement and inform teacher preparation program development.