Evidence for the OnSite BOE Team to validate during the onsite visit:
As seen in the tables for Exhibit I.5.a, Teacher Preparation Programs, the unit is comprised of 25 initial licensure and 11 advanced programs. Standard 1 Addendum Exhibit 1: Professional Education Unit Program Chart includes the following information for each program: State licensure area, program level, college and departmental affiliation, accrediting or review bodies, and numbers of candidates.
Some programs are offered at the bachelors level, some at the master’s level. Some, like, Special Education K-12: General Curriculum, are offered at both levels (5 Year MAT program and a Post Baccalaureate program) but the programs are different and approved as such. One program, Inclusive Early Childhood Education, is a dual licensure program, resulting in candidates being prepared in the areas of Early Childhood Education and Early Childhood Special Education. This program is relatively new and the programs for Early Childhood Education (BS) and Early Childhood Special Education (5 Year MAT) are being phased out as this program has been developed.
All licensure programs have been reviewed and approved by the state. Programs in School Counseling, Speech Pathology, Dance, Theater, Music (initial and advanced) and Art (initial and advanced) have also been reviewed and approved through professional accrediting organizations. All other licensure programs have been approved through the NCATE-State of Virginia Partnership Agreement that went into effect January 2010 (January 2010-2016). This Agreement acknowledges that NCATE defers to the State’s review of the unit’s programs if the teacher education program standards or licensing standards and the State’s review processes are sufficiently similar to NCATE’s, as determined by the State Partnership Board (SPB). Included among the tenets of the Agreement is the fact that SPA review of programs in Virginia is optional. Institutions that elect not to submit program reports are reviewed using the state process outlined in the NCATE-Virginia state partnership protocol.
Because of significant budgetary restraints at the time, the unit’s programs accredited through NCATE chose the option to be reviewed under Virginia’s NCATE/State Partnership Protocol and not seek national recognition during this most recent accreditation process. Therefore the requested information related to many of the team’s questions was not included in the IR Exhibits 1.3.c-h for the Institutional Report. It was explained to us that Virginia’s endorsement area competencies had been reviewed by specialized professional associations (SPAs) and had been accepted by NCATE as being aligned with the respective SPA standards. We were pleased to have been notified that “on November 17, 2011, the Virginia Board of Education approved the Advisory Board on Teacher Education and Licensure’s (ABTEL) recommendation to grant education programs at James Madison University the ‘Approved’ status.” [Department of Education Letter – January 3, 2012.]
1. Content knowledge for teacher candidates: How many candidates do not pass Praxis II on their first attempt? What are the remediation policies for these students?
Successfully passing Praxis II is a state licensure requirement for most initial licensure programs. For programs such as K-12 Special Education: General Curriculum and School Psychology that don’t have it as a licensure requirement, passing a Praxis II test is a program completion requirement. Thus, any program completer has, by default, passed the Praxis II. Programs have different timelines for when a candidate needs to pass the Praxis II. For example, candidates in the elementary education program must successfully pass the Praxis II as part of their admission into graduate portion of the program, while candidates in Secondary Education programs must successfully pass the Praxis II before completing 12 hours of graduate work. There is no limit on the number of times a candidate may take the Praxis II. Standard 1 Addendum Exhibit 2: Praxis II Pass Rates shows the number of candidates that did not pass the Praxis II over the last three years.
There is no limit on the number times a candidate can take the Praxis II. Candidates, particularly those in the Middle and Secondary Education programs may take it multiple times, often during or towards the completion of their undergraduate content coursework. While during one semester, 4/20 candidates may not pass; the following semester, all 4 of those candidates may have successfully passed their retake and an additional 2 or 3 candidates may have been unsuccessful. Note that not passing the Praxis II may have different ramifications depending on at what stage of his/her program the candidate is in, and what the precise nature of his/her difficulties were. Remediation efforts are individualized to a candidate’s specific needs Program faculty work with candidates to provide advice, direction, and, at times, individual tutoring.
For example, in Art Education faculty have candidates take a mock Praxis Test in ARED 400, and archive study guides for candidates to use as resources. Dr. Steve Purcell, Department Head of Middle, Secondary, and Mathematics Education, meets with every candidate who has had sustained difficulty passing the Praxis II to help access their needs and provide guidance. . For example, a candidate who had taken the Spanish Praxis II repeatedly (did not pass on first 7 attempts) was encouraged to get more experience speaking Spanish. She went abroad to work and live in Spain and has since successfully passed the Praxis II. She plans to resume graduate studies here at JMU in Secondary Education in Fall 2012. More recently, individual tutoring was provided to a Secondary Education Mathematics candidate and advisee during the fall of 2011. She retook and passed the Mathematics: Content Knowledge Exam during winter break and is continuing in the program. Thus all candidates seeking assistance with retaking Praxis II are afforded appropriate individual or group remediation.
2. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel: In one part of the IR it is indicated “if a candidate has challenges in any of the subsets, faculty may provide remediation opportunities.” What are the remediation policies?
Each program monitors candidate performance on key assessments. As reflected in the key assessment data summaries in IR Exhibit 1.3. c-h, the overwhelming majority of candidates in each program meet the expected competencies. Program faculty work with candidates from a mastery learning perspective; thus, candidates are provided multiple opportunities for feedback and correction on key assessments as needed.
In IR Exhibits 1.3.c-h, each cell represents the candidates in each program cohort who attempted the key assessment during the specified semester. All candidates in the class complete the key assessment. Scoring scales used in the evaluation of the key assessments vary across the programs. Many of the key assessments have subsets and candidate performance is assessed on the range of items for each subset. The resultant data in the data summaries of exhibit 1.3.c-g indicate if a candidate did not meet expectations on one or more subsets; the data may reflect individual candidate performance on a specific item or set of items and not performance on the entire key assessment. So, a candidate’s overall score may be acceptable, but program faculty would still address individual items and/or subsets.
When key assessments are scored, program faculty assess candidate performance on sections of the key assessment and will provide focused remediation for a specific section. Program faculty then determine if a candidate has demonstrated the needed competencies measured by that key assessment. Remediation may take many different forms. At times, it may be targeted feedback to candidates, at other times; it may require resubmission of some or part of the key assessment assignment.
For example, in the spring of 2010, 51/70 Elementary Education candidates met or exceeded all the criteria for the Case Study, the Pedagogical and Content Knowledge key assessment at the time. However, as noted in the footnote, 19/70 did not meet all the criteria. The footnote indicates that for most of those candidates, a school name was used in the case study; thus they did not meet the criteria for maintaining confidentiality. Program faculty gave feedback to all of these candidates, but did not require resubmission and determined that candidates overall score reflected the program competencies. Another example is in the spring of 2009 when 16/17 Secondary English candidate met or exceeded the criteria for the Teacher Work Sample Assessment. The one candidate who did not meet the needed criteria for all sections reworked the section and resubmitted it, scoring “Acceptable” upon resubmission.
3. Dispositions for all candidates: What were the results of the new dispositions rubric during Fall 2011? What implications, if any, are raised from the results?
The new disposition rubric was piloted in three Middle/Secondary Education 571 classes in fall 2011. Another dispositions form was also used. Candidates in those sections demonstrated strong professional behaviors, which rendered it difficult to tell whether either form adequately distinguishes acceptable performances from unacceptable ones. All three forms – one unit disposition rubric and, two program forms (one filled out by host teacher and one filled out by faculty member) rated candidates as acceptable or target on all criteria. The new disposition form will be utilized again Fall 2012, the next time the course is offered. Faculty conversations about the form will hopefully bring about greater clarity concerning the strength of its validity.
The rubric was also used in several courses in the Learning, Technology, and Leadership Education Department: one section of Foundations of Education, EDUC 300; two educational technology classes; a TESOL course; and three Educational Leadership courses. Faculty reported that certain sections of the rubric ask about behaviors that pertain more to teacher education candidates (e.g., confidentiality of student records); faculty recognized that they would use the response option “No Chance to Observe” for such statements. In most settings, the rubric was applicable. However, the assessment committee will discuss whether the rubric should be revised with wording which is more appropriate for non classroom-based programs.
The SPED K-12 program will use the rubric in EXED 376 (first practicum experience, spring of junior year). In a preceding course, PSYC 270, (fall semester, first course for the program) the rubric is reviewed with the candidates; they sign a statement that they will uphold those standards (this is part of their induction). In Spring 2012, each candidate in the class will be in three placements. Cooperating teachers will be asked to complete a rubric for each candidate, so by semester’s end, we will have three sets of ratings for each candidate.
The unit dispositions rubric is part of the assessment plan for the IECE program. Candidates will be assessed at the end of every semester and data will be discussed in department or program meetings. Program faculty will begin utilizing the rubric in Spring 2012. The unit dispositions rubric will also be used in the practicum course, READ 660. This course is offered during summer term.
The READ program will use the rubric in READ 602, which is required before the practicum can be started. The first administration of the rubric in Fall 2011 with eight candidates taking the course provided evidence that the rubric has utility with candidates at that stage of the program. All rubric criteria were appropriate and the rater was able to provide ratings for the cohort. The unit dispositions rubric will also be used in the practicum course, READ 660. This course is offered during summer term.
The unit recently reviewed the results of the pilots and decided on common points of administration for both initial and advanced programs. By establishing a common timeline for administration, data can be aggregated for all candidates allowing longitudinal comparisons of candidates’ dispositions and providing both programs and the unit with consistent data to monitor candidate performance and make program improvement decisions.
4. Student learning for teacher candidates: The department of Middle, Secondary, and Math Education reported efforts to support TWS development through the use of full time faculty as supervisors. Has there been any impact on candidate performance from these efforts?
Candidates in Middle and Secondary Education (MSME) complete the Teacher Work Sample (TWS) during their student teaching internships. It is a key assessment that helps program faculty understand whether or not our candidates have forged important connections between instructional planning, implementation, and assessment that ultimately shape students' learning success in the classroom. The TWS is a comprehensive examination of "core" proficiencies that in concert permit the department to confidently recommend our MAT graduates for licensure in Virginia's public schools.
Given the comprehensive nature of the TWS, MSME program faculty, particularly the MSSE 650 Student Teaching Seminar instructors, invest significant time and energy scaffolding our candidates' success through a process of iterative refinement for each of the six TWS sections (Contextual Factors, Learning Goals and Assessment Plan, Design for Instruction, Instructional Decision Making, Analysis of Student Learning, and Reflection and Self Evaluation). The department felt it was important that the MSSE 650 instructors and other full-time departmental faculty supervise our candidates’ student teaching internships so that they could provide constructive feedback and guidance to our candidates as they planned and taught their 5-day units. The net effect of this decision was twofold. First, those full-time faculty supervisors who were not teaching MSSE 650 gained valuable first-hand experience supporting candidates’ TWS development so that they were much more knowledgeable, conversant, and participatory in the department’s ongoing refinement of TWS assignment descriptions, rubrics, and writing prompts. This allowed the department to make enhancements and adjustments more efficiently with less duplication and re-working of previous efforts that translated into candidates’ reduced frustration and confusion as they authored each section of the TWS.
A second benefit to utilizing full-time faculty as supervisors (related to impact on candidate performance) is an outgrowth of the first. Since writing prompts, assignment descriptions, and rubrics were improved; MSSE 650 instructors received higher quality initial drafts at each stage of the TWS submission process. Consequently, MSSE 650 instructors were able to focus seminar class sessions on issues and topics drawn from and related directly to our candidates’ internship experiences. The improvements the department made on the front end (assignment descriptions, rubrics, and writing prompts) also led to higher levels of candidate satisfaction about the TWS itself and helped cement the importance and relevance of this project to our candidates’ lives as teachers, as evidenced in the following comments from recent MSME graduates in a 2011 survey:
“I think the TWS had a lot of portions that were time consuming but necessary. I do not feel like there was one section of the project that could have been left out because they are all related in some way. I think I learned a lot about myself as a teacher and how to deal with students under pressure and I have also learned that I do make a difference in their lives. The analysis of student learning was by far my favorite part because I felt like I saw all of my hard work pay off. Students were actually improving and learning about a certain topic.”
“After completing the TWS I see how the project allows for clear examination of how teaching impacted student learning. Analyzing the instructional decisions that were made, why they were made, and how they [affected] students was a revealing process that will be helpful for development as a teacher in the future.”
“The TWS has taught me how to change my instruction based on students’ needs and/or how they did on assessments within a unit. It was difficult, but I believe it helped me be able to better make these decisions throughout the rest of my first placement (after I completed the TWS unit), and throughout my second placement.”
“I appreciated having to do this mostly for the evaluation I had to complete. It let me reflect on myself as a teacher and also as a student. It also forced me to use all of the teaching and assessment techniques I learned at JMU. It is easy to fall into a routine of teaching, but it is important to remember what we learned.”
“TWS was valuable for making me research my students' backgrounds as thoroughly as I did and for making me evaluate my students' learning. TWS reinforced the concept of using formative assessment to drive instruction and held me accountable not only for measuring my students' learning but also for doing something about any discovered gaps in understanding.”
“I learned a great deal about looking at a unit as a "whole." Now that I have a job as a full time teacher, I feel like I have experience planning a unit from start to finish, while including adequate forms of assessment and actually carrying them out. This experience was invaluable to me. It is also nice to have the TWS on hand for future interviews.”
“I know that I will surely be referencing my TWS during my own teaching experiences. It will be helpful in developing certain types of assessments and how to evaluate those assessments for my students and myself.”
“It certainly helped me to think through the teaching process. I had to truly process everything step by step in order to do the TWS, which will help me to better implement all the things we're learned at JMU (learning goals, assessment, etc.)”
“I think that the TWS is a good way for teachers to remember what it feels like to find something difficult; I think it was very humbling to be in our students' shoes. I am also proud that I completed it and feel that I learned a lot throughout the process, despite a few frustrating bumps along the way (i.e. snow days). I learned that I need to avoid choosing materials/lessons that appeal to me specifically, and remember to always put my students' needs first.”
“I think the TWS was a great value to me. I think the biggest thing I took away from it is assessment, specifically how to look at results and figure out what needs to be done to make instruction more effective.”
“It especially helped me to analyze student data to help prepare for further lessons and assessments.”
5. Professional knowledge and skills for other school personnel: Programs in Educational Technology and Education Leadership indicated the development of new survey structures. Have these surveys been administered? If yes, when? What were the results?
Both programs have surveyed their graduates. Survey forms and an executive summary of the results for each program can be found in Standard 1 Addendum Exhibit 3: Ed Leadership and Ed Tech Surveys.
Response to Questions 6,7,8:
6. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel: Clarify for all programs, initial and advanced which NCATE standards, state and/or national standards are measured, how they are assessed and when they are assessed in each program.
7. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel: While rubrics present the standards to be assessed in some cases, as with reading, it is not clear in all cases. Clarify which criteria of the rubric are used to measure each standard for each program.
8. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel: When presented the data are then not presented by specific standards instead are presented in aggregate. Clarify how faculty and candidates in a program know performance on a particular standards when data not presented by specific standards.
As noted in the introduction to this Standard, programs chose the option to be reviewed under Virginia’s NCATE/State Partnership, thus the Institutional Report focused on providing the information required by that process. However the absence of SPA alignment information in the exhibits is not an indication of our lack of attention to the alignment of our programs with state and national standards. In order to help make these alignments more explicit, we have added a row showing the alignment to State Standards (VDOE Competencies) and a row indicating the alignment to the SPA Standards (SPA) to the Program Key Assessment Tables (Unit and Program Key Assessments for Initial Licensure Programs and Key Assessments for Advanced Programs). The amended tables can be found in Standard 1 Addendum Exhibit 4: Initial Programs and Standard 1 Addendum Exhibit 5: Advanced Programs. In addition, as seen by the examples for the Teacher Work Samples in Standard 1: Addendum Exhibit 6: TWS Rubric and Alignments, scoring guides for program assessments explicitly show the linkages. Additional program rubrics will be available in the on-site exhibits.
The foundation for programs’ assessment of candidate knowledge, skills and professional dispositions is our Conceptual Framework. Standard 1: Addendum Exhibit 7: CF Alignment shows the alignment of the Conceptual Framework (CF) to the InTASC and NBPTS standards. Unit courses and key assessments align CF candidate competencies with external standards in several ways to ensure that candidates successfully demonstrate all of the expected competencies prior to completion of their programs. Because our unit includes such a wide variety of initial and advanced programs, the unit chose to focus our assessment efforts under the elements that comprise NCATE Standard 1. This supports unit-wide discussions and helps us frame our efforts around key issues. Thus all programs in the unit aligned their program assessments with the five unit focus areas: content knowledge, professional and pedagogical content knowledge, impact on students, diversity, and dispositions. Each focus area is aligned with one or more of the 11 of the conceptual framework competencies. Programs identified key assessments for each focus area. As with IR Exhibit 1.3.c-h, Standard 1 Addendum Exhibit 4: Initial Programs and Standard 1 Addendum Exhibit 5: Advanced Programs show the above alignments.
In addition, each program completed an alignment of courses with required state competencies (Exhibit 1.3.a, Virginia Department of Education Endorsement Competencies). We also aligned our CF candidate competencies with SPA standards as appropriate and indicated which courses in each program cover those standards. (IR Exhibit 2.3.c)
All initial programs in the unit use the Assessment of Student Teaching (ST-9) as a formative assessment of candidates’ performance at the end of their programs. The ST-9 is aligned with the Conceptual Framework and individual items on the ST-9 have been identified as indicators for each of the unit focus areas (see Standard 1 Addendum Exhibit 4). Using state competencies and SPA standards, program-specific reference guides were developed by clinical and university faculty within the MidValley Consortium. As SPA standards are updated, the reference guides are also being updated. The most recent example being the PHETE reference guide, reviewed and revised based on the NASPE/NCATE standards. These guides are used by university supervisors and clinical faculty and encourage a performance-based process for supporting the professional growth of pre-service teachers over time. (Standard 1 Addendum Exhibit 8: Reference Guides)
9. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel Clarify how candidates are informed about which unit (CF) NCATE professional standards, state and/or national standards are to be measured in a specific course if they are not listed in the course syllabus.
Many of the current syllabi reference Conceptual Framework standards, state and/or national standards and will be available for review on site. Some syllabi embed these standards, others include a link to a program, state or SPA website. Other courses utilize Blackboard Course Management System to communicate this information to candidates. Many of the programs introduce candidates to professional standards during orientation or induction meetings and then reference them again throughout the program.
For example, in PSYC 270, their first course in the program, candidates in the K-12 Special Education: General Curriculum program are introduced to NCATE and CEC Standards in one class session and then all Special Education endorsement program students (IECE and K-12 SPED) participate in an induction ceremony where the candidate are “capped”. During this ceremony, program standards (CEC and NCATE) are reviewed and candidates sign a document agreeing to uphold those standards and the PEU’s professional behavior dispositions and standards, Course syllabi in subsequent courses also include assignments and links to CEC and NCATE standards.
10. Candidate knowledge for teacher candidates & Professional knowledge and skills for other school personnel: When the data are presented for each assessment only, it is difficult to determine the exact number of candidates in a specific program. Clarify the number of candidates enrolled in each program. Explain how enrollment data relates to the numbers on the summary table.
Standard 1 Addendum Exhibit 1: Professional Education Unit Program Information Chart provides this information. Programs work on a cohort model; candidates go through classes in groups of 5-75, depending on the size of the program. Some cohorts may have an enrollment of 5 in one class; others may have three sections of 25 candidates per section. For the table used for IR Exhibit 1.3.c-h, the numbers shown in the table for each semester do not equate with the total number of candidates in a program, rather the number in each cell represents the candidates in each program cohort who attempted the key assessment during the specified semester. All candidates in a class complete the key assessment if one is embedded in that class. As referenced in our response to question (2) above, unit faculty approach assessment of candidate progress for a formative assessment view. Thus, candidates are given multiple opportunities for corrective feedback and subsequent revision of key assessment projects if needed.
11. Some syllabi were presented for 2008-2009 academic year. Are more current examples available?
2008-2009 syllabi are those that were submitted to the VDOE with the State Approved Program matrices in 2008. Current syllabi will be available in the on-site exhibits.