NSDC logo
Professional Development
part of the Education Reform Network
Professional Development logo

Evaluation Rationale

Download: Evaluation.doc

The quality of staff development experienced by many teachers and administrators varies considerably from year to year and even from teacher to teacher in the same school. As a result, many educational leaders and policy makers are skeptical about the value of staff development in improving teaching and student learning. Well-designed staff development evaluation can address this skepticism by serving two broad purposes: (1) improving the quality of current staff development efforts, and (2) determining the effects of staff development in terms of its intended outcomes. In this case, we are talking about new, productive methods and methodologies that would positively affect the educational process. Thus, the consideration of various issues, the use of new literature and research (that is, we are talking about progress in various areas and not ignoring it in the academic environment), as well as services such as essays-service.com, will have a positive effect on the assimilation and production of the material.

Evaluation design is determined by the purpose for the evaluation-to improve something or to judge its worth-and by the audience for the evaluation's findings. The evaluation process begins in the planning stages and is based on clarity of thought regarding outcomes, the adult learning processes that will be used, and the evidence that is required to guide decision making. It asks and answers significant questions, gathers both quantitative and qualitative information from various sources, and provides specific recommendations for future action.

If staff development is to improve student learning, many levels of change are required, each with its own particular evaluation challenges. Unfortunately, a great deal of staff development evaluation begins and ends with the assessment of participants' immediate reactions to workshops and courses. While this information may be helpful to staff development planners, good evaluation design also gathers additional information. Beyond the (1) initial collection of data on participants' reactions, evaluation must focus on (2) teachers' acquisition of new knowledge and skills, (3) how that learning affects teaching, and in turn (4) how those changes in practice affect student learning. In addition, evaluators may also be asked to provide evidence of (5) how staff development has affected school culture and other organizational structures.

Staff development leaders must also recognize that different audiences require different evidence. Because the vast majority of decisions about staff development are made in district offices and at school improvement team meetings, the urgent pressure that many school leaders feel to improve student learning means that they are interested in knowing now if staff development as it is practiced with their teachers and administrators is making a difference. They are not willing to wait several months for the district to receive the results of its standardized testing. Likewise, teachers want to know if staff development is making their work more effective and efficient, particularly whether improvements in student learning justify the often difficult changes they are being asked to make.

School board members and state legislators, however, want to know if their increased investment in staff development is paying off in improvements on state measures. While state and local policy makers may prefer evidence derived from more rigorous evaluation designs, it is important to remember that they may also be influenced by anecdotes and other informal assessments they hear from teachers or principals at meetings or in other settings.

Staff development evaluation must take into consideration each group's needs with regard to evaluation data. It must ensure the process is in place to collect the needed data and that the audience has the prerequisite knowledge and skills to interpret and use the information.

  • Cataloged: Sep-28-2003
  • Country: USA

This resource is cataloged under:

More like this one

If this resource interests you, the button below can be used to get a list of similar resources.