logging in or signing up models of curriculum development ed4all Download Post to : URL : Related Presentations : Share Add to Flag Embed Email Send to Blogs and Networks Add to Channel Uploaded from authorPOINT lite Insert YouTube videos in PowerPont slides with aS Desktop Copy embed code: Embed: Flash iPad Dynamic Copy Does not support media & animations Automatically changes to Flash or non-Flash embed WordPress Embed Customize Embed URL: Copy Thumbnail: Copy The presentation is successfully added In Your Favorites. Views: 4877 Category: Education License: All Rights Reserved Like it (3) Dislike it (0) Added: October 18, 2010 This Presentation is Public Favorites: 1 Presentation Description No description available. Comments Posting comment... Premium member Presentation Transcript Slide 1: Models of Curriculum Evaluation Teaching & Learning Commons Presentation http://talc.ukzn.ac.za Slide 2: Models of Curriculum Evaluation Concept of Model Need for Models Models of Curriculum Evaluation 1. Tyler’s Model 2. CIPP Model 3. Stake’s Model 4. Roger’s Model 5. Scriven’s Model 6. Krikpatricks model 4. Criteria for judging evaluation studies http://talc.ukzn.ac.za Slide 3: Concept of a model Theory: Explains a process (Why?) Model Describes a process (How?) Model is a representation of reality presented with a degree of structure and order. http://talc.ukzn.ac.za Slide 4: http://talc.ukzn.ac.za Slide 5: Types of Models MathematicalModels Models Procedural Models Conceptual Models Describes: What is meant by the concept? Describes: How to perform a task? Describes: The relationship between the various elements of a situation or process http://talc.ukzn.ac.za Slide 6: Why do we need a model for curriculum evaluation? To provide a conceptual framework for designing a particular evaluation depending on the specific purpose of the evaluation. http://talc.ukzn.ac.za Slide 7: 1. Tyler’s Model (1949) Key Emphasis: Instructional Objective Purpose: To measure students progress towards objectives Method 1. Specify Instructional Objectives 2. Collect performance Data 3. Compare performance data with the objectives/standards specified http://talc.ukzn.ac.za Slide 8: Limitation of Tyler’s Model 1. Ignores process 2. Not useful for diagnosis of reasons why a curriculum has failed http://talc.ukzn.ac.za Tyler’s Planning Model(1949) : Tyler’s Planning Model(1949) Selecting learning experiences Organising learning experiences Evaluation of students performance Objectives What educational goals should the school seek to attain? How can learning experiences be selected which are likely to be useful in attaining these objectives? How can learning experiences be organised for effective instruction? How can the effectiveness of learning experiences be evaluated? [Print, M. (1993) p 65] http://talc.ukzn.ac.za Slide 10: 2. CIPP Model (1971) The CIPP model of evaluation concentrates on: Context of the programme Input into the programme Process within the programme Product of the programme http://talc.ukzn.ac.za Slide 11: Focuses on Decision making http://talc.ukzn.ac.za Slide 12: Types of Decisions Intended Ends(goals) Intended means(procedural designs) Actual means (procedures in use) Actual ends (attainments) http://talc.ukzn.ac.za Slide 13: CIPP http://talc.ukzn.ac.za Slide 14: Types of Evaluation Context Evaluation Input Evaluation Process Evaluation Product Evaluation http://talc.ukzn.ac.za Slide 15: Context Evaluation Objective: To determine the operating context To identify and assess needs and opportunities in the context To diagnose problems underlying the needs and opportunities Method: By comparing the actual and the intended inputs and outputs Relation to decision making: For deciding upon settings to be served For changes needed in planning http://talc.ukzn.ac.za Slide 16: Needs of Industry,Society Future Technological developments Mobility of the students http://talc.ukzn.ac.za Slide 17: Input Evaluation Objective: To identify and assess system capabilities, available input strategies and designs for implementing the strategies Method: Analysing resources, solution strategies, procedural designs for relevance,feasibility and economy Relation to decision making: For selecting sources of support solution strategies and procedural designs for structure changing activities http://talc.ukzn.ac.za Slide 18: Entry behavior of students Curriculum Objectives Detailed contents Methods and media Competencies of teaching faculty Appropriateness of teaching / learning resources http://talc.ukzn.ac.za Slide 19: Process evaluation: Objectives: To identify process defects in the procedural design or its implementation Method: By monitoring the procedural barriers and remaining alert to unanticipated ones and describing the actual process Relation to decision making: For implanting and refining the programme design and procedure for effective process control http://talc.ukzn.ac.za Slide 20: Feedback to judge The effectiveness of teaching –learning methods Utilisation of physical facilities Utilisation of teaching learning process Effectiveness of system of evaluation of students performance http://talc.ukzn.ac.za Slide 21: Product evaluation: Objectives: To relate outcome information to objectives and to context input and process information Method: Measurement Vs Standards interpreting the outcome Relation to decision making: For deciding to continue, terminate, modify, build or refocus a change of activity. http://talc.ukzn.ac.za Slide 22: Employability of technician engineers Social status of technician engineers Comparability of wage and salary structures Job adaptability and mobility http://talc.ukzn.ac.za Slide 23: STUFFLEBEAM’S CIPP Model(1971) Context Input Process and Product evaluation • Key Emphasis : Decision-making • Purpose : To facilitate rational and continuing decision-making Strengths : a) Sensitive to feedback b) Rational decision making among alternatives • Evaluation : Identify potential alternatives,set up activity quality control systems http://talc.ukzn.ac.za Slide 24: Limitation’s of CIIP Model Over values efficiency But undervalues students aims http://talc.ukzn.ac.za Slide 25: CIPP View of Institutionalized Evaluation http://talc.ukzn.ac.za Slide 26: CIPP approach recommends… • Multiple observers and informants • Mining existing information • Multiple procedures for gathering data; cross-check qualitative and quantitative • Independent review by stakeholders and outside groups • Feedback from Stakeholders http://talc.ukzn.ac.za Slide 27: 3. STAKE’s MODEL(1969) Antecedent is any condition existing prior to teaching and learning which may relate to outcome. Transactions are the countless encounters of students with teacher, student with student, author with reader, parent with counsellor Outcome include measurements of the impact of instruction on learners and others http://talc.ukzn.ac.za Slide 28: Description Matrix Judgement Matrix Rationale Intents Observation Standards Judgement Antecedents Transactions Outcomes http://talc.ukzn.ac.za Slide 29: ANTECEDENTS Conditions Existing prior to Curriculum Evaluation Students interests or prior learning Learning Environment in the Institution Traditions and Values of the Institution http://talc.ukzn.ac.za Slide 30: TRANSACTIONS Interactions that occur between: TEACHERS STUDENTS STUDENTS STUDENTS STUDENTS CURRICULAR MATERIALS STUDENTS EDUCATIONAL ENVIRONMENT TRANSACTIONS = PROCESS OF EDUCATION http://talc.ukzn.ac.za Slide 31: OUTCOMES Learning outcomes Impact of curriculum implementation on Students Teachers Administrators Community Immediate outcomes Vs Long range outcomes http://talc.ukzn.ac.za Slide 32: Three sets of Data Antecedents Conditions existing before implementation Transactions Activities occurring during implementation Outcomes Results after implementation Describe the program fully Judge the outcomes against external standards http://talc.ukzn.ac.za Slide 33: STAKE’s Model Key Emphasis: Description and judgement of Data Purpose: To report the ways different people see curriculum Focus is on Responsive Evaluation 1.Responds to audience needs for information 2.Orients more toward program activities than results 3. Presents all audience view points(multi perspective) Limitations: 1.Stirs up value Conflicts 2.Ignores causes http://talc.ukzn.ac.za Slide 34: 4. KAUFMAN ROGER’S MODEL Need Assessment Where are we now? Where are we to be? Discrepancy between current status and Desired status Discrepancies should be identified in terms of products of actual behaviours (Ends) Not in terms of processes (Means) Discrepancy http://talc.ukzn.ac.za Slide 35: Deduction The drawing of a particular truth from a general, antecedently known Rule – examples Induction Rising from particular truths to a generalisation Examples - rules http://talc.ukzn.ac.za Slide 36: GOAL FREE EVALUATION (1973) Proponent : Michael Scriven Goals are only a subset of anticipated effects Intended effects Effects Unintended effects http://talc.ukzn.ac.za Slide 37: Roles of curriculum evaluation: Scriven differentiates between two major roles of curriculum evaluation: the “formative” and the “summative” Formative evaluation – during the development of the programme Summative evaluation – at its conclusion http://talc.ukzn.ac.za Slide 38: Formative evaluation It is carried out during the process of curriculum development The evaluation results may contribute to the modification or formation of the curriculum For example, results of formative evaluation may help in Selection of programme components Modification of programme elements Summative evaluation – at its conclusion http://talc.ukzn.ac.za Slide 39: Summative evaluation It is carried out after offering the curriculum once or twice. Such an evaluation will summarise the merits and demerits of the programme. A curriculum that operates satisfactorily over a period time may become obsolete. To prevent this from occurring a permanent follow up of curriculum and quality control of the programme should be maintained http://talc.ukzn.ac.za Slide 40: Methodology: Determine what effects this curriculum had, and evaluate them whether or not, they were intended Evaluate the actual effects against a profile of demonstrated needs Notice something that everyone else overlooked or produce a novel overall perspective Do not be under the control of the Management. Choose the variables of the evaluation independently. http://talc.ukzn.ac.za Slide 41: Criteria for judging evaluation studies: Validity Reliability Objectivity / Credibility Importance / Timeliness Relevance Scope Efficiency http://talc.ukzn.ac.za Slide 42: Rationale: It provides the philosophic background and basic purpose of programme Goals and intents: Intents include the planned-for environmental conditions Demonstrations Coverage of criteria Subject matter Student behaviour http://talc.ukzn.ac.za Slide 43: Observations: The events and subsequent consequences form observations. Observations are derived from inventory schedule Biographical datasheets Interview routines Checklist Opinionnaires All kind of psychometric tests http://talc.ukzn.ac.za Slide 44: Standards: They refer to the benchmark of performance having wide spread reference value. It also refers to the level of competence at which the student performs essential scholastic tasks. http://talc.ukzn.ac.za Slide 45: Judgement: Two bases With respect to absolute standards as reflected by characteristic of alternate programmes From a relative judgement of a programme. http://talc.ukzn.ac.za Kirkpatrick's Four Levels of Evaluation In Kirkpatrick's four-level model, each successive evaluation level is built on information provided by the lower level. ASSESSING TRAINING EFFECTIVENESS often entails using the four-level model developed by Donald Kirkpatrick (1994). : Kirkpatrick's Four Levels of Evaluation In Kirkpatrick's four-level model, each successive evaluation level is built on information provided by the lower level. ASSESSING TRAINING EFFECTIVENESS often entails using the four-level model developed by Donald Kirkpatrick (1994). http://talc.ukzn.ac.za According to this model, evaluation should always begin with level one, and then, as time and budget allows, should move sequentially through levels two, three, and four. Information from each prior level serves as a base for the next level's evaluation. : According to this model, evaluation should always begin with level one, and then, as time and budget allows, should move sequentially through levels two, three, and four. Information from each prior level serves as a base for the next level's evaluation. http://talc.ukzn.ac.za Level 1 - Reaction : Level 1 - Reaction Evaluation at this level measures how participants in a training program react to it. It attempts to answer questions regarding the participants' perceptions - Was the material relevant to their work? This type of evaluation is often called a “smilesheet.” According to Kirkpatrick, every program should at least be evaluated at this level to provide for the improvement of a training program. http://talc.ukzn.ac.za Level 2 - Learning : Level 2 - Learning Assessing at this level moves the evaluation beyond learner satisfaction and attempts to assess the extent students have advanced in skills, knowledge, or attitude. http://talc.ukzn.ac.za Slide 50: To assess the amount of learning that has occurred due to a training program, level two evaluations often use tests conducted before training (pretest) and after training (post test). http://talc.ukzn.ac.za Level 3 : Level 3 Evaluation - Transfer This level measures the transfer that has occurred in learners' behavior due to the training program. Are the newly acquired skills, knowledge, or attitude being used in the everyday environment of the learner? http://talc.ukzn.ac.za Level 4 Evaluation- Results : Level 4 Evaluation- Results This level measures the success of the program in terms that managers and executives can understand -increased production, improved quality, decreased costs, reduced frequency of accidents, increased sales, and even higher profits or return on investment. http://talc.ukzn.ac.za Slide 53: Level four evaluation attempts to assess training in terms of business results. In this case, sales transactions improved steadily after training for sales staff occurred in April 1997. http://talc.ukzn.ac.za Slide 54: Methods for Long-Term Evaluation Send post-training surveys Offer ongoing, sequenced training and coaching over a period of time Conduct follow-up needs assessment Check metrics to measure if participants achieved training objectives Interview trainees and their managers, or their customer groups http://talc.ukzn.ac.za Thank you : Thank you firstname.lastname@example.org http://talc.ukzn.ac.za You do not have the permission to view this presentation. In order to view it, please contact the author of the presentation.