Monitoring and Evaluation

Views:
 
Category: Others/ Misc
     
 

Presentation Description

No description available.

Comments

By: nicky.dave (31 month(s) ago)

nice description of M & E , can i download it

Presentation Transcript

Monitoring & Evaluation : 

Monitoring & Evaluation

What is Monitoring? : 

What is Monitoring? Monitoring is the routine assessment (e.g. daily/monthly/ quarterly) of information or indicators of ongoing activities

Why Monitor : 

Why Monitor Monitoring tracks progress toward the set program targets or performance standards identifes aspects of the program that are working according to plan and those that are in need of midcourse corrections so that timely improvements or changes can be made

What is Evaluation? : 

What is Evaluation? Evaluation refers to the measurement of how much things have changed because of the intervention(s) implemented.

Why Evaluate : 

Why Evaluate Because there are many factors that cause things to change, a formal evaluation tries to demonstrate how much a specific intervention contributed to the change.

Definition of M & E : 

Definition of M & E A management tool that is built around a formal process for measuring performance and impact using indicators that help measure progress toward achieving intermediate targets or goals. Monitoring systems comprise procedural arrangements for data collection, analysis and reporting.

Purpose of M & E : 

Purpose of M & E To determine how program funds are being spent To determine if programs are being implemented as planned To determine what the effects are on public health To determine if programmes need to be changed to be more effective and Provide reasons for success or failure

Indicators : 

Indicators Indicator: A value on a scale of measurement (a number, %,ratio, fraction) derived from a series of observed facts that reveal relative changes as a function of time. Eg. In 2000 there were no ARV available; In 2005 5% of known HIV infected persons are on ARVs

Understanding Indicators : 

Understanding Indicators Definition Rationale Numerator Denominator Measurement Strengths and limitations

Definition : 

Definition Number and percentage of CSO’s trained in HIV/AIDS prevention among youth 15-24 (prevention approach may vary depending on CSO eg church, SLPPA)

Rationale : 

Rationale NAP reliant on CSO’s to reach subgroup populations Important to assess CSO resources available to address prevention Knowledge and approval of CSO’s activity in terms of prevention Are CSO’s meeting the needs of the populations concerned

What is measured : 

What is measured This indicator quantifies CSO’s with human resources that are trained in HIV prevention and/or CSO’s that provide prevention services

Numerator/Denominator : 

Numerator/Denominator Number of CSO workers trained or retrained in prevention methods Total number of CSO workers identified as able to provide prevention methods

Strength : 

Strength This indicator tracks the number of CSO’s trained for prevention of HIV infection. It attempts to document increasing capacity to deliver preventative interventions

Limitation : 

Limitation No conclusions should be drawn regarding quality because this is affected by practices employed rather than the existence of trained personnel

Indicator Examples : 

Indicator Examples % HIV patients on ARV 2000 - 0% In 2005 - 5% In 2008 - 20% = improvement Reported HIV cases in new born 2000 - 2 2005 - 26= stop and review 2008 – 41= programme failure

Collecting and Using Data : 

Collecting and Using Data

M & E of CSO’s : 

M & E of CSO’s Unique as M & E is done on an individual basis Maybe ongoing or one time Results are usually reflected in surveys or records

Active M & E : 

Active M & E Examination of each module to select indicator (eg CSO’s, PMTCT, OVC, TC, Blood safety) Data elements selected from indicator eg # Females 15-24 on ARV Sex, age, treatment Data capture designed

What is good data : 

What is good data Understand the data. Make sure that those responsible for collecting information clearly understand what is being asked for. Record the data every time. Encourage those responsible for collecting information to record it on the appropriate form

What is good data : 

What is good data Record all the data. Make sure all the information requested on the form is completed. Record the data in the same way every time. When possible, use the same definitions, the same rules for reporting the same piece of information over time.

M & E through data collection : 

M & E through data collection

Mid-course and end-of-program evaluation can be determined by: : 

Mid-course and end-of-program evaluation can be determined by: Reviewing available records and reports Conducting supervisory assessment Conducting self-assessment Conducting peer assessment Obtaining client feedback (exit interviews) Polling community perceptions Benchmark (comparing the site’s services with others)

Your Role in M & E : 

Your Role in M & E To ensure that proposal objectives are met To seek assistance if necessary To be honest about deliverables To give feedback/report Allow review of work

My Role in M & E : 

My Role in M & E To review approved proposals To define indicator(s) To prepare a reporting format To provide assistance where necessary To review periodically (dependant on proposal) To review accomplishments

Thank You : 

Thank You Any questions contact 712-3474