GUIDELINES FOR FINAL EVALUATION OF TLC DISTRICTS


Evaluation of Total Literacy Campaigns
I. Introduction
The launching of the NLM on 5th May 1988 marked a new beginning of an effort to place “Functional Literacy for All” on the national agenda. Today 590 districts are covered under Total Literacy Campaigns. Of these, 227 districts have moved to continuing education after successfully completing Total Literacy Campaign (TLC) and Post Literacy Programme (PLP).

Until recently all efforts were made in TLC districts to complete the literacy activities and declare the district totally literate. The various evaluation studies conducted by evaluation agencies showed marked differences between the literacy claimed and actual literacy achieved. The main reason for this discrepancy was competition among the districts to declare themselves fully literate. To curb this trend, the first step undertaken by the NLM was to discontinue the practice of declaration of total literacy. The second major step was to subject each district to rigorous evaluation to assess the reality with regard to literacy achievement. For this it was essential to make evaluating agencies understand the objectives behind final evaluation. Therefore, it was planned to conduct Regional Workshops for Eastern, Northern, Western and Southern regions for evaluating agencies and Directors of State/UT Directorate of Adult Education and SRCs with the following objectives:

1. To orient the evaluation agencies on the aims and objectives of the national Literacy Mission and on the current status of Literacy and Post Literacy campaigns;

2. To sensitize the agencies to evaluation procedures and methodologies being adopted as per the recommendations of the expert group;

3. To deliberate on the strengths and weakness of ongoing evaluation studies and find ways to remove bottlenecks;

4. To develop a systematic mechanism for bringing uniformity and a scientific approach to the evaluation studies;

II. Evaluation Systems Under National Literary Mission (NLM)
Self-evaluation of learning outcomes of the enrolled learners has been built into the body of the three primers. Each primer contains three tests and it has been assumed that if a learner attempts these tests he/she will have a fairly reliable idea of his learning weaknesses. This self-evaluation would enable the learner to perceive his/her own pace and progress of learning and should heighten his/her motivation.

Besides self-evaluation of learners, every campaign district is subjected to 2 more evaluations namely “Concurrent Evaluation” which is to be carried out by agencies within the State and “Summative of Final Evaluation” to be carried out by agencies outside State. Concurrent Evaluation will focus on various activities in the process of implementation of the programme such as survey, environment building, training etc. so as to detect bottlenecks, shortfalls and deficiencies and suggest corrective measures to ensure optimum efficiency. Summative Evaluation, which is normally executed at the end of the programme, will mainly focus on learning outcomes, success rate vis-à-vis the target and the impact of the campaign on the social, cultural and economic environment of the project area. The new approach to evaluation adopted by the NLM is aimed at ensuring complete transparency and thus enhancing the credibility of the result declared.

III. The Purpose And Objectives Of External Evaluation
It is important to understand the broad objectives and purposes of external evaluation so that such evaluation may be conducted not only with a credible methodology but also in the right spirit:
(a) To provide an objective and a reliable assessment of the literacy and social impact of the campaign in the TLC district;
(b) To provide feedback to local organisers about the outcome of the campaign, its strengths and weaknesses, and suggest remedial measures;
(c) To provide academic inputs into the policy and planning of literacy campaigns (in other districts) at the State and Central levels.

IV. Agreed Upon Minimum Evaluation Process
1.FOCUS
The main focus will be on learners’ evaluation. If concurrent evaluation has already been done, the evaluating Agency should procure a copy of its report from the ZSS; if not done, the inputs may be studied as objectively as possible. The Agency may have no time to do impact evaluation in depth & in detail. However, observations and claims must be reported.

2. PROCEDURE
(a) The Zilla Saksharata Samiti (ZSS) will initiate the evaluation procedure when, in its estimation, about 50% of the targeted learners have completed/almost completed primer-III.

(b) The ZSS will approach the State Directorate of Adult Education (SDAE) to assign to it an agency to carry out the learners’ evaluation. The State Directorate will assess the readiness of the TLC district and approach the national Literacy Mission for a panel of agencies for undertaking evaluation. National Literacy Mission will recommend three agencies from its panel from outside the state of the state DAE and the ZSS. The ZSS will select one of them and enter into a contract with it. The format of the contract is annexed at Annexure ‘C’. However, in Southern India because of language variations. NLM may consider agencies within the respective State but outside the district.

(c) If the district has completed concurrent evaluation before the External evaluation, it will make the report available to the evaluating agency.

Preparing the Evaluation Design
A design is a plan for conducting a study. It may be brief and simple; it may be long, detailed, and complex. Lack of a design or a poorly formulated design can lead only to inefficiency and waste.

Minimum essentials for any study design are:
1. A clear understanding (preferably in the form of a written statement) of the problem to be investigated.

2. A clear understanding (preferably in the form of a written statement) of the specific objectives of the study.

3. A clear understanding of the ways in which the data collected will be expected to contribute to the solution of the problem.

4. A carefully worked out plan for collecting data.

5. A carefully worked out plan for handling the data collected.

6. A carefully worked out plan for analyzing the data collected

3. UNIVERSE
A Universe is the total aggregation of people, events, or objects from which a sample is drawn. The universe for the purpose of drawing the sample units for testing the learners will be:
Primer III completers and current learners at Primer III

Necessary data for drawing up the sample as shown in Annexure ‘D’ would be procured from the District in advance along with an outline map of the district showing Block boundaries only, for showing the spread of sample units (villages and urban areas). Please refer Annexure ‘H’.

The Sampling Frame
Before we draw a sampling frame let us be clear about sampling frame let us be clear about sampling, criteria for good sample, randomness etc. Sampling is a procedure by which some members of a population - people or things (or events) are selected as representative of the entire population. The object of this selection is to make some further observations or measurements on each of the individual members so selected and on the basis of these observations to draw conclusions regarding the entire population. The sub-group selected to represent the population is known as a sample. A measure computed from a sample is known as a statistic. Corresponding measures for the population (which generally have to be estimated) are parameters. (The mean - arithmetical average of a population is a population parameter; the mean of a sample drawn from that population is a sample statistic.) Sample statistics are used to estimate population parameters.

Criteria for a good sample
1. The sample should yield the highest amount of accuracy possible for its cost. An early question to be answered is: how accurate would we like our estimate to be? (There is no point in saying absolutely accurate, because that is not attainable by any means). In practice, this can be answered by specifying how much sample error we are willing for our sample to have. Less error means more cost; we must also remember that some error will arise from factors other than sampling.

2. The sample should be designed in such a way as will make it possible later to compute the sampling error.

3. The sample design should be practical in order to avoid unnecessary procedures and problems.

4. The sample should be representative. That is, it should include about the same distribution of variables of interest to the study, as does the population from which it is drawn, so that the sample statistics yield values approximating those of the population parameter. In order to assure this:

5. The sample must be random.

4.SAMPLING
(a) Village/ward will be the last unit of sampling. Village means the ‘Panchayat Village’ and not the “Revenue Village”.

(b) Stratification will be necessary if there are pockets, having predominant (more than 50%) SC/ST/minority learners.

(c) The sample size would be 5% of the universe subject to a maximum of ten thousand learners. A higher sample should be drawn to take care of sample loss.

(d) At least one or two Contingent villages in each Block should be selected randomly.

5.CONSTRUCTION OF THE TEST PAPER
The test paper will measure all the competencies as given in the Dave Committee report. The Model Test Paper at Annexure ‘E’ could be adapted/adopted.

6.TEST ADMINISTRATION/TEST PAPER
(a) In the selected sample villages, in principle, all the learners (100%) must be tested. However, conceding the possibility of absence of learners on the day of the evaluation for various reasons (i.e. temporary/permanent out migration, marriage, sickness etc) attempt must be made to cover at least 70% of the learners. For this purpose, if necessary, villages may be revisited by the evaluation team.

(b) Cause of absenteeism must be ascertained and indicated in the report.

(c) The absentees, who fail to take the test due to some valid reasons, will not be treated as 100% failure or success. The success. The success rate will be calculated according to Ghosh Committee recommendations (see T.2 And Annexure ‘B’). The various options as per Ghosh Committee report are as follows:
(i) Assume the absentees to have ‘failed the test; this may be a little harsh;
(ii) Assume the absentees to have ‘passed’, it would be highly optimistic;
(iii) Assume the percentage of success to be the same as for those tested, and
(iv) Assume the percentage of success (among the absentees) to be somewhere between (i) & (ii) and work out the average of the success rates in terms of (i) & (iii) above.

Alternately, treat willful and non-willful absentees separately. Willful absentees are those learners who were present in the village/area when the evaluation team visited but did not turn up for the test without any valid reason (i.e. sick, death in the family, marriage etc.) These would be treated as have failed. Unwilling absentees are those who had attended the classes but at the time of evaluation were sick, or migrated by the economic reasons, marriage and went away to other areas. Such learners would be treated as passed at the rate of those who had taken the test.

The Status of absentees can easily be known from VTs, local leaders, and supervisors. Apply formula ‘iii’ for unwilling absentees and formula ‘I’ for willful absentees.

(d) In the numeracy test, simple problem question involving only money would be given.

(e) Use of parallel test papers is desirable but optional. In case parallel test papers are constructed, it would have to be ensured that they are of equal difficulty.

7.HOW TO CONTROL PROXY LEARNERS AND UNDUE OUTSIDE HELP
The first rule is don’t jump to conclusions and go with an attitude of policing. You could make serious mistakes if you do so. Here are a few examples:

(a) Don’t judge a learner as proxy learner simply because he/she is well dressed. Some of them do come to the testing place well dressed; to them it is an occasion to celebrate.

(b) Don’t come to the conclusion that a learner is a fake learner because he does not look or right age specially if he happens to belong to the Non Formal Education group or even TLC group. Some of them though of right age have stunted growth.

( c) Don’t consider a learner as proxy learner simply because he/she is writing fast in an excellent handwriting. Interview the learner. He/She may have had previous schooling, regular attendance or studying at home as well.

(d) A test Administrator saw a well dressed young woman sitting among the learners, holding there test papers in her hand. He jumped to the conclusion that she was a proxy learner solving the TPs on behalf of genuine learners. On close interview it was discovered that she was a VT and nobody had instructed her not to sit with her learners. The Test papers she was holding belonged to genuine learners who bad been called away because of some problem at home.

(e) When a Test Administrator asked a learner her name, she answered ‘Chandni’ whereas on her Test Paper she has written her name as ‘Jamila’. He concluded that she was impersonating. On enquiry she replied innocently, “but this is what they call me at home, Chandni”.

(f) To check the genuineness of learners one Test Administrator used to ask, “what is the name of the Primer”. This was an unfair question. Many people do read a book but do not care to remember its name.

(g) Some learners specially the young ones, equate a primer to a class. So, when pressed if they reply “class there”, it may mean that they are referring to primer three and not to class three of a primary school.

However proxy learners should be checked and VTs/MTs should be stopped from giving undue help or solving the papers themselves. But this is a ticklish affair. If the VTs are asked to bring with them the attendance registers and a roll call is taken first, it takes a long time. Moreover the VTs and learners both feel humiliated in public if impersonation is discovered and the learner is asked to leave. Some proxy learners write down the name and father’s name of the genuine learner on the palm of their hand or on a chit of paper. They generally remember the name of the learner whom they are impersonating but forget the name of the father. Similarly if the VTs are asked to stop giving undue help, some of them retort, “what is the harm in helping my learners a little, when copying goes on everywhere.” Therefore such situations have to be controlled with humour, tact and patience.

The following approach has been found useful in this respect:-
(a) Ask the learners of each VT to sit in a row, the VT standing in front of it. Approach the suspect learner casually and ask him/her and ask father’s name. Then go to the VT name. Usually he won’t be able to do so. If the learner is not genuine them you can make an agreed upon make on his TP.

(b) Move among the learners while they are solving the Test Paper. If you observe a learner writing very fast or having ’puka’ handwriting, interview him closely. If he/she is a proxy learner you will discover it easily.

(c) There can be a large number of VTs and MTs moving among the learners and insisting on helping them. You can take them and non-genuine learners to a different place and may discuss with them and ‘post-literacy programme’ or the reasons of low enrollment. Low turn up or any other relevant matter.

(d) Ask for the statistics register, kept with the full time NP level worker. Check the name, specially of young learners, when the testing is over. If you don’t find their names in the register, check with the VT and treat them as proxy learners in the absence of convincing explanation.

However there is no foolproof recipe to control such situations.

8. ESSENTIAL TABLES
The report shall contain the following tables:
-- Villages/wards selected in the sample with target and current learners (Table-1)
-- Success rate of the district including tested and absentee learners (Table-2)
supposed with calculation table as at Annexure B
-- Showing Percentage Achievements as per NLM norms (Table 2A)
-- Standard Error showing the result of the 2 sub-samples (Table 3)
-- Showing the status of Primers completed (Table-4)
-- Showing the achievement by primers completed (Table-5)
-- Showing the achievements by caste, age and sex (table-6)
-- Distribution of sample and total current learners according to marks obtained (Table-7)
-- District Literacy Scenario (Table-8)
-- Showing percentage and average marks obtained in Reading, Writing and Arithmetic (table-9)
-- Comparison of success rate between male and female (Table-10)

9. SELECTION AND TRAINING OF TEST ADMINISTRATORS
-- The TAS should have an unbiased yet sympathetic attitude. They should be experienced and reliable.
-- They should fully understand that the purpose of testing is to find out what the learners know and not what they do not know.
-- The TAS should be well trained and guided to understand the problem of proxy learners, reasons thereof and approaches to detect them.
-- A Guideline for TAS is given at Annexure F.
--Marketing code should be developed involving the TAS.

10. PARTICIPATORY APPROACH
It is highly desirable that the ZSS functionaries participate in the evaluation process. The following approach will be adopted:

The ZSS shall handle all boarding/loading arrangements and scheduling of village visits in consultation with the agency.
-- The evaluation procedure shall be fully explained to the ZSS.
-- The Secretary ZSS will draw the sample according to the given methodology.
-- The ZSS may check the market papers if it so desires.
--The ZSS will not be involved in actual test administration (except in detecting proxy -- learners) and marking of TPs.

11. PRESENTATION OF THE REPORT
-- It should clearly show the achievement of the district, both on the basis of sample and target learners. Achievement of target learners may be calculated as shown in table-2 and Annexure ‘B’

-- It should be short and to the point. Unnecessary details such as geography of the district, income, caste of VTs and learners etc. are to be strictly avoided. Administrators and planners should be able to read the report quickly so that they may respond to the findings.

-- The first page of the report should contain highlights. It should specifically mention the percentage achievement against target learners. After this, the background data should be provided.

CONCURRENT EVALUATION OF TLC DISTRICTS

1. INTRODUCTION

The National Literacy Mission was set up in 1988 with the objective of making 80 million persons in 15-35 age-group functionally literate by the year 1995 . Subsequently, changes were made in the target which now stands at 100 million people to be made literate in the same age group by the year 1999 and full literacy to be achieved by 2005. So far NLM has sanctioned 419 literacy projects which are being implemented in 427 districts. Out of these , 187 districts have moved to the post literacy phase after successfully completing the literacy phase.

2. EVALUATION SYSTEMS UNDER NLM

Self evaluation of learning outcomes of the enrolled learners has been built into the body of the three primers. Each primer contains three tests and it has been assumed that if a learner attempts these tests he will have a fairly reliable idea of his learning weaknesses. This self-evaluation would enable the learner to perceive his own pace and progress of learning and should heighten his motivation.

Besides self evaluation of learners. every campaign district is subjected to two more evaluations namely 'Concurrent Evaluation' which is to be carried out by agencies within the State . 'Summate Evaluation' to be carried out by agencies outside the State. Concurrent Evaluation will focus on various activities in the process of implementation of the programmed such as survey, environment building, training etc. so as to detect bottlenecks, shortfalls and deficiencies and suggest corrective measures to ensure optimum efficiency. Summate Evaluation, which is normally executed at the end of the programmed, will mainly focus on learning outcomes, success rate vis-à-vis the target and the target and the impact of the campaign on the social, cultural and economic environment of the project area. The new approach to evaluation adopted by the NLM is aimed at ensuring complete transparency and thus enhancing the credibility of the results declared.

3. WHAT IS CONCURRENT EVALUATION?

The term Concurrent Evaluation is also called 'formative' or 'process' is the evaluation of all the activities undertaken to achieve programmed objectives. The information generated through this can be used for improving to health of the programmed by focusing on mid-course correctives for reasons of convenience it has been decided that Concurrent Evaluation is to be undertaken at two stages during the course of implementation of the programmed. The overall purpose is to conduct a broad SWOT (strengths, weaknesses, opportunities and threats) analysis of the programmed so that mid-course correction in initiated at appropriate points of time.

4. PURPOSE, NEED AND OBJECTIVES

PURPOSE: The purpose and spirit behind Concurrent Evaluation is quite different from that of Summate Evaluation. In the latter we declare the final result, the outcome, the level of goal attainments on the basis of objectives as accurately as possible. In a sense, we pass judgment. In concurrent evaluation . however there is no question of passing judgment as the basic purpose is to study the bottlenecks, difficulties , problems and obstacles and discuss them with the ZSS so as to enable them to improve the programmed . The role of the evaluator would be to help and guide. he should consider himself as part and parcel of the programmed with the only difference that he is not there to 'cover up' but to 'unearth' and to help ZSS and NLM to weed out the obstacles and facilitate healthy growth of the programmed . It is a qualitative assessment of the activities supported of course by data wherever necessary . Therefore , it will have to be carried out by knowledgeable and qualified personnel of evaluative agencies.

NEED: Every campaign district is supposed to design an implement an effective Management information system which can generate useful information to facilitate effective decision making. But, in many districts, although good MIS have been designed, monitoring has remained the weakest link. Figures reported by many campaign districts indicate that MIS has mainly served the purpose of data collection and recording. In most of the cases it seems that the data reported is not authentic and is sometimes inconsistent. It also appears that they have lost their educative purpose for rarely have they been used to generate correctives and improve the programme. Some districts do not attach importance to sending even this type of routine report regularly. It is , therefore, clear that the health of the programmed may not be properly judged, weak points detected and improvements effected only through MIS reports which at best give only quantitative data and that to sometimes unreliable.

The success or failure of the literacy programmes may be attributed to factors such as motivation of and supervision . environment building efforts, training of functionaries, conducting teaching/learning activities etc. The only way ascertain the effectiveness of activities or inputs essential for the attainment of the stated goal is to evaluate them during the process of implementation itself so that appropriate remedial measures can be taken at the right time. Thus concurrent evaluation of activities becomes unavoidable and crucial in goal attainment.

OBJECTIVES: Thus the specific objectives of concurrent evaluation are:

  1. To examine the operational strategies and implementation processes in the context of approved plan of action and having regard to district - specific factors.
  2. To identify the strengths and weaknesses of the project.
  3. To identify the factors responsible for such strengths and weaknesses.
  4. To suggest corrective and remedial measures.


5. WHO SHOULD CONDUCT CONCURRENT EVALUATION & WHEN - TIMING OF VISITS BY AGENCY

WHO:

- Concurrent Evaluation will be done by qualified agencies located within the State.
- All State Directorates of Adult Education will prepare a panel of agencies having good infrastructure such as computer facilities , experienced faculty members etc.
- These agencies are to be properly oriented by the State Directorates before assigning to them the task of Concurrent Evaluation. A list of these agencies must also be sent to the Directorate of Adult Education, Delhi and to the Director General, NLM.
- The Chairman, ZSS should approach the State Directorate of Adult Education as soon as the district becomes eligible for concurrent evaluation for nominating a panel of three evaluation agencies.
- The State Directorate will forward a panel of three agencies to the district. One of these agencies will be selected by the Chairman ZSS on the criteria of suitability and response. No tenders need be called . Financial criteria to be adopted are set out in section 9.
- A Contract/Agreement (as provided at Annexure 'D') must be entered into between the ZSS and the chosen evaluating agency.

Concurrent Evaluation should be conducted at two stages during the course of implementation. These are:

I Stage : When 50% or more of the enrolled learners have completed Primer I.
II State : The second stage must be completed within three months after the I stage evaluation.
TIMING OF VISITS BY AGENCY :

FIRST STAGE:

The agency may plan its visit to the district as soon as the letter from ZSS is received by them. They may complete the process of first stage concurrent evaluation in one or maximum two visits to the district depending on the convenience, distance, time etc.

The approximate time required for first stage concurrent evaluation is 24 days. However the brief report on first stage evaluation should be submitted to the district collector within twenty five days from the date of signing the contract (i.e. if the contract is signed on 1st January the report should be submitted by 25th January).
SECOND STAGE:
The agency may visit the district for conducting second stage evaluation after forty days from the date of conclusion of the last visit. The agency may undertake two visits to the district during second stage evaluation. During the first visit the agency may hold preliminary meetings with the district magistrate and other key functionaries , the basic necessary data regarding learners may be collected, the sampling procedures and sample size may be finalized, and the actual date of test administration in the field may be decided during this visit.
During the second visit the agency may complete the field visit, test administration and complete the process of date collection.

The approximate time required for second stage evaluation has also been worked out which is 48 days . The agency must mandatorily follow the suggested schedule given at item 9.

The final comprehensive and consolidated report should be submitted to the district collector within a period of four months from the date of signing the contract. (i.e. if the date of contract is 1st January the final report should be submitted by 30th April).

6. BACKGROUND DATA TO BE PROVIDED BY THE DISTRICT TO THE AGENCY BEFORE CONCURRENT EVALUATION


After selecting an agency for concurrent evaluation, the district must provide necessary background data (See Annexure-A) to the agency. This data must be supplied to the agency within a week's time from the date of signing the contract. (See Annexure-D) It is essential to adhere to the time frame. If collection of certain items is found to be time consuming, these items may be submitted to the agency during their visit to the district.

7. ACTIVITIES AND ASPECTS TO BE EVALUATED DURING FIRST STAGE:

All the following activities should be evaluated as all of them are important milestones in the attainment of campaign goals. There are several aspects of each activity which could be studied. Only the important one are being listed here.

It should be noted that there is no need to administer a formal test paper to evaluate literacy skills acquired by learners at this stage. However, the activities and aspects to be evaluated are given below:





ACTIVITIES
PURPOSEASPECTS TO BE EVALUATED
1. Organisation & Management structure of ZSS and people committees at various levels. - to execute a time bound and result oriented campaign
- to ensure involvement of community
a) Composition of General body, Executive Committee core group and other sub-Committees.
b) frequency of meetings of ZSS Executive,
attendance, specially that of non-officials. Their role in decision making.
c) decision making process and steps taken to solve problems .
d) application of financial rules & regulations. Do they hinder or help the pace of the campaign.

e) role of village Education Committees, People's participatory Committees at Block level, village level specially that of E.B. Training and Monitoring . Do the members visit the field, hold dialogue with functionaries and learners. Practical steps taken to remove obstacles observed. Level involvement of Panchayats.

a) ZSS Secretary to see his capabilities and his commitment to the programmeda) Basis for selection, education qualification, experience, financial and administrative powers, strengths of second level leadership. Extent of delegation of powers.
2. Environment Building (E.B.) (** please refer to note below) - involvement of public / community
- spread of campaign
- information, motivation of learners



a) Organisation of different E.B. activities, specially Kala Jatha, Nukkad Natak, etc. and use of electronic & print media
b) frequency

c) which of them were peoples' activities with public contribution and which were paid shows.

d) usefulness of different activities in spreading the message, molding peoples opinion, enlisting their participation. whether message understood and noticed by intended audience.
e) E.B's role in motivating learners.
f) effectiveness of E.B. items in enlisting peoples participation.

** Using posters, stickers, hoardings, showing cinema slides. holding meetings and conventions, wall writing, enacting plays, taking out processions and 'Kala Jathas', are the usual E.B. techniques. Among these, Kala Jathas, if organised properly, have found most effective in involving the public in the campaign. However, it is seldom organised properly. The manner in which it should be organised to yield the desired result is described below:-

A workshop of writers, poets and play writes should be held at the district level to develop scripts, songs and catchy slogans. This is a technique to involve the intelligentsia of the districts in the campaign.

A 'paidal yatra' should be taken out consisting of ,if possible, educationists, scientists, district level activists and general public . The route of the yatra should be chalked out in advance and villages on the route should be informed of the time and date the yatra will be passing through or halting over night.

Villages should be encouraged to receive the yatra by erecting gates, garlanding the yatris etc. Again it is not that the yatris are manipulating their own welcome, this is but technique of spreading the message, to every home and involving the village in the campaign.

While in the village, a meeting should be held propagating the message, listening to peoples problems, explaining some scientific truth or explaining certain health hazards etc. A 'workshop' of local talent could be organised to develop slogans and songs and to prepare drama scripts. An organised group of youth could be formed under local leadership to carry on E.B. activities, including wall writing , door to door canvassing, forming if necessary Mohall wise VT groups, and so on.

The above activities could be carried out in greater depth in the villages the yatra halts overnight. Thus the caravan marches on.

3.Survey - to determine exact number of non literate.
- to serve an educational purpose as well i.e. to motivate non-literates through persuasion/small group discussions.
a) whether done door to door or number obtained from records/ledgers.
b) it should be done at one go to create and spread information about the campaign. How done? that is, in a prolonged manner/or at one go.
c) checking the authenticity of data collected, areas left out if any. Check, few survey sheets.
d) did it serve the educational purpose?
e) Quality of matching & batching.
4. Monitoring (MIS) and supervision- obtaining feedback & taking corrective measures

a) Monitoring structure, monitoring tools, frequency, only data collection or examination of data and feedback at VEC, Block, ZSS and State level. Built in Checks. Innovative methods if any .

a) Visit by District level officialsSolution of peoples and learners problemsb) Frequency of field visits by DM, ADMs, BODs etc. Reporting system, Action taken

*** The VTs desire that their efforts are taken notice of , and appreciated by district level officials. Also , people have several legitimate grievances. Timely redressal of these grievances would be helpful in community participation and learner motivation. This would be possible if district level officials visit the field frequently. Nature of visit must be to discuss and help rather then merely routine






5. Selection of committed functionaries KRPs, MTs, - to evolve a group of effective trainers a) method of selection e.g. preferring those who took interest in E.B. activities through persuasion or through orders.
a) Full Timers, - to build a team of committed supervisors. b) Method of selection, Ratio of Govt. and non-government officials, Adequacy, Commitment, Training, Reporting System.
VTs- willing VTsVTs profile.
6. Training / orientation of functionaries.
- ZSS Committee members
- ADMs and BDOs involved
- Contact persons
- KRPs, MTs, VTs
- effective guidance of the campaign
- effective teaching.
a) understanding of the campaign strategy-time bound, area specific, cost effective, societal mission.
b) attitude towards the strategy.

c) time allotted in VTs training curriculum to the development of crucial skill of how to teach the lessons, how to develop comprehension, how to involve the learners in teaching / learning; effective hours of training of each type of functionary; replacement of volunteers and the training/ reading materials provided , no of trainees and trainers in each batch; ability to persuade learners.

d) method of training by SRCs, KRPs and MTs , whether problem oriented or based on documents/guidelines, participatory (different from just asking occasional questions as a breather) or mainly one sided straight lecture . Audio-visual materials used.
e) recall of main training inputs e.g. teaching steps.

7. Development of teaching/learning materials & VT's guide - Relevant teaching/learning materials very essential>a) District specific of RSC primers.
b) Printed by District or bought.
c) Languages of Primer, (How many)
d) Approved by IPCL Committee or not.
e) Whether all primers printed together or one by one . The latter to be ensured.
f) primers & VTs guide printed and supplied in time or delayed, reasons for delay, printed in excess or shortage or primers.
g) Relevancy of primers to the district.
8. Teaching/learning phase.- This is the reason for the organisation of all other activities. a) No. of present non-literates in the village/wards as against survey figure and No. participating.
- The attainment of the main stated goal of making a large number of non-literate adults, literate, will depend upon it. b) No. of centres functioning regularly.
c) average attendance.
d) No. of learners on different primers.
e) expected level of literacy skills according to the primer being studied/completed.
f) lighting arrangements.
g) supply status of materials.
h) VTs method of teaching
i) suggestions by leaders/ VEC for improving the programme.
j) reasons of non-literates not participating.
- means of persuading non-participants to join and reducing irregularity.
k) supplies of materials actually received by learners/centres and assessment of additional supplies.
9. Support of VT - the frontline soldier - enhancing his teaching ability.
- encouragement.
preventing drop out

a) guidance received from supervisors in teaching.
b) assistance received from VEC/village leaders/supervisors in persuading learners to attend centres.
c) Solution of his personal problems.
d) incentive by community / administration.
10. Feed back from Media- Support from Media is very essential
a) Media reports on the campaign in local and popular dailies,
- positive or negative.
b) Criticism if any on misappropriation of funds or any other issue.
11. Finance
- To see whether expenditure is made as planned.

a) Budget approved and sanctioned by NLM and State Government.
b) Delay in sanction by NLM/State , if any
c) Procedures followed for depositing and spending, financial powered given to whom etc.
d) Arrangements for timely adult.
12. Involvement of SDAE, SRC and others.- What type of support received from these institutions, did it make any difference
a) Role of State Directorate of Adult Education, Visits by officials, frequency, tour reports, etc, Who attends State level monthly meetings from the district.
b) Role of SRC, limited only to training or any extra inputs given?
c) Involvement of Primary School Teachers.
d) Involvement of NGOs.
to demonstrate innovative methods/ideas if any.a) Documentation efforts, special features ,break through, break downs, other highlights.

A. Methodology
1) By the very nature of activities and their purpose ,it will be a mixed process using sampling and non sampling techniques. Aspects and responses to be checked in the 'field' will be done in villages/wards randomly. The focus in the First stage of Concurrent Evaluation will be mainly on the 'processes, or the 'activities,.

2. It should be participatory. This is an excellent educational technique leading to learning under guided observation. Participation by concerned functionaries of the activity. Non Participation , on the other hand may lead to a defensive attitude.
3. The finding should be discussed with ZSS and other relevant functionaries.

B. The Evaluation Process
A suggested process of evaluating different activities is given below:


ACTIVITYEVALUATION PROCESS
1. Organisation and Management Structure of ZSS and people committees at various level.
a) ZSS Secretary
Aspects (a) (b) (c) (d)&(e) Discussion with official an non-official members. Discussion with nonofficial members to be held separately. Checking the responses to (b) and (c) with those concerned and in the field, if necessary.

Aspects (a) Discussion with ZSS secretary, DM, District core group and other functionaries.
2. Environment Building (E.B.) Aspects (a), (b)- Discussion with ZSS
Aspects (c)- Discussion with ZSS and with some participants. Examination of budgeted expenditure.
Aspect (d)- interviewing a few general members of the public and learners exposed to E.B. items, to assess the views and opinion about campaign and readiness to participate.
Aspect (e) - Interviewing learners in selected villages/wards.
Aspect (f) - Interviewing some members of the public and studying cost vis-à-vis expenditure
3. Survey Aspect (a) (b) (c) Discussion with a number of surveyors . Checking in the field with village leaders. Discussion with ZSS. Checking few survey Sheets.
Aspects (d) (e) Discussion with learners and VTs in the field.
4. Monitoring (MIS) and supervisionAspects (a) (b) and (c) Discussion with persons concerned. Checking the responses to (a) and (b) in the fields; checking MIS forms and entries in visitors book.
Aspects (c) Discussion with VTs , learners, contact persons in the randomly selected villages/wards.
5. Selection of functionaries
a) full timers
b) VTs
Aspects (a), (b), (c) - Interviewing a few KRPs , MTs, concerned ZSS Committee members, Interviewing full timers responsible for selected villages /wards. Interviewing VTs in selected villages.

6. Training / orientation of functionaries - ZSS Committee members, ADMs BDOs, Contact Persons, KRPs, MTs, VTs
Aspect (a), (b), - Discussion with Committee members and district/block level functionaries.
Aspect (c)- Procurement and study of syllabus from SRCs, KRPs and MTs. Discussion with them. Participation in training if going on.
7. Development of Teaching / learning materials & VTs Guide Aspect (e)- Interviewing VTs in selected villages.
Aspects (a) (b) (c) (d) and (e) discussion with committee members and others connected primers.

Aspect (f) scrutiny of primers and discussion some Knowledgeable people and some good VT's.
8. Teaching / learning phase Aspect (a) , (b) - From the record of contact Persons. Present number to be estimated by interviewing VTs and villages. Checking of reported number of functioning centres.
Aspect (c)- From attendance register if maintained otherwise from VTs and learners.
Aspect (d)- from VT.
Aspect (e)- The printed tests in Primers may be administered . Suitable scores will have to be allotted to different questions . Cut off points same as recommended by Dave Committee. The evaluators to take some fresh primers with them in case some tests are already attempted in the primers available with learners.
Aspect (f), (g)- Discussion with VT and learners.
Aspect (h)- Observation
Aspect (i) - Discussion
Aspect (k) - Checking with VTs and learners.
9. Support of VTsAspect (a), (b), (c) - Discussion with VTs and campaign functionaries.
10. Feed back VTsAspect (a) (b)- Discussion with committee members, meet some local journalists, refer news paper clippings documented.
11.Finance Aspect (a) (b) (c)- Discussion with finance committee members , collect financial data, discussion with treasurer.
12. Involvement of SDAE, SRC,and others Aspect (a) (b) (c) - Discuss with Programme managers, SDAE and SRC officials, Primary school teachers, and NGO's
13. Other Aspects.Aspect (a) Refer documentation and discuss with DM and other functionaries in the field.


8. ACTIVITIES AND ASPECTS TO BE EVALUATED DURING SECOND STAGE

During the second stage of Concurrent Evaluation only the most crucial activities i.e. the status of teaching / learning and support of VTs may be evaluating the mid course corrective actions taken and their outcome.

  • Which of the teaching/learning and VT support aspects should be evaluated and how has it been described in detail under the sub-heading 'Activities and aspects to be evaluated' during first Stage (Activity No. 8 and 9)

  • In addition, the corrective measures taken according to the recommendations of the First Concurrent evaluation should also be studied.

  • A Group of learners must through random sampling for administration of test (see annexure B). The sample size should , as a thumb rule be 2.5% of the enrolled learners. However, the actual sample size should be not less than 2500 or more than 3000.



A. Methodology

The methodology of second stage concurrent evaluation will be somewhat different from First concurrent evaluation of all activities. The main source of information of the status of teaching/learning and support to VTs activities will be available in the field i.e. in villages/wards where the action is taking place. Since it will take considerable time to study all the villages and learners there is no other way to do quick evaluation than to follow the technique of random sampling.

B. The Evaluation Process

  1. Discussion with ZSS to understand the total teaching/learning situation and to get the views of ZSS regarding their areas of interest. Enlisting the participation of district and Block level officers responsible for each of the Block in the sample.

  2. Issue of instructions to village/ward in-charges to keep all necessary records ready and to ask learners in sample village/wards to come to the place of study with their Primers. The evaluation team to carry some extra primers with them or ensure their supply locally.

  3. Issue of instructions regarding the purpose of Concurrent Evaluation: It has been experienced in some districts that once the learners have been tested they get the impression that the examination is over and they stop coming to the centres. Therefore , a clear and strong message should go to the field that the purpose is not to 'test' learners but to study problems. This message may also make the practice of presenting non-genuine learners un-necessary.

  4. Test paper to be developed and printed in advance on the lines suggested in Model T6. (see Annex.B)

  5. Marking Code in consultation with TA, for each item of the test should be worked out. (see Annex. C)

  6. Thorough training of TA in interviewing and data collection.

  7. Evaluating other aspects of teaching/learning first and testing the learners later to avoid examination atmosphere.


C. Sampling
sampling Frame
collection
of the following data:
  1. Total NO. of Blocks
  2. Total No. of villages (including hamlets) in each Bock and total No. of wards in urban areas.
  3. Block wise total No. of ;
    • target learners in each village/ward
    • Sex-wise learners continuing at the time of evaluation in each village/ward, studying
      P I  P II

  4. villages/wards having pre-dominantly SC/ST, Minority learners.


D. Selection of Blocks and villages

It has been recommended that a 2.5% sample of enrolled learners who have completed PII will be adequate for our purpose. Basing our calculation on the returns showing the number of PII learners in different districts, it appears that we will get 2.5% of PII learners in just about 2 villages in a Block. To have a total picture of the difficulties and problems facing the district in respect of teaching/learning, the study of only 2 or 3 villages in a block may not reflect the total situation. The sampling teaching technique is therefore recommended to select two villages from each block, strictly through the random selection technique. Thus if there are 10 Blocks in a district, the study area will comprise 20 villages. Proportionate representation should be given to predominantly SC/ST and minority villages.

  1. The universe - The universe on which the size of sample will be based will be the No, of learners who have completed or almost completed PII. The size of the sample should not be less than 2.5% of the universe.
  2. Effort should be made to test as many PII completed learners in the sample villages as possible .If a few PI completed learners and those on PIII also appear for the test they may also be tested.
  3. The testing tool
    For PII Completed learners it will be:T6 to be developed by evaluating agency
  4. The marking system will be the same as recommend by Dave committee and as given in the Model T 6
9. TIME AND COST

Time and cost is calculated on following assumptions and data.

I. First Stage Concurrent Evaluation.

A. Data
1. No. of Blocksall (maximum 10 blocks)
2. No. of villages/wards2 per block
3. No. of well trained investigators5
4. Chief evaluators 2
5. Approximate No. of VTs, MTs and village leaders to be interviewed.100


B. Time

1. Preparation and printing of forms and schedules.7 days
2. Interviewing District and Block level Committee members and other high level functionaries by chief evaluators.3 days
3. Interview and testing at field level 7 days
4. Submission of brief reports 7 days
-------
24 days
II. Second Stage Concurrent Evaluation

A. Data


1. No. of Blocks All (max.10 blocks)
2. No. of Villages/wards3 per block
3. No. of well trained investigators12
4. Chief evaluators 2
5. Approximate No. of learners to be tested2.5% or a min. of 2500 learners
6. Approximate No. of VTs, MTs and village leaders to be interviewed.200
B. Time

'
1. Preparation, selection, training and printing of test papers and forms and schedules.14 days
2. Interviewing District and block level committee members and other functionaries.6 days
3. interview and testing at field level.10 days
4. Marking of 2000 test-papers 4 days
5. Analysis of data, report writing and typing etc. 14 days
-------------
48 days or 7 weeks
-------------


COST

The maximum amount envisaged for concurrent evaluation (for both first an second stages) is Rs. 250 lakh. The broad components of expenditure for both first and second stages of concurrent evaluation are given below to facilitate the preparation of budget.

a) First Stage

  • chief Evaluator's fee (minimum 2)
  • Investigators fee (Minimum 5)
  • preparation and production of formats, Interview schedule
  • Travel cost including local travel cost
  • Stationery
  • Miscellaneous.


b) Second Stage
  • Chief Evaluators fee (Minimum No 2)
  • Investigators fee (Minimum No 12)
  • Preparation and printing of test tools and other schedules.
  • Travel cost including local travel
  • computer fee for analysis of data and secretarial services.
  • Stationary
  • Miscellaneous.
10. REPORTING

A. As soon as the first stage of concurrent , evaluation is completed the evaluating agency has a fair idea of the weaknesses, problems and bottlenecks hampering the progress of teaching/learning. It should, therefore , hold discussions with ZSS immediately to brief them about the findings. The objective is to see that the corrective measures emerge out of collective judgment. Besides, the agency must also submit a short, compact report within 7 days of data collection so that the district could initiate. action. This report should be diagnostic in nature and to the point. The main focus should be on suggesting concrete and clear steps to be taken by the ZSS to overcome problems , remove bottlenecks and the strengthen the campaign. A copy each of this report must also be sent to the State Directorates, Directorate of Adult Education, New Delhi and to the Director General, National literacy Mission.

B. At the conclusion of the second stage of concurrent evaluation the agency is expected to submit a more comprehensive and detailed report. The focus should be on the actions taken by the ZSS to improve the health of the programme, degree of improvement achieved , future prospects of the programme in the district concerned, steps to be taken by ZSS to achieve the desired objectives. One of the most important aspects to be covered in this report is the analysis of learners progress and the score obtained by learners on different items, i.e. reading, writing and numeracy and the likely achievement rate of the district at the conclusion of the programme. This report must be submitted just after 15 days of data collection. Copies of this report must be sent to State Directorate, Directorate of Adult Education, New Delhi and to the Director General, National Literacy Mission.

11. SOME COMMON WEAKNESSES

Some of the main reasons of weakness of the campaign which have been observed are listed below to enable the reports to focus particularly on them.

  1. Irregular meetings of the centres and low attendance. This happens mainly because learners are insufficiently motivated to overcome such hurdles as lack of time, opposition by family members etc. Very often too, or the entire burden of persuading learners falls entirely upon the VT. He seldom receives any community help or assistance from the Contact Person or district/block level officers.

  2. Lack of facilities like non receipt of books, writing materials and blackboard causes discouragement. Though the supplies are usually adequate, they are sometimes of poor quality and sometimes the supplies are delayed/irregular.

  3. Lack of requisite full time structure.
  4. Lack of regular meetings of Executive Committee of ZSS and its sub committees.
  5. Lack of imaginative and sustained environment building.
  6. Lack of retraining and refresher courses for VTs.
  7. Lack of lighting or poor lighting arrangements, sometimes closes down the centres and seriously slows down the progress of learning. Learners have been found staring at their books sitting in the open in a 40 candle light bulb about 10 feet high. This should not happen when specific provision has been made for kerosene oil and lanterns.

  8. VTs, migrating from the village or student VTs getting busy in their exams and not being replaced immediately. This weakens the interest of learners.
  9. Essential teaching steps, such as the following, not followed by the VT:
  • reading the lessons himself first with proper pauses and emphasis , asking the learners to follow in their books and then asking them to read as the had read;
  • not asking comprehension questions;
  • not involving them in word or sentence building;
  • making the learner spend too much time in copying instead of helping them to develop the habit and ability of creative writing;
  • skipping the exercises and tests almost altogether;
  • absence of entertainment like singing songs, story and joke telling, narrating significant experiences, bhajans, etc.


  • Such problems should be studied by the evaluating agencies and possible solutions indicated in the report.