PDF Archive

Easily share your PDF documents with your contacts, on the Web and Social Networks.

Share a file Manage my documents Convert Recover PDF Search Help Contact



PME Booklet def .pdf



Original filename: PME Booklet_def.pdf

This PDF 1.4 document has been generated by Adobe InDesign CS4 (6.0.6) / Adobe PDF Library 9.0, and has been sent on pdf-archive.com on 25/10/2012 at 04:50, from IP address 118.96.x.x. The current document download page has been viewed 838 times.
File size: 16.6 MB (20 pages).
Privacy: public file




Download original PDF file









Document preview


PME
in your
pocket

2

Table of Contents
Introduction
What is this booklet for?
1. PM&E
Definitions - What’s in a name?
What is PM&E?
Why do PM&E?
2. Planning -Focus on goals
Big plans - Mission, vision, strategic plans, project and programme plan
Results chain
Logical Framework
* Indicators
* Assumptions
3. Monitoring and Evaluation
Preparing PM&E
Do we really need a “system”?
* What exactly do we monitor?
PM&E instruments
Evaluation
Reporting
Innovative methods
4. Learning
What is a learning organisation?
Kolb’s experiential learning cycle
5. What´s next?
Have fun!
Read more?
3

Introduction

?

What is this booklet for?
This short booklet is designed especially for participants of Simavi’s PM&E Clinic. During the clinic, you (a staff member of
one of Simavi’s partner organisations) have been reintroduced
to basic theories about Planning, Monitoring and Evaluation.
You have also received input from a consultant who has studied
PM&E practices at your organisation and exchanged ideas about
his or her findings, which has led to planned, concrete steps you
will take to improve PM&E practices within your organisation.
This guide is meant to be used as a reference to refresh your
memory on the material used during the clinic and is certainly
not exhaustive. Thick books have been written about the subject
(some of which are referred to in the bibliography);this booklet
is purposely thin, yet gives you a thorough overview of basic
theories and tools – and some space for your own notes. After
the clinic, you will have PM&E in your pocket.
In the first chapter, we will review what PM&E actually is and
why it is useful. In the following two chapters, the focus will
concentrate on how you can use PM&E. In chapter 2, we will
review how to plan a project using the logical framework as a basis, and chapter 3 discusses the tools and theories you can use
for monitoring and evaluation. Finally, we will review theories on
learning organisations.
4

?

?

?

? ?

1. PM&E
Definitions – What’s in a name?
Many people think that Planning, Monitoring and Evaluation
(PM&E) is both complicated and boring. Handbooks and manuals about the topic are often tough to read as they contain a lot of
jargon and many abbreviations. You have probably seen various
uses of the abbreviation PM&E, such as: Project Management
and Evaluation, Participatory Monitoring and Evaluation, or Performance Monitoring and Evaluation. In the clinic (and thus in
this booklet), we have tried to simplify the language of PM&E
and focus on what it is really about. So don’t be discouraged!
You too can learn how to demystify the jargon and use it to your
advantage.

PM&E or just M&E?
Often organisations separate the P for planning from
M&E. Try to Google M&E - instead of PM&E or PME
- you will find hundreds of sites (if not more). The reason for this is that PM&E is often seen as two different
processes. In the planning phase you are formulating your
goals and designing the programme that you will be monitoring and evaluating later. We, however, chose to keep
the three together in order to stress that in the planning
phase you already need to be thinking of M&E, and while
in the M&E phase, you should review what you initially
planned. Without planning, you cannot monitor and evaluate. This sounds obvious, but in practice, using planning
as a basis for monitoring and evaluation often gets overlooked.

Monitoring and Evaluation
= keeping track of your plans and adjusting along the
way so that you always move towards your goals.

Why do PM&E? For proving and improving
Often people debate about whether it is really necessary to always work according to a bigger plan. Many organisations employ a responsive work approach and their staff members feel
that PM&E limits them in their possibilities to respond quickly.
They say they cannot plan their activities in detail in advance,
because there are always unexpected occurrences. Surely the
context we work in is complex, but especially for organisations
that incorporate responsive work, it is beneficial to always keep
track of how the work is going, adjust when needed, and check
whether the programme is still moving towards its goals.

What is PM&E?
We have defined Planning, Monitoring and Evaluation as: knowing what you want to achieve, planning your steps to get there,
keeping track of your plans and adjusting along the way, and finally looking back to prove and improve your work. The basic
assumption underlying PM&E is that you always know what your
ultimate goals are; that is what matters most. If you know what
you want to achieve and what your strategies are, you can make
your PM&E system (the way you track your progress) as complicated or as easy as you want.

So, why then do PM&E? First of all, PM&E serves for proving. By
tracking the work we have done and measuring it for the results
we hoped for, we can tell the world around us whether our work
is useful and important. We can prove to the communities we
work with and for that we are acting in their favour. Also, we can
show the people that enable us to perform the work (donors for
instance), that we are using their money in an accountable way
and that they should continue to support us. Through PM&E, we
can demonstrate that progress is being made and that funds are
well spent.
5

Secondly, PM&E focuses on improving, which mostly means
that we can adjust our plans along the way. We can keep activities on schedule and review our estimations of time and money
spent to explain why things went differently, if they did. By monitoring our work we analyse why certain activities worked out and
why others did not. We can try to understand the effects of our
work and learn from them. Our experiences can then become
lessons learned for the future, so that an organisation’s way of
operating can be improved. We can also make sure that the communities we work with benefit from our work.

Experience is not what happens
to a man. It is what a man does with what happens
to him.
Aldous Leonard Huxley

2 Planning
- focus on goals
Obviously the process of working on a project or large programme starts with planning. Planning a programme is like planning a trip: you decide where you want to go first; then you think
of the route you will take; with what means of transportation you
take (car, bus, train); what you will bring with you; and finally you
think about how you will know you are still heading in the right
direction (bring a map, ask people on the way). In other words,
planning means knowing where you are going.
Often, staff members at organisations are so busy working they
do not want to take time to thoroughly plan a project. They do not
see the use, as plans are always adjusted anyway. Yes, plans are
often (always!) changed along the way. The strategy of planning
however makes us think of the things that may possibly happen.
This way we can anticipate and be ready for the unexpected.
Do not forget that plans should be dynamic, but do not be forced
into unrealistic planning. Demand flexible space from your
funders, show them that you need to be able to respond quickly
to new developments, and prove you can be accountable when
you are responsive. Adapting and changing your plan is always
acceptable, as long as you can explain the reasons for the corrective actions taken.

6

Big plans - Mission, vision, strategic plans, project and programme plans
For organisations you can say that planning is: knowing what you
want to achieve. This means formulating your dream about what
you want to change into a clear idea including long and shortterm goals you aim to reach as an organisation.

Planning = knowing what you want to achieve
Ideally, we write a strategic plan and then draft annual plans
based on our strategies. Most organisations conduct a strategic planning session every three to five years. It’s best to go
somewhere far away from the office with the entire staff and
preferably representatives of the board and other stakeholders
(such as beneficiaries or representatives of organisations you
work together with). All participants should collectively commit
to focus and be active during the meeting. What should be discussed here? Well, first of all, we must look back. It is important
to ask yourselves: What did our previous strategic plan look like,
what did we want to achieve, and did we? Make sure to use
the sources you have to review this: reports from PM&E meetings, evaluations of separate projects and progress reports sent
to donors.
Then, the vision and mission of the organisation needs to be reviewed. Don’t go and change this every three years! The vision

and mission statements form the foundation of your organisation, so try to stay true to the core message. Check whether the
language still appeals to you and see if you collectively think the
statements are clear. If this is not completely the case, you can
of course edit the language a bit. Naturally, the same goes for the
values and principles.
Next, during the strategic planning session, we can map the
problems that worry us and figure out which ones we want to
tackle. This is when you make sure your programme is relevant:
whether you are doing the right things. One tool for this is the
‘Problem tree method’. By looking at cause and effect relationships, we can separate the core problem and its symptoms, and
formulate our solutions for the core problems. We map the different parties involved by performing a stakeholder analysis. This
will enable us to clearly identify which individuals we should pay
attention to. Also, we can do a SWOT analysis to map the internal and external opportunities and threats to determine what
we intend to do. By bringing these elements together, we draft
our strategy. Try to keep the size of your strategy document to a
minimum, otherwise no one will read it. From this strategic plan,
we can then write annual and specific project plans.
Results chain
After you integrate your big dreams about changes you want
to make into concrete plans, you can check if the sequence of
expected changes makes sense. Your activities should lead to
outputs (the results). These outputs should lead to something
bigger (outcomes), which should contribute to your even bigger
7

goal (impact). These three levels can be seen as results; therefore the sequence of results is called the results chain. An easy
question to ask yourself when checking the logical sequence
is, ‘If….then’. E.g.,if I train teachers, I expect to increase their
knowledge on topic X. If teachers are more aware of X, then…
and so on. Do note that results chains are usually less linear (and
simple) allowing us to come up with a complicated web of results that may or may not lead to another.
A big difference between outputs and outcomes is that we are
directly and fully responsible for the outputs of our activities.
When we perform an activity, we know for sure that the output will be achieved. However, outcomes also depend on other
people´s behaviour. It is about what they do with the outputs that
we have offered them. We can only hope that the participants of
our sanitary trainings will actually use the knowledge. The quality
of trainings may increase the chances that they will, but in the
end it is up to them whether they use the knowledge or not.

activities/
resources
8

outputs

Be aware of words
Keep in mind that different donors use different wordings to often refer to the same things. Don’t be scared
off by their jargon. The word ‘result’ is used by some to
refer to results on the output level whereas others mean
outcomes. The higher goals are sometimes named objectives. Project purpose, final goal, overall goal could all
refer to the same results depending on who designed the
project. Try to find out to what level of result they are
referring. What’s important is that you know and understand the logic of your own programme.

outcomes

impact

A metaphor often used is that of the happy horse (first used by
Dutch consultancy MDF). Imagine walking in the mountains and
bumping into a clearly unhappy horse. Something is wrong with
the horse and after doing some research we find out that the
horse is unhappy because it is thirsty. Our solution is to bring the
horse to the well. This is our activity. The output of our activity is
that the horse has access to water. If the horse wants the water,
it can drink. The outcome we hope for is that the horse will actually drink the water. We cannot guarantee that it will however.
We can only assume that the horse wants to drink water (and
not something else) and that we have provided the right kind
of help to the horse. The most basic chain of results would look
like this:
Goal
Outcome

Output

Activity
Input



The horse is happy
The horse drank water and is
no longer thirsty
The horse is present at the well
and has access to water
We bring the horse to the well
A person to bring the horse to
the well
A well

Logical Framework
To organize our thoughts about how we can solve the problems
we face, we can choose from a wide range of materials. The
logical framework approach, using a logframe matrix, is one of
these tools. Despite the fact that this approach is debated (for
simplifying reality), it can help structure our ideas about what
we want to do. The log frame shows the hierarchy of results we
expect from our programme or project and links our main goals
to the activities we want to perform.
Naturally, your activities will not always produce the expected
results, which in turn will lead to other expected results. The
example of the happy horse is a very simple one. In the real
world problems are a lot more complex. This is why we can only
use the logframe as a tool for wrapping our thoughts around the
results we aim to achieve and as the basis for drafting our programme. It can serve as a management tool while implementing the programme. When monitoring and evaluating, we should
aim to be more flexible and look at more than just the logframe.
Indicators
For each level of results, we will draft indicators. An indicator is
an instrument that helps us measure change in terms of quantity
or quality.

An indicator

= an instrument
that specifies what we want to see and feel.
9

10
Activities

Output

Improved environmental
sanitation at household
level, schools and
dispensaries

Outcome


The school attendance
of pupils increased
by at least x % and
test scores by x% on
average

Conduct training toward
school health clubs

Train 8 teachers to become trainers of school
peer educators

Conduct sanitation training to dispensary staff

Upgrading of sanitation
facilities in 4 primary
schools

1,744 pupils in 4
schools have access to
improved latrines

Sixty villagers trained in
improved hygiene and
sanitation practices

X

X

Evaluation forms/
oral evaluation session after training

At least 70% of the
health club members
trained say they
gained knowledge
during the training
Sixty members of health
clubs in 4 schools
trained on sanitation
matters

Evaluation forms/
oral evaluation session after training

At least five teachers
state they feel confident doing training

Eight teachers have
gained knowledge while
training on sanitation
matters

Pre and post-tests
scores




Activity logs

School records

At least 70% of the
people trained have
higher scores on post
than pre-tests (min.
10%)

4 out of 5 defects are
repaired within x days
after being reported
(Depends on the
baseline!)

Monitoring reports/
photos

X% less complaints
about y


Surveys

Source of Verification

At least x% of
respondents say their
quality of life has
improved

Indicators

Five dispensary workers
and 12 village health
committee members
trained on sanitation
matters

Improved management
of sanitation facilities
and services

Pupils’ school attendance and academic
performance improved

To contribute to improving the quality of life in
4 villages

Specific
objective

Result

A logframe usually looks like this:          

Participants capable
of understanding
the content of
training and willing
to commit to doing
the training can be
found

The basic requirements for the
sanitation facilities
to be used are
fulfilled (plumbing,
maintenance)

Those trained will
feel more ownership
of the facilities

Children will be
more likely to go
(and be sent) to
school if they have
access to clean
sanitary facilities

Those trained will
use their knowledge

Knowledge is
shared and maintained

The community can
see it is a longterm
investment

X

Assumptions

During the formulation phase (in which a programme is planned)
we consider good planning indicators (sometimes called objectively verified indicators). We formulate indicators for the objective, outcome and output level. For each of the indicators, we
also define where we will find the information about the indicators, in other words our sources of verification.
Defining indicators is not easy, as we have to think about what
will actually change when we reach our goals. Also, we need to
be able to present evidence and identify sources of verification.
A great indicator is one that is SMART and SPICED, meaning:

SMART
S
M
A


R

T

Specific
Measurable
Achievable - Or: acceptable, applicable, appropriate, at
tainable or agreed upon (to stress the importance of
common understanding)
Relevant - Or: reliable, realistic (when achievable/attain-
able is not used)
Timebound

An indicator is specific when it refers to the element discussed.
For instance, if the output is to place 4 toilets, the specific indicator to measure this would be the number of toilets. A good indicator is measurable when two different people would be able to
measure it in the same way. The indicators we choose must be
achievable in the sense that we can easily collect the data with
the resources we have. The indicator must be relevant to our
project. Finally, indicators need to reflect the timeframe in which
we expect the change.

SPICED
S
P
I
C
E
D

Subjective
Participatory
Interpreted and communicable
Cross-checked
Empowering
Diverse and disaggregated

Subjective means that the indicator can measure someone´s
personal experience (for instance, whether participants thought
the training was useful). Participatory refers to the fact that indicators should be defined together, preferably with the communities we work with. Indicators that are defined in the field may not
be clear to people who do not know the reality in the field. That
is why the indicators need to be explained, interpreted, and communicable. In order for the indicator to be trustworthy, it needs
to be cross-checked. The indicators should empower communities to reflect on the changes they are going through. Finally,
indicators should be varied and include data from different perspectives and different stakeholders.
Assumptions
In addition to indicators, we include assumptions into our logframe. Assumptions are external factors which are important for
our programme. If they do not materialise, our programme will
fail. An assumption about the horse is that the horse will want
to drink water and not something else. When we formulate assumptions, we should always ask ourselves how likely it is that
the opposite will happen. In the example of the horse: is it likely
11

that the horse will not want to drink water? If so, we should ask
ourselves if it is likely or not that this problem can be solved. If it
is not, we should redesign our programme to address this issue.
So, if we think it is likely the horse will want to drink something
else than water, we have to change our project plan (by offering
the horse other drinks, for instance). If we are sure it can be
solved, it is not a good assumption (since it is more a fact than
an assumption). If we believe it is likely to be solved, then we
include the assumption into our logframe.

3 Monitoring and
Evaluation
Preparing for PM&E
Preparing for PM&E
Monitoring is often defined as the consistent checking of the
programme’s progress through the continuous and systematic
collection of information. Monitoring is looking into the mirror
you keep in your purse to see if your hairdo is still intact. Monitoring is keeping track of your plans. As said above, monitoring and
evaluation are performed for both improving the way you work
internally and learning from your successes and failures, and it is
done for proving externally that you are doing a good job. It is not
only about doing things right, but also about checking whether
you are doing the right things.

Monitoring = keeping track of your plans.
Do we really need a “system”?
PM&E seems to be the solution to all accountability issues that
exist in international development these days. NGO’s are advised to set up a ‘PM&E system’ all the time. This system is
often seen as something on paper – a table including a workflow,
12

planning matrices, and forms to fill out. But in fact, PM&E is not
only about these forms. Yes, you should have all these tools, but
you should remember that these are tools, nothing more. PM&E
is about people. PM&E is the analysis on all that we plan to do
and have done. This analysis can only be performed by people,
not by systems. PM&E thus means that the people working on a
programme will take the time and energy to discuss their work.
This is why it is important to involve the staff when you’re setting up a PM&E system. Even if there is a PM&E dedicated staff
member, make sure that those implementing the programme
are involved in monitoring and evaluation.

What exactly do we monitor?
First of all, we monitor time. We compare our plans to the implementation to see if we are still on schedule, and if not, we
adjust our planning. Secondly, we monitor money. We ascertain
whether the actual spending is in line with what we estimated
beforehand. If there are differences between the two, find out
why. Also, we will monitor our outputs and especially look at the
quality of these outputs. Finally, we also monitor beyond the output level, looking at outcomes. Often, reports only contain data
about the number of people that participated in a course or the
amount of books published. This means that there is only information about the horse being at the well, no information about
whether the horse drank. That however is what we are after,
because we want to know if our work produced the expected
results.
PM&E instruments

The mirror in your
purse

Monitoring is like looking into the
small mirror you keep in your purse.
While you’re on the road, you check
how you look and adjust your hairdo or
re-button your shirt. Evaluation is standing still in front of your big mirror at home
after you get back and really taking the time
to analyze yourself.

All of the above implies that monitoring is part of the plan. Therefore, whenever a project or programme is planned, we should
include a monitoring plan. This plan defines, for each of our results, how we will measure our indicators and who is responsible for collecting the data. Even when there are specific staff
members working on M&E, make sure that the individuals who
are actually working on the projects are also involved in monitoring – or at least in drafting the indicators and ensuring that
monitoring is being well performed. Don’t make your plans too
complicated though; the monitoring plan should only be an element of the overall project or programme plan. Before you begin
to write a monitoring plan, discuss what kind of information is
needed, how you intend to use the information and who will be

13

the main users of the information. Be aware of the danger of collecting too much data; collect information that you really need.
The monitoring plan then should contain data about:

or bad? Finally, we come up with recommendations on how to
improve our monitoring plan in the future. This analysing phase
is very important and should be planned carefully.






A PM&E system will establish the frequency in which your organisation will sit together to discuss the data you have collected. This could be every three months for instance. Ask yourself:
did things go according to plan or has nothing that was planned
actually been done? Talk about why it went the way it did and
adjust the planning for the rest of the year so that it is more realistic. Then discuss what activities succeeded and which ones
failed. What is the secret of your success? If you conduct these
meetings more often, you can begin to ask yourself whether
there is a pattern – determine the unwritten rules of your work
method and write them down.

Who collects data and who analyses the data
How the information will be collected
When and how often will monitoring be performed
How much monitoring will cost

Monitoring consists of collecting data, analysing the data, learning from data, and adjusting plans. There are many tools available
for data collection. The most common ones include:












Surveys – pre and post questionnaires to measure progress
Semi-structured interviews – participants’ opinions
Systematic observation – performed by a team of people
in order to cross-check observations
Focus Group Discussions
Storytelling
Case studies
Informal conversations
Most Significant Change Method
Multimedia monitoring
Journaling (as used in Outcome mapping)

Most important is to be critical and ask the right questions. Once
we have gathered information, we try to make sense of it by
putting all the data together and analysing it. We try to interpret
its meaning to understand why things went a certain way. We
then attach our judgment to it: was the thing that happened good
14

Guidelines for successful
monitoring meetings







Make sure all levels of your organisation are represented
Have the notes of earlier meetings available
Be critical! Ask the others why they think an event was a
sucess or a failure
Be open to criticism and look for solutions
Create a list of things to do to improve your way of working

Evaluation
The difference between monitoring and evaluation is that monitoring is performed continuously while the programme is implemented, and evaluation is performed after (a part of) a project has
ended in order to measure the performance of the programme or
project. It is designed to answer the question “why?”.Often, organisations only perform evaluations when outside parties (such
as donors) expect them to. That is a pity, because it is again an
excellent opportunity to learn, especially when there is a midterm evaluation, formulating recommendations for the remaining
project time. Through evaluation you can determine why certain
activities you conducted had such great results and why others
failed. You can see whether your assumptions were true or false.
So, look critically at yourself and the strategies you used.
Partial information gathered for monitoring can be used. Evaluations look at the bigger picture and try to map which changes
have occurred and the extent to which the project has influenced
or contributed to these changes. Depending on the funding organisation, an evaluation is done internally (often participatory)
or externally through a consultant. In both cases, there are five
areas that deserve attention:







Impact - The project strategy’s impact on communities
Effectiveness – The degree in which the outputs led to the
predicted outcomes
Efficiency – The cost and time-effectiveness of the strat
egy in turning the means into results
Relevance – The degree to which the project corresponds





to the needs of the communities
Sustainability – The degree to which the results of the
project will last

There are many ways of conducting evaluations. Most common
is to combine different methods and theories. Most important
is the concept of triangulation, which means that the evaluators
collect information from different perspectives. This method ensures that the evaluation is not influenced by a single informant
or one particular opinion.
Reporting
For any type of report (both for monitoring and evaluations) it is
important to talk about the overall purpose. Within your organisation, discuss the following questions:


How the report is used

Who prepares the report

Who gets the report

How the project will be reviewed

How we will learn
Innovative methods
Most Significant Change
MSC is a form of participatory monitoring and evaluation which
uses stories from project participants explaining changes they
have observed in their life as a result of the project. This method
interviews individuals from the communities, using one open
question. Their stories are drawn up in a similar way and col15

lected. The stories can be clustered into different themes.
If relevant, you could go back to the participants and further
analyse how changes related to a theme have been brought
about, explicitly focusing on contextual factors at play, as well
the project’s contribution to the observed changes. A team will
then vote on the story which best represents the change that
was intended within this program.
Outcome mapping
This method is based on the idea that development is about
people and how they relate to each other and their environment. This approach focuses more on behavioural changes
and on the relationships between the people, groups, and
other organisations with which an organisation works directly.
Outcome mapping tries to help organisations become more
specific about the people and groups it intends to reach, the
changes it expects to see, the strategies it uses, and eventually to become more effective in achieving results. Especially
valuable are the level of results, which are set as ‘results we
expect to see’, ‘results we like to see’, and ‘results we love to see’.

4 Learning
What is a learning organisation?
Due to criticism on international development aid and increasing competition, there has been an increased focus on concrete
results. The public understandably wants to see results. NGO’s
needed to ‘prove’ more than ‘improve’. Learning as an organisation is however very important. Why? International development
is a complex field. Nobody has ready-made solutions which can
be applied in every situation. We thus have to implement new
strategies to find out what works. An organisation that actively
learns can be more effective, as it knows what to work on. It
will also develop its capacity as an organisation to continuously
improve. Also, learning closes the gap between M&E and planning. It takes time to stand still (look into the mirror in your purse
a bit longer, look at some pictures of when you were young and
compare, talk to people that know you about how they like your
hair best).

It is not the strongest of the species that survives, nor the
most intelligent, but the one the most responsive to change.
Charles Darwin




16

A learning organisation is successful when it:
Knows something has to change
Provides learning opportunities
Links individual to organisational learning





Creates a safe space for sharing
Embraces creativity for innovation
Is aware of its environment

Linking individual to organizational learning refers to the fact that
sometimes staff members gain knowledge (through courses or
experience) which is not shared with his or her colleagues. In
a learning organisation there is space and time for sharing this
knowledge. Being aware of its environment means that an organisation knows which other organisations are active in its field
and what the developments are on theories about relevant issues.
Kolb’s experiential learning cycle

Concrete
Experience

(planning/trying out what
you have learned

Concrete Experience (doing / having an experience)
The organisation is undertaking a programme on water - experiences
Reflective Observation (reviewing / reflecting on the experience)
This step is about analyzing and judging events, activities or results. It discusses how and why. We can compare our own separate reflections, with those from participants, partners, external
evaluators or observers.
Abstract Conceptualisation (concluding / learning from the experience)

(doing/having an experience

Active
Experimentation

This learning cycle designed by Kolb describes the different stages to go through in order to learn. The main idea is that it’s not
enough to just experience something for us to learn, we need to
reflect on them and try to come up with a general rule which we
can then again test.

Reflective
Observation
(reviewing/reflecting on
the experience

Now that we understand better what went well or wrong and
why we can try to come to conclusion (a hypothetical rule) about
what that means on a larger scale. We can for instance say ‘Placing new latrines only works when people are assigned for doing maintenance and look for other information (experiences of
other organisations, literature about the topic) to complement
our conclusion.
Active Experimentation (planning / trying out what you have learned)

Abstract
Conceptualisation
(concluding/learning from
the experience

The rule or hypothesis we formulated will be tested in this step.
We are trying out what we have learned, only to continue to reflect on our experiences and redo the cycle again.

17

Room for thought

18

19

5. What´s next?
Have fun!
Now that you have PM&E in your pocket, you will hopefully be
able to get started on your own. Try to keep in mind that everyone uses different words to refer to the same things. Make sure
you agree on the definitions before you start working on plans
within your own organisation and when working with external
parties. Most of all remember that PM&E can be fun, as it allows
you to learn from your experiences and eventually improve your
programme.
Read more?
In case you are hungry to learn more about PM&E, here are two
very good publications:

The Barefoot Guide to Working with Organisation and So
cial Change’. They also have a book specifically about learning.
You can download these for free on their website: http://www.
barefootguide.org.

Integrated Monitoring; a Practical Guide for Organisations
that Want to Achieve Results’, by InProgress, downloadable on:
http://www.inprogressweb.com.

Text: Amis Boersma
Lay-out: Miriam van Oort
Supervision: Edith Kroese / Avance b.v.
Made for Simavi


Related documents


PDF Document pme booklet def
PDF Document yello youthpass
PDF Document sci managing director job profile
PDF Document january 2015 open positions
PDF Document tor facilitators ca mkd 1
PDF Document tor facilitators ca mkd


Related keywords