Pages:     | 1 |   ...   | 24 | 25 || 27 | 28 |   ...   | 32 |

Illustration 33. Outcomes of the follow-up phase National system for evaluation and Output monitoring the NISP 2.7. Permanent evaluation: a key element in the whole process Working on an NISP does not finish with the final report or action plan. As a matter of fact, an NISPs work continues through monitoring and permanent evaluation. The main criteria of evaluation should be the verification of the achievement of goals and objectives laid down in an NISP. These criteria should be relevant to each of the goals and objectives.

There are many methodologies to carry out assessments and evaluations. One of them is the outcome mapping, a methodology endorsed by International Development Research centre, IDRC, Canada. Outcome mapping provides not only a guide to essential evaluation map-making, but also a guide to learning and increased effectiveness, and affirmation that being attentive along the journey is as important as, and critical to, arriving at a destination.

It will help a program be specific about the actors it targets, the changes it expects to see, and the strategies it employs and, as a result, be more effective in terms of the results it achieves20.

Evaluation of an NISP also provides an assessment of the NISPs relevance, effectiveness and impact, efficiency and utility. An key aim of the evaluation is to assess the countrys added value of these initiatives; their impacts at national level and lessons to be learned that may inform work-programme development of the agreed time line.

The process of monitoring and evaluating progress in achieving the goals of an Information Society policy is decisive in actually implementing the chosen goals. Without some indication, signals, even warnings of how all elements of society are adapting to the installation and application of the NISP, there can be no way of understanding whether the shift towards the construction of an Information Society or its permanent updating is actually taking place or working in positive ways. Moreover, there can be no understanding of future policy steps without reference to the current status of the NISP implementation and application procedures.

A multistakeholder commission may be designated in order to periodically monitor and assess the NISPs efficiency and impacts.

Example 26. eEurope 2005 Final Evaluation eEurope 2005 Final Evaluation This evaluation contains the eEurope 2005 Action Plan, complementing the evaluation of the multi-annual programme of MODINIS (2003-2005). Its assessment includes three different evaluation criteria:

See Outcome Mapping: Building Learning And Reflection Into Development Programs, 2002, by Sarah Earl, Fred Carden and Terry Smutylo. This publication explains the various steps in the outcome mapping approach and provides detailed information on workshop design and facilitation. It includes numerous worksheets and examples.

1. Relevance and utility: whether the objectives of that programme corresponded to the needs, opportunities and challenges of society 2. Efficiency: examining the level of resource use (inputs) required to produce outputs and generate results 3. Impact: whether the intervention has created the intended effects Within each of these criteria a set of evaluation questions have been formulated to make the scope of the evaluation operational. The methodological approach is based on four types of analysis conducted in consecutive phases and makes use of multiple data sources; programme analysis, peer group analysis, country analysis and an impact analysis developing an impact model.

Source: EC, The use of indicators to monitor these objectives is critically important, particularly in developing countries, where the digital divide is a prominent political issue. Indicators provide feedback with regard to national policy making and investment, and also in terms of external participation in projects and investments. In order to design the assessment methodology, the appointed commission will need to build a set of indicators (ESCWA, 2005).

The surveys can be reduced to chosen groups or open to the public. In this case, web surveys can be extremely useful, as shown by the Web-Based Survey on Electronic Public Services in Poland:

Example 27. WEB-Based Survey on Electronic Public Services in Poland WEB-Based Survey on Electronic Public Services in Poland (III Edition 2004) Conducted by he Ministry of Interior and Administration, the Ministry of Science and Information Society (public) and Technologies Capgemini Poland (consulting company, private). The report is conducted regularly, as part of the "eEurope 2002" and "eEurope 2005" strategies.This report evaluates the public service's development in Poland in comparison with other European countries. It points out the strengths and weaknesses of Polish eGovernment and helps to build up a proper developing strategy leading Poland to EU's level.

Source: MRR, Based on the evaluation findings, the assessment report may suggest that several aspects of both management and content of the given NISP can be improved when continuing the development of successive phases and updatings.

In the case of the eEurope 2005 Final Evaluation, the assessment report was conducted with a mixture of quantitative and qualitative methods. The mixture was chosen to meet the requirement that the evaluation be exploratory and forward thinking in order to provide lessons for the future. The methodology applied is more system and model oriented that what is commonly considered evaluation practice standard. The soundness and validity of the analyses and data elaborations have been secured through triangulation of findings from multiple sources.

The scope of the data collection was wide and different data were linked to each other in the analysis. The methodology contain four types of analysis - Programme analysis - Peer-group analysis - Country analysis - Impact analysis - impact cases and development of impact model The overall objective of the programme analysis was to establish a preliminary description and analysis of the programme. The analysis primarily provided the basis for the assessment of the efficiency, but also provided input for the assessment of the programmes relevance, in particular regarding the relationship to other programmes. The data supporting the analysis was collected through desk research and interviews with programme related personnel both within the Commission services and in member states.

Interviews were conducted both face to face and over the phone. The selection of interview partners was made in cooperation with the Commission, DG INFSO.

Another example from Poland is the ePolska 2004-2006 monitoring report (MRR 2008).

Conducted by the Ministry of Interior and Administration (MI&A) and the Ministry of Science and Information Society Technologies (public), this report was the first one in a series to be conducted regularly to monitor the progress in developing the Information Society in Poland. Based on the information given by all departments responsible for the implementation of the strategy, it deals with the following issues: to provide a cheap, broadband, safe internet for all citizens; to create on-line public services and eLearning platforms; to support a common ability to use PCs and to fight against eExlusion.

Illustration 24 shows the complete map pf the procedures to formulate, implement, monitor and evaluate an NISP:

Illustration 34. NISP Map DEVELOPMENT OF A NATIONAL INFORMATION SOCIETY POLICY (NISP). Strategic Frameworks Starting point Implementation stage Follow up phase Origen NISP follow-up, monitoring, control and adaptation Implementation of the NISP Formulation of the NISP Evaluation Evaluation National Policy for National Policy for Politc and economics Information Society Information Society external factors Political will to support National contexts and Political will to support the proposed goals internal factors the proposed goals Sectorial interests Governmental Governmental Governmental Establishing body or civil body or civil body or civil Monitoring Diagnostic goals and servants in charge servants in charge servants in charge beneficiaries of the NISP of the NISP of the NISP Experts Experts Experts Actions and Team Team Team Evaluation Anlisis activities Legislation Intersectorial and Choosing or creating a System changes multistakeholder strategy Planification body (an agency or indicators Citizen organization) to carry on Participation the policies and strategies Identification and call proposed by the NISP for actors New or updated Concretization of National Policy for legislations the NISP National system for Development of evaluation and Information Society Agencies for monitoring the NISP monitoring and Communication assessment Milestone Inputs Processes Outputs 2.8. REMINDERS FOR GOVERNMENTAL OFFICERS, POLICY MAKERS AND EXPERTS TEAMS There is no general recipe for successful ICT policies and e-strategies. Govenmental officers, experts teams, and policy makers in countries at different levels of development may identify examples of successes or best practices either within their own territories, or in other similar countries in order to adapt them as necessary to fit their nations unique circumstances.

Nevertheless, a few principles are common to most, if not all, successful approaches. In crafting ICT policies, experts groups and policy makers face nine major challenges21:

10. A need for vision and leadership;

11. Consistency with other national development goals;

12. Coordination within government;

13. Consultation for consensus on objectives and approaches;

14. Implementation of articulated and realistic plans of action;

15. Resources prioritized and not based on mere wishful thinking;

16. Supportive legal framework to enable ICT policies;

17. Supportive policy frameworks to facilitate implementation; and 18. Objectives against which to monitor progress and produce defined results.

In view of these challenges it may be useful to consider these suggestions:

15. Knowing the degree of e-readiness is fundamental to set future goals and to implement realistic policies. Implement a reliable diagnostic on the status of your country regarding Information Society.

16. Establish a baseline of indicators that characterize the present and the historical trends leading up to it. Be precise in setting goals. Based on the previous diagnostic, and on the set of indicators you have used, formulate goals and monitor progress towards achieving them.

17. Be informed about international best practices. Use the Internet and other ICTs to research and identify best practices from other areas, which can eventually be replicated or adapted to your countrys needs and context.

Based on Ulrich, Chacko and Sayo (2004) 18. Prioritize your objectives, as well as the participating actors goals and interests.

19. Engage stakeholders as early as possible with consultative and participatory workshops and seminars with the private sector, academia and civil society. For the general public, awareness campaigns and educational programmes may be the best tools for appropriate and productive adoption of ICTs.

20. Enlist the participation of federal, regional and local governments in your country in planning National Information Society policies and strategies from the early stages.

Participating in the creation and updating of policies and strategies will not only provide the necessary information about local needs, goals, and demands, but it will also facilitate the involvement of provinces and regional states, as well as the implementation of the policies in their regions.

21. Consider that some of these actors and interests may be in conflict with other areas; others may deserve simultaneous but separate approaches.

22. Keep a long-term vision. Some policies generally only influence decisions over the medium to long term.

23. Be alert about leapfrogging opportunities. Analyse the stages through which other countries successful ICT policies and industries have passed and find out where, if any, opportunities exist for leapfrogging these stages with cuting-edge or emerging ICTs.

24. Let government coordinate ICT initiativeswith investments, but most importantly, with conducive policies and legislation to encourage private capital and entrepreneurship. Governments are also the model users that by using ICTs will disseminate their appropriation by the citizens, for example, in e-government applications.

25. Let the private sector drive ICT initiatives, with investments, entrepreneurship, and coordination with the state and other stakeholders. While governments set the policies and the planning, much of the implementation falls upon the private sector.

As such, private companies and organizations have a stake in ensuring that ICT policies and e-strategies match their priorities.

26. Engage the active participation of the science and technology or academic sector.

This sector provides knowledge to be applied in ICT production and dissemination, as well as the human resources to work in state and private ICT-related enterprises. Academic institutions can play a relevant role in helping design and evaluate ICT projects that may involve technically demanding research. In addition, their corporate research counterparts are also active in developing standards that are revolutionizing the spread and use of ICT.

27. Involve civil society organizations. ICT strategies should balance economic and social concerns to combine sectoral growth with the development of society. In the economic arena, the private sector drives progress, but in the social arena, civil society organizations (CSOs) and local communities should assume importance, particularly in rural areas far from the reach of central governments.

28. Ensure that policies and strategies are periodically monitored, evaluated, updated and modified as necessary to yield the desired results.

i. Index of Illustrations Illustration 1. Basic scope of an NISP........................................................................................................... Illustration 2. Milestones........................................................................... ! .

Pages:     | 1 |   ...   | 24 | 25 || 27 | 28 |   ...   | 32 |

2011 www.dissers.ru -

, .
, , , , 1-2 .