Monitoring and evaluation in the policy arena are related and often mistakenly interchangeable concepts. They have as a common goal aiming at contributing to policy-learning something which is easy to say and difficult to implement. Governments from different administrative levels have pushed the implementation of monitoring practices at system, policy and programme level from a long time, but in most of the cases following accountability purposes.
The discourse about learning is there, even in the European Commission’s communications and publications, but then it is the same EU the one that gives priority to accountability approaches as for example the ones behind the ERDF, funds that finance Smart Specialisation Strategies. Trying to align discourses with practices is the main challenge for advancing towards policy learning in this framework.
In the Manumix project, in which Orkestra participates as Advisory partner, regions participating exchanged their experiences about monitoring and evaluating programmes directed to Advanced Manufacturing in an individual manner but also taking into account their combinations (policy-mixes) in order to advance towards a policy-mix evaluation. In the past month, all the partners participated in a Learning Journey in Vilnius and discussed about the main challenges of monitoring from their own experience.
First of all, when we all listen the word ‘monitoring’ the first thing that comes to our minds is ‘indicators’. We need indicators for measuring as what you don’t measure it doesn’t exist. Is this true? Do we need indicators for every single intervention, initiative or government action? I think it depends of the purpose of the indicator. Indicators for learning about policy instruments should be established in order to capture the relevant information needed to take decisions about the programme. Is the programme investing planned inputs? Which are the signs of programme outputs and outcomes? And what it is more important is there a contribution of the programme to the whole strategy?
Partners from the Basque Country and Lithuania presented their approach to monitoring which can be a sample of the common European approach for monitoring: many indicators for measuring the effort and less for impact, a bias towards quantitative indicators and still a way to a policy-mix monitoring system. With this respect Piedmont introduced their approach for monitoring programmes, which includes qualitative techniques together with quantitative ones.
One of the main failures of monitoring systems is the difficulty of collecting the data according to the selected indicators. It is therefore important not only defining reliable indicators but also realistic with respect the available data. Issues like lack of data for smart specialisation priorities which are different from standard sectors such as advanced manufacturing at regional level or how updated the data is, constitute big challenges for monitoring systems oriented to learning.
Can policy-makers take decisions about advanced manufacturing if they don’t know the industrial performance in the region? Some initiatives such as the European Cluster Observatory try to overcome this data failure. In addition, is it accurate and reliable to take decisions when data reflects the situation of three years ago taking into account the huge dynamism of the economy? Using big data and other sources of information could be an option to face this problem and incorporate more updated data into monitoring. This is the path that NESTA and the Welsh Government are following with the Arloesiadur pilot project.
Finally an accurate and holistic monitoring system means nothing if the information is not fed into policy-making. Visualisation of the results and communication are therefore key issues, which have to be adjusted to the different types of stakeholders as they have different and sometimes vested interests in the monitoring exercise.
As this learning journey showed, monitoring is really important and taken seriously in regions across Europe despite the challenges, but the journey is still long to go.
Edurne Magro is a researcher at Orkestra-Basque Institute of Competitiveness. She holds a Ph.D. in Business Competitiveness and Economic Development with a European Mention from the University of Deusto, after having spent time at the Manchester Institute of Innovation Research of the University of Manchester, United Kingdom. She holds a degree in Business Administration and Management and has a Master in Innovation and Technology Management from the University of Deusto..
More articles by this author
Transformative innovation policy for regions: regional responses to global challenges?
Innovation policy-mixes for RIS3: new wine in old bottles?
The ghost of the past or how history matters for regional development
Evaluación de políticas: La asignatura pendiente del territorio