OECD Blue Sky 3 Summary Presentation
-
Upload
innovationoecd -
Category
Data & Analytics
-
view
122 -
download
3
Transcript of OECD Blue Sky 3 Summary Presentation
INFORMING SCIENCE AND INNOVATION POLICIES
TOWARDS THE NEXT GENERATION OF DATA AND INDICATORS
19-21 September 2016Ghent, Belgium
Luc Soete - WE NEED TO DEVELOP STI INDICATORS USING A SYSTEMS APPROACH
Luc Soete - WE NEED TO MEASURE INNOVATION IN ALL SECTORS, EXTENDING ON OSLO MANUAL
BLUE SKY KEY NOTE LECTURE
WHAT BIG QUESTIONS ARE BEGGING FOR AN ANSWER?
KEY MESSAGES ON Human centered policy design Role of participatory processes
We need human centered policy design – Minister Manuel Heitor
• Need routinely to collect data on placement outcomes of highly trained individuals
• Need to collect systematic cross-country data concerning mobility of scientists and engineers
• Need new indicators on migration of refugees, especially student refugees and refugees trained in science and engineering
• Improve measurement of scientific knowledge flow and its impact by learning directly from scientists and engineers through surveys and case studies
• Need data on collaborative mechanisms for innovation such as de jure standards
Debate 1: What big questions are begging for an answer?
We need participatory processes
• We need to better characterize participatory processes of R&D agenda setting to help engaging scientific institutions and actors with civil society
• We need collaboration with scientists, engineers and users to understand the knowledge production process and its impact
• We need to increase citizen participation in science and public support for science : the role of story telling
Debate 1: What big questions are begging for an answer?
SCOPE AND LIMITS OF INDICATOR USE BY STI POLICY
KEY MESSAGES ON The problem of research evaluation Advantages and pitfalls of composite indicators
In the quest for serviceable metrics, it is important to keep reminding ourselves about the limit of our imagination and what people really value – Stephen Curry
The problem of evaluation
For far too long we have focused on placements in academe and used bibliometric measures to assess outcomes of education and funding - Paula Stephan
Over simplification – e.g. rankings, impact factors – can create perverse effects
Scope and limits of indicator use by STI policy
In the quest for serviceable metrics, it is important to keep reminding ourselves about the limit of our imagination and what people really value – Stephen Curry
The problem of composite indicators
Scope and limits of indicator use by STI policy
If insensibly applied they result in indicator-driven policy, which is certainly NOT tantamount to evidence-based policy - Wolfgang Polt
TOWARDS MORE INCLUSIVE SCIENCE AND INNOVATION
KEY MESSAGES ON What and whom need to be included What new metrics in this space
What and whom need to be included?
• Include new performers of STI activities: e.g. user/consumer innovators, free/open innovation
• Include geographical and cognitive peripheries: locally relevant R&D, invisible science, agriculture, social sciences and humanities,
• Include new sources of info: e. g. big data, social media ….. altmetrics
• Include new stakeholders in the community of practice of STI indicators: e.g. citizens (public engagement)
Altmetrics have not been the panacea that we hoped for and they do not measure social impacts – Cassidy Sugimoto
The promise of altmetrics
THE FREE INNOVATION PARADIGM – Eric von Hippel Change the definition of
innovation in the Oslo Manual to enable it to ALSO apply to household sector innovation and other economic sectors as well.
Measure household sector innovation
“Inclusive” in terms of who uses the indicators – Ismael RafolsDevelop toolkits that allow exploration of choices in landscapes and allow users’ participation in decision making
MONDAY 19 - PARALLEL SESSIONSKEY MESSAGES
Data analytics for science and innovation Technology diffusion and breakthroughs Developing novel indicators from
scientometrics Capturing innovation in firms: do we get it
right? Leveraging the potential of administrative
data for science and innovation policy
DATA ANALYTICS FOR SCIENCE AND INNOVATION
• Text mining tools promise to alleviate some of the common challenges facing STI statistics, e.g. survey fatigue and unfit-for-purpose classification systems that are applied differently by human coders (e.g. patent assessors using USPC).
• Theory-driven text mining offers new opportunities for generating STI indicators, e.g. through near real-time monitoring and online media monitoring for sentiment analysis.
• Text mining depends on vocabularies, ontologies and other linguistic techniques. These can be defined manually or automatically, and deductively (e.g. through topic modelling) or inductively (e.g. through machine learning algorithms) – or in a combination of these approaches.
MODERATOR: Katy Börner, Indiana University
DEVELOPING NOVEL INDICATORS FROM SCIENTOMETRICS
• Traditional bibliometric indicators should be reviewed to add meaning and international comparability. Over simplification – e.g. rankings, impact factors - can have negative implications.
• A quality dimension of process of validation –indicators of peer review - should be integrated. Adding more dimensions could capture real author contributions as well as novelty.
• We should provide a more informed role to the users of bibliometric information.
MODERATOR: Laura Cruz, Institute of Public Goods and Policies, Spain
CAPTURING INNOVATION IN FIRMS: DO WE GET IT RIGHT?
• Overall…NO! We are capturing something but improvement needed.
• Survey design, question design, content, implementation, matter for data quality and international comparability.
• Also respondent characteristics have significant impacts (e.g. their expertise in innovation at the individual and firm level, micro firm, whether or not they buy in their major innovations, translation/cultural aspects, etc.)
MODERATOR: Louise Earl, Statistics Canada
LEVERAGING THE POTENTIAL OF ADMINISTRATIVE DATA FOR SCIENCE AND
INNOVATION POLICY• Metadata for research projects is inherently complicated;
data access does not solve the problem; need to consistently identify and measure R&D projects
• Tremendous potential in using machine-learning techniques to organize the large, unstructured data and make it amenable for analysis
• Tremendous interest in networks and linkages and this raises difficult problems in disambiguation; potential interesting work going forward on this
MODERATOR: Adam Jaffe , MOTU, New Zealand
TECHNOLOGY DIFFUSION AND BREAKTHROUGHS
• Appropriate reference frames / reference data sets / benchmarks are important requisites for the assessment of technology diffusion.
• A long-term (funding/analytical/strategical) focus is beneficial in the assessment of technology diffusion, in order to allow for the recognition of long-term dynamics and changes within the field.
• In advanced assessments of technology diffusion it is of great value to allow for an agile / dynamic approach to data collection, as opposed to dependence upon a static data repository.
MODERATOR: Mosahid Khan, WIPO
SCIENCE AND INNOVATION POLICY-MAKING IN AN ERA OF BIG DATA
KEY MESSAGES ON “Big”, “promising” data or “uncomfortable” data? Potential for science and innovation policy making
We want informed story telling that captures the essence of the underlying data
• We are here to create data and metrics to gain shared understanding and evaluate policy alternatives and identify gaps
• Leverage on digitisation to deliver new metrics• Need to develop a granular capability to capture the
dynamics of innovation• -
• We don’t know where t the next data will come from from!
BLUE SKY KEY NOTE LECTURE -Scott Stern
STI POLICY MAKING IN THE ERA OF BIG DATA
New data – big data, web data and open data – data combinations and interactive mapping and reporting tools: exciting opportunities or “uncomfortable data”?
Exciting Yes, BUT: What is the right amount of data? Need for complementary investment in capabilities to deal with the data; Need to think of the human in the loop, how do we present the results? Can we take big data and create a narrative?; Remove uncertainty through experiments; Look at these methods/data as “toolkits” rather than “silver bullets” answers for policy makers
Challenges for the use of “big data” in companies: trust; availability of platforms; technical skills; access to complementary data (who owns the data?)
NEW MODELS AND TOOLS FOR MEASURING SCIENCE AND INNOVATION IMPACTS
KEY MESSAGES ON Build on a wide range of available tools Embed measurement and evaluation into all of our work
NEW MODELS AND TOOLSNow available: high-quality, high coverage, interlinked data, cost-
effective storage and computation, validated, scalable algorithms, visualization and animations capabilities - but are we using them?
Some old ideas have not sunk in yet – need to build understanding and a culture for evaluating everything
Quality of data is key – we need to understand what the data tell us
Measuring the role of innovation in economic performance and productivity is like measuring the contribution of butter to the cake
OECD role: microdata analysis, building and sharing understanding, developing standards, …
NEW DATA AND FRONTIER TOOLS: THE CHALLENGE FOR OFFICIAL STATISTICS IN SCIENCE AND INNOVATION
KEY MESSAGES ON Opportunities and challenges for National statistical offices
BIG DATA: OPPORTUNITIES AND CHALLENGES FOR OFFICIAL STATISTICIANS
Surveys and administrative data are complementary methods, that ensure the representativity of the population and can be used to measure new phenomena (e.g. survey data to analyse the disruptive impact of digital platform services)
Big data techniques are used by NSOs for analytical/statistical purposes (e.g. use of geo-spatial data, hydrographic and weather data to forecast agricultural yields, scanner data to replace price collection, web scraping to improve frames for surveys, crowdsourcing of information to improve the design of policies), but also for operational ones (to cut down cost of processing data).
Opportunities relate to “timeliness” and “higher granularity”, better “accuracy” and reduction of respondent burden
Institutional Challenges: access to privately owned data/privacy issues; content stability; replicability
Technical challenges: need to invest in infrastructure, software, capabilities/expertise In the era of Internet of Things there is need for “smart statistics”, partnerships with the private sector
and incentives for data sharing What the OECD/the international community do? Common data standards, algorithmic
transparency and accountability (dealing with automation), trusted third party for certification, “labelling” activities, clearing houses for data programmes, deal with international comparability with the new data sources; collect initiatives on use of big data to develop indicators, share best practices
TUESDAY 20 - PARALLEL SESSIONS
Innovation and IP: what data gaps limit policy discussion?
Researchers on the move Interaction and impacts of STI policies Capturing hidden innovators STI actors: the potential of direct surveys
INNOVATION AND IP: WHAT DATA GAPS LIMIT POLICY DISCUSSION?
• IPRs beyond patents: need for holistic view (for instance, exploiting data on other IP - TMs, utility models…). More information needed about trade secrets in particular.
• Better understanding of the use of IP by end users in products. Here we need better data, for instance product-patent pairs (de Rassenfosse). Licensing data would be particularly useful. So far we have been limited to just a few sectors, like pharma.
• Better understanding of the mechanism of knowledge flows. Again, better data is needed, for instance the diffusion from the scientific literature to practitioners (via the "enlightenment literature," Hicks).
MODERATOR: Alan Marco, U.S. Patent and Trademark Office, USA
RESEARCHERS ON THE MOVE
• Bibliometric data can provide a wealth of information on mobility. Data can provide levels of aggregation from the country to the region, institution and individual.
• Combining different sources of data can provide larger opportunities on a global scale. However, linking challenges need to be resolved.
• Technology now provides new tools to scrape/mine Internet (e.g. CVs) such as Natural Processing Language (NLP).
MODERATOR: Emilda B. Rivers, National Science Foundation, USA
INTERACTION AND IMPACTS OF STI POLICIES
• Program evaluation - significant progress in both techniques and availability of linked datasets since Blue Sky 2
• More to be done to assess the efficiency of programs and the joint impact of policies (but a unique identifier for each firms using gvt support programs and complete information of each support enjoyed by the firm are needed)
• STI System evaluation – complex, no appropriate model currently available. Operational definition of STI system and internationally comparable proxies of policy levers are needed
MODERATOR: Pierre Therrien, Innovation, Science and Econ Dev, Canada
CAPTURING HIDDEN INNOVATORS• Go beyond definition of formal private sector/market. • Need to extend the definition of innovation to cover households and
public sector but also informal business sector especially as the geography of innovation is changing. Social innovation more problematic at this stage: definition still confusing.
• Need to investigate more the methodologies to capture innovation beyond formal private sector. Need to define survey methodology that needs to be different from private sector one as the characteristics are quite different.
• Even public sector is a controversial definition: is it only public administration? Does it include universities? Hospitals? Need to do more research to see if all public sector innovate the same way or if there are substantial differences.
MODERATOR: Vladimir Lopez-Bassols, S&T policy consultant, USA
STI ACTORS: THE POTENTIAL OF DIRECT SURVEYS
• Bibliographic information not sufficient to explain research and innovation processes. Surveys are useful and necessary to understand motivations driving research and research orientations.
• Surveys are necessary and useful to measure perceptions and opinions of Actors regarding the development of the STI system, how institutional setting affects their behaviour, or the impact of institutional reforms
• OECD should focus on global issues but still work with local researchers to increase the quality of the data
• OECD should consider implementing direct surveys to address policy gaps and when data is not sufficient to answer key policy questions
MODERATOR: Fernando Galindo-Rueda, OECD
LOOKING FORWARD: WHAT DATA INFRASTRUCTURES AND PARTNERSHIPS?
KEY MESSAGES ON Research data “infrastructures” that are reusable and sharable
The need for granular and interoperable data
• Share the data so that it is reusable • Need to directly involve researchers to collect the data
about researchers• Create standards for persistent identifiers in datasets • Communities should come together to develop the
common infrastructure• Ensure policy continuity in this area
WEDNESDAY 21 - PARALLEL SESSIONS
Beyond indicators: the innovation and productivity nexus
Towards standards for a common research infrastructure
Trust, culture and citizen's engagement in science and innovation
Developing novel approaches to measure human capital and innovation
Surveying innovation in different contexts
BEYOND INDICATORS: THE INNOVATION AND PRODUCTIVITY NEXUS
• Micro-level: production functions are a useful tool and provide a conceptual framework for estimating rates of return on investments. However, relevant for public policy to estimate the existence of complementarities (or substitution effects)
• Macro-level: governments (and society) need to know what the rates of return are from different public investments and measure spillovers from all intangible investments (education, training and R&D), including by the public sector
• Improving productivity-innovation nexus needs better macro-micro nexus: – Need for more micro-level measures to better understand aggregate dynamics and
determinants – Encourage linking across different datasets (macro and micro-level datasets
including firm, bilibiometrics, patent data, etc.)
MODERATOR: Mariagrazia Squicciarini, OECD
TOWARDS STANDARDS FOR A COMMON RESEARCH INFRASTRUCTURE
• Big potential in linking data on researchers (inputs, outputs of research, affiliations, geographical information etc.) for a better understanding of their behaviour and for a better informed policy making
• Advances are being made towards data integration but many concepts remain black boxes. More dialogue is needed between different communities to promote mutual understanding
• Models and experimentation to monitor open science are emerging. What are the metrics for open science, being aware of the fact that open science is more than open access and open data? What role for the OECD?
MODERATOR: Cecilia Cabello, Spanish Foundation for S&T, Spain
TRUST, CULTURE AND CITIZEN'S ENGAGEMENT IN SCIENCE AND
INNOVATION• Although science is global, ‘science culture’ remains local; innovation is a
collective process and depends on social, spatial and historical contexts• Develop metrics to account for culture in public understanding and attitudes to
science and innovation. Not country rankings, cluster analysis across a set of variables
• Policy making could be helped by considering different approaches to segmenting populations in surveys. Disengaged people have different, but valid, attitudes
• Scientists often don’t communicate what the public wants to know • Could OECD become curator of existing subjective databases around the
world? Develop a “Frascati manual” on public attitudes to science and innovation – a “Ghent Manual” ?
MODERATOR: Carthage Smith, OECD
DEVELOPING NOVEL APPROACHES TO MEASURE HUMAN CAPITAL AND
INNOVATION• R&D sample survey in Germany shows that gender, education and
nationality diversity can make a difference in research teams and is positively related to innovative capacity. More historical data are needed to determine causality (R&D sample survey)
• Mobility across research fields leads to less valuable inventions (loss of specialisation) but more novel inventions (cross-fertilisation of ideas). Collaboration and access to scientific publications can help balance the shortcomings of mobility.
• Oslo Manual provides clear guidelines on how to collect data but overlooks issues related to human capital, impact on outcomes and regional innovation. Linking data from different sources could give new insights without running new surveys.
MODERATOR: John Gawalt, NSF, USA
SURVEYING INNOVATION IN DIFFERENT CONTEXTS
• More comprehensive and different indicators of innovation are needed to capture innovation practices in non-traditional sectors and in developing economies.
=> These need to better capture incremental and non-technological innovations, the sourcing of external knowledge and sectoral specificities
• There is a bias towards manufacturing in much of the analysis of innovation. Information on innovation in rural areas, in mining, utilities and agriculture needs to be collected more comprehensively.
• Surveys have to aim for more objective comparable information on innovation to capture those innovating incrementally. The framing of surveys matters for responses.
MODERATOR: Tomohiro Ijichi NISTEP, Japan