Professional Documents
Culture Documents
July, 2016
Abstract
The paper reviews selected recommendations of the US National Research Council report, Capturing
Change in Science, Technology and Innovation: Improving Indicators to Inform Policy, and places the
recommendations in a broader context as part of the international effort to set a research agenda that will
produce indicators that guide policy effectively.
OECD Blue Sky Forum III, Ghent, Belgium, 19-21 September 2016
Introduction
In 2010, the National Science Foundations National Center for Science and Engineering
Statistics (NCSES)1 commissioned the National Academy of Sciences to convene a
Panel to review the future of science, technology and innovation indicators that could be
produced by the NCSES, using its own and other data sources in order to inform the
policy process. An interim report was published in 2012, and the final report entitled
Capturing change in Science, Technology, and Innovation: Improving Indicators to
Inform Policy (National Research Council 2014) was published in July 2014.
While the report was intended for NCSES, it has far broader applications for institutions
that fund research on the development of indicators, their use in the development of
policy, and on the monitoring and evaluation of policies that have been implemented.
Although some of the indictors discussed have wider applicability to a variety of
economic sectors, this paper focuses on those related to science, technology and
innovation (ST&I). Policy-relevance and international comparability of ST&I indicators
were catalytic elements in the studys approach.
The 14-member National Academies' Panel, comprising economists, policy analysis,
managers of businesses, physical scientists, engineers, geographers, and statisticians
from several nations, made 30 recommendations in the Capturing Change report to the
sponsor. Chapter 8 of the report brought these recommendations from throughout the
report together to form a strategic programme of work. Recommendations regarding
strategic planning and prioritization were organized as follows:
Under congressional mandate since the 1950s, the National Science Foundations National Center for
Science and Engineering Statistics is one of 13 U.S. federal statistical agencies. As codified in Section 505
of the America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education,
and Science (America COMPETES) Reauthorization Act of 2010, NCSES is charged with collecting,
acquiring, analyzing, reporting, and disseminating data on the status and trends in R&D, and on the
science and engineering workforce, and science, technology, engineering, and mathematics (STEM)
education. (National Research Council 2014, p. 1).
While data quality has clear universal applicability, the other four recommendations in
Chapter 8 provided a framework for the recommendations in Chapters 2-7 of the report,
particularly on measuring innovation, on knowledge generation, networks and flows,
and on human capital activities. Although the framework for a strategic plan was created
for a specific agency, it could be applied to institutions that fund research on measuring
ST&I activities, and that collect data, produce indicators and publish analyses that are
used as part of the policy process.
This paper is organized in line with the recommendations in Chapter 8 of the Capturing
Change report. Discussion of pathways for building a strategic program of work
concludes this paper. The main conclusion is that there cannot be development of Blue
Sky indicators without a framework in which to put them.
The framework that is developed in this paper, while quite broad, is focused on the
current areas of interest of the NCSES. Given that the NCSES is a statistical agency, the
next step would be to relate the NCSES framework to that of the System of National
Accounts (SNA) (EC et al. 2009)2
Data Quality
Data quality is a principal priority for statistical agencies, though there are many
challenges to maintaining or improving standards. Increased reliability of existing
indicators and development of new indicators that reflect the changing global economy,
while resource constraints tighten, are all critical factors affecting a statistical agencys
day-to-day activities. Chapter 7 of the Capturing Change report outlines alternatives for
developing or augmenting ST&I indicators from non-traditional sources, some of which
are already used by statistical agencies. Snijkers et al. (2014) also argue that
modernization of data generation and manipulation processes present both opportunities
and challenges to the statistical system, including: globalization of the location of
innovation activities; increased availability of and demand for business practice data by
traditional users of outputs from statistical agencies; recent advances in technologies for
survey data collection; and multi-source/mixed-mode data collection strategies that have
been adopted by what is termed National Statistical Institutes.
Recommendation 8-1 of the Capturing Change report focuses on data quality measures
and the need to publish regularly these indicators from survey data or from other means
of collecting data. Formally:
Given the fundamental importance of data quality and reporting of
quality measures, the National Center for Science and Engineering
Statistics should review its data quality framework, establish a set of
indicators for all of its surveys, and publish the results at regular intervals
and at least annually. (National Research Councial 2014, p. 103)
2
For a discussion of measuring innovation in all sectors of the SNA, see Gault (2015).
Accuracy, relevance, timeliness, and accessibility are all essential and interrelated data
quality elements for ST&I indicators. (National Research Council 2014, p. 101-103).
The Quality Diamond developed by Snijkers et al. places all four of these elements
under user evaluation. (see Figure 1 below)
Figure 1: The Quality Diamond
A key point that the Panel made regarding data quality focuses on transparency in the
data dissemination process. There are few ST&I data sets that come with information on
the percentage of data that have been imputed as a result of respondents not answering
questions in surveys or failing to respond to the entire survey. The Panel found that
quality deficiencies should be identified and made transparent to data users. Doing so
would itself be an improvement in the quality of data statisticsan improvement in
data clarity. In addition, methods of improving the data should also be published along
with the data. Previous National Research Council reports (including Principles and
Practices for a Federal Statistical Agency, National Research Council 2013) advise on
the quality of indicators that agencies should monitor and disseminate. Important
statistics disclose are: unit nonresponse; item nonresponse; and population coverage.
Indicators used to support evidence-based quality policy development or evaluation have
to meet quality standards and that applies to existing indicators as well as to Blue Sky
indicators, however sourced. In addition, it is important to communicate data quality to
users in a manner that allows for comparison with other sources of ST&I statistics.
Transparency is one of the hallmarks of maintaining high-quality ST&I indicators that
are used to public policy decisions.
Data Linking and Sharing
Linkages of data sets is a recurring theme in the Capturing Change report. The second
programmatic recommendation, 8-2, deals with data sharing among agencies and the
linking of data sets to produce products that would not be possible if the agencies
worked independently. (National Research Council 2014, p. 104) This is an important
recommendation because the resulting products can be relevant to policy and to
administrative objectives such as reducing respondent burden and improving value-formoney in an era of budgetary constraints.
As discussed in the previous section on data quality, linkage of survey data to business
practice data and unstructured data was recommended to support near-term estimates of
indicators that could provide more timely statistics for policymakers. Developing
statistical methodologies necessary to incorporate unstructured data into national
statistics was also discussed. With appropriate guidelines, these alternative statistics
could be used for near-term indicators and to augment survey-based data.
Furthermore, consider data produced by an innovation surveywhich is a business
surveybeing linked to data from industry surveys used to produce inputs to the
indicators produced by the System of National Accounts. Such a linkage would allow
comparison of the characteristics of innovative and non-innovative firms, by industry,
and over time, the characteristics of high growth innovative firms could be compared
with low growth innovative firms. As different pictures emerged from the analysis,
policy measures could be considered to promote innovation-driven growth and therefore
jobs.
present in one place information about a firm that could be commercially damaging in
the wrong hands. The agencies responsible for the measurement and the linkages have to
be able to guarantee confidentiality of economic data, and privacy of personal data.
Policy in the United States is managed by many departments and agencies of
government. Data linkage across agencies is a programmatic recommendation of the
Capturing Change report. By engaging in this activity of coordination and collaboration,
the policy needs of agencies can be discerned and more effective statistical outputs
produced. This is a challenge in the United States, and may indeed be impossible to
replicate in other countries. However, the resulting policy development and
implementation in the United States could provide a direction for indicator work leading
to international comparison.
Methodological Research
The 2010 America COMPETES Reauthorization Act charged NCSES with supporting
research that uses its data and with improving its statistical methods. In that vein, the
third recommendation of the Capturing Change report, 8-3, deals with increasing the
knowledge needed to produce and use indicators and it is specific to the NCSES. The
recommendation suggests that NSF uses the capacity to provide and manage grant and
fellowship programmes in order to support methodological research relevant to the
needs of the NCSES.
For example, support of research on methods of linking different types of data from
administrative records or from unstructured data sources would be productive
investment of these funds, as they would provide a public good for broader use at other
statistical agencies. While this would work for the NCSES which is a statistical office
embedded in NSF, a research funding agency, it would require considerable
coordination in other countries where data gathering agencies are separate from research
funding agencies. However, as science, technology and innovation are seen as priorities
for the economy and the society in most countries, interagency collaboration could be
seen as an integral part of science, technology and innovation policy.
Krosnick et al. (2015) summarizes proceedings of two NSF workshops on the future of
survey research. Though mainly focused on the American National Elections Studies,
the Panel Study of Income Dynamics, and the General Social Survey, researchers
discussed issues encountered when linking data from a variety of sources. Joseph
Sakshaugs paper, reported in the summary be Krosnick et al. (2015), entitled Linking
Survey Data to Official Government Records offered areas where further research is
needed to improve statistical process for linking datasets, including:
Billion Prices Project data; retail scanner data; the J.D. Power and
Associates used car frame; stock exchange bid and ask prices and trading
volume data; universe data on hospitals from the American Hospital
Association; diagnosis codes from the Agency for Healthcare Research
and Quality used to develop the producer price index; Energy
Information Agency administrative data on crude petroleum for the
International Price Program; Department of Transportation
administrative data on baggage fees and the Sabre data, used to construct
airline price indices; insurance claims data, particularly Medicare Part B
reimbursements to doctors, used to construct health care indices; and
many more sources of administrative records data from within the U.S.
government, as well as web-based data. (National Research Council
2014, p. 88)
Horrigan (2013) cautions, however, that statistically determined weights are necessary to
ensure that bedrock statistics are produced particularly when blended data sources are
used.
The Panel envisioned NCSES and other statistical agencies publishing ST&I indicators
that are based on traditional surveys, administrative records and web-based sources, all
of which will rely on methodological advances from the community of practice. The
Panel recommended that NCSES together with the user community develop ways of
structuring unstructured data, dealing with challenges of scale, negotiating access to data
and protecting sensitive data, and validating nontraditional data sources are common to
many potentially useful but nontraditional datasets. This will all require financial and
human resources, and leadership that is willing to work across organizations to effect
change.
Position of Chief Analysist
The fifth recommendation, 8-5, addresses management needs for dealing with data
quality, user liaison to ensure that the policy community is being well served, advising
the NCSES of changes that are likely to have impact on the statistical programme, and
the assessment of new datasets and tools that could be used to develop and to improve
indicators. In other words, the agency needs a curator and strategic decision-maker for
indicators development. This recommendation like others in the Capturing Change
report, is specific to NCSES. It is presented here as it emphasises analysis of statistical
data, in addition to its collection, tabulation and dissemination, user consultation to
ensure that indicators produced are relevant to user needs. It also proposes a foresight or
horizon scanning function to anticipate new types of data and tools for managing data
that could change the way in which the statistical system operates. These are issues of
significance in any statistical system.
This is an important function in development of ST&I indicators, particularly since there
is high demand for indicators by policymakers and demand for data by the research
8
community. This has international scope. While strengthening its analytical capabilities,
the agency must improve its abilities to: gauge the changing international environment
of innovation activities and evolving methods for developing new, timely metrics; and
anticipate changing demand for ST&I indicators, agilely prioritizing the release of
indicators that address current issues.
Pathways for Building a Strategic Program
Turning to new indicators, the Capturing Change report calls for the measurement of
innovation outcomes (Recommendation 4-1), which could include measures of the four
types of innovationproduct, process, organisational, and marketingalong with the
use of intellectual property instruments directly related to innovation, and the percentage
of turnover attributed to product innovation. There are others, but there is no reliable set
of such indicators that follows directly from the activity of innovation. There are many
indicators of intellectual property, indicators that are not linked directly to innovation.
These are matters for consideration as part of the process of revision of the Oslo Manual
(OECD/Eurostat 2005).
Tracking general purpose technologies (Recommendation 5-4), using the NCSES
Business Research and Development and Innovation Survey would produce indicators
of use and planned use of information and communications technology, biotechnology,
nanotechnology and green technologies. Once established the same approach could be
applied to organizational technologies like knowledge management and innovation
management.
Recommendations in Chapters 6 and 7 heavily lean on the need for more methodological
researchdata acquisition using administrative records and possibly some of the more
experimental methods such as web scraping. The panel discussed business practice data
that are produced day-to-day at firms and other types of institutions. Another gap in
indicator development is related to longitudinal data sets which can be used to produce
indicators of business and human resource activity (Recommendation 6-1 and 6-2) over
time. An example is the tracking of doctoral recipients through their career path. There
is much work to be done to understand how to use these methods. That is why using the
grants program to get input from the research community could improve the quality of
ST&I indicators. Such indicators produced and used in the United States for policy
purposes would provide leadership for the development of internationally comparable
indicators in the rest of the world. This activity could also be relevant to the
implementation of the recently revised Frascati Manual which deals with the
measurement of research and development (OECD 2015).
Data linking is not easy by any means, but developing new measures based on linked
data over time would be a powerful deliverable for NCSES, which has as part of its
mandate the clearinghouse3 function among statistical agencies regarding science and
engineering statistics.
3
The America COMPETES Act states that: (a) EstablishmentThere is established within the
Foundation a National Center for Science and Engineering Statistics that shall serve as a central Federal
The main conclusion is that there cannot be development of Blue Sky indicators without
a framework in which to put them. The framework discussed here, while it is developed
for the NCSES, has wider applicability.
References
EC, IMF, OECD, UN and the World Bank (2009), System of National Accounts, 2008, New
York: United Nations.
clearinghouse for the collection, interpretation, analysis, and dissemination of objective data on science,
engineering, technology, and research and development.
10
Q2014, the European Conference on Quality in Official Statistics, 2-5 June 2014,
Vienna, Austria
11