You are on page 1of 17

Use and relevance of web 2.

0 resources for researchers

Use and relevance of web 2.0 resources for researchers


European Bioinformatics Institute
Mendeley
Open Knowledge Foundation
Science and Technology Facilities Council
SPARC
UKOLN

1. Executive Summary
The application of Web 2.0 tools to the practice of research is an area with immense promise but
where evidence of real value is limited. We have assembled a team with a wide range of experience
in developing, using, and critically analysing such services. The team is deeply embedded within
the community that utilises and builds these services. However this community remains small and
unrepresentative of the research community at large. We are therefore interested in examining both
the successes of these approaches as well as reasons for lack of adoption.

We will undertake four main activities to qualitatively and quantitatively analyse the extent of use
of Web 2.0 tools in research.

1. First, a literature review and aggregation of research material will be carried out to define
the state of the art internationally.
2. Second, qualitative interviews and case studies will be used to identify common themes in
successful and unsuccessful applications of Web2.0 approaches and barriers, perceived and
real.
3. Along with the literature review, this will form the basis for designing and carrying out the
third activity: a large-scale empirical study. The resulting data will be analyzed using a
structural equation modeling approach, which will allow us to go beyond a qualitative,
anecdotal, or phenomenal understanding. It will enable us to quantify the strength of the
effect of each promoter and inhibitor of the adoption of Web 2.0 tools, as well as the relative
importance of the factors vis-à-vis each other. This can lead to prescriptions as to which
inhibitors to tackle first, and which promoters to focus on. It will also provide empirical
evidence as to which degree the use of Web 2.0 tools influences scholarly communication.
4. Fourth, because the self-reported survey is likely to overestimate the use of Web 2.0 tools
due to the self-selection bias, we will validate our data externally by undertaking a
sampling-based survey. This survey will involve a search for a randomised list of researchers
from a range of disciplines and environments, on a range of Web 2.0 services for scientists.
As this will systematically underestimate the use of these tools, we will be able to establish
the upper and lower boundaries of the actual use of Web 2.0 tools in research.

2. Background
In recent years much has been made of the the potential of Web 2.0 tools, applications and services
to transform the way research is performed and disseminated. There are numerous high profile
examples of technologies that facilitate effective collaboration and working practices that could

1
Use and relevance of web 2.0 resources for researchers

make the lives of researchers easier - from communication and messenging tools to collaborative
authoring, public review and rating sites, from social networking services to community driven
tools for resource discovery. Twitter can provide instant updates of what is happening in the lab.
Google Docs provides an environment for writing papers that can solve the problems inherent in
emailing around documents. Wikis can capture and preserve the collective expertise of a research
group. Digg-like mechanisms could replace peer review with social networking sites providing a
"social search" mechanism, bringing the research you need to be aware of to your attention as well
as the opportunity to find new collaborators, as and when you need them.

There are exemplars showing that all of these approaches are possible, and that they can offer an
improvement over traditional approaches. However for every success there are many failures, and
scratching beneath the surface, you will find many of the same names re-appearing in these
examples. The degree to which these approaches have penetrated general academic practice appears
to be extremely limited. Broadly, there are three reasons why this might be the case. First, adoption
could simply be expected to be slow as practice in research does not change rapidly, systems need
to change, tools need to be built. Second, there may be specific cultural or social reasons why these
tools are not appropriate or are perceived as not appropriate. This may be due to the wrong tools
being built, or it may be a result of scientists who were brought up in the pre-web age are not able
to "get it", requiring a new generation of scientists (the so called "Google Generation") to exploit
them effectively. Finally, it is possible that these tools simply do not, on average, work well in
research settings.

The overall picture is likely to be complex and a combination of factors. We have therefore elected
to apply a model-based approach to disentangle the contribution of these different factors both to
uptake, and intention to uptake. We will apply a qualitative approach to identify web applications
and services to enable the design of a quantitative survey which will be used to probe the
relationships between model components (see methodology).

3. The Team
We have assembled a team with a wide expertise in Web 2.0 technologies and their application to
research. The team includes both commercial and academic developers, users and analysts, as well
as community and policy advocates. The team is strongly motivated to take a close and critical look
at the effectiveness of such tools and inhibiting factors concerning usage in research - as developers,
as advocates, and as researchers.

Collectively, the team has a broad range of experience of research 'on the ground' in different
domains - from experimental and computational sciences, to the humanities and social sciences. A
fine-grained understanding of research environments in different domains will be particularly
advantageous when performing interviews and building case studies, and in designing a larger scale
empirical study which will allow meaningful inter-disciplinary comparison.

We have a rich perspective on the growth and adoption of new internet technologies in different
areas - from policy analysts, funding bodies, commercial developers, government, academia and the
technical community. Hence, debates about the adoption, transformational potential, network effects
and perceptions of Web 2.0 tools are familiar territory and we are keen to develop a richer,
evidence-based picture.

2
Use and relevance of web 2.0 resources for researchers

4. Key Deliverables
The key deliverables for the project will be:
• Literature review
• Transcripts, record and analysis of qualitative interviews
• Design of model for quantitative survey
• Results and analysis of quantitative survey
• Anonymised data and analysis of "adoption sampling" survey of service penetration
• Summary report and website to present data and findings

Moving forward we propose publishing the report and associated material on a website with
recommendations and summaries of key findings catered to different stakeholders, including
researchers, institutions, funders, software developers, service providers, librarians, information
professionals and publishers. In the longer term we anticipate this could act as a central point of
reference for parties interested in utilising or developing Web 2.0 tools for research.

There have been numerous surveys on Web 2.0 and its usage and relevance in different domains,
from education to library and archives. The JISC funded SPIRE project is a relatively recent
example of British funded research in this area. Our project would go beyond this kind of work in
that we hope to develop a compelling and dynamic evidence base that could be explored and
contributed to - giving a thorough and up-to-date overview of existing usage of Web 2.0 tools (by
harnessing existing networks, communities and publicity channels), as well as giving a
representative picture of UK research as a whole (overcoming selection bias with survey incentives
and adoption sampling).

5. Theoretical background and model development


The overarching theoretical framework for this study will be the Unified Theory of Acceptance and
Use of Technology (UTAUT, Venkatesh et al. 2003). The UTAUT was designed to explain the
Behavioral Intentions to use and the Use Behavior of information systems. The theory holds that
four key constructs are direct determinants of usage intention and behaviour:

• Performance Expectancy: The degree to which an individual believes that using the system
will help him or her to attain gains in job performance.
• Effort Expectancy: The degree of effort an individual associates with the use of the system.
• Social Influence: The degree to which an individual perceives that important others believe
he or she should use the information system.
• Facilitating Conditions: The degree to which an individual believes that an organizational
and technical infrastructure exists to support use of the system.

The effect of these four key constructs on usage intention and behaviour is moderated by Gender,
Age, Experience (with the system), and Voluntariness of Use, i.e. these moderating variables specify
when the effects of the key constructs on the dependent variables will be weaker or stronger (Baron
and Kenny 1984).

The UTAUT was developed through a review and empirical consolidation of eight models which
had previously been employed to explain information systems usage. It is most closely related to
the Theory of Reasoned Action (Fishbein and Ajzen 1975) and its extension, the Theory of Planned

3
Use and relevance of web 2.0 resources for researchers

Behaviour (Ajzen 1991). These are two of the leading theories of action in the social sciences,
which form the theoretical basis of over 800 studies published in the PsycINFO and Medline
databases (Francis et al. 2004).

In a longitudinal study which applied the UTAUT to research software adoption in the financial
services industry and customer service management software adoption in the retail electronics
industry, the R² of Behavioral Intentions and Use Behaviour reached .77 and .53, respectively
(Venkatesh et al. 2003). That is, the model was able to explain 77% and 55% of the statistical
variance of these dependent variables, which indicates a very good model fit and external
(predictive) validity of the theory.

By itself, the UTAUT will form the basis for understanding the reasons, facilitating conditions and
inhibitors of adoption of Web 2.0 tools for researchers. However, in order to also understand the
implications for scholarly communication, the model needs to be extended. Thus, the Use
Behaviour of Web 2.0 tools for researchers is hypothesized to influence Sharing and Re-Use
Behaviour of scientific data, Discovery Techniques of scientific data and literature, Publication
Behaviour, and Research Findings Communication Behaviour.

6. Model operationalization
In order to operationalize the constructs and adapt the theory’s measurement model to the context of
the present study, we will perform triangulation, i.e. a qualitative enquiry which incorporates three
different viewpoints and methods.

4
Use and relevance of web 2.0 resources for researchers

First, we will draw on the combined expertise of the research team. The team members, who are
active researchers in the humanities, and the natural, social and library sciences, as well as
practitioners and promoters of Open Science, digital curators and architects of Web 2.0 tools for
researchers, will have in-depth exploratory discussions on the meaning of each of the constructs in
the model to achieve a common understanding of them. Second, we will review existing literature,
weblog postings, essays and online discussion on Web 2.0 tools for researchers, with specific
attention paid to reasons for and inhibitors to adoption, and implications for scholarly
communication behaviour. Third, we will conduct a limited number of semi-structured, qualitative
interviews with members of our target group. Interviewees will be selected to represent a
heterogeneous spread across disciplines, affiliations (academic, corporate, government, ...) and
other demographic criteria.

The insights generated by this triangulation will be used as a pre-test of the internal validity of our
theoretical model, to find ways to quantify the Scholarly Communication measures, and finally to
formulate the wording of the scale items of the measurement model. For example, as a result of this
triangulation, Performance Expectancy could be a multi-item construct measured as a perceived
gain (or decrease) in scholarly merit, and as a perceived improvement in access to high-quality
information; Facilitating Conditions could be a multi-item construct measured as the perceived
support and training by the researcher’s library, and the financial cost of use of these tools, and the
IT budget available to the researcher.

7. Empirical study and quantitative analysis


Following the model operationalization, we will conduct a large-scale empirical study. The rationale
for doing this is that it will enable us to perform statistical analysis of the collected data. Using a
structural equation modeling approach (described further below) will allow us to go beyond a
qualitative, anecdotal, or phenomenal understanding. It will enable us to quantify the strength of the
effect of each promoter, inhibitor and facilitator of the adoption of Web 2.0 tools, as well as the
relative importance of the factors vis-à-vis each other. This can lead to prescriptions as to which
inhibitors to tackle first, and which promoters and facilitators to focus on. It will also provide
empirical evidence as to which degree the use of Web 2.0 tools influences scholarly
communication.

The study design will be online-survey based and cross-sectional. In general, a longitudinal study
design would be preferable to a cross-sectional one for studies like these, as it would measure the
intra-individual changes in the independent variables and their effects on the dependent variables
over time, instead of differences between respondents and the effects of these differences. A
longitudinal design would thus be better at ruling out the effects of “hidden moderator variables”
that could account for the differences between respondents in a cross-sectional design. However,
based on our experience, we feel that the time frame for the empirical study of (at most) six months
is realistically not long enough to capture not only significant changes in the respondents’ attitude
towards and use of Web 2.0 tools, but also the effects of these changes on long-term variables like
publication and communication behavior. Thus, the aforementioned qualitative triangulation will be
even more important to detect potential hidden variables before conducting the empirical survey.
Following on from the current study the team will investigate the feasibility of carrying out such a
longitudinal study based on the results of the current programme.

5
Use and relevance of web 2.0 resources for researchers

The sampling technique will be a stratified random sample, with participants randomly filtered to
achieve representative quotas for different academic disciplines, age groups, gender, and
affiliations. Using appropriate incentives will be instrumental in securing a sufficient response rate.
Moreover, the wording for the survey invitations will have to be general enough (and not refer
specifically to “Web 2.0 tools for researchers”) so as not to create a self-selection bias, in which
researchers who are a priori more interested in these tools than the average researcher are strongly
overrepresented in the sample. To control for such a self-selection bias, we will perform a second
“adoption sampling” study: We will select a smaller, random representative sample of researchers
who have not participated in the survey, and search for these people on the Web 2.0 resources
included in the study to obtain a real-world measure of adoption. Due to people not using their real
name on these services, or electing to keep their profile hidden, this “adoption sampling” approach
will systematically underestimate the “true” adoption, whereas the “self-reported usage” will
systematically overestimate the “true” adoption. Together, these two data points will help to
establish a minimum vs. maximum extent of use. They will also provide a rich data base for
descriptive statistics on the demographics, disciplines, and subject areas of the researchers who use
these Web 2.0 tools.

The relationship between the constructs in our extended UTAUT model will then be analysed using
structural equation modeling techniques. Specifically, we will be using Partial Least Squares (PLS)
analysis (Fornell and Cha 1994; Chin 1998). PLS allows a simultaneous testing of hypotheses,
taking indirect and moderating model effects into account. When compared to covariance-based
structural equation modeling approaches such as LISREL and AMOS, PLS enables single-item
measurement as well as multi-item measurement and the modeling of constructs as either reflective
or formative. As a distribution-free method, PLS has fewer constraints and statistical specifications
than covariance-based techniques.

The model will then also be calculated for sub-samples, divided by different classes of Web 2.0
tools. This means that we can pinpoint the promoters, inhibitors, and facilitating conditions for
specific Web 2.0 tools (e.g. only social networks, or only folksonomy/tagging/literature
management tools), and the effect of only these tools on scholarly communication behaviour vis-à-
vis other tools.

References

● Ajzen, Icek (1991), "The Theory of Planned Behavior," Organizational Behavior and
Human Decision Processes, 50, 179-211.
● Baron, Reuben M. and David A. Kenny (1986), “The Moderator-Mediator Variable
Distinction in Social Psychological Research: Conceptual, Strategic, and Statistical
Considerations,” Journal of Personality and Social Psychology, 51 (December), 1173-1182.
● Chin, Wynne W. (1998), "The Partial Least Squares Approach for Structural Equation
Modeling," in Modern Methods for Business Research, George A. Marcoulides ed. New
York: Lawrence Erlbaum Associates, 295-336.
● Fishbein, Martin and Icek Ajzen (1975), Belief, Attitude, Intention and Behavior: An
Introduction to Theory and Research. Reading, MA: Addison-Wesley.
● Fornell, Claes and Jaesung Cha (1994), “Partial Least Squares,” in Advanced Methods of
Marketing Research, Richard P. Bagozzi, ed. Cambridge: Blackwell, 52–78.
● Francis, Jillian J. Martin P. Eccles, Marie Johnston, Anne Walker, Jeremy Grimshaw, Robbie
Foy, Eileef F.S. Kaner, Liz Smith, and Debbie Bonetti (2004), Constructing Questionnaires

6
Use and relevance of web 2.0 resources for researchers

Based on the Theory of Planned Behaviour: A Manual for Health Services Researchers.
University of Newcastle.
● Venkatesh, Visvanath, Michael G. Morris, Gordon B. Davis and Fred D. Davis (2003),
“User Acceptance of Information Technology: Toward a Unified View”, Management
Information Systems Quarterly, 27 (September), 425-478.

8. Management
For logistical reasons the project will be managed by Mendeley Ltd who will provide the financial
arrangements and act as the contracting party for the research programme. The remainder of the
team will be managed via subcontracts from Mendeley to the relevant person or organisation.

The project is both multisite and multidisciplinary and will require communication and transfer of
data and analysis between research workers on several sites. Data integrity and management will be
a key aspect of the project and we have extensive expertise in this area. The investigators and
research workers will meet three times over the course of the project and group meetings will be
held monthly via video conference. Progress towards specific objectives on the full project plan (see
work plan) will be the basis for monitoring the status of the project. Progress will be monitored by
short written reports presented at the monthly meetings and made public on the website as part of
the record of the project (see above).

Day to day communication will be managed via a (public) messenging stream that will additionally
provide part of the record of the conduct of the project. A Friendfeed (www.friendfeed.com) will be
used to aggregate messages, material, and documents relevant to the project as it proceeds and this
material will then be re-aggregated to the project website.

9. Risk Assessment
The main areas of risk fall into three categories: technical or logistical inability of the team to
deliver the project on time or within budget (management failure); insufficient collection of data
due to poor uptake on the main survey or difficulties in obtaining subjects for the qualitative
interviews (recruitment failure); a risk of not providing sufficient new insights due to repeating or
complementing previous work in the area (scope failure).

Management failure: We have budgeted a significant sum for researcher time on this project as that
will be the key resource required to deliver effectively and on time. We have also explicitly
included a budget line to buy out a proportion of CN’s time for project management and
coordination. CN has significant experience in managing multidisciplinary and multi-site projects.
Strong reporting and communication protocols are envisaged as well as scheduled face to face
meetings throughout the project. In addition an advantage of carrying out the project “in the open”
is that the material collected will still remain available if the project fails for unforeseen
circumstances. The key researchers (JG, VH, NH) are all available for significant blocks of time
within the next six months.

Recruitment failure: A significant challenge in obtaining survey data on penetration of services is


recruiting a sufficient number of survey subjects who do not use these services. By definition they
will be difficult to contact via our existing set of contacts. It is for this reasons that we have
budgeted a significant sum for survey incentives and prizes. This constitutes a small sum for each

7
Use and relevance of web 2.0 resources for researchers

survey participant (in the form of online vouchers) as well as a small number of relatively high
value prizes. We will mitigate the risks by using access to institutional and subject based mailing
lists, personal contacts at a range of levels, as well as utilising available social network services.
The team is deeply embedded within the community that utilises and builds these web services.
This provides us with contact both with advocates and successful users and with sceptics. We will
use all of these contacts to obtain the best possible set of survey and interview participants.

Scope failure: We will conduct an in depth “due diligence” literature survey as the first part of the
research programme. This will provide a much greater depth of understanding of previous work that
has been carried out. The team is well versed in the current literature but not yet at sufficient depth
to fully optimise the research programme to provide complementary outcomes. Nonetheless our
proposed programme is unique in a number of ways. We will focus on the UK research community.
Our programme is based around a structural equation modelling approach over a wide range of
research disciplines. Most previous work, both in the UK and internationally has been qualitative,
and focussed on a specific discipline or even department. The proposed sampling survey will, to our
knowledge, be unique, as well the collection of case studies. Further we plan to engage the
community in commentary through the website which will provide an ongoing and unique resource
in the longer term. As noted, the preferred methodology to address the questions posed would be a
longitudinal study. The team will actively seek funding to enable us to continue the project so as to
carry out such a study.

10. Budget
We estimate that we would require £70,000 in order to complete the project as planned. This is
£20,000 less than the proposed budget.

• Tasks and personnel:


• Project co-ordinator (1 day/month for 8 months) - £2k
• Communications and publicity - £3k
• Literature review (1 month) - £9k
• Qualitative survey preparation and analysis (1 month) - £10k
• Planning and execution of quantitative survey (1.5 months) - £10k
• Analysis of quantitative survey (1 month) - £8k
• Sampling survey (0.5 months) - £4k
• Project write up (1 month)- £9k
• Survey incentives/prizes - £5k
• Travel and accommodation for co-ordination meetings - £5k
• Web hosting - £3k
• Printing - £2k
• Total: £70k

8
Use and relevance of web 2.0 resources for researchers

11. Team CVs

Cameron Neylon

Senior Scientist, Biomolecular Sciences, Science and Technology Facilities Council

Blog: http://blog.openwetware.org/scienceintheopen
Online: http://openwetware.org/wiki/Users:Cameron_Neylon

Cameron Neylon (STFC Rutherford Appleton Laboratory) is a biophysicist, an internationally


recognised advocate of open practices in research and a developer and critic of web based tools for
supporting research. After undergraduate studies in metabolic biochemistry he pursued a PhD in
molecular biology and protein biophysics, at the Research School of Chemistry ANU, before
continuing work as a molecular biologist within the Chemistry Department at Bath. In 2001 he took
up the position of Lecturer in Combinatorial Chemistry at the University of Southampton and in
2005 he commenced a joint appointment (80%) as Senior Scientist in Biomolecular Sciences at the
ISIS Pulsed Neutron and Muon Facility. Dr Neylon has extensive experience of managing and
coordinating multi-disciplinary and inter-disciplinary projects on a range of scales.

Through involvement in the developement of web based electronic laboratory recording systems Dr
Neylon has become a well known advocate and speaker on the subject of web based tools in
research, and an advocate of Open Access publication and Open Science more generally. His
research group is working towards making the entirety of their research record available online, as it
is recorded. He has given four invited talks at international meetings in the area in the past twelve
months.

Relevant online and non-peer reviewed material

1. Bradley JC, Neylon C, Data on Display: An interview by Katherine Sanderson, Nature,


455(7211), 273, 2008 - An interview recorded following a talk at Nature Publishing Group
that appeared in Nature in October 2008
2. Wu S, Neylon C, Open Science: Tools, approaches, and implications. Available from Nature
Precedings <http://dx.doi.org/10.1038/npre.2008.1633.1> (2008)
3. Neylon C, A personal view of Open Science. Available at Science in the Open
<http://blog.openwetware.org/scienceintheopen/2008/09/30/a-personal-view-of-open-
science-part-i/> (2008) - A four part essay
4. Neylon C, Science in the open or How I learned to stop worrying and love my blog, Science
in the 21st Century, Perimeter Institute, Waterloo, September 2008, Perimeter Institute
Seminar Archive #08090038 <http://pirsa.org/08090038/>

Peer-reviewed publications

1. Guppy M, Abas L, Neylon C, Whisson ME, Whitham S, Pethick DW, Niu X, Fuel choices
by human platelets in human plasma, Eur. J. Biochem. 244, 1997, 161-167
2. Neylon C, Brown SE, Kralicek AV, Miles CS, Love CA, Dixon NE, Interaction of the
Escherichia coli Replication Terminator Protein (Tus) with DNA: A Model derived from
DNA-binding studies of mutant proteins by surface plasmon resonance, Biochemistry, 39,

9
Use and relevance of web 2.0 resources for researchers

2000, 11989-11999
3. Wood RJ, Pascoe DD, Brown ZK, Medlicott EM, Kriek M, Neylon C, Roach PL, Optimised
conjugation of a fluorescent label to proteins via intein mediated activation and ligation,
Bioconjugate Chemistry, 15, 2004, 366-372
4. Neylon C, Chemical and biochemical strategies for the randomisation of protein encoding
DNA sequences: Library construction methods for directed evolution, Nucleic Acids
Research, 32, 2004, 1448-1459
5. Neylon C, Kralicek AV, Hill TM, Dixon NE, Replication termination in E. coli: Structure
and anti-helicase activity of the Tus-Ter complex, Microbiology and Molecular Biology
Reviews, 69, 2005, 501-526
6. Kriek M, Clark IP, Parker AW, Neylon C, Roach PL, Simple setup for Raman Spectroscopy
of microvolume frozen samples, Review of Scientific Instruments, 76, 2005,
104301-104303
7. Whiteford N, Haslam N, Weber G, Prügel-Bennett A, Essex JW, Roach PL, Bradley M,
Neylon C, An analysis of the feasibility of short read sequencing, Nucleic Acids Research,
33, 2005, e171
8. Weber G, Whiteford N, Haslam N, Prügel-Bennett A, Essex JW, Neylon C, Thermal
equivalence of DNA duplexes without calculating melting temperature, Nature Physics, 2,
2006, 55-59
9. Mulcair M, Schaffer P, Cross HFC, Neylon C, Hill TM, and Dixon NE, A Molecular
Mousetrap Determines Polarity of Termination of DNA Replication. Cell, 125, 2006,
1309-1319.
10.Cavalli G, Banu S, Ranasinghe RT, Broder GR, Martins HFP, Neylon C, Morgan H, Bradley
M, Roach PL, Multistep synthesis on SU-8: Combining microfabrication and solid-phase
chemistry on a single material. Journal of Combinatorial Chemistry, 9, 2007, 462-472
11.Chan L, Cross HF, She JK, Cavalli G, Martins HFP, Neylon C, Covalent attachment of
proteins to solid supports and surfaces via Sortase-mediated ligation, PLoS ONE 2(11),
2007, e1164 doi:10.1371/journal.pone.0001164
12.Neylon C, Small angle neutron and X-ray scattering in structural biology: Recent examples
from the literature, European Biophysics Journal, 37(5), 2008, 531-541
13.Broder GR, Ranasinghe RT, She JK, Banu S, Birtwell SW, Cavalli G, Galitonov GS,
Holmes D, Martins HFP, Neylon C, Zheludev N, Roach PL, Morgan H, Diffractive micro-
barcodes for encoding of biomolecules in multiplexed assays, Analytical Chemistry 80(6),
2008, 1902-9
14.Whiteford N, Haslam NJ, Weber G, Prügel-Bennett A, Essex JW, Neylon C, Visualising the
repeat structure of genomic sequences, Complex Systems 17(4), 2008, 381-398
15.Haslam NJ, Whiteford NE, Weber G, Prügel-Bennett A, Essex JW, Neylon C, Optimal
Probe Length Varies for Targets with High Sequence Variation: Implications for Probe
Library Design for Resequencing Highly Variable Genes, PLoS ONE 3(6), 2008, e2500
doi:10.1371/journal.pone.0002500
16.Teixeira SCM et al., New sources and instrumentation for neutrons in biology. Chemical
Physics 345(2-3), 2008, 133-151
17.Telling MTF, Neylon C, Kilcoyne SH, Arrighi V, An-harmonic behaviour in the multi-
subunit protein apoferritin as revealed by quasi-elastic neutron scattering, J Phys Chem B,
112(35), 2008, 10873-8

10
Use and relevance of web 2.0 resources for researchers

Victor Henning

Founder & Director, Mendeley


Doctoral Student, Bauhaus-University of Weimar

Blog: http://www.mendeley.com/blog
Online: http://www.mendeley.com/profiles/victor-henning

Victor Henning is an empirical social scientist and a co-founder of Mendeley, a Web 2.0 tool for
researchers. After completing his MBA in 2004, he became a lecturer and doctoral student at the
School of Media at the Bauhaus-University of Weimar. His PhD research on the role of emotions in
decision-making was funded by the German Federal Ministry of Education and Research and the
Foundation of the German Economy (SDW). In July 2005, for a paper studying the antecedents and
consequences of file-sharing technology adoption for the film industry, he won the Best Conference
Paper Award at the largest academic conference in the field of marketing, the AMA's Summer
Conference in San Francisco.

Since 2007 he is also the co-founder and director of Mendeley, a combined cross-platform desktop
software and website which helps researchers manage and share research papers. Information on
research paper usage and tagging is anonymously aggregated on Mendeley Web to create an open,
semantic database of research papers, research statistics, and (in the future) reading
recommendations. Mendeley is funded by some of the key personnel who built Skype and Last.fm.
Victor Henning's main responsibility at Mendeley is the conceptual design of the entire application,
as well as keeping close contact with interdisciplinary academic communities to better understand
its requirements for software/web tools. As such, he has been invited to give talks at international
academic conferences and institutions such as Princeton University, Cornell University, New York
University, University of Bath, and Cold Spring Harbor Laboratory.

Relevant peer-reviewed publications:

1. Henning V, Reichelt J, Mendeley - A Last.fm for Research?, Proceedings of the 4th IEEE
International Conference on e-Science, 2008, Indianapolis: IEEE. | A paper discussing the
potential implications of Web 2.0 tools like Mendeley on research collaborations, open
databases, and reputation metrics in science.
2. Henning V, Hennig-Thurau T, The Theory of Reasoned Action: Does it Lack Emotion?,
Enhancing Knowledge Development in Marketing: Proceedings of the 2008 AMA Summer
Educators' Conference, 2008, Chicago: American Marketing Association. | A survey-based
cross-sectional empirical study proposing extensions to the Theory of Reasoned Action, on
which the Unified Theory of Acceptance and Use of Technology (the theoretical framework
proposed for the present study) is based.
3. Hennig-Thurau V, Henning V, Sattler H, Consumer File Sharing of Motion Pictures, Journal
of Marketing, 71(October), 2008, 1-18. | A survey-based longitudinal empirical study of the
antecedents and consequences of file-sharing adoption, using a Partial Least Squares
structural equation model similar to the one proposed in this study. Published as the Lead
Article of the journal issue.

Other peer-reviewed publications

11
Use and relevance of web 2.0 resources for researchers

1. Hennig-Thurau T, Henning V, Sattler H, Eggers F, Houston M, The Last Picture Show?


Timing and Order of Movie Distribution Channels, Journal of Marketing, 71(October),
2008, 63-83.
2. Henning V, The Role of Anticipated Emotions in Hedonic Consumptionm, Cognition and
Emotion in Economic Decision Making, 2007, Rovereto: Università degli Studi di Trento.
3. Hennig-Thurau H, Henning V, Sattler H, Eggers F, Houston M, Optimizing the Sequential
Distribution Model for Motion Pictures, Enhancing Knowledge Development in Marketing:
Proceedings of the 2006 AMA Summer Educators’ Conference, 2006, Chicago: American
Marketing Association.
4. Henning V, Hennig-Thurau T, Consumer file sharing of motion pictures: Consequences and
Determinants, Enhancing Knowledge Development in Marketing: Proceedings of the 2005
AMA Summer Educators’ Conference. 2005, Chicago: American Marketing Association.
5. Henning H, Alpar A, Public aid mechanisms in feature film production: the EU MEDIA Plus
Programme, Media, Culture & Society, 27(2), 2005, 229-250.

12
Use and relevance of web 2.0 resources for researchers

Liz Lyon

Director of UKOLN, University of Bath

Online: http://www.ukoln.ac.uk/ukoln/staff/e.j.lyon/

Liz Lyon is Director at UKOLN where she supports the development and implementation of the
Information Environment, promoting synergies between digital libraries and e-Research. She has
led the eBank UK project, and is a partner in the eCrystals Federation. She is also Associate
Director (Community Development), UK Digital Curation Centre, in which UKOLN is a partner.

Liz published the direction-setting Report "Dealing with Data" in 2007, which has been used to
advance the digital repository development agenda within the JISC Capital programme
(2006-2009), and to assist in the co-ordination of research data repositories and to inform an
emerging Vision and Roadmap.

In May 2008, she co-authored the Scaling Up Report, which presents the results of a JISC-funded
scoping study to assess the feasibility of a federated model for data repositories in the domain of
crystallography.

Liz is a member of the ESRC Research Resources Board, the Thomson Scientific Strategic
Advisory Board and the US National Science Foundation Advisory Committee for
CyberInfrastructure.

Her academic background was in Biological Sciences and she has a doctorate in cellular
biochemistry.

13
Use and relevance of web 2.0 resources for researchers

Niall Haslam

Post-Doctoral Research Fellow, EMBL Heidelberg

Niall Haslam is computational biologist, involved in making software tools for biologists. Currently
he is employed by the European ProteomeBinders project to create a resource for the selection of
epitopes for the design of binding reagents against the human proteome. He has contributed to the
development of an ontology to describe protein affinity reagents (http://www.ebi.ac.uk/ontology-
lookup/browse.do?ontName=PAR), this work is currently under review in the Human Proteomics
Organisation's document review process before final admission into the Proteomic Standards
Initiative. Niall trained as a biologist with a degree in Human Genetics from Nottingham. After a
year working in the pharmaceutical company Galen, he took a Masters in Bioinformatics at Exeter
University. There he completed a project, with Chris Southan (then at Oxford Glycosciences)
working on DNA sequence databases, trying to discover sequences published in patent databases
but not found in the more common sequence databases. After his Masters, Niall worked on his PhD
in the University of Southampton on DNA sequencing. The project focused on the potential of
novel sequencing methods to uncover variation in highly variable genomes. Since the end of 2006
Niall has been working at the EMBL Heidelberg.

At the EMBL Heidelberg, in the group of Toby Gibson, Niall has worked on the use of webservices
and other internet based technologies to provide multiple sources of information for researchers in a
single resource. As a software developer, Niall has been interested in the potential of these new
technologies to share information in a more open manner. Through this Niall has gained an insight
into the requirements of the developer community for the next generation of scientific data sharing
tools.

Peer-reviewed publications:

1. An Analysis of the Feasibility of Short Read Sequencing, N. E.Whiteford, N. Haslam, G.


Weber, A. Prügel-Bennett, J. W. Essex, P. L. Roach, M. Bradley and C. Neylon, 2005 Nuc.
Acids. Res. 33 19 e171
2. Thermal Equivalence of DNA Duplexes Without Melting Temperature Calculation, G.
Weber, N. Haslam, N. Whiteford, A. Prügel-Bennett, J. W. Essex and C. Neylon 2006, Nat.
Phys. 2:55–59
3. Optimal probe length varies for targets with high sequence variation: Implications for probe
library design for resequencing highly variable genes N. Haslam, N. Whiteford, G.Weber, A.
Prügel-Bennett, J. W. Essex and C. Neylon, 2008, PLoS One, 3(6):e2500
4. A Novel Method for Whole Genome Repeat Visualisation, N. E.Whiteford, N. Haslam, G.
Weber, A. Pr¨ugel-Bennett, J. W. Essex and C. Neylon, 2008, Journal of Complex Systems
17 4:381–398
5. Understanding eukaryotic linear motifs and their role in cell signaling and regulation. Diella,
F., Haslam, N., Chica, C., Budd, A., Michael,S., Brown, N.P., Trave, G. and Gibson, T.J.
2008 Front Biosci. May 1;13:6580-603.
6. Thermal equivalence of DNA duplexes for probe design. Gerald Weber, Niall Haslam,
Jonathan W. Essex and Cameron Neylon. 2008 J. Phys.: Condens. Matter in press.

14
Use and relevance of web 2.0 resources for researchers

Gavin Baker

Outreach Fellow, The Scholarly Publishing and Academic Resources Coalition

Blog: http://www.gavinbaker.com/

Gavin Baker is an Outreach Fellow at SPARC (the Scholarly Publishing and Academic Resources
Coalition), an international alliance of academic and research libraries dedicated to promoting
openness in the scholarly communications system. He is also Assistant Editor of Open Access News,
Peter Suber's comprehensive blog covering the Open Access movement.

He holds a BA in Political Science from the University of Florida.

Relevant online and non-peer reviewed material:

1. Baker G, "Public Science", Science Progress, 28 January 2008


<http://www.scienceprogress.org/2008/01/public-science/>
2. SPARC, The Right to Research, January 2008 <http://www.arl.org/sparc/bm
%7Edoc/rr2008_pages.pdf>
3. Baker G, "How students use the scholarly communication system", College & Research
Libraries News, November 2007
<http://www.ala.org/ala/mgrps/divs/acrl/publications/crlnews/2007/nov/Student_activism.cf
m>
4. Baker G, "Open Access Journal Literature is an Open Educational Resource", Terra
Incognita - A Penn State World Campus Blog, 5 September 2007.
<http://blog.worldcampus.psu.edu/index.php/2007/09/05/open-access-journal/>

15
Use and relevance of web 2.0 resources for researchers

Jonathan Gray

Operations Manager, The Open Knowledge Foundation


Research Student, Royal Holloway

Blog: http://blog.okfn.org

Jonathan Gray is Operations Manager at the Open Knowledge Foundation - which is a not-for-profit
organisation founded in 2004 and dedicated to promoting open knowledge in all its forms. It is a
European leader in this field and prominent on the international stage.

He studied Philosophy at Corpus Christi College, Cambridge University, Social Sciences at the
Open University and is currently doing research in the German department at Royal Holloway,
University of London. He has a background in the library sector. As well as being invited to speak
at numerous events, he recently sat on the programme committees of I-SEMANTICS '08, Graz and
Linked Data on the Web (LDOW2008), Beijing, and is currently on the programme committee for
the First Open Source GIS UK Conference, University of Nottingham. He is co-ordinating the EU-
funded 4th COMMUNIA Workshop on "Accessing, Using and Reusing Public Sector Content and
Data".

He has run several domain specific workshops on how people find and re-use the material they
work with, specifically looking at CKAN - an open-source, community driven tool for resource
discovery developed by the Open Knowledge Foundation. More generally he has a strong interest in
how new internet technologies are being integrated into the existing work patterns of researchers
and practitioners in different fields. He has participated in the EU COST Action 32, "Open
Scholarly Communities on the Web" which is dedicated to creating a research infrastructure for
humanities scholarship on the Web and he is currently involved in setting up a new open-access
journal based at Oxford University as part of the eContentplus Discovery project. He recently co-
ordinated a series of informal, inter-disciplinary workshops looking at best-of-breed visualisation
technologies and their applications - from genomics to modelling medieval bookbinding techniques.

Relevant online and non-peer reviewed material:

1. Gray J, Pollock R, Walsh J, "Open Knowledge: Promises and Challenges", from 1st
COMMUNIA Workshop, Torino, January 2008, <http://cms.communia-project.eu/node/58>
2. Documentation of Workshop on Finding and Re-using Open Scientific Resources
<http://blog.okfn.org/2008/11/12/after-the-workshop-on-open-scientific-resources/>
3. Documentation of Workshop on Finding and Re-using Public Information
<http://blog.okfn.org/2008/11/04/after-the-workshop-on-public-information/>

Peer reviewed publications:

1. Gray J, Hamann, Nietzsche and Wittgenstein on the language of philosophers, Northwestern


University Press, Forthcoming (2009)

16
12. Project Plan
Task December January February March April May June July August

Organisational tasks Negotiate project plan and details with sponsor

Begin qualitative discussions within group, building


on internal expertise

Comprehensive literature review based on internal


group discussions

Qualitative data collection Aggregate and compare existing survey data


and analysis
Prepare qualitative interviews

Carry out in-depth qualitative interviews

Commence writing up in-depth interviews and create


illustrative case studies

Plan survey recruitment process and optimisation

Carry out "adoption sampling" study on Web 2.0


sites for researchers

Quantitative data collection Create measurement model for cross-sectional


survey, based on qualitative interviews and literature
and analysis
review

Run survey over three to four weeks (depending on


response rates)

Statistical analysis of survey results

Finalise analysis, manuscript write-up and


submission to RIN
Results and publication
Jointly prepare publication and dissemination of
results

You might also like