Professional Documents
Culture Documents
IN DEMOCRACY,
INNOVATION, AND
ENTREPRENEURSHIP
FOR GROWTH
ANALYTICS, INNOVATION,
AND EXCELLENCE-DRIVEN
ENTERPRISE SUSTAINABILITY
Series Editor
Elias G.Carayannis
School of Business
George Washington University
Washington,DC,USA
The central theme of this series is to explore why some geographic areas
grow and others stagnate over time, and to measure the effects and impli-
cations in a trans-disciplinary context that takes both historical evolution
and geographical location into account. In other words, when, how, and
why does the nature and dynamic of a political regime inform and shape
the drivers of growth and especially innovation and entrepreneurship? In this
socio-economic, socio-political, and socio-technical context, how could we
best achieve growth, financially and environmentally? This series aims to
address key questions framing policy and strategic decision-making at firm,
industry, national, and regional levels, such as:
How does technological advance occur, and what are the strategic
processes and institutions involved?
How are new businesses created? To what extent is intellectual prop-
erty protected?
Which cultural characteristics serve to promote or impede innovation?
In what ways is wealth distributed or concentrated?
Innovation,
Political Regime, and
Economic and Social Development.
Analytics, Innovation,
and Excellence-
Driven Enterprise
Sustainability
Editors
Elias G. Carayannis Stavros Sindakis
Department of Information Systems School of Business
and Technology Management American University in Dubai School
George Washington University of Business
Washington, District of Columbia, Dubai, UAE
USA
vii
viii FOREWORD
This book is a complete work that will no doubt make you rethink
your business approach in this new challenging and interesting time. I am
pleased to commend it to readers.
MiguelDiasCosta,
THM School
London, UK
Contents
ix
x Contents
Index277
List of Contributors
xi
xii LIST OF CONTRIBUTORS
ing company. His research interests belong to business analytics, business intelli-
gence, eXtensible Business Reporting Language (XBRL), and e-science.
BirteFreudenreich is a researcher at the Centre for Sustainability Management
(CSM) at the Leuphana University of Lneburg. Her research focuses on business
models and on managing their contribution to sustainable development. She holds
degrees in both Environmental Sciences and Strategic Leadership towards
Sustainability. Having worked in the sustainability management field for several
years, Freudenreich takes a keen interest in the applicability of her research in busi-
ness management practice.
Myropi Garriis Senior Lecturer in Strategic Management at Portsmouth
Business School, University of Portsmouth. Her scientific interests focus on strate-
gic management, internationalization strategies, business administration, human
resources management, and public policy. Her studies and articles have been pub-
lished in scientific journals and in edited volumes.
Herodotos Herodotou is a tenure-track lecturer in the Department of Electrical
Engineering and Computer Engineering and Informatics (EECEI) at the Cyprus
University of Technology. He received his PhD in Computer Science from Duke
University in May 2012. His research interests are in large-scale data processing
systems and database systems. In particular, his work focuses on ease-of-use, man-
ageability, and automated tuning of both centralized and distributed data-inten-
sive computing systems. In addition, he is interested in applying database
techniques in other areas like scientific computing, bioinformatics, and numerical
analysis. His work experience includes research positions at Microsoft Research,
Yahoo! Labs, and Aster Data as well as software engineering positions at Microsoft
and RWD Technologies. He is the recipient of the SIGMOD Jim Gray Doctoral
Dissertation Award Honorable Mention, the Outstanding PhD Dissertation
Award in Computer Science at Duke, Steele Endowed Fellowship, and the Cyprus
Fulbright Commission Scholarship.
Thorhildur Jetzekis a postdoctoral research fellow at the Department of IT
Management, Copenhagen Business School. She has a M.Sc. in Economics and a
PhD in Information Systems Management. During her PhD she worked at the IT
company KMD in Denmark, studying the implementation of an open data infra-
structure in Danish public sector. Her current research focuses on value creation
through open and big data, with a special attention to the role of digital platforms.
Building on 15 years of experience from the IT industry, Thorhildur strives to find
synergies between academic research and practical experiences in order to further
our understanding of how data and IT can be used to create value for society.
NikolaosKonstantopoulos is an associate professor in the Department of Business
Administration at the University of the Aegean Business School. His research
interests include small business management, entrepreneurship and strategic deci-
LIST OF CONTRIBUTORS xiii
xv
xvi List of Figures
xvii
xviii List of Tables
StavrosSindakis
S. Sindakis (*)
American University in Dubai, School of Business, Dubai, UAE
e-mail: ssindakis@aud.edu
form that jointly optimizes resilience and robustness. Whenever there are
differences in the sets of strategies and actions maximizing resiliency and
robustness, the organization should exercise care to elaborate and make
informed choices among the trade-offs between resiliency and robustness
that ultimately constrain so that any choice of strategies, actions, and orga-
nization design. Overall, this book provides a unique perspective on how
knowledge, information, and data analytics create opportunities and chal-
lenges for sustainable enterprise excellence. It also illustrates the impor-
tance of knowledge, information, and data analytics for organizational
intelligence and entrepreneurial competitiveness.
This volume consists of 11 chapters, exploring and discussing the
importance of business intelligence and analytics and their impact on mar-
ket research, business ventures, organizational sustainability, and enter-
prise excellence. An alternative perspective of strategic planning is also
discussed, considering the power of information on foreign markets as well
as the dynamics of open data, innovation, and sustainable value genera-
tion. Finally, we investigate the role of data science in the decision-making
process, discuss and assess the novel business model of sustainability ori-
entation, and investigate the correlation of sustainability and excellence
in higher education institutions under a given operational programme
for competitiveness. More specifically, the second chapter explores the
value of converting big data into useful information and knowledge, and
examines the design principles and core features of systems for analysing
large datasets for business purposes, aiming at meeting the demand for
interactive analytics, a new class of systems that combine analytical and
transactional capabilities. Chapter 3 explores the underdeveloped field of
business analytics for price trend forecasting through the utilization of tex-
tual data. The study aims at identifying methods of exploiting data analyt-
ics, which enable and support traders to maximizing their business profits.
Developing various assumptions and evaluating existing solutions in price
trend forecasting, the study introduces a novel approach of applying news
tickers for price trend forecasts in the energy market: a method, which is
applicable in any domain where important events have to be considered
instantly. Considering the value of data analytics from another viewpoint,
Chap. 4 discusses the benefits of marketing analytics and metrics in female-
owned business enterprises, focusing on customer behaviour and market
behaviour patterns. The study reveals specific marketing analytics to have
significant value in both customer behaviour and marketing behaviour
in the female-owned business ventures. Chapter 5 reviews the current
4 S. SINDAKIS
HerodotosHerodotou
2.1 Introduction
Modern industrial, government, and academic organizations are collecting
massive amounts of data (big data) at an unprecedented scale and pace.
Many enterprises continuously collect records of customer interactions,
product sales, results from advertising campaigns on the Web, and other
types of information. Powerful telescopes in astronomy, particle accelera-
tors in physics, and genome sequencers in biology are putting massive vol-
umes of data into the hands of scientists (Cohen etal. 2009; Thusoo etal.
2009). The ability to perform timely and cost-effective analytical process-
ing of such large datasets to extract deep insights is now a key ingredient
for success. These insights can drive automated processes for advertise-
ment placement, improve customer relationship management, and lead to
major scientific breakthroughs (Frankel and Reid 2008).
H. Herodotou (*)
Department of Electrical Engineering and Computer Engineering and
Informatics (EECEI), Cyprus University of Technology, Limassol, Cyprus
e-mail: herodotos.herodotou@cut.ac.cy
The set of techniques, systems, and tools that transform raw data into
meaningful and useful information for business analysis purposes is collec-
tively known as Business Intelligence (BI) (Chen etal. 2012). In addition
to the underlying data processing and analytical techniques, BI includes
business-centric practices and methodologies that can be applied to vari-
ous high-impact applications such as e-commerce, market intelligence,
healthcare, and security. The more recent explosion of data has led to the
development of advanced and unique data storage, management, analysis,
and visualization technologiestermed big data analyticsin order to
serve applications that are so large (from terabytes to exabytes) and com-
plex (from sensor to social media data) that could not be served effectively
with the previous technologies. Big data analytics can give organizations
an edge over their rivals and lead to business rewards, including more
potent promotion and enhanced revenue.
Existing database systems are adapting to the new status quo while
large-scale data analytical systems, like MapReduce (Dean and Ghemawat
2008) and Dryad (Isard etal. 2007), are becoming popular for analytical
workloads on big data. Industry leaders such as Teradata, SAP, Oracle,
and EMC/Greenplum have addressed this explosion of data volumes
by leveraging more powerful and parallel hardware in combination with
sophisticated parallelization techniques in the underlying data manage-
ment software. Internet service companies such as Twitter, LinkedIn,
Facebook, Google, and others address the scalability challenge by leverag-
ing a combination of new technologies in their clusters: key-value stores,
columnar storage, and the MapReduce programming paradigm (Wu etal.
2012; Thusoo etal. 2010; Lee etal. 2012; Melnik etal. 2010). Finally,
small and medium enterprises are slowly adopting the new technologies to
satisfy their needs for identifying, developing, and otherwise creating new
strategic business opportunities.
This monograph is an attempt to cover the design principles and core
features of systems for analyzing very large datasets for business purposes.
We organize systems into four main categoriesParallel Databases,
MapReduce, Dataflow, and Interactive Analyticseach with multiple
subcategories, based on some major and distinctive technological innova-
tions. The categories loosely correspond to the chronological evolution of
systems as the requirements for large-scale analytics have evolved over the
last few decades. Table 2.1 lists all categories and subcategories we discuss
along with some example systems for each subcategory.
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 9
Table 2.1 The system categories, subcategories, and example systems (in alpha-
betical order) for large-scale data analytics
(Sub)Category Example systems
Parallel databases
Row-based parallel databases Aster nCluster, DB2 Parallel Edition, Greenplum,
Netezza, Teradata
Columnar databases C-Store, Infobright, MonetDB, ParAccel, Sybase IQ,
Vector Wise, Vertica
MapReduce
Distributed file systems Ceph, GFS, HDFS, Kosmos, MapR, Quantcast
MapReduce execution Google MapReduce, Hadoop, HadoopDB, Hadoop++
engines
MapReduce-based platforms Cascading, Clydesdale, Hive, Jaql, Pig
Dataflow
Generalized MapReduce ASTERIX, Hyracks, Nephele, Stratosphere
Directed acyclic graph systems Dryad, DryadLINQ, SCOPE, Shark, Spark
Graph processing systems GraphLab, GraphX, HaLoop, Pregel, PrIter, Twister
Interactive analytics
Mixed analytical and Bigtable, HBase, HyPer, HYRISE, Megastore, SAP
transactional HANA, Spanner
Distributed SQL query Apache Drill, Cloudera Impala, Dremel, Presto, Stinger.
engines next
Stream processing systems Aurora, Borealis, Muppet, S4, Storm, STREAM
Having selective overlap among the nodes (or the group) on which
the partitions of two or more tables are stored can be beneficial, espe-
cially for join processing. Consider two tables R(a, b) and S(a, c), where
a is a common attribute. Suppose both tables are hash partitioned on the
respective attribute a using the same hash function and the same number
of partitions. Further, suppose the partitions of tables R and S are both
stored on the same group of nodes. In this case, there will be a one-to-one
correspondence between the partitions of both tables that can join with
one another on attribute a. That is, any pair of joining partitions will be
stored on the same node of the group. Under these conditions, the two
tables R and S are said to be collocated. The advantage of collocation is
that tables can be joined without the need to move any data from one
node to another.
In addition to collocation, data replication can often provide perfor-
mance benefits, both for join processing and for the concurrent execution
of multiple queries. Replication is usually done at the table level in two
scenarios. When a table is small, it can be replicated on all nodes in the
cluster or a group. Such replication is common for dimension tables in
star and snowflake schemas so that they can easily join with the partitions
of the distributed fact table(s). Replication can also be done such that dif-
ferent replicas are partitioned differently. For example, one replica of the
table may be hash partitioned while another may be range partitioned for
speeding up multiple workloads with different access and join patterns.
Apart from performance benefits, replication also helps reduce unavail-
ability or loss of data when faults arise in the parallel database system (e.g.,
a node fails permanently or becomes disconnected temporarily from other
nodes due to a network failure).
The diverse mix of partitioning, declustering, collocation, and replica-
tion techniques available can make it confusing for users of parallel database
systems to identify the best data layout for their workload. This problem
has motivated research on automated ways to recommend good data lay-
outs based on the workload (Mehta and DeWitt 1997; Rao etal. 2002)
and on partition-aware optimization techniques to generate efficient plans
for SQL queries over partitioned tables (Herodotou etal. 2011).
perform the join because it performs the join in parallel on each node
while avoiding the need to transfer data between nodes.
Directed join: Suppose tables R and S are both partitioned on attri-
bute a but the respective partitions are not collocated. In this case,
a directed join can transfer each partition of one table (say R) to the
node where the joining partition of the other table is stored. Once
a partition from R is brought to where the joining partition in S is
stored, a local join can be performed. Compared to a collocated join,
a directed join incurs the cost of transferring one of the tables across
the network.
Repartitioned join: If tables R and S are not partitioned on the
joining attribute, then the repartitioned join is used. This join sim-
ply repartitions the tuples in both tables using the same partitioning
condition (e.g., hash). Joining partitions are brought to the same
node where they can be joined. This operator incurs the cost of
transferring both tables across the network.
Broadcast join: When tables R and S are not partitioned on the
joining attribute but one of them (say R) is very small, then the
broadcast join will transfer R in full to every node where any par-
tition of the other table (S) is stored. The join is then performed
locally. This operator incurs a data transfer cost equal to the size of R
times the degree of declustering of S.
2.2.2Columnar Databases
Columnar systems excel at data-warehousing-type applications, where (a)
data is loaded in bulk but typically not modified much and (b) the typical
access pattern is to scan through large parts of the data to perform aggre-
gations and joins. The first columnar database systems that appeared in the
1990s were MonetDB (Boncz etal. 2006) and Sybase IQ (MacNicol and
French 2004). The 2000s saw a number of new columnar database systems
such as C-Store (Stonebraker etal. 2005), Infobright (Infobright 2013),
ParAccel (ParAccel 2013), VectorWise (Zukowski and Boncz 2012), and
Vertica (Lamb etal. 2012). Similar to the row-based databases discussed
above, we focus on the data storage and query execution of columnar
database systems.
2.3.1Distributed Storage
The storage layer of a typical MapReduce cluster is an independent distrib-
uted file system. Typical Hadoop deployments use the HDFS running on
the clusters compute nodes (Shvachko etal. 2010). Alternatively, a Hadoop
cluster can process data from other file systems like the MapR File System
(MapR 2013), Ceph (Weil etal. 2006), Amazon Simple Storage Service (S3)
(Amazon S3, 2013), and Windows Azure Blob Storage (Calder etal. 2011).
As HDFS focuses more on batch processing rather than interactive use,
it emphasizes high throughput of data access rather than low latency. An
HDFS cluster employs a master-slave architecture consisting of a single
NameNode (the master) and multiple DataNodes (the slaves), usually
one per node in the cluster (see Fig. 2.3). The NameNode manages the
file system namespace and regulates access to files by clients, whereas the
DataNodes are responsible for serving read and write requests from the file
systems clients. HDFS is designed to reliably store very large files across
machines in a large cluster. Internally, a file is split into one or more blocks
that are replicated for fault tolerance and stored in a set of DataNodes.
A number of other distributed file systems are viable alternatives to
HDFS and offer full compatibility with Hadoop MapReduce. The MapR
File System (MapR 2013) and Ceph (Weil etal. 2006) have similar archi-
tectures to HDFS but both offer a distributed metadata service as opposed
to the centralized NameNode on HDFS. In MapR, metadata is shared
across the cluster and collocated with the data blocks, whereas Ceph uses
dedicated metadata servers with dynamic subtree partitioning to avoid
metadata access hot spots. The Quantcast File System (QFS) (Ovsiannikov
et al. 2013), which evolved from the Kosmos File System (KFS) (KFS
2013), employs erasure coding rather than replication as its fault tolerance
mechanism. Erasure coding enables QFS to not only reduce the amount
of storage but also accelerate large sequential write patterns common to
MapReduce workloads.
Distributed file systems are primarily designed for accessing raw files
and, therefore, lack any advanced features found in the storage layer of
database systems. This limitation has inspired a significant amount of
research for introducing (a) indexing, (b) collocation, and (c) columnar
capabilities into such file systems.
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 23
2.3.1.1 Indexing
Hadoop++ (Dittrich etal. 2010) provides indexing functionality for data
stored in HDFS using the so-called Trojan Indexes. The indexing informa-
tion is created during the initial loading of data onto HDFS and is stored as
additional metadata in the data blocks. Hence, targeted data retrieval can
be very efficient at the expense of increased data loading time. This prob-
lem is addressed by HAIL (Dittrich etal. 2012), which improves query
processing speeds over Hadoop++. HAIL creates indexes during the I/O-
bound phases of writing to HDFS so that it consumes CPU cycles that are
otherwise wasted. In addition, HAIL builds a different clustered index in
each replica maintained by HDFS for fault tolerance purposes. The most
suitable index for a query is then selected at run-time, and the correspond-
ing replicas are read during the MapReduce execution over HAIL.
2.3.1.2 Collocation
In addition to indexing, Hadoop++ provides a data collocation technique
in MapReduce systems. Specifically, Hadoop++ allows users to co-partition
and collocate data at load time while writing metadata in the data blocks
(Dittrich etal. 2010). Hence, blocks of HDFS can now contain data from
multiple tables. With this approach, collocated joins can be processed
at each node without the overhead of sorting and shuffling data across
nodes. CoHadoop (Eltabakh etal. 2011) provides a different collocation
strategy by adding a file-locator attribute to HDFS files and implementing
a file layout policy such that all files with the same locator are placed on
the same set of nodes. Using this feature, CoHadoop can collocate any
related pair of files, for example, every pair of joining partitions across two
tables that are both hash partitioned on the join key, or a partition and an
index on that partition. CoHadoop can then run joins in a similar manner
as collocated joins in parallel database systems.
run-time costs for tuple reconstruction. Unlike Llama, CIF uses an exten-
sion of HDFS to enable collocation of columns corresponding to the same
tuple on the same node and supports some late materialization techniques
for reducing tuple reconstruction costs (Floratou etal. 2011).
Cheetah (Chen 2010), RCFile (He etal. 2011), and Hadoop++ (Dittrich
et al. 2010) use a hybrid row-column design based on PAX (Ailamaki
etal. 2001). In particular, each file is horizontally partitioned into blocks
but a columnar format is used within each block. Since HDFS guarantees
that all the bytes of an HDFS block will be stored on a single node, it is
guaranteed that tuple reconstruction will not require data transfer over the
network. The intra-block data layouts used by these systems differ in how
they use compression, how they treat replicas of the same block, and how
they are implemented. For example, Hadoop++ can use different layouts
in different replicas and choose the best layout at query processing time.
2.3.2MapReduce-based Platforms
The MapReduce model, although highly flexible, has been found to be
too low-level for routine use by practitioners such as data analysts, statisti-
cians, and scientists (Olston etal. 2008; Thusoo etal. 2009). As a result,
the MapReduce framework has evolved into a MapReduce ecosystem shown
in Fig. 2.2, which includes a number of (a) high-level interfaces added over
the core MapReduce engine, (b) application development tools, (c) work-
flow management systems, and (d) data collection tools.
Partitions are then created using subdirectories while the actual data is
stored in files. Hive also includes a system catalogcalled Metastore
containing schema and statistics, which are useful in data exploration and
query optimization. In particular, Hive employs rule-based approaches for
a variety of optimizations such as filter and projection pushdown, shared
scans of input datasets across multiple operators from the same or different
analysis tasks (Nykiel etal. 2010), reducing the number of MapReduce
jobs in a workflow (Lee etal. 2011), and handling data skew in sorts and
joins.
and Hyracks differ mainly on the type of operators and connectors that
they support.
Nephele uses the Parallelization Contracts (PACT) programming
model (Alexandrov et al. 2010), a generalization of the well-known
MapReduce programming model. The PACT model extends MapReduce
with a total of five second-order functions:
function of the Job Manager is to construct the run-time DAG from its
logical representation and execute it in the cluster. The Job Manager is
also responsible for scheduling the vertices on the processing nodes when
all the inputs are ready, monitoring progress, and re-executing vertices
upon failure. A Dryad cluster has a Name Server that enumerates all the
available compute nodes and exposes their location within the network
so that scheduling decisions can take better account of locality. There is
a processing Daemon running on each cluster node that is responsible
for creating processes on behalf of the Job Manager. Each process cor-
responds to a vertex in the graph. The Daemon acts as a proxy so that the
Job Manager can communicate with the remote vertices and monitor the
state and progress of the computation.
DryadLINQ (Isard and Yu 2009) is a hybrid of declarative and impera-
tive language layer that targets the Dryad run-time and uses the Language
INtegrated Query (LINQ) model (Meijer et al. 2006). DryadLINQ
provides a set of NET constructs for programming with datasets. A
DryadLINQ program is a sequential program composed of LINQ expres-
sions that perform arbitrary side-effect-free transformations on datasets.
SCOPE (Zhou etal. 2012), on the other hand, offers a SQL-like declara-
tive language with well-defined but constrained semantics. In particular,
SCOPE supports writing a program using traditional nested SQL expres-
sions as well as a series of simple data transformations.
Spark (Zaharia etal. 2012) is a similar DAG-based execution engine.
However, the main difference of Spark from Dryad is that it uses a memory
abstractioncalled Resilient Distributed Datasets (RDDs)to explicitly
store data in memory. An RDD is a distributed shared memory abstraction
that represents an immutable collection of objects partitioned across a set
of nodes. Each RDD is either a collection backed by an external storage
system, such as a file in HDFS, or a derived dataset created by applying
various data-parallel operators (e.g., map, group by, hashjoin) to other
RDDs. The elements of an RDD need not exist in physical storage or
reside in memory explicitly; instead, an RDD can contain only the lineage
information necessary for computing the RDD elements starting from
data in reliable storage. This notion of lineage is crucial for achieving fault
tolerance in case a partition of an RDD is lost as well as managing how
much memory is used by RDDs. Currently, RDDs are used by Spark with
HDFS as the reliable back-end store.
Shark (Xin et al. 2013b) is a higher-level system implemented over
Spark and uses HiveQL as its query interface. Shark supports dynamic
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 33
2.4.3
Graph Processing Systems
For a growing number of applications, the data takes the form of graphs
that connect many millions of nodes. The growing need for managing
graph-shaped data comes from applications such as (a) identifying influ-
ential people and trends propagating through a social-networking com-
munity, (b) tracking patterns of how diseases spread, and (c) finding
and fixing bottlenecks in computer networks. Graph processing systems,
such as Pregel (Malewicz etal. 2010), GraphLab (Low etal. 2012), and
GraphX (Xin etal. 2013a), use graph structures with nodes, edges, and
their properties to represent and store data.
Many graph databases such as Pregel (Malewicz et al. 2010) use the
Bulk Synchronous Parallel (BSP) computing model. A typical Pregel com-
putation consists of (a) initializing the graph from the input, (b) perform-
ing a sequence of iterations separated by global synchronization points
until the algorithm terminates, and (c) writing the output. Similar to
DAG-based systems, each vertex executes the same user-defined function
that expresses the logic of a given algorithm. Within each iteration, a ver-
tex can modify its state or that of its outgoing edges, receive messages
sent to it in the previous iteration, send messages to other vertices (to be
received in the next iteration), or even mutate the topology of the graph.
GraphLab (Low etal. 2012) uses similar primitives (called PowerGraph)
but directly targets asynchronous, dynamic, graph-parallel computations
in the shared-memory setting. In addition, GraphLab contains several
performance optimizations such as using data versioning to reduce net-
work congestion and pipelined distributed locking to mitigate the effects
of network latency. GraphX (Xin etal. 2013a) runs on Spark and intro-
duces a new abstraction called Resilient Distributed Graph (RDG). Graph
algorithms are specified as a sequence of transformations on RDGs, where
a transformation can affect nodes, edges, or both, and yields a new RDG.
Techniques have also been proposed to support the iterative and recur-
sive computational needs of graph analysis in MapReduce systems. For
34 H. HERODOTOU
OLAP workloads, which led to the development of new systems that can
support both. On one hand, multiple distributed storage systems like
Bigtable (Chang etal. 2008) and Megastore (Baker etal. 2011) provide
various degrees of transactional capabilities, enabling them to serve as the
data store for online services while making the data available concurrently
in the same system for analytics. On the other hand, processing systems
like SAP HANA (Frber et al. 2012a, b) and HYRISE (Grund et al.
2012) can execute both OLTP and OLAP workloads.
same metadata, SQL syntax (HiveQL), and user interface such as Apache
Hive, providing a unified platform for batch-oriented or real-time queries.
Unlike Cloudera Impala that was developed to fit nicely with the Hadoop
ecosystem, Apache Drill is meant to provide distributed query capabilities
across multiple big data platforms including MongoDB, Cassandra, Riak,
and Splunk. Finally, Presto (Traverso 2013) is a distributed SQL query
engine developed at Facebook and, unlike Cloudera Impala and Apache
Drill, supports standard ANSI SQL, including complex queries, aggrega-
tions, joins, and window functions.
2.6 Conclusions
A major part of the challenge in data analytics today comes from the sheer
volume of data available for processing. Data volumes that many compa-
nies want to process in timely and cost-efficient ways have grown steadily
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 41
from the multigigabyte range to terabytes and now to many petabytes. All
data storage and processing systems that we presented in this monograph
were aimed at handling such large datasets. This challenge of dealing
with very large datasets has been termed the volume challenge. There are
two other related challenges, namely, those of velocity and variety (Laney
2001).
The velocity challenge refers to the short response-time require-
ments for collecting, storing, and processing data. Most of the sys-
tems in the MapReduce and Dataflow categories are batch systems. For
latency-sensitive applications, such as identifying potential fraud and
recommending personalized content, batch data processing is insuf-
ficient. The data may need to be processed as it streams into the system
in order to extract the maximum utility from the data. Systems for
interactive analytics are typically optimized for addressing the velocity
challenge.
The variety challenge refers to the growing list of data typesrelational,
time series, text, graphs, audio, video, images, and genetic codesas well
as the growing list of analysis techniques on such data. New insights are
found while analyzing more than one of these data types together using
a variety of analytical techniques such as linear algebra, statistical machine
learning, text search, signal processing, natural language processing, and
iterative graph processing.
Several higher-level systems and tools have been built on top of the
systems described in this monograph for implementing these techniques,
which drive automated processes for spam and fraud detection, advertise-
ment placement, Web site optimization, and customer relationship man-
agement. BI tools, such as SAS, SAP Business Objects, IBM Cognos, SPSS
Modeler, Oracle Hyperion, and Microsoft BI, provide support for reporting,
online analytical processing, data mining, process mining, and predictive
analytics based on data stored primarily in DataWarehouses. Other soft-
ware platforms such as Tableau and Spotfire specialize in interactive data
visualization of business data. In particular, these platforms query rela-
tional databases, cubes, cloud databases, and spreadsheets to generate a
number of graph types that can be combined into analytical dashboards
and applications. Both platforms also support visualizing large-scale data
stored in distributed file systems such as HDFS.On the other hand, com-
panies like Datameer, Karmasphere, and Platforma offer BI solutions that
specifically target the Hadoop ecosystem.
42 H. HERODOTOU
References
Abadi, Daniel J., Don Carney, Ugur Cetintemel, Mitch Cherniack, Christian
Convey, Sangdon Lee, Michael Stonebraker, Nesime Tatbul, and Stan Zdonik.
2003. Aurora: A new model and architecture for data stream management. The
VLDB JournalThe International Journal on Very Large Data Bases 12(2):
120139.
Abadi, Daniel J., Yanif Ahmad, Magdalena Balazinska, Ugur Cetintemel, Mitch
Cherniack, Jeong-Hyon Hwang, Wolfgang Lindner, etal. 2005. The design of
the borealis stream processing engine. CIDR 5: 277289.
Abadi, Daniel J., Daniel S.Myers, David J.DeWitt, and Samuel R.Madden. 2007.
Materialization strategies in a column-oriented DBMS.In Data Engineering,
IEEE 23rd International Conference on, 466475.
Abadi, Daniel J., Peter A. Boncz, and Stavros Harizopoulos. 2009. Column-
oriented database systems. Proceedings of the VLDB Endowment 2(2):
16641665.
Abouzeid, Azza, Kamil Bajda-Pawlikowski, Daniel Abadi, Avi Silberschatz, and
Alexander Rasin. 2009. HadoopDB: An architectural hybrid of MapReduce
and DBMS technologies for analytical workloads. Proceedings of the VLDB
Endowment 2(1): 922933.
Agrawal, Sanjay, Vivek Narasayya, and Beverly Yang. 2004. Integrating vertical
and horizontal partitioning into automated physical database design. In
Proceedings of the 2004 ACM SIGMOD International Conference on
Management of Data, 359370.
Ailamaki, Anastassia, David J.DeWitt, Mark D.Hill, and Marios Skounakis. 2001.
Weaving relations for cache performance. VLDB 1: 169180.
Alexandrov, Alexander, Max Heimel, Volker Markl, Dominic Battr, Fabian
Hueske, Erik Nijkamp, Stephan Ewen, Odej Kao, and Daniel Warneke. 2010.
Massively parallel data analysis with PACTs on nephele. Proceedings of the VLDB
Endowment 3(12): 16251628.
Alexandrov, Alexander, Rico Bergmann, Stephan Ewen, Johann-Christoph
Freytag, Fabian Hueske, Arvid Heise, Odej Kao, etal. 2014. The stratosphere
platform for big data analytics. The VLDB JournalThe International Journal
on Very Large Data Bases 23(6): 939964.
Amazon. 2013. Amazon simple storage service (S3). Accessed 2013. http://aws.
amazon.com/s3/
Babu, Shivnath, and Jennifer Widom. 2001. Continuous queries over data streams.
ACM SIGMOD Record 30(3): 109120.
Bajda-Pawlikowski, Kamil, Daniel J. Abadi, Avi Silberschatz, and Erik Paulson.
2011. Efficient processing of data warehousing queries in a split execution
environment. In Proceedings of the 2011 ACM SIGMOD International
Conference on Management of Data, 11651176.
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 43
Baker, Jason, Chris Bond, James C.Corbett, J.J.Furman, Andrey Khorlin, James
Larson, Jean-Michel Leon, Yawei Li, Alexander Lloyd, and Vadim Yushprakh.
2011. Megastore: Providing scalable, highly available storage for interactive
services. CIDR 11: 223234.
Baru, Chaitanya K., Gilles Fecteau, Ambuj Goyal, H. Hsiao, Anant Jhingran,
Sriram Padmanabhan, George P.Copeland, and Walter G.Wilson. 1995. DB2
parallel edition. IBM Systems Journal 34(2): 292322.
Battr, Dominic, Stephan Ewen, Fabian Hueske, Odej Kao, Volker Markl, and
Daniel Warneke. 2010. Nephele/PACTs: A programming model and execu-
tion framework for web-scale analytical processing. In Proceedings of the 1st
ACM Symposium on Cloud Computing, 119130.
Behm, Alexander, Vinayak R.Borkar, Michael J.Carey, Raman Grover, Chen Li,
Nicola Onose, Rares Vernica, Alin Deutsch, Yannis Papakonstantinou, and
Vassilis J.Tsotras. 2011. Asterix: Towards a scalable, semistructured data plat-
form for evolving-world models. Distributed and Parallel Databases 29(3):
185216.
Biem, Alain, Eric Bouillet, Hanhua Feng, Anand Ranganathan, Anton Riabov,
Olivier Verscheure, Haris Koutsopoulos, and Carlos Moran. 2010. IBM infos-
phere streams for scalable, real-time, intelligent transportation services. In
Proceedings of the 2010 ACM SIGMOD International Conference on
Management of Data, 10931104.
Boncz, Peter A., Marcin Zukowski, and Niels Nes. 2005. MonetDB/X100:
Hyper-pipelining query execution. CIDR 5: 225237.
Boncz, Peter, Torsten Grust, Maurice Van Keulen, Stefan Manegold, Jan Rittinger,
and Jens Teubner. 2006. MonetDB/XQuery: A fast XQuery processor pow-
ered by a relational engine. In Proceedings of the 2006 ACM SIGMOD
International Conference on Management of Data, 479490.
Borkar, Vinayak, Michael Carey, Raman Grover, Nicola Onose, and Rares Vernica.
2011. Hyracks: A flexible and extensible foundation for data-intensive comput-
ing. In 2011 IEEE 27th International Conference on Data Engineering (ICDE),
11511162.
Borthakur, Dhruba, Jonathan Gray, Joydeep Sen Sarma, Kannan Muthukkaruppan,
Nicolas Spiegelberg, Hairong Kuang, Karthik Ranganathan, et al. 2011.
Apache hadoop goes realtime at Facebook. In Proceedings of the 2011 ACM
SIGMOD International Conference on Management of Data, 10711080.
Bu, Yingyi, Bill Howe, Magdalena Balazinska, and Michael D. Ernst. 2010.
HaLoop: Efficient iterative data processing on large clusters. Proceedings of the
VLDB Endowment 3(12): 285296.
Buffers, Protocol. 2012. Developer guide. Accessed 2012.
Calder, Brad, Ju Wang, Aaron Ogus, Niranjan Nilakantan, Arild Skjolsvold, Sam
McKelvie, Yikang Xu, etal. 2011. Windows azure storage: A highly available
44 H. HERODOTOU
Herodotou, Herodotos, Nedyalko Borisov, and Shivnath Babu. 2011. Query opti-
mization techniques for partitioned tables. In Proceedings of the 2011 ACM
SIGMOD International Conference on Management of Data, 4960.
Hoffman, Steve. 2015. Apache flume: Distributed log collection for hadoop.
Birmingham: Packt Publishing.
Hsiao, Hui-I, and David J.DeWitt. 1990. Chained declustering: A new availabil-
ity strategy for multiprocessor database machines. Madison: University of
Wisconsin-Madison, Computer Sciences Department.
IBM Corporation. 2007. IBM knowledge center: Partitioned tables. Accessed 2007.
http://publib.boulder.ibm.com/infocenter/db2luw/v9r7/topic/com.ibm.
db2.luw.admin.partition.doc/doc/c0021560.html
IBM Netezza. 2012. IBM Netezza data warehouse appliances. Accessed 2012.
http://www-01.ibm.com/software/data/netezza/
Idreos, Stratos, Fabian Groffen, Niels Nes, Stefan Manegold, K. Sjoerd Mullender,
and Martin L. Kersten. 2012. MonetDB: Two decades of research in column-
oriented database architectures. IEEE Data Engineering Bulletin 35(1): 4045.
Infobright. 2013. InfobrightAnalytic database for the internet of things.
Accessed 2013. http://www.infobright.com/
Isard, Michael, and Yuan Yu. 2009. Distributed data-parallel computing using a
high-level programming language. In Proceedings of the 2009 ACM SIGMOD
International Conference on Management of Data, 987994.
Isard, Michael, Mihai Budiu, Yuan Yu, Andrew Birrell, and Dennis Fetterly. 2007.
Dryad: Distributed data-parallel programs from sequential building blocks.
ACM SIGOPS Operating Systems Review 41(3): 5972.
Islam, Mohammad, Angelo K. Huang, Mohamed Battisha, Michelle Chiang,
Santhosh Srinivasan, Craig Peters, Andreas Neumann, and Alejandro Abdelnur.
2012. Oozie: Towards a scalable workflow management system for hadoop. In
Proceedings of the 1st ACM SIGMOD Workshop on Scalable Workflow Execution
Engines and Technologies, 4.
Kemper, Alfons, Thomas Neumann, Florian Funke, Viktor Leis, and Henrik
Mhe. 2012. HyPer: Adapting columnar main-memory data management for
transactional and query processing. IEEE Data Engineering Bulletin 35(1):
4651.
KFS. 2013. Kosmos distributed file system. Accessed 2013. http://code.google.
com/p/kosmosfs/
Lakshman, Avinash, and Prashant Malik. 2010. Cassandra: A decentralized struc-
tured storage system. ACM SIGOPS Operating Systems Review 44(2): 3540.
Lam, Wang, Lu Liu, S. T. S. Prasad, Anand Rajaraman, Zoheb Vacheri, AnHai
Doan. 2012. Muppet: MapReduce-style processing of fast data. Proceedings of
the VLDB Endowment 5(12): 18141825.
Lamb, Andrew, Matt Fuller, Ramakrishna Varadarajan, Nga Tran, Ben Vandiver,
Lyric Doshi, and Chuck Bear. 2012. The vertica analytic database: C-store 7
years later. Proceedings of the VLDB Endowment 5(12): 17901801.
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 47
Laney, Doug. 2001. 3D data management: Controlling data volume, velocity and
variety. META Group Research Note 6: 70.
Lee, Rubao, Tian Luo, Yin Huai, Fusheng Wang, Yongqiang He, and Xiaodong
Zhang. 2011. Ysmart: Yet another SQL-to-MapReduce translator. In 2011 31st
International Conference on Distributed Computing Systems (ICDCS), 2536.
Lee, George, Jimmy Lin, Chuang Liu, Andrew Lorek, and Dmitriy Ryaboy. 2012.
The unified logging infrastructure for data analytics at Twitter. Proceedings of
the VLDB Endowment 5(12): 17711780.
Lin, Yuting, Divyakant Agrawal, Chun Chen, Beng Chin Ooi, and Sai Wu. 2011.
Llama: Leveraging columnar storage for scalable join processing in the mapre-
duce framework. In Proceedings of the 2011 ACM SIGMOD International
Conference on Management of Data, 961972.
Low, Yucheng, Danny Bickson, Joseph Gonzalez, Carlos Guestrin, Aapo Kyrola,
and Joseph M. Hellerstein. 2012. Distributed GraphLab: A framework for
machine learning and data mining in the cloud. Proceedings of the VLDB
Endowment 5(8): 716727.
MacNicol, Roger, and Blaine French. 2004. Sybase IQ multiplex-designed for
analytics. In Proceedings of the Thirtieth International Conference on Very Large
Data Bases, vol. 30, 12271230. Seoul: VLDB Endowment.
Malewicz, Grzegorz, Matthew H.Austern, Aart JC Bik, James C.Dehnert, Ilan
Horn, Naty Leiser, and Grzegorz Czajkowski. 2010. Pregel: A system for large-
scale graph processing. In Proceedings of the 2010 ACM SIGMOD International
Conference on Management of Data, 135146.
MapR. 2013. MapR file system. Accessed 2013. http://www.mapr.com/prod-
ucts/apache-hadoop
Mehta, Manish, and David J.DeWitt. 1997. Data placement in shared-nothing
parallel database systems. The VLDB JournalThe International Journal on
Very Large Data Bases 6(1): 5372.
Meijer, Erik, Brian Beckman, and Gavin Bierman. 2006. Linq: Reconciling object,
relations and xml in the .net framework. In Proceedings of the 2006 ACM
SIGMOD International Conference on Management of Data, 706706.
Melnik, Sergey, Andrey Gubarev, Jing Long, Geoffrey Romer, Shiva Shivakumar,
Matt Tolton, and Theo Vassilakis. 2010. Dremel: Interactive analysis of web-
scale datasets. Proceedings of the VLDB Endowment 3(12): 330339.
Morales, Tony. 2007. Oracle database VLDB and partitioning guide 11 g release
1 (11.1). Oracle, July 2007.
Neumeyer, Leonardo, Bruce Robbins, Anish Nair, and Anand Kesari. 2010. S4:
Distributed stream computing platform. In 2010 IEEE International Conference
on Data Mining Workshops (ICDMW), 170177.
Nykiel, Tomasz, Michalis Potamias, Chaitanya Mishra, George Kollios, and Nick
Koudas. 2010. MRShare: Sharing across multiple queries in MapReduce.
Proceedings of the VLDB Endowment 3(12): 494505.
48 H. HERODOTOU
Olston, Christopher, Benjamin Reed, Utkarsh Srivastava, Ravi Kumar, and Andrew
Tomkins. 2008. Pig latin: A not-so-foreign language for data processing. In
Proceedings of the 2008 ACM SIGMOD International Conference on
Management of Data, 10991110.
Ovsiannikov, Michael, Silvius Rus, Damian Reeves, Paul Sutter, Sriram Rao, and
Jim Kelly. 2013. The quantcast file system. Proceedings of the VLDB Endowment
6(11): 10921101.
ParAccel. 2013. ParAccel analytic platform. Accessed 2013. http://www.paraccel.
com/
Rabkin, Ariel, and Randy H.Katz. 2010. Chukwa: A system for reliable large-scale
log collection. LISA 10: 115.
Rao, Jun, Chun Zhang, Nimrod Megiddo, and Guy Lohman. 2002. Automating
physical database design in a parallel database. In Proceedings of the 2002 ACM
SIGMOD International Conference on Management of Data, 558569.
Shvachko, Konstantin, Hairong Kuang, Sanjay Radia, and Robert Chansler. 2010.
The hadoop distributed file system. In 2010 IEEE 26th Symposium on Mass
Storage Systems and Technologies (MSST), 110.
Stonebraker, Mike, Daniel J. Abadi, Adam Batkin, Xuedong Chen, Mitch
Cherniack, Miguel Ferreira, Edmond Lau, et al. 2005. C-store: A column-
oriented DBMS. In Proceedings of the 31st International Conference on Very
Large Data Bases, 553564. Seoul: VLDB Endowment.
Storm, Apache. 2013. Storm, distributed and fault-tolerant real-time
computation.
Sumbaly, Roshan, Jay Kreps, and Sam Shah. 2013. The big data ecosystem at
linkedin. In Proceedings of the 2013 ACM SIGMOD International Conference
on Management of Data, 11251134.
Talmage, Ron. 2009. Partitioned table and index strategies using SQL server
2008. MSDN Library, March 2009.
Teradata. 2012. Teradata enterprise data warehouse. Accessed 2012. http://www.
teradata.com
Thusoo, Ashish, Joydeep Sen Sarma, Namit Jain, Zheng Shao, Prasad Chakka,
Suresh Anthony, Hao Liu, Pete Wyckoff, and Raghotham Murthy. 2009. Hive:
A warehousing solution over a map-reduce framework. Proceedings of the VLDB
Endowment 2(2): 16261629.
Thusoo, Ashish, Zheng Shao, Suresh Anthony, Dhruba Borthakur, Namit Jain,
Joydeep Sen Sarma, Raghotham Murthy, and Hao Liu. 2010. Data warehous-
ing and analytics infrastructure at Facebook. In Proceedings of the 2010 ACM
SIGMOD International Conference on Management of Data, 10131020.
Traverso, Martin. 2013. Presto: Interacting with petabytes of data at Facebook.
Retrieved February 4, 2014.
Wanderman-Milne, Skye, and Li Nong. 2014. Runtime code generation in clou-
dera impala. IEEE Data Eng. Bull. 37(1): 3137.
BUSINESS INTELLIGENCE ANDANALYTICS: BIG SYSTEMS FORBIG DATA 49
Weil, Sage A., Scott A. Brandt, Ethan L. Miller, Darrell DE Long, and Carlos
Maltzahn. 2006. Ceph: A scalable, high-performance distributed file system. In
Proceedings of the 7th Symposium on Operating Systems Design and
Implementation, 307320. Berkeley, CA: USENIX Association.
White, Tom. 2010. Hadoop: The definitive guide. Sunnyvale, CA: Yahoo.
Wu, Lili, Roshan Sumbaly, Chris Riccomini, Gordon Koo, Hyung Jin Kim, Jay
Kreps, and Sam Shah. 2012. Avatara: Olap for web-scale analytics products.
Proceedings of the VLDB Endowment 5(12): 18741877.
Xin, Reynold S., Joseph E.Gonzalez, Michael J.Franklin, and Ion Stoica. 2013a.
Graphx: A resilient distributed graph system on spark. In First International
Workshop on Graph Data Management Experiences and Systems, 2.
Xin, Reynold S., Josh Rosen, Matei Zaharia, Michael J.Franklin, Scott Shenker,
and Ion Stoica. 2013b. Shark: SQL and rich analytics at scale. In Proceedings of
the 2013 ACM SIGMOD International Conference on Management of Data,
1324.
Zaharia, Matei, Mosharaf Chowdhury, Tathagata Das, Ankur Dave, Justin Ma,
Murphy McCauley, Michael J.Franklin, Scott Shenker, and Ion Stoica. 2012.
Resilient distributed datasets: A fault-tolerant abstraction for in-memory clus-
ter computing. In Proceedings of the 9th USENIX Conference on Networked
Systems Design and Implementation, 22. Berkeley, CA: USENIX Association.
Zhang, Yanfeng, Qixin Gao, Lixin Gao, and Cuirong Wang. 2011. Priter: A dis-
tributed framework for prioritized iterative computations. In Proceedings of the
2nd ACM Symposium on Cloud Computing, 13.
Zhou, Jingren, Nicolas Bruno, Ming-Chuan Wu, Per-Ake Larson, Ronnie
Chaiken, and Darren Shakib. 2012. SCOPE: Parallel databases meet
MapReduce. The VLDB JournalThe International Journal on Very Large
Data Bases 21(5): 611636.
Zukowski, Marcin, and Peter A.Boncz. 2012. Vectorwise: Beyond column stores.
IEEE Data Engineering Bulletin 35(1): 2127.
CHAPTER 3
MarcoPospiech andCarstenFelden
3.1 Introduction
Living in the era of Big Data (Labrinidis and Jagadish 2012) means having
an increasing amount of structured and unstructured data accessible and
useful for different business needs (Gartner 2013). Price predictions might
be one business analytics scenario out of several ones, which describes the
need for integrating heterogeneous data (Pospiech and Felden 2014).
In all business domains, markets react sensible and accelerated of relevant
news (Chan etal. 2001). For example, news tickers contain a broad range
of edited topics from geopolitical to financial data and are currently evalu-
ated and integrated with structured (usually internal) data manually in deci-
sion processes. Since the value of an information itself decreases quickly and
decision-makers are overwhelmed by the felt information flood, an applica-
tion of business analytics is useful to gain an automated data/information
analysis. Existing business analytics approaches consider both data types sep-
aratelybut the analytical benefit comes out of an integrated perspective,
which is in the sense of Big Data. However, taking just one kind of data set
into account leads to the limitation that an entire market overview is not
possible. In addition, available approaches do not regard real-time events
and due to this reason, forecasts are calculated on defined time intervals.
Using the named example of price prediction, we will illustrate how the
combination of unstructured and structure data sources can generate value.
This chapter presents a general forecasting approach based on news tickers
and market-related indicators by applying data mining algorithms. A classi-
fication is performed to predict positive and negative price trends and based
on this forecasting models are deduced. Patterns are extracted by historical
price movements caused by various attributes so that similar characteristics
can be understood as the repetition of similar trends in the future. The
functionality is demonstrated by two different case studies.
The chapter is organized as follow: after a literature review, we intro-
duce the forecasting process in general. Hereby, unstructured textual
data and various environment conditions like currency exchange rates are
mapped and classified. The approach will be applied for the natural gas
and electricity market, whereby the specific environment conditions differ
for both. Real projects show the implementation of the forecast system for
trading floors. The chapter ends with a conclusion.
one, the example will be put into the class DOWN.The class STABLE
represents a no mover status. Here, no remarkable price change between
two trade transactions happened (Lavrenko etal. 2000). Because markets
differ from each other, domain experts have to specify how many price
point movements are needed to be able to represent a meaningful move-
ment. The historical movements need an explanation. In cooperation with
domain experts, variables and data sources explaining the developments
should be identified.
A possible explanation could be seen in historical news tickers (Chan
etal. 2001). In fact, the amount of news ticker is growing. The filtering
step aims the selection of as most as possible relevant articles. Considering
the market, specific topics, keywords, or time intervals are potential fil-
ters. The step is of major importance, because irrelevant articles contain
no explanation for price movement and will decrease model accuracy
(Khandar and Dani 2010). The next step, maps historical news tickers, his-
torical market data, and historical prices to investigate the effect upon the
price trend. In fact, it is vague, how quick the trend development shows a
response to a message event. Some will need hours other minutes until a
price adjustment occurs. A time interval needs to be specified by domain
experts. Here, the mapping belongs to one of the biggest challenges
within the prediction through text documents (Lavrenko et al. 2000).
Figure 3.2 shows two possible mapping hypotheses (the time interval is
set to two minutes).
tion. Marked data and news tickers are joined by the same timestamp.
In fact, the forward mapping procedure requires a new trend calcula-
tion, because the news ticker itself forms the central artifact. Thus,
the trend is not estimated between two trade transactions any longer,
but rather to the different prices of the previous trade transaction
(from the news ticker) and the status two minutes after publication.
The forward mapping has one important drawback: all news tickers
are used, even irrelevant ones.
However, the mapping procedures are not limited for text docu-
ments, only. Videos, audios, or images could affect the marked as well
and are possible events, too. A short-time price movement (e.g. a price
drops and rise immediately) implies a small effect by news tickers or mar-
ket data, which perhaps caused the change. Relevant events will lead to
58 M. POSPIECH AND C. FELDEN
Margin
Rules. The testing sample is assigned to the class with the highest posterior
probability (Ni and Luh 2001). Hereby, the term naive assumes the inde-
pendency of all features among each other and a Gaussian distribution.
The estimation from the training data occurs through kernel smoothing
(Mitchell 1997).
The amount of available news tickers is growing. A decision-maker
would be overwhelmed considering all documents. In this context, a
first model is trained, which identifies only relevant situation. Therefore,
DOWN and UP examples are temporal label as UNSTABLE. STABLE
examples remain STABLE. Within the live system, only UNSTABLE
examples are forwarded to the next model. In a next step, the original
labels are reconstituted and a second model is trained to predict, whether
the example belongs to UP or DOWN.Ten percent of all examples remain
for testing. The used data are unknown for the trained models, which
reflect the reality. The model evaluation is done through the most popular
classification performance measure, the accuracy (Zhang and Luh 2002).
Hereby, the assignment of a class will be true positive (TP), if the item
belongs to the class positive and the algorithm did a correct classification.
In contrast, true negative (TN) represents the correct assignment of an
item that belongs to the negative case. False positive (FP) and false nega-
tive (FN) are both wrong classifications. The accuracy represents the ratio
of all correct classification in contrast to all assignments. Whenever the
distribution of classes is unequal or specific classes are of more interest as
others different measures are applied. Recall indicates how many elements
62 M. POSPIECH AND C. FELDEN
of a specific class are identified and precision measures how correct the
prediction was for a specific class (Miner etal. 2012):
TP + TN TP TP
Accuracy = Recall = Precision =
TP + FP + FN + TN TP + FN TP + FP
Sect. 3.3 was applied. Both scenarios were conducted through the data
mining tool RapidMiner (Rapid-I Incorporation 2013). Within the sce-
narios, 56GB main memory and 4 3.07GHz processors were available.
3.4.1Electricity Market
Many countries have restructured their electrical power industry and
introduced deregulation and competition by unbundling generation,
transmission, trading, and distribution of electricity. Market participants
need to forecast the price development to be able to maximize their prof-
its or hedge against risks of price volatility as well as to ensure safety of
investments (Li et al. 2007). Text documents contain verifiable impacts
to improve decision-making (Chang et al. 2010). Pospiech and Felden
(2014) used news tickers to forecast the electricity price. They focused
on the product year-ahead, where one electricity product is traded for
the whole year. The product is liquid enough and traded within seconds
so that market participants are able to react instantly regarding published
messages.
The historical price data occurred from October 2009 until December
2012 and were obtained from a German utility company. Trends were
estimated, whereby the class remains STABLE as long as the price
change does not exceed 0.1 price points. The transactions were mapped
through forward mapping with news tickers from Thomson Reuters to
investigate the effect of a message upon the price trend. In consulta-
tion with domain experts, a mapping time interval of two minutes was
applied. The English language news ticker were categorized by Thomson
Reuters into specific topics, nonrelevant messages on the electricity price
have been deleted. Out of 1,532 topics, only 192 were selected as rele-
vant. In the end, 1,442 items remained as input data. The electricity price
is impacted through various elements (Duarte etal. 2009). Processing
expert interviews, Pospiech and Felden (2014) identified valuable input
factors (see Table 3.1). The factors form the market data and are linked
to the news ticker.
The news tickers were transformed into a machine readable format
through text mining. TD-IDF values were calculated for all terms. Overall,
11,107 features/terms are used. The final input vectors were forwarded
to the data mining stage. The items were split into training, validation,
and test data to serve as input for the chosen algorithms SVM, KNN, and
Naive Bayes. During the model development, various and rational param-
BUSINESS ANALYTICS FORPRICE TREND FORECASTING THROUGH TEXTUAL... 65
eter settings were chosen until the model reached their optimal result. Ten
percent of the examples remained for the evaluation. The SVM achieved
an accuracy of 59.03 percent. The best results (64.58 percent) are from
KNN, and Naive Bayes (Kernel). Following Roomp et al. (2010), the
results are weak. As shown in Fig. 3.6, the class STABLE is predicted
wrong too often. In this context, Pospiech and Felden (2014) investigated
with interesting results, whether the results improve when STABLE exam-
ples are removed. The accuracy increases up to 93.33 percent. However,
in reality, STABLE (no price movement after publication) news tickers are
possible and cannot be removed. In this context, a second modeling and
due to this the identification of irrelevant ticker to be able to remove them
out of the set took place in the general forecast approach.
3.4.2Gas Market
In Europe, natural gas has a high strategic impact for the electricity and
heat supply. As consequence of an extensive liberalization of the European
gas market, natural gas is free tradable within open exchanges nowadays.
The gas price represents an indicator to adjust strategic behavior as well as
risk and investment management in organizations (Lin and Wesseh 2013).
66 M. POSPIECH AND C. FELDEN
Thus, the interest to predict the natural gas price is high (Malliaris and
Malliaris 2005). The forecast is complex and depends on various factors
like currency exchange rates, liquefied natural gas, temperatures, or text
documents (Linn and Zhu 2004; Busse et al. 2012). However, within
the gas market, an automatic trend prediction through unstructured news
tickers has not been stated, yet. Traders have to analyze them manually.
The more news are published the higher the probability to miss relevant
ones. Nevertheless, the gas market belongs to one of the most volatile
markets in the world (Lin and Wesseh 2013). As a result, the reaction time
is short and a processing of all relevant information in real time is indis-
pensable and due to this reason, a forecast system is needed. The scenario
is settled in the British gas market. The forecast product is month-ahead.
The product is high volatile and text documents can contain valuable
information during this product horizon (Linn and Zhu 2004). As shown
in Fig. 3.1, three data sources are needed. Examples from November 2011
until April 2013 are used as training and validation data. The months May
until August 2013 are used as test data.
Historical prices are obtained from an archive, where bits, offers,
asks, and deals for the month-ahead product are included. The deals are
extracted and trends (UP, DOWN, STABLE) are calculated. According
to domain experts, price movements not exceed 0.1 price points should
be labeled as STABLE.At first, relevant examples need to be identified.
Thus, the trends are relabeled temporal to STABLE and UNSTABLE.
Overall, 97,637 trade transactions remain and were used in this business
analytics process. The news tickers are obtained from Thomson Reuters,
whereby more than 3,500,000 tickers are provided. Non-English lan-
guage documents are removed. Only tickers published weekdays and dur-
ing trading hours are kept. Thomson Reuters categorizes news tickers into
specific topics. In consultation with a domain expert, 8 out of 1,532 topics
remained as gas market relevant. Tickers not containing one of this topics
BUSINESS ANALYTICS FORPRICE TREND FORECASTING THROUGH TEXTUAL... 67
are filtered. In addition, various key words are applied. Documents con-
taining terms like soy or wheat are removed. Thus, filtering reduced the
relevant amount to 117,699 tickers. Besides, news tickers relevant market
data was identified through expert interviews. In sum, 322 attributes are
considered as relevant, which might cause a price development. Those
market data are available for every 15seconds. Table 3.2 provides a selec-
tive overview.
According to the domain expert, the reaction time of the gas mar-
ket to an event is a two minutes interval. Both mapping paradigms are
applied. The backward mapping leads to 34,653 mappings. In fact, only
6,687 belong to UNSTABLE.The distribution makes sense. Only a view
articles will cause a price change. But still and calculating 20 working days
per month, approximately 3.4 important tickers are published per hour.
68 M. POSPIECH AND C. FELDEN
feature vector of training and test data has to be similar. The scenario
simulates the practical usage, because the models were trained with data
from November 2011 until April 2013. The prediction of the models
was checked, regarding whether the prediction is true after two minutes.
Unfortunately, out of 10,000 examples, only 360 are UNSTABLE.Thus,
the model should identify the relevant items and label the remaining
examples STABLE.The accuracy results of the different models and map-
ping hypothesis are shown in Table 3.3.
The whole model accuracy is of minor importance. A STABLE clas-
sification of all items would lead to an accuracy of 96.40 percent. The
accurate identification of relevant examples needs to be addressed. Thus,
the UNSTABLE precision and recall is of major interest. Comparing both
mapping methods, forward mapping generates the best results. But only
minor differences to a backward mapping are observed. Thus, in future,
both methods have to be applied to choose the best approach. The Naive
Bayes (Kernel) model achieves the best results. Figure 3.8 illustrates the
details of the best model. A total of 23.61 percent of UNSTABLE cases are
identified. In this context, from 10,000 test cases and 360 possible hits at
BUSINESS ANALYTICS FORPRICE TREND FORECASTING THROUGH TEXTUAL... 71
least 85 are correct identified. A total of 9,206 items are correct identified
as irrelevant. Only 518 cases are forwarded to the second model. A total
of 433 of them caused no price movement. However, that negates not
the impact of the message. It is imaginable that the price change will need
more than two minutes. In fact, the manual evaluation by a domain expert
points out that 16 percent of the 433 tickers are relevant. Nevertheless,
most of the irrelevant items are identified and almost a quarter of relevant
cases are found. Thus, the results are practicable.
The performance of the second model is excellent (Roomp et al.
2010). Again, the best results will be obtained, if the forward mapping
is selected during the model training. The Naive Bayes (Kernel) predicts
UP and DOWN examples at 91.76 percent correctly. Just seven exam-
ples are wrong. Nevertheless, the accuracy is estimated by the 85 cor-
rect forwarded items of Model 1. The 433 wrong STABLE cases are not
calculated, because they belong neither to UP nor to DOWN. Thus, a
right prediction is foredoomed to fail (Table 3.4).
The case study is implemented as prototype (see Fig. 3.9) within a
trading floor and follows the live process (Fig. 3.5). News tickers are pre-
processed and filtered. Marked data and price data are joined with the
remaining tickers through forward mapping. Text mining is applied and
item vectors are forwarded to the models. The model processes in real
time. Only UNSTABLE predictions will be labeled as UP or DOWN.The
calculated trends are stored in the database. The dashboards list updates
and changes, which are immediately moved to the user interface. The most
recent news ticker is presented as headline on top of the table. Besides
the news ticker, users can obtain additional marked information through
72
M. POSPIECH AND C. FELDEN
the details on the right-hand side. Here, all information used during the
model predication are highlighted through a pop-up table. In addition,
the full text is provided at the bottom right-hand-side text box. Users can
select historic predictions within the table. Based on the selected item, a
chart illustrates the market price before and after the publication. A slider
allows an interactive time interval selection so that different horizons can
be observed. In this context, traders can analyze the impact of current
and historic items to gain knowledge of the market behavior. At least,
the confidence column indicates how certain the models prediction was.
A confidence filter can be applied to reduce the amount of news tickers
within the user interface. Thus, only perditions reaching a minimum of
security are shown.
3.5 Conclusion
Business analytics provides a wide field of possible application scenarios.
One of them belongs to the prediction of price trends. During the recent
years, great progresses are made. Especially the rethinking driven by the
term Big Data increases the interest in business analytics. Thus, new
data sources are combined to allow an extended market understanding
(Pospiech and Felden 2013). This section provided such an application
scenario and introduced a generic forecast approach to integrate unstruc-
tured news tickers and structured market data. The approach was applied
within two different markets, whereby other scenarios are imaginable. The
results of the predictions are practicable and comparable to state-of-the-
art research. Even the drawbacks of Big Data are addressed by this busi-
ness analytics approach. Here, the requirements of a more task-oriented
provision of data due to an increasing availability, variety, and complexity
of new data sources to prevent an information flood is fulfilled (Pospiech
and Felden 2012). Out of 10,000 examples, just 518 tickers are forwarded
through the user, which gains a benefit in context of decision-making. In
contrast to other approaches, the given prototype is event based. Changes
published by news tickers are immediately processed. Nevertheless, one
drawback remains. New information, which are not published as text doc-
uments are not perceived by the system, because audio and video formats
are not in the systems scope. Additionally, if there are no news tickers, no
price forecast will happen. It also has to be understood that not all news
tickers pulled by the dashboard are relevant and the decision-maker has
still to decide, how to handle the given information. In context of the
74 M. POSPIECH AND C. FELDEN
process has to be stated that text and market data are weighted, equally.
Thus, the prediction is perhaps not caused by a news ticker, but rather by
the market variables itself. However, the calculated forecast will not lose
its validity.
References
Breiman, Leo, Jerome Friedman, Charles J.Stone, and Richard A.Olshen. 1984.
Classification and Regression Trees. Belmont: Wadsworth.
Busse, Sebastian, Patrick Helmholz, and Markus Weinmann. 2012. Forecasting
day ahead spot price movements of natural gasAn analysis of potential influ-
ence factors on basis of a NARX neural network. Paper Presented by the
Multikonferenz Wirtschaftsinformatik, Braunschweig, Germany.
Chan, Yue-cheong, Andy C.W.Chui, and Chuck C.Y.Kwok. 2001. The impact of
salient political and economic news on the trading activity. Pacific-Basin
Finance Journal 9(3): 195217.
Chang, Yin-Wen, Cho-Jui Hsieh, and Kai-Wei Chang. 2010. Training and testing
low-degree polynomial data mappings via linear SVM. Journal of Machine
Learning Research 11(4): 14711490.
Chawla, Nitesh V., Nathalie Japkowicz, and Aleksander Kolcz. 2004. Editorial:
Learning form imbalanced datasets. SIGKDD Explorations Newsletter 6(1):
16.
Davenport, Thomas, and Jeanne Harris. 2007. Competing on Analytics: The New
Science of Winning. Boston: Harvard Business School Press.
Duarte, Andre, Jose Nuno Fidalgo, and Joao Tom Saraiva. 2009. Forecasting
electricity prices in spot marketsOne week horizon approach. Paper Presented
by the IEEE PowerTech, Bucharest, Romania.
Fayyad, Usama, Gregory Piatetsky-Shapiro, and Padhraic Smyth. 1996. From data
mining to knowledge discovery. In Advances in Knowledge Discovery and Data
Mining, ed. Usama Fayyad, Gregory Piatetsky-Shapiro, and Padhraic Smyth.
Menlo Park: AAAI Press.
Felden, Carsten, and Peter Chamoni. 2003. Web farming and data warehousing
for energy tradefloors. Paper Presented by the IEEE Web Intelligence WI.
Gartner. 2013. Hype cycle for big data. https://www.gartner.com/doc/2574616.
Accessed 28 April 2014.
Geva, Tomer, and Jacob Zahavi. 2010. Predicting intraday stock returns by inte-
grating market data and financial news reports. Paper Presented by the
Mediterranean Conference on Information Systems MCISS.
Khandar, Punam V., and Sugandha V.Dani. 2010. Knowledge discovery and sam-
pling techniques with data mining for identifying trends in data sets.
International Journal on Computer Science and Engineering (IJCSE) (Special
Issue): 711.
BUSINESS ANALYTICS FORPRICE TREND FORECASTING THROUGH TEXTUAL... 75
D.AnthonyMiles
4.1 Introduction
The use of analytics is coming popular due to the popularity of such films
as Moneyball, and thus emerged the philosophy of statistical thinking.
Using predictive analytics in business is not a secret anymore. It is now
becoming a big part of decision-making in companies. Using analytics
to study and predict patterns in businesses is important in this era of big
data.
Many times in the past, the field of marketing has long suffered because
it was hard to determine the effectiveness of advertisements and promo-
tional campaigns return on investment (ROI) of efforts. The use of ana-
lytics in business is now becoming a standard practice among corporations
and businesses. A major point of the using analytics is to help researchers
4.2.2Analytic Modeling
The use of analytics modeling is a further evolution of analytics in mar-
keting. Surprisingly, analytic models are also used in conceptionalizing
marketing analytic endeavors. They are characterized by precision of
expression. Furthermore, the use of analytic models is especially valuable
when they generate insights that are conditional or strategic in nature
as opposed to first-order or main effects. Such effects can be very diffi-
cult to document empirically, either because they cannot be disentangled
from the web of factors interacting in a complicated real-world market
or because their incremental effect on outcomes may not be measurably
large (Coughlan et al. 2010). Analytics have been historically rooted in
mathematical and statistical models (Chen etal. 2010; Drye 2011; Dufour
etal. 2012; Furness 2011; Gnatovich 2007; Marsella et al. 2005; Steinley
and Henson 2005).
4.2.3Predictive Analytics
The transition from using traditional analytics to predictive analytics has
been critical in the evolution of marketing and business intelligence (BI).
Furthermore, the latest shift in the BI market is the move from traditional
analytics to predictive analytics. Historically, predictive analytics belongs to
the BI family, it is emerging as a distinct new software sector (Zaman 2003).
Some marketing managers are traditionalists and still fight the use of
analytics as a necessary measurement tool. Many skeptics preference is the
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 83
4.2.5Marketing Metrics
Many marketing professionals have had some conflict with using metrics
to measure the effectiveness of advertising efforts or marketing efforts.
Few marketers recognize the extraordinary range of metrics now available
for evaluating their strategies and tactics. Companies are now using frame-
works for presenting marketing metrics. There are basically five types of
marketing metrics companies use: (a) Customer and Market Share-based,
(b) Revenue and Cost-based, (c) Product-based and Portfolio-based, (d)
Customer Profitability-based, and (e) Sales Force and Channel-based (Farris
etal. 2006). A marketing metrics framework must demonstrate how mar-
84 D.A. MILES
2009). Finance and marketing have traditionally been on different pages, talk-
ing different languages and unable to establish common goals (See 2006).
As with analytics, a marketing manager must be careful in its use. Many
of the traditional school of marketing professionals are still not convinced
with marketing metrics, as with marketing analytics. They still cling to the
old way of doing marketing. However, there is a dark side to metrics. Like
anything, overuse of marketing metrics can lead to disastrous results. The
use of metrics can lead to an over-reliance on statistical modeling tech-
niques (Ozimek 2010).
4.3 Methodology
4.3.1Population andSample
The data were collected through an internet questionnaire and a paper
questionnaire. The participants were FBEs. The participants were selected
from the yellow pages, local woman chambers of commerce (which was
assisted by local contacts), and the Small Business Development Center
(SBDC). The participants were able to complete the Marketing Activity
and Customer Activity Scale (MACS) survey from their offices via the
internet.
A total of 11 industry sectors were examined for this study. For each
market, both a convenience and random sample was drawn with sample size
of approximately 123 FBEs from a population of 12,256. The questions
about brand relation dealt with this particular brand. A five-point Likert
Scale was used that consisted of 1-Strongly Agree to 5-Strongly Disagree.
The data were collected for the duration of one year (20122013).
4.3.2Research Hypotheses
Four statistical hypotheses were tested for this study. The general hypoth-
esis is that there is significance in FBEs based on four marketing analytics.
The hypotheses can be segregated and studied as the following:
The first two hypotheses suggest that customer behavior analytics are
significant by the customer behavior activity in the FBEs. The second two
hypotheses suggest that market behavior and competition analytics are
significant by the market behavior activity in the FBEs. Researchers sug-
gest that an emphasis on one or more metrics within each analytic is a
necessary examining customer behavior and marketing behavior.
Based on the models presented, research analytics are given in Fig. 4.1.
on marketing analytics. The validity and reliability of the scale was assessed
using principal component factor analysis (PCA) and structural equation
modeling (SEM). The researcher conducted validity tests on the MACS
instrument by conducting additional tests (see later in results), such as
internal consistency by using Cronbach alpha and multivariate techniques.
business enterprise and thus leads to enhance the market and economic
growth.
behavior activity in the business enterprise and thus leads to enhance the
market and economic growth (see Table 4.1).
activity with SME. The MACS instrument was adapted to ensure that
it was appropriate for use in for analyzing marketing analytics for FBEs.
The questionnaire instrument consisted of two sections: (a) Section
1Sociodemographic characteristics information and (b) Section 2
Marketing and Economic characteristics information. The MACS instru-
ment used a five-point Likert-scale ranging from 1 (Strongly Disagree) to
5 (Strongly Agree), the participants were asked to rate the importance of
each of the ten marketing metrics and economic metrics to determine the
significant analytics.
Data Analysis. The study used statistical analyses for examining the
data from the sample. First, descriptive statistical methods were used
such as frequencies and distribution analysis, which were used to
analyze the characteristics of the FBEs. Second, an exploratory fac-
tor analysis (EFA) was used, then a Pearson Correlation was used.
Lastly, structural equation model (SEM) was used for a path analysis
for the data.
Statistical Analyses Tools. The statistical analyses for the data in
the research were performed using SPSS (Statistical Package for
the Social Sciences) Version 21.0 for statistics. AMOS (Analysis
of Moment Structure) Version 21.0 software (Arbuckle 1995) was
used for the SEM.First, a data screening was conducted to inspect
the variables for the multivariate analyses. SPSS was used for com-
puting the descriptive statistics, inferential statistics, and multivariate
statistics. AMOS was used for computing the SEM.The sample (N
= 123) of FBEs was selected to test the psychometric properties of
the 18-item MACS.First, the exploratory factory analysis (EFA) was
performed. Lastly, then a path analysis was conducted to assess the
model fit for confirming multivariate normality and the refined mar-
keting analytics and metrics items.
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 91
4.5 Results
This section of the study presents the results of the statistical analyses. The
purpose of this study is to examine marketing analytics with FBEs. Four
hypotheses were tested for this study with four marketing analytics. First,
a descriptive statistical analysis was conducted on the sociodemographics.
Data such age, gender, ethnicity, and so on were examined in the study.
Second, an EFA was conducted to determine the factor structure of the
analytic metrics and variables. Third, a path analysis was conducted using
a SEM.The SEM was conducted to test the validity of the factor struc-
ture. This was conducted, using AMOS to determine which path structure
adjusts better to RMS instrument, and its fit was measured through the
following indices. Lastly, a Cronbachs Alpha was conducted to measure
internal consistency in the MACS instrument (see Table 4.2).
+ Analytic 1 +H1
+ +H2
Analytic 2
FBE +H3
Firm
+ Effect
Analytic 3 On FBEs
Variables
+ +H4
Analytic 4
Fig. 4.2 Conceptual model of study: Path analysis of firm variable on analytics
92 D.A. MILES
Owner ethnicity
Asian (Pacific Islander) 4 3.3
Black (non-Hispanic) 25 20.3
Hispanic 56 45.5
Native American Indian 2 1.6
White (non-Hispanic) 33 27.0
Other 3 2.4
Industry type
Agriculture 3 2.4
Communications 3 2.4
Construction 10 8.1
Finance 4 3.3
Manufacturing 4 3.3
Retail Trade 12 9.8
Services 49 40.0
Technology 6 4.9
Transportation 1 0.8
Wholesale 5 4.1
Other Industry 26 21.1
Business entity type
Corporation 28 22.8
Limited Liability Corp or Limited Liability Part 15 12.2
Partnership 9 7.3
Sole Proprietorship 66 53.7
Other 5 4.1
Employee number
110 112 91.1
1120 8 6.5
2130 1 0.8
51100 1 0.8
101200 1 0.8
Franchise
Franchise 13 10.6
Non-franchise 110 89.4
(N = 123)
Four hypotheses were tested for this study with four marketing analyt-
ics. Four hypotheses were tested on a theoretical model based on four
different marketing analytic categories: (a) Customer Turnover Analytic,
(b) Customer Credit Analytic, (c) Market Potential Analytic, and (d)
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 93
Note: Extraction Method: Principal Axis Factoring. Rotation Method: Varimax with Kaiser Normalization.
Rotation converged in 12 iterations. Benchmark for this study, a minimum coefficient of 0.3 and higher
will be used as the standard
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 95
e1 e2
V17 V18
.56 .83
e17 Analytic 1
.35 V19 e6
.07
Analytic 4 V21 e7
.36
V22 e8
e16
e18
e5 V20 .78
.49 .89
e4 V23 Analytic 2
.27 e19
.84
e3 V24
Analytic 3
.66
.26 1.18
.39
V15 V16
Firm
Variables e15 e14
.19 .33 .05 .13
.21
Fig. 4.3 SEM path analysis results for the MACS instrument (k = 10 Items)
0.031, IFI = 0.932, and AIC = 194.330, and BCC = 209.122. A notable
observation on the goodness-of-fit theory is that a model that demonstrates
poor to marginal model fit does not imply that the path model is best, but
only plausible (Kline 1998). The cross-validation with the sample examined
the psychometric properties of the measurement model. A chi-square dif-
ference (2 = 96.330, df = 86, p < 0.0000001) further suggests the mea-
surement model was variant, and scale constructs were perceived in a similar
manner across the sample (Kline 1998) (see Fig. 4.3 and Table 4.4).
4.5.4Correlation andAnalytics
Correlation analyses were also used to examine relationships among mar-
keting metrics within the analytics. The MACS instrument measures the
market analytics in the FBEs. Table 4.5 shows strong correlations among
the analytic variable metrics. First, in the Customer Credit Analytic, the
results indicate there was a significant relationship between the metrics
V17-Customer Credit and V18-Line of Credit (r = 0.465, p < 0.01).
Second, in examining the Marketing Potential Analytic, the results indi-
cate a significant relationship between the metrics V20-Barriers to Entry and
V23-Government Regulation (r = 0.381, p < 0.01). Third, the Customer
Turnover Analytic, there was a significant relationship between the metrics
V15-Velocity of Profit and V16-Customer Activity/Turnover (r = 0.306, p <
0.01). Lastly, Competition and Economics Analytic resulted in no significant
correlations in the data. In summary, the Market Potential Analytic was
found to be a potent predictor of the market potential (see Table 4.5).
Table 4.5 Correlations of observed analytics and metric items and covariates
Analytics and Means SD V171 V181 V202 V232 V242 V153 V163 V194 V214 V224
metric items
Note: ** Denotes correlation is significant at p < 0.01 (0.01 level two-tailed). * Denotes correlation is significant at p < 0.05 (0.05 level two-tailed)
100 D.A. MILES
Table 4.6 Linear regression model of the firm variables effect on Analytic 1:
Customer Credit
Analytic 1: Customer Credit Regression SE t p
coefficient
Table 4.7 Linear regression model of the firm variables effect on Analytic 2:
Market Potential
Analytic 2: Market Potential Regression SE t p
Analytic coefficient
Second, although the measures were drawn from the marketing lit-
erature, however it must be noted that the findings suggest some of the
analytics revealed some interesting results specifically concerning the
Competition and Economics Analytic (CEAn = CPI1 + ECR2 + GOR3).
Considering, that customer behavior is difficult to measure in firms, a piv-
otal aspect of customer behavior is examining how analytics can be used by
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 103
Table 4.8 Linear regression model of the firm variables effect on Analytic 3:
Customer Turnover
Analytic 3: Customer Regression SE t p
Turnover coefficient
patterns in the data and directed at specific target industries (Bailey etal.
2009; Kridel and Dolk 2013; Hair 2007). Therefore, the specific research
design was warranted to determine if such marketing and customer behav-
iors meet the noted condition for establishing that marketing analytics
bullying has indeed occurred.
Third, the findings of the study underscore several key points. First,
four hypotheses were tested for this study with four marketing analytic
models. Four hypotheses were tested on a theoretical model based on
four different marketing analytic model categories: (a) Customer Turnover
Analytic, (b) Customer Credit Analytic, (c) Market Potential Analytic, and
(d) Competition and Economics Analytic. The most salient finding from
the results is that the marketing analytic models that half of the analyt-
ics supported the hour hypotheses. The findings from the results indi-
cate that H1 was supported. Next, the findings from results indicate that
H2 was supported. Next, the findings from results indicate that H3 was
not supported. However, the findings from results indicate that H4 was
104 D.A. MILES
Table 4.9 Linear regression model of the firm variables effect on Analytic 4:
Competition and Economic
Analytic 4: Competition and Regression SE t p
Economic coefficient
supported. Interestingly, the findings of the study could not support the
marketing analytic models concerning the predictor variables (ethnicity,
industry type, franchise, and employee number) influence on FBEs.
Lastly, the results of the first three analytic models prove to be most rel-
evant in the situations in which FBEs ability to borrow money from finan-
cial institutions. Also, this model proves the fourth analytic (Competition
and Economic Analytic) needs to be modified, due to the low coefficients.
With this modification to the analytic model, can possibly bear more sta-
tistical significance in the data, thus the researcher would consider modifi-
cations in the metric items.
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 105
the economic competition dynamics for and the effects of other forms
of competition metrics and analytics. This type of analysis could lead to
establishing new theoretical models for further refinement of marketing
models. This could also provide much richer theories that could assist
firms in different industries or market sectors.
References
Apte, C.V., Ramesh Natarajan, Edwin P.D. Pednault, and F.A. Tipu. 2002. A
probabilistic estimation framework for predictive modeling analytics. IBM
Systems Journal 41(3): 438448.
Arbuckle, J.L. 1995. AMOS Users Guide. Chicago: Smallwaters Publishing.
Bailey, Christine, Paul R. Baines, Hugh Wilson, and Moira Clark. 2009.
Segmentation and customer insight in contemporary services marketing prac-
tice: Why grouping customers is no longer enough. Journal of Marketing
Management 25(34): 227252.
Bijmolt, Tammo H.A., Peter S.H.Leeflang, Frank Block, Maik Eisenbeiss, Bruce
G.S.Hardie, Aurlie Lemmens, and Peter Saffert. 2010. Analytics for customer
engagement. Journal of Service Research 13(3): 341356.
Brooks, Neil, and Lyndon Simkin. 2011. Measuring marketing effectiveness: An
agenda for SMEs. The Marketing Review 11(1): 324.
Chen, Yongmin. 2006. Marketing innovation. Journal of Economics & Management
Strategy 15(1): 101123.
Chen, Chun-An, Ming-Huang Lee, and Ya-Hui Yang. 2010. Branding Taiwan for
tourism using Decision Making Trial and Evaluation Laboratory and Analytic
Network Process methods. The Service Industries Journal 32(8): 13551373.
Coughlan, Anne T., S. Chan Choi, Wujin Chu, Charles A. Ingene, Sridhar
Moorthy, V.Padmanabhan, Jagmohan S.Raju, David A.Soberman, Richard
Staelin, and Z.John Zhang. 2010. Marketing modeling reality and the realities
of marketing modeling. Marketing Letters 21(3): 317333.
Dash, Debi Prasad, and Alok Sharma. 2012. B2B marketing through social media
using web analytics. PRIMA, 3(2): 22. Publishing India Group.
Drye, Tim. 2011. Neighbourhood effects and their implications for analytics and
targeting. Journal of Direct, Data and Digital Marketing Practice 13(2):
119131.
Farris, Paul W., Neil T.Bendle, Phillip E.Pfeifer, and David J.Reibstein. 2006.
Marketing Metrics: 50+ Metrics Every Executive Should Master. Pearson
Education.
Fluss, Donna. 2010. Why marketing needs speech analytics. Journal of Direct,
Data and Digital Marketing Practice 11(4): 324331.
MARKET RESEARCH ANDPREDICTIVE ANALYTICS: USING ANALYTICS... 107
MyropiGarri andNikolaosKonstantopoulos
5.1 Introduction
During the last four decades, a wide range of researchers have explored
the processes of developing and implementing successful strategies for
national and international markets. Strategic management, as a scientific
field, has quickly grown and today encompasses a wide plurality of research
questions, units of analyses, and modeling tools, as a plethora of theo-
ries (e.g., Power of Competition, Resource Based View (RBV), Dynamic
Capabilities) and factors (macro- and micro-external, internal) have been
interrelated to it. However, under the light of interconnectedness, the high
complexity of analysis we reached, instead of enlightening and puzzling
M. Garri (*)
University of Portsmouth, School of Business, Portsmouth, UK
e-mail: myropi.garri@port.ac.uk
N. Konstantopoulos (*)
University of the Aegean Business School, Chios, Greece
e-mail: nkonsta@aegean.gr
out the process of developing strategy, has in contrast increased its level
of complexity, vagueness, and fragmentation. Furthermore, during the
contemporary times of turbulence, uncertainty about possible evolutions
and future trends of all these factors has been dramatically increased.
Additionally to the complicated and complex business world that we have
to take into account while shaping strategy, other contextual evolutions,
such as technological improvements, rise of social media and big data,
change gradually the value-adding operations of the firm. Thus, we note
that in the field of strategic management, there is a need created. Instead
of integrating new approaches and theories to the field, resulting to fur-
ther increase the level of complexity and uncertainty of strategic decision
making, we should start to revisit the foundations of strategy formulation
process. By revisiting one by one the different stages of strategy formula-
tion, we will understand whether and how much each process of strategy
development has been reshaped due to current evolutions. The reviewing
of the underpinnings of strategy formulation and the adoption of a rather
pragmatic rather than theoretic approach will decrease vagueness, pro-
viding more clear answers on the real contemporary practices used from
managers and entrepreneurs so as to develop successful strategies in for-
eign markets. Taking into account the above described framework, in this
chapter we are going to revisit strategy in use.
Given that world widely the current business environment is under
constant and radical change, we feel that one of the most important issues
companies have to address while formulating strategy is to develop pro-
cesses in order to capture the current trends in the market and the indus-
try they belong, so as to incorporate them in their strategic planning.
Therefore, this chapter is concentrating to one of the most important
areas of the traditional approach of strategy formulation, the information
acquisition, and processing process. Understanding the exploitation of lat-
est technology for market research purposes as a value-adding element for
the firm, leading to the creation of successful strategies in foreign markets,
we are to examine the evolution of technologys effect on the process
of information obtainment and processing. Specifically, we are going to
review the process internationalized enterprises implement in practice in
order to obtain information for market and industry research purposes,
investigating types of information obtained, sources of information, and
processes of dealing out the acquired information.
At the same time, we cannot ignore the fact that there is no magical rec-
ipe or a golden rule of success valid for all enterprises. The characteristics
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 111
each company a unique entity and lead to the generation of dynamic capa-
bilities, competitive advantage, and higher levels of performance (Eriksson
etal. 2014; Villar etal. 2014).
Another similar viewpoint supports that the organizational culture
affects decision making in a company; the corporate orientation consists of
the crucial components leading the enterprise to sustainable development
and growth, combined with the knowledge that the firm absorbs from the
market, which may redetermine its course (Dornberger and Nabi 2008).
The effective distribution of resources leverages the internalization of the
external environments effect, a fact that drives to strategic development
and organizational adjustment (Marciano 2011).
Summarizing, the transition of the company to a completely new busi-
ness environment calls for and is managed by the adjustment of business
strategy and policy to the newly emerging conditions. In addition, the
successful implementation of an internationalization strategy presupposes
the ownership or development of appropriate structures and processes by
the firm, in order to leverage the adaptation of the enterprise to the new
market settings in which it is called upon to operate and develop. Thus, a
first research main hypothesis can be formed:
H1: The characteristics, structures, and past strategies of the firm con-
stitute the main factors that facilitate or prevent the formulation and
implementation of future strategies.
this direction, CRM provides valuable insights into the way the compa-
nys products can be modified and promoted effectively, as it integrates
management of clusters of customers, offering tailor-made solutions that
address their personalized needs.
At the same time, information mined and explored supports the devel-
opment of various growth strategies, such as internationalization strategy.
CRM has been also identified as a crucial part of the internationalization
strategy development. Although it hasnt been a wide convergence on the
concept, Ngai (2005) underlined the significance of understanding CRM
as a comprehensive set of strategies for managing those relationships with
customers that relate to the overall process of marketing, sales, service,
and support within the organization (p.583). The effective use of CRM
software not only allows companies to build and maintain strong relation-
ships with domestic customers but also broadens the ability of the firm to
reach new market potential in foreign markets (Harrigan etal. 2008) an
important step toward the companys internationalization. As proposed
by McGowan and Durkin (2002), CRM increases both the internal and
external organizational efficiency throughout all stages of the supply chain.
However, it should not be employed autonomously but in alignment with
the overall internationalization strategy (Harrigan etal. 2008).
The fact that the company owns and uses a customers information
software can be a sign of having adopted a proactive and strategically alert
behavior in its operation in the local and foreign markets. To explore in
practice (a) the relationship between the implementation of CRM and
the marketing and organizational strategies of the firm, representing its
strategic thinking and strategic behavior and (b) to illustrate the relation-
ship between the use of a customer information acquisition software with
the level of internationalization activity of the company, we develop the
following hypotheses:
H4: We expect companies owning a CRM system to design and apply
multiple, direct, and indirect marketing strategies in the foreign markets
they activate.
H5: We expect companies owning a CRM system to be actively involved
and highly committed to the markets they develop their business activity.
We expect these companies to have a higher grade of internationalization
involvement.
H6: We expect companies owning a CRM system to have a high level of
strategic complexity so as to support their internationalization activities.
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 119
5.3 Methods
As proposed by many researchers in the field (e.g., Ramamurti 2004),
our research methodology involves both qualitative (Ritossa and Bulgacov
2009) and quantitative (Hutchinson etal. 2009) research methods. The
integration of important constructs proposed by prior research along with
the variables deriving from qualitative research assembles a valid research
framework of the examined field. We conduct in-depth personal inter-
views to refine our constructs and to develop a closed questionnaire. The
employment of interviews before embarking on the questionnaire gives a
feel for the key issues and confidence that the most important issues are
addressed (Saunders etal. 2011). Then, we test our hypotheses using the
survey data. To measure the internal cohesion of the questionnaire, we
use the reliability coefficient Cronbach . The value of the coefficient
is 0.967, indicating that the reliability of the questionnaire is very high
96.7 %. We decided to exclude all internationalized companies providing
services, as they constitute a unique case, which requires the development
of a different theoretical background.
A total of 1400 internationalized manufacturing companies are identi-
fied by the HEPO (Hellenic Foreign Trade Board) directory. We apply
a multi-industry stratified sampling design so as to broaden the gener-
alizability of the findings. We address the questionnaires to internation-
alization managers/directors. The 158 usable out of 165 questionnaires
totally received correspond to 11.29 % of the population. An effective
response rate of 36.66 % was attained. We compare responding and non-
responding companies in terms of size and mode of internationalization.
We do not find any significant differences between these two groups, a fact
suggesting that there is no response bias.
5.3.1Measures
Dependent Variables. Integrating information mined by the literature
review and the results coming from interviews with internationalization
directors, we concluded to include the following dependent variables to
our questionnaire (Table 5.1).
Five-grades Likert scale was used to measure the impact of each motive
to after the companys internationalization decision-making process. The
operationalization of independent variables, which are the structures, stra-
tegic, and contextual characteristics of the firm, is available at http://
tinyurl.com/p352ygz.
120 M. GARRI AND N. KONSTANTOPOULOS
Information about the Local States Assistance (Consulates The company owns
behavior of the product EmbassiesCommercial Attaches) and uses CRM
Software
Information about the Compatriot companies that operate in
Consumer Behavior of the the foreign market
Foreign Market
Information about Co-operating Companies
Competitive Products
Information about Cultural Institutional Bodies that offer
Dimensions and Socio- assistance/information
political Dimensions of the
Market
Information about the Visits in the foreign country/market
other factors affecting the from managers of the company
market
Companys Executives Seeking
Information in Foreign Markets
Trade Shows/Exhibition
Companys Representatives Abroad
Consultancy Firms, Research
Agencies
For the first cluster of companies, we observe higher means for every
variable, compared to the total mean. Oppositely, for the firms of the sec-
ond cluster, we observe lower means for every variable, compared to the
total mean. The means of the first cluster range at almost the same value
for all variables (max: 3.90min: 3.33), except of the variable Information
about the Competitive Products. The mean of this variable is the higher
one (4.20) indicating that most enterprises primarily care about the level
of competition in the industry. The mean of this variable is the highest one
in both clusters and in total. Even for the second cluster of enterprises, the
one that tends not to acquire market information, the mean of this vari-
able is 2.46, while the total mean fluctuates around 3.67. That indicates
that for every enterprise, even for those that do not widely gather market
information, competition is their main concern before and after entering
a market. They collect information about the competitive products (price,
quality, distribution channels used, promotion strategy, etc.), even if they
do not care much to acquire any other type of market information. In a
sense, our findings highlight competition as a regulatory force, as it may
enforce or prevent the entrepreneurial decision for activating in a mar-
ket, as well as a market strategy shaping element (Menon and Varadarajan
1992).
The second clusters means range from 1.54 to 2.46, revealing that
about 30% of the entrepreneurs seek little or no information about the
Table 5.3 Logistic Regression Results for Types of Information Obtained, and
Characteristics, Strategies, and Structures of the Firm
B S.E. Wald df Sig. Exp(B)
Table 5.4 Compare Means for the Institutional Information Sources Variables
Cluster Local States Companies Institutional Consultancy Institutional
Institutional Assistance operating in Bodies of the Firms. Bodies that offer
Information (Consulates the foreign Foreign Research assistance/
Sources Embassies market Country Agencies information
Commercial
Attaches)
for all variables (max: 3.58min: 2.39), while the second clusters means
range from 2.10 to 1.35. We assume that the first cluster of companies has
obtained information for their internationalization activity mainly from
institutional information sources, while the second cluster of companies
has not. The Compare Means table produced by SPSS, for the Inter-
organizational and Market Information Sources factor provides informa-
tion about the mean of each variable belonging to this group of variables
per cluster (Table 5.5). Combining this information, we can determine
the status of the two clusters of the factor.
As shown, companies belonging to the first cluster (total 70, 44.30 %)
have higher observations means for every variable, compared to the total
mean. Similarly, firms belonging to the second cluster (total 82, 55.70 %)
have lower observations means for every variable, compared to the total
mean. The means of the first cluster range at almost the same value for all
variables (max: 3.99min: 3.23), while the second clusters means range
from 3.15 to 1.61. Cluster analysis results reveal that some companies
obtain information about the foreign market from institutional infor-
mation sources, while other companies do not extensively use this kind
of information sources. In addition, there is a group of companies that
acquire information so as to support their internationalization activity from
Inter-organizational Factors and Market Factors, while another group
of companies does not. Results show that there is a group of companies
Table 5.5 Compare Means for the Inter-organizational and Market Information
Sources Variables
Cluster Co-patriot Co-operating Visits in the Trade Companys
Inter- companies Companies foreign Shows/ Representatives
organizational country/ Exhibition Abroad
and Market market from
Information managers of
Sources the company
Information Software
Phi Correlation 0.257 0.159
Approx. Sig 0.001 0.046
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 129
Information Software
Phi 0.244 0.169 0.245 0.208 0.238
Correlation
Approx. Sig 0.002 0.034 0.002 0.009 0.012
Information Software
Phi Correlation 0.230 0.355
Approx. Sig 0.004 0.000
130
Distributors/Wholesalers
Internationalization via Mergers 20.325 11307.471 0.000 1 0.999 671781900.133
& Acquisitions
Multiple strategies applied 1.017 0.388 6.867 1 0.009 2.766
Constant 44.736 22614.941 0.000 1 0.998 0.000
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 131
ized as the development of multiple targets for each foreign market. The
existence of a special internationalization department enables the com-
pany to study foreign markets and develop multiple, structured, market-
targeted strategies. Findings are aligned with Ngai (2005) who underlined
the significance of understanding CRM as a comprehensive set of strate-
gies for managing those relationships with customers that relate to the
overall process of marketing, sales, service, and support within the orga-
nization (p.583). These empirical results verify our research hypothesis;
companies owning a CRM system are expected to be market-oriented,
designing, and applying multiple strategies for each market they wish to
penetrate, develop, and compete.
5.5 Conclusion
This chapter is looking at the strategic behavior of internationalized enter-
prises on three fronts: on the kind of information they select about the
foreign market, on the sources of information they use, and on the use of
relevant software to organize and exploit these information. Market infor-
mation acquisition is a crucial component of internationalization, given
that it is important for the company to be informed about the overall
market conditions, before deciding to invest in any market. We find evi-
dence interconnecting the adoption of the market information acquisition
process to the development of active marketing strategies. Enterprises,
which develop and implement complex foreign market penetration and
development strategies, tend to obtain all kinds of market information in
order to maximize their business performance in the market. Regarding
the strategic behavior of companies for the sources of information they
use, results showed that there is a wide range of information sources use
by internationalized companies. In detail, companies mainly use insti-
tutional information sources and inter-organizational and market
information sources. Companies using institutional sources seem to be
rather re-active while companies using inter-organizational and market
information sources seem to be more proactive and to more engaged in
internationalization. Companies of the second category show evidence of
higher internationalization intensity and evidence on the existence and
implementation of outward-looking strategies and higher commitment to
the internationalization vision.
This chapter also examined the interconnection between the use of
CRM software in internationalized enterprises, and the development of tar-
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 133
References
Barney, Jay. 1991. Firm resources and sustained competitive advantage. Journal of
Management 17(1): 99120.
Bose, Ranjit. 2002. Customer relationship management: Key components for IT
success. Industrial Management & Data Systems 102(2): 8997.
Cavusgil, S. Tamer. 1985. Differences among exporting firms based on their
degree of internationalization. Journal of Business Research 12(2): 195208.
Cepeda-Carrion, Gabriel, Juan G.Cegarra-Navarro, and Daniel Jimenez-Jimenez.
2012. The effect of absorptive capacity on innovativeness: Context and infor-
mation systems capability as catalysts. British Journal of Management 23(1):
110129.
Dornberger, Utz, and Md. Noor Un Nabi. 2008. Internationalization dynamic of
Eastern German SMEs. In Proceedings of the International Council for Small
Business World Conference, Halifax, Nova Scotia, Canada, 2225 June 2008.
Available at: http://sbaer.uca.edu/research/sbi/2008/creak18f.html.
Eisenhardt, Kathleen M., and Filipe M.Santos. 2002. Knowledge-based view: A
new theory of strategy. Handbook of Strategy and Management 1: 139164.
Eriksson, Taina, Niina Nummela, and Sami Saarenketo. 2014. Dynamic capability
in a small global factory. International Business Review 23(1): 169180.
Fink, Dieter. 2006. Value decomposition of e-commerce performance.
Benchmarking: An International Journal 13(1/2): 8192.
Fortune, Annetta, and Will Mitchell. 2012. Unpacking firm exit at the firm and
industry levels: The adaptation and selection of firm capabilities. Strategic
Management Journal 33(7): 794819.
Hagel, John, and Jeffrey F.Rayport. 1997. The coming battle for customer infor-
mation. McKinsey Quarterly (3): 6477.
Hkansson, Hkan, Virpi Havila, and Ann-Charlott Pedersen. 1999. Learning in
networks. Industrial Marketing Management 28(5): 443452.
134 M. GARRI AND N. KONSTANTOPOULOS
Harrigan, Paul, Elaine Ramsey, and Patrick Ibbotson. 2008. e-CRM in SMEs: An
exploratory study in Northern Ireland. Marketing Intelligence & Planning
26(4): 385404.
Henisz, Witold J., and Andrew Delios. 2002. Learning about the institutional
environment. Advances in Strategic Management 19: 339372.
Hitt, Michael, R. Duane Ireland, and Robert Hoskisson. 2012. Strategic
Management Cases: Competitiveness and Globalization. Boston: Cengage
Learning.
Hult, G., M. Tomas, and David J.Ketchen. 2001. Does market orientation mat-
ter?: A test of the relationship between positional advantage and performance.
Strategic Management Journal 22(9): 899906.
Hutchinson, Karise, Barry Quinn, Nicholas Alexander, and Anne Marie Doherty.
2009. Retailer internationalization: Overcoming barriers to expansion. The
International Review of Retail, Distribution and Consumer Research 19(3):
251272.
Ingram, Paul, and Tal Simons. 2002. The transfer of experience in groups of orga-
nizations: Implications for performance and competition. Management Science
48(12): 15171533.
Jorda-Albinana, Begona, Olga Ampuero-Canellas, Natalia Vila, and Jos Ignacio
Rojas-Sola. 2009. Brand identity documentation: A cross-national examination
of identity standards manuals. International Marketing Review 26(2): 172197.
Kajalo, Sami, Risto Rajala, and Mika Westerlund. 2007. Approaches to strategic
alignment of business and information systems: A study on application service
acquisitions. Journal of Systems and Information Technology 9(2): 155166.
Knight, Gary A., and Peter W.Liesch. 2002. Information internalisation in inter-
nationalising the firm. Journal of Business Research 55(12): 981995.
Kutsikos, Konstadinos, and Gregoris Mentzas. 2012. Managing value creation.
Knowledge Service Engineering Handbook, 123.
Lagos, Dimitrios, and Konstadinos Kutsikos. 2011. The role of IT-focused busi-
ness incubators in managing regional development and innovation. European
Research Studies Journal 14(3): 3350.
Leonidou, Leonidas C. 1997. Finding the right information mix for the export
manager. Long Range Planning 30(4): 479584.
Li, Tiger, and S.Tamer Cavusgil. 2000. Decomposing the effects of market knowl-
edge competence in new product export: A dimensionality analysis. European
Journal of Marketing 34(1/2): 5780.
Liesch, Peter W., and Gary A.Knight. 1999. Information internalization and hur-
dle rates in small and medium enterprise internationalization. Journal of
International Business Studies 30(2): 383394.
Marciano, Alain. 2011. Ronald Coase, The problem of social cost and the Coase
theorem: An anniversary celebration. European Journal of Law and Economics
31(1): 19.
STRATEGIC PLANNING REVISITED: ACQUISITION ANDEXPLOITATION... 135
Marinagi, C.C., and Akrivos, C.K. 2011. Strategic alignment of ERP, CRM and
E-business: A value creation. Advances on Intergated Information Conference
Proceedings, 347350.
McGowan, Pauric, and Mark G. Durkin. 2002. Toward an understanding of
Internet adoption at the marketing/entrepreneurship interface. Journal of
Marketing Management 18(34): 361377.
Melewar, T.C. 2003. Determinants of the corporate identity construct: A review
of the literature. Journal of Marketing Communications 9(4): 195220.
Menon, Anil, and P.Rajan Varadarajan. 1992. A model of marketing knowledge
use within firms. The Journal of Marketing 56(4): 5371.
Miles, Morgan P., and Danny R.Arnold. 1991. The relationship between market-
ing orientation and entrepreneurial orientation. Entrepreneurship Theory and
Practice 15(4): 4965.
Morgan, Robert E., and Constantine S.Katsikeas. 1998. Exporting problems of
industrial manufacturers. Industrial Marketing Management 27(2): 161176.
Ngai, E.W.T. 2005. Customer relationship management research (19922002)
An academic literature review and classification. Marketing Intelligence &
Planning. 23(6): 582605.
Ramamurti, Ravi. 2004. Developing countries and MNEs: Extending and enrich-
ing the research agenda. Journal of International Business Studies 35(4):
277283.
Raub, Steffen, and Daniel Von Wittich. 2004. Implementing knowledge manage-
ment: Three strategies for effective CKOs. European Management Journal
22(6): 714724.
Ritossa, Claudia Monica, and Sergio Bulgacov. 2009. Internationalization and
diversification strategies of agricultural cooperatives: A quantitative study of the
agricultural cooperatives in the state of Parana. BAR-Brazilian Administration
Review 6(3): 187212.
Saunders, Mark N.K., Philip Lewis, and Adrian Thornhill. 2011. Research Methods
for Business Students (5th ed.). Pearson Education India.
Souchon, Anne L., and Adamantios Diamantopoulos. 1999. Export information
acquisition modes: Measure development and validation. International
Marketing Review 16(2): 143168.
Teece, D.J., G. Pisano, and Shuen, A. 1997. Dynamic capabilities and strategic
management. Strategic Management Journal, 509533.
Trivellas, Panagiotis, and Ilias Santouridis. 2009. TQM and innovation perfor-
mance in manufacturing SMEs: The mediating effect of job satisfaction. In
Industrial Engineering and Engineering Management, 2009. IEEM 2009. IEEE
International Conference on, 458462. USA: IEEE.
Villar, Cristina, Joaqun Alegre, and Jos Pla-Barber. 2014. Exploring the role of
knowledge management practices on exports: A dynamic capabilities view.
International Business Review 23(1): 3844.
136 M. GARRI AND N. KONSTANTOPOULOS
ThorhildurJetzek
6.1 Introduction
The miracle is this: The more we share the more we have. Leonard Nimoy
19312015
Our world is at an inflection point where technological advances and
boundary-crossing social challenges have come together to create a para-
digm shift. Our societies are facing multiple and urgent social challenges,
ranging from economic inequality, unemployment, and poor social condi-
tions to chronic diseases and climate change. Given the complexity and
cross-boundary nature of these challenges, a new approach where social
T. Jetzek (*)
Copenhagen Business School, Copenhagen, Denmark
e-mail: tj.itm@cbs.dk
well for distribution of goods and services, the shifts toward an economy
centered on information and the move to a networked Internet-based
environment have caused significant attenuation of the limitations that
market-based production places on the pursuit of value (Benkler 2006).
We must examine different types of mechanisms that facilitate shared or
sustainable value generation, and then subsequently highlight not only
economic implications of innovation but the social and environmental
implications as well.
Another mechanism has already become a foundation for generating
value from open data, that is, the network mechanism, which we define as
a mechanism that generates value from actionable insights gained through
information sharing and re-use over networks. The network mechanism
refers to the actions of what we can call information creators and infor-
mation consumers, but in fact, it is not simple to distinguish between
who creates and who consumes information. In many current business
models, the information consumers are also generating valuable data for
platforms owners that are crowdsourced to create new or improved
information. However, the main distinction between the market and the
network mechanisms is that in the latter, there is no monetary exchange
and the relationships are many-to-many, instead of the traditional one-to-
one relationship between buyers and sellers. We propose that intermediar-
ies can play a valuable role in leading the market and network mechanisms
together, thus creating a structure around these complex relationships
that allows for synergistic value generation.
are still underdeveloped and underused, and there are heterogeneous for-
mats and a lack of metadata, as well as limited network activity (Mayer-
Schnberger and Zappia 2011; Martin etal. 2013). For most individuals
and smaller developers, these issues come together to create a substantial
barrier to entry, as the efforts involved in acquiring, manipulating, and
analyzing these disparate data are simply too extensive, in comparison to
an uncertain and potentially noneconomic gain.
In most of the world, governments are already struggling with bud-
getary restraints and increased demand for services. Making data open is
never an effortless task, and these constraints limit governments aspira-
tions for open data, even if the potential for value generation may be clear
to them. As governments may not be able to do everything on their own,
data intermediaries could play a crucial role in the open data ecosystem by
facilitating data and information access for smaller organizations that may
not have the capacity and capabilities to store, integrate, and analyze large
and heterogeneous datasets. Intermediaries might also contribute directly
to value generation by augmenting and amplifying the circulation of open
data by sanitizing and curating data coming from both public and private
sources. By making data easier to access, manipulate, and use, intermediar-
ies will drive information creation and product, service, or process innova-
tion based on these data.
Having easy, one-stop access to data services offers a value proposition
for companies striving to create a competitive advantage in an increasingly
data-driven world (Lindman etal. 2014). However, a large share of data-
driven services is provided for free, oftentimes in exchange for access to
personal data (OECD 2014). Data intermediaries need to adapt to mar-
ket conditions where users are accustomed to having free access to data,
information, and information services. To enable the ongoing generation
of valuable but free information, the data intermediaries must implement
business models that allow them to generate economic profit by capital-
izing on the positive network externalities that arise from the interactions
of multiple stakeholders using the provided platforms to gain access to the
services provided by these intermediaries and their affiliates.
and Daniel Yates, founded the company Opower (then Positive Energy)
in 2007. Opower is an energy tech company with a mission to help
everyone, everywhere save energy. By the end of 2014, Opower worked
with over 95 energy utilities servicing more than 50 million homes.2 In
February 2015, the Opower home energy reports had helped people
around the world save over six terawatt hours of energy and more than
$700 million on their energy bills. Opower successfully went through an
initial public offering (IPO) in April 2014 and was acquired by Oracle in
May 2016.
As utilities deploy smart grid technologies, the volume of data they pro-
duce each day increases more than 3000-fold. Furthermore, as customers
begin to interact more with their utilities online, these interactions create
even more data.3 Opowers MSP can store and process 15-minute interval
data from smart meters from millions of in-home devices at large scale and
high speed, currently spanning more than 52 million households and busi-
nesses, and growing at a rate of more than 100 billion meter reads per year.
Opowers data analytics engine sits on top of this huge repository of data.
The engine runs hundreds of algorithms that process utility data, third-
party data, and customer behavioral data to power millions of personalized
communications with utility customers on the platform.4 Opower merges
the data streams from utilities with open data from the government to
create personalized energy-use profiles. In the USA, they use data from
the Residential Energy Consumption Survey (RECS) to understand how
households are using energy. The survey provides region-specific data on
end-use energy consumption patterns, such as the type and efficiency of
appliances used by the consumers and the systems and energy sources they
use to heat and cool homes, among other topics. Opower also uses data
from the US Census Bureau on the mix of gas and electric heating sources
in a given county in order to create location-specific profiles to use when
analyzing an individuals home energy consumption.
Opowers products are designed to enhance the utilitys interactions
with their customers in order to both reduce demand and improve rela-
tionships. When designing the way the utilities interact with energy users,
Opower has utilized findings from behavioral science that have predicted
how people react against information provided on their own use, as com-
pared to the use of others (Bos etal. 2012). These results have highlighted
the importance of a feedback mechanism to drive behavioral change, cre-
ating a subtle aspect of peer pressure (Jetzek et al. 2014a). The energy
reports that Opower creates for each energy user offer a component where
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 149
this individuals energy use is compared to the use of other similar house-
holds, complete with a smiley token to indicate approval of good behav-
ior (Jetzek et al. 2014a). When provided with better information and
suggestions on how to decrease energy consumption, as well as a token
of appreciation for their efforts, customers are empowered to take greater
control of the way they use energy. On the other side of the platform, the
utilities benefit through increased customer engagement and better target-
ing of specific customer segments for efficiency. Opower has also created
an API to allow utility clients to run their own internal analytics programs
using the data in their analytics engine. Government might be labeled as
an indirect third party on the platform, as they provide open data and in
return gain greater energy efficiency, however there is no direct interaction
between government and the other two sides.
6.4.2
Example Two: INRIX
According to the Texas Transportation Institute, the cost of congestion
in the USA in 2012 was more than $120 billion, nearly $820 for every
commuter that is said to spend over 60hours per year on average stuck in
traffic.5 Similar problems are endured by most of the worlds biggest cities.
However, estimates suggest that since 2009, the global pool of personal
geo-location data is growing yearly by 20% and by 2020 this data pool
could provide $500 billion in value worldwide in the form of time and fuel
savings, or 380 megatons (million tons) of CO2 emissions saved (OECD
2014).
INRIX is a leading provider of traffic services worldwide, with the
vision to solve traffic, empower drivers, inform planning, and enhance
commerce.6 INRIX provides historical, real-time traffic information,
traffic forecasts, travel times, and travel time polygons to businesses and
individuals in 40 countries (as of September 2014).7 INRIX also gathers,
curates, and reports roadway incidents such as accidents, road closures,
and road works. INRIX was founded by former Microsoft employees,
Bryan Mistele and Craig Chapman, in July 2004. INRIX has not yet been
through an IPO, but Porsche recently invested $55 million in the com-
pany, which in July 2014 employed around 350 people. As of September
2014, INRIX collected data about roadway speeds from over 175 mil-
lion real-time anonymous mobile phones, connected cars, and other fleet
vehicles equipped with GPS locator devices. They also get data from cam-
eras and government road sensors. Moreover, INRIX keeps a database
150 T. JETZEK
externalities are created when information that is partly created from open
government data draws users to information platforms. This information
production and consumption activity can be utilized to create economic
value, which will attract even more players to the ecosystem. More use of
data will also eventually benefit the data providers, even without them get-
ting monetary reimbursements, as market participants collectively address
various social challenges, which can rarely be solved by governments
alone. However, before this scenario can happen, the government data
must be open enough and of high quality to be of use for entrepreneurs
and the ecosystem must contribute other factors, like various important
skills, technologies, and low rate funds. It is important that both data pro-
viders on the supply side and prospective data users on the demand side
have a relatively clear idea about potential gains from future use and what
is needed in order for those benefits to be harvested.
Data providers must appropriately recognize two important elements
of this ecosystem. (1) A significant share of the value generated is intan-
gible, resulting from improved decision-making and changed behavior,
which can impact society and the environment as a whole. (2) Those that
finally appropriate the value might be far removed from those that pro-
vide the resources. Therefore, a public sector organization might invest
in gathering data, ensuring the data quality and making the data open
across different dimensions, but future use of these data will in a minority
of cases directly benefit the organization itself. The organization depends
on the yearly budget allocations from government to sustain their activi-
ties. If top-level decision-makers do not sufficiently understand the com-
plicated mechanisms that explain how much value is generated from the
data, they might reduce this funding if the data are not being used, and
the level of openness and quality of data might become compromised as
a result. This is an open data value paradox, describing a situation where
entrepreneurs do not use the data as the data are not usable enough
and there is too much uncertainty over future provision of open data.
However, data providers are not willing to invest in the people and tech-
nology necessary to make the data more usable and sustainable unless
they observe some evidence of value generation, which again depends on
data being used.
In order to resolve this paradox, which might lead to a downward spiral-
ing cycle, as discussed in the introduction, we propose to use the v aluation
methods and ideas used in economics of real options. Uncertainty and the
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 153
is present but has not been realized to become real options when exer-
cised (Bowman and Hurry 1993). After recognizing an option as such,
the holder of an option typically makes a small initial investment, holds it
open until an opportunity arrives, and then exercises a choice to strike the
option and capture the value inherent in that opportunity (Bowman and
Hurry 1993, Gosh and Li 2013). The identification of real options is, to a
significant extent, subject to contingencies such as the firms technological
capabilities, experience, and absorptive capacity, making the identification
of real options virtually unique to every firm (Saarikko 2014). The value of
holding an option becomes magnified especially when the options holder
has preferential advantages in exploiting the opportunity provided by the
option (Sambamurthy etal. 2003).
Within the Information Systems research field, real options have been
used to offer a novel perspective called digital options. Digital options
can be described as a set of IT-enabled capabilities in the form of digi-
tized enterprise work processes and knowledge systems which create value
through increased reach and richness of digitized processes and digitized
knowledge (Sambamurthy etal. 2003, Overby etal. 2006). Digital options
are at once a means of not only preserving the opportunity to capitalize
on a new technology or practice but also of mitigating the risks induced
by technological and market uncertainty (Woodard et al. 2013). While
the concept of digital options has been applied in studies on enterprise
resource planning (ERP) systems investments, it has also received criticism
for its apparent lack of detail in certain key aspects (Saarikko 2014). It
has been argued that restricting digital options to process and knowledge
reach and richness limits the concepts generative potential as well as its
relevance to IT capabilities (Sandberg etal. 2014).
Fichman (2004) compares and contrasts IT platform valuation through
the lens of discounted cash flow (DCF) analysis, on the one hand, and
through the lens of real options valuation, on the other, showing how
real options thinking will capture mechanisms that are important to the
firms competitive advantage, although the value might be intangible and
neglected through methods such as DCF (Fichman 2004, 139). Two of the
discussed real option value determinants in Fichmans model have a special
relevance to open data: susceptibility to network externalities (the extent to
which a technology increases in value to individual adopters with the size
of the adoption network) and interpretive flexibility (the extent to which a
technology permits multiple interpretations on the part of adopters about
how it should be implemented and used).
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 155
raise the upside potential, while the maximum loss for the government
is the investment made in making data fit for re-use (given that these
data have already been collected) and the eventual loss of income from
data. Of course, the decision to invest in open data in the first place also
depends on the perceived option value of data. We argue that if govern-
ments recognize the option value of open data for potential users, they will
be more willing to continue to provide high-quality open data to users,
even if doing so does demand some further investments in people and
technology.
As the value of data is dependent on network externalities, we propose
that the open data real option value increases with more use of the data,
but with diminishing marginal returns due to the market being at some
point saturated. This effect is not directly reflected in the model below,
but for those that would like to calculate the potential impact, we suggest
using growth formulas, such as that of von Bertalanffy (von Bertalanffy
1938). In that case, the growth factor (K) would be dependent on the
enabling factors we present below and, as suggested above, governments
could influence the option value of data by focusing on these factors, as
they have been found to increase use of the data (Jetzek et al. 2013).
Different growth factors will lead to different outcomes, which can be
assigned probabilities for a more accurate estimation of the distribution
of possible outcomes. Of course, this valuation is based on an estimation
of the base value (or use) of the data (as the underlying asset). The
option holders (i.e., all those that can access and use the data) are influ-
enced by these same factors, but also by their relative abilities as compared
to others. Hence, the value of the option is unique to them, reflecting
their own capabilities. We do not model the organizational level factors
here, in order to preserve clarity of representation. We propose only that
the eventual users will be influenced by the perceived value of the option
they hold, and leave more detailed organizational level modeling to future
research.
To identify potential determinants of sustainable value of open data,
we have looked to previous research, where the most important enablers
and barriers of open data have been analyzed, as well as relying on inter-
views and participation in an open data initiative in Denmark. The under-
lying assumption we make is that people are generally willing (intrinsic
motivation) to use data for sustainable value generation if they are given
the opportunity and they have the ability to do so. Additionally, certain
structures in the economy can influence extrinsic motivation, both nega-
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 157
individual privacy and guides those that want to use data to generate
value. Less risk of data fraud will motivate data users to actively participate
in responsible data use and re-use, therefore, positively influencing use
of data. Accordingly, uncertainty over rights and responsibilities and data
ownership are likely to negatively influence the motivation to use data
therefore negatively impacting the perceived option value.
P6: Collaboration Positively Influences Perceived Option Value of Open Data
(Motivation) We propose that data are used more if government actively
engages and collaborates with external stakeholders in order to motivate
private and public stakeholders to use data for various use and subsequent
value generation. This collaboration can happen via public-private part-
nerships, hackathons, or living labs or other types of formal and informal
interactions between different stakeholders in the open data ecosystem.
P7: The Risk-Free Rate Will Negatively Influence Perceived Option Value
of Open Data (Motivation) The higher the risk-free rate, the more likely
it is that money will be used for risk-less investment, rather than high risk
investment. Therefore, high risk-free rate negatively influences the prob-
ability of investment and use and thereby the perceived option value of
data.
P8: Perceived Option Value of Open Data Positively Influences Investment
in MSPs The higher the perceived option value of data, the more likely it
is that intermediaries will invest in MSPs.
P9: Investment in MSPs Supports the Generation of Information, Products
and Services Based on Data and Therefore Positively Influences Sustainable
Value Various stakeholders can provide information; products and ser-
vices based on the data through these platforms and use the network and
market mechanisms to generate valuable synergies. The more the data are
used and the more synergy is created between the network mechanisms
and market mechanisms that facilitate dissemination of information, on
the one hand, and data-driven products and services on the other, the
more sustainable value will be generated and appropriated. The model
itself is presented in Fig. 6.1.
The various enabling factors are like the roots of the open data eco-
system plant and their main role is to provide nourishment so that the
seed-like data can grow into something of value. Each of these factors will
influence the opportunity, ability, or motivation of stakeholders in the eco-
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 159
Fig. 6.1 Model of sustainable value generation in the open data ecosystem
160 T. JETZEK
6.7 Discussion
Our societies are changing fast, faster than many of us realize in the
midst of things. The interaction between technological and social ele-
ments is a big driver in these changes, influencing not only our ability
to generate value but also the way people perceive and think about
value (which might at the individual level be more accurately described
as values). We have not discussed these individual trends in depth here,
but suffice it to say that technology and network capabilities have come
together to create vast amounts of data that are currently being trans-
formed into information and used as a resource in new products and
services by a multiplicity of stakeholders. This new data-driven ecosys-
tem is highly dependent on unstructured many-to-many relationships
where data and information are flowing through networks without any
monetary transactions taking place, as opposed to the structured value
chains of the industrial economy. Network capabilities have allowed for
much more complex interactions between stakeholders, and old inter-
mediaries have been cut out while new ones have been created. The
new intermediaries are effectively playing the network mechanisms and
market mechanisms against each otherusing network externalities as
a tool to generate the income, which is necessary to sustain invest-
ments in people and technology, while simultaneously contributing to
sustainable value.
In the context of this chapter, we have used the term open data mostly
for data generated and disseminated by governments, as they are currently
the biggest distributors of open data in the world. We have proposed
that openness is in itself an important enabler to the creation of sustain-
able value from data. Openness enables both generation and appropria-
tion of value, not only by the organization that owns the data but also
by external stakeholders. However, while openness of data might be a
necessary condition for external stakeholders that want to effectively uti-
lize the vast amounts of government data, it is insufficient on its own.
Just as governments aim to provide the necessary infrastructure for effi-
cient markets, they should be aware of the factors that are needed for a
thriving data ecosystem. Such an ecosystem relies to a large degree on
the generation of relevant information, which is further disseminated
through network-based mechanisms to generate value around society.
The network mechanisms do facilitate the appropriation of value by soci-
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 161
etys stakeholders but operate under different rules than the traditional
market mechanisms.
We have made a few propositions about how sustainable value can be
created and how MSPs are enabling such value generation, as they are not
completely bound by rent seeking; rather, they gain from stakeholders that
together are addressing complicated societal challenges, previously the
responsibility of governments alone. Governments have started to realize
the power of these models, which thrive on sharing and interactions, and
are even creating their own platforms where public sector, businesses, and
citizens can meet and interact to create superior sustainable value (Janssen
and Estevez 2013). However, as in any complicated market, there are vari-
ous challenges present. One barrier that has been identified in prior work
on MSPs is the chicken-and-egg problem, describing the need to build up
a sufficient number of participants on one side of the platform in order to
attract the other side, which, in the case of open data, is usually the paying
side. In the case of government provision of open data, this translates to
government attracting enough users to justify the investments required
for making data open. When the users come, value will be generatedbut
the users will not participate unless they have a current perception of the
future value to be gained.
The economics of real options help us conceptualize the worth of per-
ceived future value by building on the same ideas that underpin the finan-
cial options markets: The limited risk and the unlimited upside as well
as the ability to wait and see before striking the option. Using this type
of thinking might help resolve the open data value paradox. If supplying
open data is conceptualized as the act of writing an option that is handed
out to all market participants, we gain a tool that can help us evaluate the
potential gain, viewing unpredictability and variability as a positive factor
rather than as a negative one and focusing on the flexibility provided as the
data are out there when the company in question needs them. Trusting
that the companies will value the option they are giving, governments can
focus on making data more open and create a nurturing environment for
interested stakeholders, which might in turn raise the option value even
further. The potential users will eventually pay back, not only by creating
jobs and paying taxes but also by finding innovative solutions to some of
our most pressing societal problems.
162 T. JETZEK
Notes
1. h ttp://www.epa.gov/ghgreporting/ghgdata/reported/index.
html
2. http://opower.com/
3. http://opower.com/platform/data-science
4. http://opower.com/platform/data-science
5. http://d2dtl5nnlpfr0r.cloudfront.net/tti.tamu.edu/documents/
tti-umr.pdf
6. http://www.prnewswire.com/news-releases/inrix-partners-with-
san-francisco-on-expanding-traffic-information-services-for-bay-
area-drivers-229643681.html
7. http://en.wikipedia.org/wiki/INRIX
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 165
8. http://www.imf.org/external/np/speeches/2014/060514.htm
9. h ttp://www.zillow.com/blog/zillow-mobile-2013-year-
in-review-141305/
10. http://www.zillow.com/corp/About.htm
11. http://priceonomics.com/the-seo-dominance-of-zillow/
References
Adner, Ron, and Daniel A.Levinthal. 2004. What is not a real option: Considering
boundaries for the application of real options to business strategy. Academy of
Management Review 29(1): 7485.
Bailey, Joseph P., and Yannis Bakos. 1997. An exploratory study of the emerging
role of electronic intermediaries. International Journal of Electronic Commerce
1(3): 720.
Bakici, Tuba, Esteve Almirall, and Jonathan Wareham. 2013. The role of public
open innovation intermediaries in local government and the public sector.
Technology Analysis & Strategic Management 25(3): 311327.
Barney, Jay. 1991. Firm resources and sustained competitive advantage. Journal of
management 17(1): 99120.
Benkler, Yochai. 2006. The Wealth of Networks: How Social Production Transforms
Markets and Freedom. New Haven: Yale University Press.
Bharosa, Nitesh, Marijn Janssen, Bram Klievink, and Yao-hua Tan. 2013.
Developing multi-sided platforms for public-private information sharing:
design observations from two case studies. In the Proceedings of the 14th Annual
International Conference on Digital Government Research, 146155.
Bharadwaj, Anandhi, Omar A. El Sawy, Paul A. Pavlou, and N. Venkatraman.
2013. Digital business strategy: Toward a next generation of insights. MIS
Quarterly 37(2): 471482.
Black, Fischer, and Myron Scholes. 1973. The pricing of options and corporate
liabilities. The Journal of Political Economy 87(3): 637654.
Bos, Maarten W., Amy JC Cuddy, and Kyle T. Doherty. (2012). OPOWER:
Increasing energy efficiency through normative influence (B). Harvard Business
School NOM Unit Case, 911061.
Bowman, Edward H., and Dileep Hurry. 1993. Strategy through the option lens:
An integrated view of resource investments and the incremental-choice process.
Academy of Management Review 18(4): 760782.
Brynjolfsson, Erik, and JooHee Oh. 2012. The attention economy: Measuring the
value of free digital services on the Internet. In the Proceedings of the 33rd
International Conference on Information Systems (ICIS), Orlando.
Buytendijk, Frank. 2014. Hype cycle for big data. https://www.gartner.com/
doc/2814517/hype-cycle-big-data-
166 T. JETZEK
Caillaud, Bernard, and Bruno Jullien. 2003. Chicken and egg: Competition
among intermediation service providers. RAND Journal of Economics 34(2):
521.
Cannon, Sarah, and Lawrence H. Summers. 2014. How uber and the sharing
economy can win over regulators Harvard Business Review. https://hbr.
org/2014/10/how-uber-and-the-sharing-economy-can-win-over-regulators/
Conradie, Peter, and Sunil Choenni. 2014. On the barriers for local government
releasing open data. Government Information Quarterly 31: S10S17.
Capgemini Consulting. 2013. The open data economy: Unlocking economic
value by opening government and public data. https://www.capgemini-con-
sulting.com/resource-file-access/resource/pdf/opendata_pov_6feb.pdf
Davies, Tim. 2013. Open data barometer: 2013 global report. World Wide Web
Foundation and Open Data Institute. http://www.opendataresearch.org/dl/
odb2013/Open-Data-Barometer-2013-Global-Report.pdf
Fichman, Robert G. 2004. Real options and IT platform adoption: Implications
for theory and practice. Information Systems Research 15(20): 132154.
Ghosh, Suvankar, and Xiaolin Li. 2013. A real options model for generalized
meta-staged projects-valuing the migration to SOA. Information Systems
Research 24(4): 10111027.
Hagiu, Andrei, and Julian Wright. 2011. Multi-sided Platforms. Boston, MA:
Harvard Business School.
Hagiu, A. 2014. Strategic decisions for multisided platforms. MIT Sloan
Management Review 55(2): 71.
Janssen, Marijn, and Elsa Estevez. 2013. Lean government and platform-based
governanceDoing more with less. Government Information Quarterly 30:
S1S8.
Janssen, Marijn, and Anneke Zuiderwijk. 2014. Infomediary business models for
connecting open data providers and users. Social Science Computer Review
32(5): 694711.
Janssen, Marijn, Yannis Charalabidis, and Anneke Zuiderwijk. 2012. Benefits,
adoption barriers and myths of open data and open government. Information
Systems Management 29(4): 258268.
Jetzek, Thorhildur, Michel Avital, and Niels Bjrn-Andersen. 2013. Generating
value from open government data. In The 34th International Conference on
Information Systems. ICIS 2013.
. 2014a. Data-driven innovation through open government data. Journal
of Theoretical and Applied Electronic Commerce Research 9(2): 100120.
. 2014b. Generating sustainable value from open data in a sharing society.
In Creating Value for All Through IT, 6282. Berlin: Springer.
Katz, Michael L., and Carl Shapiro. 1985. Network externalities, competition, and
compatibility. The American Economic Review 75(3): 424440.
. 1986. Technology adoption in the presence of network externalities. The
Journal of Political Economy 94(4): 822841.
INNOVATION INTHEOPEN DATA ECOSYSTEM: EXPLORING THEROLE... 167
Lee, Young-Chan, and Seung-Seok Lee. 2011. The valuation of RFID investment
using fuzzy real option. Expert Systems with Applications 38(10): 1219512201.
Lindman, Juho, Tomi Kinnari, and Matti Rossi. 2014. Industrial open data: Case
studies of early open data entrepreneurs. In System Sciences (HICSS), 2014 47th
Hawaii International Conference on, 739748. USA: IEEE.
Martin, Sbastien, Muriel Foulonneau, Slim Turki, and Madjid Ihadjadene. 2013.
Risk analysis to overcome barriers to open data. Electronic Journal of
e-Government 11(1): 348359.
Mayer-Schnberger, Viktor, and Zarino Zappia. 2011. Participation and power:
Intermediaries of open data. In 1st Berlin Symposium on Internet and Society
October.
McKinsey & Company. 2013. Open data: Unlocking innovation & performance
with liquid information. McKinsey Global Institute, McKinsey Center for
Government & McKinsey Business Technology Office.
Nilsen, Kirsti. 2010. Economic theory as it applies to public sector information.
Annual Review of Information Science and Technology 44(1): 419489.
OECD. 2011. Fostering innovation to address social challenges. http://www.
oecd.org/sti/inno/47861327.pdf
. 2014. Data-driven innovation for growth and well-being. interim synthe-
sis report. http://www.oecd.org/sti/inno/data-driven-innovation-interim-
synthesis.pdf
Overby, Eric, Anandhi Bharadwaj, and V.Sambamurthy. 2006. Enterprise agility
and the enabling role of information technology. European Journal of
Information Systems 15(2): 120131.
Parker, Geoffrey, and Marshall Van Alstyne. 2010. Innovation, openness & plat-
form control. In Proceedings of the 11th ACM Conference on Electronic
Commerce, 9596. NewYork, NY: ACM.
Porter, Michael E. 2008. Competitive Advantage: Creating and Sustaining
Superior Performance. NewYork, NY: Simon and Schuster.
Porter, Michael E., and Mark R.Kramer. 2011. Creating shared value. Harvard
Business Review 89(1/2): 6277.
Resnick, Paul, Richard Zeckhauser, and Chris Avery. 1995. Roles for electronic
brokers. In Toward a Competitive Telecommunication Industry: Selected Papers
from the 1994 Telecommunications Policy Research Conference, 289304.
Mahwah, NJ: Lawrence Erlbaum Associates.
Rochet, Jean-Charles, and Jean Tirole. 2006. Two-sided markets: A progress
report. The RAND Journal of Economics 37(3): 645667.
Saarikko, Ted. 2014. Here today, here tomorrow: Considering options theory in
digital platform development. In Creating Value for All Through IT, 243260.
Berlin: Springer.
Sambamurthy, Vallabh, Anandhi Bharadwaj, and Varun Grover. 2003. Shaping
agility through digital options: Reconceptualizing the role of information tech-
nology in contemporary firms. MIS quarterly 27(2): 237263.
168 T. JETZEK
Sandberg, Johan, Lars Mathiassen, and Nannette Napier. 2014. Digital options
theory for IT capability investment. Journal of the Association for Information
Systems 15(7): 422453.
van Osch, W., and M. Avital. 2010. The road to sustainable value: The path-
dependent construction of sustainable innovation as sociomaterial practices in
the car industry. Advances in Appreciative Inquiry 3(1): 99116.
van Veenstra, Anne Fleur, and Tijs A. van den Broek. 2013. Opening movesdriv-
ers, enablers and barriers of open data in a semi-public organization. In
Electronic Government, 5061. Berlin: Springer.
von Bertalanffy, Ludwig. 1938. A quantitative theory of organic growth (inquiries
on growth laws. II). Human Biology 10(2): 181213.
Wade, Michael, and John Hulland. 2004. Review: The resource-based view and
information systems research: Review, extension, and suggestions for future
research. MIS Quarterly 28(1): 107142.
Wernerfelt, Birger. 1984. A resource-based view of the firm. Strategic Management
Journal 5(2): 171180.
West, Joel, and Scott Gallagher. 2006. Challenges of open innovation: The para-
dox of firm investment in open-source software. R&D Management 36(3):
319331.
Woodard, C.J., N. Ramasubbu, F.T. Tschang, and V. Sambamurthy. 2013. Design
capital and design moves: The logic of digital business strategy. MIS Quarterly
37(2): 537564.
Zuiderwijk, Anneke, and Marijn Janssen. 2014a. Barriers and development direc-
tions for the publication and usage of open data: A socio-technical view. In
Open Government, 115135. NewYork: Springer.
. 2014b. Open data policies, their implementation and impact: A frame-
work for comparison. Government Information Quarterly 31(1): 1729.
Zuiderwijk, Anneke, Marijn Janssen, Sunil Choenni, Ronald Meijer, and R.Sheikh
Alibaks. 2012. Socio-technical impediments of open data. Electronic Journal of
eGovernment 10(2): 156172.
Zuiderwijk, Anneke, Marijn Janssen, Sunil Choenni, and Ronald Meijer. 2014.
Design principles for improving the process of publishing open data.
Transforming Government: People, Process and Policy 8(2): 185204.
CHAPTER 7
FlorianLdeke-Freund, BirteFreudenreich,
IolandaSaviuc, StefanSchaltegger, andMartenStock
7.1 Introduction
Corporate sustainability has long left its academic niche and has become an
integral part of todays business world. While companies around the globe
are trying to position themselves as economically competitive and at the
F. Ldeke-Freund (*)
University of Hamburg, Hamburg, Germany, Research Fellow at Centre
for Sustainability Management (CSM), Leuphana University, and Governing
Responsible Business Fellow, Copenhagen Business School
e-mail: Florian.Luedeke-Freund@wiso.uni-hamburg.de
B. Freudenreich (*) S. Schaltegger
Leuphana University, Lneburg, Germany
e-mail: freudenreich@leuphana.de; schaltegger@uni.leuphana.de
I. Saviuc
University of Antwerp, Antwerp, Belgium
e-mail: iolanda.saviuc@uantwerpen.be
M. Stock
ifu Hamburg GmbH, Hamburg, Germany
e-mail: m.stock@ifu.com
same time as ecologically and socially sound, for example, through more
efficient production processes or increasing product responsibility, some
pioneers are looking at new ways to meet this challenge on a more systemic
level: the development of sustainable business models (e.g. Beltramello etal.
2013; Bisgaard etal. 2012; Wells 2013a; Schaltegger et al. 2016). Empirical
studies show that this approach is already one of the most important top-
ics of sustainability and innovation management in practice (e.g. Kiron
et al. 2013). The underlying assumption and expectation is that con-
sciously managed business models can lead to more effective ways of solv-
ing ecological and social problems, while maintaining or even enhancing an
organisations competitiveness (Schaltegger etal. 2012, 2016). Corporate
sustainability has arrived in the world of business models, and vice versa.
Corporate sustainability takes into account the risk of negative business
impacts on the natural environment and society as well as the challenge of
surviving as an organisation in partly radically changing ecological, social, and
economic contexts (Schaltegger and Burritt 2005; see also McElroy and van
Engelen 2012). But corporate sustainability is also about creating positive
effects in support of a prospering natural environment and human society;
a perspective that is emphasised in the emerging research field of sustainable
entrepreneurship (Schaltegger and Wagner 2011) and sometimes referred
to as flourishing (Ehrenfeld and Hoffman 2013). A business model can be
understood as the rationale, or logic, of organisational value creation. The
conventional business model perspective defines value mainly in financial
terms, whereas softer concepts are also discussed, for example, referring to
customer value, jobs-to-be-done, or knowledge gains (cf. e.g. Beattie and
Smith 2013; Chesbrough 2010; Johnson 2010). From a corporate sustain-
ability perspective, business models should be developed and transformed in
ways that secure the long-term viability of an organisation, that is, maintain
and improve its competitiveness through managerial and innovative capabili-
ties, while satisfying customers and other stakeholders needs within the limits
of the ecological and social systems in which every human activity is embed-
ded (cf. Boons and Ldeke-Freund 2013; Schaltegger etal. 2012, 2016).
Whether and how sustainable business models really act as business models
for sustainability, that is, whether they effectively contribute to sustainable
development, is not just a matter of business model design but also of mea-
surability and manageability. Being able to measure and manage business
model effects is an essential prerequisite for targeted activities to improve the
sustainability performance and long-term prospects of organisations, espe-
cially in rapidly and radically changing business environments. However,
appropriate management approaches for the assessment and management
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 171
The activity must create a positive business effect, that is, positively
contribute to the economic success of the company which can be
measured or at least argued for in a convincing way. Such effects can
include cost savings, increased sales, improved competitiveness, prof-
itability, customer retention, or reputation, for example.
A clear and logically convincing argumentation must exist that a
deliberate management or entrepreneurial activity has led to both the
intended ecological or social effect and the economic business effect.
Economic
Performance
B
ES0
D
F C
ESP* ESP1 ESP0 Social and/ or
Ecological Performance
2013; Cohen and Kietzmann 2014; Johnson and Suskewicz 2009; Wells
2013b), approaches to marketing renewable energies (e.g. Loock 2012;
Ldeke-Freund 2014; Richter; 2012, 2013; Wstenhagen and Boehnke
2008), and different forms of social enterprises (e.g. Seelos and Mair
2005, 2007; Yunus etal. 2010; Zeyen etal. 2014).
In these contexts, the effect of new business models is often described
as the breakup of dominant and purely financially oriented paradigms of
value creation (Ldeke-Freund 2009, 2010). This can be achieved, for
example, through establishing closed-loop and zero-waste production
models (e.g. McDonough and Braungart 2013) that replace linear fire
and forget models and allow for the creation of ecological value (Wells
2008). Another role is the introduction of new ways of value distribu-
tion. For example, some social businesses distinguish between those who
pay for a value propositionlike access to nutrition, health care, or edu-
cationand those who benefit from it, thus creating additional social
value (e.g. Grassl 2012; Yunus etal. 2010). With regard to changing the
ways of producing and consuming services and goods that are culturally
and economically embedded and institutionalised, Wells argues that only
radical and sustainability driven innovations are capable of challenging
the persistent and continuously self-reproducing status quo (Wells 2008,
2013a, b; Charter etal. 2008; Hansen etal. 2009). However, such inno-
vations often start in niches and struggle either to create new markets
or to penetrate the existing mass marketstake e-mobility as a prime
example (cf. Bidmon and Knab 2014; Hockerts and Wstenhagen 2010;
Schaltegger and Wagner 2011; Tukker et al. 2008). Creating business
models that not only bridge this gap between niche and mass markets
for radical and sustainability driven innovations and hereby deliver eco-
logical and social benefits, but are also economically viable, is the major
challenge for sustainable entrepreneurs dealing with business models and
their innovation (cf. Bocken etal. 2014; Carayannis etal. 2014; Charter
etal. 2008; Laukkanen and Patala 2014; Ldeke-Freund 2013; Upward
2013).
The ability to deliberately provide market access for radical and sus-
tainability driven innovations, either by connecting to existing markets or
creating completely new markets, is the crucial feature that distinguishes
sustainable business models from what might be referred to as conven-
tional or mainstream business models (Boons and Ldeke-Freund 2013).
The relationship between business models and sustainability innovations
has been described from two major perspectives: One sees the business
176 F. LDEKE-FREUND ET AL.
Model Canvas (Osterwalder and Pigneur 2009), clearly show that the per-
formance of a business model and changes to it are to be expressed in
terms of financial costs and revenues. Magretta (2002) described this con-
ceptual kinship very clearly in her seminal article Why Business Models
Matterthey matter because they tie narratives, that is, business ideas,
to numbers, that is, expected financial results (see also Doganova and
Eyquem-Renault 2009):
While this feature allows for flexible and rich descriptions of empirical
phenomena and supports systemic thinking, its downside is that a thorough
assessment would lead to unmanageable data requirements. Therefore,
the SUST-BMA framework builds on an approach that circumvents this
impasse: since the business model, as it was defined by Osterwalder (2004),
is partly based on Kaplan and Nortons (1992, 1996) BSC, which can be
used for assessment purposes, a (re-)alignment of these two concepts would
support business model assessments. This approach provides structural guid-
ance and eliminates the need to develop an assessment framework from
scratch. Moreover, the conventional BSC has been further developed as
an SBSC that supports the management of sustainability information (e.g.
Figge etal. 2002; Schaltegger and Dyllick 2002), which is key to a sustain-
ability oriented assessment. Therefore, our conceptual approach is based on
the alignment of the business model concept with the SBSC.It is assumed
that this approach provides a manageable SUST-BMA framework.
The following sections introduce the business model concept within
our approach (Sect. 7.3.2), provide an overview of the SBSC (Sect. 7.3.3),
and explain how both merge to form the basis of the SUST-BMA frame-
work (Sect. 7.3.4).
Architectural Level Business Model Mgmt Business Model Processes: business model design; e.g.
translate strategy into structural template of business
Business & Money Making Logic logic to provide sustainability driven value propositions
Fig. 7.2 The location of the business model within management levels and
processes (Ldeke-Freund 2009:18)
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 183
Markeng
logic
Financial
logic
Contextual logic
2003; Upward 2013). The five secondary logics are basically defined as
follows:
The marketing logic describes the interaction between a firm and its tar-
get customers. It comprises the value proposition and customer interface,
and describes what is offered, how it is offered and delivered, and how
the firm interacts with its customers. It is enabled by a firms production
and resources and capabilities logic and represents the primary source of
market-based revenues. The marketing logic relates to the value delivery
function of a business model.
The financial logic describes the costs incurred within the production
logic, as well as within the resources and capabilities logic. It also includes
the revenues generated within the marketing logic. It focuses on cost driv-
ers and how a firm generates revenues to cover costs and remain financially
viable. The financial logic relates to the value capture function of a busi-
ness model.
The capabilities and resources logic describes the foundations of the pro-
duction and marketing logic. It comprises the requirements in terms of
infrastructure, people, knowledge, and capabilities in order to enable pro-
duction and marketing activities. The development, acquisition, and main-
tenance of resources and capabilities incur costs that are captured by the
financial logic. The capabilities and resources logic, together with the pro-
duction logic, relates to the value creation function of a business model.
The production logic describes the activities that need to be performed
to create a business models value proposition. In addition to activities
within the firms boundaries, it includes activities carried out by partners.
186 F. LDEKE-FREUND ET AL.
Costs associated with the production logic are taken into account by the
financial logic. The production logic, together with the capabilities and
resources logic, relates to the value creation function of a business model.
The contextual logic comprises aspects that are crucial for the function-
ing of the business model but are situated outside the other four business
model logics and maybe even outside the market. This includes, for exam-
ple, legal requirements, technological changes, and societal aspects like a
firms public reputation. The contextual logic expresses a business models
value framing with regard to its socio-cultural, political, legal, economic,
and technological spheres.
These logics are interdependent (Fig. 7.3). That is, a complete descrip-
tion of a business models fundamental value creation logic requires con-
sidering all five secondary logics. An exception is the contextual logic
which frames the other logics. It describes the context within which the
others are embedded and which cannot, or only to a limited extent, be
influenced by the firm itself.
mental and social performance (cf. Hansen and Schaltegger 2016; Figge
etal. 2002; Schaltegger 2011; Schaltegger and Dyllick 2002; Schaltegger
and Wagner 2006a, 2006b).
Fig. 7.4 shows these four basic perspectives. Their hierarchical relation-
ships become clear when distinct indicators and causal chains are devel-
oped as part of a strategy map (Kaplan and Norton 2000).
Financial Perspective
Objectives
Initiatives
Measures
Targets
Objectives
Initiatives
Initiatives
Measures
Measures
Vision
Targets
Targets
and
Strategy
Initiatives
Measures
Targets
Fig. 7.4 Basic perspectives of the balanced scorecard concept (Kaplan and
Norton 1996: 9)
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 189
Non-Market Perspective
Objectives
Initiatives
Measures
Targets
Financial Perspective
Objectives
Initiatives
Measures
Targets
Customer Perspective Internal Process Perspective
Objectives
Objectives
Initiatives
Initiatives
Measures
Measures
Targets
Targets
Vision
and
Strategy
Initiatives
Measures
Targets
Fig. 7.5 Basic layout of an SBSC with fifth, non-market perspective (Figge etal.
2002)
Business Model frames Implementaon delivers Outcome delivers Implementaon frames Performance measurement
Design and management concept
through through
Sustainability
Business Model Created Value
Balanced Scorecard
7.4.3Assessing Performance
Once material aspects have been identified and mapped to the business
model, appropriate indicators need to be chosen. While the organisations
goals will be defined based on its strategy, measures for the configuration of
the SBSC can be derived from the GRI framework (e.g. GRIs G4-EN1 indi-
cator for material usage or G4-PR5 for customer satisfaction; GRI 2013a,
2013b). However, the interplay between business model concept, SBSC,
and GRI indicators needs to be carefully adapted for the kind of performance
assessment proposed by the SUST-BMA framework: GRIs default scenario
is rather a whole organisation, an entrepreneurial firm or larger corporation,
and not a business modelwhich are very different units of analysis!
However, the GRI framework, its aspects, indicators, and materiality
matrix should be applicable to the purpose of SUST-BMAs. Assuming
that it is possible to identify material aspects and their location within a
business model, accordingly adapted indicators could be implemented.
Other sources of indicators, for example, industry or product-specific stan-
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 197
high
Influence on Stakeholder Assessments and Decisions
i g d c b
h
e f
j
m k
low
low high
Significance for the business that is being assessed
Fig. 7.7 Illustration of a materiality matrix (Source: Adapted from GRI 2013b:12)
logics. The advantageor the very natureof the business model concept is
that systemic linkages across several logics can be captured, described, under-
stood, and shared, which is another crucial function when it comes to the
evaluation and communication of a business models sustainability perfor-
mance. Based on such a performance assessment, sustainable business model
innovations can be initiated (cf. Bocken et al. 2014; Boons and Ldeke-
Freund 2013; Schaltegger etal. 2012, 2016), the outcome of which can in
turn be evaluated using the SUST-BMA framework. In consequence, the
SUST-BMA framework and process will result in the continuous improve-
ment and development of more consistent and sustainable business models.
Notes
1.The term sustainable entrepreneur or sustainable entrepreneur-
ship is meant to include any form of leadership, entrepreneurship,
or managerial activity, mainly pursued in business organisations,
that deliberately aims at the integration of ecological, social, and
economic aspects and the creation of accordingly multiple kinds of
value for the natural environment, society, and the business organ-
isation itself (see e.g. Schaper 2010).
2.Knowing that the literature offers far more definitions and concepts
(e.g. Zott et al. 2011), our review includes only those which
explicitly define business model elements and their relationships, pro-
vide minimum definitions of business model functions, and informa-
tion about its theoretical or practical context (e.g. ICT, organisation,
or strategy).
3. http://database.globalreporting.org/
References
Abdelkafi, Nizar, Sergiy Makhotin, and Thorsten Posselt. 2013. Business model
innovations for electric mobilityWhat can be learned from existing business
model patterns? International Journal of Innovation Management 17(01):
1340003.
Afuah, Allan. 2004. Business Models: A Strategic Management Approach. NewYork:
McGrawHill.
Al-Debei, Mutaz M., and David Avison. 2010. Developing a unified framework of
the business model concept. European Journal of Information Systems 19(3):
359376.
Alt, Rainer, and Hans-Dieter Zimmermann. 2014. Editorial 24/4: Electronic
markets and business models. Electronic Markets 24(4): 231234.
200 F. LDEKE-FREUND ET AL.
Amit, Raphael, and Christoph Zott. 2001. Value creation in e-business. Strategic
Management Journal 22(6/7): 493520.
Arend, Richard J.2013. The business model: Present and futureBeyond a skeu-
morph. Strategic Organization 11(4): 390402.
Baden-Fuller, Charles, Benot Demil, Xavier Lecoq, and Ian MacMillan. 2010.
Editorial. Long Range Planning 43(23): 143145.
Baden-Fuller, Charles, and Mary S. Morgan. 2010. Business models as models.
Long Range Planning 43(2): 156171.
Beattie, Vivien, and Sarah Jane Smith. 2013. Value creation and business models:
Refocusing the intellectual capital debate. The British Accounting Review 45(4):
243254.
Beltramello, Andrea, Linda Haie-Fayle, and Dirk Pilat. 2013. Why New Business
Models Matter for Green Growth. Paris: OECD.
Bidmon, Christina Melanie, and Sebastian Knab. 2014. The three roles of business
models for socio-technical transitions. In The Proceedings of XXV ISPIM
ConferenceInnovation for Sustainable Economy and Society, 811.
Bieger, Thomas, and Stephan Reinhold. 2011. Das wertbasierte Geschftsmodell
Ein aktualisierter Strukturierungsansatz. In Innovative Geschftsmodelle, 1370.
Berlin: Springer.
Bisgaard, Tanja, K.Henriksen, and M.Bjerre. 2012. Green business model inno-
vationConceptualisation, next practice and policy. Nordic Innovation, Oslo.
Bocken, N.M.P., S.W.Short, P.Rana, and S.Evans. 2014. A literature and practice
review to develop sustainable business model archetypes. Journal of Cleaner
Production 65: 4256.
Boons, Frank, and Florian Ldeke-Freund. 2013. Business models for sustainable
innovation: State-of-the-art and steps towards a research agenda. Journal of
Cleaner Production 45: 919.
Boons, Frank, Carlos Montalvo, Jaco Quist, and Marcus Wagner. 2013. Sustainable
innovation, business models and economic performance: An overview. Journal
of Cleaner Production 45: 18.
Breuer, Henning, and Florian Ldeke-Freund. 2017a. Values-Based Innovation
Management Innovating by What We Care About. Houndmills: Palgrave.
. 2017b. Values-based network and business model innovation. International
Journal of Innovation Management 21(3): 35. Art. 1750028.
Burritt, Roger L., and Stefan Schaltegger. 2010. Sustainability accounting and
reporting: Fad or trend? Accounting, Auditing & Accountability Journal 23(7):
829846.
Camponovo, Giovanni, Yves Pigneur, and S. Lausanne. 2004. Information sys-
tems alignment in uncertain environments. Proceedings of Decision Support
Systems (DSS).
Carayannis, Elias G., Stavros Sindakis, and Christian Walter. 2014. Business model
innovation as lever of organizational sustainability. The Journal of Technology
Transfer 40(1): 120.
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 201
Casadesus-Masanell, Ramon, and Joan Enric Ricart. 2010. From strategy to busi-
ness models and onto tactics. Long Range Planning 43(2): 195215.
Charter, Martin, Casper Gray, Tom Clark, and Tim Woolman. 2008. Review: The
role of business in realizing sustainable consumption and production. In System
Innovation for Sustainability 1: Perspectives on Radical Changes to Sustainable
Consumption and Production, 4669. Sheffield, UK: Greenleaf Publishing in
association with GSE Research.
Chesbrough, Henry. 2010. Business model innovation: Opportunities and barri-
ers. Long Range Planning 43(2): 354363.
Chesbrough, Henry, and Richard S.Rosenbloom. 2002. The role of the business
model in capturing value from innovation: Evidence from Xerox Corporations
technology spin-off companies. Industrial and Corporate Change 11(3):
529555.
Clinton, L., and R.Whisnant. 2014. Model behavior20 business model innova-
tions for sustainability. Sustain Ability Report.
Cohen, Boyd, and Jan Kietzmann. 2014. Ride on! Mobility business models for
the sharing economy. Organization & Environment 27(3): 279296.
Cohen, Boyd, and Monika I.Winn. 2007. Market imperfections, opportunity and
sustainable entrepreneurship. Journal of Business Venturing 22(1): 2949.
del Mar Alonso-Almeida, Mara, Josep Llach, and Frederic Marimon. 2014. A
closer look at the Global Reporting Initiative sustainability reporting as a tool
to implement environmental and social policies: A worldwide sector analysis.
Corporate Social Responsibility and Environmental Management 21(6):
318335.
Demil, Benot, and Xavier Lecocq. 2010. Business model evolution: In search of
dynamic consistency. Long Range Planning 43(2): 227246.
Doganova, Liliana, and Marie Eyquem-Renault. 2009. What do business models
do?: Innovation devices in technology entrepreneurship. Research Policy
38(10): 15591570.
Ehrenfeld, John, and Andrew Hoffman. 2013. Flourishing: A Frank Conversation
about Sustainability. California: Stanford University Press.
Figge, Frank, Tobias Hahn, Stefan Schaltegger, and Marcus Wagner. 2002. The
sustainability balanced scorecardlinking sustainability management to business
strategy. Business Strategy and the Environment 11(5): 269284.
Grassl, Wolfgang. 2012. Business models of social enterprise: A design approach
to hybridity. ACRN Journal of Social Entrepreneurship Perspectives 1(1): 3760.
GRI. 2013a. Reporting Principles and Standard Disclosures. Amsterdam: Global
Reporting Initiative.
. 2013b. G4 Sustainability Reporting GuidelinesImplementation Manual.
Amsterdam: Global Reporting Initiative.
Hahn, Tobias, Frank Figge, Jonatan Pinkse, and Lutz Preuss. 2010. Trade-offs in
corporate sustainability: You cant have your cake and eat it. Business Strategy
and the Environment 19(4): 217229.
202 F. LDEKE-FREUND ET AL.
Hamel, Gary. 2000. Leading the Revolution: How to Thrive in Turbulent Times by
Making Innovation a Way of Life. Boston: Harvard Business School Press.
Hansen, Erik, and Stefan Schaltegger. 2016. The Sustainability Balanced Scorecard:
A Systematic Review of Architectures. Journal of Business Ethics 133(2):
193221.
Hansen, Erik G., Friedrich Grosse-Dunker, and Ralf Reichwald. 2009.
Sustainability innovation cubeA framework to evaluate sustainability-oriented
innovations. International Journal of Innovation Management 13(04):
683713.
Hedman, Jonas, and Thomas Kalling. 2003. The business model concept:
Theoretical underpinnings and empirical illustrations. European Journal of
Information Systems 12(1): 4959.
Hockerts, Kai, and Rolf Wstenhagen. 2010. Greening Goliaths versus emerging
DavidsTheorizing about the role of incumbents and new entrants in sustain-
able entrepreneurship. Journal of Business Venturing 25(5): 481492.
Johnson, Mark W. 2010. Seizing the White Space: Business Model Innovation for
Growth and Renewal. Brighton: Harvard Business Press.
Johnson, H.Thomas, and Robert S.Kaplan. 1987. The rise and fall of manage-
ment accounting. IEEE Engineering Management Review 3(15): 3644.
Johnson, Mark W., and Josh Suskewicz. 2009. How to jump-start the clean econ-
omy. Harvard Business Review 87(11): 5260.
Johnson, Mark W., Clayton M. Christensen, and Henning Kagermann. 2008.
Reinventing your business model. Harvard Business Review 86(12): 5059.
Kaplan, Robert S., and David P.Norton. 1992. The balanced scorecardMeasures
that drive performance [J]. Harvard Business Review 70(1): 7179.
. 1996. Using the balanced scorecard as a strategic management system.
Harvard Business Review 74(1): 7585.
. 2000. Having trouble with your strategy? Then map it. Harvard Business
Review 78(5): 110.
Kiron, David, Nina Kruschwitz, Knut Haanaes, Martin Reeves, and Eugene
Goh. 2013. The innovation bottom line. MIT Sloan Management Review
54(3): 1.
Kraut, Marla, Philip Dennis, and Heidi Connole. 2012. The efficacy of voluntary
disclosure: A study of water disclosures by mining companies using the global
reporting initiative framework. Academy of Accounting and Financial Studies
17(2): 23.
Lambert, Susan Christine. 2010. Progressing business model research towards
mid-range theory building. PhD diss., University of South Australia.
. 2012. A Multi-Purpose Hierarchical Business Model Framework. Centre
for Accounting, Governance and Sustainability, School of Commerce,
University of South Australia.
Lankoski, Leena. 2006. Environmental and economic performance the basic links.
In Managing the Business Case for Sustainability, eds. Schaltegger, S. and
Wagner, M.Sheffield: Greenleaf Publishing, 3246.
SUSTAINABILITY-ORIENTED BUSINESS MODEL ASSESSMENT... 203
Upward, Antony. 2013. Towards an ontology and canvas for strongly sustainable
business models: A systemic design science exploration. PhD diss., York
University Toronto.
Verhulst, Elli, Ivo Dewit, and Casper Boks. 2012. Implementation of sustainable
innovations and business models. Entrepreneurship Innovation Sustainability
1(25): 3266.
Wells, Peter. 2008. Alternative business models for a sustainable automotive indus-
try, 8098.
. 2013a. Business Models for Sustainability. Cheltenham: Edward Elgar
Publishing.
. 2013b. Sustainable business models and the automotive industry: A com-
mentary. IIMB Management Review 25(4): 228239.
Wells, Peter, and Margarete Seitz. 2005. Business models and closed-loop supply
chains: A typology. Supply Chain Management: An International Journal
10(4): 249251.
Wirtz, Bernd W. 2011. Business Model Management: Design-Instruments-Success
Factors. Wiesbaden: Gabler.
Wstenhagen, Rolf, Jost Hamschmidt, Sanjay Sharma, and Mark Starik, eds. 2008.
Sustainable Innovation and Entrepreneurship, New Perspectives in Research on
Corporate Sustainability. Cheltenham: Edward Elgar.
Wstenhagen, Rolf, and Jasper Boehnke. 2008. Business models for sustainable
energy. In Perspectives on Radical Changes to Sustainable Consumption and
Production, 7079. Sheffield, UK: Greenleaf Publishing in association with
GSE Research.
Yunus, Muhammad, Bertrand Moingeon, and Laurence Lehmann-Ortega. 2010.
Building social business models: Lessons from the Grameen experience. Long
Range Planning 43(2): 308325.
Zeyen, Anica, Markus Beckmann, and Roya Akhavan. 2014. Social entrepreneur-
ship business models: Managing innovation for social and economic value cre-
ation. In Managementperspektiven fr die Zivilgesellschaft des 21. Jahrhunderts,
107132. Germany: Springer Fachmedien Wiesbaden.
Zott, Christoph, and Raphael Amit. 2007. Business model design and the perfor-
mance of entrepreneurial firms. Organization Science 18(2): 181199.
. 2008. The fit between product market strategy and business model:
Implications for firm performance. Strategic Management Journal 29(1): 126.
. 2010. Business model design: An activity system perspective. Long Range
Planning 43(2): 216226.
. 2013. The business model: A theoretically anchored robust construct for
strategic analysis. Strategic Organization 11(4): 403411.
Zott, Christoph, Raphael Amit, and Lorenzo Massa. 2011. The business model:
Recent developments and future research. Journal of Management 37(4):
10191042.
CHAPTER 8
AlexanderRayner
8.1 Introduction
We live in a world overloaded with data, often referred to as the Big
Data era, where both organizations and individuals are overwhelmed
with an abundance of existing data. Nobody knows how much of the data
collected and stored is being used effectively to make data-driven deci-
sions, let alone how much of the data is actually understood. Yet, more
and more new data continues to be created from new sources such as
mobile positioning data, wearable technologies such as the Apple Watch,
and from the Internet of Things (IOT) which, according to Gartner, is
expected to interconnect nearly 26 billion devices by 2020.
Data-driven decisions can create and sustain a competitive advantage,
but it must be recognised that competing in a data-driven world is about
people being able to collaborate effectively around an ecosystem of data.
To create and maintain a competitive advantage using data, the data first
needs to be understood, and it needs to be assessed and analyzed quickly,
A. Rayner (*)
SmartData.travel Limited, Hong Kong
e-mail: alex@smartdata.travel
to enable faster decisions. The key is speed, and technology is a tool avail-
able for quickly making data useful. One example is the use of visualization
to communicate data clearly and efficiently to users by using graphics such
as graphs and charts. Effective visualization helps users to quickly analyze
data and identify trends, making complex data easier to understand and
more useful.
Global tourism continues to grow every year, and in 2015 the number
of international tourist arrivals (overnight visitors) reached 1.184 bil-
lion, with spending of US$1.4 trillion in 2015 according to the United
Nations World Tourism Organisation (UNWTO). By 2030, arrivals are
expected to reach 1.8 billion, which means on average 5 million people
will be crossing international borders every day. The important economic
and social impact of International visitors is recognized by more and more
governments, consequently competition between destinations to attract
visitors is extremely strong and continues to intensify.
Every stage of the travel journey from dreaming, planning, booking,
experience to sharing creates an abundance of data. Consequently, an
emerging strategy by National Tourism Organizations (NTOs) is to use
data metrics and analytics for decision-making, enabling marketing to be
more targeted and focused to attract high value-adding visitors, with a
shift away from using the number of visitor arrivals as a performance indi-
cator, to more meaningful indicators such as visitor expenditure, length of
stay and number of jobs in tourism.
Although many segments of the travel and tourism sector have started
to adopt data-driven decision-making, the most advanced being airline,
hotel and online travel agent segments; however, there is plenty of scope
and opportunity to improve when compared to sectors beyond travel and
tourism.
The Pacific Asia Travel Association (PATA) is a not-for-profit travel
trade membership association, established in 1951, to act as a catalyst
for developing the Asia Pacific travel and tourism industry. In partner-
ship with private and public sector members, from all segments of travel
and tourism, PATAs mission is to enhance the sustainable growth, value
and quality of travel and tourism to, from and within the Asia Pacific
region.
At its inception, PATA pioneered the way in which travel and tourism
was managed and promoted, by thinking outside the box, a key element
of which was accurate research and intelligence. In 2010, upon reflecting
on PATAs achievements, founding member Matt Lurie said strategic
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 209
intelligence was, and should remain, a core focus of PATA, particularly for
smaller destinations that lack the resources to do it themselves.
Since 1964, PATAs Annual Statistical Report, later renamed the
Annual Tourism Monitor (ATM), has aggregated and disseminated data
about the Asia Pacific travel and tourism sector to PATA members and
has always been considered a core member service and key membership
benefit (Figs. 8.1 and 8.2).
Recognizing the evolving importance of data, research and intelligence,
the PATA Strategic Intelligence Centre (SIC) was established in 1997 to
focus on producing a wide range of publications and market intelligence
reports, including the ATM and Forecasts that were distributed in print,
CD and DVD formats.
As the Internet gained popularity and widespread usage, it changed the
way data is communicated and also created an expectation for data access
on demand, at any time, and from anywhere. PATA provided a unique
member service by aggregating data from NTOs, which in the past was
Fig. 8.1 Left: Cover page of the PATA 1st annual statistical report
Fig. 8.2 Right: Cover page of the PATA annual tourism monitor 2015 early
edition
210 A. RAYNER
difficult to access and obtain unless you personally knew the person to
contact to get the data; however, with the Internet, many NTOs made
their data available on their websites so anyone could access and download
data. PATA members were able to access and download travel and tourism
data from websites on the Internet, at their convenience 24/7, and no
longer needed to rely as heavily on PATA.
The massive amounts of data available on the Internet continues to
increase exponentially, making the task of searching and finding the right
data an extremely time-consuming and frustrating experience. The chal-
lenge today is to ensure the data is valid and that it comes from a trusted
and credible source.
After the most recent Global Financial Crisis (GFC), new regula-
tions and governance requirements were introduced requiring organiza-
tions and their management to become more accountable, resulting in
the growth of data-driven decision-making, increasing demand for more
detailed data and more frequent updates. PATAs highest paying mem-
bers, particularly Governments and Carriers, wanted more comprehensive
data, more frequently, with on-demand access that the Internet facilitated.
In 2009, PATA appointed a new CEO and, in 2010, PATA man-
agement made the decision to create a data dashboard that was to be a
web-based software platform, a digital system for data aggregation and
dissemination, in the effort to provide more benefits that are relevant to
its existing and potential members.
A consultant was engaged to develop the data dashboard, but there
was no budget available for the technical development, which created
limitations for the solution, content and functionality. Working within the
given constraints, a partnership was created with a software developer that
made available technical services in exchange for exposure and promotion
to the travel and tourism sector through PATAs network reach.
The vision of the data dashboard was to enable better decisions by
PATA member travel and tourism professionals, by aggregating and allow-
ing faster access to data through a web-based One Stop Shop, dynamic
reporting mechanism allowing users to select what information they
needed, when they needed it.
The initial objective was to create a prototype and to showcase at the
PATA Annual General Meeting (AGM) in April 2010. It was anticipated that
if members recognized value from an operational proof of concept, members
would want to continue the project and hopefully resource the project.
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 211
PATAs website had many limitations, the quickest and most effective
way to integrate a dashboard was to create an iframe within the existing
website. The dashboard was embedded without users knowing that a link
was actually taking users to another website, that was hosted elsewhere on
a different server, that had capacity and capability to host large amounts of
data and quickly process complex queries.
The Chinese year of the Tiger that commenced on 14 February 2010
was poised to become a Year of Transformation with a spirit of innova-
tion and transformation and changing market boundaries. As the web-
based software platform was developed and launched during the year of
the Tiger and since it was an innovative transformation for PATA, it was
named TIGA, an acronym for Travel Intelligence Graphic Architecture.
A key issue raised was with hundreds of PATA members paying dif-
ferent prices for their membership, how could the benefit of TIGA be
aligned to the value of membership investment?
Another concern arose from the Apple release of the iPad tablet on 3
April 2010 because Flash software was not supported, hence TIGA could
not be viewed on the iPad. This was flagged as an issue and concern as if
the level of popularity of the iPad became significant then TIGA would
have limitations in terms of usage, and further development would be
needed to make usable on the iPad.
process that began with identifying and validating value to users, identify-
ing the various sources, contacting the sources, negotiating the rights to
display supplier data on TIGA, reach agreement on the collection method
including legal agreements, establishing a data input process, testing and
finally promoting the indicator to users and PATA members.
To overcome the challenge of obtaining third-party data without incur-
ring cost, TIGA was positioned as a promotion tool for data suppliers,
and the opportunity was provided to showcase a selection of high-level
indicators that provided insight, but with greater value gained if the user
subscribed to the third-partys data. Although this became a successful
win-win scenario, it took substantial time to convince and gain approvals,
especially from legal departments of the various data suppliers, who were
concerned about the risk of losing potential revenue. A new PATA mem-
bership category was introduced called Preferred Partner that was offered
to third-party suppliers when the data provided had a value, and PATA
products and services were bartered for the value equivalent.
Formal arrangements and agreements for data collection and dissemi-
nation were put in place for each data source including the NTOs.
Harmonization issues then arose from the PATA definition of the Asia
Pacific region and sub-regions and the destinations contained therein, as it
was different from that of many other organizations. In effect, each orga-
nization grouped destinations into their own sub-regional and regional
clusters. The term PATA Region was adopted, and a comprehensive cod-
ing system was developed. When data from different sources was input to
TIGA, the destinations had to be loaded separately to ensure alignment
with PATA Region definitions and thereby allow for direct comparison.
The next issue was determining how to provide different value to
the members of PATA many of whom paid different membership dues,
ranging from $250 to $50,000 perannum. After extensive member con-
sultation, debate and feedback, it was agreed to create a member login
mechanism and to provide three levels of access with varying data indica-
tors for members. Regional and sub-regional data remained available to
the public; however, when users wanted to access data at a destination
level, then PATA membership was required.
PATA members would receive access to Indicators and Destination data
based on the amount of their investment in PATA membership (Table 8.1).
Members paying under $1000 perannum received Local access that
provides all indicators except forecasts and source markets for a single des-
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 215
Better also in the type of data and indicators and presented in ways that
saved time, enabled collaboration and provided immediate insights. PATA
members could easily use the Internet to search for data so TIGA had to
offer a higher value-added proposition.
Getting data from third parties was a challenge especially with no bud-
get, but even with partnership propositions, as PATA needed to select indi-
cators that were meaningful to the user, without diminishing the role of
the data supplier product and making it otherwise redundant or obsolete.
Strategic integration of the various data indicators was the magic for-
mula to create resilience, robustness and sustainability.
Smartphones became popular after the iPhone was introduced in 2007
and started to become mainstream. This was compounded by the popu-
larity of tablets, in particular the iPad, so PATA management made the
decision in late 2011 to redevelop TIGA in HTML5 so that it would work
on all platforms, including mobiles and to reflect the change TIGA was
rebranded and renamed to PATAmPOWER.
8.5 PATAmPOWER
PATAmPOWER1 was launched in April 2012 at the PATA AGM in Kuala
Lumpur, Malaysia, in alignment with a PATA rebranding campaign called
PATA Next Gen.
PATAmPOWER could now be accessed on any device connected to the
Internet, including tablets and smartphones that many PATA members
were using, and helped to make PATA more appealing to the younger and
emerging travel and tourism leaders especially in the Asia region.
Travel and tourism professionals could now access travel and tourism
data on demand, when they wanted it from their mobile devices, at a
lunch meeting or at an event or meeting with clients at a coffee shop.
Travel and tourism data was now available beyond a computer on a desk,
or a notebook, and the enhanced mobility of travel and tourism data now
enabled faster and smarter decisions anywhere, anytime at the conve-
nience, 24/7, of travel and tourism professionals worldwide.
8.5.1Value
The value created by PATAmPOWER is a combination of many factors
that include:
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 217
8.5.2Marketing
PATAmPOWER is promoted throughout PATAs marketing materi-
als and is highlighted as a key membership benefit. The PATA web-
site www.PATA.org features PATAmPOWER banners and has links to
PATAmPOWER, as well as links to a PDF flyer that provides a high-level
overview of PATAmPOWER.PATAs weekly newsletter, PATA Voice, fea-
tures new insights from PATAmPOWER in every edition.
A video about PATAmPOWER highlighting the benefits was pro-
duced and posted on PATAs YouTube channel called PATA TV, and was
complemented by several webinars explaining how to use PATAmPOWER,
the data indicators and functionality.
Promotion at PATA events has been an effective way to gain awareness
utilizing pull-up stands, workshops and live demonstrations.
The media occasionally acknowledge PATAmPOWER as a data source
when reporting about the travel and tourism sector; however, there
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 219
8.5.4PATAmPOWERSoftware asaService
There are many sources of data available and when several sources are
used, this usually means that multiple login usernames and passwords
are required which is inconvenient and time consuming. The preference
is for one single platform that contains all the data, hence, the emerging
opportunity for the PATAmPOWER platform to become a Software as a
Service (SaaS) that can be customized with all data an organization wants
to use.
Australias Queensland Government was the launch customer of the
PATAmPOWER white label, SaaS solution by licensing the software to
develop a customized version for Tourism and Events Queensland (TEQ).2
TEQ identified the opportunity to leverage PATAmPOWER by custom-
izing the content with Queensland regional data indicators to enable the
Queensland visitor economy to make smarter decisions by using metrics
and data as a basis for decision-making.
Tourism Malaysia signed an agreement to purchase the PATAmPOWER
SaaS at the World Travel Mart in November 2014, to create a customised
platform, TMmPOWER that will contain data about Malaysias visitor
economy, which was developed during 2015, and was launched in 2016
as MyTourismData and is available at http://www.MyTourismData.
tourism.gov.my www.MyTourismData.tourism.gov.my
8.5.5Future ofPATAmPOWER
PATAmPOWER is a key PATA membership benefit, consequently aware-
ness is high among the PATA community; however, a huge market oppor-
tunity exists with millions of tourism professionals and organizations that
may not even be aware of PATA.
PATA has the opportunity to promote and to drive data-driven decision-
making, and position PATAmPOWER as a central and core component,
especially in the support of advocacy issues where data can add credibility,
increase understanding and help stakeholders to decide about their posi-
tion on advocacy issues.
PATAs advocacy and use of the term visitor economy recognizes
that tourism impacts an extensive value chain beyond travel and tourism.
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 221
PATAmPOWER data about the Asia Pacific visitor economy can be rel-
evant and useful to enterprises beyond travel and tourism, and a large
market opportunity exists with businesses that rely on the visitor economy
for revenue and their existence, such as restaurants, suppliers and retailers
to name just a few.
A challenge that remains is to determine how each PATA mem-
ber organization can gain more value from more effective use of
PATAmPOWER. Some PATA member organization employees, who
could benefit from PATAmPOWER, continue to be unaware of its exis-
tence or of their entitlement to use it. This may be because their desig-
nated contact or liaison person for PATA may not have shared the benefits
of PATAmPOWER within their organizations and this may be due to a
range of factors ranging from not making the time to having a limited
understanding of the value of data and analytics, or simply having no inter-
est at all in this area.
Getting people to use PATAmPOWER remains a challenge, usage of
PATAmPOWER is similar to a gym membership where you have access,
and it is available for your use. If you use it, you benefit, but unfortunately
many people dont make the time, or are not sure how to use it.
Engagement can be increased by consistent education and training
about the meaning of the content indicators, and how they can be used to
generate insights and value.
Identifying and targeting new and emerging data-focused job roles
such as Data Scientist positions that are beginning to gain prominence
among travel and tourism organizations, PATAmPOWER can be a very
relevant support tool.
Constant development is another key challenge in todays fast-changing
world, the success of PATAmPOWER will depend on consist investment
in the development of both the technology and the content.
Content development in particular should remain an ongoing activity,
while the monitoring of existing indicators usage and the identification of
new indicators will ensure that the content is what users need. It is clear,
for example, that data is increasingly being demanded at the city level as
well as at the national level. The challenge is securing the data at city level
and the resources to regularly aggregate the data.
Technology improvements need to be constantly monitored and evalu-
ated, as innovation will not only continue but also accelerate in the future,
222 A. RAYNER
8.6 Conclusion
As the Big Data era becomes more complex, and the velocity of data con-
tinues to accelerate, the role of technology will become critical to making
sense of data, understanding data, to make data-driven decisions, and to
make them quickly to gain competitive advantage or merely to maintain
existing customers.
Smart data is already the key ingredient for strategies that drive con-
stant improvement and innovation, and in creating and maintaining com-
petitive advantage for many segments of the travel and tourism sector.
The PATAmPOWER system creates value by unleashing the potential
buried in data, and by stimulating the use of data for better decisions this
SMART DECISION-MAKING ANDPRODUCTIVITY INTHEDIGITAL... 223
Notes
1. http://mpower.pata.org/
2. http://teq.queensland.com/teqmpower
CHAPTER 9
KonstantinosBiginas
9.1 Introduction
The business environment of the next decades will be significantly dif-
ferent to what might have been expected just two years ago. Over the
next 1015 years, businesses will face major changes in finance and capi-
tal conditions. Finance will be more expensive and its availability will be
constrained by regulation and changes to the banking market. From an
era in which finance was cheap and readily available, these changes will
be a significant driver of adjustments to corporate finance models and
investment behavior. The next decade will almost certainly be character-
ized by a higher level of economic volatility and increased riskcloud-
ing the certainty required for long-term planning. The financial crisis has
accelerated three other existing drivers of change or has changed their
character. Public trust in business and markets, already in decline, is now
at a low ebb. The profit motive is distrusted, and the onus is now on busi-
nesses to demonstrate their ethical credentials. There is greater skepticism
K. Biginas (*)
London College of International Business Studies, London, UK
e-mail: konstantinos.biginas@lcibs.org
about the capitalistic economic model and its ability to deliver desirable
and efficient outcomes; greater political activism, government interven-
tion and supervision can be expected. Businesses approach to social and
demographic change will also alter as a result of the recession. Retirement
will still accentuate existing shortages of critical skills, but plugging these
gaps will have to be the responsibility of business rather than govern-
ment, whose spending will be constrained. In addition, pension problems
will force some to work longer, requiring businesses to manage staff with
wider age ranges, expectations and motivations than before. Lastly, the
recession has altered the economic climate in which business needs to
move to a low-carbon economy and improve resource use. The ability and
preferences of government and some consumers to pay for this movement
have been compromised, raising new questions about the role of business.
Energy costs will continue to increase in the medium term, affecting the
basic profit structure of many companies. At the same time, trends in tech-
nology change are set to continue, and as over the last decade, will have a
significant impact on business models and ways of working.
9.1.1Future Implications
So, what will our world be like in 2030? The following chapter aims to
identify seven of the leading drivers of change that will affect our future.
These are public trust and confidence in businesses and markets, sustain-
ability and resource issues, climate change, energy, demographics, urban-
ization and technology and e-commerce. These seven drivers will be
analyzed in terms of social, technological, environmental, economic and
political aspects.
These seven drivers have been chosen based on contemporary chal-
lenges. Climate change on the Earth has started billions of years ago,
but the extent and the speed that this phenomenon is occurring since
the industrial revolution, and more specifically the last 60 years, is alarm-
ing. Climate change has been selected due to the significant influence
on our lives and the changes that could result to human and ecological
systems. Energy is closely related to climate change. There is a continu-
ous increase in demand and supply for energy. Energy can be partly con-
sidered responsible for the climate change, due to its wide use, but at the
same time is the one that will help to overcome the impacts and threats
that climate change has brought. In the not too-distant future, energy
and its alternative sources will most become the prime driver of change
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 227
9.2.3Climate Change
The climate on Earth is stable due to a continuous supply of energy from
the sun. This heated energy is passing through the atmosphere of the
Earth, warming on that way the surface of the Earths. An increase in the
temperature of the Earth is sending the infrared radiation (heat energy)
back to the atmosphere. A part of this heat is being absorbed by gases in the
atmosphere. These gases are part of the greenhouse effect (see Appendix
1). According to Sharma, in 1950s, few people were aware of the green-
house effect. John Tyndall analyzed all the gases of the atmosphere so as
to see which of them have the most powerful greenhouse effect (Sharma
2006). In 1865, Tyndall postulated in the atmospheric envelope that
water vapor and CO2 retain the heat. In the 1970s, the period known
as atmospheric warming is being renamed into global warming. In an
effort to keep the greenhouse gas concentrations and global warming sta-
ble, the UN Framework Convention on Climate Change (UNFCCC) is
being established. In 1997, the UNFCCC agreed on the Kyoto Protocol
(see Appendix 2), according to which binding targets for 37 industrial-
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 229
ized countries for reducing greenhouse gas (GHG) emissions are being
set. GHG should decrease to an average of 5.2% against 1990 levels over
20082012 (Kyoto Protocol 1998).
There are many aspects based on which the climate change can be
analyzed. In terms of the political aspect, the international coopera-
tion for the per capita emissions and the political view of the rise of
sea levels have to be considered. The economic, social and technologi-
cal are equally important. The environmental aspect is directly linked
to the climate change. First of all, the most severe impact that global
warming has on the Earth is the disappearance of land and sea ice. A
50-year government study found that the worlds glaciers are melting
at an alarming rate. Glaciers worldwide are melting faster than any-
one had predicted they would just a few years ago (Erdman 2009).
Furthermore, climate change has already affected natural systems which
will eventually lead species off the planet. According to Professor Will
Steffen of the Australian National University, the extinction of species is
now 1001000 times faster than it used to be and is expected to further
increase this century (Falvon-Lang 2011). Additionally, climate change
will affect the oceans as well. Oceans absorb carbon dioxide emissions
from human activities. This absorption causes the pH of the water to
decrease and leads to chemical changes of the oceans (ocean acidifica-
tion). The average pH at the surface of the oceans has been decreased
by 0.1 (from 8.2 to 8.1) since the industrial revolution. According to
predictions by the end of the century, the pH will drop by an addi-
tional 0.3. Corrals and mollusks will be among the most worst affected.
The long-term consequences of the ocean acidification are going to be
changes in the stability in a number of ecosystems (National Academy
of Sciences 2010).
Finally, deforestations together with climate change can have harm-
ful effects for the planet. In his documentary, confronting climate
change (see Appendix 3), Al Gore clearly states that human activities
like deforestation are changing our climate in ways that pose increas-
ing threats to human well-being, in both developing and industrialized
nations (Al Gore, n.d.). Deforestation is a process with which natural
forests are being logged or burnt in order to use the timber or the land
differently. An extent of 1215 million hectares of forest are lost each
year, the equivalent of 36 football fields per minute (wwf.panda.org).
230 K. BIGINAS
9.2.4Energy
Energy is fundamental to peoples everyday needs. Energy used to be
cheap and people believe that it will last forever so as to cover the
present and future needs. Therefore, demand for global energy is con-
stantly increasing. Supply of energy is important in developing coun-
tries for their economic development. Generally, the demand for energy
can be seen as the combination of population, the economic activity
that the population produces and the energy needed for that activity.
Lately, the belief that the climate change is being caused by humans is
increasing. And energy is considered as the main culprit but at the same
time, the energy sector is one of the most important in overcoming the
challenges that climate change has brought. As the impacts of climate
change are growing, the need to make drastic changes with energy
sources used is also growing. In order to make these changes so as to
satisfy future demands, the main fuels initially have to be identified.
The majority of the produced energy comes from fossil fuels like coal,
oil and natural gas.
Energy functions and interacts in five broad categories; social, tech-
nological, economic, environmental and political. Only the technological
aspects will be analyzed further. These are the changes that need to be
done. First of all, the production of electricity can be performed in coal-
fired power stations. Efforts are being made so as to reduce the environ-
mental impact of power stations. The power stations operate by harnessing
suitable raw energy sources and transform them into electrical energy
which in turn is going to be sent to houses and industries. The most
widely used fuel is coal. The global production of coal will be increased
by 30% in coming decades mainly in Australia, China, Russia, Ukraine,
Kazakhstan and South Africa (Zittel and Schindler 2007). Another way
to generate electricity more efficiently is the hydrogen economy. It is an
energy that is stored as hydrogen and is being used to balance the electri-
cal grid load and in mobile applications.
According to Friedemann, wind turbines alone can generate electricity
at 3040% efficiency, producing hydrogen at an overall 25% efficiency.
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 231
That is, when the wind is blowing (Friedemann 2005). Demand side man-
agement and micro-generation are also two ways so as to save energy.
Demand side management is the signals that a household can send in order
to warn housekeepers for high consumption. Micro-generation is the pro-
duction of energy or heat, on a small scale for individuals, small businesses
and communities so as to meet their own needs. Micro-generation is the
electricity produced under the capacity of 50 kW and heat less than 300
kW, like heat pumps, solar panels, biomass boilers and micro-wind tur-
bines. The UK government and the Department of Energy and Climate
Change have launched a consultation on the micro-generation strategy
they are following (Froley 2010).
Finally, there are energy technologies based on renewable sources (e.g.
wind, solar, biomass, geothermal) for our future energy generation so as
to move away from conventional fossil fuel-based energy sources. These
technologies have seen rapid change in the latest years, of more specifically
in terms of the wide range of implementation and their public and com-
mercial use (OKeefe, etal. 2010). Unfortunately, for now, the electricity
that is generated from renewable sources of energy accounts only the one-
fifth of the total energy consumption (see Appendix 4) (Hodgson 2010).
9.2.5Demographics
The demographer Ronald Lee defines demography as the study of the
causes and consequences of demographic rates and structures. The demo-
graphic rates are fertility, mortality and migration; whereas, the structures
include size and distribution by age, sex, race-ethnicity and geographic
location (Birks 2007). Demographics as a driver of change can be seen
as the combination of interrelated social, economic, political, technologi-
cal and environmental factors. There are three processes that are related
to demography. These are natural population development (fertility and
mortality) and migration. Fertility can be set as the mean number of chil-
dren (alive) that a woman will give birth to during her lifetime. The fertil-
ity rate in European countries lies in 2.1 much lower compared to Africa,
Asia, Oceania, Latin and Northern America. In the last decades of the
twentieth century, it has been noticed that continuous changes in fertility
are occurring. The global fertility level from 1970 to 1980 was 4.6 and
fell to half in 19942005 (United Nations 2007). This reduction can be
attributed mostly in social changes that had fundamental impact in order
for the development to be sustained.
232 K. BIGINAS
9.2.6Urbanization
Urbanization is the process by which large numbers of people become
permanently concentrated in relatively small areas, forming in that way
cities Urbanization can be caused by natural increase, migration or reclas-
sification of rural areas as urban. Urban areas are subject to an increasing
component of regional climate change. Most of the increase of CO2 in
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 233
9.2.7E-commerce
E-commerce has changed the way companies do business today. This is
not something that companies are just considering to do but something
which they have to do if they want to compete in the global environ-
ment and sustain any competitive advantage they have or gain a competi-
tive advantage. Today e-commerce is linking nations and organizations
either locally or globally. It is all about speed, connectivity, sharing and
exchanging goods, services and information. New technology devel-
opment has increased during the last years and has changed our lives.
Innovations and new products have been developed at great speed. One
of these innovations is the Internet. The growth of the Internet and its
increased use are forcing companies to evaluate their current distribution
channels and redesign their strategies, since now they are able to target
customers differently.
We could describe e-commerce as a computer to computer, individual
to computer, or computer to individual business relationships enabling
an exchange of information or value (Rao 2000). A form of e-commerce
existed between a significant number of large companies for about two
decades in the form of Electronic Data Interchange (EDI). Ninety per-
cent of e-commerce is EDI, which is unlikely to vanish (Nemzow 2002).
Trading over the Internet, mainly in the USA and at an increasing rate in
Europe, is accelerating the pace of change and for the first time provid-
ing the conditions for seriously free markets. Internet provides affordable,
accessible technology to bring together buyers and sellers, large and small.
People from any location on the planet could enter competitive markets
(Rao 2000). A World Trade Organization (WTO) study on e-commerce
emphasizes the growth of opportunities that e-commerce offers, including
for developing countries, it predicts more than 300 million users would
be transacting over the Net and estimates $300 billion e-commerce (Rao
2000).
9.2.8Planning fortheFutureCompeting
intheContemporary Markets
Forecasts and predictions are not are not always materialize. It is impor-
tant to keep in mind that forecasts are just forecasts which are suggesting
the likelihood for something to happen unless measures are taken so as
to avoid it. For climate, the predictions for the twenty-first century show
warming in the Arctic, an increase in precipitation extremes, a decrease in
snow and ice cover, and an increase in sea level (IPCC). The magnitude
that climate is changing in the future has to be limited. Limiting climate
change is a global issue. A fundamental strategy that should be followed
is to reduce the GHGs globally. For energy, on the other hand impossible
to make accurate predictions for the future. Nonetheless, there are pres-
sures so as to develop new trajectories so as to reduce carbon emissions.
These trajectories are the use of hydrogen energy economy, carbon trad-
ing in vehicles or use of other technological developments like biofuels.
But peoples lifestyles and the way energy is being used are the major
issues. Despite the uncertainty of the future for technological advance-
ments or climate change, there is only one thing for sure and that is births
and deaths. Economic and social policies should be implemented based on
demographic trends. Fertility rates and employment should be increased.
Finally, there is no doubt that globally urban population will be increased
in the future. What needs to be done so as urban settlements to be more
sustainable and reduce poverty is the worlds population to be concen-
trated on a less than 3% of the land area?
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 237
1. Number of firms
2. Barriers to entry
3. Nature of products
4. Knowledge of products
Conduct:
Performance:
1. Profitability
2. Efficiency
3. Equity
4. Innovation
5. Consumer Choice
3. It does not allow for risk and does not take account of alliances and
networks in industries.
4. It is qualitative and hence does not give accurate measurement.
1. Threat of entry
It is clear that each supplier offers a similar and not identical product.
Each supplier does not face a perfectly elastic demand line, as they
would in perfect competition.
9.5.2Efficiency ofCompetition
Consumers are efficient at all points on the demand curve D, whereas
producers are efficient at all points on the supply curve S. Moreover,
the demand curve is the marginal benefit curve MB, while the supply
curve is also the marginal curve MC. It is important to underline that
the resources are used efficiently at the point A, where marginal benefit
equals marginal cost and the sum of producer surplus and consumer sur-
plus is maximized.
Therefore, the efficient use of resources requires:
marginal costs (MC). When P = MC, the price of each product reflects its
cost of production which means that the resources are allocated according
to the demand of people reflecting also the true costs for producing them.
Under competitive environment, the efficiency is promoted and the
consumers are charged with lower prices. Perfectly, competitive firms have
incentives to use the best available technology. With a full knowledge of
existing technologies, firms will choose the technology that produces the
output they want at the least cost and each firm uses inputs such that the
marginal value of each input is just equal to its market price.
Hence, as the price-marginal cost equality (P = MC) condition defines
the efficiency of resource allocation, it is obvious that only the existence
of competitive markets will allow the promotion of perfect resource allo-
cation. The competitive markets can ensure that in the long-run equi-
librium condition (production level X0) P = MC. Katz and Rosen reveal
a Pareto-efficient allocation of resources requires that prices be in the same
ratios as marginal costs, and competition guarantees this condition will be
met (1998: 391).
Furthermore, the competitive markets lead to efficiency because when
the consumers decide the quantities that will buy from each product, they
will equal the marginal utility (MU) that they get from the consumption
of an extra unit of this product with the marginal cost (MC) of buying
this extra unit, which is the price that they pay. So, in perfect competi-
tion, as P = MC and P = MU, that means MC = MU as well. Conversely,
as the optimal allocation of resources is ensured by the maintenance of
the competitive markets, it is obvious that the absence of competitive
conditions will lead us in suboptimal allocation (misallocation) of them.
According to Karz and Rosen (1998: 398), an economy with freely oper-
ating markets may fail to generate an efficient allocation of resources
for two general reasons: the existence of market power from businesses
(monopolistic/oligopolistic power) and the nonexistence of markets at
all. In the first case, the existence of market power allows the firms to set
the prices by charging more than the competitive ones and supplying less
output than the competitive markets would. As prices cannot reflect the
real costs of production (P MC), this violates the fundamental condi-
tion of optimal resource allocation and, furthermore, all the other condi-
tions of efficiency.
For example, in a monopoly, there is a misallocation of resources as in
the production level where the monopoly maximizes its profits (X0), the
equilibrium price (P0) is much higher than the marginal cost (MC). As
248 K. BIGINAS
Katz and Rosen argue if all of the other goods in the economy are sold in
perfectly competitive markets at prices equal to their marginal costs, then the
monopolist violates the condition for allocation efficiency because it sets the
price of its product greater than its marginal costs (1998: 430). Another
example of resource misallocation can be given by a monopolistic com-
petitive market. There, although in long-run equilibrium, the firm has
zero profits (point B), this is not a sign of economic efficiency. Again the
price is higher than the marginal cost (P0>MC0), declaring the misalloca-
tion of resources. As long as P>MC we know that there is someone who is
willing to pay more for an extra unit of output than it costs to produce that
extra unit (Varian, 1996: 413). Furthermore, except the non-compet-
itive markets, the worst problem is the nonexistence of markets. That
means, there is the possibility that there is no supplier for a demanded
good or service.
The appearance of market power or the absence of market mechanisms
will result the phenomenon of market failure. Market failure occurs when
resources are misallocated or allocated inefficiently and the result is waste
or lost value. Evidence of market failure is revealed by the existence of
imperfect market structure, external costs and benefits (externalities), and
imperfect (or asymmetric) information. The appearance of market failure
violates the first welfare theorem; however, the market failure is a very
common problem for the real economies.
Even the provision of public goods can damage the competition as the
provision of these goods in some people that they need them cannot
prevent others from consuming them. As Katz and Rosen state that the
market-generated allocation is imperfect does not mean that the govern-
ment can do better (1998: 399), as, for example, the cost of setting-up
an agency to deal with externalities can be higher than the cost of the
externality itself.
Concluding, it can be said that only the promotion of competitive mar-
kets can ensure the efficient allocation of resources. In the case of market
failure, the government has to develop the policies and the mechanisms
that will diminish the negative effects of this failure. However, the govern-
ments intervention in the allocation process must be restricted; other-
wise, it will create more problems than those that may solve.
9.7 Conclusion
The next decades will be of fundamental changes for businesses around the
globe and the actions business take will begin to have a significant impact
on the shape of each economy. In the short-term future, businesses will
typically be involved in a range of collaborations, partnerships and joint
ventures, supporting investment finance, R&D and innovation, train-
ing and new organizational structures. There will be much more rigor in
identifying investment and innovation projects for funding and businesses
will have outsourced the next level of activities, including many special-
ist tasks. The workforce will be more diverse, highly flexible and mobile,
making the most of new ways of working and using more business-relevant
professional skills. This will leave organizations focused on a smaller core
of people and projects, supported by a much wider range of individuals
and businesses around the periphery. Building and maintaining trust with
business partners and the public will become critical to the smooth opera-
tion of these structures, and compliance with governance and sustainabil-
ity standards will be a major objective.
Effective management and wise use and allocation of resources avail-
able will be the key determinants of survival and success. Globalization
has resulted in increased competition and expanded international oper-
ations for many US and European companies and these effects will
likely continue in the new era. In fact, the rapidly accelerating develop-
ment of Brazil, Russia, India and China (BRIC) and with Africa next
250 K. BIGINAS
on the horizon will likely drive our global economic growth, as well
as many companies financial prospects, in the new business environ-
ment. Accordingly, boards will need to continue to address these fac-
tors, especially adapting corporate governance practices to take into
account the ethical business practices of, and companies relation-
ships with, foreign governments. Government intervention and mar-
ket regulation/deregulation in different market structures will be of
a great significance in the process of change and the transition in the
new era. Factors such as human resources, culture change, creativity
and innovation are vital to a business and that if they are not properly
delivered, it could lead to the company faltering. Strategic intervention
techniques would assist the company to carry on in certain situations,
survive and continue to prosper. It is expected to experience resistance
in any change process. Transition is a painful period for companies and
economies in general. The crucial aspect will be the identification of
the actual sources of resistance and the creation of an effective action
plan based on communication, flexibility and appropriate leadership
styles and managerial techniques.
9.9.1Article 1
For the purposes of this Protocol, the definitions contained in Article 1 of
the Convention shall apply. In addition:
9.9.2Article 28
The original of this Protocol, of which the Arabic, Chinese, English,
French, Russian and Spanish texts are equally authentic, shall be deposited
with the Secretary-General of the United Nations.
DONE at Kyoto this eleventh day of December one thousand nine
hundred and ninety-seven.
IN WITNESS WHEREOF the undersigned, being duly authorized
to that effect, have affixed their signatures to this Protocol on the dates
indicated.
252 K. BIGINAS
9.9.3Annex A
9.9.4Annex B
9.9.4.1 P
arty Quantified Emission Limitation or Reduction
Commitment
(Percentage of base year or period)
Australia 108, Austria 92, Belgium 92, Bulgaria* 92, Canada 94,
Croatia* 95, Czech Republic* 92, Denmark 92, Estonia* 92, European
Community 92, Finland 92, France 92, Germany 92, Greece 92, Hungary*
94, Iceland 110, Ireland 92, Italy 92, Japan 94, Latvia* 92, Liechtenstein
92, Lithuania* 92, Luxembourg 92, Monaco 92, Netherlands 92, New
Zealand 100, Norway 101, Poland* 94, Portugal 92, Romania* 92,
Russian Federation* 100, Slovakia* 92, Slovenia* 92, Spain 92, Sweden
92, Switzerland 92, Ukraine* 100, United Kingdom of Great Britain and
Northern Ireland 92, United States of America 93
* Countries that are undergoing the process of transition to a market
economy.
Source: http://www.kyotoprotocol.com/resource/kpeng.pdf
CHANGE MANAGEMENT: PLANNING FORTHEFUTURE... 253
Human activities like deforestation, and the burning of fossil fuels like
coal, oil or gas, are changing our climate in ways that pose increasing
threats to human well-being, in both developing and industrialized
nations. We are already experiencing the harmful effects of the climate
crisis and we know that more severe damage lies ahead unless we act
quickly. The good news is that we can still avoid the most severe impacts
of global warming by reducing our emissions of heat trap and gases and
halting in reversing deforestation. However, as we work to reduce these
emissions of global warming pollution, by investing in renewable energy
and by protecting our forests and soils, we must also begin to prepare
for the changes already coming, by working to better understand the
risks and intergrading these needs into our development planning. Since
the beginning of the industrial revolution, when we began to put heat
trap and gases in large volumes in the atmosphere, the global average
temperature has risen almost one degree Celsius and another eighteenth
of the degree is already in store for us because its in the ocean and will
be released in the atmosphere. If we were not to dramatically reduce
our emissions, the global average temperature is expected to raise as
much as four or more degrees Celsius by the end of this century, and
that would cause severe damage to natural systems and to human health
and well-being. Sustained warming of this magnitude could make hun-
dreds of millions of people climate refugees, because of coastal flooding.
And as many as a billion or more people at risk of increased water stress.
Sustained warming of this magnitude could cause large-scale irreversible
changes including the extinction of up to 2030% of the worlds plant
and animal species. Some of the regions most at risk for species extinc-
tion are areas that are expected to have the most species turnover due to
changing climate. In addition, the destabilization and extensive melting
of the Greenland and west Antarctica ice seeds, as the number of days of
Greenland ice seed melting, has increased dramatically since 1979. And
the disappearance of Antarcticas ice shell and a dozen others warns us
that the melting of large areas in west Antarctica and Greenland could
cause sea level to rise between 4 and 12meters, with each meter causing
roughly another one hundred million refugees.
254 K. BIGINAS
References
Bain, Joe S. 1951. Relation of profit rate to industry concentration: American
manufacturing, 19361940. The Quarterly Journal of Economics 65: 293324.
Begg, David, and Damian Ward. 2012. Economics for Business. New York:
McGraw-Hill.
Besanko, D., D.Dranove, S.Schaefer, and M.Shanley. 2009. Economics of Strategy,
5th International Student Edition. New Jersey: Willey.
Harris, N. 2009. Business Economics. UK: Butterworth Heinemann.
Katz, Michael L., and Harvey S.Rosen. 1998. Microeconomics. 3rd International
ed. Boston: McGraw-Hill.
Mansfield, Edwin, and Gary Yohe. 2000. Microeconomics. 10th ed. New York:
Auflage.
McConnell, B. 2010. Economics. 15th ed. NewYork: McGraw-Hill.
Parkin, M. 2010. Economics. 6th ed. US: Pearson Education.
Porter, Michael E. 2008. The five competitive forces that shape strategy. Harvard
Business Review.
Samuelson, Paul A. 1967. The monopolistic competition revolution. In
Monopolistic Competition Theory: Studies in Impact, 105138. NewYork: Wiley.
Varian, Hal R. 1996. Intermediate Microeconomics: A Modern Approach. NewYork:
W.W.Norton & Company.
CHAPTER 10
PetrSvoboda andJanCerny
10.1 Introduction
Sustainable development assumes long-term vision of consistency in pur-
suing goals by implementing certain measures that change the patterns
of economic, social, and environmental interaction. Key prerequisites for
sustainability are highly qualified professionals and a more comprehen-
sive and operational information support. The availability of these prereq-
uisites requires unprecedented financial resources invested in education
and research as a sector that has little connection with revenue-producing
activities. Aware of this situation, countries of the European Union have
developed specific operational programs that aim at overcoming this sepa-
ration gap.
Generally, sustainability props on three pillars: environmental,
social, and economic. For aspiring entrepreneurs, the third one is
the most important, because it creates resources for the other two.
ing. Such input should encourage changes in behavior that will create a
more sustainable future concerning society, economic viability, and envi-
ronmental integrity for present and future generations. Philosophy of
sustainable development has evolved to include more than just recycling
and constructing buildings with solar panels by recognizing that human
behavior can be altered to limit harmful effects on the environment. Now
it also includes how individuals and communities behave and interact with
the Earth.
The Decade of Education for Sustainable Development includes all
levels of formal and informal education, but formal higher education is
considered fundamental to the strategy for achieving sustainability as it
influences graduates who go on to become leaders in their organizations
and countries.
Two unique opportunities for HEIs to engage in sustainable develop-
ment were identified by UNESCO (2004). The first one was that universi-
ties form a link between knowledge generation and transfer of knowledge
to society for their entry into the labor market. This includes education
for teachers as they play the most important role in providing education
at all levels. The second was that universities actively contribute to the
development through outreach and service to society. Cortese (2003)
underlined this notion by stating that universities bear a moral responsi-
bility to increase awareness, knowledge, skills, and values needed to create
a just and sustainable future. Cortese also stated that higher education
often plays a critical role in making this vision a reality, which is also often
overlooked, and that it prepares most of the professionals who teach, man-
age, lead, work in, and influence societys institutions. Thus, universities
have a tangible and critical role in developing the principles and qualities
needed to improve the awareness and delivery of the sustainable develop-
ment philosophy.
said about the other two pillars. This can be illustrated by the example of
Czech universities.
10.2.3
Economic Situation ofCzech Universities
Almost every Czech University consists of highly autonomous parts. That
part is in Czech language called fakulta, which has a similar meaning
as the English word college or department. For example, the famous
Charles University in Prague includes the Faculty of Law, Faculty of
Mathematics and Physics, Faculty of Philosophy, and 24 other schools.
The economic situation of the various faculties of the same university may
be considerably different. This is because the income of each faculty is
composed of three parts:
(1) Sometimes, they do not meet the target number and therefore
capitation income is less.
(2) One faculty teachers have much less significant scientific results
than teachers of another faculty.
(3) Additional income can also be very different.
R1: Applicants for study are recruited from smaller and smaller population
years. Since primary and secondary schooling in Czech Republic takes
13 years (5 + 8) and starts at 6, most of the applicants in the year 2013
were born in 1994. The number of live births in 19851996 was sub-
sequently 135,881; 133,356; 130,921; 132,667; 128,356; 130,564;
129,354; 121,705; 121,025; 106,579; 96,097; and 90,446. During
the past 10 years, the sources of students were decreasing from
135,881 to 106,579, that is, by 22%. And this development will con-
tinue at least for the next 2 years. On the other hand, the teaching
capacities have been stationary or a bit increasing.
R2: Decision-making of the applicants is a complex process, which is dif-
ficult to describe exactly. They take into account such fuzzy informa-
tion like what is the image of the faculty, what is the picture of the
faculty drawn by its current students, mainly as concern the qual-
ity of education, of facilities, and of information and communication
services.
The first reason cannot be influenced by HEIs; thus, the authors focus
on the other one. Most deans try to adopt traditional marketing measures
such as advertising, open houses, and so on. The Faculty of Management
of the University of Economics is trying to extend this spectrum in a way
thoroughly described below.
Gaining the Support from the European Social FundThe second point
of the Faculty of Managements approach to overcome the problems
outlined in R2 was gaining the support from the ESF for the proj-
262 P. SVOBODA AND J. CERNY
10.4.1Basic Approaches
European society must adapt to the dynamically evolving conditions in the
global economy and to the related demands on the flexibility, knowledge,
and skills of each individual. Growing number of people will be forced to
engage in lifelong learning due to the global competition and frequent
changes at the labor market. The urgency of these challenges is simultane-
ously intensified by the fact that the population of many European coun-
tries is experiencing a demographic decrease and the labor market needs
to replace the people who are retiring. A lack of readiness or an inability to
respond effectively to these conditions may cause serious problems in the
area of competitiveness of European countries.
The European Union is aware of the importance of supporting the
development of human potential as one of the fundamental factors for
sustainable economic growth of the knowledge economies of developed
countries. In this context, the educational system has to be perceived as
one of the main pillars of future economic success and social cohesion.
Taking into account the high degree of complexity of the education
issues, the Czech Republic has prepared a multi-objective operational pro-
gram Education for Competitiveness.
groups of students and academic staff. It also sets the fundamental axis for
the long-term sustainability. Total amount allocated to this project was
approximately EUR 300,000 (Trojan 2011).
10.5.4
Experience oftheUniversity ofEconomics inPrague
The main idea of the project was the development and innovation of ways
and educational methods and curricula. The project, however, has had a
significant positive impact on other aspects of education as well. One of
the positive effects of the project was that it provided an opportunity or
even the need to implement innovations within individual subjects and
considered the related study aids. Thus, outdated aspects of particular
subjects could be eliminated, content of the subject adjusted, and support
materials extended for more effective teaching. All this, however, stands
beside the already outlined main idea of the project.
EU OPERATIONAL PROGRAM EDUCATION FORCOMPETITIVENESS ANDITS... 267
Difficulties of the innovation lie mainly in the fact that new modular
subjects have a great range of credits, which is related to higher dotation
of teaching hours. This fact brings a number of complications. For exam-
ple, there are higher demands on the cooperation of teachers than they
were used to, as for each group of students there, alternate more teachers,
which places greater demands on coordination. Another difficulty may be
more complicated situation with the final evaluation of students.
Some of the ideas and objectives of the project are the desired and mod-
ern trends in higher education related to the Bologna Declaration and the
European Union frameworks. However, although the idea is good, its
implementation is currently precarious and not fully thought out in all
aspects. Simply said, it still cannot be stated that everything works well and
that the implemented innovation is a positive step in every aspect. To sum
it up, the realized project brought, in addition to a number of benefits,
also some complications, and it will take some time before it settles down.
Students used these aids extensively and achieved good study results.
More than 80% passed an exam in mathematics at the first attempt, which
had never happened before.
10.6 Conclusion
The goal of the chapter consists of three main points: (1) to present
the relation between sustainability and excellence of the higher educa-
tion institutions (HEIs), (2) to show their impact on the sustainability
of enterprise excellence and (3) to outline the role of EU Operational
Program Education for Competitiveness in Czech Republic concern-
ing (1) and (2). It was fulfilled, since after the introductory Chap. 1,
Chaps. 2, 3 and 5 presented how the sustainable development of a HEI
influences its educational excellence and what measures can be used to
enhance it. Czech experience was described in detail. In Chap. 4, the EU
Operational Program Education for Competitiveness is introduced and
the role of education in excellence and competitiveness improvement is
mentioned.
The chapter maps the possibilities of using the Structural funds of the
European Union for innovation of educational processes at HEIs. It shows
that the key role in the financing of tertiary education has the ESF with
billions allocated to innovative programs such as the OPEC. described
above. OPEC and its second priority axis focused on tertiary education
aims to improve the quality and further diversification of universities with
an emphasis on the requirements of the knowledge economy. This leads
to greater flexibility and creativity of graduates employable in the labor
market.
The article further presents practical examples of the innovation process
and documents comprehensive approaches which allow efficient utiliza-
tion of resources throughout the innovation cycle while meeting the cur-
rent objectives and priorities of the European Union.
Detailed presentation of the original solution to the problem of improv-
ing the quality of teaching at one of the leading Czech universities illus-
trates these facts in the last chapter.
References
Aly, Nael, and Joseph Akpovi. 2001. Total quality management in California pub-
lic higher education. Quality Assurance in Education 9(3): 127131.
Brundtland Commission. 1987. Our common future. Oxford: Oxford University
Press.
Cortese, Anthony D. 2003. The critical role of higher education in creating a
sustainable future. Planning for Higher Education 31(3): 1522.
FM VSE Innovation. 2009. In http://projekty.fm.vse.cz/projekt-isovp/ (in
Czech).
Kanji, Gopal K., Abdul Malek, and Bin A.Tambi. 1999. Total quality manage-
ment in UK higher education institutions. Total Quality Management 10(1):
129153.
Kotler, Philip, and Karen F.A. Fox. 1985. Strategic Marketing for Educational
Institutions. USA: Prentice-Hall.
Kubiszewski, Cutler ClevelandIda, and Cutler J. Cleveland. 2012. United
Nations Conference on Environment and Development (UNCED), Rio de
Janeiro, Brazil. The Encyclopedia of Earth. http://www.eoearth.org/view/
article/156773/, 20 November 2014.
Kurikova, Veronika, and Monika Ulrichova. 2012. Innovation of the Study
Program Transcultural Communication Taught in English (in Czech). Czech
Ministry of Education and Sport.
Svoboda, Petr, and Jan Cerny. 2013a. Customer satisfaction and loyalty in higher
education-a case study over a five-year academic experience. In KDIR/KMIS,
431436.
. 2013b. Quality of higher education institutions as a factor of students
decision-making process. In International Conference on Intellectual Capital
and Knowledge Management and Organizational Learning, 622. UK:
Academic Conferences International Limited.
Svoboda, Petr, Jan Voracek, and Michal Novak. 2012. Online marketing in higher
education. In Knowledge Management, 11451152. Cartagena: Universidad
Politcnica.
Trojan, Jan.2011. Innovation of the study program of the college of business and
hotel management under the European social fund. In Proceedings of the 4th
International Conference on Teaching of Tourism, Hotel and Restaurant Services.
Brno: VSOH.
UNESCO. 2004. Higher education for sustainable development. In Education for
Sustainable Development Information Brief. Paris: UNESCO.
. 2005. United Nations Decade of Education for Sustainable Development
20052014: Draft International Implementation Scheme. Paris: UNESCO.
CHAPTER 11
StavrosSindakis
S. Sindakis (*)
American University in Dubai, School of Business, Dubai, UAE
e-mail: ssindakis@aud.edu
data editing options like PATAmPOWER in the travel and tourism sec-
tor, as has been referred in Chap. 8. Finally, it is worth to be said that the
European Union has focused its education policies on innovation and the
comprehension of the competitive business environment. EU Operational
Program Education for Competitiveness is a testament to the invest-
ment strategy followed. Economy knowledge and flexibility establish a
new agenda, adapted to current data.
References
Benner, M.J., and M.L. Tushman. 2003. Exploitation, exploration, and process
management: The productivity dilemma revisited. Academy of Management
Review 28(2): 238256.
Ceri, S., G. Gottlob, and L. Tanca. 2012. Logic programming and databases.
Heidelberg: Springer Science & Business Media.
Chen, Hsinchun, Roger H.L. Chiang, and Veda C. Storey. 2012. Business intelli-
gence and analytics: From big data to big impact. MIS Quarterly 36(4):
11651188.
Galbraith, J.K. 2012. Inequality and instability: A study of the world economy just
before the great crisis. New York: Oxford University Press.
Gibson, C.B., and J. Birkinshaw. 2004. The antecedents, consequences, and medi-
ating role of organizational ambidexterity. Academy of Management Journal
47(2): 209226.
Gunday, G., G. Ulusoy, K. Kilic, and L. Alpkan. 2011. Effects of innovation types
on firm performance. International Journal of Production Economics 133(2):
662676.
He, Z.L., and P.K. Wong. 2004. Exploration vs. exploitation: An empirical test of
the ambidexterity hypothesis. Organization Science 15(4): 481494.
Joshi, K.D., L. Chi, A. Datta, and S. Han. 2010. Changing the competitive land-
scape: Continuous innovation through IT-enabled knowledge capabilities.
Information Systems Research 21(3): 472495.
LaValle, S., E. Lesser, R. Shockley, M.S. Hopkins, and N. Kruschwitz. 2011. Big
data, analytics and the path from insights to value. MIT Sloan Management
Review 52(2): 21.
Nickerson, J.A., and T.R. Zenger. 2004. A knowledge-based theory of the firm
The problem-solving perspective. Organization Science 15(6): 617632.
Provost, F., and T. Fawcett. 2013. Data science and its relationship to big data and
data-driven decision making. Big Data 1(1): 5159.
Reinmoeller, P., and N. Van Baardwijk. 2005. The link between diversity and resil-
ience. MIT Sloan Management Review 46(4): 61.
Index
Z
V Zahavi, J., 53
value capture function, 185 Zenger, T.R., 273
value creation function, 185, 186 Zillow, 1501
value framing, business model, 186 Zott, C., 179, 182