You are on page 1of 178

MANAGEMENT OF CHANGE TO ENSURE IS SUCCESS:

A LONGITUDINAL STUDY
A Dissertation
Presented to the Faculty of the College of Business
of Trident University International
in Partial Fulfillment of the Requirements for the Degree of
Doctor of Philosophy in Business Administration
by

Pauline Ash Ray


Cypress, California 90630
2011
Defended September 2, 2011

Approved by:
Office of Academic Affairs
Date: October 10, 2011
Dean: Dr. Scott Amundsen
Director, PhD Program: Dr. Joshua Shackman
Committee Chair: Dr. Wenli Wang
Committee Member: Dr. Jerry Cha-Jan Chang
Committee Member: Dr. Roger McHaney
Pauline Ash Ray

UMI Number: 3485542

All rights reserved


INFORMATION TO ALL USERS
The quality of this reproduction is dependent on the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

UMI 3485542
Copyright 2011 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.

ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346

2011 Pauline Ash Ray

BIOGRAPHICAL SKETCH

Pauline Ash Ray is an assistant professor of business at Thomas University, Thomasville,


Georgia. Prior to her doctoral studies at Trident University, she earned her B.S. in Chemical Engineering
from Mississippi State University. During her career in industry, she earned her M.S. in Business and her
B.S. in Accounting at Mississippi University for Women. She entered an academic career in 2003
teaching as an adjunct at Southwest Georgia Technical College and Thomas University. She earned her
Master's Certificate in Accounting from Brenau University. In further study she completed her 18
graduate hours in Management and at TUI in Information Systems. Pauline served as the Blackboard
Coordinator for four years. She taught as an adjunct for Brenau University and Trident University in
finance and accounting.

iii

I dedicate this work to the memory of my loving and supportive husband Albert Ash, who
believed and encouraged me in this pursuit. I would also like to thank my mother Zella Mathews
for her wonderful work ethic along with my family who has given moral support and
encouragement throughout the program. And I thank the new love of my life and husband
Richard Ray, who has given support and encouragement to complete the task so long in the
making. To God be the glory for His wondrous grace and sustaining guidance.

iv

ACKNOWLEDGMENTS

Foremost I want to thank my dissertation committee members Dr. Jerry Cha-Jan Chang,
Dr. Roger McHaney, and particularly my mentor and chair Dr. Wenli Wang, for their continuous
encouragement and efforts on my behalf. Throughout the years of this endeavor, I have learned
to appreciate patience, perseverance, and new friendships thanks to Dr. Wangs genuine
mentorship. I also want to thank the Directior of the PhD program Dr. Joshua Shackman for his
expertise and suggestions for improving this dissertation.
There are many supporters from both universities who helped me throughout my studies
and I want to thank them for their work, especially: Jenny Swearingen, Theresa Reese, Carolyn
Treadon, Robin Ouzts, Gary Bonvillian, Ann Landis, Denae Johnson, Crissie Grove and all of
those who made this possible by participating in my research study. In addition, I want to thank
Dr. Geoffrey Hubona for his assistance and teaching in SmartPLS.

MANAGEMENT OF CHANGE TO ENSURE IS SUCCESS:


A LONGTUDINAL STUDY
TABLE OF CONTENTS

Page
Table of Contents
List of Tables
List of Figures
Abstract

i
ii
iv
V

Chapter I: INTRODUCTION

Section 1.1 Problem Statement and Gap of Knowledge

Section 1.2 Research Questions

Section 1.3 Significance of Study

Chapter II: LITERATURE REVIEW AND THEORETICAL FRAMEWORK


Section 2.1 Literature Review

12
13

Section 2.1.1 Users Perception of Management of Change Effectiveness 16


Section 2.1.2 Readiness for Change and Resistance to Change

19

Section 2.1.3 End-user Computing Satisfaction

25

Section 2.2 Theoretical Development

27

Section 2.3 Hypotheses

31

CHAPTER III. METHODOLOGY

37
vi

Section 3.1 Research Design

37

Section 3.2 Data Collection

39

Section 3.2.1 Human Subject Concerns

44

Section 3.2.2 Population and Sample.

44

Section 3.3 Measurement Development

46

Section 3.4 Method of Analysis

51

Chapter IV: Data Analysis and Research Findings

54

Section 4.1 Measurement Validation

54

Section 4.2 Data Analysis and Results

59

Section 4.3 Hypotheses Testing

62

Chapter V: Implications and Conclusions

78

Section 5.1 Discussion

78

Section 5.2 Implications for Research

81

Section 5.3 Implications for Practice

83

Section 5.4 Limitations and Future Research

84

Section 5.5 Conclusion

86

References

989

APPENDICES

97

Appendix A Detailed Instrument Items

97
vii

Appendix B Descriptive Statistics by Item

105

Appendix C Cross Loadings

109

Appendix D Survey Invitation Emails

111

Appendix E Timeline-Qualitative Data

117

Appendix F Interviews

129

viii

LIST OF TABLES

Table 1. Timeline on Implementation...

40

Table 2. Sample Sizes

45

Table 3. Descriptive Statistics (n=145).

55

Table 4. Assessment of the Measurement Model..

57

Table 5. Discriminant Validity (Inter-correlations) of Latent Variable


Constructs

57

Table 6. Sample Processing..

59

Table 7. Descriptive Statistics (n= 56)...

61

Table 8. Question Order..

65

Table 9. Combined PLS Sample Results


.......
Table 10. Comparison of PLS Sample Results..

67

Table 11. Best Item Scores on EUCS/RES/MOC/REA.

74

Table 12. Lowest Item Scores on MOC/REA

75

Table 13. Comment Summary....


.......Summary...............................................
........................................
Table
14. Descriptive Statistics by Item..

77

Table 15. Cross-Loadings


....Loadings...............................................................................................

ix

72

105
109

LIST OF FIGURES

Figure 1. Management of Change Research Model.

12

Model
Figure
2. Model for Testing Longitudinal Effects

28

Figure 3. Results: at Time 1 (n=145).......

63

Observations..
Figure 4. Results: at Time 2 (n=145).......

63

Figure 5. Results: at Time 3 (n=145) ...........................................................

64

Figure 6. PLS Results of Full Model Testing for n=145...

66

..Sample.........
Figure
7. PLS Results of Full Model Testing for n=56 (Matched Respondent Cases) ..

70

Figure 8. Modified Management of Change Research Model.

82

MANAGEMENT OF CHANGE TO ENSURE IS SUCCESS:


A LONGITUDINAL STUDY
Pauline Ash Ray

Trident University International 2011

This dissertation aims to understand the effect of management of change on the success
of information system (IS) implementation. A research model is developed drawing on change
management research. Data collected from a longitudinal field survey before, during, and after
an enterprise-wide IS implementation are analyzed to test the proposed hypotheses. The results
indicate that management of change can be used to increase readiness for change and end-user
computing satisfaction during and after the implementation. Readiness for change positively
affects satisfaction during an implementation but not after. Contrary to the literature, no
significant relationship between resistance to change and satisfaction exists. The paper
contributes to IS research and practice by drawing attention to the importance of management of
change and readiness for change for IS success.

CHAPTER I
Introduction
Enterprise-wide information systems support and integrate multiple
organizational business functional areas. They achieve greater efficiency in the transfer
and use of information preventing the entry of redundant data and duplication of effort.
This type of technology enhances business performance in support of the organization's
business strategy by improving the efficiency of information use and controlling its
access. Organizations have made significant investment into these systems. In order to
realize a return on investment, it is necessary to functionally integrate the technology
into workflow and job routines (Xue et al. 2009), support effective system use, and
satisfy users (Nelson, 2003). This study seeks to understand the relationships among
users perceptions of management of change strategies, readiness for change, resistance
to change, and end-user computing satisfaction before, during and after an enterprisewide information system (IS) implementation and how such an understanding facilitates
a successful system implementation.
Successful implementation practices cannot be overlooked when investing in a
new enterprise-wide information system. Kwahk and Lee (2008, p.474) cite that new
enterprise resource planning (ERP) implementations have a 60-90% failure rate due to
resistance to change while Vollman (1996) attributes the high failure rate to

managements failure to understand skills necessary to manage the change. Self and
Schraeder (2009) agree that many contributing factors add to these poor results but
suggest a primary reason for the failure could be organizational managers inability to
fully understand what is necessary to guide their organizations through a change
initiative. The system implementation must be managed both as a technological
innovation and an organizational change. Proper planning of an implementation process
can reduce the likelihood of failure and help prevent other undesirable consequences
such as reduced employee morale (Self and Schraeder, 2009).
A variety of change management strategies have been reported in the literature.
One of the key models put forward to help managers and leaders successfully
implement change is Lewins three steps to change unfreeze, moving and
refreezing (Lewin, 1951). In the first stage, emphasis is placed on efforts to minimize
obstacles to change and maximize the change effort. The next stage seeks recognition
that the change is needed and acceptance of the proposed change. In the final stage, the
new system has to be reinforced and consolidated (Lewin, 1951). Refreezing implies
stasis with innovation ending with improved organizational routines. However, research
in IS suggests that while state of the art technology rapidly advances there is no end to
the implementation process (Bikson, 1987; Bikson et al., 1985). Another key model is
Kotters 8 steps to transforming your organization, which emphasizes the need for
continual communication and having a shared vision (Kotter, 1995).

A critical user attitude for a successful implementation is "change orientation" -the extent to which participants in an innovation process view the change as a positive,
problem-solving, and achievable goal that benefits the entire organization. This attitude
was a highly significant predictor of success in a cross-sectional study of organizations
introducing computerized information tools (Bikson et al., 1987). Hence, in addition to
communication and a shared vision, the organizations need for and ability to
implement the change must be imparted to users.
Bikson et al. (1987) suggest that important aspects of user attitudes toward the
new system include affective assessment (user satisfaction), cognitive assessment
(discrepancy between old and new system), and user resistance to the change. Davis et
al. (1989) addresses the ability to predict users' computer acceptance from a
measurement of their intentions, and the ability to explain their intentions in terms of
their attitudes, subjective norms, perceived usefulness, perceived ease of use, etc. Davis
et al., (1989, p. 982) stated that These results suggest the possibility of useful models
of the determinants of user acceptance, with practical value for evaluating systems and
guiding managerial interventions aimed at reducing problems. Venkatesh and Davis
(2000) conducted a longitudinal study before, during, and after an implementation, an
idea which can create the opportunity for managerial interventions from data analysis.
Taken holistically, these studies imply that for a successful implementation changes
must be managed to reduce resistance and increase readiness for the change in a
dynamic manner with interventions if necessary.
3

Complementary research on management of change implementation exists in the


Information Technology field for antecedents of IS acceptance (Capaldo and Rippa
2009, Joshi 1991, Shang and Su 2004). Common threads exist in different areas of
research that address overcoming resistance, creating readiness, and enhancing
acceptance. Organizational change and IS acceptance research builds upon
organizational behavior research citing many of the same studies (Bikson et al. 1987;
Davis et al., 1989; Judson 1991; Kotter, 1995; Lewin, 1951). In the IS acceptance area,
Capaldo and Rippa (2009) propose the evaluation of organizational capabilities when
selecting appropriate implementation strategies and change management interventions
during the implementation. Some of their example strategies include communication,
management support, modification, and training. Joshi (1991) posits that individuals
evaluate change for the expected outcome and then decide to either react favorably or
resist. A pre-implementation analysis of the potential impact of a new system for
identified user groups and an attempt to address their concerns in training and
communication programs as part of the implementation strategy in the change
management process is recommended (Joshi, 1991). Change strategies that can
overcome resistance and create readiness assist in successful implementation (Shang
and Su, 2004). Other research in the IS acceptance area has also been conducted on
how to prevent, reduce, or overcome resistance to change (Bhattacherjee and Hikmet,
2007; Hirscheim and Newman, 1988). Other research addresses how to prepare the
organization for change through strategies to increase readiness for change (Kwahk and
4

Kim; 2007; Kwahk and Lee, 2008). Change management strategies in these studies
include communication, training, management support, and technical resource
availability. The conclusion from this brief research review is that the precursors for IS
success must involve the users attitudes toward the change management process as
well as toward the change itself.
Section 1.1 Problem Statement and Gap of Knowledge
Realizing the importance of this area, even after reviewing the prior research, it
is still not clear how change should be managed during the change process and how
change management strategies can enhance the implementation success. There is a lack
of longitudinal studies in change management. It is also unclear that between reducing
resistance and creating readiness which is more effective to ensure a successful
implementation. After much research of the areas the question still persists whether
readiness and resistance are opposite ends of a continuum or separate states of attitude.
Although common threads exist in organizational behavior, change
management, and IS acceptance research (that addresses overcoming resistance,
creating readiness, and enhancing acceptance), no study combines these particular
constructs with users perception of management of change effectiveness in a
comprehensive model to explain their relationships. Further, no clear indication exists
on whether it is more important to overcome resistance or build readiness for change.
Research has not determined how early in the change process management strategies
should be introduced or how effective they are throughout the implementation. It is
5

important then, and a goal of this study, to research a model representing the
relationships between the key constructs of readiness, resistance, users perception of
management of change effectiveness, and end-user satisfaction; to explore the relative
importance of resistance and readiness to creating user satisfaction; and to develop an
instrument that gives an early indication of the management of change effectiveness as
it surfaces issues from feedback in the dynamic change process.
A search was conducted in organizational change, human behavior, IS
acceptance, and other literature for factors that influence successful IS implementations.
While studies that reduce resistance and build readiness to accept change provide a
basis for this study, no combined model that also includes users perception of
management of change effectiveness and an appropriate acceptance measure in
mandatory situations was discovered. The literature search did not reveal any existing
study that addresses early detection of issues including readiness, resistance, and users
perception of management of change effectiveness as antecedents of user satisfaction.
A plethora of research regarding readiness and resistance, and their relationships exists.
However, the literature is inconclusive about which one has more impact or if they
interact as opposite ends of a continuum on IS implementation success.
Change is a process (Orlikowski and Hofman, 1997). Hence, change
management and its impacts should be studied along with the change and preferably
pre-, during, and after a change. Although much research has been conducted on
management of change, readiness, resistance, end-user computing satisfaction, and their
6

respective relationships with one another, no research has looked closely inside the
change management process and explicitly examined the relationship among all of them
longitudinally. Venkatesh and Davis (2000) is the most relevant longitudinal study;
however, their study extending the Technology Acceptance Model (TAM2) mainly
captured snapshots of use characteristics at three time frames and did not introduce any
process measures for the change. They tested technology acceptance in both mandatory
and voluntary settings over a period spanning three months during which they measured
the effects of perceived usefulness and ease of use on usage intention and actual usage.
This research, however, studies only the mandatory use of technology, and
argues that end-user computing satisfaction is a better measure for true technology
acceptance in mandatory settings rather than use intention and actual use. In mandatory
settings, use intention can be influenced by compliance requirements (Xue et al., 2009)
and the actual use depends on the role, needs, and the proficiency of the user. Therefore,
user satisfaction with the system is a better indication of the system success than use
intention and actual use.
Venkatesh and Davis (2000) recommended further research to determine how
early in system development one can reliably measure key user reactions as indicators
of post-implementation success of the system. Venkatesh et al. (2003) studied
antecedents of acceptance of new systems as indicated by usage intention and usage
behavior. They recommended additional research to understand the drivers of

acceptance in order to proactively design interventions targeted at populations of users


that may be less inclined to adopt and use new systems.
This research investigates the causal relationships among users perception of
management of change effectiveness (MOC), readiness to change (REA), resistance to
change (RST), and end-user computing satisfaction (EUCS) before, during and after an
IS implementation. Data are collected at the three points of an IS implementation: after
a decision is made about a new IS implementation but before its initiation, during the
implementation after the first major modules are implemented, and after the entire
implementation is complete with the new system in use for a while.
This study also represents an effort to understand the relative importance of
resistance and readiness in creating user satisfaction and if these relationships change
over the course of the implementation. This research studies only the mandatory use of
technology and argues that EUCS is a better measure for true technology acceptance in
mandatory settings rather than intention and the actual use in IS acceptance studies
under voluntary IT settings. It also expands the tools available for management of
change during a new IS implementation, particularly those for early detection,
intervention, and the prediction of success. The example case is based on longitudinal
data and observations taken at three points in time as the Comprehensive Academic
Management System (CAMS Enterprise) -- an integrated web-based Academic

Enterprise Resource Planning System for higher education -- is introduced replacing


several separate un-integrated legacy systems.
Section 1.2 Research Questions
The following research questions are being investigated in this study:
1. How does end-user satisfaction with an existing system affect management of
change to a new IS implementation?
2. How does management of change in a new IS implementation affect readiness
for the new IS?
3. How does management of change in a new IS implementation affect resistance
to the new IS?
4. How does readiness for change affect the success of the new IS implementation
as evidenced by end-user computing satisfaction?
5. How does resistance to change affect the success of the new IS implementation
as evidenced by end-user computing satisfaction?
6. How does management of change in a new IS implementation affect the
success of the new IS implementation as evidenced by end-user computing
satisfaction?
Section 1.3 Significance of the Study
This study draws from the streams of literature from change management,
organizational behavior, and information technology acceptance, and intends to

contribute value to these areas. The findings of this study can help to understand how
management of change effectiveness can foster increased user satisfaction, an indicator
of IS implementation success. Training, communication, and management support are
some of the strategies used in management of change that are expected to change
resistance, readiness, and end-user satisfaction over the course of an IS implementation.
This study adds to the body of knowledge by introducing an explanatory model
of how management of change effectiveness during an IS implementation can promote
user satisfaction in mandatory IT settings. The research model includes both readiness
and resistance, exploring how they are affected by management of change strategies
and, in turn, how they may affect IS implementation success as indicated by end-user
computing satisfaction. For the theorists, this study contributes to the understanding of
the relationships between the readiness and resistance constructs longitudinally during
an implementation and the relative importance of them. No known prior study has
combined these constructs to evaluate their interactions, their relative importance to the
implementation success, or if any of these relationships change during an
implementation.
Managers may believe that they are being supportive in communication but are
unaware of the perceptions and attitudes of their employees at the operational level
(Bonvillian, 1997). Managers need tools to identify implementation issues early on and
to adapt the management of change strategies so as to better achieve a successful
implementation by reducing resistance, increasing readiness to accept system changes.
10

For the practitioners, this study contributes an early indicator to capture issues and to
provide feedback to enable management to adapt their strategies and direct their
resources during the implementation.
The rest of the paper is organized as follows. Chapter II introduces the literature
review, the research model, and the hypotheses. Chapter III describes the survey
research process and Chapter IV reports the results of the data analysis and research
findings. Chapter V discusses the implications for research and practice and concludes.

11

CHAPTER II
Literature Review and Theoretical Framework
This chapter presents the process model in Figure 1 then the literature overview.
The model was derived from the more specific literature review beginning with the
users perception of management of change effectiveness (MOC) followed by resistance
to change (RST) and readiness for change (REA), and ending with end-user computing
satisfaction (EUCS).
Figure 1 Management of Change Research Model

Readiness for
Change
H2
[+]
End-User
Satisfaction of
Old System

H1
[-]

H4
[+]

Users
Perception of
Management of
Change
Effectiveness

H6
[+]

H3
[-]

End-User
Satisfaction of
New System

H5
[-]
Resistance to
Change

12

H7
[+]

Figure 1 depicts the process model for a longitudinal study. The feedback from
data collected at a survey point serves as input to management to adapt management of
change strategies for greater resulting implementation success, (whether they entail
behavior modification of the user, technical support, or modification of a technological
application, etc.). The longitudinal model portraying the three survey points for full
model testing is pictured in Figure 2 in section 2.2.
Section 2.1 Literature Review
In general, management change strategies that (1) enhance perceptions of ease
of use and perceived usefulness; (2) provide sufficient information to enable
comparison of the before and after processes; (3) introduce new interfaces; and (4)
create an empowering vision of the desired end will illuminate the need for change
(Davis, et al., 1989, Kotter 1995). Additionally, adjustment of cognitive assessment of
the change can be important and should include both clear descriptions of advantages
offered by the changes and the expected system improvements to be gained. Cognitive
adjustment prepares current system users by communicating the need for change prior
to change process initiation (Armendakis, 1993). These strategies must include
elements of communication, training, and management support.
Self-Determination Theory (SDT) is a relevant organizational behavior research
stream that addresses change in the workplace, and its acceptance (Deci 1972, Deci and
Ryan 1985, Deci and Ryan 1989). SDT research explores the consequences of work
climates that enhance intrinsic motivation as well as the integration of extrinsic
13

motivation, which contributes to important work outcomes (Gagne and Deci, 2005).
Baard, et al. (2004) focused on workplace factors that support autonomy and facilitate
internalization of extrinsic motivation. A workplace exhibiting such factors with
timely, effective communications and training can serve to internalize the external
motivation for IS implementation (Baard et al., 2004; Kirner, 2006). Understanding and
applying this theory of motivation helps the manager assess and use strategies to assist
in the implementation of change (Armendakis, 1993).
In other related organizational change research, Holt et al. (2003, p. 262) posits
that "the extent to which the organization achieves the benefits at the end of the process
is affected by the influence strategies used by organizational leaders to encourage
adoption and implementation of the change." Such change management helps achieve
the success of an IS implementation indicated by user satisfaction with the system, the
information generated by the system, and its ease of use (Venkatesh and Davis 1996).
Examples of research include: (1)the organizational change area on how to prevent,
reduce, or overcome resistance to change (Armendakis, et al. 1993, 1999; Folger and
Skarlicki, 1999; Henry, 1994; Holt, Self, Thal, Lo, 2003; Judson, 1991; Self, 2007; Self
and Schraeder, 2009); and (2) motivations place in preparing the organization for
change through the application and measurement of strategies to increase readiness for
change (REA) (Armendakis, 1993; Self, 2007; Self and Schraeder, 2009). These
studies, for the most part, explored how to create readiness or reduce resistance as the
dependent variable, but this study seeks to determine their relative effects on IS success.
14

To quantify and measure the effects of change management, it is necessary to


have an indicator of success. IS benefits are sometimes intangible and the literature
contains many examples of user satisfaction serving as a surrogate measure for IS
success (Ives, Olson, Baroudi, 1984; Baroudi and Orlikowski, 1988; Straub 1989;
DeLone and McLean 2003). Even in a mandatory system, the preparation of an
organization for change by enhancing readiness and reducing resistance is important to
achieve not only usage but also user satisfaction.
Orlikowski and Barley(2001) introduces the importance of the organizational
behavior theories in the information technology research. Further, Orlikowski and
Hofman (1997) stresses that change is a dynamic process which cannot be predetermined without adaptation during implementation. Yet, no model was given on
how to study such a dynamic change process. Hence, this study suggests that change
should be studied along with the change, and proposes a process model to enable the
longitudinal study of a change process.
A process model should address the dynamic nature of a change and contain a
feedback loop which allows for adaptation. Feedback should be measured scientifically
so that meaningful inputs are injected into the adaptation in the change process.
Usefulness and ease of use are the two precursors for IS success in studies by
Venkatesh and Davis (2000). Nelson (2003) avers the advantages of using the End-User
Computing Satisfaction instrument (EUCS) to measure the success of an IS
implementation operationalized with subscales in content, accuracy, format, and
15

timeliness to measure usefulness of an IS and a subscale called ease of use. The


model for this study is derived from and combines the theories in Orlikowski and
Hofman (1997) and Nelson (2003) using EUCS as the determinant of a successful IS
implementation during and after a change and measuring EUCS during an
implementation to provide feedback for adaptation during the change process. Such a
longitudinal process model (with measurements before, during and after
implementation) provides a solution to what Orlikowski and Hofman (1997) proposed.
This completes the process overview and the process framework model pictured
in Figure 1 is derived and expanded from the extensive literature review for the four
major constructs which follows.
Section 2.1.1 Users perception of management of change effectiveness
From the organizational behavior area, a person acts to achieve, or to avoid, a
desired or an undesired consequence (Baard et al., 2004). In order to manage change
effectively information must be shared with employees, and their concerns must be
addressed as they surface (Parker, 2009). Management of change must motivate
employees by creating a work climate that satisfies basic psychological needs to
enhance intrinsic motivation. A mandatory system can apply introjection, which entails
taking in a value or regulatory process but not accepting it as ones own (Deci, et al.,
1994). Research findings imply that when people are coerced into doing something
without a clear rationale, they generally become less interested in the task and will
perform it only as long as some form of surveillance is in place. On the other hand,
16

when people are provided with reasons and choices for doing the task, they generally
become more interested in it and are more likely to continue engaging it, even after
external demands are removed (Koestner, Ryan, Bernieri, and Holt, 1984). Thus,
management of change strategies can encourage integration, through which the
regulation is assimilated internally resulting in self-determination and intrinsic
motivation (Deci et al., 1994; Armenakis et al., 1993; Gagne el al., 2000; Gagne and
Deci, 2005; Self, 2007; Self and Schraeder, 2009).
According to Orlikowski and Hofman, changes associated with technology
implementations are an ongoing process and cannot all be anticipated ahead of time
(1997): Management of change strategies such as training that increase self-efficacy
and commitment to the change increase in importance as the amount of simultaneous
and overlapping change in the surroundings increase (Herold, Fedor, and Caldwell,
2007). Examples of related management of change strategies include communication to
share information with employees while addressing their concerns, and provision of
additional training when needed. Armenakis, et al., (1999) proposes that the
communication introducing the change should address key questions to set the stage for
the change and to create readiness in the change participants: Providing a meaningful
rationale for doing the task, acknowledging that people might not find the activity
interesting, and emphasizing choice rather than control are change management
strategies that promote internalization and satisfaction (Deci et al., 1994; Gagne and
Deci, 2005).
17

Top management support, business involvement, communication, and training


are important factors in managing these changes successfully in enterprise systems
(Shang and Su, 2004). Many researchers have been interested in how to promote user
satisfaction for successful implementations (Chau, 1996; Davis, 1989; Igbaria et al.,
1997). The level of satisfaction depends on the motivation and ability to change
(Judson, 1991; Kotter, 1995; Lewin, 1951). Empathy and concern, two elements of
communication, are also conducive to satisfaction of organizational change and apply to
management of change during IS implementations (Kirkpatrick, 1985; Gagne el al.,
2000; Gagne and Deci, 2005). Published research has studied these elements and their
influence on the users resistance/readiness for change to the system (Herold, et al.,
2007). Users who did not perceive a positive outcome would not express acceptance
through satisfaction with the new system.
Objective measures for the number or extent of activities executed that
demonstrate management of change strategies are prohibitive. This research defines
users perception of management of change effectiveness (MOC) as the users
evaluative opinion of the dynamic use of those strategies and techniques practiced by
management to introduce and facilitate an organizational change. Our research explores
the users perception of management of change effectiveness and whether the strategies
employed have persuaded users that the change is beneficial and that they should act to
achieve desired consequences. Specifically, in this research the goal is to enhance
readiness and overcome resistance, resulting in greater end-user computing satisfaction
18

during and post-implementation of an integrated information system (e.g., an academic


enterprise resource planning (ERP). Feedback from users is included in the model to
surface the concerns and allow adaptation of management of change strategies to
modify user behavior, strengthen needed support, or modify the IS technological
application if needed. This concept is discussed in research although not formally
modelled in literature (Benn and Baker, 2009; Folger and Skarlicki 1999; Orlikowski
and Hofman, 1997; Parker, 2009). This research intends to fill in the gap.
Section 2.1.2 Readiness for change and resistance to change
Readiness for change is related to ones attitude toward change, and the
respondent's belief of how others view their attitude toward change (Kwahk and Kim,
2007). This study adopts the definition of readiness collectively reflects the extent to
which an individual or individuals are cognitively and emotionally inclined to accept,
embrace, and adopt a particular plan to purposefully alter the status quo (Holt et al.,
2007, p. 235). Readiness is reflected in organizational members' beliefs, attitudes, and
intentions regarding the need and the organization's capacity to implement changes.
Strategies of the management of change, change agent credibility, and interpersonal and
social dynamics are important in the readiness creation process (Armenakis, et al.,
1993). Readiness creation is often discussed in conjunction with prescriptions for
resistance reduction (Piderit, 2000). Other research has been conducted on overcoming
resistance to change by creating readiness with management strategies matched to the
sources of resistance. The most influential readiness factors are (a) discrepancy (i.e.,
19

the belief that a change was necessary), (b) efficacy (i.e., the belief that the change
could be implemented), (c) organizational valence (i.e., the belief that the change would
be organizationally beneficial), (d) management support (i.e., the belief that the
organizational leaders were committed to the change), and (e) personal valence (i.e., the
belief that the change would be personally beneficial) (Holt, et al., 2003; Self, 2007;
Self and Schraeder, 2009). The underlying assumption is that organizations will move
through the stages of readiness, adoption, and institutionalization of change when
organizational members recognize that the change is appropriate, beneficial, and
supported (Holt, et al., 2003).
Similarly, Armenakis et al. (1999) proposed that the communication introducing
the change should address five key questions to set the stage for the change and to
create readiness in the change participants:
(1) Is the change necessary?
(2) Is the change being introduced the right change to make?
(3) Are key organizational members supportive of the change?
(4) Do I or we (the organizational members) have the ability to
successfully implement the change?
(5) What is in it for me if we change(Self and Schraeder, 2009, p.172)?
Holt et al. (2007, p. 235) observed that readiness to change scales usually assess
four dimensions: (1) the content of the change; (2) the context (environment); (3) the
process of the change; and (4) the factors related to individuals involved in the changes.
Strategies for the communication elements in each dimension were reported to create
20

readiness and prevent resistance (Self and Scraeder, 2009). Piderit (2000) proposed that
the first step in implementation of change is to create readiness for the change rather
than merely overcoming resistance. Management of change is also applied to overcome
resistance that develops during the implementation as issues arise resulting from the
change. Self and Schraeder (2009) emphasize the continuing management of readiness
efforts across all stages of implementation, not just at the beginning, to increase the
likelihood of success. Therefore, management of change is a dynamic process during
the implementation (Orlikowski and Hofman, 1997). In this study the users perception
of management of change effectiveness reflects how well they believe that the change
process has been managed to achieve these goals: whether the elements of
communication, management support, technical availability, and training needed to
create readiness and/or reduce resistance have led to the subsequent end user computing
satisfaction.
Dent and Goldberg (1999) credit Kurt Lewin with the concept of resistance to
change. Lewin believed that the status quo was equilibrium between barriers to change
and forces driving change. He believed it was more effective to weaken the barriers
than to strengthen the drivers to bring about the change. Kwahk and Kim (2008) cited
resistance to change as a contributing factor to high failure rates of new IS
implementations. Resistance has been defined as any conduct that tries to keep the
status quo, i.e. resistance is equivalent to inertia, as the persistence to avoid change
(Maurer, 1996). Oreg defines it as an individuals tendency to resist or avoid making
21

changes, to devalue change generally, and to find change aversive across diverse
contexts and types of change (Oreg, 2003). This study adopts the definition of
resistance as a generalized opposition to change engendered by the expected adverse
consequences of change (Bhattacherjee and Hikmet, 2007). Whether a user is satisfied
or dissatisfied with the system leads to either positive or negative behaviors. Hultman
(1995) argue that resistance consists of two dimensions: active and passive. Active
resistance includes behaviors such as being critical, selective use of facts, sabotaging,
and starting rumors. Passive resistance is displayed by behaviors such as public
support, but failure to implement the change, procrastinating, and withholding
information or support. Jiang, Muhanna, and Klein (2000) summarized the seven
reasons employees resist new technology:

Change in job content.

Loss of status.

Interpersonal relationship altered.

Loss of power.

Change in decision-making approach.

Uncertainty/unfamiliarity/misinformation.

Job insecurity.
Factors identified as causing resistance include innate resistance to change, lack

of involvement in the change process, lack of management support, poor system


quality, and the lack of designer-user interaction (Hirscheim and Newman, 1988).
22

Harvey's 16 resistance factors for which he develops antidotes indicate the importance
of management support and communication, two elements of the management of
change actions that increase readiness and prevent/reduce resistance (Harvey, 1995).
Henry (1995) states that, "Researchers have found that resistance can be
categorized according to whether or not end users attribute their problems to specific
features of the technology, are computer anxious, or have a negative attitude toward
computers" (p 20). Specific features of the technology causing the end user's resistance
can be identified and assessed for validity. If the complaint is valid, one approach is to
modify the technology to increase acceptance/satisfaction. If the complaint is based in
anxiety, and the end user cannot be reassigned, special training to reduce anxiety can be
conducted. Involvement in the design or early training can provide the end user with a
sense of participation and a feeling of vested interest.
The Jiang, et al. study (2000) further explores strategies used to reduce
resistance to change through five key activities such as: involving employees,
addressing concerns about IS development using open communication, sharing change
information, showing sympathy, and retraining. Negative behaviors are related to
resistance which can occur at any stage in implementation (Cooper and Zmud, 1990).
Change managers, therefore, need to delve into the reasons for user resistance and to
learn effective strategies for managing different states of changes. A complete model of
user resistance would lead to better implementation strategies and desired
implementation outcomes (Joshi, 1991).
23

Folger and Skarlicki (1999) claim that resistance to the change may result from
some legitimate issues that need to be addressed. Benn and Baker (2009) examine a
model that incorporates input from resisting employees and channels conflict into
innovative outcomes to modify change. This co-evolutionary perspective fosters
institutional change to integrate with the human systems of the organization. The
change is then more easily integrated into the processes, procedures, and norms of the
organization. This perspective indicates that change management is a dynamic process
requiring recognition, evaluation, and reconciliation of issues throughout the change
implementation not only to lower resistance but also to benefit the organization. This
research strengthens the concept of the feedback loop in the model for this study to
allow issues to be surfaced and examined for corrective action as the co-evolutionary
perspective mentioned by Benn and Baker (2009).
Research on the acceptance and resistance to change follows two predominant
approaches. One approach views acceptance and resistance to change as opposite ends
of a continuum. By this view, low scores on acceptance instrument items indicate
resistance (Venkatesh and Davis, 2000). Self and Schraeder incorporate readiness
measurements in the resistance scale continuum (2009). However, the other approach
considers acceptance separately from resistance to change. Self avers that resistance
and readiness are not polar opposites on a linear continuum. Instead, resistance and
readiness represent complex states, impacted by numerous individual and organizational
factors (2007, p. 11). Lauer and Rajagopalan ( 2003) treat resistance and acceptance
24

as separate constructs but analyzed cases post hoc by identified behaviors using a
framework rather than a measurement instrument. Holt, et al. (2003) adds empirical
support to previous anecdotal recommendations for implementing change, still without
measuring resistance. This study regards and measures readiness, resistance, and
acceptance/satisfaction separately using the resistance measurement instrument
developed by Bhattacherjee and Hikmet (2007). An analysis of the data determines
whether readiness and resistance to change are separate constructs as pictured in the
research model and if so, which one has more prominent effects.
Section 2.1.3 End-user computing satisfaction
In the literature on finite measures of IS performance, IS benefits are sometimes
intangible, and hence, user satisfaction is utilized as a surrogate measure (Ives, Olson,
Baroudi, 1984; Baroudi and Orlikowski, 1988; Straub, 1989; DeLone and McLean,
2003). A survey of the sensitivities to user needs, participation, and communication
was used to examine satisfaction as a measure of how well the change was being
managed (Davis et al., 1989). Chen and Lee (2000, p. 554) define end-user satisfaction
with an information system as "the overall affective evaluation an end-user has
regarding his or her experience related with the information system. This study
defines success of an information system as the extent to which users are satisfied with
the system; the information generated; and its ease of use. Some of the research
addressing how to increase user acceptance/satisfaction includes the Technology
Acceptance Model (TAM), which posits that user acceptance/satisfaction is predicted
25

by user perceptions regarding the ease of use and usefulness of the new system (Chau,
1996; Davis, 1989; Igbaria, et al., 1997; Szajna, 1996; Taylor and Todd, 1995;
Venkatesh and Davis, 2000). However, earlier studies (Judson, 1991; Kotter, 1995;
Lewin, 1951) suggest the level of acceptance/satisfaction depends on the motivation and
ability to change. Martins and Kellermann (2004) focus on motivating factors and
enabling factors, which influence user acceptance/satisfaction. In their study, change
motivators, such as the explanation of realized benefits, positively influence perceived
usefulness. Change enablers, such as training, positively influence perceived ease of
use of the system. Accordingly, it can be acknowledged that management of change
strategies regarding communication and training promote change
acceptance/satisfaction.
Of the different instruments to measure user satisfaction, the primary measure
used in this study was the well-known instrument, the End-User Computing Satisfaction
instrument, in part because it has been validated for overall correlations (Doll and
Torkzadeh, 1988, 1989; McHaney, Hightower, and Pearson, 2002). The EUCS
instrument has been used extensively in a variety of workplace settings and continues to
be tested to extend its use in current practice (internet web services: Abdinnour-Helm,
Chaparro, and Farmer, 2005; public sector: Aladwani, 2002; Doll and Torkzadeh,
1989; Harper, Slaughter, and Norman, 1997; Taiwanese business: McHaney,
Hightower, and Pearson, 2002; On-line banking: Pikkarainen, Pikkarainen, and Pahnila,
2006; and ERP application: Somers, Nelson, and Karimi, 2003). This instrument has
26

been validated for measurement across subgroups using invariance analysis (to verify
that the 5 first-order factors have equivalent item-factor loadings across populations
subgroups). Researchers have used EUCS as a standardized measure of advanced
information technologies and propose it to practitioners for evaluating ERP
implementations (Nelson, 2003).
Section 2.2 Theoretical development
Figure 1 presents the management of change research model. Management of
change is critical to the success of enterprise-wide IS implementations. It is important
to understand the effects of change management on creating readiness and overcoming
resistance in order to improve end-user satisfaction, which is often used as the surrogate
measure of IS success.
Research has been conducted on the impacts of both resistance and readiness on
satisfaction from the self-determination theory research (Deci et al., 1994; Gagne and
Deci, 2005; Self and Schraeder, 2009), from change management research (Piderit,
2000), and from information systems research (Chin and Lee, 2000; Kwahk and Lee,
2008), but with inconclusive results. It is unclear whether readiness and resistance are
simply the reverse of each other. This study seeks to examine if they are both important
antecedents of user satisfaction, and if not, which one plays a more prominent role. As
discussed in the significance of the study, the research model is a proposed explanation
of how management of change can enhance and support information systems during
implementation. The longitudinal samples and instrument wording (REA future
27

oriented, MOC evaluates past action, and EUCS evaluates current state) are used to
establish time sequence and allow testing of a causal model of some of the constructs.
Results and qualitative comments from each survey point serve as feedback input to
management to adapt the change process strategies. Our combined model to test
longitudinally for causation is presented in Figure 2.
Figure 2 Model for Testing Longitudinal Effects
REA2
H2 (+)

REA3
H4 (+)

H1 (-)

H7 (+)

H6 (+)

EUCS1

H4 (+)

H6 (+)

MOC2

EUCS2

T1
MOC3

H3 (-)

H5 (-)

H3 (-)

RST2
Time 1

H2 (+)

EUCS3

H5 (-)

RST3
Time 3

Time 2

Data and comments collected at Time 1, Time 2, and Time 3 were collated,
analyzed, summarized, and forwarded to the management, and the management did
improve its strategies accordingly. By providing a feedback loop the survey actually
changed MOC in reality in later periods and that may have consequently affected REA
and RST.
Armenakis, et al. (1993) posits a model separating resistance and readiness and
discusses methods to reduce resistance and build readiness. Although that study
28

considers readiness as a precursor for the user to decide to resist or support the change,
it was the first model found that separates the constructs and proposes that readiness
could be managed. This is an important contribution to constructing a model to test
how management of change relates to resistance and readiness and they in turn relate to
end user satisfaction. The techniques recommended by Armenakis, et al., to increase
readiness aligns with increasing perceived ease of use and perceived usefulness to
increase acceptance posited by Davis, et al. (1989). Orlikowski and Hofman (1997)
contributes the dynamic aspect of management of change requiring adaptation during
the implementation. The concept of using feedback is reinforced by the
recommendation that members' concerns should be acknowledged exploring strategy
effectiveness further to identify when managers should embrace resistance rather than
try to avoid it (Holt, et al. 2003). From Venkatesh and Davis (2000) we draw the idea
of a longitudinal approach testing before, during and after implementation, but we use
satisfaction as an indication of success in the mandatory IT setting rather than time
usage in the voluntary IT setting.
Nelson (2003) contributes the validation of using the End-User Computing
Satisfaction instrument (EUCS) as a measure of the success of newly implemented ERP
applications. Although there is no one model that this study builds upon, these concepts
do contribute and synthesize to the proposed model that management of change
strategies could affect IS successful implementations by creating readiness, reducing
resistance, as well as directly affecting the end-user satisfaction.
29

With the feedback loop and longitudinal application this research design meets
the criteria recommended by Holt, et al., (2007, p.253)
It would be useful to change agents to know how the employees feel about
proposed changes. Knowing whether the employees (a) felt the change was
appropriate, (b) believed management supported the change, (c) felt capable to
making the change successful, and (d) believed the change was personally
beneficial would alert them to needed attention about the change. Periodic
assessment of these sentiments may provide the necessary information to take
whatever actions may be needed to make the change successful.
The research model depicted in Figure 1 seeks to understand the relationship of
the users perception of management of change effectiveness on readiness, resistance,
and directly on end-user satisfaction. It integrates user feedback of satisfaction or
concerns to surface issues for evaluation of relevance and importance used in decision
making of whether to adapt the management of change processes or to improve an
element of the IT itself. The longitudinal testing before, during, and after
implementation allow testing of the variables and comparison of their relationships over
the implementation period. Resistance and readiness are studied for their effects on
user satisfaction and to evaluate which is a better precursor. This process model offers
periodic assessment of the sentiments which may provide the necessary information to
take whatever actions needed to make the change successful, as proposed by Holt
(2007).
30

Section 2.3 Hypotheses


This research explores the relationship among users perception of management
of change effectiveness, readiness for change, resistance to change, and end-user
computing satisfaction before, during, and after a new IS implementation. Satisfaction
with the old system is seen as decreasing the subject's readiness for change to a new
system. Doubtful attitudes inhibit favorable reactions and promote resistance to IS
change (Joshi, 1991). It is assumed that users who are satisfied with the old existing
information system are not motivated to use a different information system. Those
users do not see the discrepancy of a new desired endstate or the efficacy to achieve it
will not have a positive attitude toward the change (Armenakis, et al., 1993). These
users are less collaborative in the change process. Those users who are very dissatisfied
with the old system should welcome the change resulting in a more favorable
perception of management of change effectiveness of the new system. Hence, change
managers should assess users attitudes towards the replaced information system and
adjust their strategies accordingly (Armenakis, et al., 1993).
H1. End-user computing satisfaction with the old system negatively affects the users
perception of management of change effectiveness for the new implementation.

Management of change includes: 1) communication of the need for change; 2)


promoting the expected benefits of the new system; 3) management support for the
planned change; and 4) training to promote ease of use and to diminish uncertainty
31

(Deci et al., 1994; Gagne and Deci, 2005). The readiness message should incorporate
two issues: (a) the need for change, that is, the discrepancy between the desired endstate (which must be appropriate for the organization) and the present state; and, (b) the
individual and collective efficacy (i.e., the perceived ability to change) of parties
affected by the change effort. (Martins and Kellerman, 2004). These strategies aim to
inform users of the benefits of the change and encourage them to favorably respond to
the change. Bentleys (2005) seventh prerequisite for successful implementation called
Education is defined as the ability to understand the solution (technology), why the
business needs it, how the technology works, what one can expect from it, and what
changes are required. These objectives are attained through communication and training
to establish realistic users expectations. Creating discrepancy in the users mind
between the old system and the new increases the users readiness to accept the change.
High ratings on MOC should result from effective efforts to prepare users to accept the
change.
H2. Users perception of management of change effectiveness positively affects
Readiness for Change.

IS researchers also recognize users' acceptance of a system as a major objective


of system implementation and the organizational change it entails. Understanding and
effectively managing resistance are, therefore, important determinants of the system
success (Jiang et al., 2000). Resistance to change can be managed by communicating
32

the rationale for the change (Deci et al., 1994; Gagne and Deci, 2005). Resistance is
reduced as the ease of using the new system and the expected utilization benefits are
enhanced.
If a users perception of the management of change effectiveness is high, then it
is expected that the users resistance to change decreases. Low MOC measurements
would indicate a negative opinion of change management effectiveness, which
increases user resistance.
H3. Users perception of management of change effectiveness negatively affects
Resistance to Change.

Kwahk and Lee (2008) found that readiness for change had an indirect, positive
effect on behavioral intention to use an enterprise-wide system through the influences of
perceived usefulness and perceived ease of use; both are important causal antecedents
of acceptance/ satisfaction according to Venkatesh and Davis (1996). Venkatesh and
Davis (2000) suggested that interventions to increase the comparative effectiveness
between the new and old systems may produce increased leverage to promote user
acceptance/satisfaction. Training represents an obvious opportunity and is one of the
major elements of management of change. Training impacts the user's belief regarding
both ease of use and usefulness and is one management strategy to create readiness to
prepare users to accept the change (Venkatesh and Davis, 1996). If creating readiness

33

has a positive effect on perceived usefulness and ease of use then it should increase user
satisfaction which indicates a successful implementation.
H4. Readiness for change positively affects end-user computing satisfaction of the new
system.

Changes that are considered favorable are not resisted and may even be sought
after and welcomed while changes considered unfavorable are likely to be resisted.
More resistance deters internalization of the benefits of change and reduces satisfaction
with the change. MIS researchers recognize that better theories or models of user
resistance would lead to better implementation strategies and desired implementation
outcomes (Joshi, 1991). Overcoming resistance should lead to greater acceptance or
EUCS. Readiness for change is expected to positively impact satisfaction with the new
system, whereas resistance to change is expected to lower satisfaction (Piderit, 2000).
H5. Resistance to change negatively affects end-user computing satisfaction of the new
system.

Change management is critical to successful IS implementation. Top


management support, business involvement, communication, and training are important
factors in managing these IS changes successfully (Shang and Su 2004). Many
researchers have been interested in how to promote user satisfaction for successful
implementations (Chau, 1996; Davis, 1989; Igbaria et al., 1997; Venkatesh and Davis,
34

2000). The level of satisfaction depends on the motivation and ability to change
(Judson, 1991; Kotter, 1995; Lewin, 1951). Motivating factors and enabling factors
influence user satisfaction. Change motivators, such as the explanation of realized
benefits, positively influenced perceived usefulness. Change enablers, such as training,
positively influenced perceived ease of use of the system (Martins and Kellerman,
2004; Venkatesh et al., 2000). It is recognized that satisfaction can be enhanced by
giving managers a tool to proactively design interventions targeted at populations of
users that may be less inclined to adopt and use new systems (Doll, 2004, p. 426). An
instrument that helps managers to identify weak areas in change strategies can supply
feedback to adapt the change process during the implementation to promote end-user
satisfaction. This expected affect is indicated by the longitudinal model Figure 2. It is
expected that as the perception of the effectiveness of the change management increases
so does the users satisfaction with the system.
H6. Users perception of management of change effectiveness positively affects enduser computing satisfaction of the new system.
If feedback is collected on the users concerns about the change or technology
and acted upon by adapting management of change strategies in order to address those
concerns then the users perceptions of how well the change is managed should improve
(Holt, et al., 2003; Holt, et al., 2007; Jiang, et al., 2000).

35

H7. Feedback from end-user computing satisfaction of the new system positively affects
users perception of management of change effectiveness.

The theoretical model in Figure 1 and the longitudinal testing model in Figure 2
depict the hypotheses of proposed relationships of the users perception of the
management of change, readiness for change, resistance to change, and end-user
computing satisfaction. Although literature streams of management of change and IS
acceptance contain research of these constructs, no study was found with a model that
contained them all. This research investigates them together longitudinally in a process
model. .

36

CHAPTER III
Methodology
Section 3.1 Research Design
This study is of non-experimental quantitative explanatory longitudinal design
since independent variables are not manipulated. To establish causation, variables must
be correlated, independent variable must precede dependent variable in time order, and
the observed relationship must not be due to a third confounding variable. A
longitudinal design rather than a single cross-sectional design can be used to establish
time order. Techniques to establish time order in this study include collection of
samples at three sequential points in time and wording of constructs such as MOC
referring to change actions already occurred and EUCS referring to the current level of
satisfaction (Johnson and Christensen, 2006).
The purpose of this research is to analyze causal relationships between the four
main variables. The research design is a longitudinal study with surveys at three points
in time, and at each point it is a cross-sectional observational study using a web-based
survey. The research setting is in a small university replacing multiple separated
systems with a new, integrated mandatory use student information system.
This study differs from the longitudinal study of Venkatesh and Davis (2000) by
using feedback from two of the three survey points spanning 15 months to surface

37

issues as input for management of the change process. It measures end-user satisfaction
in this mandatory setting rather than usage time in a voluntary setting. The concepts
applied in this studys model are prevalent in literature but no previous study could be
found that investigates this combination of variables longitudinally in a process model.
Although studies were found that treated resistance and readiness as separate variables
conceptually, only one study was found with a measurement instrument for resistance
separate from readiness (Bhattacherjee and Hikmet, 2007). This study treats readiness
and resistance separately and searches for the answer which one plays a more prominent
role in the change process.
This study is based on data and observations taken as Comprehensive Academic
Management System (CAMS Enterprise) an integrated, web-based Academic
Enterprise Resource Planning System for higher education is introduced.
CAMS, marketed as an academic Enterprise Resource Planning system, is
similar to an Enterprise Resource Planning (ERP) system for a business. First, it
provides a student (i.e., customer) portal, allowing students to access email, financial
data, grade reports, and the course management system similar to how customers
remote access to a business. Secondly, the online testing in CAMS is parallel to what is
typically used for employee training by a business human resource department.
Thirdly, CAMS faculty and staff portals offer functionalities similar to those a business
offers to its employees. Faculty conduct classes in an online environment interfaced
with the backend data management system. They can access to appropriate student
38

records, advise students, complete course registration and post grades. Staff, depending
on the department where they work and their job titles, interfaces with email, accounts
payable, admissions, financial aid, registar and the student databases. Both faculty and
staff can conduct their respective functions serving students in CAMS. CAMS, as a
campus (enterprise)-wide management system, interfaces and integrates academic
(business) functions and eliminates unnecessary duplicated data entries and inconsistent
data management in the old legacy information systems that only served a specific
business function or audience. CAMS, just as it has been marketed, indeed functions as
an ERP system in a general business. Therefore, results of this study may be
generalizable to other enterprise-wide integrated software implementations facing
similar integration and change management challenges.
Section 3.2 Data Collection
The university organization in target groups from administration/staff, faculty,
and students responded to the emails soliciting their participations in the survey that
was placed on a controlled access web site. Follow-up emails were sent to maximize
the response rate and enable comparison of late respondents to earlier ones. A note at
the beginning of the survey explained the purpose of the study and the procedure for
handling the data. It was emphasized that the data would be kept confidential and used
only for research purposes. All constructs were measured using the survey. (See
Appendix D for email invitations). The data were collected with the survey instrument
contained in Appendix A using SurveyGold software (www.surveygold.com). Several
39

techniques were used to encourage participation. First, it was explicitly stated in the
instructions that participation was voluntary and that no identifying information would
be shared. Additionally, in order to encourage participation, upon completion of the
survey, respondents were directed to a password protected site that collected their
information for an incentive drawing for $100 held at the end of each survey collection
period. A final drawing was held for $100 for those who had participated in all three
surveys.
See Table 1, Time-line on implementation, for the dates and phases of
implementation at each sample point. Qualitative data was collected to establish the
timeline (Appendix E).

Table 1 Time-line on Implementation


Milestones
Date
Employee training CAMS and Blackboard
Feb. 17-20
New hosted Blackboard implemented less than 1 Aug. 14
CAMS
training
StudentFaculty
Portal open
Oct. 29
week before
classes
Faculty
portalFall
open
Dec. 10
March 20
Technical help desk/ Student g-mail
Jan. 13
Implementation complete/ register and submit
March
grades online

Survey Date
Feb.26 Mar. 10
Mar./10
Nov. 25 Dec. 24
Apr. 15 May 3

Harvey recommended that users complaints during change implementation


about technology should be examined. If the need was revealed then the technology
should be modified (1994). Accordingly, when faculty complained about the
inadequacy of the proposed course management module included in the CAMS ERP,

40

the decision was announced to employ an updated hosted integrated Blackboard instead.
This decision was made and announced prior to the Time 1 survey.
Data were collected at three points: in March 2009, which are referred to as
Time 1, at the initiation of the new system, in November 2009, which are referred to as
Time 2, after the registrars module and upgraded course management system was
implemented, and in April 2010, which are referred to as Time 3, after the
implementation of all systems was complete and in operation for a month. As new
modules were implemented the parallel older system modules were completely
displaced and taken offline except the old student information database, which was
read accessible for a short period before being taken offline. By Time 3 all modules
and integration were complete and the old student records database, email, and unintegrated Blackboard were completely displaced.
The survey instrument was modified slightly at each time to reflect some
specific needs at that time. Issues identified by the survey comments were forwarded to
management as input for adapting change management strategies. Communications
from the management, comments on improved workflow enabled by the new IS system,
priority changes, or other issues indicated in the survey comments were collected as
qualitative data and are useful in interpreting data results.
Issues identified by the survey comments were collated and forwarded to
management as input for change management strategies. Management
communications, comments on improved work methods enabled by the new IS system,
41

and shifting priorities or issues indicated in survey comments were collected as


qualitative data. Management communications announced a compromise for upgrade
of the existing course management platform (Blackboard) and integration to the new
information system in lieu of using the CAMS module considered inferior by faculty.
Emails announced expected benefits, time-lines for the implementation and periodic
updates as modules were implemented. Instruction, training schedules, and technical
support structure were announced. Clarifying emails were sent to address rumors and
unrest.
Control variables are chosen to account for variance in the dependent variables
that might be explained by factors other than the hypothesized variables. Agarwal and
Prasas (1990) posit individual difference factors affect beliefs in usefulness and ease of
use in IT acceptance. Individual differences in that study are defined as user factors
that include traits such as personality and demographic variables, as well as situational
variables that account for differences attributable to circumstances such as experience
and training (1990, p. 2). Gender was examined as a control variable in a study of how
specific change messages and change facilitation strategies relate to perceptions of the
change benefits (Holt, et al., 2003). Vankatesh and Davis, (2000) take into account
certain variables that might determine acceptance factors tied to social context and
individual characteristics (such as age, level of income or education). A number of
demographic variables including age and education have been studied and shown to
influence system use. Dillion and Morris (1996) aver that it is not surprising that age
42

influences the use of technology within broad parameters but not in a strong
relationship. Demographic factors of age and gender are therefore collected in this study
to test if age or gender plays a role in relationships under study and to help examine if
there is bias in the sample or not.
In addition, Nelson (1990) suggests that investigations of individual adjustment
to technological innovations should include job characteristics as potential influences on
attitudes and behavior and should do so in a longitudinal multiple measures design.
Objective job content along with perceived job characteristics should be studied. Palm,
Colombet, Sicotte, and Deqoulet, (2006) investigate the effect of functional group
(medical secretaries, nurses and physicians) on acceptance and user satisfaction of a
clinical IS. They focused on user characteristics, user satisfaction, and perceived
usefulness and concluded that satisfaction is higher in the group of medical secretaries
who are the most frequent users of the computer IS functions and the only users of the
appointment and scheduling functions. Laerum, Karlsen, and Faxvaag (2004) in a
separate study reached the similar result that secretaries generally use hospital IS
functionalities more frequently in their daily tasks and are more satisfied than nurses or
doctors. Therefore, based on the literature, the group factor of students, staff, and
faculty is also considered for a control variable in this study due to their differences in
work assignments, computer modules used, training, and function, It is expected that the
different university groups of students, staff, and faculty may also react differently to
the implemented change.
43

Section 3.2.1 Human subject concerns


It was emphasized that the data would be kept confidential and used only for
research purposes. All constructs were measured using the survey. To track
respondents, each survey was assigned a unique code and respondents did not need to
provide their identity on the survey. A list of codes that matched the email addresses of
respondents was created from the incentive drawing survey link to which only the
researcher had access.
Section 3.2.2 Population and sample
A small, private university was the setting for sampling during implementation
of a new integrated student information system. Permission was granted by the
President and Vice-President of Academic Affairs to conduct the study on user
satisfaction as it relates to users perception of management of change effectiveness and
the impact on resistance/readiness to change. Initial interviews were approved and
conducted to explore the proposed model. Employees names and email addresses were
obtained by functional group of the school. The university community in target groups
from administration, faculty, students, and staff (advisers, registration, financial aid,
admissions, advancement, academic support, and business office) were sent emails
soliciting their participation in the survey with an information link to access the survey
web site.
Despite the limited population size in this small university (approximately 100
faculty, 50 staff, and 1000 students), the response rates for the survey across all three
44

points are consistently satisfying. Initially, 181, 325, and 207 surveys were completed at
points Time 1, Time 2 and Time 3, respectively. However, after pre-processing for
missing data, each data set was reduced to 145 records (All surveys with greater than
10% N/A (Not Applicable) responses or missing data were eliminated. Those with 10%
or less were replaced with the average value. The Partial Least Squares (PLS) testing
required the same number of cases at each point. Time 1 retained 145 cases, which
determined the number for the other two points. After stringent elimination, Time 2
still exceeded 145 cases, so random number generation was used to eliminate cases to
the required level.)
Table 2 indicates the sample size in each group with a total of 145 at points
Time 1, Time 2, and Time 3 for data analysis.
Table 2 Sample Sizes
Group TIME 1 TIME 2 TIME 3
Students
86
102
87
Faculty
31
23
28
Staff
28
20
30
Total (n)
145
145
145
The descriptive statistics of each group show that samples of each group at each
point are representative of the respective population (Table 6), indicating no sample
bias. The sample size of 145 at each point satisfies the minimum sample size
requirements in this particular study with the desired effect size and power.

45

To determine the minimum sample size, the following factors were considered:
the power analysis with power of 0.8 at the 95% confidence level and 0.5 effect size
requires a sample size of 102. Additionally, SmartPLS requires ten times the number of
items measuring a latent variable, which are 120 for this study.
In this study, not all individuals use all applications or perform exactly the same
tasks, so although the unit of measure is the individual, the unit of analysis is the
aggregated experience, which represents the organizational level. The goal of using the
organization as the unit of analysis is to provide findings that are useful to organizations
assessing their current state of readiness, resistance, users perception of management of
change effectiveness, and end-user satisfaction to manipulate their management of
change strategies to affect a successful enterprise implementation.
Fifty-six subjects responded to the surveys at all three points in time spanning
15 months. The data from these longitudinally matched respondent records was
analyzed for comparison to the results from the larger sample of 145 respondents. The
students were not as heavily represented in the smaller matched respondent group since
students in the population changed with one class graduating and another entering
between sample points. The staff and faculty were more stable groups and a larger
portion of these two groups participated at all three survey points.
Section 3.3 Measurement Development
After conducting a literature review and developing a tentative research model,
administrative personnel were interviewed to finalize the appropriate research model
46

and necessary instruments for assessment. A total of seven interviews were conducted
with key administrative individuals having titles such as Vice President Academic
Affairs," "Assistant Dean," Vice President of Finance" also responsible for technical
services , "Executive Director of Enrollment Management and Student Life," "Director
of Institutional Assessment," "Registrar," and "Assistant Registrar." Each interview
was recorded using a digital recorder. The transcriptions contain a total of 18,104
words (Appendix F).
Exploratory questions were asked concerning why the change was being made,
expected benefits, expected resistance, participation in selection, communications to the
organization, and important elements for successful implementation. Communication,
training, management support, and technical support were all listed by interviewees as
important. They each expressed trust in the new system with benefits of integration and
improved accuracy. Change agent credibility and good data migration were also listed
as important to the success of the new system. The interviews also served as a reminder
of critical elements for successful change to the interviewees. No new elements
surfaced that would not fit into the existing constructs and model.
At the end of one interview, the Vice President of Finance offered to list
improvements in the task procedures after implementation for the accounting area
employees. This type of data was recommended for Information System research in the
literature review materials and was deemed valuable. From these conversations the idea
developed to add a comment area on the survey (Please comment on any job tasks that
47

have improved or worsened with the change). The comments from the initial survey
surfaced issues that needed addressing and were collated and sent to management. At
the second survey point, the email invitation included a statement: "Your comments
will be anonymous but your concerns will be passed on to administration."
Four constructs are measured in this study. All instrument items are detailed in
Appendix A. Some instrument items are modified at the different data collection points
to specifically refer to the information system under examination at that point. For
instance, at the pre-implementation of the new IS system, CAMS, the EUCS
measurements specifically refer to the old information system composed of fx Scholar,
ACT, and Response Plus, etc. Questions are carefully worded with proper tense. For
instance, MOC measures the users perception of how well change has been managed
before the date of the survey, and EUCS measures users present satisfaction on the date
of the survey with the current information system. All four construct's items were
measured using a five-point Likert scale, anchored at 1 = strongly disagree and 5 =
strongly agree. Appendix A contains the measurement items.
The operationalization of users perception of management of change
effectiveness (MOC) combines survey items from three existing MOC instruments.
MOC is measured by the user's opinion of fairness, management support (Caldwell et
al. 2004), technical availability (Martins and Kellermanns 2004), communication, and
training (Herold et al. 2007) that have been exhibited by the universitys management.

48

The order of question arrangement for the three survey collection times is given in
Table 8.
Readiness for change (REA) is measured using thirteen items from Kwahk and
Kim (2009). These readiness items address the general attitude toward change and how
others perceive their attitude toward change. Items from Holt, et al. (2007) were added
addressing perceived benefits to the respondent and organization. Several items were
reworded to relate specifically to the context of CAMS while other common issue items
were consolidated to render the instrument more concise.
Resistance to change (RST) is measured using four items from the instrument in
Bhattacherjee and Hikmet (2007). These items address the general attitudes of userss
resistance to change with regard to how they input data, receive reports, interact with
others, and other general work methods. I like the CAMS system. This item was added
to test for understanding of the 4 negative resistance questions but placed later in the
Time 2 survey.
For the point Time 3 survey, the Section III, Attitude toward change
introduction, was modified to further clarify scoring of negative questions: Agreeing to
the first four questions means you don't like change.
In this study, satisfaction is measured by the well published instrument of EndUser Computing Satisfaction as the extent to which users are satisfied with the system,
the information it generates, and its ease of use (Doll and Torkzadeh, 1988, 1989;
Abdinnour-Helm et al., 2005; Aladwani 2002; McHaney et al., 2002; Somers et al.,
49

2003). The instrument consists of five dimensions: content; accuracy; format;


timeliness; and ease of use. Each dimension is measured with Likert scaled responses
regarding the users frequency-based belief that their response is true.
The final instrument was further refined by pilot testing with the intent of further
clarifying the wording and layout of the instrument. The following alterations were
made in response to input from 6 survey testers:

Added N/A column for Not Applicable.

Added color which may improve response rate.

Changed title to: "The University Pre-Survey"

Added phrase on informed consent to the invitation email with the link
explaining that "clicking the link and taking the survey grants permission to use
the data."

Revised EUCS Section II introductory phrase as suggested to "Regarding the


current Information systems" and added = marks between number and
descriptive phrase.

Revised attitude Section III introductory phrase as suggested: "These questions


ask about your attitude toward changes. Indicate the extent that you agree or
disagree with the following statements."

Revised the Management of change Section IV introductory phrase as


suggested: "The following statements ask you to assess how the university is
preparing you for the move to CAMS.
50

Demographics: Changed to ask respondent to type in "years and months."

Other than adjustment of wording for tense and reference to the current systems the
surveys consisted of the same questions at each survey point.
Section 3.4 Method of analysis
This study is of non-experimental quantitative explanatory longitudinal design
since independent variables are not manipulated. To establish causation, variables must
be correlated, independent variable must precede dependent variable in time order, and
the observed relationship must not be due to a third confounding variable. A
longitudinal design rather than a cross-sectional design can be used to establish time
order. Techniques to establish time order in this study include collection of samples at
three sequential points in time and wording of constructs such as MOC referring to
change actions already occurred and EUCS referring to the current level of satisfaction
(Johnson and Christensen, 2006).
The purpose of this research is to analyze causal relationships between the four
main variables. Structural equation modeling (SEM) is a statistical technique for testing
and estimating those causal relationships based on statistical data and qualitative causal
assumptions. SEM allows the researcher to simultaneously consider relationships
among multiple independent and dependent constructs answering a set of interrelated
research questions in a single, systematic, and comprehensive analysis (Gefen et al.,
2000). SEM also supports unobservable latent variables (LVs) through use of
observable and empirically measurable indicator variables (also referred to as manifest
51

variables (MVs)) to estimate LVs in the model (Urbach and Ahlemann, 2010). Thus, the
relationships can be analyzed between theoretical constructs.
Partial least squares (PLS) is a component-based approach for testing structural
equation models. The PLS approach does not require a normal distribution of data of
measured variables as does the traditional SEM approach (Chin, 1998b). PLS will
neither produce inadmissible solutions nor suffer factor indeterminacy (Fornell and
Bookstein, 1982). It works with relatively small sample sizes (Cassel et al., 1999) and
generates LV estimates for all cases in the data set. Neither independent observations
(Wold, 1980) nor identical distributions of residuals are needed for PLS (Chin and
Newsted, 1999; Lohmller, 1989). The model quality improves as more indicators are
used to explain the LV's variance (consistency at large) (Huber et al., 2007; Lyttkens,
1973). The Power analysis is based on the portion of the model with the largest number
of predictors. Minimal recommendations range from 30 to 100 cases (10 times the
largest number of MVs for any LV in the model), where traditional SEM covariance
based analysis has recommended sample sizes from 200 to 800 (Urbach and Ahlemann,
2010). It is not possible to perform significance tests of model parameters with PLS.
Three of the constructs in this research model had 12 MVs, which requires a minimum
sample size of 120. PLS was chosen to analyze the data since the available 145 cases
exceeded the minimum required for its use and were too small for traditional SEM
analysis (<200 cases).

52

In PLS path modeling, parameter estimation is accomplished through a multistage algorithm. Stages involve a sequence of regressions in terms of weight vectors.
Iteration leads to convergence on a final set of weights. Weight vectors obtained at
convergence satisfy fixed point equations. SmartPLS simultaneously assesses the
psychometric properties of the measurement model (e.g., the reliability and the validity
of the scales used to measure each latent variable construct), as well as the parameters
of the structural model (e.g., the magnitudes and significance levels of the beta
coefficients for each of the paths) between the latent variables. SmartPLS is not
constrained to data sets that meet homogeneity and normality requirements, and it can
handle smaller sample sizes relative to other structural techniques (Chin et al., 2003).
Hence, SmartPLS is used to analyze the data from this study.

53

CHAPTER IV
Data Analysis and Research Findings
Section 4.1 Measurement Validation
The research model depicted in Figure 1 was analyzed using SPSS for
descriptives of MVs and LVs with average values depicted in Table 3. The
relationships were analyzed using Partial Least Squares (PLS) path modeling technique.
Specifically, the model was tested using linear PLS path modeling as implemented in
the freely-available SmartPLS software (Ringle, et al., 2005).
Table 3 shows the descriptive statistics of construct measurements for each
group and the total sample size of 145. (Average constructs are calculated with equally
weighted items). The descriptive statistics by item are found in Table 14 in Appendix B
and indicate the same trends. Note that MOC2 for the total is higher than MOC1.
Comment summaries from the surveys were sent to management to identify issues for
strategic action. By providing a feedback loop the survey actually changed MOC in
reality in later periods and that may have consequently affected REA and RST
(substantiated by memo of 12/03/09 in Appendix E, p. 118). MOC3, unfortunately,
was rated slightly lower than MOC2, as qualitative comments showed that users were
complaining about more than one login into the student portal and that the management

54

did not respond promptly and effectively to address these complaints. This failure by
management to respond to these concerns was a special case for this particular project
only.
Table 3 Descriptive Statistics (n=145)
Group
Construct
Time 1
Time 2
Time 3
MOC
3.4
3.9
3.8
REA
3.5
4.0
3.7
Student
RST
3.3
3.5
3.4
EUCS
4.0
4.2
4.2
MOC
3.7
3.9
3.5
REA
4.0
4.1
3.7
Staff
RST
2.4
2.7
3.0
EUCS
3.1
4.0
4.0
MOC
3.7
3.6
3.6
REA
4.0
4.2
3.8
Faculty
RST
2.4
2.6
2.9
EUCS
3.1
4.0
4.0
MOC
3.5
3.9
3.7
REA
3.7
4.0
3.7
Total
RST
2.9
3.3
3.3
EUCS
3.6
4.1
4.1
1= Strongly disagree, 2 = Disagree, 3 =Neither agree nor disagree, 4 =Agree, 5= Strongly Agree

In Table 3 it should be noted that at Time 1, students as a group tested higher in


end-user satisfaction (4.0 vs. 3.1 for both staff and faculty), lower in readiness (3.5 vs.
4.0 for staff and faculty), and greater in resistance (3.3 vs. 2.4 for staff and faculty).
Prior to Time 1 students were not aware of the introduction of a new system and had not
received communication concerning the change. However, the staff and faculty had
been notified prior to Time 1. After Time 1 and prior to Time 2, students were
experiencing the benefits of automatic identification creation and course enrollment

55

from the integration with the course management system. Students were benefiting
from the student portal to access transcripts, financial information, and schedules prior
to the Time 3 survey. Students could access and reset common component passwords
through the email system alleviating great irritation and access delays without the
necessity of contacting technical support.
Before the implementation in Time 1 there were significant differences between
students and the other two groups. The students were not involved in eminent changes
and had received no communication yet. The measurements for satisfaction referred to
the old information system. Only the data for EUCS from the Time 1 survey is used in
the analysis of the implementation of the new information system.
As discussed in Section 3.4 the method of analysis for this study is SmartPLS.
The following discussion of the recommended criteria is examined to validate the
measurements and the quality of the model as analyzed by SmartPLS. Reliability
results from testing the measurement model with the combined Time 1, Time 2, and
Time 3 data for the large sample of 145 respondents are reported in Table 4. The
composite reliabilities, Cronbach's alphas, and the average variance extracted (AVE) for
each of the first-order latent variable constructs are reported. The data indicate that the
measures are robust in terms of their internal consistency reliability as indexed by the
composite reliability. The composite reliabilities of the different measures in the model
(Dillon Goldsteins Rho) range from 0.94 to 0.97, which exceed the recommended

56

threshold value of 0.70 (Nunnally, 1978) indicating excellent internal consistency


reliability of each block of manifest variable indicators for each of the latent constructs.
Table 4 Assessment of the Measurement Model
Construct
Cronbach's
Composite
EUCS1
0.97
0.97
Alpha
Reliability
EUCS2
0.95
0.96
EUCS3
0.95
0.96
MOC2
0.94
0.95
MOC3
0.94
0.95
REA2
0.94
0.95
REA3
0.93
0.94
RST2
0.92
0.94
RST3
0.95
0.97

AVE
0.76
0.64
0.67
0.62
0.62
0.65
0.63
0.80
0.88

In addition, consistent with the guidelines of Fornell and Larcker (1981), the
average variance extracted (AVE) for each measure well exceeds 0.50, which suggests
that the principal constructs capture much higher construct related variance than error
variance (Hair et al., 1998).
Table 5 Discriminant Validity (Inter-correlations) of Latent Variable Constructs
EUCS1 EUCS2 EUCS3 MOC2 MOC3 REA2 REA3 RST2 RST3
Latent
0.87
EUCS1
Variables
0.04
0.80
EUCS2
0.07
0.11
0.82
EUCS3
0.07
0.52*
0.12
0.79
MOC2
-0.10
0.00
0.55*
0.05
0.79
MOC3
-0.05
0.47
0.01
0.62*
0.05
0.81
REA2
-0.02
0.03
0.33
0.07
0.61*
0.07
0.79
REA3
0.00
-0.08
0.09
-0.06
0.07
-0.21
0.08
0.90
RST2
0.10
0.05
0.15
0.13
0.23
0.04
0.14
0.04
0.94
RST3
Notes: The diagonal elements (in bold) are square roots of AVE.

* > recommended 0.50

The correlations among all of the constructs are well below the 0.90

57

threshold, suggesting that the constructs are distinct from each other (Bagozzi
et al., 1991). Therefore, resistance and readiness are separate distinct
constructs.
Table 5 presents the results of testing the discriminant validity of the
measurement scales for the larger sample of 145 respondents. The bolded elements in
the matrix diagonals, representing the square roots of the AVEs, are greater in all cases
than the off-diagonal elements in their corresponding row and column, providing
evidence of the discriminant validity of the scales (Chin, 1998; Fornell and Larcker
1981). Four off-diagonal elements exceed the recommended 0.5 but are still acceptable
(0.52 to 0.62).
Table 15 in Appendix C presents the factor loadings and cross loadings for the
combined data (Time 1, Time 2, and Time 3 combined model of 145 sample). These
factor loadings and cross loadings indicate good convergent and discriminant validities
with their respective associated corresponding (and non-corresponding) latent variable
constructs. All factor loadings on the underlying construct equal or exceed the
recommended threshold of 0.70 level (Chin, 1998) indicating good indicator reliabilities
with the exception of Explained2 (0.691), which is acceptable. All cross-loadings are
significantly lower in magnitude than the corresponding factor loading with some crossloadings exceeding the recommended 0.50 (averaged at 0.55 with only one above 0.6 at
0.65). None of these exceptions are cross-loadings between readiness and resistance.

58

Additionally, each items factor loading on its respective construct is statistically


significant (p < 0.001). The latent constructs items loadings and cross loadings
presented in Appendix C and their levels of statistical significance serve to affirm the
convergent validity of these indicators as representing distinct latent constructs in the
research model.
Section 4.2 Data Analysis and Results
Table 6 shows the sample sizes in each group at each point collected and used in
the data analysis with a total of 145 each time for final data analysis. It also shows the
number of longitudinal matched respondent cases in each group (that responded to all
three surveys) and the population size. The descriptive statistics of each group show
that samples in each group at each point are representative of the respective population.

Responses collected

Student
Staff
Faculty
Male
Female
<30
30s
40s
> 50
Total

T1
115
33
33
50
131
49
45
66
31
181

T2
252
42
57
82
269
112
69
105
65
351

T3
134
40
33
43
164
45
50
65
47
207

Table 6 Sample Processing


Used
n=145 sample
n=56 sample
T1
86
31
28
36
109
36
36
47
26
145

T2
102
23
20
37
108
52
21
47
25
145

T3
87
28
30
32
113
29
32
48
36
145

59

T1 to T3
18
13
25
13
43
5
13
22
16
56

Population
1043
46
82
354
817
Not available
Not available
Not available
Not available
1171

Records with greater than 10% missing data or Not Applicable (N/A) answers
were eliminated with outliers. Those with less than 10% were processed with average
field values substituted. Student responses contained more N/A answers resulting in
record elimination during pre-processing. Almost half the of the same Full Time
faculty members answered at each survey point. Adjunct faculty members vary in
terms and length of service. Nine adjunct faculty members responded at Time 3, and
five of those responded at all three points. Hence, the faculty group, including the
adjunct faculty members, is a stable group, as is the staff group.
A sample size of 56 can be used in PLS, but a couple of the constructs have 12
manifest variable indicator items. These matched respondent surveys, are a subset of
the larger sample, and are analyzed to compare the relationships of linked surveys
completed by the same respondent to the larger sample where each data point has
records arranged roughly chronologically. The universitys staff, faculty, and students
range from 24% to 33% male. Over 44% of students are over the age of 25. No finer
divisions of demographics are available on the sample population. The descriptive
statistics for each group in Table 7 show that samples in each group at each point are
representative of the respective population in the larger sample. The smaller matched
respondent group has fewer of the younger respondents since it has a smaller proportion
of students that responded to all three surveys.
Table 7 shows the descriptive statistics of construct measurements for each
group, for the sample size of 56. Note that MOC2 for the total is higher than MOC1 as
60

data and comments collected during the first survey point of the study were collated and
analyzed. The summaries were sent to management to identify issues for strategic
action. By providing a feedback loop, the survey actually changed MOC in reality in
later periods and that may have consequently affected REA and RST.
Table 7 Descriptive Statistics (n=56)
Group
Construct Time 1 Time 2 Time 3
MOC
2.6
3.8
3.7
REA
3.2
3.6
3.8
Student
RST
2.6
3.2
2.9
EUCS
4.0
4.2
4.2
MOC
3.7
3.7
3.4
REA
4.0
4.1
3.7
Staff
RST
2.3
2.5
2.2
EUCS
3.1
4.2
4.0
MOC
3.5
3.9
3.8
REA
3.8
3.8
3.8
Faculty
RST
2.6
2.7
3.0
EUCS
3.1
3.9
4.1
MOC
3.3
3.8
3.7
REA
3.6
3.8
3.7
Total
RST
2.5
2.8
2.8
EUCS
3.4
4.1
4.1
1= Strongly disagree, 2 = Disagree, 3 =Neither agree nor disagree, 4 =Agree, 5= Strongly Agree

MOC3, as in the larger sample of 145, was rated slightly lower than MOC2,
with the largest drop for the staff group. The descriptive statistics of construct
measurements for the total sample size of 145 in Table 3 are comparable to the average
values reported here in Table 7 for the matched group of 56. In general, the larger
sample average values are equal or greater than those of the smaller matched sample.

61

Noted differences are the average values for faculty at Time 3 in MOC, REA, and
EUCS, and at Time 1 in RST, which are lower than those in the larger sample.
It should be noted that at Time 1, similar to the larger sample, students as a
group tested higher in end user satisfaction (4.0 vs. 3.1 for both staff and faculty), lower
in readiness (3.2 vs. 4.0 for staff and 3.8 for faculty), and greater in resistance (2.6 vs.
2.3 for staff and 2.6 for faculty). In the larger sample at Time 1, the students averaged
even greater in resistance at 3.3.
Section 4.3 Hypothesis Testing
PLS testing revealed that at point Time 1, significant differences existed
between students and the other two groups according to equal variance T-tests
conducted comparing model path coefficients. Therefore, MOC1, REA1 and RST1
were not used in inferential statistical analysis. At points Time 2 and Time 3 no
significant differences existed among the three groups. Figure 3 represents the
relationships of the total 145 sample at Time 1.

62

Figure 3 Results: at Time 1 (n= 145)

REA1
=- 0.10

EUCS1

R2 = 0.01

= 0.43** *

MOC1

= 0.10

R2 = 0.19

= 0.45***

= -0.00

RST1
Significance Levels:
* P < 0.05
** P < 0.01
*** P < 0.001

R2 = 0.20

Figure 4 represents the relationship of the corresponding sample at Time 2.

Figure 4 Results: at Time 2 (n= 145)


REA2
= 0.62***

= 0.24
R2 = 0.38
= 0.37***

EUCS2

MOC2

R2 = 0.30
= -0.06

RST2
R2 = 0.004

63

=- 0.002

Significance Levels:
* P < 0.05
** P < 0.01
*** P < 0.001

Significant differences existed in the overall relationships between points Time1


and Time 2. The EUCS compared in Time 1 related to the old IS, which explained 20%
of the variability of RST1. EUCS1 was non-significant to REA1, although REA1
raised the opinion of how MOC1 was assessed. Therefore, MOC1, REA1 and RST1
were not used in inferential statistical analysis. At point Time 2, RST2 was nonsignificant in effect on EUCS2 but readiness exerted a positive effect on EUCS2 with
the new IS system.
Figure 5 shows results for Time 3. At points Time 2 and Time 3, no significant
differences existed among the groups. As stated previously prior to Time 1, students
were not aware of the introduction of a new system and had not received
communication concerning the change although the staff and faculty had been notified.

Figure 5 Results: Time 3 (n= 145)

= 0.61***

REA3
= -0.02

R2 = 0.37
= 0.56***

MOC3

EUCS3
R2 = 0.31

= 0.23*

= 0.03

RST3

R2 = 0.05

64

Significance Levels:
* P < 0.05
** P < 0.01
*** P < 0.001

By the Time 2 point, students were seeing benefits from the integration with the
course management system. By Time 3, the students were benefiting from the student
portal to the student information system where they could access grades, schedule,
financial information, gmail, and Blackboard log-in links (See Table 1 Time-line on
Implementation, p. 35). There was no significance difference between early and late
responders at either survey time so it is assumed the results are representative of the
population.

Table 8 Question Order


Time 1

Time 2

22

I like the CAMS system.

22

23

I am inclined to try new ideas in


information systems.

23

24
25
26
27
28

Changes with information


systems tend to stimulate me.
Changes with Information
systems often help my
performance.
Most coworkers will benefit
from CAMS..
I usually support new ideas in
information systems.
Other people think I support
the change to CAMS.

24
25
26
27
28

29

I find most changes with


information systems pleasing.

29

30

I often suggest new approaches


in information systems.

30

31
32
33

34

I usually benefit from change in


information systems.
I will benefit from the CAMS
system.
I find most changes with
information technology benefits
the organization.
I intend to support the change
to CAMS.

31
32

I find most changes with


information systems pleasing.
I find most changes with
information technology
benefits the organization.
I am inclined to try new ideas
in information systems.
Changes with information
systems tend to stimulate me.
Changes with Information
systems often help my
performance.
I usually support new ideas
in information systems.
Other people think I support
the change to CAMS.
I often suggest new
approaches in information
systems.
I like the CAMS system.
I usually benefit from change
in information systems.
I will benefit from the CAMS
system.

Time 3
22

I like the CAMS system.

23

I am inclined to try new


ideas in information systems.

24
25
26
27
28
29
30
31
32

33

Most coworkers will benefit


from CAMS..

33

34

I intend to support the


change to CAMS.

34

65

Changes with information


systems tend to stimulate me.
Changes with Information
systems often help my
performance.
Most coworkers will benefit
from CAMS..
I usually support new ideas
in information systems.
Other people think I support
the change to CAMS.
I find most changes with
information systems
pleasing.
I often suggest new
approaches in information
systems.
I usually benefit from change
in information systems.
I will benefit from the CAMS
system.
I find most changes with
information technology
benefits the organization.
I intend to support the
change to CAMS.

As seen in Table 8, some items order was varied during the Time 2 survey,
from the order of the first and last surveys with no apparent effect on results.
The results of the PLS model are presented in Figure 6. The research model
presented as Figure 1 was tested in a path analytic framework using the segregated data
for points Time 1, Time 2, and Time 3. The data from Time 1 is represented in the
EUCS1 construct (user satisfaction with the existing old system). The data from points
Time 2 and Time 3 are represented using the complete research model (Figure 2).

Figure 6 PLS Results of Full Model Testing for n=145


REA2

R2 = 0.38

0.62***

R2 = 0.37

0.24*

0.61 ***

0.10

EUCS1

MOC2

0.06

EUCS2

MOC3

0.55***

EUCS3

R2 = 0.31

R2 = 0.30

R2 = 0.00
0.23*

-0.002

RST2

0.03

RST3
R2 = 0.01

Time 1

-0.02

-0.01
0.37***
R2 = 0.01

* p < 0.05
** p < 0.01
*** p < 0.001

REA3

R2 = 0.05

Time 3

Time 2

66

The hypotheses were evaluated by assessing the sign and significance of the
structural path coefficients using one-tailed t-test. The sample size of 145 exceeds the
minimum calculated for a power of 0.8 at the 0.95 confidence level and 0.5 effect size
(102) and the SmartPLS requirement of ten times the number of items measuring a
latent variable (120 in this study). SmartPLS does not calculate any goodness-of-fit
values. The amount of variance explained (R2) for the predicted endogenous latent
constructs were evaluated to assess the ability of various proposed relationships to
predict a significant degree of explanatory power in each construct. The t-values were
assessed with threshold values of 1.98, 2.62, and 3.38 to determine the strength of the
various paths and are indicated in Figure 6 and Table 9 (Fisher and Yates, 1963).

Table 9 PLS longitudinal results (n=145)


Path
T1
T2
T3
Hypothesis
EUCS1MOC2
H1(-)
0.10
N/A
***
MOCREA
H2(+)
0.62
0.61***
MOCRST
H3(-)
-0.06
0.23* OP
H4(+)
H5(-)
H6(+)

REAEUCS

H7(+)

RSTEUCS
MOCEUCS

0.24*
-0.002
0.39***

-0.02
0.03
0.56***

EUCS2MOC3

-0.01

N/A

N/A

OP = hypothesis has a significantly positive relationship instead of the negative one as theorized

* p < 0.05 ** p < 0.01

*** p < 0.001

Satisfaction at Time 1 (EUCS1) has insignificant impact on users perception of


management of change effectiveness (MOC2) in Time 2, failing to support H1. At
67

points Time 2 and Time 3, users perception of management of change effectiveness


does have a significant positive impact on readiness (MOC2REA2: = 0.62, p <
0.001, R2= 38%; MOC3REA3: = 0.61, p < 0.001, R2= 37%), which support H2.
MOC2 has insignificant impact on resistance (RST2) at Time 2, and a small positive
significant impact on resistance in Time 3 (MOC3RST3: = 0.23, p < 0.05, R2= 5%),
failing to support H3. At point Time 2, readiness has positive significant impact on user
satisfaction (REA2EUCS2: = 0.24, p < 0.05), which supports H4 during the
implementation, but at point Time 3, the impact is insignificant failing to support H4
post-implementation.
At both Time 2 and Time 3, resistance does not have a significant negative
impact on user satisfaction failing to support H5. At points Time 2 and Time 3, users
perception of management of change effectiveness does have a significant positive
impact on user satisfaction (MOC2EUCS2: = 0.37, p < 0.001; MOC3ECUS3:
= 0.56, p < 0.001). Both are statistically significant, thus support H6. User satisfaction
from Time 2 has no impact on MOC3 in the larger sample of 145.
The method of assessing non-response bias is to compare early responders to
late responders of the survey. T-tests did not show significant differences at the 0.05
level of significance between early responders and late responders at each point, which
suggests that non-response bias is unlikely to be a serious concern.

68

Gefen, Straub, and Boudreau (2000) recommends that to test for a group effect
on a given model would require running the theoretical model on a sub-sample of one
group first, and then run the same model with the sub-sample of the other group. Hence,
the sample of this study is sorted by group, segmented, and analyzed separately to
determine the relationship path coefficient and standard error. Since the numbers of
staff and faculty are small, their descriptive statistics are similar, and they received
similar levels of communications and trainings during change management, staff and
faculty subgroups are combined as one group and compared to the student group. The
F-test ratio of the standard errors squared is used to determine whether to use the
formula for equal or unequal variance to calculate the t-value. The results of the tvalues show that there are no significant differences between staff/faculty and student
groups in either Time 2 or Time 3 as calculated using the Fornell and Larcker (1981)
formula for unequal variance. Hence, there is no evidence of group as a control
variable. Neither is there any significant difference detected between groups based on
age or gender in the 145 sample.
Additional work was done on the 56 respondents who took all three point
surveys to compare the results to those of the total sample. It can be seen in Table 10
that the proportion of staff and faculty, groups with more continuity, are higher in the
matched responders than in the main sample. The sample period extended over 15
months. Each semester some students are graduating and new students entering with
the greatest difference expected between the Spring 2009, Time 1 point, and the Fall
69

2010, Time 2 point. Many of the new students at Time 3 point could not compare the
new system then in place to the older system and selected the N/A (Not Applicable)
answer (greater than 10% missing data or N/A were eliminated from the sample).
For the collective sample of 168 records from the 56 matched respondent cases
the data also indicates that the measures are robust. Each latent variable has an AVE >
0.50, which exceeds the minimum threshold. AVE indicates the level of the convergent
validity of each latent factor. The composite reliabilities (and Cronbachs alpha) all
exceed 0.90 indicating excellent internal consistency reliability of each block of
indicators for each latent construct. According to Fornell and Larcker, the square root
of each AVE should be higher than any corresponding cross-correlation with another
latent construct (e.g. the square root of the AVE should be greater than any crosscorrelation in a corresponding row and/or column). This requirement for discriminant
validity is met. The same qualities were verified for the individual points of the
matched respondent cases realizing there was less power.
Fig ure 7 . PLS R esult s o f Full M o del
Test ing f o r n=5 6 ( M a t chedR espo ndent C ases)
R 2 = 0 .1 8

REA2

0.43***

0.22

EUCS1

R 2 = 0.36

0.39**

0.60 ***

0.46***

0.37*

MOC2

EUCS2
R2

* p < 0 .0 5
** p < 0 .0 1
*** p < 0.001

0.10

0.48**

R 2 = 0 .3 1

= 0.21
0.00

-0.002

RST2

EUCS3

MOC3

R2 = 0.30
-0.25

REA3

- 0.04

RST3

R 2 = 0 .0 6

R 2 = 0 .0 5

70

In order to maintain consistency two readiness construct items were deleted


from all analyses ("I find most changes with information technology benefits the
organization," "I find most changes with information systems pleasing"), due to lower
than .50 factor loading at Time 1 of the matched respondent group of 56.
The path coefficients and significance of the 145 sample in Figure 6 and the
results of the matched respondent cases in Figure 7 are compared in Table 10 below.
The 56 sample size is too small to test demographics for groups in the matched
respondent cases. MOC2 REA2 relationship is strong and significant in both
samples, although the larger sample explains 38% of REA2 variability where the small
sample explains only 18%. REA2 EUCS2 relationships are significant at point Time
2 with the smaller sample somewhat stronger. Neither sample testing the full model
tested significant at point Time 3 for REA3EUCS3.
The strength of the MOCEUCS relationship at Time 2 and Time 3 is strong,
and similarly, the models explain 30% of the variability of EUCS2 and 31% of EUCS3.
EUCS1MOC2 was not significant in either of the samples because EUCS1 refers to
the old information system and does not appear to increase users opinions of how well
the change has been managed up to the Time 2 point. One difference between the
results is that the matched respondent cases in Figure 8 indicate a significant effect of
EUCS2MOC3. The smaller sample has a greater portion of the stable workforce
whose comments and concerns are expressed not only through survey comments, but
also through other venues supplying feedback on issues during the implementation.
71

This access to offer input would increase the users perception of management of
change effectiveness.

Table 10 Comparison of PLS Results


n=145 sample
Hypothesis

Path

T1

T2

0.1

T3

n=56 sample
T1

T2

H1(-)

EUCS1MOC2

H2(+)

MOCREA

0.62***

0.61***

0.43***

0.60***

H3(-)

MOCRST

-0.06

0.23* OP

-0.25

0.01 OP

H4(+)

REAEUCS

0.24*

-0.02

0.39**

0.1

H5(-)

RSTEUCS

0.01

0.03

0.01

-0.04

H6(+)

MOCEUCS

0.39***

0.56***

0.37*

0.48***

H7(+)

EUCS2MOC3

-0.01

N/A

0.46***

N/A

N/A

0.22

T3

N/A

N/A

N/A

OP = hypothesis has a significantly positive relationship instead of the negative one as theorized
*

p<0.05

**

p<0.01

***

p<0.001

New students at the Time 3 point were familiar only with the systems then in
place and (for those respondents who did not use the N/A answers) their MOC3
responses would not be related to the EUCS2 readings of the prior period. In the
smaller sample, we do not see the unexpected positive relationship for users perception
of management of change effectiveness to resistance in Time 3.

72

The question of whether readiness and resistance are opposite ends of one scale
has been raised. Some confusion may result from researchers using logic and not
measuring resistance in the study: innate resistance to change, lack of involvement in
the change process, lack of management support, poor system quality, and the lack of
designer-user interaction have all been identified as factors causing resistance
(Hirscheim and Newman, 1988). Those factors are elements of MOC and EUCS
measures in this study, which posits that MOC increases readiness and directly
increases EUCS. The RST instrument measures innate resistance to change.
This MOC instrument measures the opinion of how well this change has been managed.
RST measures an inherent attitude resisting change of methods and interaction.
Although qualitatively MOC average data was slightly lower in Time 3, average RST
was slightly higher and average REA was lower. Readiness in Time 3 looks toward
future change and is no longer considering the completed CAMS implementation.
In an effort to qualitatively interpret this data, average values of each of the
individual indicator items for all the latent constructs are sorted by numeric value (All
items were rated from 1 to 5 on a Likert scale). The top ten ratings in best and worst
areas were selected for attention. Examination of the sorted item scores indicates that
qualitatively the best ratings were EUCS items at points Time 2 and Time 3 (all items
averaged between 3.9 and 4.2). The best ratings for the RST item scores in Table 11 for
the three points would be those with the lowest values. All three of the RST items with
the lowest average values fall in Time 1.
73

The highest average scored items for MOC and REA qualitatively suggest the
areas where the change process went well as seen in Table 11 below. The highest seven
ratings for REA items and the highest three ratings for MOC items were all at Time 2.
Table 11 Best Item Scores on EUCS/RST/MOC/REA
Best RST
Best EUCS
Best MOC/REA
(lowest)
ECurrent2
4.2 RData_3
3.2 REAPeerBenefit2
ECurrent3
4.2 RInteraction_3
3.2 REASupportCams2
EAccurate3 4.2 RInteraction2
3.0 REASupportive2
EClear2
4.2 RData2
3.0 MOCDesignee2
EAccurate2 4.2 RInteraction_1
3.0 REABenefitCams2
ESufficient2 4.2 RReports2
3.0 MOCReaction2
EPrecise2
4.2 RMethods2
3.0 MOCAssistsance 2
ETimely2
4.2 RData_1
2.9 REABenefitUsually2
EContent2
4.2 RMethods_1
2.9 REAHelp2
EAccuracy3 4.2 RReports_1
2.8 REASuggestions2

4.2
4.2
4.2
4.1
4.1
4.1
4.1
4.0
4.0
4.0

The lowest average item values for EUCS/MOC/REA constructs qualitatively


indicate areas needing improvement (Table 12). In Table 12 the lowest ratings
contained 5 MOC items, 2 REA items and how user-friendly the old IS was at point
Time 1, one REA item and Like CAMS in Time 3. Interpretation of the lowest item
scores for MOC/REA is aided by descriptions.
Since students were not included on communications in the first period, yet
contributed over half of the responses at Time 1, it is understandable that the 5 users
perception of management of change effectiveness items and the 2 readiness items were
at Time 1. It is puzzling that after the implementation at point Time 3, "I like CAMS"
rates among the lowest readings while system satisfaction rates among the highest
scores (although lower than EUCS2 at Time 2) qualitatively.
74

Table 12 Lowest Item Scores on MOC/REA


Worst
MOCResources1
MOC/REA
REATryNew3
MOCFair1
EUCSFriendly1
REAStimulate1
MOCAffected1
REASuggestions1
MOCInformed1
LikeCams3
MOCTraining1

3.5
3.5
3.5
3.5
3.5
3.4
3.4
3.4
3.2
3.0

Survey Item
Sufficient resources were available to support the change.
I am inclined to try new ideas in information systems.
People affected negatively by the change were treated fairly
Is the system user friendly?
I find most changes with information systems pleasing.
Those affected by change had ample opportunities for input.
I often suggest new approaches in information systems.
The organization kept everyone fully informed during change.
I like the CAMS system.
I received adequate training in using CAMS.

Further information can be gained by viewing the survey comments, which were
sent to management with no identifying information after each survey period was
completed. The categories of comment from the three surveys can be seen in Table 13
grouped by those deemed positive, negative, or neutral ("no comment," "not
applicable," or "none. At Time 1, survey few comments were submitted that contained
specific problems (144 neutral, 29 positive, 10 with specific content). Much more
content was contained in comments collected from the survey comments at Time 2 and
Time 3. Some of the comments given to management present areas of concern that need
further action even after the implementation of communication, technology support, or
further development by Blackboard course management platform or CAMS supplier.
The categories designated as BB problems, Log-ins, CAMS function, training, reports,
User-friendly, and Stress listed below represent the areas of concern by users.

75

With this additional information, it can be seen that, although most users may be
satisfied with the functionality of the CAMS system compared to the old systems
(EUCS elements of timeliness, format and accuracy), they may still have complaints
about the perceived inconvenience with the number of clicks for faculty to access
information or students to access email and Blackboard inside the student portal.
In Time 3 two-thirds of the respondents added comments, essentially split
evenly between positive and negative in nature. Some positive comments were about
the internet based access for the faculty to the student information system to enter
grades and retrieve information needed for advising and registering students or entering
grades. Students made positive comments about the ability to view their financial and
grade information online through the student portal access.
Blackboard was integrated with CAMS, which functioned well to assign
instructors or to add and withdraw students to Blackboard classes. Some students
complained that the new edition of Blackboard had issues with students being randomly
"kicked out" during online exams. Other comments highlighted the remaining unintegrated areas of Advancement and Financial Aid as well as the limited "preprogrammed" reports as issues for management to reconcile with the software supplier
that had promised to develop any report needed. Some complaints of stress are due to
the timing of the implementation stages just as school session started or as students
were taking midterm exams.

76

Table 13 Comment Summary


Time 1
Time 2
Positive
g

BB prob
Accessc
Communication
Bookstoref
Confusion
Gradesa
Log-insb
Like
CAMS
\
Reportsh
Slower
Ok
CAMS function
Training
User-friendlyk
Stress/timingd

Negative

Positive

Negative

Time 3
Positive

16
2
8

5
4
4
1

1
64
2
60

29
1

1
1
30
5

Negative

38
1
37

18
2
2

9
41

3
3
5
2

4
2
4
2

33
6
168
85
80
44
Total
142
83
76
N/A comments
18%
3%
50%
25%
40%
22%
% of Total
a
Grades -- faculty's ease to enter and for students to view grades, financials, schedule.
b
Log-ins -- students' portal and log in again inside to access email or Blackboard.
c
Advising students internet-based online access, negative could not log into system
d
Stress expressed due to change or timing of implementation stages
f
Bookstore closed and went online.
eg
Access
via internet
to Student and Faculty portals,upgraded.
CAMS, and Blackboard
Blackboard
mechanics/reliability/bugsrecently
h
Reports from CAMS system for staff/faculty.
k
Comments about how user-friendly the new system is for the user.
i
Complaints about CAMS
j

Comments about new Student email. No change on faculty email.

77

CHAPTER V
Discussion and Implications of the Research
Section 5.1 Discussion
The results of the analysis illustrate the strength of the proposed model. While
some hypotheses were not supported such as resistance to change negatively affecting
end-user satisfaction, some key hypotheses were supported. This research hypothesized
that lower satisfaction with the old system would result in greater discrepancy created
between the new and old systems resulting in a more favorable users perception of
management of change effectiveness to the new system. However, the empirical results
show that the satisfaction with the old system has no impact on the perception of how
well the change to the new system has been managed. Armendakis, et al. (1994) stress
that it is not enough to create discrepancy between the current and desired end-state; the
efficacy of organizational members must be bolstered regarding the ability to
implement the proposed changes to attain the new end-state.
The users perception of management of change effectiveness exerts a stable and
positive effect on readiness for change, as well as on the end-user computing
satisfaction both during and after an implementation. When users believe that
management has been fair, supported the change, communicated well, and provided
good training for the new system, they are more prepared for and satisfied with the

78

implementation (Deci et al., 1994; Gagne and Deci, 2005). Readiness for change
positively affects end-user computing satisfaction during the implementation but not
post-implementation. After the implementation, readiness, because it looks forward to
change, is no longer meaningful since the change is completed. However, the
evaluation of how well the change was managed through the implementation is still
relevant, as the strategies in change management such as communication, training,
management support, and technology resource availability -- are still contributing to the
ease of use and the usefulness of the new system. If dissonance had existed from
unrealized expectations, then high readiness at an earlier point and low satisfaction in a
later point would have occurred (Wirtz and Lee, 2003). This study did not see such
occurrences.
Qualitative data was collected demonstrating management of change strategies.
Self and Schraeder (2009) prescribed effective timely communications and training to
internalize the external motivation for the IS implementation. On this campus
management communicated coming change phases, change status, and issue resolution
through email. Additionally, scheduled training, establishment of a formal technical
support system, and management support of the new IS implementation were
communicated during the implementation. Due to faculty resistance to the course
management module of the new IS, Blackboard was retained and upgraded to allow
integration with CAMS. Confirmation of management receiving comment input from
the surveys and acting to resolve issues establishes use of the model as a tool for
79

management of change during the process of the implementation. A selection of


qualitative data of these communications is presented in Appendix E.
Contrary to expectations, resistance does not negatively affect end-user
computing satisfaction either during or post implementation. Resistance does positively
correlate with end-user satisfaction with the old system at the pre-implementation point.
Student participants, the larger portion of the Time 1 sample, had no use of the student
information system and were satisfied with their email and course management
Blackboard systems. At Time 3 students tested lower in resistance and readiness with
the same average satisfaction values at Time 2. The students' satisfaction with the new
system could account for the lower readiness for further change. Responses of all three
groups had lower REA3 post implementation, which may relate to further change rather
than the CAMS implementation. It has been suggested that possibly, if the ease of use
and the usefulness of the new system are rated high, then the user who had a high
resistance to change is more than satisfied with the new system. MOC does not
negatively affect RST. No significant relationship exists during the implementation.
Surprisingly, MOC has a significant positive relationship to RST post implementation
in the larger sample, although it explained only 5% of the variability of RST3. This
positive relationship is not evident in the smaller matched sample. A possible
explanation is that users of the new system after its implementation have stabilized their
familiarity with the new system. Hence, their inherent resistance to change (to another
system) has no clear connection with the change management conducted for the newly
80

implemented system. It has also been suggested that it may be possible that change
management activities like involvement, etc. enables a user to take stances for or
against certain aspects of the new technology, and hence, MOC may lead to both RST
and REA.
Respondents at Time 3 evidently did not interpret the inconvenience of multiple
log-ins or clicks/windows as difficult to use since EUCS3 items still averaged between
4.0 and 5.0 ratings. This longitudinal study indicates that measuring resistance post
implementation no longer represents the resistance attitude towards the already
implemented system. Hence, the resistance measure should either be deleted from the
survey or be more specifically defined.
This study contributes to literature on how to successfully implement
technological change, and how to adapt the change process for better success. The
comment collection is important to interpretation of the data and, as input to
management, highlighting areas of dissonance.
Section 5.2 Implications for Research
This study contributes to literature on how to successfully implement
technological change, how to plan for change, how to measure and evaluate progress,
and how to acquire valuable interim inputs to adapt the change process for better IS
success.
Our findings in this study have important implications for IS research. The
revised management of change research model based on our findings is shown in Figure
81

8 below. The users perception of management of change effectiveness exerts a strong


effect on readiness for change and on the end-user satisfaction. Data analysis indicates
that resistance plays no important role during or after the implementation, nor is there
any significant consistent relationship between management of change and resistance,
so it is eliminated from the revised model in Figure 8. No significant relationship exists
between end-user satisfaction with the old system and users perception of management
of change effectiveness during the implementation so the pre-implementation
satisfaction has also been eliminated from Figure 8. The feedback loop from the end
users satisfaction measurement in an earlier period to the users perception of
management of change effectiveness remains as a critical procedure that allows for
adaptation in the change strategies, which in turn strengthen the positive reinforcement
between change perception and user satisfaction.
Figure 8 Modified Management of Change Research Model
Readiness for
Change
H2
[+]
Users
Perception of
Management of
Change
Effectiveness

H4
[+]

H6
[+]

82

End-User
Satisfaction of
New System

H7
[+]

Resistance and readiness are not simply the reverse of each other. In the quality
of the SmartPLS model the correlations among all of the constructs are well below the
0.90 threshold, suggesting that the constructs are distinct from each other (Bagozzi et
al., 1991). The users perception of management of change has a strong positive effect
on readiness both during the implementation and post-implementation. However, the
users perception of the management of change effectiveness has no consistent impact
on resistance, especially during the implementation. Readiness has a significant
positive effect on end-user satisfaction during the implementation, but resistance has no
effect on satisfaction both during and post implementation. Therefore, readiness rather
than resistance has a more prominent role in predicting IS success.
Section 5.3 Implications for Practice
This study makes a significant contribution to IS practice. It illustrates that in
mandatory settings, organizations that decide to implement enterprise-wide systems
should not ignore the importance of management of change strategies. Contrary to what
has been suggested in the literature, management should focus on measuring and
enhancing users readiness for change instead of emphasizing on measuring and
reducing users resistance to change. Using strategies that communicate about the
change, deal fairly with users, demonstrate management support, supply technical
availability, and conduct training can increase the perceived ease of use in order to
positively affect both readiness for change and user satisfaction.

83

Nelson (2003) avers the advantages of using the EUCS as a measure of the
success of newly implemented ERP applications. In addition to an overall assessment
of end user satisfaction, EUCS enables the analysis of which aspects of the ERP
implementation efforts are most problematic. The relative importance of areas of
dissatisfaction can be determined by comparing the magnitude of subscale path
coefficients. Managers could focus on those factors as significant contributors to
overall satisfaction to improve ERP system effectiveness.
The survey instrument in Appendix A is a tool that measures readiness for
change, user satisfaction, perception of how well change has been managed, and
surfaces issues for resolution, which can assist management in the change process to
assure a successful IS implementation. Users comments assist in interpretation of the
data. Taking longitudinal measurements offers the opportunity for analysis and
valuable input to adapt MOC strategies to achieve a successful implementation. The
combined research model seeks to become what Davis et al. (1989) recommended, a
simple but powerful model of the determinants of user acceptance with practical value
for evaluating systems and guiding managerial interventions.
Section 5.4 Limitations and Future Research
The study may be limited by results of self-reports, which may be unduly biased
by a single cross-sectional test method. This method is balanced by multiple samples
performed at three points in time. The data were collected in the manner developed and
detailed due to eminent system implementation.
84

Since this study was conducted at a single academic setting during and after the
implementation of one integrated system and the study carefully addresses its validity,
the research results should be generalizable to other academic settings with similar user
groups and similar integrated IS implementations. Since the model itself is not limited
to academic IS system or academic organizations, the research model is general enough
to study other integrated IT (such as ERP) systems with common change management
issues. But further research is recommended to confirm the validity of this model in
non-academic settings or with different IS implementations.
The new information system in this study is mandatory and caution should be
exercised when generalizing results to users of voluntary systems. In mandatory
settings, use intention can be influenced by compliance requirements (Xue et al., 2009)
but the amount of actual use depends on the role, needs, and the proficiency of the user.
Therefore, user satisfaction with the system is a better indication of the system success
than use intention and actual use.
Although the impacts of different factors in management of change on
satisfaction, resistance, or readiness have not been studied individually here, the use of
SmartPLS does allow the examination of the different change management factors
downstream impacts simultaneously. Further research should be conducted to evaluate
the interactions or the relative importance of change management factors to the
downstream constructs.

85

This research directly measures end-user computing satisfaction at a point of


time during and after an IS implementation. It did not measure the dissonance between
what users may have expected and what they experienced. A user may have higher than
usual expectations of the new system and hence low satisfaction although the system is
implemented and in operation as designed and delivers the promised benefits. Further
research can be conducted to study dissonance of users.
While some questions in the readiness instrument specifically refer to the
CAMS system, the four items in the resistance instrument of Bhattacherjee and Hikmet
(2007) touch more on the inherent attitude to any change rather than to the CAMS
implementation. Further study should exclude resistance or use a more comprehensive
instrument for resistance with items modified to specifically refer to the particular IS
implementation. A more comprehensive resistance instrument would also increase the
power of a SmartPLS analysis for that construct if it consisted of a greater number of
manifest variables. A more comprehensive instrument would not increase the minimum
required sample size for this model unless the number exceeds 12 manifest variables.
Section 5.5 Conclusion
The objective of this study is to understand the impact of users perception of
management of change effectiveness on resistance to change, readiness for change, and
end-user computing satisfaction in an IS implementation. The results indicate that enduser satisfaction as a surrogate for IS success can be increased by management of
change. The study draws attention to the role users perception of management of
86

change effectiveness has in building user readiness and end-user satisfaction. Readiness
for change has a significant positive effect on end-user computing satisfaction during,
but not after an implementation; however, resistance to change has no significant effect
on end-user satisfaction during or post-implementation. The study contributes to the IS
literature by providing a new perspective that complements the extant IS adoption as
well as change management research. Management strategies that raise user's opinion
of fairness, management support, technical availability, communication, and training
increases the users readiness for change and the computing satisfaction with the
content, format, timeliness, accuracy, and ease of use of the new enterprise-wide
implementation.
Problems with user acceptance of new technologies can be overcome by
establishing mechanisms for user feedback (Heichler, 1995, p. 12). Change
management is an adaptive process. The survey instrument in Appendix A can be used
to gather inputs for management to identify issues faced before or during a change
process and to adapt management of change strategies for the purpose of increasing
users readiness for change and ultimately enhancing end-user computing satisfaction.
By analyzing the magnitude of path coefficients the relative importance of subscales of
EUCS and MOC effectiveness can be assessed to determine which areas to focus on to
adapt the change process for greater ERP system effectiveness. Comparison of average
values for MOC effectiveness item responses at a point in time can indicate where
efforts are needed (communication, fairness, management support, technical support, or
87

training) in those items rated lower than the average value. Additional qualitative
comments offer more details to identify specific issues. Item responses rated higher
than the average value indicate successful strategy areas.

88

REFERENCES
Abdinnour-Helm, S. F., Chaparro, B. S., & Farmer, S. M. (2005). Using the End-User
Computing Satisfaction (EUCS) Instrument to measure satisfaction with a web
site. Decision Sciences 36(2), pg. 341-365.
Aladwani. A. M., (2002). Organizational actions, computer attitudes, and end-user
satisfaction in public organizations: An empirical study. Journal of End User
Computing, 14(1), pg. 42-50.
Agarwal, Ritu, and Prasad, Jayesh (1990) Are individual differences germane to the
acceptance of new information technologies? Decision Sciences; Spring 1999
30(2). Pp. 361-392
Armenakis, A., Harris, S., & Mossholder, K., (1993). Creating readiness for
organizational change. Human Relations 46(6), pp. 681-704.
Baard, P. P., Deci, E. L., & Ryan, R. M. (2004). Intrinsic need satisfaction: A
motivational basis of performance and well-being in two work settings. Journal
of Applied Social Psychology, 34, pp. 2045-2068.
Bagozzi, R., Y. Yi, & Phillips, L. (1991). Assessing construct validity in organizational
research. Admin. Sci. Quart. 36, pp. 421458.
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in
social Psychological research: Conceptual, strategic, and statistical
considerations. Journal of Personality and Social Psychology, 51, pp. 11731182.
Baroudi, J.J. & Orlikowski, W. J., (1988). A short-form measure of user information
satisfaction: A psychometric evaluation and notes on use. Journal of
Management Information Systems 4(4), pp. 44-59.
Benn, S. & Baker, E. (2009). Advancing sustainability through change and innovation:
A co-evolutionary perspective. Journal of Change Management 9(4), pp. 383397.
Bentley, R.( 2005). Get With the Program, Financial Management London (May
2005), pp. 19-22.
Bhattacherjee, A. & Hikmet, N. (2007). Physicians' resistance toward healthcare
information technology: A theoretical model and empirical test, European
Journal of Information Systems 16, pp. 725-77.

89

Bikson, T. K., Gutek, B. A., & Mankin, D. A. (1987). Implementing computerized


procedures in office settings: Influences and outcomes, Report No. R-3077-NSF,
The Rand Corporation, Santa Monica, CA.
Bonvillian, G., (1997). Managing the messages of change: Lessons from the field.
Industrial Management Norcross: Jan/Feb 1997. 39(1), pp. 20-25.
Caldwell, S. D., Herold, D. M., & Fedor, D. B., (2004). Toward an understanding of the
relationships among organizational change, individual differences, and changes
in person-environment fit: A cross-level study. Journal of Applied Psychology
89(5), pp. 868-882.
Capaldo, Guido & Rippa, Pierluigi (2009). A planned-oriented approach for ERP
implementation strategy selection. Journal of Enterprise Information
Management 22(6), pp. 642-659.
Cassel, C., P. Hackl, & Westlund, A.H.,(1999). Robustness of partial least-squares
method for estimating latent variable quality structures, Journal of Applied
Statistics, 26(4), pp. 435446.
Chau, P.Y.K. (1996) An empirical investigation on factors affecting the acceptance of
CASE by systems developers, Information and Management 30(6), pp. 269-280.
Chin, W. W. (1998). The partial least square approach to structural equation modeling.
G. A. Marcoulides, ed. Modern Methods for Business Research. Lawrence
Erlbaum Associates, pp. 295336.
Chin, W. W., Marcolin, B. L. & Newsted, P. R. (2003). A partial least squares latent
variable modeling approach for measuring interaction effects: Results from a
Monte Carlo simulation study and an electronic mail emotion/adoption study.
Inform. Systems Res. 14(2) 189217.
Chin, W.W. & Newsted,P.R. (1999). Structural equation modeling analysis with small
samples using partial least squares, In Statistical Strategies for Small Sample
Research, Hoyle, R. (ed.), Sage Publications, Thousand Oaks,CA, 1999, pp.
I3071341.
Chin, W. & Lee, M. (2000). A proposed model and measurement instrument for the
formation of IS satisfaction: The case of End-User Computing Satisfaction
(2000). ICIS 2000 Proceedings. Paper 57. pp. 553-564.
http://aisel.aisnet.org/icis2000/57

Cooper, R.B. & Zmud, R.W. (1990) Information technology implementation research:
A technology diffusion approach. Management Science, 36(2), 123-139.

90

Davis, F.D., Bagozzi, R.P. & Warshaw, R. (1989). User acceptance of computer
technology: A comparison of two theoretical models. Management Science,
35(4), August, 982-1003.
Deci. E.L. (1972). Intrinsic motivation, extrinsic reinforcement, and inequity. Journal of
Personality and Social Psychology, 22, pp. 113-120.
Deci, E. L., Eghrari, H., Patrick, B. C., & Leone, D. (1994). Facilitating internalization:
The self-determination theory perspective. Journal of Personality 62, pp. 119142.
Deci. E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-dertermination in
human behavior. New York: Plenum.
Delone, W. H., & McLean, E. R., (2003) The DeLone and McLean model of
information systems success: A ten-year update. Journal of Management
Information Systems 19(4), pp. 9-30.
Dent, E.B. & Goldbergy, S.G. (1999) Challenging "resistance to change." The Journal
of Applied Behavioral Science, Arlington, 35(1)pp 25-42.
Dillion, A. and Morris, M.G. (1996). User Acceptance of Information Technology:
Theories and Models. Annual Review of Information Science and Technology
Vol 31: Information Today, 3-32.
Doll, W.J., Deng, X., Raghunathan, T. S., Torkzadeh, G., & Xia, W. (2004) The
meaning and measurement of user satisfaction: A multigroup invariance analysis
of the end-use computing satisfaction instrument. Journal of Management
Information Systems 21(1) pp. 227-262.
Doll, W.J. & Torkzadeh, G., (1989) A discrepancy model of end-user computing
involvement. Management Science 35(10) pp. 1151-1171.
Doll, W.J. & Torkzadeh, G.. (1988)The measurement of end-user satisfaction. Journal
of Management Information Systems 12(2) pp. 258-274
Dworkin, G.(1988). The Theory and Practice of Autonomy.Cambridge Studies in
Philosophy.
Fisher, & Yates, (1963). Table III from Statistical Tables for Biological, Agricultural,
and Medical Research, 6th edition, published by Oliver ad Boyd Ltd.,
Edinburgh, 1963.
Fornell, C. & Larcker,D. (1981).Evaluating structural equation models with
unobservable variables and measurement error. Journal of Marketing Research,
18: pp. 39-50.

91

Folger, R. & Skarlicki, D. (1999). Unfairness and resistance to change: Hardship as


mistreatment, Journal of Organizational Change Management,12, pp. 3550.
Fornell, C. &. Bookstein, F.L (1982). Two structural equation models: LISREL and
PLS applied to consumer exit-voice theory, Journal of Marketing Research, 19,
pp. 440452.
Fornell, C. & D.F. Larcker, (1981). Evaluating Structural Equation Models with
unobservable variables and measurement error. Journal of Marketing Research,
18, February, pp. 39-50.
Gagne, M., Koestner, R., & Zuckerman, M. (2000). Facilitating acceptance of
organizational change: The importance of self-determination. Journal of Applied
Social Psychology 30(9), pp. 1843-1852.
Gagne, M. & Deci, L. (2005). Self-determination theory and self-motivation, Journal of
Organizational Behavior 26, pp, 331-362.
Gefen, D., Straub, D., & Boudreau, M.-C. (2000). Structural equation modeling and
regression: Guidelines for research practice, Communications of the AIS, 2000,
Vol. 4, Article 7, pp. 1-79.
Hair, J., Anderson, R. Tatham, R. & Black, W.. (1998). Multivariate Data Analysis, 5th
ed. Prentice Hall, Englewood Cliffs, NJ.
Harper, B., Slaughter, L. & Norman, K. (1997). Questionnaire administration via the
WWW: A validation & reliability study for a user satisfaction questionnaire.
QUIS http://www.lap.umd.edu/webnet/paper.html
Henry, J. W. (1994). Resistance to computer-based technology in the workplace:
Causes and solutions. Executive Development 7(1), pp.20-24.
Herold, D. M., Fedor, D. B., & Caldwell, S. D. (2007). Beyond change management: A
multilevel investigation of contextual and personal influences on employees'
commitment to change. Journal of Applied Psychology 92(4), pp. 942-951.
Heichler, E. (1995). Move to Groupware sparks user resistance. Computerworld,
Framingham 29(11),p 12.
Hirscheim, R. & Newman, M., (1988). Information systems and user resistance: theory
and practice. The Computer Journal 31(9), pp. 398-408.
Holt, D. T.; Self, D. R.; Thal, A. E., Jr.; & Lo, S. W. (2003). Facilitating organizational
change: A test of leadership strategies. Leadership & Organization Development
Journal. Bradford, 24(5/6) pp. 262-273.

92

Holt, D.T.; Armenakis, A.A.; Feild, H.S.; &Harris, S.G. (2007). Readiness for
organizational change: The systematic development of a scale. The Journal of
Applied Behavioral Science, 43(2) pp.232-255.
Huber, F., Herrmann, A., Frederik, A.M., Vogel, J., & Vollhardt, K. (2007).
Kausalmodellierung mit Partial Least SquaresEine anwendungsorientierte
Einfhrung, Wiesbaden: Gabler.
Hultman, K. E. (1995). Scaling the wall of resistance, Training and Development,
October: pp. 15-18.
Igbaria, M. & Parasurman, S. (1989). A path analytic study of individual characteristics,
computer anxiety and attitudes toward microcomputers, Journal of Management
(15), pp. 373-388.
Igbaria, M., Zinatelli, N., Cragg, P. & Cavaye, A.L.M. (1997). Personal Computing
Acceptance Factors in Small Firms: A Structural Equation Model. MIS
Quarterly 21(3), pp.279-305.
Ives, B., Olson, M., & Baroudi, J. (1984). The measurement of user information system
satisfaction. Communications of ACM 26(10) 1983 pp. 785-793.
Jiang, J. J., Muhanna, W. A. & Klein, G. (2000). User resistance and strategies for
promoting acceptance across system types, Information & Management, 37(1)
pp. 25-41.
Johnson , B. & Christensen, L. (2008). Educational Research: Quantitative, Qualitative
and Mixed Approaches (Los Angeles, CA et al.,: Sage Publications).
Joshi, K. (1991). A model of users' perspective on change: the case of information
systems technology implementation. MIS Quarterly 15(2), pp. 229-242.
Judson, A. S. (1991). Changing behavior in organizations: Minimizing resistance to
change, Blackwell, Cambridge, MA.
Kemp, M.J., & Low, G.C., (2008). ERP innovation implementation model
incorporating change management, Business Process Management Journal 14(2),
pp. 228-242.
Kirner, R. J. (2006). Performance appraisal: A descriptive multiple case study of the
abolishment theory through the lens of intended purposes. Doctoral
dissertation, University of La Verne, UMI No. 3218805.
Kirkpatrick, D.L. (1985). How to manage change effectively: Approaches, methods, and
case examples, Jossey-Bass,, San Francisco, CA, .
Kotter, J.P. (1995). Leading edge: Why transformation efforts fail. Harvard Business
Review, March-April. pp.59-67.
93

Kwahk, Kee-Young & Kim, Hee-Woong (2007). Managing readiness in enterprise


systems-driven organizational change. Behaviour & Information Technology
27(1) pp. 79-87.
Kwahk, Kee-Young & Lee, Jai-Nam (2008). The role of readiness for change in ERP
implementation: Theoretical bases and empirical validation. Information and
Management 45(7), pp .474-481.
Laerum H, Karlsen TH, Faxvaag A. Use of and attitudes to a hospital information
system by medical secretaries, nurses and physicians deprived of the paperbased medical record: a case report. BMC Med Inform Decis Mak. 2004;4:18.
Lauer, T. & Rajagopalan, B. (2003). Conceptualization of use acceptance and resistance
in system implementation research: A re-examination of constructs. Working
Paper November.
Lewin, K.(1951). Field Theory in Social Science, Harper & Row, New York.
Lohmller, J.-B., (1989). Latent Variable Path Modeling with Partial Least Squares,
Heidelberg: Physica-Verlag,
Lyttkens, E.,(1973). The fixed-point method for estimating interdependent systems
with the underlying model specification, Journal of the Royal Statistical Society,
A136, pp. 353394.
Martins, L.L. & Kellermanns, F.W. (2004). A model of business school students
acceptance of a web-based course management system. Academy of
Management Learning and Education 3(1), pp. 7-26.
McHaney, R., Hightower, R., & Pearson, J.(2002). A validation of the end-user
computing satisfaction instrument in Taiwan. Information & Management.
Amsterdam: 39(6) pp. 503-507.
Nelson, D. (1990) Individual adjustment to information-driven technologies: A critical
review. MIS Quarterly; Mar 1990 14(1), pp. 79-99.
Nelson, K. (2003). Confirmatory factor analysis of the End-User Computing
Satisfaction instrument: Replication within an ERP domain. Decision Sciences,
July 1, p 11. Retrieved from internet site:
http://www.allbusiness.com/management/996262-1.html, November 20, 2009.
Nunnally, J.C., (1978). Psychometric Theory, McGraw Hill: New York.
Orlikowski, W.J. & Barley, S.R.(2001). Technology and Institutions: What can
research on information technology and research on organizations learn from
each other? MIS Quarterly 25(2). pp. 145-165.

94

Orlikowski, W.J. & Hofman, J.D. (1997). An improvisational model of change


management: The case of Groupware Technologies, Sloan Management Review
38(2), Publisher: MITCambridge, MA., Pp.11-21.
Palm, Jean-Marc, Colombet, Isabelle, Sicotte, Claude, and Deqoulet, Patrice, (2006)
Determinants of User Satisfaction with a Clinical Information System AMIA
Annual Symposium Proceedings; Vol. 2006: pp. 614618.
Parish, J. T., Cadwallader, S., & Busch. P. (2008). Want to, need to, ought to: employee
commitment to organizational change. Journal of Organizational Change
Management 21(1), pp. 32-52.
Parker, Judith (2009). Using informal networks to encourage change at BP. Strategic
Communication Management 13(1), pp. 24-28.
Piderit, S.K. (2000). Rethinking resistance and recognizing ambivalence: A
multidimensional view of attitudes toward an organizational change. Academy
of Management Review 25(4), pp.783-794.
Pikkarainen, K., Pikkarainen, T., & Pahnila, S. (2006). The measurement of end-user
computing satisfaction of online banking services: Empirical evidence from
Finland. The International Journal of Bank Marketing, 24(2/3), pp. 158-173.
Ringle, C.M.; Wende, S.; & Will, A. (2005). SmartPLS 2.0 Beta edition, University of
Hamburg, Hamburg, Germany. Retrieved from http://www.smartpls.de June,
2010.
Self, D. R. & Schraeder, M. (2009). Enhancing the success of organizational change
matching readiness strategies with sources of resistance. Leadership &
Organization Development Journal 30(2)pp. 167-182.
Self, D. R., (2007). Organizational change - overcoming resistance by creating
readiness. Development and Learning in Organizations, Bradford. 21(5), pp. 1113.
Shang, Shari & Su, Tim. (2004). Managing user resistance in enterprise systems
implementation. AMCIS 2004 Proceedings. Paper 23. p. 148.
http://aisel.aisnet.org/amcis2004/23, pp. 148-153.
Smith, P.C.; Kendall, L.; and Hulin, C.L. The Measurement of Satisfaction in Work and
Retirement. Chicago: Rand McNally, 1969
Somers, T. M., Nelson, K., & Karimi, J. (2003). Confirmatory factor analysis of the
end-user computing satisfaction instrument: Replication within an ERP domain.
Decision Sciences 34(3), pp. 595-621.
Straub, D. W. (1989). Validating instruments in MIS Research. MIS Quarterly 13(2), pp
147-169.
95

SurveyGold. This survey software is widely used and accessible through the Internet
site at www.surveygold.com. The university holds licenses for its use.
Szajna, B. (1996). Empirical evaluation of the revised Technological Acceptance
Model, Management Science 42(1), 1996, pp. 85-92.
Taylor, S., & P.A. Todd (1995). Understanding information technology usage: A test of
competing models. Information Systems Research 6(2), pp.145-176.
Urbach, N. & Ahlemann, F.(2010). Structural equation modeling in information
systems research using partial least squares. Jounal of Information Technology
Theory and Application 11(2) pp. 5-40.
Venkatesh, V., & Davis, F.D. (1996). A model of the antecedents of perceived ease of
use: Development and test. Decision Sciences. 27, pp. 451-481.
Venkatesh, V., & Davis, F.D. (2000).A theoretical extension of the Technology
Acceptance Model: Four longitudinal field studies Management Science 46(2),
pp. 186-204.
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F.D. (2003). User acceptance of
information system technology toward a unified view. MIS Quarterly 27(3), pp.
425-478.
Vollman, T.E. (1996). The Transformation Imperative: Achieving market dominance
through radical change, Harvard Business School Press, Boston, MA.
Wold, H.,(1980). Model construction and evaluation when theoretical knowledge is
scarce: Theory and application of partial least squares, Evaluation of
Econometric Models, , pp. 4774.
Xue, Yajiong, Liang, Huigang, & Wu, Liansheng. (2009). Punishment, justice, and
compliance in mandatory IT settings. Information Systems Research published
online before print February 19, 2010, DOI: 10.1287/isre.1090.0266

96

APPENDIX A
The University Pre-survey
Instructions
Study title: Management of change to lower resistance during information system
implementation
The email containing the link to this survey serves as your informed consent. By completing the
survey and submitting it you are giving permission to use the data in the study. By way of
appreciation for your help, you will be asked to enter your name and contact information for a
drawing for $100 at the end. Your name will not be stored with your data. It is only for the
drawing.
Pledge of Confidentiality
All information gathered in this survey will be kept completely confidential. You will not be
identified in any instance except to enter your name in an incentive drawing. Your data will not
be associated with your name.
This questionnaire is designed to assess the performance of the Information Systems (IS)
function in your organization. It includes those systems you use to enter, process, or access
reports. It also includes the systems used to support online/hybrid classes. The new system,
CAMS Enterprise (Comprehensive Academic Management System) is a completely
integrated, 100% web-based Academic Enterprise Resource Planning System for higher
education. As a user of information systems/technology, it is the performance of your IS
function that should be addressed here.
Please complete the survey before leaving the web site.
Claudia Pauline Ash, University Professor
The University
City, GA 31792
pash@xxxxu.edu email address

97

Answer questions as they relate to you.


Please provide the following (*required)
Email*
SECTION I. BACKGROUND INFORMATION
If you are both employee and student, please answer all questions from your viewpoint as an
employee, submit the survey, and return to the site again to answer all questions as a student.
You can then be entered twice for the drawing. For most answers, choose the answer most
applicable to you or fill in the blanks.

How would you describe your position in the university?

Freshman

Sophomore

Full Professor

Junior

Associate Professor

Senior

Asst Professor

Graduate Student
Adjunct Professor

Other
Area of employment:
Auxiliary Support
Financial Aid
Student

Registration

Administration

Other

Age group:
Under 20

20-29

30-39

40-49

50 or over
98

Instruction

Gender:
Male

Female

Highest educational level:


High school graduate

2 year degree

4 year degree

Masters degree

Doctorate degree

SECTION II. SATISFACTION WITH THE CURRENT INFORMATION SYSTEM


Regarding the current information systems.(fx Scholar, ACT, Response Plus, etc.)
Choose the number that most closely indicates the time that the statement is true:
1 =Almost never, 2= Not usually, 3= Sometimes, 4 = Mostly, 5 =Almost always
6. Does the system provide the precise information you need ?
1

7. Does the information content meet your needs?


1

8. Does the system provide reports that seem almost exactly what you need?
1

9. Does the system provide sufficient information?


1

10. Is the system accurate?


1

11. Are you satisfied with the accuracy of the system?


1

99

12. Do you think output is provided in a useful format?


1

13. Is the output information clear?


1

14. Is the system user friendly?


1

15. Is the system easy to use?


1

16. Do you get the information you need in time?


1

17. Does the system provide up-to-date information?


1

SECTION III. ATTITUDE TOWARD CHANGE


The following questions ask about your attitude toward change in general and the change from
the current information systems to CAMS Enterprise (Comprehensive Academic
Management System). Indicate the extent you agree or disagree with the following statements:
1= Strongly disagree, 2 = Disagree, 3 =Neither agree nor disagree, 4 =Agree, 5= Strongly
Agree, N/A = Not applicable
18. I don't want to change how I generate reports/retrieve information.
1

N/A

19. I don't want to change how I enter data.


1

N/A

20. I don't want to change the way I currently work.


1

N/A
100

21. I don't want to change how I interact with others.


1

N/A

22. I find most changes with information systems pleasing.


1

N/A

23. I find most changes with information technology benefits the organization.
1

N/A

24. I am inclined to try new ideas in information systems.


1

N/A

25. Changes with information systems tend to stimulate me.


1

N/A

26. Changes with Information systems often help my performance.


1

N/A

27. I usually support new ideas in information systems.


1

N/A

28. Other people think I support the change to CAMS.


1

N/A

29. I often suggest new approaches in information systems.


1

N/A

N/A

30. I like the CAMS system.


1

31. I usually benefit from change in information systems.


1

N/A

101

32. I will benefit from the CAMS system.


1

N/A

33. Most coworkers will benefit from CAMS..


1

N/A

34. I intend to support the change to CAMS.


1

N/A

SECTION IV. CHANGE TO CAMS


The following statements ask you to assess how the university is preparing you for the move to
CAMS . Indicate the extent you agree or disagree with the following statements:
1=Strongly Disagree, 2=Disagree, 3= Neither agree nor disagree, 4 =Agree, 5= Strongly Agree
N/A = Not applicable
35. Sufficient notice was given to those affected by the change.
1

N/A

36. Those affected by the change had ample opportunities for input.
1

N/A

37. Sufficient resources were available to support the change.


1

N/A

38. I received adequate training in using CAMS.


1

N/A

39. All levels of management are committed to the change.


1

N/A

40. The organization kept everyone fully informed during the change.

102

N/A

41. People affected negatively by the change were treated fairly


1

N/A

42. Assistance is readily available to help me with using CAMS.


1

N/A

43. An adequate explanation was given for why the change was necessary.
1

N/A

44. There is a designated person to contact for help on using CAMS.


1

N/A

45. When I request help with CAMS someone gets back to me quickly.
1

N/A

46. Management dealt quickly and effectively with surprises during the change.
1

N/A

SECTION V. EXPERIENCE INFORMATION


47. Approximately how long have you been in your current position? Type in years and months.
48. How long have you used the Basic Blackboard? Type in years and months.
49. How long have you used the current information system (fx, ACT, Response Plus, etc.) ?
Type in years and months.
50. Number of hours training you received on the current system of fx, ACT, Response plus,
and Blackboard Basic Learning?
51. Number of hours per week you use the current information system.

103

52. How did the announcement that Blackboard would continue to be used affect your attitude
toward changing to CAMS? Please choose one and then explain why.

ewhat negative.

Explain:
53. Please comment on any job tasks that have improved or worsened with the change.

Thank you for participating in the survey. Please enter your name and phone number for the
$100 drawing.

Section III. Attitude Toward Change:


For the last survey clarification of resistance answer values was added:
The following questions ask about your attitude toward change in general
and the change from the old information systems to CAMS Enterprise
(Comprehensive Academic Management System known to students as
Hawklink). Indicate the extent you agree or disagree with the following
statements: 1=Strongly Disagree, 2=Disagree, 3= Neither agree nor
disagree, 4 =Agree, 5= Strongly Agree N/A = Not applicable. Agreeing to
the first four questions means you don't like change.

104

APPENDIX B
Table 14. Descriptive Statistics by Item (average responses)
Item/Group

Students

Staff

Faculty

Total

Time 1 n=145

86

28

31

145

Eprecise

3.2

3.4

3.7

Econtent

3.3

3.2

3.7

Eneeds

3.9

2.9

3.1

3.5

Esufficient

3.2

3.2

3.7

Eaccurate

4.1

3.2

3.4

3.8

Eaccuracy

2.9

3.3

3.6

Eformat

3.9

3.3

3.6

EClear

3.9

3.3

3.4

3.7

Efriendly

2.9

2.5

3.5

Eeasy

3.1

2.7

3.5

Etimely

3.9

3.1

3.2

3.6

Ecurrent

3.2

3.6

Rreports

3.2

2.2

2.2

2.8

Rdata

3.3

2.2

2.4

2.9

Rmethods

3.3

2.4

2.3

2.9

Rinteraction

3.3

2.7

2.5

LikeCams

3.5

3.6

3.6

3.5

TryNew

3.6

4.1

4.2

3.9

Stimulate

3.3

3.8

3.5

Help

3.5

4.1

3.7

Supportive

3.8

4.1

3.9

Opinion

3.4

3.9

3.6

Suggestions

3.5

3.6

3.2

3.4

BenefitUsually

3.6

3.7

BenefitCams

3.6

3.9

4.1

3.7

PeerBenefit

3.4

4.1

4.2

3.7

SupportCams

3.5

4.3

4.3

3.8

MNotice

3.5

4.1

3.9

3.7

mAffected

3.4

3.5

3.5

3.4

mResources

3.5

3.5

3.5

3.5

mTraining

3.1

2.9

105

Table 14 continued

Students

Staff

Faculty

Total

mMgmtSupport

3.4

3.9

3.7

3.6

mInformed

3.3

3.4

3.5

3.4

mFair

3.4

3.6

3.5

3.5

mAssistance

3.6

3.8

3.9

3.7

mExplained

3.3

4.3

4.1

3.7

mDesignee

3.5

3.9

4.1

3.7

mResponse

3.4

3.6

3.7

3.5

mReaction

3.4

3.7

3.6

3.5

Time 2 n=145

102

20

23

145

EPrecise2

4.2

3.9

4.2

4.2

EContent2

4.2

3.9

4.2

4.2

ENeeds2

4.2

3.9

3.8

4.1

ESufficient2

4.2

4.1

4.2

EAccurate2

4.3

4.1

4.2

EAccuracy2

4.2

3.9

4.2

4.2

EFormat2

4.2

4.1

3.8

4.1

EClear2

4.3

4.2

3.9

4.2

EFriendly2

4.2

3.9

3.4

EEasy2

4.2

3.3

ETimely2

4.2

4.1

4.2

ECurrent2

4.2

4.1

4.4

4.2

RReports2

3.2

2.3

2.5

RData2

3.3

2.3

2.5

RMethods2

3.2

2.4

2.5

RInteraction2

3.2

2.8

2.7

LikeCams2

3.6

3.5

3.6

TryNew2

3.8

3.8

3.6

3.8

Stimulate2

3.8

3.7

3.9

3.8

Help2

3.9

4.2

4.1

Supportive2

4.1

4.1

4.4

4.2

Opinion2

3.9

3.9

4.3

3.9

Suggestions2

4.1

3.9

BenefitUsually2

4.3

106

Table 14 continued

Students

Staff

Faculty

Total

BenefitCams2

4.4

4.3

4.1

PeerBenefit2

4.1

4.4

4.4

4.2

SupportCams2

4.1

4.5

4.5

4.2

Notice2

3.9

3.7

3.9

Affected2

3.8

3.8

3.3

3.7

Resources2

3.9

3.7

3.6

3.8

Training2
MgmtSupport2
Informed2
Fair2
Assistance2
Explained2

3.4
3.8
3.9
3.8
4.1
3.7

3.2
3.8
4
3.7
3.9
4.2

3.3
4
3.7
3.6
3.9
4.1

3.3
3.8
3.9
3.8
4
3.8

Designee2

4.2

4.2

3.8

4.1

Response2

3.9

3.7

3.9

Reaction2

4.1

4.1

4.1

Time 3 n=145

88

30

27

145

EPrecise3_1

4.2

4.1

4.2

4.2

EContent3_1

4.2

4.2

4.2

3.8

3.9

3.9

ESufficient3_1

4.2

4.1

4.1

EAccurate3_1

4.2

4.2

4.2

4.2

EAccuracy3_1

4.2

4.1

4.1

4.2

EFormat3_1

4.1

3.8

4.1

EClear3_1

4.2

4.1

4.1

4.2

EFriendly3_1

3.7

3.7

3.9

EEasy3_1

3.9

3.7

3.9

ETimely3_1

4.2

4.2

4.1

ECurrent3_1

4.3

4.2

4.1

4.2

RReports3_1

3.5

3.1

2.8

3.3

RData3_1

3.4

3.1

2.9

3.2

RMethods3_1

3.4

3.3

RInteraction3_1

3.4

2.9

2.9

3.2

LikeCams3_1

3.3

3.1

3.2

TryNew3_1

3.5

3.4

3.6

3.5

ENeeds3_1

107

Table 14 continued

Students

Staff

Faculty

Total

Stimulate3_1
Help3_1

3.5

3.8

3.8

3.6

3.8

3.6

3.8

3.7

Supportive3_1

3.9

3.9

4.0

3.9

Opinion3_1

3.5

3.8

3.8

3.6

Suggestions3_1
BenefitUsually3_1

3.6

3.7

3.4

3.6

3.7

3.6

3.7

3.7

BenefitCams3_1

3.8

3.6

3.9

3.8

PeerBenefit3_1

3.9

3.8

4.0

3.9

SupportCams3_1

3.9

3.9

4.3

3.9

Notice3_1

3.9

3.9

3.6

3.8

Affected3_1

3.8

3.3

3.3

3.6

Resources3_1

3.8

3.3

3.4

3.6

Training3_1

3.4

2.9

3.6

3.3

MgmtSupport3_1

3.6

3.8

3.9

3.7

Informed3_1
Fair3_1

3.8

3.3

3.5

3.6

3.6

3.5

3.6

3.6

Assistance3_1

3.8

3.5

4.0

3.8

Explained3_1

3.6

4.0

3.9

3.7

Designee3_1

4.0

3.6

3.6

3.8

Response3_1
Reaction3_1

3.8

3.6

3.7

3.7

3.8

3.6

3.7

3.7

108

APPENDIX C
Table 15. Cross Loadings
Variable

EUCS1

EUCS2

EUCS3

MOC2

MOC3

REA2

REA3

RST2

RST 3

EAccuracy

0.92

-0.01

0.07

0.03

-0.10

-0.07

-0.03

0.01

0.08

EAccurate

0.90

0.00

0.09

0.05

-0.08

-0.10

-0.04

0.06

0.08

EClear

0.87

0.02

0.04

0.05

-0.16

-0.05

-0.09

0.02

0.06

EContent

0.90

0.04

0.08

0.05

-0.07

-0.06

-0.01

-0.04

0.13

ECurrent

0.89

0.08

0.09

0.04

-0.12

-0.03

0.01

-0.05

0.08

EEasy

0.78

0.10

0.06

0.01

-0.14

-0.05

-0.09

-0.03

0.07

EFormat

0.85

-0.03

0.04

0.02

-0.12

-0.06

-0.01

0.02

0.06

EFriendly

0.75

0.04

0.05

-0.04

-0.17

-0.11

-0.06

-0.06

0.09

ENeeds

0.88

-0.05

0.06

-0.02

-0.11

-0.08

-0.06

0.04

0.10

EPrecise

0.90

0.01

0.05

0.04

-0.05

-0.04

-0.03

-0.03

0.10

ESufficient

0.90

0.01

0.02

0.06

-0.07

-0.06

0.00

-0.01

0.10

ETimely

0.89

0.07

0.06

0.06

-0.12

-0.01

-0.02

0.00

0.10

EAccuracy2

-0.03

0.79

0.04

0.41

-0.07

0.37

0.01

-0.03

-0.02

EAccurate2

0.01

0.76

0.12

0.29

-0.06

0.33

-0.02

-0.04

-0.06

EClear2

0.01

0.86

0.00

0.41

-0.02

0.43

0.01

-0.08

0.01

EContent2

0.04

0.83

0.12

0.40

0.02

0.35

0.02

-0.06

0.04

ECurrent2

0.00

0.76

0.04

0.39

-0.05

0.41

0.03

-0.18

-0.01

EEasy2

0.04

0.74

0.12

0.46

0.03

0.37

0.04

-0.08

0.14

EFormat2

0.00

0.84

-0.06

0.42

-0.10

0.38

-0.06

-0.03

-0.10

EFriendly2

0.06

0.77

0.16

0.42

0.05

0.29

0.03

-0.02

0.15

ENeeds2

0.01

0.80

0.19

0.44

0.09

0.35

0.07

0.00

-0.04

EPrecise2

0.06

0.78

0.12

0.41

0.02

0.41

0.03

-0.03

0.05

ESufficient2

0.10

0.87

0.08

0.42

0.00

0.36

0.08

-0.04

-0.05

ETimely2

0.04

0.81

0.15

0.45

0.05

0.40

0.05

-0.13

0.04

EAccuracy3

0.02

0.11

0.88

0.11

0.49

0.01

0.24

0.08

0.10

EAccurate3

0.00

0.05

0.86

0.04

0.49

-0.03

0.28

0.04

0.09

EClear3

0.16

0.10

0.86

0.10

0.48

-0.05

0.32

0.06

0.08

EContent3

0.04

0.10

0.80

0.10

0.41

0.00

0.21

0.08

0.15

ECurrent3

0.01

0.13

0.83

0.12

0.49

0.06

0.34

0.07

0.09

EEasy3

0.05

0.03

0.82

0.04

0.45

0.06

0.32

0.13

0.06

EFormat3

0.11

0.04

0.79

0.16

0.42

0.00

0.20

-0.03

0.23

EFriendly3

0.02

0.02

0.79

0.07

0.51

0.03

0.31

0.08

0.05

ENeeds3

0.11

0.14

0.80

0.12

0.35

-0.01

0.17

0.03

0.10

EPrecise3

0.11

0.14

0.80

0.12

0.35

-0.01

0.17

0.03

0.10

ESufficient3

0.03

0.11

0.86

0.10

0.48

-0.03

0.30

0.17

0.19

ETimely3

0.10

0.16

0.81

0.13

0.43

0.05

0.31

0.08

0.12

MAffected2

0.03

0.35

0.03

0.84

-0.04

0.46

0.02

0.05

0.17

MAssistance2

0.17

0.44

0.23

0.79

0.08

0.46

0.05

-0.06

0.08

MDesignee2

-0.03

0.48

0.09

0.73

0.05

0.50

0.14

0.04

0.00

MExplained2

-0.07

0.28

0.04

0.69

0.21

0.47

0.19

-0.22

0.17

MFair2

0.10

0.39

0.07

0.72

-0.04

0.45

-0.07

0.04

0.03

MInformed2

0.07

0.43

0.13

0.87

0.08

0.55

0.09

-0.07

0.12

MMgmtSupport2

0.06

0.39

0.05

0.80

0.08

0.57

0.03

-0.12

0.11

MNotice2

0.04

0.41

0.08

0.85

-0.05

0.56

0.06

-0.03

0.12

MReaction2

0.09

0.48

0.13

0.85

0.02

0.53

0.05

-0.07

0.07

109

Table 15. Continued


Variable

EUCS1

EUCS2

EUCS3

MOC2

MOC3

REA2

REA3

RST2

RST 3

MResources2

0.03

0.34

0.11

0.80

0.01

0.39

-0.01

0.06

0.12

MResponse2

0.09

0.44

0.11

0.76

0.02

0.43

0.05

-0.12

0.03

MTraining2

0.04

0.40

0.06

0.73

0.09

0.43

0.07

-0.05

0.22

MAffected3

-0.11

0.02

0.45

0.04

0.80

0.00

0.48

0.07

0.13

MAssistance3

-0.02

-0.10

0.41

0.03

0.77

0.00

0.44

-0.01

0.26

MDesignee3

-0.04

-0.12

0.38

0.01

0.77

0.01

0.39

-0.02

0.33

MExplained3

-0.12

0.00

0.36

0.01

0.77

0.02

0.52

0.03

0.10

MFair3

-0.07

-0.01

0.38

0.06

0.75

0.06

0.48

-0.06

0.12

MInformed3

-0.17

0.04

0.43

0.04

0.81

0.04

0.48

0.02

0.16

MMgmtSupport3

-0.15

0.02

0.44

0.03

0.70

0.02

0.49

0.14

0.07

MNotice3

-0.15

0.02

0.44

0.03

0.70

0.02

0.49

0.14

0.07

MReaction3

0.03

-0.02

0.48

0.04

0.85

0.03

0.55

0.05

0.16

MResources3

-0.11

0.03

0.47

0.06

0.86

0.06

0.54

0.10

0.25

MResponse3

-0.01

-0.05

0.42

0.10

0.77

0.09

0.39

0.03

0.31

MTraining3

-0.02

0.13

0.46

0.02

0.79

0.04

0.46

0.10

0.15

BenefitCams2

-0.03

0.35

0.01

0.59

0.02

0.84

0.01

-0.25

0.07

BenefitUsually2

0.02

0.42

0.10

0.47

0.04

0.82

0.12

-0.22

0.13

Help2

-0.11

0.33

0.04

0.52

0.03

0.77

0.01

-0.18

0.09

Opinion2

-0.04

0.34

-0.11

0.45

0.00

0.81

0.03

-0.14

0.02

PeerBenefit2

-0.07

0.48

0.10

0.55

0.13

0.84

0.18

-0.18

0.01

Stimulate2

-0.04

0.38

-0.01

0.40

0.02

0.81

-0.01

-0.13

-0.07

Suggestions2

0.00

0.39

-0.02

0.53

0.03

0.80

0.03

-0.11

0.10

SupportCams2

-0.07

0.36

0.01

0.57

0.04

0.84

0.05

-0.23

0.03

Supportive2

0.03

0.39

0.01

0.47

0.01

0.81

0.02

-0.18

-0.08

TryNew2

-0.05

0.31

-0.04

0.40

0.03

0.74

0.04

-0.09

-0.02

BenefitCams3

0.02

-0.04

0.43

0.08

0.65

0.07

0.84

0.01

0.22

BenefitUsually3

0.04

0.04

0.26

0.05

0.44

0.01

0.83

0.02

0.15

Help3

-0.09

0.00

0.24

0.06

0.55

0.07

0.79

0.03

0.20

Opinion3

-0.08

0.03

0.21

0.06

0.43

0.04

0.75

0.11

0.07

PeerBenefit3

0.10

0.00

0.30

0.05

0.50

0.05

0.83

0.19

0.09

Stimulate3

-0.05

0.07

0.14

0.02

0.39

0.02

0.77

0.12

-0.02

Suggestions3

-0.09

0.06

0.15

0.01

0.38

0.01

0.72

0.07

0.12

SupportCams3

0.01

0.02

0.37

0.10

0.59

0.08

0.85

0.07

0.09

Supportive3

-0.06

0.09

0.14

0.08

0.37

0.09

0.78

0.02

-0.02

TryNew3

-0.03

0.07

0.14

0.04

0.32

0.04

0.74

-0.01

0.05

RData2

-0.01

-0.05

0.12

-0.05

0.09

-0.21

0.05

0.89

0.06

RInteraction2

0.03

-0.11

0.02

-0.06

0.02

-0.16

0.08

0.89

-0.02

RMethods2

-0.02

-0.05

0.15

-0.01

0.07

-0.24

0.07

0.90

0.10

RReports2

-0.04

-0.01

0.11

-0.09

0.08

-0.20

0.08

0.89

0.06

RData3

0.06

0.01

0.13

0.12

0.20

0.05

0.12

0.05

0.95

RInteraction3

0.12

0.02

0.17

0.09

0.25

0.02

0.14

0.02

0.93

RMethods3

0.11

0.03

0.11

0.12

0.19

0.02

0.12

0.04

0.95

RReports3

0.07

0.01

0.15

0.15

0.20

0.06

0.12

0.04

0.92

110

APPENDIX D: Survey Invitation Emails


Initial Invitation:
From: Pauline Ash
Sent: Thursday, February 26, 2009 11:08 AM
To: Adjunct Faculty; Everyone Email; Students
Subject: FW: Research survey request from Pauline Ash

Please complete the survey as soon as possible. The link to the survey is the bottom line of the
email.
Please consider taking this survey for me. It should take only 5 to 10 minutes and will enter
you in a drawing for $100. I will be repeating it later after CAMS (the new student information
system to be called Hawklink) is implemented for your reactions also. This data will be used as
baseline data in research for a Ph.D. dissertation. There will be a drawing for $100 for each
survey period. The drawing for this first round of surveys will be done on March 11th. Thanks so
much!
Follow-up Invitation
From: Pauline Ash
Sent: Monday, March 09, 2009 10:56 AM
To: Everyone Email
Subject: FW: Research survey request from Pauline Ash

Thanks to all of you that took the survey for me. I need a few more responses, especially from
the staff and faculty which are such small groups to begin with. Please do one for me if you
have not already done so. Your name will be entered for the $100 drawing which will be done
this Friday, March 13th to allow time for the added responses.
I appreciate your help.
111

Pauline Ash
From: Pauline Ash
Sent: Thu 2/26/2009 11:07 AM
To: Adjunct Faculty; Everyone Email; Students
Subject: FW: Research survey request from Pauline Ash

Please complete the survey as soon as possible. The link to the survey is the bottom line of the
email.
Please consider taking this survey for me. It should take only 5 to 10 minutes and will enter
you in a drawing for $100. I will be repeating it later after CAMS (the new student information
system to be called Hawklink) is implemented for your reactions also. This data will be used as
baseline data in research for a Ph.D. dissertation. There will be a drawing for $100 for each
survey period. The drawing for this first round of surveys will be done on March 11th. Thanks so
much!
11/11/2009 to all community
Subject: Coming Soon the Chance to Win $100 drawing for survey participants!
Now that HawkLink student portal is operative and most of the The University community has
had exposure to part of the new Student Information system, you will soon be receiving the
invitation to participate in a survey as part of my research for my doctorate work. Please
consider taking part as a favor to me and a chance at $100.
Thank you for considering this. The email invitation will contain a link to the survey.
Pauline Ash
From: Pauline Ash
Sent: Wed 11/25/2009 8:41 AM
To: Adjunct Faculty; Everyone Email; Students
Subject: FW: Research survey request from Pauline Ash
Many of you off-campus know me only through the Blackboard help number. I teach in the
112

Business Division also.


During your Thanksgiving holiday, I would deeply appreciate it if you would take five minutes
to complete a research survey for me on the implementation of CAMS, Hawklink (student
portal), and integration of Blackboard. It is located at this link. Your comments will be
anonymous but your concerns will be passed on to administration.
Thank you,
Pauline Ash
12/8/2009 to community
Subject: Can you use an extra $100 for Christmas shopping? If you have not done my 5minute survey, please do so now
I'll draw a name from those taking the 5 minute survey for $100. If you have not taken it yet,
please consider doing it for me.
I would deeply appreciate it if you would take five minutes to complete a research survey for
me on the implementation of CAMS, Hawklink (student portal), and integration of Blackboard.
It is located at this link. Your comments will be anonymous but your concerns will be passed
on to administration.
http://www.surveygoldplus.com/s/xxxxxxx.htm
Thanks,
Pauline Ash
Please complete the survey as soon as possible. The link to the survey is also at the bottom line
of the email.
By clicking on the link to the survey site, completing and submitting the survey, you provide
consent to allow the data (i.e., survey, interview, and/or observation) collected from me for the
purposes of the dissertation to be used in potential future research publications.

Thank you for

your participation and assistance on this important step to collect data for my dissertation.
http://www.surveygoldplus.com/s/xxxxxxxx.htm
12/24/2009 to community
113

Subject: Thank you each and every one


To:

Everyone Email; Students; Adjunct Faculty

I appreciate each of you that completed the survey for me. I just wanted you all to know that
the $100 drawing went to xxxxxxxxxx, a student.
There will be another opportunity when the final survey data is collected during Spring
semester.
Wishing you each a Merry Christmas and Happy New Year.
Pauline Ash
Assistant Professor
From: Pauline Ash
Sent: Thursday, April 15, 2010 9:32 PM
To: Students; Everyone Email; Faculty Email; Adjunct Faculty
Subject: Research survey request from Pauline Ash
Please complete the survey for my research. The link to the survey is the bottom line of the
email.
Please consider taking this survey for me. It should take only 5 to 10 minutes and will enter
you in a drawing for $100. I am very grateful to each of you who help with this.
This is the set of data to be taken after CAMS (the new student information system to be called
Hawklink) is implemented This data will be used in research for a Ph.D. dissertation. There
will be a drawing for $100 for this survey just as we did for the two earlier surveys.

In

addition there will be an additional $100 drawing for those who have participated in each of the
three surveys as a reward for your loyalty and kindness. The drawings for this final round of
surveys will be done on May 1st. Realizing that students may not check their email after final
exams, I will call the phone number you enter on the last page of the linked form to notify the
winners and arrange delivery of your money.
Thanks so much!
Pauline Ash

114

Tuesday, April 27, 2010 6:08 AM


I know this week is extremely busy for each of you but It is the last chance to get your input on
this research survey and I do not have enough responses yet. We need to do the drawing
Saturday before everyone scatters. If you have already done it, I appreciate it greatly. If you
have not, please take five minutes to complete it for me and express your opinion. This is the
last survey in the series for implementing all the changes with Blackboard, CAMS (Hawklink),
and email to integrate the information systems.
There will be two drawings for $100.one for everyone that responds to this survey and one
for everyone who has responded to all three surveys.
The link is at the bottom of the email and the issue about page advancing has been corrected.
Thank you,
Pauline Ash

Study title: Management of change to lower resistance during information system


implementation
The email containing the link to this survey serves as your informed consent. By completing the
survey and submitting it you are giving permission to use the data in the study. By way of
appreciation for your help, you will be asked to enter your name and contact information for a
drawing for $100 at the end. Your name will not be stored with your data. It is only for the
drawing. Pledge of Confidentiality All information gathered in this survey will be kept
completely confidential. You will not be identified in any instance except to enter your name in
an incentive drawing. Your data will not be associated with your name. This questionnaire is
designed to assess the performance of the Information Systems (IS) function in your
organization. It includes those systems you use to enter, process, or access reports. It also
includes the systems used to support online/hybrid classes. The new system, CAMS
Enterprise (Comprehensive Academic Management System) is a completely integrated,
100% web-based Academic Enterprise Resource Planning System for higher education. It

115

includes the integrated student portal Hawklink. As a user of information systems/technology, it


is the performance of your IS function that should be addressed here. Please complete the
survey before leaving the web site. Claudia Pauline Ash, University Professor. Answer
questions as they relate to you. For most answers, check the boxes most applicable to you or fill
in the blanks.

116

APPENDIX E: Timeline- Qualitative Data

1. 2/13/2009 Subject: RE: Blackboard Enterprise selected.

2. February 16, 2009 Subject: CAMS Training this week, data migration by
Wednesday, February 25th. last day we can enter data into FX system will be close of
business Tuesday, February 24th.
We will be live with CAMS by the first week in March.
Discuss next configuration steps, follow up and final requirements for going live, and
ongoing support

3. From: xxxxxxx
Sent: Tue 2/17/2009 5:17 PM
Subject: new integrated system and course management system news
Update on our progress with our new integrated student records system (CAMS). The
migration of records from FX to CAMS is complete, and within two weeks, we will
go live with our new system. HawkLink. The primary benefit of the new system
will enable a smooth flow of information between Admissions, the Registrar, the
Business Office, and Faculty Advisors without repeated re-entry of student data. We
will also be able to register students for summer and fall in the new system beginning in

117

April 2009. You will receive detailed instructions about the new system and how to
access information on March 20 (Reserved Friday meeting).

We have also made the important decision to retain Blackboard as our course
management system. We are upgrading our Blackboard license to include integration
with HawkLink and managed hosting through Blackboard. What this means is

1)

Blackboard and HawkLink will talk to one another, students automatically


enrolled in the Blackboard course site; when one withdraws, they will be
withdrawn on your Blackboard roster; and you will be able to submit your final
grades from your Blackboard gradebook; and

2)

managed hosting means that our courses will be housed on Blackboards


servers rather than locally at TU, which brings greater reliability (24/7), system
technical support, and virtually unlimited storage space for video uploads, etc.

The upgraded version of Blackboard with HawkLink will go live beginning in August
2009 (Fall semester). Prior to that transition, we will provide you with training about
the new features.
xxx

4. 5/21/2009 Subject: Immediate Blackboard Course Clean-up


118

to migrate our Blackboard course shells/materials to the new Blackboard Enterprise 9.0
edition which will now be hosted by Blackboard, we need your help eliminating
unused, unnecessary course shells.

5. 5/28/2009 Subject: RE: Blackboard Managed Hosting


1) We are planning to migrate next week only the course shells that will
support the courses on our Fall semester 2009 schedule. In August, migrate course
shells for Spring 2010 semester
3) We could migrate faculty users, so that they can go into the
new environment to work on Fall courses.

6. 6/16/2009
Subject: Blackboard Update Progress and Instructions
To: faculty
We are making great progress upgrading our Blackboard license to the newest version
9.0 and integrating our Blackboard sites with HawkLink so that students are enrolled in
our Blackboard courses as they register for courses.
schedule of events We are taking every precaution to retain past course files and are
asking you to take some precautions as well. The following is somewhat detailed in an
effort to answer questions.
Calendar of Events:
119

Any Day: Go into your Blackboard courses and saving them to your computer or a
flash drive.
July 1: In early July, documents for introducing you to the upgraded version of
Blackboard 9.0. video overview of new features. The other will highlight what has
changed
July 15: instructions for logging into the new Blackboard platform. course shell
with its previous content in the new environment. can now work with the course
shell to develop your courses for Fall semester. students already registered for your
course will already be listed in the Grade Center roster automatically
August 3: All course shells/content from Summer courses will be exported and
saved.
Also by August 3: tutorial information for students to make them aware of the
changes they will see and any different processes for using Blackboard site tools. hope
to have the HawkLink Student Portals up and running by the end of the summer
students would then simply log in to their TU student portal and the specific Blackboard
course sites for the courses in which they are enrolled will be linked there for them, not
requiring any other login.
August 17: We will provide everyone with clear Expectations and Instructions for
Managing, Protecting, and Retaining Blackboard Course Materials.

120

Late September: When Spring 2010 Course Schedule entered into HawkLink by
Registrar , new Blackboard course shells automatically generated.

Watch for email

updates and tutorial materials. 7. 8/14/2009


Subject: Good news!
You should now be able to log in successfully to the new Blackboard.

proceed with

course development, On Monday, make your course Available to students.

8. 9/08/2009
Subject: Re: New Features
I agree it is important through training in helping people navigate new system and I
have been doing so many group and individual trainings and people say ah thanks that
is easy or that makes sense. I will work on incorporating your good ideas into trainings.
Thanks, xxxxxxxx

From: Pauline Ash


Subject: new features
Acceptance is incrased by training that lets you see what you have gained.
I would be sure to include new features, problem areas, workarounds, good surprise
features.
Thanks for doing this. I think it will make a big difference.
Pauline Ash
121

9. 10/23/2009 Subject: Advising for Spring 2010


Registration for spring semester 2010 is November 2-5, 2009

new Student Portal will be up and running any day. Students will be notified when it
goes live able to access academic records and see Course offering from this portal.
The faculty portal will not be far behind! More news on this soon.

From: xxxxxxxx
Sent: Wed 12/2/2009 4:40 PM
To: Faculty Email
Subject: Distance Learning and Technology Questions - All call

At Faculty Senate today for meeting with the President regarding the universitys
increasing emphasis and presence with online courses.

To all faculty
Subject: Follow-up to Faculty Senate meeting
Dear Colleagues,

122

In the Faculty Senate meeting today, a question was raised regarding some apparent
"rumors" that may be circulating about the future of the traditional class model (face-toface). Please let me assure everyone that our core value of providing personal attention
to students has not changed. The traditional model will always remain an important
method of teaching and learning at The University. At the same time, we are fortunate
to have such an innovative and dedicated faculty that has taken the lead in developing
such creative and rigorous teaching models, from hybrid to fully online courses, in
order to expand our reach to a greater number of students. Our success, not only as
teachers, but as a university, depends upon our ability to continuously improve and
utilize a variety of teaching strategies in order to maximize our students' learning. For
many students, a traditional, face-to-face model provides the best learning
environment. For others, a fully online program has opened the doors to a college
education that would not have otherwise been possible. And, for a great many of our
students, the hybrid model has proven to be an ideal mix of the best of both
worlds. This variety, coupled with your strong commitment to quality and rigor, has
been the foundation for our success and growth in recent years.
If you have any questions regarding our academic mission, including current or future
plans, please do not hesitate to call or stop by my office. I would also be happy to
provide a more formal summary / update to the Faculty Senate if requested.
Thank you,

10. 12/3/2009 to all faculty


Subject: Important Required Password Information for Faculty

123

MONDAY, DEC. 7 at NOON. If you do not change your password by this time,
you will be locked out of your email and Blackboard and will be unable to process
final grades in Hawklink.

11. From: xxx xxxxxx


Sent: Fri 1/1/2010 9:19 PM
To: Everyone Email
Subject: IMPORTANT! New Technical Support HELP DESK
New Technical Support Help Desk

EFFECTIVE JANUARY 5, 2010, we are implementing a new Technical Support


system for faculty and students with questions and concerns about Blackboard,
HawkLink Portals, and email access and functions. (Onsite hardware/software and
classroom technology will continue to be supported by Blough Tech.)
The first line of support for all students and faculty is the Technical Support Help
Desk. To reach the Help Desk, call the direct line 2xx-227-xxxx, or email
helpdesk@xxxu.edu. technical specialists will assist callers during the following
peak hours:

124

Calls will be returned within 75 minutes during Help Desk open hours. During "offpeak" hours, calls/emails will be queued. Emergency technical support is still available
through xxxxx Tech.

The new Blackboard Administrator is xxxxxxxxxx, computer science professor.


Faculty only may call the Blackboard Administrator at xxx-221-xxxx during normal
business hours or email xxxxx@xxxxu.edu for assistance with Blackboard function
questions, problems, or for assistance with integrating publisher materials. Students
must contact the Help Desk for assistance. The Academic Technology Specialist will
also continue to provide training and tutorials for students and faculty on how to use
Blackboard. She can be contacted at xxx-221-xxxx or xxxxx@xxxxu.edu.

12. 12/2/2009 to all faculty


Subject: Follow-up to Faculty Senate meeting
Dear Colleagues,
In the Faculty Senate meeting today, a question was raised regarding some apparent
"rumors" that may be circulating about the future of the traditional class model (face-toface). Please let me assure everyone that our core value of providing personal attention
to students has not changed. The traditional model will always remain an important
method of teaching and learning here. At the same time, we are fortunate to have such
an innovative and dedicated faculty that has taken the lead in developing such creative
125

and rigorous teaching models, from hybrid to fully online courses, in order to expand
our reach to a greater number of students. Our success, not only as teachers, but as a
university, depends upon our ability to continuously improve and utilize a variety of
teaching strategies in order to maximize our students' learning. For many students, a
traditional, face-to-face model provides the best learning environment. For others, a
fully online program has opened the doors to a college education that would not have
otherwise been possible. And, for a great many of our students, the hybrid model has
proven to be an ideal mix of the best of both worlds. This variety, coupled with your
strong commitment to quality and rigor, has been the foundation for our success and
growth in recent years.

If you have any questions regarding our academic mission, including current or future
plans, please do not hesitate to call or stop by my office. I would also be happy to
provide a more formal summary / update to the Faculty Senate if requested.
Thank you,
Vice President of Academic Affairs

13. 12/3/2009 To: Dissertation Committee


The update of the student information system is to improve infrastructure to support the
University's growing hybrid and online programs since their physical footprint is very
small. All the change in IT has been accompanied by additional training in conducting
126

and enriching class content aimed at the online areas but appropriate to any class. Some
faculty feel like they are not in the "know" on the direction the school is going and
worry that all courses will be forced online. Some are concerned for themselves and do
not want to teach online. Some for the students they may be forced into online. In
general, the campus is muddy and messy and pouring rain and everyone is grumpy. It is
the end of the semester and the natives are restless. The issue was taken to the steering
committee for faculty senate and raised during the meeting. The VP attends these
meetings and was immediately concerned and asking questions.
Item 12. Is the VP's communique to faculty after returning to her office and thinking
about it. She had received an email from me with the opinions of the first 170 or so
surveys on the qualitative comments of what was going well and what was not. I think
this helped set the stage for yesterday. She did quite well on her email to reassure and
express openness, etc. I think you will agree it was a good response as "management of
change" goes when faced with this situation. When we come back in Jan all the
grounds and sidewalks should be restored with paved parking again.

We were introduced yesterday to the faculty portal where we will enter grades online at
the end of next week. We can see our list of advisees, their grade record, our courses,
roster, see each student's advisor and major from the class roster. They are working on
the function to email from there and to approve registration online requested by the
student. A master course list with descriptions and prerequisites is there also. This is a
127

major improvement from our paper system and having to be on campus to use any IS
and should affect the final survey results. I have 207 surveys so far, a couple are
duplicate names. I have sent 2 blanket request and one targeted to non-responding
faculty and staff. I will send one last appeal before awarding the $100 at the end of
exam week before they all scatter for holidays to attempt to get more students. This
time I got more students and less faculty/staff which is why I targeted them since they
are such a small group.

Probably more than you wanted to know but the "laboratory" here is cooking in this
interim period. The VP is receptive to using survey info to help manage change and
solve problems to increase user satisfaction.....like the model.

Thanks to all of you.


Pauline Ash

128

APPENDIX F: Interviews

These interviews are of 6 key decision makers for the ERPS selection process to be
implemented at The University. The interviews are to explore why the system is being changed,
how a selection was made, whether the decision is supported, expected benefits, implementation
plans, and any areas not covered in the theoretical model.
The University's migration progress has been slower than expected and issues are developing
with the ERPS supplier which will have to be mitigated by use of their contract technology firm.
Interview 1.
What would be the motivation to change from the old system?
A lot of the processes that we did are unnecessarily manual processes, and each is extremely
time-consuming and extremely tedious, and it doesn't have to be that way. Reports that people
need have to be written because the current system does not give us the information that we
need. It would manage time better to have a system that does what we'd need it to do man hours
are wasted trying to get information that should be a simple process. And we could be doing so
much more if we had the aid of a better system.
The benefits that you anticipate from the new system are mostly from integration?
Through integration I see myself working more efficiently, see myself moving along and doing
more things because my time is not eaten up in doing things that have to be done manually.
Have you had any problem as far as accuracy and the generation of reports or the way they are
packaged? As far as Fx. we have had problems in all those areas.
From the demonstrations that you have seen of the new system, do you think there will be any
improvement? I think there will be a great improvement.
Do you think there will be very much resistance to this change from people training on the new
system? I think that some people who are not really comfortable with computers might be more
nervous. I personally have never been a big fan of change, but I am looking forward to this. I
have no resistance to change in this area at all.

129

Other people that work here do you think they will be resistant to changing from FX and
Blackboard? I think that, unfortunately, the current system we have has caused us so many
problems and has been such a big headache that I don't know anybody that is not going to be
excited about changing.
So they are so dissatisfied with the old system that almost anything would be better? What are
some of the things that you think the management should do to try to make a successful
implementation? Definitely make sure all employees are well trained on how to use the system
and retrieve information that they need. I think it is important since we are doing a migration
process to put in the correct information in the first place and that will be the key to success. It
will require a lot of man-hours just to get the information in there correct.
Do you think they involved anyone in the exposure and selection of the program? I think that
some years ago when everyone are talking about systems, different people from different
departments were invited to see other systems met our needs. I think they did a good selection
and took out key people to make the selection based on what our needs are.

Interview 2.
Some of the things that I would like to ask you xxx, why did we consider getting a new
information system? Well, that's a big question. As we began to move our programs online, we
found we needed web-based accessibility for staff, students, faculty; that was one big issue, of
course. We've had problems with the integration of financial aid, Bursar, and the other
departments, alumni relations, etc.. We've had a lot of difficulties. Our current system is not
integrated for the most part. For example, if Admissions does a piece in the system I had to
literally process transfer that information over into my module. Each area is a silo, sort of an
island unto itself. So that's a problem. We've just had a lot of difficulties, and that's
withstanding the problems we've had with corruption of data. Loss of data. Reporting is a huge
issue. As a matter of fact, I would say that was one of the other aspects that drove this whole
project was institutional assessment for the Integrated secondary educational system(IPEDS).
Thompson Peterson, college board, maintaining accreditation with our accrediting board,
SACS. For all of these we needed timely data, accurate data, and F X was not doing that for us.
130

So it became very apparent that we needed an ERPS (Enterprise Resource Planning system)
Basically a system for unifying and integrating all the heterogeneous parts of our institution of
which there are many. That's kind of how it went. That's just my take on it.
Other factors driving it would be? So as far as Customer Satisfaction or User Satisfaction
based on the old system uninterested system, based on quality, format, content, appearance,
ease-of-use. That would be? Very low. And that's a good point. Really, our department is
very customer service driven. We really want to support people to be able to give people
information they need. But we haven't been able to without a huge amount of manual
processing, re a dual entry. That's been a huge problem. Any reports for institutional
assessment requires a huge amount of manual work. That's primarily due to the shortcomings
of our current system.
What do you anticipate with the new system? I am very optimistic, because it is integrated,
because we won't have these islands or silos unto themselves in a. That was really what drew us
to the product. A That database is not separately or in your divided into all these components,
it's one database. You make a change and a class and I can see that change. I make a change in
registration, and you can see that change in real time happening as we make those changes.
Currently I have to transfer a little bit of work that I've done over to the Bursar's module and
then they have to transfer back over to my module.
So as far as efficiency and accuracy? Right, reducing redundancies, increasing efficiency, it is
vital. Again, this is my opinion. I would also add for the customer service part in order to do
our job right. We need a system that is accurate in reporting and allows all the departments to
talk and not have ever one isolated in their little area, their little chiefdom. That's real
important. We've also had trouble with FX. The people are very responsive. XXXXXX and I
went to Boston and worked with them on some of the problems. They were very vocal about
wanting to fix the problems the we couldn't fix the problems. That's one part of wanting this
webpage accessibility, the ability to access the data outside the campus. We want our faculty
and staff to be able to access remotely. For example, Eurasia will not be able to come to
campus. We have online offerings, our programs are getting larger online, and it's important.
So you need a much higher level of stable infrastructure? Exactly! And if you will recall,
when we first implemented the F. X. portal we wanted to use the existing software and thought
131

there was just a bug that could be worked out, so we worked with them on each of these issues.
Basically, when we tried to implement the portal. We got everything ready, we thought we had
solved all the problems. The programmers said everything is ready. Let's do it. Let's do some
testing. Testing was important. So we tested with many faculty members, simulating
registration. And boy did the fur start to fly. It was a disaster. We started to recognize that
although we were working really hard to try to make this program work, that perhaps it was not
advanced enough to be able to handle what we were asking it to do. So we started looking
around. We did try to work problems out with FX. That lack of support has been a big issue.
again, not that they weren't responsive, but they could not solve our problems.
So it was technology capability? Right. It was, in a sense, very antiquated technology, software
was developed in the 1980s. We were one of their larger customers, and we really pushed the
software beyond its limits.
Once you put all this effort into transferring the data into the new system, are you going to trust
the results? That's a good point. It's a big problem that I've been discussing with the cams
people. We are concerned. The problem is that due to the corruption of our data, some of the
data. We cannot migrate, what we really need for all these reporting functions that are required
for IPEDS and some of the things that I mentioned earlier, it is going to be difficult to get that
past data in a format where we can do reporting. We are thinking of having the FX data put into
an access format having the technical people, run reports based on that data. We tell them what
we need and then they get it out of access. We will bring a limited selection of our data into the
new system so that as we have to bring new students in and have to bring old data in we can put
it into the proper format that will be required to generate the adhoc reports that are needed. And
also to meet the huge reporting need for our entire institution.
In other words all the past history for The University would not be in CAMS? Say, if you
graduated 10 years ago? It would only be for the more recent past? Yes and no. Say you have
a current student who has a history going back to the 80s. All of that data will be in cams. We
are going to have to go through very carefully to be sure that all of that data is properly
migrated and that we got everything that we needed. But there are a lot of pieces that we won't
have because we've never reported on the demographic data such as information on veteran's
affairs. International students, that sort of thing. There are a lot of pieces that we will have to
132

plug in, as we go. If the student is current, all of their history should be included. Other
students, we will keep available for reporting in the access piece so that their history can be
retrieved and entered into cams as needed. The final piece will be to scan all the old data and
transcripts and put it into a format so that we can access it. If an old student comes back from a
long time ago. You would reenter the data into cams at that point? Exactly. We would make a
new file for them in the system, and we would still have their hard file here or in secure their
records. There are a lot of logistics related to these questions that you are asking that we will
have to be very cautious about to ensure the integrity of our information in the future. If we
migrated 30,000 records from FX. It would be a mess for ages so we are going to take a small
subset. The president wants to get report from Fall, 2006 on. So we are going to make sure
that we have that data for him. But that does not mean that we are going to just dump it all into
cams. We are having to be very cautious. So that we are not migrating corrupt data, we are not
migrating information that will not be useful for reporting. It's a lot more complicated than let's
just transfer everything over from F. X. into cams.
You're drawing a line in the sand and saying from this point forward, we are going to sift
through the data to ensure the best data that we can import and that will be our beginning?
Exactly. The future will be very bright, and I have a lot of confidence in this application and
the way they have designed it and the best practices that it will help us to implement here, but
dealing with the past is going to be difficult at first, because we need to ensure that we have
what we need that we keep separate and aside data from cams what we don't need. But just
needs to be available in case it is needed in the future. So that we don't get this huge mlange or
mess in cams. So that's going to be interesting.
You were instrumental in researching the correct packages and selecting which one was
recommended to implement. Why did we choose cams? Interestingly, we chose from many
good applications. It is hard to narrow down from literally hundreds specifically tailored
academic based ERPS applications to something that works for The University. It was a
committee involving a lot of people. Not just me. There was me, zzz, nnn, fff. We started to
get a few applications, we had demos to get a feel of the program. We wanted to see the
program. Show us the program. We would talk to them on the phone. Ask them questions,
look at their demo, and then we gradually narrowed down the playing field to two contenders.
133

One was SunGard Power Campus and the other CAMS. Power campus was included because
banner is pervasive in our region and it was a baby banner. As far as their approach, they would
not allow us to view their application early like the other ones. They would not show us the
program online. They would not answer a lot of our questions. They asked us a bunch of
questions, which I found infuriating after a while. We can only and servers so many questions
without seeing their application. They got a bunch of information from us and came down for a
one demo much to our frustration. Let us see the program online and talk through some of our
problems. We'd wanted to share with them some of our issues and not just give them
information so that could come down and do a custom demo. We tried to be very open-ended
and open-minded but frankly, it came down to they could not answer the hard questions the way
that CAMS did. That was the most important , crucial decision-making aspect of choosing the
software. They couldn't answer the hard questions. With CAMS, literally, I would throw
complicated reports at them and say tell me how you would do this. The people would get
online and they would talk about it and click around, find their reporting tool, and build the
report and say this is how you could do this. And we felt confident that is was accurate
whereas with SunGard we did not have that interaction. That is my view again. For me that
was pivotal because they could answer the hard questions. That's where we are: we live in the
hard questions. How do we generate this report? Because they did that, that generated trust in
the new system? Absolutely to the point where we became a little enthusiastic about their
software. We sort of took it through its paces. Here is a report I am having trouble with . How
would you do this? They are clicking around and talking to people and say here is how we can
do it and show us on the screen. We appreciated that approach. That was much better than
asking us a bunch of questions to show up on campus and giving us a dog and pony show.
And were more than just the committee invited to view the demos? I know faculty and staff
were invited if they wanted to be, the Bursar helped with the selection process, Angela for
Financial Aid, Advancement was there some for the Alumni relationship aspect. The people
that were making the choice wanted everybody in on the decision. I wish the response would
have been greater but I think we had a lot of people involved.

If it turns out to be a disaster

you got that on tape so. What do you think are going to be some of the most important factors
as far as getting a successful implementation? Wow! I think its a little early because there are
134

so many questions that I have. I want to get in there and play and want to start learning the
program so that I can assist with the actual implementation. My biggest concern right now is
the whole migration process. Everything is sort of waiting for technology to complete the data
transfer.
Let's say that we can check that one off and we get good data migration. As far as
implementing the whole process over the whole campus, what would you think , if you were
responsible for implementing, what would be the key things that have to be in place and can
affect a successful implementation of usage a year from now? Well, first I hope we have
faculty, staff buy-in. I feel that we do. I hope everyone is completely aware of the importance
of making this application work correctly. I think that the tools are in place. I'll email you a
copy of the rap sheet on what the software can do. Its very extensive. It has a lot of pieces that I
think that we need. As long as we get the faculty buy-in, as long as everyone's willing to deal
with it a little bit of frustration in making the transition from the horror that we have now to
hopefully a bright future.

I am encouraged and optimistic that this whole process will work.

How would you go about engaging those faculty that may have been unable to go to the
presentation? Or even a little bit reluctant? I hope they will start to pick up on the enthusiasm
and I have a great deal of faith that the actual presentation in July when they come down and
start showing what this can do for you as a faculty member , how this can help you, how its
going to reduce your long and laborious processes that we all have to endure in the registration
and grade changes and how this is going to be a more efficient way of doing things if you will
simply take the time to learn the application. I really don't foresee any problems.
So you see the presentation and the training as key? and the Training., training, training, yes.
Because once they get over the initial concern about it being complicated, when you can bring
it up in your browser, I think that then they are going to say Oh Wow I can do this and this and
then you got them. And they are going to be in there and they're going to be working with it
and making things happen. But training and the presentation from CAMS because they know
the product. As much as I want to share, there are aspects of the product that I don't know yet,
because I haven't been trained. I know a lot from interacting with them but there is still a
great deal to be learned.

135

Up to this point, what kind of communications have the normal faculty and staff received if they
weren't involved in the choice? Have there been any communications that you are aware of?
Most people are aware that we have been in the selection process . Everyone that I have
spoken to knows about the process that we are changing to a new system. I think the word is on
the street. Are they aware of where we are in the process now? No, that I am not aware right
now. I need to get on the phone in a minute and find out exactly where we are in the migration
process. It changes by the minute. At this stage there are some new challenges that need to be
addressed to make this happen. No that information has not been given. I don't think any one
would care at this point. We are going to take this old data and put it in this system is not all
that interesting to most people. They just want it in there and they are ready to roll. But that's
going to change, I believe, in July when we train and people see this on their desk and they start
playing with it.
So you would anticipate that management would communicate periodically as far as progress
once we get to the training point? Right. Also all the scheduling needs to be done to keep
faculty up to the minute of what's going on, where you need to be, what's available. Certainly
during the training, for example, everyone needs to be there the whole time, needs to know
when there area is going to be addressed. Scheduling aspects of it, dates have changed.
XXXXXX was to handle the scheduling piece. I am hoping if we hire the new Assessment
director they will handle the training coordination.
I believe when faculty see some of the little parts like degree audit functionality , the what if
scenarios that you can play with, once we have gotten all the behind the scenes set up, solutions
to the course management, master scheduling, you'll be able to what if I got an interdisciplinary
studies degree where would I be? What if I went with Criminal Justice degree, where would I
be? I think faculty will find that fascinating. It will help in the advising process. Our
progression sheets and advising in the past has been all on paper. with a huge paper trail. A lot
of that is going to move into the system so there's no question of where is that paper or that
napkin I filled out during lunch go? I am very encouraged that they are really going to get
onboard with this when they see what it can do one we get our work done. Up to this point it
has been a pretty close knit group that has been looking at this. Do you feel like that you have
had management support through this process? Yes, I do feel like we had management
136

support. In addition to some of the senior management being at the demo meetings, there were
follow up meetings , conference calls, etc. Allen Towns and the President asked some pretty
hard questions and got very good answers. I think management has been involved and
supporting the whole process. I don't see how it could be done without that. If you have some
people in the background saying we really need to make change but without support from
management you can't even begin this process. There's going to be a lot of upheaval in
moving from where we are in the stone age to the twentieth century maybe the 21st century.
As you go through the process of implementation and more and more people get involved, do
you think you are going to see the management support demonstrated to the whole campus?
Yes. I am somewhat of an optimistic person, but I think definitely because management
knows the importance of reporting and to students of being able to register online, the
importance to faculty of being able to submit grades and have accuracy in that sort of reporting.
I think there is going to be overwhelming support when they see what we can do with this
application. I know management has been supportive through this process and I only foresee it
increasing as we move forward because of the importance to our accreditation, our day-to day
business, our operations, and also as we grow we want to offer that to our students, that quality
of service.
Do you think it will make any difference to faculty and staff as for as their acceptance or
resistance of it to see management support demonstrated rather than it is just turned over to
somebody to implement? Absolutely. I think the difference would be between black and white.
For example, if they said Okay let's change over to this product and I was just out there saying
Lets do this with no support whatsoever. no. it's not going to fly. We have to have the
management support. I am glad we are not in that situation.
It is very clear how you feel about the change in the system, is that the common sentiment on
campus as far as their satisfaction with the old system? Absolutely. Primarily, with the quality
of the data that is coming out of the departments, I am not sure in the past that they realized the
extent of the inaccuracies, redundancies, the corruption of the data because as one of the
XXXXX programmer's said Its going to be almost impossible for you to transfer it to another
system since the corruption has been there for years. This corruption most faculty members
have dealt with resulting in weird information in FX but I don't think the extent was known until
137

we started looking at the Institutional Assessment Director's ability to generate data for
planning basis. To make a decision based on our data, how accurate is it? It isn't accurate. We
have been scrambling around taking data from FX through manual processes to make it accurate
so it is vital, very important and in general, I think everyone does know that FX has not been
cutting it on so many levels, not just the reporting aspect but also day-to-day working. Student
hours, student drops just randomly, schedules changing as they go from the Registrar module to
the Bursar module. I don't know how aware of the problem they were years ago. From what I
have seen FX is really horrible. I think everyone is pretty much on board with we need to
change.
Once you have made the change there is usually some little resistance to have to overcome, do
you think that dissatisfaction with the old system is going to go a long way to overcome
resistance to the new? I sure hope so. It certainly seems to me to make complete sense. I think
people have so many problems arise working with this old software that if they are resistant to
change they may turn it around and say I don't like this technology stuff but I know this is just
absolutely not working so I am going to get on board with this. If we do find someone who
absolutely doesn't' want to spend the time we will work with them. I hope we can express to
them the importance of the system because I think it is incredibly important on so many levels.
Not just to accreditation and reporting but also to future as an institution. You think that small
amount of resistance will be overcome when they have the training and see they can do it. that
it is not going to be that difficult for them. I hope, but I want know until we deal with them. I
have not had that problem yet. I think it will be a lot easier for them to use the system than to
create a class. For faculty and administration I think it is going to be a very easy transition. It is
going to be hard for people using the nuts and bolts work like the registrar because there are a
lot of details that most people just don't care about. They just want the results and that's fine. If
we do get any of that resistance I would be interested in how we deal with it. I don't know how
we will deal with it yet because I haven't seen it yet. Do you think there is any value in
exploring the cause of the resistance to see if they are raising a valid issue? Oh, yes.
Absolutely, any time we get , I don't know about technological resistance, but anytime you get
resistance to a procedure sometimes that comes from just a great idea of why this new policy
isn't going to work and then you have to reassess , drop back and say how can we address that.
138

Most people here are not resistant to be cantankerous, they are resistant for a reason. This
policy is not going to work and here's why. I kind of hope for that. I would be so thrilled if we
could become a best practices institution so that we are a paradigm of policy perfection. This is
the way it is done because it is the best practice. Maybe with CAMS we can reach that goal.
That is my objective.
Executive Director of Enrollment Management and Student Life. The research that I am
looking into on the CAMS implementation and involves management of change and the success
of the system is indicated by user satisfaction . Whenever you do change there is some
resistance. We'll have different measurements for resistance and acceptance, so that how well
management manages the change that you will actually see those levels change over time. We
want to first look at before you ever start implementing, during and post implementation.
So some of the things we are interested in are why you think we are motivated to change from
the old system? First of all, no one here at The University is completely trained on the old
system. That was my first problem. Second problem was that when I contacted the company
they could never give me a list of other schools that were using it or give me any training over
the phone. I can't pull reports from the old system. I can not report, in general, retention and
attrition, two of the most important things I need to know at all times without using a calculator
and doing a manual report. That's my complaint with the system. For the most part there are no
canned reports. You've got to create reports and I have not found anybody that really knows
how to do that well.
What do you anticipate that you'll get out of the new system? I anticipate it being user friendly
for the students to be able to go in and look at their classes and look at their accounts and look
at their own progression without having to call on staff or faculty to help them. I think every
faculty and staff member should be able to do the same thing and look up and look at all the
numbers without a report having to be pulled. For reports that do need to be pulled, I think it
should be accessible. Do you think there's going to be very much resistance on campus from
changing from the old system? No. There's so much dissatisfaction with the old system.

139

What are some of the critical things that you think that management needs to do to ensure that
we get good, smooth implementation and satisfied users a year from now? Heavy-duty crosstraining for everybody involved, all the employees, all the staff, and faculty to an extent, as
much as they need to know. That's the problem with the old system. There was one person
here that was really trained on the whole program from beginning to end and she left The
University and with her went all that knowledge. It doesn't make sense to have one person
knowing everything. Everybody in this area in admissions, registrar and the business office
needs to be able to run every program in it. And with the training CAMS says they will provide
for us, that should be easy to do. We'll all be trained together, we'll all know everything.
So if somebody leaves the rest of us can fill in and the rest of us can together train the new
person hired to take that person's place. If there is somebody a little resistant to change
because of fear of the new system, you think that will be overcome by......

I think that will

be overcome quickly. I can't imagine who that would be...and I guess you are talking about
Blackboard mostly when you are talking about a new system and I can't really speak to that
system. It could be Blackboard or it could be some people or just resistant when it is something
new. Well, I guess you are going to have a small percentage of that regardless of what you do.
That's just life, right? You just pick up, and shake it off, and move on.
So you anticipate that will be overcome by the training so they can see it modeled and see I can
do it? Right.
How important do you think that communications will be during the implementation?
Communicating to the campus and let them know where we are in the process? Pivotal, it's a
must. We are all going to have to communicate what we are doing and talk to each other.
Absolutely.
Have you seen management support demonstrated all the way process of the selection? Yes,
absolutely. We had one person in a high-salaried position spend of her time in the last year
shopping for the system and reporting back to the rest of the administration at first monthly, biweekly, and then weekly. And we all had a say-so in it and a choice.
And you think that that was important to everybody involved that they were asked their opinion?
I do because I think through the administration everybody's opinion was counted....because

140

everyone on the administration represents a different share of the employees here and we all had
a say-so, on the academic side and on the staff side of the house.
Now, if you put yourself in the place of people that might not be intimately involved other
faculty, staff. and all, during the implementation process do you think it is going to be very
important to them in seeing that management does support this and is intimately involved and
not just passing it off for somebody to go out and install it? Sure I do and I think that's easily
done with this group of people. I think we are going to have to have some big training groups
for faculty and that's are biggest constituency here. That will be after the administration is
already trained, and then the staff will be trained, and then the faculty will have to be trained. I
think each step of the way the person in the last group has to be involved in the next group.
And I think that will happen. Knowing this administration, that will happen.
So they'll see them in training and that they support it? Yes. And when the faculty see that
she's training at the same time her Dean is training I think they'll appreciate that.
Other than communications and management support are there any other critical elements that
you can think of in management of this change that should be planned in? I don't. This is a
first for me...first go-round for me, other than a smaller software system that we used for
development in the Advancement office and I went through the beginning, selecting, training,
all the way through with that. Mostly where we missed out on that was we didn't do enough
cross-training. Two people went to Chicago for two days and trained on it and then those two
people had to come back and train the rest of the group. Of those two people, only one if left up
there and they didn't do such a good job training the rest of the group. So its all about training.
Its all about cross-training, Everybody has to learn everything. Some people are going to be
quicker than others at different parts of this new software system.
So you are saying the fact that you have the support of other trained people to help you? Right.
IS going to help as far as bringing other new people later plus reinforcing it to someone that
might have difficulty? Right. Some of us will have more difficulty. We will all have more
difficulty. I will be depending on others to remind me of what I learned in the training.
So the support structure that we have here will be very important? It will be and what we have
here is sufficient.

141

What we are doing in this research is by measuring prior to implementation and during
implementation and then after implementation the level of satisfaction, the amount of resistance
and acceptance, that we think we will be able to see that vary and change, depending on how
well we manage the change. We anticipate as Dr. XXX says that we probably will skew the
data because we are so dissatisfied with the old system. That's what I think. Everybody is so
frustrated with the old system, I think they'll do just about anything to make this new one work.
We'll see. We'll find out how many calls there are to the help desk. It'll be easy because we'll
know how many calls we get. These things will be traced. We'll know who is calling and
asking what questions. You will be able to go to those people calling and see if they were
satisfied with the help they got.
That is a very good point that they'll have a help desk and they will be recording that. Well,
we'll get charged if we go over so many calls per month we'll be charged so CAMS will keep
track of how much we use our helpdesk which will be a good tool for you. After I have trained
on it, I'll be our helpdesk to assist whoever our new ATS person is. They have a list of issues on
FX that they have logged in problems until they got to the point where they gave up. There's
some data there. Nothing is going to be perfect. There will be a few things that will be
recorded, but we don't anticipate that there is going to be anything like the amount of problems.
Right. Last July I called the people with FX and asked a
question. We are at the beginning of July now, they called me about 5 weeks ago to see if I had
solved it or gotten my answers.
Yourself or if they had sent it? Me. I have called them periodically since then saying that I
never heard back from you on this, leaving messages with a person named XXX. She called me
recently to see if I was satisfied with the answer that I got and I told her that I had never gotten
an answer and that I was no longer interested in an answer....that I had decided to do something
different. I didn't tell her that we had gone to CAMS because I didn't know if it were my place
or not. I told her I was no longer interested in the information that I was looking for. It was
something simple. I wanted to run a mailing list of everyone who is registered or been accepted
as a student at The University but never registered for classes. You would think that would be
possible. You'd think that would be possible but they can't do it. We certainly can't do it, no
one here can but when she was visiting last summer she said give me what you want and I'll
142

make it happen. I wanted to use that as a mailing list for recruiting. Hello, Mr. So and So, at
one time you showed an interest in The University. Just wanted to check in with you and see
how things are going. Did you get your college degree? Have your dreams come true? That
sort of letter. It would be a huge mailing list way in the thousands. A lot of people apply for 5
colleges. One Ivy leagues that you know you are not going to get into, 2 or 3 that you really
want to get into, and then a catchall at the bottom that you know that you can get into if none of
these work. So we get applications and application fees and transcripts on about 5 to 10 times
as many students as plan on attending here every semester. So that's a mailing list. That's
important information.
They have gone to quite a bit of trouble to get to the accepted stage? Right, they have paid the
fee, got their transcripts from high school, their SAT scores, or whatever, depending on their
age, where they are in life, already sent to us, got it on file here. Can't I just run a mailing list
without having to pull all those paper files, and we are talking thousands and thousands, and
paying people to have them sit there and type up those names and addresses. But the old system
the software people couldn't do that for me. and there's a lot of list like that I want to be able to
pull everyone who took one class here, or one semester here.
And from what you've seen on the demonstrations, do you think you're going to have that
capability? Yes. I think each one of individually will be able to do it. I don't want to have to go
to one person on campus like the Registrar and say I need this. I want to be able to do it myself.
We have a lot of students for one reason or another that close to graduation and then they don't
graduate due to a death in the family or an emergency. You know when Hurricane Kate came
through a lot of people were dislocated temporarily or permanently so you want to be able to
keep an eye on those people and you want to communicate with them. You want to be able to
pull those list of people that got up to 110 hours and didn't actually graduate (120 hrs).
That would be a valuable tool? Valuable, very valuable. A little encouragement might get
them back to finish. You need three classes, that's all you need and you're done. And the old
system can't pull those lists.
Interview 3.
143

We talked a little about what my project is going to be involving the CAMS implementation.
Some of the things that I wanted to find out: Why do you feel like we considered making a
change in the first place? Mostly because our registration and record system right now is not
accurate. We definitely have problems within the system with corrupt data that we can not fix
so that affects our data, our perception of data, our frustration, keeping accurate records, and
also because all of our units are not using the same program. So admissions uses one and
financial aid uses one, and Registrar uses another one, and business uses another one. So we
would like to get it all integrated....that was our big goal.
So it was to eliminate double entry? Exactly, it will eliminate a lot of double entry.
On the course management was there dissatisfaction there or was it opportunity to integrate
that? Same thing....an opportunity to integrate. As far as I know we are not dissatisfied with
Blackboard but the course management system came with it so its an opportunity to save money
by not purchasing Blackboard since this came with it and also it is integrated so that when the
student registers it will automatically dump them right into the course so that will save another
manual process.
And how did we go about making a choice on what new system we were going to go to? We
compared. XXX was actually in charge of that and she looked at several systems small
colleges, what was available to us because we didn't want to use something like Banner that U
of G uses because that's not what we need. So we wanted to see what was out there for smaller
colleges and we boiled it down to 2, CAMS was one and a smaller version of Banner called
Power Campus was the other one. We compared. We brought those people on campus. We
talked to them. We looked at their product.
And were you just comparing price or? Oh, no, all the elements of it. And another
consideration was that we wanted an Internet based program so that it was available, not
computer based, not campus based so that its available no matter where you are.
Did that have to do with infrastructure for future growth of online? Yes, Yes, and online
faculty, the whole thing. Power Campus that was one thing they had an Internet version but it
was still not as powerful. The main thing was the computer based here.
Do you think there's going to be very much resistance to change on our campus? I don't think
there's going to be resistance to change. I think some people don't invest as much in learning a
144

new product as they should at the beginning. I don't think they will say Oh, I don't want to learn
it, but maybe won't bother to learn it. Kind of a more of a I've got so much to do that I really
can't worry about this registration program right now. I'll just find out when I need to
know....on a need to know basis only. Although we are planning faculty training, staff training,
all that stuff. We will have it. But just in the way that I feel like there are some people who still
don't know how to use FX and that's been around what? 10 years at least.
Has there been very much communication from management as far as the need for the change
and ...

I think so. Now, I am mostly in contact with division chairs and a few faculty and all

of them definitely know the need for change and the problems with FX so I guess my
assumption is that the faculty also are going to be glad.
And was there any variety of level of people involved in reviewing the program and making the
decision and getting opinions? It was a mix in that there were faculty involved in it,
administrators, and staff people and somebody from every area was involved.
And they all made the same recommendation? In the end? Yes.
Once all this data is converted to CAMS are you going to trust it? That's the plan. That is our
biggest hope that we will have accurate data. Our old system has such corrupt data that we may
have to start from even a ....we are going from Fall 07, taking all that academic year and then
archiving the rest of the records so that they can be accessed but it will take xxxx xxxx to access
it. So we are just moving forward. XXXXXX and I have talked about even if we just had to
move forward from Fall 08 we would do that. I mean because we just really have no....
So you would have confidence in any new data from this point forward? Yes. Yes.
Your question would be how dependable the data that had been .... Right. That we need
sometimes for 10 year report, 5 year report. We don't have confidence in that data.
So I understand it will just be put into like an Access database? Right. Yes, that's it.
The technology, the confidence in the technology? We have a dedicated person with XXX Tech
that's involved in it? Um uh. XX is the Project Manager. XXXX was. They were both Project
Managers together. Then she resigned. He took the brunt of it. From what I've seen everybody
would have a lot of confidence in XX. He has very high credibility. Do you feel like that is
sufficient whether they know anything about that person at XXX or not?
145

For those on campus? I don't know whether anybody on campus knows that XX has the
technical savvy or knowledge or that he was a programmer and that he really knows his stuff. I
don't know that everybody knows that. But I think they trust that we have studied and taken us
so long to do this that they have confidence in the administration and the people that have made
this decision. ZZ being one of them, ZZ. And you were also in on it. People that they do
know. LLLL was very interested in it. So it's not just XX. I have confidence because XX is
involved but I don't know that the faculty know what an expert he is.
To this point the management support has been evidenced by the people that you have involved.
One of the things that is in the model is that evidence of management support from top down all
the way through implementation versus just saying here it is. Go put it in. No, everybody is
very involved. In fact we have update meetings with everybody in all areas like the business
office, financial aid, admissions, Registrar, somebody from academics V.P. or I, somebody
from all areas go to this update meeting so its not just XX making the decision or XX and I
making the decision and that is one thing I know that did not happen last time. We were not all
involved. In fact I don't think academics were involved at all in the decision when we switched
over from one record keeping system to the other. The Registrar was involved but no input
from the actual users of the system.
Will you be doing any communications to your faculty? Yes. Yes, as soon as we get the data
migrated and we are going to have training.... You'll roll out the programming and estimated
timing? Right. Right. Though it might slip. that's just our caution because some people are
like I'll believe it when I see it. So we are not saying anything until it actually is here. And
everybody also is going to be able to sample it on their own, without training. They'll be able to
look at it and ...
What the program is like? Right. You'll be able to look at all the areas just as a faculty
member. You won't be able to do anything. It won't be real data but you'll be able to look at the
business side, admissions, everything it can do. That's what we were told Thursday. I thought
that was good.

146

During the training you mean or after? No, just like as soon as they give the word that they
have loaded it on our system and you can get in. You'll be able to look at it and prior to training
and then we'll have training.
But after it is fully implemented everybody will have just what they need to have access to?
Right. But they'll have a chance to see an overall view of how everything ties together? So that
you might say as the course management help desk person that you might really need to see
something that we didn't think about so that you have a picture of the whole. Now, I don't
think everybody will take advantage of that seeing the whole system but some people will. So
division chairs when they want to do reports they are going to need to see whatever they need to
do the report on. You know, they can just kind of play around with that and then we'll have
training on it. That sounds like a good idea. You know we had to find out about FX reports.
You did. You just had to call the company and say "How do you do this?" There's got to be a
better way. So hopefully we won't have to dig ourselves so much. It'll be very user friendly.
And the comment that Dr. YYYY made was that in doing this research that we would probably
skew the data so far to one side because we were so dissatisfied with what we had that he wasn't
sure, that I would need other places to take data also. I don't think he realized how great our
dissatisfaction was really because the data was not accurate. You can't trust it really, at all. Not
just that it is hard to retrieve it which we can work around. But then when you retrieve it and
after going to all that trouble then its not accurate, It's not right. Then, when in your head you
know that the number of freshmen that you are looking at is not the number of freshmen that
were there, then, you know......
That was one of the main things in measurements of satisfaction with the system, the accuracy
and the quality, is one of the main things. Right. Timeliness and format, how easy it was to
pull out....all those things were factors of it. Right. But if you can't trust it after you go to all
that trouble...Right! Exactly. So I think once he got...once he understood that then he
understood our frustrations. Initially he just thought that it was an older program that was
harder to use and doesn't..you know.
In this one study that I had found on the Self-determination theory on management of change, it
was like 3 key things: the explanation for the need for change, the training that would empower
them to see themselves as being successful in it and able to do it, and they have that self-efficacy
147

that I guess you mentioned being increased. I can't remember it all but communications was a
big deal all the way through. Right. Once we get far enough along .... That's a favorite of
mine, too, because we haven't really sent out all faculty emails because it just hasn't happened
yet. Really they are just doing the internal, data migration. All that stuff. I mean, nothing has
actually happened.
Well, when I had people that had problems with Blackboard or whatever, I've said "this is going
to be so much better whenever we get our CAMS. When's that going to be? We will be
working on it this Fall and we anticipate that it will probably be in the Spring. So even just to
tell them that much will probably give them a...... I would say now, that I would say most
resistance or apprehension come with the course management....the Blackboard.....because
people are happy with Blackboard for the most part. We have our glitches but we work around
them.
I think, maybe, as far as management support, what you can do to help facilitate change, the
fact that we can zip up the Blackboard courses and bring them over and put them into CAMS,
and they don't have to do it, will be a huge big part of it. Right. If that works good. Right. But
we're all anxious to see if that is really going to work. Angel did the same thing but all that was
there was just the test questions. So that is why I am happy to hear about you and XXXXX....
Going from the old Blackboard to the new Blackboard, the majority of everything did transfer
fine. We're hoping... We are hopeful.
So if they do well with this data and all, you're expecting that you would actually be able to look
into this system not too far down the road? Yes, like soon, like this week, next week. Oh
great! And you will too. I mean this whole list of user names and that included everybody.
Does that mean that when we are able to look in there can we find, maybe from Pam Nutt, once
that step is made, how does it set up a master schedule from the past data of courses that were
in there last year? That is what we are hopeful.
Can one test course be zipped and pulled over? You mean for the course management ? Yes.
That'll be something you've got to be in on. Well, and ZZZ because what we are going to do is
use one of ZZZ's which has Softchalk to test it and Boy, that would be very soon and very good
if we could do that. That would tell us what we then needed to plan on for having to do for
everything else. They keep not giving me a lot of information about the course management.
148

So... because our primary concern right now is all the other records. I mean, that's the SSSS
tech person's primary consideration, but....
One of the things my Chair was interested in was why did we give such credibility to things that
they are developing and don't already have? They showed us some things they already had but
like on course management they can't bring in any packages from the publisher. They can bring
them in from Blackboard. And their answer is that they are going to work once they get
finalized on the specifications for the Common package....things put together by a committee of
all these different people. Once these are finalized then we will work on it and it won't take us
long. She said why did you give such trust and confidence to them when you haven't seen this?
Well, I guess our backup is that we haven't discarded Blackboard yet, and if it proved to be a
big problem we won't move it until they develop it. But it did all the other things that we
needed. This was not part of our original plan, that course management. That was not one of
our considerations that we were looking for. You've still got a failsafe in that you haven't let go
Blackboard..that's a wonderful point. Right. That's what I feel good about.
And whenever I asked XXXXXX about her confidence in this system, she said they had given her
references and she had talked to people that had already made the transition from Blackboard,
from Banner. One gave us a detailed report, that they had turned in to their university that was
great about how.... So you need that report? I can see your eyes. About the change and how
effective it was and do you have that? From XXXXX? Yes, then she did give me that. Yes, it is
very informative. It talks about it.
So they were very forthcoming, showing you their system, demonstrating it, where the other
people weren't, and they gave you references where it was highly endorsed? All of that. And
then XXXXX has some of her own references somehow. I am thinking they didn't all come
from just the company which might have been skewed, you know. I thing she found out some
other ways about people who used it, I think. But she did check and actually talk to people.
And there might be another report. I'll have to look and I'll look in the folder.
And maybe one thing. Everything sounds great at the training and all. Is the training going to
be videotaped for like new hires that come after this? Well, I think that we might end up doing
the faculty training ourselves because if they do that training any time within the next three or
four weeks, faculty won't be here.
149

Can it be videotaped so that you have that since it will help put together the faculty training?
Right. That's a good idea. And maybe their materials can be re-used. The people that contact
you about the little shortcut things for Blackboard says there doesn't appear to be any for
CAMS yet but they may be willing if there is a large market to develop them for CAMS. The
shortcuts? You know those little Blackboard guides? Oh, Oh, Oh. Maybe CAMS will have
something similar to that and we can get for not $4 a piece. Right. We could be sure that we
could always get them printed or something. If we think about the faculty that's going to come
in the future. We know how it is to try to get people to train on the Blackboard now. If we
could have a canned training, or tutorials, or whatever, anything that we could get like that it
would be a big help and save ZZZ from having to develop it. Right. I slipped out of interview
mode, didn't I? That's alright.
You had some very good comments and information in there. Good.
Interview 4.
My project got changed around a little bit now. What we are looking at is the way you can
facilitate or manage a change in IT implementation so that you can maximize acceptance and
minimize resistance of the people of the change. And we will be measuring the satisfaction with
the system before, during, and after and try to see if the levels are changing. Okay. And some
of the factors that we have come up with so far have been communication and management
support. What I am trying to do with this interview is find out what your thinking is on where
we are at so far and those things that you think are important during the implementation to see
if maybe I have completely overlooked some area that I should be including in my model.
Okay.
XXX gave me a real good one.
what was the purpose of us making a change? Well, the archaic quality of our prior system and
then beyond that it was beginning to deconstruct besides of the unreliability of it and it wasn't a
product that had kept pace with what products need to do. And in particular since we have
moved into an online arena and need to meet students' needs who are not typically present and
give us some facility that we need to offer some student services anywhere. So it's a real step
ahead in the infrastructure? Right. It meets those student records needs and student affairs or
150

student services needs in one swoop... And those were the areas that drove it? Right. To start
with? Right.
So why did we choose the one that we are going with? It is manufactured for smaller scale
colleges and universities, so it doesn't give us features that we don't need. Its not a product
made for a large university and then try to saw off corners to make it fit a smaller university.
We have the feeling that it really is a product made for an enrollment about our size and
because of that technologically it gives us unlimited number of users but for our scale we're not
having to pay a price that would be up to 50,000. You know, it's more like its up to 5,000. I
don't know. It doesn't have a cap on it but its just structured where its not a.....
So cost is one element but not the over-riding element? Fundamentally, I guess it was but
although we didn't want to get the cheapest thing out there. We needed something that was
scaled to our size university because if we had to deal with Banner or something like that, we
never would have been able to consider it. It is still very expensive as far as outlay of money
goes at the beginning. And the other thing of why we are going with this product is because of
the technical support and the customer attitude where they seemed to function more like smaller
institutions where they will know certain users who will pick up the phone and say "This isn't
working" or "Could we add this feature?" They're, instead of charging us a customization rate
for every suggestion that we have that will make the product work for us. A lot of those things
will be assimilated and given to us on the annual renewal. To me that was a feature that
recognizes that smaller schools have more limited resources and might have more particular
ways they need to adapt the product too.
I know you considered more than one product even in that category? Yes. What was the overriding things that convinced you that this one was the better? I am trying to remember exactly.
Some of it did have to do with the people that we talked to, their availability to answer
questions and even having Vice-Presidents of various areas of technology talk with us like the
one who talked with us about the Course Management System. We want to feel like they can
make somebody like that available to us. And yet Allan did research on the organization itself
and that it is not just made up of three people. You know it has a sizable staff, a stable business
in its own right. Sungard, the other one we were looking at is a scaled down product of Banner.

151

It is a much larger organization and so in some ways in our dealings with them it had more of
that package deal.
Were they as responsive? They weren't as responsive. They did more "I am coming to town
and this will be the day. Who'll be there to see me?" Than to say "we would like to schedule a
time to come, when can people be available?" That was Cams' approach.
And on our side how did we involve people in making a decision? XXXX, the Director of
Assessment, was the lead on the search project but she met regularly with the Administrative
team to give us feedback on what she was finding and then she involved people in all of
operational parts of the course management... of what we could do with the course management
system. In all these times that we had phone interviews. We had a whole day of that. Where
they demonstrated different aspects of the product, admissions and financial aid, business office
too, and academic areas, and assessment and the registrar were there all day to that and then we
had continual follow-ups to look at different modules more closely and the one in course
management in particular so anybody who was involved in implementing one of these things
had at least the opportunity to be involved, a part of those demonstrations. So were we going to
collect that back to the administrative team. Early on, we weeded out the ones....We called
references, got references accounts from everybody Heather talked to initially and she called the
schools and got them to send information and then the President called reference accounts for
the two that it boiled down to, by President to President to kind of take a President off guard and
say does this work? That wouldn't be the person that you typically think calls would go to that
person for a reference account. He got really good feedback on CAMS.
In the research one of the things in service is the confidence that people have in their IT group,
in their technology group, which of course, ours is contracted. But, when we look at this new
program that's coming in, do you think the confidence is going to be in the program technology
group, rather than so much in our local technology group? Yes, I think it will be the product
itself that sells people on it. And I am very optimistic that the product will have that...be
convincing in its own right. If its not then we've got problems, because we are really going to
depend on it and knowing its like shopping for any new car or anything, you can do all your
homework on reports and stuff but ultimately, you drive it for 10 minutes and then you have to

152

sign away your salary for it. So, I think it's kind of like that. Now I hope it really plays out on a
day to day basis to be as user friendly as it looks like it will be.
Where the financial and records keeping end of it were the drivers in the beginning, this one
also includes the course management? Right. So the majority of the users, what they are going
to see and feel will probably be with that end of it? That's right. And obviously the kind of
daily page that's in it where students have a calendar where faculty can post due dates for
projects in the student's calendar. I think that kind of main page like that will be useful. Their
go-to page for I need to go check on my bill or ....
Especially useful for when you want to make an announcement outside of the course structure?
Right. That could be a big help. Right, and we have lots of those things and it should be good
for a sense of campus life even though its remote too. This is going on. Don't forget about this.
One of the things my Chair found interesting was the confidence and the trust that we had....that
we immediately awarded when there was a feature that wasn't existing yet but they said "Well, it
won't take but a couple of months" and we said "Okay". Yes, right. Again we are counting on
them to operate in good faith. The other reason I would say we weren't naively duped by that is
that our references said that that was CAMS reputation. They really had come through doing
that for others. And new developments too? Yes. That if they asked for something, it was put
into practice.
Most of the things that we talked about that were missing would be valuable to every one of
their users, not just to us? Right. And kind of surprising that they had not pulled some of those
in when they predicated it on Blackboard but if they....I think this is the thing that we ran into
with the one guy that demonstrated things, he you could tell had not taught himself using
Blackboard so he was impressed with some things that we were like Yes, Yes, Yes, so that is
commonplace, but can it...? The more you knew what you needed to do to maximize your
teaching efficiency, the less the creators had keyed in to those features. So I hope its only the
rare "Well, can you see the whole grade book?" kind of questions that we come up with and not
a we think "How could you not have come up with a way to sort discussion boards?" rather
than... I am hoping all those pieces are there.
One of the other factors is management support so management is also helping support
implementation by maintaining the Blackboard for another year? Right, that's right. And that
153

should give us the cross-over we need to one year to do that. Not like doing it in one week?
Right. That would just be impossible and I hope by having the new Academic Technology
Specialist which that was something that was its own argument for that person but having a full
time person, a little more who can help people develop course layout as well as course content.
That coincides nicely so that she'll be able to help people make the most of that course
management system. And it may get more unity and similarity between courses if they have
similar structures and maybe easier for the student. Right. Some templates that we will help us
not be just these rigid Reagents file templates but things that will help students and also as we
pull in some higher end tools like with the classroom tools and things, that that will ...that there
will be ways for students to know how that works in a given site too. That will be great.
Do you anticipate that there will be very much resistance to this change? No, just...ah.. I would
say three quarters of our faculty have, even if they don't profess it, they actually have a
comfortable level of user ability with technology and can even if they get into the lower levels
of what's going on that they are not afraid to say "okay, so I click this and click that and they do
it a few times and they're good with it so I think that its just training, playing with it a few times,
and they'll have it down and see how great a tool it is. Then I think there are the quarter of the
people, maybe it isn't even as large as a quarter but, the people who really resist having to do
something this way and it really does plant them in front of a computer to advising and some
things like that and they'll have to just get over it. You know..... Surprisingly enough, even
though they don't want to do it themselves they'll say "Yes, we need to do it." Usually people
don't complain about having manual steps removed from their lives once they see "Wow, that
took a lot of time before that I hadn't even realized." And the duplicate, triplicate copies get
misplaced and you know, so maybe once it proves itself again...that it really works, everybody
will say "How did we get along without this?" Dr. YYYY said my data will be skewed because
everybody is so dissatisfied with our existing products. That's true, but the thing that we do
have is the paper forms we've used. The actual electronic record system is a pain and nobody
will miss that. but some people will feel like filling out those forms that's the way we've always
done it. You have confidence on a piece of paper. I think its actually even back stepped that
there will be people who will say "What was wrong with filling out the piece of paper?"

154

There may be some that will keep a paper record or print it out. Yes. Since we have gotten bit.
Yes. They can print it out when it looks right on the screen to you. Some of that need to fill out
the form...it's probably a matter of trust. That may be the thing that skews us so is that they have
such a high level of distrust for the existing technology that we have a greater barrier for getting
people to be sure that they can trust the new one.
Are you going to trust the data that gets transferred over? We are transferring as little as
possible so that we have what's in there starting probably with Fall of 07 so that it is when XXX
was here and we know he was using consistent tables for what he was doing so that we are only
taking over things that we can rely on. And then the only weakness will be if that even in that
data if FX has done something screwy with it so that its sending something that crazy. But
from this point forward it should be with no problems? Right. In fact we are not going to run
concurrently which most would be the best practice in that situation where you do that because
usually that's how you find the bugs in your new system, but for us we have so little trust in old
system that there's no reason to run concurrently. The new system is probably the right one and
the old one is the wrong one so there's just no way to make that a valid comparison.
You'll start the Fall semester on the old system? Yes, I think so. I think right now because they
had to rebuild the ISS server, I think that is what it is called....when KKK first put it together
CAMS couldn't get it to write something it was supposed to do so she had to redo it. That'll
delay and maybe we are a month behind schedule from where we thought we would be. So we
had hoped that over the summer we could start maybe July putting all new registration on into
CAMS but we are still having to put them in to FX. so we will have to FX will be our primary
document. I think that as we begin to register for Spring we'll be registering with CAMS. So
at some point we do like an update of the Fall and forward the data over into CAMS? Right.
Right. And DDDD said that probably within a week or so that we they will have that first data
loaded so that we can see what its going do? I did talk to XXX about KKKK about KKKK and I
volunteered to take a couple of his courses and trying out to see if Soft Chalk worked and all
that. And then I talked to KKKK and he was willing to volunteer. Good. People have a lot of
trust in whatever he says. If he tries it out and he says it is going be fine then I'm not going to
worry. Yes. That's a good idea because it is partly who's telling you its going to work to and I
think if its coming from the course management stuff from you and KKKK and the other
155

services stuff from XXX and NNNN and me that the faculty will feel like its endorsed. Coming
from XXX Tech they're going say "Yes, but whose telling us this?"
One of the points they had on management support was that management support was visible,
that they supported this implementation, not only in the beginning saying it must be done and
walking away but you saw them through the whole system. Yes, I think we are so much a part
of the system that that is not going to be an issue. Down to the Bursar level where V.P. Finance
needs to know how it works and it actually will get Advancement linked into the system in a
way that they haven't been able to do before. and obviously Admissions wants to be a user of
that system and I'll need to know as much of the report drawing stuff for assessment more than
just how to register a student in it although that bound to be in it. Wouldn't it be easier to do
ours? Yes. It will have to be . I mean it will just change that report that report thing. I'll still
have, as long as we are having to pull from older year's data that isn't migrated there will be that
cumbersome thing of looking at it in Excel, I mean in Access. That means that it should phase
out? Yes, it will.
One thing that amazed me when I came here was that things were entered so many times, so
that should eliminate a lot of that? It really should. And by different groups into different
programs that couldn't talk to each other. It should be immensely helpful. Even if you take into
account every student who has to enroll themselves in any given Blackboard class. You know,
that time disappears too. And creating the classes and creating them. and all your time. Right.
Well the theory I had recommended that I look into Self-Determination theory is a very good fit.
It talks about when you try to motivate people that you would expect resistance until they
internalize it. After they internalize it, it can be controlled because of overseeing them or it
becomes intrinsic and they are interested in actually doing it. And this article that was on
facilitating acceptance of Organizational change boiled it down to using three things: giving
them a rationale for why that they are doing it, offering them a choice about how to do it, and
acknowledging their feeling about the task and so the training that is done should help put those
elements in . And I am sure that everyone will be acknowledging the feelings about it. And they
had some very good, simple small number of questions that they used in their instrument that
fits right in with our other instrument our data that we will be taking so I think we will
incorporate that into the model. Probably it will be for everybody at the convocation that we'll
156

have on Wednesday, August 20, and I might just....I was going to say a little about CAMS
coming up. If you don't mind I am going to use those three points to have my little introduction
to what's going to happen for that's good for me to think. I know you don't have a lot of reading
time, would you be interested in reading this. That's the only application, the study where I
found that was truly in a workplace application, and not in the research. I said that's just
exactly what we are doing. I know you came to hear the first person who interviewed, were you
there for the second one? No, I didn't get to hear the second one. She talked a little bit about
this because her dissertation work uses this self-determination stuff in exercise theory....how
motivated people are to exercise. So that jumped out at me. I knew that LLLL used that in her
dissertation. I really didn't know those three implements like that. Good.
Interview 5.
Some of the questions that we are wanting to discuss on the Course Management system is just
to be included with the Integrated Student Information/Financial package, what we do we call
that entire CAMS package? What do we call it for the organization to discuss? CAMS. Okay,
I guess that's the easiest thing. That's what I use in communicating it to everyone.
What was the real driver for why we started looking to make a change? The inability to extract
data from the existing system that we can trust.
And how did we go about selecting the one that we are going to go with? a person was
appointed which was xxxxxxxx, to be the point or lead person on this, and she gathered
information on four different systems and then she presented that information in a meeting. The
participants in the meeting reviewed the information and then selected two of those to bring
onsite.
And what was the extent of the involvement of other personnel? xxxxxx interviewed each of the
people that was on the Administrative team to find out what weaknesses that we had, what we
were looking for to gain, and from that she developed her criteria to make a selection or
recommendation really.
And so then they were involved as far as seeing the demonstration of the system? Yes. We had
a demonstrations, interviews with the two people from the organizations.

157

And then as far as what determined the selection? What was the main factors that you would
say that you looked at? the choice between those two? Well, I looked at it maybe a little
different than the other people. I thought it was more the stability of the company, size of the
company, what was going on in the market place. In other words, we don't want to buy
something that everyone else is transitioning to some other type of architecture in software. So,
I was more concerned about whether it was a good investment. I let other people decide which
ones were going to meet the needs of the Registrar and Admissions, the online stuff, I was
concerned about if we were making a good investment.
Not just the price, but the stability of the company as well? All types of factors? We didn't just
pick out the low price bidder? As for as the course management that just happened that it was
a nice circumstance that it was included? It was a desire of some people on the team.
And then they checked out references? References were actually checked a couple of times.
xxxxx did her checks then the President of The University talked to the presidents of several
different schools using the program we selected.
My Chair was kind of interested in why if we had had problems with the ones in the past why we
would ask about certain features, particularly in course management, and they weren't existing
but they said they would develop them and we so readily gave our trust that that was going to
happen....as to what that was based on?
I don't know. I personally didn't ask...I wasn't looking at it from that angle. I got an answer on
that from SSSS the other day.
You know I don't work in those other areas, Pauline, so I just rely on them to make sure its
right. I was concerned about us making an investment that we are going to advertise over a
long period of time...that that support is going to be there, that we are going in the direction that
the markets going. Because if we are not, then these people are going to trying to develop
something else and we're just going to have a dinosaur. I was just trying to look out for that
angle. That was kind of the direction that I was given.
Were you kind of looking at it as far as you wanted to have a stable infrastructure that will
allow growth here without having to redesign again? Absolutely.
Since we are anticipating or hoping that we will be in a growth mode with more online stuff and
all that? Yes.
158

Do you anticipate that there will be very much resistance to the change? Say from the staff
level? No, I don't think there's going to be any resistance on our end. There's concern. It
depends on how you define change. We don't mind going to a new system that feeds our data
but general ledger is not changing. We actually have the one recommended by CAMS which is
Great Plains. And they interface? Yes. Its already set up a certain way. We would be resistant
to changing how that data is collected. In other words if you wanted to redefine the buckets
how you collect data for tuition, that would be a significant impact on us.
And that we would create inconsistent over a period of time? Ah, it wouldn't work. What I
made clear up front and The President agreed and so far I think everyone else has bought into
that is that we are open to changing things if we need to but what we would like to do first is
install CAMS throughout the campus, have it operational and then if we need to change how we
collect our buckets of data, we do it after all the other variables are taken out. Otherwise, we'll
have a General Ledger with inconsistencies. We don't know if its from because we changed
how the data is being identified and collected and transferred or if there is problems with CAMS
software. We have certain ports we have to collect data a certain way. We've got the program
set up that way. a CPA firm had done the same thing because they take our balance sheet, our
financials, loaded into their system and then they do the things they need to do so it is a big
deal to us.
But as far as your staff level people, they are not going to be resistant learning CAMS? No.
And it depends on who you are with because some of these people what they do won't change at
all. The person who does Accounts Payable, that's in Great Plains. So it won't affect them at all.
Not unless they cross-train? Yes. the person in AR will be a brand new job, so. But they are
frustrated with how things are so there's not much resistance. Dr. XXX told me your data is
going to be so skewed because everybody is no unhappy with the existing situation. Absolutely.
We can't do the fundamentals right now.
In addition to that, you are also in charge of the contract technology group. Um Huh. So what
extent is their involvement in this? Pretty significant. The lead person on our side, the
University side, is an employee of XXX Tech. She is assigned to work as implementation
consultant fulltime, although when we don't have something for her to do she does something
else for XXX Tech so we don't get billed for that. So they are very much involved. Is she a
159

new XXXX Tech employee? She is. She is a local person. She's done work for the city of Cairo
and also the school system. I think she is a very knowledgeable person. Good.
Do you think as far as the trust that the people on campus have in technology that they are
going to be judging it based on XXXX Tech or the technology department of the new CAMS as
for as they have a problem that they can call somebody up to troubleshoot. Not sure I
understand the question. Once it is fully implemented, where will they go for help as far as
trying to troubleshoot any kind of a problem? Will they go through XXXX Tech or will they go?
Who's the expert? I think there has to be an internal resource that people go to. Whether that's
a school employee or XXXX Tech. Somebody's got to be the expert. And then they can seek
help wherever they need to? I would think so. And then decide which one it is going to be?
Yes, because you don't want everybody here directly to go to CAMS say I've got a problem, fix
this. CAMS is saying, doing things. It needs to be administered so its like a funnel we're down
here and CAMS is up here. I would think most of these things are going to be internal issues
like having issues what size data base or things like that. Okay.
There's another think I want to say in related to that question: right now related to our security
in FX scholar is really loose and all those things need to be brought to the level that they should
be at. It needs to be administered by somebody local. Okay. So we will be able to do that with
CAMS? That's the plan.
And it will be web-based rather than computer based, right? But all the server, data, will still
be on campus? I don't know if you know this. We are going to add on the back of this
building, it's already been approved, for a server room. It'll be a server room, an office for the
person like the technician, storage room. That should start in the next 30 days and finish
probably in November. So all the servers will be up here. Well, good. So it will be designed
for that purpose and have proper cooling and be on the first floor, centrally located, and will
also have a generator backup. And will our firewall come in there? Because, you know, that
was one of the things we talked about before where everything came in up there brought from
the internet down here and spread out so will everything be located there? Yes, everything in
the server room will move in here. Will that'll be great. That sounds like that will be a big
improvement. A large investment but it ought to be worth it. Yes. Right now we can't cool the
room up there, it's getting too much weight on the floor. It served the purpose of locating the
160

servers where the internet came in but it was just not an ideal spot.... Terrible. ....but this will
be lower, maybe lightning won't strike it. I hope. Certainly will put lightning arrestors in there.
Will you trust the data that's converted over into CAMS? From the new point going forward
hopefully we can. When CAMS is operational? I think CAMS will....I think its data will be a
lot more reliable. From conversations that I have had with xxx and ttttt. There's just a lot of
corruption in FX scholar and she just got her doubts about most all of it. I don't trust any of it
myself.
So for a period of time, you're going to have to rely on these back up data bases that they are
going to keep all the information in in FX, like where they said they would go in the back
records that way? I think they are only going to put one year in anyway. I think whatever goes
in, if its a...I think CAMS will manage accurately whatever is put in but.....

Garbage in,

garbage out? That's right. You need to use it all with a grain of salt. You need to look at it and
see if this really makes sense.
Well, I know the CAMS is supposed to help as far as being able to generate that IPEDS data,
and make all that much easier, and hopefully, it will also enable y'all to do some maybe
reporting easier....that you might decide that you want to give internal cost reports that's not
set up now or something?
What amount of communications has been done so far as far as what you have let your staff that
reports to you know? You mean about CAMS? I've shared with them everything I know. The
daily emails that I get from LLLLL, I don't share those. They are really only updates on
whether we were able to load this data. No, we got this problem. Can we load this program?
That would not be useful.
But they got a general idea of all when its coming and the fact that its coming and why its
coming? Yes, no secrets.
One of the things when I talked with XXXXX, she said you need to go read Self-Determination
Theory. You need to include that. You need to go read that. I did go read that. There was some
good thinks in it. And a lot of it, too, talks about the way you help internalize goals that you
have so that the person , they become their goals. That once they internalize it, they can be 1.
controlled, because you're standing over them watching them, or 2. because it becomes
interesting to them and part of their job and they become motivated that way. And it said there
161

was three things it said that you can do that would help that take place. One of them was
communicating the rationale on why you are doing it in the first place so they know why they
have to do it, offering some choice about how they get to do the task that makes it easier for
them, and acknowledging their feelings about the task. But, like you said, I do think that people
are very hopeful that it is going to be an improvement and easier, once we learn it, and
hopefully more accurate, too. There's nobody here who'll be resistant, in my area. I don't think
the faculty will, if their...but to this point, faculty has not really been communicated to. But,
once that they are, I interviewed XXXXXX, and she wrote those three things down, and said if
you don't mind, I'll use that and said she said the main part of the communication with the
faculty will start with convocation (Aug. 20). And then they would roll out the training. I had
volunteered to xxxx that kkkk and I would take a couple of his courses because he's used
Softchalk and he's wondering whether or not that's going to work, and convert a couple of his
over right as soon as we can and play with it and see how its going to work because people
think a whole lot of him. So, if he says this is going to be great, then even if its going to be
several months, everybody will be at ease and they will accept it based on how much they trust
him. XXXX said I think that's a good idea. I went asked him and said I went and volunteered
you, is that okay, He said Yes. I want to know. Well, I appreciate your time. Is there anything
.....
Oh, when it was talking about management support, it was talking about that it should be
evident all the way through. In other words you didn't just say well, we picked this out, now go
do it. And you saw those people continuing to be interested and involved through the
implementation so that its obvious. And communication as far as keeping people kind of up-todate on it where they were in the process. That kind of thing. A lot of that's just common sense,
right? Yes, I am kind of excited about getting into this and getting it wrote up and I was just
thrilled that my Chair was so interested. That's good.
Interview 6.
What was the main drivers behind our making the change on our student information system?
Well, at the beginning when I was first hired, I was the first one to ever have this position.
What I really tried to do was organize everything and I had to run a lot of reports and doing that,
162

I realized just how little data we collected and how disorganized it was. Our previous system,
the one we have now, just made things really difficult, was not user-friendly at all. Much of the
data was inconsistent or even inaccurate, because it really didn't have any checks and balances
to make sure it was correct and what I ended up doing a lot of the times was taking a report,
exporting it to Excel, and then literally hand counting it by sorting it or cutting and pasting it,
and that kind of thing to get at data that was being requested. Your position was institutional
Assessment, right? Uh-hum. right.
So I was reporting to our National Education System doing IPEDS report, that's an example of
an external report. I also was asked to do internal reports for the President, a lot of people on
campus, and I would always have to do that kind of a method because our system just didn't
have nice reporting tools, and as I mentioned the data was just so inconsistent, and sometimes,
even inaccurate, So its kind of a scary situation to be in when I am putting my name on this
data and saying this is what I believe to be correct. That was really difficult to get.
Just reports, was the main reason we changed? I would just say it was the main reason for me
to kind of make a voice about it because that was what I worked with, but in working with our
Registrar about different things, and he actually assisted me in helping me get started in
understanding how to get data out of our system. We realized that there was a whole series of
issues. Some of those being that even though it was a kind of computer program, a lot of the
things they did they still used paper and pencil for. It was still a very manual process to try to
do much of anything. Students would stand in huge lines. We had a different system for
Admissions than we did for Registration, things were being entered twice. Of course whenever
you have something like that, there's going to be errors made when something has to be done a
number of times. People would make mistakes, and things would get inaccurate from the
errors. It was really difficult. We realized that we were spending so much time upon things that
still didn't make to be a very nice system.....still left us with a lot of holes and a lot errors, and
was really hard to get data out of. The business office mentioned that they had issues in not
being integrated as well financially. Exactly. Exactly. So we had a different program for
Admissions, Registrar was different, the Business office again completely separate. Everyone
has had their own little separate areas, and nobody talked. When you have that kind of
situation, if something happens over here then the people over here don't now what's going on.
163

Again, that just leads to errors, and inconsistencies. To make future decisions, you have to
understand what's happened in the past. Well if you can't get an accurate picture of what's
happened, it's difficult to make decisions about financial needs or employment needs so to have
all these little pieces all separated, that was really, really difficult. And like I said, we still
ended up with a lot of manual processes like paper/pencil, students standing in lines, just
nothing was online, nothing was done... Can you really trust the data? We really didn't trust
the data. Unfortunately, that was all we had so you had to just have to but you say everything
with a grain of salt: this is about what it is, this is probably a close estimation of what it is.
When you are talking about a school our size 10 or 20 students here or there, the kind of errors
you have, is such a huge difference. you know, so its scary to think that you have that kind of
error.
So a fully integrated student information system was the main goal? It was the main goal.
When students apply at the University their information goes into one central database where
everyone can see it and it is the same. So if there's a change made, let's say they have an
address change, you change in one place and everybody is aware of that change. Because even
trying to get things like sending out graduation surveys or whatever, we didn't have accurate
addresses. Nobody knew which one was the most current one and there was five addresses
listed in there. There's nothing dated. Nobody knows which one's correct. Those kind of things
that are important to just keep basic kind of record keeping things going just didn't exist. It
wasn't there.
Course management...that was just a big plus? It was a plus to have it, but again it wasn't
integrated. On the new system? Right. It wasn't a main driver, it was just a big plus? It was a
big plus because we were using Blackboard which was a separate system....its a great system.
But again, if students are enrolled in class, it doesn't mean that they are on the Blackboard site.
And if they drop the class, they are still on the Blackboard. Again, when the systems are so
separated, students don't necessarily understand that, even some of the faculty and staff don't
understand. And so all of these errors occur because of that and really difficult.
Well then when you were appointed or enlisted to find a replacement, how did you go about
that? I didn't really know how to go about it at first, to be honest. I just did some searches. I
had heard about different systems just from attending conferences and that kind of a thing so I
164

knew there were other systems out there that existed, that were better than ours, that had online
capabilities, made for less manual processes. that you could get reports out of them and
understand your data, so I really just started searching using some online tools and finally got
kind of a list going and eventually when you kind of put your name out there people start
contacting you. Their sales reps started contacting me. I eventually had a hand full of systems
that all seemed like they had some similar type features. I started getting some price quotes,
kind of knew about the price range we wanted to be in. Some were just way out of our league
as far as price so those were automatically kind of rejected, some almost seemed like they were
too inexpensive. You do get what you pay for in some of these things and I did look at online
demonstrations with every system. I had phone conversations, email conversations trying to
understand what the different products offered. I made kind of a list of features that all of them
had, and narrowed it down to about 5 or 6. At that time I presented that to the Administrative
Team including the President, the VP of Academic Affairs, and the other Directors. We
decided on CAMS.
But you came down to like two? We came down to two that we thought were the best. They
asked me Which? What do you think? Seems like some of these are pretty similar. Which two
would you be the most comfortable with? A list of two, we got a little more in depth. Both of
those actually came and presented in person to us to kind of do some more details. Show us live
demonstrations of them.
Did you do any involvement of key personnel or other personnel on campus?
Interruption. Continued.
If the new course program had features that weren't existing and they said they would develop
it, why did we give them instant credibility that we believed that they would? They gave us a
list of references. I took references of other people who used both systems that we had it
narrowed down to. We eventually chose CAMS. They were just willing to work with us and
they were willing to say "If this is something you need, this is something we can do for you."
And their current users said "Yes, that's true". We've had a need and said this is something we
really need you to do and they have been able to produce fast.

165

And you've still have Blackboard for a backup, haven't you? Right, we still have Blackboard
for the Course Management System, but they told us they'll be able to convert all the courses so
that tests and all the things that are in them will have to be generated.
And one other thing that you had mentioned to me because I was interested in whether or not
my model had any main, missing parts that I needed to include. And you brought up a very
interesting theory and that was what? Self-Determination theory. That's actually a
motivational theory, psychology theory, based on that autonomy, competence, and relatedness
are the key elements that drive intrinsic motivation. Basically, it states that when those three
elements exist, the person will be intrinsically motivated to do whatever it is, make change or
act some type of task or whatever it is. that theory has been tested on a multitude of different
fields and areas. I really thinks it speaks to a number of different things and can be utilized in a
number of different situations.
And where we were looking at the resistance/acceptance to change as to how to manage the
change and you thought that was very good to look at self-determination theory and how that
can be worked in to reduce the resistance. It had like three things, because I looked it up like
you told me too. It was to inform people of the need for change, give them a little bit of a choice
in how it gets done and acknowledge their feelings, which was like the three main things. It ties
in with our communication and management support being demonstrated throughout. If people
seem like they are being forced into something or don't really understand it, of course, they are
going to be resistant to change. People don't like to change even if what they currently do is
very manual and very difficult, they know it. They know the process, they know it works. They
know at the end they are going to get whatever they need to get. So when someone tries to
force the change on them, of course, they're going to resist that and not be excited about that.
You would be pleased to know that I had printed the paper out and when I had my interview
with the V.P. Academic Affairs I showed it to her and she made a note of it and said those three
things she was going to be sure to work it in during the convocation. Oh, good. Starting the
early communication with the people that hadn't been directly involved. I thought that was real
good. So I really appreciate you mentioning that theory.

166

You might also like