You are on page 1of 45

REVIEW AND REDEFINITION OF SOU INSTITUTIONAL PEER COMPARATOR GROUP

Prepared By

Ryan Brown

Submitted in fulfillment of the requirements for


MM 598: Capstone Project
in the Master in Management Program of the
School of Business
Southern Oregon University

Under the supervision of


Rene Leo E. Ordonez, PhD
Professor of Business

Ashland, Oregon
Winter 2016
March 13, 2016
EXECUTIVE SUMMARY

Southern Oregon University (SOU) relies on peer comparisons for important management
and strategic planning decisions, however, the group of peer institutions currently utilized
by SOU through the National Center for Education Statistics (NCES) and Integrated
Postsecondary Education Data System (IPEDS) was selected in 1997 and was not chosen
specifically for SOU.

Because this group is out of date, SOU may have been making key decisions based on
obsolete comparisons. This research project collected qualitative data from key SOU
offices and personnel in order to determine what criteria should be used in identifying
peer comparators. Quantitative analysis then identified new peers that better align with
current needs and priorities. Ensuring that SOU is comparing itself to true peers will allow
for stronger benchmarking and comparative analysis going forward.

Project objectives included:


- Determine proper criteria for SOU’s peer comparison institutions.
- Determine if the current peer comparison group met above criteria.
- If not, determine which institutions belong in SOU’s peer comparison group.
- Propose new custom comparison group for future SOU IPEDS reports.

Upon determining that a new peer comparison group was indeed needed for SOU, 55 key
members of the SOU community were surveyed regarding which criteria should be used
for identifying the new group, and whether and how that criteria should be weighted.

Survey results indicated a group of 16 institutions recommended to serve as the IPEDS


peer comparison group for SOU going forward:

1. Eastern New Mexico University-Main Campus


2. Bemidji State University
3. University of Wisconsin-Green Bay
4. Western Oregon University
5. Emporia State University
6. University of Washington-Bothell Campus
7. University of South Florida-St Petersburg
8. Midwestern State University
9. The Evergreen State College
10. Indiana University-Northwest
11. Auburn University at Montgomery
12. Montana State University-Billings
13. Shepherd University

i
14. Missouri Western State University
15. University of South Carolina-Upstate
16. Eastern Oregon University

With the results of this project, SOU now has information that could result in an updated
peer comparison group, leading to better strategic planning and benchmarking.

ii
ACKNOWLEDGEMENT

Southern Oregon University Director of Institutional Research Chris Stanek is to be


acknowledged for his guidance and mentorship throughout this project. Also, thank you
to all SOU faculty and staff who participated in the surveys.

iii
TABLE OF CONTENTS

Page

EXECUTIVE SUMMARY ...............................................................................................i

ACKNOWLEDGEMENT ................................................................................................iii

LIST OF TABLES ...........................................................................................................v

CHAPTER 1 - INTRODUCTION .....................................................................................1

Background and Need for the Study..............................................................1

Statement of Problem and Research Objectives ...........................................2

CHAPTER 2 - REVIEW OF RELATED LITERATURE ........................................................3

CHAPTER 3 - RESEARCH DESIGN AND METHODOLOGY .............................................5

Research Design .............................................................................................5

Limitations of the Study .................................................................................9

CHAPTER 4 - FINDINGS OF THE SURVEY ....................................................................10

CHAPTER 5 – CONCLUSION AND RECOMMENDATION .............................................15

Conclusion ......................................................................................................15

Recommendation...........................................................................................15

Suggestions for Future Studies ......................................................................18

APPENDIX
A Current Peer Group Key Data ............................................................19
B List of Survey Recipients ....................................................................20
C Possible Criteria Long List ..................................................................21
D Survey No. 1 .......................................................................................23
E Full Table Weighted Criteria ..............................................................28
F Survey No. 2 .......................................................................................29
G Research Proposal..............................................................................31

BIBLIOGRAPHY ...........................................................................................................38

iv
LIST OF TABLES

Table Page

1 Selected Criteria: Mean Importance and Weight ..........................................7

2 Top 50 Scoring Institutions Based on Weighted Total Score.........................8

3 Survey Results: Criteria Selection ..................................................................11

4 Survey Results: Should Criteria Be Weighted? ..............................................12

5 Survey Results: Ranked Criteria from Most to Least Important ....................13

6 Criteria Importance: Mean Ranking...............................................................14

7 Survey Results: Should EOU and WOU Be Included? ....................................14

8 Criteria Comparison: SOU vs. Recommended Institutions ............................17

v
CHAPTER 1

INTRODUCTION

A. Background and Need for the Study

Using data available from NCES and IPEDS, this research project undertook a review of
the colleges and universities currently being utilized by SOU as an institutional peer
comparison group.

Conversations with SOU’s Provost’s Office and Office of Institutional Research indicated
that the current peer comparison group was selected almost 20 years ago. At that time,
the Oregon University System (OUS) chose institutions to serve as a singular peer
comparison group four all three of Oregon’s public regional universities: SOU, Eastern
Oregon University (EOU) and Western Oregon University (WOU) (Weeks, Puckett, &
Daron, 2000). No information was identified during the research phase of this project to
indicate if SOU has ever selected and utilized a group of peer institutions chosen
specifically for SOU. When Weeks, Puckett, and Daron conducted their research, the
primary needs for establishing peer comparison groups for the OUS institutions included
“budgeting, faculty compensation analysis, performance measurement, and trend
analysis.” Comparisons among institutions were to be used as a critical aspect of a new
system-wide budgeting model.

This study was designed to determine if the current peer comparison group is still
relevant, and if not, which institutions should be included instead. In order to better
benchmark institutional priorities and new initiatives, SOU must be comparing itself
against true peers to help measure performance in key areas. By identifying a new peer
group, benchmarking and comparative analysis for SOU will be stronger going forward.

The current peer comparison group is comprised of:

- California State University-Stanislaus (CA)


- Eastern Washington University (WA)
- Fort Hays State University (KS)
- Plymouth State University (NH)
- Southeast Missouri State University (MO)
- Southern Utah University (UT)
- SUNY at Fredonia (NY)
- University of Mary Washington (VA)
- University of Michigan-Flint (MI)
- University of Wisconsin-Parkside (WI)

1
B. Statement of Problem and Research Objectives

SOU uses peer comparisons for a number of things, including evaluating the performance
of the institution as a whole in some areas, and evaluating the performance of specific
programs, initiatives, and institutional priorities and efforts. Specifically, metrics on which
SOU evaluates its performance compared to its peer institutions include enrolment,
number of degrees awarded annually, number of teaching faculty and staff, 6-year
bachelor’s degree graduation rate, percent of students receiving Pell grants, student
services expenses per FTE enrolment, and more (see Appendix A).

Peer comparisons are also used for benchmarking future performance, strategic planning,
policy setting, and decision making; however, the group of universities that SOU currently
uses as a peer comparison group has not been updated in nearly 20 years. At the time the
current peer comparison group was formulated, OUS selected a group to apply to all three
of Oregon’s public regional universities collectively; it was not chosen specifically for SOU.

The objective of this research was to determine whether or not SOU is currently using an
outdated peer comparison group. Because the research indicated that the current group
was indeed obsolete, it is likely that any comparative analysis or strategic planning and
benchmarking based on data compared to the current group may likewise be
problematic. Qualitative and quantitative analysis collected from the Provost’s Office and
other key departments on campus, as well as a review of existing literature on the subject
determined what criteria SOU should consider when identifying a new peer comparison
group. This information, taken in conjunction with quantitative analysis and reports to
NCES and IPEDS determined which colleges and universities were recommended for
inclusion in SOU’s peer comparison group.

Having a better and more-sound comparison group should aid SOU in future planning as
well as evaluation of current and past practices.

Project objectives were:

- In consultation with the Provost’s Office, determine proper criteria for SOU’s peer
comparison institutions.
- Determine if the current peer comparison group meets the above criteria.
- If not, utilize existing NCES and IPEDS data to determine which institutions belong
in SOU’s peer comparison group.
- Propose new custom comparison group for future SOU IPEDS reports.

The findings of this research and recommendations will be presented to the Provost’s
Office and the Department of Institutional Research for their consideration and use going
forward.

2
CHAPTER 2

REVIEW OF RELATED LITERATURE

Much of the research into the selection of peer institutions dates from the 1980s when
issues regarding the use of comparative data in higher education were researched from
an academic perspective by Teeter (1983) and Brinkman and Teeter (1987) defined the
various types of comparison groups. Even prior to Brinkman and Teeter’s work, Terenzini,
Hartmark, Lorang, and Shirley (1980) identified math-based practices for assessing and
comparing institutions of higher education according to quantitative criteria and
variables. This work was later further enhanced by Teeter and Christal (1984).

In subsequent years, much of this earlier work has been revisited, with variations and
refinements made. One key addition to the earlier work has been utilization of
comparison methods that endeavor to better incorporate both quantitative and
qualitative analysis (Ingram, 1995; Zhao & Dean, 1997). When the OUS conducted
research for the establishment of peer groups in 1997, Weeks, Puckett, and Daron (2000)
found that comparisons with peer universities “can be an effective way for university
presidents to communicate with legislators, board members, and other stakeholders
about where their institution stands.” In the years since Weeks, Puckett and Daron
conducted their research, the OUS has disbanded. When established, the current peer
comparison group was selected for its relevance for all three of Oregon’s public regional
universities (SOU, EOU, WOU). Oregon Institute of Technology, the other of the state’s
four Technical and Regional Universities, was considered separately due to its technical
focus and mission. For the first time in decades, SOU now has an opportunity to establish
a peer comparison group specific to its own needs.

The OUS research recommended that the number of peer institutions in the group be
kept at a manageable number, and that it be focused on institutions with missions that
align (Weeks, Puckett, & Daron, 2000). Additionally, Hurley (2002) found that the primary
functions of institutional peer groups fall into three main categories: financial, program,
and planning. That is, Hurley’s research indicated that institutions primarily utilize peer
grouping to establish financial resource allocation, measure the performance of the
institution or specific programs against other institutions, and conduct long-term
planning. Carrigan (2012) describes a “thoughtful and deliberate” peer selection process
that involves numerous institutional stakeholders, including senior academic and
administrative leadership.

Although the review of literature did reveal at least one study that indicated that an
institution’s peer comparison group should be reviewed on a six-year cycle (Weeks,
Puckett, & Daron), no identified research specifically focused on review of an existing peer
group.

3
Though NCES and IPEDS provide access to a large quantity of data on nearly every
institution of higher education in the United States, there are some limitations to what
data can be accessed and how, including the number of institutions that can be included
in a single query, broadness of some of the statistical categories, and exclusion of some
data that institutional decision makers might find useful (Schuh, 2002). Despite these
limitations, Schuh concludes that IPEDS data serves as a relevant and useful source for
cross-institutional comparisons (2002). Gater and Lombardi also called the IPEDS data
system “the most readily available and widely used source” for national higher education
data (2001).

4
CHAPTER 3

RESEARCH DESIGN AND METHODOLOGY

A. Research Design

This research project was comprised of two primary phases: information gathering and
analysis. The analysis phase consisted of both quantitative and qualitative analysis and
the information gathering process included a review of existing literature as well as one-
on-one interviews with key SOU personnel primarily from the Provost’s Office and Office
of Institutional Research. Additionally, two surveys were utilized to collect important
information. Each survey was sent to 55 key members of the SOU faculty, staff, and
administration. The recipients included administrative leadership, academic department
chairs and division directors, many administrative directors and other select members of
the SOU community (see Appendix B for the full list of recipients).

Each survey was brief, the first asking just three questions and the second asking only
two. The initial survey was designed to identify which criteria would be used going
forward in the possible selection of a new peer comparison group by narrowing down the
number of criteria to be considered when selecting institutions. The second survey built
off of the original survey by soliciting input to incorporate when determining whether and
how to weight the final criteria based on importance.

Initial interviews with key personnel, particularly the Director of Institutional Research,
helped to identify possible criteria to consider when comparing institutions and
identifying those that may be considered for inclusion in SOU’s peer comparison group.
Additional criteria were added to this list after careful review of the existing literature, as
information from previous institutions that had undergone similar processes was
reviewed. This resulted in an initial “long list” of possible criteria (see Appendix C).

Because this list was deemed too lengthy to use for the initial survey, further input from
the Director of Institutional Research and another careful review of existing literature
refined the list, resulting in 32 potential criteria to send to survey recipients for input. The
survey asked three questions (see Appendix D for the full survey and results), the first of
which was: “Which of the below criteria do you believe should be used in designating the
SOU peer comparison group? Please check all that apply.”

The survey was designed to identify approximately 10-12 criteria from the refined list of
32 that the respondents felt should be utilized when selecting SOU’s new peer
comparison group.

The additional questions included in the initial survey were: “Are there criteria not listed
above that you believe should be considered when selecting the SOU peer comparison
group? If so, please write your choice(s) in below,” and “Are there institutions that you

5
believe should be considered for inclusion in the SOU peer comparison group regardless
of which criteria they match? If so, please write your choice(s) below.” While the question
regarding additional criteria did not elicit useful response, the question about specific
institutions did. Only four respondents answered that question in the affirmative, but all
four indicated that both EOU and WOU should be included in the SOU peer comparison
group regardless of how they match the selected criteria. This led to a follow up question
in the subsequent survey.

The second survey, which was sent to the same 55 individuals, asked just two questions.
The first question asked respondents whether the final criteria, which was identified
through the initial survey, should be weighted based on importance; those who
responded in the affirmative were then provided with an opportunity to rank the criteria
from most to least important.

The final question of the second survey asked recipients if they thought EOU and WOU
should be included in SOU’s peer comparison group regardless of how they match the
criteria.

The three criteria weighted by respondents as most important (“control of institution


(public vs. private),” “status as a regional institution,” and “FTE enrollment”) were
considered separately from the other seven criteria for weighting purposes. Because
control of institution was chosen as most important by a wide margin, only other public
universities were considered when identifying possible institutions for SOU’s new peer
comparison group. FTE enrollment was selected as the second-most important criteria,
so institutions along a range of FTE enrollment size of +/- 25 percent of SOU’s enrollment
were considered. The most recent available data being used by IPEDS indicated an FTE
enrollment of 4,394 for SOU, so the range used was 3,295 to 5,493. There are 117 public
universities in the IPEDS system between 3,295 and 5,493 FTE enrollment, providing the
initial list of institutions to be considered for SOU’s new peer comparison group. Because
it would require substantial time researching individual institutions in order to determine
regional status, the question of how to weight that criteria, which ranked as third most
important, was reserved for later in the research process.

The remaining seven criteria—those that were chosen by respondents as fourth to tenth
most important—were weighted based on the mean score each received according to
how they were ranked. The lowest ranked criteria, “percentage of undergraduate
students over age 25,” had a mean importance ranking of 7.89. This score was established
as a weight of 1.00, with each subsequent criteria given a weight based on how much
more important it was ranked, according to its mean score. The next lowest ranked
criteria, “percent of undergraduate students receiving Pell grants,” had a mean score of
6.58, or 17 percent more important than the baseline score of 7.89. Thus, it was weighted
at 1.17. The table below illustrates the weights of each of the seven criteria that were
weighted in this manner.

6
Criteria Mean Importance Weight
Percent of undergrads over 7.89 1.00
age 25
Percent of undergrads 6.58 1.17
receiving Pell Grant
4-year graduation rate 6.42 1.19
Percent of total revenue 5.95 1.25
from tuition and fees
Student-to-faculty ratio 5.68 1.28
Percent of FTE comprised of 5.11 1.35
undergrads
FTE enrollment 4.58 1.42

Note: a lower “mean importance” score means the criteria was weighted by respondents
as more important. In the survey, the scale was 1 – 10 from most to least important.

Each of the 117 institutions was scored on a scale of 2-10 based on its proximity to SOU
on each of the selected criteria. An institution that was within +/- 5 percent of SOU on
any given criteria was given a score of 10 points for that criteria. If it was within +/- 5-10
percent, a score of 8 points was given, +/- 10-15 percent resulted in a score of 6 points,
+/- 15-20 percent a score of 4 points, and +/- 20-25 percent a score of 2 points. If an
institution fell outside of a range of +/- 25 percent of SOU, it did not receive points for
that criteria. These scores were then multiplied according to the weight scale established
for each criteria, resulting in a weighted score for each criteria for each of the 117
institutions. The 50 institutions with the highest scores are below (see Appendix E for the
full table of all 117 institutions weighted scores for each criteria).

7
Southern Oregon University 86.10
Eastern New Mexico University-Main Campus 59.64
Bemidji State University 59.14
University of Wisconsin-Green Bay 58.12
Western Oregon University 58.06
Emporia State University 57.64
University of Washington-Bothell Campus 57.30
University of South Florida-St Petersburg 56.88
Midwestern State University 55.70
The Evergreen State College 54.78
Indiana University-Northwest 54.38
Auburn University at Montgomery 54.22
Montana State University-Billings 53.68
Shepherd University 53.18
Missouri Western State University 53.14
University of South Carolina-Upstate 51.78
University of Wisconsin-Parkside 51.54
Lock Haven University 50.86
California State University-Channel Islands 48.86
Clarion University of Pennsylvania 47.88
Indiana University-South Bend 47.70
Saint Johns River State College 47.62
East Central University 47.46
SUNY College at Old Westbury 47.16
Southwestern Oklahoma State University 47.08
Indiana University-Southeast 47.04
Southern Polytechnic State University 46.90
Arizona State University-Polytechnic 46.12
Pennsylvania State University-Penn State Abington 45.00
Delaware State University 44.50
The University of Texas of the Permian Basin 44.38
University of Cincinnati-Blue Ash College 43.28
Frostburg State University 43.24
Alabama A & M University 42.60
Gulf Coast State College 41.90
Oklahoma State University-Oklahoma City 41.40
Southwest Minnesota State University 40.68
Cameron University 40.34
Georgia Highlands College 40.26
Grambling State University 39.80
Fort Lewis College 39.70
University of Mary Washington 39.70

8
Additional research on the 20 institutions with the highest total weighted scores was
conducted in order to determine if they were regional institutions. This analysis included
review of each university’s website, analysis of mission statements, contact with
representatives from the institutions if regional status was not apparent, and reliance on
secondary sources, including the U.S. News and World Report. This research indicated
that the top 20 scoring institutions all serve a regional mission, so further analysis of the
remaining 97 universities was deemed unnecessary. Even if they improved in overall score
based on regional status, they would not improve relative to other regional institutions
and as such not be able to leap frog the highest ranked universities.

B. Limitations of the Study

One of the major limitations of this study was the number of survey respondents. The 55
survey recipients were carefully selected based on their role within SOU and a desire to
have their feedback reflected in the survey results. Less than half of the 55 recipients
responded to the surveys, limiting the amount of input received. Additionally, because
the survey was anonymous, there is no indication of which recipients responded. If a
larger percentage of staff members and administrators responded than did faculty
members, for example, the results may be vastly different than they would have been
had the reverse been true. One way to remedy this issue for future research may be to
include a question on each survey asking the respondent’s role at SOU. While not
identifying individual survey-takers, this information would provide valuable insight into
whether or not there may be bias in the results.

Another limitation regarding one of the selected criteria came to light late in the survey
process. “Mix of full-time to adjunct faculty” was included in the long list and initial
refined list of possible criteria after being identified through the review of existing
literature. It was then one of the 10 criteria selected by respondents for inclusion in the
final list of criteria to be considered when identifying a new peer comparison group. It
was not until this time that IPEDS data began to be compiled for the purpose of ranking
institutions for possible inclusion that it was discovered that this is not something that is
tracked by NCES and IPEDS. As such, it was omitted from the final tabulations, resulting
in the rankings being based on just nine criteria instead of 10.

9
CHAPTER 4

FINDINGS OF THE SURVEY

As previously stated, the two surveys were sent to a recipient list of 55 key SOU personnel.
The initial survey received 24 responses with the follow-up survey receiving 27 responses.
The number of responses to the various questions within each survey varied slightly. This
is fewer responses than hoped for, but enough to provide useful data and draw
conclusions.

The initial survey provided respondents with a list of 32 potential criteria and asked them
to choose all from the list that they thought should be considered when selecting a new
peer comparison group. Of these initial 32 potential criteria, ten criteria were selected by
more than 50 percent of respondents, becoming the full list of criteria that would be used
when selecting the new peer comparison group. The top 18 criteria are presented in the
table below (for full results, see Appendix D).

10
Criteria No. %

Control of institution
19 86%
(public vs. private)
Status as a regional
18 82%
institution
Percentage of FTE made
up by undergraduate 16 73%
students
FTE enrollment 16 73%
Student-to-faculty ratio 14 64%
Percentage of
undergraduates receiving 14 64%
Pell grants
Mix of full-time faculty to
12 55%
adjunct faculty
Percentage of total
12 55%
revenue from tuition
Percentage of
undergraduates over age 12 55%
25
Four-year graduation rate 12 55%
Carnegie classification 11 50%
Six-year graduation rate 11 50%
Degree of urbanization of
11 50%
locale
Percentage of FTE made
10 45%
up by graduate students
FTE-to-headcount ratio 10 45%
Percentage of student
body identified as 10 45%
minority
First-time, full-time fall-
9 41%
to-fall retention rate
Percentage of total
degrees awarded that are 9 41%
bachelor degrees

11
The second survey, which was sent to the same 55 recipients, asked just two questions.
The first question asked respondents whether the ten selected criteria should be
weighted based on importance and the result was very clear, 80 percent of respondents
indicated that yes, it should be weighted:

Answer No. %
Yes, the criteria should be
20 80%
weighted based on importance.
No, all criteria should be
5 20%
counted equally.
Total 25 100%

Those who replied in the affirmative to this question were then asked to weight the ten
criteria from most to least important, with 1 being the score for most important and 10
the score for least important.

12
1 (Most 10 (Least Total
Answer Important 2 3 4 5 6 7 8 9 Important Response
) ) s
Control of
institution
10 2 1 3 1 1 1 0 0 0 19
(public vs.
private)
Status as a
regional 1 8 3 1 1 0 2 0 3 0 19
institution
FTE enrollment 1 1 6 5 2 0 0 2 0 2 19
Percentage of
FTE made up
by 1 1 4 4 1 3 0 2 3 0 19
undergraduate
students
Four-year
1 1 3 0 1 2 4 2 1 4 19
graduation rate
Student-to-
1 3 0 2 3 2 3 1 3 1 19
faculty ratio
Mix of full-time
faculty to 3 0 1 0 2 2 4 2 2 3 19
adjunct faculty
Percentage of
total revenue 0 1 1 2 3 5 2 4 1 0 19
from tuition
Percentage of
undergraduate 0 2 0 0 1 1 1 4 4 6 19
s over age 25
Percentage of
undergraduate
1 0 0 2 4 3 2 2 2 3 19
s receiving Pell
Grants
1 1 1 1 1 1 1 1
Total 19 19 -
9 9 9 9 9 9 9 9

The results indicated a clear preference for most important, as “Control of institution
(public vs. private)” was ranked most important by more than half of respondents and
was the only criteria with a mean ranking of less than 4. Only two other criteria (“Status
as a regional institution” and “FTE enrollment”) had a mean ranking of under 5 and the
remaining seven criteria ranged from a mean ranking of 5.11 to 7.89.

13
Criteria Mean Ranking
Control of institution (public vs. private) 2.47
Status as a regional institution 4.00
FTE enrollment 4.58
Percentage of FTE made up by undergrads 5.11
Student-to-faculty ratio 5.68
Percentage of total revenue from tuition 5.95
Mix of full-time faculty to adjunct faculty 6.32
Four-year graduation rate 6.42
Percent of undergrads receiving Pell Grant 6.58
Percentage of undergraduates over age 25 7.89

Note: A lower “mean ranking” correlates with a higher importance as ranked by survey
respondents. In the survey, importance was ranked from 1 – 10 with 1 being “most
important” and 10 being “least important.”

The final question of the second survey asked recipients if they thought EOU and WOU
should be included in SOU’s peer comparison group regardless of how they match the
criteria. A slight preference was indicated for including both institutions.

Should EOU
and WOU Be Response %
Included
Yes 13 54%
No 11 46%
Total 24 100%

14
CHAPTER 5

CONCLUSION AND RECOMMENDATION

A. Conclusion

Upon comparing data from its current peer comparison group to SOU, reviewing existing
literature—including the process by which the current group was selected—and
consulting with key SOU personnel, the determination was made that a new peer
comparison group was indeed necessary for the university. The criteria that was used to
determine the current group back in 1997 is no longer what survey respondents felt
should be considered when selecting a peer comparison group.

The criteria by which this new group is to be selected consists of:

- Control of institution (public vs. private)


- Status as a regional institution
- FTE enrollment
- Percentage of FTE made up by undergraduate students
- Four-year graduation rate
- Student-to-faculty ratio
- Percentage of total revenue from tuition and fees
- Percentage of undergraduates over age 25
- Percentage of undergraduates receiving Pell Grant

Based on survey results, this criteria will be weighted according to importance.

B. Recommendation

Upon applying the above criteria to the 117 institutions identified as being public
universities with an FTE enrollment in a range of +/- 25 percent of SOU’s FTE enrollment,
15 institutions that earn at least 60 percent of the total possible weighted score are
identified. These are the institutions that most closely align with SOU based on the
selected criteria and weighting.

Per the survey results and findings of this research, it is recommended that SOU change
its peer comparison group to this group of 15 institutions, plus Eastern Oregon University,
which survey respondents indicated they would like to see included regardless of how it
matches the criteria (note: Western Oregon University, which respondents also indicated
they would like to have included, is included in the list of 15 institutions that earned more
than 60 percent of the total possible weighted score).

1. Eastern New Mexico University-Main Campus


2. Bemidji State University

15
3. University of Wisconsin-Green Bay
4. Western Oregon University
5. Emporia State University
6. University of Washington-Bothell Campus
7. University of South Florida-St Petersburg
8. Midwestern State University
9. The Evergreen State College
10. Indiana University-Northwest
11. Auburn University at Montgomery
12. Montana State University-Billings
13. Shepherd University
14. Missouri Western State University
15. University of South Carolina-Upstate
16. Eastern Oregon University

This group represents institutions that most closely match SOU in the areas that key
members of the SOU community who responded to the survey felt are the most
important metrics by which to compare the university to other institutions.

Of the ten institutions that make up the current SOU peer comparison group, four are
found in the initial list of 117 public universities between 3,295 and 5,493 FTE enrollment
that comprise the original list from which the new recommended group was drawn. The
highest ranking among them was University of Wisconsin-Parkside at No. 16.

16
% of
undergrads % of % of FTE % of revenue
receiving Pell undergrads 12-month FTE comprised of 4-year grad from tuition Student-to- Weighte
Institution grants over age 25 enrollment undergrads rate and fees faculty ratio d Score
Southern Oregon University 38 25 4394 0.90 22 40 21 86.1
Eastern New Mexico University-Main Campus 36 26 4197 0.86 11 19 19 59.64
Bemidji State University 35 4269 0.97 27 36 22 59.14
University of Wisconsin-Green Bay 31 25 5229 0.98 24 35 23 58.12
Western Oregon University 43 18 4923 0.90 22 40 17 58.06
Emporia State University 40 15 4828 0.72 22 43 18 57.64
University of Washington-Bothell Campus 34 25 4435 0.92 38 24 19 57.3
University of South Florida-St Petersburg 41 32 4260 0.90 17 34 19 56.88
Midwestern State University 38 26 4948 0.98 18 32 17 55.7
The Evergreen State College 46 34 4392 0.94 46 37 22 54.78
Indiana University-Northwest 31 26 4424 0.93 9 41 15 54.38
Auburn University at Montgomery 42 27 4097 0.88 9 42 15 54.22
Montana State University-Billings 37 41 4152 0.93 10 39 17 53.68
Shepherd University 33 23 3631 0.96 20 44 16 53.18
Missouri Western State University 43 20 4400 0.97 15 41 17 53.14
University of South Carolina-Upstate 46 20 5180 0.98 23 43 17 51.78

17
C. Suggestions for Future Studies

In order to ensure that SOU’s peer comparison group does not once again become
irrelevant, this process should be repeated every 5-7 years to make certain that it remains
relevant and inclusive of institutions that are similar to SOU. Additionally, before
disseminating surveys, a process by which criteria can be randomly ordered should be
identified and undertaken so criteria in surveys are in random order.

18
APPENDIX A

CURRENT PEER GROUP KEY DATA

# of Bachelor's # of Master's # of FTE 6-year Bachelor's % of Student Services


Headcount Degrees Degrees Postsecondary Degree Undergrads Expenses per FTE
University FTE Enrollment Enrollment Awarded Awarded Teachers and Staff Graduation Rate* Receiving Pell Enrollment

SOU 4,650 7,908 773 215 258 33% 39% $1,222


CSU Stanislaus 7,054 9,087 1,621 316 349 49% 58% $2,346
Eastern Washington 11,831 16,025 2,252 465 547 46% 39% $1,572
Fort Hays 8,728 16,059 2,798 452 319 40% 28% $1,109
Plymouth State 4,838 7,025 878 336 296 59% 27% $1,850
Southeast MO 9,678 13,812 1,623 365 394 46% 36% $1,860
Southern Utah 6,522 15,478 930 315 275 35% 38% $2,114
SUNY Fredonia 5,523 5,987 1,229 137 340 63% 35% $1,661
Mary Washingtoin 4,506 5,802 1,026 246 292 71% 15% $1,558
Michigan - Flint 6,820 10,042 1,102 343 388 37% 45% $1,296
Wisonsin - Parkside 3,843 5,605 678 45 200 28% 43% $3,343

*Based on first-
time, full-time,
degree/certificate
seeking students

19
APPENDIX B

LIST OF SURVEY RECIPIENTS

Name Title
Roy Saigo President
Susan Walsh Provost and VP for Academic and Student Affairs
Craig Morris VP for Finance and Administration
Liz Shelby Chief of Staff (retired)
Karen Stone AVP for Academic Resource Management
Jody Waters Associate Provost and Director of Graduate Studies
Lisa Garcia-Hanson AVP for Enrollment and Retention
Jeffery Gayton University Librarian
Chris Stanek Director of Institutional Research
Kelly Moutsatson Director of Admissions
Kristin Nagy Catz Director of University Assessment
Jennifer Fountain Director of Student Life
Danielle Mancuso Assistant Director of Student Life
Tim Robitz Director of Housing
Matt Sayre Director of Athletics
Bobby Heiken Associate Director of Athletics
David Humphrey Division Director - Oregon Center for the Arts
Greg Jones Division Director - Business, Comm., and the Environment
John King Division Director - Education, Health, and Leadership
Scott Rex Division Director - Humanities and Culture
Dan DeNeui Division Director - Social Sciences
Sherry Ettlich Division Director - STEM
Lee Ayers Division Director - Undergraduate Studies
Deborah Rosenberg Chair - Creative Arts
Miles Inada Chair - Creative Arts
Vicki Purslow Chair - Music
Laurie Kurutz Chair - Theatre
Scott Malbaurn Chair - Schneider Museum of Art
Joan McBee Chair - Business
Alena Ruggerio Chair - Communication
Alissa Arp Chair - Environmental Studies
Roni Adams Chair - Education Undergrad
Amy Belcastro Chair - Educatoion Grad
Jamie Vener Chair - Health, OAL, and PE
Daniel Morris Chair - Foreign Langauge and Literatures
Wesley Leonard Chair - Native American Studies
Prakash Chenjeri Chair - Philosophy
Kylan de Vries Chair - GSWS
Charlotte Hadella Chair - English
Alison Burke Chair - Criminology and Criminal Justice
Linda Wilcox Young Chair - Economics
Pat Acklin Chair - Geography
Dustin Walcher Chair - History and Political Science
Paul Murray Chair - Psychology
Mark Shibley Chair - Sociology and Anthropology
Michael Parker Chair - Biology
Laura Hughes Chair - Chemistry
Peter Nordquist Chair - Computer Science
Jim Hatton Chair - Mathematics
Peter Wu Chair - Physics
Deborah Brown Chair - USEM
Curt Bacon Chair - Acc. Bacc.
Ken Mulliken Director - Honors College
Eva Skuratowicz Director - SOU Research Center
Janet Fratella VP for Development and Exec. Dir. of SOU Foundation

20
APPENDIX C

POSSIBLE CRITERIA LONG LIST

Carnegie classification
FTE enrollment
Part-time enrollment as a percentage of total FTE enrollment
Degrees awarded in business as a percentage of total degrees awarded
degrees awarded in education as a percentage of total degrees awarded
degrees awarded in humanities/social science as a percentage of total degrees awarded
Degrees in STEM as a percentage of total degrees awarded
Ratio of research to instructional expenditures
Geographic region
Control of Institution (Private vs. Public)
Highest degree level offered
Total student head count
FTE-to-headcount ratio
Percentage of undergraduate FTE
Percentage of graduate FTE
Percentage of student body identified as minority
Percentage of undergrads over age 25
Percentage of all students over age 25
Six-year graduation rate
Four-year graduation rate
First-time, fulltime Fall-to-Fall retention rate
Student-to-faculty ratio
Freshman selectivity (percentage of rejected applicants)
Percentage of undergraduates receiving Pell grants
Percentage of undergraduates receiving federal loans
In-state undergraduate tuition and mandatory fees
Out-of-state undergraduate tuition and mandatory fees
In-state graduate tuition and mandatory fees
Out-of-state graduate tuition and mandatory fees
Number of instructional faculty
Percentage of full-time instructional faculty
Average faculty salary
Percentage of institutional expenditures on instruction
Average tuition per FTE
Average federal revenue per FTE
Average state/local revenue per FTE
Average total revenue per FTE
Percentage of revenue from tuition
Percentage of revenue from federal

21
Percentage of revenue from state/local
Number of degrees awarded
Percentage bachelor's
Percentage master's
Percent in humanities and social sciences
Percent in education
Percent in STEM
Percent in business
Percent in communications and art
Percent of black, non-Hispanic students
Percent of American Indian/Alaskan Native students
Percent of Asian/Pacific Islander students
Percent Hispanic students
Degree of urbanization of location
Regional institution

22
APPENDIX D

SURVEY NO. 1

Question No. 1: Which of the below criteria do you believe should be used in designating
the SOU peer comparison group? Please check all that apply.

(Results begin on next page)

23
# Answer Response %
Control of
2 institution (public 19 86%
vs. private)
Status as a
3 regional 18 82%
institution
Percentage of
FTE made up by
7 16 73%
undergraduate
students
5 FTE enrollment 16 73%
Student-to-
20 14 64%
faculty ratio
Percentage of
undergraduates
31 14 64%
receiving Pell
grants
Mix of full-time
22 faculty to adjunct 12 55%
faculty
Percentage of
26 total revenue 12 55%
from tuition
Percentage of
30 undergraduates 12 55%
over age 25
Four-year
17 12 55%
graduation rate
Carnegie
1 11 50%
classification
Six-year
18 11 50%
graduation rate
Degree of
4 urbanization of 11 50%
locale
Percentage of
FTE made up by
8 10 45%
graduate
students
FTE-to-
6 10 45%
headcount ratio
Percentage of
student body
29 10 45%
identified as
minority
First-time, full-
19 time fall-to-fall 9 41%
retention rate

24
Percentage of
total degrees
10 awarded that are 9 41%
bachelor
degrees
Percentage of
total revenue
27 8 36%
from state/local
sources
Average tuition
24 7 32%
per FTE
Average
25 state/local 6 27%
revenue per FTE
In-state
undergraduate
23 6 27%
tuition and
mandatory fees
Number of
9 degrees 6 27%
awarded
Percentage of
total degrees
11 6 27%
awarded that are
master's degrees
Percentage of
undergraduates
32 6 27%
receiving federal
loans
Number of
21 instructional 5 23%
faculty
Degrees
awarded in
business as a
12 4 18%
percentage of
total degrees
awarded
Degrees
awarded in
communication
16 and art as a 3 14%
percentage of
total degrees
awarded
Degrees
awarded in
education as a
13 3 14%
percentage of
total degrees
awarded

25
Degrees
awarded in
humanities/social
14 science as a 3 14%
percentage of
total degrees
awarded
Degrees
awarded in
STEM as a
15 3 14%
percentage of
total degrees
awarded
Ratio of research
28 to instructional 3 14%
expenditures

Question No. 2: Are there criteria not listed above that you believe should be considered
when selecting the SOU peer comparison group? If so, please write your choice(s) in
below.

Results:
Text Response
a;sdlkfjs;adl
Degree of urbanization of locale is a good measure, but potentially better is a
measure that defines isolation from metropolitan areas.
Have specific targeted degrees/programs that match SOU's mission, i.e. EMDA,
Performing Arts, Digital Cinema, Social Media & Public Engagement
no
Accrediting body?
Nothing comes to mind, but I would recommend items that are more fixed
demographics and not something we might set as a goal to move. For example, that
is why I did not check graduation rates.
Athletic affiliation (NAIA or NCAA II, III)
Percentage of county residents who attend institution, Percentage of students over
the age of 30, Percentage of students who are veterans, SES of students
1st generation students, median family income,
NAIA, on-campus housing, sustainability ranking, LGBT friendly

Question No. 3: Are there institutions that you believe should be considered for inclusion
in the SOU peer comparison group regardless which criteria they match? If so, please
write your choice(s) below.

26
Results:

Text Response
asdf;lkdsajf
all public universities within the state
no
Question: Some criteria above look like items to compete/compare such as graduation
rates, but I would not require universities to have a similar grad rate to qualify as a
peer comparison institution.
CSU Humboldt, Western Oregon University, Eastern Oregon University
Eastern and Western
Western Oregon University, Evergreen (WA state), Eastern Oregon

27
APPENDIX E

WEIGHTED CRITERIA FULL TABLE


% of
undergrads % of % of FTE % of revenue
receiving Pell Unweighted Weighted undergrads Unweighted Weighte 12-month FTE Unweighted Weighted comprised of Unweighted Weighted 4-year grad Unweighted Weighted from tuition Unweighted Weighted Student-to- Unweighted Weighted Unweighted Weighte
instnm grants score Score over age 25 score d Score enrollment score Score undergrads Score Score rate score Score and fees score Score faculty ratio score Score total d total
Southern Oregon University 38 10 11.7 25 10 10 4394 10 14.2 0.90 10 13.5 22 10 11.9 40 10 12 21 10 12.8 70 86.10
Eastern New Mexico University-Main Campus 36 10 11.7 26 10 10 4197 10 14.2 0.86 10 13.5 11 0 0 19 0 0 19 8 10.24 48 59.64
Bemidji State University 35 8 9.36 0 0 4269 10 14.2 0.97 8 10.8 27 2 2.38 36 8 9.6 22 10 12.8 46 59.14
University of Wisconsin-Green Bay 31 4 4.68 25 10 10 5229 4 5.68 0.98 8 10.8 24 8 9.52 35 6 7.2 23 8 10.24 48 58.12
Western Oregon University 43 6 7.02 18 0 0 4923 6 8.52 0.90 10 13.5 22 10 11.9 40 10 12 17 4 5.12 46 58.06
Emporia State University 40 10 11.7 15 0 0 4828 8 11.36 0.72 4 5.4 22 10 11.9 43 8 9.6 18 6 7.68 46 57.64
University of Washington-Bothell Campus 34 8 9.36 25 10 10 4435 10 14.2 0.92 10 13.5 38 0 0 24 0 0 19 8 10.24 46 57.30
University of South Florida-St Petersburg 41 8 9.36 32 0 0 4260 10 14.2 0.90 10 13.5 17 2 2.38 34 6 7.2 19 8 10.24 44 56.88
Midwestern State University 38 10 11.7 26 10 10 4948 6 8.52 0.98 8 10.8 18 4 4.76 32 4 4.8 17 4 5.12 46 55.70
The Evergreen State College 46 4 4.68 34 0 0 4392 10 14.2 0.94 10 13.5 46 0 0 37 8 9.6 22 10 12.8 42 54.78
Indiana University-Northwest 31 4 4.68 26 10 10 4424 10 14.2 0.93 10 13.5 9 0 0 41 10 12 15 0 0 44 54.38
Auburn University at Montgomery 42 8 9.36 27 8 8 4097 8 11.36 0.88 10 13.5 9 0 0 42 10 12 15 0 0 44 54.22
Montana State University-Billings 37 10 11.7 41 0 0 4152 8 11.36 0.93 10 13.5 10 0 0 39 10 12 17 4 5.12 42 53.68
Shepherd University 33 6 7.02 23 8 8 3631 4 5.68 0.96 8 10.8 20 8 9.52 44 8 9.6 16 2 2.56 44 53.18
Missouri Western State University 43 6 7.02 20 4 4 4400 10 14.2 0.97 8 10.8 15 0 0 41 10 12 17 4 5.12 42 53.14
University of South Carolina-Upstate 46 4 4.68 20 4 4 5180 4 5.68 0.98 8 10.8 23 10 11.9 43 8 9.6 17 4 5.12 42 51.78
University of Wisconsin-Parkside 44 6 7.02 24 10 10 3801 6 8.52 0.98 8 10.8 8 0 0 31 2 2.4 21 10 12.8 42 51.54
Lock Haven University 40 10 11.7 9 0 0 5117 4 5.68 0.91 10 13.5 28 2 2.38 47 4 4.8 21 10 12.8 40 50.86
California State University-Channel Islands 46 4 4.68 20 4 4 4501 10 14.2 0.97 8 10.8 27 2 2.38 19 0 0 22 10 12.8 38 48.86
Clarion University of Pennsylvania 39 10 11.7 17 0 0 5383 2 2.84 0.90 10 13.5 29 0 0 44 8 9.6 19 8 10.24 38 47.88
Indiana University-South Bend 34 8 9.36 24 10 10 5374 2 2.84 0.94 10 13.5 6 0 0 38 10 12 13 0 0 40 47.70
Saint Johns River State College 41 8 9.36 29 6 6 4701 8 11.36 1.00 6 8.1 0 0 17 0 0 20 10 12.8 38 47.62
East Central University 46 4 4.68 23 8 8 3967 8 11.36 0.85 8 10.8 17 2 2.38 29 0 0 19 8 10.24 38 47.46
SUNY College at Old Westbury 51 0 0 25 10 10 4128 8 11.36 0.96 8 10.8 18 4 4.76 23 0 0 19 8 10.24 38 47.16
Southwestern Oklahoma State University 38 10 11.7 0 0 4435 10 14.2 0.91 10 13.5 15 0 0 28 0 0 18 6 7.68 36 47.08
Indiana University-Southeast 32 6 7.02 29 6 6 4963 6 8.52 0.94 10 13.5 9 0 0 42 10 12 14 0 0 38 47.04
Southern Polytechnic State University 41 8 9.36 29 6 6 5486 2 2.84 0.91 10 13.5 10 0 0 49 2 2.4 21 10 12.8 38 46.90
Arizona State University-Polytechnic 39 10 11.7 28 8 8 3373 2 2.84 0.93 10 13.5 36 0 0 49 2 2.4 18 6 7.68 38 46.12
Pennsylvania State University-Penn State Abington 42 8 9.36 0 0 3498 2 2.84 1.00 6 8.1 23 10 11.9 0 0 20 10 12.8 36 45.00
Delaware State University 48 2 2.34 6 0 0 4213 10 14.2 0.93 10 13.5 21 10 11.9 25 0 0 16 2 2.56 34 44.50
The University of Texas of the Permian Basin 19 0 0 27 8 8 3586 4 5.68 0.88 10 13.5 20 8 9.52 22 0 0 18 6 7.68 36 44.38
University of Cincinnati-Blue Ash College 43 6 7.02 25 10 10 3692 4 5.68 1.00 6 8.1 0 0 48 4 4.8 18 6 7.68 36 43.28
Frostburg State University 37 10 11.7 15 0 0 4936 6 8.52 0.90 10 13.5 20 8 9.52 26 0 0 15 0 0 34 43.24
Alabama A & M University 74 0 0 0 0 4419 10 14.2 0.83 8 10.8 11 0 0 32 4 4.8 20 10 12.8 32 42.60
Gulf Coast State College 41 8 9.36 40 0 0 4204 10 14.2 1.00 6 8.1 0 0 11 0 0 19 8 10.24 32 41.90
Oklahoma State University-Oklahoma City 40 10 11.7 0 0 4623 8 11.36 1.00 6 8.1 0 0 26 0 0 19 8 10.24 32 41.40
Southwest Minnesota State University 15 0 0 0 0 3618 4 5.68 0.92 10 13.5 21 10 11.9 36 8 9.6 29 0 0 32 40.68
Cameron University 49 0 0 39 0 0 4476 10 14.2 0.93 10 13.5 10 0 0 31 2 2.4 19 8 10.24 30 40.34
Georgia Highlands College 50 0 0 23 8 8 4040 8 11.36 1.00 6 8.1 0 0 27 0 0 21 10 12.8 32 40.26
Grambling State University 78 0 0 19 2 2 4603 10 14.2 0.84 8 10.8 12 0 0 26 0 0 20 10 12.8 30 39.80
Fort Lewis College 33 6 7.02 15 0 0 3648 4 5.68 0.10 0 0 18 4 4.76 38 10 12 19 8 10.24 32 39.70
University of Mary Washington 16 0 0 8 0 0 4370 10 14.2 0.94 10 13.5 64 0 0 40 10 12 15 0 0 30 39.70
Longwood University 25 0 0 4 0 0 4766 8 11.36 0.93 10 13.5 45 0 0 37 8 9.6 17 4 5.12 30 39.58
Clover Park Technical College 35 8 9.36 67 0 0 4117 8 11.36 1.00 6 8.1 0 0 12 0 0 23 8 10.24 30 39.06
Missouri Southern State University 56 0 0 29 6 6 4495 10 14.2 0.99 8 10.8 16 0 0 24 0 0 18 6 7.68 30 38.68
Washburn University 36 10 11.7 26 10 10 5327 2 2.84 0.94 10 13.5 13 0 0 29 0 0 14 0 0 32 38.04
Savannah State University 74 0 0 9 0 0 4537 10 14.2 0.96 8 10.8 8 0 0 21 0 0 20 10 12.8 28 37.80
Winthrop University 40 10 11.7 7 0 0 5486 2 2.84 0.87 10 13.5 34 0 0 44 8 9.6 14 0 0 30 37.64
Alcorn State University 76 0 0 23 8 8 3382 2 2.84 0.87 10 13.5 20 8 9.52 18 0 0 16 2 2.56 30 36.42
Texas A & M International University 56 0 0 14 0 0 5411 2 2.84 0.96 8 10.8 20 8 9.52 18 0 0 21 10 12.8 28 35.96
University of Maryland Eastern Shore 54 0 0 0 0 4219 10 14.2 0.93 10 13.5 19 6 7.14 17 0 0 14 0 0 26 34.84
Shawnee State University 54 0 0 18 0 0 3925 6 8.52 0.98 8 10.8 13 0 0 38 10 12 16 2 2.56 26 33.88
Columbia Basin College 32 6 7.02 32 0 0 5181 4 5.68 1.00 6 8.1 0 0 14 0 0 20 10 12.8 26 33.60
Western Connecticut State University 31 4 4.68 0 0 5117 4 5.68 0.94 10 13.5 20 8 9.52 26 0 0 14 0 0 26 33.38
Darton State College 57 0 0 42 0 0 4717 8 11.36 1.00 6 8.1 0 0 26 0 0 22 10 12.8 24 32.26
SUNY at Purchase College 33 6 7.02 10 0 0 4489 10 14.2 0.97 8 10.8 58 0 0 28 0 0 15 0 0 24 32.02
Pennsylvania State University-Penn State Harrisburg 33 6 7.02 0 0 4067 8 11.36 0.88 10 13.5 43 0 0 0 0 14 0 0 24 31.88
Albany State University 72 0 0 24 10 10 3638 4 5.68 0.89 10 13.5 10 0 0 26 0 0 16 2 2.56 26 31.74
Purdue University-North Central Campus 26 0 0 15 0 0 3802 6 8.52 0.99 8 10.8 5 0 0 40 10 12 15 0 0 24 31.32
SUNY College at Geneseo 25 0 0 2 0 0 5492 2 2.84 0.98 8 10.8 66 0 0 33 4 4.8 20 10 12.8 24 31.24
University of Baltimore 43 6 7.02 50 0 0 4773 8 11.36 0.69 2 2.7 18 4 4.76 48 4 4.8 15 0 0 24 30.64
Fitchburg State University 36 10 11.7 17 0 0 5075 4 5.68 0.74 4 5.4 30 0 0 34 6 7.2 15 0 0 24 29.98
Pennsylvania College of Technology 41 8 9.36 19 2 2 5324 2 2.84 1.00 6 8.1 32 0 0 0 0 18 6 7.68 24 29.98
University of Washington-Tacoma Campus 48 2 2.34 34 0 0 4271 10 14.2 0.84 8 10.8 41 0 0 24 0 0 16 2 2.56 22 29.90
University of Puerto Rico-Bayamon 63 0 0 11 0 0 4753 8 11.36 1.00 6 8.1 14 0 0 6 0 0 23 8 10.24 22 29.70
SUNY College at Potsdam 45 4 4.68 5 0 0 4140 8 11.36 0.92 10 13.5 36 0 0 21 0 0 13 0 0 22 29.54
California State University-Monterey Bay 49 0 0 15 0 0 5348 2 2.84 0.94 10 13.5 17 2 2.38 18 0 0 23 8 10.24 22 28.96
Fairmont State University 47 2 2.34 26 10 10 3558 4 5.68 0.96 8 10.8 16 0 0 21 0 0 15 0 0 24 28.82
Pennsylvania State University-Penn State Altoona 32 6 7.02 0 0 3897 6 8.52 1.00 6 8.1 46 0 0 0 0 17 4 5.12 22 28.76
Henderson State University 54 0 0 0 0 3400 2 2.84 0.90 10 13.5 19 6 7.14 20 0 0 17 4 5.12 22 28.60
University of North Carolina at Asheville 34 8 9.36 22 8 8 3413 2 2.84 1.00 6 8.1 38 0 0 27 0 0 14 0 0 24 28.30
SUNY at Fredonia 36 10 11.7 4 0 0 5411 2 2.84 0.90 10 13.5 48 0 0 27 0 0 15 0 0 22 28.04
Plymouth State University 29 2 2.34 0 0 4675 8 11.36 0.83 8 10.8 44 0 0 63 0 0 16 2 2.56 20 27.06
Pennsylvania State University-Penn State Erie-Behrend College31 4 4.68 0 0 4157 8 11.36 0.98 8 10.8 44 0 0 0 0 15 0 0 20 26.84
University of Hawaii at Hilo 47 2 2.34 27 8 8 3622 4 5.68 0.96 8 10.8 11 0 0 24 0 0 14 0 0 22 26.82
Dalton State College 57 0 0 24 10 10 3821 6 8.52 1.00 6 8.1 8 0 0 16 0 0 29 0 0 22 26.62
Eastern Connecticut State University 28 2 2.34 11 0 0 4911 6 8.52 0.97 8 10.8 39 0 0 30 2 2.4 16 2 2.56 20 26.62
University of Puerto Rico-Arecibo 80 0 0 0 0 3599 4 5.68 1.00 6 8.1 6 0 0 3 0 0 22 10 12.8 20 26.58
Fayetteville State University 58 0 0 43 0 0 5371 2 2.84 0.89 10 13.5 17 2 2.38 17 0 0 18 6 7.68 20 26.40
Truman State University 20 0 0 2 0 0 5445 2 2.84 0.93 10 13.5 57 0 0 32 4 4.8 17 4 5.12 20 26.26
Northwest Florida State College 33 6 7.02 43 0 0 5023 6 8.52 1.00 6 8.1 0 0 19 0 0 26 2 2.56 20 26.20
Christopher Newport University 16 0 0 2 0 0 5135 4 5.68 0.97 8 10.8 55 0 0 37 8 9.6 15 0 0 20 26.08
Gordon State College 55 0 0 19 2 2 3307 2 2.84 1.00 6 8.1 0 0 24 0 0 21 10 12.8 20 25.74
Chicago State University 71 0 0 56 0 0 4406 10 14.2 0.84 8 10.8 2 0 0 17 0 0 12 0 0 18 25.00
North Seattle College 13 0 0 60 0 0 4329 10 14.2 1.00 6 8.1 0 0 23 0 0 16 2 2.56 18 24.86
Bowie State University 50 0 0 0 0 4698 8 11.36 0.84 8 10.8 8 0 0 29 0 0 16 2 2.56 18 24.72
Keene State College 24 0 0 5 0 0 4948 6 8.52 0.92 10 13.5 51 0 0 65 0 0 16 2 2.56 18 24.58
Jackson College 53 0 0 36 0 0 3887 6 8.52 1.00 6 8.1 0 0 23 0 0 18 6 7.68 18 24.30
CUNY Medgar Evers College 62 0 0 38 0 0 5101 4 5.68 1.00 6 8.1 4 0 0 10 0 0 19 8 10.24 18 24.02
SUNY College of Technology at Alfred 48 2 2.34 14 0 0 3516 4 5.68 1.00 6 8.1 42 0 0 19 0 0 18 6 7.68 18 23.80
The University of Texas Health Science Center at Houston 42 8 9.36 0 0 4496 10 14.2 0.34 0 0 0 0 4 0 0 5 0 0 18 23.56
Winston-Salem State University 58 0 0 31 2 2 5132 4 5.68 0.94 10 13.5 17 2 2.38 17 0 0 14 0 0 18 23.56
United States Military Academy 0 0 1 0 0 4609 10 14.2 1.00 6 8.1 78 0 0 0 0 0 7 0 0 16 22.30
Francis Marion University 56 0 0 0 0 3551 4 5.68 0.93 10 13.5 15 0 0 31 2 2.4 15 0 0 16 21.58
Colorado State University-Global Campus 31 4 4.68 87 0 0 4960 6 8.52 0.79 6 8.1 0 0 86 0 0 38 0 0 16 21.30
Yakima Valley Community College 56 0 0 33 0 0 3499 2 2.84 1.00 6 8.1 0 0 9 0 0 19 8 10.24 16 21.18
University of Illinois at Springfield 37 10 11.7 41 0 0 3941 6 8.52 0.61 0 0 33 0 0 27 0 0 14 0 0 16 20.22
South Seattle College 16 0 0 56 0 0 4092 8 11.36 1.00 6 8.1 0 0 24 0 0 15 0 0 14 19.46
Citadel Military College of South Carolina 24 0 0 9 0 0 3787 6 8.52 0.85 8 10.8 59 0 0 52 0 0 13 0 0 14 19.32
University of the District of Columbia 54 0 0 55 0 0 3587 4 5.68 0.92 10 13.5 5 0 0 19 0 0 11 0 0 14 19.18
Thomas Edison State College 12 0 0 89 0 0 5225 4 5.68 0.87 10 13.5 0 0 68 0 0 15 0 0 14 19.18
Midland College 21 0 0 0 0 3563 4 5.68 1.00 6 8.1 0 0 17 0 0 17 4 5.12 14 18.90
Arizona State University-West 50 0 0 32 0 0 3301 2 2.84 0.90 10 13.5 42 0 0 49 2 2.4 13 0 0 14 18.74
Pennsylvania State University-World Campus 35 8 9.36 0 0 5289 2 2.84 0.74 4 5.4 10 0 0 0 0 12 0 0 14 17.60
Skagit Valley College 27 0 0 43 0 0 3786 6 8.52 1.00 6 8.1 0 0 16 0 0 15 0 0 12 16.62
United States Air Force Academy 0 0 1 0 0 4924 6 8.52 1.00 6 8.1 83 0 0 0 0 0 8 0 0 12 16.62
University of Puerto Rico-Cayey 71 0 0 0 0 3455 2 2.84 1.00 6 8.1 5 0 0 4 0 0 25 4 5.12 12 16.06
University of Oklahoma-Health Sciences Center 29 2 2.34 39 0 0 4638 8 11.36 0.40 0 0 0 0 10 0 0 8 0 0 10 13.70
The University of Texas Medical Branch 22 0 0 0 0 3670 4 5.68 0.36 0 0 0 0 3 0 0 24 6 7.68 10 13.36
University of West Alabama 53 0 0 0 0 3321 2 2.84 0.53 0 0 20 8 9.52 51 0 0 15 0 0 10 12.36
Governors State University 51 0 0 58 0 0 4105 8 11.36 0.60 0 0 0 0 27 0 0 10 0 0 8 11.36
Colorado Mountain College 18 0 0 38 0 0 3503 2 2.84 1.00 6 8.1 0 0 11 0 0 13 0 0 8 10.94
Delta State University 45 4 4.68 16 0 0 3718 4 5.68 0.65 0 0 15 0 0 27 0 0 13 0 0 8 10.36
The University of Texas Health Science Center at San Antonio 30 4 4.68 58 0 0 3360 2 2.84 0.49 0 0 0 0 5 0 0 7 0 0 6 7.52
University of Nebraska Medical Center 23 0 0 35 0 0 3717 4 5.68 0.31 0 0 0 0 10 0 0 5 0 0 4 5.68
Medical University of South Carolina 9 0 0 61 0 0 5307 2 2.84 0.13 0 0 0 0 14 0 0 9 0 0 2 2.84

28
APPENDIX F

SURVEY NO. 2

Question No. 1: Results of the previous survey indicated 10 criteria to use in selected a
new peer comparison group for Southern Oregon University. Should these criteria be
weighted based on importance?

Results:

# Answer Response %
Yes, the
criteria
should be
1 20 80%
weighted
based on
importance.
No, all
criteria
2 should be 5 20%
counted
equally.
Total 25 100%

Question No. 1 Pt. 2: If yes, please rank the criteria in order of importance with 1 being
the most important and 10 being the least important.

Results:
Total
# Answer
Responses
Control of
institution
1 10 2 1 3 1 1 1 0 0 0 19
(public vs.
private)
Status as a
2 regional 1 8 3 1 1 0 2 0 3 0 19
institution
3 FTE enrollment 1 1 6 5 2 0 0 2 0 2 19
Percentage of
FTE made up
4 by 1 1 4 4 1 3 0 2 3 0 19
undergraduate
students
Four-year
5 1 1 3 0 1 2 4 2 1 4 19
graduation rate
Student-to-
6 1 3 0 2 3 2 3 1 3 1 19
faculty ratio
Mix of full-time
7 faculty to 3 0 1 0 2 2 4 2 2 3 19
adjunct faculty
Percentage of
8 total revenue 0 1 1 2 3 5 2 4 1 0 19
from tuition
Percentage of
9 undergraduates 0 2 0 0 1 1 1 4 4 6 19
over age 25
Percentage of
undergraduates
10 1 0 0 2 4 3 2 2 2 3 19
receiving Pell
Grants
Total 19 19 19 19 19 19 19 19 19 19 -

29
Question No. 2: A high percentage of respondents to the previous survey listed Eastern
Oregon University and Western Oregon University as institutions that should be
considered for inclusion in a new SOU Peer Comparison Group regardless of how they
match the selected criteria. Do you agree?

Results:
# Answer Response %
Yes, EOU
and WOU
should be
included
1 regardless of 13 54%
how they
match the
selected
criteria.
No,
institutions
should be
selected for
the Peer
2 Comparison 11 46%
Group based
only on how
they match
the selected
criteria.
Total 24 100%

30
APPENDIX G

RESEARCH PROPOSAL

CAPSTONE PROPOSAL
MM 514 – Practical Research

Name(s): Ryan Brown

Proposed
Research Review and possible redefinition of SOU’s institutional peer group
Topic/Title:

Executive
Summary: Southern Oregon University (SOU) relies on peer comparisons for important
management and strategic planning decisions, however, the group of peer
institutions currently utilized by SOU through the National Center for Education
Statistics (NCES) and Integrated Postsecondary Education Data System (IPEDS)
was selected in 1997 and was not specifically chosen for SOU.

If this group is out of date, SOU may be making key decisions based on obsolete
comparisons. This research project will college qualitative data from key SOU
offices and personnel in order to determine what criteria should be used in
identifying peer comparators. If the current peer group does not fit this criteria,
quantitative analysis will identify new peers that better align with current needs
and priorities. Ensuring that SOU is comparing itself to true peers will allow for
stronger benchmarking and comparative analysis going forward.

Project objectives include:


- Determine proper criteria for SOU’s peer comparison institutions.
- Determine if the current peer comparison group meets above criteria.
- If not, determine which institutions belong in SOU’s peer comparison
group.
- Propose new custom comparison group for future SOU IPEDS reports.

Once this study is completed, SOU will have a better understanding of whether its
current group of peer comparator institutions remains relevant. This will either
reinforce that decisions made based on peer comparison data are currently using
an accurate comparison group, or provide an updated peer comparison group,
which could lead to better strategic planning.

Introduction/
Background Using data available from NCES and IPEDS, this research project will review the
of the Study colleges and universities currently being utilized by SOU as an institutional peer
comparison group.

Through conversations with SOU’s Provost’s Office and Office of Institutional


Research, it has been determined that the current peer comparison group was
selected approximately 20 years ago. At that time, the Oregon University System
(OUS) chose institutions to serve as a singular peer comparison group four all
three of Oregon’s public regional universities: SOU, Western Oregon University

31
and Eastern Oregon University (Weeks, Puckett, & Daron, 2000). Preliminary
research has not found whether or not SOU has ever identified and utilized a
group of peer institutions chosen specifically for SOU. When Weeks, Puckett, and
Daron conducted their research, the primary needs for establishing peer
comparison groups for the OUS institutions included “budgeting, faculty
compensation analysis, performance measurement, and trend analysis.”
Comparisons among institutions were to be used as a critical aspect of a new
system-wide budgeting model.

This study will attempt to determine if the current peer comparison group is still
relevant, and if not, which institutions should be included instead. In order to
better benchmark institutional priorities and new initiatives, SOU must be
comparing itself against true peers to help measure performance in key areas. If a
new peer group is identified through this research, benchmarking and
comparative analysis for SOU will be stronger going forward.

The current peer comparison group includes:

- California State University-Stanislaus (CA)


- Eastern Washington University (WA)
- Fort Hays State University (KS)
- Plymouth State University (NH)
- Southeast Missouri State University (MO)
- Southern Utah University (UT)
- SUNY at Fredonia (NY)
- University of Mary Washington (VA)
- University of Michigan-Flint (MI)
- University of Wisconsin-Parkside (WI)

*Source: IPEDS Data Feedback Report 2014

Problem
Statement: SOU uses peer comparisons for a number of things, including evaluating the
performance of the institution as a whole in some areas and evaluating the
performance of specific programs, initiatives, and institutional priorities and
efforts. Specifically, metrics on which SOU evaluates its performance compared to
its peer institutions include enrolment, number of degrees awarded annually,
number of teaching faculty and staff, 6-year bachelor’s degree graduation rate,
percent of students receiving Pell grants, student services expenses per FTE
enrolment, and more (see Appendix 1).

Peer comparisons are also used for benchmarking future performance, strategic
planning, policy setting and decision making, however, the group of universities
that SOU currently uses as a peer comparison group has not been updated in
nearly 20 years. At the time that the current peer comparison group was
formulated, OUS selected a group to apply to all three of Oregon’s public regional
universities; it was not chosen specifically for SOU.

In order to ensure that SOU is comparing itself against true peers, it must evaluate
the current peer comparison group, determine if it is the correct collection of
universities for comparison, and if not, determine what criteria should be used to

32
select a new peer comparison group, then identify 10-15 institutions that meet
that criteria.

Research
Objectives: The objective of this research is to determine whether or not SOU is currently
(or Purpose using an outdated peer comparison group. If the research indicates that the
of the current group is obsolete, any comparative analysis or strategic planning and
Research) benchmarking based on data compared to the current group may likewise be
problematic. If this is indeed the case, qualitative analysis collected from the
Provost’s Office and other key departments on campus will determine what
criteria should be considered when identifying a new peer comparison group, and
this information, taken in conjunction with quantitative analysis and reports to
the NCES and IPEDS will determine which colleges rightfully belong in SOU’s peer
comparison group.

Having a better and more sound comparison group will aid SOU—and specifically
the Provost’s Office—in future planning as well as evaluation of current and past
practices.

Project objectives will be:

- In consultation with the Provost’s Office, determine proper criteria for


SOU’s peer comparison institutions.
- Determine if the current peer comparison group meets the above criteria.
- If not, utilize existing NCES and IPEDS data to determine which
institutions rightfully belong in SOU’s peer comparison group.
- Propose new custom comparison group for future SOU IPEDS reports.

Upon completion, findings and recommendations will be presented to the


Provost’s Office and the Department of Institutional Research for their
consideration and use going forward.

Review of
Related Much of the research into the selection of peer institutions dates from the 1980s
Literature: when issues regarding the use of comparative data in higher education were
researched from an academic perspective by Teeter (1983) and Brinkman and Teeter
(1987) defined the various types of comparison groups. Even prior to Brinkman and
Teeter’s work, Terenzini, Hartmark, Lorang, and Shirley (1980) identified math-based
practices for assessing and comparing institutions of higher education according to
quantitative criteria and variables. This work was later further enhanced by Teeter
and Christal (1984).

In subsequent years, much of this earlier work has been revisited, with variations and
refinements made. One key addition to the earlier work has been utilization of
comparison methods that endeavor to better incorporate both quantitative and
qualitative analysis (Ingram, 1995; Zhao & Dean, 1997). When the OUS conducted
research for the establishment of peer groups in 1997, Weeks, Puckett, and Daron
(2000) found that comparisons with peer universities “can be an effective way for
university presidents to communicate with legislators, board members, and other

33
stakeholders about where their institution stands.” In the years since Weeks, Puckett
and Daron conducted their research, the OUS has disbanded. When established, the
current peer comparison group was selected for its relevance for all three of Oregon’s
public regional universities (SOU, EOU, WOU). For the first time in decades, SOU now
has an opportunity to establish a peer comparison group specific to its own needs.

The OUS research determined that representation on the peer group from all four
major U.S. Census Bureau geographic districts was ideal. It also indicated that the
number of peer institutions in the group be kept at a manageable number, that it
include both similar and aspirational peers, and that it be focused on institutions with
missions that align (Weeks, Puckett, & Daron, 2000). Additionally, Hurley (2002)
found that the primary functions of institutional peer groups fall into three main
categories: financial, program, and planning. That is, Hurley’s research indicated that
institutions primarily utilize peer grouping to establish financial resource allocation,
measure the performance of the institution or specific programs against other
institutions, and conduct long-term planning. Carrigan (2012) describes a “thoughtful
and deliberate” peer selection process that involves numerous institutional
stakeholders, including senior academic and administrative leadership.

Although the review of literature did reveal at least one study that indicated that an
institution’s peer comparison group should be reviewed on a six-year cycle (Weeks,
Puckett, & Daron), no identified research specifically focused on review of an existing
peer group.

Though the National Center for Educational Statistics (NCES) and Integrated
Postsecondary Education Data System (IPEDS) provide access to a large quantity of
data and nearly every institution of higher education in the United States, there are
some limitations to what data can be accessed and how, including the number of
institutions that can be included in a single query, broadness of some of the statistical
categories, and exclusion of some data that institutional decision makers might find
useful (Schuh, 2002). Despite these limitations, Schuh concludes that IPEDS data
serves as a relevant and useful source for cross-institutional comparisons (Schuh,
2002). Gater and Lombardi also called the IPEDS data system “the most readily
available and widely used source” for national higher education data (2001).

Importance/
Benefits of Once this study is completed, SOU will have a better understanding of whether its
the study. current group of peer comparator institutions remains relevant. If not, a new
group will be identified. This will either reinforce that decisions made based on
peer comparison data are currently using an accurate comparison group, or
provide an updated peer comparison group, which will lead to better decision
making as it pertains to benchmarking against peers, strategic planning, and
setting policy for the institution. SOU will be able to better determine if its
financial resources are allocated in a manner that is consistent with similar
institutions across the country and better track trends that impact university
operations.

Comparisons with peer universities also serve as an effective way for the
university to communicate with key constituencies, including legislators, board
members, and the media. SOU will be able to more accurately compare itself to
similar institutions and gauge performance regarding enrolment, degrees

34
conferred, finances, demographics, faculty size and utilization, and other
institutional characteristics.

Research This project will rely on both qualitative and quantitative analysis. The qualitative
Design: aspect of the research will be informed based on input from key SOU personnel,
including the President, Chief of Staff, Vice President of Finance and
Administration, Provost and Vice President for Academic and Student Affairs,
Associate Vice President for Academic Resource Management, Associate Provost
and Director of Graduate Studies, Associate Vice President for Enrollment and
Retention, University Librarian, Director of Institutional Research, Director of
Admissions, Director of University Assessment, and Directors of all SOU academic
divisions: Oregon Center for the Arts; Business Communication and the
Environment; Education, Health, and Leadership; Humanities and Culture;
Science, Technology, Engineering, and Mathematics; Social Sciences; and
Undergraduate Studies. In addition to one-on-one interviews with some of these
key personnel, a brief survey utilizing a Likert scale will be sent to all. This survey
will be designed to identify which variables these individuals feel are important to
include in the criteria used for identifying a new peer comparison group.

Careful review of existing literature to identify best practices will play an


important role as well, particularly the work of Weeks, Puckett, and Daron (2000),
whose research established the existing peer comparison group in 1997. They
determined that a careful process of identifying peers institutions incorporates
“both informed administrative judgement at the campus level and an appropriate
array of statistical data.”

The statistical phase of this research will utilize secondary data available from
NCES and IPEDS. This database has statistical information on every institution of
higher education in the United States that receives federal funding and tools that
allow for comparison between individual institutions or groups of institutions.
Out of the numerous variables for which data is available via IPEDS, the choice of
which to include for identifying peer comparators will be based on best practices
gleaned from previous research as well as the results of the qualitative data
collected from SOU personnel. Weighting of variables will be established based on
results of the Likert survey.

Data
Analysis: Because the data to be presented will not be defined until qualitative research is
complete, it is not yet known exactly what form the presentation will take.
However, it is likely that simple bar graphs will be the best method to illustrate
which institutions meet the criteria established through the qualitative process.

A Likert scale survey to aid in weighting criteria will also be used.

Results and
Deliverables: If it is determined that the current peer comparison group being used by SOU is
obsolete, the main deliverable will be a new group based on the qualitative and
quantitative analysis conducted. Additionally, a newly-generated IPEDS Data
Feedback Report comparing SOU to its new comparator group will be presented.

35
Qualification
of Ryan Brown is Head of Community and Media Relations at SOU. He previously
Researcher/s: served as Chief Communications Officer at Klamath Community College (KCC) in
Klamath Falls, Oregon. In both positions he has been routinely called upon to
research, collect, analyse, and present data and complex information. At both
institutions, Brown has held key administrative positions, reporting directly to
the presidents of both SOU and KCC. He has given communications and
marketing-related presentations at regional and national conferences, including
for the National Council on Marketing and Public Relations and the National
Council for Continuing Education & Training, both of which he maintains
membership in.

Prior to his career in education, Brown was an award-winning journalist, having


been honoured by the Oregon Associated Press Broadcasters Association for
multiple statewide awards, including Best Newscast, Best Coverage of a Single
Story, and Best Light Feature. He is a past member of the Society of Professional
Journalists.

Brown holds an Associate Degree in General Education from Shasta College in


Redding, California and a Bachelor of Arts in Communication Studies with a
Minor in English from Sonoma State University in Rohnert Park California. He
currently completing coursework toward a Master in Management degree at SOU.

References
and Brinkman, Paul T., and Teeter, Deborah J. (1987). Methods for selecting comparison
Bibliography groups. In Paul T. Brinkman (ed.), Conducting Interinstitutional Comparisons.
New Directions for Institutional Research, no. 53, pp. 5-23. San Francisco:
Jossey-Bass.

Carrigan, Sarah D. (2012). Selecting peer institutions with IPEDS and other nationally
available data. New Directions for Institutional Research, no. 156, pp. 61-68.

Gater, Denise S., and Lombardi, John V. (2001). The use of IPEDS/AAUP faculty data in
institutional peer comparisons. TheCenter, University of Florida, TheCenter
Reports. 1-7.

Hom, Willard (2008). Peer grouping: the refinement of performance indicators.


Journal of Applied Research in the Community College 16(1): 45-51.

Hurley, R. G. (2002). Identification and assessment of community college peer


institution selection systems. Community College Review. 29, 1-27.

Ingram, John A. (1995). Using IPEDS data for selecting peer institutions. Paper
presented at the Thirty-Fifth Annual Forum of the Association for
Institutional Research, Boston, MA, May 1995.

36
Schuh, John H. (2002). The integrated postsecondary education data system. New
Directions for Higher Education, no. 118, pp. 29-38.

Teeter, Deborah J. (1983). The politics of comparing data with other institutions. In
James W Firnberg and William F. Lasher (ed.), The Politics and Pragmatics of
Institutional Research. New Directions for Institutional Research, no. 38, pp.
39-48. San Francisco: Jossey-Bass.

Teeter, Deborah J., and Christal, Melodie E. (1984). A Comparison of procedures for
establishing peer groups. Paper presented at the Annual Meeting of the
Southern Association for Institutional Research, Little Rock, AR, October
1984.

Terenzini, Patrick T., Hartmark, Leif, Lorang, Wendall G. Jr., and Shirley, Robert C.
(1980). A conceptual and methodological approach to the identification of
peer institutions. Research in Higher Education 12(4): 347-364.

Weeks, S. F., Puckett, D., and Daron, R. (2000). Developing peer groups for the
Oregon University System: from politics to analysis (and back). Research in
Higher Education 41(1): 1-20.

Zhao, Jisehn, and Dean, Donald C. (1997). Selecting peer institutions: A hybrid
approach. Paper presented at the Thirty-Seventh Annual Forum of the
Association for Institutional Research, Orlando, FL, May 1997.

Appendices
1. Current Peer Group Peer Data

37
BIBLIOGRAPHY

Brinkman, Paul T., and Teeter, Deborah J. (1987). Methods for selecting comparison
groups. In Paul T. Brinkman (ed.), Conducting Interinstitutional Comparisons. New
Directions for Institutional Research, no. 53, pp. 5-23. San Francisco: Jossey-Bass.

Carrigan, Sarah D. (2012). Selecting peer institutions with IPEDS and other nationally
available data. New Directions for Institutional Research, no. 156, pp. 61-68.

Gater, Denise S., and Lombardi, John V. (2001). The use of IPEDS/AAUP faculty data in
institutional peer comparisons. TheCenter, University of Florida, TheCenter
Reports. 1-7.

Hom, Willard (2008). Peer grouping: the refinement of performance indicators. Journal of
Applied Research in the Community College 16(1): 45-51.

Hurley, R. G. (2002). Identification and assessment of community college peer institution


selection systems. Community College Review. 29, 1-27.

Ingram, John A. (1995). Using IPEDS data for selecting peer institutions. Paper presented
at the Thirty-Fifth Annual Forum of the Association for Institutional Research,
Boston, MA, May 1995.

Schuh, John H. (2002). The integrated postsecondary education data system. New
Directions for Higher Education, no. 118, pp. 29-38.

Teeter, Deborah J. (1983). The politics of comparing data with other institutions. In James
W Firnberg and William F. Lasher (ed.), The Politics and Pragmatics of Institutional
Research. New Directions for Institutional Research, no. 38, pp. 39-48. San
Francisco: Jossey-Bass.

Teeter, Deborah J., and Christal, Melodie E. (1984). A Comparison of procedures for
establishing peer groups. Paper presented at the Annual Meeting of the Southern
Association for Institutional Research, Little Rock, AR, October 1984.

Terenzini, Patrick T., Hartmark, Leif, Lorang, Wendall G. Jr., and Shirley, Robert C. (1980).
A conceptual and methodological approach to the identification of peer
institutions. Research in Higher Education 12(4): 347-364.

Weeks, S. F., Puckett, D., and Daron, R. (2000). Developing peer groups for the Oregon
University System: from politics to analysis (and back). Research in Higher
Education 41(1): 1-20.

38
Zhao, Jisehn, and Dean, Donald C. (1997). Selecting peer institutions: A hybrid approach.
Paper presented at the Thirty-Seventh Annual Forum of the Association for
Institutional Research, Orlando, FL, May 1997.

39

You might also like