You are on page 1of 5

Competency Map: Visualizing Student Learning to Promote Student Success

Jeff Grann
Capella University 225 South Sixth Street 9th Floor Minneapolis, MN 55402

Deborah Bushway
Capella University 225 South Sixth Street 9th Floor Minneapolis, MN 55402

Jeff.Grann@capella.edu ABSTRACT
Adult students often struggle to appreciate the relevance of their higher educational experiences to their careers. Capella Universitys competency map is a dashboard that visually indicates each students status relative to specic assessed competencies. MBA students who utilize their competency map demonstrate competencies at slightly higher levels and persist in their program at greater rates, even after statistically controlling for powerful covariates, such as course engagement.

Deborah.Bushway@capella.edu
remained generally satised with the readiness of their graduates for the job market [17]. However, faculty condence in the outcomes of post-secondary education is challenged by survey data showing that most employers report insucient preparation amongst new hires [17]. Moreover, the Organization for Economic Cooperation and Developments international study of adult basic skills has shown comparatively middling to low performance of American adults in basic mathematics, use of technology, and literacy [18]. Combined, these studies suggest that more needs to be done to help students connect their educational experiences to their personal and professional aspirations. Direct assessment programming and supportive competencybased curriculum [8] presents an opportunity for administrators to examine and improve the relevance of their programming. Competencies are dened as the knowledge, skills, abilities, and professional attributes required to successfully perform a task in a given context [11] and signal a degree of career relevance not visible in traditional curricular models. Central to the operations of a competency-based curriculum is the quality and coordination of faculty assessment practices across a programs curriculum [22]. The state of the art in educational assessment systems emphasizes the importance of aligning assessment practices through measures of student learning that can support both formative and summative purposes [19]. Direct assessment practices bridge these purposes by establishing measures of student learning that are embedded throughout a program and utilized with every student. Student grades may seem sucient as a measure of student learning for direct assessment programs, but several historical issues prevent institutions from relying on grades. Over the last decade, American regional accrediting agencies have consistently encouraged institutions to implement robust assessment systems to measure and promote student learning [20]. These eorts have had several positive eects, with many institutions now articulating intended learning outcomes, but these advances have occurred mostly at the institutional level and have largely not aected the grading practices of faculty [12]. By policy and practice, faculty members in American universities have long held sole responsibility for issuing student grades [1]. While this approach to grading is entirely appropriate, it has also led to the development of diverse grading practices, confusion over the basis for student grades, and a gradual lowering of performance standards in American colleges and universities. Currently, A level grades are estimated to represent 43 per-

Categories and Subject Descriptors


K.3.1 [Computers and Education]: Computer Uses in Education; H.4.3 [Information Systems Applications]: User Interfaces

General Terms
Experimentation, Design

Keywords
competency, learning analytics, visualization, evaluation

1.

INTRODUCTION

Adult post-secondary students in America expect their educational experiences to be relevant to their careers. In 2010, 75 percent of Americans reported a college education to be essential for success in todays society [7]. Unfortunately, these high expectations for education are not being met. A recent international study found that only about half of the students surveyed believed their studies had improved their employment opportunities [17]. These ndings may not surprise faculty, many of whom have long known that students often have a dicult time articulating the value of college beyond the credentials it produces [26], but have nevertheless
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for prot or commercial advantage and that copies bear this notice and the full citation on the rst page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specic permission and/or a fee. Request permissions from Permissions@acm.org LAK 14, March 24 - 28 2014, Indianapolis, IN, USA. ACM 978-1-4503-2664-3/14/03...$15.00. http://dx.doi.org/10.1145/2567574.2567622

cent of all letter grades assigned by faculty; this represents an increase of 28 percent since 1960 [21]. This increase in very high grades cannot be accounted for by corresponding increases in studying behavior [5] or levels of demonstrated learning [4]. Historical reviews have reported little progress made within academic programs to measure even the most common learning outcomes of an undergraduate education [6]. This weak state of grading practices in higher education has undermined grades as a marker of individual distinction in the job market. Human resource managers view grades with ambivalence when evaluating applicants for positions [25] and major companies view GPA as essentially a silly number[16]. Our position is that the most direct way for institutions to help students appreciate their educational experience is to align faculty grading practices with specic competencies and to visualize student learning for multiple stakeholders. Prior research has provided encouraging results suggesting that students and faculty can utilize personalized dashboards to improve academic outcomes[2, 27]. This paper reports on a multi-year eort to measure and visualize student learning via a competency map by presenting preliminary evaluation results with MBA students.

Capellas competency map is intended to inform students of their status and progress toward demonstrating specic competencies. Students can use their competency map to conceptualize their academic experience, communicate accomplishments, and focus their future studying. The competency map is also designed to help faculty and advisors engage students more deeply in their academic program.

3.1

Student-Centered Design

The competency map went through multiple design iterations, both in its visualization and in the grain of data being displayed to the student. We believed students would be most interested in their performance relative to their current course and decided early on to frame the competency map relative to a specic course. As Figure 1 illustrates, the initial intention was to simultaneously display student status on unit assignments, course competencies, and program outcomes. In 2012, a student focus group appreciated this information, but many students were overwhelmed by the amount of information provided.

2.

INSTITUTIONAL CONTEXT

Capella University is a regionally accredited online institution whose mission is to extend access to adult students seeking to maximize their personal and professional potential. Founded in 1993 as the Graduate School of America, the institution was renamed Capella University in 1999 and continues to serve primarily graduate students. The universitys history of measuring and reporting on students demonstration of outcomes and competencies has been instrumental to the development of the competency map (for more details, please see the National Institute for Learning Outcome Assessments case study[9]). Conceptually, the university uses a fully-embedded assessment model (FEAM) to measure each students competency demonstration via the alignment of all graded course assignments and scoring guide criteria. Alignment is established through the independent judgments of subject matter experts and assessment specialists, with disagreements mediated in a reconciliation session with the faculty chair. All subject matter experts receive detailed training1 and are required to successfully complete a brief qualifying assessment. The competency map utilizes these documented alignments to measure and report on students competency demonstration based on facultys grading practices. Faculty grade assignments using an electronic scoring guide that both communicates faculty feedback to students and records criterionspecic judgments as data concerning a students demonstration of a specic competency. This course development approach has matured so that all academic oerings are created by a team of subject matter experts and faculty working with curriculum specialists, instructional designers, and assessment specialists (for an overview, see Creating Capella Universitys Academic Oerings2 ).

Figure 1: Initial competency map prototype displaying unit-, course-, and program-level information Based on this feedback, the team simplied the display using small multiples [28] to indicate a students status on each competency. Initially, the team used a colored sliding bar with the amount of coloring indicating the percentage of criteria assessed for a competency and the color of the bar indicating the overall demonstrated performance level (for an example, see Figure 2). However, subsequent testing revealed that students would inappropriately combine the two dierent metrics being represented by the slider bars; believing that the bars indicated the amount of progress made relative to the amount of progress expected. We also learned that students did not often know the competencies by number and needed to refer to the course syllabus in order to interpret the competency map. The team reworked the sliding bars into colored circles to help dierentiate the concepts of completion and demonstration. The amount of shading indicates the percentage of criteria that have been assessed relative to the total number of criteria in the course aligned to the particular competency. The color of the circle represents the overall level of performance demonstrated for the particular competency. The circles color is based on the students historical performance on all aligned scoring guide criteria. Capella Universitys grading policy recognizes the following four perfor-

3.
1 2

COMPETENCY MAP

http://tinyurl.com/FEAMTraining http://tinyurl.com/AcademicOerings

groups were similarly engaged in competency map training sessions, including student advising, career center, faculty support, IT support, customer care, and enrollment services.

3.3.1

Utilization

Figure 2: Early competency map prototype focusing exclusively on competency demonstration mance levels, which were mapped to the indicated colors, red = non-performance, yellow = basic, light green = procient, and green = distinguished. In Figure 3, the student has demonstrated the rst competency at a distinguished level of performance across the two criteria that have been assessed so far. All visual information is also presented numerically or textually to support visually impaired students.

Students use of the competency map was optional and therefore it was important to track utilization to analyze outcome measures. To track utilization we implemented a custom link solution that logged the students ID number every time a student accessed the competency map. Using this technique, we could discern that 31 percent of the MBA students viewed the competency map during the July term. Of these students, more than half viewed the competency map repeatedly and a small handful viewed it more than ten times throughout the term. All subsequent analysis relies on these utilzation data to segment students based on competency map usage.

3.3.2

Academic Performance

3.2

Student Pilot

During the rst and second academic quarters of 2013, School of Business students participating in a self-paced course pilot were provided with an email version of the competency map. These students were required to demonstrate each competency in the course by completing a series of assessments. The courses were referred to as self-paced because students were not required to advance on a weekly schedule or participate in weekly discussions with other students. The email version of the competency map contained a textual table that summarized the students performance relative to each course competency. It was sent to students soon after their assessment was marked by faculty. This pilot involved sixteen dierent courses and more than two hundred students, which provided an opportunity for the team to test the performance of the data systems involved in producing the competency map. In this regard, the pilot successfully validated key pieces of the technical architecture and produced accurate competency map emails as expected. The student pilot was not part of the formal evaluation of the competency map, but supportive data were collected from students. More than 80 percent of students reported that the emailed competency maps were helpful or very helpful for understanding their competency performance. Furthermore, email tracking reports showed that every competency map email sent was opened, suggesting that the information was relevant to students.

The competency map provides detailed information about facultys performance expectations for each courses assignment and this information is intended to help focus students studies and improve their academic performance. We measured academic performance by examining the distribution of demonstrated competency prociency levels. As shown in Table 1, students who viewed the competency map demonstrated competencies at slightly higher levels of performance than students who did not, although this eect was not statistically signicant.

3.3.3

Persistence

3.3

MBA Rollout

Based on these successful pilots, the competency map was deployed in the July 2013 academic term for MBA graduate students. Several change management techniques were used to support the rollout of the competency map. A brief tutorial was developed to describe the unique design of the competency map to students.3 Faculty were engaged in a variety of ways using produced materials, meetings with leadership, listening sessions, and an interactive media piece that simulated multiple use case scenarios.4 All student-facing sta
3 4

http://tinyurl.com/competencymap http://tinyurl.com/FacultyScenarios

Based on prior studies that have found personalized student dashboards to increase persistence across academic programs[3], we expected competency map to increase next term registration rates. As expected, students who utilized the competency map registered for a course the following academic term at a signicantly higher rate than those students who never viewed the competency map. This eect is consistent with our belief that adult students value knowing how their educational experience relates to their career success. Given our relatively weak research design, this higher reregistration rate could be accounted for by multiple other explanations. One possibility is that a general engagement eect accounts for these results students who are likely to persist in their program are generally more active and involved in their courses. This eect is likely to be signicant given prior research that has found courseroom activity, such as frequency of logging into the course, posting in discussion threads, and dwell time to be good predictors of learner success[14]. We attempted to statistically account for these effects using a logistic regression model (1) that included a generalized predictive model index (PMI) consistenting of multiple measures of student engagement. By including the PMI in our regression model, we are able to test for the unique variance accounted for by utilization of the competency map on reregistration behavior. We found that competency map utilization signicantly predicted reregistration behavior the following term, even when the regression model includes PMI. Specically, competency map utilization accounted for an additional 7 percent of the variance in reregistration behavior beyond the PMI metric alone. We interpret these results as strong evidence supporting the value stu-

Figure 3: Final competency map design dents place on understanding and monitoring their progress in demonstrating competencies. ReRegistration = + 1 P M I + 2 U sage + (1) tion behavior. Our approach was to develop an outcomesbased curricular structure with supporting competency assessments and criterion-referenced scoring guides. We believe this approach is unique in higher education in its engagement and reliance on faculty to measure students competency demonstration. Moreover, our positive results are important because, although American post-secondary enrollments have risen to historic levels[23], graduation rates have not signicantly increased [15]. In many ways, the challenge of measuring and reporting

4.

CONCLUSION

We have shown that course-level assessment practices can be aligned with a programs curriculum and successfully visualized for students in a manner that promotes reregistra-

Table 1: Competency demonstration performance levels by competency map utilization Usage Non-performance Basic Procient Distinguished No Utilization 6% 8% 21% 65% Utilization 3% 6% 17% 75%

competency demonstration is larger than any particular institution and needs to be contextualized within a historical context to envision the adjacent possible[10]. Various thought leaders, employers, professional associations, policy makers, and accreditation ocials are actively working in this area, especially in relation to critiquing the credit hour as a measure of student learning [13], and envisioning a future competency platform [24]. As more institutions measure learning outcomes, cross-institutional collaborations amongst faculty can help clarify performance expectations for degree programs and thereby support a focus on competency demonstration, such as via the Lumina Foundations Tuning USA5 , the Association of American Colleges and Universities Valid Assessment of Learning in Undergraduate Education rubrics6 , and the Lumina Foundations Degree Qualications Prole7 . Such cross-institutional collaborations build trust and deepen a programs commitment to student success.

5.

REFERENCES

[1] American Association of University Professors. The Assignment of Course Grades and Student Appeals. AAUP, Washington, D.C., 1998. [2] K. E. Arnold. Signals: Applying academic analytics. Educause Quarterly, 33(1), 01/01 2010. [3] K. E. Arnold and M. D. Pistilli. Course signals at purdue: Using learning analytics to increase student success. In 2nd International Conference on Learning Analytics & Knowledge, 2012. [4] R. Arum and J. Roksa. Academically adrift: limited learning on college campuses. University of Chicago Press, Chicago, 2011. [5] P. Babcock and M. Marks. Leisure college, USA: The decline in student study time. Technical report, American Enterprise Institute for Public Policy Research, 08/01 2010. [6] D. C. Bok. Our underachieving colleges: a candid look at how much students learn and why they should be learning more. Princeton University Press, Princeton, N.J., 2008. [7] W. J. Bushaw and S. J. Lopez. A time for change: The 42nd annual Phi Delta Kappa/Gallup Poll of the publics attitudes toward the public schools. Phi Delta Kappan, 92(1), 2010. [8] K. Field. Student aid can be awarded for competencies, not just credit hours. Chronicle of Higher Education, 59(29):A17A17, 03/29 2013. [9] N. Jankowski. Capella University: An outcomes-based institution. Technical report, National Institute for Learning Outcomes Assessment, 2011. [10] S. Johnson. Where good ideas come from: the natural history of innovation. Riverhead Books, New York, 2010. [11] E. A. Jones, R. A. Voorhees, and K. Paulson. Dening and assessing learning: Exploring competency-based initiatives. Committee Report 190, National Postsecondary Education Cooperative, 9-2002 2002.
5 6

http://tuningusa.org http://tinyurl.com/valuerubric 7 http://degreeprole.org

[12] G. D. Kuh and P. T. Ewell. The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1):120, 01/01 2010. [13] A. Laitinen. Cracking the credit hour. 2012. [14] L. P. Macfadyen and S. Dawson. Mining LMS data to develop an early warning system for educators: A proof of concept. Computers & Education, 54(2):588599, 02/01 2010. [15] D. Matthews. A stronger nation through higher education: How and why Americans must meet a big goal for college attainment. Technical report, Lumina Foundation for Education, 09/01 2010. [16] G. L. McDowell. The Google resume: how to prepare for a career and land a job at Apple, Microsoft, Google, or any top tech company. Wiley, Hoboken, N.J., 2011. [17] M. Mourshed, D. Farrell, and D. Barton. Education to employment: Designing a system that works. Technical report, McKinsey Center for Government, 2013. [18] Organisation for Economic Co-operation and Development. OECD skills outlook 2013: First results from the survey of adult skills. Technical report, OECD Publishing, 2013. [19] J. W. Pellegrino, N. Chudowsky, and R. Glaser. Knowing what students know: The science and design of educational assessment. National Research Council, Washington, DC, 2001. [20] S. Provezis. Regional accreditation and student learning outcomes: Mapping the territory. Technical report, National Institute of Learning Outcomes Assessment, 2010. [21] S. Rojstaczer and C. Healy. Where A is ordinary: The evolution of American college and university grading, 1940-2009. Teachers College Record, 114(7), 01/01 2012. [22] C. G. Schneider. Holding courses accountable for competencies central to the degree. Liberal Education, 99(1):1421, Winter2013 2013. [23] T. D. Snyder and S. A. Dillow. Digest of education statistics 2009. Technical Report NCES 2010-013, National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, 2010. [24] L. Soares. A disruptive look at competency-based education: How the innovative use of technology will transform the college experience. Technical report, Center for American Progress, 2012. [25] N. Spinks and B. Wells. Trends in the employment process: resumes and job application letters. Career Development International, 4(1):4045, 1999. [26] J. Tagg. The learning paradigm college. Anker Pub. Company, Bolton, Mass., 2003. [27] C. Thille and J. Smith. Cold rolled steel and knowledge: What can higher education learn about productivity? Change, 43(2):2127, Mar 2011. [28] E. R. Tufte. Envisioning information. Graphics Press, Cheshire, Conn. P.O. Box 430, Cheshire 06410, 1990.

You might also like