You are on page 1of 46

Engineering Programs and NBA Accreditation

N J Rao and K Rajanikanth

Engineering Programs in India


 Are offered as per the regulations of All India Council for Technical Education (AICTE)
 Are offered by Tier 1 (Academically Autonomous) and Tier 2 (Academically
Non-autonomous) Institutions
 At present 95% of engineering colleges are academically non-autonomous, i.e., Tier 2
institutions.

National Board of Accreditation (NBA)


 Established in the year 1994 under Section 10 (u) of AICTE Act.
 NBA became Autonomous in January 2010 and in April 2013 the Memorandum of
Association and Rules of NBA were amended to make it completely independent of AICTE,
administratively as well as financially.
 NBA became a permanent member of the Washington Accord (an international accord) in
2014.

Washington Accord
 It recognizes the substantial equivalency of programs accredited by those bodies and
recommends that graduates of programs accredited by any of the signatory bodies be
recognized by the other bodies as having met the academic requirements for entry
to the practice of engineering

Accreditation
 Accreditation is a process of quality assurance and improvement, whereby a program in
an approved Institution is critically appraised to verify that the Institution or the program
continues to meet and/or exceed the Norms and Standards prescribed by regulator from
time to time.
 It is a kind of recognition which indicates that a programme or Institution fulfils certain
standards.
 Programs, and not Educational Institutions, are considered for accreditation.

Purpose of accreditation is NOT TO


 find faults with the institution but to assess the status-ante of the performance
 denigrate the working style of the institution and its programs but to provide a feed
back on their strengths and weaknesses
 demarcate the boundaries of quality but to offer a sensitizing process for continuous
improvement in quality provisions

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 1
 select only institutions of national excellence but to provide benchmarks of excellence
and identification of good practices

Benefits of Accreditation
 Facilitates continuous Quality Improvement
 Demonstrates accountability to the public
 Improves staff morale
 Recognizes the achievements/innovations
 Facilitates information sharing
 Priority in getting financial assistance helps the Institution to know its strengths,
weaknesses and opportunities
 Initiates Institutions into innovative and modern methods of pedagogy
 Promotes intra and inter-Institutional interactions

What are Outcomes?


 An outcome of an education is what the student should be able to do at the end of a
program/ course/ instructional unit.
 Outcome-based education is an approach to education in which decisions about the
curriculum are driven by the exit learning outcomes that the students should display at
the end of the program/ course.

Why is OBE important?

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 2
Outcomes
 Outcomes are the abilities the students acquire at the end of the program
 Outcomes provide the basis for an effective interaction among stakeholders
 In outcome-based education, “product defines process”.
 It is the results-oriented thinking and is the opposite of input-based education where
the emphasis is on the educational process and where we are happy to accept whatever
is the result”
 Outcome-based education is not simply producing outcomes for an existing
curriculum.

Perceived Disadvantages of OBE


 Imposition of Constraints
– The concern was that education should be open ended, taking the student where he
or she was able to develop.
– “The proposed outcomes watered down academics in favour of ill-defined values and
process skills”
– “Traditional academic content is omitted or buried in a morass of pedagogic clap-trap
in the outcome-based education plans that have emerged to date”
 Inclusion and Emphasis on Attitudes and Values was Inappropriate
 Inhibition of Learning by Discovery
– Education should be valued for its own sake and not because it led to a pre-identified
outcome
 To define education as a set of outcomes decided in advance of teaching and learning,
conflicts with the wonderful, unpredictable voyages of exploration that characterize
learning through discovery and enquiry.

Levels of Outcomes
 Program Educational Objectives: PEOs are broad statements that describe the
career and professional accomplishments in five years after graduation that the
program is preparing graduates to achieve.
 Program Outcomes: POs are statements that describe what the students graduating
from engineering programs should be able to do
 Program Specific Outcomes: PSOs are statements that describe what the graduates
of a specific engineering program should be able to do
 Course Outcomes: COs are statements that describe what students should be able to
do at the end of a course

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 3
What is SAR (Self Assessment Report)
 SAR is compilation of such data and information pertaining to a given program for its
assessment (identifying strength and weaknesses) vis-à-vis accomplishment of
defined POs and PSOs by the college itself.
– SAR has two parts
– Part -I seeks Institutional /Departmental information
– Part –II seeks information on ten criteria and Programme Educational Objectives,
Programme Outcome, Programme Curriculum, Students’ Performance, Faculty
Contributions, Facilities and Technical Support, Academic Support Units and
Teaching-Learning Process, Governance, Institutional Support and Financial
Resources, Continuous Improvement in Attainment of Outcomes
 NBA reconfirms or differs from the assessment of the institution, using a mechanism of
peer review, in its evaluation report.

SAR Criteria (Tier II)

Criteria Mark/
Criteria
No. Weightage

Program Level Criteria

1. Vision, Mission and Program Educational Objectives 60

2. Program Curriculum and Teaching–Learning Processes 120

3. Program Outcomes and Course Outcomes 120

4. Students’ Performance 150

5. Faculty Information and Contributions 200

6. Facilities and Technical Support 80

7. Continuous Improvement 50

8. First Year Academics 50

9. Student Support Systems 50

10. Governance, Institutional Support and Financial Resources 120

Total 1000

Award of Accreditation
 Full Accreditation for 5 Years: 750 out of 1000 points with a minimum of 60% points in
Criteria 1, 4, 5, 6, 7 and 8
 Provisional Accreditation for 2 Years: Minimum 600 out of 1000 points
 No Accreditation: < 600 points out of 1000 points
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 4
V, M, PEOs, POs, PSOs
Sections 1, 2, 3, and 8: Work Flow

Vision and Mission


Vision: Where you “see” your department down the road; typically one sentence!
Mission: What you “do” to get there? Typically, 2-3 sentences.
• Must follow from Vision and Mission of the Institute
• Must be shared with all stake holders!
• Better to avoid “flowery” phrases (generally):
– World-Class
– Global excellence
– All round excellence …
• Must result from a well-defined and recorded process!

Vision and Mission - PROCESS


• Stakeholders: Top Management (...), Faculty and Staff, Current Students, Alumni,
Employers, Industry reps, ......
• Process:
– Initial brainstorming at multiple levels;
– Review, refine, and validate (Experts, Advisory Group,...)
– Wide publicity (Institute web site, campus, ...)
– Review “to close the loop” (5 years?)
– (Regular interactions with new faculty and staff; students?)
• Process documentation
• Records of process implementation

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 5
Vision & Mission of the Department

PEOs
• What the Graduates of the program are expected to achieve within 3 to 4 years of completing
the program.
• Can be abstract to some extent; but must be smaller in number and must be achievable.
• Must follow from Vision and Mission
• Must follow an established process
• Typically, the process is similar to the one for Vision and Mission
• Process Documentation
• Records of Process Implementation
• Must be shared with all stake holders!
• Key elements (generally):
– Professional success
– Life-long learning, Higher Education, Research
– Ethical professional practice
– Communication skills
– Team player
– ……
• 3 to 5 PEOs may be arrived at following a well-defined and recorded process
• Measurement and closing the loop

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 6
Program Educational Objectives

(Sample) PEOs – UG IN EEE


Graduates of BE program in EEE will be able to
1. Engage in design of systems, tools and applications in the field of electrical and electronics
engineering and allied engineering industries
2. Apply the knowledge of electrical and electronics engineering to solve problems of social
relevance, and/or pursue higher education and research
3. Work effectively as individuals and as team members in multidisciplinary projects
4. Engage in lifelong learning, career enhancement and adopt to changing professional and
societal needs

Mission – PEO Mapping


• PEOs must be consistent with the Mission
• Example: A PEO states that the Graduates will be successful in Research BUT Mission has no
mention of Research!
• Develop the PEO-Mission Matrix
• The strength of mapping between a PEO and an element of Mission may be marked as
Substantial, Moderate, Slight
• Such mapping strengths must be justified
• From this perspective also, it is better to limit the number of PEOs to a reasonably small
number and have fairly crisp Mission statements.

M1 M2 ... Mk
PEO1 - 3 3
...
...
PEOn 1 1 1
• M1, M2, and so on are elements of the Mission
• Correlation levels: 1, 2, or 3 interpreted as follows: 1- Slight; 2- Moderate; 3 –
Substantial. If there is no correlation, indicate by a “–”
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 7
• Each mapping needs to be justified
• Example:
A PEO states that the Graduates will engage in life-long learning; this is mapped to an
element of the Mission statement, “environment conducive for self-directed learning”;
PEO3–M4: The mapping strength is “substantial”
Justification: The learning environment provided in the college is designed to promote
self-directed learning by the students; this coupled with the Program Curriculum will
lead Graduates to engage in continuous learning in their professional careers.

POs and PSOs


• What the students become capable of, at the end of the program (PEOs look at the
graduates 3 to 4 years after the completion of the program!)
• POs (12 in number) are defined by NBA; are applicable to all UG programs; cover not
just technology competence but also skills and attitudes!
• PSOs are program specific; 2 to 4; need to be defined following a documented process

POs and PSOs

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 8
Attainment of POs / PSOs

Course Outcomes

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 9
CO Attainment

Closing the Quality Loop


• All the processes required for accreditation need to have the step of “closing the loop”.
• A model useful for understanding this is the Deming’s Quality Cycle:

• We plan the activity; do it; measure the performance (CHECK); and finally based on what
was planned and what was actually achieved, initiate appropriate action commencing the
next round of the quality cycle.
ACTION:
• If the attainment lags behind the planned target, we need to further analyze the reasons
for the same and plan suitable corrective actions for the next round.
• If the achievement exceeds the planned target, we need to “raise the bar”! Further, we
need to examine:
• If the targets set were too easy; if so, we need to raise the bar in a realistic fashion
• If the targets set were reasonable, then we need to plan for achieving the new
target level.
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 10
• This concept of Quality Loop operates at all levels of attainment of outcomes. Will be
discussed elaborately in later sessions
• At Course Level:
• Target levels of attainment of Course Outcomes (COs) are set; Course is
delivered; actual attainment of COs is determined; AND
• The loop is closed either by increasing the target level for the next offering of the
course or
• By planning suitable improvements in the teaching /learning process to increase
the actual attainment so as to reach the target
• At PO, PSO Level:
• POs and PSOs are achieved through formal courses and other co-curricular and
extra-curricular activities
• Target levels of attainment of POs and PSOs are set; Program is delivered; actual
attainment of POs and PSOs is determined; AND
• The loop is closed either by increasing the target level for the next cycle of the
program or
• By planning suitable improvements in all the relevant activities to increase the
actual attainment
• “Closing the loop” must be carried out, in a similar manner, at the level of PEOs also!
• This concept applies even at higher levels of Mission and Vision though the time frames
involved are usually much larger!
• Thus Mission is revisited typically once in 5 to 6 years.
• It is much rarer to revisit the Vision in less than about 7 to 10 years!

This process view of Quality is central to Accreditation

Taxonomy of Teaching, Learning and Assessment


Dimensions of Learning
 Cognitive
– Cognitive Processes
– Knowledge Categories
 Affective (Emotion)
 Psychomotor
All three dimensions are involved to varying degrees in all experiences and activities
 Spiritual

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 11
Cognitive Processes
Anderson/Bloom’s Taxonomy
 Remember
 Understand
 Apply
 Analyze
 Evaluate
 Create

Remember
 Remembering involves retrieving relevant knowledge from long-term memory
 The relevant knowledge may be factual, conceptual, procedural, or some combination of
these
 Remembering knowledge is essential for meaningful learning and problem solving as that
knowledge is used in more complex tasks
 Action verbs: Recognize, recall, list, mention, state, draw, label, define, name, describe,
prove a theorem etc.

Understand
 We are said to understand when we are able to construct meaning from instructional
messages
 Instructional messages can be verbal, pictorial/ graphic or symbolic
 Instructional messages are received during lectures, demonstrations, field trips,
performances, or simulations, in books or on computer monitors
Action verbs for ‘Understanding’
 Interpret: translate, paraphrase, represent and clarify
 Exemplify: Illustrate and instantiate
 Classify: Categorize and subsume
 Summarize: Generalize and abstract
 Infer: Find a pattern
 Compare: Contrast, match, and map
 Explain: Construct a model

Apply
 Using procedures to perform exercises or solve problems
 Closely linked with procedural knowledge
Action verbs:
 Execute/Implement: determine, calculate, compute, estimate, solve, draw, relate,
modify, etc.

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 12
Analyze
Involves breaking material into its constituent parts and determining how the parts are related to one
another and to an overall structure
 Differentiate: Discriminate, differentiate, focus and select (Distinguishing relevant
parts or important parts from unimportant parts of presented material)
 Organize: Structure, integrate, find coherence, outline, and parse (Determine how
elements fit or function within a structure)
 Attribute: Deconstruct (Determine a point of view, bias, values, or intent underlying
presented material

Analyse activities
 refining generalizations and avoiding oversimplifications
 developing one’s perspective: creating or exploring beliefs, arguments, or theories
 clarifying issues, conclusions, or beliefs
 developing criteria for evaluation: clarifying values and standards
 evaluating the credibility of sources of information
 questioning deeply: raising and pursuing root or significant questions
 clarifying arguments, interpretations, beliefs, or theories
 reading critically: clarifying or critiquing texts
 examining or evaluating assumptions
 distinguishing relevant from irrelevant facts
 making plausible inferences, predictions, or interpretations
 giving reasons and evaluating evidence and alleged facts
 recognizing contradictions
 exploring implications and consequences

Analysis in Engineering
 Use of the verb ‘analyze’ in engineering is bit tricky
 It is not easy to design any questions in this category in limited time written examinations
 Analyse activities can be included in assignments related to case studies, projects, term
papers and field studies

Evaluate
 Make judgments based on criteria and standards
 Criteria used include quality, effectiveness, efficiency and consistency
 The standards may be either quantitative or qualitative
Action Verbs
 Check: Test, detect, monitor, coordinate
 Critique: Judge (Accuracy, adequacy, appropriateness, clarity, cohesiveness,
completeness, consistency, correctness, credibility, organization, reasonableness,

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 13
reasoning, relationships, reliability, significance, standards, usefulness, validity, values,
worth, criteria, standards, and procedures)

Create
 Involves putting elements together to form a coherent or functional whole
 While it includes objectives that call for unique production, also refers to objectives
calling for production that students can and will do
Action verbs:
 Generate: Classify systems, concepts, models, explanations, generalizations,
hypotheses, predictions, principles, problems, questions, stories, theories)
 Plan (design)
 Produce

Critical Thinking
 Critical thinking refers to the deep, intentional and structured thinking process that is
aimed at analyzing and conceptualizing information, experiences, observation, and
existing knowledge for the purpose of creating original and creative solutions for the
challenges encountered
 Critical thinking is systematic and holistic in the sense that while examining a proposed
solution, it examines its impact and consequences on other parts of the system thus
ensuring that a solution at one level of the system does not create challenges and
difficulties somewhere else
 Thinking critically requires a positive open and fair mindset that is able to objectively
examine the available information and is aware of the laid assumptions and limitations
brought about by them.
 Critical thinking is the art of analyzing and evaluating thinking with a view to improving it

Problem Solving
 Problem solving involves Apply, Analyze, Evaluate and Create processes

Nature of Engineering Courses


 The frameworks with in which majority of engineering and engineering science courses
are dealt with are fairly well defined
 Solution of open ended problems is attempted in engineering programs mostly through
projects and sometimes through assignments where time for solving is not a major
limitation
 Assessment items in class tests and end-semester examinations dominantly belong to
the Remember, Understand and Apply cognitive levels

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 14
Higher Orders of Learning/ Deep Learning/Meaningful Learning
 Apply (Implement)
 Analyze
 Evaluate
 Create

Categories of Knowledge
Knowledge
 The problem of characterizing knowledge is an enduring question of philosophy and
psychology
 Knowledge is organized and structured by the learner in line with a cognitivist -
constructivist tradition
 Knowledge is domain specific and contextualized
General Categories
 Factual
 Conceptual
 Procedural
 Metacognitive

Categories specific to Engineering


 Fundamental Design Principles
 Criteria and Specifications
 Practical Constraints
 Design Instrumentalities
Factual Knowledge
 Basic elements students must know if they are to be acquainted with the discipline or
solve any of the problems in it
 Exists at a relatively low level of abstraction
Subtypes of Factual Knowledge
 Knowledge of terminology (e.g., words, numerals, signs, pictures)
 Knowledge of specific details (including descriptive and prescriptive data) and elements
Samples of ‘Factual’ Knowledge
 Terminology: Signal-to-noise ratio, low-pass filter, VCVS, CCCS, power factor etc.
Specific details:
 Power supply frequency in India is 50 Hz
 Semiconductor devices fail above 120OC
 Ball grid array packaging can provide for more that 200 input-output pins
 TI and Analog Devices are two semiconductor manufacturers making a wide variety of
analog ICs

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 15
Conceptual Knowledge
 A concept denotes all of the entities, phenomena, and/or relations in a given category or
class by using definitions.
 Concepts are abstract in that they omit the differences of the things in their extension
 Classical concepts are universal in that they apply equally to everything in their
extension.
 Concepts are also the basic elements of propositions, much the same way a word is the
basic semantic element of a sentence.
Includes
 knowledge of categories and classifications, and the relationships between and among
them
 schemas, mental models, or implicit or explicit theories
Schemas and models, and theories represent
 how a particular subject matter is organized and structured
 how the different parts or bits of information are interconnected and interrelated in a
more systematic manner
 how these parts function together
Examples of Conceptual Knowledge
 Force, acceleration, velocity, mass, voltage, current, temperature, entropy, stress, strain
 Kirchhoff’s laws
 Laws of thermodynamics

Procedural Knowledge
 is the “knowledge of how” to do something
 it often takes the form of a series or sequence of steps to be followed.
 includes knowledge of skills, algorithms, techniques, and methods, collectively known as
procedures
 also includes knowledge of the criteria used to determine when to use various procedures
 is specific or germane to particular subject matters or academic disciplines

Examples of Procedural Knowledge


 Solving matrix differential equation
 Preparing a truth-table from a logic expression
 Drawing a Bode plot
 Designing a filter as per specifications

Metacognitive Knowledge
 is knowledge about cognition in general as well as awareness of and knowledge about
one’s own cognition?

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 16
Categories of Metacognitive knowledge
 Assessing the task at hand
 Evaluating one’s own strengths and weaknesses
 Planning an appropriate approach
 Applying strategies and monitoring performance
 Reflecting and adjusting one’s own approach
 Beliefs about intelligence and learning

Fundamental Design Concepts


Operational principles of devices, and components within a device
Examples
 A device can perform a variety of tasks by incorporating memory into it.
 A device that has two well defined states can be used as a memory unit.
 Stepping movement can be created through interaction between two salient magnetic
fields.
 An airfoil, by virtue of its shape, in particular its sharp trailing edge, generates lift when
inclined at an angle to the air stream.

Criteria and Specifications


 It is necessary to translate the qualitative goals for the device into specific, quantitative
goals.
 Design criteria vary widely in perceptibility
Examples
 Any power converter should have efficiency above 95%.
 The speed control unit for the dc motor should not create excessive harmonic distortion
on the power line.
 The SMPS output should have an output regulation of 0.5%.
 The speed of the dc motor should be controlled over a speed range of 1 to 300 RPM with
an accuracy of 0.05%.

Practical Constraints
 an array of less sharply defined considerations derived from experience in practice,
considerations that frequently do not lend themselves to theorizing, tabulation, or
programming into a computer.
Examples
 The legend should be written above the switch on the front panel
 The indicator lamp should be above the switch
 The clearances that must be allowed between physical parts in equipment for tools and
hands to reach different parts

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 17
 The design should be completed within two months

Design Instrumentalities
 Procedural knowledge including the procedures, way of thinking and judgmental skills by
which design is done.
Examples
1. Top-down approach to the design of a product
2. Phasing of development of a product
3. Structuring of an electronic product
4. Design walkthroughs.
5. Identify all members of the team early on and include every member in the group
communications from the outset.

Taxonomy Table
 It is a table of six cognitive processes (columns) and eight categories of knowledge
(rows).
 Each cell represents a specific combination of cognitive process and a category of
knowledge.

Taxonomy Table (Anderson-Bloom-Vincenti)

Cognitive Processes
Knowledge Categories
Remember Understand Apply Analyze Evaluate Create

Factual

Conceptual

Procedural

Fundamental Design principles

Criteria & Specifications

Practical Constraints

Design instrumentalities

Metacognitive

Alignment
 Alignment refers to the correspondence of learning objectives, assessment and
instructional activities

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 18
Psychomotor domain
 It includes physical movement, coordination, and use of the motor-skill areas. (Simpson,
1972)
 Development of these skills requires practice and is measured in terms of speed,
precision, distance, procedures, or techniques in execution.

Affective Domain
 Proposed in 1956 by Krothwohl, Bloom, and Masia
 Difficult to structure
 Catch all: self-concept, motivation, interests, attitudes, beliefs, values, self-esteem,
morality, ego development, feelings, need achievement, locus of control, curiosity,
creativity, independence, mental health, personal growth, group dynamics, mental
imagery and personality

Relation between the three domains


 Cognitive, affective and psychomotor activities are not independent of one another
 Instruction needs to pay attention to these dependencies

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 19
Program Outcomes, Program Specific Outcomes, and Course Outcomes

POs and PSOs


• What the students become capable of, at the end of the program (PEOs look at the graduates
3 to 4 years after the completion of the program!)
• POs (12 in number) are defined by NBA; are applicable to all UG programs; cover not just
technology competence but also skills and attitudes!
• PSOs are program specific; 2 to 4; need to be defined following a documented process

POs defined by NBA


1. Engineering knowledge: Apply the knowledge of mathematics, science, engineering
fundamentals, and an engineering specialization to the solution of complex engineering
problems.
2. Problem analysis: Identify, formulate, research literature, and analyze complex engineering
problems reaching substantiated conclusions using first principles of mathematics, natural
sciences, and engineering sciences.
3. Design/development of solutions: Design solutions for complex engineering problems and
design system components or processes that meet the specified needs with appropriate
consideration for the public health and safety, and the cultural, societal, and environmental
considerations.
4. Conduct investigations of complex problems: Use research-based knowledge and research
methods including design of experiments, analysis and interpretation of data, and synthesis of
the information to provide valid conclusions.
5. Modern tool usage: Create, select, and apply appropriate techniques, resources, and modern
engineering and IT tools including prediction and modeling to complex engineering activities
with an understanding of the limitations.
6. The engineer and society: Apply reasoning informed by the contextual knowledge to assess
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 20
societal, health, safety, legal and cultural issues and the consequent responsibilities relevant to
the professional engineering practice.
7. Environment and sustainability: Understand the impact of the professional engineering
solutions in societal and environmental contexts, and demonstrate the knowledge of, and need
for sustainable development.
8. Ethics: Apply ethical principles and commit to professional ethics and responsibilities and norms of
the engineering practice.
9. Individual and team work: Function effectively as an individual, and as a member or leader in
diverse teams, and in multidisciplinary settings.
10. Communication: Communicate effectively on complex engineering activities with the
engineering community and with society at large, such as, being able to comprehend and
write effective reports and design documentation, make effective presentations, and give and
receive clear instructions.
11. Project management and finance: Demonstrate knowledge and understanding of the
engineering and management principles and apply these to one’s own work, as a member
and leader in a team, to manage projects and in multidisciplinary environments.
12. Life-long learning: Recognize the need for, and have the preparation and ability to engage in
independent and life-long learning in the broadest context of technological change.

Program Specific Outcomes (PSOs):


• Beyond POs
• Specific to the particular program
• 2 to 4 in number
• Must have a process for arriving at them
• Must be realistic
• Program Curriculum and other activities during the program must help the achievement of
PSOs as with POs!

PSOs - Examples
CSE: (Stem as with POs)
• Design, develop, test, and maintain Software Systems for business applications
• Design, develop, test, and maintain Systems Software.
• Maintain legacy software systems
ECE: (Stem as with POs)
• Specify, design, prototype and test modern electronic systems that perform analog and
digital signal processing functions.
• Architect, partition, and select appropriate technologies for implementation of a specified
communication system

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 21
Course Outcomes
Students learn well when
 They are clear about what they should be able to do at the end of a course
 Assessment is in alignment with what they are expected to do
 Instructional activities are designed and conducted to facilitate them to acquire what they
are expected to achieve

Assessment
 Understanding what students know and are able to do as a result of their engineering
education is fundamental to students success and to the quality and effectiveness of
engineering education
Many academics still view
 The assessment of student learning as an obligatory, externally imposed chore of
compliance and accountability
 Gathering evidence of students learning is not compliance with external demands but
rather, an institutional strategy, a core function of continuous improvement and a means
for faculty to elevate student success and strengthen institutional health

Outcomes of Learning
 When we teach we want our students to learn.
 Outcomes of any learning: Outcomes, Course Outcomes, Learning Outcomes, Intended
Learning Outcomes, Instructional Objectives, Educational Objectives, Behavioral
Objectives, Performance Objectives, Terminal Objectives, Subordinate Skills,
Subordinate Objectives, General Instructional Objectives, Specific Learning Outcomes and
Competencies.

What is Course Outcome?


 Course Outcomes are what the student should be able to do at the end of a course
 It is an effective ability, including attributes, skills and knowledge to successfully carry out
some activity which is totally identified
 The most important aspect of a CO is that it should be measurable

Structure of a CO Statement
 Action: Represents a cognitive/ affective/ psychomotor activity the learner should
perform. An action is indicated by an action verb representing the concerned cognitive
process.
 Knowledge: Represents the specific knowledge from any one or more of the eight
knowledge categories

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 22
 Conditions: represents the process the learner is expected to follow or the conditions
under which to perform the action (This is an optional element of CO)
 Criteria: represent the parameters that characterize the acceptability levels of performing
the action (This is an optional element of CO)

Sample 1
Determine the input-output characteristics of active two-port networks using Microcap simulator and
TI Analog Laboratory unit and compare their characteristics as obtained by simulation and Lab Unit
 Action: Determine (Apply)
 Knowledge: input-output characteristics of active two-port networks (Conceptual)
 Condition: using Microcap simulator and TI Analog Laboratory unit
 Criteria: compare its characteristics as obtained by simulation and Lab Unit

Sample 2
Macro model signal processing functions of resistors, capacitors, inductors, crystals, diodes,
Amplifiers, Op Amps, Comparators and Multipliers as one-port and two-port networks
 Action: Macro model (Understand)
 Knowledge: signal processing functions of …… (Conceptual and Procedural)
 Condition: One-port and two-port networks
 Criteria: None

Sample 3
Calculate major and minor losses associated with fluid flow in piping networks
 Action: Calculate (Apply)
 Knowledge: major and minor losses associated with fluid flow in piping networks
(Conceptual and Procedural)
 Condition: None
 Criteria: None

Sample 4
Determine the dynamic unbalanced conditions of a given mechanical system of rigid objects
subjected to force and acceleration
 Action: Determine (Apply)
 Knowledge: Dynamic unbalanced conditions (Conceptual and Procedural)
 Condition: given mechanical system of rigid objects subjected to force and acceleration
 Criteria: None

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 23
Dos and Don’ts
 Use only one action verb
 Do not use words including ‘like’, ‘such as’, ‘different’, ‘etc.’ with respect to knowledge
elements. Enumerate all the knowledge elements.
 Put in effort to make the CO statement as specific as possible and measurable

Check List
1. Does the CO begin with an action verb (e.g., state, define, explain, calculate,
determine, identify, select, and design)?
2. Is the CO stated in terms of student performance (rather than teacher performance
or subject matter to be covered)?
3. Is the CO stated as a learning product (rather than in terms of the learning process)?
4. Is the CO stated at the proper level of generality and relatively independent of other
COs (i.e., is it clear, concise, and readily definable)?
5. Is the CO attainable (do they take into account students’ background, prerequisite
competences, facilities, time available and so on)?

COs: Samples and comments


 Students will execute mini projects
 Instructional activities are designed to facilitate the attainment of COs by learners,
but themselves are not COs
 Have the concepts of compensators and controllers (P, PD, PI, PID)
 COs are competencies / behaviors that can be demonstrated; not descriptions of
internal changes in the students (though these are necessary)
 Optimal Generator scheduling for thermal power plants by using software package in the
lab
 No action verb; no way of assessing; no way of determining attainment level;
syllabus part is rewritten.
 Will get knowledge of protection schemes for Generator, Transformer and Induction Motor
 COs are competencies / behaviors that can be demonstrated; not descriptions of
internal changes in the students (though these are necessary) - See the comments in
the previous slide!
 Apply problem solving techniques to find solutions to problems.
 Too general; no clear way of assessing!
 Study variety of advanced abstract data type (ADT) and data structures and their
Implementations
 Activity that the student engages in during the Course; not what he / she become
capable of demonstrating at the end of the course?
 Know the stress strain relation for a body subjected to loading within elastic limit.
 See the earlier comment; Not an action that can be demonstrated; Internal change

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 24
 Students will be able to learn the structure, properties and applications of modern metallic
materials, smart materials non-metallic materials and advanced structural ceramics.
 An outcome? How to assess?
 Students will be aware of base band signal concepts and different equalizers.
 See the earlier comment; Not an action that can be demonstrated; Internal change
 Get complete knowledge regarding adaptive systems
 See the earlier comment; Not an action that can be demonstrated; Internal change;
Too ambitious to be realistic?

Exercise
Write a set of COs a student should acquire at the end of your course, emphasizing particularly the
relevant higher cognitive levels.
 Make sure that the CO does not appear to be like a single question.
 Avoid using the action verbs Apply, Analyze, Evaluate and Create. Use the action verbs
associated with these cognitive levels.
 Mark the number of classroom sessions you would need to conduct the instructional
activities for each competency

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 25
Attainment of Course Outcomes
Course Outcomes
 Course Outcomes are statements on what the students will be expected to attain at the
end of the course.
 The number of course outcomes is about 6.
 2-credit course has about 28 classroom sessions
 3-credit course has about 40 classroom sessions
 4-credit course has about 54 classroom sessions
 It is desirable to associate an approximate number of classroom sessions with each
Course Outcome.

Sample Course Outcomes


 Course: Analog Circuits and Systems Credits: 3:0:1
 Course Designers: K. Radhakrishna Rao and N.J. Rao

Course Outcome Cognitive Class Lab


Level Sessions Sessions
(Hrs)

CO1 Understand the characteristics of linear one-port U 3


and two-port signal processing networks

CO2 Model one-port devices including R, L, C and U 9 4


diodes, two-port networks, and active devices
including amplifiers, Op Amps, comparators,
multipliers, BJTs and FETs

CO3 Understand how negative and positive feedback U 4 4


influence the behaviour of analog circuits

CO4 Design VCVS, CCVS, VCCS, CCCS, and DC and Ap 10 4


SMPS voltage regulators

CO5 Design analog filters Ap 8 8

CO6 Design waveform generators, phase followers Ap 6 8


and frequency followers

Total Hours of instruction 40 28

Attainment of COs of the Course


 Attainment of COs can be measured directly and indirectly
 Direct attainment of COs can be determined from the performances of students in all the
relevant assessment instruments.
 Indirect attainment of COs can be determined from the course exit surveys.
 The exit survey form should permit receiving feedback from students on individual COs.
 Computation of indirect attainment of COs may turn out to be complex; the percentage
weightage to indirect attainment can be kept at a low percentage, say 10%.
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 26
Direct CO attainment
 Semester End Examination (SEE) is conducted and evaluated by the affiliating University.
 The Department will have access only to the marks obtained by each student in the course
 As the information on performance in SEE on each student in individual COs is not available,
the Institution/Department has to take that attainment (percentage marks) for all COs of the
course is the same.
 The proportional weightages of CIE: SEE may be 20:80, 25:75 or 30:70.
 The number of assessment instruments used for CIE is decided by the instructor and/or
Department and sometimes by the affiliating University

Assessment Pattern
All assessment items in all CIE assessment instruments are to be tagged with
 Cognitive Level (CL)
 Course Outcome (CO)
 Marks

Sample Assessment Pattern for all the concerned CIE Instruments (assuming 25% weightage for
CIE) indicated.

CL A1 T1 T2
5 10 10

Remember 0 20% 20%

Understand 0 60% 40%

Apply 100% 20% 40%

Analyze 0 0 0

Evaluate 0 0 0

Create 0 0 0

Class average in CIE

A1 T1 T2
CO 5 10 10 CIE Class Average
Cl. Ave Cl. Ave Cl. Ave

CO1 0 2.3/4 0.6/1 2.9/5= 58%

CO2 1.5/2 2.1/3 0.8/1 4.4/6 = 76%

CO3 0.7/1 2.3/3 2.3/3 5.3/7= 76%

CO4 1.7/2 0 1.2/2 2.9/4= 72%

CO5 0 0 1.1/2 1.1/2= 55%

CO6 0 0 0.7/1 0.7/1= 70%

Setting CO Attainment Targets


 There can be several methods
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 27
Example 1:
 Same target is identified for all the COs of a course. For example
 The target can be “the class average marks > 60 marks”
Example 2
 Targets are the same for all COs and are set in terms of performance levels of different
groups of students.
 While this method classifies students into different categories it does not provide any clues to
plans for improvement of quality of learning

Targets

(% of students (% of students (% of students (% of students


getting < 50) getting >50 and < 65) getting >65 and < 80) getting > 80)

10 40 30 10

Example 3
 Targets are set for each CO of a course and for different groups of students separately
 Provides considerable details which can lead to specific plans for improvement

CO Targets

(% of students (% of students getting (% of students getting (% of students


getting <50) >50 and < 65) >65 and < 80) getting > 80)

CO1 10 40 40 10

CO2 20 30 40 10

CO3 20 30 40 10

CO4 10 40 40 10

CO5 20 20 50 10

CO6 20 20 50 10

Example 4
Setting targets for Course Outcomes
 Targets are set for each CO of a course separately.

CO Target (Class Average)

CO1 60%

CO2 75%

CO3 70%

CO4 70%

CO5 80%

CO6 70%

 It does not directly indicate the distribution of performance among the students. It has the
advantage of finding out the difficulty of specific COs
 There are several ways setting targets for Course Outcomes

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 28
Computation of CO Direct Attainment in the course Cxxx
Attainment of COi in a course Cxxx = Wt. of CIE x Attainment of COi as percentage in CIE + Wt. of
SEE x Class Average Marks Percentage in SEE

CO CIE SEE Direct CO Attainment


25 75 0.25 CIE Cl. Ave
Cl. Ave Cl. Ave +0.75 SEE Cl. Ave

CO1 2.9/5= 58% 63% 61.75

CO2 4.4/6 = 76% 63% 65.9

CO3 5.3/7= 76% 63% 65.9

CO4 2.9/4= 72% 63% 64.7

CO5 1.1/2= 55% 63% 59.6

CO6 0.7/1= 70% 63% 64.1

CO Attainment and Attainment Gap


 Computation of Attainment of COs in Cxxx =
0.9 Direct CO Attainment+ 0.1 Indirect CO Attainment

CO Direct CO Indirect CO CO CO Target CO


Attainment Attainment Attainment Attainment Gap
0.25 CIE Cl. Ave (Exit Survey) %ge
+0.75 SEE Cl. Ave

CO1 61.75 78 62.3 60 -2.3%

CO2 65.9 85 67.8 75 7.3%

CO3 65.9 76 66.9 70 3.1%

CO4 64.7 89 67.1 70 2.9%

CO5 59.6 78 61.4 80 18.6%

CO6 64.1 85 66.2 70 3.8%

Note: When there are no attainment gaps or attainment gaps are negative it is expected that the
instructor will enhance the CO target next time he offers the course.
Closure of the Quality Loop

Target CO Action proposed to Modification of


Attainment gap bridge the gap target where achieved

CO1 60 -2.3%

CO2 75% 7.3%

CO3 70% 3.1%

CO4 70% 2.9%

CO5 80% 18.6%

CO6 70% 3.8%

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 29
Use of Surveys
A Recap
 Evaluation of attainment of POs and PSOs is based on Direct and Indirect Methods!
 Direct Methods:
The performance of students in different assessments (Internal, University)  Evaluation of
attainment of COs Evaluation of attainment of POs and PSOs based on the mappings from
COs to POs and PSOs
 Indirect Methods:
Program Exit Surveys, Alumni Surveys, and Employer Surveys are used to evaluate the
attainment of POs and PSOs

Attainment of POs and PSOs


 Evaluations of attainment of POs and PSOs based on Direct and Indirect Methods are
combined to arrive at the Final Evaluation.

Example: PO 5 (Modern Tool Usage): Evaluation Based on Direct Methods: Level 2


 Based on Indirect Methods (3 Surveys): 2.67
 Combined Evaluation: (w1 x 2) + (w2 x 2.67)
 The weights w1 and w2 need to be decided by the Institute.
Typical values can be 0.8 and 0.2 respectively!
 With these values, the combined value is: 1.6 + 0.54 = 2.13
(Between Level 2 and Level 3)

Attainment of PEOs
 Evaluation of attainment of PEOs is generally based only on Indirect Methods!

Indirect Methods:
Alumni Surveys, and Employer Surveys are generally used to evaluate the attainment of PEOs.
 Thus the data from Surveys is used for evaluating the attainment of POs and PSOs as well as
PEOs.
 The actual responses useful for these two different purposes are not identical!

Program Exit Survey - 1


Personal Details:
 Name
 Duration at the Institute (From...To....)
 Program of Study
 Rural / Urban Background
 Placement Status

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 30
 Status in GATE / GRE / ....
 ... ...
(What follows are sample questions only)
On a scale of 1 (worst) to 5 (best) where relevant (other ranges are possible, of Course)

Program Exit Survey - 2


 Level of comfort in working in groups
 Level of confidence in formulating imprecise real-world problems as formal engineering
problems
 Opportunities provided for acquiring leadership skills
 Communication skills and Interpersonal skills acquired during your stay in the Institute
 Nature of final-year project: (Research, Implementation, Fabrication, Purely theoretical,)

Program Exit Survey - 3


 Confidence in applying concepts of Mathematics and Computing in solving problems
 Usefulness of professional core courses during job interviews
 Availability and adequacy of modern tools in the laboratories
 Opportunities provided for working in multi-disciplinary project teams
 Usefulness of Mathematics, Professional core and electives in competitive exams like GATE,
GRE etc

Program Exit Survey - 4


 Level of understanding of the need to factor in sustainability, ethical, health, public safety,
and environmental issues in the solutions developed by you
 Opportunities for working on real-life problems during the program
 Extent of opportunities available for applying project management principles in academic
activities undertaken by you during the program
 Extent of usefulness of Basic Science and Engineering Science courses in problem solving

Program Exit Survey - 5


 New tools (outside the formal curriculum) learnt
 Extent of acquisition of critical analysis competency in solving complex engineering problems
 Opportunities available for working on projects with research focus (PG?)
 Open suggestions for improving the quality of the program

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 31
Alumni Survey
Personal Details:
 Name
 Duration at the Institute (From...To....)
 Program of Study
 Rural / Urban Background
 ....
 ... ...
On a scale of 1 (worst) to 5 (best) where relevant (other ranges are possible!) (These are sample
questions only):
 Current Position; Organization
 Initial Position; Organization
 Promotions, Organizations in which you worked along with period in each organization,
Rewards, Awards, projects handled etc
 Publication of Research Papers, White Papers etc.
 Level of comfort in working in groups – initially and at present
 Enhancement of qualifications (higher degrees, certificate courses etc), knowledge, skills etc.
(workshops, training programs etc)
 Level of confidence and success in formulating imprecise real-world problems as formal
engineering problems – initially, now
 Success in leadership roles (preparedness at program exit, success in on-site trainings etc.)
 Communication skills (level of acquisition during the program, usefulness in the job,
additional acquisitions during work etc.)
 Level of Interpersonal skills
 Ease with modern tools
 Learning curve with new tools
 New tools learnt during job
 Your assessment of need for professional ethics in work
 Comfort level with application of concepts Mathematics, Engineering, in solving real problems
 Usefulness of professional core courses in your professional practice.
 Relevance of professional electives to your profession so far
 Ability to factor in sustainability, ethical, health, public safety, and environmental issues in
the solutions developed by you
 Extent of application of project management principles in the projects handled/being handled
by you
 Extent of usefulness of Basic Science and Engineering Science courses in understanding
problems you solved so far in your career
 Open suggestions for improving the quality of the Program

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 32
Employer Survey
Organization Details: ...
Employee Details:
 Name
 Current Position
 Date of Joining the Organization
 Position at the time of joining ... ...
With respect to our Graduates, please indicate your assessment on the following:
 Ability to work well in groups
 Publication of Research Papers, White Papers etc.
 Level of confidence and success in formulating imprecise real- world problems as formal
engineering problems
 Success in leadership roles
 Communication skills
 Interpersonal skills
 Ability to learn and use new and modern tools
 Ethical Behavior
 Ability to factor in sustainability, ethical, health, public safety, and environmental issues in
the solutions developed
 Extent of application of project management principles in the projects handled/being handled
by him/her
 Extent of critical analysis competency exhibited in solving complex engineering problems
 Enthusiasm in participating your CSR activities
 Any specific negative traits observed
 Open suggestions for improving the quality of our graduates

Using the Survey Data


Using the survey data for evaluating the attainment of a PO or PSO or PEO is same:
Example: PO 5 (Modern Tool Usage)
1. Identify the responses that are relevant to this PO from each survey.
Example:
“Rate the Ability to learn and use new and modern tools” from Employer Survey
“New tools (outside the formal curriculum) learnt”
from Program Exit Survey and so on...
2. With data from only one type of survey, find the average rating for one relevant question.
Example (cont’d): Using Program Exit Survey
50 people answered the example question given earlier; 6 rated 1 (low); 35 rated 4; and 9
rated 5. So, the average is: 3.82
3. Repeat for all other relevant questions from the same survey
Example (cont’d): Assume there are 3 other relevant questions and their average ratings are 3.91,
4.15, and 4.88

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 33
4. The final average rating from this survey is 4.19
5. Set target levels of attainment
6. Example: Average value from a Survey is
<3  Level 1
≥ 3 and < 4  Level 2
≥4  Level 3
(Other ranges are possible; discuss in department and record the justifications for setting the
target levels the way they are set)
7. So, Attainment of PO 5 from the survey under consideration is:
4.19  Level 3
8. Repeat with other types of Surveys if relevant.
9. Compute the grand average as the Final Value of Attainment of this PO
Example: Attainment of PO5
From Program Exit Survey: Level 3
From Alumni Survey: Level 3
From Employer Survey: Level 2
Final Value: (3+3+2) / 3 = 2.67
10. Repeat this for each PO, PSO, and PEO Surveys useful for Pos and PSOs:
Program Exit Survey, Alumni Survey, Employer Survey
Surveys useful for PEOs: Alumni Survey, Employer Survey
Alternative approach for combining results from different surveys:
 Previous approach: Result of each survey was immediately quantized in to one of the 3 levels
 Alternatively: We can retain the average value computed for each survey (without
quantizing); find the grand average value from all the relevant surveys; and then quantize!
Example: Attainment of PO5
Values from Program Exit Survey, Alumni Survey,
Employer Surveys are respectively:
4.19, 4.32, 3.79  Grand Average = 4.1  Level 3

Course Surveys
 Course Surveys: Mid-Course; Course-End
 Written / Electronic; Signed / Anonymous
 Mid-Course Survey:
– Typically, about a month after the start of the course; can be repeated after another
month!
– Useful for corrections in course delivery
 Course-End Survey:
– At the end of the course
– Useful for “closing the quality loop”
– May be used in computing course attainment, though the manual does not explicitly
recognize this approach!

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 34
Mid-Course Survey
 Helpful for mid-course corrections
 Typical Questions to be answered by all the students
(on a scale of 1 to 5 – most negative to most positive response):
– COS are clear
– Pace of coverage is comfortable
– Instruction is aligned to COs
– Questions are encouraged
– Good access to learning resources
– Examples are worked out well
– Good communication skills (of Faculty)
– Supportive attitude (of Faculty).....

Course-End Survey
 Helpful for: “closing the loop”
 Can be used in computing attainments of COs
 Questions generally cover:
– Course Management
– Learning Environment
– Attainment of COs
– Instructor characteristics.........
 Typical Questions to be answered by all the students
(on a scale of 1 to 5 – most negative to most positive response):
– COs were clear
– Instructional activities helped in attaining Cos
– Pace of coverage was comfortable
 Questions were encouraged
 Had good access to learning resources
 Examples were worked out well and also useful for Examinations
 Instructor had good communication skills
 Instructor’s attitude was supportive
 How much did you learn?
 Any specific CO(s) that you are not confident of? (Tick them in the list below)
 The course helped you in improving your problem solving abilities.........

Using the Survey Data


 Find the average rating for one relevant question.
Example: For a question related to CO3, of the 65 answers:
6 rated 1 (low); 54 rated 4; and 5 rated 5. So, the average is: 3.8

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 35
It corresponds to (as per our own settings)
Level 2 (medium)!
 Repeat for all other relevant questions
 The final attainment of that CO is the average of all these values
 This process is repeated for all the COs

Combining Direct & Indirect Evaluations


 The attainment levels obtained by direct methods and course-end survey can be combined to
get the final level of attainment.
 The relative weights need to be decided upon. (90% and 10% to 80% and 20%?)
 Example: CO2
– Direct method (University Examination + Internal Assessment): 1.9
– Based on Course-End Survey: 2
– Final Value: (0.9 x 1.9) + (0.1 x 2) = 1.91

Rubrics
 What?
– A Scoring Tool useful for subjective assessments
– A more systematic way of evaluating performance of students on tasks such as
Seminars, Projects, Term Papers...
 Must be shared up front with students
– Enables students “do” what is expected
– Makes the process more transparent
– Allows self-evaluation by students
 Components:
– Attributes
– Descriptors
– Scores

Rubrics - Attributes
 The criteria by which the performance is to be evaluated
 Are derived from the planned outcomes
Example:
For a Technical Seminar on Operating Systems, some of the attributes can be:
– Verbal Skills
– Body language
– Technical Content
– ...
 The more clearly articulated the attributes, the better will be the usefulness of the rubrics

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 36
 Your comments on the above list
 Attributes can be organized hierarchically (attributes, sub-attributes)
Example:
– Verbal Communication
 Grammatically correct sentences
 Semantically clear sentences
 “Filler words”
 Voice Modulation
 ...
– Non-Verbal Communication
 Eye-Contact
 Posture
 ...

Descriptors
For each (sub) Attribute:
 Provide descriptions of performance at different levels of “quality”
 The levels can be 3 to 5 (typical)
 Number of Levels
– Too small  not much discrimination
– Too large  Taxing for all
– No hard and fast rule
 Avoid stand-alone vague descriptors
(Excellent, Creative, Weak,)
 Descriptors need to be as specific as possible
 Good descriptors 
– More objective evaluation
– More helpful for students in preparing well

Example: 3 Levels for “References Section in a Term Paper”


– GOOD (Highest Level): Latest references (up to the previous year) are included;
References are cited as per the specified standard (say IEEE Standard); References
cited cover the subject matter comprehensively.
– AVERAGE (Intermediate Level):
– POOR (Low Level): Only old (...) references; many of the References are not cited as per
the specified standard; References cited poorly cover the subject matter.

Scores
For each level of descriptor of each (sub) attribute:
 Assign scores
 Can be a single value or a range of values
 Avoid a range that is too wide

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 37
Example: For the descriptors given earlier –
Good Level: 8 to 10
Average Level: 4 to 7
Low Level: 0 to 3

Rubrics – Template (partly filled)


 Seminar Presentation:

Criteria Descriptors with Scores

Use of Communication Communication Communication Communication


Communi- aids enhance aids contribute to aids marginally aids are poorly
cation Aids presentation. The the quality of the contribute to the used. Font size is
font on the presentation. quality of the too small to read.
visuals is readable. Font size is presentation. Too much
Information is mostly readable. Font size is information is
represented and Appropriate mostly readable. included. Details
organized to information is Information or some
maximize audience included. included is often unimportant
comprehension. Sometimes main unimportant information is
Details are points are Some times main highlighted, and
minimized so that obscured by points may confuse the
main points excessive details are obscured by audience (0-1)
stand out. (8-10) (4-7) excessive details
(2-3)

Rubrics – Example
Rubric for Formal Oral Presentation:
Attributes:
Organization
Style
Use of tools
Depth of Content
Accuracy of Content
...
Descriptors and Scores:
Rubrics for Main / Mini Project:

Rubrics – Laboratory

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 38
Attributes:
Team Work
Viva
...
Scores??

Rubrics – Project
Attributes:
Technical
Requirements
Literature Survey
Design Alternatives
...
Report
...
Presentation
...
Team Work......

Rubrics – Exercises
 Main Project
 Seminar
 Laboratory Work

COs-POs and PSOs


 POs and PSOs are attained through program specific Core Courses.
 Each Course addresses a sub-set of POs and PSOs to varying levels (strengths) (1, 2 or 3).
Sometimes we determine the POs/PSOs the courses address.
 Sometimes we may apriori determine the POs/ PSOs a Course
should address and the COs have to be written to meet the identified POs/PSOs.

Strength of CO-PO/PSO Mapping


 Attainment of a PO/PSO depends both on the attainment levels of associated COs and the
strength to which it is mapped
 It is necessary to determine the level (mapping strength) at which a particular PO/PSO is
addressed by the course.
 Strength of mapping is defined at three levels: Low (1), Medium (2) and Strong (3)
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 39
 Several methods can be worked to determine the strength of a PO/PSO, but implementing
them across a few hundred courses can become a burden

Strength of CO-PO/PSO Mapping Sample


 A simple method is to relate the level of PO with the number of hours devoted to the COs
which address the given PO.
– If >40% of classroom sessions/tutorials/lab hours addressing a particular PO, it is
considered that PO is addressed at Level 3
– If 25 to 40% of classroom sessions addressing a particular PO, it is considered that PO is
addressed at Level 2
– If 5 to 25% of classroom sessions addressing a particular PO, it is considered that PO is
addressed at Level 1
– If < 5% of classroom sessions addressing a particular PO, it is considered that PO is
considered not-addressed

Sample CO-PO/PSO Mappings


 Course: Analog Circuits and Systems Credits: 3:0:1
 Course Designers: K. Radhakrishna Rao and N.J. Rao

Class Lab Sessions


Course Outcome POs CL
Sessions (Hrs)

Understand the characteristics of linear PO1,


CO1 one-port and two-port signal processing PO10, U 3
networks PSO1

Model one-port devices including R, L, C


and diodes, two-port networks, and PO2,
CO2 active devices including amplifiers, Op PO10, U 9 4
Amps, comparators, multipliers, BJTs and PSO1
FETs

Understand how negative and positive


CO3 feedback influence the behaviour of PO1, PSO1 U 4 4
analog circuits

PO3, PO4,
Design VCVS, CCVS, VCCS, CCCS, and
CO4 PO5, Ap 10 4
DC and SMPS voltage regulators
PSO1

PO3, PO4,
CO5 Design analog filters Ap 8 8
PO5, PSO1

Design waveform generators, phase PO3, PO4,


CO6 Ap 6 8
followers and frequency followers PO5, PSO1

Total Hours of instruction 40 28

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 40
Course – PO/PSO Mapping Strength

11 of 68 (16%) sessions are devoted to PO1 Mapping strength is 1

13 of 68 (19%) sessions are devoted to PO2 Mapping strength is 1

47 of 68 (69%) sessions are devoted to PO3 Mapping strength is 3

44 of 68 (64%) sessions are devoted to PO4 Mapping strength is 3

44 of 68 (64%) sessions are devoted to PO5 Mapping strength is 3

16 of 68 (23%) sessions are devoted to P10 Mapping strength is 1

68 of 68 (100%) Sessions are devoted to PSO1 Mapping strength is 3

Course-POs/PSO Mapping
 POs and PSOs are addressed through core courses, projects etc.
 A course/project etc. meets a subset of POs and PSOs to different strengths (1, 2 or 3)
Sample Course addresses a subset of POs and PSOs to varying strengths

Course PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO PO PO PSO1 PSO2
10 11 12

C302 1 1 3 3 3 0 0 0 0 1 0 0 3 0

CO Attainment and POs/PSOs


 Not every COi of the course will address every PO or PSO addressed by the course

CO POs CO Attainment (%ge)

CO1 PO1, PO10, PSO1 62.3

CO2 PO2, PO10, PSO1 67.8

CO3 PO1, PSO1 66.9

CO4 PO3,PO4, PO5, PSO1 67.1

CO5 PO3,PO4, PO5, PSO1 61.4

CO6 PO3, PO4, PO5, PSO1 66.2

PO and PSO Attainment


 PO and PSO attainments are normalized to 1, that is, if a PO is to be addressed at the
level of 3 and attainments of CO associated with that PO is 100%, then attainment of
that PO is 1

Attainment of PO1 in Cxxx = (1/3) x Ave (0.623+0.669) = 0.215


Attainment of PO2 in Cxxx = (1/3) x Ave (0.678) = 0.226
Attainment of PO3 in Cxxx = (3/3) x Ave (0.671+0.614+0.662) =0.648
Attainment of PO4 in Cxxx = (3/3) x Ave (0.671+0.614+0.662) = 0.648
Attainment of PO5in Cxxx = (3/3) x Ave (0.671+0.614+0.662) = 0.648
Attainment of PO10 in Cxxx = (1/3) x Ave (0.623+0.678) = 0.217
Attainment of PSO1 in Cxxx = (3/3) x Ave (0.623+0.678+0.669+ 0.671+0.614+0.662) = 0.653
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 41
 These computations are approximate but indicative PO/PSO attainment

Attainment of POs and PSOs

Course PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO12 PSO1 PSO2

C302 1 1 3 3 3 0 0 0 0 1 0 0 3 0

Attain- 0.215 0.226 0.648 0.648 0.648 0 0 0 0 0.271 0 0 0.653 0


ment

Program Curriculum, T-L and Other Processes; Highly Doable and Highly Useful
Criterion II: Program Curriculum and Teaching Learning Processes
From SAR and Evaluation Manual:
 Program Curriculum and Teaching Learning Processes : 120
 Program Curriculum : 20
 Teaching Learning Process : 100

Program Curriculum

Program Curriculum 20

Process used to identify the extent of compliance of the University Curriculum for 10
attaining the POs and PSOs; mention the curricular gaps if any

State the delivery details of the content beyond the syllabus for the attainment of the 10
POs and PSOs

Curricular Gaps
Process (One possible approach):
 Responsibility: Board of Studies
 Board of Studies (Typical Composition):
HoD, Faculty, Alumni, Current Final-Year Students, Industry, University, Faculty from
other Academic Institutes,
 Map all the COs (Core Subjects only) to POs and PSOs
 If any POs / PSOs are addressed in common by all the electives, record them
 Examine the strength of mappings to the POs and PSOs
 Identify weakly addressed POs and PSOs!
 Record the MoM
 Communicate to the University (copy retained)
 Brainstorm the additional content required to address the identified curricular gaps and
record the final decisions
 Deliver the content beyond the curriculum as planned
 Treat this as you would treat any other course! (Measure attainments, close the quality loop
etc)
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 42
(BoS can meet once a semester)
Example:
 Analysis of mapping of all courses to POs and PSOs reveals that one of the PSOs that is not
being addressed adequately by the University Curriculum is “Maintain Legacy Software
Application Systems”
 Planned additional content:
– An additional 4-Hour Module in the Software Engineering Course; Assessment is by
Group Discussion of a Case Study
– An additional 5-Day Hands-On Training Program on Software Maintenance delivered in
collaboration with Industry; Assessment is by a Lab Test.

Teaching – Learning Processes

Processes followed to improve the quality of Teaching – Learning 25

Quality of IA (Tests, Assignments etc) 20

Quality of Student Projects 25

Initiatives related to Industry Interaction 15

Initiatives related to Industry Internship / Summer Training 15

Total (Teaching – Learning Processes) 100

Discuss & Make Action Plans

Processes followed to improve the quality of Teaching – Learning

Academic Calendar 3

Pedagogic Initiatives 3

Weak and Bright Students 4

Classroom Teaching 3

Laboratory Experiments 3

Continuous Assessment in the Laboratory 3

Student Feedback of T-L and Action taken thereof 6

Processes followed to improve the quality of Teaching – Learning:


 Lesson Plan – Teaching Diary – (fortnightly?) Review – Recorded corrective action plans
where necessary – Course-end review
NBA Accreditation Workshop – Jan. 11-13, 2016
N.J. Rao/K. Rajanikanth 43
 Activity-based learning
 Tool-supported instruction
 Tech-support for weak students (including LMS, Discussion forums, Google Groups,)
 Challenges, learning resource support, and rewards (not in terms of class grades!?) for bright
students
 Open-ended experiments in the laboratories and support for conducting them
 Rubrics for continuous evaluation in the laboratory
 Mid-course, end-course surveys, data analysis, recorded improvement actions, follow-up on
the effectiveness of such measures
 Case-study based learning
 (Tech) Book Study Clubs and follow-up
 ... ... ...

Quality of Internal Assessment:


 Process to ensure quality (5)
 Process to ensure quality of IA Papers (5)
 Evidence of coverage of COs (5)
 Quality of Assignments and relevance to COs (5)
----
 Assessment plan – prepared, reviewed, revised, and shared up-front with students; Includes
CO- Assessment Item mapping
 Academic audit of assessment instruments
 Incentives for assignments where they cannot be part of formal internal assessment

Quality of Student Projects:


 Process to identify and allocate projects (3)
 Type, relevance, relation to POs and PSOs (5)
 Process for monitoring and evaluation (5)
 Process to assess individual and team performance (5)
 Quality of completed projects/working prototypes (5)
 Evidence of papers published/awards received (2)

Quality of Student Projects:


 Recorded process for announcing / allocating projects
 Support for project laboratory
 Materials/ Tools /Budget support for projects; a recorded process; defined budgetary
allocations and monitoring
 Add-on module if required on project planning
 Milestones for review and evaluation; recorded evidence
 Rubrics for project evaluation

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 44
 ... ... ...

Industry-Institute Interaction:
 Industry – supported laboratories (5)
 Industry participation in Program Design and Course Delivery (even partial) (5)
 Impact Analysis of III and action taken thereof (5)

Internships, Summer Training:


 Industrial training / tours (3)
 Internship / summer training (>15 days); post-training assessment (4)
 Impact analysis of industrial training (4)
 Student feedback on such initiatives (4)

Other Processes – Highly Doable and Highly Useful


 Several processes, procedures, policies, activities are highly doable and highly useful in the
context of Accreditation
 Each of the above contributes in a small way only in terms of “points”; but collectively they
can have significant impact on the total score
 Apart from usefulness from Accreditation perspective, these are useful in themselves for
improving the quality of learning!
(These are related to Criteria 4 onwards only!
Others, Criteria 1, 2 and 3 have already been discussed!)

To Do
 Establish Student Chapters of relevant professional societies, organize some events, and
maintain full records
 Bring out a technical magazine / news letter (Once or twice a semester?)
 Faculty Performance Appraisal and Development System (FPADS) – Define, implement, and
record
 Visiting / Adjunct Faculty (at least 50 hours of interaction per year): Organize, Record
 Record the Maintenance Process (Preventive / Corrective / Calibration) and record data
 Establish a Project Laboratory (Facilities, Rules for Usage etc)
 Define Safety Procedures and display in Laboratories; Define Review process and record Mom
 Establish an Academic Audit Cell, define its functions, articulate the process details, and
record the actions etc
 Establish an Entrepreneurship Development Cell, define its functions, articulate the process
details, and record the actions etc

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 45
 Establish a formal Proctor System, define its functions, articulate the process details, and
record the actions etc
 Faculty Evaluation by Students: Process details, records, actions taken, rewards,
 Comprehensive Student Feedback: On Resources, Procedures,
 Self-Learning Facilities (other than traditional library)
 Career guidance, Training, and Placement Activities: Already exist! Record the processes,
maintain records
 Records of co-curricular and extra-curricular activities
 Service Rules – Formal Document
 Recruitment and Promotional Policies – Formal Documents
 Formal budgetary planning, analysis
 ... ... ...

Conclusion
 Additional efforts required to attain the POs and PSOs
 Must follow the Quality Cycle
 Involve all the stake holders
 Maintain the Records

NBA Accreditation Workshop – Jan. 11-13, 2016


N.J. Rao/K. Rajanikanth 46

You might also like