Professional Documents
Culture Documents
Professional (CAP)
Job Analysis Study
2004
Notice
The information presented in this publication is for the general education of the reader. Because neither the
author nor the publisher has any control over the use of the information by the reader, both the author and the
publisher disclaim any and all liability of any kind arising out of such use. The reader is expected to exercise sound
professional judgment in using any of the information presented in a particular application.
Additionally, neither the author nor the publisher have investigated or considered the effect of any patents on
the ability of the reader to use any of the information in a particular application. The reader is responsible for
reviewing any possible patents that may affect any particular use of the information presented.
Any references to commercial products in the work are cited as examples only. Neither the author nor the
publisher endorses any referenced commercial product. Any trademarks or tradenames referenced belong to the
respective owner of the mark or name. Neither the author nor the publisher makes any representation regarding the
availability of any referenced commercial product at any time. The manufacturer's instructions on use of any
commercial product must be followed at all times, even if in conflict with the information in this publication.
Copyright © 2004
ISA–The Instrumentation, Systems and Automation Society
67 Alexander Drive
P.O. Box 12277
Research Triangle Park, NC 27709
All rights reserved.
ISBN: 1-55617-903-0
No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any means,
electronic, mechanical, photocopying, recording or otherwise, without the prior written permission of the publisher.
Introduction ...............................................................................................................................................3
A. Validation Scales.........................................................................................................20
V. Conclusion ..............................................................................................................................34
ISA- The Instrumentation, Systems, and Automation Society works to protect the public by identifying
individuals who are competent to practice in several related career fields. Consistent with this mission,
the intended function of the ISA Certified Automation Professional (CAP) examination program is to
assess competence in the automation professional. Passing scores on the examination indicate that the
Certified Automation Professional has achieved a level of ability consistent with requirements for
competence on the job.
The development of a quality credentialing or licensing examination must follow certain logically sound
and well-researched procedures. These principles and methods are outlined in federal regulation
(Uniform Guidelines on Employee Selection Procedures) and manuals, such as Standards for
Educational and Psychological Testing (published by the American Educational Research Association,
1999), and Standards for Accreditations of Certification Programs (published by The National
Commission for Certifying Agencies, 2002), as well as standards set by American National Standards
Institute (ANSI). Through its relationship with CASTLE Worldwide, Inc., ISA follows these standards in
developing examinations for its credentialing program.
The guidelines hold that it is necessary to determine the knowledge and skills needed to be a competent
practitioner in the field in order to develop a practice-related examination. The process for identifying
these competency areas includes a job analysis study, which serves as a blueprint for examination
development. A job analysis also helps to determine the type of examination, such as multiple-choice, to
be developed in order to assess essential competence in the most appropriate manner.
The critical reason for conducting a job analysis study is to ensure that the examination has content
validity. In psychometric terms, validation is the way a test developer documents that the competence to
be inferred from a test score is actually measured by the examination. Content validity is the most
commonly applied and accepted validation strategy used in establishing certification examinations. A
content-valid examination for ISA’s Certified Automation Professional program, then, appropriately
evaluates knowledge and skill required to function as a competent practitioner in the automation
profession. A content-valid examination in automation contains a representative sample of items that
measure the knowledge and skills essential to the job.
The job analysis study is an integral part of ensuring that the examination is content-valid—that the
aspects of automation covered on the examination reflect the tasks performed in the range of practice
settings throughout the United States and Canada. For both broad content areas and tasks, the study
validates importance and criticality to practice. These ratings play an important role in determining the
content of the examination.
The ISA Certified Automation Professional practice analysis study consisted of the following three
phases, which are the focus of this report:
III. Development of Test Specifications. Based on the ratings gathered from the
representative sample of automation professionals, the test specifications for the
examination were developed.
Since 1996, ISA has offered a well-recognized certification program for control systems technicians.
Certified Control System Technicians (CCSTs) work in a variety of industries to monitor and calibrate
devices that control the manufacturing process. In 2004, ISA began the first steps in the development of
a new credentialing program for Certified Automation Professionals.
The first steps in analyzing the automation profession included the identification of the major content
areas or domains, the listing of tasks performed under each domain, and the identification of the
knowledge and skills associated with each task. To conduct the study, ISA assembled a 15-member
panel of automation experts to discuss the practice. The panel members represented automation
professionals practicing in various job settings, all geographic regions of the United States, and various
experience levels as well as educators. A complete list of panel members is provided in Appendix A.
A. The panel determined that the profession could be divided into six major domains of practice. The six
domains of practice denote major responsibilities performed by automation professionals. These
performance domains are:
1. Feasibility Study
2. Definition
3. System Design
4. Development
5. Deployment
6. Operation and Maintenance
B. Next, the panel delineated essential tasks in each of the six domains. The tasks define the domains
and focus the automation professional on public safety, health, and welfare. The panel subsequently
generated a list of knowledge and skills required to perform each task.
C. The panel members then evaluated each performance domain and task, rating each on importance
and criticality to the automation practice.
Based on the work of the panel of experts, CASTLE developed a electronic survey and distributed it to a
sample of automation professionals. The results of the survey are the focus of Phase II.
Using the domains and tasks identified by the panel of experts, CASTLE developed an electronic
questionnaire to be completed by a sample of automation professionals. ISA provided CASTLE with a list
of 1,500 names of professionals in the automation field. CASTLE distributed the questionnaire to these
1,500 professionals to consider, rate, and provide other feedback on the domain and task lists delineated
by the panel of experts. The questionnaire also solicited biographical information from the respondents in
order to ensure a representative response and completion by appropriately qualified individuals.
Of the 1500 individuals who were asked to participate online, 219 submitted usable responses.
Discounting undeliverable e-mail addresses, out of office individuals, individuals unable to log into the
survey, and individuals opting out of the survey, the overall response rate was 14.95%. Given that the
survey required approximately 20 minutes to complete and that it was unsolicited, the response rate
achieved is reasonable.
Not all individuals responded to every question, therefore, the total number of responses per question
may vary.
The characteristics of the sample are important as a means to assess the degree to which the group of
respondents represents the automation profession along key dimensions. The panel of experts
discussed key variables that might have an impact on how members of the profession view their work
and developed 14 questions that accounted for them. Survey respondents were asked to provide this
information by responding to the questions. The following tables summarize the information provided by
survey respondents. Due to the fact some respondents elected not to respond to the various questions,
the frequencies reported below do not total the number of respondents.
As shown in the chart and graph below, the majority of respondents (203, or 94.4%) are male.
GENDER
Frequency Percent
Female 12 5.6
GENDER
300
200
Number of Respondents
100
0
Male Female
As shown in the chart and graph below, the majority of the sample was more than 40 years old. Thirteen
individuals (6%) reported their age as under 30 years old.
AGE
Frequency Percent
61 years and
7 3.3
above
AGE
100
80
60
Number of Respondents
40
20
0
Under 30 years 41-50 years 61 years and above
31-40 years 51-60 years
As shown in the graph below, states were grouped into geographic regions. All regions were
represented in the sample.
Alaska
1
3
2
4
5
Puerto
Rico
Ha waii
LOCATION
Frequency Percent
1 17 8.5
2 50 25.0
3 33 16.5
4 53 26.5
5 47 23.5
The table and graph below present the status of the respondents according to the years of experience
they reported. As evidenced by the table and graph, the respondents tended to be very experienced in
the automation profession with 97 individuals (45.1%) reporting more than 15 years of experience in the
field.
YEARS OF EXPERIENCE
Frequency Percent
I’m not an
automation 3 1.4
professional
Less than
3 1.4
1 year
More than
97 45.1
15 years
EXPERIENCE
120
100
80
Number of Respondents
60
40
20
0
not an AP 1-5 years 11-15 years
Less than one year 6-10 years More than 15 years
The respondents were asked to provide the percentage of their time spent working as an automation
professional in their current position. Over half of the respondents (65.6%) reported spending 76 to 100
percent of their time working as an automation professional in their current position
Frequency Percent
I’m not an
automation 4 1.9
professional
Less than
6 2.8
25 percent
140
120
100
Number of Respondents
80
60
40
20
0
Not an AP 25-50 percent 76-100 percent
Less than 25 percent 51-75 percent
The majority of the respondents reported working in both discrete/machine control and process/liquid/dry
control areas on a daily basis.
PROCESS AREAS
Frequency Percent
Discrete (Machine
16 7.5
Control)
CONTROL AREA
200
100
Number of Respondents
0
Discrete Process Both
The majority of respondents (73.3%) reported that Project/Systems Engineering was their primary
responsibility in their current position.
PRIMARY RESPONSIBILITY
Frequency Percent
Information
5 2.5
Systems
Operations and
24 11.9
Maintenance
Project/Systems
148 73.3
Engineering
Other 25 12.4
PRIMARY RESPONSIBILITY
160
140
120
100
Number of Respondents
80
60
40
20
0
Information Systems Project/System Eng
Operations & Maint Other
Respondents were asked to select the responses that best described the industry in which they worked.
The responses are provided in the table below and the chart on the following page.
INDUSTRY
Metals
Aerospace 1 0.5 3 1.4
Manufacturing
Automotive Petroleum
4 1.9 12 5.6
Manufacturing Manufacturing
Building Pharmaceutical
6 2.8 27 12.6
Automation Manufacturing
Chemical Plastics
25 11.7 4 1.9
Manufacturing Manufacturing
Pe ls M M a
a y
et er ge
M hin ra n
e o
ac v ti
M /B e truc
od ns nu
F o Co M a s
g / i c od
En tron Go f
r
ec e nu n
El um Ma ti o
s l a
on a m
C mi c uto uf
n
he A a
C i ng e M
i ld tiv
Bu mo
t o ace
Au s p
ro
Ae
20
10
0
Number of Respondents
The table and graph below present the status of the respondents according to their current employer’s
company or organization. As shown below, the greatest number (82, or 38.1%) of respondents reported
their current employer is best described as end-users. Only 13 individuals, or 6.0%, responded that their
employer did not fit a listed category.
CURRENT EMPLOYER
Frequency Percent
Control
Systems 15 7.0
Suppliers
End-Users 82 38.1
Engineering
and Design 44 20.5
Firm
Original
Equipment
22 10.2
Manufactur
er (OEM)
Systems
39 18.1
Integrators
Other 13 6.0
CURRENT EMPLOYER
100
80
60
Number of Respondents
40
20
0
Control Sys Supplier Eng/Design Firm Systems Integrators
End-Users OEM Other
Respondents were asked to indicate which, if any, certifications and licenses they held.
CERTIFICATIONS/LICENSES
Frequency
CEM 1
CQE 1
CCST 2
CSE 10
MSCE 2
PE 51
PMP 3
Other 22
Respondents were also asked to provide which, if any, professional societies, they were a member of.
ORGANIZATION MEMBERSHIP
Frequency
AIChE 13
ASME 3
CSIA 13
IBEW 27
IEEE 27
ISA 124
UA 1
Other 32
The table and chart below show that a significant majority of respondents (62.6%) reported their highest
level of education as the bachelor degree. Respondents were also asked to provide the major/focus of
their highest degree. The responses are provided in Appendix C.
Frequency Percent
High
school/Secondary 15 7.0
school
Other 4 1.9
140
120
100
80
Number of Respndents
60
40
20
0
High school Bachelor's Degree Doctoral Degree
Associate degree Master's Degree Other
The responses for annual income are provided in the table and graph below. Only three individuals
(1.4%) reported earning an annual income level of less than $20,000 while 28 individuals (13.4%)
reported earning an annual income level greater than $110,000.
ANNUAL INCOME
Frequency Percent
More than
28 13.4
$110,000
ANNUAL SALARY
100
80
60
Number of Respondents
40
20
0
Less than $20,000 $50,000 to $79,999 More than $110.000
$20,000-$49,999 $80,000 to $110,000
A. Validation Scales. The panel of experts reviewed a number of scales that are often used in job
analysis and other validation studies for the purpose of collecting data that would account for how
members of the profession evaluate the domains and tasks. In making its selection, the panel considered
which scales seemed most appropriate for the automation profession and the purpose of the study. After
considerable discussion and rehearsal using the scales, the panel selected three, one for importance,
one for criticality and one for frequency. These scales then were used to collect preliminary validation
data from members of the panel of experts and final validation data from survey respondents.
Participants (panel members and survey respondents) were asked to use four-point scales to express
their evaluation of the importance and criticality for each performance domain and task, with a “4”
representing the highest rating. The scale anchors for importance and criticality are listed below as a
reference. The description for frequency is also provided below.
Importance
Participants were asked to rate each domain on a rating of importance, or the degree to which
knowledge in the domain is essential to the minimally competent practice of interior design. The rating
anchors are provided below.
1. Slightly Important. Performance of tasks in this domain is only slightly essential to the job
performance of the certified automation professional .
2. Moderately Important. Performance of tasks in this domain is only moderately essential to the
job performance of the certified automation professional.
3. Very Important. Performance of tasks in this domain is clearly essential to the job performance
of the certified automation professional.
4. Extremely Important. Performance of tasks in this domain is absolutely essential to the job
performance of the certified automation professional.
Criticality
Participants were asked to rate each domain on a scale for criticality, or the degree to which adverse
effects (of some type) could result if the certified automation professional is not knowledgeable in the
domain. The rating anchors are provided below.
1. Minimal or No Harm. Inability to perform tasks within this performance domain would lead to
error with minimal adverse consequences.
2. Moderate Harm. Inability to perform tasks within this domain would lead to error with moderate
adverse consequences.
3. Substantial Harm. Inability to perform tasks within this domain would lead to error with
substantial adverse consequences.
4. Extreme Harm. Inability to perform tasks within this domain would definitely lead to error with
severe consequences.
Frequency
Participants were asked to provide the percent of time the certified automation professional spent
performing the duties associated with each domain. Directions in the survey required respondents to
ensure that percentages given for each domain added to 100%.
IMPORTANCE
Sample Standard
Standard
Domain Size Mean Error of
Deviation
(N) Mean
I. Feasibility Study 14 1.69 .1929 .722
The panelists rated the criticality of the domains as seen in the table below. Domain V (Deployment)
was the area seen as having the greatest potential for harmful results if the automation professional were
not competent in the domain.
CRITICALITY
Sample Standard
Standard
Domain Size Mean Error of
Deviation
(N) Mean
I. Feasibility Study 14 1.77 .2380 .890
As shown in the table on the following page, the panelists reported spending the least amount of time in
Domain I (Feasibility Study) and the most time in Domain IV (Development).
C. Survey Respondents’ Evaluations. Survey respondents employed the scales for importance,
criticality, and frequency to evaluate all domains and tasks. Their responses are summarized in the
tables on the following page.
As depicted in the table that follows, survey respondents indicated that all domains are very important.
Domain III (System Design) was seen as the most important of the six domains. Domain II (Definition)
was considered the second-most important, followed closely by Domain IV (Deployment). Domain VI
(Operation and Maintenance) was considered to be the least important, although it was considerably
higher than the scale mid-point.
IMPORTANCE
Sample Standard
Standard
Domain Size Mean Error of
Deviation
(N) Mean
I. Feasibility Study 217 3.03 .0540 .796
CRITICALITY
Sample Standard
Standard
Domain Size Mean Error of
Deviation
(N) Mean
I. Feasibility Study 217 2.43 .0608 .896
The panelists rated Domain III (System Design) as being the most frequency performed while Domain VI
(Operation and Maintenance) was rated as being performed the least often.
FREQUENCY
Sample Standard
Standard
Domain Size Mean Error of
Deviation
(N) Mean
I. Feasibility Study 212 10.29 .4965 7.229
As depicted in the chart that follows, both groups rated the importance of the domains similarly. As
shown in the following table, Domain I (Feasibility Study) had the greatest difference in ratings.
IMPORTANCE
Domain Survey Panel Difference
The two groups rated the criticality of the domains similarly with Domain IV (Development) having the
greatest difference (.58).
CRITICALITY
Domain Survey Panel Difference
FREQUENCY
Domain Survey Panel Difference
E. Survey Respondent Subgroups’ Evaluations. When using a survey to collect information regarding
a profession, the possibility that individuals in various settings have differing views of the profession is to
be expected. Finding meaningful differences in domain or task ratings among the various subgroups
might indicate that one should not generalize the survey results from one subgroup to another. With this
in mind, the responses of specific subgroups were compared using the criterion that more than one unit
of the four-point scale or 10 points on the frequency scale would indicate the possibility of meaningful
difference if any of the calculated values was lower than the scale mid-point. Subgroups were defined by
age, level of experience, time spent working as an automation professional in current position, control
areas worked in on a daily basis, area of responsibility, employer, and highest level of education.
Although three between-group differences were slightly greater than ten points on the frequency scale,
the importance and criticality means for the domain ratings were within one scale point for each
comparison. Consequently, the mean responses of the various subgroups do not vary to a practical
extent, indicating general agreement between and among the different subgroups of participants.
The following charts illustrate the similarities in means, or averages, for the responses of subgroups of
respondents. Only minor variations occur between the responses. The similarity in the ratings provides
support for generalizing from the survey results to the general population of qualified automation
professionals.
IMPORTANCE
Domain Under 61 years
31-40 41-50 51-60
30 and
years years years
years above
I. Feasibility Study 2.75 3.01 3.04 3.18 **
CRITICALITY
Domain Under 61 years
31-40 41-50 51-60
30 and
years years years
years above
I. Feasibility Study 2.23 2.52 2.37 2.47 **
FREQUENCY
Under 61 years
31-40 41-50 51-60
Domain 30 and
years years years
years above
I. Feasibility Study 11.15 10.27 9.94 10.88 **
IMPORTANCE
Less More
1-5 6-10 11-15
Domain Not an AP than 1 than 15
years years years
year years
I. Feasibility Study ** ** 3.21 2.89 3.00 3.10
CRITICALITY
Domain Less More
1-5 6-10 11-15
Not an AP than 1 than 15
years years years
year years
I. Feasibility Study ** ** 2.63 2.22 2.50 2.44
FREQUENCY
Less More
Not an 1-5 6-10 11-15
Domain than 1 than 15
AP years years years
year years
I. Feasibility Study ** ** 13.06 7.64 11.49 10.22
IMPORTANCE
Less than 25-50 51-75 76-100
Domain Not an AP
25 percent percent percent percent
I. Feasibility Study ** ** 3.15 2.95 3.04
FREQUENCY
Domain Less than 25-50 51-75 76-100
Not an AP
25 percent percent percent percent
I. Feasibility Study ** ** 2.56 2.35 2.42
CRITICALITY
Domain Less than 25-50 51-75 76-100
Not an AP
25 percent percent percent percent
I. Feasibility Study ** ** 13.20 10.61 9.58
IMPORTANCE
Domain Discrete Process Both
CRITICALITY
Domain Discrete Process Both
FREQUENCY
Automation Controls
Domain Other
Engineer Engineer
I. Feasibility Study 7.63 11.28 10.24
CRITICALITY
Domain Operations Project/
Field Information
and Systems Other
Engineering Systems
Maintenance Engineering
I. Feasibility
** ** 2.18 2.33 2.72
Study
II. Definition ** ** 2.68 2.75 2.84
III. System
** ** 2.86 3.32 3.56
Design
IV. Development ** ** 2.64 3.11 3.08
FREQUENCY
Domain Operations Project/
Field Information
and Systems Other
Engineering Systems
Maintenance Engineering
I. Feasibility
** ** 9.65 9.48 13.04
Study
II. Definition ** ** 13.17 13.96 17.83
III. System
** ** 25.43 27.29 28.04
Design
IV. Development* ** ** 15.87 26.58 21.30
IMPORTANCE
Control Engineering
End- Systems
Domain Systems and Design OEM Other
Users Integrators
Suppliers Firm
I. Feasibility Study 3.20 2.98 3.07 2.82 3.13 3.23
CRITICALITY
Control Engineering
End- Systems
Domain Systems and Design OEM Other
Users Integrators
Suppliers Firm
I. Feasibility Study 2.93 2.21 2.59 2.59 2.41 2.46
FREQUENCY
Domain Control Engineering
End- Systems
Systems and Design OEM Other
Users Integrators
Suppliers Firm
I. Feasibility Study 16.00 10.49 8.88 9.64 9.47 10.17
IMPORTANCE
Domain High Associate Bachelor’s Master’s Doctoral
Other
school Degree degree degree degree
I. Feasibility Study 3.21 2.86 2.95 3.33 ** **
CRITICALITY
Domain High Associate Bachelor’s Master’s Doctoral
Other
school Degree degree degree degree
I. Feasibility Study 2.13 2.32 2.41 2.67 ** **
FREQUENCY
Domain High Associate Bachelor’s Master’s Doctoral
Other
school Degree degree degree degree
I. Feasibility
10.00 9.36 9.48 12.77 ** **
Study
II. Definition 14.67 14.59 13.79 16.43 ** **
CASTLE assessed the reliability of the scales in order to determine how consistently the tasks measured
the domains of interest. Reliability refers to the degree to which tests or surveys are free from
measurement error. It is important to understand the consistency of the data along the importance and
criticality dimensions in order to draw defensible conclusions. With inconsistency (i.e., unreliability), it
would be impossible to reach accurate conclusions. Reliability was estimated as internal consistency
(Cronbach’s Alpha) using the respondents’ ratings of importance and criticality for each domain. This
calculates the extent to which each task rating within each domain consistently measures what other
tasks within that domain measure. Reliability coefficients range from 0 to 1 and should be above 0.7 to
be judged as adequate. Reliability values below 0.7 indicate an unacceptable amount of measurement
error. As shown below, all domains easily exceed this critical value.
RELIABILITY
Working under the direction of CASTLE, the panel of experts developed a comprehensive list of the
knowledge and skills that the qualified automation professional must possess in order to provide
competent service in each task area. Members of the expert panel drafted these lists at the time that the
panel reached consensus on the tasks. CASTLE then circulated the list throughout the panel of experts
and collected revisions and editorial suggestions for each list from the entire panel.
Following the meeting, CASTLE and ISA arranged for a special committee to review the lists online using
software designed for that purpose in combination with a series of conference calls. CASTLE facilitated
the review meetings, which led to the final listing presented in Phase III of this report.
It is useful when conducting a job analysis in connection with the content validation of a credentialing
examination to understand that knowledge is normally considered a matter of the cognitive domain
(Bloom, et al., 1956). Within the cognitive domain, predominating taxonomies use different levels to
describe the learning outcomes desired. For a credentialing examination such as the CAP, the most
common levels are knowledge (which includes recall and comprehension), application, and analysis.
Knowledge refers to the remembering of previously learned subject matter and a grasp of its meaning.
Application is the ability to use subject matter in job-related situations, and analysis refers to the ability to
break subject matter into component parts in order to reveal its organization and structure. Skills may be
psychomotor or they may involve cognitive skills, such as critical thinking. The CAP examination should
target the objective of having questions with each cognitive domain.
As shown in the charts on the preceding pages, the survey respondents indicated that all domains are
very important. Each of the six domains has an average importance of at least 2.58 on the four-point
rating scale, with 2 being “Moderately Important” and 3 being “Very Important.” Similarly, the respondents
considered all domains to be critical. Each of the six domains has an average criticality rating of at least
2.43 on the four-point scale, which means that incompetent performance of tasks in each domain could
result in “Moderate Harm” to “Substantial Harm” (of some type) to the public. It is of further value to note
that the panel of experts and survey respondents agreed on the average ratings for importance and
criticality of domains, with only one difference greater than one scale point. These data support the
validity of the six domains as major categories of responsibility in the practice of automation.
Of interest in the analysis was the possibility that respondents’ status along biographical dimensions
might influence their views about the practice of automation. All subgroups rated the domains within one
scale point or ten points on the frequency scale with the exception of three cases. In these three
instances, the highest between-group difference exceeded the lowest by greater than ten points on the
frequency scale. Two of these differences occurred when the area of primary responsibility was
examined. Differences were found in the ratings of frequency for Domain IV (Development) and Domain
VI (Operation and Maintenance). These differences were not unexpected as those respondents who
reported working in Operations and Maintenance as their primary area of responsibility reported
spending 12.52 percent more time performing duties associated with Domain V (Operations and
Maintenance) than did those individuals who reported having another area of primary responsibility.
Respondents reporting their primary area of responsibility as Project/Systems Engineering reported
spending 10.71 percent more time in Domain IV (Development) than those individuals reporting their
primary area of responsibility as Domain IV. However, no differences greater than one scale point were
found on the Importance and Criticality ratings. The final difference was found when examining subgroup
differences based on current employer’s company or organization. The respondents reporting their
current employers were best described as System Integrators reported spending 14.09 percent more
time in Domain IV (Development) than those reporting their current employers were best described as
Control Systems Suppliers. However, no differences greater than one scale points were found on the
Importance and Criticality ratings. Therefore, the differences observed were not considered meaningful
in terms of influencing test specifications.
VII. Conclusion
The results of the job analysis survey validate the results of the panel of experts. This conclusion means
that the domains and tasks developed by the job analysis panel constitute an accurate definition of the
work of qualified automation professionals
Based on a psychometric analysis of the tasks, knowledge, and skills identified by the job analysis study
and given the depth of knowledge and skill implied for protection of the public, competence in the
profession can best be assessed using a multiple-choice examination format.
The final phase of a job analysis study is the development of test specifications which identify the
proportion of questions from each domain and task that will appear on the CAP examination. Test
specifications are developed by combining the overall evaluations of importance, frequency, and
criticality, and converting the results into percentages. Importance, frequency, and criticality ratings were
weighted equally in this computation. These percentages are used to determine the number of questions
related to each domain and task.
TEST BLUEPRINT
Domain # of Items
% of Test
on Test
I. Feasibility Study 11.60% 20
V. Deployment 15.24% 27
This section of the report contains the domains, tasks, and knowledge and skill statements as delineated
by the practice analysis panel of experts and validated with data from the practice analysis survey.
Domain V. Deployment
RATINGS
% of Items on # of Items on
Task Importance Criticality Frequency
Test Test
1 2.68 2.18 1.80 1.96% 4
TOTAL 11.60% 20
RATINGS
Task % of Items on # of Items on
Importance Criticality Frequency
Test Test
1 3.11 2.55 2.05 3.23% 5
TOTAL 15.23% 26
Task 1: Determine operational strategies through discussion with key stakeholders and using
appropriate documentation in order to create and communicate design requirements.
Knowledge of:
1. Interviewing techniques
2. Different operating strategies
3. Team leadership and alignment
Skill in:
1. Leading a individual or group discussion
2. Communicating effectively
3. Writing in a technical and effective manner
ISA Certified Automation Professional 38
Job Analysis Study
4. Building consensus
5. Interpreting the data from interviews
Task 2: Analyze alternative technical solutions by conducting detailed studies in order to define
the final automation strategy.
Knowledge of:
1. Automation techniques
2. Control theories
3. Modeling and simulation techniques
4. Basic control elements (e.g., sensors, instruments, actuators, control systems,
drive systems, HMI, batch control, machine control)
5. Marketplace products available
6. Process and/or equipment operations
Skill in:
1. Applying and evaluating automation solutions
2. Making intelligent decisions
3. Using the different modeling tools
4. Determining when modeling is needed
Task 3: Establish detailed requirements and data including network architecture, communication
concepts, safety concepts, standards, vendor preferences, instrument and equipment
data sheets, reporting and information needs, and security architecture through
established practices in order to form the basis of the design.
Knowledge of:
1. Network architecture
2. Communication protocols, including field level
3. Safety concepts
4. Industry standards and codes
5. Security requirements
6. Safety standards (e.g., ISAM, ANSI, NFPA)
7. Control systems security practices
Skill in:
1. Conducting safety analyses
2. Determining which data is important to capture
3. Selecting applicable standards and codes
4. Identifying new guidelines that need to be developed
5. Defining information needed for reports
6. Completing instrument and equipment data sheets
Task 4: Generate a project cost estimate by gathering cost information in order to determine
continued project viability.
Knowledge of:
1. Control system costs
2. Estimating techniques
3. Available templates and tools
Skill in:
1. Creating cost estimate
2. Evaluating project viability
RATINGS
Task % of Items on # of Items on
Importance Criticality Frequency
Test Test
1 3.31 3.26 2.16 3.15% 5
TOTAL 24.94% 44
Task 1: Perform safety and/or hazard analyses, security analyses, and regulatory compliance
assessments by identifying key issues and risks in order to comply with applicable
standards, policies, and regulations.
Knowledge of:
1. Applicable standards (e.g., ISA S84, IEC 61508, 21 CFR Part 11, NFPA)
2. Environmental standards (EPA)
3. Electrical, electrical equipment, enclosure, and electrical classification standards
(e.g., UL/FM, NEC, NEMA)
Skill in:
1. Participating in a Hazard Operability Review
2. Analyzing safety integrity levels
3. Analyzing hazards
4. Assessing security requirements or relevant security issues
5. Applying regulations to design
Task 5: Select the physical communication media, network architecture, and protocols based on
data requirements in order to complete system design and support system development.
Knowledge of:
1. Vendor protocols
2. Ethernet and other open networks (e.g., Devicenet)
3. Physical requirements for networks/media
4. Physical topology rules/limitations
5. Network design
6. Security requirements
7. Backup practices
8. Grounding and bonding practices
Skill in:
1. Designing networks based on chosen protocols
Task 6: Develop a functional description of the automation solution (e.g., control scheme, alarms,
HMI, reports) using rules established in the definition stage in order to guide development
and programming.
Knowledge of:
1. Control theory
2. Visualization, alarming, database/reporting techniques
3. Documentation standards
4. Vendors' capabilities for their hardware and software products
5. General control strategies used within the industry
6. Process/equipment to be automated
7. Operating philosophy
Skill in:
1. Writing functional descriptions
2. Interpreting design specifications and user requirements
3. Communicating the functional description to stakeholders
Task 7: Design the test plan using chosen methodologies in order to execute appropriate testing
relative to functional requirements.
Knowledge of:
1. Relevant test standards
2. Simulation tools
3. Process Industry Practices (PIP) (Construction Industry Institute)
4. General software testing procedures
5. Functional description of the system/equipment to be automated
Skill in:
1. Writing test plans
2. Developing tests that validate that the system works as specified
Knowledge of:
1. Field devices, control devices, visualization devices, computers, and networks
2. Installation standards and recommended practices
3. Electrical and wiring practices
4. Specific customer preferences
5. Functional requirements of the system/equipment to be automated
6. Applicable construction codes
7. Documentation standards
Skill in:
1. Performing detailed design work
2. Documenting the design
Task 9: Prepare comprehensive construction work packages by organizing the detailed design
information and documents in order to release project for construction.
Knowledge of:
1. Applicable construction practices
2. Documentation standards
Skill in:
1. Assembling construction work packages
RATINGS
Task % of Items on # of Items on
Importance Criticality Frequency
Test Test
1 2.99 2.61 2.33 2.82% 5
TOTAL 22.04% 39
Task 1: Develop Human Machine Interface (HMI) in accordance with the design documents in
order to meet the functional requirements.
Knowledge of
1. Specific HMI software products
2. Tag definition schemes
3. Programming structure techniques
4. Network communications
5. Alarming schemes
6. Report configurations
7. Presentation techniques
8. Database fundamentals
9. Computer operating systems
10. Human factors
11. HMI supplier options
Skill in:
1. Presenting data in a logical and aesthetic fashion
2. Creating intuitive navigation menus
3. Implementing connections to remote devices
4. Documenting configuration and programming
5. Programming configurations
Knowledge of:
1. Specific networking software products (e.g., I/O servers).
2. Network topology
3. Network protocols
4. Physical media specifications (e.g., copper, fiber, RF, IR)
5. Computer operating systems
6. Interfacing and gateways
7. Data mapping
Skill in:
1. Analyzing throughput
2. Ensuring data integrity
3. Troubleshooting
4. Documenting configuration
5. Configuring network products
6. Interfacing systems
7. Manipulating data
Task 5: Implement security methodology in accordance with stakeholder requirements in order to
mitigate loss and risk.
Knowledge of:
1. Basic system/network security techniques
2. Customer security procedures
3. Control user-level access privileges
4. Regulatory expectations (e.g., 29 CFR Part 11)
5. Industry standards (e.g., ISA)
Skill in:
1. Documenting security configuration
2. Configuring/programming of security system
3. Implementing security features
Task 6: Review configuration and programming using defined practices in order to establish
compliance with functional requirements.
Knowledge of:
1. Specific control software products
2. Specific HMI software products
3. Specific database software products
4. Specific reporting products
5. Programming structure techniques
6. Network communication
7. Alarming schemes
8. I/O structure
9. Memory addressing schemes
10. Hardware configurations
11. Computer operating systems
12. Defined practices
13. Functional requirements of system/equipment to be automated
RATINGS
Task % of Items on # of Items on
Importance Criticality Frequency
Test Test
1 2.75 2.49 2.05 1.16% 2
TOTAL 15.24% 27
Domain V: Deployment
Task 1: Perform receipt verification of all field devices by comparing vendor records against
design specifications in order to ensure that devices are as specified.
Knowledge of:
1. Field devices (e.g., transmitters, final control valves, controllers, variable speed
drives, servo motors)
2. Design specifications
Skill in:
1. Interpreting specifications and vendor documents
2. Resolving differences
Task 3: Install configuration and programs by loading them into the target devices in order to
prepare for testing.
Knowledge of:
1. Control system (e.g., PLC, DCS, PC)
2. System administration
Skill in:
1. Installing software
2. Verifying software installation
3. Versioning techniques and revision control
4. Troubleshooting (i.e., resolving issues and retesting)
Task 4: Solve unforeseen problems identified during installation using troubleshooting skills in
order to correct deficiencies.
Knowledge of:
1. Troubleshooting techniques
2. Problem-solving strategies
3. Critical thinking
4. Processes, equipment, configurations, and programming
5. Debugging techniques
Skill in:
1. Solving problems
2. Determining root causes
3. Ferreting out information
4. Communicating with facility personnel
5. Implementing problem solutions
6. Documenting problems and solutions
Task 5: Test configuration and programming in accordance with the design documents by
executing the test plan in order to verify that the system operates as specified.
Knowledge of:
1. Programming and configuration
2. Test methodology (e.g., factory acceptance test, site acceptance test, unit-level
testing, system-level testing)
3. Test plan for the system/equipment to be automated
4. System to be tested
5. Applicable regulatory requirements relative to testing
RATINGS
Task % of Items on # of Items on
Importance Criticality Frequency
Test Test
1 2.39 2.10 1.65 0.91% 2
TOTAL 10.95% 19
Task 1: Verify system performance and records periodically using established procedures in
order to ensure compliance with standards, regulations, and best practices.
Knowledge of:
1. Applicable standards
2. Performance metrics and acceptable limits
3. Records and record locations
4. Established procedures and purposes of procedures
Skill in:
1. Communicating orally and written
2. Auditing the system/equipment
3. Analyzing data and drawing conclusions
Task 2: Provide technical support for facility personnel by applying system expertise in order to
maximize system availability.
ISA Certified Automation Professional 54
Job Analysis Study
Knowledge of:
1. All system components
2. Processes and equipment
3. Automation system functionality
4. Other support resources
5. Control systems theories and applications
6. Analytical troubleshooting and root-cause analyses
Skill in:
1. Troubleshooting (i.e., resolving issues and retesting)
2. Investigating and listening
3. Programming and configuring automation system components
Task 3: Perform training needs analysis periodically for facility personnel using skill assessments
in order to establish objectives for the training program.
Knowledge of:
1. Personnel training requirements
2. Automation system technology
3. Assessment frequency
4. Assessment methodologies
Skill in:
1. Interviewing
2. Assessing level of skills
Task 4: Provide training for facility personnel by addressing identified objectives in order to
ensure the skill level of personnel is adequate for the technology and products used in
the system.
Knowledge of:
1. Training resources
2. Subject matter and training objectives
3. Teaching methodology
Skill in:
1. Writing training objectives
2. Creating the training
3. Organizing training classes (e.g., securing demos, preparing materials, securing
space)
4. Delivering training effectively
5. Answering questions effectively
Task 5: Monitor performance using software and hardware diagnostic tools in order to support
early detection of potential problems.
Knowledge of:
1. Automation systems
2. Performance metrics
3. Software and hardware diagnostic tools
4. Potential problem indicators
5. Baseline/normal system performance
6. Acceptable performance limits
Knowledge of:
1. Installed base of system equipment and software
2. Support agreements
3. Internal and external support resources
4. Lifecycle state and support level (including vendor product plans and future
changes)
Skill in:
1. Organizing and scheduling
2. Programming and configuring
3. Applying software updates (i.e., keys, patches)
Task 10: Determine the need for spare parts based on an assessment of installed base and
probability of failure in order to maximize system availability and minimize cost.
Knowledge of:
1. Critical system components
2. Installed base of system equipment and software
3. Component availability
4. Reliability analysis
5. Sourcing of spare parts
Skill in:
1. Acquiring and organizing information
2. Analyzing data
Task 11: Provide a system management plan by performing preventive maintenance,
implementing backups, and designing recovery plans in order to avoid and recover from
system failures.
Knowledge of:
1. Automation systems
2. Acceptable system downtime
3. Preventative and maintenance procedures
4. Backup practices (e.g., frequency, storage media, storage location)
Skill in:
1. Acquiring and organizing
2. Leading
3. Managing crises
4. Performing backups and restores
5. Using system tools
Contributors
ISA would like to thank these individuals and their employers for their contribution of time, expertise, and enthusiasm for the
Certified Automation Professional (CAP) program.
Additional Contributors:
Dave Adler, Senior Engineering Consultant Greg McMillan (retired)
Eli Lilly, IN Austin, TX
Skip Holmes, Associate Director – Control & Information Joe Ruder, Principal Controls Engineer
Systems, Nestle' Purina Petcare, MO
Corporate Engineering Technologies
Proctor & Gamble, OH Nicholas Sands, Control Engineer
E I du Pont, NJ
Gavin Jacobs, Principal Engineer
Emerson Process Management, AB George Skene, Senior Controls Engineer
The Benham Companies, Inc., MI
Lee Lane, Manager of Applications Engineering
Rockwell Automation, OH Chris Stephens, Design Engineer III
Fluor Corporation, TX
Bob Lindeman, Senior Project Manager
Aerospace Testing Alliance, TN Ken Valentine, Director Design Engineering – Control
Systems
Ron Lutes, Vice President Performance Solutions Fluor Corporation, TX
COMPEX, MO
Jeff White, Control Engineer
Paul Maurath, Technical Section Head Interstates Control Systems, Inc., IA
P&G, OH
This booklet contains the ISA – The Instrumentation, Systems, and Automation Society role delineation survey
for the Certified Automation Professional along with instructional materials to aid you in completing it.
Directions are provided at the beginning of each section of the survey.
ISA – The Instrumentation, Systems, and Automation Society is developing a new certification for automation
professionals to cover the entire field of automation application. We appreciate your time in completing this
survey and we value your important input.
In Section A, you are asked to complete a Confidential Survey, which provides us with the demographic
information necessary to ensure that automation professionals working in various settings with differing
backgrounds are represented in the data collection.
In Section B, we have provided you with a list of definitions and terms that are used throughout the survey.
We suggest that you review the Definition of Terms before responding to any survey questions.
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 63
In Section C, you are asked to review the Task Statements required for competent performance in each
performance domain by the Certified Automation Professional, and rate each for importance, criticality, and
frequency as they pertain to the role of the Certified Automation Professional.
In Section D, you are asked to review the Performance Domains that define the role of the Certified
Automation Professional. We ask that you rate the importance, criticality, and frequency of these domains as
they pertain to the role of the Certified Automation Professional.
Please review the entire booklet before responding to any of the questions. Your review will help you to
understand our terminology and the structure of the role delineation survey.
Please mark your responses directly in this booklet. Please return your completed survey by 2 April 2004 in
the enclosed, self-addressed, stamped envelope to:
Thank you in advance for your help with this very important project. ISA will use your responses to help
determine the blueprint for the ISA Certified Automation Professional Examination.
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 64
Section A
Confidential Survey
Please fill in the following demographic information, which will be used to ensure that automation professionals
working in various settings with differing backgrounds are represented in the data collection.
All responses are kept strictly confidential by CASTLE Worldwide, Inc. Computer programs are used to sort the
data. Neither individual persons or companies nor their particular data will be identifiable in any report
generated using information obtained through this survey.
4. How much experience do you have as an automation professional? (Please select one.)
I am not an automation professional. 6-10 years
Less than 1 year 11-15 years
1-5 years More than 15 years
5. What percentage of your time do you spend working as an automation professional in your current
position? (Please select one.)
I am not an automation professional. 51-75 percent
Less than 25 percent 76-100 percent
25-50 percent
6. Which control area(s) do you work in on a daily basis? (Please select one.)
Discrete/Machine Control Both Discrete/Machine Control and
Process/Liquid/Dry
Process/Liquid/Dry
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 65
7. What is your primary responsibility in your current position? (Please select one.)
Field Engineering Project/System Engineering
Information Systems Other (Please specify.)
Operations and Maintenance ___________________________________
8. Which of the following best describes the industry in which you work? (Please select one.)
Aerospace Petroleum Manufacturing
Automotive Manufacturing Pharmaceutical Manufacturing
Building Automation Plastics Manufacturing
Chemical Manufacturing Pulp and Paper Manufacturing
Consumer Goods Textiles/Fabrics Manufacturing
Electrical/Electronic Manufacturing Transportation
Engineering and Construction Utilities
Environmental/Waste Water/Waste
Food and Beverage Manufacturing Other (Please specify.)
Machinery Manufacturing ___________________________________
Metals Manufacturing
9. Which of the following best describes your current employer's company or organization? (Please select one.)
Control Systems Suppliers Original Equipment Manufacturer (OEM)
End-Users (petro-chem, food and beverage, Systems Integrators
pulp and paper)
Other (Please specify.)
Engineering and Design Firm
10. What certifications/licenses do you currently hold? (Please select all that apply.)
CEM CSE PMP
CQE MSCE Other (Please specify.)
CCST PE ___________________
11. In which professional societies and/or organizations do you currently hold membership? (Please select
all that apply.)
AIChE ISA
ASME UA
CSIA Other (Please specify.)
IBEW ___________________________________
IEEE
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 66
12. What is your highest level of education? (Please select one.)
High School/Secondary School Doctoral Degree
Associate Degree Other (Please specify.)
Bachelor’s Degree ___________________________________
Master’s Degree
13. What is the major/focus of study of your highest degree? (e.g., measurement, business administration,
mechanical engineering, electrical engineering)
__________________________________________________________________________________
Section B
Definition of Terms
Below are definitions of the terms found in this role delineation survey.
Certified Automation Professional (CAP): The ISA Certified Automation Professional (CAP) has
completed a four-year technical degree* and five years of experience working in automation. CAPs are
responsible for the direction, definition, design, development/application, deployment, documentation, and
support of systems, software, and equipment used in control systems, manufacturing information systems,
systems integration, and operational consulting.
* There may be a variety of ways to combine education and experience to satisfy eligibility
requirements for an introductory two-year period.
Performance Domain: The performance domains are the major responsibilities or duties that define the
role of the Certified Automation Professional. Each performance domain may be considered a major
heading in an outline and may include a brief behavioral description. There are six performance domains
included in this survey, as identified by an expert panel:
• Feasibility Study
• Definition
• System Design
• Development
• Deployment
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 67
Task Statement: A task is an activity performed within a performance domain. Each performance domain
consists of a series of tasks that collectively form a comprehensive and detailed description of each
performance domain. Typically, task statements answer such questions as:
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 68
Section C
Evaluation of Performance Domains
Instructions: You will be rating each performance domain identified by an expert panel on three
dimensions: importance, criticality, and frequency.
Please remember, the performance domains are the major responsibilities or duties that define the role of
the Certified Automation Professional. Each performance domain may be considered a major heading in an
outline. There are six performance domains included in this survey, as identified by an expert panel. Each
performance domain consists of a series of tasks that collectively form a comprehensive and detailed
description of each performance domain. A task is an activity performed within a performance domain. In
this section, you will validate the performance domains. If you are unclear what areas a performance
domain covers, please review Section D.
Importance: Importance is defined as the degree to which knowledge in the performance domain is
essential to the role of the Certified Automation Professional. Indicate how important each performance
domain is to the Certified Automation Professional. Rate each of the six performance domains by using the
scale below. Please assign each performance domain only one rating. DO NOT RANK THE DOMAINS.
Select the number of the description below that best exemplifies your rating for each performance domain,
and write that number in the space provided next to each performance domain.
1 = Slightly Important. Performance of tasks in this domain is only slightly essential to the job
performance of the Certified Automation Professional.
2 = Moderately Important. Performance of tasks in this domain is only moderately essential to the job
performance of the Certified Automation Professional.
3 = Very Important. Performance of tasks in this domain is clearly essential to the job performance of
the Certified Automation Professional.
4 = Extremely Important. Performance of tasks in this domain is absolutely essential to the job
performance of the Certified Automation Professional.
Rating of
Importance Performance Domain
1. Feasibility Study
2. Definition
3. System Design
4. Development
5. Deployment
6. Operation and Maintenance
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 69
Criticality: Criticality is defined as the potential for harmful consequences to occur if the Certified
Automation Professional is not knowledgeable in the performance domain. Indicate the degree to which the
inability of the Certified Automation Professional to perform tasks within the performance domain would be
seen as causing harm to employers, employees, the public, and/or other relevant stakeholders. Harm may
be physical, emotional, financial, etc. Rate each of the six performance domains by using the scale below.
Please assign each performance domain only one rating. DO NOT RANK THE DOMAINS. Select the
number of the description that best exemplifies your rating for each performance domain, and write that
number in the space provided next to each performance domain.
1 = Minimal or No Harm. Inability to perform tasks within this performance domain would lead to error
with minimal adverse consequences.
2 = Moderate Harm. Inability to perform tasks within this performance domain would lead to error with
moderate adverse consequences.
3 = Substantial Harm. Inability to perform tasks within this performance domain would lead to error with
substantial adverse consequences.
4 = Extreme Harm. Inability to perform tasks within this performance domain would definitely lead to
error with severe adverse consequences.
Rating of
Criticality Performance Domain
1. Feasibility Study
2. Definition
3. System Design
4. Development
5. Deployment
6. Operation and Maintenance
Frequency: What percent of time does the Certified Automation Professional spend performing duties
associated with each domain? Write the percentage in the space provided next to each domain. The total
must equal 100 percent.
Percent of
Time Performance Domain
1. Feasibility Study
2. Definition
3. System Design
4. Development
5. Deployment
6. Operation and Maintenance
100%
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 70
Section D
Evaluation of Task Statements
In this section you will rate the task statements associated with each of the six domains on three
dimensions – importance, criticality, and frequency – according to the scales below.
Please remember, a task is an activity performed within a performance domain. As previously discussed,
the performance domains are the major responsibilities and duties that define the role of the Certified
Automation Professional. In this section, you will validate the tasks. If you are unclear about the relationship
between the performance domains and the tasks, please review Section C.
Rating Scales
Importance Criticality* Frequency
1 – Slightly Important 1 – Causing Minimal or No Harm 1 – About Once Per Year or Never
Circle the number corresponding to the Importance, Criticality, and Frequency rating for each task statement.
_____________________________________________________________________________________
__________
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 71
Please list any tasks related to Domain I that you think may have been overlooked.
_____________________________________________________________________________________
_____________________________________________________________________________________
Rating Scales
Importance Criticality* Frequency
1 – Slightly Important 1 – Causing Minimal or No Harm 1 – About Once Per Year or Never
Please list any tasks related to Domain II that you think may have been overlooked.
_____________________________________________________________________________________
_____________________________________________________________________________________
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 72
DOMAIN III: SYSTEM DESIGN IMPORTANCE CRITICALITY FREQUENCY
Task 1: Perform safety and/or hazard analyses, security analyses,
and regulatory compliance assessments by identifying key
1 2 3 4 1 2 3 4 1 2 3 4
issues and risks in order to comply with applicable
standards, policies, and regulations.
Task 2: Establish standards, templates, and guidelines as applied to
the automation system using the information gathered in the
1 2 3 4 1 2 3 4 1 2 3 4
definition stage and considering human-factor effects in
order to satisfy customer design criteria and preferences.
Task 3: Create detailed equipment specifications and instrument
data sheets based on vendor selection criteria,
characteristics and conditions of the physical environment,
1 2 3 4 1 2 3 4 1 2 3 4
regulations, and performance requirements in order to
purchase equipment and support system design and
development.
Task 4: Define the data structure layout and data flow model
considering the volume and type of data involved in order to
1 2 3 4 1 2 3 4 1 2 3 4
provide specifications for hardware selection and software
development.
Task 5: Select the physical communication media, network
architecture, and protocols based on data requirements in
1 2 3 4 1 2 3 4 1 2 3 4
order to complete system design and support system
development.
Task 6: Develop a functional description of the automation solution
(e.g., control scheme, alarms, HMI, reports) using rules
1 2 3 4 1 2 3 4 1 2 3 4
established in the definition stage in order to guide
development and programming.
Task 7: Design the test plan using chosen methodologies in order to
execute appropriate testing relative to functional 1 2 3 4 1 2 3 4 1 2 3 4
requirements.
Rating Scales
Importance Criticality* Frequency
1 – Slightly Important 1 – Causing Minimal or No Harm 1 – About Once Per Year or Never
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 73
DOMAIN III: SYSTEM DESIGN (CONTINUED) IMPORTANCE CRITICALITY FREQUENCY
Task 8: Perform the detailed design for the project by converting
the engineering and system design into purchase
requisitions, drawings, panel designs, and installation
1 2 3 4 1 2 3 4 1 2 3 4
details consistent with the specification and functional
descriptions in order to provide detailed information for
development and deployment.
Task 9: Prepare comprehensive construction work packages by
organizing the detailed design information and documents 1 2 3 4 1 2 3 4 1 2 3 4
in order to release project for construction.
Please list any tasks related to Domain III that you think may have been overlooked.
_____________________________________________________________________________________
_____________________________________________________________________________________
Please list any tasks related to Domain IV that you think may have been overlooked.
_____________________________________________________________________________________
_____________________________________________________________________________________
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 74
Rating Scales
Importance Criticality* Frequency
1 – Slightly Important 1 – Causing Minimal or No Harm 1 – About Once Per Year or Never
Please list any tasks related to Domain V that you think may have been overlooked.
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 75
Rating Scales
Importance Criticality* Frequency
1 – Slightly Important 1 – Causing Minimal or No Harm 1 – About Once Per Year or Never
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 76
Task 12:Follow a process for authorization and implementation of
changes in accordance with established standards or
1 2 3 4 1 2 3 4 1 2 3 4
practices in order to safeguard system and documentation
integrity.
Please list any tasks related to Domain VI that you think may have been overlooked.
_____________________________________________________________________________________
____________________________________________________________________________________
_____________________________________________________________________________________
ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004 77
ISBN - 1-55617-903-0