You are on page 1of 28

TRINITY UNIVERSITY OF ASIA

Stages in Research Process


Selection and Development of a Problem

Veralynn P. Palileo 12/2/2011

SELECTION AND DEVELOPMENT OF A PROBLEM

The previous presentation made it clear to us that the research process starts with the identification of a problem. But what is a problem? More importantly, what is a valid problem? Or a researchable problem? The Research Problem According to Fisher & others (1991), a problem is a perceived difficulty, a feeling of discomfort with the things are, a discrepancy between what someone believes should be and what is. Ardales (2008) said, without a problem, no research can be undertaken. And as Leey (1980) put it, the problem is the heart of every research project because it is paramount in importance to the success of the research effort, thus the situation is simple: no problem, no research. Selltiz (1959) even contends that the formulation of a problem is often more essential than its solution. Factors in Problem Selection So what are the characteristics of a researchable problem? While a research begins with a problem, bear in mind that not all problems are researchable. So how do we identify which is and which is not? On his book Basic Concepts and Methods in Research (2008), Dr. Venacio Ardales said that a problem is researchable when any of the following five conditions is true: 1. When there is no known answer or solution to the problem such that gap in knowledge exists; 2. When there are possible solutions to it but the effectiveness of which is untested or unknown yet; 3. When there are answers or solutions but the possible results of which may seem or factually contradictory; 4. When there are several possible and plausible explanations for the undesirable condition; and 5. When the existence of the phenomenon requires explanation.

Page 1

Selltiz & others (1976) said that a problem is researchable when it meets three conditions: 1. The concept must be clear enough so that one can specify in words exactly what the question is. 2. The concepts must be such that they can be represented by some sort of evidence which is obtainable through direct observation or other less direct activities. 3. It must be feasible to carry out such operations or activities. The following Criteria for Choice of Research Problems as identified by Dr. Crestita Barrientos-Tan in her book A Research Guide in Nursing Education is noteworthy for nurses: 1. Significance of the Problem Research focuses on an existing or prevailing problem, the novelty and practical value of the study, the solution of which can contribute knowledge to the field of nursing. You may ask; Is this problem critical enough to prove the difference between what is ideal and what is real? Will its solution improve the practice of nursing and bring about change in nursing practitioners? What contributions or meaning will it give to the different sectors or beneficiaries of the study? 2. Problem Researchability Not all problems can be investigated scientifically for researchability. The research problem must be manageable, the nature and scope of which are specific and well defined. The phenomenon which is the focus of the problem must be observable, quantifiable and measureable. The problem must be subjected to empirical testing to identify specific variables and determine the relationships of those variables. Keep in mind that there are problems that can be solved merely on the basis of opinion and application of rationality and personal values, and can be solved through debate. Example: Should nurses join unions? Is family planning moral? 3. Feasibility of the Problem Or the potential researchability of the problem, which can be established when it meets the following criteria:

Page 2

i. Time the problem is projected to be solved within a given time frame. ii. Availability of subjects available population size is adequate enough for sampling purposes. iii. Administrative control and group support the problem is likely to be endorsed by the approval board concerned. iv. Research resources the problem must be of such nature and structure that solution is possible, using available space, computers and other equipment, transportation, communication and other facilities. v. Fiscal resources there must be sufficient available funds to pursue the study through completion. Hence, anticipated benefits from the study must justify its cost. The researcher should project needed expenses before finalizing the selection of a problem. A limited budget could be a constraint in the effective pursuit of the research. vi. Experience of the researcher the problem must represent the researchers specialized field to ensure knowledge of the phenomenon under study, and skill in analyzing, interpreting, and rationalizing the implications of the results of the study to its target population. vii. Ethical considerations a research problem is feasible if it does not make any undue impositions on the respondents. 4. Potentials of the researcher there must be a genuine interest and curiosity about the particular problem on the part of the researcher. Research requires the researchers experience in the field being investigated or subject matter under inquiry. Simply put, a research question is an explicit query about a problem or issue that can be challenged, examined, and analyzed, and that will yield useful new information. Answers to research questions add to our general knowledge. They can be used by other people in other places because the answers are valid no matter who asked the question or where the answer was found. This is the critical feature of research findings they must be facts, not opinion. Defining the Research Problem A research problem must be clearly defined so as not to make I too broad that it may overwhelm the researcher and leave him with a predicament on where to start. This can be done thru first, by defining the major concepts or terms and
Page 3

variables in the study. Second is limiting the scope of the study in terms of (1) issues, concerns, or subjects, (2) area coverage, (3) target population and/or sample population, (4) source of data or respondents, (5) time allotment, and (6) data requirements qualitative or quantitative or both. Example:

Page 4

GATHERING OF MATERIALS AND/OR DATA There is a wide array of data collection methods from which the researcher can choose. Which method or methods he will select for his particular study is determined by a number of factors. The major consideration is the nature of the research problem and general objective, the latter made explicit in the specific objectives and hypotheses. Other factors to consider are research design that has been selected for the study; the nature and area dispersion of the target population; the operational feasibility; the availability of resources which includes money, time and trained personnel, and the type of data that should have been specified as early as in the formulation of specific objectives and hypotheses of the study. CLASSIFICATIONS OF RESEARCH DATA In research, data refers to the results of the study from which inferences are based (Kerlinger, 1986). Research data can be classified on the basis of their source and their form. By source, the data are either primary or secondary in type. Primary data are those which are gathered directly from the informants of the study. Those which are generated by a field researcher in a face-to-face interview with a
Page 5

respondent are primary in type. Secondary data are those which have been previously gathered, compiled and stored somewhere and may be made available to the researcher who finds them useful to his particular study. Many data of this type are found in government agencies, like the National Statistics Office (NSO), the National Economic Development Authority (NEDA), the Department of Health (DOH), the Department of Education, Culture, and Sports (DECS), the Department of Agriculture (DA), and the Commission on Population (POPCOM). Schools and non-government agencies (NGOs) may also have collections of information useful to the investigation being undertaken. On the basis of form the data are either qualitative or quantitative. They are qualitative when they are descriptions of the basic nature or characteristics of the people or objects under investigation. Examples are descriptions of people on basis of complexion, color of the hair or eyes, attitudes, active or inactive participation, and so on. Data are quantitative form when they are numerical in nature and have the property of measurability. Statistics on age, height, income, academic grade, distance and the like are examples of quantitative data. A research may just require qualitative information as in the case of qualitative researches like an ethnographic study on the beliefs and rituals of a tribe. Other researches may be restricted o utilizing quantitative data, particularly if analyses require statistical summaries, comparisons, and correlations. It is possible, however, that a study may require both the qualitative and the quantitative forms of data. Similarly, some researches may require primary data, others may need secondary data only, while still others may stipulate the use of both the primary and the secondary data. METHODS OF DATA COLLECTION Data collection means gathering information to address those critical evaluation questions that you have identified earlier in the evaluation process. There are many methods available to gather information, and a wide variety of information sources. The most important issue related to data collection is selecting the most appropriate information or evidence to answer your questions. To plan data collection, you must think about the questions to be answered and the information sources available. Also, you must begin to think ahead about how the information could be organized, analyzed, interpreted and then reported to various audiences.

Page 6

What kind of data should be collected? The information you collect is the evidence you will have available to answer the evaluation questions. Poor evidence is information which cannot be trusted, is scant, or simply is not relevant to the questions asked. Good evidence is information that comes from reliable sources and through trustworthy methods that address important questions. There are two general types of information: descriptive and judgmental. Descriptive information can include the following examples:  Characteristics of the project  Reports of project accomplishments  Current skill or knowledge levels of project personnel and the target audience  Amount of participation by the target audience  Rates of use of an agricultural chemical  Rates of production of a specific crop  Policies concerning cost share  Rules regarding livestock waste application  Types of participants  Demographic data Judgmental information can include the following examples:  Opinions from experts or consultants  Consumer preferences  Target audiences beliefs and values  Technical agency personnels interpretation of laws  Stakeholders perceived priorities  Individuals interpretation of guidelines What methods should be used to collect data? There are multiple ways to collect information to answer most questions. The ideal situation would be to collect from more than one source and/or to collect more than one type of information. The selection of a method for collecting information must balance several concerns including: resources available, credibility, analysis and reporting resources, and the skill of the evaluator. Examples of different data collection methods are given below.

Page 7

                      

Behaviour Observation Checklist: a list of behaviours or actions among participants being observed. A tally is kept for each behaviour or action observed. o Participant Observation o Non-Participant Observation Knowledge Tests: information about what a person already knows or has learned. Opinion Surveys: an assessment of how a person or group feels about a particular issue. Performance tests: testing the ability to perform or master a particular skill. Delphi Technique: a method of survey research that requires surveying the same group of respondents repeatedly on the same issue in order to reach a consensus. Q-sorts: a rank order procedure for sorting groups of objects. Participants sort cards that represent a particular topic into different piles that represent points along a continuum. Self-Ratings: a method used by participants to rank their own performance, knowledge, or attitudes. Questionnaire: a group of questions that people respond to verbally or in writing. Time Series: measuring a single variable consistently over time, i.e. daily, weekly, monthly, annually. Case Studies: experiences and characteristics of selected persons involved with a project. Individual Interviews: individuals responses, opinions, and views. Group Interviews: small groups responses, opinions, and views. Wear and Tear: measuring the apparent wear or accumulation on physical objects, such as a display or exhibit. Physical Evidence: residues or other physical by-products are observed. Panels, Hearings: opinions and ideas. Records: information from records, files, or receipts. Logs, Journals: a persons behaviour and reactions recorded as a narrative. Simulations: a persons behaviour in simulated settings. Advisory, Advocate Teams: ideas and viewpoints of selected persons. Judicial Review: evidence about activities is weighed and assessed by a jury of professionals.

Page 8

Below are some issues to remember when choosing a data collection method.  Availability: You may have information already available to you that can help answer some questions or guide the development of new guidelines. Review information in prior records, reports, and summaries.  Need for Training or Expert Assistance: Some information collection methods will require special skill on the part of the evaluator, or perhaps staff will need to be trained to assist with the evaluation.  Pilot Testing: You will need to test the information collection instrument or process you design, no matter the form or structure. You will need to plan time for this step and for any revisions that may result from this testing.  Interruption Potential: The more disruptive an evaluation is to the routine of the project, the more likely that it will be unreliable or possibly sabotaged by those who feel they have more important things to do.  Protocol Needs: In many situations, you need to obtain appropriate permission or clearance to collect information from people or other sources. You will have to allow time to work through the proper channels.  Reactivity: You do not want how you ask something to alter the response you will get. Reactivity may also be a concern if your presence during data collection may possibly alter the results. For example, if you as a supervisor are administering an opinion survey about a specific project, the responses your employees give may be influenced by their desire to please you as their supervisor, rather than based on their true feelings.  Bias: Bias means to be prejudiced in opinion or judgment. Bias can enter the evaluation process in a variety of ways. For example, if you use a self-selected sample (when a person decides to participate in a study, rather than being picked randomly by the researcher), how might these respondents be different from the people that chose not to participate?  Reliability: Will the evaluation process you have designed consistently measure what you want it to measure? If you use multiple interviews, settings, or observers, will they consistently measure the same thing each time? If you design an instrument, will people interpret your questions the same way each time?  Validity: Will the information collection methods you have designed produce information that measures what you say you are measuring? Be sure that the information you collect is relevant to the evaluation questions you are intending to answer.
Page 9

How much information should you collect? Sampling refers to selecting a portion of subjects in order to learn something about the entire population without having to measure the whole group. The portion taken is known as the sample. When you sample, you do so to learn something about a population without having to measure the whole group, which in many cases might be quite large. There are two general types of sampling methods: random and purposive.  Random methods are used to produce samples that are, to a given level of probable certainty, free of biasing forces. In a random sample, each individual in the population has an equal chance of being chosen for the sample. Purposive methods are used to produce a sample that will represent specific viewpoints or particular groups in the judgment of those selecting the sample. The purposive sample consists of individuals selected deliberately by the researcher.

Here are some questions to consider when deciding whether to sample:  Should you use a sample of a population or a census (an entire population, such as all people living in the watershed)?  Should you use a random or purposive sample?  How large a sample size do you need?  Is your sample likely to be biased?

Page 10

The following table tells you the number of people you must survey to accurately represent the views of the population under study. For example, you may want to understand how all of the residents in a city feel about a particular issue. If the city population is 70,000 people, then the sample size will be 382 people (find the number 70,000 under the Population column: to the right is the sample size of 382). Thats the number of people youll have to include in order to make generalizations about the entire city population.

Page 11

Writing Questions This section focuses on what questions to ask and how to write them. At some point you will probably need to design your own instrument. At minimum, you will have to modify an existing instrument. In Step 1 you began the process of developing your questions, as you wrote several critical questions your evaluation needs to answer. Now you should start writing the specific questions that you will ask your target audience. The importance of exact wording in each question is very significant. A great deal of research has studied the effects of question wording and style on responses. While writing good questions may seem to be more of an art than a science, some basic principles for writing questions can serve as a guide for developing a written instrument. Below is a checklist you can use when forming your questions:  Is this question necessary? How will it be useful? What will it tell you?  Will you need to ask several related questions on a subject to be able to answer your critical question?  Do respondents have the necessary information to answer the question?  Will the words in each question be universally understood by your target audience?  Are abbreviations used? Will everyone in your sample understand what they mean?  Are unconventional phrases used? If so, are they really necessary?  Can they be deleted?  Is the question too vague? Does it get directly to the subject matter?  Can the question be misunderstood? Does it contain unclear phrases?  Is the question misleading because of unstated assumptions or unseen implications?  Are your assumptions the same as the target audience?  Have you assumed that the target audience has adequate knowledge to answer the question?  Is the question too demanding? For example, does it ask too much on the part of the respondent in terms of mathematical calculations, or having to look up records?  Is the question biased in a particular direction, without accompanying questions to balance the emphasis?  Are you asking two questions at one time?
Page 12

    

Does the question have a double negative? Is the question wording likely to be objectionable to the target audience in any way? Are the answer choices mutually exclusive? Is the question technically accurate? Is an appropriate referent provided? For example: per year, per acre.

Instrument Construction An instrument is the tangible form on which you elicit and record information. There are many types of instruments and in some cases, you may be the instrument. Instruments must be carefully chosen or designed. Sloppy or improper instruments can destroy an evaluation effort. Designing instruments is a complex process. An option is to find an instrument that already exists, and adapt it to your evaluation effort. While using an already designed instrument may save some development time, you need to make sure that its use is valid for your evaluation. Creating a Questionnaire Of all the data collection methods listed in Step 4, questionnaires are a widely used method of collecting information. They can be a cost-effective way to reach a large number of people or a geographically diverse group. After writing the questions you want to ask, a few other items must be considered before creating your questionnaire. General guidelines for questionnaire format cover letter, and envelopes: Once the questions are written, they must be organized into some type of structure. The format could be assembled as a booklet, or as single sheet of paper that is stapled together in the corner. The questionnaire should include the following key elements: Cover Letter: A questionnaire should always be sent accompanied by a cover letter. The letter should include the title of the questionnaire, the purpose of the study, why and how the participant was selected to receive the questionnaire, and who is sponsoring the research. Also included should be the names of the project sponsor and contact person, and addresses and phone numbers for these persons. Remember to include a deadline for returning the questionnaire.

Page 13

Questionnaire Introduction: State the purpose of the questionnaire, why it is being conducted, who is sponsoring the research/the agency responsible for the questionnaire. In essence, a short recap of some of the information included in the cover letter. Instructions: Give clear instructions on how to answer the questions. For example, will the answers be circled or will a check mark be used? Will the respondent be expected to fill in a blank? If there are open-ended questions, is the question written so that the respondent needs to answer with more than a yes or no response? Are there clearly written instructions that tell the respondent to skip to a particular section on a designated page? Grouping Questions: Group questions with similar topics together in a logical flow. Use a transition statement when moving to a new topic within the questionnaire. For example, state: Next we would like to ask you several questions about the vegetative filter strips used on your land. Demographic Questions: Place all demographic questions at the end of the questionnaire. Demographic questions include asking a persons age, gender, amount of formal education, ethnic group, etc. Ask only the demographic information you need to know for analyzing data. Other Comments: Allow space on the questionnaire to ask respondents to share any other comments. Thank You: Remember to thank the respondent for completing the questionnaire. What to do with the questionnaire: At the end of the questionnaire, repeat the deadline for returning the completed instrument, and the name and address of the person it should be mailed to. Always include the mail to address in case the enclosed envelope is misplaced by the respondent. Arranging Questions The first rule in arranging questions is to put the most important question first. After reading the cover letter explaining the purpose of the survey, the first thing a respondent should find on the questionnaire is a question that relates directly to that purpose.

Page 14

Here are some additional tips on ways to arrange questions so they are clear and easy to answer.  Make each question fit on the same page. Never force respondents to turn a page in the middle of a question or flip pages back and forth to answer a question.  Provide instructions on how to answer each question. Place directions in parentheses using lower case letters. For example: Since attending the workshop, which of the following management practices have you used? (circle each answer that applies).  Arrange questions and the space for answers in a vertical flow. Put the answer choices underneath, instead of next to the questions. This way the respondent moves down the page rather than side to side. For example: Do you own a no-till drill? 1) Yes 2) No  If using yes/no or other repeated answers, always keep answer categories in the same order from question to question. For example: 1) Yes 2) No Do not switch to: 1) No 2) Yes  Use multiple columns to conserve space and make the question less repetitious. For example: How much of an effect did the watershed programs have on your farming operation? (circle the response that best represents your feelings; if you did not participate in the program circle DP.)

Page 15

Group questions of similar subject matter together. Suppose you were constructing a questionnaire that asked questions about three topics such as vegetative filter strips, grass water ways, and conservation practices. You should organize the questions so that one section contains questions that relate specifically to vegetative filter strips; one section contains questions that relate specifically to grass water ways; and one section contains questions that relate specifically to conservation practices.

Checklist for Evaluating Your Questionnaire:  A cover letter accompanies the questionnaire.  Title of questionnaire will appeal to respondents.  Questionnaire looks easy to complete.  Print quality is clear and legible.  Introduction is concise and relevant.  Instructions are brief.  Instructions are clear.  Instructions are provided for each question or series of very similar questions.  All questions are essential and relevant to the objective of the survey.  Wording is at an appropriate literacy level for the survey population.  Initial items are applicable to all members of the survey population.  Initial items are non-threatening.  Initial items are interesting.  Items with similar content are grouped together.  Adequate space is provided for respondents to write answers.  Each question fits within the boundary of the page.  All questions are arranged in a vertical flow.  Demographic questions are at the end.  A thank you is included at the end of the questionnaire.  Instructions for mailing the questionnaire are included at the end.  A self-addressed stamped envelope is included for each respondent.

Page 16

USE OF AVAILABLE DATA The researchers study may not call for the use of methods discussed above in generating data. Instead, it may only require locating and examining: 1) Data which have been previously gathered by other researchers or those which were accumulated through a regular and systematic system, 2) Materials about introspection, or 3) Those which were written to inform, entertain, or influence public opinion. Statistics Data which have been previously gathered by other researchers or and those which were accumulated regularly and systematically for purposes of planning, administration, development of interventions, or for historical reason. These statistics include those of surveys, censuses, vital records (as of births, deaths, and morbidity), official statistics (as on population, housing, economy and education), and service statistics (as in health, family planning, and emergency operations like providing aid to victims of fire, flood, typhoon, and other natural calamities. Advantage of using Statistics: Use of these available data offers advantages including much savings on the part of the researchers who will be spared from spending thousands of pesos if the study population is so large and so widely distributed over the province or region, the data being collected repeatedly and regularly allow determination of trends over time and collecting them does not involve the help of many persons nor the cooperation of individuals who are the subject of the investigation. In using the said data, it is well for researchers to ensure that the methods used in obtaining them were scientific and appropriate, and that they are accurate. Personal Documents Materials about personal introspection refer to personal documents which include autobiographies, diaries, letters, essays and the like. These materials, to be useful to a research undertaking, should meet the following criteria: 1) They should be tangible (either written or recorded) 2) They were produced on the writers own initiative, or if not, their introspective content has been determined by the author; and 3) They focus on the authors personal experiences.
Page 17

Mass Communications Mass communications in the form of newspapers, magazines, radio, television and motion pictures, while they produce materials for information, entertainment and persuasion are good sources of information for research use. Since they were not produced for the benefit of the researcher, therefore, they are free of the researchers theoretical and personal bias. They also reflect broad aspects of the social conditions in which they are produced. Finally, they allow the researcher to view and to examine the historical past as well as the contemporary society. Content analysis is the appropriate method for obtaining data from materials produced by mass communications. CRITICIZING, EVALUATING AND ESTABLISHING RELATIONSHIP BETWEEN AND AMONG THEM Evaluation is like a pinwheel because it revolves and your project should revolve around evaluation. Once the instrument development and testing process is underway, it is time to start developing your plan to handle the information you will collect. This is an important process. At this point, you should develop the process that will organize, analyze, interpret, summarize, and report your results. This is also the point where you may need the help of a consultant. Organize Before you being to collect the first piece of information, you must develop a system to organize your data. Proper organization and planning will help insure that the data will be kept secure and organized for the analysis. Tips for organizing your evaluation data:  Set up a protocol on how to receive and record the information as it comes in. For example, one person on the project team should be in charge of handling all incoming mail. Label all data immediately as you collect or receive it. For example, label cassette tapes with name of interviewee, interviewer, and any other pertinent information. If you are receiving questionnaires returned by mail, check them off, record the date received, code and number, and add any other information needed. As data are received, check to be sure that the participant has completed the entire instrument correctly, that interviewers have used the proper questioning route, etc. You do not want to discover after all data are
Page 18

  

collected that there are errors. If data are being transcribed or transferred in some way, check to be sure that this is done accurately throughout the process. Back up all computer disks containing data. Set up a protocol for accessing the data including who has, or does not have access. Establish a secure place and way to store all data. If destroyed or lost, data cannot be replaced. If data are confidential they should be stored in a locked place so that only the staff member working with the data has access. Set up a system to track all data. This will be your system to check that data are not lost or overlooked as analysis and summarizing are completed. Develop a format for storing and organizing your data prior to the analysis. For example, you could use a spreadsheet program to enter the raw data.

Analyze The first step in analyzing data is to determine what method of data analysis you will be using. If most of the information you collected contains numbers, then the data is quantitative data. If the information you collect consists of words, then the data is qualitative data. With quantitative data the analysis does not begin until all data are collected. In contrast, most qualitative data analysis begins as data are collected. For example, when conducting interviews, the transcripts are analysed as soon as possible in order to generate additional questions for follow-up interviews. Quantitative Data Analysis If most of the information you collected contains numerical (quantitative) data, then descriptive statistics can be used to characterize your data. Some of the more commonly used descriptive statistics are mean, mode, standard deviation, and frequency. Definitions: Mean: The average score of the sample. Median: The score halfway between the high and low score. Mode: The response given most often. Standard Deviation: The distance from the mean in which 66% of the responses can be found. Frequency: How often a particular response was given.
Page 19

For example, consider the data set for the following question: Question: On a scale of 1 to 5, where 1=poor and 5=excellent, how would you rate the overall quality of the workshop? Answers from 10 respondents: 4; 5; 2; 4; 3; 4; 3; 3; 5; 4  The mean for this data set is 3.7 (the total 37 divided by 10 scores).  The median for this data set is 3.5 (this is the score halfway between the lowest score of 2 and the highest score of 5)  The mode for this data set is 4 (this is the score reported most often).  The standard deviation for this data set is .95 (in this data set a majority of the scores were close to the mean of 3.7). See note below.  The frequency for each response is as follows: 1: no responses 2: one response 3: three responses 4: four responses 5: two responses Qualitative Data Analysis If most of your data collection was done using individual interviews, focus group interviews, open-ended questions, or case studies, then your data will be in the form of words (qualitative data). Unlike being able to use a hand calculator or computer program to analyze your numerical data, the qualitative data of words need to be analyzed initially by reading and sorting through the data. With qualitative data, the challenge is how to organize the information you have collected. How the data is ordered, categorized, and arranged is important because most qualitative data are words that must be interpreted for content. Researchers who specialize in qualitative analysis use a method called Content Analysis. This process will include carefully reading the information, and then identifying, coding, and categorizing the main themes, topics, and or patterns in the information. Coding is simply attaching some alpha-numeric symbol to phrases, sentences, or strings of words that follow a similar theme or pattern. This process allows you to then place these phrases of similar themes into a category for further analysis. There are several strategies that can be employed to help with content analysis. One example from Bogdan and Biklen contains ten different coding categories as a method for sorting qualitative data. These categories are:
Page 20

 

 

  

 

Setting/Context: these are data related to the evaluation setting. Definition of the situation: these types of data tell how the people in the study define the setting, or define the topic; for example, what is their worldview about their work. Perspectives held by subjects: the information focuses on ways of thinking, such as shared ideas held by the participants. Subjects ways of thinking about people and objects: this category is more detailed than the previous one; the codes include data that focus on peoples understanding of each other, and of their world. Processes: these data include codes and phrases that categorize sequences of events, and changes that occur over time. Activities: codes include behaviours that occur on a regular basis. Events: the information in this category of data is categorized in relation to specific activities in the evaluation setting, or in the lives of the people interviewed. Strategy: these are the methods and techniques that people use to accomplish various tasks. Relationships and social structures: this type of information focuses on friendships, adversaries, mentors, romances, enemies or other individual relationships. Methods: data in this category are related to project or evaluation procedures, problems, successes, barriers, dilemmas, etc.

Bogdan and Biklen (1992) describe qualitative data analysis with the following definition: Data analysis is the process of systematically searching and arranging the interview transcripts, field notes, and other materials that you accumulate to increase your own understanding of them, and to enable you to present what you have discovered to others. Analysis involves working with data, organizing them, breaking them into manageable units, synthesizing them, searching for patterns, discovering what is important and what is to be learned, and deciding what you will tell others. Whats important to understand from this discussion of quantitative and qualitative data analysis methods is that the analysis methods used will differ from one evaluation setting to another. There is no single prescription for conducting analysis that fits every situation. When conducting an evaluation you need to recognize this and base your data analysis methods on the nature of your data.
Page 21

Interpret After the data have been analyzed, it is time to interpret the results. Put simply, interpretation is the process of bringing meaning to the data. You may ask yourself, What does it all mean? When interpreting the data you must sift through the mass of results and identify trends, commonalties and testimony that will help answer the critical evaluation questions that were generated in Step 1. If the evaluation is to be useful, the evaluator must interpret the information so that the stakeholders will understand the results and know how to use them. Below is an exercise in data interpretation. Emerald Lake users were asked to rate their familiarity with several programs on a scale of 1 to 5, with 1=not being familiar with the program, and 5=being very familiar with the program. The table below lists the programs and the average score each received.

Page 22

y y y

Gathering of materials and/or data Criticizing, evaluating and establishing relationship between and among them Reporting of facts observing carefully the accepted rules and mechanics

HONESTY IN YOUR WORK


Honesty is essential, not only to enable straightforward, above-board communication, but to engender a level of trust and credibility in the outcomes of the research. This applies to all researchers, no matter what subject they are investigating. Although honesty must be maintained in all aspects of the research work, it is worth focusing here on several of the most important issues.
INTELLECTUAL OWNERSHIP AND PLAGIARISM

Unless otherwise stated, what you write will be regarded as your own work; the ideas will be considered your own unless you say to the contrary. The worst offence against honesty in this respect is called plagiarism: directly copying someone elses work into your report, thesis etc. and letting it be assumed that it is your own. Using the thoughts, ideas and works of others without acknowledging their source, even if you paraphrased into your own words, is unethical. Equally serious is claiming sole authorship of work which is in fact the result of collaboration or amanuensis (ghosting).
ACKNOWLEDGEMENT AND CITATION

Obviously, in no field of research can you rely entirely on your own ideas, concepts and theories. You can avoid accusations of plagiarism by acknowledging the sources of these features and their originators within your own text. This is called citation. Although there are several well established citation methods, they all consist of brief annotations or numbers placed within the text that identify the cited material, and a list of references at the end of the text that give the full publication details of the source material. These methods of reference cater for direct quotations or ideas etc. from the work of others gathered from a wide variety of sources (such as books, journals, conferences, talks, interviews, TV programmes etc.), and should be meticulously used. You should also indicate the assistance of others and any collaboration with others, usually in the form of a written acknowledgement at the beginning or end of the report.
RESPONSIBILITY AND ACCOUNTABILITY OF THE RESEARCHER

Apart from correct attribution, honesty is essential in the substance of what you write. You do have responsibilities to fellow researchers, respondents, the public and the academic community. Accurate descriptions are required of what you have done, how you have done it, the information you obtained, the techniques you used, the analysis you carried out, and the results of experiments a myriad of details concerning every part of your work.
DATA AND INTERPRETATIONS

Although it is difficult, and some maintain that it is impossible, to be free from bias, distorting your data or results knowingly is a serious lapse of honesty. Scientific objectivity should be maintained as much
Page 23

as possible. If you can see any reason for a possibility of bias in any aspect of the research, it should be acknowledged and explained. If the study involves personal judgements and assessments, the basis for these should be given. Silently rejecting or ignoring evidence which happens to be contrary to ones beliefs, or being too selective in the data used and in presenting the results of the analysis constitutes a breach of integrity. The sources of financial support for the research activities should be mentioned, and pressure and sponsorship from sources which might influence the impartiality of the research outcomes should be avoided.
WHERE DO YOU STAND?

The theoretical perspective, or epistemology, of the researcher should be made clear at the outset of the research so that the ground rules or assumptions that underpin the research can be understood by the readers, and in some instances, the subjects of the research. One of the principal functions of doing background research is to explore just this aspect, and to come to decisions on theory that will form the basis of your research approach. The theoretical approach will influence the type of data collection and analysis used. These methods are not ethically neutral so they will raise ethical issues.

SITUATIONS THAT RAISE ETHICAL ISSUES


Social research, and other forms of research which study people and their relationships to each other and to the world, need to be particularly sensitive about issues of ethical behaviour. As this kind of research often impinges on the sensibilities and rights of other people, researchers must be aware of necessary ethical standards which should be observed to avoid any harm which might be caused by carrying out or publishing the results of the research project.
RESEARCH AIMS

Although research aimed merely at gaining greater knowledge and understanding of a phenomenon has little or no ethical consequences the expansion of scientific knowledge is generally regarded as a good thing applied research is more easily subjected to ethical investigation. Will the results of the research benefit society, or at least not harm it? Will there be losers as well as gainers? The research aims and their consequences must be clearly stated. Normally you will have to argue that the aims of your research are in accordance with the ethical standards prescribed by your university or organization.
USE OF LANGUAGE

How you use language has an important influence when doing and writing up research. You should aim be as neutral as possible in the way you use terminology involving people who and what they are, and what they do. Guard against being patronizing or disparaging, and avoid bias, stereotyping, discrimination, prejudice, intolerance and discrimination.You will notice that acceptable terminology changes with time, so be aware that terms used in some older literature are not suitable for use now. You need to be constantly aware of the real meaning of terms, and their use within the particular context.
PRESENTATION

This relates to how you present yourself in the role of the researcher which might influence the attitude and expectations of the people you involve in your project. Student-researchers should present themselves as just that, and give the correct impression that they are doing the research as an academic exercise which does not have the

Page 24

institutional or political backing to cause immediate action. Practitioner researchers, such as teachers, nurses or social workers, have a professional status that lends more authority and possibly power to instigate change. Do not raise false expectations. The research situation can also be influential. Stopping people in the street and asking a few standardized questions will not raise any expectations about actions, but if you spend a lot of time with a, perhaps lonely, old person delving into her personal history, the more intimate situation might give rise to a more personal relationship that could go beyond the simple research context. Even more expectations can be raised if you are working in a context of deprivation or inequality will the subjects begin to expect you to do something to improve their situation?
DEALING WITH PARTICIPANTS

You should treat participants with due ethical consideration, in the way you choose them, deal with them personally and how you use the information they provide. In many cases, participants choose freely whether to take part in a survey by simply responding to the form or not. However, friends or relatives may feel that they have an obligation to help you despite reservations they may have and could result in a restriction of their freedom to refuse. Pressure might be exerted on participants if they are left too little time for due consideration which might also result in them regretting taking part. Obviously, you should avoid dishonest means of persuasion, such as posing as an official, making unrealistic and untrue promises, being unduly persistent, and targeting people in vulnerable situations. This could occur almost inadvertently if you are not alert to peoples situations and reactions. Participants will decide whether to take part according to the information they receive about the research. The form that this information takes will depend on the type of person, the nature of the research process and the context. It should be clear and easily understood so they can make a fair assessment of the project in order to give an informed consent. Particular attention is needed when getting consent from vulnerable people such as children, the elderly or ill, foreign language speakers and those who are illiterate. When working within organizations, managers or other people with overall responsibilities may need to be consulted, with the result that several layers of consent will be required. Make it clear and get agreement at all levels about what issues are to be discussed, how the investigation will be conducted, how confidentiality will be maintained. Be aware that there may be conflicts of interest between the management and employees so there must be some obvious form of protection for those making criticisms of the organization or systems of work or conditions. Although verbal explanations may be sufficient in informal situations, a written rsum on a flyer could be useful. Questionnaires should always provide the necessary written information as an introduction. Participants must have the right to terminate their participation at any time.

CARRYING OUT THE RESEARCH


POTENTIAL HARM AND GAIN

The principle behind ethical research is to cause no harm and, if possible, to produce some gain for the participants in the project and the wider field. Therefore the researcher should assess the potential
Page 25

of the chosen research methods and their outcomes for causing harm or gain. This involves recognizing what the risks might be and choosing methods that minimize these risks, and avoiding making any revelations that could in any way be harmful to the reputation, dignity or privacy of the subjects.
RECORDING DATA

There is a danger of simplifying transcripts when writing up data from interviews and open questions. When you clean up and organize the data, you can start to impose your own interpretation, ignoring vocal inflections, repetitions, asides, and subtleties of humour, thereby loosing some the meanings. Further distortion can be introduced by being governed by ones own particular assumptions.
PARTICIPANT INVOLVEMENT

Questions about rapport are raised if your research entails close communication between you, the researcher, and the participants. Will those involved understand the motivation for your actions and do these conform to your own practice? You should not take familiarity so far as to deceive in order to extract information that the participant might later regret giving. Neither should you raise unrealistic expectations in order to ingratiate yourself.
SENSITIVE MATERIAL

Information can be thrown up that is of a sensitive nature which, if revealed, could do damage to the participants or to other people. Every case will have to be judged individually, but if this information is relevant to the research, it must be presented in such a way that individuals are not damaged by assuring confidentiality and anonymity. In cases of, for example, unfairness, victimization or bullying, it is unwise to get personally involved, but it may be possible to give advice to the participant about who to contact for help, such as a school tutor, trade union or ombudsman.
HONESTY, DECEPTION AND COVERT METHODS

Honesty is a basic tenet of ethically sound research so any type of deception and use of covert methods should be ruled out. Although you might argue that certain information of benefit to society can only be gained by these methods due to obstruction by people or organizations that are not willing to risk being scrutiniszed, how can you be sure of the benign consequences of the actions? The risks involved make the use of deception and covert methods extremely questionable, and in some cases even dangerous.
STORING AND TRANSMITTING DATA

The Data Protection Act 1998 in the UK and equivalent regulations elsewhere cover the conditions regarding collections of personal data in whatever form and at whatever scale. They spell out the rights of the subjects and responsibilities of the compilers and holders of the data. The data that you have collected may well contain confidential details about people and/or organizations. It is therefore important to devise a storage system that is safe and only accessible to you. If you need to transmit data, take measures that the method of transmission is secure and not open to unauthorized access.
CHECKING DATA AND DRAFTS

It is appropriate to pass the drafts of your research report on to colleagues or supervisors for comment, but only with the proviso that the content is kept confidential, particularly as it is not ready for publication and dissemination at this stage. The intellectual independence of the findings of the report could be undermined if you allow sponsors to make comments on a draft and they demand changes to be
Page 26

made to conclusions that are contrary to their interests. It is not practical to let respondents read and edit large amounts of primary data.
DISSEMINATION

Dissemination of your results in the form of conference or journal papers, a website or other types of publication inevitably involves reducing the length of the material, and perhaps changing the style of the writing. You must therefore be careful that the publication remains true to the original and avoid oversimplification, bias towards particular results or even sensationalization.
DISPOSING OF RECORDS

A suitable time and method should be decided for disposing of the records at the end of the research project. Ideally, the matter will have been agreed with the participants as a part of their informed consent, so the decision will have been made much earlier. The basic policy is to ensure that all the data is anonymous and non-attributable. This can be done by removing all labels and titles that could lead to identification. Better still, data should be disposed of in such a way as to be completely indecipherable. This might entail shredding documents, formatting discs and erasing tapes.

Page 27

You might also like