You are on page 1of 50

ANALYTIC METHODS GREEN TEAM

16 May 2012 INTL 520 Mercyhurst University

Introduction

Page 1

ADVANCED ANALYTIC METHODS GREEN TEAM


Dean Atkins Leslie Guelcher David Krauza Puru Naidu Shawn Ruminski Emily Slegel Erie, PA

2012 Mercyhurst University, Green Team, Erie, PA All rights reserved. No part of this book may be reproduced or transmitted in any form or by any means without written permission from the author.

Introduction

Page i

Table of Contents CHAPTER 1: GAP ANALYSIS


Description Strengths Weaknesses How-To Personal Application Conclusion

4
4 4 5 5 6 8

CHAPTER 2: COST-BENEFIT ANALYSIS


Description Strengths Weaknesses How-To Personal Application Conclusion Additional Resources

9
9 9 10 11 12 16 16

CHAPTER 3: CONTENT ANALYSIS


Description Strengths Weaknesses How-To Personal Application Conclusion Additional Resources

18
18 18 19 19 21 25 25

CHAPTER 4: DEPHI METHOD


Description Strengths Weaknesses How-To Personal Application Conclusion

26
26 26 27 27 28 30

CHAPTER 5: GAME THEORY


Description Strengths
Advanced Analytics: Green Team Methods

31
31 33
page ii

Weaknesses How To Personal Application Conclusion

34 34 35 37

CHAPTER 6: COMPARATIVE NEWS FRAME ANALYSIS


Description Strengths Weaknesses How-To Personal Application Conclusion Further Information

38
38 39 39 40 41 45 45

BIBLIOGRAPHY

47

Introduction

Page iii

Chapter 1: Gap Analysis


Dean Atkins
Description
Gap analysis is a method that identifies the difference (gap) between the current state and a desired end state of a situation within a business or organization. As an analytic tool in the intelligence community, gap analysis is focused externally on other countries and entities. The current state is generally known and the gap analysis can be used to identify the likely course of action a target/entity may take in order to get to their desired end state. Traditionally, gap analysis is internally focused. Gap analysis can be done in a number of ways and is very similar to benchmarking.

Strengths
Flexibility. There are a number of different ways to use it. Gap analysis can be done in a number of ways. It can be applied quantitatively or qualitatively and can be broken down into smaller components in a micro approach or applied on a larger scale in a macro approach. Can be applied internally or externally. Gap analysis is traditionally applied internally in order to improve business methods to a desirable end state from their current position. The intelligence field requires an external focus and this can be achieved by using the current state and the desired end state to identify possible course of action/pathways. Once gaps are identified, it is easier to come up with actions/solutions. When gaps are perceived and identified it then becomes a lot easier to provide actions or solutions to a
Advanced Analytics: Green Team Methods page 4

decision maker. These recommendations are easily rationed and explained with a clear gap guiding the user.

Weaknesses
There is no standardized way of doing it. With so many ways of using gap analysis, a number of methods can be used to perceive and demonstrate the gap. It is also hard to conduct and compare if there are so many ways of implementing the method. Doesnt offer a clear estimate. Most analytic techniques offer a decision maker a clear estimate. Internally this may be realistic. However, when applied externally, you are not certain of the pathways or courses of action a target may use, making an estimate much harder to be extracted from gap analysis. Harder to apply externally. Furthermore, externally focused gap analysis has a lot less data and information that can be utilized. This makes it more challenging to find the current state, know a targets desired end state and therefore, likely courses of action to breach the gap between those two states.

How-To
1. Identify the target. Whether it is internal or external, identify the target that you will be applying the gap analysis to. 2. Identify the current state. Using all possible information available, determine the current position of the target where they are at. 3. Identify the desired end state. Whether its a statement of their intent or a likely outcome given the information you have, determine the desired end state of the target. What are they trying to achieve where do they want to be. 4. Determine what the gaps are between the two states.

Page 5

The gap between the two states is where the most flexibility can be applied. In whatever manner necessary, determine what the gaps are between where they are at and where they want to be. This may take into account qualitative or quantitative data in a micro or macro approach. 5. Interpret how the target may act in order to breach the gap. Using the gaps you have identified, figure out the likely courses of action the target may take to get from the current state to the ideal end state. Although these may not be truly estimative, it can identify possible pathways or courses of action. 6. Report results. Report the findings back, clearly stating what the current state and desired end states were, followed by the gaps identified and if necessary, likely courses of action the target may take in order to breach that gap.

Personal Application
1. Identify the target. I applied gap analysis to the Mens Soccer Team at Mercyhurst University. My goal was to use gap analysis in both the traditional, internal way and also the intelligence method of focusing on external actors. Therefore, my secondary targets were the other PSAC (Pennsylvania State Athletic Conference) teams. 2. Identify the current state. Using Excel and a number of statistics, I applied the gap analysis quantitatively to assess where each team was over the past 3 years. Qualitatively, I interviewed the current Mercyhurst Soccer coach to achieve more detail on the internal current state of the Mercyhurst Mens Soccer Team. 3. Identify the desired end state. Building on the quantitative information gathered, I used the previous PSAC winners as benchmarks (a similar analytic technique) and the PSAC Title as the desired end state.
Advanced Analytics: Green Team Methods page 6

Internally, I honed in on Mercyhurst and the desired end state the PSAC Title. 4. Determine what the gaps are between the two states. I used three methods for determining the gaps between a teams current state and the desired end state of a PSAC title. The first method was externally focused and concentrated on quantitative data in Excel to look for trends and correlations between data that resulted in success. There were a number of factors measured; record, win percentage, home record, home win percentage, average attendance, squad size, # of players in each position, # of coaches, # of players by college year, # of goals scored and # of goals conceded. The second method was internally focused and conducted qualitatively. This involved an interview with the Mercyhurst Soccer Coach and identified a number of gaps between their current state and desired end state of a PSAC title. The third method was developed from the coachs interview. Utilizing the gaps identified internally, I explored how they might be applied externally to the other teams in the PSAC. This then became an external approach using a qualitative survey of teams defensive weaknesses. This was a very specific gap analysis, focusing just on defensive weaknesses and was measured on the following weaknesses; counter attacks, corners, free-kicks, penalties, individual mistakes, out of position, open play, high line, deep line, red cards, crosses and through the middle. 5. Interpret how the target may act in order to breach the gap. Using the gaps I identified, I came up with a number of possible approaches the opposition may take next season in order to get to the desired end state of a PSAC title. Furthermore, building on the external and internal factors, I was able to come up with a number of
Page 7

recommendations and suggestions for the Mercyhurst Soccer Coach. Utilizing this data would be very useful if applied to a Indicators and Warnings analytic technique (possible future study). 6. Report results. After compiling all the data, I presented to the Mercyhurst Soccer Coach and the Athletic Director on my findings and conclusions.

Conclusion
The results of the gap analysis were very useful to the Mercyhurst Soccer Coach and left a lot of scope for further study. The method itself is hard to implement because there is no set way of doing it. It is hard to give instructions and a how-to methodology for such an expansive method. My personal application demonstrates how easy it is to apply the method in a variety of ways, yet it is important to define whether you are applying the method internally or externally beforehand as I found myself blurring the lines on multiple occasions during my study.

Advanced Analytics: Green Team Methods

page 8

Chapter 2: Cost-Benefit Analysis


Leslie Guelcher
Description
Cost-benefit analysis (CBA) is an analytic modifier that attempts to determine if a project, course of action, or investment should be selected based on limited investible funds (Mishan & Quah, 2007, p. 3). The process entails quantifying both the costs and the benefits of a project. It can be used to reduce uncertainty. CBA is traditionally used in one of two settings: as an economic analysis to determine the social benefit of a public undertaking or as an accounting function for private enterprises to determine the opportunity costs for a set of projects or decisions (Mishan & Quah, 2007, p. 5). For economic problems, the cost of a proposal is weighed as societys cost while the benefits are regarded as social benefits to determine if a project will result in a net social benefit, where benefit cost = net value (Robinson, 1993, p. 924). At the firm level, the costs are the actual, or estimated, costs of the project to the firm alone. Similarly, the benefits are those accrued only to the firm within the framework of the project being examined (Sonnenreich, Albanese, & Stout, 2006, p. 55).

Strengths
Cost benefit analysis is simple to implement when using quantitative data. When conducting CBA using project costs and benefits that have set financial numbers attached to the
page 9

project, it is a simple matter of inputting all costs and then associated benefits to derive a net value for the project. CBA reduces uncertainty. The process of listing all potential costs and quantifying the benefits allows an analyst to verify a project has considered the true costs of both inputs (costs) and outputs (benefits). Using brainstorming, expert testimony or other techniques to generate a list of costs and benefits aids the process. Examples of using CBA are plentiful. Because CBA has been used in both accounting and economics for decades, there are plenty of examples of academic articles, books, and downloadable spreadsheet templates to aid in preparing the analysis. Methods for determining costs and benefits are available. A myriad of journal articles and other publications exist that detail procedures for arriving at the costs and benefits for a specific project. For instance, in my application of CBA I found several articles that helped determine which factors to include as costs and benefits.

Weaknesses
Qualitative data can be manipulated. To work in CBA, qualitative data must be changed to quantitative data. Using a range for the data helps mitigate the bias that could permeate the analysis. Otherwise, it is easy to underestimate costs while overestimating benefits, which leads to faulty conclusions. CBA is not a stand-alone answer. When using qualitative data, other modifiers are needed in conjunction with CBA in order to transfer the ideas of qualitative to the numbers needed in quantitative.

Advanced Analytics: Green Team Methods

page 10

How-To
1. Identify the project, course of action or investment to be analyzed. The project to be analyzed will usually be defined by decision-makers. It can be a question comparing two potential solutions or a singular investment that the analyst is tasked with determining its projected usefulness/benefit. 2. Choose appropriate modifiers. Additional modifiers should be used to effectively generate a complete list of benefits and costs. By using more than one modifier, the analyst can ensure as complete a list as possible is included in the CBA. Various modifier ideas can be found in Additional Resources. 3. Determine the costs and benefits to be analyzed. The analyst can begin listing costs and benefits from individual research using Google Scholar, Lexis-Nexis or other online sources for journal and technical articles/papers. The papers should give the analyst a starting point for determining what other information will be needed. From there, additional items can be added to the list from HUMINT (talking to experts), brainstorming, or other idea generating activities. 4. Build a spreadsheet. Using a program like Microsoft Excel, either build a spreadsheet from scratch or find a template online. A Google search for Cost Benefit filetype:xls should generate a number of already designed templates. The spreadsheet should include an area for listing all of the costs followed by all of the benefits. Each section (cost and benefit) should be autosummed to improve accuracy. 5. Determine the number to use for quantifiable data. Through researching costs and numerical benefits, an analyst can start building the items on the spreadsheet. If exact numbers are not available, then a range of costs/benefits should be used. Using Hubbards 90
page 11

percent confidence interval (Hubbard, 2010, pp. 55, 103), the analyst should identify the low and high ends for both costs and benefits. The low and high numbers should each be listed in its own column to provide a specific low total and high total for the analysis. 6. Change qualitative data to quantitative. For CBA to work, all data must be in the form of numbers. As such, any fuzzy items require conversion from ideas to numbers. Again, using Hubbards ideas for being creative when approaching items that might be considered unmeasurable can produce estimates that are close enough to be able to reduce uncertainty (Hubbard, 2010, pp. 139-176). 7. Input quantities. All costs and benefits need to have a quantity, or range of quantities, associated with it. Each item should be listed with what it is and what it costs. 8. Analyze the results. Add all costs together to obtain the total investment. Then, add all benefits together to get the total benefit. Subtract total investment from total benefit to get the net value. If the value is positive, you have a net benefit; if negative, a net cost.

Personal Application
1. Identify a project. The first step was to determine an area, industry or concept that may not have utilized CBA in the past. I identified small business cyber security spending as the area I would investigate. 2. Define terms. To narrow down the items to investigate, I first determined what cost-benefit analysis entailed. (This is reviewed in the Description section, using the perspective of one firm.) I defined small business as an organization with fewer than 50 employees. Cyber security was defined as any undesirable event that is a result of an attack against the information system of the business (Arora, Hall, Pinto, Ramsey, & Telang, 2004, p. 35).
Advanced Analytics: Green Team Methods page 12

3. Identify types of data. I next researched the types of incidents that are included in cyber security. I found information pertaining to measuring risk of intrusion, costs associated with hardware and software solutions, the maintenance (or update costs) needed for hardware and software, and the costs associated with monitoring installed solutions. To quantify the benefits, I associated the cost of computers being unusable for an hour/day/week against the number of incidents prevented because of the installation of security devices. To obtain this information, I spoke with industry experts who advice and sell security solutions to small businesses. As an example, a business with a firewall, anti-virus protection, anti-spam protection and updated hardware/software can expect next to no intrusions into their networks. However, by eliminating any one of the solutions, changes how at-risk the business is to intrusion. I decided to use the risk-based analysis because I could establish different risk levels for different types of business. I was then able to tailor the analysis based on business risk. I identified five levels that depended on the type of network and data that a business has on-site.
Risk Levels 1 No network, no client/customer data, no IP, no sensitive documents 2 Networked, with staff data, but no client, IP, or sensitive documents 3 Networked, with staff data plus either client OR IP, no sensitive documents 4 Networked, with either client OR IP AND staff data along with sensitive documents 5 Networked, with client and staff data, IP and sensitive documents

4. Quantify data. Because the benefit area consisted of qualitative data, I needed to find a way to measure it. By using information obtained from experts, journal

page 13

articles and technical publications, I was able to identify 90 percent confidence interval for the benefits. I used the value of prevented incidents formula developed by Sonnenreich, et al to quantify risk exposure. The formula I used is: Risk exposure = ALE = SLE * ARO SLE = Single loss event ARO = Annual loss exposure To determine the cost of a SLE, I used industry data provided by the Computer Security Institute and the US Federal Bureau of Investigation. Again, I used low and high estimates when calculating the proposed benefit of prevented incidents.
Value of Prevented Incidents Cost of single security incident (SLE) Dollars Estimated annual rate of occurrence (ARO) Count Total annual loss exposure (ALE) 300 12 3,600 500 30 15,000

The other major qualitative measure was the savings to employee productivity. To determine the cost of loss
Monthly Productivity Savings Employees Reduced Hours/Month of non-Access Average Wage Total Monthly Productivity Savings Count Hours Dollars 10 5 55 2,750 30 10 55 16,500

productivity, I looked at an average number of incidents per organization based on the findings of CSI and the FBI. I then used the number of employees and an average wage to determine the savings for reducing the number and length of down-time on a computer network. 5. Enter costs. The costs associated with any given set of security answers are more easily constructed. The areas identified included: Implementation planning, Contract
Advanced Analytics: Green Team Methods page 14

labor, Internal implementation labor, Training costs, Opportunity costs, and Capital costs/equipment. The labor costs are entered for a specific entity. As an example:
Worksheet - Enter Values in Right Column

Internal Labor Cost/Hour IT Staff Cost/Hour Management Cost/Hour Other Staff Cost/Hour Average Wage External Labor Cost/Hour Expected life span Risk Level

Dollars Dollars Dollars Dollars Dollars Years Number

$ $ $ $ $

25.00 65.00 45.00 55.00 90.00 2 5

The items entered into the worksheet are then used to calculate labor and other costs based on the risk level of the business (the last item on the list). The risk levels each have associated costs and number of hours as they relate to each area identified above. The worksheet is designed to summarize all associated costs and benefits and then list the suggested hardware
Calculate Total Monthly Benefit Monthly Benefit Monthly Cost Total Monthly Benefit $ $ $

Low Est $ 6,545 $


7,750 1,205 3

High Est 26,750 13,581 13,169 3

Payback (Months)

solutions based on the risk level. As an example, the CBA for Level 5, using the data above computes to: The list of suggested hardware for the business includes: Modem
page 15

Switch Firewall/Anti-spam Backup Document Management Monitoring Dashboard Internet Monitoring

Each risk level has its own hardware suggestions and CBA analysis. 6. Analyze data. By adding risk analysis to the CBA, I was able to determine that a business with a low risk for intrusion, such as a mechanic with only one computer and no client personal data, could invest as little as $1,300 to secure its data infrastructure and the an additional $100 a month for on-going maintenance. Meanwhile, a business with more risk such as a manufacturer only concerned with protecting intellectual property on its network, can invest from $9,000 in implementation costs and then $2,000 a month in on-going costs to $45,000 for implementation and $6,300 on-going.

Conclusion
The results of the cost-benefit analysis produced results that could be used to inform decisions; however, it did not produce results that are estimative. I needed to include risk analysis in order to obtain enough information to be able to draw conclusions about the optimum level a small business should invest.

Additional Resources
http://www.techrepublic.com/downloads/a-projectmanagers-costbenefit-analysis/173615

Advanced Analytics: Green Team Methods

page 16

http://www.compliancesforum.com/it-project-costbenefit-and-risk-analysis-templates http://www.infotech.com/sem/costbenefit-analysistool?_kk=cost%20benefit&_kt=821b259e-18e5-49f19cb9-df22e94becc6&gclid=CM6g8aDda8CFSWFQAodGEgXDQ http://www.oit.umn.edu/project-management/projecttoolkit/index.htm www.pbis.org/common/cms/documents/NewTeam/.../c ostbenefit.xls www.dot.state.mn.us/transit/grants/.../Cost_Benefit_W ksht_4.xls www.panopticinfo.com/docs/CostBenefitAnalysis.xls www.projects.ed.ac.uk/.../Full.../ProjectCostBenefitWork book.xls www.tc.faa.gov/acf/Cost_Benefit_Template1.xls

page 17

Chapter 3: Content Analysis


David Krauza
Description
Content analysis is an analytic modifier that uses systematic, objective, quantitative analysis of message characteristics (Neuendorf, 2002, p. 1) to determine the presence of certain words, concepts, themes, phrases, characters or sentences within text or a set of texts and quantifies this presence in an objective manner. It entails a reading of a body of texts, images, and symbolic material, not necessarily from an authors or users perspective (Krippendorff, 2004, p. 3). Content analysis can be divided into two categories of analysis, conceptual analysis and relational analysis. Conceptual analysis is the traditional form of content analysis. In this method, a concept is chosen and the analysis involves quantifying and tallying the presence of the concept in the text(s). Relational analysis starts the same way as conceptual analysis, with the identification of a concept. However, relational analysis attempts to identify semantic relationships in the text. In this form of content analysis individual concepts do not have meaning, the results of the relationships among the concepts reveal the meaning.

Strengths
Content analysis is unobtrusive. When conducting content analysis, the analyst or researcher does not need to come into contact with the subject being studied. In fact, the target of the analysis may never know that they have been the subject of a study. Given this, there is a very low risk the target will
Advanced Analytics: Green Team Methods page 18

change their behavior as is the case with other forms of observation. Useful in analyzing trends. Content analysis is useful when analyzing historical materials. This form of analysis is good for documenting trends over time. Harder to trick with denial and deception tactics. Since content analysis will document trends over time a sudden and large change in behavior will easily be noticed. To effectively defeat content analysis the target will have to incorporate denial and deception tactics in all texts they produce and maintain the practice over a long period of time.

Weaknesses
May not identify motives. Content analysis is a descriptive technique. It will produce results that are good at describing what has happened or what is happening but it may not reveal why an event occurred or why those involved engaged in the observed behavior. Limited availability of data. Like all analytic techniques, content analysis will be limited by the availability of data. If the target under analysis does not produce sufficient content to be analyzed then the technique will have limited benefits. Vulnerability to bias. Content analysis is subject various forms of bias. Examples of bias that analysts may engage in include: not including relevant texts in the analysis and, in the case of relational content analysis, purposely miscoding text to arrive at a different meaning.

How-To
1. Determine research question. The research question is the theory or perspective the analyst wishes to examine. Content analysis research questions differ from
page 19

2.

3.

4.

5.

scientific hypotheses, which is pitted against direct observational evidence (Krippendorff, 2004, p. 31). Answers to content analysis questions are made from inferences drawn from the text. Choose appropriate tools. To effectively perform content analysis, software tools are necessary (please see the Additional Resources section for suggestions). The software can automate the mundane tasks of tallying word usage and identifying patterns in coded texts. The analyst must choose the software that best answers the question/need. Obtain necessary texts. There is a wealth of publicly available texts online. Analysts can use public sources such as the Security and Exchange Commission, media archives, and specialized websites, such as Google Finance or SeekingAlpha.com. The transcripts of Congressional testimony also offer an excellent source of content. The advantage that Congressional testimony offers is that the speakers must answer questions truthfully under penalty of perjury. Many news websites offer an archive of interviews conducted by their news staff. While a transcript is not always available (e.g. video of the interview is available on the website) it still provides another location to acquire content. If cost is not a problem there are also paid websites where content can be found, examples include Reuters, Bloomberg, and Lexis-Nexis. Analyze texts. This is one of the easier tasks in content analysis. Since the software tools will perform the raw analysis, the analyst will have to stage the data so that the tool can read the texts and produce output. Its likely that this step in the process will have to be repeated several times and new texts are discovered or anomalies in the data need to be corrected. Interpret results. Once the software tool results are available the analyst must attempt to infer meaning
Advanced Analytics: Green Team Methods page 20

from the texts. Krippendorff (2004, p. 219) recommends a statistical analysis of the results to arrive at an answer to the original research question. Another way to draw conclusions from the texts is to look at the report of word usage and draw conclusions from their usage. 6. Report results. The results of content analysis will be overwhelming to any decision-maker. It is necessary to present the results in a user-friendly format, including creating graphs and charts that can be easily understood by the intended audience.

Personal Application
1. Identify concept. The first step was to choose an industry to analyze. I chose the technology industry due to my familiarity with it. Within the industry I selected Research in Motion (RIM), Apple (AAPL), Google (GOOG), and Nokia (NOK) to analyze. I picked these companies because they are generally considered to be on different trajectories. AAPL and GOOG are generally considered to be on the ascendancy while RIM and NOK are generally considered to have seen better days. The concept I wished to investigate was whether public commentary by company management could provide leading indicators of operating earnings performance, as measured by earnings before interest and taxes (EBIT). 2. Determine content analysis method. After determining what concept I wanted to investigate, I needed to determine what method, conceptual or relational, I would use for my analysis. I decided to use the conceptual model (see Description section for more detail) because this model is a straightforward method and there is a substantial body of research using it. 3. Content analysis software. I examined several different software packages to perform my analysis. The first
page 21

package I investigated was QDA Miner from Provalis Research. It is a mixed methods qualitative data analysis package for coding, annotating, retrieving and analyzing collections of documents. QDA Miner can also be used to code transcripts, legal documents, journal articles, and books. QDA Miner coding and annotating features means that it is a tool that is geared more towards a relational content analysis. The package did support the conceptual model through its text-mining feature, but that required the use of a dictionary that was not provided with the tool. Given the 10-week time frame I had to work with it seemed implausible to create my own dictionary of financial indicator terms. This limitation essentially ruled QDA Miner out from consideration. Another software package I investigated is called MaxQDA from VERBI GmbH. MaxQDAs functionality is very similar to QDA Miner, however its text-mining feature is not part of the base product and requires an additional licensing fee to use. MaxQDA was eliminated for the same reason as QDA Miner. The final package I examined was the Linguistic Inquiry and Word Counter (LIWC) that was designed by James Pennebaker from the University of Texas. LIWC is based on Pennebakers research into the use of pronouns and function words to determine an authors or speakers motivation and intentions. The software package calculates the
Advanced Analytics: Green Team Methods page 22

degree to which different categories are used in source texts. Pennebakers (2011) theory and software fit with the conceptual content analysis model I wished to follow. LIWC is the software I would use for my analysis. 4. Finding source texts. I collected source texts from Seeking Alpha. The website maintains transcripts from investor conference calls. I searched for transcripts from 2007 to the present (2012). From the website I was able to extract the text of the call for RIM, AAPL, GOOG, and NOK and save them into a MS-Word formatted filed. From Seeking Alpha, I was able to download 120 transcripts between the four companies. I had also downloaded the Management Discussion and Analysis (MDA) section from the companies SEC annual 10-k filings. However, the language used in the MDA was very boilerplate and show very little difference in pronoun usage and I was concerned it was skewing my results. 5. Execution. After I had was satisfied with the amount of texts for each company I began my analysis using LIWC. The output of the LIWC software produced results counting words for several different categories. Specifically, LIWC identified word usage commonly associated with positive and negative emotion, and uses of first person singular and plural pronouns. The use of first person pronouns is significant because according to Pennebacker an increased usage of first person singular pronouns (I) is indicative of a depressed state of mind and a lack of confidence, while first person plural pronouns (We) is indicative of confidence and a feeling of superiority. With the output of LIWC I graphed the usage of words for the companies. My graphs were intended to help assist in the identification of general trends or long-term
page 23

patters. The graphs also had the benefit of making the data easy to present to viewers of the data, and an eventual decision-maker.

Figure 1: Apple, Inc. Negative Emotion compared with Operating Profit Margin

Once I had the output of LIWC I attempted to correlate the word usage to the companys EBIT to identify. I also attempted to run a regression using SPSS on the LIWC results and the companies reported results to see if a statistical significance existed.

Figure 2: Scatter Plot showing the correlation of Operating Revenue with First Person Singular (I) Pronouns

Advanced Analytics: Green Team Methods

page 24

Conclusion
The results of the conceptual content analysis produced results that are interesting; however, it did not produce results that are estimative. If the scope of my analysis had been broadened to include more aspects of relational content analysis, incorporated a larger set of documents and transcripts, and been conducted using a longer time period content analysis is likely to get the analyst closer to a more estimative outcome than I was able to achieve over the 10-weeks of the project.

Additional Resources
http://secretlifeofpronouns.com/ http://www.liwc.net/ http://www.provalisresearch.com/ http://www.maxqda.com/ http://www01.ibm.com/software/analytics/spss/products/statistics/ stats-standard/ http://writing.colostate.edu/guides/research/content/ http://academic.csuohio.edu/kneuendorf/content/

page 25

Chapter 4: Dephi Method


Puru Naidu
Description
Delphi Method is a tool that uses intuitive opinions, ideas, and thoughts of a group of experts to forecast, estimate, and decision-making of events and trends. It is based on the principle that forecasts from a structured group of individuals are more accurate than those from unstructured groups. (Rowe and Wright, 2001). The method can be used to gain consensus on future trends and projections through a systematic process of communication and information gathering. (Yousuf, 2007)

Strengths
Flexibility. The biggest strength of Delphi is its flexibility with participants geographical presence, time, and cost. The participants can be geographically dispersed while taking part in the study. The study can be conducted over days or weeks, and is not time sensitive. The study can be conducted use tools available on the Internet, and is very cost effective. Anonymity. The study requires that the participants are anonymous to other participants. This not only gives equal opportunity for all participants to voice their ideas and opinions that may not do so during a live group discussion, but also prevent participants social status or career status from influence other members projections. Simplistic. The method is very simple and easy to comprehend. The participants can be briefed on the study with just few

Advanced Analytics: Green Team Methods

page 26

sentences, and they will a good understanding of the purpose and process of the study.

Weaknesses
Facilitator. The method requires a facilitator that constructs, re-constructs, and analyzes the questionnaires. The facilitators limited knowledge and biases can influence the forecasting. Does not work in all circumstances. There is no statistical data indicating the effectiveness of the Delphi method. It does not necessarily work for short-range forecasting. Participant Interaction. The study is only effective when there are indebt opinions and increased interaction. The study may require more than two rounds of questionnaires to reach a more precise forecast.

How-To
1. Identify the topic and resources. Delphi method is a forecasting tool, and hence the topic should align with its purpose. The organization or individual conducting this study should identify the facilitator and the channel of communication that will facilitate the study. 2. Identify the participants. Once the study topic and resources are established, the facilitator needs to look for participants that have the required expertise in the topic. 3. Initiate the first round of questionnaire. Send out the first round of questionnaire. The questionnaire should be carefully constructed without any biases and stay in relevance to the topic of study. It should also include a basic description of the study. 4. Analyze results. After the first round of questionnaires, the facilitator should analyze all the responses and use it to reconstruct the second round of questionnaire.
page 27

5. Second round of questionnaire. Send out the second round of questionnaire. The questionnaire should include all the views and opinions of all participants from the first round for other participants to review. Note: The facilitator can use more than two rounds of questionnaires until a desired state is reached in the study. 6. Analyze and Report. Following the last round of questionnaires, the facilitator should analyze the responses and state the findings of the study.

Personal Application
1. Identify the topic and resources. I chose to apply Delphi method to forecast the outcomes of two events. Premier League Soccer match between Arsenal and Chelsea, April 21, 2012, and the Second round off of 2012 French Presidential Elections. My most feasible resources for the study were Google Docs and school emails. 2. Identify the participants. The participants for the Premier League topic were the soccer enthusiasts who followed soccer regularly, and were once soccer players them selves. The participants for the French elections topic were diverse but within college environment. The participants included students who took political theory class, students from Western Europe, and students who followed European news closely. The participants were talked to individually and made aware of the study. 3. Initiate the first round of questionnaire. Using Google docs, I constructed the first questionnaire and contacted the participants through email with the questionnaire links, and provided them with a time frame to take the questionnaire. For the soccer study, my forecast question was Who will win the Premier League match between Arsenal and Chelsea scheduled on 21 April 2012. And, for the French elections study, my forecast
Advanced Analytics: Green Team Methods page 28

question was Who will win the next French presidential election? Giving the participants all the possible outcomes of those events to choose from, I asked them to give their reasons as to why they chose that. Along with their confidence level in their predictions. 4. Analyze results. After the first round of questionnaires, I analyzed all the responses and collaborated the different reasons the participants gave in order of their predictions. In soccer study, most participants predicted Arsenal to win because of its good team form and its home field advantage. There were other participants who included other factors that would lead Chelsea to win. In the French elections study, all the participants chose Hollande to win, with their reasons being the poll standings and recent news sources that indicated Hollande to win. 5. Second round of questionnaire. Following the analysis of the first round of questionnaire, I reconstructed the questionnaire for the second round of questionnaire. I listed all the views and opinions, and asked the participants to rate the relevance of the views and opinions to the outcome of the event. 6. Analyze and Report. After analyzing the outcomes of both the studies, I am inclined to state that Delphi method doesnt always work, and can also be used with participants who are not experts in the research topic. In the soccer study, all participants were experts in the topic, but failed to predict the correct outcome. Nevertheless, there was some sort of consensus within participants with the ratings of the relevancy of the views and opinions listed in the second round of questionnaires. In the French elections study, the participants were not experts in this topic, however they possessed the abilities for indebt analysis for forecasting of this event. All the participants predicted
page 29

correct outcome starting from the first questionnaire and did not change their views. 7. Report results. I presented my analysis and findings to another Delphi method enthusiast, Mark Burgman, a professor at the University of Melbourne, Australia.

Conclusion
Delphi Method is a simple forecasting tool that uses expert opinions and views to forecast the outcome of an event. Its strengths include its simplicity, cost effectiveness, and the ability of geographically dispersed participants to take part in the study. However, in my personal application of this study, I find that it does not work in all circumstances. Within the two groups of study, the soccer study group, who were the experts in topics, was unable to predict the correct outcome. The French elections study group, despite their lack of expertise in the topic, was able to predict the right outcome. All qualitative studies state its effectiveness, but there is no quantitative data to prove it, and hence needs more research.

Advanced Analytics: Green Team Methods

page 30

Chapter 5: Game Theory


Shawn Ruminski
Description
In general, game theory is a branch of reasoning commonly used in economics and political theory. It is best used to understand the interactions between decision-makers. The traditional applications of game theory involve modeling situations in such a way that they represent a simplification of reality. They are an abstraction we use to understand our observations and experiences (Osborne, 2000). More specifically, there are some basic requirements to games. There are two or more players There is some choice of action where strategy matters There is at least one outcome, leading to a winner and a loser The outcome depends on strategic interaction between the players (Duffy, 2012)

Game theory, at least in the iteration studied for the purposes of this course, involves two actors with finite choices. These choices have a tangible, measureable consequence. The actors are rational, meaning that they actively seek to maximize their payoffs. The game being modeled is most often set up in a matrix, with the payoffs for each decision laid out in the grid for each actor. Given the choice between two payoffs, the actors will pick the higher number. The most common example of game theory is the Prisoners Dilemma. This is an excellent model which is commonly used in law enforcement. It shows why two individuals might not cooperate, even if it appears that it is in their best interest to do so. The classic description of the Prisoners Dilemma follows:
page 31

Two men are arrested, but the police do not possess enough information for a conviction. Following the separation of the two men, the police offer both a similar dealif one testifies against his partner (confesses), and the other remains silent (denies), the confessor gets a reduced punishment and the uncooperative man receives the largest sentence. If both remain silent, both are sentenced to only one month in jail for a minor charge. If each 'rats out' the other, each receives a three-month sentence. Each prisoner must choose either to betray or remain silent; the decision of each is kept quiet. In this example, the socially preferred course of action is for both to deny. However, each man is incentivized to confess, because their individual payoffs increase with confession. Game theory can be very useful at analyzing and predicting strategic interactions between actors, but its efficacy is limited by the accuracy of the model. For this reason, the applications of game theory are limited. Shubik suggested that while game theory may be applicable to actual games (such as backgammon or chess), and may even be useful for constructing a model to approximate an economic structure such as a market, it is much harder to consider being able to trap the subtleties of a family quarrel or an international treaty bargaining session (Shubik, 1975). In many situations, actors are unable to perfectly discern their environment, or their goals shift over time. These, in particular, are difficult to account for in game theory. A study done by Green found that game theoretic experts forecasts were correct only 32 per cent of the
Advanced Analytics: Green Team Methods page 32

time, compared with 34 per cent for unaided judgment by experts and 62 per cent for simulated-interaction (role-playing) by novices (Green, 2003). In the proper situations, however, game theory is very useful for forecasting the interactions between rational actors. Often times, the simpler games offer more accurate forecasts. However, the key factor in the effectiveness is rational choice (Osborne, 2000). Humans do not always act rationally, and this puts game theory at a significant disadvantage.

Strengths
Focus on the actions of individuals. Rather than being distracted by temporal elements of the situation, game theory distills it to the essential, which is the set of actions possibly taken by either actor. Should help counter judgmental biases. By examining the scenario and identifying the most significant variables, the analyst will often counter cognitive biases held regarding the situation as a whole (Green, 2003). Breaking the scenario into its respective parts interrupts these biases. Useful with undisputed assumptions. When the the basic facts of the environment as translated into the model are generally agreed upon, game theory tends to be very successful in exhibiting the qualities of the actual scenario. Applicable to a variety of fields. Game theory is proven to be effective when applied to military and economics. It has also been useful, although to a lesser extent, in law, ethics, sociology, biology, and classic parlor games (Martin, 1978).
page 33

Weaknesses
Humans do not always act rationally. It is difficult to game the impact of subjective influences (such as marketing or advertising in economics) Quantifying payoffs is difficult. When looking at complex political models, or other qualitative situations, subtle differences in the payoffs for actors may have a significant impact on the optimum strategies for those actors. Difficult applications to complex situations. Game theory is not very effective at modeling complex situations with many actors. Even in situations with few actors, the modeling often restricts the number of possible actions for either actor, since each action much be quantified and analyzed. Often constricts the number of options for actors. Modeling real world environments involves simplifying the variables. Often this means only analyzing the most plausible activities for actors. Although in reality actors could conceivable do any number of different things, in game theory this is not the case.

How To
1. Isolate the situation to be modeled. The situation must include two or more actors, strategic choices, and some possible outcomes. 2. Model the situation. The environment must be simplified and distilled into its most relevant variables. 3. Identify actors. For the purposes of this investigation, the number of actors was limited to two. 4. Identify the set of actions available to each actor. This may constrict the options available to decision makers.
Advanced Analytics: Green Team Methods page 34

5. Assign payoffs to each actor for every possible action. This most often takes the form of a matrix (simultaneous games) or a game tree (sequential games). 6. Forecast the probable actions of each actor. This is based on the rational choices they will make in the game. This will involve some mathematical calculation of mixed strategies, such as Nash Equilibrium or another technique (Osborne, 2000)

Personal Application
1. Isolate the situation to be modeled. I chose to examine specific situations in a basketball game, from a coaching standpoint. I am familiar with the game of basketball, which has many aspects that translate well to game theoretic analysis. I specifically identified both dealing with foul trouble, and an end of game scenario where a team is down two points near the end of regulation with the ball. 2. Model the situation. For the situation of foul trouble, the application of game theory was much more difficult than for the second situation. This was because it was difficult to model the actions of the coach versus the actions of the player for foul trouble. In the second scenario, I simplified the end of the game to two actors, with two possible actions each. 3. Identify actors. For the first situation, the actor was the coach, with the second actor being the teams starter or bench player. In the second situation, the actors were the two coaches involved in the game. 4. Identify the set of actions available to each actor. For the first situation, the coachs actions are benching or playing a starter in foul trouble. The players actions were their performance on the court. This was determined by a statistic called Wins Produced per 48 minutes (WP48). In the second situation, the coach of
page 35

the team that has the lead, the defending coach, could choose to defend a two point shot or a three point shot. The coach of the team that is losing, the offensive coach, can choose either to shoot a two point shot or a three point shot. 5. Assign payoffs to each actor for every possible action. For the first situation, payoffs were assigned based on the calculated probability of fouling out, coupled with the WP48 statistic. In the second situation, payoffs for shooting or defending each shot were based on shooting percentages from this past year in the NBA, and studies regarding open shooting percentages and the effect of good defense on shooting percentages. 6. Forecast the probable actions of each actor. For the first situation, the coach should let his starters play, in spite of foul trouble. However, I provided data that this is often not the case. In the second situation, the analysis shows that it is likely in the best interests of the losing team to shoot the three almost all the time. As long as the defending (winning) team guards the three pointer less than about 80% of the time, the losing team should seek to end the game in regulation every time. Similarly, the team that is ahead should fear the three pointer much more than overtime. As long as the team that is losing shoots the three at least a third of the time, the defending team should always defend the three.
Advanced Analytics: Green Team Methods page 36

Conclusion
Game theory is an excellent way to evaluate actors most effective strategies. However, an effective application involves a situation with accurate modeling, including the precise evaluation of payoffs, is required in order to get the most out of game theory as an intelligence methodology. Furthermore, it is important to assess the target actors evaluation of payoffs, rather than applying an arbitrary evaluation. However, given applicable situations and accurate models, game theory lends itself well to intelligence analysis and forecasting.

page 37

Chapter 6: Comparative News Frame Analysis


Emily Slegel
Description
Comparative News Frame Analysis is an analytic modifier that uses quantitative and qualitative analysis of words in texts to understand how one specific frame is being presented across newspapers and media outlets across cultures and countries by comparing analyzed frames. Frames are how an individual cognitively comprehends and files events, and the news can provide a pseudo-environment for the readers, thus framing the event for the public. The frames adopted by media to cover terrorism and the ones adopted by governments to report and respond to have the power to influence the societys perception of terrorist activity. Comparative News Frame Analysis is used to determine one specific frame thus understanding the perception of terrorist activity, for example, within that country or culture comparing it to other frames to understand the current situation. Comparative News Frame Analysis can be applied to various issues and events in history, from crises to the adoption of governmental policies. The technique is reliable up to a certain point, but is limited solely based on the understanding of the topic and technique. The method is fairly flexible and useful even with a small, but sufficient, amount of data. Within Comparative News Frame Analysis, there are a variety of qualitative and quantitative techniques within which to compare frames in newspapers. One technique includes discourse analysis, which seeks to understand links between texts by identifying particular frames. This is done by reading
Advanced Analytics: Green Team Methods page 38

the text several times and marking certain contextual items present. A quantitative technique for comparative news frame analysis is centering resonance analysis. This type of analysis calculates the words influence within texts, using their position in the texts structure of words. Once the influential words scores are calculated, these results indicate the authors intentional acts regarding word choice and message meaning. The analyst then can draw conclusions from these messages to determine the frame.

Strengths
Can use a small data set to draw conclusions. Comparative News Frame analysis does not require a large data set in order to draw conclusions. The method is also very useful with textual data. There are no limits to the amount of news sources being examined. Method is easily understood. Due to previous literature and academic research the method is well documented and is easy to understand. Can be used in variety of languages. Native speakers in nonEnglish languages will be able to use this technique. Any news source can be analyzed. Any topic can be analyzed using Comparative News Frame Analysis. The results can be easily communicated to decision makers. Using quantitative techniques, it is an easy method to complete with aid of software.

Weaknesses
Can only examine one frame/event at a time. The method does not allow for more than one event to be examined at a time.

page 39

For more robust results a larger amount of data is required. Some articles do not contain relevant textual items to examine/analyze. If few articles are available more robust results may not be possible. It may be challenging to find articles on one frame from different areas. Conclusions require experienced and knowledge of analysts. Once the data is produced, the analyst must draw conclusions on the frames based upon: experience, knowledge and previous literature Susceptible to denial and deception. The analysis relies on what is published in the media, therefore denial and deceptions tactics can skew the results.

How-To
1. Pick a problem or issue to examine that is able to seen across cultures or countries. 2. Research writings on the problem or issue that come from that region, country, or culture including examining local newspaper articles, national newspaper articles and blog sources. 3. Pick articles on the topic. 4. Determine qualitative, quantitative or mix techniques for the analysis: a. Qualitative Techniques Discourse Analysis: Articles are read over to identify frames, which include conflict, human interests, economic consequences, morality and responsibility. These frames are identified by symbols that carry specific attitudes and positions, which include: metaphors, exemplars, catch phrases, depictions, visual images and appeals to principal. Examine the percentages of favorable terms, neutral terms and unfavorable terms.
Advanced Analytics: Green Team Methods page 40

b. Quantitative Technique Centering Resonance Analysis: I used Crawdad, which is software based on the concept of CRA. CRA analyzes text by creating word networks of nouns and noun phrases that represent main concepts, their influence and their relationships. Simply input the .txt file into the program and convert it to Crawdad-specific format (.cra). From there the software completes all the calculations on the desired articles individually. Crawdad calculates two scores for each article, the influence and resonance scores. The higher the scores, the more influences and more betweenness centrality the word has within that article. Influence scores range from 0 to 1. A score of 0.05 or higher is considered significant by leading researchers in the area, and a score above 0.1 is considered very significant. 5. Once results are achieved, determine the trends and draw conclusions. 6. Can make Excel graphs and other methods to visualize the results to decision maker.

Personal Application
1. Identify concept. The first step was to choose a terrorist attack/event. I had to choose an event that had each article would be long enough in terms of words to compare, as well as had a variety of international news coverage. I chose the Madrid Train Bombing because of the international coverage that it had, as well as the articles had enough words to use for comparison. The challenge was that there was a lot of repetition of a single article in many news sources, so finding an original source of news coverage on the attack was a challenge. By using Google News Search Engine and LexusNexus I was only able to find 6 news articles that
page 41

were locally written. The concept I wished to investigate was if the Madrid Train Bombing was viewed differently in different countries and regions to understand the perception of terrorism in those countries. 2. Determine Comparative News Frame Analysis. Once I picked the event, I started researching the variety of news frame analysis available to determine the appropriate frame analysis method to use. I began investigating strategic frame analysis, which analyzes news by looking at the subtle selection of certain aspects of an issue in order to frame the event. This would not work for my topic, for I wanted to analyze different countries or regions perception of the Madrid Train Bombing by analyzing the frames in the news. Then I discovered a variant of news frame analysis that is Comparative News Frame Analysis, which looks at one particular topic and compares the frames presented by analyzed news articles in various areas of the world. I decided to use this method for there is a large body of research using this type of analysis to research similar topics. 3. Research Frames and Framing. After determining my method to understand how the Madrid Train Bombing is viewed in different countries, I started researching what frames are and what they do and can do. This was a large and lengthy process due to the vast amount of information available. There are also different academic views of frames and framing, for example psychology and sociology have two different views of frames and framing and I had to determine which definition of frames I would like to use when applying Comparative News Frame Analysis. It is important to note that an analyst would need to have strong background knowledge of frames and framing to fully understand the results of the analysis. I read the book Psychology
Advanced Analytics: Green Team Methods page 42

of Prejudice to understand the topic of prejudice and stereotyping along with other articles on frames and framing. 4. Choosing Techniques. There are a variety of methods to conduct comparative news frame analysis, but based upon previous research, which focused on news coverage of other terrorist events, I followed the same path of using discourse analysis, a qualitative technique, and centering resonance analysis, a quantitative technique. 5. Execution. When applying discourse analysis to my six newspaper articles, this technique appeared to have the potential to be easily flawed and did not prove to be an effective way to analyze the newspapers. I have concluded that for the qualitative analysis, one would need a very good knowledge of identifying textual items including metaphors, exemplars, catch phrases and depictions. Besides being able to simply identify these textual items the analyst would also have to know what it means. This could easily cause issues to the everyday analyst using this one method of analysis. Also I did note that some articles didnt contain some of the textual items to examine/analyze, which I figured was going to happen but nonetheless was a challenge.

page 43

I chose centering resonance analysis as the quantitative technique, for the other types of quantitative analysis used in other papers seemed to focus on the content itself, not the relationships that these words have in the articles, which helps the analyst understand the frame of the newspaper article. This was seen in a study that also studied a particular act of terrorism, comparing US to UK newspapers. I used Centering Resonance Software by Crawdad. The software was easy to use, though the analyst must draw the conclusions, which can be subjected because it is based upon experience, knowledge and previous experience. Crawdad also made it very easy to visually understand the differences between the newspapers when looking at the links and relationships between the words.

After reading all the results of the different articles, I made a very quick and simple graph, which enabled the results to be easily understood.

Advanced Analytics: Green Team Methods

page 44

Conclusion
The results of the quantitative and qualitative techniques are replicable, but it allows for the analysts to introduce bias into the conclusion. Comparative News Frame Analysis can help an analyst further in his or her research in an area or topic but its results do not produce an intelligence estimate itself. Also, the method was easy to complete with aid of the software and can help an analyst communicate to the decision maker because of the low technical word usage and is theoretically easy to understand. Finally, I was able to answer my question on how different countries perceive terrorism but I would need more data to have more robust results.

Further Information
Crawdad Technologies, L. (2005). Crawdad Text Analysis System version 1.2. Chandler, AZ. http://www.crawdadtech.com/ IYENGAR, S., & SIMON, A. (1993). News coverage of the gulf crisis and public opinion a study of agenda-setting, priming, and framing. Communication Research, 20(3), 365383. Retrieved from http://crx.sagepub.com/content/20/3/365.short KOENIG, T. (2006). Compounding mixed-methods problems in frame analysis through comparative research. Qualitative
page 45

Research, 6(61), 6176. Retrieved from http://qrj.sagepub.com/content/6/1/61.full.pdf Papacharissi, Z., & Oliveira, M. (2008). News frames terrorism: A comparative analysis of frames employed in terrorism coverage in u.s. and u.k. newspapers. The International Journal of Press/Politics , 13(1), 52-74 . Retrieved from http://hij.sagepub.com/content/13/1/52.full.pdf+html Zanna, M. P., & Olson, J. M. (1994). The psychology of prejudice. Lawrence Erlbaum. ZHANG, J., & FAHMY, S. (2009). Colored revolutions in colored lenses: A comparative analysis of u.s. and russian press coverage of political movements in ukraine, belarus, and uzbekistan. International Journal of Communication, 3, 517539. Retrieved from http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source= web&cd=10&sqi=2&ved=0CH4QFjAJ&url=http://ijoc.org/ojs/ index.php/ijoc/article/download/253/327&ei=cACHTx6iODRAfHpwY0H&usg=AFQjCNGIxbLtYMSJLPBImON6 acbcPmoCEQ&sig2=IQ2IZouGQJjAY0pPwduprw

Advanced Analytics: Green Team Methods

page 46

Bibliography
Arora, A., Hall, D., Pinto, C. A., Ramsey, D., & Telang, R. (2004). Measuring the Risk-Based Value of IT Security Solutions. IT Pro , 35-42. Duffy, J. (2012, January 10). Elements of a Game, Thinking Strategically. Econ 1200 Lecture. University of Pittsburgh. Green, K. (2003). Forecasting Decisions in Conflict: Analogy, Game Theory, Unaided Judgement and Simulation Compared. Victoria University of Wellington. Hubbard, D. W. (2010). How to Measure Anything. Hoboken, CA: John Wiley & Sons. Krippendorff, K. (2004). Content Analysis: An Introduction to Its Methodology. Thousand Oaks, CA: Sage Publications, Inc. Martin, B. (1978). The Selective Usefulness of Game Theory. Social Studies of Science , 85-110. Mishan, E., & Quah, E. (2007). Cost Benefit Analysis. New York, CA: Taylor & Francis e-Library. Neuendorf, K. A. (2002). The Content Analysis Guidebook. Thousand Oaks, CA: Sage Publications, Inc. Osborne, M. (2000). An Introduction to Game Theory. Oxford: Oxford University Press. Pennebaker, J. W. (2011). The Secret Life of Pronouns: What our words say about us. New York, NY: Bloomsbury Press. Robinson, R. (1993). Economic Evaluation and Health Care: Cost-benefit analysis. BMJ , 924-926. Shubik, M. (1975). Games for Society, Business and War. Amsterdam: Elseveir. Sonnenreich, W., Albanese, J., & Stout, B. (2006). Return on Security Investment (ROSI) - A Practical Quantative Model.
page 47

Journal of Research and Practice in Information Technology , 55-66.

Advanced Analytics: Green Team Methods

page 48

You might also like