You are on page 1of 62

Table of Contents More-Than-A-Medium Internet-Studies-and-Online-Journalism Metadisciplinary International Activism Communication-Rights Internet-Studies General-Digital-Theory Internet-Infrastructure Internet-Governance Stakeholder-Issues Jurisdictional-Issues Filtering-&-Information-Control

ation-Control Search Deep-packet-Inspection Cloud-Computing Privacy Archivability Online-Journalism Early-Online-Journalism-Research Canadian-Journalism-Scholarship Public-Sphere General-Online-Journalism-Theory Political-Economy Labor Collaboration-and-Participation Community Vertical-Integration

More-Than-A-Medium At the 2010 meeting of the Canadian Association of Journalists, Globe and Mail publisher John Stackhouse announced his newspaper would be redefining the roles of the paper's print and digital editions. The Globe and Mail's print edition had always been seen as the "paper of record;" now, this role would gradually shift online, while the physical newspaper would embrace the style and tone of a news magazine. As Stackhouse explained, the Globe's efforts to produce a print product distinct from the online paper was a move to preserve print ad revenue despite a shift towards increased online production. But in moving some of the paper's civic function online -- making the online edition Canada's "paper of record" -- the Globe highlighted an aspect of the online transition that has so far eluded the attention of scholars and professionals in Canada. What are the implications of using Internet infrastructure as a backbone for the delivery of the news? What are the specific consequences in Canada when the Fourth Estate moves online, ceding control over its method of distribution to an environment shaped by the commercial interests of online service providers, emerging regulatory structures, evolving technical protocols, and new legal regimes? My project explores this question in detail. I take as my starting point the contention that the physical structure of the Internet, and the legal, political and technical questions attendant on this infrastructure, have been of too little interest to journalism scholars. Thus, I wish to determine how aspects of Internet infrastructure, policy, and governance --including filtering, search, throttling, content inspection, vertical integration of broadband providers, global internet governance concerns, the increasing integration of cloud computing, privacy concerns and archivability --- will affect Canadian online journalism. This project could not be more timely, as it comes at a moment when the accelerating pace of change in Canadian media has outpaced the ability of Canadian journalism scholars to take stock of what such change might entail. To date, there has been limited study of online journalism in Canada. Elsewhere, journalism scholars point to the destabilizing effects of the Internet, yet rarely interrogate the structure of the medium that produces such effects. Especially when online print media are discussed, an emphasis on production and consumption overshadows any accounting of the materiality of transmission. My intention is to show that if Canadian journalism is to preserve its civic function, it must come to terms with the material and technical heterogeneity of the Internet, and understand the medium as a site of struggle between globalizing forces and local realities. My research will span two fields of inquiry: Journalism Studies and Internet Studies. I plan to undertake a critical review of scholarship in these fields, using insights from Internet Studies to reframe the conversation about Canadian online journalism. I will also review Canadian industry reports, and policy papers focused on digital media and/or the news industry, in order to identify key areas of concern for Canadian journalism. I will report my findings directly to professional and policy audiences, with the aim of producing awareness of critical issues in Internet structure and governance. My goal here is threefold: first, I plan to expand the study of online journalism in Canada; second, I intend to provide resources for those shaping the future of Canadian journalism in the online space; and third, I hope to thus assure that the civic function of journalism in Canada is not compromised by inattention to the material realities of the Internet.

Internet-Studies-and-Online-Journalism

Metadisciplinary 1. Sterne, Jonathan. Digital Media and Disciplinarity. The Information Society: An International Journal 21.4 (2005): 249. Summary: In this paper Sterne draws on writings by Foucault and Bourdieu to argue that digital media studies has not yet constituted a truly novel scholarly discourse. Therefore, the reasons for disciplinizing the field would be largely strategic, and the symbolic benefits of becoming a discipline would be limited and have significant intellectual costs. Sterne argues that digital media studies has not, as a field, constituted a new object or defined a set of objects and methods in terms that would make it a discipline. He says that there is a certain looseness that characterizes digital media studies. Sterne suggests that it is necessary to question the assumed separation between digital media and every other everyday technology that is not considered communication or information technology. This it is important to question the possibility of digital media studies same as with Internet studies - insofar as this field assumes its object matter because it is communication technologies and not other kinds of technologies. Sterne suggests that a serious discussion of disciplinarity should be grounded in the issues explained above. He also argues that the intellectual justification for disciplinarity changes depending on whether the argument is made from the humanities or the social sciences. The author asserts that disciplination as a strategic matter deals with legitimate concerns, which on an individual level intensify as the experience and prestige of the individual scholar declines. Sterne concludes that the best hope for the field lies in the ability to address and reframe big questions that cut across all the human sciences, and that disciplinarity is just and institutional promise for the field. Key quotes: If we consider digital media studies as an intellectual enterprise, then I believe it entirely fair to say that the field is not moving toward disciplinarity (...) There are canons of digital media research, but not proper or coherent subfields (p.251). Presently, we are more likely to apply the methods and theories of other fields to our objects than to make a unique contribution to the humanities and social sciences (p.251). By assuming that the most important technologies are the ones we call media, we denigrate rather than investigate the implications of digital media in broader networks of technologies, routines, and practices. 2. Mitchelstein, Eugenia, and Pablo J. Boczkowski. Between tradition and change: A review of recent research on online news production Journalism 10.5 (2009): 562 -586. Web. Summary: This article reviews scholarship on online news production published since 2000 and examines historical context and market environment, the process of innovation, alterations in journalistic practices, challenges to established professional dynamics, and the role of user-generated content. The purpose of this review is first to update and revise this area of inquiry during these years of exponential growth in research output; and second to chart new directions for future research. The authors suggest that scholarship about online journalism is at the intersection of tradition and change, relying primarily on traditional lenses, but showing potential for theoretical renewal. Regarding the context of online news production, scholars suggest that because of online news lack of profitability there is not an adequate business model. Several studies underscore the importance of advertising

revenues and show that users do not seem ready to pay for content. Some scholars link the growing importance of advertising revenues to the increased blurring of commercial and editorial content in the online environment. Studies on the process of innovation in online journalism tend to reject deterministic explanations and instead propose that technological innovations are mediated and shaped by initial conditions and contextual characteristics. Some studies suggest that established journalistic operations have tended not to realize the potential of new technologies, thus limiting change across the industry. Regarding the practices of online news production, scholars have argued that it has increased the pressure on journalists to carry out multiple tasks in different media formats. Existing technology and how journalists used them contribute to shape the information-gathering process. Scholars also propose that online journalism has contributed to the collapse of the twice-a-day news cycle leading to the emergence of high-speed news. Research about professional and occupational dynamics have concentrated on the identity of journalism as a profession and its relevance in a networked society, the self-reflection of journalists about possible changes to their professional identities, and the challenges posed by user-generated content to journalists as gatekeepers. Research on user-generated content suggest that most news organizations are not enthusiastic about UGG and that bloggers rely on journalists for information. Studies also show a low level of user involvement on forums and that journalists have been trying to co-opt and normalize blogging. The authors identify the lack of scholarship on historical matters as an important gap. Also absent are studies on the role that labor processes and conditions play in online news enterprises. Research on innovation has had a narrow focus and have given little attention to placing empirical findings in the context of comparable processes in other industries. Key quotes: Thus, online news producers seem to have adopted one of the practices that online media make possible, constantly publishing new information, which in turn has led to changes in their traditional way of producing news (p.569). Mitchelstein, Eugenia, and Pablo J. Boczkowski Online News Consumption Research: An Assessment of Past Work and an Agenda for the Future. New Media Society (2010): 1461444809350193. Web. Summary: The authors assess the methods and findings in recent scholarship on online news consumption. Findings show that consumption of online news is not drastically different from that in traditional media. Likewise, the dominant modes of inquiry have not changed because research has usually drawn on traditional theoretical and methodological approaches. These modes of inquiry have three limitations: the assumption of a division between print, broadcast an online media; the notion that the analysis should treat media features and social practices separately; and the tendency to focus on ordinary or extraordinary patterns but not both at the same time. Regarding whether online news complements or displaces traditional media consumption they found conflicting research findings that could be tied to analytical strategies that ignore the manifold interpretation of news consumption across media. Research on the effect of online news consumption on political knowledge also has arrived to conflicting findings because the studies have often looked at either media features or social practices but not at the interactions between them. On the issue of fragmentation and homogenization among online media audiences the authors suggest that this tension might be due to the tendency to focus on either ordinary or extraordinary patterns of phenomena. Focus on the ordinary reveals homogenization; focus on the extraordinary reveals segmentation. Studying both the ordinary and the extraordinary could result in more comprehensive accounts of the tension between them. Scholarship on online news as a resource for civic participation shows a lack of consensus on the relationship

between them, and it is due to the previously mentioned limitations: divisions across media, between media features and social practices, and between ordinary and extraordinary patterns of audience behavior. The authors propose an integrative research agenda that advocates: a) questioning taken-for-granted assumptions, b) pursuing empirical strategies that address various aspects of the studied phenomena and how they are integrated, including both the ordinary and the extraordinary, c) using mixed-method designs and triangulating the findings, and d) developing theory from findings. Key quotes: The existent research has made important contributions in a relatively short period of time. But it has failed to take full advantage of online news consumption for more extensive empirical, methodological, and theoretical renewal. Undertaking this renewal would not mean setting aside the existing empirical foci, methodological approaches, and theoretical lenses. But it would mean broadening the aspects of phenomena that deserve examination, the tools utilized to learn about them, and the analytical perspectives that explain the resulting findings (p.1094). Theoretical renewal emerges by highlighting how new concepts emerge from existent ones rather than by assuming that new phenomena are better understood through either an automatic application of the current theoretical tool-kit or the creation of an entirely new conceptual framework (p.1095).

4. Rall, Denise. Locating Four Pathways to Internet Scholarship. Cultural Science, 3.2 (2010) 2-10. Summary: This paper studies Internet scholars to determine differences in their disciplinary perspectives, their methodological approaches, life circumstances and diverse academic career tracks. Semi-constructed interviews were conducted with 28 participants working or studying in universities in Australia, the U.S. and England. Four academics were chosen for further analysis. Through a biographical narrative of the four selected scholars the author outlined the process of Internet scholarship: how does it happen? What are the different kinds of internet scholars? From these cases, four pathways to Internet scholarship were located: a. The Professional Internet Scholar: They range from messianic to pessimistic about the internet but their longevity and success in scholarship also offers a dispassionate view of the internet as an area of research. They could be interests in the Internet as an emergent phenomenon, therefore including internet-based research as part of their established research portfolios. Internet-specific research arrives via students and colleagues. His/her strength is in taking the long view. b. The Peripatetic Internet Scholar: They are defined by multiple and changing interests, they are flexible. These scholars are committed to their personal and family interests before academic ambitions. Their work on the Internet may be strongly tied to personal involvement with the internet. They work well with students and peers but lack enough collegial relationships or depth in their new disciplines to generate large numbers of publications. c. The Research Internet Scholar: Views the Internet and other advanced technologies as a means to open opportunities within particular environments. Works in community related projects. Internet research provides a good way to assess and build better basic services or employment opportunities with disadvantaged communities. They do not have established methodological frameworks. Success depends on the scholars ability to evolve the novel research designs and evaluative methods that will work within the increasingly complex arrangements of virtuality and human society. d. The Immersed Internet Scholar: They see the Internet as a set of technologies that take place behind or beyond the computer screen via vast networks of transferable code. They do not care much about

collegiality, publications, teaching, promotion, tenure. But the Academy provides a good environment for them and they often take positions in computer lab management or pursue graduate studies as a means to stay immersed in computer networks. These pathways are not mutually exclusive. There are a variety of mechanisms whereby internet scholars negotiate their traditional academic frameworks and their work in a new field of study. The profiles presented in this paper show that scholars tend to rely heavily on their traditional areas of study. Internet scholars can find the process of scholarly engagement challenging. Thesis can be rejected because they are in the edge of too many disciplines, there are no tenure-track positions for internet scholars, and they feel the need to reinvent themselves constantly. The author suggests that in order to advance internet studies and research must locate a proper scholarly home within universities, allow its scholars to mature through processes of scholarly engagement that require both personal and professional development, internet research methods must be legitimized, and empirical projects should get more support from granting agencies. . ---. Zombie Journalism. Journalism Studies Interest Group Newsletter : Coleman, E. Gabriella. Ethnographic Approaches to Digital Media. 23 Sept. 2010. Web. 30 Dec. 2010. Consalvo, Mia, and Charles Ess. The Handbook of Internet Studies. Wiley-Blackwell, 2011. Print. Henry Jenkins, Thorburn, David, and Brad Seawell. Rethinking Media Change: The Aesthetics of Transition. MIT Press, 2004. Print. Hunsinger, Jeremy, Lisbeth Klastrup, and Matthew Allen. International Handbook of Internet Research. 1st ed. Springer, 2010. Print. Jones, Steve. Foreward. International Handbook of Internet Research. 1st ed. Ed. Jeremy Hunsinger, Lisbeth Klastrup, & Matthew Allen. Springer, 2010. Lovink, Geert. Dark Fiber: Tracking Critical Internet Culture. MIT Press, 2003. Print. Wahl-Jorgensen, Karin, and Thomas Hanitzsch. The handbook of journalism studies. Taylor & Francis, 2008. Print. Wellman, Barry. "" The Handbook of Internet Studies. Ed. Mia Consalvo & Charles Ess. Wiley-Blackwell, 2011. Print.

International Cottle, Simon. Journalism studies: coming of (global) age?. Journalism 10.3 (2009): 309 -311. Web. Summary: In this paper Cottle argues that journalism researchers have failed to theorize and examine todays global crises and how they depend on the worlds news media. He stresses that while citizen journalism, the blogosphere, usergenerated content and corporate interests colonization of journalism are all worthy of study, researching these issues poses the threat of losing sight of global crises and how they become conditioned and staged by the worlds news media. So Cottle calls on journalism scholars to study todays major global issues and crises and how these are constituted and conducted within the media and communication flows around the world. Key Quotes: Hidden wars, forgotten disasters and permanent emergencies still abound in the world today and, because of their media invisibility, may command neither wider recognition nor political response (p.309). Todays threats go to the core of contemporary arguments about global cosmopolitanism and a possible emergent global public sphere, and should compel concerted responses from researchers working in the field of journalism studies (p.310). Singer, Jane and Ian Ashman. User-Generated Content and Journalistic Values. Citizen Journalism: Global Perspectives. Eds. Allan, Stuart, and Einar Thorsen. New York: Peter Lang, 2009. Summary: This chapter explores the journalistic control over the norms that journalists see as framing their own products and processes, but not necessarily those of users. The authors examine this by looking at how journalists at Britains Guardian newspaper and website assess and incorporate UGC. The chapter offers empirical evidence about the journalists views of certain values (such as autonomy) in an environment that includes citizen journalism and other kinds of UGC. It is based on interviews with 33 print and online Guardian journalists, as well as brief questionnaires asking them to provide three words associated with credibility, responsibility, autonomy and competence. This questionnaire provided data on how the journalists define their occupational values. The journalists were also asked to highlight key ethical issues related to audience input. Regarding authenticity, most interviewees agreed that UGC pose a challenge to journalistic authority. Journalists believed they took adequate steps to ensure that what they wrote was credible but had no way to assess the credibility of UGC. Another threat to authority was perceived as coming from users who challenge what journalists write. The journalists also said that UGC has the potential to erode professional autonomy due to the immediate availability of hit logs and comment counts. The interviewees felt a responsibility to readers, which they believe distinguishes them from users. This chapter suggests that journalists are thinking about issues raised by UGC - autonomy, accountability, credibility and civility in terms of an existing cultural framework defined by occupational norms. Key quotes: For Guardian staff, being a true or authentic journalist encompasses occupational norms of credibility authority and accuracy. The extent to which UGC jeopardized or undermines personal and institutional authenticity was a major concern (p.236-7). In general, the journalists expressed concerns about the credibility of UGC itself and about the effects of anonymous and/or uncivil comments on personal and institutional credibility. The perception that UGC

potentially challenged their authority was widespread; though they said they appreciated the fact that new voices could be heard, many clearly felt that too many of those voices were not worth listening to (p.241). All these issues from autonomy to accountability, credibility to civility- are connected to the challenges inherent in negotiating new relationships (p.241). 6. Quandt, Thorsten. (NO) NEWS - ON - THE - WORLD - WIDE - WEB? -- A comparative content analysis of online news in Europe and the United States. Journalism Studies 9.5 (2008): 717. Web. Summary: This study is a comparative content analysis of 10 online news media in five countries (U.S., U.K., France, Germany and Russia) to assess if they had fulfilled the promise of a new journalism (i.e. interaction between authors and audiences and the inclusion of multimedia content). The analysis considered what topics and regions/countries are reported in online stories, whether there are multimedia features and possibilities for interaction, how the source attributions and link structures work, and whether there is user-generated content (p.721). The selected sites are: sueddeutsche.de, Spiegel.de (Germany), news.bbc.co.uk, timesonline.co.uk (U.K.), le-monde.fr, lefigaro.fr (France), kommersant.ru, lenta.ru (Russia), newyorktimes.com, usatoday.com (U.S.). The analysis was carried out from 31 January and 13 February 2005. This is one of the first studies to look at the content of online news sites (p.720). Regarding formal characteristics of the sites, German and British ones do experiment with various forms of reporting, including background pieces, comments and opinion pieces, while French, Russian and American sites stick to neutral, informative news items (p. 724). Seven of the 10 sites use multimedia content in no more than a fifth of their articles (p.727). There is no clear pattern when it comes to interactive options. German and English sites and the New York Times offer journalists email addresses and feedback forms. In some cases they also offer links to forums. Le Monde and Kommersant offer discussion forums on a regular basis. Le Figaro and Lenta don not offer any options for interaction (p. 727). In the analysis of content and links, in three-quarters of the sample one journalist is mentioned as sole author. Other authors and sources are indicated only in some media. Sueddeutsche.de, Le Monde, and USA Today cite agencies constantly. Only the German sites cite occasionally other media as authors. All the sites made heavy use of links inside the website. Many used links to other sites. Four media used links for cross-promotion. There were also links to partner websites and even to commercial websites (p. 729). Regarding topics, actors and regions online journalism is fairly conventional: the main emphasis of coverage was on national politics and economy, followed by human interest stories, international politics, crime, sports and culture (p. 729). The importance of the U.S. is obvious in all the coverage. However, French sites largely ignore the U.S. The sites reproduce the same geographic bias as traditional newspapers by only referring to regions where their respective countries have vital interests (p. 733). The analysed websites have similar formal structure, lack of multimedia content, missing options for direct interaction with journalists, similar repertoire of article types, missing source/author attributions, a general trend towards coverage of national political events, and a limited scope of news. Websites do not use the Internet to its full potential for new types of writing, producing, linking and interacting. Key quotes: These findings contradict the notion of a homogeneous worldwide journalism on the Internet and the simple transfer of US studies (p.719-720). (W)hile the German and British news people use the Intenets unlimited space for offering analysis and comments, their French, Russian and American colleagues obviously have an informative, less subjective journalism in mind; this echoes findings form journalism surveys in conventional journalism(p. 724).

(W)e could not find one single chat link in more than 1600 coded articles so basically, the users interaction possibilities with the media still remain indirect and without obligations(to answer or even read the users input) on the side of the journalists (p. 727). Online journalism, as it is offered by market leaders in the respective countries, is basically good old news journalism, which is similar to what we know from offline newspapers (p. 735) 8. Goggin, Gerard, and Mark J. McLelland. Internationalizing Internet studies: beyond Anglophone paradigms. Internationalizing Internet studies: beyond Anglophone paradigms, Ed. Gerard Goggin and Mark J. McLelland. New York : Routledge, 2009 Summary: This chapter is the books introduction. The authors argue that it is necessary for Internet studies to include developments in the non-Anglophone world and to give greater recognition to the local histories and cultures of use. Thus, it is necessary to study the cultural specificity of the Internet, its design, functions, uses and meanings. So far, the authors argue, Internet Studies has only accounted for the North American and European experiences. However, since the 1990s the Internet has become an essential medium in a wide range of countries. Chinese and Spanish are the languages that are growing most in the internet. Communications and media scholarship has not studied the ramifications of this shift in emphasis from English to non-English language users online and how this challenges the concepts, methods, assumptions, and frameworks used to study the internet. Scholarly worked produced outside of English-speaking countries are hardly translated and published, having little impact on Internet and web studies. This book attempts to bring together understandings of culture, politics, use, and the social shaping of technology to suggest the profound implications of the internationalization of the internet. The authors argue that it is imperative to take the scholarship further by recognizing the different shaping of cultures of the internet in particular contexts. Key quotes: [W]hile there are certainly things that can be said about the Internet as a whole, it is crucial to come to grips with the very different sorts of communicative structures and cultures of use (...) we see the important work that is undertaken to describe, analyze, and theorize particular Internet forms, and how users are arranged, publics and audiences created, and relations of consumption and production are reconfigured (p. 10-11). 9. Goggin, Gerard. The International Turn in Internet Governance. A World of difference? Internationalizing Internet studies: beyond Anglophone paradigms, Ed. Gerard Goggin and Mark J. McLelland. New York: Routledge, 2009 Summary: In this chapter the author reviews international internet governance, how it has unfolded, who it represents and includes, and what linguistic, social, and cultural values it inscribes in the technology. Since the 1970s and 1980s the internet community and its governance developed with the core principle of openness. The internet community made the development of the internet open to whoever wished to join in and has enough technical expertise. As more people became interested through the 1990s and the internet internationalized, the pressure grew for these users to be represented in how the internet was governed. The author refers to two key debates on internet governance: that over domain names in the 1990s, and the WSIS summits. What became evident from the former is the considerable interest from a range of contributors, including civil society. What became obvious from the latter was that the development of the internet and its governance were reliant upon only one country,

10

the United States. In the preparation of the WSIS conferences, non-state actors were only observers and advisors, despite a declaration endorsing a multi-stakeholder approach. In response, civil society organized against state domination. The legacy of WSIS is that a global movement arose around the goal of ensuring that diversity and the importance of civil society concerns and their voices be recognized in the WSIS deliberations. The author suggests three conclusions when thinking about the internationalization of the internet. First, that this internationalization has revealed issues for the design of governance and policy arrangements. These issues arose in part due to the demand that internet governance would reflect the multiplicity of its users, and move beyond the predominance of the U.S. to allow for other governments, organizations and individuals to participate in it. Second, internet governance reviewed issues like who speaks, in what language, who decides, how the powerless are addressed, and how to meet claims to democracy and development, as well as technical collaboration and interworking. Finally, due to the two debates mentioned above new collective forms of action emerged at the international level. The author argues that these are developments and debates we need to engage in when considering the forms and implications of the internets internationalization for governance, power, and the place of users. Key quotes: What was striking about WSIS was that the debates were not restricted to narrow understandings of the digital divide the discussions were very comprehensive indeed. Further, there were genuine, if deeply problematic, efforts to incorporate disadvantaged countries, groups, and actors. There is a widespread sense that WSIS was a failure, and considerable disappointment regarding this (p.55). Deuze, Mark. The Future of Citizen Journalism. Citizen Journalism: Global Perspectives. Eds. Allan, Stuart, and Einar Thorsen. New York: Peter Lang, 2009. George, Cherian. Contentious journalism and the Internet: towards democratic discourse in Malaysia and Singapore. NUS Press, 2006. Print. Wurff, Richard, and Edmund Lauf. Print and online newspapers in Europe: a comparative analysis in 16 countries. Het Spinhuis, 2005. Print. Zhou, Yongming. Historicizing online politics: telegraphy, the Internet, and political participation in China. Stanford University Press, 2006. Print.

11

Activism 1. Atton, Chris. An alternative Internet. Edinburgh University Press, 2004. Print. Summary: This book looks at alternative internet through a series of case studies exploring the use of internet by individuals and organizations with alternative philosophies and practices, i.e. alternative to the dominant and expected ways of doing media. Atton examines Indymedia as an example of alternative journalism and a radical form of public journalism, intimately involved with global struggles against corporate governance. He also examines the British National Partys website and its discourse of far-right media. The second half of the book is devoted to popular cultural activity on the internet, and how audiences may become critics and commentators on those products and become creators on their own. He looks into new forms of social authorship, such as open copyright, anti-copyright and copyleft, which challenge commercially based notions of ownership. Finally, the book examines online radio for political activism, popular music and education, as well as fan culture. Atton argues that when studying the internet one has to consider that it is a series of human processes. To think of it as an unproblematic source of social change is to ignore the political and economic determinants that shape technology, and how these may be influenced by social and cultural elites. It is also to ignore the obstacles to empowerment that legislation, inequalities of access and limits on media literacy pose to groups and individuals. Regarding radical online journalism, Atton asserts that Indymedia deploys various radical journalism methods, such as first-person native reports; radical critiques of government policies, actions and the mass media; use of mainstream sources; and the creation of spaces for discussion. Through these practices alternative journalism critiques dominant news values and transforms dominant practices. However, it does not represent a total rupture with journalistic norms. Atton argues that in analysing alternative media researchers should avoid the ghettoisation of such media, which may lead to their marginalization. Also should avoid the valorization of alternative media because of their difference or because they appear to resist in a way that might be appealing to our own political sensibilities. Key quotes: [Radical journalisms] new practices signal a challenge to the epistemological basis of mainstream news production. In its place we see enacted a socially situated and self-reflexive form of journalism (...) We see a move away from journalism as expert culture and commodity; readers are invited to approach the knowledge presented here not as the product of an elite authority but as the result of a process that comes about through the impersonal connectedness of journalist and reader (p.60). Alternative media practices are hybrid practices that embody continuation as well as reform and rupture. Nor are they to be understood solely in relation to political activism (p. 159). 4. Bennett, Lance. New Media Power. The Internet and Global Activism. Contesting media power: alternative media in a networked world, Ed. Nick Couldry and James Curran. Lanham, Md.: Rowman & Littlefield, 2003. Print. Summary: This chapter explores the rise of global protest networks, which have used new digital media to coordinate activities and publicize information about their causes. Bennett seeks to identify what conditions enable activists to use new media (mobile phones, internet, wireless networks) to communicate their message across geographical and media boundaries. According to the author, these activist organizations create networks are segmented,

12

polycentric; however they are not centrally or hierarchically limited in their growth or capacities to recombine around threats and internal disruptions. Their information exchange is relatively open. Regarding internet empowerment, Bennett explains that there are three generalizations: networks of diverse groups could not be sustained without digital communication channels; the scale of protest on a global level seems impossible without the global communication and coordination capabilities of the internet; and the internet enables the diversity and global scale of protest at a greatly reduced cost. Global activism is therefore empowered by the internets capacity for internal (intra-group) and external communication (reaching audiences beyond activist circles). But there are also elements of human context that increase the power of the internet in global protest: the willingness of activists to share, merge and tolerate diverse political identities; the perception of activists of a need to scale protest activities across great reaches of time and space; and the growing permeability of all media enabling viral messages to travel further and reach new publics. Therefore, Bennett contends that the internet is not inherently transformative of either human communication or social and political relations. Rather, it is the interaction between the internet and its users what generates the power of new media to create new spaces for discourse and coordinated action. Referring to the capacity for simultaneous membership in local and global communities, Bennett suggests that old Gramiscian notions of class and group foundations of consciousness and resistance must be refigured: Global activism is less nationalistic; its collectivism is rooted in individual choices of social networks; this collective individualism is facilitated by discourses conceived less in ideological terms than in broad categories of threat and harm and justice. The digital public sphere for contesting media power would be less important if the various media spheres were not becoming increasingly porous. The importance of new media in contesting power involves more than just their existence as new communication tools. The political impacts of emerging technologies reflect the changing social, psychological, and economic conditions experienced by their users. Key quotes: The Internet happens to be a medium well suited for easily linking (and staying connected) to others in search of new collective actions that do not challenge individual identities. Hence global collectives often become collectives capable of directed action while respecting diverse identities. This diversity may create various problems for maintaining thematic coherence in networks (...) Despite such vulnerabilities of networks, the power of the internet is thus inextricably bound to the transformation of identity itself (p.28). Researchers are beginning to pay attention to the pathways from mico-to-middle media that bring important messages in contact with mass media gatekeepers. The distributed property of the web makes it difficult for news organizations to close the gates on provocative stories that competitors will be tempted to report if they dont (p.34). ---. Anarchy on the Internet: obstacles and opportunities for alternative electronic publishing.. 1996. Web. 17 Dec. 2010. Gillmor, Dan. We the media: grassroots journalism by the people, for the people. O'Reilly Media, Inc., 2006. Print. Nazario, Jose (Arbor Networks) Politically Motivated Denial of Service Attacks

13

Communication-Rights

Shtern, Jeremy and Raboy, Marc. Media Divides: Communication Rights and the Right to Communicate in Canada, Vancouver: UBC Press 2010. Summary Introduction: This anthology addresses the idea of "communication rights," historicizing the idea in the Canadian context and looking at the present-day constraints on the right to communicate. The "media divides" indicated by the title are between those who have access to communication rights and those who do not. The book draws on the Communication Rights Assessment Framework and Toolkit (CRAFT, 2005) developed by Communication Rights in the Information Society (CHRIS), a rubric used mainly to assess communication rights in developing countries. The volume thus draws on a volume of methodologies to examine the links between communication rights and media, access, Internet, privacy, and copyright. The argument is not so much for the crafting of new policies, but for the application of existing ones. The introduction points to the 1971 report Instant World: A Report on Telecommunications in Canada, as a key text in arguing for communication rights as well as a prescient discussion of problems that still plague Canadian communication. Among other things, Instant World predicted: technological convergence that telecommunications networks would be used to provide remote access to computer memory the rapid increase of processing speeds freestanding computers and mobile devices the need for broadband services on-demand programming electronic delivery of bills, etc. However, the report did not concern itself with corporate concentration of ownership public funding cultural diversity intellectual property rights. Canada has a strong tradition of associating itself with human rights, and communication rights within human rights. The book attempts to hold Canada to its own standards. Also attempts to shift away from fixation on US found in much Canadian communication policy analysis. Stresses importance of the "social cycle of communication" that is much broader than the commonly held notion of freedom of expression - freedom to communicate is the freedom to speak, be heard, and hear. Chapter One (Raboy and Shtern): Links between "right to communicate" and New World Communication and Information order (1970s) conflicts with West; US and other countries argue that idea of free press is being challenged. Communications rights enters period of stagnation, is picked up again in the 1990s; ie Voices 21 in 1999. Persistent debate over whether communication rights are first generation or second generation human rights, which has bearing on to whom such rights are granted as well as how they are adjudicated. Does the individual have the ability to communicate freely in today's world, or must the state intervene? Notes that the UN Conference on Freedom of Information stipulated that "delegations to the conference should include in each instance persons actually engaged or experienced in press, radio, motion picture or

14

other media" but there was no civil society (question - is there a mandate for media representation at the WSIS/IGF) Chapter Three (Raboy and Shtern) "The Horizontal View" Example of "Adbusters" case in which court decision "confirmed the property rights of media holders to decided what it was in their interests to publish" There are efforts to preserve source and content diversity in Canadian broadcasting, but not for Canadian newspapers Lack of funding and spectrum for community broadcasters - thus community broadcasting cannot have the influence envisaged by the Broadcasting Act Canada has high access to technology, but does not train people to make meaningful use of it, thus access it determined by socioeconomics 2006 Federal Accountability Act does not address problem of civil servants who refuse to create information in the first place - the government database showing who is asking for information access is not longer updated. Public service media is not archived in a publicly accessible format Canada has a "limited tootlkit" when it comes to addressing the impact of the media on an individual's honor and reputation (vs freedom of expression) Authors assess current privacy system to be "underfunded and poorly suited to the current environment Controversy over the provision of "third-language services" other than English and French Summary Chapter Four (Raboy) "Media" Focuses on Arar case as example of chilling and intimidation of the press. Media concentration as posing a restriction on free speech Broadcasting as a special case because of "pervasiveness" "invasiveness," "publicness," and "influence" Regulation is needed to protect "vulnerable values" such as diversity from market forces Interviews in 1998 revealed that Broadcasting Act was not felt to preserve communication rights on the ground. There is alarming evidence that the role of public consultation in communication policy development in Canada has been effectively diminished in recent years. CBC cuts = inability of CBC to serve the needs of regional Canada underrepresentation and misrepresentation of minorities CRTC's revisiting of its new media policy over the next few years seen as crucial to "the extent to which Canadian media are able to support communication rights." Chapter Five (Shade) Access 1994 Community Access Program provides funding for public internet access in schools, community centers, libraries - closed in 2006 despite success. characterized Canada's communications regime as beset by "globalization in a neo-liberal and antiregulatory environment, where issues of access, equity and social justice receive scant attention" Canadian citizens are now positioned only as consumers for IT services Pippa Norris: the social divide re ICTs, the global divide and the democratic divide (active vs passive users). The last is the most controversial: are ICTS the appropriate tool for democratic development Quotes Michael Powell drawing an analogy between digital divide and "Mercedes divide" - i.e., broadband as a luxury important to shift from digital divide to digital capabilities -- not only access, but use. Uses concept of "access rainbow," which considers governance, literacy and social facilitation, service and access provision, content and services, software tools, devices, and carriage facilities. Chapter Six Internet Defines "affordances" as new actions made possible by new modes of communication interaction or conditions enabled by new ICTs. Examines communication rights in the context of three Internet technology areas: the practice of network traffic shaping; the broad area of Semantic Web and Web 2.0

15

applications, and Internet telephony, areas "linked by the layered architecture of Internet applications. Traffic shaping is cited by ISPs as necessary to maintain infrastructure; but it CAN violate of principles of network neutrality. The two must be seen as distinct: network traffic shaping refers mainly to "the technical dimension of routing content across networks" while Network Neutrality "refers mainly to a set of principles relating to network operators: a network operator should not discriminate between data packets sent by or destined for its users; a network operator should provide full transit for data sent and destined for other networks; network operators should have limited liability for the data they permit across their networks. E2E principle: as long as basic communications functionality exists at a lower layer, advanced functionality should not be added later at a higher layer - the functionality should be implemented by applications. Many ISPs now offer content-based services they view as complementary to their data products. In 2009, the CRTC established some rules about when traffic shaping is allowed, but they are far from espousing net neutrality. One ominous use of traffic shaping -- deep packet inspection as a means of surveillance. Makes the argument that Web 2.0 technologies have "weakened quality journalism and replaced it with a flood of amateur commentary" Also brings up the issue of Web 2.0 and data ownership re cloud Key Quotes: "Scratch beneath the surface, as we have modestly tried to do in this book, and it is clear that the present linkages between rights, communication policy and practice are at best weak, when they are not misguided: that abstract and unenforceable claims to national sovereignty are favored over the ability of citizens to use their media; that when it is convenient to do so, government appropriates public communication for the purpose of promoting an idealized notion of national identity; the effort to cast this country as a champion of human and cultural rights is favored over realizing communication rights on the ground. (266) "Too often, ICT policy discourse frames "functionings" as the ability to purchase and consume products and services consumer rights rather than as citizen's rights, or the ability to access ICTS to create and participate meaningfully in democratic public life. (136) "These novel communications technologies do not give rise to the need for fundamental rights beyond those already recognized under the umbrella of communication rights as much as they create new conditions under which existing rights can be challenged."(174)

http://openmedia.ca/blog/bridging-broadband-gap-canada-needs-take-cue-latest-us-initiative

16

Internet-Studies

17

General-Digital-Theory Bennett, Lance. New Media Power. The Internet and Global Activism. Contesting media power: alternative media in a networked world, Ed. Nick Couldry and James Curran. Lanham, Md.: Rowman & Littlefield, 2003. Print. Summary: This chapter explores the rise of global protest networks, which have used new digital media to coordinate activities and publicize information about their causes. Bennett seeks to identify what conditions enable activists to use new media (mobile phones, internet, wireless networks) to communicate their message across geographical and media boundaries. According to the author, these activist organizations create networks are segmented, polycentric; however they are not centrally or hierarchically limited in their growth or capacities to recombine around threats and internal disruptions. Their information exchange is relatively open. Regarding internet empowerment, Bennett explains that there are three generalizations: networks of diverse groups could not be sustained without digital communication channels; the scale of protest on a global level seems impossible without the global communication and coordination capabilities of the internet; and the internet enables the diversity and global scale of protest at a greatly reduced cost. Global activism is therefore empowered by the internets capacity for internal (intra-group) and external communication (reaching audiences beyond activist circles). But there are also elements of human context that increase the power of the internet in global protest: the willingness of activists to share, merge and tolerate diverse political identities; the perception of activists of a need to scale protest activities across great reaches of time and space; and the growing permeability of all media enabling viral messages to travel further and reach new publics. Therefore, Bennett contends that the internet is not inherently transformative of either human communication or social and political relations. Rather, it is the interaction between the internet and its users what generates the power of new media to create new spaces for discourse and coordinated action. Referring to the capacity for simultaneous membership in local and global communities, Bennett suggests that old Gramiscian notions of class and group foundations of consciousness and resistance must be refigured: Global activism is less nationalistic; its collectivism is rooted in individual choices of social networks; this collective individualism is facilitated by discourses conceived less in ideological terms than in broad categories of threat and harm and justice. The digital public sphere for contesting media power would be less important if the various media spheres were not becoming increasingly porous. The importance of new media in contesting power involves more than just their existence as new communication tools. The political impacts of emerging technologies reflect the changing social, psychological, and economic conditions experienced by their users. Key quotes: The Internet happens to be a medium well suited for easily linking (and staying connected) to others in search of new collective actions that do not challenge individual identities. Hence global collectives often become collectives capable of directed action while respecting diverse identities. This diversity may create various problems for maintaining thematic coherence in networks (...) Despite such vulnerabilities of networks, the power of the internet is thus inextricably bound to the transformation of identity itself (p.28). Researchers are beginning to pay attention to the pathways from mico-to-middle media that bring important messages in contact with mass media gatekeepers. The distributed property of the web makes it difficult for news

18

organizations to close the gates on provocative stories that competitors will be tempted to report if they dont (p.34).

Morozov. The Net Delusion Note: From Dave Parry at profoundheterogeneity.com: This is what we would call one of the warnings Morozov offers: cyber-utopism is bad government policy. It is not that those working on making the internet a more just place, the EFF, the Berkman Center, etc. are cyber-utopians, but rather that cyber-utopism still prevails as a discourse at the level of the public, journalism, and public policy, and that this type of techno-determinism-utopism has a long history we ought to try and avoid repeating; peoples lives are after all at stake here. There is one other cautionary tale in here, that Morozov doesnt spell out as much as I would like, but is worth noting before I move onto the second thesis in the book. The cyber-utopic-techno-determinist rhetoric which informs public policy, especially foreign policy has one more danger: a military-industrial-surviellance-cyber complex. There is a lot of money to be made here and we should expect companies against our own interest to leverage the quick fix of a particular technological tool to make money at our expense and engineer a social world against our own best interests. (I think this is a point that Morozov, Doctorow, and Shirky would all agree on.) Zittrain, Jonathan. The Future of the Internet--And How to Stop It. Yale University Press, 2009. Print. Summary: In this book Zittrain argues that the key to the success of the PC and the Internet is their openness to contribution and innovation. However, these characteristics are also what makes them vulnerable and what is prompting the emergence of tethered information appliances that facilitate regulation and control. Viruses and malware are successful because people can control the code that runs in their computers and can be tricked into running a dangerous code. According to the author, if security problems worsen, internet users could end up preferring some sort of lockdown, which opens the door to new forms of regulatory surveillance and control. In this book the author asserts that a lockdown on PCs and the following rise of tethered appliances bundled hardware and software created and controlled by one company - would eliminate our capability to influence and revolutionize technology. Stopping this future depends on some wisely developed and implemented locks, new technologies and a community ethos that secures access to those locks among groups with shared norms and a sense of public purpose, rather than in the hands of a single gatekeeping entity. So far computers have had generative systems, which are built on the notion that they are never fully complete, that they have many uses, and that the public can be trusted to invent and share good ones. Inclusive projects such as Wikipedia, which bring people together in meaningful conversations, commerce, or action, are possible not only due to usersinterest but because of the generative technology that supports them. However, constant breaches of that trust can threaten the foundations of the generative systems. The author argues that insecurity in the internet makes the situation unsustainable and that generativity could be sacrificed. In this scenario, users that are frustrated with their PCs would substitute them with information appliances that offer safer experiences, or PCs could become appliansized. A mainstream dominated by non-generative systems could harm innovation and individual freedoms and opportunities for self-expression. However, generative and non-generative systems are not mutually exclusive; they can compete and intertwine within a single system. With tethered appliances regulatorsinterventions into the devices themselves are much more predictable, affecting directly the way people use the appliances. This makes surveillance and censorship easier, especially by repressive governments.

19

The author argues that so far internet insecurity has been dealt with by intergovernmental organizations and diplomatic initiatives such as WSIS, which have ended in bland consensus pronouncements. Zittrain considers that these meetings are missing key participants: computer scientists and geeks, and without them the prospect of coding new tools and protocols to facilitate social solutions is easily neglected. Zittrain suggests that the best approach to secure the internet and the innovations built upon it is to empower its users to contribute, rather than impose security models controlled by a handful of people. He adds that PC users unaware of their digital environments and unable to act when facing danger should also become more prepared. Key quotes: [The rise of tethered appliances] will affect how readily behavior on the Internet can be regulated, which in turn will determine the extent that regulators and commercial incumbents can constrain amateur innovation, which has been responsible for much of what we now consider precious about the Internet (p.9). [C]onsumers find themselves frustrated by PCs at a time when a variety of information appliances are arising as substitutes for the activities they value most. Digital video recorders, mobile phones, BlackBerries, and video game consoles will offer safer and more consistent experiences. Consumers will increasingly abandon the PC for these alternatives, or they will demand that the PC itself be appliancized (p.57). Generative systems are threatened by their mainstream success because new participants misunderstand or flout the ethos that makes the systems function well, and those not involved with the system find their legally protected interests challenged by it. Generative systems are not inherently self-sustaining when confronted with these challenges (p.65). The keys to maintaining a generative system are to ensure its internal security without resorting to lockdown, and to find ways to enable enough enforcement against its undesirable uses without requiring a system of perfect enforcement (p126). Castells, Manuel. Communication power. Oxford University Press, 2009. Print. Mayer-Schonberger, Viktor. Delete: The Virtue of Forgetting in the Digital Age. Princeton University Press, 2009. Print.

20

Internet-Infrastructure Bowker, Geoffrey, Baker, Karen et al. Towards Information Infrastructure Studies: Ways of Knowing in a Networked Environment. International Handbook of Internet Research, Springer 2010. Summary: This article explores the current change accompanying the development of the Internet in terms of its relationship with the nature and production of knowledge, from an infrastructure studies perspective. The authors define infrastructure as technologies and organizations that enable the development of science, and the individuals in existing and emerging roles associated with information infrastructures (p.98). Cyberinfrastructure is defined as the set of organizational practices, technical infrastructure, and social norms that collectively provide for the smooth operation of scientific work at a distance (p.102). New information infrastructure is fundamentally about distribution; our information society has moved towards a form of distribution where complex social, organizational, and knowledge work can be practiced at a global scale (p.112). In the context of the Internet, infrastructure studies explore the ways in which: (1) New forms of sociality are being enabled/shaped by and shaping Information and Communication technologies. This is the social dimension of the new infrastructure. The Internet has transitioned from a research environment, into a committee-run entity, and now in a marketplace of policies and regulations. Its infrastructure is reaching beyond the physical and technical to the individual and the community. Thus, mediation emerges as a way to negotiate these systems and networks, people and organizations, involving community and system building (p.106). The need for engaged participation ensures that issues such as standards formation and maintenance and update are addressed in information infrastructures. (2) Social, ethical and political values are being built into these infrastructures. The Internet can be considered an immense database, which is the outcome of technological development over centuries. Social and political choices are being made in the construction of large interoperable databases and ontologies in the social sciences and humanities. If a dominant group within a given discipline gets to define the ontology of the database, they can reinforce their power. A wide range of cultural and organizational changes need to happen for a new technical infrastructure to be fruitful. Infrastructure studies analyses change in forms of practice, routines, and distributed cognition associated with knowledge work. (3) The nature of knowledge work is changing with the introduction of new information technologies, modes of representation, and the shifts in work practice and systems for the accreditation of knowledge. Information infrastructure studies is a multidisciplinary field in which the global and the local, the social and the technical are in flux in new and interesting ways for the first time in 500 years. It is in need of new models for scalable qualitative research, better forms of multi-modal research, and an integrated view of the social, organizational and cognitive. Researchers need to move freely between these three aspects (p.112-113). Key quotes: We are convinced that we are in the midst of developing fundamentally new ways of knowing though we put this in a 200 year span rather than a machine-centered 50 year frame (...) Information infrastructure is a great tool for distribution of knowledge, culture and practice. Homesteading the space it has slowly opened out over the past two centuries involves building new kinds of community, new kinds of disciplinary homes and new understandings of ourselves (p. 113-114). Wu, Timothy. The Master Switch: The Rise and Fall of Information Empires. New York : Alfred A. Knopf, 2010.

21

Summary: The Internet is property of no one where the Bell system belonged to a private corporation. This open character of the Internet makes our time without precedent in terms of culture and communications. In the 20th century there was a succession of optimistic and open media, each of which became a closed, controlled and highly centralized new industry (p.5-6). This oscillation of information industries between open and closed is what the author calls the Cycle. Throughout the book Wu reviews how film, radio, television, telephone, and cable TV went though this cycle in the 20th century. According to Wu, if the Internet should become subject to the Cycle the consequences would be staggering, and there are already signs that complete network openness is ending (p. 7). The Internet works over an infrastructure that does not belong to those using it. Initially the owner of this infrastructure was AT&T (p. 198). However, the Internet works with a Transmission Internet Protocol (TCP), which allows it to run on any infrastructure, it made the Internet independent of the infrastructure over which it runs. The TCP/IP became the only language with which you could get on the Net (p. 202). When AOL and Time Warner merged in 2001, the idea was to get AOLs subscribers to consume Time Warners products, and Time Warners consumers had to subscribe to AOL; this plan did not work due to the neutral design of the Internet (p. 265). This design can boost or destroy companies; net neutrality destroyed AOL Time Warner but it catapulted Google and Amazon. So, the idea of owning all sorts of services and products is not necessarily convenient (p. 268). In the 2010s the old conflict between open system and closes has reappeared, with Goggle and its allies like Amazon on one side, and Apple, AT&T, and the entertainment conglomerates on the other. This conflict is the essence of the Cycle, which the Internet seems to have moderated but not abolished (p.289).The difference this time is that in previous Cycles (film, radio, TV) bigger organizations bought smaller ones until all the power would end up concentrated in one or two very centralized giants; in the 2010s there is no such predilection for central order, and the democratization of technology has given the individual more power than ever before (p.297-298). Wu proposes a constitutional approach, a regime whose goal is to constrain and divide all power that derives from the control of information. He suggests a Separations Principle, which would mean that those who develop information, those who own the network infrastructure on which it travels, and those who control the tools or venues of access must be kept apart from one another (p. 304). It also means that the government should not intervene in the market or favour any technology, network monopoly, or integration of the major functions of the industry. Key quotes: (T)he blessing of the state, implicit or explicit, has been crucial to every twentieth-century information empire (...) In every information industry, the government mediated what would have otherwise surely been a more tumultuous course of the Cycle (p. 160). The principle of net neutrality, instilled by the Internets founders, is ultimately what wrecked AOL Time Warner. And that now iconic wreck, if nothing else, would attest powerfully to the claim that the Internet was at last the great exception, the slayer of the Cycle we have been visiting and revisiting (p. 260). Like the separation of church and state, the Separation Principle means to pre-empt politics; it is a refusal to take sides between institutions that are historically, even naturally, bound to come into conflict, a refusal born of societys interest in preserving both (p. 304-305). The Internet with its uniquely open design has led to a moment when all other information networks have converged upon it as the one superhighway (...) this proposes an awesome dependence on a single network, and a no less vital need to preserve its openness from imperial designs (p. 318).

22

4. Lovink, Geet, "Toward Open and Dense Networks, in Boler, Megan, ed. Digital media and democracy: tactics in hard times. MIT Press, 2008. Key Quotes: "I feel that the technology is leading us to the question of whether we can create new forms of organization that at times will be dictated by its architecture. The increasing role of technology also poses the threat that the architecture of technologies will dictate what practices we can develop." (127) Chadwick, Andrew, and Philip N. Howard. The Geopolitics of Internet Control. Routledge Handbook of Internet Politics. 1st ed. Routledge, 2008. Print.

23

Internet-Governance

Mathiason, John. Internet Governance: The New Frontier of Global Institutions. New York, Routledge, 2009. Summary: Non State Actors And Internet Governance: The IETF is in control of Internet architecture, while the World Wide Web Consortium (W3C) overlooks basic web applications. The majority of members of both groups are from private sector corporations; though outside of the US membership tends to come from corporations, government, and academia in a more proportional mix. Proprietary standards within a given corporate product are not monitored by either group; if a problem occurs (i.e. Microsoft) it is up to governments to intervene. Aside from the IETF and W3C, non-state actors include public sector groups concerned with a free, open internet (netizens) including Electronic Frontier Foundation (EFF), Computer Professional for Social Responsibility (CPSR), the Association for Progressive Communication (APC). Internet Governance Emerges as a consequence of the need to manage the domain-name system - the memorandum of understanding on top level domains called for WIPO standards in managing domain names: it was seen as a threat to the dominance of Network Solutions as well as an end-run around government regulations. USGOV decided to have a public comment period, and then prepared greenpaper calling for transition with Network Solutions playing transitional role. Seen as US-centric. Followed by White Paper which recommended four key principles: stability, competition, private bottom-up coordination, representation (international input into decision making). The principles were not intended to replace other regimes of international law. White Paper was more attentive to globalization issue: noted that many countries pushed for globalization of the domain name system. Final body created to oversee principles is ICANN...."there are precendents for international institutions that function as private entities while providing public services, but none as radical as ICANN." Intial heads of ICANN were symbolically representative of academic interests/internet freedom interests: most exited by 2000. ICANN Stakeholders include governments, the private sector, netizens. Persistent US Control of ICANN has been a global issue one past example is the US pressure to deny the .xxx domain in 2006. (A current example, not in the book, is US industry opposition to the expansion of TLDs). 2003 - a summit on information sponsored by the International Telecommunication Union (a UN body( becomes the WSIS. ITU was concerned that it was obsolete as an organization as a result of changes in the communications infrastructure. It organizes a conference which recognizes the importances of different participants (stakeholders), focusing on citizens, communities and information access The Internet Governance Forum emerges out of the Working Group on Internet Governance (see below)... the IGF is now providing policy coordination in an area where technology, economics, social forces and international norms intersect...a new model for reaching international agreements. Book ends with postulating that the "multi-stakeholder approach" used to address Internet governance will soon be used to address other complex global situations, i.e., global warming. Key Quotes "Because of the political and economic consequences of the information revolution built around the Internet, ICANN has had to become involved in policy issues to an extent that the White Paper did not envisage."(92) "The Internet grew dramatically (in the years after ICANN was founded), and ICANN continued to grow, but the international political dimension of Internet Governance did not emerge until the World Summit on the Information Society in 2003" (96) McLaughlin, Lisa and Victor Pickard. 2005. What is bottom-up about global internet governance? Global Media and Communication, December 2005 1: 357-373.

24

Summary: This article argues that the inclusion of civil society in the World Summit on the Information Society (WSIS), which created the Working Group on Internet Governance (WGIG), has eroded an oppositional civil society within the summit itself; and evaluates the WGIG as a manifestation of global neo-corporatism. The first phase of WSIS contemplated a series diverse interests and concerns that by the second phase had been condensed into one agenda item focused on internet governance. Despite the participation of civil society organizations, the most powerful stakeholders (i.e. governments, UN agencies, and private sector) had a pre-set, neoliberal agenda focused on harnessing the power of new information technology to unleash the entrepreneurial spirits of people in underdeveloped countries. According to the authors, the WSISs emphasis on internet governance is a product of the complex interplay among mechanisms of inclusion and exclusion that characterizes the global neocorporatist policy concentration. WGIG was supposed to be open and inclusive, and its members selected from governments, civil society and the private sector (each one third of the membership). The WGIG identified as key issues the equitable distribution of resources, access for all, stable and secure functioning of the internet, and multilingualism and content. The first two WGIG meetings (2004, 2005) generated collective observations on internet governance, like including that governance cannot be reduced to government activities. The third meeting focused on capacity building in developing countries, and the last meeting evaluated policy recommendations, like the replacement of ICANN. Of the various concerns explored in the first phase of WSIS only governance was dealt with in the second phase, while the rest remain ignored. Civil society groups whose concerns were not addressed (issues of gender, race, cultural diversity, human rights) have left the official process and pursued dialogue in other fora. The remaining civil society associations now seem more institutionalized and bureaucratized than they were during the first phase. Corporatism seeks to promote social integration and stability in advances capitalist economies by creating cooperative agreements among a limited set of conflicting social groups. Global neo-corporatism does something similar, with the UN responding to NGO challenges by promoting cooperative arrangements among certain international organizations, and defusing radical opposition by co-opting moderate groups. WSIS intended open inclusion for all stakeholders based on the impossible proposition that civil society and private sector would participate at an equal level as governments. Governments had closed plenary sessions where neither civil society organizations nor the private sector could participate. Neo-corporatist concentration supposed passive exclusions of groups that threaten existing economic imperatives. WGIG also excluded participants with no professional training or internet expertise; thus, the members of WGIG are more educated and privileged than the majority of members of their respective societies. WGIG members also had to be familiar with the language of ICANN, which further excluded civil society members and representatives from the bottom. However, WGIG has been cited by civil society members as a model for multi-stakeholder partnerships. Key quotes: (W)hether willingly or not, the majority of nation-states have shifted their priorities from meeting the social and economic needs of their various constituencies to satisfying the economic interests of multinational corporations and wealthy social classes (Keane, 1998: 34) (p.366). Raboy, Marc. 2004. The WSIS as a Political Space in Global Media Governance. Continuum: Journal of Media & Cultural Studies, Vol. 18, No. 3, September 2004, pp. 347-361. Summary: The World Summit on the Information Society (WSIS) was held in two phases: Geneva 2003 and Tunis 2005. The first phase concentrated on issues that will characterize communication governance in the 21st century, which supposes a new paradigm in which new actors, such as civil society, will be increasingly involved. In

25

communication governance some decision makers are more important than others; and while nation-states hold enormous power in their territories, they are constrained by multilateral bodies, transnational corporations and international treaties. The WSIS is the third attempt of the UN to deal with information and communication issues on a global scale. WSIS was characterized by governments and civil society associations being at odds with one another. Countries such as Pakistan, Iran, Russia and China were opposed to the participation of civil society; China wanted to avoid references to media and human rights in the official texts; and the U.S. insisted on making information security a central point. Intellectual property was a sensible issue for developing countries like Brazil and India, as well as the question of funding the bridging of the digital divide. Many countries challenged the current Internet governance regimen and the U.S. governments role as overseer of the Internet Corporation for Assigned Names and Numbers (ICANN). The first phase of WSIS produced a text destined to remain unapplied. However, the summit was important because of the result of civil society participation: this was the first UN formal structure including civil society, and the autonomous structures formed by civil society members form the basis of a new model of representation and legitimation of nongovernmental input to global affairs. Civil society organizations struggled to maintain a minimal degree of participation through official intervention and informal lobbying-, and their numerous contributions remained weak. It also organized its own side events. The great achievement of civil society is the great degree if coordination between entities, the development of networks, exchange of ideas, projects, and the articulation of an alternative discourse. The WSIS is a place of confrontation of opposing communicational paradigms by including civil society. Thus, it exemplifies important emerging trends in global governance. Key quotes: The WSIS is the first UN summit where civil society was officially invited to be a participating partner (...) Many saw this as a fabulous opportunity, and they were disappointed (...) official decisions continue to be negotiated in intergovernmental structures, but the gains made by civil society will resonate (p.349). The promise of inclusion, loudly proclaimed by the events organizers, was for the most part not realized (p.349). So, despite its disappointment in the tangible outcomes to be expected- civil society has already moved towards a new paradigm and has begun to articulate a new conception of society based on communication between human beings. It is not a question of building a more equitable information society, but of developing a communication society, reviewing structures of power and domination that are expressed and sustained through information and media structures (p. 353) Deibert, Ron, "Black Code: Censorship, Surveillance and the Militarization of Cyberspace," In Boler, ed. Digital Media and Democracy. Summary: Argues than in the face of increasingly commercial and military pressures on Internet infrastructure, "civic networks have begun to create an alternative transnational paradigm of Internet security and design, oriented about shared values and technologies." (138) Such networks have been coalescing around the idea of "Internet protection" - they have tried to intervene in Internet governance, but Deibert points to the limitations of the IGF aside from "norm-promoting" and "coordinating." Also point to the "sousveillance" techniques of the Internet protection movement, which has resulted in the modification of corporate practices infringing on Internet freedom. Key Quotes: The Internet is much more than a simple appendage to other sectors of world politics it is the forum or commons within which civic communications take place. Preserving this commons from militarization is as

26

essential to global democratic governance as is the judicial restraint on force in the domestic political spheres. (154) Unless a transnational social movement arises to bring to bear on Internet governance the concerns of civic networks -- an open commons of information, freedom of speech, privacy and distributed grassroots communications the prospect of building a communications infrastructure that supports, rather than detracts from, these basic human rights will become increasingly difficult. (155) For all its many faults and digital divides, it is the Internet that is providing the means by which an increasing numbers of citizens around the can and will deliberate, debate and ultimately have an imput into the rules of the game by which they are governed." (158) "International relations and media theorists interested in and normatively in favor of opening up spaces for alternative voices, grassroots democracy, and global democratic governance to flourish will have to pay greater attention to the material foundations upon which global communications take place." (158). Solum, Lawrence B. Models of Internet governance. Internet governance: infrastructure and institutions, Eds. Lee Bygrave and Jon Bing. Oxford : Oxford U Press, 2009. Summary: This chapter offers a typology of the different kinds of internet governance forms and analyzes five models. The author argues that no single model captures all the facets of internet governance or offers a solution to the problems raised by it. He rather supports a hybrid model of governance. Solum argues that research on internet governance should focus on the relationship between technical infrastructure and internet architecture (codes) and the impact of the internet on broad policy questions. The chapter is based in three ideas: (a) the internet is constituted by its architecture or codes; (b) problems of regulation can be analyzed using conventional tools such as normative theory, economics, and social theory; and (c) the space for discussing internet governance can be captured in a set of models or ideals. The fundamental subject of internet governance is what institutions should govern and how should they be organized. Governments can regulate the physical structure of the internet and the activities of its users within national boundaries. But it is apparent that the internet is regulated by the market. The five models proposed are: a. Spontaneous order: the internet is a self-governing realm beyond the reach of government control. The architecture of the internet is resistant to purely national control. b. Transnational and international organizations: internet governance inherently transcends national borders, so the most appropriate institutions are transnational quasi-private cooperatives or international organizations based on treaties between national governments. c. Code and internet architecture: many regulatory decisions are made by communications protocols and software that determine how the internet operates. Code has regulative effects on human behaviour. The nature of internet is determined by code (software and hardware that implement the internet). Since internet activity can originate in any location, because of its code, its physical location is irrelevant. Thus, code is the most important institution for internet governance. d. National governments and law: national governments should regulate the internet through legal regulation. It assumes that internet can be regulated in accordance with the principle that each sovereign country has power over its territory. There are two problems, the attempt to subject architecture to national regulation, and the attempt to censor open-access content. The national regulation of internet is costly and ineffective because the architecture and content can originate outside the national territory. e. Market regulation and economics: market forces drive the fundamental decisions about the nature of the internet. It attempts to describe internet governance in economic terms, as markets for products and services.

27

The author argues that a broad range of issues that involve the internet can be resolved without the intervention of any special institutions or principles of internet governance. Not all the issues of internet regulation require solutions and in some cases it is best to leave the internet alone. The existing model of governance is that of code and architecture, and it has some benefits, like having produced a transparent internet. However, there is room for improvement. National regulation can become more sensitive to the value of preserving the internet architecture, and transnational governance institutions can become more sensitive to the economics of internet regulation. Key quotes: The internet may not be a separate, self-governing, libertarian utopia, but it is still a realm that hampers government regulation (p.10). 6. Bygrave, Lee and Terje Michaelsen. Governors of Internet. Internet governance: infrastructure and institutions, Eds. Lee Bygrave and Jon Bing. Oxford : Oxford U Press, 2009. Summary: This chapter describes the main organizations involved in internet governance. It outlines its responsibilities and agendas. The chapter also describes the roles played by national governments. The Internet Society (ISOC) was formed in 1992, with the goal to provide an organizational umbrella for internet standards development, based on the need to protect individuals involved in related lawsuits. It also funds the Internet Engineering Task Force (IETF). The Internet Architecture Board (IAB) was found in 1992 to preside over the development of internet standards. It is a committee of the IETF and an advisory body of ISOC. The Internet Engineering Task Force was formed in 1986 as the main workhorse in the development of internet standards. It produces a range of technical and engineering documents such as protocol standards and best current practices. The Internet Engineering Steering Group (IESG) manages and oversees technical operations of the IETF. The Internet Research Task Force (IRTF) is affiliated with IETF and is made of several research groups. The World Wide Web Consortium (W3C)was established in 1994 at MIT. It develops standards and recommendations for the web. The Internet Assigned Numbers Authority (IANA) manages various unique codes, numbering systems, and other parameters which are crucial for internet communication. It is responsible for allocating IP addresses. The Internet Corporation for Assigned Names and Numbers (ICANN) is a non-profit public corporation registered in California. It was created in response to a call from the US Department of Commerce for a new corporate entity to assume primary responsibility for managing uniquely assigned parameters on the internet. The Internet Systems Consortium (ISC) was founded in 1994 to maintain the BIND code supporting DNS servers, and it is currently involved in several open-source projects. The chapter describes other minor internet governance organizations. In the 1980s and 1990s it was popular to deny the application of traditional law to the internet. However, rules did emerge either in the form of procedural rules for standards development or in the form of rules for maintaining basic standards of politeness and courtesy in cyberspace (Netiquette). National governments have never made any real attempt to alter basic internet architecture. But they have often made decisions with impact on internet development and usage. The authors argue that the internet has never been beyond the reach of law (internet surveillance is proof of this), and that the basic question is how to apply the law.

28

Key quotes: Unlike the first generation of Internet governors, the bulk of new governors have traditionally not had governance of the internet as their primary remit (...) Many of them still do not have the Internet governance as their chief concern but they are increasingly making conscious efforts to make their influence felt in the field. In doing so, they are infusing discourse on Internet governance with new policy agendas (p.3). Often governments become unwitting Internet governors in the sense that they have stumbled into the role without thinking specifically about the realities of the Internet. They have typically done so when passing legislation prior to, or on the cusp of, the emergence of the Internet as a major tool for commerce and personal self-realization, and drafting the legislation broadly enough to apply to the digital world (p.23). Hubbard, Amanda and Lee Bygrave. Internet governance goes global. Internet governance: infrastructure and institutions, Eds. Lee A. Bygrave and Jon Bing. Oxford : Oxford University Press, 2009. Summary: This chapter deals with the attempts to democratize internet governance. It describes the World Summit on the Information Society (WSIS), in Geneva 2003 and Tunis 2005, and the Internet Governance Forum. The WSIS was a series of conferences for governments, businesses and Civil Society organized by the International Telecommunications Union and the UN. The entire process helped move policy discourse on internet governance form the sphere of the technical community into a global arena. Internet governance was not a central theme in the initial list prepared by the ITU. The key themes were infrastructure building, open the gates to all users, services and applications, the needs of users, the development of a framework, ICT and education. More than 11,000 individuals participated in Geneva 2003. In Tunis 2005 the number was nearly doubled. But the authors argue that the summits success in achieving a forum that included a multitude of divergent groups may have been the main cause for the lack of concrete progress on reaching consensus on difficult issues. The main formal outcome of Geneva 2003 was the adoption of a Declaration of Principles and a Plan of Action. The Working Group on Internet Governance (WGIG) was also created by the UN, which created other several groups and events. However, they seem to have duplicated each other. One benefit of the overlap was exposing a variety of viewpoints in different fora. Another benefit was the production of website content, white papers, and reports. The final report stated that internet governance is the development and application by governments, private sector and civil society, of shared principles, norms, rules and decision-making procedures that shape the evolution and use of the internet. In terms of long-term political impact of the WIGIG was its proposal to create an open, multi-stakeholder forum for internet governance issues. The objective of Tunis 2005 was to review the progress towards reaching the goals set in Geneva. The Internet Governance Forum (IGF) is a discussion body, not a decision-making body. The authors conclude from the history of the WSIS, WGIG and IGF that, first, the broad range of topics that suppose internet governance will continue to defy simplistic solutions. Second the underlying political, economic and cultural differences among the stakeholders will continue to challenge any attempt to reach a complete agreement on internet governance. And finally, that the way in which each stakeholder conveys its message will either hinder or advance the reception of that message. Goggin, Gerard. The International Turn in Internet Governance. A World of difference? Internationalizing Internet studies: beyond Anglophone paradigms, Ed. Gerard Goggin and Mark J. McLelland. New York: Routledge, 2009 Summary:

29

In this chapter the author reviews international internet governance, how it has unfolded, who it represents and includes, and what linguistic, social, and cultural values it inscribes in the technology. Since the 1970s and 1980s the internet community and its governance developed with the core principle of openness. The internet community made the development of the internet open to whoever wished to join in and has enough technical expertise. As more people became interested through the 1990s and the internet internationalized, the pressure grew for these users to be represented in how the internet was governed. The author refers to two key debates on internet governance: that over domain names in the 1990s, and the WSIS summits. What became evident from the former is the considerable interest from a range of contributors, including civil society. What became obvious from the latter was that the development of the internet and its governance were reliant upon only one country, the United States. In the preparation of the WSIS conferences, non-state actors were only observers and advisors, despite a declaration endorsing a multi-stakeholder approach. In response, civil society organized against state domination. The legacy of WSIS is that a global movement arose around the goal of ensuring that diversity and the importance of civil society concerns and their voices be recognized in the WSIS deliberations. The author suggests three conclusions when thinking about the internationalization of the internet. First, that this internationalization has revealed issues for the design of governance and policy arrangements. These issues arose in part due to the demand that internet governance would reflect the multiplicity of its users, and move beyond the predominance of the U.S. to allow for other governments, organizations and individuals to participate in it. Second, internet governance reviewed issues like who speaks, in what language, who decides, how the powerless are addressed, and how to meet claims to democracy and development, as well as technical collaboration and interworking. Finally, due to the two debates mentioned above new collective forms of action emerged at the international level. The author argues that these are developments and debates we need to engage in when considering the forms and implications of the internets internationalization for governance, power, and the place of users. Key quotes: What was striking about WSIS was that the debates were not restricted to narrow understandings of the digital divide the discussions were very comprehensive indeed. Further, there were genuine, if deeply problematic, efforts to incorporate disadvantaged countries, groups, and actors. There is a widespread sense that WSIS was a failure, and considerable disappointment regarding this (p.55).

Hintz, Arne. 2009. Civil society media and global governance: intervening into the World Summit on the Information Society .Berlin: Mnster Lit. Raboy Marc, Landry, Normand, and Shtern, Jeremy. Digital Solidarities, Communication Policy and MultiStakeholder Global Governance. New York: Peter Lang 2010.

30

STAKEHOLDER ISSUES

Deibert, Ron and Rohozinski, Rafael. "Beyond Denial: Introducing Next-Generation Internet Access Controls." Access Controlled. Key quotes: States no longer fear pariah status by opening declaring their intent to regulate and control cyberspace. The convenient rubric of terrorism, child pornography, and cyber security has contributed to a growing expectation that states should enforce order in cyberspace, including policing unwanted content. Paradoxically, advanced democratic states within the Organizational for Security and Cooperation in Europe including members of the European Union are (perhaps unintentionally) leading the way toward the establishment of a global norm around filtering of content with the the introduction of proposals to censor hate speech and militant Islamic content on the Internet. Waldman, Steven. "The Internet and Mobile," in The information needs of communities: The changing media landscape in a broadband age. Federal Communications Commission, 2011. Summary: This chapter provides a summary of FCC policy that are most relevant to news and journalism. Regarding access, the FCC has a plan for providing high-speed access for 100 million Americans over the next decade, encouraging wireless connections to generate independence from cable and telephone wires. This will expand access and potentially lower consumer prices. As news media migrate to the internet and wireless becomes more available, the author concludes that a flourishing wireless ecosystem is essential to the future of the news (p.305). Yet there are communities with high-speed internet service that just do not use it because it is too expensive. To solve that, the FCC seeks to increase digital literacy for low income residents, with public libraries playing a key role in it. There is much debate about hoe to insure net neutrality, including what should be the governments role in it. However, there is consensus that openness should persist. Net neutrality is key for the development of news and journalism. Despite net neutrality rules, the FCC recognizes that broadband providers tend to act as gatekeepers, favouring or disfavouring particular content, applications and services. There is debate about how revenue should be divided. While cable companies share some of the revenues with content providers, this does not happen with ISPs, which sell access to free content but pay nothing for it. The FCC has in recent years emphasized policies to expand mobile services, including broadband. This would have a positive impact on the delivery of local information to communities, particularly among groups with restricted access to a PC. The FCC has also taken steps to expand service for low-income consumers, Native Americans, and persons with disabilities. Another source of information for mobile subscribers is the FM chip, so that mobile phones can function as a FM radio. Broadcasters have complained that in America consumers have less access than in Europe and Asia. But wireless companies and device manufacturers say that many phones offer the service but there is no demand for it. Key quotes: A world without an open Internet would be one in which the very innovation we are depending on to save journalism would lose its oxygen before it had a chance to flourish (p.307).

31

Thompson, Marcelo. The Neutralization of Harmony: Whither the Good Information Environment, 18:2, B.U.J.Sci & Tech. L. (forthcoming 2011) Summary: In this article Thompson criticises technological neutrality as a Western imposition to the world, while defending Chinas interventionist, Confusionist approach. Technological neutrality in law supposes that the law should not benefit or hinder particular types of technological artefacts. This paper examines the idea of technological neutrality, particularly in relation to liberal and Confucian philosophical traditions. According to the author, as technology touches all dimensions of life, so does the expectation that the state limits its control of the technological. This principle of technological neutrality has been proposed to the WTO by Western countries according to the author- as a means for preventing China from carrying out specific regulatory initiatives for its territorial Internet. The author argues that the law is very vague when referring to tech neutrality based on the argument that the more abstract the law the less susceptible it will be to technological variation. There is also de notion that by writing laws in ways that do not describe specific properties of technological artefacts states will be able to stick to the non-discrimination principle. However, the author argues that the problem with this is that of what is meant by not framing the law in terms of technology itself. Originally, the operational systems of PCs and the Internet were designed to be accessible and flexible; the author argues this is changing and now it has more to do with the architecture of the world-scale computational grid in which the Internet and PC are intertwined: things that before were done at the PC now are done on the Internet; and people are adopting devices which in theory are able to perform the same functions as a computer, but in practice are they are locked down and their range of processes contingent upon authorization. This results in an increasingly closed, concentrated and gatekeeped Internet (p.16). The law cant address these issues because it is directed only to the functions of the technological devices, but according to the author it needs to choose between different possible technological models. The author argues that devices are designed with technological and political reasons, which are excluded of the realm of state action by technological neutrality. He also argues that technological neutrality has revived political neutrality to the point that the theoretical foundations of politics seem to have already called it a day (p.23). According to Thompson, the political structure of old-fashioned forms of liberalism command governmental restraint, restricting the pursuit of valuable goals and precluding the possibility of governmental action even where there would besound reasons for action (p. 29). The Confusionist concept of harmony supposes the expression diversity of options and criticisms, unlike neutrality, which tends to uniform negative constancy (p.41). Thompson argues that neither Western nor Eastern states should adopt technological neutrality. Technological artefacts embed societal and political valuechoices, from social media to search engines, the way we express friendship or the determination of the relevance and morality of what we can and cannot access, have a fundamental and pervasive impact on individual and collective reason. Thompson argues that the path pursued by China is one that has as a principle the political enframing of scientific endeavour rather than an affirmation of technological indulgence by the political (p. 48). He also argues that the United States and Google have been using international human rights and trade to pursue the neutralization of Chinas technological policies. Technological neutrality is a principle of deference argues Thompson as it asks states to regulate their own conduct and subjects the international community to the will of the states and corporations who hold technological stakes. Key quotes: [T]he principle of technological neutrality, has become the touchstone of Western law and policy making in the information age, elevating neutrality to heights it had never reached before (p.5). Some would claim that technological neutrality is about ensuring that law has neutral effects upon technologies or technological markets, rather than being a matter of wording. However, technological neutrality is a matter of wording, it is in the explicitly articulated rules of the normative order that the effects of technological neutrality

32

are felt for what technological neutrality does is exclude reasons of technological nature from an important dimension of practical reasoning, which is that of the reasons provided by law (p.12). [W]hen our focus moves from technological reasons towards technological artefacts, any illusion of autonomy disappears. This is so as, hen reflected in the architecture of technological artefacts, technological reasons are modified by political ones (p. 19). [B]y excluding state action based on conceptions of the good here those that are reflected in technological artefacts technology neutrality is tantamount to political neutrality (p.21). Processes of standardization of technologies such as the Internet have a pervasive impact on our lives and attempts by non-state actors to capture the unfolding of such processes are much more serious than many such carried out in the houses of parliament (p.31). Waldman, Steven. The information needs of communities: The changing media landscape in a broadband age. Federal Communications Commission, 2011. Chapter 29: The Internet and Mobile Summary: This chapter provides a summary of FCC policy most relevant to news and journalism. Regarding access, the FCC has a plan for providing high-speed access for 100 million Americans over the next decade, encouraging wireless connections to generate independence from cable and telephone wires. This will expand access and potentially lower consumer prices. As news media migrate to the internet and wireless becomes more available, the author concludes that a flourishing wireless ecosystem is essential to the future of the news (p.305). Yet there are communities with high-speed internet service that just do not use it because it is too expensive. To solve that, the FCC seeks to increase digital literacy for low income residents, with public libraries playing a key role in it. There is much debate about hoe to insure net neutrality, including what should be the governments role in it. However, there is consensus that openness should persist. Net neutrality is key for the development of news and journalism. Despite net neutrality rules, the FCC recognizes that broadband providers tend to act as gatekeepers, favouring or disfavouring particular content, applications and services. There is debate about how revenue should be divided. While cable companies share some of the revenues with content providers, this does not happen with ISPs, which sell access to free content but pay nothing for it. The FCC has in recent years emphasized policies to expand mobile services, including broadband. This would have a positive impact on the delivery of local information to communities, particularly among groups with restricted access to a PC. The FCC has also taken steps to expand service for low-income consumers, Native Americans, and persons with disabilities. Another source of information for mobile subscribers is the FM chip, so that mobile phones can function as a FM radio. Broadcasters have complained that in America consumers have less access than in Europe and Asia. But wireless companies and device manufacturers say that many phones offer the service but there is no demand for it. Key quotes: A world without an open Internet would be one in which the very innovation we are depending on to save journalism would lose its oxygen before it had a chance to flourish (p.307). Lawford, John, Janet Lo and Michael De Santis. Staying Neutral: Canadian Consumers and the Fight for Net Neutrality. Public Interest Advocacy Centre. Ottawa, Ontario: 2009. Summary: This is a report that presents a view of net neutrality from a consumer perspective. Six focus groups were conducted in January 2009 in Toronto, Vancouver and Montreal (in French and English) to determine consumer

33

knowledge and reactions to net neutrality issues in Canada. The Public Interest Advocacy Centre (PIAC) also participated in CRTC hearings and interviewed stakeholders. Overall, consumers did not seem to be aware of the debates on net neutrality, but they were concerned about issues related to it, such as universal access, privacy, censorship and the commercialization of the internet. The concept of net neutrality was confusing for many focus group participants, and most of them were unaware of related issues. However, once they became aware they were interested and engaged in the discussion. Consumers opposed traffic shaping and throttling as a means to resolve bandwidth problems. Compared to the United States and Europe, Canadas regulatory measures has set the country back in terms of recognizing net neutrality as a principle of the internet and protecting broadband consumers. In the report PIAC calls on Parliament to preserve consumers right to net neutrality, and urges the government to ensure that access is guaranteed as well as legal broadband service standards with forward looking minimum speed targets for ISPs, similar to the European approach. The report recommends: 1. Parliamentary leadership for net neutrality 2. Consumer education on net neutrality and CRTC decisions 3. Guidance for consumers on the complaints procedure 4. Consideration of how this complaints mechanism will work for the CRTC and the CCTS 5. Consumers should use the complaint mechanism to challenge internet traffic management practices. 6. Technical guidance for time sensitive criteria under s. 36 (the CRTC should report on what is and what is not time sensitive). 7. Forward-thinking legal standards for broadband 8. Consumer protection from removal of their internet access connection without due process of the law for copyright infringement allegations. Key quotes: If ISPs are allowed to continue using their current ITMPs, then Canadian consumers may be looking at a future with a non-neutral internet in Canada (p. 7). This lack of awareness about net neutrality is troubling, since it makes the creation of net neutrality policies more difficult. If consumers are not able to opine upon or even identify net neutrality issues, they will be equally unable to choose appropriate solutions proposed to them by legislators (p.17). If broadband access and higher speeds were made an explicit government priority, as has been undertaken in the E.U. and started in the U.S., the problems of network capacity might be somewhat alleviated. However, it is clear that at present, Canada is not keeping pace with the rest of the world (p.74).

Malcom, Jeremy. 2008. Multi-Stakeholder Governance and the Internet Governance Forum. Terminus Press Jurisdictional-Issues

34

Filtering-&-Information-Control 1. Deibert, Ron and Rohozinski, Rafael. Beyond Denial: Introducing Next-Generation Internet Access Controls. Access Controlled: The Shaping of Power, Rights and Rule in Cyberspace. Summary: Discusses Australian filtering plan as an example of "internet censorship becoming a global norm." Talks about corporate "self-regulation pacts"Key Quotes:The center of gravity of practices aimed at managing cyberspace has shifted subtly from policies and practices aimed at denying access to content to methods that seek to normalize control and the exercise of power in cyberspace through a variety of means. ..these next-generation techniques employ the use of legal regulations to supplement or legtimize technical filtering measures, extralegal or covert practices, including offensive methods, and the outsourcing or privatizing of controls to 'thirp parties' to restrict what type of information ca be posted, hosted, accessed or communicated online...examples include surveillance at key choke points of the Internet's infrastructure, legal takedown notices, stifling terms-of-usage policies, and national information-shaping strategies(6).Another control beyond denial in Access Controlled relates to the growing and widespread prevalence of cyberspace as a communications environment and the ways in which third party intermediaries, including private companies and public institutions, host, service and ultimately control that environment. At one point in time, it might have been fair to characterize cyberspace largely as a separate and distinct realm -- something people "enter into" when they turn on their computer or play video games. Today, however, with always-on portable devices that are fully connected to the Internet, and much of society's transactions mediated through information and communication technologies, cyberspace is not so much a distinct realm as the very environment we inhabit. 2. Pariser, Eli, The Filter Bubble. Penguin, New York, 2011. Summary: Pariser discusses the problems inherent in personalized search: in attempting to customize sources to your preferences (since December 2009), Google forecloses exposure to the sort of information necessary for democratic exchange. Gives example of Facebook becoming a primary news source, and one which filters out stories that are dissimilar to those you select. Filter bubble is invisible to most people and we don't know when we are in it and cannot choose to opt out. It is in part a consequence of the attention crash. Its roots lie in the collaborative filtering technologies developed during in the 1990s, which remained stagnant until the launch of Amazon and the realization of the commodity potential of filtering.Pariser discusses "behavioral retargeting," in which ads are tied to a user and "follow" the user across various sites on the Internet...anticipates that this sort of technology will soon be used by content providers as well as advertisers. He notes that post-2000 media critics often spoke of the "distintermediation" of news -- the elimination of the editor as a sort of middleman -- as a positive thing, but argues (with Tim Wu) that the rise of the Internet did not eliminate intermediaries, but rather changed them. Google News is the example of a new sort of intermediary. (He admits that Google News is still slanted towards the editorial judgements of newspapers). Krishna Barat, the designer of the Google News prototype, would like to move the technology onto the site of other content producers as well, in order to enable them to personalize news for their clients. This means media consumption is changing from a "pull" system to a "push" system. Pariser also discusses demand driven news: gives the example of Las Ultimas Noticias, a Chilean paper that only follows up on the stories in the paper which get the most amount of clicks. At Yahoo's Upshot news blog, editors assign articles based on search queries.

35

Key quotes: With little notice or fanfare, the digital world is fundamentally changing. What was once an anonymous medium where anyone could be anyone is now a tool for soliciting and analyzing our personalized data.Advertisers no longer needed to pay The New York Times to reach Times readers: they could target them wherever they went online. The era in which you needed to develop premium content to get premium audiences, in other words, was drawing to a close. (48)Personalization has given us...a public sphere sorted and manipulated by algorithms, fragmented by design, and hostile to dialogue. (164) Zittrain, Jonathan. The Future of the Internet--And How to Stop It. Yale University Press, 2009. Print. Summary: In this book Zittrain argues that the key to the success of the PC and the Internet is their openness to contribution and innovation. However, these characteristics are also what makes them vulnerable and what is prompting the emergence of tethered information appliances that facilitate regulation and control. Viruses and malware are successful because people can control the code that runs in their computers and can be tricked into running a dangerous code. According to the author, if security problems worsen, internet users could end up preferring some sort of lockdown, which opens the door to new forms of regulatory surveillance and control. In this book the author asserts that a lockdown on PCs and the following rise of tethered appliances bundled hardware and software created and controlled by one company - would eliminate our capability to influence and revolutionize technology. Stopping this future depends on some wisely developed and implemented locks, new technologies and a community ethos that secures access to those locks among groups with shared norms and a sense of public purpose, rather than in the hands of a single gatekeeping entity. So far computers have had generative systems, which are built on the notion that they are never fully complete, that they have many uses, and that the public can be trusted to invent and share good ones. Inclusive projects such as Wikipedia, which bring people together in meaningful conversations, commerce, or action, are possible not only due to usersinterest but because of the generative technology that supports them. However, constant breaches of that trust can threaten the foundations of the generative systems. The author argues that insecurity in the internet makes the situation unsustainable and that generativity could be sacrificed. In this scenario, users that are frustrated with their PCs would substitute them with information appliances that offer safer experiences, or PCs could become appliansized. A mainstream dominated by non-generative systems could harm innovation and individual freedoms and opportunities for self-expression. However, generative and non-generative systems are not mutually exclusive; they can compete and intertwine within a single system. With tethered appliances regulatorsinterventions into the devices themselves are much more predictable, affecting directly the way people use the appliances. This makes surveillance and censorship easier, especially by repressive governments. The author argues that so far internet insecurity has been dealt with by intergovernmental organizations and diplomatic initiatives such as WSIS, which have ended in bland consensus pronouncements. Zittrain considers that these meetings are missing key participants: computer scientists and geeks, and without them the prospect of coding new tools and protocols to facilitate social solutions is easily neglected. Zittrain suggests that the best approach to secure the internet and the innovations built upon it is to empower its users to contribute, rather than impose security models controlled by a handful of people. He adds that PC users unaware of their digital environments and unable to act when facing danger should also become more prepared. Key quotes: [The rise of tethered appliances] will affect how readily behavior on the Internet can be regulated, which in turn will determine the extent that regulators and commercial incumbents can constrain amateur innovation, which has been responsible for much of what we now consider precious about the Internet(p.9).

36

[C]onsumers find themselves frustrated by PCs at a time when a variety of information appliances are arising as substitutes for the activities they value most. Digital video recorders, mobile phones, BlackBerries, and video game consoles will offer safer and more consistent experiences. Consumers will increasingly abandon the PC for these alternatives, or they will demand that the PC itself be appliancized(p.57). Generative systems are threatened by their mainstream success because new participants misunderstand or flout the ethos that makes the systems function well, and those not involved with the system find their legally protected interests challenged by it. Generative systems are not inherently self-sustaining when confronted with these challenges(p.65). The keys to maintaining a generative system are to ensure its internal security without resorting to lockdown, and to find ways to enable enough enforcement against its undesirable uses without requiring a system of perfect enforcement(p126).

37

Deep-packet-Inspection Parsons, Christopher. Working Paper: Deep Packet Inspection in Perspective: Tracing its lineage and surveillance potentials. Kingston, ON, CAN: Surveillance Studies Centre, Queens University, 2008. Summary: By using packet inspection and capture technologies, ISPs can search and record the content of unencrypted digital communications data packets. This paper explains the structure of these packets, and explains the technologies that monitor their movements and content. The author argues that these should be considered surveillance technologies that can potentially be very invasive. Packets are transmitted from clients to servers; they contain seven layers of content: application; presentation; session; transport; network; data link; and physical layer. The server decodes the packet response for the client, who then receives the decoded packets with the requested information. Packet analysis technologies have been in use for over 15 years. There are three classes of packet inspection: shallow, medium, and deep. Shallow packet inspection (SPI) technologies drive the simple firewalls in most operating systems. These firewalls limit user-specified content from either leaving, or being received by, the client computer. The technology examines the packets header information and evaluates it against a blacklist. It cannot read beyond the header information and focuses on the second and third layers. Medium packet inspection (MPI) refers to devices that stand between end-users computers and ISP/Internet gateways. These proxies examine packet header information against a loaded parse-list. A parse-list is more subtle than a blacklist: whereas the latter establishes that something is either permissible or not, a parse-list allows specific packet-types to be allowed or not based on their data format types and associated location on the internet, rather than on their IP address alone. Deep packet inspection (DPI) technologies are found in expensive routing devices installed in major networking hubs. They allow to identify precisely the origin and content of each packet that passes through these hubs. MPI devices have limited application awareness, but DPI devices can look inside all traffic from specific IP addresses, select the HTTP traffic, and reassemble emails as they are typed out by users. Deep packet capture (DPC) allows thousands of packets to be stored until enough information is gathered to appropriately match the packets against the devices list of known packet-types. Then it is possible to know what application is generating and sending the packet, and rules can be applied to allow or disallow it. Rules can also moderate the rates of data flowing to and from an application (throttling). DPI slows the transmission of packets. DPC is not marketed to constantly capture all of the data that ISPs customers send and receive, but can be used to improve network performance and to comply with regulatory demands. Evaluating packets let ISPs identify applications that could affect the network performance and lets them develop rules that reduce congestion. DPI technologies can be used to improve network security, implement access requirements, guaranty quality of service, and tailor service for particular applications. The author distinguishes between surveillance, which supposes using DPI devices to inspect each packet that passes along their network, and search, which supposes looking for a particular element of network traffic. Search surveillance could provide a deep field of data that is relatively limited in its scope. The distinction between broad and narrow surveillance raises questions of the impact of surveillance, who is, or may be, discriminated against; and the responses to it. It raises questions of freedom of speech, privacy, surveillance, net neutrality, and freedom of association. In Canada, ISPs use DPI technologies to monitor and shape bandwidth, to regulate file-sharing applications, modify webpages, and redirect customers to their own websites.

38

Key quotes: DPI enables IPs to identify the tools that are being used to communicate (...) in addition to the IP addresses that communications data is being transmitted to, number of intended recipients of data flows, and the perceived relations between individuals transmitting data to one another (p.12). Canadian ISPs are using DPI technologies to survey the entirety of their clients activities as part of their routine business operations (...) use a massive surveillance technology to comprehensively remain aware of the data traffic coursing along their respective networks (...) there is no evidence that Canadian ISPs have shifted from surveillance to search (p.12).

39

Cloud-Computing 1. Deibert, Ronald. Questions of trust as we head into the cloud. The G20 Cannes Summit2011: A new way forward, Eds. John Kirton and Madeline Koch, Newsdesk Media Group and the G20 Research Group, 2011. Summary: This article by the director of the Canada Centre for Global Security Studies and the Citizen Lab, argues that cloud computing supposes a paradigm shift in communications. Under the internet paradigm, the companies that ran the infrastructure were independent from the content that flowed through their networks. Conversely, now data is entrusted to international corporations such as Google, Amazon and Facebook-, who act as gatekeepers. Before data was only as secure as people could protect it behind closed doors in cabinets, now it is only as secure as the companies that host it. However, cloud computing companies are far less concerned with security than to make revenues. Thus, there has been a growing rash of major security breaches across governments and private sector. Research has uncovered cloud-based espionage from jurisdictions in countries like China, Iran, Syria and Burma, targeting politicians, human rights activists and military in Asia, Europe and North America. Cloud computing has created new governance issues. Data is transported in an instant to other political jurisdictions. Governments seeking to control cyberspace should work with the private sector that owns and operates the cloud, in order to implement laws, regulations, incentives and other types of pressures. Different legislations on online privacy can result in citizens that use different communications services living with entirely different rights. For example, intermediary liability is very different in Canada and the U.S. than in Belarus, Iran or China. In non-democratic countries ISPs, telecom carriers and mobile operators are asked to police political content, track dissidents and send threatening messages over their networks. The author argues that the private sector that operates the cloud should be required to spend as much effort in protecting users privacy as it does policing the internet for law enforcement agencies and copyright holders. Key quotes: As cyberspace becomes an object of geopolitical contests and a political battlefield among authoritarian regimes and their adversaries, clouds well become vectors for cyber-espionage and politically motivated attacks(p.244). As peoples data evaporates into the clouds, so seemingly do their rights (p. 244).

40

Privacy

41

Archivability

42

Online-Journalism

43

Early-Online-Journalism-Research Allan, Stuart. Online news: journalism and the Internet. McGraw-Hill International, 2006. Print. Summary: This book examines the ways in which the users of online news are rewriting the rules which have traditionally governed journalism as a profession. The book analyses online news about particular stories in the US and UK considering the following questions: a. Which reportorial innovations have attracted public attention to online news, and why? b. To what extent have forms and practices of online news become conventionalised? c. What factors are shaping the gradual consolidation of these conventions? d. In what ways will online news have to develop in order to further enhance its reportorial integrity? Allan describes two crucial moments, or tipping points for journalism: a. Murdochs 2005 speech directed to media owners saying that the next generation of news consumers are digitally native and expect news online on demand. So traditional journalism needs to evolve. b. The 2004 tsunami in Indonesia, when citizens began posting on weblogs what was going on, including video and pictures. This material was picked up by news organizations, thus broadening the definition of news. Analysing the coverage of the Oklahoma bombings, Allan suggests that this event raised the internet as a site for breaking news. People went online in huge numbers for the first time for updates. And for the first time, instead of reproducing print content, news sites published the stories first. Other events, such as the death of Princess Diana, later led to debates about the quality of online news. A discussion also emerged about how traditional media supposedly just reproduced official statements, while online reporters seemed more open and would allow readers to participate in the production of news. Allan refers to the BBCs website and the Drudge Report as the two ends of the online journalism spectrum. The former with a conventional news coverage and the latter covering rumors and odd news. The author calls Drudge The most famous pioneer in the earliest years of the democratization of journalism (p.44). Referring to blogging, Allan argues that while news websites seek to maximize profits by encouraging readers to stay on that site, weblogs are not interested in profit and have links to several other sites. Newspapers at first dismissed bloggers as amateur journalists and as likalists (p.51). Later, blogs became essential to journalists. During 9/11 news organizations had problems with their websites due to huge traffic, and citizen journalists as well as alternative news sites, were the main sources for updates online. Blogs have become a challenge for professional journalists because bloggers sometimes know more about particular topics and do a better job than professional journalists. Bloggers sometimes also cover important stories ignored at first by the mainstream media. Allan suggests that blogs and blogging can be considered a third tipping point for journalism. During crisis situations, blogs have proved to be crucial for reporters trying to get the story. Similarly, photos and videos sent by citizens in places where journalists do not have access, or by citizens who have witnessed a disaster or attack, began to be included by journalists in their stories. Alternative online news sites such as OhmyNews, IndyMedia and Wikinews, emerged as a response to the concentration of mainstream media. Now, mainstream, commercial sites are capitalizing on user generated content and openly experimenting with new forms of journalism previously used by alternative media as a means to enhance their connectivity with users. However, Allan concludes, because of economic pressures, competition, and concerns about reportorial commitment and integrity, online news sites mainstream and independent - are becoming more similar in how they define and cover news.

44

Key quotes: Online journalism, at its best, brings to bear alternative perspectives, context and ideological diversity to its reporting, providing users with the means to hear voices from around the globe (p. 105). In the meantime, it is likely that citizen journalism will be increasingly recognized by commercial sites for its attractiveness to users (...) as well as for the relatively modest operational costs involved (...) And yet, I would suggest, both similarly promise to curtail the very aims, values and commitments which citizen journalism, at its best, represents (p. 142). There appears to be little doubt in the eyes of both advocates and critics alike - that citizen reporting is having a profound impact on the forms, practices and epistemologies of mainstream journalism, from the international level through to the local (p. 166). [I]t is my perception that news sites are beginning to share more similarities than differences. This is so, in my view, not only in their appearance but also, more troublingly, in their preferred definitions of what counts as news and how best to report it (p. 184).

45

Canadian-Journalism-Scholarship . Perigoe, Ross. Ten-year retrospective: Canada and the United States in the age of digital journalism. Journal of Media Practice Volume, Vol. 10 (2&3), 2009. 2. Felczak, Michael, Richard Smith and Geoffrey Glass. Communicating with (Some) Canadians: Communication Rights and Government Online in Canada. Canadian Journal of Communication, Vol 34 (3), 2009. 3. Sparks, Robert, Mary Lynn Young and Simon Darnell. Convergence, Corporate Restructuring, and Canadian Online News, 20002003. Canadian Journal of Communication, Vol 31 (2), 2006. 4. Edge, Mark. Balancing Academic and Corporate Interests in Canadian Journalism Education. Journalism & Mass Communication Educator, Summer, 2004. 5. Winseck, Dwayne. Financialization and the Crisis of the Media: The Rise and Fall of (Some) Media Conglomerates in Canada. Canadian Journal of Communication, Vol 35 (3), 2010. 6. Gasher, Mike and Sandra Gabriele. Increasing Circulation? A comparative news-flow study of the Montreal Gazettes hard-copy and on-line editions. Journalism Studies, Volume 5 (3), 2004. 7. Barratt, Neil and Leslie Regan Shade. Commentary Net Neutrality: Telecom Policy and the Public Interest. Canadian Journal of Communication, Vol 32 (2), 2007. 8. Anderson, Steve. Net neutrality:The view from Canada. Media Development, Vol. 1, 2009. 9. Belanger, Pierre C. Online News at Canada's National Public Broadcaster: An Emerging Convergence. Canadian Journal of Communication, Vol 30 (3), 2005. 10. Viseu, Ana, Andrew Clement and Jane Aspinall. Situating Privacy Online: Complex perceptions and everyday practices. Information, Communication & Society, Vol. 7 (1), 2004. 11. Milliken, Mary, Kerri Gibson and Susan ODonnell. User-generated video and the online public sphere: Will YouTube facilitate digital freedom of expression in Atlantic Canada? American Communication Journal, Vol. 10 (3), 2008. 12. Thrift, Samantha. Who Controls Canada's Media? Conference Report: McGill Institute for the Study of Canada, Montreal. Canadian Journal of Communication, Vol 28 (2), 2003.

46

Public-Sphere Dean, Jodi, "Communicative Capitalism" in Boler, Megan, ed. Digital media and democracy: tactics in hard times. MIT Press, 2008. Summary: Dean argues that the circulation of information, or content, within modern society has taken the place of debate linked to action. She does not deny that networked communication can lead to political, but says the density of information in communicative capitalism makes this difficult if not impossible. Draws on Agamben (communicative exchanges are not fundamental to democratic politics now, but are rather the basic elements of capitalist production). Messages are uncoupled from the sender-receiver dynamic and become part of the circulating data stream. Messages have USE VALUE, while contributions simply have EXCHANGE VALUE. The belief in their value stems partly from the TECHNOLOGICAL FETISH, which condenses complex problems into issues solveable through a technological fix, and displaces real action into technological action. Key quotes: "Today, the circulation of content in the dense, intensive networks of global communications relieves top-level actors (corporate, institutional, governmental) from the obligation to respond. Rather than responding to messages sent by activists and critics, they counter with their own contributions to the circulating flow of information, hoping that sufficient volume will give their contributions dominance or stickiness (the result being) a multiplication of resistances and assertions so extensive that it hinders the formation of strong counterhegemonies. (102) "Communicative capitalism designates that form of late capitalism in which values heralded as central to democracy take material form in networked communications technologies." (104) "Struggles on the net reiterate struggles in real life, but insofar as they reiterate these struggles, they displace them." (109) Preston, Paschal. AN - ELUSIVE - TRANS-NATIONAL - PUBLIC - SPHERE? -- Journalism and news cultures in the EU - setting. Journalism Studies 10.1 (2009): 114. Web. Summary: This study analyses the media-technology relationship focusing on the socio-cultural shaping of technology in a European context. It seeks to answer: (1) Is there an emergent European journalistic culture or set of practices reflecting a specific European sense of identity? (2) Is there a patter in which European issues are covered? The research comprised in-depth interviews with almost 100 senior journalists in 11 European countries, as well as reviews of the journalism studies literature in each country. The aim was a muti-country study of changes in journalism culture centered on dominant professional values, influence of market forces in newsmaking, how controversial issues are dealt with, the role of technological innovation, and changing relations between journalists and audiences. Results showed that the dominant professional values have changed little due to technology, and they are very similar across countries. Growing commercialization of news has increased the volume of work and sped the production process negatively affecting quality in the newsrooms. Technology has changes routines but not values in the newsrooms. Technology has facilitated the expansion of media outlets, but this has not been matched with an expansion of diverse or novel forms of information. Conversely, new technologies have encouraged the recycling of information across different media and platforms, and research and investigation have been reduced. The increased number of media outlets has not been matched by any increase in the number of full-time professional journalists.

47

European issues and stories are usually based on their impact at the national level. Interviewees also emphasized a lack of interest in the audience for European issues. Some interviewees argued that there is not a common emergent European journalistic culture, while others said that they share a philosophical commitment, especially between journalists working for serious or high-brow media appealing to intellectual and political elites. EU-level integration has promoted transnational ownership and pan-EU regulatory frameworks related to the media. But this has not created a common journalistic culture. However, there us a certain common elite form of mediated public sphere that caters for the industrial, financial and political strata. Key quotes: The research evidence clearly points to the continuing disconnect between the spaces of connectivity, on the one hand, and the operational spaces of news cultures and journalism, on the other hand (p. 125). Jenkins, Henry, and David Thorburn. Democracy and new media. MIT Press, 2004. Print. Tewsksbury, David, and Jason Rittenberg. Onlne News Creation and Consumption: Implication For Modern Democracies. Routledge Handbook of Internet Politics. 1st ed. Ed. Andrew Chadwick & Philip N. Howard. Routledge, 2008. Print. Schudson, Michael. Click Here For Democracy. Democracy and New Media. Ed. Henry Jenkins & David Thorburn. The MIT Press, 2003. Print.

48

General-Online-Journalism-Theory Chyi, Hsiang Iris, "Information Surplus in the digital age: Impact and Implications." In Papachraissi, Zizi, ed. Journalism and Citizenship: New Agendas in Communication. Summary: Explores the role of "information surplus" and "attention deficit" in changing news consumption. Argues that news is now one of many information categories competing for user's information share. In the quest to make news desirable to consumers, the civic function of news diminishes. Call for a new media research agenda through the lens of information surplus. Picard, Robert. Mapping Digital Media: Digitization and Media Business Models. Reference Series No. 5. Open Society Foundations. July 2011. Summary: In the Key Policy and Legal Issues section of the report, Picard argues that there must be policy and law to avoid control of distribution over the internet and other telecommunication providers, because otherwise they would try to take advantage of these systems and promote their own interests. Internet providers may try to limit the amount of some type of content in order to reduce their investment in capacity. Therefore, equal access and use of these systems by individuals and organizations should be guaranteed. The policy mechanisms needed for this are: a. specific protective measures to ensure that competitors content is not excluded as a means of unfair competition b. net neutrality regulations c. requirements that certain types of content or content providers must be carried on distribution systems Picard identifies as a problem that the means of distribution are now global rather than national, but domestic and international law regarding taxation, the freedom to disseminate and receive, and mechanisms to censor or punish were created in the analog age, when borders, importation processes and domestic distribution systems were more controlled. Finally, another issue, according to the author, is to ensure mechanisms for content creators to benefit financially from their work. However, copyright and related rights protection needs to be balanced with fair use, protecting use by educational institutions, libraries and protected groups. Key Quotes: Most policies and laws on taxation, trade, libel, privacy, obscenity, and copyright date from an era when there were clearly identifiable producers, publishers and broadcasters who created and disseminated informationand could be held responsible for it. Today, content is not disseminated merely through traditional channels for which policy processes and procedures were established, but is transported through constantly changing networks in which identifiable and anonymous users choose whether or not to access content, reconfigure and retransmit content, and create content of their ownthus gaining power over the content themselves (p.17-18). Many provisions to protect the business models of audio and visual producers have been put in place internationally and domestically over the past decade (...) Newspaper publishers are arguing for similar protection against aggregators, social networks and bloggers in the U.S. and a number of European states, and there are efforts to obtain a similar type of protection for broadcasters against retransmissions (p.18). Atton, Chris. An Alternatrive Internet: Radical Media, Politics and Creativity. Edinburgh: Edinburgh University Press, 2004.

49

Summary: This book looks at alternative internet through a series of case studies exploring the use of internet by individuals and organizations with alternative philosophies and practices, i.e. alternative to the dominant and expected ways of doing media. Atton examines Indymedia as an example of alternative journalism and a radical form of public journalism, intimately involved with global struggles against corporate governance. He also examines the British National Partys website and its discourse of far-right media. The second half of the book is devoted to popular cultural activity on the internet, and how audiences may become critics and commentators on those products and become creators on their own. He looks into new forms of social authorship, such as open copyright, anti-copyright and copyleft, which challenge commercially based notions of ownership. Finally, the book examines online radio for political activism, popular music and education, as well as fan culture. Atton argues that when studying the internet one has to consider that it is a series of human processes. To think of it as an unproblematic source of social change is to ignore the political and economic determinants that shape technology, and how these may be influenced by social and cultural elites. It is also to ignore the obstacles to empowerment that legislation, inequalities of access and limits on media literacy pose to groups and individuals. Regarding radical online journalism, Atton asserts that Indymedia deploys various radical journalism methods, such as first-person native reports; radical critiques of government policies, actions and the mass media; use of mainstream sources; and the creation of spaces for discussion. Through these practices alternative journalism critiques dominant news values and transforms dominant practices. However, it does not represent a total rupture with journalistic norms. Atton argues that in analysing alternative media researchers should avoid the ghettoisation of such media, which may lead to their marginalization. Also should avoid the valorization of alternative media because of their difference or because they appear to resist in a way that might be appealing to our own political sensibilities. Key quotes: [Radical journalisms] new practices signal a challenge to the epistemological basis of mainstream news production. In its place we see enacted a socially situated and self-reflexive form of journalism (...) We see a move away from journalism as expert culture and commodity; readers are invited to approach the knowledge presented here not as the product of an elite authority but as the result of a process that comes about through the impersonal connectedness of journalist and reader (p.60). Alternative media practices are hybrid practices that embody continuation as well as reform and rupture. Nor are they to be understood solely in relation to political activism (p. 159). Greer, Jennifer and Donica Mensing. The Evolution of Online Newspapers: A longitudinal content analysis, 19972003. Internet Newspapers: The Making of a Mainstream Medium, Ed. Xigen Li. Mahwah, NJ: Laurence Erlbaum Associates, 2006. Summary: This is a longitudinal content analysis of 83 U.S. online newspapers attached to print newspapers - from 1997 to 2003. The study examines trends in news presentation and content, multimedia use, interactivity, and potential revenue sources. The results show two trends that emerged over the 7 years spanned by the study. First, online newspapers are offering more content, multimedia, interactivity, and revenue-generating features. Second, medium and large-sized newspapers sites are almost identical in the number of offered features, however small newspapers sites are significantly less varied. Regarding content, local stories still dominate, although other types of content are also offered, like archives, national news and news wires. Multimedia use has increased steadily over the study period and there is acceleration in the use of these features. Newspapers are still working on interactive elements appropriate for an online news environment. These websites have also enhanced their

50

advertising and tried to make this business model work online. The study concludes that instead of stagnating, online newspapers have evolved as they developed new features according to the needs of the audience and the organization. However, online newspapers continue to search for a successful business model. Key Quotes: The findings in this study suggest that online newspapers continue to search for a successful business plan, while adding new features and discarding those that have failed to deliver significant revenue (p.30). This study suggests that online newspapers are not only evolving, but that they are thriving at least in terms of variety of content and features available. Newspaper Web developers are experimenting, adding new elements, and abandoning features that do not work (p. 30-31). Bucy, Erik P. and Robert B. Affe . The Contributions of Net News to Cyber Democracy: Civic Affordances of Major Metropolitan Newspaper Sites. Internet Newspapers: The Making of a Mainstream Medium, Ed. Xigen Li. Mahwah, NJ: Laurence Erlbaum Associates, 2006. Summary: This chapter examines the extent to which newspaper sites facilitate civic participation online through civic journalism and the use of interactive features. The study asked what were the opportunities for interactivity and citizenship in the websites of leading U.S. Newspapers and if the range of user options relevant to political participation during the 2004 presidential campaign could be conceptualized as civic affordances that turn readers from passive consumers to engaged citizens. The researchers carried out a content analysis of 48 newspaper websites. The unit of analysis was the Politics or Election 2004 section of each newspaper and was coded in terms of election-related content, interactive features and information accessibility items in the home pages. Results showed that 81.3% of the analysed sites contained direct election-related content. The most common form of campaign content was candidates biographies with links to their campaign sites. Online polls, quizzes, audio and video downloads, slide shows and graphics were also available. At least one of these features was present in 66.7% of the sample. Opportunities for human interaction were less frequent. The analysis suggested that interactive options increased with circulation size of the newspaper. There was also a positive relationship between interactive mechanisms and political information. The researchers concluded that to the extent that the interactive features made it possible for readers to have a direct involvement in the campaign through the newspapers sites, they can be conceptualized as civic affordances that call the site visitor to action. Key quotes: Digital developments are giving news organizations increased opportunities for interactivity at the same time that the civic journalism movement is creating renewed awareness of the public obligations of news that extend beyond the traditional function of information dissemination (p. 232). As a growing source of civic information and activity online news organizations appear to be addressing this public responsibility by giving consideration to the broader democratic promise of the Internet as a discussion forum or site of public deliberation (p. 238). Li, Xigen. News of Priority Issues in Print versus Internet Newspapers.Internet Newspapers: The Making of a Mainstream Medium, Ed. Xigen Li. Mahwah, NJ: Laurence Erlbaum Associates, 2006. Summary:

51

This study examines whether different delivery and access modes of internet newspapers have an impact on identifying news as priority issues, a starting point in setting the agenda. It looks at how readers of print and internet newspapers identify priority news differently due to the different ways in which the information is presented in each platform. He used print and online versions of The Washington Post, USA Today, and the Chicago Tribune. Li looked, first, at the effects of different web design on the identification of priority issues. Second, he isolated the effects through the internet readers that focus on information seeking instead of just surfing. And third, he did a content analysis of the priority news in print and online newspapers. Lis hypotheses were: H1a: Priority issues identified by internet readers are noticeably different from those identified through content analysis. H1b: They are also different from those identified by readers of print newspapers. H2: The priority issues identified by internet readers are less likely to be associated with those identified in the content analysis than those identified by print newspaper readers. H3: Web design is associated with distinctive patterns in priority issues identified; H4: Prominence cues used by readers to identify priority issues on web sites are different from those used on traditional media. H5: The way that news is presented on web sites is likely to change the readers view of what the priority issues are. Results showed that H1a, H1b, H2, and H3 were supported. H4 was partially supported. And H5 was not supported. The findings suggest that web design approach and information delivery and access modes of print and online newspapers do have an impact on how priority issues are identified. For online newspapers a different list of priority issues was generated. The priority issues identified in print newspapers correspond with those identified in the content analysis. The different web designs of each newspaper produced diverse patterns in priority issues identified from print and online compared to content analysis. Prominence cues used in print are still useful online, although some of them were no longer so evident online, such as story prominence and headline type size. The way news is presented online has little effect on what readers consider priority issues. Key quotes: When readers access news information from the Internet newspapers, if the prominence cues are not as evident as in print newspapers, they may try to find alternative cues in guiding their selection of news stories. Something identified by the print newspapers as priority issues may not be seen as so important in the internet newspapers by readers (p.264-265). Nguyen, An. Harnessing the potential of online news: Suggestions from a study on the relationship between online news advantages and its post-adoption consequences. Journalism 11.2 (2010): 223 -241. Web. Summary: This study uses data from an Australian national survey to explore the relationship between nine common sociotechnical advantages of online news and how it is used and alters existing news use habits. The nine advantages are: No cost Multitasking More news choices In-depth and background information 24/7 updates Customization

52

Ability to discuss news with peers Have my say to the news media Different viewpoints The article finds that, except for have my say to the news media, all the other attributes have a more or less marked effect on the way people adopt, use and integrate online news into daily life. Results also showed that online users seem to expect immediacy at the same time as depth and diversity of news and views. The study confirmed the previously found positive effect of online news attributes, especially immediacy and multitasking ability, on the decision to adopt online news and on the extent to which it is used, evaluated and affiliated with. This suggests that with the continuing improvement of online news technologies, online news has substantial potential for continued growth. Regarding the impact of online news on traditional sources, the study shows that the displacement effect is still very limited, with a reduction in overall traditional news taking place among only 10 per cent of online news users. Smyrnaios, Nikos, Emmanuel Marty, and Franck Rebillard. Does the Long Tail Apply to Online News? A Quantitative Study of French-Speaking News Websites. New Media Society (2010): 1461444809360699. Web. Summary: The multiplicity of news online in often presented as leading to pluralism. The web is expected to offer a wider range of news than offline media (Long Tail theory). This article tests this hypothesis with a transdisciplinary quantitative study based on thousands of articles taken from 80 French-speaking websites from France, Canada, Belgium, Switzerland,. Algeria, Morocco, Lebanon, China, Russia and sub-Saharan Africa. It also included 22 French blogs. The agenda-setting effect is more complex online due to the emergence of multiple intermediaries and new patterns of distribution (search engines, aggregators, portals, blogs). However, the content distributed by them seems to be redundant and to originate from press agencies and traditional media. This means that the growth of online information circulation does not necessarily mean that online news is more diverse in journalistic terms, or that it covers a wider range of topics than offline news. The researchers applied an automated crawling method to the sources to extract the headlines of the articles (main data for the study) and the first lines of the articles. They calculated the degree of headline distribution among topics and focused on the wording of the headlines. They used textual statistics to compare the vocabulary used by the different categories of sources. Results show that websites have diversity as well as high concentration on a few major and redundant issues. Online news were produced similarly than offline news, they were varied and very unevenly distributed. Online journalists tend to concentrate on rewriting and editing, relying on existent material instead of original reporting. The most redundant sites were those without any journalist staff, free dailies with small newsrooms and radio and TV channels. Blogs, webzines and citizen journalism sites focused on creativity rather than reactivity or productivity. Although the sample in this study is not representative, the results highlight the necessity to question the ideal of pluralism that the web is supposed to embody. Dickinson, Roger. Studying the Sociology of Journalists: The Journalistic Field and the News World. Sociology Compass 2.5 (2008): 1383-1399. Print. Friend, Cecilia, and Jane B. Singer. Online Journalism Ethics: Traditions and Transitions. M.E. Sharpe, 2007. Print. McNair, Brian. Cultural chaos: journalism, news and power in a globalised world. Taylor & Francis, 2006. Print.

53

Zelizer, Barbie. The changing faces of journalism: tabloidization, technology and truthiness. Taylor & Francis, 2009. Print. ---. Managing The UK News Revolution. News Online: Transformations and Continuities. 1st ed. Ed. Graham Meikle & Guy Redden. Palgrave Macmillan, 2010. Print.

54

Political-Economy Robert McChesney,Meikle, Graham, Guy Redden, and eds. The Crisis of Journalism and the Internet. News Online: Transformations and Continuities. 1st ed. Palgrave Macmillan, 2010. Print. Summary: McChesney first describes the crisis of journalism; then suggests how the Internet could be a tool to improve the situation; and finally explains how internet governance issues impede the development of online journalism and proposes solutions. The crisis of journalism is described as the decline of investigative reporting, bad political reporting, lack of international and local journalism, and the prevalence of infotainment (p.54). The Internet is usually blamed for it because it provides competition to the dominant commercial news media, and takes away advertisers and readers. However, McChesney argues that the crisis of journalism began before the Internet. In the 1980s and 1990s news media were cutting back on journalists and resources because it was profitable in the short term (p. 56). In order to make news profitable immediately the autonomy of professional journalism was sacrificed. Thus, the market does not encourage the production of good journalism (p. 59), and the Internet has only accentuated the flaws of commercial journalism (p.60). Corporations suggest that the solution is to allow media companies to merge and create monopolies, which should not be worrisome because the Internet is supposed to provide a forum for everyone else (p.55).But the author argues that the openness of the Internet is due to policy and technology, and telecommunications companies can censor the Internet and work with authorities for surveillance purposes (Eg. China and US). By controlling the speed of the Internet and the access to websites, the cable and phone companies seek to extend their monopoly business model to the broadband world. He argues that we cannot expect the Internet or the market to solve the crisis, and that resources and institutional support are also necessary. And the role of public and community media is also pivotal to end the crisis of journalism (p. 65). Key Quotes: McChesney proposes that other policy issues should be addressed and resolved successfully for the Internet to even begin to fulfil its promise for society, and for journalism. For starters, if the Internet is to provide the foundation for free speech and free press, it has to be ubiquitous, high-speed and inexpensive The future of a free press is dependent upon ubiquitous, inexpensive and super-fast Internet access as well as Network Neutrality (...) ending the digital divide and stopping corporate privatisation of the Internet (and the broader realm of digital communication) is mandatory (p. 63-64). Benkler, Yochai. The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press, 2006. Print. Fenton, Natalie. New media, old news: journalism & democracy in the digital age. SAGE, 2009. Print. Mansell, Robin. Political Economy, Power and New Media. New Media & Society 6.1 (2004): 96 -105. Web.

55

Labor Deuze, Mark, and Timothy Marjoribanks. Newswork. Journalism 10.5 (2009): 555 -561. Web. Summary: This paper is an introduction to a special issue of Journalism on labor. The authors argue that journalistic work is changing fast. There are mass layoffs, offshoring and outsourcing. News media organizations in North America and Europe are facing bankruptcy or closure. In the middle of this the news industry is innovating its approaches, organization and management of the production process. Audiences are moving to the internet which supports a greater emphasis on minute-by-minute reporting. This situation has prompt significant scholarly attention to the working conditions and labor practices in journalism. Scholars suggest that the demands o the marketplace can be met by reporters and editors if they perform a more flexible, adaptive and multi-skilled professional identity. But this should not come at the cost of upholding journalisms values. Another tension observed in the research is the casting of journalists as either individuals or as a collective. Key quotes: On the one hand we can observe a field that is losing its traditional bearings and casting its practitioners in a new entrepreneurial ideal of being free agents; on the other hand these professionals are not perceived to have any individual or collective power to enact some kind of meaningful change to the system (p.558) The primary function of the organization of newswork is not so much the liberation of the media professional from the constraints of technology and the market so she can do her best work, but rather to prevent the individual voices and talent of journalists from being heard, seen, or featured at all (p.558). rnebring, Henrik. Technology and journalism-as-labour: Historical perspectives. Journalism 11.1 (2010): 57 -74. Web. Summary: This article is a historical analysis of journalism as labor using labour process theory. It historicizes the relationship between journalism and technology, and places labor at the centre of analysis to address concerns over the state and future of journalism. The study focuses on the importance of the separation of conception and execution of labor, the increased differentiation of the labor process, the use of technology to increase productivity, and the deskilling of labor. The author asserts that technological determinism is common among journalists when reflecting on the changes in their profession. Some scholars have also taken a deterministic stance, while others are critical of it. In the 19th century the gradual division of conception and execution of journalistic labor represents a disconnection of the technology of printing from the actual news-gathering labor. In the 20th century the technical tasks for the production of news were carried out by a support staff, including photographers, graphic designers, cameramen, and so on. The differentiation of labor was organized along technological lines: technical skills were separated from journalism. The author argues this is changing, and journalists now are expected to have technical skills in computer-based and digital technologies of production. These changes are linked to management needs of control, making journalism more cost-effective. The introduction of new technologies in the news process has produced a discourse of speed as the main criterion by which journalistic labor is judged. The author suggests there is need for further investigation regarding the deskilling of journalists and asserts that there seems to be evidence that wholesale upskilling among journalists is no more likely to occur than wholesale

56

deskilling. He suggests that as journalists become more skilled in digital production techniques, they may find less use for their news-gathering skills. The author concludes that news technologies are adapted according to existing patterns, which are shaped by a long historical process that has served to naturalize the dominance of technology over journalism. Key quotes: [T]he changes to the differentiation and specialization of journalistic labour are not so much driven by technological necessity as by the capitalist necessity to reduce overall labour costs (p.64). This link between news, periodicity and speed only became more marked in the interplay between technology and liberal capitalism that was the industrial revolution, more marked still with the emergence of broadcasting and accelerated even further with the introduction of online news provision (...) the discourse of speed, understood as at heart a capitalistic logic of competition and use of technology to increase productivity, has become a wholly naturalized element of journalism and forms a template for how journalists understand new technologies. The prime function of any new technology is to speed up the news process (p.65). Machill, Marcel, and Markus Beiler. THE - IMPORTANCE - OF - THE - INTERNET - FOR JOURNALISTIC - RESEARCH - -- A multi-method study of the research performed by journalists working for daily newspapers, radio, television and online. Journalism Studies 10.2 (2009): 178. Web. Summary: This article reports how journalists integrate online research methods to their work, how they assess internet search engines, and how highly developed are their competences in using them. The authors observed 235 German journalists working on a permanent or freelance basis for public and privately owned daily newspapers, radio and television companies. Then they surveyed 601 journalists and conducted an experiment with other 48. The internet has advantages for journalists but also suppose some risks, like distorted reality, which happens because sources on the internet can be easily manipulated, affecting their credibility and reliability. The observation was performed on an open and non-participating basis at the journalists normal workplace. It showed that journalists use computer-aided research tools more frequently, but for shorter periods than classical tools. With increasing age, the frequency and duration of use of electronic research tools decreased, whereas the use of non-computer-aided research tools and the use of news agencies remain constant. The telephone remains the most important tool. And Google dominates the source-determination process, thus having a decisive influence on journalistic research. Online journalists used more computer-aided research tools and news agencies. The survey showed journalistic attention focused on a limited number of internet offerings, while journalists had a pragmatic attitude towards the internet as a research tool, even though they are aware of possible problems. Journalists give great importance to the internet for various research tasks, but especially for acquiring additional sources and information; which indicates the decisive influence of the internet in journalistic research. Journalists agreed that internet makes their job easier, and they signalled as potential problems the increased pressure on journalists to be up-to-the-minute, that the selection of information has become more important than getting new information, and that quality suffers as a result of everyone being able to disseminate information via the internet. The search-engine experiment revealed that journalists have a moderate success when using the internet as a research tool. The most successful journalists pursued an in-depth search strategy by having semantically well thought-out search terms. The least successful journalists submitted a large number of queries, usually using general, imprecise or indirect search terms. Overall, the study shows that computer-aided research complements but does not replace classical research tools. Search engines, particularly Google, dominate the process of identifying additional sources. In many cases there seems to be no alternative. A cross-check on research hardly

57

occurs. Another problem is self-referentiality of the media. Journalists are aware of these problems, but they are not always consistent enough or in a position to change their behaviour.

Paterson, Chris A., and David Domingo. Making online news: the ethnography of new media production. Peter Lang, 2008. Usher, Nikki. Goodbye to the news: how out-of-work journalists assess enduring news values and the new media landscape. New Media & Society 12.6 (2010): 911 -928. Web. Quandt, Thorsten et al. AMERICAN - AND - GERMAN - ONLINE - JOURNALISTS - AT - THE BEGINNING - OF - THE - 21ST - CENTURY - -- A bi-national survey. Journalism Studies 7.2 (2006): 171. Web. Deuze, Mark. Media work. Polity, 2007. Print. Weiss, Amy Schmitz, and David Domingo. Innovation Processes in Online Newsrooms as Actor-Networks and Communities of Practice. New Media Society (2010): 1461444809360400. Web.

58

Collaboration-and-Participation Bruns, Axel. Gatewatching: collaborative online news production. Peter Lang, 2005. Print.

59

Community Cavanagh, Allison. From Culture to Connection: Internet Community Studies : Sociology Compass. Communication and Media 3.1 (2009): 1-15. Print. Tillema, Taede, Martin Dijst, and Tim Schwanen. Face-to-face and electronic communications in maintaining social networks : the influence of geographical and relational distance and of information content. New Media Society (2010): 1461444809353011. Web.

60

Vertical-Integration Canada: New CRTC Regulatory Framework For Vertical Integration http://www.mondaq.com/canada/x/146880/Broadcasting/New+CRTC+Regulatory+Framework+for+Vertical+Int egration Summary: This article summarizes the new rules published by the CRTC in Spetember 2011, after holding public hearings to determine if new regulatory safeguards were needed to prevent broadcasting companies - such as Bell, Shaw, Rogers and Quebecor - from obtaining an unfair competitive advantage over their competitors. In its Regulatory Policy, the commission prohibits broadcasting distributors from "acquiring exclusive rights to the distribution of programming on non-linear or ancillary platforms." It has also established a code of conduct for commercial negotiations and adopted specific measures to protect independent broadcasters and distributors. Regarding exclusivity, the commission no person may offer programming designed primarily for conventional television, specialty, pay or VOD services on an exclusive or preferential basis to their mobile or internet subscribers. Broadcasting companies should offer more choice and flexibility in their services and provide a pick and pay model. The commission established a code of conduct to guide commercial interactions between the various industry stakeholders and to ensure that no party engages in anti-competitive behavior. During a dispute between an operator of a programming undertaking and the operator of a distribution undertaking concerning the terms or carriage of programming , both shall continue to provide their services on the same terms as conditions as they did before the dispute. Though encouraging fast conflict resolution, the commission will introduce a mandatory mediation mechanism.

Winseck, Dwayne, Winners, Losers and Opportunities Lost in the CRTC Vertical-Integration Ruling. http://www.theglobeandmail.com/news/technology/digital-culture/dwayne-winseck/winners-losers-andopportunities-lost-in-the-crtc-vertical-integration-ruling/article2175946/ Summary: On September 20th, 2011 the CRTC published its new rules, after holding public hearings on vertical integration in June. This article summarizes the new rules and argues that they suppose a partial success for consumers. The columnist says that arguments about small markets needing big media players were made in the CRTC's press release. However, he argues that Canada's total media economy is not small , and it is growing fast, so what it needs is for carriers to provide "clear channels and the most open media set-up possible." The six key measures adopted by the CRTC are: a. The four vertically integrated conglomerates - Bell, Shaw, Rogers and QMI - cannot offer TV programs exclusively to their own mobile or internet subscribers. b. Programs created specifically for Internet or mobile distribution by any of the companies mentioned above can be exclusive. c. Programming cannot be interrupted because of distribution rights conflicts. d. Status quo maintained with respect to independent television producers access to schedules to "the Big Four" specialty channels. e. End of "block-booking," i.e. access to one channel is tied to taking a block of several channels. f. Vertically integrated companies should come up with a broader range of "pick and pay" models within six months allowing people to order television programming services a la carte. The columnist argues that there are two important aspects that were ignored by the CRTC's new rules:

61

a. "References to the existing provisions in the Telecommunications Act (1993) - the common carrier sections 27, 28 and 36, and specifically so when it comes to broadcast programming - are completely ignored, referenced only in passing." b. "There's nothing in the new rules establishing parity of treatment for rival online video distributors (OVDS) such as Netflix, Apple TV and Google TV versus the big four's online 'TV everywhere' initiatives and IPTV offerings." Canada. Parliament. Senate. Standing Committee on Transport and Communications. Interim report on the Canadian News Media. Ottawa, Ont.: Standing Senate Committee on Transportation and Communications, 2004 (Saint-Lazare, Quebec : Gibson Library Connections, 2010) Part B, II: Ownership Structures, p.30-41. Summary: This parliamentary report states that the changing ownership of the Canadian media has been an issue of concern for several years, prompting the formation of committees and the elaboration of reports since the 1970s. While most democracies constitutionally guarantee freedom of the press, it is not uncommon to find restrictions with respect to concentration, cross-media ownership and foreign ownership. Regarding horizontal integration, in Canada the CRTC has to approve all mergers for broadcasters, and for radio there are restrictions on multiple ownership within a single market. Regarding vertical integration, it is relatively unlimited in Canada. The CRTC examines every case and may apply conditions of licence that may require the separation of editorial operations; the Competition Bureau also examines the acquisitions on a case-by-case basis; while the federal government can impose limits on media ownership. Foreign ownership of media in Canada is restricted to up to 46.7% for broadcasters and 25% for print media. Non-Canadians can directly own 20% of a broadcaster and indirectly own 33.3%.

62

You might also like