In his later writings, particularly since the "Theory of Communicative Action," Habermas replaced this pessimistic perspective with a more hopeful model, in which grassroots initiatives and citizens' movements can influence and instigate public discourse. Thus, the idea of a revitalization of the public sphere could be built upon actual societal agents and existing social movements.

Though this view is both more optimistic and more accurate, it does not sufficiently take into account the pitfalls of the information society. Invasion of Privacy and the Infosphere Public discourse is the necessary condition and at the same time unreachable ideal of modern democracy. Yet, it has become a scarce good in the information society and the era of digitized mass communication. At the same time, governmental and commercially motivated surveillance, and the pressure of public opinion cut deeply into people's private lives.

Privacy is not, however, just a citizen's right, it is a need and necessity for civilized life and the development of subjectivity. The normative power of the public eye and the controlling power of discourse formations should be reviewed under the conditions of digital invasions of privacy in global communication networks and databases. Thus, we are witnessing the dawn of a tightly woven global infosphere , a digitized networked panoptic sphere that leaves little space for unmonitored privacy. This new sphere of public privacy is characterized by.

The political reactions to the events of September 11, , have shown that the surveillance function of the infosphere can be triggered when necessary, and basic rights and liberties of citizens are subordinated to the greater good of global surveillance. As much as the public sphere was a corollary of the private sphere in the early modern era, the sphere of amalgamated Public Privacy is the inescapable accompaniment of the infosphere.

Due to time restrictions, I cannot address the all the complexities of the sphere of public privacy see reference 1. In my presentation, I plan to focus on two examples of resistance strategies on the macro- and the micro-level:. Revitalization of Public Discourse and the Infosphere In contrast to conventional mass media, the Internet permits true nonhierachical, multidirectional communication.

In addition, it has a low access and publication threshold; anyone who has the necessary equipment and skills can publish in the WWW or take part in its discussions. As a hybrid medium, the Internet is thus an ideal medium for the interactive forms and needs of communication in the lifeworld. However, it must be emphasized that it is only the social and political actions of human beings, and not the technical structure, that can lead to a revitalization of public discourse in the infosphere.

This revitalization is to be found on three main levels of public communication: The conventional media have come under pressure to react constantly to the Internet because of its new role in processes of agenda setting. At the same time, mass media participate in the Internet with their own content and thus constitute a new form of mass communication. This trend is visible in the Internet presence of nearly all print and broadcast media, as well as in the mediamix and cross-media productions of established media conglomerates. Regulating Invasion of Privacy Currently, invasion of privacy is usually understood as a problem of the individual whose privacy has been infringed upon.

When it comes to collecting and disseminating data, users are at best asked to opt out, often without a clear description of what the government or commercial bodies will do with the data and, even more importantly, what third parties might be able to do with them.


  • !
  • Not Your Average Kind Of Girl;
  • From Public/Private to Public Privacy | Public Sphere Project;
  • Integrating Spheres - Avantes;

The user—generated content environment has undergone such a dramatic change that even a mere eight years ago, when the Internet was already widespread, political campaigns had to resort to some degree of trickery to compel users to provide content. People had to be coaxed into user—generated data. These days, such data is voluntarily and widely generated by individuals themselves as a by—product of civic participation that is digitally mediated — in other words, people are commenting and discussing about politics on general—purpose digital platforms and this digital mediation of their activities leaves behind a trove of data that is harvested by companies and data brokers.

Further, the quantitative depth of big data composed of online imprints is exponentially richer than pre—digital data. A large commercial database may easily contain from thousands data points on each individual — a recent report found that some data brokers had 3, individual data points per person, and were adding to it at a rapid pace U. The volume and variety of this kind of big data is qualitatively different.

If anything, the problem of data analysis today is data that is too much, too deep, and too varied. However, rise of computational methods and modeling is quickly catching up to the challenge of turning this deluge of data into usable information at the hands of political campaigns and others. All this data is burdensome without techniques to acquire usable information from the dataset.


  • User login.
  • outside the sphere of influence - French translation – Linguee!
  • Le Cachet dOnyx (French Edition).
  • Public Sphere in Modern Societies!

Computational methods used by political campaigns depend on multiple recent developments. First, technical developments in storage and database systems mean that large amounts of data can be stored and manipulated. Second, new methodologies allow processing of semantic , unstructured information contained in user—generated natural language outputs such as conversations — as opposed to already structured data such as a financial transaction which come in the shape of already neatly packaged fields. Third, new tools allow human interactional data can to be examined through a structural lens using such methods as social network analysis.

Fourth, the scale of the data allows for new kinds of correlational analyses that would have been previously hard to imagine.

Integrating Spheres

First , given the amount of data that is being generated, even mere storage has been a challenge, and has required developing new methods. YouTube has 72 hours of video uploaded every minute. As of last year, Facebook was processing about 2. Facebook reportedly holds its data in a —petabyte Hadoop cluster. Second , new computational processing techniques allow for extracting semantic information from data without using an army of human coders and analysts, as would have been required under old techniques. Without these computational techniques, the texts would have to be read and summarized by a large number of people; even then, just aggregating the results would pose a challenge.

The broadened utility has occurred partly because data that is in the form a network has increased significantly due to online social network platforms that are used for a variety of ends, including politics Howard and Parks, Previously, gathering social network information from people was a difficult and costly endeavor and various biases and difficulties in recalling social network information led to great many difficulties as even small social networks required hundreds of interviews where people were expected to name dozens, if not hundreds, of social ties.

Understandably, such research has always been very difficult and carried out only on small samples. With the advent of networks that were encoded by the software, network analysis became possible without the difficult step of collecting information directly from individuals.

For example, people with high centrality are useful propagators of information and opinions, and once identified, can be targeted by political campaigns as entry points to larger social networks. This has, of course, led to many positive applications. For example, researchers have started identifying drug interactions by looking at Google searches of multiple drugs matched by symptoms — a feat that simply cannot practically be done any other way which would mean surveying all users of all drugs about all side effects. However, for political campaigns, that also opens doors to better individualized identifying of target individuals by looking or correlations between their political choices and other attributes.

Considering that data brokers have thousands of data points on almost every individual in the United States, these new computational methods available to campaigns with which to analyze, categorize and act upon the electorate. In this context, modeling is the act of inferring new information through analysis of data based on creating a computational relationship between the underlying data and target information. Modeling can be vastly more powerful than aggregate profiling. Aggregate profiling attempts to categorize a user, by putting her in a category with many others, by combining available data.

However, the advent of big datasets that contain imprints of actual behavior and social network information — social interactions, conversations, friendship networks, history of reading and commenting on a variety of platforms — along with advances in computational techniques means that political campaigns and indeed, advertisers, corporations and others with the access to these databases as well as technical resources can model individual voter preferences and attributes at a high level of precision, and crucially, often without asking the voter a single direct question.

Strikingly, the results of such models may match the quality of the answers that were only extractable via direct questions, and far exceed the scope of information that could be gathered about a voter via traditional methods. In other words, just access to a fraction of Facebook data, processed through a computational model, allows for largely correctly delineating Republicans and Democrats without looking into any other database, voter registration file, financial transactions or membership in organizations.

While parts of this example may seem trivial since some of these, such as age and gender, are traditional demographics and are usually included in traditional databases, it is important to note that these are being estimated through modeling, and are not asked or observed from the user. This means that that these attributes can also be modeled in platforms where anonymous or pseudonymous postings are the norm.

This type of modeling also furthers information asymmetry between campaigns and citizens; campaigns learn about a given voter with the voter having no idea about this modeling in the background. Crucially, this type of modeling allows access to psychological characteristics that were beyond the reach of traditional databases, as invasive as those might have been considered.

Une collection de contenus sélectionnée sur Politika

In other words, without asking a single question, researchers were able to model psychological traits as accurately as a psychologist administering a standardized, validated instrument. Given that social media data have been used to accurately model attributes, ranging from suicide rates to depression to other emotional and psychological variables De Choudhury, et al. Campaigns do not want to spend resources on people who are unlikely to vote and pollsters need this data to weigh their data correctly.

Previous voting records are also tenuous predictors — besides, there are many young voters entering the rolls. Gallup, whose likely voter model had long been considered the gold standard, asks a series of seven questions that include intent, knowledge where is your voting booth? However, even with decades of expertise, Gallup has been missing election predictions, due to its inability to correctly predict likely voters through survey data.

The gravity of the situation such that Gallup became a punch line: This resulted in a targeted, highly efficient persuasion and turnout effort which focused mostly on turning out voters that were already Obama supporters rather than spending a lot of effort persuading voters who would not end up voting. In , Obama campaign staffers told a gathering at the Personal Democracy Forum that in key states, they were able to go deep into Republican territory, to individually pick voters that they had modeled as likely Democrats within otherwise Republican suburbs, breaking the lock of the precinct at voter targeting.

The advantages of stronger, better modeling, an expensive undertaking that depends on being able to purchase and manipulate large amounts of data, can hardly be overstated. Finally, big data modeling can predict behaviors in subtle ways and more effectively oriented toward altering behavior. For example, for years, the holy grail of targeting for commercial marketers has been pregnancy and childbirth, as that is a time of great change for families, resulting in new consumption habits which can last for decades.

Previously, retailers could look at obvious steps like creation of a baby registry; however, by then, the information is often already public to other marketers as well, and the target is well into the pregnancy and already establishing new consumption patterns. In a striking example, Duhigg recounts the tale of an angry father walking into Target, demanding to see the manager to ask why his teenage daughter was being sent advertisements for maternity clothing, nursery furniture and baby paraphenelia.

The manager, it was reported, apologized profusely, only to receive an apology himself when the father went back home to talk with his daughter, who was, indeed, pregnant. Data modeling ferreted out facts that a parent did not know about his own child living under his own roof. These predictive analytics would not be as valuable without a corresponding rise in sophistication of behavioral science models of how to persuade, influence and move people to particular actions. Developing deeper models of human behavior is crucial to turning the ability to look, model and test big data into means of altering political behavior.

The founder of public relations, Edward Bernays, himself had posited that people were fundamentally irrational. All this changed thanks to research which emphasized the non—rational aspects of human behavior, and with attempts to measure and test such behavior modification within political contexts. Just as behavior analysis became more sophisticated, for the first time in modern political history, an influx of scholars from the behavioral sciences moved into practical politics, starting with the Obama campaign.

Increasingly, however, elections are fought at the margins in part because of pre—existing polarization, a winner—takes—all system in the case of United States, and low turnout. Under these conditions, the operational capacity to find and convince just the right number of individual voters becomes increasingly important [ 7 ].

In such an environment, small differences in analytic capacity can be the push that propels the winning candidate. Combining psychographics with individual profiles in a privatized i.

Doing Social Sciences

However, research shows that when afraid, only some people tend to become more conservative and vote for more conservative candidates. Campaigns, though, until now had to target the whole population, or at least a substantial segment, all at once, with the same message. Experimental science in real—time environments: The online world has opened up the doors to real—time, inexpensive and large—scale testing of the effectiveness of persuasion and political communication, a significant novelty to political campaigns. Empirical discussions about politics would, at most, focus on surveys and there has been surprisingly little testing or experimentation in political campaigns Gerber and Green, But most importantly, field experiments are costly and time—consuming, and money and time are the resources on which political campaigns already place the highest premium.

In spite of these obstacles, some experiments were conducted; however their results were often published too late for the election in question.

GuardBot, A Robotic Sphere Surveillance System That "Can Roll On Any Terrain"

The field experiments conducted in , demonstrating that face—to—face canvassing was most effective for turnout, were published three years later Green, et al. These experiments increased awareness that many methods that campaigns traditionally spent money on for example, slick mailers or phone calls were not very effective.

A culture of experimentation was encouraged and embraced. The rise of digital platforms allowed incorporating real—time experimentation into the very act delivery of the political message. The results are measured in real time and quickly integrated into the delivery as the winning message becomes the message. Methodologically, of course, this is traditional experimental science but it has become possible because campaigns now partially take place over a medium that allows for these experimental affordances: The Obama campaign had incorporated experiments into its methods as early as For example, in December , when the Obama campaign was still in its early stages, the campaign created 24 different button and media combinations for its splash page the first page that visitors land on.

Each variation were seen by 13, people — an incredibly large number for running a field experiment by old standards, but a relatively easy and cheap effort in the digital age Siroker, Through such experimentation, the Obama campaign was led to predominantly feature his family in much campaign material. The increasing digitization of political campaigns as well as political acts by ordinary people provides a means through which political campaigns can now carry out such experiments with ease and effectiveness. Power of platforms and algorithmic governance: These platforms operate via algorithms the specifics of which are mostly opaque to people outside the small cadre of technical professionals within the company with regards to content visibility, data sharing and many other features of political consequence.

These proprietary algorithms determine the visibility of content and can be changed at will, with enormous consequences for political speech. Similarly, non—profits that relied on Facebook to reach their audiences faced a surprise in — The implications of opaque algorithms and pay—to—play are multiple: Second, since digital platforms can deliver messages individually — each Facebook user could see a different message tailored to her as opposed to a TV ad that necessarily goes to large audiences — the opacity of algorithms and private control of platforms alters the ability of the public to understand what is ostensibly a part of the public sphere, but now in a privatized manner.

Campaigns can access this data either through favorable platform policies which grant them access to user information. These private platforms can make it easier or harder for political campaigns to reach such user information, or may decide to package and sell data to campaigns in ways that differentially empower the campaigns, thus benefiting some over others. Further, a biased platform could decide to use its own store of big data to model voters and to target voters of a candidate favorable to the economic or other interests of the platform owners. Such a platform could help tilt an election without ever asking the voters whom they preferred gleaning that information instead through modeling, which research shows is quite feasible and without openly supporting any candidate.

A similar technique could be possible for search results. Ordinary users often never visit pages that are not highlighted on the first page of Google results and researchers already found that a slight alteration of rankings could affect an election, without voter awareness Epstein and Robertson, Big—data driven computational politics engenders many potential consequences for politics in the networked era. In this section, I examine three aspects: First, the shift to tailored, individualized messaging based on profiling obtained through modeling brings potential for potential significant harms to civic discourse.

Howard and Hillygus and Shields had already presciently warned of the dangers of data—rich campaigns. However, these can be double—edged for campaigns in that they elicit significant passion on all sides. Hence, campaigns aim to put wedge issues in front of sympathetic audiences while hiding them from those who might be motivated in other directions Hillygus and Shields, ; Howard, Until now, the ability to do just that has so far been limited by availability of data finding the exact wedge voter and means to target individuals Barocas, Prevalence of wedge issues is further damaging in that it allows campaigns to remain ambiguous on important but broadly relevant topics economy, education while campaigning furiously but now also secretly on issues that can mobilize small, but crucial, segments.

It can also incorporate psychographic profiles modeled from online social data — data collected without directly interfacing with an individual. Hence, fear—mongering messages can be targeted only to those motivated by fear. Unlike broadcast, such messages are not visible to broad publics and thus cannot be countered, fact—checked or otherwise engaged in the shared public sphere the way a provocative or false political advertisement might have been.

This form of big data—enabled computational politics is a private one. At its core, it is opposed to the idea of a civic space functioning as a public, shared commons. It continues a trend started by direct mail and profiling, but with exponentially more data, new tools and more precision. The second negative effect derives from information asymmetry and secrecy built into this mode of computational politics. While the observational aspect is similar, computational politics is currently exercised in a manner opposite of the panopticon. The panopticon operates by making very visible the act and possibility of observation, while hiding actual instances of observation, so that a prisoner never knows if she is being watched but is always aware that she could be.

Modern social engineering operates by making surveillance as implicit, hidden and invisible as possible, without an observed person being aware of it [ 10 ]. While browsers, cell phone companies, corporate and software companies, and, as recently revealed, the U. This model of hegemony is more in line with that proposed by Gramsci which emphasizes manufacturing consent, and obtaining legitimacy, albeit uses state and other resources in an unequal setting, rather than using force or naked coercion. Research shows that people respond more positively to messages that they do not perceive as intentionally tailored to them, and that overt attempts are less persuasive than indirect or implicit messages.

Political campaigns are acutely aware of this fact. As advisor and consultant to the Democratic party, Hal Malchow puts explicitly: When they see our fingerprints on this stuff, they believe it less. The public is constituted unequally; the campaign knows a great deal about every individual while ordinary members of the public lack access to this information. Even when identity information is not embedded into a platform such as Twitter where people can and do use pseudonyms , identity often cannot be escaped. Modeling can ferret out many characteristics in a probabilistic but highly reliable manner Kossinki, Commercial databases which match computer IP to actual voter names for an overwhelming majority of voters in the United States Campaign Grid, ; U.

Federal Trade Commission are now available. Thus, political campaigns with resources can now link individual computers to actual users and their computers without the consent. Big data makes anonymity difficult to maintain, as computer scientists have shown repeatedly Narayanan and Shmatikov, Given enough data, most profiles end up reducing to specific individuals; date of birth, gender and zip code positively correlate to nearly 90 percent of individuals in the United States.

On the surface, this century has ushered in new digital technologies that brought about new opportunities for participation and collective action by citizens. Social movements around the world, ranging from the Arab uprisings to the Occupy movement in the United States Gitlin, , have made use of these new technologies to organize dissent against existing local, national and global power [ 11 ].

SIR SPHERES () English version

Such effects are real and surely they are part of the story of the rise of the Internet. However, history of most technologies shows that those with power find ways to harness the power of new technologies and turn it into a means to further their own power Spar, From the telegraph to the radio, the initial period of disruption was followed by a period of consolidation in which challengers were incorporated into transformed power structures, and disruption gave rise to entrenchment.

The dynamics outlined in this paper for computational politics require access to expensive proprietary databases, often controlled by private platforms, and the equipment and expertise required to effectively use this data. At a minimum, this environment favors incumbents who already have troves of data, and favors entrenched and moneyed candidates within parties, as well as the data—rich among existing parties.