SHARE US


Quick Links

Sign In

Lose something?

Enter Username or Email to reset.

Sign Up

Painted Brain | Cyberspace And Foucault’s Panopticon – The Rise Of Surveillance Society
The effects of Data-mining on consumers: Online profiling allows marketing companies who receive this information to predict consumer behavior and place consumers into categories determining their economic value and value for potential marketers
cyberspace,socialmedia,surveillance,mentalhealth
post-template-default single single-post postid-4023 single-format-standard _masterslider _msp_version_3.0.6 full-width full-width cp_hero_hidden cyberspace-and-foucaults-panopticon-the-rise-of-surveillance-society cp_header_absolute none cpcustomizer_off megamenu no-header cp_breadcrumbs_visible unknown wpb-js-composer js-comp-ver-5.0 vc_responsive

Share

  • May 30, 2017

Cyberspace and Foucault’s Panopticon – The Rise of Surveillance Society

“The circuits of communication are the supports of an accumulation and a centralization of knowledge.” ~ Jan Fernback

In Foucault’s book “The Archaeology of Knowledge”, he refers to archaeology as an epistemic inquiry into understanding the power dynamics and forces that give rise to discursive forms. Such “forces” also give rise to and set forth the conditions that make certain discourses possible or dominant over others. Knowledge is power to decide what is “known” and practiced. In Jan Fernback’s article “Profitable Surveillance: Online Community as Commercial Exploitation,” speaks to the hierarchy that is created in cyber-culture. She states, “In collecting, storing and selling data, computers are, in essence, surveilling individual consumers.” In other words, computers and cyber-culture interact with Foucault’s theory that social control can be gained through technological devices, and this control is shaped by discourse. This paper will examine how the rhetoric of cyber-culture, the instilled ideas of personal agency, is intentionally manipulated on the Internet. Most importantly, these elements will engage with Foucault’s notions of power structure, the inevitable binary that is created between the “producer” and the “consumer”, the citizen and the “gatekeepers”, the wielder of the rhetoric and the audience.

Foucault was concerned more with the systemic properties of such discursive formations than the “content” or the specific actors involved in them. By this he refers to as “properties” embedded in discourse itself, in terms of the conditions that make the rise of such discursive formations possible. These would be the forces that brought them into existence and “reproduces” them over time. Online surveillance and the other forms of “soft controls” constitute the conditions that not only make the case for “profitable surveillance” and the “rhetoric of community” as strategy for surveillance, they in fact provide the conditions for the control of discourse through selection and restriction of “aspects” of cyberspace. Through this “selection” process it both reveals and hide imbedded within “public discourse” itself. The public discourse of cyberspace suggests that Foucault’s prediction loom true.

Janice Denegri-Knott suggests, “Online discourses seek to curtail freedoms and opportunities.” The “digital revolution” and the users interactions with the Internet are emphasized in the name of “true connection[s] among people, for enhanced customer service, and for a more representative democracy” (United States Department of Commerce, 1999). It is important to note here that intentional rhetoric is being used. By using words such as “true,” “enhanced,” and “representative”, a utopia – a space of  ‘supposed’ individual freedom is being created. This is an example of rhetoric being used as social identification and control. According to Kenneth Burke, identification means the conjoining or mutual acceptance of interests between two or more people/minds and the online discourses.  Returning back to the example provided by the US Department of Commerce, the conjoining of interests lies between the data miners and the users. It can be argued, then, that the data miners are using rhetoric as social control to lure the users to divulge information by joining a community of discourses around a certain product. They become the unwitting accomplices in their exploitation by data-miners.

Suddenly all at once, the rhetoric of cyberspace serves a discursive function and a value-based debate itself about what is desirable versus otherwise and this reflects in the “text” of the Web. On another level it serves a gate-keeping function in that communities themselves are “gated” or regulated through member access guidelines. The typical norm for membership involves creating an online individual profile that can be viewed by others. Examples could be the joining of a dating site, creating an Amazon review, or even posting a blog on Yahoo Answers. Members must disclose of themselves in such a way that it is possible to gleam useful information about individuals from a marketing and “prospecting” perspective. Such prospecting will be exemplified in the efforts of online headhunters who source job seekers from Internet leads, or people on social media who happen to desire to join so-and-so’s “friends” list. Online profile generation is a crucial aspect of the rhetorical strategy that builds the case for surveillance. It makes it not only acceptable, it makes it desirable because it gives you access to an entire community, and for a job hunter, potential spotlight for headhunters. Entire discursive formations (Friendster.com, flickr.com, etc.) arise from this idea of utopian “empowerment” and “access” in today’s technology as “potentially liberating”. These notions are co-opted thus embedded in much commercial online discourse. The co-optation or embedded-ness of these “discursive forms” are revealed in the text of the webpage itself and are hidden in the structures underneath that regulate the flow of online traffic. Thus ISPs (Internet Service Providers) have their own built-in requirements about what the end-user must provide – whether a monthly fee or certain disclosure agreements, or the disclosure of certain information.

Foucault’s “discipline through surveillance” is approached as a “soft” form of “social control” through its insidiousness and cloak of invisibility. It is done through several forms of data-veillance. One is the profiling of online customers by gathering personal information such as name, phone number, site visiting history, income, etc. Sites such as Amazon.com allow users to join discussion boards, post product reviews, and divulge personal information ‘About You’ through these forums, which are then collected for further profiling activity.

The effects of Data-mining on consumers: Online profiling allows marketing companies who receive this information to predict consumer behavior and place consumers into categories determining their economic value and value for potential marketers. “Online profiling” can also be “discriminatory” in that targeting may be limited only to those with sizable incomes or good credit, thus marginalizing or shutting out the less economically well-to-do consumers. They place them (consumers, citizens, audience members) in a category of diminished economic value. This placement is inherently rhetorical in that discursive forms of domination are embedded within the “form”, and might be rightly argued that this is where social identification in the Foucaultian sense applies as a rhetorical aim of situating (placing) people within that hierarchy and reproducing the power structure through discourse.

Foucault’s theory of domination can be used to understand how cyber-culture can in fact dominate the user. Boyle’s asserts that cyber-culture’s “work on surveillance, sovereignty and hard-wired censors is loosely framed by Foucaultian theory, and concludes, despite the Net’s catechism of freedom, that online life can be regulated and dominated” (Boyle). The “panopticon” is part of this analysis of cyber-culture in a variety of ways. On the one hand, cyberspace can be a highly anonymous and private place where one might feel they are not being controlled by the panopticon theory. However, in another way, because there are people mining the information used by any particular computer, a sort of invisible panoptical exists; a new layer to the original panopticon is created. As we know, the panopticon was created in a prison where the guards could see all of the inmates, and therefore regulate their behavior. Foucault said this same theory is internalized, and people no longer need a warden to tell them what to do because they have internalized the warden. In cyberspace, there is a space where theoretically no one is watching you and no one knows who you are. However, data miners have become the new wardens. The difference is that the user does not know of his/her presence. The layer between the user and “guard” is mediated by a “cloak” that renders the “guards” invisible.

Rhetoric in various disguised forms “have the potential to operate as a powerful panoptic technique for observing, classifying, and normalizing the individual and the collective” (Kitto 5). The “guise” that is critically examined in this instance is online education classes. In similar vein and in a general sense, identification works on the level of locating and surveilling the audience to manipulate pathos logos and ethos of the audience to gain control. Thus surveillance evolves into a series of disciplines to control the mind (via symbolism), body (action), and intentions that animate individuals to cooperate or enact desired behaviors. The end result attained is the exploitation of the cyber-audience as consumers engaging in “predictable” and “desirable” consumer behavior. This will mean one thing to Amazon, another thing to napster.com, and another thing in the case of distance online learning programs.

The rhetoric on cyberspace also serves a disciplinary function through discourse. Traditional notions of discipline involved sanctions by the state or government through rewards and punishments to enact desired behavior and enforce laws. Discipline in cyber-culture works a reproductive function by re-producing existing discourse and structures of discourse and this is accomplished by constraining and regulating behavior by governments, intelligence agencies, and business organizations, in the form of copyright laws. Discipline “objectizes” the “object” by subjecting it to surveillance, and the very way in which the surveillance is carried out is the discursive formation of domination and economic exploitation by dominant interests.

While on one hand cyber-culture makes use of specific rhetoric as a way to create a sense of security, personal freedom, and participation on the other hand, specific rhetoric is used to discourage certain behaviors. The system of sanctions leads to self-censorship behavior, or discourages certain others. For example, free music downloading has been called many pejorative labels, ‘piracy’, ‘boot-legging’, ‘stealing’, among others. The terms are pejorative and connote those that violate the established order as “deviants” or engaged in “deviant behavior”. Similar negative connotations have been attached to “spam” (wasteful or intrusive email messages). Furthermore such designations now have the force of litigation behind them to discourage users from noncompliance. The notion of “privacy rights” is a highly contested area surrounding the issue of file sharing rights for music users. An example of this would be in the efforts of the Music Industry to both discourage and penalize users displaying ‘deviant’ behavior – downloading free music files – even suing Napster after its inception through various courtroom battles and lobbying efforts. Another example would be reflected in the lawsuit levied upon Verizon Communications by Music Industry litigators over private records of end-users on the Internet. The cases of litigation and lobbying efforts by the so-called “gatekeepers” or our technology (the Music Industry giants in this case) are no less a form of rhetorical discourse seeking to emancipate and strengthen the rights of “gatekeepers” over “users”.

What does this say about the fate and future direction of cyberspace or about “emancipatory discourse”? Foucaultian theory of social control has proven accurate, but is accurate prediction to be equated with the “right” outcome or truth-value? Foucault’s theories could reflect truth just as well as they could be a reflection of the prevailing pessimism of the times or a belief in the overwhelming power of our modern institutions of political economy. To put too much weight in the seeming accuracy of any theory could overstate its importance beyond necessity, and accumulate into an endless cycle of fatalism of a particular kind, especially the fatalism that is so characteristic of Post Modern thought.

Because as I deem Foucault has proven accurate in his predictions about power and Internet discourses as they relate to the forces dominating the Internet today, much importance arises that we “must not” allow Foucault’s theories to accurately predict our future. We must not because it is increasing proving to accurately explain present phenomena. The predictive and explanatory must evolve to the change and transformation of discursive formations and the forces that give rise to them, and this requires a radical re-situating of “identity” and “autonomy” in the midst of overt and covert systems of controls in the present in regards to the future.

The Internet truly represented for many a potential field for liberating human discourse and freedom. This utopian ideal isn’t misplaced, but is devoid of the realism with which we must assess the impact of technology on the human race. The rhetorical ideal of freedom and vibrant public discourse is not guaranteed. It is only challenged by and dangers as well as opportunities on “cyberspace” as a place for wielding rhetorical power, and it is power that can be and is abused by powerful economic interests in the name of political power. The citizen must certainly balance of Foucault’s analysis with a healthy dose of optimism if we are to reclaim cyberspace as a free, openly accessible “digital commons” for all.

By Kevin Naruse @ 2017

Kevin Naruse is a web developer, social media consultant and blogger. You can visit his site at: kevinnaruse.com

Originally published in : http://kevinnaruse.com/blog/cyberspace-and-foucaults-panopticon-the-rise-of-surveillance-society/

Works Cited:

Campbell, John Edward, and Matt Carlson. “Panopticon.com: Online Surveillance and the Co modification of Privacy.” Journal of Broadcasting & Electronic Media 46.4 (Dec. 2002): 586. Communication & Mass Media Complete. 4 May 2008

Fernback, Jan. “Profitable Surveillance: Online Community as Commercial Exploitation.” Conference Papers — International Communication Association (2003): 1-30. Communication & Mass Media Complete. May 2008

http://search.ebscohost.com/login.aspx?direct=true&db=ufh&AN=16028335&site=ehost-live

O’regan, John P. “The text as a critical object.” Critical Discourse Studies 3.2 (Oct. 2006): 179-209. Communication & Mass Media Complete. 4 May 2008

Rybas, Sergey. “Contesting the Panopticon Metaphor: Online Education and Subjectivization of the Online User.” Conference Papers — International Communication Association (2007): 1-1. Communication & Mass Media Complete. 4 May 2008

  • Categories:

  • Editorial
  • Social Media

Post A Comment