Author: 16kmccarthy

GamerGate Gets Personal

Since I was sick on Friday and unable to access the live streaming of most of the speakers, I took it upon myself to dive into the world of GamerGate and do some research through Twitter. I thought this approach would help me solve some of my misunderstandings of GamerGate and weigh both sides of the conflict, but it’s possible that I feel even more misinformed after hours of scrolling through seemingly-petty arguments on Twitter.

What struck me on Twitter was the difference in online identities when comparing GamerGate and anti-GamerGate and GamerGate presences. On one side, we have people, like Anita Sarkeesian, whose names, faces, and ideas are well-known and well publicized. As an anti-GamerGate feminist, Anita Sarkeesian has offered herself to the public as one of the faces of opposition of misogyny and harassment in video game culture. On the other side, we have multitudes of GamerGate presences that do not share their names, faces, or ideas on why misogyny and harassment are “okay,” but rather that they hate people like Anita Sarkeesian. I take issue with the stances of these individuals because, from what I saw on Twitter, they are not arguing Anita Sarkeesian’s ideas, but picking apart her existence as a whole. In fact, there were very few Tweets that intelligently argued against Sarkeesian’s theories, but countless that addressed how she “is not a gamer” and why “feminism is bad.” Sarkeesian has not based her talks and messages around her existence as a gamer, but the issues that are present in the gamer culture. Are these GamerGate supporters withholding their identities so that they do not have to justify what they say, or is their lack of identity a mirror of video game culture?

Another thing that REALLY struck me was coming across this tweet when scrolling through the #GamerGate feed.

Screen Shot 2014-11-10 at 10.05.11 AM

Mainly it struck me because this Twitter-user was featuring Professor Dougherty’s retweet. But, beyond that, I realized that Professor Dougherty might not even know, because “@FrankPwnatra” in no way tagged Professor Dougherty, but shared a photo of her tweet. This discovery further asserts my point about identity in the GamerGate conflict. Professor Dougherty has mentioned on numerous occasions that she has possessed the same handle online for years, maintaining a consistent and strong identity that is undoubtedly connected to her publicized studies and opinions. The GamerGate supporter, though listed under a real name, did not refute the information in the tweet with a factual argument, but attacked the “agenda” of anti-GamerGate as a whole (which, as listed here, are simply means of protection online). This user also failed to tag Professor Dougherty in the tweet, protecting himself from any legitimate conversations that might have been started from his call out. Was he avoiding conflict or did he simply not see the importance of tagging Professor Dougherty in his accusation?

I believe that Internet users, especially those who engage in argumentative conversation, need to be accountable for their words and actions online, and back up their arguments with factual information. In my opinion, it is also helpful to reinforce claims with a solidified identity. My experience on Twitter informed me that, unlike Anita Sarkeesian many individuals involved in the GamerGate conflict do not assume any responsibility for their words or actions.

LambdaMoo for Me, LambdaMoo for You

My experiences using LambdaMoo were unlike any that I have ever had on a computer. Never before had I been introduced to an interactive space that was created purely based on the imaginations of its users, and using strictly descriptive text. When I first experienced LambdaMoo in class, I was extremely lost and initially confused on what users went there to do. However, once I entered into LambdaMoo again last night, I recognized that it was more welcoming than a lot of other places, both online and in everyday life, and that the other, more experienced, users were interested in helping the newer ones learn. I felt a sense of closeness with the other people in the Coat Closet with me, and almost didn’t want to leave.

Screen Shot 2014-11-03 at 8.21.34 AM

I didn’t make this R. Kelly joke, but I wish that I did.

I felt torn between exploring these other places built in LambdaMoo and maintaining conversations with the other users. My experiences with LambdaMoo were odd for me mainly because I interpreted them as strange crosses between gaming and reality. As Dibbell stated, “all of these entities — rooms, things, characters — are just different subprograms that the program allows to interact according to rules very roughly mimicking the laws of the physical world.” For this reason, the experience was pretty challenging. If I were having a conversation with these people in real life, would I leave unannounced? Where does one draw the line between expressing social graces and addressing the fact that these are people we will never have to know? For me, it was difficult to determine how I should behave in a place where I would not be permanently attached to my behavior, but where I was expected to treat it like reality. Also for this reason, it is difficult for me to interpret the acts committed by Mr._Bungle that Dibbell described in the reading. Though no physical harm was done, LambdaMoo is strongly representative of a reality for some people. For instance, as Cherny discussed, omitting a wave as you enter the room can be rude to other users. With this in mind, we must play to the sensitivities of other users when using LambdaMoo because there is no easy way to determine what their LambdaMoo experiences mean for them.

Is it Magic?

According to Turner’s article, in the mid-1990s, the general prediction was that the Internet would create a utopia. “The Net would level social hierarchies, distribute and personalize work, and dematerialize communication, exclaimed pundits and CEOs alike.”

It is interesting to analyze this notion that the Web would, without doubt or negative consequence, improve our social condition, against what it has really done for society. It is understandable to assume that it would be a flawless, enriching resource in itself. According to Jones’s article, new technology has so much potential to improve various industries in the future. Big Think’s resident “futurist” Michio Kaku predicts that, in the next few decades, we can look forward to technology that resembles the magic we’ve only encountered in movies (invisibility cloaks and shape shifting).

But rewind back to a time before the Internet. Danah Boyd would suggest that every aspect of new media and technology that we are familiar with now seems magical when thinking about it from the perspective of someone from the mid-1990s. Whenever new media and new technology come along, we experience a cycle of wonder, adaptation, and indifference. I suspect that when the invisibility cloak comes along, it might even align with this cycle.

Certainly the Internet has changed and improved the lives of those who use it in some respect. But, are technology and the Internet creating a “harmonious electro sphere” and linking the human race, or is the “World Wide Web” linking a certain demographic of the human race? When thinking of words to describe the Internet and social networking sites, “harmonious” is nowhere near the top of my list. Sure, social networking sites have offered us great opportunities to connect and share ideas, visuals, and opinions. But to whom has the Internet offered it? What percentage of the world’s population actually has access to not only a computer, but also the Internet? The Web has connected people from different countries, but it has not dissolved the levels of social hierarchies in these countries. Maybe the opinions we need to hear most to alter our social condition are stifled by a lack of technology. For this reason, I’m not so sure that utopia can exist until we all have a way of experiencing the magic.

Kate McCarthy

The Hedgehog and the Fox

Just as there are always two sides to a story, there are two sides to any given Internet-user’s experience with online text. Speaking from a personal perspective, the Internet has both hindered and helped my learning, writing, and reading. As much as the Internet has distracted me, it has also kept me mesmerized and informed with articles, photographs, and videos. As difficult as it is for me to read long stretches of text on a computer screen, it has also allowed me to answer questions as soon as I formulate them in my head.

The tone in Rich’s article suggests that those of us (my generation) that do not consistently read books will be less successful than those of us who do. Specifically, it insinuates that if our generation does not read paper books, we will be disadvantaged in a job interview. In my experience and understanding, this inference is a leap. Especially in the field of communications, it is important for interviewees to be well rounded, both in hobbies and activities, and in their knowledge.

In one of my classes this week, Sir Isaiah Berlin’s essay The Hedgehog and the Fox was brought up in conversation. This essay addresses the quote by philosopher Archilochus that states, “The fox knows many things, but the hedgehog knows one big thing.” Berlin proceeds to classify different philosophers as either foxes or hedgehogs. In my mind, I equate these classifications to the different styles of reading. Those who read a single book are hedgehogs, whereas those who read a multitude of different articles online are foxes. Neither is better or worse, we are just different classifications of “scholars.” However, I am curious to hear other opinions: is the breadth of knowledge more valuable, or the depth?

Certainly tweeting is not comparable to each of us writing a book, but I agree with Dibbell’s argument. Tweets should not be devalued. In the past, technology has been a symbol of progress. Twitter and other social media, though not always academic, are also forms of progress, and our generation has mastered the technology. Though I do not always feel brilliant when I am tweeting, I know that it takes more effort to articulate things concisely and capture attention than to ignore filters and ramble on for more than 140 characters. It is a strange mix of qualitative and quantitative that we have adapted to seamlessly. Does anyone else feel a sense of accomplishment when including multiple layers of information within 140 characters? I know I do, despite it taking 30 minutes to find something worth tweeting about:

Screen Shot 2014-09-19 at 10.07.34 AM

Kate McCarthy

NOSTALGIA AND NEW MEDIA

Like most people born in the 1990s, nothing brings me more nostalgia than the thought of firing up the unmoving, eye-sore called the Gateway 2000, and plugging in the internet jack anytime we wanted to endure the process of dialing up the Internet (here’s 10 hours of dial up tones for your listening pleasure click HERE). Not a lot of memories from that time of my life are as vivid as playing “Orly’s Draw-A-Story” and “Madeline European Adventure,” or listening to my mom accusingly inquire about the location of the Hallmark Card Studio CD. I still haven’t forgotten that my interpretation of graphic art was the neon green, even spraying of the airbrush tool in Microsoft Paint.

How my time was spent in 1999

In my early usage of computers, my experience strictly involved what van Dijk would call the space dimension of interactivity. Who would I be interacting with online at the age of seven? Aside from Orly, Madeline, and the occasional dose of Carmen San Diego, my interactivity, especially the degree of sychronicity, was low. Enter AIM and AIM e-mail. These programs incorporated an entirely different facet of interactive communication for me, as a young computer user. The synchronicity of interactive communication skyrocketed. My classmates and I would go home after school and immediately log on to AIM to talk with the same people we had just seen minutes earlier. Conversation was rapid-fire and never-ending. It was from this point forward that I realized I would never be without consistent communication with my friends.

Miss it

As I grew, more forms of “new media” entered my life: MySpace, YouTube, Twitter, and Facebook. At this point in time, I would still categorize social media sites such as Twitter and Facebook as “new media” based on Gitleman & Pingree’s qualification. “When a new medium is introduced, its meaning—its potential, its limitations, the publicly agreed upon sense of what it does, and for whom—has not yet been pinned down.” Though we are perfectly aware of the function and audience of these sites, are we entirely sure that we can “pin down” their potential and limitations? This article suggests that new media users have not developed a sense of when to stop. It mentions that, “old media,” what they consider television and radio, has been pinned down. It is certainly more predictable and formed than the forms of social media that I would consider “new” for this reason. When will the time come for social media sites to graduate into “old media” based on Gitleman & Pingree’s definition? Though, yes, one of the great things about social media and the World Wide Web is that an individual can post anything he or she wants at any given time, in my opinion, it is this same factor that will keep it from being pinned down. There are no limitations, and this can be both a great and bad thing.

 

Kate McCarthy