Wednesday, April 16, 2014

The Psychology of Facebooking



When formulating the possible motivations for creating a Facebook account, happiness and connectedness seem to come to mind.  But new studies, such as those collected in The New Yorker article written by Maria Konnikova, reveal a different result entirely.



University of Michigan psychologist Ethan Kross and his colleagues conducted a two-week study concerning social media and its psychological effects.  According to Konnikova, the study transpired as follows:


Over two weeks, Kross and his colleagues sent text messages to eighty-two Ann Arbor residents five times per day. The researchers wanted to know a few things: how their subjects felt overall, how worried and lonely they were, how much they had used Facebook, and how often they had had direct interaction with others since the previous text message. Kross found that the more people used Facebook in the time between the two texts, the less happy they felt—and the more their overall satisfaction declined from the beginning of the study until its end. The data, he argues, shows that Facebook was making them unhappy.

Surprised?  Results such as those found in Kross' study are not uncommon or even new.  "In 1998, Robert Kraut, a researcher at Carnegie Mellon University, found that the more people used the Web, the lonelier and more depressed they felt. After people went online for the first time, their sense of happiness and social connectedness dropped, over one to two years, as a function of how often they used the Internet."  

The trend of treating the Internet as an anti-depressant has been in existence for over a decade.  But, its effectiveness at curing deep, onset issues of self-esteem and self-worth are nonexistent.  

Not only do social networking sites such as Facebook increase loneliness, but studies conducted by researcher Hanna Krosnova and her colleagues at the Berlin Institute of Information Systems suggest that Facebook increases envy between users.  The more people upload to Facebook, the more they notice the achievements of their "like-minded peers" as being (or appearing to be) superior to their own. 

"We want to learn about other people and have others learn about us," writes Konnikova,"but through that very learning process we may start to resent both others’ lives and the image of ourselves that we feel we need to continuously maintain."  What attracts users to Facebook is the very quality which repels them;  users want to upload and advertise the best parts of their lives, their work, and their experiences, but can also be offended when other users have representations that exceed their own.

So what is the solution?  Reserving posts for deep, philosophical insight and intellectual discourse as opposed to selfies?  Limiting posts to once or twice per day?  Balancing Facebook usage with real-time social interaction?  

The results, similar to the results gathered in Konnikova's study, will vary.  But recognizing the potential for these outcomes will protect users from the various psychological effects which can set in while exploring the digital-social realm.



(Also, as a side note, check out Konnikova's book, Mastermind: How to Think Like Sherlock Holmes.  Unrelated?  Yes.  Worth it?  Absolutely.)








Blog Prompt: Bolter, Chapters 9 & 16


Create a blog post reflecting on the following quote:

"The virtual traveler sees and interacts with bodies, not minds, and she must be inclined to deny the traditional hierarchy in which we are minds and merely have bodies."  (p.249)


This quote seems to defy certain assumptions related to online activity, namely that they are more an excursion in mental exercises than physical ones.  But digital objects are tangible in that they are an expression of a physical process of creating a product (e.g. code, text, images, videos, etc) that exists within a context (e.g. the Internet).  And, in order for virtual travelers to be interacting with minds as opposed to bodies, the content with which they interact would have to embody all the attributes of its host.  Because all traits cannot be expressed, the virtual content only represents what Sigmund Freud would call the "Superego," in which one only represents a portion of their personal and mental capacity within any given context.  This is what humans do on a normal basis;  one does not have the same endogenous or indigenous traits within every context (e.g. school, home, parties, grandparents' houses, funerals) but rather one is selective with the qualities that one reveals at specific times and places. 

The Heartbleed Compromise: How Safe Is the Digital Realm?


The recent discovery of the Heartbleed compromise has affected myriad online services and accounts.  But many are confused as to how it began, what is going on now, and how safe online users will be in the future.

One important fact to note is that Heartbleed is a defect, not a virus, as Forbes writer James Lyne states in his article.  The software defect is related to the Open SSL software code used for a number of online sources, "which means," writes Lyne, "that the code is available for anyone to review and for those handy with code to contribute to."  It can be manipulated, and the information stored within its protocols can be retrieved.  

This is not the worst of it, however.  Lyne points out that "There are lots of tools and processes that would have turned up such a fault very quickly, yet it went unnoticed for an extended period of time and was adopted in to a staggeringly large number of places."


And of course, it gets worse.  With the open-source nature of the program, one would expect that experts would be able to manipulate the software if there were problems.  So why did this not occur?  "Unfortunately, the project is very under funded and reviewed given the critical role it plays." 

With the lack of accountability for its functions and the seeming absence of monetary support, how can users expect to entrust the entirety of their identity to the digital realm?  For a service that is increasingly essential for the integrity of professional, economical, and social realms, how can one ever trust the Internet with the most sensitive collections of information?

"This is certainly not the first bug like this (though this was one of the more painful ones) and it is very unlikely to be the last."  Vulnerabilities such as Heartbleed will always exist on the Internet.  But will there one day be a defect, or even a virus, that completely wipes the digital slate clean and reduces our online way of life to nothing?  







Saturday, April 12, 2014

What DO We Really Mean By 'Art?'


As a musician, I have an open-mind when it comes to aesthetic expression.  That being said, I still have a hard time looking at a Jackson Pollock painting and experiencing the same type of pleasure that I do when I view Monet's "Beach in Pourville."



Beach.




Bile?


 Angela Montefinise, a writer for the Huffington Post, published an article on the very question of defining art.  Based on the experts she interviewed, including Bard College President Leon Botstein, the question of art being a definitive entity exists in modern discussions. Botsein states that "the important question is, what is art? Is there such a thing? Does it exist and how do we recognize it?"

Recognizing, according to Botsein, is the essential part of defining art.  "[The artists are] trying to communicate with us, with others who participate in this art in a way that somehow suggests or communicates something of real value that can't be said or can't be easily described."  Artists communicate what cannot be spoken through a medium which does not speak but shows.  

And because it cannot be spoken, it cannot be quantified.  "So much of what we do in school gets compared on a metrical level...it's all measurable. Art is beginning to create criteria of value that is not all about measurement." 




The crux of Botsein's viewpoint, though, is that art prevents the most dangerous entity known to humanity: boredom.

"Boredom creates a sense of meaningless, pointlessness. It destroys our sense of the power of time.  The finding or creating of beauty in the world is enormous protection against a sense of meaningless and resentment and pointlessness."





Blog Prompt: Paige, Abigail, and Latisha

Blog prompt: Watch the following videos (one, two, or all three), please. don't have to watch the whole thing. Who is the artist here? Is it the graphic designer, or do you believe the programmers and engineers behind the craeations are? Is it only one, a culmination of a few, or are all peole who work on them artists in their own right? Explain. 


Based on the discussion, the definition of art is relative to the discussion.  The graphic designer gives the framework for the presentation (product-based), but the graphic designers and engineers complete the electronic and digital work necessary in order to make the presentation possible (process-based).  Thus, each of the professionals who contribute to the "art" would be considered artists, because without one, the other does not exist.


Saturday, April 5, 2014

Underneath the Stars and Stripes: Why "Captain America: The Winter Solider" Has Larger Social Implications Than One Would Expect


[WARNING: CONTAINS SPOILERS]

As I finished salivating over my ticket to the midnight premiere of Captain America: The Winter Soldier, I assumed my position in the theater, complete with 3D glasses, popcorn, and an endless chasm of pent-up nerd energy.  Granted, as a True Believer, the movie could fall short of its star-spangled glory and I would still enjoy the experience.  But, as the previews finished out (for other Marvel movies such as The Amazing Spider-Man 2 and Guardians of the Galaxy, speaking of nerd energy) and the events of the plot unfolded, I was astounded.



It wasn't just the phenomenal cast.



It wasn't just the astounding action sequences and special effects.



In conjunction with these epic qualities, it was the the plot which made the movie fantastic.

No, really, I'm serious; despite being an action movie, especially that of the comic book realm, the premise of the movie was grounded in a engaging and contemplative story-arch.


Movie critic Owen Gleiberman writes on this quality of the movie, saying that,
"When the Captain is surrounded by government officials on an elevator, and he realizes that none of them are on his side, the fight scene that follows isn't just brutally exciting. It expresses the film's theme: that you can't trust anyone in a society that wants to control everyone."

 Gleiberman highlights the relevance of the plot to modern times: "In his armored van, [Nick Fury] is attacked by shadow forces that want to militarize the world and make spying as common as breathing. Sound like anything you've read recently?"  The critic could be referring to hot button issues such as the President's policy on drones, the security discrepancies with the NSA, et cetera, et cetera. 

When audience members get a glimpse of the immense power of Project: Insight, even before the corrupt influence of H.Y.D.R.A. is revealed, Captain America raises the question: is it right to use such power on criminals without due process of the law?  Should a criminal be executed from 3,000 feet via red-hot death as opposed to being arrested and processed? 

The point is, Gleiberman writes, "What works here is setting up Captain America in a battle against ... America.  That's the way to turn a super-square into an awesome antihero."  Cap's boy-scout image gets a makeover when he's on the run from the world's largest police force.  The enemy-of-the-state element of the movie "plugs you right into what's happening now."






















Digital Justice: Preventing Crime through the Utilization of Big Data

The idea of using data to predict crimes used to be reserved for fanaticized shows such as Intelligence or Person of Interest.  Now, the process may become a reality.

Marc Rogers, a principal security researcher at Lookout, recently had an article published in The Guardian on the subject of using "machine learning" as the basis for crime prevention.  "In this age of custom malware and targeted advanced persistent threats," writes Rogers, "slow, resource-intensive security is a big problem."  Rogers points out the increasing frequency of cyber-related attacks, such as that on the New York Times by Chinese hackers, highlights the rapid advancements being made by the bad guys.


So what is the solution to all of these online woes?




"We can use machines to identify more complex signals and relations in datasets far bigger than any human could analyze."  This approach stresses the importance of avoiding the singular, myopic approach to analyzing strings of data.  Rogers comments on the highly specific nature of malware detection that security agencies currently utilize:  "security systems were limited to searching in small chunks of that data for a few tell-tale bytes – those being a pattern of data or an antivirus signature unique to a specific attack. Found some matching bytes? Bad guy possibly detected. Didn't find the bytes? Not conclusive."

This system rules out the potential for closely-related-but-slightly-different malware to be detected.  Rogers' solution, in turn, involves the broadening of the cyber horizons.  "Instead of just creating a signature for each piece of malware," says Rogers, "we can build a database of all the malware and everything associated with the people who use it, from their development accounts through to the servers they use and how they plan to monetize."  Patterns of behavior, including the utilization of specific codes, can give analysts and law enforcement agencies valuable insight into the behavior of potential suspects and criminals. 

Seems like a great idea...unless the technology goes the other way.



Rogers' altruistic approach to utilizing "big data" for crime prevention only looks at one half of the equation.  If such technology can be used to look for any and all code types, who is to say that a criminal could not hack the database, change the code types, and make their malware "invisible" to the system?  What if the dirtbag jerry-rigged the database to hunt for code related to bank information, setting up malware that uploads banking information from thousands of online accounts to his or her own off-shore account?

Rogers acknowledges that he has not yet found the absolute perfect system for digital crime prevention. "While this isn't going to solve the problem of advanced attacks and malware alone," he says, "for the first time the tables are turned and the good guys are gaining an advantage."  That is surely a step in the right direction.







Wednesday, April 2, 2014

"Too Big to Know: Rethinking Knowledge Now That The Facts Aren't The Facts, Experts are Everywhere, and the Smartest Person in the Room Is the Room."


David Weinberger's docent title hits the nail on the head.

The author points out the incorrect emphasis that writers and experts in a given field emphasize.   The British Petroleum Oil Spill, for example, became the main topic of discussion when "the real topic is the limits of expert knowledge in tackling complex problems" (vii); within a discussion, the surface issue, or the result of the problem, takes precedence when the knowledge processes leading to the issue are the actual cause of the problem.  And, in questioning the knowledge, one must question the foundation on which said facts are built.  

Weinberger discusses the history of facts, and in doing so, points out that the basis of knowledge comes from facts as opposed to "logical deduction[s] from grand principles" (26).  Facts are the "bricks" of the knowledge "house" that construct our paradigms of values, truth, and reality.  This way, knowledge built and supported by solid facts cannot be widely discredited or challenged.  And, because of the fact that a single person cannot possess the knowledge of humanity, using second sources and follow the "footnotes" of other sources (21).

His approach to dissecting the basis of knowledge and the human capacity for fully understanding shows that, for the most part, people are asking the wrong questions when it comes to information, data, and knowledge as a whole.  

Sunday, March 30, 2014

iPhone VS. Traditional Camera: What the Experts Say



Over spring break, I travelled to Boston in order to participate in an a cappella competition.  On my downtime, I explored the city with my Dad, taking pictures at almost every location.  But I did not do so with a standard camera--  instead, I used my iPhone 5S to snap said pictures.

But are they up to snuff?

In late 2013, Sean Captain, a writer for a technology magazine called Tom's Guide, published an article about the growing use and popularity of smartphone cameras in relation to their ability to shoot quality photos.  He discusses how, in the beginning stages of Apple's iPhone development, "The first three iPhone models (original, 3G and 3GS) had abysmal cameras that, ironically, may have jump-started the photo-app craze."  The ability to quickly point, shoot, store, and share photos outweighed the quality. 

However, in 2010, Captain mentions Apple's advancements in quality that put it on the map with other high-end cameras.  The introduction of a sensor into the iPhone 4 which included  "just-emerging technology called (unfortunately) backside illumination" propelled the iPhone into photographic stardom.  "This tech moves a lot of the wiring to the back of the sensor where it won't block light from reaching the pixels in the front."  This technology was developed in the newest iteration of the iPhone via the "True Tone flash" feature, which "mi[xes] the light from two LEDs — one white and one amber...[and] match[es] the lighting in any environment, complementing rather than contrasting with it." 




In comparison to real cameras, iPhone also made advancements in its high-speed capabilities.  Captain describes the iPhone's ability to shoot 10 photos per second, saying that "The iPhone analyzes burst[s] of images and picks the best one to use as the final photograph. Sony pioneered similar technology several years ago in its Cybershot point-and-shoot cameras and then expanded out to higher-end models, such as its NEX Mirrorless cameras. That was a killer feature that point-and-shoots had over smartphone cameras. Not anymore."  Clearly, Apple was gaining on the competition.


So, in all, what makes the difference?  Portability?  Quality?  Price?  Tradition?

This is a photo of the view from the Performing Arts Theater at the Berklee College of Music in Boston. I shot this photo with my iPhone 5S.  You decide, America.


   






Wednesday, March 12, 2014

How to Change Your Baby's Life...Forever


As with every Conducting II class, my professor always uses a bizarre anecdote that relates to a technical issue of a particular student's conducting.
   
This week, a student conducted the group in a four-part choral arrangement of "O Nata Lux" by Morten Lauridsen, an excerpt from his cycle Lux Aeterna.  One thing the student conductor failed to do is breathe with the singers on their entrances.  Instead of immediately launching into a discussion on synchronized breathing/cueing techniques, he began to describe his experience with the following commodity:




As a father with two infant children, my professor explained how the apparatus was a  "life-saver" for him and for his children.  And the student gained a newfound understanding on the importance of breathing correctly.  Meanwhile, the rest of the class had a laugh for 5 minutes or so.  





Saturday, March 8, 2014

AI: Artificial Invasion



The end of the world:  a topic that isn't necessarily the best topic for small talk at a party, but still merits attention nonetheless.  So then, among the experts at institutes such as the Future of Humanity Institute at the University of Oxford, UK (whoa, that place isn't real, is it?  Yes, it is.), what outcome has the largest probability of decimating the human race?


Zombie outbreak?
















Nope.



Nuclear war?





















Not really.



An coup d'état perpetrated by various forms of artificial intelligence?


















Bingo.


Unfortunately, there is not much attention given to the subject. “I think there’s more academic papers published on either dung beetles or Star Trek than about actual existential risk,” says Stuart Armstrong, a philosopher and Research Fellow at the institute.

Especially for AI, the focus and discussion does not exist in abundance.  Regardless, people such as Armstrong still stress the importance of the issue.  In his research, he gives a statistical example detailing the possibility of total human extinction in various end-of-the-world scenarios.

“One of the things that makes AI risk scary is that it’s one of the few that is genuinely an extinction risk if it were to go bad. With a lot of other risks, it’s actually surprisingly hard to get to an extinction risk... You take a nuclear war for instance, that will kill only a relatively small proportion of the planet. You add radiation fallout, slightly more, you add the nuclear winter you can maybe get 90%, 95% – 99% if you really stretch it and take extreme scenarios – but it’s really hard to get to the human race ending. The same goes for pandemics, even at their more virulent."


But it isn't the Terminator that Armstrong fears.  He fears the AI that is "smarter than us – more socially adept...better at politics, at economics, potentially at technological research.”

Armstrong goes on to discuss the myriad factors that accelerate the "Robot Uprising."  One factor, which can be seen in modern society, is the presence of machines in the workforce.  Armstrong expands the concept tenfold, saying that “You could take an AI if it was of human-level intelligence, copy it a hundred times, train it in a hundred different professions, copy those a hundred times and you have ten thousand high-level employees in a hundred professions, trained out maybe in the course of a week. Or you could copy it more and have millions of employees… And if they were truly superhuman you’d get performance beyond what I’ve just described.”

Okay, so robots take our jobs.  But where does the human extinction part come into play?

“Take an anti-virus program that’s dedicated to filtering out viruses from incoming emails and wants to achieve the highest success, and is cunning and you make that super-intelligent... Well it will realize that, say, killing everybody is a solution to its problems, because if it kills everyone and shuts down every computer, no more emails will be sent and and as a side effect no viruses will be sent."  

Wow.

But perhaps a safeguard, such as the "Three Laws of Robotics" from iRobot, would keep us mouth-breathers safe? 

“It turns out that that’s a more complicated rule to describe, far more than we suspected initially. Because if you actually program it in successfully, let’s say we actually do manage to define what a human is, what life and death are and stuff like that, then its goal will now be to entomb every single human under the Earth’s crust, 10km down in concrete bunkers on feeding drips, because any other action would result in a less ideal outcome."

Oh.


With the rapid expansion of machinery in the workforce, home, and even battlefield, researchers such as Armstrong are becoming more verbose about their extrapolations.   If the "AI Apocalypse" comes, and humanity has no defense, then it'll be:




before you can say "The Humans Are Dead."


(Sorry, I couldn't help myself.)








Wednesday, March 5, 2014

Response to Shirky: Leave it To the Professionals?



What values are being presented in modern society?  How does technology affect those values?  

Clay Shirky, author of Here Comes Everybody, offers insight into the social and economical contexts that surround the dilemma described earlier in Chapter 1, in which a woman loses her phone and struggles to recover it from the girl who found it in a taxi cab.  Because of the woman's friend, Evan, the story was able to gain local, national, and global attention.

But is this a positive aspect of technology?  Sure, Evan used his digital prowess in order to help his friend get here phone back.  Sasha and her party were being unreasonable and cruel, and given the price of the phone, it was no laughing matter.  However, can this digital networking be used for nefarious purposes? Shirky mentions that Evan was "able to escape the usual limitations of private life and to avail himself of capabilities previously reserved for professionals" and recover the phone for his friend.  Yet, what long-term effects come from undermining the professionals themselves?  



















The Internet, in its sheer vastness, allows for anyone to upload anything at anytime.  Non-professionals, who, before the World Wide Web, were left to shout and hock their wares on street corners, now have a voice, and that can be dangerous. 

Tatev Harutyunyan , an Armenian journalist, writes in an article that "the dramatic expansion of online media has damaged Armenian journalism, flooding it with a new generation of unprofessional editors and journalists.”  Information lacks sources and/or proper citation, important topics are undermined and replaced by trivial entertainment, and uniformity in the news business dissolves.  

Could a similar trend be growing in the United States?


I don't know, I'm kind of bored.


Hey, look!  


It's Hulk smashing the Asgard out of Loki!


HAHAHAHAHAHAHAHAHA










* Emery, Adam.  "Puny God."  The Revengers. 2014.  Sony Pictures. 

Saturday, March 1, 2014

The YouTube Generation


Almost a year ago today, Google ran an article that analyzed the question of "who watches and contributes to YouTube the most?"  

The research concluded that the millennials contribute to the  "YouTube generation."  According to this study, over 80% of millennials make up the website's audience.  Based on its findings, Google has placed a label on said viewers that not only defines their demographic but also outlines their state of mind — Gen C.





In describing this new label, Google says that "...Gen C is more than an age-group; it's a mindset defined by creation, curation, connection and community."  It is a self-sustaining, expansive, dynamic entity that cultivates expression in every sense of the word.  


Now, while this information may point out a fairly obvious point, namely that twenty-somethings make up the general population of YouTube and social media in general, Google goes on to provide more stats about Gen C.





On the "social" front:


































Okay, nothing earth-moving there.  Social media has grown in popularity ever since its conception.  The fact that millennials are proficient in the ways of digital media and technology seems fair.


On the technology front:


















Slightly creepy choice of words, even for Google, but still no shocking revelation.  With multiple events and conversations going on at once across the Internet, it's not a stretch to say that smartphone users have a hard time unplugging.


















Millennials use smartphones, smartphones have the YouTube, millennials use smartphones to watch the YouTube. Transitive property at work.





On the economic front:





















Wait, how much?




Thank you, Google.


500 freakin' billion dollars.  

Generated by Tweets?  And Facebook posts?  And Tumbling???

Ginny Marvin, a writer for Market Land who focuses on online marketing and revenue, concluded that, at the end of 2013, "social media finally seems to have become a part of the revenue equation for many companies."  

(Since you are probably tired of reading numbers, you can view Marvin's statistics and colorful charts here).

Does social media carry more power than previously thought?  Does the generation that spends too much time posting and taking selfies actually contain the potential to change the direction of society? 

Until that question can be answered indefinitely, please enjoy tank missile:




(Because really, nothing is more BA than Iron Man dodging a round, shooting a slightly delayed, lethal explosive, and walking away.
You're welcome.)