• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.


Bibliography by Pehr Hovey

Page history last edited by Pehr Hovey 10 years, 5 months ago



Annotated Bibliography Assignment


By Pehr Hovey, @LitPlus Twitter Visualization Project Team


1.  Huberman, Bernardo A., Romero, Daniel M. and Wu, Fang. Social Networks that Matter: Twitter Under the Microscope (December 5, 2008). Available at SSRN: http://ssrn.com/abstract=1313405


Huberman and colleagues take a quantitative look at the social networks present in the social networking website Twitter. Unlike Facebook and some other popular social networking websites, ‘friendships’ in Twitter are uni-directional. Linking to another user does not necessarily mean they will link back to you. The team at HP labs looked at users’ follower links to separate the complete social network surrounding a group of users from the ‘one the matters’ – the set of users who could be considered actual friends and who communicate with each other on a regular basis. They find that while a Twitter user may have a dense network of followers and people they follow, the sub-network of just ‘friends’ is comparatively sparse. They attempt to gauge the meaningfulness of these Twitter connections by analyzing how prolific a user’s posting is in the context of sheer total number of followers, and the specific number of ‘friends’ that follow them. They find that posting frequency increases regularly as the number of true friends increases, but remains somewhat stagnant as the total number of followers increases. Their conclusion is thus that the underlying ‘hidden’ and sparse network is more important than the superficial large network of all their followers.



2.  McCullough, Malcolm. (2006). On the Urbanism of Locative Media [Media and the City]. Places, 18(2),. Retrieved from: http://www.escholarship.org/uc/item/84x6m3nf


Malcolm McCullough discusses the apparent disconnect between ‘media’ as a form of disconnected, passive entertainment, and urban spaces. He points out that up till recently, our mass-media infrastructure has been tailored for remote viewing and does not engage us in a local, community-based fashion. A one-size-fits-all approach to ubiquitous computing lets us do anything anywhere at any time but has the consequence of stripping us of our space and place. Locative media seems to offer a chance to break away from the remote past and more firmly integrate media into our space. While there is a never-ending current of corporate information infrastructure (credit cards, UPC barcodes, etc) the time is right for bottom-up folksy approaches to make inroads. He points to RFID, Bluetooth and other technologies that rely on close proximity to work. For us to interact with this technology we need to be physically present in a space – no cloud computing is possible. This, he says, is how media and urbanism can coexist.

3.  Twitter API: http://apiwiki.twitter.com/


Twitter has a free Application Programmer Interface (API) which is documented on a wiki. This API lets programmers access the twitter system to perform searches and also submit new tweets for an account (if they know the account password). Searches can be done by keyword or user and also can be filtered by location. Location filtering uses Twitter’s geotagging system whereby participating users automatically record GPS location data when they submit their tweet. This feature requires a mobile phone or other platform that can submit latitude and longitude data.

Data is returned in a variety of formats that are easily understood by programs but are not as user-readable as the regular search interface on the twitter website. Most developers use specific code libraries to integrate Twitter into their program but the standard search interface is publicly accessible in the web browser. The image below shows sample data (in JSON format) for a search of tweets near Santa Barbara, retrieved by entering a query directly into the browser:



4.  Rieser, Martin. “Locative Media and Spatial Narrative”. NeMe blog. Available at http://www.neme.org/main/1000/locative-media-and-spatial-narratives


Martin Rieser takes a broad look at spatial narratives in this paper for NeMe, an NGO focusing on questions of arts, culture and contemporary theories. This paper specifically examines how we can learn from historical examples of ‘locative media’ to influence our contemporary interactive public art. GPS and mobile computing technology has enabled artists to take the ‘diegesis’, or fictional eventspace and bring it to a physical place. The public can experience the (fictional) story in real life instead of in a book or in a museum. With Augmented reality, we can further meld the real and fictional worlds.


Rieser points out that while the technologies enabling this new wave of locative media are relatively new, the concept of narrative tied to a space has been seen as far back as historians can look. He specifically discusses sacred ritual spaces that seem precisely placed and designed for effects other than their casual appearance. For example, spaces like Stonehenge and worship sites in the American Southwest are oriented and laid out to align with solar and lunar events. Ancient storytelling about cosmic events would have been aided by these special places that become a part of the story. Rieser suggests that for this new wave of digital locative media it is not enough to simply have things happen in physical space, but rather the story must be embedded in the streets—the narrative must arise from the space and not simply be presented in it.


5.  Daniel W. Goldberg, A Geocoding Best Practices Guide, University of Southern California, GIS Research Laboratory, Los Angeles, California, November 2008. Available at http://www.naaccr.org/filesystem/pdf/Geocoding_Best_Practices.pdf

Geocoding is the process of converting an imprecise placename such as a street address into precise latitude and longitude coordinates on the earth. Software systems utilize data on street layout to make a best guess at where on the street an address lies. The process is far from straightforward since an address may be ambiguous or outdated. The same region may have several streets with the same name, or a single physical street may have several different names. With such potential for inconsistencies, it is entirely common for two different geocoding systems to return different results for the same input. This may be acceptable for consumer mapping applications but can prove hazardous for serious scientific research.


Two data sets that were geocoded using disparate methods cannot be combined together without almost guaranteed error introduced. Furthermore, unrealized imprecision in a specific datapoint can skew the conclusions of a study in non-negligible ways. Daniel Goldberg’s book discusses the pitfalls inherent in using geocoded data in epidemiology. When tracking cancer rates over time and space it is important to have reliable geographic data. One study highlighted in the book drew definite conclusions on cancer likelihood based on how close someone lives to a freeway. Upon further analysis it was discovered that up to 24% of the datapoints varied widely in freeway distance when geocoded using two different systems. This could have a dramatic effect on the validity of the study conclusions.


Though Goldberg’s book is aimed at public health researchers, he points out that the same issues apply to any field that seeks to draw conclusions supported by geospatial data. The book pivots from assessing the problems to lay out several dozen best practices that can be implemented to reduce the uncertainty in the data and help standardize the scientific community’s approach to handling geospatial information.


Comments (0)

You don't have permission to comment on this page.