About | Project Members | Research Assistants | Contact | Posting FAQ | Credits
Page 1 of 212

Announcement: Research Reports (By Posting Date)

Research reports focus on high-priority items in “Objects for Study.” Reports are written in a standard format designed both to synopsize the topic and to offer a preliminary evaluation of the opportunities it suggests for Transliteracies’s goal of improving online reading.

ConceptVISTA


Summary

ConceptVISTA is a free software created through the collaboration of researchers at the Geographic Visualization Science, Technology and Applications Center (GeoVISTA) at the University of Pennsylvania, the University of Southampton, University of Leeds, and UC Santa Barbara. Using the priniciples of the semantic web, ConceptVISTA is a browser for mapping of concepts, teaching objects, and their related contexts in a way that “allows users to define and link concepts and resources pertaining to a conceptual domain.”1 It is designed to provide a rigorous user-based environment that allows users to add, view, and manage information and related to concepts and resources, primarily for pedagogical purposes through a web browser. Using the Web Ontology Langauge (OWL)2, it is based on the semantic web, and can also be used to import and organize outside ontologies.

(more…)

SNAC

Research Report by Lindsay Thomas

Related Categories: Search & Data Mining Innovations | Online Knowledge Bases | Social Networking Systems

Summary:

The goal of the Social Networks and Archival Contexts Project (SNAC) is to rethink the ways in which primary humanities resources are described and accessed. A collaborative project from the Institute for Advanced Technology in the Humanities at the University of Virginia, the School of Information at the University of California at Berkeley, and the California Digital Library, SNAC uses the new standard Encoded Archival Context — Corporate Bodies, Persons, and Families (EAC-CPF) to “unlock” the descriptions of the creators of archives from the records they have created. The project aims to create open-source tools, not yet released, that allow archivists to separate the process of describing people from that of describing records and to build a prototype online platform, released in December 2010 and currently in alpha stage, that links descriptions of people to one another and to descriptions of a wide variety of resources. The project received $348,000 over two years starting in May 2010 from the National Endowment for the Humanities, and the project team is directed by Daniel Pitti and Worthy Martin and also includes Ray Larson, Brian Tingle, Adrian Turner, and Krishna Janakiraman [i].

(more…)

Narrative as Metadata


Research Report by Eric Chuk

(created 3/17/10)


Related Categories:

Summary:

We have seen RoSE grow into a linked collection of references to documents and people of various eras, and we know intuitively that each of these entities, living or not, has a story, has significance. Or more precisely, we can study and ascribe meaning to them separately and in clusters. Doing so might also be called a form of storytelling, or making sense of the data by tracing a certain vantage point or line of inquiry. The very acts of telling and seeing have deep ties to narratology. (more…)

Zotero


Summary:

Zotero is an information manager that, as an extension for Firefox, facilitates storing citation information, clipping, and sorting research material on the web. Funded by the United States Institute of Museum and Library Services, the Andrew W. Mellon Foundation, and the Alfred P. Sloan Foundation, Zotero is a product of the Center for History and New Media at George Mason University. The public beta version of Zotero 1.0 launched on October 5, 2006, while the most recent version went live in May 2009.

(more…)

WorldCat Identities


Summary:

In 2007, WorldCat, the union catalog of some 71,000 libraries that participate in the Online Computer Library Center (OCLC), implemented WorldCat Identities, which provides a summary page for every name in WorldCat. WorldCat currently includes over 30 million names of authors, filmmakers, fictional characters, actors, and other subjects of published works. The project team, led by Thomas Hickey, included Ralph LeVan, Tom Dehn, and Jenny Toves. It is currently in its Beta stage.

(more…)

Freebase


Research Report by Renee Hudson

(created 1/30/2010; version 1.0)

Related Categories: Social Networking Systems | Tools for Analyzing Social Networks| Online Knowledge Bases | Text Visualization


Summary:

On March 9, 2007, Metaweb Technologies unveiled Freebase.com, an open database that pulls information from public sources like Wikipedia, MusicBrainz, Netflix, and the Securities and Exchange Commission. Like a wiki, Freebase also allows users to add their own content to contribute to the knowledge base of the website. However, unlike search engines that point a user toward a list of websites, Freebase connects users with clusters of information that emphasize relationships among pieces of data.

(more…)

MediaCommons


Summary:

MediaCommons is a project led by Kathleen Fitzpatrick and Avi Santo currently in development at USC as part of the Institute for the Future of the Book.  The project is a network for media scholars and students, providing access to scholarship beyond traditional peer-reviewed journals in the form of wikis, blogs, journals, and other digital media.  A major goal of the project is to re-imagine the scholarly press as a social network through which scholars can, via the affordances of digital network technologies, produce new knowledge about their specific fields.  As it stands, MediaCommons also has two projects in development, In Media Res and MediaCommonsPress.

(more…)

Academia.edu


Research Report by Renee Hudson

(created 11/20/09)

Related Categories:

Summary:
Richard Price, who recently earned his Ph.D. in Philosophy at All Souls College, Oxford, launched Academia.edu on September 16, 2008. The website seeks to answer the question “Who’s researching what?” by taking a social networking approach to academic relationships. Academia.edu illustrates these relationships as a genealogy in which users are grouped by university, then department, then their position within their department. Users add content based on their research, including what papers they have written, their research interests, and their advisors.

Description:
Academia.edu uses a tree-like genealogical structure to organize academic relationships. Colleges and Universities are listed in a row along the top of the page – clicking the background and dragging it left or right allows a user to scroll through other universities. Using the search box in the middle of the page, a user finds their college or university and, if the school is not found, adds it to the list of universities and colleges. Once the user finds his or her university, s/he will see departments organized alphabetically on a tree that stems from his or her university. The user then locates his or her department or adds the department if it is not already part of the tree. (more…)

A Comparison of Development Platforms for Social Network Data Visualizations

Research Report by Salman Bakht
(created 10/22/09)

A Comparison of Development Platforms for Social Network Data Visualizations

This report explores several software development platforms that may be used in developing web-based data visualizations. This report particularly focuses on comparing the suitability of these platforms for developing dynamic social network and document visualizations for ProSE (Professional Social Environment), the social network environment developed by the Bluesky Group of the Transliteracies Project. Adobe Flash (www.adobe.com/products/flash), Adobe Flex (www.adobe.com/products/flex), OpenLaszlo (www.openlaszlo.org), and Processing (www.processing.org) are examined herein. These platforms are first described in general terms in the Overview section, and then they are compared in terms of several factors, such as licensing and cost, accessibility, and ability to interface with databases. (more…)

Document Database Integration for the Professional Social Environment (ProSE)


Research Report by Salman Bakht

(created 10/6/09)

Document Database Integration for the Professional

Social Environment (ProSE)

ProSE (Professional Social Environment) is a social network environment developed by the Bluesky Group of the Transliteracies Project. While online reading interfaces such as Professional Reading Environment (PReE) being developed by the Electronic Textual Cultures Laboratory (ETCL) provide sophisticated access to data derived from documents in a professional or scholarly field, ProSE provides access to the social network connected the field. ProSE models social networks in a way that seamlessly combines professional readers and writers, both contemporary and historical. Consequently, ProSE is designed to populate its social network database from existing databases within one or more fields of study to supplement user-created entries. This report describes the following databases, which may be integrated into ProSE:


  • English Broadside Ballad Archive (EBBA): a database of seventeenth-century broadside ballads, created by the Early Modern Center at UC Santa Barbara.

  • Early English Books Online (EEBO): a database of text images from 1475-1600.

  • Early English Books Online-Text Creation Partnership (EEBO-TCP): coding of the full text 25,000 works in EEBO.

  • The Renaissance English Knowledgebase (REKn): a database developed at ETCL consisting of primary and secondary sources related to the Renaissance.

  • The Iter Bibliography: a bibliographical database for articles, essays, books, dissertations, encyclopedia entries, and reviews pertaining to the Middle Ages and Renaissance (400-1700).

(more…)

TimesPeople

Research Report by Renee Hudson

(created 6/07/09; version 1.0)

Related Categories: Social Networking Systems | Collective Reading

Original Object for Study description

Summary:The New York Times released TimesPeople on June 18, 2008 with the goal of creating a social network based on sharing content from the website. The June 18th launch, previously available only in Firefox and as a plug-in, became more widely available in September 2008 and allowed users additional features like the ability to sync their TimesPeople activity with their Facebook accounts. More recently, in February, The New York Times added the TimesPeople API to their current list of APIs to facilitate interaction with the Times outside of the website and to move one step closer to reimagining the future of the news through collaboration with developers.

While TimesPeople markets itself as a social network, its stripped down style can be likened to something more along the lines of a tool than a network. Like Twitter, rather than having “friends” on the site, users follow other users and are followed in turn. Profiles are limited to handle, location, and image. Rather than creating content like a blog post or a Facebook note, users interact with the NYTimes.com website through their actions with Times created content. This content ranges from slideshows and articles users share with others to comments, reviews and ratings of movies and restaurants.

(more…)

Jazz as an Extended Metaphor for Social Computing

Research Report by Aaron McLeran
(created 5/17/09; version 1.0)

Related Categories: Social Networking Systems | Online Social Networking (Tools for Analyzing)

The Ontological Problem of Social Computing

At the UCSB’s Bluesky Social Computing Group, part of the University of California’s Transliteracies Project, we are tasked with the problem of researching the impact of social computing as a tool for collaborative research and to explore new ways in which social computing might be used in the future. We have been confronted with the problem of how to conceptualize what social computing means and what we mean when we talk about collaborative research. This issue is further exacerbated by the variety of social computing experiences; broadly considered, anything related to the internet, by definition, is a form of social computing. Recent efforts have focused on developing and applying a deeper understanding of ontological meaningfulness of concepts like a “person” and “relationships”. Most realizations of digital social networks have trivial and naive answers to these questions, a situation which fundamentally limits their usefulness. To this end, it has been suggested by Bluesky project leader and English department chair, Dr. Alan Liu, that new metaphors for social computing are needed.

(more…)

MONK Project


“MONK” by Salman Bakht, Pehr Hovey, and Kris McAbee.

(Created 4/23/09)

About the Authors:
Salman Bakht is a new media artist and composer currently studying in the Media Arts and Technology Program at UC Santa Barbara. Salman’s work focuses on the reuse and transformation of recorded audio using algorithmic composition methods. He is interested in creating art which analyzes, represents, and integrates with the physical environment and the media landscape.

Pehr Hovey is an algorithmic artist and researcher interested in the intersection of arts and technology. He is studying the mapping between audio and visual domains and how visual stimuli can be tightly integrated with the aural environment. He recently graduated with degrees in Computer Science and Computer Engineering from Washington University in Saint Louis and is currently a Masters student in Media Arts & Technology at UC-Santa Barbara.

Kris McAbee [Under Construction]

Transliteracies Research Paper PDF Version PDF version of the research report.


Summary
MONK, which stands for “Metadata Offer New Knowledge,” is a single digital environment of literary texts that endeavors to make “modern forms of text analysis and text mining accessible to humanities scholars” {1}. The metadata associated with any given document in the MONK environment ranges from data about individual words, to data about discursive organization, to bibliographic data. MONK offers the ability to read back and forth between these different levels of data and, therefore, to read as closely or as distantly as one wants. The current collection of texts in the MONK prototype consists of about 1200 works, including approximately 500 texts of various genres published between 1533 and 1625, alongside about 700 works of English and American fiction from about 1550 to 1923. Fundamentally, MONK assumes that operating through “coarse but consistent encodings across many texts in a heterogeneous document environment” offers significant scholarly benefit. The single environment that will bundle these operations for this large collection of texts will be housed at http://monkproject.org. (more…)

CommentPress Research Paper

“CommentPress” by Pehr Hovey and Renee Hudson.
(version 1.0; created 4/22/09)

About the Authors:
Pehr Hovey is an algorithmic artist and researcher interested in the intersection of arts and technology. He is studying the mapping between audio and visual domains and how visual stimuli can be tightly integrated with the aural environment. He recently graduated with degrees in Computer Science and Computer Engineering from Washington University in Saint Louis and is currently a Masters student in Media Arts & Technology at UC-Santa Barbara.

Renee Hudson received her BA in English at Stanford University and is currently a PhD student in English at UCLA. She specializes in twentieth century American literature. Her research interests include media theory, terrorism, and political violence.

Transliteracies Research Paper PDF Version PDF version of the research report.

Summary
The Institute for the Future of the Book officially launched CommentPress 1.0 on July 25, 2007 as a theme for WordPress. The initial project developed in 2006 when the Institute approached Mackenzie Wark about creating an online version of his book Gamer Theory. In order to emphasize conversation about the book, the Institute departed from the typical blogging structure by placing comments next to the text rather than directly beneath the text. The online project, called GAM3R 7H30RY was so successful that it sparked similar projects, among them Mitchell Stephen’s “Holy of Holies: On the Constituents of Emptiness,” and Lewis Lapham’s “The Iraq Study Group Report.” (more…)

Open Journal Systems


“Open Journal Systems” by Salman Bakht, Pehr Hovey, and Aaron McLeran.

(version 1.0; created 4/17/09)

About the Authors:

Salman Bakht: [Under Construction]

Pehr Hovey is an algorithmic artist and researcher interested in the intersection of arts and technology. He is studying the mapping between audio and visual domains and how visual stimuli can be tightly integrated with the aural environment. He recently graduated with degrees in Computer Science and Computer Engineering from Washington University in Saint Louis and is currently a Masters student in Media Arts & Technology at UC-Santa Barbara.

Aaron McLeran: [Under Construction]

Transliteracies Research Paper PDF Version PDF version of the research report.

Summary
Open Journal Systems (OJS) is an online management and publishing system for peer-reviewed journals developed by the Public Knowledge Project, a partnership among the University of British Columbia, the Simon Fraser University Library, the School of Education at Stanford University, and the Canadian Centre for Studies in Publishing at Simon Fraser University. Open Journal Systems’s open source software is designed with the purpose of “making open access publishing a viable option for more journals, as open access can increase a journal’s readership as well as its contribution to the public good on a global scale”{1} and provides tools to aid with every stage of the publishing process. Additionally, OJS offers a set of reading tools which can optionally be added to the journal. As of January 2009, OJS was currently used by over 2000 journals {2}. (more…)

World Without Oil (ARG)

Research Report by Lindsay Brandon Hunter
(created 9/11/08; version 1.0)

Related Categories: Alternative Interfaces | Collective Reading

Original Object for Study description

Summary: World Without Oil was an alternate reality game developed by Ken Eklund and Jane McGonigal, ITVS (Independent Television Service) Interactive and funded by the Corporation for Public Broadcasting. Originally played between April 30 and June 1, 2007, World Without Oil was conceived as both an ARG and a “serious game,” in the sense of the Woodrow Wilson International Center for Scholars 2002 Serious Games Initiative. The game’s tag line–”Play it before you live it”–emphasized the what-if nature of the game: players were encouraged to explore what would change in their own realities in the event of a massive oil shortage. Game makers provided rough parameters for the in-game reality (the price of oil, fuel availability) as well as character content (blogs, videos), but the game was aggressively user-driven. The gamers’ task was to imagine the consequences of a massive oil crisis, communicate about their experience and explore creative strategies for dealing with the attendant difficulties.

(more…)

The Lost Experience

Research Report by Renee Hudson
(created 06/03/08; version 1.0)

Related Categories: Alternative Interfaces | Collective Reading

Original Object for Study description

Summary:
Between seasons two and three of the television show Lost, ABC launched “The Lost Experience,” an Alternate Reality Game (ARG) designed to maintain viewer interest in the show. “The Lost Experience,” like many ARGs, incorporated a variety of media into its implementation. Players were encouraged to watch commercials that aired during the last episodes of season two in order to be notified of relevant websites that would provide clues to the game. In addition to websites, users watched mini-movies, read advertisements, and a tie-in novel. They were also directed towards recordings and podcasts over the course of “The Lost Experience.” While the game itself is a multi-media experience, this report will focus on the textual elements that played a crucial role in the game. (more…)

CommentPress

Summary:

CommentPress was developed by the Institute for the Future of the Book as part of their ongoing experiments with “networked books”. First instituted in 2006 as part of McKenzie Wark’s GAM3R 7H3ORY 1.1 publication, the software was developed to work with WordPress and intended to reconfigure the nature of blog discussions. CommentPress allows respondents to post comments in the margin of the text, on a paragraph-by-paragraph or “whole page” basis. This breaks down the top-down hieararchy typical of blogs whereby a main post is positioned vertically above any commentary. Instead a reader may view the text and commentary at the same time.

Version 1.0 of CommentPress was released to the general public in July 2007 and the software has been used to generate discussion around Master’s Theses, scholarly articles, and books. (more…)

El Muro

Research Report by Kate Marshall
(created 6/01/07; version 1.0)

Related Categories: Alternative Interfaces | Art Installations

Original Object for Study description

Summary:
A self-inscribing wall, El Muro is an installation piece designed by Willy Sengewald and Richard The in 2004 as part of the “Sensitive Skin” project at the Berlin University of the Arts Digital Media course. The large monolith of El Muro stands alone in a darkened room, and produces graffiti on its surface that appears to write itself. The project calls for the dividing lines of urban and architectural space to be read — and re-read — as built expressions of political reality and as communications media. The walls to which El Muro refers not only include the borders of Berlin, Israel-Palestine, and US-Mexico, but also the anonymous urban walls that become advertising and graffiti canvases, as well as the wall as an abstract architectural element. In each case, the support structures of the built world become self-reflexive reading interfaces. (more…)

Moving Canvas

Research Report by Kate Marshall
(created 06/01/07; version 1.0)

Related Categories: Alternative Interfaces | Art Installations

Original Object for Study description

Summary:
Moving Canvas, a combination of technologies and installations designed by Frédéric Eyl, Gunnar Green, and Richard The, involves the placement of an LED projection device (“Parasite”) on the side of a commuter subway train. The device is enclosed in a suitcase equipped with suction cups, and its projection aligned with the subway walls. The ensuing display projects words and images on the tunnel walls, viewable through the train windows. Cinematic time and commuter time combine radically as bodies and messages literally communicate through the subway tunnels.

The project was developed in 2005 as part of a digital media class at the Berlin University of the Arts, and exhibited with the university’s “Here/There” project. Moving Canvas takes another look at the problem of here and there by asking what lies between. (more…)

Interface Ecology

Summary:
Interface ecology is a theoretical framework for the study of relationships between interfaces; its objects range from social to computer interfaces. The practice of interface ecology is characterized by three intertwined objectives: the analysis of interfaces as cultural artifacts from an ecosystems approach, the production of systems and interfaces that elevate the role of human expression, and the translation between disparate cultures and disciplines. This approach was first theorized by Andruid Kerne through his own interdisciplinary work in performance art and computer science at New York University (1997-2001). He has published on interface ecology primarily within computer science and digital art forums from this period to the present. Five years ago, Kerne established the Interface Ecology Lab at Texas A&M University. (more…)

Ben Fry, Valence (2001)

Research Report by Brooke Belisle
(created 05/21/07; version 1.0)
[Status: Draft]

Related Categories:

Original Object for Study description

Summary:
Valence is a software program written by Ben Fry to dynamically render complex information as a visual, three-dimensional, and relational representation. It has been produced and installed in multiple versions that take various inputs. The original version, which ‘reads’ novels, was installed in 2001 at Ars Electronica in Austria and appeared in the film Minority Report. The latest version, which visualizes genetic information, was installed in 2002 at the Whitney Biennial in New York and appeared in the film Hulk. Images and quick time movies of various instantiations of Valence can be viewed at Fry’s website. (more…)

Remembrance of Media Past (Ayhan Aytes)

Summary:
Remembrance of Media Past engages with cultural archetypes as motivations for designing interfaces in contemporary media. I chose to take illuminated manuscripts as a central focus of my research because they were perhaps the most significant medium of complex information structures before the introduction of the mechanical reproduction beginning with the Gutenberg era. In its final articulation, the project components attempt to link these antecedent cultural interfaces to more current approaches to complex information structures. (more…)

Peter Cho, “Typotopo”

Research Report by Kate Marshall
(created 4/27/07; version 1.0)
[Status: Draft]

Related Categories: Text Visualization | Text and Multimedia

Original Object for Study description

Summary:
Peter Cho’s body of typographic experiments, collected on Typotopo, visually explore the constituent parts of language and narrative. Cho’s work presents a range of graphic design innovations that use digital technology to access forms of letters or forms of texts. The text visualizations showcased on Typotopo ask not only how technology influences typography, but also what happens to the act of reading when letters, words, and narratives are experienced in interactive, dynamic environments.

(more…)

WordsEye: An Automatic Text-to-Scene Conversion System

Research Report by Nicole Starosielski
(created 3/13/07; version 1.0)
[Status: Draft]

Related Categories: Text and Multimedia | Alternative Interfaces | Text visualization

Summary:

WordsEye is a text-to-scene conversion tool that allows users to construct a computer modeled scene through the use of simple text. Users describe an environment, objects, actions and images, and WordsEye parses and conducts a syntactic and semantic analysis of these written statements. The program assigns depictors for each semantic element and its characteristics and then assembles a three-dimensional scene that approximates the user’s written description. This scene can then be modified and rendered as a static two-dimensional image. (more…)

Noah Wardrip-Fruin’s News Reader (2003) (with David Durand, Brion Moss, and Elaine Froehlich)

Research Report by Brooke Belisle
(created 2/21/07; version 1.0)
[Status: Draft]

Related Categories: New Approaches to Reading Print Texts, New Reading Interfaces

Original Object for Study description

“It is difficult to get the news…”

Summary:
In 2003, New Radio and Performing Arts commissioned two artworks by Noah Wardrip-Fruin for their website, Turbulence.org. [1] Wardrip-Fruin produced Regime Change and News Reader, both of which he titled “Textual Instruments.” News Reader offers an interface for reading current news stories, and for what Wardrip-Fruin calls “playing” these stories or “playing” the online news environment. [2]As the user interacts with the news stories by clicking highlighted text, the stories multiply and warp in unpredictable ways. (more…)

LibraryThing

Research Report by Kimberly Knight
(created 2/19/07; version 1.0)

Related Categories: Text Visualization | Social Networking Systems | Online Knowledge Bases

Original Object for Study description

Summary:
LibraryThing is an online knowledge base and social networking tool for bibliophiles. The website allows users to catalog their personal libraries. By entering in their own books, users can locate others with similar libraries, find suggestions for books they might like, or even get “unsuggestions” for the books that are least like their own. Users can organize their collections according to self-defined tags and also view how others have tagged the same books. (more…)

Brian Kim Stefans, “The Dreamlife of Letters” (2000)

Research Report by Kim Knight
(created 2/18/07; version 1.0)

Related Categories: New Reading Interfaces | Text and Multimedia | Collective Reading

Original Object for Study description

Summary:
“The Dreamlife of Letters” is a flash poem by Brian Kim Stefans. Published in 2000, the piece is based upon an appropriated poem by Rachel Blau DuPlessis and takes the viewer through the mobile and unstable “dreamlife” of letters. The words of DuPlessis’ poem have been grouped together according to their first letter and animated in such a way that the passive viewer can only watch as the text moves around the screen. Influenced by the traditions of concrete poetry and ambient poetics, the piece foregrounds language not only as a medium of meaning, but also as a medium of design. (more…)

CaveWriting and the CAVE Simulator

Research Report by Nicole Starosielski
(created 2/6/07; version 1.0)
[Status: Draft]

Related Categories: Immersive Text Environments | Alternative Interfaces

Original Object for Study description

Summary:

Cave Writing is an interdisciplinary artistic practice developed at Brown University for the CAVE simulator, a virtual reality environment typically used for scientific visualization. Cave Writing began in 2002 when hypertext fiction writer Robert Coover initiated a series of workshops in Brown University’s CAVE that brought together faculty, students, artists and scientists in the development of creative projects integrating text, visual imagery, narrative and sound. Several notable projects from the workshop include Screen, developed by Noah Wardrip-Fruin, et al., John Cayley’s Torus and William Gillespie’s Word Museum. The release of CaveWriting 2006, a spatial hypertext authoring system designed by workshop developers, allows authors to directly manipulate text, imagery and 3D models in a graphical front-end environment. CaveWriting now expands beyond the physical limits of the CAVE simulator, making it relatively easy for anyone with a compatible personal computer to experiment and explore writing and reading in three dimensional environments. (more…)

Giselle Beiguelman, “esc for escape” (2004)

Research Report by Lisa Swanstrom
(created 12/15/06)
[Status: Draft]

Related Categories: Text Encoding | Text and Multimedia | Art Installations

Original Object for Study description

Summary:
“Tell us: What was the most scary, funniest, unforgettable error message of your life?” So asks Giselle Beiguelman’s “esc for escape,” (2004) a multifaceted art project that solicits and archives error messages from computer users around the globe and re-expresses them in a variety of contexts and media. The project includes a public exhibition of error messages on electronic billboards in São Paulo, Brazil; a repository of selected error messages published on the web, entitled “The Book of Errors”; “The Monastery,” an archive of all error messages related to the project; a dvd of the project; a project blog; as well as several “trailers,” which offer ironic visualizations of various error messages by the artist. (Can this sentence be broken up into 2?) In addition to providing a playful space for people to express their most “unforgettable” error messages, the project offers a subtle–yet sustained and sophisticated–commentary about the relationship between computer code and natural language in relation to the digital age. (more…)

Semantic Web

Research Report by Angus Forbes
(created 10/6/06; version 1.0)
[Status: Draft]

Related Categories: Software / Coding Innovations, Search and Data Mining Innovations

Original Object for Study description

Summary:
The World Wide Web Consortium (W3C) uses the term “the semantic web” as an umbrella identifier to refer to a number of initiatives that enable developers and archivists to add rich, meaningful metadata to digital resources. According to the W3C, the major reason for theses initiatives to tag information with explicit meaning is to make “it easier for machines to automatically process and integrate information” [1]. The semantic web adds depth to the existing web protocols running over the application layer of the internet without involving any changes to its more basic architecture. Currently, the main feature that organizes the web is the “link”—any document (or resource) can link to any other. Additionally, each link is coupled with a method (or protocol) to present the resource to the user or application that followed that link (e.g., by clicking on it). That is, the web in one sense is completely non-hierarchical and unstructured. The only structural meaning of links between two web pages (or other resources) is simply that one of them refers to the other (and possibly vice versa); all other meanings are entirely contextual and must be interpreted by humans. The goal of the semantic web is to provide a richer structure of relationships to define formally some of the meanings that link resources. And in particular, to provide an extensible uniform structure that can be easily interpreted by search engines and other software tools. The W3C describes a number of potential practical applications that could make use of semantic web technologies, including enhanced search engines for multimedia collections, automated categorization, intelligent agents, web service discovery, and content mapping between disparate electronic resources [2]. (more…)

Collex

Summary:
Collex is a tool developed at the University of Virginia’s Applied Research in Patacriticism lab (ARP) and currently operated in conjunction with NINES (Networked Interface for Nineteenth-century Electronic Scholarship). Described as an “interpretive hub,” (Nowviskie) Collex acts as an interface for nine different peer-reviewed, scholarly databases. The interface allows users to access all nine databases in one search, while results retain the unique characteristics of each individual source. Additionally, users can create exhibits for their own personal use, or they may submit exhibits to be shared with all users. As such, Collex and its relationship to data evolves as users interact with it, relying on folksonomy and user-generated relationships to construct new ways of viewing the information it contains therein. (more…)

FaceBook.com

Research Report by Katrina Kimport
(created 3/31/06; version 1.0)

Related Categories: Online reading and society; Social Networking

Original Object for Study description

Summary:
First launched in February of 2004, Facebook.com (initially known as Thefacebook.com) is an online networking website that allows users to create their own profiles and link to and view the profiles of others. Facebook is unique in that its online communities are based on offline university communities and membership is restricted to users with a .edu email address.

Facebook is the second fastest growing website and is particularly popular with young adults currently enrolled in or recently graduated from college. Because Facebook users are organized by college affiliation, users have a clear offline presence. The site thus offers the opportunity to investigate the relationship between offline communities and their online counterparts. (more…)

InfoDesign: Understanding by Design

Research Report by Mike Godwin
(created 8/30/06; version 1.0)
[Status: Draft]

Related Categories: Related Blogs

Original Object for Study description

Summary:
InfoDesign: Understanding by Design, is a blog devoted to the relatively new discipline of Information Design. The blog, maintained by a small team of people, is widely touted as one of the most comprehensive views of the growing field of Information Design. New posts appear approximately weekly, compiling links to pertinent articles, people, companies, organizations, degree programs, publications, events, and job postings. Like any blog, there is little native content on InfoDesign to review, as it is primarily links to articles and websites. For the Transliteracies project, InfoDesign is relevant in two capacities: as a guide to the field of Information Design — a young academic and professional field devoted largely to improving online reading, and as an index to the most important topics within the field. Every link to a news article is categorized, and each of these categories — there are 35 currently — are browsable. These 35 categories read as a list of what’s important in information design right now, and each will be reviewed for relevance to the Transliteracies project. (more…)

The Coh-Metrix Project

Research Report by Kim Knight
(created 8/22/06; version 1.1 updated 9/15/06)

Related Categories: Cognitive Approaches to Reading

Original Object for Study description

Summary:
The Coh-Metrix Project is a research project concerned with predicting the readability of texts in order to facilitate textual comprehension. The underlying assumption of the project is that current “readability” tests, based upon word and sentence length, are inadequate to truly predict textual coherence. Coherence in this context is defined as a mental representation that results from an interaction between the reader’s skills and goals, and textual cohesion. The Coh-Metrix project proposes the creation of two tools that will provide a more nuanced prediction of textual cohesion than current indices allow: 1. Coh-Metrix computes the cohesion of a text based on complex cohesion metrics. 2. CohGIT locates where gaps in textual cohesion occur, facilitating textual improvement. The project relies upon an interdisciplinary approach to reading practices, drawing upon “psychology, linguistics, education, literary theory, cognitive science, mathematics, and artificial intelligence” (McNamara, Louwerse, & Graesser 5). (more…)

Andrew Elfenbein, “Cognitive Science and the History of Reading” (2006)

Research Report by Kim Knight
(created 8/18/06; version 1.0)

Related Categories: Cognitive Approaches to Reading | Past Reading Practices

Original Object for Study description

Elfenbein, Andrew. “Cognitive Science and the History of Reading.” PMLA 121.2 (2006) 484 — 500.

Summary:
Elfenbein uses the strategies and terms of cognitive approaches to the study of reading to analyze the varied critical response to Robert Browning’s Men and Women, published in 1855. He argues for a critical practice that joins the complexity of literary criticism with the scientific attention to microprocesses of reading. His aim is to reveal that microprocesses, although always individually inflected, are locatable in various cultures and time periods. (more…)

MediaWiki

Research Report by Mike Godwin
(created 8/13/06; version 1.0)
[Status: Draft]

Related Categories: Blog and Content Management Systems (CMS) | Online Knowledge Bases

Original Object for Study description

Summary:
The wiki is an increasingly popular content management system for organizing widely distributed collaborations over the internet. This report will describe the relevant history and evolution of the wiki, and then consider the technology, interface, and design of MediaWiki as an example of what a wiki is today. While there are literally dozens of implementations of the wiki format, MediaWiki is unique as the engine responsible for the operation of Wikipedia — currently the largest wiki—and as the software supported by the non-profit Wikimedia Foundation Inc. (more…)

Computing with Words (Lofti Zadeh’s Fuzzy Logic and Natural Language/Perception Processing)

Research Report by Angus Forbes
(created 8/11/06; version 1.0)
[Status: Draft]

Related Categories: Software / Coding Innovations

Original Object for Study description

Summary:
Fuzzy logic is a system of logic which applies meaning to imprecise concepts. Rather than simply labeling a statement as either “true” or “false,” as traditional binary logic does, a statement is instead mapped along a continuum of values. These mappings are interconnected with other mapped statements, ultimately yielding applicable functions and rules despite the imprecision of the concepts on which the rules were based.

Fuzzy logic was developed initially by the engineer Lotfi Zadeh in the late Sixties as a method to create control systems whose inputs were made up from imprecise data. More recently, Zadeh has conceived of a merger of natural language processing and fuzzy logic called Computing with Words, and also of an associated Computational Theory of Perception as a preliminary way of thinking about how to compute and reason with perceptual information. (more…)

Haptic Visuality (Laura U. Marks’s Touch: Sensuous Theory and Multisensory Media)

Research Report by Angus Forbes
(created 8/11/06; version 1.0)
[Status: Draft]

Related Categories: Cognitive Approaches to Reading

Original Object for Study description

Summary:
In the last decade, the critical discourse of new media studies has shifted its focus from the virtual to the physical; from an abstract, decontextualized space to the embodied experience of augmented reality. Digital media have come to pervade everyday life and new media criticism has increasingly encouraged culturally specific, materialist and multisensory approaches. Laura Marks’s formulation of haptic visuality offers one such approach. As a way of seeing and knowing which calls upon multiple senses, haptic visuality offers a method of sensory analysis which does not depend on the presence of literal touch, smell, taste or hearing. While many sensory analyses focus on the evocation of and interaction between these literal senses (for example, the study of tactile interfaces, kinesthetics and textures), Marks’s concept of haptic visuality provides an alternative framework for discussing online new media works (too often understood as “simply” visual) in relation to multiple senses, affect and embodiment. (more…)

Page 1 of 212