philosophi.ca |
SDH »
SEMI 2012These are notes on SDH/SEMI 2012 (Programme) at Congress in Waterloo, Ontario. Note: that these are being written live and will therefore have typos and misrepresentations. Ian Lancashire: The Effect of Alzheimer's Disease on Ross MacDonald and Enid BlytonLancashire has studied whether language change can provide an early warning for memory loss. Language impairment can be caused by many incidents, but memory loss is one of the most common. Lancashire looked at Ross Macdonald (Kennith Millar) and Enid Blyton. Lancashire had previously published a book on this subject, Forgetful Muses. Lancashire digitized 16 novels and used MTAS 2.0 (Lidio Presutti.) MTAS calculates type/token ratios and lexical repetition (word or phrase repetition). Lancashire would take the same number of words for each novel (50,000 for Millar) to control for the difference in length of novels. Millar's vocabulary drops over time, which does not normally happen as you age. Lancashire showed a graph where repeating phrases go up, repeated words increase, total types drops. He uses fewer words and increases the repetition of the words he uses. Looking at both Blyton and Millar you see dramatic drop-offs which may have to do with issues other than memory loss. Millar had a family tragedy that corresponds with a drop. Blyton had heart trouble. James O'Sullivan: A New Visual Aesthetic: Materiality and Form in Electronic LiteratureO'Sullivan is applying material modernist theory to poetry. Our perception of the text is managed by theories of text and the materiality of texts - their design and so on. Technological trends also change our perception of texts. Texting, Twitter, blogging and other text technologies can change what we think is text and how we think it should work. O'Sullivan talked about the controversy over a poem by Yeats about the Dublin lock-out, 1913-4. A whole range of bibliographic components can influence reception. Where a text appears, its design, the form of the text, historical context - all these can influence how something is interpreted as the appearance of Yeat's poem in the Irish Times shows. He talked about "bibliographic code." O'Sullivan then turned to electronic literature. Electronic media have significantly enhanced the variety of forms e-lit can take. The linguistic code can be remediated into many rich electronic forms. He then talked about various electronic poems like Stephanie Strickland's V: Vniverse, Robert Kendall's Dispossession, and Kostelnetz's One Letter Changes. Materiality has been redefined as no longer the physical design but the media that frames a text. Andrew Piper and Mark Algee-Hewitt: The Werther EffectPiper is trying to understand the relationship between the novel and 18th century writing. In the 18th century the Epistolary novel becomes an image for connected text. Piper is trying to see how the language actually reappears or influences in other writers. How does Werther go and influence language. Piper calls this a topological reading. Topology is about lexical recurrence. Freud's notion of the uncanny comes from Goethe and has to do with recurrence. Piper showed a neat topology graph that was like a dendogram mapped. He did by coming up with 91 most common words. He then gets relative frequency over different works. He then creates a distance table using R. From that they get a force-directed graph to show distance and then a Voronoi diagram that turns points into places and draws the size of the tile based on length of the edges. What he found was that the language of Werther correlates with other works of Goethe about artists lives (including a translation of Diderot's Rameau's Nephew.) In all the thousands of articles on Werther none draw this connection. Then they looked at the cluster of works about artists and did the same process for "pages" in the works. He argued that the work may not be the best chunk to Next he looked at topic modelling and talked about how words might have a structural effect but not a "meaning" effect. I didn't get this, but it sounds intriguing. He closed by talking about social text theory, diagrammatical reasoning, and the contingency of the visualizations. As Ian Lancashire pointed out at the end, this is fascinating work. The Voronoi diagrams alone were worth coming to the conference for. Panel: TAPoR 2.0, Redesigning AroundI was part of a panel of speakers on TAPoR 2.0. We announced the release of the new version of TAPoR (see http://entry.tapor.ca).
Presentations on Recently Funded and Completed DH ProjectsThe last event was a sort presentations and posters session. Presenters spoke for 2 minutes about their posters and then stood by them so you could talk to them. I loved this combination of short presentation and poster. All the posters were interesting. Some that I remember:
Tuesday, May 29thIterative Design, Testing and Assessment Practices for Serious GamesI was part of a panel by the GRAND (Networks of Centres of Excellence) group at the University of Alberta that talked about our game experiments. Shane Riczu presented about the Return of the Magic locative game in Strathcona, Edmonton. Matt Bouchard talked about GWrit, a writing game. Michael Burden talked about a FPS (First Person Shooter) game designed for health education using Unity. He is now reimplementing in RenPy and he talked about migrating platforms. He talked about how you help players figure what to do. David Holmes talked about the CobbleCards, a card game for ideation and game design. CobbleCards was designed by Patrick von Hauf. David developed a web site where you can create your own cards. Then David talked about fAR-Play, a locative game platform developed with the folk in Computing Science. We have been using this platform to build different games. Shannon Lucky talked about different ways of assessing games, assessing the design process and even assessing assessment. It is hard to find light and easy to run assessment probes for games. Joyce Yu talked about working with partners. One things that I have been trying to figure out is what seems to be an Alternate Reality Game taking place during and about Congress. The game goes under the Twitter tag #bonfireofthehumanities and involves the Torch Institute which presents itself as a tax payers association annoyed by Congress. It is hard to figure out if it really is a game or to find the time to play it. Jon Saklofske: Gaming the Edition: Applying a Digital Game Framework to Digital EditionsJon talked about how game players are editors. They don't change the predesigned systems, but they can change the "edition" of a game. This is different from reading as it has a co-creation component. He then talked about how we may use gamification in editorial systems. John quoted Bogost to the effect that "gamification is bullshit" and others like Jenkins on games and learning. Learning through games is seductive but not necessarily sinister. INKE's scholarly modeling and prototyping team is trying to leverage game design knowledge for editing. They understand a model as "a representation of a system for investigating properties of system." Some of the gaming ideas they can use include:
Jon talked about avoiding some aspects of gaming (competition, leaderboards, and prizes) given the community culture of editors. Interestingly the games for academics could actually have punishments and real consequences. He described an idea for a game that hasn't been implemented or prototyped. It would allow people to play and practice at editing. It reminds me of the Ivanhoe game. He concluded that "game paradigms could lend integrity to social edition processes." Andrew Keenan: Information Seeking Behaviour in First Person Shooter video gamesAndrew is applying the information science idea of information seeking to the rich complex spaces of multiplayer first-person shooters. Most info seeking research looks at Andrew surveyed Mazewar, Battlezone, SpaceSim, Castle Zone 3D, Wolfenstein, Doom, Half-Life, Halo, Call of Duty: Modern Warfare 3. Battlefield 3. He mentioned that a lot of games were made in Freescape. He talked about the HUD or information that you see about yourself - the health-bar, the inventory and so on. He showed how the HUD (the information about yourself and your environment) has evolved in interesting ways. The new ones like Battlefield have all sorts of metainformation. Andrew then moved to presenting an argument about the change in on-screen information. The trend is to pack more information onscreen. Information behaviour provides a framework for understanding seeking processes. Cognitive gaps are when someone realizes they are missing information and they try bridging practices to cross gaps. Information seeking is a process of mastery. But how can people feel mastery in games where they are failing regularly? He draws on Juul to talk about how failing in a game is a key learning component that in games presents you with information. Andrew proposed using Valve's ideas of heat maps which show where players typically die/fail in games. See http://blog.counter-strike.net/science/maps.html There was discussion about why and whether we would want more information. Susan Brown asked an interesting question about types of failure. Dieing in games is a failure with few consequences that people get used to. Failure in the academy can be humiliating. Amy Ratelle: Crossed Wires: Retrofitting the Brain for Information ArchitectureRatelle talked about the Children's Literature Archive. She talked about programming enacts models of mental space. She talked about the relationship between techne and writing. She quotes Derrida to the effect that thinking and technology (like writing) are always connected - that we can't pretend that technology is other to thinking. She also argued that the CMS (Content Management System) and database are essential to the digital humanities. The database is how many digital humanities projects share thinking. Engelbert Gayagoy: Processing Interactivity and SynapseGayagoy started by talking about visual literacy and how we need visual literacy seems to be dependent on textual literacy. He argued that programming is the new visual literacy in design. He talked about programming languages like Processing. He called for museum studies having more He reviewed the Adobe Museum of Digital Design and the Victoria and Alberta Decode exhibit. William Robinson: Creative Labour and the Performance of Play: Challenging Marxist Critique with Aesthetic AnalyticsRobinson looked at definitions of art like "art is what museums show" and art has a cluster of features. He wanted then to see if a game or sport might fit one of these definitions. He talked about a cool move (act of virtuosity) in Defense of the Ancients. He was comparing acts of virtuosity to art. John Saklovsky: Play-Editing: Recognizing the Player as Editor in Digital EnvironmentsJon presented a slightly different take on the idea of the player as editor. (See above.) This time the paper was aimed at a game studies audience. Milena Radzikowska: Prospect on ConstraintsMilena presented on her thesis work. She showed some sketches including a cool "cog" design. These sketches are for a larger research project to develop a framework for industrial plant operations. She is part of a larger team building an operations system. Because their industry partner was giving them proprietary information they constructed a ice cream factory scenario to mimic the real industry problem. Their design goals were to design an interface for operators who don't have engineering math. Second they wanted to use "direct manipulation." She talked about Decision Support Systems, about which there is a literature. She was critical of the work in Human Machine Interfaces. She talked about Stan Ruecker's Rich Prospect Browsing theory. The idea is to show a meaningful representation of all items so the prospect (view of the information) lets you see the whole in different ways. She proposed four new Rich Prospect Browsing principles in addition the 7 Stan Ruecker has articulated:
Mihaela Ilovan: Exploring humanist citation practice through visualizationMihaela presented a paper that a bunch of us worked on a neat project to visualize citations in monographs. She talked about citation indexes. There are two opposing theories about what citation are about:
She then talked about the politics of technology. Most visualizations start from a normative view of citations to help people follow influence. Mihaela then asked what a constructivist visualization might look like. She compared science and humanities research/publishing and how that would affect citation. Science is evolutionary while the humanities are cumulative. Humanists still like footnotes and citations that are contextualized. We have to take into account how citations are used in footnotes in the humanities. A footnote can be used to develop a dialogue on the side or for other purposes. Mihaela then talked about the development of a prototype called CiteLens. She talked about the personas developed for the design and features desired. The key is that this prototype is for exploring argument in a monograph through citations NOT for what citation visualizations usually do, which is show relationships between works that cite each other. She then showed CiteLens sceen designs and talked about future iterations. Wednesday, May 30, 2012I had to miss the morning sessions to get some grants reviewed. The Interface to Interface ResearchA group of us presented a panel on the interface to interface research.
Ron Tetrault: Living in the Open: the Fate of Privacy in Digital CultureRon Tetrault was the recipient of the 2012 SDH-SEMI Outstanding Achievement Award! His award talk went back to his early scholarly electronic edition, Lyrical Ballads of Samuel Taylor Coleridge and William Wordsworth. The project was imagined from the beginning to be free and open. He then switched to another sense of openness. He quoted McLuhan, "any technology gradually creates a totally new human environment." The openness of having our information monetized by social media is now challenging our privacy. Gating is a way to control openness. Ron talked about "rights to privacy." It is a late 19th century idea. This right emerged in response to technological change as photography and the tabloid press challenged privacy. Any rights we claim to privacy are historical and class inflected as the middle classes began to get larger homes. The growth of information technology begat fears of "data liquidity." People worried about tax returns being shared with other people. Canada has been a leader in setting up the office of the Privacy Commissioner in 1977. We have given up privacy for all sorts of conveniences. We have CCTVs all over the place. They are in public spaces and private spaces. Loyalty cards that we voluntarily accept are used to gather information about our consumption. Nowdays we worry less about big brother and more about little brother - the peer to peer surveillance. Bentham thought the fear induced by the Panopticon would be enough to get people to spy on themsleves. We now live in a "sousveillance" world where everyone watches everyone. The digital has extended the reach of sousveillance. For Ron, while these new technologies bear risks, they also bring signficant benefits. The watchers are becoming the watched. The gaze is being reversed. Citizen journalists can now watch and shame those who are in power. Ron talked about the Arab Spring and how governments have responded by controlling access to social media. What do we do about it? McLuhan foresaw that as the speed of media increases people begin to want transparency and participation in governance. Speech that used to be fairly private has now become public. We seem to tolerate or welcome what can be called a "participatory panopticon." If personhood is maintaining control over our self-representation then we are losing our identity with social media. Social media has become about advertising yourself, self-exposure. To compensate we use technology to shut people out. We are renegotiating privacy and public. As eductors, our obligation is to learn for ourselves and to teach students to manage their privacy. We need to teach them to think before you post! We had a great discussion around issues of privacy and archiving. SDH/SEMI Annual General MeetingWe heard reports from the Exec. Our budget is in good shape and students are being supported. The Exec have proposed to change our name to Canadian Society for Digital Humanities / Société Canadienne des Humanites Numériques. We had a fun discussion about the name and voted it in. We also have a new award called the CSDH/SCHN award for early career researcher. Susan Brown walked us through this and there will be a call for nominations. Susan then announced the Ian Lancashire award for student promise. This year it went to Lisa Macklem and Spencer Roberts. A representative from SSHRC talked about Connection Grants which are for knowledge mobilization initiatives. They can be used as a first step for forming a network for other types of grants. They are one year grants with a value up to $50K. He then talked about SSHRC Tools which comes out of discussions around the gaps between CFI and SSHRC. The SSHRC leaders will be discussing this. It is too early to tell what this program will be. We suggested that the SSHRC leaders may not always represent us as well as they could. Then the Federation representatives (Ray Siemens and Jean-Marc Mangin) came to talk to us. They suggested that the Federation can represent our needs to organizations like SSRHC. The Federation has signed the Berlin Declaration. See http://fedcan.ca/en/issues/open-access - this is great news! |
Navigate |
Page last modified on May 30, 2012, at 04:12 PM - Powered by PmWiki |