Main »

Congress 2015

These are my notes on the Canadian Federation for the Humanities and Social Sciences Congress 2015. I attended two societies:

Note: these are being written live which means that I don't finish lines and so on. I go back and correct the grammar when I have a chance.

The Twitter hashtag is #csdhach2015

Sunday, May 31st: Text Mining in the Humanities

I caught the end of an event talking about text mining in the humanities organized by Stéfan Sinclair with IBM.

Gias Uddin (IBM): Watson Analytics

Uddin presented Watson Analytics, which is not the big supercomputer Watson. Analytics tries to let you start with questions. You type in a natural language query and it provides suggestions.

He talked about predictive features that let you create a workbook and identify what you want to understand. This then creates a dashboard of all sorts of information. It has a spiral display in the center that shows what patterns predict the field you want understand.

This idea for a course sounds brilliant.

He showed using Analytics on social data - twitter. He then showed assembling a dashboard of your own and used it to look at tweets about 50 Shades of Grey.

There is a free version of Watson Analytics available at http://www.ibm.com/analytics/watson-analytics/

John Hawthorn: Bluemix/Watson Developer Cloud

John started by talking about how one doesn't need a lot of money to do analytics. Bluemix is a platform for cognitive computing. The "Platform as a Service" is where all you need to do is upload data and write code. No need to manage anything else. The pricing is "pay for what you use". Everything you try you can try for free in a prototype mode.

Watson development cloud is a catalogue of analytics capabilities built on Bluemix. There are a lot of different classifiers and language tools. There is a web site on the Watson Developer Cloud. He showed doing sentiment analysis using the tools and compared Proust and Checkov.

This is a platform that we can use to build services for our own use.

John Miedema (IBM): Watson Explorer

Miedema talked about how customers want solutions that can read. We (DH folk) are experts. He then showed Watson Explorer (WEX). WEX is a content analysis tool that pools in content from a wide range of sources and puts it in a single platform for analysis.

They have a content Analytics Studio which is an Eclipse based platform that builds on UEMA (?). It lets people build structure from unstructured data.

Then they have the Content Miner which provides views of the content. This combines natural language annotators and visualization tools.

He demoed analyzing novels in the Watson Content Analytics.

Stephen Perelgut (IBM): Watson Academic

Perelgut (perelegut@ibm.com) talked about the academic initiative and how they donate to the academy. They are moving just about everything into the cloud.

If teaching a class one can get sign up and get student accounts.

They provide matching support for SSHRC researchers.

See http://www-304.ibm.com/ibm/university/academic/pub/page/academic_initiative

They have PhD fellowships and work with MITACS and so on.

Open Discussion

I asked about ethics and privacy. Does IBM design for privacy. Ryerson has appointed Dr. Ann Cavoukian the Executive Director of the Ryerson University Institute for Privacy and Big Data. IBM collaborated with them.

Monday, June 1st, 2015: Day 1 of CSDH/SCHN

John Simpson: Visualizing Philosopher and Topic Frequency Data Gathered from Named Entity Recognition Tools

John Simpson talked about work that we are doing to trying to trace ideas over time. We first tried sequence alignment and that didn't work that well. We didn't have efficient enough code to work on the size of data we had. There were also a lot of problems around

Our next experiment focused NER (named entity recognition). We used the InPhO (Indiana Philosophy Ontology) ontology of concepts and philosophers. They have a hierarchical ontology of philosophy. Our idea was to train a NER tool using the InPhO ontology. We found the ontology didn't work that well to train the Stanford NER.

John then showed a live demo.

Jean-Gabriel Ganascia: Extracting and Visualizing Named Entities using Interactive Streamgraphs – A Case Study on First World War Data

Ganascia is at the Observatoire de la vie litteraire, a laboratory of excellence at the Sorbonne Universities. He showed a system they developed to do NER on newspaper accounts from around the time of WWI. He talked about the problems of old papers.

Their NER system lets them locate entities right on the article to figure out what has worked or not. They do a number of semi-supervised passes on the data to train and disambiguate things.

He showed Themeriver visualizations and Streamgraphs that they have adapted to show evolution of ideas.

Harvey Quamen: What We Talk About When We Talk About Books: Topic Modelling Reader Responses on Social Reading Sites

Quamen and Anouk Lang are looking at social reading sites. They collected reviews from Library Thing on three numbers.

Quamen then showed a neat visualization that shows the reviews in clusters. It looked a bit like Mandala. They can see the book reviews in the context of the topics. When you look at the words of the topics there always seems to be one topic that is about 95% that has sentiment as in "good". A second topic has 5% and is a plot summary.

Then he talked about his system doesn't show evolution over time. The conversations of reviews seem to aim to consensus, but what types of topics head to consensus. He showed a nice graph of topics over time for the Dragon Tattoo book. Strangely people are still doing plot reviews.

What they did find is that the social history of the book (major movie releases of book) influences reviews. When movies come out you get plot reviews and negative reviews which may have to do with comparisons with the book.

The topics don't seem to cycle in the sense that one topic come and another goes. Instead the cycling responds to events like release of a DVD or movie.

Harvey then talked about how reviews are also doing reviewer identity work. The claims about emotion ("I felt angry ...") . They looked for colocation of emotional words with words like "me" and "I". Then they graphed emotional claims and found that they increased over time. Even though reviews aren't driving to a consensus, they do get more emotional. Other claims are contestible, but emotional identity (affect negotiation) claims are not.

He concluded by saying that the book reviews are homogeneous. The 95% to 5% breakdown is very interesting. That the plot summaries don't seem to disappear could have to do with ways that people use summaries to make identity work points.

Discussion

We then had a great discussion about the challenges of topic modelling and the interpretation of journals.

Wendy Hui Kyong Chun: New Media: Wonderfully Creepy

Chun gave a joint ACCUTE/CSDH talk. She started in engineering and viewed it as a field outside of politics, but the Montreal massacre got her thinking about the social and politics of new media. She worked at Gandalf Technologies in Ottawa.

We should think of ourselves as characters in a drama called "big data." Our character is determined not only by our actions, but by our "friends". Reading now is no longer silent. If you read on Kindle your highlights are tracked and how long you read is tracked.

A transparent system is one that makes us transparent as data. Chun wants us to think about all the signals that are caressing us. We need to embrace the promiscuity of the machine.

The issues are not just simply technological. She is not a technological determinist, but so we can challenge and rethink what technology is and does.

What we accept as true of technology depends on a whole host of decisions. Remember the promise of the internet - how it would be empowering. There was no race, age, and infirmities on the internet. This moral was get online if you don't want to be discriminated.

Snowden loved the internet of the 1990s when it was empowering. Chun thinks it strange that the internet was considered new in the 1990s. She thinks it was new as it mixed with cyberpunk fiction.

How is the internet sold as empowering? How is a control solution sold as empowering.

She then showed her network card activity. Our network cards are promiscuous - they gather everything and delete what isn't for you. New media is thus leaking all the time (like friends.) New media is wonderfully creepy because they work by sharing too much. The system works because all sorts of stuff is then screened out. The screening makes things personal.

She talked about New Media - new media replaces mass media. It is replaces the mass with what is you. She talked about the conflation of diversity and participation. She compared the propaganda of the internet to Roland Barthes idea of writerly text. When the internet was democracy come true we had Barthes come true in ideas of hypertext.

We need to consider how reading as writing has been conflated with democracy. New media runs on the variety and diversity. They feed the algorithm. Crises plus habit equal update. We can all speak and the world isn't better.

For Chun a reader is now a character. The reader is now being read by algorithms so we are now characters reading in public. The windows are now on us. Silence is impossible as our networked devices are always

Part 2: Like, Like or the revenge of the valley girl. Algorithms of prediction are constantly comparing us to us.

The analogous is taking over. Everything being compared to everything. Then we look at how people deviate. This is redlining on another level. Race and gender is factored in. Netflix doesn't care what we review - it creates neighbourhoods based on our actions. The supposedly outdates markers (race, gender, sexuality, and class) are now used by big data. She talked about FICO which is used for insurance calculations. Companies are using apparently unrelated discriminators to avoid having to use the obvious and politically incorrect markers. Your choices in insurance could be used to determine your gender or race.

These system regard what we do as better than what we say. The systems are unverifiable. It is impossible to tell if the systems are working properly. There is no proof.

There is a relationship between reality and truth that is different in big data. Surveillance is now co-produced by states and businesses.

She then talked about issues of accountability and responsibility. She believes that attempts at accountability make us more vulnerable. They lead to slut shaming.

What if we understand trust as the ability to take risks. The right to be public in ways you control. What if treated the internet as a place to loiter. People are being told that if you don't want to be harassed you should get off the internet. Loitering creates safer masses. She mentioned http://adnauseam.io/

Privacy offers no shelter so activists are "coming out" (homonationalism.) She talked about the undocumented. These narratives follow templates and restate things. What does it mean to confess in a medium in which everyone is always confessing?

She ended on writing, even if repeated.

"Writing, however inoperable, allows us to begin with ... to begin with."

I asked what software we should look at. She suggested looking at what our network cards do. She suggested we try our terminal and run "netstat -a". She also argued we should look at hardware.

If almost any correlation can be shown to be true, which ones matter.

She argued that we should stop storing everything. Storage takes up space and energy.

Embodied Invention

Stephen Wilcox. Wilcox talked about embodied invention (inventio in sense of creation). Aristotle talked about common places (topoi) - invention by discourse. Sensory invention is knowledge that comes through senses - the common tacit knowledge. It comes to us from Vico. Lyotard talks about embodied common place and embodied invention through play. Wilcox talks about commonplace games.

  • Discursive - discursive argument based on shared assumptions
  • Sensory - sensory argument based on generally accepted perceptions like comics
  • Embodied - argument that reflects a particular body

We can't share embodied enthymemes as we don't have common experiences of the body. That creates room for artists to create woks that teach us about body experience.

We get memes and enthymemes - common relationships between phrases and so on. Common experience of having watched stuff provides a common place of materials for enthymemes.

With bodies we form different topoi, not common places.

He then talked about poesis - the play or make function. Play is a way of exchanging enthymemes. All play has a playground - spots where special rules obtain for playing.

Wilcox wants to trouble the idea that there is a common place. He talked about games that give you a sense of a different embodiment like Auti-Sim - you play as a child of autism and Depression Quest - the famous game on what it is like to be depressed.

He is building a game about food allergies.

Maureen Engel: Playing Queer: Locative Media, City Space, and Game Mechanics

Engel started with a quote by Lev Manovich that "a prototype is a theory." She told a story about Grand Theft Auto and how there are games that incentivize us to beat up prostitutes to get money back. These stories

Games circulate among players in social situations where they participate in and resist what they are encouraged to do. Games are learned genres - players learn to play them (how to use interface, controls, and conventions.)

She then talked about "queer gaming." There are games that allow for same sex playing and those that have embedded gay content. There is an emerging "gamyr" community that has

She then moved to asking if one can build a game that a queer person would be at home with. How can design a game that is queer.

She gave Disforia as an example where content and mechanic have a congruence.

She then talked about a prototype she is building on the annual Gay Bus Tour of Edmonton. This tour is communal in that stories get developed by participants from year to year.

Urban space provides the structure against which a queer community is built. She talked about tactical production of queer spaces. She sees locative media as a creative misuse of space that can be queer. Engel's game lets one use 4 tactics:

  • Queer movement
  • Queer acculturation
  • Queer community
  • Queer belonging

Mark Johnson: Algorithmic Generation of Global Racial, Cultural, Religious, Architectural and Linguistic Variation

Johnson started by talking about procedural generation. He talked about Roguelikes where everything on screen is a character. His game is Utlima Ratio Regum. He wants to move away from only generating maps. His game's AI uses grammars and then generates countries, cities, buildings ...

His game creates a model for names based on where the nation is. It also creates coats of arms and mottoes. The game generates religions, each with all its paraphernalia from cathedrals to altars.

The game is about cultural choices. It would be interesting to "read" his ontology - what did he think were the available choices.

He concluded by arguing that the game is the most complete cultural game you can find. He hopes it would be good to play.

Tuesday, June 2nd, 2015: Day 2 of CSDH/SCHN

Peter Webster: Improving Research Data Visibility and Accessibility, for Better Data Sharing and Higher Research Profile.

Webster addressed the need for humanities to do a better job at depositing data with appropriate documentation.

He talked about a major research data facility project called Portage. It is led by CARL and others. We already have a bunch of projects like the Ontario Scholars Portal Dataverse. British Columbia Research Libraries' Data Services. Then there is Islandora.

There are repositories that anyone can use like ICPSR, TAPAS, and figshare. figshare is open and Webster felt it was a common lower denominator.

What should be done:

  • You should have metadata so it can be discovered
  • You should provide a persistent identifier over the long term
  • Ensure that the data is discoverable

There are several registers for data like re3data.org, datacite, OAIster and so on. There are many metadata documentation standards like Dublin Core, Data Documentation Initiative, and the TEI.

He talked about how we are moving from a model where data is only shared when published to data being shared for other reasons. Datasets are being treated as scholarship, of a sort.

We also don't have to release everything. There is a continuum. He then talked about the Hague Declaration of Knowledge Discovery in the Digital Age. The declaration says that analysis is not what copyright was supposed to prevent. A number of outfits like CRKN are now helping researchers get access to data.

Sonja Sapach: Data Stewardship in the Digital Humanities

Should depositing data be viewed as scholarly activity? Sapach presented a paper arguing for the recognition of depositing documented data.

Sapach talked about the TC3+ document arguing for data management and big data. She talked about the OECD declaration on access to research data. Canada is also committed to G8 Open Data Charter.

The idea is to encourage good data stewardship. We have no standardized way of sharing data with other researchers and the public.

A lot of folk don't plan for data management at the beginning of the project. Data Management Plans are encouraged and she showed the U of Alberta DMP Builder http://www.library.ualberta.ca/researchdata/begin/ . She then outlined challenges:

  • Lack of repositories
  • Lack of metadata standards
  • Lack of culture of digital scholarly stewardship
  • Lack of funding

She then talked about the benefits for all:

  • Researchers get inspiration for new research
  • Reinterpretation of
  • Increased citation rates
  • Avoiding costly duplication
  • Institutional benefits
  • Canada has made a commitment - we need to work with

She returned to the opening question about scholarly status of data.

We have to alter our practices in a significant way which then leads to different practices and see the results of these practices as scholarly practice and therefore a scholarly.

Javier de la Rosa: SylvaDB: A Framework for Research Productivity in the Digital Humanities

de la Rosa talked about the challenges of social network research using standard databases. In standard databases once you have a schema it is hard to change while research tends to evolve. There has been a reaction called NoSQL or Not Only SQL. There are now lots of new types of databases.

In the humanities we have been still stuck in the relational paradigm. In the CulturePlex lab at Western they have been creating new tools. They have created a Query By Example tool to avoid SQL.

Humanists deal with highly connected data. The CulturePlex decided to embrace the challenge and developed SylvaDB. This is graph database tool. A graph stores relationships - things related to other things. One type of graph is a Property Graph - directed, attributed and multi-relational. This is the graph structure used by graph databases to store data. There are other solutions like Neo4j, Titan, HypergraphDB and Etc. These are complex for humanists.

SylvaDB is built to be accessible - you have schemas, data, and analysis. It is built on PostresSQL and Neo4j on Amazon EC2. He showed the schema building tool which had a nice visual interface. SylvaDB then provides forms for data entry and a visual query builder.

de la Rosa then talked about use cases starting with the Hispanic Baroque. They have some neat projects studying social networks of people like of the Duke of Lerma.

Rachel Deblinger: Memories/Motifs, Historical Memory & the unexpected inspirations of Digital Humanities

Deblinger talked about postwar representations of the holocaust. She showed materials around the visits of war orphans to the US. She built Memories/Motifs in Scalar to weave a network of meaning about holocaust memory.

She told the story of Irene and Rene, twins separated in Auschwitz were then brought to the US and adopted. Deblinger is interested less in their story as the representation of the story. The Memories/Motifs site lets you explore the representations.

Exploring by tags let users "choose their own adventure" and not have to read by individuals, but follow themes.

She then shifted to the ethical question and asked about transforming memories into a digital representation. She is aggregating and representing materials about people still alive. What does it mean to be a historian if you are curating live information about living people. She asked if she has informed consent?

Graduate Student Networking and Mentoring Lunch

ACH and CSDH ran a mentoring and networking lunch for grad students. Some of the issues that came up:

  • Too many jobs seem to want the DH person to do everything
  • There is a tension around what skills you need to have or are expected to have

Jean-Gabriel Ganascia: Navigating through Memory Island of Stanford Encyclopedia of Philosophy – a demonstration of the Memory Island Technique

Ganascia talked about a visualization for looking at the Stanford Encyclopedia of Philosophy based on the INPHO ontology. He gave some background on visual memory techniques and information visualization.

They are trying to visualize knowledge and ontology. He talked about how tree maps are hard to understand, even though they are attractive. They want to spatialize the hierarchy of ontological information to make it easier to remember. They were inspired by the Art of Memory.

They are creating imaginary maps of islands and you can follow your trajectory on the islands. He talked about not using 3D as it can lead to lostness. They use proportion and show paths. The web site is http://www-poleia.lip6.fr/~polyle/2014-12/result-inpho/index.html

He talked about the difficulties of making labels.

the point is not to just create pictures, but to understand data.

Richard Furuta: The Astrolabe Project

Furuta and colleagues are building a database about astrolabes. He showed a nice topology of the main tiypes of Astrolabes.

Daniel Powell: Reproducing Text: Observations from a Pre-Digital Humanities

Powell talked about pre digital ways of reproducing texts. He started by how we are doomed in the digital humanities to always talk about evolution over and over. We don't seem to evolve our history.

Power wanted to tell a different story and outline the history of EEB (Early English Books) that became EEBO (EEB Online). EEBO drew heavily on microfilm. He commented on how little we know of the corporate history of UMI that imaged the collection to microfilm. They remediated the print to microfilm and then to digital. There are sorts of problems with how they did it.

Microphotography was the information technology that was thought of as a goal in itself until the 1980s. In the 1990s they realized that they should shift to the internet. He described some of the challenges and iterations of the project. The scholar is now several times removed from the original book.

Powell then raised the issue of a technological humanities - we need to start looking at overlapping information technology not only computing. The humanities are closely tied to the history of technologies. We need to understand technology better. He talked about Mumford and Innis. We need a media archaeology to understand technologies.

The DH Manifesto 2.0 talks about deconstructing the materialities and practices of the humanities. We need to also ask about the disciplines, including our own.

Rockwell and Sinclair: Characteristic Curve: Reinterpreting Early Analytics

Stéfan Sinclair and I gave a paper on developing notebooks that recapitulate important experiments in text analysis. You can see some of our notebooks here.

Quinn DuPont: A Parallel Tradition - Cryptographic Writing as Humanities Computing

DuPont is interested in the history of cryptography as writing. He sees cryptography as a form of humanities writing. Cryptography evolved alongside text analysis. By Liebniz's day cryptography diverged.

The history of computers is often a seen as the history of a calculator. The Colossus, by contrast was a text processing machine. Important to the emergence of computing is the co-development of the intelligence agencies that used computers in cryptography. I wonder if that

There was a second use for text analysis. Warren Weaver sent out a memo to collaborators about the possibility of using computers in automatic language translation. This was seen as an extension of cryptography. Think of reading Russian as a form of decryption. He posits a pivot language into which all languages can be translated.

He then goes back to Bacon and connects the discussion of cryptography to the discussion of a common language. Bacon saw cryptography as more than secret writing. He saw it as a way to bypass censorship, but also as a form of literacy.

Award Plenary: CSDH/SCHN Outstanding Achievement: Chad Gaffield

We recognized two graduate students with awards and then Chad Gaffield with the 2015 CSDH/SCHN Outstanding Achievement Award.

His talk was on "Digital Humanities at a Crossroads: Questions and Suggestions for Next Steps"

He talked about the Transatlantic Platform and extensions into Brazil and the Americas. He then turned to issues around where DH is going. At U of Ottawa they put together taskforce and have produced a preliminary report.

He talked about an Ed Ayers article about how DH is being domesticated and losing its future. Maybe computing has been domesticated and its not an issue any more. Or maybe its time has come and its time to look ahead.

He talked about how surprising the ascendance of DH is. If DH is ascending then why is there so much antagonism? He talked about the antagonism towards digital and quantitative history/humanities that still exists. He talked about the myth of "solitary scholars" that still persists. There is an issue about the humanities being detached from campus and "not relevant."

He then tried to unpack why DH is doing so well. It is:

  • Helping campuses reimagine teaching and learning - DH has been a less breathless and more measured engagement with computing
  • Helping rethink undergraduate learning as research - he mentioned The Great Canadian Mysteries as a way of thinking about the
  • Helping us think about how to collaborate without forcing it - he gave the example of Susan Brown's CWRC
  • Helping us think about interdisciplinarity - DH brings discipline-based interdisciplinarity - the humanities have been wrestling with how to engage other disciplines
  • Helping us think about new ways of knowing - why do we think of coding and math as different ways of knowing - while we want to avoid the quantitative/qualitative dichotomy, we still need to be open to numeric/statistical approaches
  • Helping us imagine new partnerships of co-creation
  • Helping us get away from notions of detached scholarship - we think much more about engaged research than others

He then talked about how we now need to start defining Digital Humanities and be clear that having a Blackboard site is not doing DH. To be part of DH you need to be helping to advance the field, not just using technology.

We also have to avoid "revolutionary fervour" and shared an article from the Globe and Mail about a McLuhan project that promised too much. We sometimes talk about the tools. We need to talk about insights.

He also talked about the need to avoid snapping at people playing in the humanities and need to take an engagement approach. Likewise we should engage those trying to take over the humanities. Finally we need to work closely with libraries and work on curriculum. DHSI shouldn't be the only workshop.

The 21st century is when we take seriously understanding the human condition. A key part of that is the new data we can use as evidence.

Wednesday, June 3rd: CSDH/SCHN and CGSA

The third day of CSDH/SCHN overlapped with the first day of CGSA. We had a digital demo / poster session at 8:30. Then there was a joint CSDH and CGSA session on gamergate - the topic that can't be tweeted. It was interesting that we actually had a discussion about tweeting the topic at the beginning.

A. Budac: #GamerGate: Distant Reading Games Discourse

Budac gave a paper I helped with on a project on gathering and studying the gamergate phenomenon. We have shared some of our data on Dataverse. We also have an interactive for exploring keywords.

E. Vist: Idiosyncratic tagging and the creation of intimate publics on Tumblr

Vist started by commenting on how we were promised an internet that has no race, gender and so on. Vist talked about how women started creating spaces where they could have "intimate publics."

The other dream for the internet was a true democracy. That is also dead. There isn't a debate anymore. The filter bubble is supposed to be bad, but for many women it is needed to avoid harrassment.

Some of the affordances for networked publics include:

  • Scalability
  • Replicability
  • Persistence
  • Searchability

These affordances cause people who don't want to be blasted to take various defensive measures to create intimate publics. Twitter doesn't let one create such publics. Tumbler does. Twitter structured the gamergate discussion in ways that privileged certain types of participants. Twitter let people search back to show that people critical were hypocritical. Twitter made it easy for gamergaters to investigate voices they wanted to silence and sea-lion.

People use ideosyncratic tags in Tumblr to exclude an audience and then to maintain the boundaries. Tumblr indexes only the initial tags so you just put in a bunch of useless ones.

Twitter is built for the internet we wish we had. The reality is that certain practices and publics are privileged. The impossibility of creating intimate publics in Twitter means that people are leaving for Tumblr.

S. C. Ganzon: Nerdy Cupcakes, Boob Cams and Girl Gamers with Evil Boyfriends: Gender Capital and Constructions of Gamer Identities in Let's Plays by Women

Ganzon started by talking about how popular "Let's Play" YouTube videos are. There are very few women Let's Play videos. Various features of YouTube privilege male voices.

Alas those women who do post videos get seen as a woman wanna-be gamers or sluts.

The Let's Play genre may go back to a screen shot version of Oregon Trail. LP are not walk-throughs. They focus on the subjective reactions. One of the popular early versions is GameCenter CX - the Fuji TV show in Japan.

Those who are popular use branding to make money. They create a persona for an audience in order to get views (and money.) The constructed gamer identity has a lot to do with brands and is gendered. Let's Play feeds a toxic gamer culture by reinforcing attitudes.

She talked about the pixel pack that collaborate. It is a common practice on YouTube and helps people gain subscribers. Few women LPers have a face cam or use their

Trolling and sexist comments were common, but on the more subscribed channels it was shut down by other subscribers. A lot of the discussion is around "fake girl gamers" or branding women as "attention whores". Broadcasting for women is perilous - if you do too much you get called names.

On YouTube a hierarchy among gamers will remain.

S. Murray: Three Faces of Aveline: Difference, GamerGate and the Visual Politics of Play"

Murray does games and the history of art (visual studies.) The recent gamergate culture war has shown how political representations are. They show how difference and power are at the forefront.

She talked about the attempt to keep gaming safe from social justice issues. The idea is that normative group is constructed as not political and not doing identity politics and they are just asking people to stop doing politics in gaming. Of course, the normative group is doing politics when they pretend not to.

It isn't really useful to point out racism/sexism in games as if it is a case of ignorance. Murray thinks gamergate made clear the deep structural racism/sexism in gaming. See Male Protagonist Bingo.

Game culture is the least progressive form of media representation despite being the newest. Given how large a percentage of gamers is women

She then talked about Assassin's Creed III Liberation. The main character is a lady assassin slave. Murray talked about the aesthetic origins and the persona of the main character.

Assassin's Creed owes a lot to Prince of Persia and swashbuckling movies. AC was envisaged as Prince of Persia - Assassin. It became its own title, but there is still a swashbuckler aspect with an orientalist aspect. There is a deployment of powerful images of the orient (Middle East) in the games. Tropes of cruelty, harems, and so on are common. Games often draw on durable myths and fantasies from the cultures that create them.

In AC III the main character is both slave and assassin. The core mechanic, combat, stealth, navigation creates a particular relationship to place.

She talked about Stuart Card's discussion of adventure as a form of colonial tourism. (I'm not sure I got this right, but there is a way in which AC seems to deploy adventure, in the sense of adventuring through different times and places, as a colonial appropriation.)

There is a tension between the character that disavows the masculinist construction, but underlying the game is a collection of tropes that reinforce an orientalist perspective.

She closed by talking about the different versions of the main character and passing between being a lady, slave and assassin.

There was an interesting question about the differences of methods between the papers. Ours was distant reading the others were close.

David McClure: Textplot: Visualizing the thematic structure of texts and corpora

McClure worked on Neatline when he was at University of Virginia. He talked about how ideas of mapping are being used in the humanities where one can have maps that adapt geographic literality. He showed some Franco Moretti images.

He talked about Textexture as an example. Textplot is a stab a visualization that builds out a semantic web about a term or terms. For histograms he generates a kernel density (?) curve across a novel time of the text. Once you have distributions you can do things with that information. You can compute the overlap of two distributions. He showed different examples including a neat one of Humanist. He closed on some of the things that don't work well.

He sees these visualizations as intuition pumps or deformances. Could these become more? Could these become a way to compare texts? (Matt Jockers tried this with

I like McClure's clean design style. The code is at https://github.com/davidmcclure

Stan Ruecker: Why We Are Designing Hardware Peripherals for Text Analysis

Ruecker talked about a project designing physical applications to texts. A reviewer asked how his work might be immediately useful. Ruecker talked about how long it takes designs to become useful. It takes decades - we shouldn't expect immediate results from academic design research.

He then gave some neat examples of physical mockups of text analysis environments. This raised the question of what these models do and types of knowing. His designs are meant more for physical and tactile knowing.

His examples were fascinating. Check out the Twitter feed for photos.

I am reminded on what Bertin did for cluster analysis with blocks and rods.

Why are we doing this? To be smarter?

Micki Kaufman: Everything on Paper Will Be Used Against Me: Quantifying Kissinger”

Kaufman started by commenting on how her work might not be considered proper history. I'm surprised she got this reaction.

She has studied two different corpora related to Kissinger:

  • Memcons are memoranda that were archived and librarians organized them
  • Telcons are telecommunications that were only released after it went to the Supreme Courts

She had an interesting interaction with ProQuest when she was scraping large numbers of documents.

She showed a series of really neat results in a sequence of thinking through topic modelling and term frequency histograms. She had a neat stacked bar graph of the memcons and telecons that showed an interesting break when Kissinger was made Secretary of State. There were really interesting differences in the two types of documents. The classifications of the telcons made a difference.

She then talked about a collocation study using AntConc. She looked at words collocating with "bombing" and found that in the telcons they get different collocates than in memcons. Bombing of Cambodia was illegal and thus talked about less in the memos that Kissinger knew would be eventually retained in the record.

She showed a Textplot of the data.

See her blog on this at http://blog.quantifyingkissinger.com/category/interactivity/interactive/

Toniesha Taylor: Millican to Ferguson: Digital Representations of Activists Discourse Then and Now

Since the civil war communities have had different relationships with authorities. The goal of expanding the franchise of black ran against the goal of enfranchising white women. The tensions codified segregation rather than guaranteeing emancipation.

Millican Texas, also known as Six-shooter junction, had one of the largest race riots in the US in 1868. At the time newspapers were the main form of information. By contrast, today there are all sorts of alternatives that activists can use.

The past can be understood as a window into the future. She talked about Millican when the war ends. Millican had been a big trading spot for confederacy. There was a flood, a crop disaster, and economic collapse. You have free blacks, and poor white. White men who had supported the confederacy were forced out and replaced by black republicans.

The KKK responded by marching against black churches. At Brook's church they fired on the KKK. The KKK fled and then came back.

She then talked about the language in the newspapers. An activist would use words like "patriot" and "god-fearing" for blacks. The

She gathered tweets of terms about ferguson and got a million overnight. They looked for terms like "justice", "god-fearing". She finds that the racist tropes function in the same way. The news doesn't show blacks trying to activate their rights, but whites losing theirs.

She wants to use the digital humanities methods to show that change can happen. She wants to show the past to help people really see the functioning of discourse and then be able to imagine things changing.

This project and discovering the parallels between 19th century racism and 21st century racism was obviously hard for Taylor. She talked about having to stand back from the results.

Dutta and McArthur: The Business of Culture: A Single-Subject Community

They talked about The Culturist - a project to simulate a real newsroom and reflect on that. The idea was to think about how business and culture are interrelated.

This course was student centred - where students are asked to get out and do the work. They found interdisciplinary teams were best.

They talked about hypercontent where a site/blog is focused on a narrow subject - all the research on that subject. These have become alternatives to the news. Some examples are Chalkbeat, Deep Sea News, Ebola Deeply, and Syria Deeply (coming from News Deeply.) Their hypersite was focused on anything about the business of culture.

They developed a social media strategy and then used SylvaDB to analyze.

They did a 360 degree report at the end to explore what worked and didn't. The recommendations include expanding social media presence. They should explore monetization strategies. They need move to an established platform to host the site.

Budac and Palmer: WIScking Ideas: Gather a Corpus from Wikipedia

Palmer started by talking about the background to this project. The WIScker tool came out of an Intensity Challenge at U of Alberta. Palmer then talked about how the Wikipedia, despite being unreliable, is a unique resource that has been used a number of people to study ideas.

Budac then demoed the tool. She talked about how to use it and some of the issues. The tool is in Javascript so you can save the page locally and use from your machine. The tools is at: http://analytics.artsrn.ualberta.ca/wiscker/

Palmer then talked about our idea of the epidemiology of ideas. He then talked about an actual project studying "terrorism"with the tool. He talked about how we used it with Voyant.

AGM: Annual General Meeting

Jeremy and Crystal from SSHRC talked about the Trans-Atlantic Platform. I have blogged about this at http://theoreti.ca/?p=5536

Their pilot program is a programme on digital scholarship in conjunction with Digging into Data. There will be a launch in 2016 with funding decisions in 2017. With the expanded TAP there might be up to 14 countries involved.

John Simpson pitched Compute Canada starting with OwnCloud.

Amy Earhart: DH Futures: Conflict, Power, and Public Knowledge

Earhart gave the closing keynote. She told her story. She was a temporary lecturer for five years. A NINES workshop launched her interest. She has spent a lot of time thinking of her time and power.

She talked about where we are going in light of power structures in which we work. She talked We are at a treacherous moment in the digital humanities. We are a representative of neo-liberalism and so on. See Matt Kirchenbaum's article on What is ‘Digital Humanities,’ and Why Are They Saying Such Terrible Things about It?.

Melissa Terras told Earhart a story about a request on Twitter that lead to extreme reactions right up to a very nasty tweet. There is a struggle for power taking place within and for the academy and the digital humanities finds itself the site of battle. DH has an understanding of power that may have led to this problem as we think of ourselves .

DH is always contained within the structures of the academy. DH, when it argues it is new and outside, just provokes fearful charges for anti-humanism. When the DH views itself as a big tent that betrays a naivete. The big tent ignores all the traditions that we come from. We should look at the sweeping generalizations carefully. She feels that DH should be particularized. If we understand how we are distinctive (rather than a big tent) then we might see what we have to offer. She feels "boycott the circus."

Digital humanities is ever a living term that is evolving. We need to think about how national structures constrain what we do. Reward systems and funding models influence what is done. We need to overcome assimilationist forms of thought. All sorts of things could come from outside the global north.

Is the digital humanities an outlier to humanistic study? In her forthcoming book she outlines how digital literary studies are influenced by the digital:

  • textual studies
  • new historicism
  • cultural studies
  • data/algorithmic studies

She gave examples of how textual studies felt outside the mainstream and felt safe from messy issues like race and gender. She talks about views of the purity of documents and how textual studies were about cleaning out corruption. Fredson Bowers seemed to be try to breed pure texts just as he bred wolfhounds.

We tend to think of ourselves as outside of power. It is ironic that DH is an "in" crowd when we have thought of ourselves as outsiders. We need to be honest that DH does have power in the academy. Power manifests itself in terms of jobs, invitations and so on and DH has all of that. We need to be careful not to pretend we have no power when we do. We all slip into positions that are not critical of power.

Race is another issue that we ignore. The issues permeate the academy and society so it would be shocking if it wasn't an issue. Earhart followed an interesting trace in language based in race. We are in a moment when any criticism is perceived as attack which makes it hard for us to really engage. We do need to be open to criticsm and thinking critical.

DH should be a way to deconstruct power dynamics in the academy rather than defensive.

There is nothing essentially anti-hierarchical to DH, but we have a moment now when we can intervene.

She talked Willard's note on Humanist about the Yale grad student conference. It was as if having an Ivy legitimized us. We don't want a DH that reifies current power structures.

We have become a lightning rod because we learned outside the system. Those in power may be anxious about the new swaggering kid on the block.

The humanities and the academy seem to be under attack. DH seems to be alien to the humanities and getting what little is left. That makes us a Trojan horse - a corruption of the humanities. The points of tension with our colleagues are how we disrupt traditions. The more public we are the more threatening.

What is threatening is not the work, but how we do it. For many DH projects the most powerful person might be the undergrad programmer. That is disruptive.

She called for activist digital projects. She told a horror story about being reviewed for tenure and how the committee felt that DH itself was the problem. No amount publication could be enough.

She talked about a project, White Violence, Black Resistance about the Millican riot. Her collaborator, Toniesha Taylor, talked earlier about the project. She talked about the activism of having students join them in doing meaningful research. The partnership is also looking at inequitable practices between Texas A&M and Prairie View A&M.

She talked about small DH projects and how they are important. We need to be frank and honest about power. To challenge power structures we will need to mercilessly introspective. We need to be willing to be uncomfortable.

As someone pointed out, this was a brave talk that we needed to hear.

This has me asking myself if I would be willing to stop flying to give talks. Coming from Alberta I feel we need to confront oil consumption in our practices. Given how invited talks are sign of prestige it is hard to imagine giving them up.

Thursday: June 4th: Canadian Game Studies Association

After a busy day with lots of papers I crashed and made it to the afternoon of CGSA.

Jérémy Bergeron: Comparison of Casual and Hardcore gamers’ physiological response patterns during a turn-based strategy card game

Bergeron is doing player research not game research. He is studying the types of players. He asked why we like games:

  • Graphics, storyline, game mechanics
  • Emotions or moods people want
  • Skills and challenges (ideas of flow)

Psychophysiology is the study of interior affective and cognitive states. They can study stress, cognitive overload, and emotions.

They wanted to see if psychophysiological markers help differentiate different types of players. They split players by how much they played. They had 15 hard core and 15 missions. Players played Myth and Magic.

They ran all their data through principal component analysis and found that there was only difference was in respiration features. The differences are hard to explain. There are differences, now they need to start looking more closely. Is there really something like "player types" such that data could find it.

The FUNii (funny) project is trying to create a predictive model of player's fun. This is just the beginning.

The interesting/scary thing, is that many of the controllers that we are seeing in the new consoles etc. can actually measure our

I had two questions:

  • How confident are they in there being player types?
  • What are the ethics of helping companies use our devices to gather data about us?

Maude Benenfant: Using Big Data tools and techniques to study a gamer community: The case of an online multiplayer game on Facebook

Benenfant didn't talk about results so much as data and methods. They wanted to compare in-game and out-game activities. They want to identify identity moves.

She is studying Big Story Little Heroes - a multiplayer game in Facebook which lets her get Facebook data. Again, did she have ethics for gathering the data? Did she sign any memos with Vandal?

She has a sample of about 550,000 players - 87% male. They used a mixed qualitative and quantitative methods. They have Google analytics connection data. They have Facebook analytics, but it is aggregated. The best data is from Vandal Games, the publisher. They track levels, wins and losses. They use Rapid Miner to process and Excel to organize data. She showed Rapid Miner.

You can't just work with the quantitative data - it doesn't necessarily explain what is happening. They saw a gap between level 8 and 9 and when they talked to the developers, they confirmed the design of a extra-hard level. There was also the issue of what changes happened when new features were introduced.

She wants to make an ethical reflection on the tools of big data. She worries about how big data tools might skew the results. Researchers are tied too closely to tools. Companies have too much data, but don't know how to ask questions? There is an automated generation of meeting from the tools and data.

Does she share her data?

Keiji Amano: Pachinko: Evolution in parlour and in video game

Amano presented a paper he and I worked on.

Alex Dean Cybulski: Playing in the Enclosure: the Xbox 360, Surveillance and Identity

He started by talking about the Xbox 360 and the Kinect and what it means to be spied on. Users were banned from Xbox Live based on actions like opening their case. He is interested in how Xbox 360 Achievements work and how the Microsoft surveillance works. How do they know what we are doing? How are they managing us?

He surveyed some understandings of surveillance. Why would MS watch us? The production of consoles is high risk industry - the industry wants to know what they are in for. The surveillance in games is of two types. The game software collects info, but there is also stuff gathered by the runtime environment of the Xbox which is watching you and making a profile.

Videogame telemetrics are then used to optimize games. Social games, for example, will be managed to pay sooner and more often. Games are being turned into a crude pavlovian science. It also minimizes the expert knowledge.

Microsofts network is how the info is collected. We are required to submit to getting an account. Then we get "juicy feedback". Players are enrolled in a network of consumption.

Xbox uses highly proprietary hardware/supervisor - the "hypervisor". The hypervisor is checking code and hardware to make sure you don't install stuff you shouldn't. The hardware is now the sovreign territory of microsoft. It is like an embassy in your house from Microsoft.

We are transformed into subjects.

In discussion we talked about surveillance and ethics. Some issues:

  • What is the ethics of working on psychometrics - what if we help industry spy on us?
  • What about the ethics of the data? We have to work with industry if we want to understand what is happening which puts us in a situation of compromise.
  • Can we resist surveillance? Can we not play ball?

Carl Therrien and Anthony Colpron: Now you’re playing with rhetoric. The evolution of marketing discourse in the video game specialized press (1981-1995)

Therrien and colleagues are looking at how players are addressed in marketing discourse.

Williams in "The video game lightning rod" has followed themes in marketing over time using coding. They are applying this method. They are going through three magazines (on in US, UK, and France.) They have created and revised a typology and now encoding the materials.

His hypothesis is that we move from antagonism (competition) to power fantasy. In arcades the rhetoric was taunting and antagonizing them. Later the rhetoric moves to a power fantasy rhetoric. Nintendo "now you're playing with power."

Anthony Colpron then talked about preliminary results from CGW. War games with male antagonists were most common. Males were leaders and women don't show up that much. Most people have dominant ethnicity (white). Technological features are used to attract buyers. War games are the most popular genre and the AI is part of what is sold.

Violence and sexuality: over 500 of 840 games depicted violence. The few women are sexualized.

They were interesting on the taunting of players in the marketing. "Are you ready?" They think cross-cultural comparisons should be interesting.

They decided to distinguish violence and hyperviolence. Hyperviolence introduces gore. War games have a very different type of violence.

It is interesting to see how auteurism and claims to art are presented. Finally there are the attractions of the media or graphics - interesting images or other media.

"The rhetoric and the play are never identical" B. Sutton-Smith, The Ambiguity of Play.

Emily Flynn-Jones: Letters to the Editor and Editorialized Gender: Discussions of Gender and Gaming in the Popular Press During the Mid-90s

Magazines often dictates how girls should consume and what values they should have. Flynn-Jones was hoping that gaming magazines might show how gamers are shaped. The ideal gamer is young, male and wealthy. The gamer is taught how to be a gamer.

She discovered two prolonged discussions about gender and gaming in the 1990s. There was a concerted attempt to attracted girls with girly games. There were criticisms of the movement to create gender segregated play. Tomb Raider provoked discussion.

By 2000 there was data

Nintendo Power and Electronic Gaming Monthly were two of the most popular gaming magazines. She looked at the editorials. She gave an example of a letter from 1995 to the editor of Nintendo Power complaining about representations. Some of the counter arguments include:

  • Counter examples
  • Gamer knowledge and gaming capital
  • Asserting male dominance
  • Drawing false equivalence
  • Male logic vs feminine emotion
  • Disengagement

She then presented Rhetoric Reloaded a game that presents quotes and you have to guess if it was the 90s or Now. Then she talked about a game called Rhetoric Reloaded Now that emphasizes that the rhetoric of gamers is consistent.

Ray Op'Tland: Gaming Evolution: Tracking Change in Gaming Technology and Titles

He talked about how we want to bring evolution into studying things like games. He has developed an ontology of evolution. Evolution doesn't really happen in culture the way it happens in other systems.

Evolution has Variation, Heredity, and Differential Fitness. Evolution can happen regardless of underlying medium if you have these three. He was trying to propose an idea that there is a DNA equivalent in games so that we can follow memes though time in games. He had an interesting plot

He ended by warning us that "Genealogy is not Evolution" and we need to think about that.

At times his talk seemed patronizing, but perhaps I wasn't following properly. He seemed to be warning us about using evolution as a concept in a naive fashion.

Cat Goodfellow: Examining difference in video game traditions: tentative steps towards a Soviet history of gaming

We don't tend to talk about non-western games unless they have influenced us. Non-western gaming are seen as peripheral and primitive.

Post Soviet games (Witcher 2, Metro 2033, and S.T.A.L.K.E.R.) tend to have a cult status and are seen as coming from outside. Few realize that people have been playing games outside the west for 40 years.

She gave a bit of a history. In the 1970s there were a lot of arcade games that were like fair games - partly electronic, partly digital.

There was a console called Dendy make by a Korean company for the Russian market. She showed a neat reality gaming TV show https://www.youtube.com/watch?v=Wzg3GQpRHLg

The Russians also came up with games that looked like Game & Watch. They were copying the idea of small hand-held games. She felt the ideology of these games fit the Russian spirit of the time. See http://www.ebay.com/bhp/nu-pogodi-game

"ELLO" electronic game was made in 1986 and made in Ukraine. Cost about 195 roubles. It was a some sort of mix of game with pegs and so on. You put a template on a board and that creates a game.

This was one of the most interesting papers! I now want to go on e-Bay and spend money on Russian game machines.

Navigate

PmWiki

edit SideBar

Page last modified on June 08, 2016, at 12:49 AM - Powered by PmWiki

^