Main »

CSDH-SCHN 2014

These notes are my notes on CSDH-SCHN (Canadian Society for Digital Humanities) 2014 conference at Congress. The conference is taking place at Brock University.

NOTE, these are being written live so they are full of typos, omissions, gaps, lacunae, holes, black holes, pin holes and so on.

Javier de la Rosa: A Digital Demography of Ten Centuries of World Painting

Rosa started by discussing how art can show the social structures of population when there is no census data. 25,000 paintings incldues 47,000 faces. They can use face recognition to identify the faces. They have all sorts of problems:

  • Some faces hard to recognize and false positives
  • Most paintings from Europe
  • Metadata (like dates) is of mixed quality

Interestingly there is a decreasing density of faces starting in the 15th century (or before.) The amount of paintings go up, but so does the population. He had interesting data about the age of the people presented. Women are presented older in the 12th-3th century - were they patrons then? The trend after is to show younger women.

He showed age pyramids for different periods. During the Baroque more youthful men. Then Renaissance a focus on women. Painters seem more interested in women between 25-29. The algorithm, however, isn't always accurate on youth and older people. Painting styles also changed and children sometimes shown older.

Painting demographics seems a fascinating way to study culture on a large scale.

Antonio Jiménez-Mavillard: Bullipedia, collaboration, motivations and trust

Antonio started by talking about the el Bulli restaurant which is possibly one of the best restaurants. The chef opened a el Bulli Foundation that was a centre for creativity around cooking. They are building a Bullipedia to be an online gastronomic database.

Antonio talked about the design for this Bullipedia which he is working on. He talked about gathering data about Taste of Home (a cooking web site.) The social network was very centralized - everyone linking to Taste. This led him to conclude that, if the Bullipedia wants more of a discussion, they need to develop a community of trusted commentators.

Natalia Caldas: The Glory of New Spain: Depiction of Complex Identities in Casta Paintings

Caldas went back to art history. She is looking at casta paintings that are sequences of paintings of families organized by castes from early 17th century to 18th century. The casta system was imposed by colonial system and paintings often specified role, wealth, race. The paintings also depict the beauty of New Spain, the exoticness.

I think that creoles (criollo) were not depicted in these caste sequences. There were tensions between creoles and Spanish born.

Caldas built a graph database of casta paintings with lots of information about the people represented, objects, settings, family structure in order to see how these paintings represent social structure.

Interestingly the Spanish seem most represented. 82% of people represented, however, are non-Spanish. Posing (of males) is the most common stance. Women often holding child. Hitting and fighting is also frequent. She showed a type of painting of a Spaniard holding off a black wife who is hitting him while a child is holding onto her skirts.

An interesting question was about the audience of the paintings. These paintings are a fascinating picture into colonial representations of family types.

Jason Boyd: Biographical Corpora and Computer-Assisted Exegesis: The Texting Wilde Project

Boyd talked about representations of an event in Wilde's life. He has extracted the incident from the corpus of texts and compared them to show that the story changed in different contexts. Biographical stories need to be understood in the context of the telling. So much of Wilde's life is gossip and stories spread over many different sources.

Boyd is trying to develop a genealogy of the facts of Wilde's life so one can track the "facts" over time and across the tellers.

He has developed a TEI P5 based coding that is designed for interpretation. He tried to code the texts using TEI XML and found that chunking would be very hard. They worked at the unit of the paragraph. They found it hard to code anecdotes and recorded speech. They want to put anecdotes on a timeline so they can track the ways the anecdotes evolved.

Markup has been a way to think through the problems.

Stephen Wittek: Mapping Conversion in Early Modern Texts

Wittek's project is part of the Early Modern Conversions project. He talked about the distinction between histories and news - something that emerged in the 17th century. News begins to emerge out of history. He used the transcribed texts in the EEBO collection, dividing them into two corpora, Commercial Drama and Commercial Print. He used a tool to replace variants with candidate modern spellings. Then he used Voyant and Mallet (topic modelling) to study the corpora.

He was encouraged by topic modelling. He wants to track what is happening in theatre and in print. He tried Paper Machines that works with Zotero and does historical topic modelling. This has given him ideas to follow up on.

James O'Sullivan: "More or Less All Plot": A Rolling Delta Analysis of the Commodification of Collaboration

They are using multi-variant statistics to analyze function words of James Patterson. Patterson is a former advertising executive and author of over 100 books in collaboration with others. Why Patterson and his collaborators? He is a popular author - the world's best selling author. Analyzing Patterson lets them look at pop culture and how he works with his junior partners.

Patterson commissions his own studies to figure out what the audience wants. He has developed writing into team or factory art.

O'Sullivan's study showed how the collaborators are stylistically different with one exception of a Patterson only novel which seems to have actually been co-written with

O'Sullivan ended by talking about literary capitalism. Patterson like Dumas before is a brand. His name is a trademark designed to sell books.

Panel: Culturing Tools

I was on a panel on TAPoR (Text Analysis Portal for Research) where we talked about ways of treating tools as research.

We had an interesting discussion about how to get reviews for tools. How do you get peers to

Panel: The Lifecycle of Interfaces

This was another panel that I am involved in. The panel reports on research from the Interface Design group of the INKE project:

  • The panel started with a short presentation by Sarah Vela about a study of the evolution of the interface of Perseus. She talked about a framework for reading interface change and gave two examples.
  • Mihaela Ilovan looked at how to do citation analysis of books. She has been finishing a prototype of a tool called CiteLens that lets one explore citations coming out of a monograph.
  • Laurentia Romaniuk presented on our models for what Perseus could look like on a mobile interface. Youth are using mobile devices more than desktops - it is time to redesign curated knowledge for mobile media.
  • Luciano Frizzera presented on workflows and the really neat WrkFlx tool. He showed an evolution of workflow interfaces that led to his web tool for editing workflows. He ended by talking about play and a board game that has come out of workflow. What if people can create games out of their workflows? He ended by showing the game DH Experience that they developed.
  • Tianyi Li talked about a project to archive the evolution of interface. Our case study was the Perseus project.

Issues about the preservation of digital knowledge for research are important to this and the Culturing Tools. I concluded by talking about the need for Research Data Management plans.

Panel: Thinking Critically About Information Visualization

This panel was on visualization in the humanities.

Tomoko Ichikawa talked about "Confusing, Misleading, and What Now?: Critiquing Visualizations." She adapts the 5E model (Entice, Enter, Engage, Exit, Extend) model to critique site following these criteria:

  • Perceptibility - users should be able to perceive it
  • Pre-Knowledge - give users some instructions if the viz is new
  • Comprehension - understanding that the visual is communicating - make sure people can
  • Utility - giving clues as to what is interactive or what is an affordance
  • Interpretation - what meanings do people take away - does the system inspire more questions
  • Engagement - is it fun, novel or are there alternatives
  • Outcome - what going to take away from it (back to life or work)
  • Purpose - why making this, what used for

John Simpson then talked about "Hermeneutics of Visualization"

Interactive visualizations invite dialogue. The experience of interacting can approach that of developing a tool (which might be an answer to Steve Ramsay on building.) Interactive visualizations are a looking with rather than a looking at.

We wondered at whether a tool can lie to you where the dialogue is effective, but misleading.

Ryan Chartier talked about "Untangling the 'Hairball': Revealing Networks in Big Data"

Ryan talked about a project that scraped large amounts of tweets. We started by grabbing all tweets about Rob Ford and now have around two million tweets. He showed an interactive tool for looking at top hashtags that have to do with Rob Ford. The visualization can be seen here.

John Montague then talked about "Visualization Better Dendograms''

John started by talking about big data. He asked "how do we use visualizations to find things we don't know?" He surveyed a number of text mining visualizations. He then showed and discussed an alternative way of representing a dendogram, which shows clustering.

Kim Irwin talked about "Now you See It: Task-Relevant Representations in the Humanities"

Kim started with a blog by Stéfan Sinclair where they applied different D3 visualizations to the same data. A data-centric approach starts with the data and its properties. Task-centric start with the task and what one wants to see. (Is there a look-centric approach that starts with the look you want.)

Kim presented 3 principles to guide us:

  • Pick a visual structure that matches the analytical task
  • Pick visual variables that fit the way we see (she recommended Jacques Bertin's topology of visual variables). She talked about how cognitively we focus on centre of visual field and therefore we should put information there. Radial and circular graphs put nothing in the centre which is a problem.
  • (And I missed the third principle.)

Kim showed us a neat app called the Vispo (Visual Poetry).

Tuesday, May 27th

David Plotz: Fast, Cheap, and Out of Control

Kevin Kee introduced David Plotz by talking about Slate, for which David Plotz is the Editor in Chief. Plotz offered to talk about pandas, but didn't.

Plotz talked about the news from the world of journalism which is bad and getting worse. The wise ones are tolling the death of standards and journalism. Plotz thinks that perhaps that all the things that are supposed to be terrible about the Internet may not be so bad.

He talked about an investigation of the Nobel Prize Sperm Bank that he conducted in 2000 that unfolded over months. It happened because he had the time. He was hired for Slate and hadn't been on the web. From the beginning the wise men were sceptical of internet journalism. The Drudge Report breaking the Monica Lewinsky story was the first encounter. Then blogs came along and people complained about that. Then the Huffington Post. Then Buzz Feed which is all about cat videos.

In the meantime newspapers were dying. Now digital journalism is journalism (and there is way too much of it.) The NWT publishes 200 stories a day, Huff Post publishes 500 and so on. Then there are the tweets, vines, podcasts and everyone is publishing everything all the time. We are drowning in trivia.

Most hang-wringing done about journalism is by journalists or professors of journalists. The real purpose of journalism is to inform and entertain the public. By that standard we are in a golden age - we have never had so much news so fast and so cheap. Governments have opened data mines to us. With open and accessible data we can make informed judgements instead of just speculating.

There are also new forms of digital journalism. Obsessional digital journalism that is focused like Politico or game/tech journalism. Explanatory journalism like Vox that clearly explains issues. Interactives are now available to help people. There is a surge in social capital and journalism - like Buzzfeed. Audio journalism - now we have podcasts of all sorts. Public radio is now more accessible and getting much larger audiences through the net. Even investigative journalism has benefited, like the Intercept and Glenn Greenwald. Propublica is another.

The long-form is also back. The internet is not just distraction, but people are reading longer stories - long-form journalism is back.

Why is the explosion happening now? It is possible and costs almost nothing. For 10K you can set up a reasonable enterprise. The mainstream media have shed so many jobs and the recession mean that there is a pool of very good people who work for little. Digital publishers are inventing new business models fast.

There are, for Plotz, 4 things wrong with the digital revolution:

  1. We risk losing the newspaper - mid-sized newspapers can't survive. Little ones and global ones (Guardian, NYT) survive. The loss of the mid-sized means that there aren't papers to hold cities to account.
  2. Can digital journalism create jobs with dignity that people can live off. It is great that youth are getting jobs, but senior people aren't being hired.
  3. Speed is a problem. Most sites that survive on being current update constantly. Frenetic pace means that people write faster and more and it is sloppy. Haste may not be as dangerous as people think, but sometimes people get harmed by sloppy reporting.
  4. Increase in speed has been accompanied by an ideological sorting. We read in enclaves sorted by left and right news. The new digital is good at giving you what you want, but it doesn't give you what will surprise you. The costs is that we are increasingly sheltered and it goes beyond the news. We live in communities and work in jobs with people like us. Plotz has found that it is commercially necessarily to play to one side or the other.

Round-table: From New Media Journalism to Digital Humanities

After the Plotz talk, he generously agreed to join us for a conversation about the shifts in journalism and the academy. I was part of the panel and was struck by how patiently he answered our questions. Some of the issues discussed included:

  • Will the unbundling of albums and newspapers that has been accelerated by the web also affect the university? He believes an important part of the university experience has to do with the space - ie. the social learning (and life) that takes place on campus in clubs and so on. This can't be replaced by MOOCs.
  • Does information lose context when published on the web and is this a problem?
  • He asked about journalism schools (j-schools). He answered that he likes them because otherwise you end up hiring people who know each other. The j-school allows people who don't have connections to become a journalist.

Jan-Christoph Meister: The DH Paradox: Challenges and Opportunities for the Hermeneutic Disciplines

He defines the humanities as the hermeneutic disciplines and he defines hermeneutics as "the theory based, methodologically controlled explication of meaning encoded in human symbolic artefacts."

He quoted Sontag's anti-hermeneutic sensualism to the effect that what matters is the erotics of art. Meister claimed that the problem is actually big data. Meister feels we should not give up on meaning through analysis and hermeneutics.

The function of the digital humanities should be to help us understand how human artefacts became and are what they are, and to exploration what they might have meant and might mean, rather than to determine what they are and mean.

He worries that purportedly objective techniques are more dangerous. We need to bring phenomenology back in. For example, we experience phenomena like temperature as analogue not as samples that are quantified. Zeno's paradox is another example and he had a great video of a fractal.

This is what he called the DH paradox - the tension between phenomenology and formalization.

Then he talked about Pi and its calculation which was initially observed and approximated. Can we find the "Pi of narrative?" Narratology tries at formal approximation. Genette's "Dicours de la Recit" and the formal model of narrative by Propp were derived from reading stories. There is some question as to whether Propp actually analyzed Propp.

Chris then moved to big data: "In God we trust - all others must bring data" (Google). Through technology we have a multi-sensory data and the data can be transcoded. It is tempting to think statistics will then do everything, but Deming showed some of the limitations of statistics.

What we do in the humanities is a highly recursive set of practices. He doesn't just want a tool, but the ability to interpret the text. CATMA allows you to add hermeneutical markup before analysis. There is thus a hermeneutical circle of interpretation intervention. He then talked about CLÉA = Collaborative Literature Exploration and Annotation which allows groups to negotiate interpretative markup.

What Meister now wants is to get automated markup. heureCLÉA takes the expert markup and feeds it into a machine-learning module that is then trained to be able to do it semi-automatically. Thus heureCLÉA tries to statistically model narratological perspectives. heureCLÉA is not fully automated, but uses human markup to train a system to then make suggestions back to you for more markup.

The DH paradox is an old story, Heinrich von Kleist (1834): "one might separate humankind into two classes: those capable of metaphors and those capable of formulas" (I missed the end.)

Meister had a good answer to my question about how he would answer Moretti to the effect that there is too much interpretation and that this sort of annotation can't be used on big data. He responded that the two go hand in hand. Smaller annotation approaches can train for big data and big data can generate hypotheses that can be tested in closer work. The question is why does Moretti feel he has to distinguish what he is doing from interpretation.

Innis Across the Disciplines: New Insights, New Opportunities for the Digital Humanities Communications and History

Viv Nelles, William Buxton, and I presented on John Bonnet's book Emergence and Empire.

I spoke about how Innis and Bonnet can help us understand this moment of communication technology change. Innis doesn't just tell us about how changes in technology can lead to societal changes, he also talks about what comes after - the restorative measures that can help rebalance things.

Viv Nelles talked about how Innis is unlikely to be resurrected, despite John Bonnet's work. He talked about the "cod-liver oil" argument for reading dead historians like Innis - namely that it is good for you. Nelles gives three reasons for

  • For Nelles it takes too much work to extract applicable ideas from Innis. Innis is also so opaque that you can read whatever you want into him if you have faith. (I would add that McLuhan was even more opaque and suggestive. If what you want is an oracle you can mine, McLuhan is better.)
  • For Nelles, if Innis presents a theory of emergent systems of interest today, there are better places to learn about such systems than Innis.
  • Innis still lives in the field of the study of Innis. History has moved on to the self, the sexual and the social. The centre of gravity has shifted away. The one place where Innis has had an influence is in environmental history.

He ended by saying that at McMaster Innis will probably no longer be taught in history, but he will be taught in Communication Studies and Multimedia.

John Bonnet then responded. In response to my concerns about big data John's reading of Innis is that there will always be critics that will help rebalance things. It isn't so dire. Bonnet's response to Nelles is that Innis makes sense to a wider public. He feels that Innis relevant to global history and that we need to bring together spatial computing techniques.

Wednesday, May 28th

Annual General Meeting

We start the day with our Annual General Meeting. Some of the news:

  • We voted on a change to our organization. We will have just one President and two Vice Presidents (one English and one French).
  • The CSDH-SCHN Outstanding Contribution Award went to the Day of DH project I was involved in!
  • Dugan O'Neil gave us a presentation on Compute Canada which now does Advanced Research Computing. They are developing services that are genuinely helpful to us like adding OwnCloud for moving data around. They are talking about a portal service competition.
  • Canadiana: We had a presentation about the Canadiana project talking about the new materials they are adding. They are partnering with CRKN.

Visualization, Epidemiology and Contagion

A bunch of us were on a panel on visualization.

John Simpson: The Epidemiology of Ideas

John Simpson talked about our attempts to study how ideas evolve - something we call the epidemiology of ideas. We got full text from JSTOR for about 10 journals in philosophy that we are trying to mine. He talked about the amount of computing power needed to compare hundred of thousands of articles to each other. We want to do sequence alignment over large numbers of journals. He argued that we back down from really big computational problems. Big numbers are important, both politically, and because we can see things on scale.

We had an interesting discussion about how to use Mallet on our data in WestGrid.

There was also the beginning of a discussion about whether we should be inventing projects for HPC just to use it? This raises interesting questions about the way the humanities interact in the large conversation. The meme of "big questions" and "big data" has gained traction with governments and public - do the humanities need to try to identify big questions we are working on or change our research to be understandable as "big"?

Ernesto Peña: Viewer Susceptibility, Metaphorical Entailments, and the Glass Cast Prototype

Ernesto started by talking about workflow prototypes. His group has been doing usability studies on INKE prototypes. One thing they found was different ways of participating in interface studies. Some people participate and then think about the interface. Some think about it while doing. Ernesto Peña then asked what we are doing in interface studies. He feels we want generative ideas and proof of concept ideas, not bug testing. He is wondering if removing the working model might increase the chances that participants give feedback on the design, not the bugs.

The Glass Cast prototype is a neat sort of DNA helix way of exploring citations and influence. Imagine a column with horizontal links and vertical ones. Participants were provided with a paper prototype and a pen and then videotaped (from above) as they commented on the paper designs. They were trying to understand what metaphors worked for users and whether metaphors are useful. One participant found their design distractingly "artsy" which suggests that metaphors can hinder comprehension.

Monica Brown: Contagion Rhetoric in Visualization Scholarship: A Critical Perspective

 Monica Brown gave a paper talking about the use of the contagion metaphor.

Watching Olympia: Reading Interface

I gave a paper talking about reading the CSEC slides and the software they demoed.

Digital Demos

This year we ran an alternative to a poster session which we called Digital Demonstrations. The idea was that papers showing software should have a table and computer so they could show their innovations. We got a great turn out. Some of the neat projects:

  • SylvaDB, Design and Creation of Graphs Data Bases - a cool graph database from the CulturePlex
  • IRC Mine: a free software tool for analyzing IRC conversations - a neat online tool to mine IRC chats
  • Rapid Cataloguing : Using Digital Cameras and Wi-Fi Enabled SD cards - a tool that allows you to suck images over wifi from a digital camera into a database. The tool polls your camera and sucks in any new images and puts them into a bibliographic database. Very cool idea.
  • Festos - a cool tool for OCR workflow. Very clean and it connects to Dropbox.

ACH Panel: Microsteps to Advance the Digital Humanities (ACH)

Jarom McDonald started by talking about the issue of perceptions about the digital humanities. 10% of Humanities Professionals self-identify as working in the digital humanities. Only 2.5% of Humanities-based funding grants go toward DH-inflected projects. What can we do to engage larger humanities community and get people to think differently about funding? The ACH started the Microsteps programme to give small grants to smart new scholars. They have now funded 6 microgrants.

Roopika Risam then talked about "Digitizing Chinese Englishmen and ACH Microgrants". She talked about how texts represented in projects about 19th century writing tent to preserve writing by UK and American writers. "As a result, the projects tend to obscure connections between the English language and empire." The role of the colonies has yet to be fully integrated in how we think of Victorian English literature. We have to address archival silences - remedy the absence of colonial writing in the archives. The microgrant seeded a project that is now getting support from elsewhere. They are looking at shifting to Scalar and trying to tie into NINES.

Jarom channelled Scott Weingart on Assessing Relevancy in DH. Scott had some interesting graphs about growth in DH community. He also had an interesting analysis of the words in accepted abstracts when in America and when in Europe. In North America words like visual come higher in Europe we see archive. He had very interesting data about acceptance rates and how they correlate with keywords (text analysis does well.)

Jeri Wieringa talked about "Bringing Rails Girls to the Digital Humanities". She talked about the gender gap in technology and the digital humanities. She talked about an event she organized called Rails Girls DH. Coding tends to get more attention which is why there have been projects to encourage coding among women. But that isn't really enough. There is stereotype vulnerability (fear that your failure will be taken as an example of a stereotype.) Their event was a day-long coaching on Ruby on Rails. They had great participation. There is a lot of interest in the idea. There is, however, a limitation. Microgrants and one-off events aren't good for building community. You need repetition and follow up.

Corpus Analysis

The last session of the conference was on corpus analysis.

Grace S. Fong, Song Shi: Inclusion and Exclusion: Patterns of Selection and Distribution in Anthologies of Women’s Poetry in Late Imperial China (17C-early 20C)

Fong and Shi presented on the Ming Qing Women's Writing project. They are digitizing and making available Chinese women's literature. It is not a full-text collection - you search the metadata and get page images. They have a "page turner" . In the mid to late 16th century there was a marked increase in anthologies of women's poetry which shows how women's poetry was becoming important. Anthologizing is an important trend. They are studying these anthologies as a way of studying the explosion of women't poetry.

Shi talked about why they used Microsoft Access. Access seems to have been a convenient way to deliver databases to researchers. It is also a bridge to MySQL, SPSS and ArcGIS. Access can run on individual researcher computers. They generated hypotheses to test their tools and research. Eg. Hypothesis: Female editors of antohologies are more likely to select poems titled with the character "bing" (illness). They created a neat map of where the poetry was coming from.

Harvey Quamen: Visualizing Discourse: Archival Interfaces Using Topic Modeling and Vocabulary-Management Profiles

Harvey talked about his work on Sheila and Wilfred Watson as part of the EMiC project. They are doing topic modeling and then visualization. You can see a first tool here. He talked about Vocabularly-Management Profiles and how he is using topics with VMP. He referenced a Gilbert Youmans paper. VMP depends on context as it measures new vocabulary. If you place a document in a different context the words that are new will be different.

Harvey had interesting things to say about how people assume that that in Digital Humanities we are doing structuralism. He drew our attention to Structuralism: Thinking with Computers and Johanna Drucker's critique of Moretti.

I really liked how Quamen was thinking about studying a large collection of letters. He was sensitive to the dangers of assuming that changes in the corpus reflected changes in the authors. A collection of letters may not be as coherent as we would be like. It may not be a map to changing thought in an author.

Jon Saklofske: Meta-adapters: Mediating compatibility to enhance the scholarly potential of scattered humanities data

Jon asked how we can make corpus study as easy as uploading to Voyant. He talked about the lack of standards across projects so it is hard to create corpora across projects for study.

He talked about demoing tools and how tools can evolve as they are used for different theoretical issues. New Radial is a tool he has been extending to do different things. He now has an adaptor feature that allows texts to be brought in from different collections like http://dp.la and Europeana.

Instead of standardizing formats (as Tim Berners-Lee wants) New Radial takes an adaptor approach. This allows individual projects to keep their variety.

He talked about a meta-adaptor which I didn't understand, but is a great idea. I also didn't understand how they he avoided dropping things to the lowest common denominator the way Abbot does.

The End of the CSDH-SCHN conference.

Thursday, May 29th

On Thursday I visited some CGSA sessions.

Federica Giannelli and Benjamin Neudorf: Visual Memory and Gameplay in a Collaborative Bibliographic Management Tool and Game-Based Mapping Project

Federica and Benjamin presented Ref-Scape which gamifies collaborative building of a shared bibliography on Zotero. The use D3 to create a visualization that lets you see how things are connected and to connect items. They also have a scoring system to encourage people to contribute. They talked about location of items and memory palaces.

The environment seemed to me to be more useful for using Zotero for social network analysis. If it can analyze an open Zotero group you could use Zotero for social network analysis.

John Montague & Luciano Frizzera: Have you ever been DH-experienced

John talked about the DH Experience game. He gave background on the development of board game. The board game came out of work on workflow (see Luciano Frizzera's WrkFlx tool). We wanted to adapt the tool to gaming so we thought we should prototype on paper first. Now Luciano is working on an online version of the game build with WrkFlx.

Sonja Sapach: Gotta Catch Em' All: The Compelling Act of Creature Collection in Pokemon, Ni No Kuni, Shin Megami Tensei & World of Warcraft

Sonja talked about how collecting is so compelling. She crossed theories of collecting from the West with Japanese games. Collection is based on classification and goes back to Noah (in the sense that Noah is an archetype collector, even if he didn't really collect.) Shared features of the games she talked about:

  • Hundreds of collectible pets
  • Collected pets are battled - they are a representative of the player
  • There is a relationship between types of creatures and who beats who - a whole classification.
  • Design of creatures is "moe" - to provoke an emotional response

She identified 4 possible themes:

  • Exploration - creator of Pokemon wanted a game
  • Organization -
  • Specialized knowledge
  • Immortality

Sapach used Azuma's idea of database animals. The creatures collected are part of a grand non-narrative. Collecting has streamlined to just hunt, acquisition, post-acquisition, and manipulation/display/cataloguing

Jérémie Pelletier-Gagnon: Self-regulation as a System: Policing Erotic Video Games in Japan

Jérémie started with the CNN story about Rapelay and gamified violence. Facing international concerns the game designers withdrew the game from the Western markets. Some people saw the game as indicative of something about Japan. He took Lah's question "Why would Rapelay thrive in Japan?" and turned it to "What has allowed problematic games to survive in Japan?" To understand you have to go back to the 1980s and games like Lolicon Syndrome, Ednix 1983. 177 from Macadamia, 1986 was one of the first rape games and "177" is the number of the rape legislation. It was presented in Japan to legislators as an example of unregulated games. A few years later Saori Bishojotachi no Yakata, 1991, was investigated. Police searched the company headquarters and arrested the president.

Japan in 1992 saw the creation of a Ethics Organization of Computer Software - a business association that rates products (not just games.) EOCS got into conflict with publishers. Erotic game producers didn't have confidence in EOCS so some formed their own CSA. EOCS relaxed their guidelines and rated Rapelay as having no violence.

Why are there so many published erotic games? Jérémie argues that the system is essentially self-regulatory. The system is sort of effective on child porn games. It is also clear that the Japanese government doesn't want to regulate the industry.

I wonder why the government doesnt' intervene?

Domini Gee: Visual Novels and the International Fan Community

Domini started by defining visual novels. They are big in Japan, but a niche outside. She wanted to look at the international fan community. Fans have organized translations and distribution. She looked at surveys of fan community. She talked about some of the big web sites like Visual Novel Status Translation Tracker. Translating is a lot of work when these games can have as much text as a 800 page book.

The relationship between fans and companies is complicated. She talked about an eruption between the company Minori and fan translators. Minori mentioned that the game was developed under Japanese law and they don't want to risk international attention. "Ethics is different from country to country." The translators seem to use arguments similar to those who pirate music or software - "you force us to by not doing it yourself." She talked about a fan effort on Steam to get the company to reach out to original developers.

Mimi Okabe: From Pain to Pleasure: An Exploration of Rape Fantasy in Japanese Boys’ Love Visual Novel Games

Okabe is looking at Boys' Love manga and games. She looks at themes as the Japanese discuss them like Uke/Seme (receiver/attacker), Bishonen (beautiful boys), Asobigokoro (playfulness). How to think about boy on boy rape in BL games?

Her example was Enzai - a visual novel game that depicts man on man forced sex. As a visual novel game it doesn't have a lot of interactivity. There are different endings.

Uke/Seme paradigm - a paradigm of attack and domination. The Uke is typically the submissive stereotypically "female" character. The paradigm seems to reinforce a patriarchal hierarchy of violence and sex. Some critics have argued that these representations are a fantasy and not read as real. These bodies are neither male or female and the fantasies may be

Bishonen - beautiful boys that are slender, hairless, and toned bodies. Mysterious looking eyes. The bishonen can be read as figure of resistance. In Enzai the bishonen have lots of scars.

Asobi (play) - Mimi argued that Enzai is responding to bishonen media - playing with it. The graphic depiction of bodies, blood and so on, seem to be designed for female gratification. Male on male sex (and rape) is idealized for female fantasy.

We had a nice discussion at the end about representations of sexuality in Japan.

Navigate

PmWiki

edit SideBar

Page last modified on May 29, 2014, at 10:14 AM - Powered by PmWiki

^