Main »

CSDH-SCHN 2018 And CGSA 2018

These are my notes to CSDH-SCHN 2018 and then CGSA 2018.

They are strictly my interpretation of the talks I went to. I also don't take notes all the time.

Saturday, May 26th, 2018

Networks in Text and Space

Corpora and Text Mining the Eighteenth Century: Catherine Nygren

Nygren talked about gathering a and mining 18th century travel texts. Deduplication was a major issue as she got texts from many sources. Excel has a nice Fuzzy Look Add-In that actually worked well. She used Python for the mining. She now has over 4000 texts in her bibliography. She compared the distribution of texts from ECCO to that of HTRC that allows one to see idiosyncrasies of different collections.

"Or, The Whale": Stylistic Analysis of Melville’s Cetology Narratives: Chelsea Miya

Texts are carry traces of the factors that influence the writing. This is the foundation of stylometry. Miya tested stylometric techniques on a single novel, Moby Dick. She wanted to test the theory of others that Moby Dick was created from two separate texts.

Miya talked about the reception of Moby Dick and Melville's own thoughts. Melville thought Moby Dick would make him famous, but it didn't sell well and he died in obscurity. Later editors improved it (and abridged it). These shortened versions may have helped make the full novel such a classic. Miya wanted to compare the whale sections that got cut out to the adventure that was kept. Any chapter without any characters is a whale chapter.

She then used Stylo from Eder, Rybicki and Kestemont. She tested if Stylo could disambiguate the chapters based on vocabulary using cluster analysis using frequent words. She then tried network analysis to see how chapters may be connected.

She used the Zeta method to find words avoided or preferred in the two types of chapters. She did POS tagging and then clustered POS trigrams.

Scriblerian Mappings: Experiments in Visualizing 18th-century London's texts as Networks: Allison Muri

Muri talked about how normal digital editions map onto print. Muri's edition is now HTML 5 not TEI. She still pays attention to the document as a document, but she marks up people, events, places. The idea is to make an edition of London as rich place of literary allusions. Rather than intertextual she is looking at the pantextual. She is drawing in McGann's idea of the social text as a network of texts in social relations.

She has a database and multiple zoomable maps. You can see relationships of people to place or books to place. She then talked about Pope's Dunciad and her digital edition of this.

Muri used HTML 5 rather than TEI to be able to control the design better. HTML 5 makes it possible to have semantic tags so one has the best of both. HTML 5 has microdata - data recognized by search engines. She showed examples.

She is now generating JSON data so that one can do force-directed graphs that lets one examine relationships. She compared graphs for different texts.

Workflows and Architectures

Code Notebooks: New Tools for Digital Humanists: Kynan Tran Ly, Kaitlyn Grant, Robert Budac, Jinman Zhang, Greg Whistance-Smith, Geoffrey Rockwell, Anthony Owino, Jason Bradshaw, Stéfan Sinclair

Ly introduced literate programming and showed Jupyter notebooks. He talked about the advantages of code notebooks:

  • A portal into coding
  • Portable
  • Ability to share resources and code
  • Archive research
  • Reproduce other's results

He talked about training materials and the materials we have put up on TAPoR. He then talked about Spyral which is a notebook environment that is being developed to extend Voyant. This work has been supported by the SSHRC-funded partnership on Text Mining the Novel.

The Role of Digital Humanists in the Implementation of Computerized Workflows: Maryse Kiese

Kiese presented in French about workflows. She argued that workflows are important as they communicate knowledge about processes. She mentioned Raymond Williams who talked about different ways of understanding culture. Who are workflows for big data visualized and conceived. She quoted someone on how important it is that people understand informatics and workflows made visible can help. Humanists don't contribute enough to the creation of tools. Workflows are a way of bringing them in.

In big data it is important to have domain experts involved in the iterative exploration of patterns. Workflow is an "abstraction of steps for executing a real world process." A workflow can thus communicate everything from simple to complex workflow.

Kiese has been working on machine learning with computer scientists to look at the "aboutness" of texts. If humanists participate in the whole process (as opposed to the end when the tools are made) it will help big data and our knowledge.

She then showed some workflows and talked about "trigger words". She is then using Weka that can let her implement her workflows.

I like the idea of workflows being a site for interdisciplinary work.

Textual Communities -- what and why: Peter Robinson

Robinson has been talking about textual communities for a long time and now it works. See http://textualcommunities.org . He then demoed the system.

They have 30,000 manuscript pages of the Canterbury Tales and are working on the transcription. He is employing a large number of transcribers which means the system needs to handle all the people and roles.

Everything is in a publicly accessible database that anyone can call. He then showed collation tools.

There is a sandbox version which allows one to try things out. If your project is flourishing then you want to move over to the production version.

It is working! This has taken decades. Why did it take so long? Part of the problem is that texts overlap pages. This is the multiple hierarchy problem in XML. He treats each page as a complete TEI document but the system can bring the pages together. Part of the definition of a text is that they are at least two things:

  • A physical object
  • An act of communication

Thus it is at least two hierarchies. It isn't really overlapping hierarchies, but two trees with the same leaves.

He talked about things not to do: When making an edition, don't make the software for it. There is a better way. He now wants others to make better system.

Robinson then talked about the International Image Interoperability Framework which allows distributed software for images. Alas people who say they will use it are not. Libraries are hiding what they have. The attitude is that everyone wants to create their own system and they want to own the images to control them. These habits are reinforced by the traditions of the academy. Data and tools are in silos because we like it that way.

The Textual Communities system forces you to commit yourself to an open license. We aren't trying to build something so much as give something affordances for others to build software for.

Format and Translations

Methodological Translation: _Lexicons of Early Modern English_ and TEI: Tim Alberdingk Thijm, Ian Lancashire

Ian Lancashire started by describing LEME (Lexicons of Early Modern English). This project has survived for decades by adapting with the times. It started in 1996 and in 2006 they used TAPoR funding to move it to mySQL and publish through the U of T Library. Now they have created a new interface and back end.

Tim Thijm then talked about the variety of formats for the different dictionaries included. TEI is great for interchange, but the style of their dictionaries don't fit cleanly. Rather than create a new encoding format, they used methodological encoding. They have an encoding that suits them, but can translate to TEI.

They use a pseudoXML and then convert it to a strict XML. He showed how their code did this. Then the code is translated into TEI.

He then talked about cases where a humanities expert has to intervene to make decions. They use conservative rules to be careful changing the order of the original.

Ian closed by arguing for openness.

New Format, Old Tricks? Texts’ Graphic Travels: Constance Crompton, Ruth Truong, Michelle Schwartz

Does the old tension debated in the PMLA about how databases are a genre (Folsom, 2007) still hold. Is the database a new genre? Is it a rhizome? Folsom also dealt with the database and how it was a natural genre for new media. (Database as a Genre of New Media, AI and Society, 2007) Computerizing forces structure of database on media content.

Hayles sees narrative and database as symbionts. Databases can include new content and maintain order, narrative loses order if new content is put in.

The TEI was theorized by McGAnn, "Database, interface, and archival, fever". McGann undermines the idea of the database as the ultimate form for the 21st century. Johanna Drucker lightly chides the TEI community. Peter Stallyrass, "Against Thinking" asks to think of the commonplace - taking what is sweetest from each flower.

Crompton then talked about the Lesbian and Gay Liberation in Canada which is modeled in a graph database which allows you to not have to work it all out in advance. They use TEI for individual documents and the graph database (without a scheme) for gathering it.

What might be the downside of schemalessness? Does it mean you can check your entries? That you can't encourage standardized data.

Patterning Chaos: Periodical Publications and the Small Worlds of Literary Translators: Raluca Tanasescu, Chris Tanasescu

Raluca Tanasescu is working a project looking at chaor theory and the translation of English poetry into Romanian, after 1989. Everyone is a poet in Romania and so there is lots of anthologies etc.

She put together the corpus and found that important venues not listed in national library. She looked at social categories of translators like academics or poets who sometimes translate.

She borrowed idea of microspection from chaos theory to analyze world of translators.

Chris Tanasescu then talked about experiments on social network theory and chaos.

Sunday, May 27th

Accessing Conceptual Domains

Adjoined Conceptual Domains in the Bilingual Poetry of Pablo Picasso: Luis Meneses

Meneses has been looking at Picasso's writings (poetry) and the challenges of interpreting them using computers given their visual composition.

He showed the On-Line Picasso Project - a system of integrated databases with art and associated narratives. They have tens of thousands of art works, articles, writings, and artefacts.

They are using semantic domains dervived from his poetry. They first tried topic modelling, but that didn't work as the poems are too short. They have created a set of concepts and link to words in different languages that Picasso used. How are the concepts created?

They know that Picasso worked with collage - are they inferring that he would work this way in writing?

They can also see which language (Spanish or French) is used most for a particular concept. They also look at parts of speech - Picasso likes verbs, perhaps as they convey action. They looked for correlations between language and semantic categories. Spanish seems to be used for nominal events and French for change, motion, social relations.

Picasso said something like, "Computers are useless. All they can do is provide answers."

The Palatine Anthology project: an API allowing new interactions with the Greek epigrams: Marcello Vitali-Rosati

I had to surrender my laptop so I couldn't take notes.

Absorbing DiRT: Tool Discovery in the Digital Age: Kaitlyn Grant

Grant talked about a project to merge DiRT and TAPoR - two tool directories. McCarty commented on how DH does a poor job of keeping track of the objects of innovation that we develop. Bibliographies don't keep track of tools and digital resources well. Tool directories are one way to keep track of what we build. They go back to catalogues and lists and yearbooks. Grant gave a nice summary of the history of tool directories.

Both TAPoR and DiRT have changed over time. Sustainability is a real problem. In the case of TAPoR the original portal was split up and TAPoR was simplified and became just a tool directory with the analytics being offered by Voyant and other tools. The idea is to keep resources relatively simple - do one thing well. Grant talked about how TAPoR has survived:

  • Keep infrastructure simple
  • Have a faculty champion who can get small grants and assistance
  • Adapt the infrastructure to different grant projects

DiRT, alas, was not sustainable. It had to closed or merged. To do that Grant had to crosswalk the fields and programmers had to adapt TAPoR to merge the data.

This work has been supported by the SSHRC-funded partnership on Text Mining the Novel.

Abstraction and Big Data

Digitally Reading Canada Reads: Paul Barrett

Canada Reads is a sort of a battle of the books on CBC. There is a theme each year that frames the public discourse. There is a lot of PR around the show and the books. We have a lot of stats on this mass reading event. It creates a sense of community through literature. Cultivating citizenship around the humanities has a history in McLuhan/Innes. Can-Lit can be the conscience of Canada.

Canada Reads promotes the idea that reading itself is worth while and can be a form of nationalism - national pedagogy. What do people actually say about Canada Reads? What sort of dialogue does it foster? What is the public made by this? How does the digital form a public dialogue? Who is borrowing the books?

There has been a lot of discussion about CanLit as a unifying sphere at all. Could CanLit be breaking up?

How can we actually assess how people talk about it all. The project has been gathering social media with the CanadaReads hashtag. Alas self-publishers often use the hashtag to promote themselves.

Barrett gave examples of Instagram images and tweets. There are different genres of discussion.

They use a service Discover Text that of $200 gives them all the tweets for a year.

Splendid Isolation: Geoffrey Rockwell

I then gave a paper about a French tradition of textual statistics. This discussed work I am doing with Stéfan Sinclair looking at the use of correspondence analysis in France. The work is being supported by the SSHRC-funded partnership on Text Mining the Novel.

Measuring Abstraction Using Word Embedding: Unsupervised Learning of Generic Categories: Ryder Wishart

Wishart started by talking about what he means by abstract categories. Abstraction in liguistics is different from philosophical abstraction. Function words are abstract.

Charles Ruhl has a book On Monosemy. There is a cline (continuum) between closed categories far from reality to open categories related to reality. Concrete words close to reality. They name things. The more abstract words are like "of" and "from". He gave criteria for abstraction.

It is important to distinguish language from what we do with language.

He asked some questions:

  • Can we use vectors to generate a picture of how text related to text?
  • Can we capture the function-content continuum?

He calculated the distance between words and then graphed them in a network to see how function and content words work.

Monday, May 28

Climate Change and Academia - Joint Panel with ESAC

I was part of a panel that discussed the carbon costs of research, specifically air travel. Flying to meetings and conferences can be between 15 - 35 percent of the carbon footprint of a university and the percentage is growing at many universities as buildings are made energy efficient. How then can we reduce the flying?

I talked about our experience organizing the Around The World Conference as a way of modeling different e-conference formats.

AGM: Sonja Sapach received the Ian Lancashire Award for Graduate Student Promise. This award "recognizes an outstanding presentation at our annual conference of original research in the digital humanities by a graduate student." Bravo Sonja!

Figuring out a path. Building Research and Knowledge Environments in a Digital Culture: Bertrand Gervais

Bertrand Gervais was awarded the CSDH-SCHN Outstanding Achievement Award. He was given this award for his community building among other things. He does digital aesthetics, research/creation in addition to writing novels. He told a story.

He asked about what part of his knowledge is his? is stable? His story was auto-biographical. He talked about the fictions his roommate played (his roommate was in computing and played early computer games.) Bertrand witnessed how the computer was not just a tool, but a new cultural form that was evolving over a network.

Bertrand got interested in cognitive science and literature. He worked with Jean-Guy Meunier and got hired by a literature department. He was more interested in how humans understood their world than programming a computer to understand it. When hypertext fictions emerged he got interested in them and built a lab funded by CFI called NT2 (New Technologies Squared). This lab studied new culture and new artifacts - the computer as a media and its impact on imagination.

NT2 created a directory of new media work (French). Just putting up a database was not enough, they needed to frame it with supporting knowledge. They called it a research and knowledge environment. Artists wanted to be on their databases. The directory in some ways defines how knowledge is organized. Other resources support this. Part of developing the directory was developing taxonomies.

He talked about a fascinating project, Archiver Le Present that is dealing with exhaustion as an aesthetic.

Observatoire De L'Imaginaire Contemporain was created in 2009 to provide a place for people to contribute content to a digital form. The search engine there federates the other directories so you can find information from across the ecosystem of things that they have built. The universe of directories built can thus be woven together.

He then described a project to help people in Quebec to get into writing mobile fiction. They brought together libraries, editors, academics, writer's union, and others that have to do with mediation. They have a tool called Opuscules. This is going forward as a SSHRC partnership proposal.

Gervais finally returned to the story of his old roommate who went into computer-science. After years of stress in high-tech he had a health crisis and lost his memories. Gervais ended up visiting and telling his stories of their times together. He wanted to share his CSDH prize with his friend. Gervais thought he would weave his friend into his story and did. Now he goes back to tell his friend a story of digital humanities.

And that was the end of CSDH-SCHN.

Wednesday, May 30th - CGSA - Canadian Game Studies Association

Starting Wednesday, I attended sessions of the Canadian Game Studies Association.

Game Industry

Leveling Up?: Examining Recent Trends in the Canadian Games Industry: Dylan Armitage

Armitage talked about the game industry and the Montreal model. Too much of the research is industry based or from government. Reports like the Canadian Interactive Industry Profile.

Ubisoft in Winnipeg. Alberta has now announced a 25% tax credit to attract game companies.

Firm-Level Diversity in the App Store: Towards a Classification Taxonomy of the Game Industry

David Nieborg, Chris Young & Daniel Joseph

They are working on apps - how they are imagined, how they are sold and used and politics. See AppStudies.org. They have access

They see themselves doing the political economy of platforms - platform capitalism. They all see themselves doing "game production studies".

They are not studying cultures of production - ie. what is going on in studios. They are following the money.

What they do brings up the issue of how to go beyond the categories like AAA and indie. Their thesis is to widen the scope of game studies to look at platforms like Apple's App Store. They want to find new ways to classify things.

They have data for 2 years from the two big app stores for three countries. They are coding it.

The Canadian store for one month is 13.3 million CAD. The top 100 make 85%. The Pay model is 99% freemium (with the exception of Minecraft.) Apps are being developed by US, UK and then a lot of smaller countries. Canada doesn't have that much.

Jealous Gods, Dull Razors and Worthless Machines: Economic Platform Studies as a Challenge to the Orthodoxies of Business and Management Studies of the Video Game Industry: Dominic Arsenault

Arsenault started by joking about the foregone conclusions about the Famicom business template - that Nintendo is brilliant and it set the template for all other work. He has a book from MIT Press which is a platform studies of the Super NES. It is more an economic platform studies.

He argued that the strategy was to capture market share by subsidizing the console sales to then establish the standard and capture the market. This is the razor-and-blades model. See The Political Economy of Blockbuster Video Game. He feels this model has problems:

  • Everyone hates Nintendo: Nintendo isn't like other industries. They treat developers as competitors not collaborators. If Nintendo was a person, what personality would they have.
  • Darwin did't shave: The razor and blade model was just being winged. Nintendo didn't really buy into it.
  • Consoles suck: so Nintendo has to integrate games

He concluded by arguing that business models are not universal. There is a certain amount of riding chaos rather than strategy.

History In and Of Games

Deadplay: A methodology for the study of “dead” videogames: Dany Guay-Belanger

Guay-Belanger's project started at the Canada Science and Technology Museum going through their videogame collection. He defines videogames as assemblages (Delanda 2006) and Articulations (Slack and Wise, 2015). He uses paratextuality theory and adapts it to games. Lunenfeld and Consalvo talk about how fluid the relevant parts of an assemblage.

Guay-Belange made a podcast of his MA thesis - he does public history so a podcast would engage others.

Games are dying in all sorts of ways. How can we preserve them? Games don't die entirely, they become zombies. Preservation of games is a matter of reanimating. the goal is to preserve the aura. His way of reanimation/resurrection focused on:

  • Oral histories of creators and players
  • Play captures
  • Emulation

Where can we find stuff:

  • Online
  • Museums - like the Strong Museum of Play
  • Game studies centers and heritage collections

He talked about the game Joust (1982) and Seven Cities of Gold (1984) that he researched at the Strong. He talked about the creators of the games and their histories, including Dani Bunten-Berry who is featured in the CBC series The Artists.

Archiving an Untold History: Greg Whistance-Smith and others

Greg presented a paper that focused on developing an archive of interviews with Japanese developers that are the raw materials for John Szczepaniak's books, Untold History. He gave a nice short history of the Japanese game industry. He pointed out how we play a lot of Japanese games, but don't know much about their industry and the developers. This is partly because Japanese developers are often not allowed by their companies to give interviews and there isn't a culture of artist designers getting attention.

Szczepaniak then ran a kickstarter to raise the funds for a series of books that are now published. We at the University of Alberta are now working with him to archive the collection. Greg talked about archiving issues, metadata schemes and archiving technology.

He closed by talking about auteurship in Japan.

Real pasts and fictional futures: Counterfactual storytelling in Fallout 4: Samuel Mccready

Maccready talked about historical games and how Fallout 4 employs history. How does the game challenge our ideas about history. Fallout 4 provides a counterfactual history. Counterfactual history forces readers/players to think about who controls history. Valkyria Chronicles is another example that Maccready talked about.

The tone of Fallout 4 is set by the post-apocalyptic world. You have preserved rooms and period materials and devastation. Billboards that capture the optimism of the past age with a dose of irony. There is a contrast between ads (from the age) and the destruction of the setting. Ads for "Sugar Bombs" - cereal.

McCready talks about how historians read these counterfactual games as an answer to attempts to whitewash the cold war.

He ends by talking about types of history games:

  • best possible history - monumental - Civilization
  • postmodern history
  • critical history - critical

We had an interesting talk about how counterfactual games can undermine stories about history.

Geek Girls

We had a showing of the excellent documentary Geek Girls by Gina Hara.

Friday, June 1

(Re)investigating Data & Methods

Identifying gaymer through an online community: A big data content analysis of r/gaymers: Jason Lajoie

Lajoie started by talking about how gaymers are not necessarily gay men. He then talked about the questions he brought to the subreddit r/gaymers.

He used the Reddit API to scrape the top stories over time. He got all posts ranked top for a year. He also scraped all the links and images from top posts.

He talked about his estimates of the number of contributors and how much of the posts were from top authors. The conversation doesn't seem to be dominated by the top authors.

He ran the content through mallet to do topic modelling. He talked about the discursive formations. Some gaymers were seeking others to play with. Some where surveying others. Some were sharing things that they thought others would value. A fair amount of not-safe-for-work images were shared.

He talked about the language used. Lots of "I think" and other similar constructions.

Using Salience to Study Twitter Corpora: Robert Budac

Budac presented on work we are doing on studying the Twitter data we have. His paper was mostly about methodology. He started with definitions of words like corpora and salience.

He talked about our corpus which is at https://dataverse.library.ualberta.ca/dataset.xhtml?persistentId=doi:10.7939/DVN/10253 . We have 11 million tweets but there are lots of bots and retweets. Removing them gives us 2 million.

He then started talking about different measures of salience starting with frequency. Then he talked about tf-idf (term frequency - inverse document frequency) which shows words salient in particular documents. Then he talked about the Mann-Whitney test and showed salient words from comparisons of our corpus from before and after Trump announced his candidacy.

He concluded by talking about how bots seem to be edging out humans.

Political Mobilization in the GG Community

ZP gave a great talk on a study of our data.

Using Autoethnography to revisit Korean Gaming Data from 2001-2002: Kym Stewart

Stewart collected data through "metisage" (autoethnography) in Korea during her masters. She talked about finding ones voice and how teaching can suppress it. She talked about kids play in Korea. South Korea didn't have the home game console environment in the 1990s as Japanese consoles were banned. There were fewer channels on TV.

By 2001 Korea was seen as one of the most wired and avid Internet surfer country. She found a changed country with lots of PC internet cafes (PC-bangs) Cybergames happened. There were internet cybertrains. The PS2 hit Korea in 2002.

Stewart did a series of surveys and interviews including professional gamers (foriegn.) She looked at policies. Games were now being seen as a viable Korean industry.

She then got interested in media education. Educators aren't that interested in data. She needed a new method. She found metissage - autobiographical writing woven with multiple voices. She gave us an example.

Dreams and Video Game Play: Nightmare Protection, Virtual Reality and Individual Differences

Gackenback started by talking about the importance of dreaming. Mental content goes on all night long. It is during sleep that the brain does important work with minimal sensory input.

Does being in another fictional reality (games) affect dreaming. Gamer dreams may show some fundamental structural differences.

She then introduced here students who gave short papers on their experiments.

Female Nightmare Protection as a Function of Sex Role Identity and Sex of Experimenter: Kendall Zielke

Gamers who play combat games practice quick reactions that allow them to develop defensive maneuvers in their dreams. There are nightmare protection effects in men and women. They randomly assigned 65 participants to either play a combat game or see a misery film or do a computer task. They then filled out dream reports. They looked for signs of the film or game in the resulting dreams. They coded the dream reports.

They found actions from the game show up in the dreams. There was no different in threats in dreams. There was more thinking in game dreams.

Emerging Implications of Virtual Reality Game Play on Dreams: A methodological refinement and extension: Neelinder Rai, Braden Wagner

This study was about virtual reality and effects on dreams. Dreams can be considered a form of virtual reality. They hypothesized that people who played with VR would have more control and a more vivid experience in dreams. Again the participants would submit dream reports.

They found that those that played the VR felt increased control in dreams. The VR experience primed people to report lucidity in previous dreams. Those who experienced VR had an increased frequency of locations in their dreams.

There seem to be an overlap of characteristics between VR and dreams (biological VR).

Individual differences in dreams and video game play: Jayne Gackenback

Gackenback reported on a third correlational study. The study focused on dreams with media content - ie. a dream about Twitter. They had 481 respondents. Respondents were asked for a dream about media.

While we often dream about what we have been doing, the question is often about what details about the media. The respondents were asked about emotions, the sense of presence, the game transfer phenomena.

She reported on results. There is a correlation between dreams of passive media and fear/anger. There is some relationship between the type of media consumed (passive and interactive) and dreams. Interactive media seems to be better for you in that you have more of a sense of control in dreams.

I'm fascinated by the idea of games as dreams and therefore the relationships between games and dreams. How is gaming influencing dreaming and vice versa.

Diverse Representation in Games

Where Are Older Adults in this World? The Representation of Older Adults in Video Games: Julija Jeremic

Jeremic has been studying educational games. She is interested in incidental learning in commercial games. She is part of the Aging Well NCE and in particular is working on projects dealing with gaming and adults.

They created games and tested with adults. She described an online escape game based on Alice in Wonderland.

Adults tend to play mostly puzzles and quizzes. Part of the problem is that games are not accessible to older people. There are some people in their 60s and 70s who are avid gamers.

She commented on how there are very few older people in games, despite aging population. Representations that you do have are of people who just reminisce and then die. There seems to be some agism in the games.

The World Could Always Use More Heroes: Character Diversity in Blizzard’s Overwatch: Gregory Blomquist

He looks at Overwatch as a transmedia franchise. He is looking at the characters, the variety of which is part of the success. He is looking at diversity in the characters. Diversity is when all audiences are confronted with characters of different backgrounds. It is the responsibility of producers. Pluralism is when you have variety, but many are hidden away NPCs.

In Overwatch teams don't well unless you use different characters and a diversity on the team.

Then he switched to the transmedia aspect where characters show up across different media. In Overwatch there is more diversity in the paratext that allows one to build backgrounds that are richer.

The End

And that was the end of my Congress.

Navigate

PmWiki

edit SideBar

Page last modified on June 01, 2018, at 02:08 PM - Powered by PmWiki

^