Main »

DH 2015

These are my notes from DH 2015 at the University of Western Sydney in Sydney, Australia from the 29th of June to the 3rd of July 2015.

These notes are written live and therefore are full of typos, unfinished thoughts, and so on. My apologies for any misunderstandings.

I've found other notes at http://bibliobrary.net/2015/07/03/dh-2015-sydney-notes-friday/ (follow links to other days.)

Monday, June 29th

New Scholars Symposium

Rachel Hendry and I facilitated the New Scholars Symposium that brought together graduate students and recent PhDs to discuss futures for the field. A number of interesting questions were raised, including:

  • How can one understand the field when first entering it? What resources, journals, projects, initiatives should one explore?
  • What would a ethics code for DH look like?

(I will be adding more on this later - at the time I was participating.)

Wednesday, July 1st

The first full day of the conference was July 1st, Canada Day!

Welcomes

Greg Simms, an elder from one of the first nations of Australia welcomed us. Once we start sharing there will be reconciliation. When you take your first steps remember those that walked this land before.

Paul Arthur followed and recognized all the people who worked on the conference.

Barney Glover (Vice-Chancellor and President, University of Western Sydney) welcomed us to this historic Parramatta campus. The university was built on the Female Orphan School which had the first 3 story building in Australia. The buildings now house the Whitlam institute named after an important Prime Minister of the 1970s. (I think he was the first Labor PM and was responsible for dramatic changes.)

Glover commented on the importance of investing in humanities research and teaching. In Australia there is also considerable pressure on and discussion of research infrastructure.

UWS is proud to be a pioneer in digital humanities. They created the first chair in Australia in the field.

John Fitzgerald (President of the Australian Academy of the Humanities) then talked. He thanked Uncle Greg for welcoming us to these lands and talked about collaboration around the humanities.

John Nerbonne (ADHO Steering Committee Chair, University of Groningen) welcomed us on behalf of ADHO. He talked the background role of ADHO. This is the 10th anniversary of DH conferences. It is also the 30th anniversary of the journal Digital Scholarship in the Humanities (previously Literary and LInguistic Computing).

Øyvind Eide and Mark Algee-Hewitt announced the Paul Fortier prize and bursaries.

Ray Abruzzi (Vice President and Publisher of Gale Digital Collections) talked about the collaboration of libraries and researchers. There can be a duplication between the infrastructure set up by libraries and by researchers. He talked about how Gale is trying to support text and data mining through their products. You can get a hard drive of full data from their collections if your university has licenses the services. They want to find out what happens to the data they ship to us. They want to experiment with new services.

Keynote: Jeffrey T. Schnapp (Harvard University): Small Data

He talked about the three areas that inform his work. He talked about a course called Teaching With Things that dealt with how to leverage the power of all the rich collections. The digital is at the center of a discussion of activating the cultural reserves. How do we weave space and information together? He will be running a workshop on Beautiful Data that looks at how to make resources matter. Digitization is not enough. The third area is a project called Curarium - a tool to leverage the information in collections.

He works on experiments in curating in spaces like the Trento Tunnel exhibition. Objects have to changed, extended, transformed into other media to be shown in spaces.

His talk then turned to Small Data by comparing it to big data. Small in the discourse of big data tends to be associated with inadequacy. Last year's big data is now small. He gave the standard definition of big data having to do with volume, variety, and velocity.

The real issue is making sense of data, whether big or small.

He talked about live crisis archiving. They are interested in federated, participatory archives that are used as they are being gathered. The Digital Archive of Japan's 2011 Disaster is one example. It is an archive where one can study things that happened moments ago. There is a prospect for an intersection between the digital humanities and digital social sciences.

Data are not merely captured, but constructured by all sorts of structures. When it comes to systems the big is entangled with the small. The deeper question is big and small - how do we design zooming in and out.

He talked about data halos and how the digital surrogates become a new set of objects. Real objects include aspects that can't be reduced in digitization. You lose touch and all the data in the small when you digitize it.

The humanities bring unique skills to the types of attention needed. Attention to detail, power, materiality are important to data. How can we bring these skill sets to bear? Poets have thought about putting things into words and what is lost and gained. I think of Heidegger on grasping with words.

He then talked about how we might respect the traces of what gets lost in digitization so as to create new experiences. How might we sculpt new experiences independent of the original objects using the digital?

There has never been a world of autonomous objects. They have always been part of a social world. We have a history of metadata schemes that structure our bibliographic and collection spaces. The conventions are to reduce an object to a series of facts in fields in a record. This is what we see in descriptions in online museum sites.

Objects are ignorant of human categories. Everyone can be multiple things and described in many different ways. Traditionally the metadata was only used and seen by experts aware of the traditions. Now they are used by publics before, during and after experiences of exhibits.

He considered the way MOMA and Amazon both describe the same thing, a Moka Express coffee maker. Ironically the Amazon description is more accurate while the MOMA one turns the object into a design object. It isn't important that it is a stovetop coffee maker. Amazon treats the Moka as a network with links to citizen consumers and so on.

What would a world look like where every object is thought of as a collection and vice versa? He talked about Hyper-Stacks that works with the Victoria and Albert collection. He talked about RTI (?) which makes features available to touch in digital form. MicroCT is another technique. Low cost tomography techniques is becoming accessible.

He showed experiments from Teaching With Things that tried to avoid canonical descriptions by presentings many different ways into objects. The aim is to model the world - to animate objects.

The final goal was to then put the webs modelled by people back into the physical world. He showed what they have done with the lightbox gallery at Harvard.

Steven Jones: The Priest and the Punched Cards

Jones talked about work that is coming out in a forthcoming book that is a biography of his projects first decade or so. His work does both a media archaeology and a platform study of Busa's project. Jones retold the story of the encounter with Watson at IBM that got support. He then looked at that meeting and took it apart in interesting ways like an exploded diagram. What tensions were there? Tasman, in an oral history, suggested he was expected to make the Jesuit dissappear. What was the poster Busa brought in to Watson's office to turn the words of the founder on itself.

Jones talked about the slogan "the possible we do now, the impossible takes a little longer." This was the slogan of the American military engineers that Busa might have met in the War.

Jones also followed IBM's war record and internationalism. IBM had an agenda too. Busa in 1952 and Tasman gave the mother of all computing demos - they showed off their method of mechanized concording analysis. It was the first time IBM used computers on natural language processing.

He talked about the IBM SSEC computer installed in public view in a glass show room in New York. Busa didn't use the SSEC, but a latter computer that took over the show room. He demoed literary data processing at a worlds fair - the theme of which was "a new humanism."

Jones made some neat connections to the cold war. Then he talked about a project to work on the dead sea scrolls that caused Busa a nervous breakdown. IBM continued to discuss the dead sea scroll project. Now IBM has the Watson Project. Watson is descended from Busa's project.

He talked about the importance of Busa and his project as a swerve.

Sinclair and Rockwell: Talking About Programming in the Digital Humanities

Stéfan Sinclair and I gave a paper sketching a history of the debates around programming in the digital humanities. We spent some time to introduce people to Sally Sedelow and argue that she was important to the field. Daniel Powell created a Wikipedia draft article at https://en.wikipedia.org/wiki/Draft:Sally_Yeates_Sedelow .

We had a great discussion after the paper about the value of understanding formalization. I tried to argue that there are many ways of formalizing information other than programming which should be just as important to the digital humanities.

Jeremy Browne: LinkedIn circa 2000 BCE: Towards a dNetwork Model of Pusu-ken's Commercial Relationships in Old Assyria

Browne talked about creating a social network of people-events in order to disambiguate names in a small corpus of Assyrian business correspondence. They built a tool that allows the expert to disambiguate people. He showed a 3D visualization of the location of the tablets.

A Longitudinal Analysis of Knowledge Integration in Digital Humanities Using Co-citation Analysis

Muh-Chyun Tang talked about a citation . He showed how digital humanists tend to publish alone, but it is changing. He also showed how keywords have changed and are becoming increasingly diverse.

He then talked about co-citation and quickly showed some results. They found that there seem to be separate national loose clusters of authors. Co-author network remains fragmented.

Cora Johnson-Roberson: Susurrant: A Tool for Algorithmic Listening in Networked Soundscapes

Johnson-Roberson started by talking about algorithmic listening - how computers listen. It is an overlap of sonnification, aesthetics, and music retrieval. She mentioned the Ryan Maguire "The Ghost in the MP3" (2014) work. This is a supplement of machine listening. Elias Pampalk et al. "Islands of Music" (2002). The largest users of machine listening are commercial outfits who analyze large amounts of musical data for services like MusicNest.

Algorithmic listening is important to the humanities as we should want to understand what is being sonically captured by computers.

Susurrant applies topic modelling on a corpus of music. She demoed it.

I didn't understand all of the paper, but it opened a whole new vista for me. Machine listening is listening to what a machine might hear, which is not necessarily what we hear. For that matter we don't hear as a machine so machine listening is always an approximation or metaphoric. Very provocative. This paper was one of the most innovative I heard.

Thursday, July 2nd

Deb Verhoeven started the day with a frank challenge. She asked why there were no women on the stage yesterday - "parade of patriarchs."

She talked about how uncomfortable women feel at/in DH. It is like the room is always too cold. When was the last time the conference air conditioning didn't feel right for me? It is time for us to sweat and have a sense of what it is like. It not about inviting token women. It is about our plans to exit the stage. It is about letting others in by letting go of our privileged position. We are the problem. 50% isn't equity because of the decades of mostly men.

She offered some practical tips.

  • Number our days
  • Find someone who doesn't look or sound like me and mentor them
  • Have a clear purposeful succession plan and enact it
  • Do it because you embrace diversity

Time to learn to be silent, to listen, to let others lead the field.

Genevieve Bell: Making Life: the Art and Science of Robots

Bell is an anthropologist from Intel. She started with a legal disclaimer and introductions to her background. She talked about the importance of acknowledging that we are on complicated lands. She talked about her encounter with Silicon Valley. Her second day she was asked to help Intel with women. Intel wanted help with women and ROW (rest of world.) Her job is to explain people, which is what anthropologists do. In a technology company it is about imagining what technology could do for people.

She talked about a video of a Furby talking to a iPhone (Siri). The moment an object not only talks, but listens, things change. There is the possibility of nurture. Her colleagues immediately worried that AIs would kill us. She quoted technologists who worry about AI. She talked about ambivalence and anxieties about robots. The greatest number of robots are ROOMBA. See the @selfAwareROOMBA .

There are other ways of looking at robots. We can look at origins of the imagination like Capek's original "robot". She traced the word/story from Capek's play out into the popular culture. We have a fascination with unshaped matter - when we make life through meddling. The story of golem. The story of Frankenstein. Since Frankenstein there has been an explosion of representations.

The problem is that Hollywood makes it look really simple. Rarely do movie robots have to be plugged in. The science is hard and there is a long tradition of automata. What does it take to give things movement or life. Automata were mostly for the wealthy. She talked about a 18th century automata - digesting duck - Canard Digérateur. The duck could be fed and it shat. You wound it up and waddled around, ate and shat. No one had seen anything like it and it make great impression.

At the same time in Japan there were clockmakers learning and building Karakuri. She talked about the robot being about grace, not life. It was a very different idea of what bringing things to life might mean.

She talked about a 1928 robot built to open a conference. The robot Eric went on tour. All sorts of other robots follow. 1939 Westinghouse had a robot called Electro at the World Fair. All these early instantiations are in dialogue with Capek's play.

She then asked what questions we might want to ask. One issue is bodies - why do robots have to have human bodies. See Cybernetic Zoo. What do the bodies being made for robots tell us? What is the shape of the object and what does that tell us about us and the world? Then there is the question of unction. What does the robot do? What was Edison thinking when he developed his failed robot?

Another question is, what is the degree of autonomy that the robot has? This leads to all sorts of questions about control and what it means about robots having sentience. We don't trust sentience in things other than us. And yet we trust all these algorithms that do stuff for us.

She talked about "The Buddha In The Robot" - a book that challenges us to think differently about robots. Where does this leave us? There should be a constant dialogue about technology between engineers, poets, humanists and social scientists. Often the most powerful imaginations about technologies do not come from within technology. The challenge is how we stay in those conversations.

In questions she talked about autonomous cars and who is making decisions about them. Why invest in autonomous cars to solve problems of traffic rather than trains?

The History of Science in the Age of Networked Digital Humanities

Stephen P. Weldon, Ailie Smith and Gavan McCarthy presented on a project that is developed a bibliography for history of science and technology.

Stephen Weldon started by talking about the Isis Bibliography and the field of the history of science. Isis was started by George Sarton who started the Isis journal. In the first issue of the journal he published a bibliography that has continued. Weldon now runs the bibliography. Now they have a History of Science Technology and Medicine Database (hosted by EBSCO).

The question that comes up now is what to do with the pre-1974 data. The Sloan Foundation has funded a project to digitize the earlier bibliographies.

He talked about the new structure of what has been a fairly traditional bibliographic database structure. They are trying to do more with citations and authorities. They never had an authority file - just a controlled thesaurus.

They are now developing a way to connect to information outside on the web using open linked data. He talked about the concepts of the new informatics. The data can be exported so that it can be visualized and other analytics can be done. He showed dissertation data that has been visualized.

He concluded on the larger implications. As they become more than a bibliography and become a dataset they will need more data to properly represent the history of science. We need international standards.

In the second phase of the project they want to allow people to get an account and personally curate the information

Ailie Smith then spoke first about the eScholarhsip Research Centre and within it the Australian Science and Technology Heritage Centre (they were previously under another name ASAP). They have been collecting information about Australian scientists and more generally science.

They have created two systems, the Heritage Documentation Management System. The second is the Online Heritage Resource Manager. The OHRM is used for the Encyclopedia of Australian Science. Both systems work with standards. She showed some of the things they can do with the XML data. She showed a social network diagram of Australian science.

Trove is harvesting their data and vice versa. The Huni (Humanities Networked Infrastructure) project is bringing data together and letting people search play with it. Their resources are being designed so that data can be shared, aggregated and linked.

Gavin McCarthy then stepped back in our history with a case study of missing records: the long journey of the correspondence of Ferdinand von Mueller. Mueller was an important botanist and one of the best known 19th-century scientists. McCarthy talked about the concept of entanglement and the importance of teleology (purpose.)

The correspondence of von Mueller was destroyed in the 1930s. The project is to recovery the correspondence by locating and transcribing letters around the worlds. They have recovered about 15,000 items. Their collection maps a community.

Mueller tended to use the lovely phrase "Regardfully Yours" which was then used for publications of the correspondence and the bibliography of the project. See Trove record.

The projects has been going for a long time and they did a lot of their work in Mac Word 1.0 (1985). They had to rescue files encoded as Word 1.0. Microsoft had no records of their file formats. They had to extract the text and restyle the files. They are now trying to figure out how to port it to TEI.

McCarthy showed how you could follow information about Mueller from site to site. He showed Social Networks and Archival Context that has out-of-the-box visualizations that can be useful. Particularly interesting is the Radial Graph Demo you can play with.

He closed by talking about cladistics, which is based on taxonomies and hierarchies and which is now being challenged.

"We build what we can ... with the technology at hand ... in the world in which we find ourselves."

We then shifted to a panel discussion. There are interesting challenges to opening archives to the academics and what happens when you deform an archive by using it for a purpose it wasn't intended for. What happens when people use Isis for tenure and promotion purposes.

Indigenous Digital Knowledge

Hart Cohen started by talking about "Journey to Horseshoe Bend", a book that he started to enrich with lots of . His latest project is called Digital Archives and Discoverability. They are working on repatriation, data-diversity, and the ethos of storytelling they have developed. "Country" is a concept that combines territory and place.

Peter Radoll talked about the use of information technology in aboriginal communities. He talked about how often the archives are created for non-indigenous researchers. He is trying to figure out how to develop indigenous resources for (and by) indigenous people.

He gave some examples like IndigenousX is a rotating indigenous Twitter feed. See http://indigenousx.com.au/ for more on the project.

Susan Beetson talked about what it is that people use in the community she comes from. She has been working on ideas for social media in remote communities. They worked with the One Laptop Per Child computers bringing them out and using them in projects with children. They reflexively developed lesson plans that got people using computers with others. They used Scratch to make stories and share with their families. She mentioned that Scratch had some parts that were alienating as it was globally developed.

Julia Torpey talked about her enhanced e-book project that makes visible stories following best practicies. She showed a video from the e-book of a woman talking about her art.

Peter Read asked about what he would do differently if he were to redo a digital project. What can we do? He argued for following our emotions. He talked about the last camp in Sydney. He imagines that digital recreations could lead us on a journey to understand things we couldn't understand before.

Robert Warren: Language, Cultural Influences and Intelligence in Historical Gazetteers of the Great War

Warren talked about all the neat things they can do with 3D data about battlefields like the 1st World War battlefield of Ypres (?) that he is working on. He circulated a 3D print of a trench.

He talked about historical accuracy and how the points are probabilistic as surveyors (in battle) were not always accurate. Armies used different coordinate systems. He is using a lot of image processing to process historical trench maps. They have the British maps that show where they think the Germans are and the German maps with where the Germans think the British are.

He showed the data structures needed. The same trench would be called the "Regina Trench" by the Canadians and something else by the Germans.

He plugged http://www.openhistoricalmap.org as a place to put map data.

Old Traces, New Links: Representation of Taiwan Baotu in OpenStreetMap

Jheng-Peng Huang started by talking about "Being Digital". He talked about a "Facebook of the Dead" where you take a biography and try to extract information to create a Facebook-like social network.

His project is about Taiwan Baotu - a set of 457 topographic maps of Taiwan when it was ruled by Japan. They are working with OpenStreetMap. They want it to be accurate, computational, and transformative.

Tyng-Reuy Chuang talked about how they are using a redrawing approach. They run their own instance of OpenStreetMap as they like the features of OpenStreetMap. For example, it can redefine features and let them define their own features. They can preserve the history and help people understand the historical maps by connecting to modern maps.

See http://140.109.161.36/baotu

Padmini Ray Murray: Press F6 to Reload: Games Studies and the Future of the Digital Humanities in India

She believes we should be both thinking and doing. Artifacts can both "reify knowledge and communicate it." We must recognize the cultural specificity of all work. Murray quoted Tara McPherson.

She then shifted to the Indian game market. The market is growing exponetially. It used to be that animation was outsourced to India. Game culture has been dominated by the US and Japan. What gets missed is how localization by companies changes cultural references for different markets. American localizers tend to remove stuff like rice bowls in Pokemon.

Some believe India will be the next site for original game designs. Murray wonders what would be an Indian game that doesn't fall into the drop of the post-colonial exotic. Call of Duty has a mission in Madra Pradesh. The local site is rendered in lots of detail, but the battle is between Americans and Russians. No Indians. Age of Empires has some battles that don't match.

Murray and colleague have run a survey of game developers, players and industry personalities. They ran it online and on paper. Players felt cultural location was important, developers didn't as much. Developers are thinking about exporting their games.

There are some recognizable Indian games. Unrest was the first kickstarter game. The availability of affordable tools has expanded the development community. Murray has consulted on games.

Murray challenged the idea of media archaeology as layered - she thinks of it as more of a constellation. Murray hopes for practice based interventions in game studies/development that could move ideas out of the academy.

Foaad Khosmood: Game of Thrones for All: Model-based Generation of Universe appropriate Fictional Characters

Videogames have expanding universes. Huge worlds need large numbers characters to keep suspension of disbelief. He gave some examples, like DaggerFall that had a world the size of England. Audiences want more. Assassin's Creed Unity has thousands of characters separately modelled, but not different characters. The project is trying to create NPC agency. They want to create lots of "off the shelf characters" that can be downloaded by designers. They are mass producing the characters.

For a proof of concept they developed characters around Game of Thrones. They used Inform 7.

Friday, July 3rd

John Montague: Exploring Large Datasets with Topic Model Visualizations

Montague started with the question "how can we use visualizations to learn something new about a corpus?" He gave a brief introduction to topic modelling. Topics often seem arbitrary and they should always be considered exploratory. Correlation is not causation and it is important to check distant reading with

We (I was on the team) got a collection of philosophy journals that we got from JSTOR. He described what we had done previously with topic modelling (Mallet) and philosophy journals.

He then talked about other programs that visualize topic modelling. He talked about a classification of visualizations. He then talked about the Galaxy Viewer, the visualization that we have developed. The viewer is at http://analytics.artsrn.ualberta.ca/viz/galaxy/ - Note that this is a prototype. It takes a while to load and has messy parts.

Micki Kaufman: "Everything on Paper Will Be Used Against Me" Quantifying Kissinger

She started with a quote about how digitization is not enough to compete with the fog of competing ideologies. She also showed a video quote by John Ehrlichman about how studying a fragment can be misleading. Kaufman is trying to find ways to study the whole.

The sources she has are 18000 documents that are meeting memoranda (memcon) and teleconference transcripts (telecon). She got a cease and desist letter from ProQuest for all the scraping she was doing, but they worked it out. She reinforced what John said about topic modelling being suggestive. She talked about how she is working within a set of archiving decisions. She was able to compare her topics to the archival categories which reinforces her topics.

She talked about gaps in the collection and how gaps can raise questions, but you have to know enough of the subject to guess at gaps.

She then showed a neat stacked visualization of the histograms of all the topics. You can see a shift when Kissinger becomes Secretary of State. She then showed a 3D visualization from Gephi of her topics that was very interesting. Other visualizations suggests that Kissinger spoke differently in the memcons and telecoms. He thought the memcons (?) would be likely to be seen so he was more discrete.

She did a lot of work on collocations of "bombing" and "cambodia" to see where there bombing cambodia was discussed. The phone conversations have a cluster right when Watergate takes off.

There was such a richness in the paper that I can't do it justice. See here blog/website on this at http://blog.quantifyingkissinger.com/

Peter Cornwell: Improving Compliance With Evolving Standards Using Computed Transformation of Digital Collections

Cornwell discussed a collaboration that crosses a number of universities (Heidelberg, Lyon, and Westminster) that deals with data futures. He focused on a case study of [[http://chinaposters.westminster.ac.uk/zenphoto/

 Chinese propaganda posters]]. He talked about how to combine information from different holdings and ways to deal with annotation.

He talked about keeping documentation about the iterations of a project as it gets upgraded over time. You can end up with many slightly different copies of the same things.

He talked about IIIF (International Image Interoperability Framework) which allows image collections to work together.

We had an interesting question at the end about metadata.

Matthew Wilkens: Mapping and Modeling Literary Geography in the Twentieth Century

Wilkens is interested in the intersection of literary geography, social change and economics. He talked about the neoliberal hypothesis and how it is hard to think outside of markets. He is exploring it in collaboration with the U of Chicago Knowledge Lab. He created a corpus of about 10K volumes of US authors - published between 1880 and 1990. The idea is to run the process on HathiTrust materials.

He then did NER on the texts and Google geocoding API to associate strings with regions and coordinates. You get about 96 accuracy on national data (countries mentioned.)

Wilkens is interested not just in the literary share by changes. The US is 70% of all countries mentioned.

He then talked about getting economic data to compare with the geographic information. It is hard to get data from before 1945. The Maddison Project has gathered and normalized historical data. Before 1900 economies are very different then they are now. You can't have countries become wildly wealthier before industrialization. What is more interesting is share of global GDP from 1900. He then subtracted GDP share from literary share.

I had to stop taking notes as it got complicated and I wasn't following the statistics.

He concluded that there is a "forever 50s". American fiction looks at America from the 40s and 50s from an economic point of view. He shifted to Western European nations.

He was interesting on why Canada couldn't be graphed because so many of the place names are the same as those of the UK.

Where does this leave us in terms of the neoliberal hypothesis. This technique has failed to find a correlation between literary geography and economic geography. Perhaps there isn't such a clear link between economic function and literary function.

Mapping the Emotions of London in Fiction, 1700-1900: A Crowdsourcing Experiment

This paper was presented by Ryan Heuser and Van Tran, Annalis.

They started by talking about mapping named locations. They found more named locations in the centre than in suburbs. They did a lot of proofing of the NER to see what problems cropped up. They mapped the frequency of placed named.

Then they looked at colouring the locations. They wanted to get emotions for the places. They crowdsourced this using Mechanical Turk. They needed a framework for categorizing emotions. They used Plutchik's Psychoevolutional Model of Emotions. They presented Plutchik's emotion wheel to participants. They could then check consensus among raters. They did not get a good consensus rate so they went back to come up with a simpler set of emotions. They settled on "fear" and "happiness". This raised issues of whether emotions like "happiness" aren't socially constructed. Is happiness a modern phenomenon. So ... they used emoji that people clicked on and they found much better consensus.

They then created emotion maps for the places. They found a east-west polarity of emotion. They wanted to see if there is a class difference between happy and fearful spaces or is it jails vs parks. They used a class map someone else generated and also looked at types of sites. They also found some descriptions of fearful spaces are often not clearly placed.

Are novels given an emotion to places or drawing on associations with places? It seems that authors use places to create emotion rather than rewriting the emotion of places. Prisons are fearful, while parks are associated with happiness. They speculated that London is a binary sort of place. Fiction is providing a simplification of the city. It is helping people understand London.

John Bradley: How About Tools for the Whole Range of Scholarly Activities?

Bradley started by talking about how tools haven't been used much by folk. He mentioned how Pliny hasn't had much of an effect. He raised the issue of evidence of value.

He quoted Joris van Zundert to the effect that our tools are hermeneutically uninformed and inadequate. He has looked into the philosophy of tools mentioning Feibleman 1967. (See http://www.jstor.org/stable/2575191 ) Bradley distinguished 3 types of tools:

  • Tools for Making - tools for designing something in the world based on a conception
  • Tools for Exploring - tools for learning about the world
  • Tools for Thinking - tools for people to think about things rather than externalizing their thinking (making)

Bradley talked about how we have had tools for making for a while. Tools like Omeka, Zotero are examples. DH 2.0 is seen as generative - making things.

Bradley talked then about tools for exploring that are like telescopes. He quoted Steve Ramsay and my paper on Developing Things. Text analysis software started as tools for making concordances and then shifted to becoming tools for exploring - especially for big data.

The third type of tools that are for thinking are cognitive and enhance our thinking. He gave as an example Mathematica - or the idea of notebooks for thinking through.

He feels that most tools for DH have been tools for exploring. Should we think about other types of tools.

A questioner asked if programming isn't a tool for thinking? Others asked whether the tools for making and the tools for thinking aren't the same thing. Can't the making be also a way of thinking in DH?

I object to the opening move that tools haven't had much take up. Many tools like Zotero, Omeka, and Voyant get used a lot more than most books are read. Voyant gets over 50K runs a month.

centerNet/ADHO AGM

I attended the centerNet AGM. Neil Fraistat introduced the New Scholars Seminar. Kara Kennedy said a few words about the NSS. Lets hope we can

Ryan Cordell talked about DH Commons a journal that reviews projects. The first issue will be out very soon. A neat feature they are adding is a "how did they make that" deep look a the technology.

John Nerbonne talked about what ADHO is. Alex Gil asked about a purported hierarchy in ADHO. Nerbonne talked about how ADHO is a bit baroque, but that this is due to it being an alliance. The structure has to do with the history of coming together.

I proposed exploring a job shadowing initiative to allow new people or people who feel excluded to shadow officers to get a sense of what is involved.

Tim Sherratt: Unremembering the Forgotten

Richard Neville introduced the keynote speaker Tim Sherratt, the manager of Trove.

Sherratt started by talking about previous times that academics have gathered here in Australia. Over a 100 years of the British Academy for the Advancement of Science (?) met in Australia. He talked about how this meeting was received. This was during the war and two German scientists were interned. He commented on how recent legislation having to do with the war on terror similarly sacrifices rights.

He went on to talk about how a handbook about Australia at the time of the conference promoted a white Australia. He talked about WWI is being remembered as nation forming a century later. This ignores all the people for whom WWI was not formational. Peter Read, who we heard yesterday, wrote about the "stolen generation." There are initiatives to document other histories.

These different histories are forgotten not because they aren't archived, but because they don't fit popular conceptions of the nation. Find & Connect is a resource for forgotten Australians and those interested in child welfare in Australia, but

Memory as experienced is fragmentary and shaped by context of recall. Memory is contested and complex. Access isn't enough. The ways access is structured limits memory. A simple search box can hide all sorts.

He then talked about archives about British atomic bomb tests in Australia. He harvested all the government collections that are not accessible. A lot of files are not shared, including a number of files that have to do with the cold war. Files about the atomic tests are not shared for reasons of non-proliferation that Sherratt has good reason to doubt.

As files are released, more and more are being closed. National security is becoming a magical mantra to close off access.

Open data is not open. Data is structured. Likewise Australia has not always been open. He showed immigration documents and the The Real Face of White Australia site which shows the documents that exempts people from the dictation test.

He talked about Morph that gathers scrapers and helps people understand how to get information as access is not give, but taken.

He connected opening up data about Australia to how the country is not open to people.

He talked about demonstrating against American bases like the Pine Gap Facility. Digital tools let us see things differently. Twitterbots can show random stuff that challenge us. They can protest things. His OperationBot creates new nonsense names for government initiatives as a way of protesting initiatives like those about immigrants.

He last project is a project of eyes. He talked about connecting with ordinary people other than those who came for the British Academy for the Advancement of Science (?) meeting in Australia.

His talk was a lovely tour that had relevance to current events. He put it up at http://discontents.com.au/unremembering-the-forgotten

Opening Closings: John Burrows and others

John Nerbonne introduced Willard McCarty who introduced John Burrows who started by talking about Gough Whitlam who was a political pioneer bringing Australia into the 20th century. He ironically joked that current politicians are trying to do that again. He then went on to talk about his delta measure. He showed how it can be showed as a way of measuring difference between documents. We have good markers for authorship, chronology, and other aspects.

We had awards. The 2015 Paul Fortier prize went to Micki Kaufman, "Everything on Paper Will Be Used Against Me". Bravo! Then we had poster prizes. The first was the Anthem Prize to the best poster which went to a poster on "Who's talking in Edgeworth's Novels". Another poster on "The Aboriginal Dreaming Meets Virtual Reality" also won a prize.

The Roberto Busa Prize award was announced today for the next year. Helen Aguera is at the NEH and has worked hard to develop the digital humanities over the years. She will be awarded the prize next year and give a keynote talk.

We got a great tour of next year's conference in Krakow. Then we heard about 2017 which will be in Montréal. The Montréal conference proposed the Twitter hashtag #whatifDH2017 to talk about inclusivity. The 2018 conference will be in Mexico!

Now I get to go snorkelling on the great barrier reef.

Navigate

PmWiki

edit SideBar

Page last modified on July 03, 2015, at 12:52 AM - Powered by PmWiki

^