Main »

Lessons From TA Po R Evidence Of Research Value In A Canadian Context

Lessons from TAPoR: Evidence of Research Value in A Canadian Context

"What difference does human science research make?"

The research culture in Canada is changing so it is not enough just to conduct research, but we are expected to share our research broadly and we are expected to be able to explain how our research benefits Canada. Further, humanities research is being compared with that from other fields - the humanities in Canada are expected to justify their value to society in ways comparable to others for cross-disciplinary evaluation. It shouldn't be surprising that we don't seem to benefit Canada compared to medical researchers looking for a cure for cancer.

Nonetheless, the Social Science and Humanities Research Council of Canada in its Transformation Consultation started from the assumption that we had to add "Maximum Knowledge Impact" to SSHRC's core values - that human science research in Canada has to "build greater capacity for understanding research and its applicability", sharing new knowledge not just within the peer research community but also out to the broader community. [1] More recently, in SSHRC's newsletter, "Dialogue" (Summer 2008) there is a story on "Measuring impact" that argues,

"The debate over how to define the value of social sciences and humanities research is growing on campuses and communities around the world. Universities are being challenged to expand traditional criteria for tenure-track promotions to include more unconventional outputs, and governments are under increasing pressure to demonstrate the value of public investments in research." [2]

In Canada the drive to define the value of research is being driven by federal government which has been suspicious of the arts, humanities and social sciences. [3] SSHRC is responding by commissioning research around measurement and trying to help researchers better explain their research so as to make the case for more funding. In this paper I will follow some of the questions that have arisen in the Canadian context, but with a focus on the lessons learned in the TAPoR (Text Analysis Portal for Research) project: [4]

1. What are the different qualitative and quantitative indications of achievement that can be used by research projects? Which of these are appropriate in the humanities. In the paper I will survey different types of indicators that should be considered, depending on the project. There are a surprising variety of types of indicators that can be used.

2. What sorts of statistics can humanities computing projects gather, especially about web projects? In this paper I will discuss the gathering of web statistics and their interpretation. In Canada there are also ethical issues to be considered when gathering data about human activity.

3. How can projects organize so that value indicators are gathered efficiently? Large projects like TAPoR are now expected to gather data about the impact they have had, even if it is preliminary. To do this appropriately projects have to learn to put in place communication and reporting systems that gather the appropriate information. It is not enough to just ask what has happened - you get an incomplete picture if gathering information is half-hazard.

4. Most importantly, do we have in the humanities, traditions of assessment that we can draw on? Where we can draw on our own traditions we are more likely to provide evidence of value that can be understood by our colleagues. We also need to articulate our traditions of assessment and critique so that they can be included in comparative contexts. If we let the humanities be assessed by metrics from other areas we not only will not appear to be relevant, we will also find it hard to gather data and share it with colleagues. In this paper I hope to argue for assessments of value that fit the humanities and draw on our traditions of review and critique.

In sum, in this paper I will use the TAPoR project as an example for discussing the situation around research metrics in Canada. I will outline how this has become important in the last 5 years, especially with the current government. I will survey metrics that we have used, discuss the gathering of web usage statistics, and finally talk about our traditions of review.

Notes

[1] Page 10 of Volume 1 of _From Granting Council to Knowledge Council_, January 2004. On page 3 the introductory "A Message from SSHRC's Council" says, "The role of researchers is not only to develop knowledge, although this is very important in and of itself. They must become far more proficient at moving the knowledge from research to action and, in the process, at linking up with a broad range of researchers and stakeholder-partners across the country." By implication we have to justify the value of our research.

[2] "Measuring impact; Weighing the value of social sciences and humanities research", "Dialogue", Summer 2008. <http://www.sshrc.ca/newsletter/2008summer/measuring_impact_e.asp>

[3] For example two arts programs and a new media program were cancelled by the current Harper government.

[4] For more on TAPoR see <www.tapor.ca> or the TAPoR portal <portal.tapor.ca>.

Navigate

PmWiki

edit SideBar

Page last modified on August 31, 2008, at 03:26 PM - Powered by PmWiki

^