Kate Sherren

Landscapes - People - Global change

MES Scholarship opportunity: How can we learn to love the renewable energy landscapes of the Anthropocene?

Wind turbines near Amherst, Nova Scotia, with train passing

Wind turbines near Amherst, Nova Scotia, with a train passing.

I have a new Legacy scholarship opportunity open for very high-GPA domestic students aiming for MES entry in September 2017.  Please get in touch if you think you might be a good fit, or to discuss other opportunities that close this fall such as Nova Scotia Graduate Scholarships (also open to international students) and SSHRC (domestic only).

Landscape impacts are oft-cited barriers to changes that are otherwise agreed to be necessary, such as those implied by a transition to renewable energy sources. Many examples exist, however, of deep attachment to man-made and otherwise purely functional landscape features such as lighthouses, factories, hydroelectric dam headponds, that in some cases extend far beyond their utility. The landscape of the Tantramar Marshes, the low-lying area that links New Brunswick and Nova Scotia, presents a unique opportunity to explore how people attach meaning and form attachments to large, utilitarian infrastructure. A natural experiment is occurring in the region, by the overlap of the 2014 dismantling of the Radio Canada International (RCI) shortwave transmission towers (constructed in 1944) and the construction of 15 2.1 MW wind turbines in Amherst in 2012 by the Sprott Power Corp. Prospective students might use interviews, archival data, social media and/or spatial analysis to:

  • Understand the process by which attachment is formed to man-made, functional landscape infrastructure, over time;
  • Understand what drives the acceptance of and attachment to functional landscape features by locals; or,
  • Build insights about how to facilitate functional landscape change without sacrificing sense of place.

When to call a social scientist (or how to fool one)

In science, when human behavior enters the equation, things go nonlinear. That’s why Physics is easy and Sociology is hard. (Neil Degrasse Tyson, Twitter, 5 Feb 2016)

It is heartening to see increasing support for interdisciplinary applied research from funding bodies. Some countries (like Canada) still largely divide funding programs by discipline, requiring researchers to carve out feasible standalone disciplinary research subprojects within more interdisciplinary projects and subject them individually to the rigours of granting bodies. By contrast, places such as the European Union welcome large, integrative and synthetic research projects. It seems clear, however, that such opportunities do not necessarily increase the likelihood of interdisciplinary team research. In fact, sometimes it seems to encourage members of more disciplinary teams to extend into unfamiliar domains to meet granting requirements. It is human nature to want to work with people similar to us, who we understand and share language, methods and a sense of what consititutes good evidence. Specifically, based on what I have recently been asked to review from numerous journals over the past year, it is common for teams of biophysical scientists to engage in social science research, in a way that would be unheard of in reverse. In many cases the first authors are students, themselves ‘converted’ from biophysical research to take on the social angle, poorly mentored by a team of biophysical scientists.

I am very sympathetic of the drive to reach outside familiar domains in research. My own career is not linear, and my set of interests and methods broad. I have many times felt the terror of the dilettante at the conferences of various disciplines into which I ‘dipped’ (before I learned to stop going to disciplinary conferences). My first degree was Geography, so I am a natural ‘borrower’. I find my natural home at applied conferences and in problem-based journals, where researchers and readers alike are more concerned with answering an important question, than within which paradigm the answer was found. Unlike many, I am enthusiastic about the creative mixing of methods and theory as appropriate to solve problems, but believe that there is a blindness and an impotence to social science that is done in the image of biophysical science, and without building on (or even awareness of) an extant rich body of understanding about how people think, feel and behave.

Purity, a great (and relevant) webcomic by xkcd.

Purity, a great (and relevant) webcomic by xkcd.

Red flags

There are five common flaws that I see in social science papers led by biophysical teams, though of course they also are committed more broadly. Together, they are indicators of a positivistic mindset that has been set to a post-positivistic task – quantitative social science – without adequate recognition of the ways that people differ from biota, and that many scholars are already working in that space and have made substantial headway.

  1. Focus on sample size above instrument design. It is critical in any research using statistics to acquire a large enough sample of the desired population that inference can be made. Occasionally, however, it is clear that the design of the research instrument and its application has been sacrificed to the pursuit for a large sample. The sample is assumed to be the ultimate mark of quality, and used to generate blinding amounts of statistics, perhaps in the hope that the logic of the task that generated them is not interrogated. Sometimes, the pure distracting power of such academic ‘flashbang’ means editors publish the work, assuming that the presence of such tables indicates the work is rigorous. Protesting to one editor, I was told that the use of complex statistics, so long as the tools are used with technical correctness, renders the work valid even if the insight is minimal because of poor instrument or research design.
  2. Use of convenience samples. A common sacrifice in the quest for a large n is the nature of the sample. In ecological work it may take a long time to find the species of interest, but once you have done so, the only limit to finding enough to sample is time. As long as individuals meet the criteria you can take what measurements or observations are needed to suit the study. By contrast, one of the great challenges of social science is how to find your population – define them and determine their prevalence for sampling – and find a way to gather information ethically from a robust number or diversity of them. You can’t force people to participate, unless your study depends entirely upon observation in public places. You can send surveys and reminders, you can go door-to-door, you can set up desks in high-traffic areas, but people are busy and can still say no. Social scientists focus on justifying survey effort and the validity of the sample achieved, and thus the insight, but would not simply ask different people in order to fill a deficit. A biophysical researcher, by contrast, may assume a person is a person, regardless of context, and turn to a convenience sample (e.g. tourists instead of residents) even when to do so renders the question they are asking utterly nonsensical. The salience of the question, the respondents’ ‘stake’ in the subject and the outcome of the research, is critical for generating meaningful responses.
  3. Ignoring context. Context is also substantial in how the data is collected from the chosen sample. When questing for a large sample size, it is common to use multiple interviewers. Rarely, however, do biophysical researchers doing such work account for (or even seem to recognize) the ways that interpersonal dynamics may bias the resulting answers. This is not surprising, as the gender and age of someone doing biotic samples does not generally impact the measurements taken. The gender and age of different interviewers will create biases within subsets of the data, however, as research participants respond differently to one then they may have to another. Moreover, research participants who are interviewed alone may respond differently to those who are interviewed with their partner and/or their children at their elbow, listening to what they say. These biases must be recognized and discussed when working with people.
  4. Gaming Cronbach’s Alpha. Another red flag is the misuse of a common social science metric to generate indices (often called ‘scales’) based on responses to related questions. Cronbach’s Alpha is was developed to help social scientists assess whether responses to a set of questions were consistent enough across the sample for them to be collapsed into a single measure. That is, is each person’s set of responses internally consistent, even if the responses range widely across the sample? An acceptable Alpha suggests reliability, but not necessarily validity, i.e. that the index measures what it is intended to. Many researchers ‘game’ this metric (not just biophysical converts), testing various sets of their questions to identify the ‘best’ score, and simply dropping the questions from their set that are being answered differently. Biophysical scientists seem particularly prone to trusting the statistics over the respondents. The danger comes in the blind acceptance that the questions left standing – those that give the best alpha – are a genuine measure of the phenomenon that was previously represented by a larger set of questions. The remaining questions must be interrogated to generate a meaningful index name that reflects the new conceptual coverage, and some attempt made to understand why other questions were not answered similarly. There may be a logical set of unidimensional subconcepts embedded within the question set that could be converted into their own indices. Moreover, it may be that a set of questions that more comprehensively cover the phenomenon may still be better than a subset, even if the alpha is lower than it could be. Such statistics are meant to be an aid, not a replacement, for sociological thinking.
  5. Lack of engagement with social science literature. The final red flag is a lack of engagement with existing social science research, assuming that there is nothing that exists to build on, and this is very characteristic of biophysical researchers undertaking qualitative or quantitative methods. This lack of literature review is evident in the design of research, for instance not using established scales, concepts, theories or typologies from related work in survey design, leading to weak instruments. This is also evident from discussion sections that ignore existing social science research on the same or related topics, for instance discussing whether survey responses were correct in relation to the biophysical phenomena that the questions cover, instead of how the responses relate to what we know about what guides human behaviour.

Of course social scientists do this stuff sometimes, too. But we should know better.

 

 

Lab alumna on Quirks & Quarks

Ellen Whitman identifying post-fire understory vegetation in Northern Alberta.

Ellen Whitman identifying post-fire understory vegetation in Northern Alberta for her PhD at the University of Alberta.

Exciting to hear Ellen Whitman, MES 2013, on CBC Radio 1’s Quirks &Quarks this past weekend, talking with Bob McDonald about her summer field season on post-fire impacts in the north. She did a great job, and touched briefly on her work with Eric Rapaport and I on her Masters working on fire at the peri-urban fringe of Halifax. She is now working on her PhD in Mike Flannigan’s lab at the University of Alberta, looking at fire regimes and adaptation under short-interval fires, combining field observation and remote sensing. Exciting to hear about her progress, and rather awe-inspiring to hear her expertise, so eloquently and smoothly delivered during the 15-minute segment.

Another successful Globalink internship

With Jingwen (June) Qin on her last day as a Mitacs Globalink intern.

With Jingwen (June) Qin on her last day as a Mitacs Globalink intern.

Farewell to Jingwen (June) Qin, who headed back to China early this morning to begin her final undergraduate year of urban planning at Wuhan University. She has been working with me on a research project this summer, funded by the Mitacs Globalink, using Sina Weibo social media to understand Chinese student perspectives on Halifax. It was great to have her overlap with Ruoqian (Joy) Wang, last year’s Globalink intern, who has just arrived to begin her MES at SRES with Karen Harper and I. I hope June takes a similar path back to us next year. Thank you, and bon voyage, June.

What WAS said?

I was in Moncton for a meeting with DUC on Friday when the What Was Said report arrived in my email inbox. This report is a compilation of the stakeholder process that NB Power has so far undertaken around the future of the prematurely aging Mactaquac hydroelectric dam. They’re careful not to call it a social science report or stakeholder analysis, which is appropriate, being the combined work of Corporate Research Associates (CRA) polling and National PR firm coding. Report authors suggest that New Brunswickers voiced almost unanimous concern about the environment (particularly fish passage), taxpayer burden, local and renewable energy sources, local suppliers, and transparent process. There was a nice geographic comparison of priorities: based on self-declared postal codes those in the region were predictably more concerned with community impacts than cost; vice versa for non-residents.

The cover of our new report based on a 2014 survey of NB residents on Mactaquac and general energy issues.

The cover of our new report based on a 2014 survey of NB residents on Mactaquac and general energy issues.

They could make much more meaning from what was collected. CRA did two surveys of 400 New Brunswick residents in 2015 and 2016, before and after the public engagement, but the results are aggregated (e.g. 59% heard of the Mactaquac issue) rather than compared across years to demonstrate changing awareness. The awareness indicated here is also captured as yes/no, rather than by levels of awareness as we did in 2014, when only 7% of our 500 survey respondents considered they knew quite a bit or a lot about the Mactaquac decision. Our new report, Mactaquac and Beyond (2016), delves into the drivers of various opinions on Mactaquac (among other things), revealing an imbalanced tug-of-war between economic benefits (rebuild with power – the dominant opinion) and environmental impacts (remove – minority view), with rebuild without power a compromise option driven by landscape and cost concerns. For female respondents the issue was a local one, particularly influenced by self-judged knowledge of the Mactaquac issue, while for men preferences rode on larger principles such as conservatism or position on hydroelectricity. Risk was a driver for the ill-informed, which suggests that misinformation about the possibility of failure may be influencing results. This kind of impartial social science is important, but also usefully associates perceived impacts and preferences with options, which is something that seems to have been intentionally avoided in the NB Power process as reported.

Thematic coding is a process of generalization and erasure, and must be undertaken very carefully and by trusted parties. The data collection itself can introduce many biases as well. It was interesting to see the different themes emerging depending on the type of data collection/intervention: Mactaquaction was clearly bombed with ‘keep the dam’ comments (note there was no way to avoid multiple entries), though environment topped other modes. Yet, what is in that huge ‘Other’ category, NBP? Later analyses of community and fish passage sessions include First Nations themes lumped with Infrastructure/Transportation and Other. Those are strange bedfellows. This is a green versus green debate: climate mitigation and an adapted headpond versus fish passage and hydrological integrity. What was included in ‘environment’ and how were these sources coded? This work suggests that fish passage only came up in the formal submissions, and transparency only in community sessions, but the appendices themselves belie this.

Our first Mactaquac paper is less-than-lovingly reproduced from page 149 to 159 of the largely unsearchable scanned appendices (not including my less formal commentaries and feedback). The appendices include formal submissions from groups such as WWF, NCC and the NB Salmon Council dated as far back as January 6, 2015, including one from energy project collaborator Tom Beckley (p. 173-178) about his multiple relationships with the dam landscape as landholder, taxpayer, Local Service District committee member and scholar. His piece is a nice microcosm of the complexity of the Mactaquac decision. None of these thoughtful submissions is given any response. Appendix E, the Public Correspondence Snapshot, goes back even further, to December 15, 2014, and is filled with rich stories and important questions (all anonymized). The parts we can see scanned sometimes include quick replies that suggest an answer to the proffered questions will be forthcoming (see below): how powerful it would have been if these submissions AND answers were posted online as they arrived! As mentioned in our recent Mactaquac paper, this could help bring the conversation from “me” to “we”.

A typical answer from NB Power on a public submission on Mactaquac submitted via the stakeholder engagement website (Appendix E p. 234).

A typical answer from NB Power on a public submission on Mactaquac submitted via the stakeholder engagement website (Appendix E p. 234).

One public submission from a high school student demonstrates misinformation about Mactaquac.

One public submission from a high school student demonstrates misinformation about Mactaquac  (Appendix E p. 246)=.

One submission in particular caught my eye, from a student at Nackawic High School who believes that the decision has already been made not to rebuild (p. 246), asking questions to complete a Journalism class assignment. My first thought was… since when does NHS offer Journalism? My elective options in 1989 were typing and child care. But secondly and clearly more salient here: the fact that such misinformation existed in April 2016 (and perhaps still exists), likely transmitted by a parent or teacher, was disturbing. Unlike other submissions, there is no evidence from the Appendices if the student ever got an reply – even poorly informed questions should be addressed, maybe especially poorly informed ones.

So what now? What will happen with What Was Said, and how will it feed into the decision? When will we hear about First Nations consultation, though at least one group has come out for removal? When will we get more details on options 3 or 4, comparable to that available for 1 and 2? What about real estate analyses? Impact on the river ecosystem? The NB Power and NSERC funding to the Canada Rivers Institute so far has generated lots of data, available through their ArcGIS online storymap, but a little preliminary synthesis would be great.

« Older posts

© 2016 Kate Sherren

Theme by Anders NorenUp ↑