SciComm Conversations: “Does public engagement make science better?”

Listen to “Does public engagement make science better? Guest: Prof Bruce Lewenstein” on Spreaker.
Transcript
Achintya Rao [00:09]: Welcome to SciComm Conversations. My name is Achintya Rao. In our second season of the podcast, we have been hearing from science-communication researchers about their work. This episode was also recorded at the PCST conference held in Aberdeen last year. In it, I spoke with Professor Bruce Lewenstein from Cornell University on the question of whether public engagement makes science better. Bruce presented this question at his talk at the conference, in which he discussed work done at Cornell’s Center for Research on Programmable Plant Systems, or CROPPS.
Bruce Lewenstein [00:46]: I’m glad to be here. I’m Bruce Lewenstein. I teach science communication at Cornell University, which is in New York State, in the United States.
AR [00:56]: Thanks very much for being with us today, Bruce, on this episode of SciComm Conversations. We’re going to be talking today about the hypothesis that public engagement can change science. But before we go into it, can you tell our listeners a bit more about the CROPPS project that you’re involved with?
BL [01:12]: Yeah, so this is a big project, a multi-year, multi-institutional project funded by the US National Science Foundation to do research on… it’s called the Center for Research on Programmable Plant Systems. CROPPS. And the idea here is that we actually don’t know that much about how crops – plants, in general – sense their environment and send signals to their environment back out about, “I’m thirsty, I need more water,” or “I need more nutrients of a particular kind.” So the fundamental part of the project is to study plants and to understand more about how they communicate and how we can communicate with them, for example, to send a signal to a plant that says, ”Hey, there’s a, uh… weather prediction is it’s going to warm the next few days, you might want to stock up on water.”
There are four technical research themes in this, uh, big center, and then there’s a fifth theme, which is on social and ethical issues in digital biology. “SEE,” we called it. And the idea there was, if we study… if the scientists who are involved in the CROPPS project, they’re all going to be doing public engagement, and what is it that they’re doing in that project in their public engagement? We can assess how audiences change, but what about the scientists themselves? How are they going to change, and how is their research going to change? Will that… will it change as a result of public engagement?
AR [02:55]: So that is, then, the hypothesis that you’re seeking out to test?
BL [02:58]: Right. So the hypothesis here is those of us who believe in public engagement, and we, we’re past the dialogue model, and we’re going to have two-way communication… If you take that seriously, that means that both sides of the conversation have to be willing to change. So we know how we want publics to change, but how do we want scientists to change, and do they change? And what change… what is it that changes? And in particular, does their science change? Do they ask different questions? Nature isn’ going to change. Nature is there, but what we know about nature comes through the process of asking questions, and I wanted to see… And so we all assume that this public engagement is better for science. But we haven’t actually tested that, that it’s better for science. So that’s what I want to find out.
AR [03:54]: You’ve mentioned that we all believe that public engagement can change science, or a lot of us do anyway. But that objective of public engagement, having an effect on science and the scientists itself, where does that idea come from?
BL [04:08]: It comes from the emergence of the idea of public engagement in science, which was a reaction to the idea… partly it was a reaction to the idea that we just have to tell people things then everything will be better. But it came in part from a political position, a belief from democratisation, a belief that we will be able to solve the world’s problems better if we actually listen to communities and groups, and respond to their priorities and their interests. And that’s a political position. And it drove a lot of work in environmentalism and so on. And it has, in fact, created the field of environmental studies in a way. But it’s not something that’s really been tested very well regarding science. So it’s an idea that emerged 25–30 years ago from actually from a lot of research originally in the UK that was a result… that was the product of the Bodmer Report from the 1980s, which led us to really recognise we have to understand audiences. But many of the people who were involved in some of that work were quite committed to a political ideal of empowering communities. But if you’re empowering them, then somebody else has to give. And in this case, it’s the science that we think would give…
AR [05:49]: Right. That comes back to the sort of like two-way, the exchange… You can’t just… Even if the empowerment has been happening through engagement it can’t just be one way, is the hypothesis. So this hypothesis that public engagement – science communication – can change science, how did you test it with the CROPPS project?
BL [06:06]: So here’s where things got a little tricky. So our original idea was that we were going to do a lot of interviews with the scientists, document their positions on public engagement, their what their science was, follow them as they participated in engagement and then do sort of post-test things. For a variety of reasons, that never quite happened. So we’ve been part of the project. The project is now almost five years old. And what we’ve done is document to various public-engagement activities. And we’ve observed the meetings in a very… in a non-systematic way. But we’ve been… we’re part of the team. As I said, we’re a co-equal research team.
AR: You’re embedded, you’re not observing externally.
BL: Yeah. Yeah. So in a sense it’s an anthropological study. The problem is we don’t actually – for ethical reasons and for complicated reasons – we actually don’t have permission to quote people and observe. So actually testing the hypothesis is actually going to be somewhat harder than I had hoped.
AR [07:15]: Yeah. Were there any surprising findings from your work? Oh, actually, let’s… let’s take a step back. Coming back to the hypothesis. How did it work out?
BL [07:31]: So I don’t know yet. The project is still only a few years old and this is a long-term thing. What we are seeing is there are absolutely some scientists in the project who were already doing things, where they were engaging with publics in ways that had a fundamental impact on the research they did. So I’m thinking of one researcher who… he studies corn corn or maize. Which term is used in the UK?
AR: I think both will do. I grew up hearing it in India, hearing it referred to… when it was a crop, it was maize; when it was consumed, it was corn.
BL: Anyway, corn. So as a as a public-engagement activity he brings different kinds of popcorn and different colours of popcorn. So you have… especially if you have kids, you know… But what he would have… what he would do then is he would have the kids in high-school projects, in middle-school projects talk about what traits they wanted to see in the corn. And he would use that to decide what traits to study in the next field season. Because he wasn’t… he didn’t particularly care about what colour the corn kernels were. Although it has some bearing on what market capabilities, what possibilities there are. But it was… it’s a way of getting at the underlying genetics. And that’s what he was interested in, is the underlying genetics. So the exact phenotype was actually a little less important than understanding the genetics. So… so he was doing that. Another researcher at a different university but also part of the CROPPS team goes into high schools and has them do DNA replicat–, er, DNA manipulation. And again, he takes the data that they… that they’re producing and on the projects they’ve chosen, the students have chosen to be part of the data that he uses to analyse for some of his genomic work.
So we definitely have some anecdotal examples of things they were already doing. And in particularly one of those, the popcorn guy, he reached out to us. He said, “Tell me more about this social and ethical issue stuff, and help me understand it better.” So we’re seeing… and we’re seeing others. I mean, we’ve seen examples of a graduate student who says, “I could be working on one of several genes. Maybe I should talk to the public to find out which of those genes would be of interest to them. And then I’ll… ’cause I can work on that one, ’cause I… the genetics is the same. So I might as well work on the one that actually has value to the community that I’m working with, that I want to take up my innovation.” So we see things like that.
AR [10:31]: So coming back to the question that I started to ask earlier, were there any surprising findings in your work? Were there anything… any observations, any conclusions that just made you pause and think, “Oh, I hadn’t really thought about that.”
BL [10:45]: So, two elements… two things. One: not everybody cares about science. That’s actually hard for those of us who are involved in science communication to… accept, to come to terms with. So one of my students, Sohinee Bera went to study some… had an opportunity to go talk with migrant farm workers. So these are in the United States, these are Spanish-speaking, they actually live in Mexico, cross the border into the United States legally to work in the fields and then go back. And they do this every day, they commute every day. But it’s low literacy; even in Spanish, their literacy reading levels are low. No English to speak up. And she wanted to ask them questions about what your attitudes are about plant biotechnology. I wanted her to ask those questions! It wasn’t her idea, it was me, me and the team. And of course what she learned was, they’re not interested in plant biotechnology.
They gave… they gave answers and she got some interesting data. But really what we learned was if we want to work with marginalised populations, we have to think very, very differently about what our goals are. And what they… we have to think in their terms, what are they interested in? We were… as an extension of her work, she was trying to connect with some other farm work groups and was told, “You know what, our problems are actually not about plant biotechnology. Our problems are about access to health care and legal status in the United States. Given that that’s not what you’re working on, we don’t have the time to work with you to do public engagement about science. It’s just not important to us.”
AR [12:37]: There’s like a hierar–… Maslow’s hierarchy of needs almost. If you don’t have the basics, then why are you going to think about these things? And of course there are lots of communities around the world, groups of people that don’t have those basic needs met, and so they don’t have… It’s almost like we have to appreciate the luxury that we have to think about science and science communication.
BL [12:56]: And exactly. We have that luxury. But also it means if we talk about upstream engagement, upstream with who? Which tributaries? I’m good at pushing metaphors to far! [laughs] Which tributaries to the upstream engagement should we be exploring? Because it turns out that some of those tributaries are people who are uninterested in where the water is going. They’ve got other […] and issues. The other thing was for me, and this was… I’ve been forced to rethink what do I mean by what is the effect on science? So I like to think of science in three categories. Science as knowledge; science as a body of human knowledge; science as a way of approaching the world. And when I say I like to think of it, what I mean by that is when people say science, what do they mean? Sometimes they mean body of knowledge. Sometimes they mean process. But sometimes they mean the institutions of science: universities, research institutes, government agencies. And so when I say what is the effect on science, I have to think about all of those. Not just does it change the body of knowledge. But also does it change the way we do science? Does it change some of the institutions of science?
And in particular… I’ve made a presentation similar to the one that I’m making at the conference in Aberdeen; I made it to a group in South Africa about six months ago. And somebody who runs a science centre there, who I’ve known, someone I’ve known for many years, asked me a very nasty, harsh question. He wasn’t nasty. He just asked a harsh question in the Q&A, which is he said, “In my science centre, I’m working with communities who have zero access to science. Their ability to walk in the door and just have fun with something that’s got the label of science… I have opened up so many possibilities for those people, career opportunities, areas that they never even knew were open to them to be interested in.” He said, I don’t care if they go become lawyers or accountants. They at least considered, “Oh, maybe I could be a scientist or maybe I could do something related to science.” And if one of them out of a hundred, he didn’t say it that way, but you know, goes science, then that’s great. So his point was he’s impacted the community of… there’s now a larger community of people who consider science to be an area of possibility for them. And that’s an impact on science.
AR [15:51]: So a different way of almost upstreaming the impact in some senses, like a longer term, it’s not an immediate thing. And it isn’t changing science through the act of the public engagement, but eventually someone is going to go and do the science, and that will change the science.
BL [16:10]: And it changes the community of who chooses questions to pursue in science. It changes the range of ideas and perspectives that are brought into science. It raises… it changes the public discussions, but I don’t actually want to go to the public side. I want to stay on the science side. And as you said, if it changes who goes into science, then that ultimately changes the questions that will be asked. And therefore, what we will know about the natural world, the natural or the constructed world or the virtual world.
AR [16:51]: We’ve touched upon this a little bit, and you have spoken about researchers who are already in some way changing the… changing the… choosing the questions that they pursued as a result of their public-engagement efforts. But were there, from your study, were there any other benefits to the researchers themselves from the public engagement? And I don’t necessarily mean in terms of changing the science, but did they perceive any benefits to themselves?
BL [17:21]: Yeah. So I don’t have systematic data on that. That’s where our… the fact that we weren’t able to do quite the original systematic study we’d hope for. What we do see is some anecdotal stuff about… as we already know, a number, many scientists say that they do public engagement because they feel some obligation to do it as a result of… they’re publicly funded. And by obligation, not a obligation they don’t want fulfilled, but it is one of the things that they’re… And so what we have seen are some examples of “Yeah, I feel good when I do public engagement because I am fulfilling an obligation that I believe in, and now I have a structured way to do it. I wanted to do something public, and now I’ve had a chance to do it.” So there is that kind of thing. There’s also some of the inherent benefits they learn to make… to speak publicly, they learn to do activities with school kids, so they learn some things that they feel they’ve gained some skills.
AR [18:30]: Yeah. When I was doing my research, one of the benefits that people found from doing public engagement was they found that they were in a better position to present their work to their own colleagues, because of the diversity of backgrounds and fields within the field that existed. So once you get rid of the assumptions of what do people know, and you start talking to the right audience, those skills translate.
BL [18:47]: These are absolutely… scientists are absolutely learning transferable skills, about presentation, about understanding your audience, about thinking about what’s your goal in this presentation with this audience. And those skills are useful, whether you’re talking with school kids, or at a science cafe with people who’ve already had a couple of beers, or… so it wouldn’t be a science cafe, it would be a science pub! Or whether you’re talking to your colleagues, or maybe colleagues in a slightly different area, or you’re a young researcher and you’re trying to get a job, and you’re pitching yourself to a department that’s a little different than what your degree department was. Yeah, they absolutely learn transferable skills. So growth as a professional scientist… I was looking for a label, growth as a professional scientist.
AR [19:42]: Yeah, that’s a really good way of looking at it as well, I think, because how we then define ourselves as scientists is not just doing the research, but it’s also then all the other human aspects of the people that you engage with, whether they’re within your field or outside of it. You have… you’ve told us that you were not able to – for a variety of reasons – to pursue the research in the way that you originally envisioned. So there must be some limitations that you have encountered. You’ve been quite open about this in the talks that you’ve given about the research. Could you tell our listeners some of the limitations as well?
BL [20:15]: So one of the key ones is actually is the ethical-research issue. So if you’re doing research with human subjects, you need their permission. We wanted to do… and typically if you are doing anthropological work, you are going into a community, and you’re getting permission from the community to be there and you’re promising the privacy and anonymity and all that kind of stuff. We had the problem that we have a geographically distributed network of somewhere between 100 and 150 people, who are coming and going, and who are in a variety of different kinds of meetings, small group meetings, big centre-wide meetings, some once a year in person, other times… And the logistics of getting IR–, well in the United States we call IRB, institutional review board, ethical approval, undid me. [laughs] There were some, you know, there were some reasons for that that I won’t go into, but it just, it was really… how are we going to handle permission? Especially if some people would say, “No, I don’t give permission.” And we wouldn’t always know who was in the meeting. Or they would say, “Yes, I give you permission, but you can’t record.” And we’re recording all these meetings. I mean the meetings are being recorded for internal purposes, so am I allowed to use those recordings for my research purposes if they haven’t given permission? So these actually turned out to be more complicated than I had intended, and this was also right early in COVID, and it was just more than I could handle.
AR [21:58]: You also mentioned in your talks about the challenge of hypothesis testing when you don’t have a controlled group. How even would you go about with that, would you have a group of researchers long-term who are told, “You are not allowed to talk about your research with anyone other than your peers,” almost, and even there limited…
BL [22:17]: And you have to be in a five-year or hopefully ten-year NSF-funded… no, this is the problem, there is no control. And you know, I’m being partly facetious when I say it, that there’s no control group. But I am also, because I’ve framed it as this is a hypothesis test, well, in order to test a hypothesis, you need a control group. So this is [a] partly rhetorical piece about this. I think what it does involve is our having to be reflective as researchers about what can we know, and what is the value of different kinds of systematic data, different kinds of observational data, looking for anecdotes where people chose not to…
Right, so the classic examples here are that in the early days of nanotechnology, which came five-to-ten years after the crisis of GMOs. A lot of the nanoscientists were… the reason they wanted to do work on public engagement and on social and ethical issues was they said in so many words, “We don’t want to have happen to us what happened to GMOs.” They believed… they believed the GMOs were good science, they believed their nano is good science, but they didn’t want the products of their nano to be prevented from pursuing… from being pursued in the way that many GMOs scientists felt they were being blocked. So we have… we’re looking for counter examples, and this gets to the question of, if public engagement, if there is no controversy, does that mean the public engagement has been successful? Or are there other kinds of ways of thinking about this? So the lack of a control group is partly a matter of, can I find other examples where people have or have not engaged in public engagement and think about what has happened to the science that those people were pursuing?
AR [24:32]: I mean, there’s… there’s always the possibility of looking around the world and not every science culture has fully embraced… is on different… different paths of the trajectory towards this sort of fully upstream public engagement. And so you could always look at different cultures and researches working in different environments, but I imagine it would be very impossible to disentangle socio-cultural factors from purely public engagement.
BL [24:59]: Yeah, yeah, I don’t think we can. And as you said earlier when you were talking about your own work, there are different sciences and the opportunities, the ability for public engagement to affect high-energy physics is very different than the ability of public engagement to affect the development of new materials.
AR [25:22]: ’cause I was going to say, you used to work controversy earlier and often our eyeballs are drawn towards controversies and risks and all of these sort of scary concepts in science, but there’s a lot of science that is just incrementally advancing some niche domain of knowledge that is – relatively speaking – uncontroversial and generally ignored by the populace as a whole. So yeah, how would, you know… like I said, I don’t think all sciences have the same capacity for public engagement. So how would you test the hypothesis in a different… you mentioned high-energy physics, we’ve spoken about this earlier before we started recording… Could you test the hypothesis? How would that affect the research itself?
BL [26:02]: So if I were in an ideal world, which I’m not, I know that, you know, I would first of all at the very beginning of a project at the point where some researchers are saying, “Hey, we’ve got an idea for a… There’s a call for proposals. Let’s put together a proposal.” I would want to document at that time what those people, what they are imagining. We talk sometimes about visions. What’s the vision that they are pursuing? And document that as well as possible.Document what kinds of public-engagement activities they engage in. Document their visions that they have at the end of the project. And through observation and interviews and so forth, understand how we got from the first set of visions to the final visions and what role public engagement might have played in that. Again, we have a problem of controls and we have a problem of clearly I’m an embedded researcher at that point, because I’m having these interviews and so they are reflecting on what they’re doing. So maybe I leave that part out. But even if I did that initial set of interviews, I’ve had an intervention.
AR [27:29]: Exactly. Yeah. So is it the very act of thinking about it might change… without having to do the public engagement… just getting to think about what would public engagement mean, you’ve already changed.
BL [27:39]: I’ve already changed. So it’s going to be a tricky question. Again, I’m not claiming this is in any way unique. Many people who do anthropological research, you do any kind of sociological research, anything with communities, they face this all the time. So I’m being a little, if I knew that work better, there’s so many things more I want to know, that I need to read. There might well be ideas there for how to resolve this kind of problem.
AR [28:09]: Yeah, but overall I really love the idea of empirically testing a hypothesis, and good luck with the rest of your research and this particular project as well. Thank you so very much Bruce for taking the time to talk to me about…
BL: It’s my pleasure. Thanks very much.
Music for SciComm Conversations is by Brodie Goodall. Follow Brodie on Instagram at “therealchangeling” or find them on LinkedIn.
SciComm Conversations, with the exception of the music from Game Changers, is released under the Creative Commons Attribution 4.0 licence.
The COALESCE project is funded by the European Union to establish the European Competence Centre for Science Communication.
Views and opinions expressed on this podcast are those of the guests only and do not necessarily reflect those of COALESCE or of the European Union.

