#65 Investing in Knowledge: The Life Cycle of Research

Dr. David Naylor, Professor of Medicine at the University of Toronto and founding CEO of the Institute of Clinical and Evaluative Sciences


Sept 25, 2019

On this episode, we explore the life cycle of research: from seeking funding to sharing findings that inspire. Grace spoke to Dr. David Naylor about science culture in Canada and the Fundamental Science Review panel that he chaired. This review reported on the science funding ecosystem in Canada and was key to Federal research budget increases in 2018. Stephania also sat down with Dr. Alan Bernstein, President of the Canadian Institute for Advanced Research (CIFAR) and former President of the Canadian Institute for Health Research (CIHR). He shared what makes a grant application great and more likely to be awarded funds, as well as the collaborative research that CIFAR is leading. We also spoke with Dr. Orli Bahcall, a Senior Editor at Nature, who gave us her perspective on publishing, the value of preprints, and impact factor as a measure of publication success. Finally, we dived into what we can do to advocate for more research funding and engage with the public about science. From inspirational ideas to a published article, the research process is an extensive and expensive one.

Written by: Zeynep Kahramanoglu

Dr. Ben Mudrak on Preprints
Fundamental Science Review
The Impact Factor
Guide to Scholarly Metrics
Illinois Library: Scholarly Metrics
CIHR Grants and Award Expenditures
CIHR In Numbers

Marija [0:00] Hi, I'm Marija.

Zeynep [0:01] And I'm Zeynep.

Marija [0:16] And welcome to Episode 65 of Raw Talk. Across Canada and all around the world, thousands of scientists work in labs on a variety of complex topics, many of which we've delved into on previous episodes of Raw Talk.

Zeynep [0:18] Each of these questions being asked by this community from understanding the role of a particular protein to developing new imaging techniques to diagnose cancer. They all have a tremendous impact on Canadians, and have fundamentally shaped our day to day lives.

Marija [0:31] But how is the value of each project determined when we have limited resources? How do we decide what science gets funded and published and what gets left on the laboratory floor?

Zeynep [0:51] On today's episode, we talked about the life cycle of research. From collecting support and funds for research ideas to the review process of research results, and its publication, to ensuring public outreach and engagement, all of which ultimately lead to further investing in research and beginning the cycle again to add to discoveries.

Marija [1:09] But where does the money for science come from? There are organizations and charities in Canada that offer funding, but the simple answer is - from you. Scientists compete in a series of granting programs, which in Canada, are mostly organized by the federal government in Ottawa. Ultimately, the money for science comes from the Canadian taxpayer.

Zeynep [1:31] The Canadian Institutes for Health Research or CIHR is one of the agencies of the Tri Council created by the government and tasked with funding health research in Canada. There are also the National Science and Engineering Research Council or NSERC, and the Social Sciences and Humanities Research Council, or SHRC.

Marija [1:49] Grant proposals are sent to expert panels in these organizations who then decide which projects receive funding from a set budget determined by the government. These agencies are also involved in awarding competitive scholarships to graduate students and incentivizing early career scientists to work in Canada through a variety of programs.

Zeynep [2:10] So far, so good, but unfortunately, science and science funding has become a bit of a neglected topic and Canadian politics. Funding has stagnated and we are far behind our peer countries

Marija [2:21] During the last election in 2015, however, the liberals platform spoke broadly of the importance of science and evidence-based decision making. And as one of his first act as Prime Minister, Justin Trudeau appointed a Minister of Science.

Zeynep [2:36] Grace had the opportunity to speak with Dr. David Naylor, former president of the University of Toronto and chair of the recent fundamental science review panel, which was created in 2016, at the direction of Minister of Science, Kirsty Duncan. Dr. Naylor told us about the panel and their mandate.

Dr. David Naylor [2:53] The minister made it pretty clear early on she was concerned about funding had picked up the message that the Harper government had focused on applied research, had not provided a lot of funding in general, but particularly earmarked it towards applications and innovation. So, from early on in her term as Minister, Dr. Duncan emphasized the need to think about how to support independent fundamental science. The concept of a panel rolled out - she was in touch with me about chairing it. And we worked together on the composition of the panel, tried to make sure we had reasonable gender balance, and mix of research intensive and regional research universities, and of course, the disciplinary balance as best we could. I don't think we hit a perfect disciplinary balance, we had three people with a physics interest and we were light on the humanities. But it was certainly broader than many panels have been that have looked at things in the research space. So, pretty wide angle commitment on the part of the panel.

Marija [3:55] The FSR publish their report in early 2017, but suggested improvements in the budget were not implemented until the following year. We asked Dr. Naylor about some of the major findings of this report.

Dr. David Naylor [4:09] The main findings have been rehearsed many times. The minister's diagnosis was one that reflected the community's perspective. It was a fabulous panel, amazing experience to work with them. But the community was also very engaged with the panel process and you know, affirmed exactly what they've been saying to the Minister. Namely, we had a shortfall in funding. The peak had been reached many years earlier, really early in the Chretien period of the so-called innovation agenda. Science funding per capita had fallen and we were seeing a squeeze on the next generation, which, of course, badly affects equity, diversity and inclusion, because of the shift in the composition of the research workforce, with a much more diverse pool of individuals undertaking graduate studies and entering the junior ranks of the profesori. So, bad news for diversity with the squeeze on funding. And, more generally, you know, we had an imbalance in terms of applied versus fundamental research that had emerged. And they were across the board matters of concern -the instability of the budget of the Canada Foundation for Innovation. That infrastructure was supported on a somewhat unpredictable and spiky basis, the graduate student and postdoctoral suite of supports and not being refreshed or reviewed for many years, big concern with the growth in the pool. Some of the career awards had been pulled back that the granting councils offered, which meant there was more reliance on the Canada Research Chairs. Well, you know, they were introduced in 2000 and had no further renewal of funding. In fact, in the Harper era, because logistics meant that some number of them went unfilled, instead of managing the logistics of the approval and application process, which you know, involves basic queueing theory and some thought. The Harper government took that as a chance to clawback 350 of the chairs. So there were missing chairs even against the 2000 that were initially planned in terms of the budgeting, side effect of leaving spots unfilled quite a number of them. So there was, everywhere we looked, signs of a certain amount of neglect a certain amount of lack of investment. And we heard that in great detail for the research community, which was enormously engaged with our process in 2016-17. And, of course, findings reflected all that and a series of recommendations to begin to bring us back into play to restore a competitive footing for Canadian independent extramural research. Finally, I would say is we were at considerable pains to emphasize it was about the entirety of the research realm. This wasn't focused on STEM, we invoke the French "Les Sciences Humaines", to highlight that the humanities were very much in play here, that we were talking about scholarship in its broadest sense, and we were also concerned with what we saw as a life cycle approach. So we were thinking about from the graduate students at the masters and doctoral level right through to senior scholars. What suite of initiatives, including administrative and governance changes would put Canadian science on a footing to improve its prospects for, you know, the next decade or next generation. I think the financial recommendations were actually prudent and modest, they were not aggressive. They were really benchmarked very carefully against early high water marks. And so in putting it all together, I think it was a blueprint to get us back on the road to being highly competitive. And unfortunately, though I think tremendous progress has been made, the government really has fallen short in both the speed and the final point of investment so that we, I think, are recovering, but we still need a serious infusion of resources to get fully competitive in the years ahead.

Zeynep [7:56] As mentioned, in recent years, there's been more emphasis on Applied Science, sometimes called Translational Science, in the health setting. Applied Science clearly targets a problem facing humans and will be used with clear applications to benefit humans.

Marija [8:10] When thinking about how funding is allocated, Stefania sat down to speak with Dr. Alan Bernstein, the president of CFAR and former president of CIHR from 2000 to 2007. He explained that thinking about potential impact on human health is a good way to build engagement and interest in scientists' work.

Dr. Alan Bernstein [8:31] In a democracy, at least government represents people, people pay for our science. So it's not surprising that people care about how their money is spent. And so, what do people care about? Well, for sure people care about diseases, right? Everyone cares about their health. So they want cancer to be solved even more specifically, they want the cancer that affects them the closest, whether it's themselves or a loved one, they want that one to be solved tomorrow. Right, we're all getting older so they want dementia to be solved and Alzheimer's and, you know, neurological disorders to be solved, mental illness, heart disease... I don't think that's a bad thing. I think that reflects an interest of the public in health and an understanding that you need research to solve those problems. If we are going to argue for science, with our politicians and with Canadians, we need to relate it to them and to the issues that they face. And it's not an argument to a fundamental research versus applied. It's an argument about the importance of research and knowledge to create new jobs. So there needs to be an interlocker between the public and the scientific community, which is, I think, in a sense of what people who run these agencies and certainly what I did, their role is to, in part, explain to parliamentarians, who are representing the public, that sometimes the shortest way home was not a straight line. And that's where fundamental research comes in. To me the question: "is it interesting research?" And will it be research that eventually that will have impact? Impact on science, impact on people. That to me is the right question. And, it's a funny thing about science, but the best science has impact on science and on people. The questions on a grant application that ask that question, I don't think should be used to decide whether you get funding or not, unless it's an RFA, unless it's a strategic initiative. To me, the main purpose of that question is to force the person who's writing the grant, to think about that question. So again, I'll go back to, this is going back a long time, when I wrote my postdoc application to work on chicken tumor viruses, avian tumor viruses, I was asked that question: What is the relevance of human cancer? So the honest answer is: I don't know. Right? Who cares about chicken tumor viruses and humans, right? On the other hand, it made me think about exactly this question. By and large human cancers are not caused by retroviruses, certainly not by avian retroviruses. On the other hand, the mechanisms by which these viruses cause cancer can be completely different. If you think about it, in hindsight at least, then how humans get cancer, how many ways can you get cancer? So viruses, again, I have the benefit of hindsight. What these viruses do is capture human genes and they're mutated and they're a hook to getting into how human cancers work. So what I wrote 50 years ago in my postdoc application, was that the mechanisms by which these avian viruses cause cancer in chickens, is probably going to be shared with how human cancers arise. Not the actual virus themselves, but the underlying biomedical mechanism, that turned out to be exactly right. It's not that I'm brilliant, at all, but it's just obvious when you think about it. So it did, it did force me to think about it. I never quite believed what I wrote, to be honest, but it forced me to think about it. And I think we all should have to think about how we're spending public money.

Zeynep [12:24] Dr. Naylor worries that Canada is still lacking in science culture and observes that in the US, the National Institutes of Health have been very successful in winning broad support from the US public and lawmakers, with per capita funding three times greater than its counterpart, the Canadian Institutes of Health Research.

Dr. David Naylor [12:42] It comes back to science culture. And you know, the striking example of the US. Now, in the case of the US, I would emphasize that NSF, the National Science Foundation, has a per capita funding rate that isn't much different than NSERC. So the the bias is towards medical research and somehow, through many, many decades of advocacy and education, the concept has been baked into the minds of lawmakers that medical research is a good thing and that investing in medical research is part of the lifeblood of the US innovation economy and part of what will matter to the voter in the smaller town or a big city or a rural constituency. For reasons I still don't fully comprehend, that mindset has not permeated here.

Marija [13:29] In response to the FSR, funding for science in Canada was recently increased in 2018, but still remains low compared to other G7 countries.

Zeynep [13:39] Out of an approximate $1 billion budget for the 2017-2018 fiscal year, CIHR invested a little over $550 million dollars into investigator-initiated project and foundation grants.

Marija [13:52] In the fall 2017, a total of 3415 applications were submitted to CIHR and only 512 of those applications were funded, with the total investment of approximately $372 million dollars.

Zeynep [14:09] With such tight competition, which ideas and which scientists will receive that funding? What makes an application or project great? Dr. Bernstein shared what he looks for:

Dr. Alan Bernstein [14:19] I think you bet on two things. The number one thing, and probably the number two thing, is you bet on people. Good people, great people, by definition, do great things. So I really focus on the caliber of the individuals who want to do a particular project. Because in the end of the day, it's that's the most important thing. I think secondarily, you pay some attention to, is this a really interesting question? And it may not be obvious, especially if it's outside your own area, it's an interesting question, but if a really smart person, a really good scientists said I would like to work on x, then, even if my initial reaction -I'll personalize it- is, why would you want to do that? On the other hand, "jee, I really have a lot of respect for that individual", so then I'll do a little bit of thinking and reading about it and "Oh! Now I get it." It comes down to, again by definition, good people do interesting things. So you bet on good people, even when it's not obvious to you at the time, that it's going to be an important question. I don't like either/or decisions. And so when you're running an organization like CIHR, you need a a fundamental core of researchers doing a lot of work, a lot of great research. So that's, you know, the open competition as it's called right at CIHR. I also think the right kind of strategic initiatives, that are addressing more immediate or obvious issues, should also be part of the portfolio, sort of a diversified portfolio. A lot of people in the research community will argue when success rates are so low, that money should not be used to fund strategic things. On the other hand, a well written RFA, Requests for Applications, in the strategic areas, allows for a lot of really, really good fundamental research. And the best way to convince politicians to give CIHR more money is not to frame it as, "we're only going to do fundamental research. We're not gonna do anything that's relevant to Canadians," is a big mistake. And so we try to write the the strategic initiatives, I believe, from the institutes, in a way that was broad enough that could encompass a lot of great fundamental research and tackle more immediate issues facing Canadians.

Marija [16:58] What are some more reasons for why the success rates in achieving funding are so low?

Dr. Alan Bernstein [17:03] One reason that the success rates in the open competition is so low or too low, actually got nothing to do really with government funding of CIHR. It has to do with the growth in the research community at universities. It's really going through a huge expansion, driven by other things, if you haven't by CFI, driven by Canada Research Chairs, driven by indirect costs, the growth of our universities from philanthropy... blah, blah, blah. So it means that the denominator has gotten bigger.

Zeynep [17:36] Competition in science isn't only at the level of applying for funding, but also once you've carried out your study and are trying to share it with the world. There are many scientific journals that you can publish in, but with varying degrees of reach, rigor and impact.

Marija [17:50] Two of the most prestigious highly cited and shared journals that cut across fields are science and nature. Stefania spoke to Dr. Orli Bahcall, a Senior Editor at Nature Genetics, to hear more about how they choose which articles to accept for publication and what they think will have a strong impact.

Dr. Orli Bahcall [18:10] Assessment of manuscripts are all subjective. Our assessment of the advance of a work is always subjective. And we are here as editors of really as part of our community to help and support our communities in different ways. And one of that is through the peer review and publication of manuscripts. And the things that we work for, you've mentioned a few of them, in assessing manuscripts is the novelty and the advance and the contribution to the field. But all of this fits within the spectrum of what, as editors, we're looking for within our journal and how we're looking to represent our field within our journal. And so we do this on a broader scale. We're looking, while we're reading each manuscript and we make our assessment of each manuscript as a journal or as a team of journals as we have across Nature journals. We have very regular assessments where we do, just like we're doing here with a strategic planning meeting, we do our own strategic assessment. And we keep track of our fields. And we keep track of what work is going on. And the new trends and really exciting new areas that are developing. And this is the part I love the most about my job is the fact that we get to keep track of all this. And we discuss within a big team at Nature, I work with a team of 18 editors in the Biological Sciences team. And we meet together and we discuss what are the most important areas and we each represent our fields. And we have discussions about how we represent that and what areas we want to represent and what our standards within each area that we should be looking for. And write our own strategic review about how we're going to do that. And all of that, which we do a couple times a year, and we update as we go along as well, that it really is the major thing which informs how we assess manuscripts. So each manuscript that we see, we are reading it to understand the manuscript and what was done and the level of advance, but as an editor, we're fitting that into the context of what we're looking for to represent as a journal. So as an editor, when we're doing our regular assessments, if we say the word impact, what we're really looking at is not citations, but we're looking at what do we as an editor think is the impact of this work? And that's absolutely something that I talk about a lot. So the impact of the work I will usually be talking about, is this study going to really change something in my field? And change something in my field, it might be something like a resource, I mentioned the UK bio bank is just transforming human genetics research as we know it. And so that is one of the most impactful papers that I've published in the past couple of years. And that's because it does transform how we do human genetics, because it's providing a valuable resource. And so it might provide a resource or it might have findings or analysis that shifts the way the field is done or just questions some theories and makes people re-evaluate it. I can consider that is very impactful, because it's something that will be widely discussed in the field and makes people re-evaluate some of the theories. Impact -I often talk about if it's something, I'm asking how is this working to be used. So it's impactful if it's something that lots of people are going to be reading. And it's going to inspire, it might be a very initial result right now, but it's going to inspire a lot of people to do follow-up work. So, you know, there are many different ways to be impactful, but we're really looking at impactful as something that people are going to read, people are going to talk about, it's going to have an influence within this field or other fields. Another good way to be impactful is because maybe, you know, within my field of genetics, it might be something that's quite commonly known, but it's really translating this to the next step. And we've been talking a lot here about genomic medicine and translation... So impactful is translating things into the clinic. And so they're very different ways when we're talking about impactful and Clinical Translation of genomics, there are many different ways to look at that. So every manuscript has very different types of questions that we will look at to try to talk about that.

Zeynep [22:10] Dr. Bacall also discussed the consultation system in place that nature to try to find the right journal to fit an author's manuscript.

Dr. Orli Bahcall [22:17] I think the most important advice I give to authors when I talked to them, is to keep in mind that there are many journals out there. And all of the journals are doing the job of representing the way they feel is appropriate for their journal or is their set of journals and understand that there's a fit for everyone in a place for everyone. And that's something that, at Nature journals, we also try to help out our authors as a service even more, because we have a whole set of journals across the Nature family. I try, and all the editors at the company try to find the best home for any manuscript that's submitted to us and we do believe we have a good home for any manuscript. And so the way we do that is, you know, when I talk to authors out some interesting work, and they submit this, if it's not a good fit for me, I talk to my team. And we'll also talk to editors across our office, across many other journals that are relevant, and ask who's interested in the manuscript. And I think that's been one of the most interesting process, we've developed this, what we call "consultation system" over the last five years, to really expand it and make it a lot more available and accessible for all of us. And I think that's been the most interesting experience because as a group of journals at Nature, we're really able to work very closely together to understand all different aspects of a field and how different journals can represent different areas of a field really well. And within that refer a manuscript that is very good, technically, but may have a different angle. We can find different journals will want to represent that in different ways and we can offer authors different options have different journals to go to.

Marija [23:57] An important part of science is replication. Making sure previous findings are repeated and validated. Just as important are negative results, where no effects or correlations are found. Although these studies may sometimes be perceived as less exciting, they're a vital part of the scientific process. And something journals are placing greater emphasis on.

Dr. Orli Bahcall [24:20] Negative results is, I mean, that's really in just how you define a negative result. Negative result can be a positive result if you spin it right, finding a lack of association can be incredibly important, finding lack of support for theory that was assumed to be true can be incredibly important. So it's really in how you decide and frame it. I look for that just as much as any other kind of paper and I absolutely have published some great ones like that. And I think we do, we do look for that just as much. You may not see it, maybe depending on how its framed, but I think there's incredible value in that... of publishing negative results. I think I'll, you know as I said, all journals have their own approach. And they're really looking for how to represent their fields in the best way at their journal. So I think everyone has a slightly different approach. I don't think it's as clear cut as some might think is we do where we don't publish negative results. It's really we look for the we look for the best in the paper, we look for what it contributes to our field. So I just don't see that kind of dichotomy. But what I think a little separate from a negative result is that the idea of replication and following on papers, as well. So there is a lot of interest in just pure replication studies from a manuscript and how we represent and publish those. And those can be incredibly important. And very often, they can contribute a lot, but they're not enough to warrant publication in the same level of journal, or in the same journal and difficult for that journal to find enough space. And we've definitely had that issue of we get a lot of replication of studies that are not actually questioning or changing a publication and we can't give it a full publication itself, but we see a lot of value in collecting those in some way. And I think that's something this field is still working with and across our journals, we still discuss for how to best do that. I think things like pre print servers have really been a huge help having these more available in biology. And posting pre prints can be great. We also have comments fields on all of our articles, so authors can post the pre print, add comments and link directly to our papers. But we definitely do look for more ways that we could aggregate that across and link it more directly to the original publication, which is I think, would be a good direction for publishing.

Zeynep [26:48] We've talked about how science is funded and how it's published. But despite all the competition involved, science can also be incredibly collaborative. CIFAR is the Canadian Institute for Advanced Research, a global organization that brings together top researchers from around the world to think about important questions that face science and humanity. Unlike CIHR, CIFAR is not a funding source, but a builder of community and ideas. Dr. Bernstein, their president, told us more:

Dr. Alan Bernstein [27:16] So CIFAR is a fabulous organization, it's Canadian-based, global organization, research organization, where we build communities, we bring together some of the top researchers from around the world, including Canada, of course, around a question of importance to science and to humanity. We convene this community. They meet, you know, a couple times a year for five years renewable, and what we want is, ultimately and ideally, a whole new way of thinking about a really important question... transformative way of thinking about it. So we're not primarily a funder of a search at all, we're a builder of global communities. We're not bounded by geography, we're not bounded by discipline, or not bounded by questions. So we have problems in the social sciences and we have problems in the biological sciences in life and health, we have programs in quantum materials, quantum information science, astrophysics, a new program in geophysics, and we invite anybody in the world to come forward with an idea for a new CIFAR program. And what we require is that it be a question of importance to humanity and to science. That it be of the nature that it's a great CIFAR program. So if you want to identify all the genes mutated in autism, it's the kind of work that Steve Sherer does, for example, it's a great project, but it's not a CIFAR question. That can be funded by CIHR. I'm not saying one's better than the other, it's just we have a different niche in the ecosystem of research. We're interested in bigger questions, more broader questions. So, just to give you a good example of one: brain, mind, and consciousness. What is relationship between our brain and our mind? We can't write a CIHR grant to solve that problem. That program led out of Western Ontario has got about 20 to 25 people in it, from philosophers to molecular geneticists to cognitive psychologists, psychiatrists, etc. Who are meeting to discuss that question. So they don't need to do experiments, they need to discuss, to wrestle with such a huge question. And so we invite proposals in any area. We don't say it's got to be in the brain, it's got to be in geophysics, or it's got to be in disease, could be in anything. We could have a global call, that's targeted. We've talked about it actually, we might one day. So you know, you could have a global call on the environment, or whatever. But so far, they've been open calls. So it's got to be a CIFAR-type program, great leadership, global. We don't require the things be interdisciplinary. On the other hand, we believe that our contributions to the research ecosystem is to bring together people who normally would not get together. Which then means both international and interdisciplinary.

Marija [30:24] The nature of science is continuously evolving. Technology has reshaped how we act think, and ultimately, how we conduct science today. These changes have propelled interdisciplinary and intercontinental collaboration, creating a community of researchers across the globe working to advance science, together.

Dr. Alan Bernstein [30:46] We are much more collaborative and it's much more bigger International Science today than ever before.

Dr. Orli Bahcall [30:53] More and more, I'm appreciative of what it takes to do the type of genomic science that we see today. It requires a level of international collaborations. I think that's, I mean, that's been true for one to two decades, but increasingly, it's becoming even more true. And that's something when I was a graduate student was not as much the case. I think there there are differences in cultures and that's something that's really important when we work together as scientists, but there's always the focus on the scientific method and the results. And I think that crosses borders, we all have good things that we can learn from each other. So if we want to just compare one thing currently in genomics between the UK and the US and Canada, too, and it's something we've been talking about this meeting would be establishing resources that are invaluable for a field and for the progress of genomic medicine. And we've been talking about the need for population cohorts, establishing population cohorts and ensuring that these have genetic information and now whole genome sequencing is the hope. And we talked about the UK biobank, which has really become the primary model, the leading effort in this and it's becoming, you know, the most invaluable resource that has really shifted all of human genetics in the past couple of years. I mean, it can't be underestimated. Just the value of that having establishing the UK Biobank and providing the open, fully open access to all researchers has for human genetics and human health. And the idea is to then model that in other countries, so the US has an effort with all of us, and other countries are establishing by banks too, but are further behind. And what we've been talking about today is a lot of people who are trying here in Canada and through the McClaughlin Center to hopefully establish more of these population cohorts in Canada, which I think is an incredibly laudable effort.

Zeynep [32:55] In research, we are often faced with the dilemma of when to actually publish our work. Should we focus on publication quantity and divide our projects into smaller components, or package everything together. in a potentially more impactful paper? We asked Dr. Bahcall to give us an editor's point of view.

Dr. Orli Bahcall [33:11] Figuring out what's the right fit for that work and that project and the right way to present and package that work. And it's really great to involve an editor, any editors that you know, regardless of whether you plan to submit to their journal or not, because editors can have great ideas, and just what's the right way to package a set of studies and where to draw the line, you know. I get these questions all the time, not necessarily about submitting your manuscript, but when I go to visit a lab and spend say a whole day with a lab, they want to know, you know, here's all these amazing stuff going on and how do I decide where to stop? Or where to break things up? Or what's a good way to do it? And editors have, you know, they're used to seeing this, they'll have other things to weigh in, you know, "maybe you can package it this way", or, you know, "maybe that's really enough", now. You know, "if you want, you can submit it that way. You don't have to go on for two more years". But at the end of the day, it's your project and you and your advisor need to be working together to figure out how, you know, what you want to achieve? What are your goals as a student, what do you need to accomplish? And there's there's really no right or wrong answer.

Marija [34:18] While formal review of research studies must be extensive to maintain a certain level of quality, bias within this process can compromise what serves as the foundation of science, objectivity. Bias can come in many forms, whether it be where articles come from, gender, expertise, scientific domain, or conflict of interest, to name a few. We asked Dr. Bahcall how editors limit bias during their review of manuscripts.

Dr. Orli Bahcall [34:47] This is something we care a lot about, obviously. We want to, I mean, we do represent the best research from anywhere from anyone and we are, vut we're not fully blinded, right? We're not blinded to who the authors are. We have looked at models where they are double blinded. So I've read manuscripts where, say the author list is removed and commented on that, we've all as editors, we've all done this in certain times and done trials, and how that affects our editorial decision or how they were fixed to review process. And we see that doesn't really effect us. As editors, we really only care about the, you know, publishing the best in science as we should. We also offer at Nature journals we offer what's called double blind peer review. So authors can select during the submission process to have a blinded reviews such that the author, author list, and author information is removed from the manuscript. And so it is the editors will still see the author information, but the referees will not. And we've done this now for several years and across Nature journals, it's available to anyone who submits. And from those who opt into that and their manuscripts set to review. We see really no effect both on, whether the manuscript center review at that journal, or what the final decision is. So we don't see any actual impact on it. And as quite rightly so because we don't believe that it does have any impact. What we do look at also is the submission rates, which you alluded to the submission rates, and where we get the best research and how well it's presented. So we do try to do more outreach in areas that we feel are underrepresented, that may not know, just submit to our journals, or don't know quite the process or not feel as open doors to that or may not have easy access to editors. So that's something we care a lot about. And we will if we realized there's really good work going on in an area or a lab that is just not submitting to top journals,we will go talk to them and do more outreach and visit with their labs and maybe give seminars about how to work on their papers and how to prepare a paper for submission and how to talk to an editor and all these things I'm talking about with you, and try to help them work through the process. That's something we all try to do.

Zeynep [37:07] When discussing publishing, it is vital to consider the issue of accessibility. That is how easily articles can be read by both researchers and the general public. Some articles and journals are open access, meaning that they are free to everyone without barriers, unfortunately, the majority are not.

Stephania [37:23] With regards to communication, there's has been some judgment with regards to the journals that... basically they're not free access. So that a lot of this science, as we said, we have to get more funds and a lot of it is funded from taxpayer money, but then those same people have to pay again to access the published results. How do you respond personally on that type of judgment on the journals? How do you respond to that kind of criticism?

Dr. Orli Bahcall [37:52] Yeah, so I think I think these are very exciting discussions that we've been having, and they've evolved over the years. Over the last decade or so, they've evolved a lot. And I remember very well when class first started, and they launched an incredible model for open access journals, and has been very successful and very exciting to follow. And since then many other journals have launched and shown the ability to have these models. But as you know, there's still the question of what is a viable business model? And that's something that all publishers really do struggle and are continuing to struggle to figure out. What is a business, a successful business model to allow this to happen? And I don't think any, I've never seen anyone who actually questions the idea between making the research finding and the data sets and the access fully open and available and that's something that we all would like to see. Something at Nature journals we very much support, having the openness of the methods and the data sets and analysis behind manuscript, but the the final publication, there still needs to be some business model for us, something all of us do work with. You know what, I think one of the things that we do offer a Nature as you said, we do have a number of open access journals. And we've increased this quite a bit over the last few years, to give authors more and more options for publishing in our journals within fully open access journals.

Marija [39:22] Dr. Bahcall continues, explaining her support of pre prints, which are full drafts of research articles made available online to the public before formal peer review and publishing in a journal.

Dr. Orli Bahcall [39:35] We also support pre prints more and more. Nature has always supported preprints for over a few decades, starting back when it was mostly only in physics, and I come from an academic family, with two parents in astronomy, so for me growing up, pre prints were the norm. In astronomy, pre prints were how the communication of research happened. And the journal publication happened quite a bit later. So that for me was very normal. The norm was you pre printed your work as soon as it was done everyone in your field read the pre print that week and discussed it and gave you feedback. And then, you know, once you collect the feedback and figured out what we were doing and reassessed, some point later you would submit it to a journal but it wasn't that big a deal, that wasn't what made your career at all. So that was the norm that I knew for my family. So for me that was quite natural when I became an editor one of the first things I talked about when I started was, "why are we doing more of these preprints? Like why didn't biology have taken this on yet? Why aren't we more open in the communication?" I was very excited when that movement happened. Very excited when biologists started to take on preprints more and more. And Nature itself was one of the first publishers to adopt that, by launching its own pre print server, it was called Nature Proceedings. It was a full preprint server at Nature, that lasted for a few years, and we were able to work quite closely as an editor is able to work quite closely with some of them in looking at how we can best use the pre print service for authors and use it to develop manuscripts that were under review so that authors might submit their pre print, post their pre print there, while also submitting to Nature journal and going through the review process, and using the comments they received on the pre print thoughout the review process. We looked at models for how we could integrate this better with the review process, that were interesting. Unfortunately, it wasn't quite the right time for that as a company and it didn't, it closed a few years, but I think we learned a lot from it. And since then other by archive has launched and it's done an amazing job for biology and has grown more and more and it's been become an incredible resource. It's widely used and which we all use, I use daily as well. And as a publisher, and most of the big publishers have now supported bioarchive and Nature, we've always allowed pre print servers and said very clearly that this will not negatively affect any editorial decision for consideration for research manuscript. But now we actively encourage it, we work with our archive to make the submissions easier. And I think it's really important to encourage pre prints for authors. Because this is really part of getting your work out there. And I think the way people communicate their own research has shifted from a time where, where you presented this at a meeting, and there's lots of travel and that's where you got your main feedback. More and more now, that's not happening. You know, even if you do go to conferences, they're only so many conferences you can go to. The main communication that we see today is through pre prints. And I think it's a great way, so this is why I encourage this, particularly for younger authors too, is to post a pre print and take some time to collect your comments on this and really ask people for comments in the preprint before you submit your manuscript. And I think its invaluable to get the work out there as early as possible and to get feedback and learn from that.

Zeynep [43:19] Another strong consideration when publishing is what we call the 'impact factor' of a journal. Journals such as Nature are known to be high impact. But what does that really mean?

Dr. Orli Bahcall [43:29] Really important to talk about the impact factor. So the Impact Factor defined by ISI is a measure of this level of citations of a journal through a somewhat complicated algorithm. It is the most common metric that's used to measure the citations of both the single manuscript publication and journal publication overall. And there hasn't been too many metrics that's become as widespread as that, which is why it's still the most widely used metric. I'd like to talk about this because, from what I see, it is authors and often authors from certain institutions or certain locations that really are driven, have a lot of pressure to publish in journals with a certain Impact Factor. And so there is a lot of focus on that from the author's perspective to try to achieve that. From a journal perspective though, it's quite different, which is what I think people should really recognize. From a general perspective, we don't actually talk about that kind of impact factor. We see that as one metric that is quite commonly use. And that's very easy, you know, provided for us. It's useful because it is provided every year and it's kept track of, and it's easy to compare. There is definitely one metric that is useful to look at. But we know it is just one of many metrics there. And there are many other metrics that are as or even much more useful for us to look at.

Marija [45:01] In summary, a journal with the high Impact Factor means it has articles that are cited often. It can only be calculated after completing at least three years of publication. Therefore, relatively new journals do not have an impact factor.

Zeynep [45:18] As Dr. Bahcall mentions, the impact factor is one of many research metrics. Other metrics include the site score, the H index, the G index, the Eigen factor score, and Altmetric score, just to name a few. The H index, for example, focuses on the impact of only one scholar instead of the entire journal. Another example is the Altmetric score, which centers on the attention an article has received, including public policy documents, online sources, the news, or social networks. The Illinois library points out that this attention doesn't necessarily mean the article is important or of high quality.

Marija [45:53] Overall, many scholarly metrics should be taken into account when considering a journal's impact and with that a research article's prestige and influence in society.

Zeynep [46:06] In order to promote and advocate for funding, it's important to reach out and engage with the public. Stefania asked Dr. Bernstein how we can move forward and overcome the disconnect and update the public on what the researchers are up to.

Stephania [46:18] You mentioned it a little bit, that since we're using taxpayers money, it's important that there's a communication to them of what is going on in science. However, we have a whole big issue of science illiteracy and pseudoscience, people not believing in global warming, anti-vaxxers, there's all these things that are coming up, how do you think we should tackle this, because it is true that we have all these media and compared to the past 50 years ago, it is communicated a lot faster and a lot easier, but it seems that we have a lot less trust in what official science will say and people just go on Google and figure it out on their own. So how do you think we can move backwards and regain that trust?

Dr. Alan Bernstein [47:02] That's a really important question and a complicated one. So I think with, you know, with the web and Google and all that stuff, there is sort of an infectious disease of people, someone will see some crazy idea and then it's picked up by a lot of other people. It seems like a lot of other people, but on a percentage basis, take anti-vaxxers, for example, it's an area I'm familiar with, most people vaccinate their children. There are some people for religious reasons or for reasons that are not, to me nonsensical, that don't, but most people still do. Does that mean we don't have to engage with the public? Absolutely not. So I think it's really important for everybody in the scientific community, especially, especially young scientists, to engage with the public. And I think a couple things on that: One, I've heard people say yeah, "we need to educate the public". No, no, no, no, no, they're not an empty vessel waiting for their brain to be filled up with all this great knowledge that we have. That's a clear -the surest way to turn them off. We need to engage with the public. That's number one. Number two: There's lots of ways we can engage. So for example, when I was much younger, the Canadian Cancer Society, I was getting money from them through the National Cancer Institute of Canada, would ask me to go to Sudbury. I remember once going to Sudbury to give a talk about cancer research. I always said yes when I was asked. I felt it was my obligation. So I was getting money from them from donors, volunteers who were going door to door knocking on doors. So I would do it. It's amazing what you learn as a scientist, what we would take for granted about what people know and what they don't know and what they care about, is completely different than what I would have predicted. Third, for people who receive money from, let's say, you know, CIHR. That's taxpayers, that's parliament. That money comes as a vote from parliament. So when you get money from somebody you should say "thank you". So my message always is, especially the graduate students, and to postdocs, to young trainees, and to young investigators, is if you are supported by CIHR, you should write a letter of Thank You to the Minister of Health. The letter can have three paragraphs -it's very easy. Paragraph one, "Dear Minister of Health, my name is Joe Blow. I am receiving funding from CIHR for a project called, you know, blah, blah, blah. Thank you very much. The reason I think this grant is important to Canadians is for the following reasons. If you happen to be in you know, if your labs in Toronto or in Timbuktu wherever you are, I'd be happy to show you around my lab and talk to you more about my work. Yours sincerely." Doesn't cost a penny, because you can send a hardcopy letter to Ottawa without a postage stamp. Just address it, you don't need the address, Minister of Health Government of Canada, blah blah, blah. Ditto your own Member of Parliament, ditto the Prime Minister, ditto the Minister of Industry Science and Economic Development, and ditto the Ministry of Science. So it's in the same letter. And the cool thing is, especially for young people, it's a civics lesson, you're kind of learning about how government works. So you'll get a letter back. And you know, all the members of parliament are in their home writing this summer, especially because there is an election coming. So, you can easily book appointment to see your member of parliament. So I think the education of the public starts with the public and start with the people who represent the public, which are members of Parliament and the Cabinet and the Prime Minister. So I think we all have an obli- all of us, have an obligation to start engaging with them. By starting by thanking them.

Marija [51:05] Dr. Naylor also emphasized the importance of engaging the public and building a culture that celebrates science.

Dr. David Naylor [51:13] I think there is a challenge in that this is a young country, we have had a tradition of excellent science in many ways. We have to figure out how to celebrate that without that becoming a way for governments to assume they've spent enough money because that's part of the game that's played, "Oh, we'll celebrate Canadian science and now we don't need to spend any more if the next year, we've had a Nobel Prize or we've had a Wolf prize or something else, and goody goody, now we don't need to spend any more money yet". That worries me. So it needs to be a broader thing about celebrating science, valuing it, and embracing it and seeing the need for further support. So the public presses the government, rather than the government trying to impress on the public that they're spending enough by celebrating. Politics is a very tough business. It's a contact sport. These folks have to reapply for their jobs every few years. They make decisions that are based not just on evidence, but on values or preferences and context or circumstances. And a lot of this is highly positional interest-based rather than being about finding common ground. It's, you know, orchestrated warfare in the trenches in an election and in legislature. So more and more academics are getting the fact that they have to do science communication, they have to do outreach, they have to try to inspire the very young generation and their teachers to carry the torch for science and scholarship. I think that's a very positive change that I've seen over the course of a pretty long career. It's quite different now than when I started out. And on the same point, although the term 'Knowledge Translation' has a lot of baggage in my view, but knowledge sharing, knowledge co-creation, knowledge collaboration, whatever we want to call it, that is also way more active now than it was a few years ago. And it's not all about innovation in the marketplace, it's also about working with civil society organizations and working to build community to build non-profits. So there's, so called, B corps, as a hybrid in between commercial and non-profit. So there's a much more varied array of entities with whom academics are working in 2019, then say 30, 40 years ago. I think that's all to the good. I think it's getting the message out, it's building relationships, it's building a culture of creativity and critical thinking and innovation in all facets of society. I do think that the non-profits are a big part of this equation on multiple levels. One is because many of the non-profit's support research, large number of them in the medical and health world, but also because some of the foundations that have been set up are supportive more broadly of science and scholarship. And I think their allies to be enlisted. It was very telling in the run up to the 2018 budget when the heads of several major Canadian family foundations came forward and wrote an open letter urging implementation of the episode report, that was a very powerful statement. So I think we have to activate those allies in the philanthropic community, in the non-profit foundations of various types, and see them as friends of research to the greatest extent possible. I want to put an asterisk up and say, sometimes these entities try to nudge the direction of research more than seems helpful. Sometimes they insist on matching in ways that I'm not sure is always constructive. So, you know, I'm not going to say that they're the answer to research funding shortfalls, we must have open untrammeled research funding, without strings, for the best questions and the best ideas to be supported to be pursued with complete independence by great scientists and scholars at all stages of their careers.

Marija [55:03] The cycle of research from funding to review to publication to engaging and sharing findings with the public, are all necessary components of scientific discovery and progress. This cycle starts with the availability and prioritization of funding and resources for science. A critical component that Canada needs to improve on is promoting a strong national science culture and advocating for the importance of scientific funding.

Zeynep [55:29] Be sure to check out our next episode where we explore science policy and evidence informed decision-making in government. We also dive deeper into what you can do as a scientist or science ally to make your voice heard in our upcoming election and beyond. This episode was hosted by myself, Zeynep Kahramanoglu and Marija Zivcheska. Stefania Assimopoulos, Grace Jacobs and Frank Telfer assisted with content creation. Alex Jacob was our audio engineer.

Marija [55:55] A very special thank you to our guests, Drs. Alan Bernstein, Orli Bahcall, and David Naylor for speaking to us and sharing their insights. And thank you for listening. Until next time, keep it raw.

Grace [56:17] Raw talk podcast is a student presentation at the Institute of Medical Science in the Faculty of Medicine at the University of Toronto. The opinions expressed on the show are not necessarily those of the IMS, the faculty of medicine, or the university. To learn more about the show, visit our website rawpodcast.com and stay up to date by following us on Twitter, Instagram and Facebook @rawtalkpodcast. Support the show by using the affiliate link on our website when you shop on Amazon. Awesome! Don't forget to subscribe on iTunes, Spotify, or wherever else you listen to podcast and rate us five stars. Until next time, keep it raw.