Point of View: Bioengineering horizon scan 2020

  1. Luke Kemp  Is a corresponding author
  2. Laura Adam
  3. Christian R Boehm
  4. Rainer Breitling
  5. Rocco Casagrande
  6. Malcolm Dando
  7. Appolinaire Djikeng
  8. Nicholas G Evans
  9. Richard Hammond
  10. Kelly Hills
  11. Lauren A Holt
  12. Todd Kuiken
  13. Alemka Markotić
  14. Piers Millett
  15. Johnathan A Napier
  16. Cassidy Nelson
  17. Seán S ÓhÉigeartaigh
  18. Anne Osbourn
  19. Megan J Palmer
  20. Nicola J Patron
  21. Edward Perello
  22. Wibool Piyawattanametha
  23. Vanessa Restrepo-Schild
  24. Clarissa Rios-Rojas
  25. Catherine Rhodes
  26. Anna Roessing
  27. Deborah Scott
  28. Philip Shapira
  29. Christopher Simuntala
  30. Robert DJ Smith
  31. Lalitha S Sundaram
  32. Eriko Takano
  33. Gwyn Uttmark
  34. Bonnie C Wintle
  35. Nadia B Zahra
  36. William J Sutherland  Is a corresponding author
  1. Centre for the Study of Existential Risk (CSER), University of Cambridge, United Kingdom
  2. Biosecurity Research Initiative at St Catharine’s College, University of Cambridge, United Kingdom
  3. Ebiosec, Inc, United States
  4. Manchester Institute of Biotechnology, Faculty of Science and Bioengineering, University of Manchester, United Kingdom
  5. Gryphon Scientific, United States
  6. Division of Peace Studies and International Development, University of Bradford, United Kingdom
  7. Centre for Tropical Livestock Genetics and Health, Royal (Dick) School of Veterinary Studies, United Kingdom
  8. Department of Philosophy, University of Massachusetts, United States
  9. Rogue Bioethics, United States
  10. Cambridge Consultants, United Kingdom
  11. Genetic Engineering and Society Center, North Carolina State University, United States
  12. University Hospital for Infectious Diseases, Croatia
  13. Medical School, University of Rijeka, Croatia
  14. Catholic University of Croatia, Croatia
  15. Future of Humanity Institute, University of Oxford, United Kingdom
  16. iGem Foundation, United States
  17. Rothamsted Research, United Kingdom
  18. John Innes Research Centre, United Kingdom
  19. Center for International Security and Cooperation (CSIAC), Stanford University, United States
  20. Department of Bioengineering, Stanford University, United States
  21. The Earlham Institute, United Kingdom
  22. Arkurity Ltd, United Kingdom
  23. Biomedical Engineering Department, Faculty of Engineering, King Mongkut's Institute of Technology Ladkrabang, Thailand
  24. Institute for Quantitative Health Sciences and Engineering, Michigan State University, United States
  25. Chemistry Research Laboratory, University of Oxford, United Kingdom
  26. Ekpa’Palek: Empowering Latin-American Young Professionals, Peru
  27. Department of Politics, Languages and International Studies, University of Bath, United Kingdom
  28. Science, Technology & Innovation Studies, School of Social and Political Science, University of Edinburgh, United Kingdom
  29. Manchester Institute of Innovation Research, Alliance Manchester Business School, University of Manchester, United Kingdom
  30. SYNBIOCHEM, University of Manchester, United Kingdom
  31. School of Public Policy, Georgia Institute of Technology, United States
  32. National Biosafety Authority, Zambia
  33. Department of Chemistry, Stanford University, United States
  34. School of BioSciences, University of Melbourne, Australia
  35. Department of Biotechnology, Qarshi University, Pakistan
  36. Department of Zoology, University of Cambridge, United Kingdom

Peer review process

This article was accepted for publication as part of eLife's original publishing model.

History

  1. Version of Record published
  2. Accepted
  3. Received

Decision letter

  1. Helena Pérez Valle
    Reviewing Editor; eLife, United Kingdom
  2. Peter Rodgers
    Senior Editor; eLife, United Kingdom
  3. Ariel B Lindner
    Reviewer; Institut National de la Santé et de la Recherche Medicale, Université Paris Descartes, France

In the interests of transparency, eLife publishes the most substantive revision requests and the accompanying author responses.

Thank you for submitting your article "Bioengineering Horizon Scan 2020" for consideration by eLife. Your article has been reviewed by three peer reviewers, and the evaluation has been overseen by two editors from the eLife Features team (Helena Pérez Valle and Peter Rodgers). The following individuals involved in review of your submission have agreed to reveal their identity: Ariel B Lindner (Reviewer #3).

Summary:

Kemp et al. present a timely and well-written qualitative analysis of short (<5 years) - to long- (>10) term emerging trends in bioengineering and succinctly describe hurdles as well as potential positive and negative outcomes.

The manuscript is an update of a similar exercise carried out in 2017. The augmented Delphi process used to drive a consensus among self-selected experts is well-described, and its main deficiency (the 'loud voice' effect) was addressed by limiting proposers' contributions to the discussion. Overall, the manuscript is easily accessible for general scientific audience and comprehensible for policy makers and in large portions readable by the general public. However, the manuscript would benefit from addressing a number of issues - see below.

Essential revisions:

1. Restructure the Introduction so that it starts with a discussion of why the scan is needed, why Bioengineering in particular is of interest, and what gap the work is trying to fill. Limit the discussion of the methods to the end of the Introduction.

2. Table 1 and the review of the 20 issues are results, so a Results section should be included with these elements.

3. Give a more detailed (but still brief) explanation about how the scoring of the issues works in the Introduction, as well as a more detailed explanation in the Materials and Methods. Also include why 20 was chosen as a cutoff for the number of issues finally reported. Please provide (either in the article or as supplementary data) a list of the 83 issues initially proposed and the z-scores calculated.

4. For each of the sections describing one of the issues (currently in the Introduction, to be moved to a Results section, see above), clearly identify in each what the emerging issue is, why it's a (potential) issue (positive or negative), what the drivers of the issue are, and the current state of the issue to provide a stronger takeaway message about why each issue is important.

5. On the comparison between the 2017 and 2020 scan:

i) Expand on the differences between the 2017 and the 2020 scan, including discussing whether any of the emerging issues of 2017 have come into effect; or why the experts may have switched focus (including whether expert selection may have biased this change; or whether any disruptive influences or drivers may have changed experts' focus).

ii) Map the final issues from the 2017 scan horizon to the full list of 83 issues of the 2020 scan horizon using the Z score matrix, rather than sharing the 2017 issues as a table. If the table is not openly accessible in the previous publication, include it here as supplementary data.

iii) If possible, designate which new issues on the 2020 list are due to a wider scope definition of bioengineering, otherwise the comparison between the 2017 and the 2020 scan horizons can be biased.

6. Report the gender/geographical/disciplinary bias in the 2nd as compared to the 1st experts' panel, since the 1st panel had 38 participants while the 2nd panel had 26.

7. Share statistics regarding the 'Novelty' yes/no response, and briefly discuss the distribution of yes/no responses for the issues after the 1st and 2nd scoring and in the final 20 issues.

8. Add a discussion as to whether the addition of devil's advocates to the Delphi process was helpful. Was there a correlation between Novelty and the issues proposed by the devil's advocates? What percentage of the devil's advocates' proposals made it through each round of cutoffs, and what percentage to the final list?

9. Report how different the ranking of the 41 issues selected in the 1st round was compared to the ranking of the issues after the 2nd round. This can signal controversy in the choices and expose whether any 'black swans' survived the scrutiny.

https://doi.org/10.7554/eLife.54489.sa1

Author response

[We repeat the reviewers’ points here in italic, and include our replies in plain text].

Essential revisions:

1. Restructure the Introduction so that it starts with a discussion of why the scan is needed, why Bioengineering in particular is of interest, and what gap the work is trying to fill. Limit the discussion of the methods to the end of the Introduction.

We now open the Introduction with a paragraph on why bioengineering is of interest (due to the speed of change and depth and breadth of societal impacts) and why the scan is needed (to create a process of periodically updated and improving horizon-scans as is done in other areas such as global conservation).

As recommended, we have shifted the discussion of methods to the end of the Introduction.

2. Table 1 and the review of the 20 issues are results, so a Results section should be included with these elements.

This has been added directly after the Introduction.

3. Give a more detailed (but still brief) explanation about how the scoring of the issues works in the Introduction, as well as a more detailed explanation in the Materials and Methods. Also include why 20 was chosen as a cutoff for the number of issues finally reported. Please provide (either in the article or as supplementary data) a list of the 83 issues initially proposed and the Z-scores calculated.

We have added detail to the Introduction to further explain the voting protocol.

We have tried to be as thorough as possible in providing further details on scoring in the Methods section (under ‘Phase 2: Scoring’). We now provide further information on the novelty scores, the logistics of scoring, and how long the participants were given. If there is anything further that the reviewers would like to see covered in scoring, then we would be happy to add it in.

We now explain why 20 was chosen as a cut-off in the ‘Phase Three’ section: “The decision to keep the list to 20 was made by consensus by the workshop group and was influenced by a significant difference between the z-scores of the top and bottom 20 issues, but a much smaller spread of scores within the top 20.”

We are deep believers in transparency and thus all of the anonymised scoring data (including z-scores for the long-list, short-list and final list) have already been included via the Open Science Framework (see https://osf.io/ej8p2/). We will also provide these to the editors to include as supplementary data.

4. For each of the sections describing one of the issues (currently in the Introduction, to be moved to a Results section, see above), clearly identify in each what the emerging issue is, why it's a (potential) issue (positive or negative), what the drivers of the issue are, and the current state of the issue to provide a stronger takeaway message about why each issue is important.

We have reviewed all the issues and made edits to ensure that these meet each of the recommended dimensions.

5. On the comparison between the 2017 and 2020 scan:

i) Expand on the differences between the 2017 and the 2020 scan, including discussing whether any of the emerging issues of 2017 have come into effect; or why the experts may have switched focus (including whether expert selection may have biased this change; or whether any disruptive influences or drivers may have changed experts' focus).

ii) Map the final issues from the 2017 scan horizon to the full list of 83 issues of the 2020 scan horizon using the Z score matrix, rather than sharing the 2017 issues as a table. If the table is not openly accessible in the previous publication, include it here as supplementary data.

iii) If possible, designate which new issues on the 2020 list are due to a wider scope definition of bioengineering, otherwise the comparison between the 2017 and the 2020 scan horizons can be biased.

i) Unfortunately, providing a comprehensive analysis of what issues have come into effect requires a separate methodology and journal article. However, we have tried to highlight some specific issues which appear to have become more prominent since 2017.

ii) We have added the z-scores for the 2017 list to the supplementary materials as suggested. As noted above, the z-scores and other scoring data have already been provided via the Open Science Framework and will be sent to the editors as supplementary data as well. We would suggest that since the z-scores are being provided in the supplementary data, that the summary table of the 2017 scan be kept. This is intended as an easily digested summary which will allow for the audience to compare issues across the two scans. The discussion of similarities and differences across the scans will be much less fruitful without this qualitative summary.

iii) This is a very good idea. We have flagged now which issues we believe would have fallen outside of the 2017 scan’s scope but made it into the 2020 scan: neuronal probes expanding new sensory capabilities and the governance of cognitive enhancement.

6. Report the gender/geographical/disciplinary bias in the 2nd as compared to the 1st experts' panel, since the 1st panel had 38 participants while the 2nd panel had 26.

We have added in Table 3 which provides a comparative analysis of both groups in terms of gender balance, disciplinary distribution and geographical coverage and sample size. We make a brief mention of this in-text (under Method: Phase III) to highlight that the two groups were largely comparable across each of the characteristics.

7. Share statistics regarding the 'Novelty' yes/no response, and briefly discuss the distribution of yes/no responses for the issues after the 1st and 2nd scoring and in the final 20 issues.

We now provide statistics on the novelty of all 41 shortlist issues in Table 4 at the end of the Methods section (now Supplementary file 2). The final paragraph of the Methods section now discusses the distribution of novelty.

8. Add a discussion as to whether the addition of devil's advocates to the Delphi process was helpful. Was there a correlation between Novelty and the issues proposed by the devil's advocates? What percentage of the devil's advocates' proposals made it through each round of cutoffs, and what percentage to the final list?

We have added a paragraph to the end of “Phase II: Scoring”, directly after our initial discussion of the devil’s advocates. As recommended, this covers the statistics on the novelty and success of the issues proposed y devil’s advocates.

9. Report how different the ranking of the 41 issues selected in the 1st round was compared to the ranking of the issues after the 2nd round. This can signal controversy in the choices and expose whether any 'black swans' survived the scrutiny.

A comparison of the ranks after first and second round scoring is now provided in Table 4 (now Supplementary file 2) at the end of the Methods section. We further discuss these in the final paragraph of this section.

https://doi.org/10.7554/eLife.54489.sa2

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Luke Kemp
  2. Laura Adam
  3. Christian R Boehm
  4. Rainer Breitling
  5. Rocco Casagrande
  6. Malcolm Dando
  7. Appolinaire Djikeng
  8. Nicholas G Evans
  9. Richard Hammond
  10. Kelly Hills
  11. Lauren A Holt
  12. Todd Kuiken
  13. Alemka Markotić
  14. Piers Millett
  15. Johnathan A Napier
  16. Cassidy Nelson
  17. Seán S ÓhÉigeartaigh
  18. Anne Osbourn
  19. Megan J Palmer
  20. Nicola J Patron
  21. Edward Perello
  22. Wibool Piyawattanametha
  23. Vanessa Restrepo-Schild
  24. Clarissa Rios-Rojas
  25. Catherine Rhodes
  26. Anna Roessing
  27. Deborah Scott
  28. Philip Shapira
  29. Christopher Simuntala
  30. Robert DJ Smith
  31. Lalitha S Sundaram
  32. Eriko Takano
  33. Gwyn Uttmark
  34. Bonnie C Wintle
  35. Nadia B Zahra
  36. William J Sutherland
(2020)
Point of View: Bioengineering horizon scan 2020
eLife 9:e54489.
https://doi.org/10.7554/eLife.54489