# Agenda Chloro Review Call
Zeit: 05.07.2019 9:15
Ort: Skype
Teilnehmer: Jan, Simon, Niklas, Markus, Frank
## Strategische Entscheidung
- Nochmal BMC Genome Biology
- Transfer BMC Bioinformatics
- was ganz anderes
## High-level Einschätzung der Reviewer Reports
### Reviewer 1
- Missverständnis Benchmarking
- Einige Anfragen im Kontext von Benchmarking nicht sinnvoll
### Reviewer 2
- Evtl. Autor von MITObim
- Hauptsächlich sprachlich
- Overall not well written
## Konkrete TODOs
- Container neu bauen -> Frank heute
- Re-run -> Jan (start sobald Frank Container gebaut (heute))
- Simulated -> Jan
- Real -> Jan
- Re-run -> Jan
- Performance -> Frank
- Weitere Analysen
- 50 novel? -> Markus (am Wochenende) Liste aller, start download
- Textuelle Änderungen
- Native Speaker
## Timeline
Re-submission vor Franks Urlaub im August!
9. August
## Decision Letter
```
GBIO-D-19-00938
The landscape of chloroplast genome assembly tools
Jan A Freudenthal; Simon Pfaff; Niklas Terhoeven; Arthur Korte; Markus J Ankenbrand; Frank Förster
Genome Biology
Dear Dr. Förster,
Thank you for submitting your manuscript to Genome Biology. We have now received reports from two referees, whose comments are available below.
As you will see from the reports, both referees raised serious concerns that the benchmark hasn’t provided sufficient advance to be useful for the field. At this stage, therefore, we are not persuaded that an offer of publication in Genome Biology would be justified.
Unless you can address all of the issues raised by the referees I am afraid that we would not be able to consider this work again. In particular we ask that you provide additional analyses as requested by both referees to convince them of the advance. The revised manuscript should address in full all the points raised by both referees; a separate list of the revisions made, specifying where in the manuscript the changes have been made, would also be helpful. Please note that all data generated for the manuscript should be deposited in a relevant public repository, with accession codes listed in the manuscript's Availability of Data and Materials section. All other supporting data should also be signposted.
If you decide to submit a revised version, you will need to this as a new manuscript, although we would consider it in the same way as we do all revised manuscripts. The exception is that we will take into account any work published in the intervening period.
For the time being, however, we hope that you will find our referees' comments helpful in deciding how to proceed. Please feel free to contact us if you have any questions.
Alternatively, if you do not wish to revise the manuscript extensively to add additional new insights, you might prefer to submit a lesser revised version of your manuscript to one of the more specialized journals published by BMC, such as BMC Bioinformatics (http://www.biomedcentral.com/journals) with which Genome Biology is affiliated. As in Genome Biology, original research published in BMC Bioinformatics is rigorously and rapidly peer-reviewed and is freely available online.
If you would like to find out more about the option to submit to BMC Bioinformatics, please click the link below and follow the online instructions:
https://www.editorialmanager.com/gbio/l.asp?i=244911&l=3SS3LBLG
This offer to transfer your submission is valid until 29 Sep 2019. If you encounter any problems with your submission, please contact our colleagues at the BMC Transfer Desk at transfersdesk@biomedcentral.com.
For the time being, we hope that you will find our referees’ comments helpful in deciding how to proceed.
With best wishes,
Yixin
Yixin Yao
Genome Biology
https://genomebiology.biomedcentral.com/
Editors' comments (if any):
Reviewer #1: Methods are appropriate but same authors developed ChloroExtractor, an important tool used as a benchmark in this study. Authors compare seven chloroplast genome assembly tools. So, this reviewer is not sure of novelty of this manuscript. So, authors haven't addressed the important question "Does the work represent a significant advance over previously published studies?".
User experience and success rate should be determined by users and documented through satisfaction surveys. This is especially important when authors emphasize that program can be used using standard computer infrastructure.
There is >1,000 sequenced chloroplast genomes, with a large majority not fully assembled or annotated. So, authors have plenty of opportunity to demonstrate use of these new assembly tools.
It is well known that intergenic spacer regions are not conserved and not even a single intergenic spacer region is conserved within chloroplasts genomes of the graminae or grass genomes. Intergenic spacer region occupies >50% of the genome. So, how does reference genome help in genome assembly?
In order to have real life comparison, authors could have sequenced few chloroplast genomes and used that data to compare different assembly tools and time taken for each program, pros and cons.
- Is the paper of broad interest to others in the field, or of outstanding interest to a broad audience of biologists?
This is of interest to evolutionary biologists to study phylogenetics or for chloroplast genetic engineering.
Reviewer #2: This study provides an systematically comparison of the chloroplast assembly tools at first and show significant differences between the tested assemblers in terms of generating whole chloroplast genome sequences and computational requirements. Moreover, the authors created docker images for each tested tool, which are available for the scientific community and improved reproducibility, thus large scale screening for chloroplasts as hidden treasures within genomic sequencing data is feasible.
This topic of the review paper is knowledgeable and helpful for all the scientists who are working in the field of plant chloroplast assembly. However, the overall manuscript is not well written and contains limited information about systematically comparison of the chloroplast assembly tools and docker images for assembled reproducibility.
Specific points:
The number of lines on the edge does not correspond to the maintext. The major concerns about this manuscript can be found below and major revision is suggested before publication in "Genome Biology".
Page1 part of "General introduction and motivation"
Line 4: It should be quote recent article
Line 11: It should be quote recent article
Page2
Line 16: It should be IRa and IRb
Line 18: It should be quote recent article
Line 24-27: Re-write sentence
Part of "Approaches to extracting chloroplasts from whole genome data"
Line 6: Delete the dot before the quote
Lines 7-9: Re-write sentence
Part of "Purpose and scope of this study"
Line 2: It should be in italic type
Line 2: It should be genome
Line 3: Raw genomic data sets should be filter out low-quality reads by quality control
Lines 4 and 5: Many command-line tools of de novo assemble for chloroplast genome have all of these features, e.g., MITObim.
Part of "Time requirements"
Line 1: It should be among
Line 4: Different tools with more than 1 gigabyte data sets should be tested, and Figure1 can`t give valuable information as no significant difference for time-consuming. The same question for Part of "Memory and CPU Usage". Why these seven assemblers were chosen to test? Figure should be "Fig." and table as "Table".
Part of "Qualitative"
Line 3: NGS data for crop genomes can`t be used for GetOrganelle? In this part, the same data set should be used to test for every assemblers better.
Part of "Quantitative"
Part of "Simulated data"
Line 1: Why not show the result for different data sets with every assemblers in Fig.3 and Table 2? Fig.3 and table 2 show no regular pattern in scores except the assembler GetOrganelle obtaining perfect.
Part of "Real data sets"
Line 3: The performance of GetOrganelle was bad, as only 199 circular assemblies out of a total of 356 assemblies that resulted in an output, the percent is 55.9%, however it achieved a median scores of 99.7. The number "99.7" should to be "89.13", as show in table 3? The SD value of GetOrganelle is too large than the other three tools but it achieved the highest scores.
Part of "Consistency"
Line 3: As shown in the Fig.6, only two tools achieved R2 not equal 1 by re-running assemblies and comparison of the scores of two assemblers, however, GetOrganelle defined as the only tool that succeed in obtaining similar score for all assemblers, this conclusion is not rigorous.
```