# nf-core/funcscan pipeline nf-test conversion procedure
The following instructions explain how to add new tests using an 'intermediate' pipeline-level nf-test structure.
This structure and procedure is somewhat opinionated based on early work on ampliseq, and may be changed in the future so do not use it as a canonical set of instructions.
Make sure to have nf-test and nextflow installed and/or activated before running through these instructions
## Set up the skeleton
These steps have already been done! They are just here for reference:
1. Make a `nf-test.config` that contains required nf-test related options
- Can be generated with `nf-test init`
- Mostly specifying where to find the tests, where to place them, etc.
- I just copied from other pipelines
3. Add to `.gitignore` whatever you set in the the nf-test config that points to the `workDir`, e.g. `.nf-test/` and/or `.nf-tests`
4. Make a `tests/` directory (also done if used with `nf-test init`)
5. Replace `.github/workflows/ci.yml` with an nf-test structure (see stuff like createtaxdb, ampliseq, phageannotator rnaseq)
- All currently quite different, IMO ampliseq/createtaxdb are simplest
- Just remember to replace any pipeline names in them
6. Add to `.nf-core.yml` the lint specification to don't check `actions_ci`
- as of nf-core tools 2.13.1 nf-test structure isn't supported which is why we don't use it here
## Adding new tests
1. Make a config file with the required parameter settings for the test
2. Don't forget to add to this to the bottom of `nextflow.config`
2. Add the test to the 'tags' list in `.github/workflows/ci.yml`
3. Create a `<test_config_name>.nf.test` file under `tests/`
- Follow the structure in other tests/pipelines
- It is critical you have a specific `tag` at the top of the test, that matches the same name as the `<test_name>.conf` file and the `<test_name>.nf.test` file name
- Start your assertion with simply
```nextflow
{ assert workflow.success },
{ assert snapshot(path("$outputDir/")).match()},
{ assert new File("$outputDir/pipeline_info/nf_core_pipeline_software_mqc_versions.yml").exists() },
{ assert new File("$outputDir/multiqc/multiqc_report.html").exists() },
```
4. Run your first` nf-test` command to generate a snapshot (this will be deposited along side the `<test_name>.nf.test` file under `tests/`)
```bash
nf-test test --tag test_nothing --profile test_nothing,docker
```
- Don't forget to replace `docker` as necessary
- The `--tag` will call the corresponding `<test_name>.nf.test` file, while `--profile` will call the `<test_name>.conf` file with the test parameters
5. Enter the output directory of the test run and check that the contents of each file is AS EXPECTED (we don't want bugs, or silent failures, or empty files etc!)
- To enter the output directory of the test, change into `.nf-test/tests/<Test hash >`, reported in the stdout of the nf-test run. In the example below the hash is 'c942120d'
```
Test [c942120d] 'test_nothing_profile' PASSED (16.543s)
Snapshots:
1 updated [test_nothing_profile]
```
- The contents of this directory consists of:
- `meta/`: contains the `nextflow.log` log file
- `output/`: contains the `--outdir` results file
- `work/`: is the typical Nextflow work directory
- Your snapshot will be checking files in `output/` as we are checking against `$outDir`, so validate these files!
7. Once verified the output is as we want for the test, run the test _again_ to identify instability
8. If instability between files are found, update the assertions within `snapshot()` etc. as necessary (much like with modules)
- Currently it's best to snapshot files in `$outDir`,
- in the future will likely be also/replaced with checking for _workflow_ channel emissions
- If copying assertions from module level tests, be wary of 'process' vs 'workflow' definitions! For the time being refer to each file using `$outDir`
9. Repeatedly run nf-test until the snapshot is stable
10. Commit, push, and onto the next config!