# Rework jenkins pipelines ###### tags: `functional cycle 10` Developers: Christoph, Samuel, Nikki Appetite: 1/2 cycle ## Problem 1. Jenkins does not fail if stencils do not verify. 2. Let jenkins build in different folder and only copy finished project/bindary if success. 3. PR plan cannot be restarted from failed step, because triggerPhrase is missing. 4. Prefer python as script language over bash (especially for spack). 5. Move complete configuration of pipeline to source controlled 6. Use named arguments for bash scripts instead of postition arguments. 7. Make sure every stage fails gracefully (-e in bash, in python need to check state of subprocess) 8. Eliminate 'build verification multi' and 'build substitution multi' stages, as they are unnecessary. Adapt scripts accordingly (build folder paths will be broken). 9. Split 'build' stage in a 'configure' and a 'build' stage. 10. Compare jenkins to gitlab CI with the help of someone from CSCS 11. Prepare talk ## Background The daily and PR jenkins plans for the DSL version of ICON are central to the development workflow. So we propose the following bugfixes, quality of life upgrade and improvements of robustness. ## Appetite ## Known steps 1. **Jenkins stencil verification** To verify which stencils fail during *verification* against Fortran stencils during the CI pipeline we will serialise all relevant stencil verification metrics to `json` at runtime, which will then be parsed in a jenkins `stencil_verification` stage, failing if one of the stencils did not verify. The proposed steps are as follows: - Modify `::dawn::verify_field` so that it returns a struct of verification metrics, and a stencil validity flag, instead of **only** the flag. - In `CudaIcoCodeGen.cpp` modify `verifyAPI` so that: - `::dawn::verify_field` returns a struct of metrics. - we have `__SERIALZE_METRICS` preprocessor directive, enabling serialisation to JSON at will. - we call `serialize_to_json` which is a function which will write metrics returned from `::dawn::verify_field` to a JSON file on disk. - The JSON object should include the following keys/values and be structured as follows: ```json { "<StencilName>": [ { "iteration": "<IterationNumber>", "max_relative_error": "<MaxRelError>", "min_relative_error": "<MinRelError>", "max_absolute_error": "<MaxAbsError>", "min_absolute_error": "<MinAbsError>", "stencil_is_valid": "<StencilIsValid>" } ], "<StencilName2>": [ { "iteration": "<IterationNumber>", "max_relative_error": "<MaxRelError>", "min_relative_error": "<MinRelError>", "max_absolute_error": "<MaxAbsError>", "min_absolute_error": "<MinAbsError>", "stencil_is_valid": "<StencilIsValid>" } ] } ``` Constructing the JSON can be done using the [nlohmann/json](https://json.nlohmann.me/) library. Utility functions to do so will be defined in their own module in `dsl/bindings_generator`. - The generated JSON file can then be parsed in a jenkins stage. For this the [`readJSON`](https://www.jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#readjson-read-json-from-files-in-the-workspace) facility from the Jenkins pipeline-utility-steps plugin can be used. If this approach does not work the fallback option is a Python script to parse the JSON file. - If we discover a stencil where `stencil_is_valid` is `false` the stage will fail, and the JSON file will be echoed. **Note** At a later stage we may also add additional information to the JSON file such as other build information such as type of compiler used etc. This part however is not directly relevant to verification testing 3. Problems: 3. SHA of new commit needs to be passed, so needs new triggerPhrase. 4. How to store state of pipeline when interrupting, we do not have containerized binaries, so has to be the workfolder that saves the state, will conflict if multiple people use the pipeline. 5. Clean solution unclear. 4. Have to create venv and always activate it before launching a python script. Maybe can activate venv from groovy? 5. There is another DSL called job-dsl: https://plugins.jenkins.io/job-dsl/ 7. For failing gracefully, be aware that spack sometimes catches all exceptions and does not pass them through, see also: https://github.com/MeteoSwiss-APN/cosmo/blob/mch/cosmo/ACC/jenkins/rdeploy/build_deps.py ## Things to keep in mind 1. Giacomo already wrote a file called 'parsing rules' for this step which could be part of the solution or should be deleted if it is not. ## Potential Rabbit Holes 4. Do not try to force python when bash is much easier for the respective step. 5. Time box this since information on this DSL seems to be too sparse to be viable. ## No Gos # Progress - [x] Added support for named arguments in bash scripts. - [x] Improved icon4py install using `_external_src`