# Manual test checklist
To best simulate the environment that our users will use our software
in, the manual tests should be done with the tools that are installed by
our installer.
To build and run the installer, run:
```
INSTALLER=$(./scripts/release/build-installer.sh) && \
"${INSTALLER}" --non-interactive --install-vscode-extension
```
Then start the local server (with verbose logging to ease debugging):
```
ci-daemon -v2 --alsologtostderr
```
### CI/CD
* [ ] Check the CI/CD pipeline
### K8s Daemon
* [x] (Willian) Deploy the k8s daemon in minikube either with authentication or without (./deployments/kubernetes/deploy.sh)
* [x] Run the test tool //cmd/tools/test
### Local
Manually test the CI suite with projects in the [fuzz-testing repository][1].
[1]: https://gitlab.com/code-intelligence/fuzz-testing
Before starting, make sure you set these variables in the `.env` file at
the root of the core repository:
```
DOCKER_REGISTRY
DOCKER_REGISTRY_USER
DOCKER_REGISTRY_PASSWORD
```
If you don't have a `.env` file yet, copy `.env.template` to `.env` and
replace the placeholders.
When you start to test a project, write your name in parenthesis behind
the "Test $PROJECT" bullet item.
* [x] (moe) Test CppCMS
* [x] Create a fuzzing project in the CPPCMS directory in the fuzz
testing repo using the configuration:
* Build Script:
```
#!/bin/bash -eu
mkdir build
cd build
cmake -DCMAKE_TESTING_ENABLED=OFF -DDISABLE_SHARED=ON -DDISABLE_GCRYPT=ON -DDISABLE_OPENSSL=ON -DDISABLE_FCGI=ON -DDISABLE_SCGI=ON -DDISABLE_HTTP=ON -DDISABLE_CACHE=ON -DDISABLE_TCPCACHE=ON -DDISABLE_GZIP=ON ..
make -j$(nproc)
```
* Build Image: `registry.gitlab.com/code-intelligence/core/builders/cppcms`
* [x] Test Initializing the project.
* [x] Test the fuzz-this-function functionality by trying to generate a
fuzz target from an API function.
* [x] Create and run a test collection to test "fuzz_json" fuzz test
which is already created in the repo, with libfuzzer and the
address sanitizer as a fuzzer.
* [x] Check that charts are showing and updating once the test
collection is running.
* [x] Show coverage information for the fuzz test "fuzz_json"
* [x] Insert a bug at line 1220 in the file "src/json.cpp"
(e.g., `*(char*)2=1;`). Re-run the campaign to produce a crash
finding. Debug the found crash.
* [x] Make sure that the "Go to finding" functionality works in the
UI by navigating to the erroneous line.
* [x] Test grammar fuzzer: create and run a test collection to test
"fuzz_json_grammar" with libfuzzer and the address sanitizer
similar to "fuzz_json"
* [x] (adrian) Test nginx (testing the socket fuzzer)
* [x] Test creating and initializing a fuzzing project using
* The build script:
```
#!/bin/bash -eu
./auto/configure
mkdir -p logs
make
```
* The build image: registry.gitlab.com/code-intelligence/core/builders/nginx
* [x] Create a new socket fuzz test using the "add fuzz target"
functionality in the UI.
* [x] select port 6666
* [x] select the nginx binary from the dropdown menu
* [x] use the following run arguments: `-p $SRC, -c .code-intelligence/nginx2.conf`
* [x] Run the automatically created test collection for the newly
create fuzz test with libfuzzer and the address sanitizer.
* [x] change the test collections fuzzer engine options to
disable leak detection: `-detect_leaks=0`
* [x] start the collection
* [x] Show coverage information for the newly created fuzz test
* [x] (moe) Test struct fuzzer
* [x] Run the struct fuzzer test collection in the example project
in "core/fuzzing/testdata/projects/example/" using (A crash
should be found quickly):
* Build Script: `make`
* Build Image: registry.gitlab.com/code-intelligence/core/builders/cppcms
* [x] Debug the crash and verify that you can step through the code
and see the value of the struct that caused the crash.
* [x] Test Code coverage
* [x] (Simon) Test string instrumentation in the example java project.
* [x] Create and initialize the project (found in the core repo:
fuzzing/testdata/projects/java_example) with the settings:
* build script:
```
mkdir -p build/libs
javac $(find src -name "*.java") -d build
jar cf build/libs/example.jar -C build com
```
* image: cifuzz/builders:maven
* [x] Start the default Test Collection and check if the the crash
is found.
* [x] Test debugging the found crash, and step into the code until
you reach the bug location.
* [x] Try the `contains` method in the Parser class instead of equals
* [x] (Simon) Test WebGoat
* [x] Initialize the fuzzing project (two Spring Boot applications
should be started during this process. One of them would fail).
Use the following configuration:
* Build script: `mvn clean package -DskipTests`
* Build image: cifuzz/builders:maven
* [x] Add a new Spring Boot Fuzz test to fuzz the `/challenge/5`
controller.
* [x] Check in the campaign expert options that ZAP is added as a
fuzzing engine.
* [x] Start the corresponding automatically created test collection
and check that an SQL Injection is reported.
* [x] Verify that there are multiple warnings reported from the ZAP
run.
* [x] Add a new Spring Boot Fuzz test to fuzz the `/InsecureLogin/task`
controller.
* [x] Start the corresponding automatically created test collection
and check that the login credentials are found
(username: CaptainJack, password: BlackPearl).
* [x] (seb) Test AltoroJ
* [x] Initialize the fuzzing project AltoroJ
* Build script: `gradle build`
* Build image: cifuzz/builders:gradle
* [x] Perform the web application analysis using the OpenAPI
definition `properties.json`.
* [x] Verify in the web app configuration (Pen button in the
navigation bar) that endpoints have been found.
* [x] In the web app configuration modify the application URL from
`localhost:8080/AltoroJ/api` to `localhost:8080/api` and save
the change.
* [x] Create a new web app fuzz target with all controllers except
for `logout`.
* [x] Run the automatically created campaign and verify that an SQL
injection is found.
* [x] Perform the web app analysis again but this time using the
"Web Crawler" option (start url: `http://localhost:8080`).
* [x] Create a new web app fuzz target (the list of controllers
should be different this time).
* [x] Run the newly created campaign and verify that at least one
XSS vulnerability is found.
* [ ] Spring Boot fuzz target edit UI: In the test spring boot project
in our fuzz testing repo (or any spring boot project).
* [ ] Add a fuzz test
* [ ] Test adding/modifying/deleting requests in the setup chain of
the project.
* [ ] Test adding/modifying/deleting request templates from the fuzz
test.
* [ ] Test adding/modifying/deleting requests in a request template
of the fuzz test.
* [ ] Test Adding/modifying/deleting fuzzing policy for both the
test and the project.
* [x] (moe) UI
* [x] Test Adding/Removing/Updating a fuzz test (API, socket, and
Java targets).
* [x] Test Adding/Removing/Updating a test collection.
* [x] Test the ability to configure a fuzz test that already exists:
Click on an already created fuzz test and check if you can
modify the configuration.
* [x] (seb) Test the unit test runner
* [x] Initialize the project `yaml-cpp` from the fuzz_testing repo
* Build script:
```
#!/bin/bash -eu
mkdir -p build && cd build
cmake -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON -DYAML_CPP_BUILD_TESTS=OFF ..
make -j$(nproc)
```
* Test script:
```
#!/bin/bash -eu
rm -rf build && mkdir build && cd build
cmake ..
make -j$(nproc)
test/run-tests
```
* Build image: cifuzz/builders:zint
* [x] Create a new test collection. Don't select any fuzz targets, use "Unit Test Runner" with "address" sanitizer as engine
* [x] Make one of the unit tests fail (e.g. insert `EXPECT_TRUE(false);` in `test/regex_test.cpp:14`)
* [x] Run the campaign and check if the failed test is reported as a bug
* [x] Make one of the unit tests crash (e.g. insert `*(char *) 1 = 2;` in `test/regex_test.cpp:14`)
* [x] Run the campaign and check if the asan report shows up in findings