# Unified Product Installer/Upgrader Tool
### todo: Everything
## Premise
Customers sometimes need reassurance that an upgrade or new installation will be stable and go along smoothly. In the presented scenario, there is a customer that provides a DB dump and a sosreport. Both of these pieces of data can then be used to recreate a customer's environment.
Upgrade automation, which is often already in place for products, is then run on top of this replica environment, followed by the usual integration testing. This allows for a controlled and low-risk debug situation, should any issues occur.
## General Workflow
1. Customer provides a database dump and sosreport.
2. The tool moves this data to the necessary automation to "mimic" the customer environment.
3. The usual upgrade testing/integration testing is done. If there are issues, engineers can then debug.
## Products Under Consideration
- Ansible Tower
- Dev and Project Manager input mentions that the typical database size for a customer can reach into the terabyte range, but that it can also expand to something that is "as large as the system it is on". Loading database/sosreport information therefore needs to be robust enough to handle transfer of large files. For the purposes of a proof-of-concept, this is not as much of an issue.
- Devs also mention db dumps aren't really grabbed for Tower, but maybe something slimmed down is possible? Unsure.
- Upgrade automation is already in place for Tower, on a Jenkins instance. It can likely be added to a pipeline that first customizes a Tower instance to mimic a customer's environment. Do we want this to be ansible-ized or part of the existing Jenkins infrastructure?
- Satellite
# Design Requirements/Details
## Web Interface
- The web interface must present a clean, clear, and concise interface for all of the below components.
- The interface should emphasise the recommended operation flow.
- The interface should be able to retrieve, present, and manage information stored in the database. This information will consist of:
- A (filterable) list of all active systems.
- The product, version, and customer name associated with that system.
- A (filterable) list of all environments that can be recreated.
- Environment information, including links to previous activity.
- A (filterable) list of all previous environment recreations.
- Recreation information should include product information, upgrade results, and test results.
## Database
- The database must be able to store all information needed to populate the website
- It should also store information needed for underlying architecture
## File Handler
- The file handler must be able to receive files from one system and host them on another
- This may also include on-demand download/serving of files as needed.
## Orchestrator
The orchestrator is the underlying application that:
- decides what actions need to be taken
- knows how to talk to each component
- can reserve physical/virtual hardware
- is able to run ansible playbooks directly or via tower
- stores and retrives configuration information
- can spin up the web interface
- has its own CLI
## Environment Recreation
Environment recreation is the most crucial part of this project.
- Recreation will be determined by a combination of SOS reports, product dumps, and sane defaults.
- Recreation execution information is in the form of ansible playbooks and/or tower workflows
- Each playbook/workflow should conform to a standard, accepting a baseline set of arguments
- Product-specific arguments should not conflict with baseline arguments
- Playbooks/workflows should be created by a product's Dev/QE/CSS team
## Upgrader
The upgrader is what carries out the product upgrade actions. This section follows the same standards/requirements of Environment Recreation.
## Test Runner
- Must be able to setup all requirements for a test environment
- Must be able to keep environment configuration and accept another configuration at runtime
- Must keep environments isolated and/or destroy environments when finished
- Must be able to collect/interpret tests results and store them in the database
- Tests should be provided a product's QE team, test selection criteria is up to their discretion.