# This is Essay 3
The previous two essays already describe the product vision and the architecture. This essay will focus on means to safeguard the quality and architectural integrity of the underlying system.
## Key Quality Attributes
For *Beets*, there are 2 key attributes that need to be satisfied. The first one is flexibility, and the second one is accessibility, or user-friendly. The reason why these 2 are the key attributes is easy to understand. For flexibility, although *Beets* is a media management system, it is designed for music geeks. Therefore, various requirements from different people should be easily satisfied. For accessibility, since not all music geeks are computer geeks, a too-complex system is not accessible for most people who love music. Thus, the system should be accessible to normal people, such that it will be widely used.
Considering the current status of *Beets*, we can tell that for flexibility, it has already been satisfied, but it still can be further improved when talking about accessibility. The flexibility of *Beets* mainly concerns incorporating different plugins with the core functionality of *Beets*. This is well satisfied by a virtual class called *Beetsplugin*. By overwriting this class, plugins could easily extend *Beets*'s functionalities.
Nevertheless, the accessibility of *Beets* can be further improved. For now, users can only interact with *Beets* through CLI, there is no graphical interface for *Beets*. In practice, CLI is more difficult to use compared to the graphical interface. A well-functioned graphical interface would greatly improve the accessibility of *Beets*.
## Quality Control
As an open-source software, *Beets* has its process for quality control. The process is shown in the following list.
1. First, contributors should fork the repo to create their own workspace. This prevents intentional and unintentional damage to the *Beets* repo.
2. For adding new features, one should discuss with core developers first, then implement the feature.
3. After making changes, tests should be added. This proves previous bugs are actually fixed, or the new features work as it says.
4. Add document, so that other contributors and users can understand what you have done.
5. Add changelog to demonstrate changes.
6. Run tests and style check, and modify code accordingly.
7. Push to the forked repo and use pull-request to contribute to the upstream.
8. Automatic workflow will check whether the changes are applicable for all platforms and whether the code style is consistent with the requirement.
9. Core developers will check manually to ensure changes fit the quality requirement and decide whether the pull-request should be merged. If the PR needs further modification, related suggestions will be given to help contributors meet the quality requirement.
## Key Elements of Beets' Continuous Integration Processes
As an open-source community project whose contributors are from different corners of the world and have limited opportunity to communicate, a Continuous Integration (CI) process is a necessity for building the project correctly. CI is an idea that different project members work on different branches and merge their own work into one single working branch which needs to be built and pass a series of tests with automated workflows. This practice guarantees that every member's code works properly together and are well-tested. [^1]
*Beets* has integrated a very detailed CI process, allowing the contributors to build the project accurately and efficiently without errors. The following CI processes are applied during the development of *Beets*:
- When the contributors make a pull request (PR), it needs to pass 12 automatic check, including one *lint* test, one documentation validity check, and 5 versioning tests for five versions of *Beets* on two platforms (ubuntu and windows). The automatic check process can be viewed in image *Automatic checks in CI*.
- When PR is submitted, it triggers GitHub Actions toHub Actions* to deploy auto-test scripts on different test platforms, which installs the dependencies and executes the test flow.
- *Beets* also incorporates a documentation CI, which is a single file named *changelog.rst*. This *changelog* is a summary of all the PRs that have been submitted, including those only involving documentation change. Every contributor needs to register their PR on *changelog*, including the PR number, a brief description of the changes, and the type of change (bugfix, new features, new plugins, etc).
- For every code contribution of *Beets*, the following four parts are required: code, documentation, test, and log. All developers need to adhere to this standard workflow which makes the whole system clear and understandable to the succeeding contributors.
{{<image file="beets_CI.png" caption="Automatic checks in CI">}
## Test Process and Test Coverage
Tests help to improve the reliability of softwares. *Beets'* tests are written with *unittest* and *mock*. *unittest* is for testing the validity of each unit functional module, namely the methods in python. *mock* is for simulating the behavior of external objects that the units have dependencies on, so that we can focus on the code being tested instead of the external dependencies. [^2]
*Tox* is used for setting up and executing the tests locally before submitting the PR. We can execute tests with multiple python version. In the following use case, it is run with python 2.7:
```
tox -e py27
```
With the following command, we can check the coverage of tests:
```
tox -e cov
```
*Beets* currently has 68 tests, while the test coverage is merely 40% on average and is somewhat low. It is urged to improve test coverage since it helps to identify the defects and the gaps in requirements in an early stage and save a lot of works later. [^3]
It is also a good practice to check the programmatic and stylistic errors with the following command:
```
tox -e lint
```
*Lint* helps to mark occurrences of suspicious and non-structural code (i.e. potential bugs) in source code, which is important to reduce errors and improve the overall quality of your code.
## Hotspot components
Using the figure generated by Codescene[^4], it's clear that there are not many contributors making contributions for the *Beets*. The highest number of contributors is 35, and it reached the top in the year 2017, which is later than the year of highest commits per author. This figure indicates that before 2016, the project *beets* was not outstanding enough and still needed many features to add. By analyzing all these contributions, we identified hotspot components, defined as files in the codebase that are often changed.
{{<image file="beets_commit.png" caption="Commit avtivity trend">}
{{<image file="beets_contributor.png" caption="Active Contributors">}}
{{<image file="beets_hotspot.png" caption="Hotpot">}}
The most significant hotspot component represents peak method implementation, contained in the `replaygain.py` file, the large red dot on the right in the figure above. This plugin supports ReplayGain, a technique for normalizing audio playback levels.
Another hotspot is the `library.py` file, The class: Library object is the central repository for data in beets. It represents a database containing songs, class: Item instances, and groups of items, class: Album instances. The class: Library instantiates as a singleton. A single invocation of beets usually has only one class: Library powered by class: dbcore. Database under the hood handles the SQLite abstraction, something like a very minimal ORM. The library is also responsible for handling queries to retrieve stored objects.
## Code quality
Code quality determines the ease of understanding, writing, and maintaining code. At the most basic level, it includes maintaining a specific coding style and contributing instructions.
The most basic and effective enforcement is the PEP8, a style guide for Python code that aims to make it easier to understand, test, and maintain. Those rules help the code organize, and programmers write better code by default.
For the code documentation, google's docstring format is the chosen style to illustrate the attributes of a class or the parameter of a method.
Besides the docstrings for documentation, the code also has comments in the areas where the code itself notes verbose enough or where additional motivation is needed for some specific lines to help authors understand it more clearly.
Using code style checking tools name tox, contributors can automatically figure out the wrong format to help maintain the correct code styles. And even though a linter does not automatically enforce it, the code quality is high when naming conventions. It is easy to understand just by reading it.
## Quality Culture
In this part, we will illustrate the quality culture of *Beets* which is mainly about issues and PRs of it. We will use 10 examples of issues and PRs respectively to make the discussion.
We pick up the issues and PRs which could give people a clear overview of the quality culture of beets and the way this system is maintained. In the following table, we show all the issues and PRs we select to analyze and their status now.
| Issues | Status | PRs | Status |
| :----: | :----: | :----: |:----: |
| [#1840](https://github.com/beetbox/beets/issues/1840) | closed | [#4302](https://github.com/beetbox/beets/pull/4302) | merged |
| [#4168](https://github.com/beetbox/beets/issues/4168) | closed | [#4209](https://github.com/beetbox/beets/pull/4209) | merged |
| [#4169](https://github.com/beetbox/beets/issues/4169) | closed | [#4192](https://github.com/beetbox/beets/pull/4192) | merged |
| [#4116](https://github.com/beetbox/beets/issues/4116) | closed | [#4148](https://github.com/beetbox/beets/pull/4148) | closed |
| [#4089](https://github.com/beetbox/beets/issues/4089) | closed | [#4261](https://github.com/beetbox/beets/pull/4261) | closed |
| [#4305](https://github.com/beetbox/beets/issues/4305) | open | [#4195](https://github.com/beetbox/beets/pull/4195) | merged |
| [#4235](https://github.com/beetbox/beets/issues/4235) | open | [#4196](https://github.com/beetbox/beets/pull/4196) | merged |
| [#4072](https://github.com/beetbox/beets/issues/4072) | open | [#4130](https://github.com/beetbox/beets/pull/4130) | merged |
| [#3916](https://github.com/beetbox/beets/issues/3916) | closed | [#4124](https://github.com/beetbox/beets/pull/4124) | merged |
| [#3979](https://github.com/beetbox/beets/issues/3979) | open | [#3589](https://github.com/beetbox/beets/pull/3589) | open |
According to the analysis, we could see that not every issue leads to a solvent and some of them are only open for discussion. If the issue is fixed in one pull request, it should pass the checks first. There are parts of PRs are closed since the invalid code style and failure to pass the checks. Then the maintainer of *Beets* would check the necessity of the pull request and decide if it will be merged into the main branch. As we could also see from the different kinds of issues and PRs selected, all the modifications should follow a fixed and comprehensive flow which makes the whole system easily maintained and keeps in good quality. We will use some detailed examples to show the quality culture of *Beets* as follows.
Issue [#4168](https://github.com/beetbox/beets/issues/4168) is an interesting example since it is linked to two pull requests and needs multiple attempts to fix. And we could see how the pull request is merged or closed via the same issue. The first attempt to fix this issue is PR[#4192](https://github.com/beetbox/beets/pull/4192) which could work well on the local machine of the contributor. However, when it is tested in the environment of the maintainer, it is not the case. Therefore, in the second attempt, PR[#4209](https://github.com/beetbox/beets/pull/4209), the contributor proposes one improved fix which passes the test and is merged eventually.
There are also some examples for closed issues and PRs not merged into main beanch. Issue [#3916](https://github.com/beetbox/beets/issues/3916) raised one issue that non matching albumdisambigs should be penalized higher. This issue has been closed and marked as stale because there is no active discussions for over two months. PR[#3589](https://github.com/beetbox/beets/pull/3589) attempts to solve gfetch artists, performers, engineers etc flexibly. But it is closed since it has conflicts that must be resolved.
In summary, the quality culture of *Beets* depends on the active discussions, positive solutions of contributors, and rigorous verifications of maintainers.
## Technical Debt
As is defined by Ward Cunningham[^5], there is always cruft contained in the system that makes it harder to revise the first code and technical debt means dealing with this cruft like financial debt. Due to this kind of internal deficiencies, extra efforts should be paid to make changes which are defined as the interest of this debt.
In order to make an objective analysis of the technical debt of Beets, we make use of the tool, SonarQube[^6], to analyze the codes of Beets.
{{<image file="beets_technical_debt.png" caption="Technical debt analysis of Beets using SonarQube">}}
As is shown in the figure above, 9 days and 1 hour is needed to solve all the technical debt, which is not a good result. However, the debt ratio of Beets is only 0.4% which makes Beets be rated as A for the maintainability rating. All these results are calculated when test files of Beets are considered. And according to the overview of the technical debt, we could find that `command.py` contributes a large portion of the overall debt, which is 3 hours and 50 minutes. But considering the number of lines of it, the maintainability of this file could still be rated as A. In a general view, the technical debt for the codes of *Beets* is good and the whole system is easy to maintain.
And for the documentation of *Beets*, there are still improvements that should be made to clarify the usages of some plugins. In this way, the developers would have a more clear view of the codes, which will definitely reduce the time to modify and extend the functions of the system.
The last technical debt is about the test coverage, which is less than 60% and does not reach the standard.
## References
[^1]:https://www.freecodecamp.org/news/what-are-github-actions-and-how-can-you-automate-tests-and-slack-notifications/
[^2]:https://www.telerik.com/products/mocking/unit-testing.aspx
[^3]:https://www.simform.com/blog/test-coverage/#:~:text=Test%20coverage%20is%20defined%20as,then%20test%20coverage%20is%2090%25.
[^4]:https://codescene.io/
[^5]:http://c2.com/doc/oopsla92.html
[^6]:https://www.sonarqube.org/