---
tags: community, minutes
---
# PlasmaPy Community Meeting | Tuesday 2021 November 30 at 18:00 UT
### Video Conference Information
* [Zoom link](https://zoom.us/j/91633383503?pwd=QWNkdHpWeFhrYW1vQy91ODNTVG5Ndz09)
* Instant messaging: [Matrix](https://app.element.io/#/room/#plasmapy:openastronomy.org) and [Gitter](https://gitter.im/PlasmaPy/Lobby)
* [GitHub Minutes Repository](https://github.com/PlasmaPy/plasmapy-project/tree/master/minutes)
* ["Community" Sub-directory](https://github.com/PlasmaPy/plasmapy-project/tree/master/minutes/_community)
* [PlasmaPy on GitHub](https://github.com/PlasmaPy/plasmapy) ([pull requests](https://github.com/PlasmaPy/plasmapy/pulls), [issues](https://github.com/PlasmaPy/plasmapy/issues))
* [PlasmaPy Enhancement Proposals on GitHub](https://github.com/PlasmaPy/PlasmaPy-PLEPs)
* [PlasmaPy Google Calendar](https://calendar.google.com/calendar/embed?src=c_sqqq390s24jjfjp3q86pv41pi8%40group.calendar.google.com&ctz=America%2FNew_York)
## Agenda (please feel free to edit or add items)
1. Introductions
2. 10-15 minutes for [roadmap](https://hackmd.io/@plasmapy/ry0mmnj6v)
3. solicit "Project Issues"
4.
5. MagNetUS open scienciness update (quick)
6. Normalizations classes (if time)
7. [conda forge build is failing](https://github.com/conda-forge/plasmapy-feedstock/pull/16#issuecomment-977102212)
1. and it seems to be failing locally too...? but no failures on remote tests on our repo?
9. Issues
1. ...
10. Pull requests in progress
1. [Adopt Contributor Covenant v2.1](https://github.com/PlasmaPy/PlasmaPy
11. Pull requests **MERGED**
1. ...
## Attendees
* Erik
* Nick
* Dominik
## Action Items
***Person***
* ...
## Minutes
* MagNetUS
* Meeting was yesterday (11/29)
* There is interest in open science
* Vid will be posted
* conda-forge
* https://github.com/conda-forge/plasmapy-feedstock/pull/16
* `test_grid_methods`
* `pytest` failing vs `pytest -k test_grid_methods` not failing
* neither is `pytest plasmapy/plasma/tests/test_grids.py`
* current working hypothesis:
* some test is modifying the `grids` module at runtime
* we are not picking that up on main, because github actions are parallelized via `pytest-xdist`
* each subprocess has its own version of the `grids` module in memory
* `test_grid_methods` thus does not pick the change up
* confirmed by running `pytest -n 1 --dist=loadfile plasmapy/diagnostics plasmapy/plasma` (fails) vs `-n 3` (runs)
* need the verbose command due to issues in parallel test discovery in `particles`
* Is there a way to catch these test failures in the future?
* Do a sequential version of the tests, probably as a GitHub Action cron job.