# Geomagnetic model evaluation **Proposal:** Add an API to VirES to allow on-demand geomagnetic model evaluation over custom coordinates. **API:** The API functions by accepting a VirES-style model specification string and a netCDF file (.nc). The file contains a grid of `Timestamp`, `Latitude`, `Longitude`, `Radius` coordinates. The API returns a matching netCDF file additionally containing `B_NEC` evaluated at those coordinates. **Python interface** in viresclient: A convenient interface will be made that accepts numpy arrays and transparently handles the creation of the netCDF file and extraction of variables from the generated netCDF. This is expected to be the main way that people interact with the API (at least in the short term). The user experience could follow something like: ``` model_values = evaluate_magnetic_model( model="CHAOS", times=np.array([datetime(...), ...]), positions=np.array(...), ) model_values.as_numpy() model_values.as_pandas() model_values.as_xarray() ``` **Longer term:** This stretches the scope of VirES to be a run-on-demand model provider, specifically for geomagnetic models. We should consider a web interface that makes it more accessible. Some existing examples: - NASA CCMC: - https://ccmc.gsfc.nasa.gov/models/IGRF~13/ - https://ccmc.gsfc.nasa.gov/models/CM5~1.0/ - BGS: - http://www.geomag.bgs.ac.uk/data_service/models_compass/wmm_calc.html - https://geomag.bgs.ac.uk/web_service/GMModels/help/general - e.g. https://geomag.bgs.ac.uk/web_service/GMModels/igrf/13?latitude=0&longitude=0&altitude=0&date=2020-01-01&format=json - GFZ: - https://ionocovar.agnld.uni-potsdam.de/Kalmag/ - To make the VirES models more accessible, can we make it available without login? - Could we consider a special service account for situations like this? i.e. an account that allows $N$ simultaneous async requests, and a certain web frontend uses this service account for any public user. **Considerations:** - Are there any relevant specifications/standards - Is it a problem to use netCDF instead of CDF (c.f. SPASE & ISTP/CDF) - Are we creating a problem for the future interoperability - What is the reasonable upper file size limit for a single request? - presumably corresponds to a fairly large grid so this is not too limiting - or maybe the processing duration could impose a limit? - Should we be consdering extra utilities other than just getting field values? - e.g. L-values ## Implementation plan? The input consists of: - coordinates, supplied as one of: - set of coordinates (mandatory Time, Latitude, Longitude, Radius) - Fixed location (Latitude, Longitude, Radius), and Time beginning and end with time delta - Fixed grid (Latitude, Longitude, Radius) with delta (defining step in Lat/Lon), and Time beginning and end with time delta - string defining the model to access - output parameters list, e.g. ["B_NEC"]. (maybe we extend the options later) Outputs: - Evaluated values (i.e. B_NEC) with the provided grid The inputs may be supplied as json parameters, or within a netcdf. and the returned evaluated model as either json or netcdf (or CDF or CSV ?)