volume rendering intro for geodata in yt this presentation: https://bit.ly/ytgeostuff yt's overview on volume rendering: link Chris's 2020 AGU poster: citeable archive, direct repo link. Other code/repos: yt: https://yt-project.org/ ytgeotools: https://github.com/chrishavlin/ytgeotools yt_idv: https://github.com/yt-project/yt_idv
9/27/2022Table of Contents Overview Experiments in daskifying yt Development plan Some Final Notes yt and Dask: an overview In the past months, I've been investigating and working on integrating Dask into the yt codebase. This document provides an overview of my efforts to date but also is meant as a a preliminary YTEP (or pYTEP?) to solicit feedback from the yt community at an early stage before getting to far into the weeds of refactoring.
1/15/2021return to main post 1. (particle) data IO Full notebook available here. yt reads particle and grid-based data by iterating across the chunks, with frontend-specific IO functions. For gridded data, each frontend implements a _read_fluid_selection (e.g., yt.frontend.amrvac.AMRVACIOHandler._read_fluid_selection) that iterates over chunks and returns a flat dictionary with numpy arrays concatenated across each chunk. The particle data, frontends must implement a similar function, _read_particle_fields, that typically gets invoked within the BaseIOHanlder._read_particle_selection function. In both cases, the read functions accept the chunks iterator, the fields to read and a selector object: def _read_particle_fields(self, chunks, ptf, selector): def _read_fluid_selection(self, chunks, selector, fields, size):
1/15/2021return to main post 3. dask-unyt arrays yt uses unyt to track and convert units so if we are using dask for IO and want to return delayed arrays, we need some level of dask-unyt support. In the notebook, working with unyt and dask, I demonstrate an initial prototype of a dask-unyt array. In this notebook, I create a custom dask collection by sublcassing the primary dask.array class and adding some unyt functionality in hidden sidecar attributes. This custom class is handled automatically by the dask scheduler, so that if we have a large dask array with a dask client running and we create our new dask-unyt array, e.g.: import dask.array as da from dask.distributed import Client client = Client(threads_per_worker=2, n_workers=2)
1/6/2021or
By clicking below, you agree to our terms of service.
New to HackMD? Sign up