././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.4351544
yt-4.4.0/ 0000755 0001751 0000177 00000000000 14714401715 011564 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/CITATION 0000644 0001751 0000177 00000003655 14714401662 012733 0 ustar 00runner docker To cite yt in publications, please use:
Turk, M. J., Smith, B. D., Oishi, J. S., et al. 2011, ApJS, 192, 9
In the body of the text, please add a footnote to the yt webpage:
http://yt-project.org/
For LaTex and BibTex users:
\bibitem[Turk et al.(2011)]{2011ApJS..192....9T} Turk, M.~J., Smith, B.~D.,
Oishi, J.~S., et al.\ 2011, The Astrophysical Journal Supplement Series, 192, 9
@ARTICLE{2011ApJS..192....9T,
author = {{Turk}, M.~J. and {Smith}, B.~D. and {Oishi}, J.~S. and {Skory}, S. and
{Skillman}, S.~W. and {Abel}, T. and {Norman}, M.~L.},
title = "{yt: A Multi-code Analysis Toolkit for Astrophysical Simulation Data}",
journal = {The Astrophysical Journal Supplement Series},
archivePrefix = "arXiv",
eprint = {1011.3514},
primaryClass = "astro-ph.IM",
keywords = {cosmology: theory, methods: data analysis, methods: numerical},
year = 2011,
month = jan,
volume = 192,
eid = {9},
pages = {9},
doi = {10.1088/0067-0049/192/1/9},
adsurl = {http://adsabs.harvard.edu/abs/2011ApJS..192....9T},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
Using yt can also utilize other functionality. If you utilize ORIGAMI, we ask
that you please cite the ORIGAMI paper:
@ARTICLE{2012ApJ...754..126F,
author = {{Falck}, B.~L. and {Neyrinck}, M.~C. and {Szalay}, A.~S.},
title = "{ORIGAMI: Delineating Halos Using Phase-space Folds}",
journal = {\apj},
archivePrefix = "arXiv",
eprint = {1201.2353},
primaryClass = "astro-ph.CO",
keywords = {dark matter, galaxies: halos, large-scale structure of universe, methods: numerical},
year = 2012,
month = aug,
volume = 754,
eid = {126},
pages = {126},
doi = {10.1088/0004-637X/754/2/126},
adsurl = {http://adsabs.harvard.edu/abs/2012ApJ...754..126F},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
The main homepage for ORIGAMI can be found here:
http://icg.port.ac.uk/~falckb/origami.html
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/CONTRIBUTING.rst 0000644 0001751 0000177 00000107555 14714401662 014243 0 ustar 00runner docker .. This document is rendered in HTML with cross-reference links filled in at
https://yt-project.org/doc/developing/developing.html
.. _getting-involved:
Getting Involved
================
There are *lots* of ways to get involved with yt, as a community and as a
technical system -- not all of them just contributing code, but also
participating in the community, helping us with designing the websites, adding
documentation, and sharing your scripts with others.
Coding is only one way to be involved!
Communication Channels
----------------------
There are three main communication channels for yt:
* Many yt developers participate in the yt Slack community. Slack is a free
chat service that many teams use to organize their work. You can get an
invite to yt's Slack organization by clicking the "Join us @ Slack" button
on this page: https://yt-project.org/community.html
* `yt-users `_
is a relatively high-traffic mailing list where people are encouraged to ask
questions about the code, figure things out and so on.
* `yt-dev `_ is
a much lower-traffic mailing list designed to focus on discussions of
improvements to the code, ideas about planning, development issues, and so
on.
The easiest way to get involved with yt is to read the mailing lists, hang out
in IRC or slack chat, and participate. If someone asks a question you know the
answer to (or have your own question about!) write back and answer it.
If you have an idea about something, suggest it! We not only welcome
participation, we encourage it.
Documentation
-------------
The yt documentation is constantly being updated, and it is a task we would very
much appreciate assistance with. Whether that is adding a section, updating an
outdated section, contributing typo or grammatical fixes, adding a FAQ, or
increasing coverage of functionality, it would be very helpful if you wanted to
help out.
The easiest way to help out is to fork the main yt repository and make changes
in it to contribute back to the ``yt-project``. A fork is a copy
of a repository; in this case the fork will live in the space under your
username on github, rather than the ``yt-project``. If you have never made a
fork of a repository on github, or are unfamiliar with this process, here is a
short article about how to do so:
https://help.github.com/en/github/getting-started-with-github/fork-a-repo .
The documentation for
``yt`` lives in the ``doc`` directory in the root of the yt git
repository. To make a contribution to the yt documentation you will
make your changes in your own fork of ``yt``. When you are done,
issue a pull request through the website for your new fork, and we can comment
back and forth and eventually accept your changes. See :ref:`sharing-changes` for
more information about contributing your changes to yt on GitHub.
Gallery Images and Videos
-------------------------
If you have an image or video you'd like to display in the image or video
galleries, getting it included it easy! You can either fork the `yt homepage
repository `_ and add it there, or
email it to us and we'll add it to the `Gallery
`_.
We're eager to show off the images and movies you make with yt, so please feel
free to drop `us `_
a line and let us know if you've got something great!
Technical Contributions
-----------------------
Contributing code is another excellent way to participate -- whether it's
bug fixes, new features, analysis modules, or a new code frontend. See
:ref:`creating_frontend` for more details.
The process is pretty simple: fork on GitHub, make changes, issue a pull
request. We can then go back and forth with comments in the pull request, but
usually we end up accepting.
For more information, see :ref:`contributing-code`, where we spell out how to
get up and running with a development environment, how to commit, and how to
use GitHub. When you're ready to share your changes with the community, refer to
:ref:`sharing-changes` to see how to contribute them back upstream.
Online Presence
---------------
Some of these fall under the other items, but if you'd like to help out with
the website or any of the other ways yt is presented online, please feel free!
Almost everything is kept in git repositories on GitHub, and it is very easy
to fork and contribute back changes.
Please feel free to dig in and contribute changes.
Word of Mouth
-------------
If you're using yt and it has increased your productivity, please feel
encouraged to share that information. Cite our `paper
`_, tell your colleagues,
and just spread word of mouth. By telling people about your successes, you'll
help bring more eyes and hands to the table -- in this manner, by increasing
participation, collaboration, and simply spreading the limits of what the code
is asked to do, we hope to help scale the utility and capability of yt with the
community size.
Feel free to `blog `_ about, `tweet
`_ about and talk about what you are up to!
Long-Term Projects
------------------
There are some out-there ideas that have been bandied about for the
future directions of yt -- stuff like fun new types of visualization, remapping
of coordinates, new ways of accessing data, and even new APIs to make life easier.
yt is an ambitious project. Let's be ambitious together!
yt Community Code of Conduct
----------------------------
The community of participants in open source
Scientific projects is made up of members from around the
globe with a diverse set of skills, personalities, and
experiences. It is through these differences that our
community experiences success and continued growth. We
expect everyone in our community to follow these guidelines
when interacting with others both inside and outside of our
community. Our goal is to keep ours a positive, inclusive,
successful, and growing community.
As members of the community,
- We pledge to treat all people with respect and
provide a harassment- and bullying-free environment,
regardless of sex, sexual orientation and/or gender
identity, disability, physical appearance, body size,
race, nationality, ethnicity, and religion. In
particular, sexual language and imagery, sexist,
racist, or otherwise exclusionary jokes are not
appropriate.
- We pledge to respect the work of others by
recognizing acknowledgment/citation requests of
original authors. As authors, we pledge to be explicit
about how we want our own work to be cited or
acknowledged.
- We pledge to welcome those interested in joining the
community, and realize that including people with a
variety of opinions and backgrounds will only serve to
enrich our community. In particular, discussions
relating to pros/cons of various technologies,
programming languages, and so on are welcome, but
these should be done with respect, taking proactive
measure to ensure that all participants are heard and
feel confident that they can freely express their
opinions.
- We pledge to welcome questions and answer them
respectfully, paying particular attention to those new
to the community. We pledge to provide respectful
criticisms and feedback in forums, especially in
discussion threads resulting from code
contributions.
- We pledge to be conscientious of the perceptions of
the wider community and to respond to criticism
respectfully. We will strive to model behaviors that
encourage productive debate and disagreement, both
within our community and where we are criticized. We
will treat those outside our community with the same
respect as people within our community.
- We pledge to help the entire community follow the
code of conduct, and to not remain silent when we see
violations of the code of conduct. We will take action
when members of our community violate this code such as
contacting confidential@yt-project.org (all emails sent to
this address will be treated with the strictest
confidence) or talking privately with the person.
This code of conduct applies to all
community situations online and offline, including mailing
lists, forums, social media, conferences, meetings,
associated social events, and one-to-one interactions.
The yt Community Code of Conduct was adapted from the
`Astropy Community Code of Conduct
`_,
which was partially inspired by the PSF code of conduct.
.. _contributing-code:
How to Develop yt
=================
yt is a community project!
We are very happy to accept patches, features, and bugfixes from any member of
the community! yt is developed using git, primarily because it enables
very easy and straightforward submission of revisions. We're eager to hear
from you, and if you are developing yt, we encourage you to subscribe to the
`developer mailing list
`_. Please feel
free to hack around, commit changes, and send them upstream.
.. note:: If you already know how to use the `git version control system
`_ and are comfortable with handling it yourself,
the quickest way to contribute to yt is to `fork us on GitHub
`_, make your changes, push the
changes to your fork and issue a `pull request
`_. The rest of this
document is just an explanation of how to do that.
See :ref:`code-style-guide` for more information about coding style in yt and
:ref:`docstrings` for an example docstring. Please read them before hacking on
the codebase, and feel free to email any of the mailing lists for help with the
codebase.
Keep in touch, and happy hacking!
.. _open-issues:
Open Issues
-----------
If you're interested in participating in yt development, take a look at the
`issue tracker on GitHub
`_.
You can search by labels, indicating estimated level of difficulty or category,
to find issues that you would like to contribute to. Good first issues are
marked with a label of *new contributor friendly*. While we try to triage the
issue tracker regularly to assign appropriate labels to every issue, it may be
the case that issues not marked as *new contributor friendly* are actually
suitable for new contributors.
Here are some predefined issue searches that might be useful:
* Unresolved issues `marked "new contributor friendly"
`_.
* `All unresolved issues `_.
Submitting Changes
------------------
We provide a brief introduction to submitting changes here. yt thrives on the
strength of its communities (https://arxiv.org/abs/1301.7064 has further
discussion) and we encourage contributions from any user. While we do not
discuss version control, git, or the advanced usage of GitHub in detail
here, we do provide an outline of how to submit changes and we are happy to
provide further assistance or guidance.
Licensing
+++++++++
yt is `licensed `_ under the
BSD 3-clause license. Versions previous to yt-2.6 were released under the GPLv3.
All contributed code must be BSD-compatible. If you'd rather not license in
this manner, but still want to contribute, please consider creating an external
package, which we'll happily link to.
How To Get The Source Code For Editing
++++++++++++++++++++++++++++++++++++++
yt is hosted on GitHub, and you can see all of the yt repositories at
https://github.com/yt-project/. To fetch and modify source code, make sure you
have followed the steps above for bootstrapping your development (to assure you
have a GitHub account, etc.).
In order to modify the source code for yt, we ask that you make a "fork" of the
main yt repository on GitHub. A fork is simply an exact copy of the main
repository (along with its history) that you will now own and can make
modifications as you please. You can create a personal fork by visiting the yt
GitHub webpage at https://github.com/yt-project/yt/ . After logging in,
you should see an option near the top right labeled "fork". You now have
a forked copy of the yt repository for your own personal modification.
This forked copy exists on the GitHub repository, so in order to access
it locally you must clone it onto your machine from the command line:
.. code-block:: bash
$ git clone https://github.com//yt ./yt-git
This downloads that new forked repository to your local machine, so that you
can access it, read it, make modifications, etc. It will put the repository in
a local directory of the same name as the repository in the current working
directory.
.. code-block:: bash
$ cd yt-git
Verify that you are on the ``main`` branch of yt by running:
.. code-block:: bash
$ git branch
You can see any past state of the code by using the git log command.
For example, the following command would show you the last 5 revisions
(modifications to the code) that were submitted to that repository.
.. code-block:: bash
$ git log -n 5
Using the revision specifier (the number or hash identifier next to each
changeset), you can update the local repository to any past state of the
code (a previous changeset or version) by executing the command:
.. code-block:: bash
$ git checkout revision_specifier
You can always return to the most recent version of the code by executing the
same command as above with the most recent revision specifier in the
repository. However, using ``git log`` when you're checked out to an older
revision specifier will not show more recent changes to the repository. An
alternative option is to use ``checkout`` on a branch. In yt the ``main``
branch is our primary development branch, so checking out ``main`` should
return you to the tip (or most up-to-date revision specifier) on the ``main``
branch.
.. code-block:: bash
$ git checkout main
Lastly, if you want to use this new downloaded version of your yt repository as
the *active* version of yt on your computer (i.e. the one which is executed when
you run yt from the command line or the one that is loaded when you do ``import
yt``), then you must "activate" by building yt from source as described in
:ref:`install-from-source`.
.. _reading-source:
How To Read The Source Code
+++++++++++++++++++++++++++
The root directory of the yt git repository contains a number of
subdirectories with different components of the code. Most of the yt source
code is contained in the yt subdirectory. This directory itself contains
the following subdirectories:
``frontends``
This is where interfaces to codes are created. Within each subdirectory of
yt/frontends/ there must exist the following files, even if empty:
* ``data_structures.py``, where subclasses of AMRGridPatch, Dataset
and AMRHierarchy are defined.
* ``io.py``, where a subclass of IOHandler is defined.
* ``fields.py``, where fields we expect to find in datasets are defined
* ``misc.py``, where any miscellaneous functions or classes are defined.
* ``definitions.py``, where any definitions specific to the frontend are
defined. (i.e., header formats, etc.)
``fields``
This is where all of the derived fields that ship with yt are defined.
``geometry``
This is where geometric helpler routines are defined. Handlers
for grid and oct data, as well as helpers for coordinate transformations
can be found here.
``visualization``
This is where all visualization modules are stored. This includes plot
collections, the volume rendering interface, and pixelization frontends.
``data_objects``
All objects that handle data, processed or unprocessed, not explicitly
defined as visualization are located in here. This includes the base
classes for data regions, covering grids, time series, and so on. This
also includes derived fields and derived quantities.
``units``
This used to be where all the unit-handling code resided, but as of now it's
mostly just a thin wrapper around unyt.
``utilities``
All broadly useful code that doesn't clearly fit in one of the other
categories goes here.
If you're looking for a specific file or function in the yt source code, use
the unix find command:
.. code-block:: bash
$ find -name ''
The above command will find the FILENAME in any subdirectory in the
DIRECTORY_TREE_TO_SEARCH. Alternatively, if you're looking for a function
call or a keyword in an unknown file in a directory tree, try:
.. code-block:: bash
$ grep -R
This can be very useful for tracking down functions in the yt source.
.. _building-yt:
Building yt
+++++++++++
If you have made changes to any C or Cython (``.pyx``) modules, you have to
rebuild yt before your changes are usable. See :ref:`install-from-source`.
.. _requirements-for-code-submission:
Requirements for Code Submission
--------------------------------
Modifications to the code typically fall into one of three categories, each of
which have different requirements for acceptance into the code base. These
requirements are in place for a few reasons -- to make sure that the code is
maintainable, testable, and that we can easily include information about
changes in changelogs during the release procedure. (See `YTEP-0008
`_ for more
detail.)
For all types of contributions, it is required that all tests pass, or that all non-passing tests are specifically accounted for.
* New Features
* New unit tests (possibly new answer tests) (See :ref:`testing`)
* Docstrings in the source code for the public API
* Addition of new feature to the narrative documentation (See :ref:`writing_documentation`)
* Addition of cookbook recipe (See :ref:`writing_documentation`)
* Extension or Breakage of API in Existing Features
* Update existing narrative docs and docstrings (See :ref:`writing_documentation`)
* Update existing cookbook recipes (See :ref:`writing_documentation`)
* Modify of create new unit tests (See :ref:`testing`)
* Bug fixes
* Unit test is encouraged, to ensure breakage does not happen again in the
future. (See :ref:`testing`)
* At a minimum, a minimal, self-contained example demonstrating the bug should
because included in the body of the Pull Request, or as part of an
independent issue.
When submitting, you will be asked to make sure that your changes meet all of
these requirements. They are pretty easy to meet, and we're also happy to help
out with them. See :ref:`code-style-guide` for how to easily conform to our
style guide.
.. _git-with-yt:
How to Use git with yt
----------------------
If you're new to git, the following resource is pretty great for learning
the ins and outs:
* https://git-scm.com/
There also exist a number of step-by-step git tutorials to help you get used to
version controlling files with git. Here are a few resources that you may find
helpful:
* http://swcarpentry.github.io/git-novice/
* https://git-scm.com/docs/gittutorial
* https://try.github.io/
The commands that are essential for using git include:
* ``git --help`` which provides help for any git command. For example, you
can learn more about the ``log`` command by doing ``git log --help``.
* ``git add `` which stages changes to the specified paths for subsequent
committing (see below).
* ``git commit`` which commits staged changes (stage using ``git add`` as above)
in the working directory to the repository, creating a new "revision."
* ``git merge `` which merges the revisions from the specified branch
into the current branch, creating a union of their lines of development. This
updates the working directory.
* ``git pull `` which pulls revisions from the specified branch of the
specified remote repository into the current local branch. Equivalent to ``git
fetch `` and then ``git merge /``. This updates the
working directory.
* ``git push `` which sends revisions on local branches to matching
branches on the specified remote. ``git push `` will only
push changes for the specified branch.
* ``git log`` which shows a log of all revisions on the current branch. There
are many options you can pass to ``git log`` to get additional
information. One example is ``git log --oneline --decorate --graph --all``.
We are happy to answer questions about git use on our IRC, slack
chat or on the mailing list to walk you through any troubles you might have.
Here are some general suggestions for using git with yt:
* Although not necessary, a common development work flow is to create a local
named branch other than ``main`` to address a feature request or bugfix. If
the dev work addresses a specific yt GitHub issue, you may include that issue
number in the branch name. For example, if you want to work on issue number X
regarding a cool new slice plot feature, you might name the branch:
``cool_new_plot_feature_X``. When you're ready to share your work, push your
feature branch to your remote and create a pull request to the ``main``
branch of the yt-project's repository.
* When contributing changes, you might be asked to make a handful of
modifications to your source code. We'll work through how to do this with
you, and try to make it as painless as possible.
* Your test may fail automated style checks. See :ref:`code-style-guide` for
more information about automatically verifying your code style.
* You should only need one fork. To keep it in sync, you can sync from the
website. See :ref:`sharing-changes` for a description of the basic workflow
and :ref:`multiple-PRs` for a discussion about what to do when you want to
have multiple open pull requests at the same time.
* If you run into any troubles, stop by IRC (see :ref:`irc`), Slack, or the
mailing list.
.. _sharing-changes:
Making and Sharing Changes
--------------------------
The simplest way to submit changes to yt is to do the following:
* Build yt from the git repository
* Navigate to the root of the yt repository
* Make some changes and commit them
* Fork the `yt repository on GitHub `_
* Push the changesets to your fork
* Issue a pull request.
Here's a more detailed flowchart of how to submit changes.
#. Fork yt on GitHub. (This step only has to be done once.) You can do
this at: https://github.com/yt-project/yt/fork.
#. Follow :ref:`install-from-source` for instructions on how to build yt
from the git repository. (Below, in :ref:`reading-source`, we describe how to
find items of interest.) If you have already forked the repository then
you can clone your fork locally::
git clone https://github.com//yt ./yt-git
This will create a local clone of your fork of yt in a folder named
``yt-git``.
#. Edit the source file you are interested in and
test your changes. (See :ref:`testing` for more information.)
#. Create a uniquely named branch to track your work. For example: ``git
checkout -b my-first-pull-request``
#. Stage your changes using ``git add ``. This command take an argument
which is a series of filenames whose changes you want to commit. After
staging, execute ``git commit -m ". Addresses Issue
#X"``. Note that supplying an actual GitHub issue # in place of ``X`` will
cause your commit to appear in the issue tracker after pushing to your
remote. This can be very helpful for others who are interested in what work
is being done in connection to that issue.
#. Remember that this is a large development effort and to keep the code
accessible to everyone, good documentation is a must. Add in source code
comments for what you are doing. Add in docstrings
if you are adding a new function or class or keyword to a function.
Add documentation to the appropriate section of the online docs so that
people other than yourself know how to use your new code.
#. If your changes include new functionality or cover an untested area of the
code, add a test. (See :ref:`testing` for more information.) Commit
these changes as well.
#. Add your remote repository with a unique name identifier. It can be anything
but it is conventional to call it ``origin``. You can see names and URLs of
all the remotes you currently have configured with::
git remote -v
If you already have an ``origin`` remote, you can set it to your fork with::
git remote set-url origin https://github.com//yt
If you do not have an ``origin`` remote you will need to add it::
git remote add origin https://github.com//yt
In addition, it is also useful to add a remote for the main yt repository.
By convention we name this remote ``upstream``::
git remote add upstream https://github.com/yt-project/yt
Note that if you forked the yt repository on GitHub and then cloned from
there you will not need to add the ``origin`` remote.
#. Push your changes to your remote fork using the unique identifier you just
created and the command::
git push origin my-first-pull-request
Where you should substitute the name of the feature branch you are working on for
``my-first-pull-request``.
.. note::
Note that the above approach uses HTTPS as the transfer protocol
between your machine and GitHub. If you prefer to use SSH - or
perhaps you're behind a proxy that doesn't play well with SSL via
HTTPS - you may want to set up an `SSH key`_ on GitHub. Then, you use
the syntax ``ssh://git@github.com//yt``, or equivalent, in
place of ``https://github.com//yt`` in git commands.
For consistency, all commands we list in this document will use the HTTPS
protocol.
.. _SSH key: https://help.github.com/en/articles/connecting-to-github-with-ssh/
#. Issue a pull request at https://github.com/yt-project/yt/pull/new/main A
pull request is essentially just asking people to review and accept the
modifications you have made to your personal version of the code.
During the course of your pull request you may be asked to make changes. These
changes may be related to style issues, correctness issues, or requesting
tests. The process for responding to pull request code review is relatively
straightforward.
#. Make requested changes, or leave a comment indicating why you don't think
they should be made.
#. Commit those changes to your local repository.
#. Push the changes to your fork::
git push origin my-first-pull-request
#. Your pull request will be automatically updated.
Once your pull request is merged, sync up with the main yt repository by pulling
from the ``upstream`` remote::
git checkout main
git pull upstream main
You might also want to sync your fork of yt on GitHub::
# sync my fork of yt with upstream
git push origin main
And delete the branch for the merged pull request::
# delete branch for merged pull request
git branch -d my-first-pull-request
git push origin --delete my-first-pull-request
These commands are optional but are nice for keeping your branch list
manageable. You can also delete the branch on your fork of yt on GitHub by
clicking the "delete branch" button on the page for the merged pull request on
GitHub.
.. _multiple-PRs:
Working with Multiple GitHub Pull Requests
------------------------------------------
Dealing with multiple pull requests on GitHub is straightforward. Development on
one feature should be isolated in one named branch, say ``feature_1`` while
development of another feature should be in another named branch, say
``feature_2``. A push to remote ``feature_1`` will automatically update any
active PR for which ``feature_1`` is a pointer to the ``HEAD`` commit. A push to
``feature_1`` *will not* update any pull requests involving ``feature_2``.
.. _code-style-guide:
Coding Style Guide
==================
Automatically checking and fixing code style
--------------------------------------------
We use the `pre-commit `_ framework to validate and
automatically fix code styling.
It is recommended (though not required) that you install ``pre-commit`` on your machine
(see their documentation) and, from the top level of the repo, run
.. code-block:: bash
$ pre-commit install
So that our hooks will run and update your changes on every commit.
If you do not want to/are unable to configure ``pre-commit`` on your machine, note that
after opening a pull request, it will still be run as a static checker as part of our CI.
Some hooks also come with auto-fixing capabilities, which you can trigger manually in a
PR by commenting ``pre-commit.ci autofix`` (see ` `_).
We use a combination of `black `_,
`ruff `_ and `cython-lint
`_. See ``.pre-commit-config.yaml``
and ``pyproject.toml`` for the complete configuration details.
Note that formatters should not be run directly on the command line as, for instance
.. code-block:: bash
$ black yt
But it can still be done as
.. code-block:: bash
$ pre-commit run black --all-files
The reason is that you may have a specific version of ``black`` installed which can
produce different results, while the one that's installed with pre-commit is guaranteed
to be in sync with the rest of contributors.
Below are a list of additional guidelines for coding in yt, that are not automatically
enforced.
Source code style guide
-----------------------
* In general, follow PEP-8 guidelines.
https://www.python.org/dev/peps/pep-0008/
* Classes are ``ConjoinedCapitals``, methods and functions are
``lowercase_with_underscores``.
* Do not use nested classes unless you have a very good reason to, such as
requiring a namespace or class-definition modification. Classes should live
at the top level. ``__metaclass__`` is exempt from this.
* Avoid copying memory when possible. For example, don't do
``a = a.reshape(3, 4)`` when ``a.shape = (3, 4)`` will do, and ``a = a * 3``
should be ``np.multiply(a, 3, a)``.
* In general, avoid all double-underscore method names: ``__something`` is
usually unnecessary.
* When writing a subclass, use the super built-in to access the super class,
rather than explicitly.
Ex: ``super().__init__()`` rather than ``SpecialGrid.__init__()``.
* Docstrings should describe input, output, behavior, and any state changes
that occur on an object. See :ref:`docstrings` below for a fiducial example
of a docstring.
* Unless there is a good reason not to (e.g., to avoid circular imports),
imports should happen at the top of the file.
* If you are comparing with a numpy boolean array, just refer to the array.
Ex: do ``np.all(array)`` instead of ``np.all(array == True)``.
* Only declare local variables if they will be used later. If you do not use the
return value of a function, do not store it in a variable.
API Style Guide
---------------
* Internally, only import from source files directly -- instead of:
``from yt.visualization.api import ProjectionPlot``
do:
``from yt.visualization.plot_window import ProjectionPlot``
* Import symbols from the module where they are defined, avoid transitive
imports.
* Import standard library modules, functions, and classes from builtins, do not
import them from other yt files.
* Numpy is to be imported as ``np``.
* Do not use too many keyword arguments. If you have a lot of keyword
arguments, then you are doing too much in ``__init__`` and not enough via
parameter setting.
* Don't create a new class to replicate the functionality of an old class --
replace the old class. Too many options makes for a confusing user
experience.
* Parameter files external to yt are a last resort.
* The usage of the ``**kwargs`` construction should be avoided. If they cannot
be avoided, they must be documented, even if they are only to be passed on to
a nested function.
.. _docstrings:
Docstrings
----------
The following is an example docstring. You can use it as a template for
docstrings in your code and as a guide for how we expect docstrings to look and
the level of detail we are looking for. Note that we use NumPy style docstrings
written in `Sphinx restructured text format
`_.
.. code-block:: rest
r"""A one-line summary that does not use variable names or the
function name.
Several sentences providing an extended description. Refer to
variables using back-ticks, e.g. ``var``.
Parameters
----------
var1 : array_like
Array_like means all those objects -- lists, nested lists, etc. --
that can be converted to an array. We can also refer to
variables like ``var1``.
var2 : int
The type above can either refer to an actual Python type
(e.g. ``int``), or describe the type of the variable in more
detail, e.g. ``(N,) ndarray`` or ``array_like``.
Long_variable_name : {'hi', 'ho'}, optional
Choices in brackets, default first when optional.
Returns
-------
describe : type
Explanation
output : type
Explanation
tuple : type
Explanation
items : type
even more explaining
Other Parameters
----------------
only_seldom_used_keywords : type
Explanation
common_parameters_listed_above : type
Explanation
Raises
------
BadException
Because you shouldn't have done that.
See Also
--------
otherfunc : relationship (optional)
newfunc : Relationship (optional), which could be fairly long, in which
case the line wraps here.
thirdfunc, fourthfunc, fifthfunc
Notes
-----
Notes about the implementation algorithm (if needed).
This can have multiple paragraphs.
You may include some math:
.. math:: X(e^{j\omega } ) = x(n)e^{ - j\omega n}
And even use a greek symbol like :math:`omega` inline.
References
----------
Cite the relevant literature, e.g. [1]_. You may also cite these
references in the notes section above.
.. [1] O. McNoleg, "The integration of GIS, remote sensing,
expert systems and adaptive co-kriging for environmental habitat
modelling of the Highland Haggis using object-oriented, fuzzy-logic
and neural-network techniques," Computers & Geosciences, vol. 22,
pp. 585-588, 1996.
Examples
--------
These are written in doctest format, and should illustrate how to
use the function. Use the variables 'ds' for the dataset, 'pc' for
a plot collection, 'c' for a center, and 'L' for a vector.
>>> a = [1, 2, 3]
>>> print([x + 3 for x in a])
[4, 5, 6]
>>> print("a\n\nb")
a
b
"""
Variable Names and Enzo-isms
----------------------------
Avoid Enzo-isms. This includes but is not limited to:
* Hard-coding parameter names that are the same as those in Enzo. The
following translation table should be of some help. Note that the
parameters are now properties on a ``Dataset`` subclass: you access them
like ds.refine_by .
- ``RefineBy `` => `` refine_by``
- ``TopGridRank `` => `` dimensionality``
- ``TopGridDimensions `` => `` domain_dimensions``
- ``InitialTime `` => `` current_time``
- ``DomainLeftEdge `` => `` domain_left_edge``
- ``DomainRightEdge `` => `` domain_right_edge``
- ``CurrentTimeIdentifier `` => `` unique_identifier``
- ``CosmologyCurrentRedshift `` => `` current_redshift``
- ``ComovingCoordinates `` => `` cosmological_simulation``
- ``CosmologyOmegaMatterNow `` => `` omega_matter``
- ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
- ``CosmologyHubbleConstantNow `` => `` hubble_constant``
* Do not assume that the domain runs from 0 .. 1. This is not true
everywhere.
* Variable names should be short but descriptive.
* No globals!
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/COPYING.txt 0000644 0001751 0000177 00000006532 14714401662 013444 0 ustar 00runner docker ===============================
The yt project licensing terms
===============================
yt is licensed under the terms of the Modified BSD License (also known as New
or Revised BSD), as follows:
Copyright (c) 2013-, yt Development Team
Copyright (c) 2006-2013, Matthew Turk
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of the yt Development Team nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
About the yt Development Team
-----------------------------
Matthew Turk began yt in 2006 and remains the project lead. Over time yt has
grown to include contributions from a large number of individuals from many
diverse institutions, scientific, and technical backgrounds.
Until the fall of 2013, yt was licensed under the GPLv3. However, with consent
from all developers and on a public mailing list, yt has been relicensed under
the BSD 3-clause under a shared copyright model. For more information, see:
https://mail.python.org/archives/list/yt-dev@python.org/thread/G4DJDDGB4PSZFJVPWRSHNOSUMTISXC4X/
All versions of yt prior to this licensing change are available under the
GPLv3; all subsequent versions are available under the BSD 3-clause license.
The yt Development Team is the set of all contributors to the yt project. This
includes all of the yt subprojects.
The core team that coordinates development on BitBucket can be found here:
http://bitbucket.org/yt_analysis/
Our Copyright Policy
--------------------
yt uses a shared copyright model. Each contributor maintains copyright
over their contributions to yt. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the yt
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire yt
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the yt repositories.
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/CREDITS 0000644 0001751 0000177 00000015461 14714401662 012614 0 ustar 00runner docker yt is a group effort.
Contributors:
Tom Abel (tabel@slac.stanford.edu)
Gabriel Altay (gabriel.altay@gmail.com)
Kenza Arraki (karraki@nmsu.edu)
Kirk Barrow (kssbarrow@gatech.edu)
Ricarda Beckmann (ricarda.beckmann@astro.ox.ac.uk)
Christoph Behrens (cbehren2@gwdu101.global.gwdg.cluster)
Elliott Biondo (biondo@wisc.edu)
Alex Bogert (fbogert@ucsc.edu)
Josh Borrow (joshua.borrow@durham.ac.uk)
Robert Bradshaw (robertwb@gmail.com)
André-Patrick Bubel (code@andre-bubel.de)
Corentin Cadiou (corentin.cadiou@iap.fr)
Pengfei Chen (madcpf@gmail.com)
Yi-Hao Chen (ychen@astro.wisc.edu)
Yi-Hao Chen (yihaochentw@gmail.com)
Salvatore Cielo (cielo@iap.fr)
David Collins (dcollins4096@gmail.com)
Marianne Corvellec (marianne.corvellec@ens-lyon.org)
Jared Coughlin (jcoughl2@nd.edu)
Brian Crosby (bcrosby.bd@gmail.com)
Weiguang Cui (weiguang.cui@uwa.edu.au)
Andrew Cunningham (ajcunn@gmail.com)
Bili Dong (qobilidop@gmail.com)
Donald E Willcox (eugene.willcox@gmail.com)
Nicholas Earl (nchlsearl@gmail.com)
Hilary Egan (hilaryye@gmail.com)
Daniel Fenn (df11c@my.fsu.edu)
John Forbes (jcforbes@ucsc.edu)
Enrico Garaldi (egaraldi@uni-bonn.de)
Sam Geen (samgeen@gmail.com)
Austin Gilbert (agilbert39@gatech.edu)
Adam Ginsburg (keflavich@gmail.com)
Nick Gnedin (ngnedin@gmail.com)
Nathan Goldbaum (ngoldbau@illinois.edu)
William Gray (graywilliamj@gmail.com)
Philipp Grete (mail@pgrete.de)
Max Gronke (max.groenke@gmail.com)
Markus Haider (markus.haider@uibk.ac.at)
Eric Hallman (hallman13@gmail.com)
David Hannasch (David.A.Hannasch@gmail.com)
Stephanie Ho (stephaniehkho@gmail.com)
Axel Huebl (a.huebl@hzdr.de)
Cameron Hummels (chummels@gmail.com)
Suoqing Ji (jisuoqing@gmail.com)
Allyson Julian (astrohckr@gmail.com)
Anni Järvenpää (anni.jarvenpaa@gmail.com)
Christian Karch (chiffre@posteo.de)
Max Katz (maximilian.katz@stonybrook.edu)
BW Keller (kellerbw@mcmaster.ca)
Ashley Kelly (a.j.kelly@durham.ac.uk)
Chang-Goo Kim (changgoo@princeton.edu)
Ji-hoon Kim (me@jihoonkim.org)
Steffen Klemer (sklemer@phys.uni-goettingen.de)
Fabian Koller (anokfireball@posteo.de)
Claire Kopenhafer (clairekope@gmail.com)
Kacper Kowalik (xarthisius.kk@gmail.com)
Matthew Krafczyk (krafczyk.matthew@gmail.com)
Mark Krumholz (mkrumhol@ucsc.edu)
Michael Kuhlen (mqk@astro.berkeley.edu)
Avik Laha (al3510@moose.cc.columbia.edu)
Meagan Lang (langmm.astro@gmail.com)
Erwin Lau (ethlau@gmail.com)
Doris Lee (dorislee@berkeley.edu)
Eve Lee (elee@cita.utoronto.ca)
Sam Leitner (sam.leitner@gmail.com)
Yuan Li (yuan@astro.columbia.edu)
Alex Lindsay (al007@illinois.edu)
Yingchao Lu (yingchao.lu@gmail.com)
Yinghe Lu (yinghelu@lbl.gov)
Chris Malone (chris.m.malone@gmail.com)
John McCann (mccann@ucsb.edu)
Jonah Miller (jonah.maxwell.miller@gmail.com)
Joshua Moloney (joshua.moloney@colorado.edu)
Christopher Moody (cemoody@ucsc.edu)
Chris Moody (juxtaposicion@gmail.com)
Stuart Mumford (stuart@mumford.me.uk)
Madicken Munk (madicken.munk@gmail.com)
Andrew Myers (atmyers2@gmail.com)
Jill Naiman (jnaiman@cfa.harvard.edu)
Desika Narayanan (dnarayan@haverford.edu)
Kaylea Nelson (kaylea.nelson@yale.edu)
Brian O'Shea (oshea@msu.edu)
J.S. Oishi (jsoishi@gmail.com)
JC Passy (jcpassy@uvic.ca)
Hugo Pfister (pfister@loginhz02.iap.fr)
David Pérez-Suárez (dps.helio@gmail.com)
John Regan (john.regan@helsinki.fi)
Mark Richardson (Mark.Richardson.Work@gmail.com)
Sherwood Richers (srichers@tapir.caltech.edu)
Thomas Robitaille (thomas.robitaille@gmail.com)
Anna Rosen (rosen@ucolick.org)
Chuck Rozhon (rozhon2@illinois.edu)
Douglas Rudd (drudd@uchicago.edu)
Rafael Ruggiero (rafael.ruggiero@usp.br)
Hsi-Yu Schive (hyschive@gmail.com)
Anthony Scopatz (scopatz@gmail.com)
Noel Scudder (noel.scudder@stonybrook.edu)
Patrick Shriwise (shriwise@wisc.edu)
Devin Silvia (devin.silvia@gmail.com)
Abhishek Singh (abhisheksing@umass.edu)
Sam Skillman (samskillman@gmail.com)
Stephen Skory (s@skory.us)
Joseph Smidt (josephsmidt@gmail.com)
Aaron Smith (asmith@astro.as.utexas.edu)
Britton Smith (brittonsmith@gmail.com)
Geoffrey So (gsiisg@gmail.com)
Josh Soref (jsoref@users.noreply.github.com)
Antoine Strugarek (antoine.strugarek@cea.fr)
Elizabeth Tasker (tasker@astro1.sci.hokudai.ac.jp)
Ben Thompson (bthompson2090@gmail.com)
Benjamin Thompson (bthompson2090@gmail.com)
Robert Thompson (rthompsonj@gmail.com)
Joseph Tomlinson (jmtomlinson95@gmail.com)
Stephanie Tonnesen (stonnes@gmail.com)
Matthew Turk (matthewturk@gmail.com)
Miguel de Val-Borro (miguel.deval@gmail.com)
Kausik Venkat (kvenkat2@illinois.edu)
Casey W. Stark (caseywstark@gmail.com)
Rick Wagner (rwagner@physics.ucsd.edu)
Mike Warren (mswarren@gmail.com)
Charlie Watson (charlie.watson95@gmail.com)
Andrew Wetzel (andrew.wetzel@yale.edu)
John Wise (jwise@physics.gatech.edu)
Michael Zingale (michael.zingale@stonybrook.edu)
John ZuHone (jzuhone@gmail.com)
The PasteBin interface code (as well as the PasteBin itself)
was written by the Pocoo collective (pocoo.org).
developed by Oliver Hahn.
Thanks to everyone for all your contributions!
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/MANIFEST.in 0000644 0001751 0000177 00000006553 14714401662 013334 0 ustar 00runner docker include README* CREDITS COPYING.txt CITATION setupext.py CONTRIBUTING.rst
include yt/visualization/mapserver/html/map.js
include yt/visualization/mapserver/html/map_index.html
include yt/visualization/mapserver/html/Leaflet.Coordinates-0.1.5.css
include yt/visualization/mapserver/html/Leaflet.Coordinates-0.1.5.src.js
include yt/utilities/tests/cosmology_answers.yml
include yt/utilities/mesh_types.yaml
exclude yt/utilities/lib/cykdtree/c_kdtree.cpp
prune tests
prune answer-store
recursive-include yt *.py *.pyx *.pxi *.pxd README* *.txt LICENSE* *.cu
recursive-include doc *.rst *.txt *.py *.ipynb *.png *.jpg *.css *.html
recursive-include doc *.h *.c *.sh *.svgz *.pdf *.svg *.pyx
# start with excluding all C/C++ files
recursive-exclude yt *.h *.c *.hpp *.cpp
# then include back every non-generated C/C++ source file
# the list can be generated by the following command
# git ls-files | grep -E '\.(h|c)(pp)?$'
include yt/frontends/artio/artio_headers/artio.c
include yt/frontends/artio/artio_headers/artio.h
include yt/frontends/artio/artio_headers/artio_endian.c
include yt/frontends/artio/artio_headers/artio_endian.h
include yt/frontends/artio/artio_headers/artio_file.c
include yt/frontends/artio/artio_headers/artio_grid.c
include yt/frontends/artio/artio_headers/artio_internal.h
include yt/frontends/artio/artio_headers/artio_mpi.c
include yt/frontends/artio/artio_headers/artio_mpi.h
include yt/frontends/artio/artio_headers/artio_parameter.c
include yt/frontends/artio/artio_headers/artio_particle.c
include yt/frontends/artio/artio_headers/artio_posix.c
include yt/frontends/artio/artio_headers/artio_selector.c
include yt/frontends/artio/artio_headers/artio_sfc.c
include yt/frontends/artio/artio_headers/cosmology.c
include yt/frontends/artio/artio_headers/cosmology.h
include yt/geometry/vectorized_ops.h
include yt/utilities/lib/_octree_raytracing.hpp
include yt/utilities/lib/cykdtree/c_kdtree.cpp
include yt/utilities/lib/cykdtree/c_kdtree.hpp
include yt/utilities/lib/cykdtree/c_utils.cpp
include yt/utilities/lib/cykdtree/c_utils.hpp
include yt/utilities/lib/cykdtree/windows/stdint.h
include yt/utilities/lib/endian_swap.h
include yt/utilities/lib/fixed_interpolator.cpp
include yt/utilities/lib/fixed_interpolator.hpp
include yt/utilities/lib/marching_cubes.h
include yt/utilities/lib/mesh_triangulation.h
include yt/utilities/lib/origami_tags.c
include yt/utilities/lib/origami_tags.h
include yt/utilities/lib/pixelization_constants.cpp
include yt/utilities/lib/pixelization_constants.hpp
include yt/utilities/lib/platform_dep.h
include yt/utilities/lib/platform_dep_math.hpp
include yt/utilities/lib/tsearch.c
include yt/utilities/lib/tsearch.h
include doc/README doc/activate doc/activate.csh doc/cheatsheet.tex
exclude doc/cheatsheet.pdf
include doc/extensions/README doc/Makefile
prune doc/source/reference/api/generated
prune doc/build
prune .tours
recursive-include yt/visualization/volume_rendering/shaders *.fragmentshader *.vertexshader
include yt/sample_data_registry.json
include conftest.py
include yt/py.typed
include yt/default.mplstyle
prune yt/frontends/_skeleton
recursive-include yt/frontends/amrvac *.par
recursive-exclude requirements *.txt
exclude minimal_requirements.txt
exclude .codecov.yml .coveragerc .git-blame-ignore-revs .gitmodules .hgchurn .mailmap
exclude .pre-commit-config.yaml clean.sh
exclude nose_answer.cfg nose_unit.cfg nose_ignores.txt nose_requirements.txt
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.4351544
yt-4.4.0/PKG-INFO 0000644 0001751 0000177 00000035345 14714401715 012673 0 ustar 00runner docker Metadata-Version: 2.1
Name: yt
Version: 4.4.0
Summary: An analysis and visualization toolkit for volumetric data
Author-email: The yt project
License: BSD 3-Clause
Project-URL: Homepage, https://yt-project.org/
Project-URL: Documentation, https://yt-project.org/doc/
Project-URL: Source, https://github.com/yt-project/yt/
Project-URL: Tracker, https://github.com/yt-project/yt/issues
Keywords: astronomy astrophysics visualization amr adaptivemeshrefinement
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Framework :: Matplotlib
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: POSIX :: AIX
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: C
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Scientific/Engineering :: Astronomy
Classifier: Topic :: Scientific/Engineering :: Physics
Classifier: Topic :: Scientific/Engineering :: Visualization
Requires-Python: >=3.10.3
Description-Content-Type: text/markdown
License-File: COPYING.txt
Requires-Dist: cmyt>=1.1.2
Requires-Dist: ewah-bool-utils>=1.2.0
Requires-Dist: matplotlib>=3.5
Requires-Dist: more-itertools>=8.4
Requires-Dist: numpy<3,>=1.21.3
Requires-Dist: numpy!=2.0.1; platform_machine == "arm64" and platform_system == "Darwin"
Requires-Dist: packaging>=20.9
Requires-Dist: pillow>=8.3.2
Requires-Dist: tomli-w>=0.4.0
Requires-Dist: tqdm>=3.4.0
Requires-Dist: unyt>=2.9.2
Requires-Dist: tomli>=1.2.3; python_version < "3.11"
Requires-Dist: typing-extensions>=4.4.0; python_version < "3.12"
Provides-Extra: hdf5
Requires-Dist: h5py!=3.12.0,>=3.1.0; platform_system == "Windows" and extra == "hdf5"
Provides-Extra: netcdf4
Requires-Dist: netCDF4!=1.6.1,>=1.5.3; extra == "netcdf4"
Provides-Extra: fortran
Requires-Dist: f90nml>=1.1; extra == "fortran"
Provides-Extra: adaptahop
Provides-Extra: ahf
Provides-Extra: amrex
Provides-Extra: amrvac
Requires-Dist: yt[Fortran]; extra == "amrvac"
Provides-Extra: art
Provides-Extra: arepo
Requires-Dist: yt[HDF5]; extra == "arepo"
Provides-Extra: artio
Provides-Extra: athena
Provides-Extra: athena-pp
Provides-Extra: boxlib
Provides-Extra: cf-radial
Requires-Dist: xarray>=0.16.1; extra == "cf-radial"
Requires-Dist: arm-pyart>=1.19.2; extra == "cf-radial"
Provides-Extra: chimera
Requires-Dist: yt[HDF5]; extra == "chimera"
Provides-Extra: chombo
Requires-Dist: yt[HDF5]; extra == "chombo"
Provides-Extra: cholla
Requires-Dist: yt[HDF5]; extra == "cholla"
Provides-Extra: eagle
Requires-Dist: yt[HDF5]; extra == "eagle"
Provides-Extra: enzo-e
Requires-Dist: yt[HDF5]; extra == "enzo-e"
Requires-Dist: libconf>=1.0.1; extra == "enzo-e"
Provides-Extra: enzo
Requires-Dist: yt[HDF5]; extra == "enzo"
Requires-Dist: libconf>=1.0.1; extra == "enzo"
Provides-Extra: exodus-ii
Requires-Dist: yt[netCDF4]; extra == "exodus-ii"
Provides-Extra: fits
Requires-Dist: astropy>=4.0.1; extra == "fits"
Requires-Dist: regions>=0.7; extra == "fits"
Provides-Extra: flash
Requires-Dist: yt[HDF5]; extra == "flash"
Provides-Extra: gadget
Requires-Dist: yt[HDF5]; extra == "gadget"
Provides-Extra: gadget-fof
Requires-Dist: yt[HDF5]; extra == "gadget-fof"
Provides-Extra: gamer
Requires-Dist: yt[HDF5]; extra == "gamer"
Provides-Extra: gdf
Requires-Dist: yt[HDF5]; extra == "gdf"
Provides-Extra: gizmo
Requires-Dist: yt[HDF5]; extra == "gizmo"
Provides-Extra: halo-catalog
Requires-Dist: yt[HDF5]; extra == "halo-catalog"
Provides-Extra: http-stream
Requires-Dist: requests>=2.20.0; extra == "http-stream"
Provides-Extra: idefix
Requires-Dist: yt_idefix[HDF5]>=2.3.0; extra == "idefix"
Provides-Extra: moab
Requires-Dist: yt[HDF5]; extra == "moab"
Provides-Extra: nc4-cm1
Requires-Dist: yt[netCDF4]; extra == "nc4-cm1"
Provides-Extra: open-pmd
Requires-Dist: yt[HDF5]; extra == "open-pmd"
Provides-Extra: owls
Requires-Dist: yt[HDF5]; extra == "owls"
Provides-Extra: owls-subfind
Requires-Dist: yt[HDF5]; extra == "owls-subfind"
Provides-Extra: parthenon
Requires-Dist: yt[HDF5]; extra == "parthenon"
Provides-Extra: ramses
Requires-Dist: yt[Fortran]; extra == "ramses"
Requires-Dist: scipy; extra == "ramses"
Provides-Extra: rockstar
Provides-Extra: sdf
Requires-Dist: requests>=2.20.0; extra == "sdf"
Provides-Extra: stream
Provides-Extra: swift
Requires-Dist: yt[HDF5]; extra == "swift"
Provides-Extra: tipsy
Provides-Extra: ytdata
Requires-Dist: yt[HDF5]; extra == "ytdata"
Provides-Extra: full
Requires-Dist: cartopy>=0.22.0; extra == "full"
Requires-Dist: firefly>=3.2.0; extra == "full"
Requires-Dist: glueviz>=0.13.3; extra == "full"
Requires-Dist: ipython>=7.16.2; extra == "full"
Requires-Dist: ipywidgets>=8.0.0; extra == "full"
Requires-Dist: miniballcpp>=0.2.1; extra == "full"
Requires-Dist: mpi4py>=3.0.3; extra == "full"
Requires-Dist: pandas>=1.1.2; extra == "full"
Requires-Dist: pooch>=0.7.0; extra == "full"
Requires-Dist: pyaml>=17.10.0; extra == "full"
Requires-Dist: pykdtree>=1.3.1; extra == "full"
Requires-Dist: pyx>=0.15; extra == "full"
Requires-Dist: scipy>=1.5.0; extra == "full"
Requires-Dist: glue-core!=1.2.4; python_version >= "3.10" and extra == "full"
Requires-Dist: ratarmount~=0.8.1; (platform_system != "Windows" and platform_system != "Darwin") and extra == "full"
Requires-Dist: yt[adaptahop]; extra == "full"
Requires-Dist: yt[ahf]; extra == "full"
Requires-Dist: yt[amrex]; extra == "full"
Requires-Dist: yt[amrvac]; extra == "full"
Requires-Dist: yt[art]; extra == "full"
Requires-Dist: yt[arepo]; extra == "full"
Requires-Dist: yt[artio]; extra == "full"
Requires-Dist: yt[athena]; extra == "full"
Requires-Dist: yt[athena_pp]; extra == "full"
Requires-Dist: yt[boxlib]; extra == "full"
Requires-Dist: yt[cf_radial]; extra == "full"
Requires-Dist: yt[chimera]; extra == "full"
Requires-Dist: yt[chombo]; extra == "full"
Requires-Dist: yt[cholla]; extra == "full"
Requires-Dist: yt[eagle]; extra == "full"
Requires-Dist: yt[enzo_e]; extra == "full"
Requires-Dist: yt[enzo]; extra == "full"
Requires-Dist: yt[exodus_ii]; extra == "full"
Requires-Dist: yt[fits]; extra == "full"
Requires-Dist: yt[flash]; extra == "full"
Requires-Dist: yt[gadget]; extra == "full"
Requires-Dist: yt[gadget_fof]; extra == "full"
Requires-Dist: yt[gamer]; extra == "full"
Requires-Dist: yt[gdf]; extra == "full"
Requires-Dist: yt[gizmo]; extra == "full"
Requires-Dist: yt[halo_catalog]; extra == "full"
Requires-Dist: yt[http_stream]; extra == "full"
Requires-Dist: yt[idefix]; extra == "full"
Requires-Dist: yt[moab]; extra == "full"
Requires-Dist: yt[nc4_cm1]; extra == "full"
Requires-Dist: yt[open_pmd]; extra == "full"
Requires-Dist: yt[owls]; extra == "full"
Requires-Dist: yt[owls_subfind]; extra == "full"
Requires-Dist: yt[parthenon]; extra == "full"
Requires-Dist: yt[ramses]; extra == "full"
Requires-Dist: yt[rockstar]; extra == "full"
Requires-Dist: yt[sdf]; extra == "full"
Requires-Dist: yt[stream]; extra == "full"
Requires-Dist: yt[swift]; extra == "full"
Requires-Dist: yt[tipsy]; extra == "full"
Requires-Dist: yt[ytdata]; extra == "full"
Provides-Extra: mapserver
Requires-Dist: bottle; extra == "mapserver"
Provides-Extra: test
Requires-Dist: pyaml>=17.10.0; extra == "test"
Requires-Dist: pytest>=6.1; extra == "test"
Requires-Dist: pytest-mpl>=0.16.1; extra == "test"
Requires-Dist: sympy!=1.10,!=1.9; extra == "test"
Requires-Dist: imageio!=2.35.0; extra == "test"
# The yt Project
[](https://pypi.org/project/yt)
[](https://pypi.org/project/yt/)
[](http://yt-project.org/docs/dev/)
[](https://mail.python.org/archives/list/yt-users@python.org//)
[](https://mail.python.org/archives/list/yt-dev@python.org//)
[](https://hub.yt/)
[](http://numfocus.org)
[](https://numfocus.org/donate-to-yt)
[](https://github.com/yt-project/yt/actions/workflows/build-test.yaml)
[](https://github.com/yt-project/yt/actions/workflows/bleeding-edge.yaml)
[](https://results.pre-commit.ci/latest/github/yt-project/yt/main)
[](https://github.com/charliermarsh/ruff)
yt is an open-source, permissively-licensed Python library for analyzing and
visualizing volumetric data.
yt supports structured, variable-resolution meshes, unstructured meshes, and
discrete or sampled data such as particles. Focused on driving
physically-meaningful inquiry, yt has been applied in domains such as
astrophysics, seismology, nuclear engineering, molecular dynamics, and
oceanography. Composed of a friendly community of users and developers, we want
to make it easy to use and develop - we'd love it if you got involved!
We've written a [method
paper](https://ui.adsabs.harvard.edu/abs/2011ApJS..192....9T) you may be interested
in; if you use yt in the preparation of a publication, please consider citing
it.
## Code of Conduct
yt abides by a code of conduct partially modified from the PSF code of conduct,
and is found [in our contributing
guide](http://yt-project.org/docs/dev/developing/developing.html#yt-community-code-of-conduct).
## Installation
You can install the most recent stable version of yt either with conda from
[conda-forge](https://conda-forge.org/):
```shell
conda install -c conda-forge yt
```
or with pip:
```shell
python -m pip install yt
```
More information on the various ways to install yt, and in particular to install from source,
can be found on [the project's website](https://yt-project.org/docs/dev/installing.html).
## Getting Started
yt is designed to provide meaningful analysis of data. We have some Quickstart
example notebooks in the repository:
* [Introduction](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/1\)_Introduction.ipynb)
* [Data Inspection](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/2\)_Data_Inspection.ipynb)
* [Simple Visualization](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/3\)_Simple_Visualization.ipynb)
* [Data Objects and Time Series](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/4\)_Data_Objects_and_Time_Series.ipynb)
* [Derived Fields and Profiles](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/5\)_Derived_Fields_and_Profiles.ipynb)
* [Volume Rendering](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/6\)_Volume_Rendering.ipynb)
If you'd like to try these online, you can visit our [yt Hub](https://hub.yt/)
and run a notebook next to some of our example data.
## Contributing
We love contributions! yt is open source, built on open source, and we'd love
to have you hang out in our community.
We have developed some [guidelines](CONTRIBUTING.rst) for contributing to yt.
**Imposter syndrome disclaimer**: We want your help. No, really.
There may be a little voice inside your head that is telling you that you're not
ready to be an open source contributor; that your skills aren't nearly good
enough to contribute. What could you possibly offer a project like this one?
We assure you - the little voice in your head is wrong. If you can write code at
all, you can contribute code to open source. Contributing to open source
projects is a fantastic way to advance one's coding skills. Writing perfect code
isn't the measure of a good developer (that would disqualify all of us!); it's
trying to create something, making mistakes, and learning from those
mistakes. That's how we all improve, and we are happy to help others learn.
Being an open source contributor doesn't just mean writing code, either. You can
help out by writing documentation, tests, or even giving feedback about the
project (and yes - that includes giving feedback about the contribution
process). Some of these contributions may be the most valuable to the project as
a whole, because you're coming to the project with fresh eyes, so you can see
the errors and assumptions that seasoned contributors have glossed over.
(This disclaimer was originally written by
[Adrienne Lowe](https://github.com/adriennefriend) for a
[PyCon talk](https://www.youtube.com/watch?v=6Uj746j9Heo), and was adapted by yt
based on its use in the README file for the
[MetPy project](https://github.com/Unidata/MetPy))
## Resources
We have some community and documentation resources available.
* Our latest documentation is always at http://yt-project.org/docs/dev/ and it
includes recipes, tutorials, and API documentation
* The [discussion mailing
list](https://mail.python.org/archives/list/yt-users@python.org//)
should be your first stop for general questions
* The [development mailing
list](https://mail.python.org/archives/list/yt-dev@python.org//) is
better suited for more development issues
* You can also join us on Slack at yt-project.slack.com ([request an
invite](https://yt-project.org/slack.html))
Is your code compatible with yt ? Great ! Please consider giving us a shoutout as a shiny badge in your README
- markdown
```markdown
[](https://yt-project.org)
```
- rst
```reStructuredText
|yt-project|
.. |yt-project| image:: https://img.shields.io/static/v1?label="works%20with"&message="yt"&color="blueviolet"
:target: https://yt-project.org
```
## Powered by NumFOCUS
yt is a fiscally sponsored project of [NumFOCUS](https://numfocus.org/).
If you're interested in
supporting the active maintenance and development of this project, consider
[donating to the project](https://numfocus.salsalabs.org/donate-to-yt/index.html).
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/README.md 0000644 0001751 0000177 00000016300 14714401662 013044 0 ustar 00runner docker # The yt Project
[](https://pypi.org/project/yt)
[](https://pypi.org/project/yt/)
[](http://yt-project.org/docs/dev/)
[](https://mail.python.org/archives/list/yt-users@python.org//)
[](https://mail.python.org/archives/list/yt-dev@python.org//)
[](https://hub.yt/)
[](http://numfocus.org)
[](https://numfocus.org/donate-to-yt)
[](https://github.com/yt-project/yt/actions/workflows/build-test.yaml)
[](https://github.com/yt-project/yt/actions/workflows/bleeding-edge.yaml)
[](https://results.pre-commit.ci/latest/github/yt-project/yt/main)
[](https://github.com/charliermarsh/ruff)
yt is an open-source, permissively-licensed Python library for analyzing and
visualizing volumetric data.
yt supports structured, variable-resolution meshes, unstructured meshes, and
discrete or sampled data such as particles. Focused on driving
physically-meaningful inquiry, yt has been applied in domains such as
astrophysics, seismology, nuclear engineering, molecular dynamics, and
oceanography. Composed of a friendly community of users and developers, we want
to make it easy to use and develop - we'd love it if you got involved!
We've written a [method
paper](https://ui.adsabs.harvard.edu/abs/2011ApJS..192....9T) you may be interested
in; if you use yt in the preparation of a publication, please consider citing
it.
## Code of Conduct
yt abides by a code of conduct partially modified from the PSF code of conduct,
and is found [in our contributing
guide](http://yt-project.org/docs/dev/developing/developing.html#yt-community-code-of-conduct).
## Installation
You can install the most recent stable version of yt either with conda from
[conda-forge](https://conda-forge.org/):
```shell
conda install -c conda-forge yt
```
or with pip:
```shell
python -m pip install yt
```
More information on the various ways to install yt, and in particular to install from source,
can be found on [the project's website](https://yt-project.org/docs/dev/installing.html).
## Getting Started
yt is designed to provide meaningful analysis of data. We have some Quickstart
example notebooks in the repository:
* [Introduction](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/1\)_Introduction.ipynb)
* [Data Inspection](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/2\)_Data_Inspection.ipynb)
* [Simple Visualization](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/3\)_Simple_Visualization.ipynb)
* [Data Objects and Time Series](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/4\)_Data_Objects_and_Time_Series.ipynb)
* [Derived Fields and Profiles](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/5\)_Derived_Fields_and_Profiles.ipynb)
* [Volume Rendering](https://github.com/yt-project/yt/tree/main/doc/source/quickstart/6\)_Volume_Rendering.ipynb)
If you'd like to try these online, you can visit our [yt Hub](https://hub.yt/)
and run a notebook next to some of our example data.
## Contributing
We love contributions! yt is open source, built on open source, and we'd love
to have you hang out in our community.
We have developed some [guidelines](CONTRIBUTING.rst) for contributing to yt.
**Imposter syndrome disclaimer**: We want your help. No, really.
There may be a little voice inside your head that is telling you that you're not
ready to be an open source contributor; that your skills aren't nearly good
enough to contribute. What could you possibly offer a project like this one?
We assure you - the little voice in your head is wrong. If you can write code at
all, you can contribute code to open source. Contributing to open source
projects is a fantastic way to advance one's coding skills. Writing perfect code
isn't the measure of a good developer (that would disqualify all of us!); it's
trying to create something, making mistakes, and learning from those
mistakes. That's how we all improve, and we are happy to help others learn.
Being an open source contributor doesn't just mean writing code, either. You can
help out by writing documentation, tests, or even giving feedback about the
project (and yes - that includes giving feedback about the contribution
process). Some of these contributions may be the most valuable to the project as
a whole, because you're coming to the project with fresh eyes, so you can see
the errors and assumptions that seasoned contributors have glossed over.
(This disclaimer was originally written by
[Adrienne Lowe](https://github.com/adriennefriend) for a
[PyCon talk](https://www.youtube.com/watch?v=6Uj746j9Heo), and was adapted by yt
based on its use in the README file for the
[MetPy project](https://github.com/Unidata/MetPy))
## Resources
We have some community and documentation resources available.
* Our latest documentation is always at http://yt-project.org/docs/dev/ and it
includes recipes, tutorials, and API documentation
* The [discussion mailing
list](https://mail.python.org/archives/list/yt-users@python.org//)
should be your first stop for general questions
* The [development mailing
list](https://mail.python.org/archives/list/yt-dev@python.org//) is
better suited for more development issues
* You can also join us on Slack at yt-project.slack.com ([request an
invite](https://yt-project.org/slack.html))
Is your code compatible with yt ? Great ! Please consider giving us a shoutout as a shiny badge in your README
- markdown
```markdown
[](https://yt-project.org)
```
- rst
```reStructuredText
|yt-project|
.. |yt-project| image:: https://img.shields.io/static/v1?label="works%20with"&message="yt"&color="blueviolet"
:target: https://yt-project.org
```
## Powered by NumFOCUS
yt is a fiscally sponsored project of [NumFOCUS](https://numfocus.org/).
If you're interested in
supporting the active maintenance and development of this project, consider
[donating to the project](https://numfocus.salsalabs.org/donate-to-yt/index.html).
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/conftest.py 0000644 0001751 0000177 00000041716 14714401662 013775 0 ustar 00runner docker import os
import shutil
import sys
import tempfile
from importlib.metadata import version
from importlib.util import find_spec
from pathlib import Path
import pytest
import yaml
from packaging.version import Version
from yt.config import ytcfg
from yt.utilities.answer_testing.testing_utilities import (
_compare_raw_arrays,
_hash_results,
_save_raw_arrays,
_save_result,
_streamline_for_io,
data_dir_load,
)
MPL_VERSION = Version(version("matplotlib"))
NUMPY_VERSION = Version(version("numpy"))
PILLOW_VERSION = Version(version("pillow"))
# setuptools does not ship with the standard lib starting in Python 3.12, so we need to
# be resilient if it's not available at runtime
if find_spec("setuptools") is not None:
SETUPTOOLS_VERSION = Version(version("setuptools"))
else:
SETUPTOOLS_VERSION = None
if find_spec("pandas") is not None:
PANDAS_VERSION = Version(version("pandas"))
else:
PANDAS_VERSION = None
def pytest_addoption(parser):
"""
Lets options be passed to test functions.
"""
parser.addoption(
"--with-answer-testing",
action="store_true",
)
parser.addoption(
"--answer-store",
action="store_true",
)
parser.addoption(
"--answer-raw-arrays",
action="store_true",
)
parser.addoption(
"--raw-answer-store",
action="store_true",
)
parser.addoption(
"--force-overwrite",
action="store_true",
)
parser.addoption(
"--no-hash",
action="store_true",
)
parser.addoption("--local-dir", default=None, help="Where answers are saved.")
# Tell pytest about the local-dir option in the ini files. This
# option is used for creating the answer directory on CI
parser.addini(
"local-dir",
default=str(Path(__file__).parent / "answer-store"),
help="answer directory.",
)
parser.addini(
"test_data_dir",
default=ytcfg.get("yt", "test_data_dir"),
help="Directory where data for tests is stored.",
)
def pytest_configure(config):
r"""
Reads in the tests/tests.yaml file. This file contains a list of
each answer test's answer file (including the changeset number).
"""
# Register custom marks for answer tests and big data
config.addinivalue_line("markers", "answer_test: Run the answer tests.")
config.addinivalue_line(
"markers", "big_data: Run answer tests that require large data files."
)
for value in (
# treat most warnings as errors
"error",
# >>> warnings emitted by testing frameworks, or in testing contexts
# we still have some yield-based tests, awaiting for transition into pytest
"ignore::pytest.PytestCollectionWarning",
# matplotlib warnings related to the Agg backend which is used in CI, not much we can do about it
"ignore:Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.:UserWarning",
r"ignore:tight_layout.+falling back to Agg renderer:UserWarning",
#
# >>> warnings from wrong values passed to numpy
# these should normally be curated out of the test suite but they are too numerous
# to deal with in a reasonable time at the moment.
"ignore:invalid value encountered in log10:RuntimeWarning",
"ignore:divide by zero encountered in log10:RuntimeWarning",
#
# >>> there are many places in yt (most notably at the frontend level)
# where we open files but never explicitly close them
# Although this is in general bad practice, it can be intentional and
# justified in contexts where reading speeds should be optimized.
# It is not clear at the time of writing how to approach this,
# so I'm going to ignore this class of warnings altogether for now.
"ignore:unclosed file.*:ResourceWarning",
):
config.addinivalue_line("filterwarnings", value)
if SETUPTOOLS_VERSION is not None and SETUPTOOLS_VERSION >= Version("67.3.0"):
# may be triggered by multiple dependencies
# see https://github.com/glue-viz/glue/issues/2364
# see https://github.com/matplotlib/matplotlib/issues/25244
config.addinivalue_line(
"filterwarnings",
r"ignore:(Deprecated call to `pkg_resources\.declare_namespace\('.*'\)`\.\n)?"
r"Implementing implicit namespace packages \(as specified in PEP 420\) "
r"is preferred to `pkg_resources\.declare_namespace`\.:DeprecationWarning",
)
if SETUPTOOLS_VERSION is not None and SETUPTOOLS_VERSION >= Version("67.5.0"):
# may be triggered by multiple dependencies
# see https://github.com/glue-viz/glue/issues/2364
# see https://github.com/matplotlib/matplotlib/issues/25244
config.addinivalue_line(
"filterwarnings",
"ignore:pkg_resources is deprecated as an API:DeprecationWarning",
)
if MPL_VERSION < Version("3.5.2") and PILLOW_VERSION >= Version("9.1"):
# see https://github.com/matplotlib/matplotlib/pull/22766
config.addinivalue_line(
"filterwarnings",
r"ignore:NONE is deprecated and will be removed in Pillow 10 \(2023-07-01\)\. "
r"Use Resampling\.NEAREST or Dither\.NONE instead\.:DeprecationWarning",
)
config.addinivalue_line(
"filterwarnings",
r"ignore:ADAPTIVE is deprecated and will be removed in Pillow 10 \(2023-07-01\)\. "
r"Use Palette\.ADAPTIVE instead\.:DeprecationWarning",
)
if NUMPY_VERSION >= Version("1.25"):
if find_spec("h5py") is not None and (
Version(version("h5py")) < Version("3.9")
):
# https://github.com/h5py/h5py/pull/2242
config.addinivalue_line(
"filterwarnings",
"ignore:`product` is deprecated as of NumPy 1.25.0"
":DeprecationWarning",
)
if PANDAS_VERSION is not None and PANDAS_VERSION >= Version("2.2.0"):
config.addinivalue_line(
"filterwarnings",
r"ignore:\s*Pyarrow will become a required dependency of pandas:DeprecationWarning",
)
if sys.version_info >= (3, 12):
# already patched (but not released) upstream:
# https://github.com/dateutil/dateutil/pull/1285
config.addinivalue_line(
"filterwarnings",
r"ignore:datetime\.datetime\.utcfromtimestamp\(\) is deprecated:DeprecationWarning",
)
if find_spec("ratarmount"):
# On Python 3.12+, there is a deprecation warning when calling os.fork()
# in a multi-threaded process. We use this mechanism to mount archives.
config.addinivalue_line(
"filterwarnings",
r"ignore:This process \(pid=\d+\) is multi-threaded, use of fork\(\) "
r"may lead to deadlocks in the child\."
":DeprecationWarning",
)
if find_spec("datatree"):
# the cf_radial dependency arm-pyart<=1.9.2 installs the now deprecated
# xarray-datatree package (which imports as datatree), which triggers
# a bunch of runtimewarnings when importing xarray.
# https://github.com/yt-project/yt/pull/5042#issuecomment-2457797694
config.addinivalue_line(
"filterwarnings",
"ignore:" r"Engine.*loading failed.*" ":RuntimeWarning",
)
def pytest_collection_modifyitems(config, items):
r"""
Decide which tests to skip based on command-line options.
"""
# Set up the skip marks
skip_answer = pytest.mark.skip(reason="--with-answer-testing not set.")
skip_unit = pytest.mark.skip(reason="Running answer tests, so skipping unit tests.")
skip_big = pytest.mark.skip(reason="--answer-big-data not set.")
# Loop over every collected test function
for item in items:
# If it's an answer test and the appropriate CL option hasn't
# been set, skip it
if "answer_test" in item.keywords and not config.getoption(
"--with-answer-testing"
):
item.add_marker(skip_answer)
# If it's an answer test that requires big data and the CL
# option hasn't been set, skip it
if (
"big_data" in item.keywords
and not config.getoption("--with-answer-testing")
and not config.getoption("--answer-big-data")
):
item.add_marker(skip_big)
if "answer_test" not in item.keywords and config.getoption(
"--with-answer-testing"
):
item.add_marker(skip_unit)
def pytest_itemcollected(item):
# Customize pytest-mpl decorator to add sensible defaults
mpl_marker = item.get_closest_marker("mpl_image_compare")
if mpl_marker is not None:
# in a future version, pytest-mpl may gain an option for doing this:
# https://github.com/matplotlib/pytest-mpl/pull/181
mpl_marker.kwargs.setdefault("tolerance", 0.5)
def _param_list(request):
r"""
Saves the non-ds, non-fixture function arguments for saving to
the answer file.
"""
# pytest treats parameterized arguments as fixtures, so there's no
# clean way to separate them out from other other fixtures (that I
# know of), so we do it explicitly
blacklist = [
"hashing",
"answer_file",
"request",
"answer_compare",
"temp_dir",
"orbit_traj",
"etc_traj",
]
test_params = {}
for key, val in request.node.funcargs.items():
if key not in blacklist:
# For plotwindow, the callback arg is a tuple and the second
# element contains a memory address, so we need to drop it.
# The first element is the callback name, which is all that's
# needed
if key == "callback":
val = val[0]
test_params[key] = str(val)
# Convert python-specific data objects (such as tuples) to a more
# io-friendly format (in order to not have python-specific anchors
# in the answer yaml file)
test_params = _streamline_for_io(test_params)
return test_params
def _get_answer_files(request):
"""
Gets the path to where the hashed and raw answers are saved.
"""
answer_file = f"{request.cls.__name__}_{request.cls.answer_version}.yaml"
raw_answer_file = f"{request.cls.__name__}_{request.cls.answer_version}.h5"
# Add the local-dir aspect of the path. If there's a command line value,
# have that override the ini file value
clLocalDir = request.config.getoption("--local-dir")
iniLocalDir = request.config.getini("local-dir")
if clLocalDir is not None:
answer_file = os.path.join(os.path.expanduser(clLocalDir), answer_file)
raw_answer_file = os.path.join(os.path.expanduser(clLocalDir), raw_answer_file)
else:
answer_file = os.path.join(os.path.expanduser(iniLocalDir), answer_file)
raw_answer_file = os.path.join(os.path.expanduser(iniLocalDir), raw_answer_file)
# Make sure we don't overwrite unless we mean to
overwrite = request.config.getoption("--force-overwrite")
storing = request.config.getoption("--answer-store")
raw_storing = request.config.getoption("--raw-answer-store")
raw = request.config.getoption("--answer-raw-arrays")
if os.path.exists(answer_file) and storing and not overwrite:
raise FileExistsError(
"Use `--force-overwrite` to overwrite an existing answer file."
)
if os.path.exists(raw_answer_file) and raw_storing and raw and not overwrite:
raise FileExistsError(
"Use `--force-overwrite` to overwrite an existing raw answer file."
)
# If we do mean to overwrite, do so here by deleting the original file
if os.path.exists(answer_file) and storing and overwrite:
os.remove(answer_file)
if os.path.exists(raw_answer_file) and raw_storing and raw and overwrite:
os.remove(raw_answer_file)
print(os.path.abspath(answer_file))
return answer_file, raw_answer_file
@pytest.fixture(scope="function")
def hashing(request):
r"""
Handles initialization, generation, and saving of answer test
result hashes.
"""
no_hash = request.config.getoption("--no-hash")
store_hash = request.config.getoption("--answer-store")
raw = request.config.getoption("--answer-raw-arrays")
raw_store = request.config.getoption("--raw-answer-store")
# This check is so that, when checking if the answer file exists in
# _get_answer_files, we don't continuously fail. With this check,
# _get_answer_files is called once per class, despite this having function
# scope
if request.cls.answer_file is None:
request.cls.answer_file, request.cls.raw_answer_file = _get_answer_files(
request
)
if not no_hash and not store_hash and request.cls.saved_hashes is None:
try:
with open(request.cls.answer_file) as fd:
request.cls.saved_hashes = yaml.safe_load(fd)
except FileNotFoundError:
module_filename = f"{request.function.__module__.replace('.', os.sep)}.py"
with open(f"generate_test_{os.getpid()}.txt", "a") as fp:
fp.write(f"{module_filename}::{request.cls.__name__}\n")
pytest.fail(msg="Answer file not found.", pytrace=False)
request.cls.hashes = {}
# Load the saved answers if we're comparing. We don't do this for the raw
# answers because those are huge
yield
# Get arguments and their values passed to the test (e.g., axis, field, etc.)
params = _param_list(request)
# Hash the test results. Don't save to request.cls.hashes so we still have
# raw data, in case we want to work with that
hashes = _hash_results(request.cls.hashes)
# Add the other test parameters
hashes.update(params)
# Add the function name as the "master" key to the hashes dict
hashes = {request.node.name: hashes}
# Save hashes
if not no_hash and store_hash:
_save_result(hashes, request.cls.answer_file)
# Compare hashes
elif not no_hash and not store_hash:
try:
for test_name, test_hash in hashes.items():
assert test_name in request.cls.saved_hashes
assert test_hash == request.cls.saved_hashes[test_name]
except AssertionError:
pytest.fail(f"Comparison failure: {request.node.name}", pytrace=False)
# Save raw data
if raw and raw_store:
_save_raw_arrays(
request.cls.hashes, request.cls.raw_answer_file, request.node.name
)
# Compare raw data. This is done one test at a time because the
# arrays can get quite large and storing everything in memory would
# be bad
if raw and not raw_store:
_compare_raw_arrays(
request.cls.hashes, request.cls.raw_answer_file, request.node.name
)
@pytest.fixture(scope="function")
def temp_dir():
r"""
Creates a temporary directory needed by certain tests.
"""
curdir = os.getcwd()
if int(os.environ.get("GENERATE_YTDATA", 0)):
tmpdir = os.getcwd()
else:
tmpdir = tempfile.mkdtemp()
os.chdir(tmpdir)
yield tmpdir
os.chdir(curdir)
if tmpdir != curdir:
shutil.rmtree(tmpdir)
@pytest.fixture(scope="class")
def ds(request):
# data_dir_load can take the cls, args, and kwargs. These optional
# arguments, if present, are given in a dictionary as the second
# element of the list
if isinstance(request.param, str):
ds_fn = request.param
opts = {}
else:
ds_fn, opts = request.param
try:
return data_dir_load(
ds_fn, cls=opts.get("cls"), args=opts.get("args"), kwargs=opts.get("kwargs")
)
except FileNotFoundError:
return pytest.skip(f"Data file: `{request.param}` not found.")
@pytest.fixture(scope="class")
def field(request):
"""
Fixture for returning the field. Needed because indirect=True is
used for loading the datasets.
"""
return request.param
@pytest.fixture(scope="class")
def dobj(request):
"""
Fixture for returning the ds_obj. Needed because indirect=True is
used for loading the datasets.
"""
return request.param
@pytest.fixture(scope="class")
def axis(request):
"""
Fixture for returning the axis. Needed because indirect=True is
used for loading the datasets.
"""
return request.param
@pytest.fixture(scope="class")
def weight(request):
"""
Fixture for returning the weight_field. Needed because
indirect=True is used for loading the datasets.
"""
return request.param
@pytest.fixture(scope="class")
def ds_repr(request):
"""
Fixture for returning the string representation of a dataset.
Needed because indirect=True is used for loading the datasets.
"""
return request.param
@pytest.fixture(scope="class")
def Npart(request):
"""
Fixture for returning the number of particles in a dataset.
Needed because indirect=True is used for loading the datasets.
"""
return request.param
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.2151506
yt-4.4.0/doc/ 0000755 0001751 0000177 00000000000 14714401715 012331 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/Makefile 0000644 0001751 0000177 00000012047 14714401662 013776 0 ustar 00runner docker # Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest
help:
@echo "Please use \`make ' where is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " clean to remove the build directory"
@echo " recipeclean to remove files produced by running the cookbook scripts"
clean:
-rm -rf $(BUILDDIR)/*
-rm -rf source/reference/api/yt.*
-rm -rf source/reference/api/modules.rst
fullclean: clean
recipeclean:
-rm -rf _temp/*.done source/cookbook/_static/*
html:
ifneq ($(READTHEDOCS),True)
SPHINX_APIDOC_OPTIONS=members,undoc-members,inherited-members,show-inheritance sphinx-apidoc \
-o source/reference/api/ \
-e ../yt $(shell find ../yt -name "*tests*" -type d) ../yt/utilities/voropp* ../yt/analysis_modules/*
endif
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/yt.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/yt.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/yt"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/yt"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
make -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/README 0000644 0001751 0000177 00000000560 14714401662 013213 0 ustar 00runner docker This directory contains the uncompiled yt documentation. It's written to be
used with Sphinx, a tool designed for writing Python documentation. Sphinx is
available at this URL:
http://www.sphinx-doc.org/en/master/
Because the documentation requires a number of dependencies, we provide
pre-built versions online, accessible here:
https://yt-project.org/docs/dev/
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/activate 0000644 0001751 0000177 00000005712 14714401662 014062 0 ustar 00runner docker ### Adapted from virtualenv's activate script
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
deactivate () {
# reset old environment variables
if [ -n "$_OLD_VIRTUAL_PATH" ] ; then
PATH="$_OLD_VIRTUAL_PATH"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if [ -n "$_OLD_VIRTUAL_PYTHONHOME" ] ; then
PYTHONHOME="$_OLD_VIRTUAL_PYTHONHOME"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi
### Begin extra yt vars
if [ -n "$_OLD_VIRTUAL_YT_DEST" ] ; then
YT_DEST="$_OLD_VIRTUAL_YT_DEST"
export YT_DEST
unset _OLD_VIRTUAL_PYTHONHOME
fi
if [ -n "$_OLD_VIRTUAL_PYTHONPATH" ] ; then
PYTHONPATH="$_OLD_VIRTUAL_PYTHONPATH"
export PYTHONPATH
unset _OLD_VIRTUAL_PYTHONPATH
fi
if [ -n "$_OLD_VIRTUAL_LD_LIBRARY_PATH" ] ; then
LD_LIBRARY_PATH="$_OLD_VIRTUAL_LD_LIBRARY_PATH"
export LD_LIBRARY_PATH
unset _OLD_VIRTUAL_LD_LIBRARY_PATH
fi
### End extra yt vars
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "$BASH" -o -n "$ZSH_VERSION" ] ; then
hash -r
fi
if [ -n "$_OLD_VIRTUAL_PS1" ] ; then
PS1="$_OLD_VIRTUAL_PS1"
export PS1
unset _OLD_VIRTUAL_PS1
fi
unset VIRTUAL_ENV
if [ ! "$1" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}
# unset irrelevant variables
deactivate nondestructive
VIRTUAL_ENV="__YT_DIR__"
export VIRTUAL_ENV
_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/bin:$PATH"
export PATH
### Begin extra env vars for yt
_OLD_VIRTUAL_YT_DEST="$YT_DEST"
YT_DEST="$VIRTUAL_ENV"
export YT_DEST
_OLD_VIRTUAL_PYTHONPATH="$PYTHONPATH"
_OLD_VIRTUAL_LD_LIBRARY_PATH="$LD_LIBRARY_PATH"
LD_LIBRARY_PATH="$VIRTUAL_ENV/lib:$LD_LIBRARY_PATH"
export LD_LIBRARY_PATH
### End extra env vars for yt
# unset PYTHONHOME if set
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
# could use `if (set -u; : $PYTHONHOME) ;` in bash
if [ -n "$PYTHONHOME" ] ; then
_OLD_VIRTUAL_PYTHONHOME="$PYTHONHOME"
unset PYTHONHOME
fi
if [ -z "$VIRTUAL_ENV_DISABLE_PROMPT" ] ; then
_OLD_VIRTUAL_PS1="$PS1"
if [ "x" != x ] ; then
PS1="$PS1"
else
if [ "`basename \"$VIRTUAL_ENV\"`" = "__" ] ; then
# special case for Aspen magic directories
# see http://www.zetadev.com/software/aspen/
PS1="[`basename \`dirname \"$VIRTUAL_ENV\"\``] $PS1"
else
PS1="(`basename \"$VIRTUAL_ENV\"`)$PS1"
fi
fi
export PS1
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "$BASH" -o -n "$ZSH_VERSION" ] ; then
hash -r
fi
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/activate.csh 0000644 0001751 0000177 00000003654 14714401662 014641 0 ustar 00runner docker # This file must be used with "source bin/activate.csh" *from csh*.
# You cannot run it directly.
# Created by Davide Di Blasi .
alias deactivate 'test $?_OLD_VIRTUAL_PATH != 0 && setenv PATH "$_OLD_VIRTUAL_PATH" && unset _OLD_VIRTUAL_PATH; test $?_OLD_VIRTUAL_YT_DEST != 0 && setenv YT_DEST "$_OLD_VIRTUAL_YT_DEST" && unset _OLD_VIRTUAL_YT_DEST; test $?_OLD_VIRTUAL_PYTHONPATH != 0 && setenv PYTHONPATH "$_OLD_VIRTUAL_PYTHONPATH" && unset _OLD_VIRTUAL_PYTHONPATH; test $?_OLD_VIRTUAL_LD_LIBRARY_PATH != 0 && setenv LD_LIBRARY_PATH "$_OLD_VIRTUAL_LD_LIBRARY_PATH" && unset _OLD_VIRTUAL_LD_LIBRARY_PATH; rehash; test $?_OLD_VIRTUAL_PROMPT != 0 && set prompt="$_OLD_VIRTUAL_PROMPT" && unset _OLD_VIRTUAL_PROMPT; unsetenv VIRTUAL_ENV; test "\!:*" != "nondestructive" && unalias deactivate'
# Unset irrelevant variables.
deactivate nondestructive
setenv VIRTUAL_ENV "__YT_DIR__"
if ($?PATH == 0) then
setenv PATH
endif
set _OLD_VIRTUAL_PATH="$PATH"
setenv PATH "${VIRTUAL_ENV}/bin:${PATH}"
### Begin extra yt vars
if ($?YT_DEST == 0) then
setenv YT_DEST
endif
set _OLD_VIRTUAL_YT_DEST="$YT_DEST"
setenv YT_DEST "${VIRTUAL_ENV}"
if ($?PYTHONPATH == 0) then
setenv PYTHONPATH
endif
set _OLD_VIRTUAL_PYTHONPATH="$PYTHONPATH"
setenv PYTHONPATH "${VIRTUAL_ENV}/lib/python2.7/site-packages:${PYTHONPATH}"
if ($?LD_LIBRARY_PATH == 0) then
setenv LD_LIBRARY_PATH
endif
set _OLD_VIRTUAL_LD_LIBRARY_PATH="$LD_LIBRARY_PATH"
setenv LD_LIBRARY_PATH "${VIRTUAL_ENV}/lib:${LD_LIBRARY_PATH}"
### End extra yt vars
set _OLD_VIRTUAL_PROMPT="$prompt"
if ("" != "") then
set env_name = ""
else
if (`basename "$VIRTUAL_ENV"` == "__") then
# special case for Aspen magic directories
# see http://www.zetadev.com/software/aspen/
set env_name = `basename \`dirname "$VIRTUAL_ENV"\``
else
set env_name = `basename "$VIRTUAL_ENV"`
endif
endif
set prompt = "[$env_name] $prompt"
unset env_name
rehash
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/cheatsheet.tex 0000644 0001751 0000177 00000046406 14714401662 015203 0 ustar 00runner docker \documentclass[10pt,landscape]{article}
\usepackage{multicol}
\usepackage{calc}
\usepackage{ifthen}
\usepackage[landscape]{geometry}
\usepackage[hyphens]{url}
% To make this come out properly in landscape mode, do one of the following
% 1.
% pdflatex cheatsheet.tex
%
% 2.
% latex cheatsheet.tex
% dvips -P pdf -t landscape cheatsheet.dvi
% ps2pdf cheatsheet.ps
% If you're reading this, be prepared for confusion. Making this was
% a learning experience for me, and it shows. Much of the placement
% was hacked in; if you make it better, let me know...
% 2008-04
% Changed page margin code to use the geometry package. Also added code for
% conditional page margins, depending on paper size. Thanks to Uwe Ziegenhagen
% for the suggestions.
% 2006-08
% Made changes based on suggestions from Gene Cooperman.
% 2012-11 - Stephen Skory
% Converted the latex cheat sheet to a yt cheat sheet, taken from
% http://www.stdout.org/~winston/latex/
% This sets page margins to .5 inch if using letter paper, and to 1cm
% if using A4 paper. (This probably isn't strictly necessary.)
% If using another size paper, use default 1cm margins.
\ifthenelse{\lengthtest { \paperwidth = 11in}}
{ \geometry{top=.5in,left=.5in,right=.5in,bottom=0.85in} }
{\ifthenelse{ \lengthtest{ \paperwidth = 297mm}}
{\geometry{top=1cm,left=1cm,right=1cm,bottom=1cm} }
{\geometry{top=1cm,left=1cm,right=1cm,bottom=1cm} }
}
% Turn off header and footer
\pagestyle{empty}
% Redefine section commands to use less space
\makeatletter
\renewcommand{\section}{\@startsection{section}{1}{0mm}%
{-1ex plus -.5ex minus -.2ex}%
{0.5ex plus .2ex}%x
{\normalfont\large\bfseries}}
\renewcommand{\subsection}{\@startsection{subsection}{2}{0mm}%
{-1explus -.5ex minus -.2ex}%
{0.5ex plus .2ex}%
{\normalfont\normalsize\bfseries}}
\renewcommand{\subsubsection}{\@startsection{subsubsection}{3}{0mm}%
{-1ex plus -.5ex minus -.2ex}%
{1ex plus .2ex}%
{\normalfont\small\bfseries}}
\makeatother
% Define BibTeX command
\def\BibTeX{{\rm B\kern-.05em{\sc i\kern-.025em b}\kern-.08em
T\kern-.1667em\lower.7ex\hbox{E}\kern-.125emX}}
% Don't print section numbers
\setcounter{secnumdepth}{0}
\setlength{\parindent}{0pt}
\setlength{\parskip}{0pt plus 0.5ex}
% -----------------------------------------------------------------------
\begin{document}
\raggedright
\fontsize{3mm}{3mm}\selectfont
\begin{multicols}{3}
% multicol parameters
% These lengths are set only within the two main columns
%\setlength{\columnseprule}{0.25pt}
\setlength{\premulticols}{1pt}
\setlength{\postmulticols}{1pt}
\setlength{\multicolsep}{1pt}
\setlength{\columnsep}{2pt}
\begin{center}
\Large{\textbf{yt Cheat Sheet}} \\
\end{center}
\subsection{General Info}
For everything yt please see \url{http://yt-project.org}.
Documentation \url{http://yt-project.org/doc/index.html}.
Need help? Start here \url{http://yt-project.org/doc/help/} and then
try the IRC chat room \url{http://yt-project.org/irc.html},
or the mailing list \url{https://mail.python.org/archives/list/yt-users@python.org/}. \\
\subsection{Installing yt} The easiest way to install yt is to use the
installation script found on the yt homepage or the docs linked above. If you
already have python set up with \texttt{numpy}, \texttt{scipy},
\texttt{matplotlib}, \texttt{h5py}, and \texttt{cython}, you can also use
\texttt{pip install yt}
\subsection{Command Line yt}
yt, and its convenience functions, are launched from a command line prompt.
Many commands have flags to control behavior.
Commands can be followed by
{\bf {-}{-}help} (e.g. {\bf yt render {-}{-}help}) for detailed help for that command
including a list of the available flags.
\texttt{yt load} \textit{dataset} \textemdash\ Load a single dataset. \\
\texttt{yt help} \textemdash\ Print yt help information. \\
\texttt{yt stats} \textit{dataset} \textemdash\ Print stats of a dataset. \\
\texttt{yt update} \textemdash\ Update yt to most recent version.\\
\texttt{yt update --all} \textemdash\ Update yt and dependencies to most recent version. \\
\texttt{yt version} \textemdash\ yt installation information. \\
\texttt{yt upload\_image} \textit{image.png} \textemdash\ Upload PNG image to imgur.com. \\
\texttt{yt upload\_notebook} \textit{notebook.nb} \textemdash\ Upload IPython notebook to \url{https://girder.hub.yt}.\\
\texttt{yt plot} \textit{dataset} \textemdash\ Create a set of images.\\
\texttt{yt render} \textit{dataset} \textemdash\ Create a simple
volume rendering. \\
\texttt{yt mapserver} \textit{dataset} \textemdash\ View a plot/projection in a Gmaps-like
interface. \\
\texttt{yt pastebin} \textit{text.out} \textemdash\ Post text to the pastebin at
paste.yt-project.org. \\
\texttt{yt pastebin\_grab} \textit{identifier} \textemdash\ Print content of pastebin to
STDOUT. \\
\texttt{yt bugreport} \textemdash\ Report a yt bug. \\
\texttt{yt hop} \textit{dataset} \textemdash\ Run hop on a dataset. \\
\subsection{yt Imports}
In order to use yt, Python must load the relevant yt modules into memory.
The import commands are entered in the Python/IPython shell or
used as part of a script.
\newlength{\MyLen}
\settowidth{\MyLen}{\texttt{letterpaper}/\texttt{a4paper} \ }
\texttt{import yt} \textemdash\
Load yt. \\
\texttt{from yt.config import ytcfg} \textemdash\
Used to set yt configuration options.
If used, must be called before importing any other module.\\
\texttt{from yt.analysis\_modules.\emph{halo\_finding}.api import \textasteriskcentered} \textemdash\
Load halo finding modules. Other modules
are loaded in a similar way by swapping the
\emph{emphasized} text.
See the \textbf{Analysis Modules} section for a listing and short descriptions of each.
\subsection{YTArray}
Simulation data in yt is returned as a YTArray. YTArray is a numpy array that
has unit data attached to it and can automatically handle unit conversions and
detect unit errors. Just like a numpy array, YTArray provides a wealth of
built-in functions to calculate properties of the data in the array. Here is a
very brief list of some useful ones.
\settowidth{\MyLen}{\texttt{multicol} }\\
\texttt{v = a.in\_cgs()} \textemdash\ Return the array in CGS units \\
\texttt{v = a.in\_units('Msun/pc**3')} \textemdash\ Return the array in solar masses per cubic parsec \\
\texttt{v = a.max(), a.min()} \textemdash\ Return maximum, minimum of \texttt{a}. \\
\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max,
min value of \texttt{a}.\\
\texttt{v = a[}\textit{index}\texttt{]} \textemdash\ Select a single value from \texttt{a} at location \textit{index}.\\
\texttt{b = a[}\textit{i:j}\texttt{]} \textemdash\ Select the slice of values from
\texttt{a} between
locations \textit{i} to \textit{j-1} saved to a new Numpy array \texttt{b} with length \textit{j-i}. \\
\texttt{sel = (a > const)} \textemdash\ Create a new boolean Numpy array
\texttt{sel}, of the same shape as \texttt{a},
that marks which values of \texttt{a > const}. Other operators (e.g. \textless, !=, \%) work as well.\\
\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of
elements from \texttt{a} that correspond to elements of \texttt{sel}
that are \textit{True}. In the above example \texttt{b} would be all elements of \texttt{a} that are greater than \texttt{const}.\\
\texttt{a.write\_hdf5(\textit{filename.h5})} \textemdash\ Save \texttt{a} to the hdf5 file \textit{filename.h5}.\\
\subsection{IPython Tips}
\settowidth{\MyLen}{\texttt{multicol} }
These tips work if IPython has been loaded, typically either by invoking
\texttt{yt load} on the command line.
\texttt{Tab complete} \textemdash\ IPython will attempt to auto-complete a
variable or function name when the \texttt{Tab} key is pressed, e.g. \textit{HaloFi}\textendash\texttt{Tab} would auto-complete
to \textit{HaloFinder}. This also works with imports, e.g. \textit{from numpy.random.}\textendash\texttt{Tab}
would give you a list of random functions (note the trailing period before hitting \texttt{Tab}).\\
\texttt{?, ??} \textemdash\ Appending one or two question marks at the end of any object gives you
detailed information about it, e.g. \textit{variable\_name}?.\\
Below a few IPython ``magics'' are listed, which are IPython-specific shortcut commands.\\
\texttt{\%paste} \textemdash\ Paste content from the system clipboard into the IPython shell.\\
\texttt{\%hist} \textemdash\ Print recent command history.\\
\texttt{\%quickref} \textemdash\ Print IPython quick reference.\\
\texttt{\%pdb} \textemdash\ Automatically enter the Python debugger at an exception.\\
\texttt{\%debug} \textemdash\ Drop into a debugger at the location of the last unhandled exception. \\
\texttt{\%time, \%timeit} \textemdash\ Find running time of expressions for benchmarking.\\
\texttt{\%lsmagic} \textemdash\ List all available IPython magics. Hint: \texttt{?} works with magics.\\
Please see \url{http://ipython.org/documentation.html} for the full
IPython documentation.
\subsection{Load and Access Data}
The first step in using yt is to reference a simulation snapshot.
After that, simulation data is generally accessed in yt using \textit{Data Containers} which are Python objects
that define a region of simulation space from which data should be selected.
\settowidth{\MyLen}{\texttt{multicol} }
\texttt{ds = yt.load(}\textit{dataset}\texttt{)} \textemdash\ Reference a single snapshot.\\
\texttt{dd = ds.all\_data()} \textemdash\ Select the entire volume.\\
\texttt{a = dd[}\textit{field\_name}\texttt{]} \textemdash\ Copies the contents of \textit{field} into the
YTArray \texttt{a}. Similarly for other data containers.\\
\texttt{ds.field\_list} \textemdash\ A list of available fields in the snapshot. \\
\texttt{ds.derived\_field\_list} \textemdash\ A list of available derived fields
in the snapshot. \\
\texttt{val, loc = ds.find\_max("Density")} \textemdash\ Find the \texttt{val}ue of
the maximum of the field \texttt{Density} and its \texttt{loc}ation. \\
\texttt{sp = ds.sphere(}\textit{cen}\texttt{,}\textit{radius}\texttt{)} \textemdash\ Create a spherical data
container. \textit{cen} may be a coordinate, or ``max'' which
centers on the max density point. \textit{radius} may be a float in
code units or a tuple of (\textit{length, unit}).\\
\texttt{re = ds.region(\textit{cen}, \textit{left edge}, \textit{right edge})} \textemdash\ Create a
rectilinear data container. \textit{cen} is required but not used.
\textit{left} and \textit{right edge} are coordinate values that define the region.
\texttt{di = ds.disk(\textit{cen}, \textit{normal}, \textit{radius}, \textit{height})} \textemdash\
Create a cylindrical data container centered at \textit{cen} along the
direction set by \textit{normal},with total length
2$\times$\textit{height} and with radius \textit{radius}. \\
\texttt{ds.save\_object(sp, \textit{``sp\_for\_later''})} \textemdash\ Save an object (\texttt{sp}) for later use.\\
\texttt{sp = ds.load\_object(\textit{``sp\_for\_later''})} \textemdash\ Recover a saved object.\\
\subsection{Defining New Fields}
\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory.
Field can either be created before a dataset is loaded using \texttt{add\_field}:
\texttt{def \_metal\_mass(\textit{field},\textit{data})}\\
\texttt{\hspace{4 mm} return data["metallicity"]*data["cell\_mass"]}\\
\texttt{add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
Or added to an existing dataset using \texttt{ds.add\_field}:
\texttt{ds.add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
\subsection{Slices and Projections}
\settowidth{\MyLen}{\texttt{multicol} }
\texttt{slc = yt.SlicePlot(ds, \textit{axis or normal vector}, \textit{fields}, \textit{center=}, \textit{width=}, \textit{weight\_field=}, \textit{additional parameters})} \textemdash\ Make a slice plot
perpendicular to \textit{axis} (specified via 'x', 'y', or 'z') or a normal vector for an off-axis slice of \textit{fields} weighted by \textit{weight\_field} at (code-units) \textit{center} with
\textit{width} in code units or a (value, unit) tuple. Hint: try \textit{yt.SlicePlot?} in IPython to see additional parameters.\\
\texttt{slc.save(\textit{file\_prefix})} \textemdash\ Save the slice to a png with name prefix \textit{file\_prefix}.
\texttt{.save()} works similarly for the commands below.\\
\texttt{prj = yt.ProjectionPlot(ds, \textit{axis or normal vector}, \textit{fields}, \textit{additional params})} \textemdash\ Same as \texttt{yt.SlicePlot} but for projections.\\
\subsection{Plot Annotations}
\settowidth{\MyLen}{\texttt{multicol} }
Plot callbacks are functions itemized in a registry that is attached to every plot object. They can be accessed and then called like \texttt{ prj.annotate\_velocity(factor=16, normalize=False)}. Most callbacks also accept a \textit{plot\_args} dict that is fed to matplotlib annotator. \\
\texttt{velocity(\textit{factor=},\textit{scale=},\textit{scale\_units=}, \textit{normalize=})} \textemdash\ Uses field "x-velocity" to draw quivers\\
\texttt{magnetic\_field(\textit{factor=},\textit{scale=},\textit{scale\_units=}, \textit{normalize=})} \textemdash\ Uses field "Bx" to draw quivers\\
\texttt{quiver(\textit{field\_x},\textit{field\_y},\textit{factor=},\textit{scale=},\textit{scale\_units=}, \textit{normalize=})} \\
\texttt{contour(\textit{field=},\textit{levels=},\textit{factor=},\textit{clim=},\textit{take\_log=}, \textit{additional parameters})} \textemdash Plots a number of contours \textit{ncont} to interpolate \textit{field} optionally using \textit{take\_log}, upper and lower \textit{c}ontour\textit{lim}its and \textit{factor} number of points in the interpolation.\\
\texttt{grids(\textit{alpha=}, \textit{draw\_ids=}, \textit{periodic=}, \textit{min\_level=}, \textit{max\_level=})} \textemdash Add grid boundaries. \\
\texttt{streamlines(\textit{field\_x},\textit{field\_y},\textit{factor=},\textit{density=})}\\
\texttt{clumps(\textit{clumplist})} \textemdash\ Generate \textit{clumplist} using the clump finder and plot. \\
\texttt{arrow(\textit{pos}, \textit{code\_size})} Add an arrow at a \textit{pos}ition. \\
\texttt{point(\textit{pos}, \textit{text})} \textemdash\ Add text at a \textit{pos}ition. \\
\texttt{marker(\textit{pos}, \textit{marker=})} \textemdash\ Add a matplotlib-defined marker at a \textit{pos}ition. \\
\texttt{sphere(\textit{center}, \textit{radius}, \textit{text=})} \textemdash\ Draw a circle and append \textit{text}.\\
\texttt{hop\_circles(\textit{hop\_output}, \textit{max\_number=}, \textit{annotate=}, \textit{min\_size=}, \textit{max\_size=}, \textit{font\_size=}, \textit{print\_halo\_size=}, \textit{fixed\_radius=}, \textit{min\_mass=}, \textit{print\_halo\_mass=}, \textit{width=})} \textemdash\ Draw a halo, printing it's ID, mass, clipping halos depending on number of particles (\textit{size}) and optionally fixing the drawn circle radius to be constant for all halos.\\
\texttt{hop\_particles(\textit{hop\_output},\textit{max\_number=},\textit{p\_size=},\\
\textit{min\_size},\textit{alpha=})} \textemdash\ Draw particle positions for member halos with a certain number of pixels per particle.\\
\texttt{particles(\textit{width},\textit{p\_size=},\textit{col=}, \textit{marker=}, \textit{stride=}, \textit{ptype=}, \textit{stars\_only=}, \textit{dm\_only=}, \textit{minimum\_mass=}, \textit{alpha=})} \textemdash\ Draw particles of \textit{p\_size} pixels in a slab of \textit{width} with \textit{col}or using a matplotlib \textit{marker} plotting only every \textit{stride} number of particles.\\
\texttt{title(\textit{text})}\\
\subsection{The $\sim$/.yt/ Directory}
\settowidth{\MyLen}{\texttt{multicol} }
yt will automatically check for configuration files in a special directory (\texttt{\$HOME/.yt/}) in the user's home directory.
The \texttt{config} file \textemdash\ Settings that control runtime behavior. \\
The \texttt{my\_plugins.py} file \textemdash\ Add functions, derived fields, constants, or other commonly-used Python code to yt.
\subsection{Analysis Modules}
\settowidth{\MyLen}{\texttt{multicol}}
The import name for each module is listed at the end of each description (see \textbf{yt Imports}).
\texttt{Absorption Spectrum} \textemdash\ (\texttt{absorption\_spectrum}). \\
\texttt{Clump Finder} \textemdash\ Find clumps defined by density thresholds (\texttt{level\_sets}). \\
\texttt{Halo Finding} \textemdash\ Locate halos of dark matter particles (\texttt{halo\_finding}). \\
\texttt{Light Cone Generator} \textemdash\ Stitch datasets together to perform analysis over cosmological volumes. \\
\texttt{Light Ray Generator} \textemdash\ Analyze the path of light rays.\\
\texttt{Rockstar Halo Finding} \textemdash\ Locate halos of dark matter using the Rockstar halo finder (\texttt{halo\_finding.rockstar}). \\
\texttt{Star Particle Analysis} \textemdash\ Analyze star formation history and assemble spectra (\texttt{star\_analysis}). \\
\texttt{Sunrise Exporter} \textemdash\ Export data to the sunrise visualization format (\texttt{sunrise\_export}). \\
\subsection{Parallel Analysis}
\settowidth{\MyLen}{\texttt{multicol}}
Nearly all of yt is parallelized using
MPI\@. The \textit{mpi4py} package must be installed for parallelism in yt. To
install \textit{pip install mpi4py} on the command line usually works.
Execute python in parallel similar to this:\\
\textit{mpirun -n 12 python script.py}\\
The file \texttt{script.py} must call the \texttt{yt.enable\_parallelism()} to
turn on yt's parallelism. If this doesn't happen, all cores will execute the
same serial yt script. This command may differ for each system on which you use
yt; please consult the system documentation for details on how to run parallel
applications.
\texttt{parallel\_objects()} \textemdash\ A way to parallelize analysis over objects
(such as halos or clumps).\\
\subsection{Git}
\settowidth{\MyLen}{\texttt{multicol}}
Please see \url{https://git-scm.com/} for the latest Git documentation.
\texttt{git clone https://github.com/yt-project/yt} \textemdash\ Clone the yt
repository. \\
\texttt{git status} \textemdash\ Show status of working tree.\\
\texttt{git diff} \textemdash\ Show changed files in the working tree. \\
\texttt{git log} \textemdash\ Show a log of changes in reverse chronological
order.\\
\texttt{git revert } \textemdash\ Revert the changes in an existing
commit and create a new commit with reverted changes. \\
\texttt{git add } \textemdash\ Stage changes in the working tree to
the index. \\
\texttt{git commit} \textemdash\ Commit staged changes to the repository. \\
\texttt{git merge } Merge the revisions from the specified branch on
top of the current branch.\\
\texttt{git push } \textemdash\ Push changes to remote repository. \\
\texttt{git push } \textemdash\ Push changes in specified
branch to remote repository. \\
\texttt{git pull } \textemdash\ Pull changes from the
specified branch of the remote repository. This is equivalent to \texttt{git
fetch } and then \texttt{git merge /}.\\
\subsection{FAQ}
\settowidth{\MyLen}{\texttt{multicol}}
\texttt{slc.set\_log('field', False)} \textemdash\ When plotting \texttt{field}, use linear scaling instead of log scaling.
%\rule{0.3\linewidth}{0.25pt}
%\scriptsize
% Can put some final stuff here like copyright etc...
\end{multicols}
\end{document}
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/docstring_idioms.txt 0000644 0001751 0000177 00000003227 14714401662 016437 0 ustar 00runner docker Idioms for Docstrings in yt
===========================
For a full list of recognized constructs for marking up docstrings, see the
Sphinx documentation:
http://www.sphinx-doc.org/en/master/
Specifically, this section:
http://www.sphinx-doc.org/en/master/usage/restructuredtext/
http://www.sphinx-doc.org/en/master/usage/restructuredtext/roles.html#cross-referencing-syntax
Variables in Examples
---------------------
In order to construct short, useful examples, some variables must be specified.
However, because often examples require a bit of setup, here is a list of
useful variable names that correspond to specific instances that the user is
presupposed to have created.
* `ds`: a dataset, loaded successfully
* `sp`: a sphere
* `c`: a 3-component "center"
* `L`: a 3-component vector that corresponds to either angular momentum or a
normal vector
Cross-Referencing
-----------------
To enable sufficient linkages between different sections of the documentation,
good cross-referencing is key. To reference a section of the documentation,
you can use this construction:
For more information, see :ref:`image_writer`.
This will insert a link to the section in the documentation which has been
identified with `image_writer` as its name.
Referencing Classes and Functions
---------------------------------
To indicate the return type of a given object, you can reference it using this
construction:
This function returns a :class:`ProjectionPlot`.
To reference a function, you can use:
To write out this array, use :func:`save_image`.
To reference a method, you can use:
To add a projection, use :meth:`ProjectionPlot.set_width`.
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.2151506
yt-4.4.0/doc/extensions/ 0000755 0001751 0000177 00000000000 14714401715 014530 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/README 0000644 0001751 0000177 00000000231 14714401662 015405 0 ustar 00runner docker This includes a version of the Numpy Documentation extension that has been
slightly modified to emit extra TOC tree items.
-- Matt Turk, March 25, 2011
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/config_help.py 0000644 0001751 0000177 00000002025 14714401662 017357 0 ustar 00runner docker import re
import subprocess
from docutils import statemachine
from docutils.parsers.rst import Directive
def setup(app):
app.add_directive("config_help", GetConfigHelp)
setup.app = app
setup.config = app.config
setup.confdir = app.confdir
retdict = dict(version="1.0", parallel_read_safe=True, parallel_write_safe=True)
return retdict
class GetConfigHelp(Directive):
required_arguments = 1
optional_arguments = 0
final_argument_whitespace = True
def run(self):
rst_file = self.state_machine.document.attributes["source"]
data = (
subprocess.check_output(self.arguments[0].split(" ") + ["-h"])
.decode("utf8")
.split("\n")
)
ind = next(
(i for i, val in enumerate(data) if re.match(r"\s{0,3}\{.*\}\s*$", val))
)
lines = [".. code-block:: none", ""] + data[ind + 1 :]
self.state_machine.insert_input(
statemachine.string2lines("\n".join(lines)), rst_file
)
return []
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/pythonscript_sphinxext.py 0000644 0001751 0000177 00000005321 14714401662 021764 0 ustar 00runner docker import errno
import glob
import os
import shutil
import subprocess
import tempfile
import time
import uuid
from docutils import nodes
from docutils.parsers.rst import Directive
class PythonScriptDirective(Directive):
"""Execute an inline python script and display images.
This uses exec to execute an inline python script, copies
any images produced by the script, and embeds them in the document
along with the script.
"""
required_arguments = 0
optional_arguments = 0
has_content = True
def run(self):
cwd = os.getcwd()
tmpdir = tempfile.mkdtemp()
os.chdir(tmpdir)
rst_file = self.state_machine.document.attributes["source"]
rst_dir = os.path.abspath(os.path.dirname(rst_file))
image_dir, image_rel_dir = make_image_dir(setup, rst_dir)
# Construct script from cell content
content = "\n".join(self.content)
with open("temp.py", "w") as f:
f.write(content)
# Use sphinx logger?
uid = uuid.uuid4().hex[:8]
print("")
print(f">> Contents of the script: {uid}")
print(content)
print("")
start = time.time()
subprocess.call(["python", "temp.py"])
print(f">> The execution of the script {uid} took {time.time() - start:f} s")
text = ""
for im in sorted(glob.glob("*.png")):
text += get_image_tag(im, image_dir, image_rel_dir)
code = content
literal = nodes.literal_block(code, code)
literal["language"] = "python"
attributes = {"format": "html"}
img_node = nodes.raw("", text, **attributes)
# clean up
os.chdir(cwd)
shutil.rmtree(tmpdir, True)
return [literal, img_node]
def setup(app):
app.add_directive("python-script", PythonScriptDirective)
setup.app = app
setup.config = app.config
setup.confdir = app.confdir
retdict = dict(version="0.1", parallel_read_safe=True, parallel_write_safe=True)
return retdict
def get_image_tag(filename, image_dir, image_rel_dir):
my_uuid = uuid.uuid4().hex
shutil.move(filename, image_dir + os.path.sep + my_uuid + filename)
relative_filename = image_rel_dir + os.path.sep + my_uuid + filename
return f'
'
def make_image_dir(setup, rst_dir):
image_dir = os.path.join(setup.app.builder.outdir, "_images")
rel_dir = os.path.relpath(setup.confdir, rst_dir)
image_rel_dir = os.path.join(rel_dir, "_images")
thread_safe_mkdir(image_dir)
return image_dir, image_rel_dir
def thread_safe_mkdir(dirname):
try:
os.makedirs(dirname)
except OSError as e:
if e.errno != errno.EEXIST:
raise
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/yt_colormaps.py 0000644 0001751 0000177 00000003775 14714401662 017632 0 ustar 00runner docker # This extension is quite simple:
# 1. It accepts a script name
# 2. This script is added to the document in a literalinclude
# 3. Any _static images found will be added
import glob
import os
import shutil
from docutils.parsers.rst import Directive, directives
# Some of this magic comes from the matplotlib plot_directive.
def setup(app):
app.add_directive("yt_colormaps", ColormapScript)
setup.app = app
setup.config = app.config
setup.confdir = app.confdir
retdict = dict(version="0.1", parallel_read_safe=True, parallel_write_safe=True)
return retdict
class ColormapScript(Directive):
required_arguments = 1
optional_arguments = 0
def run(self):
rst_file = self.state_machine.document.attributes["source"]
rst_dir = os.path.abspath(os.path.dirname(rst_file))
script_fn = directives.path(self.arguments[0])
script_bn = os.path.basename(script_fn)
# This magic is from matplotlib
dest_dir = os.path.abspath(
os.path.join(setup.app.builder.outdir, os.path.dirname(script_fn))
)
if not os.path.exists(dest_dir):
os.makedirs(dest_dir) # no problem here for me, but just use built-ins
rel_dir = os.path.relpath(rst_dir, setup.confdir)
place = os.path.join(dest_dir, rel_dir)
if not os.path.isdir(place):
os.makedirs(place)
shutil.copyfile(
os.path.join(rst_dir, script_fn), os.path.join(place, script_bn)
)
im_path = os.path.join(rst_dir, "_static")
images = sorted(glob.glob(os.path.join(im_path, "*.png")))
lines = []
for im in images:
im_name = os.path.join("_static", os.path.basename(im))
lines.append(f".. image:: {im_name}")
lines.append(" :width: 400")
lines.append(f" :target: ../../_images/{os.path.basename(im)}")
lines.append("\n")
lines.append("\n")
self.state_machine.insert_input(lines, rst_file)
return []
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/yt_cookbook.py 0000644 0001751 0000177 00000005477 14714401662 017442 0 ustar 00runner docker # This extension is quite simple:
# 1. It accepts a script name
# 2. This script is added to the document in a literalinclude
# 3. Any _static images found will be added
import glob
import os
import shutil
from docutils.parsers.rst import Directive, directives
# Some of this magic comes from the matplotlib plot_directive.
def setup(app):
app.add_directive("yt_cookbook", CookbookScript)
setup.app = app
setup.config = app.config
setup.confdir = app.confdir
retdict = dict(version="0.1", parallel_read_safe=True, parallel_write_safe=True)
return retdict
data_patterns = ["*.h5", "*.out", "*.dat", "*.mp4"]
class CookbookScript(Directive):
required_arguments = 1
optional_arguments = 0
def run(self):
rst_file = self.state_machine.document.attributes["source"]
rst_dir = os.path.abspath(os.path.dirname(rst_file))
script_fn = directives.path(self.arguments[0])
script_bn = os.path.basename(script_fn)
script_name = os.path.basename(self.arguments[0]).split(".")[0]
# This magic is from matplotlib
dest_dir = os.path.abspath(
os.path.join(setup.app.builder.outdir, os.path.dirname(script_fn))
)
if not os.path.exists(dest_dir):
os.makedirs(dest_dir) # no problem here for me, but just use built-ins
rel_dir = os.path.relpath(rst_dir, setup.confdir)
place = os.path.join(dest_dir, rel_dir)
if not os.path.isdir(place):
os.makedirs(place)
shutil.copyfile(
os.path.join(rst_dir, script_fn), os.path.join(place, script_bn)
)
im_path = os.path.join(rst_dir, "_static")
images = sorted(glob.glob(os.path.join(im_path, f"{script_name}__*.png")))
lines = []
lines.append(f"(`{script_bn} <{script_fn}>`__)")
lines.append("\n")
lines.append("\n")
lines.append(f".. literalinclude:: {self.arguments[0]}")
lines.append("\n")
lines.append("\n")
for im in images:
im_name = os.path.join("_static", os.path.basename(im))
lines.append(f".. image:: {im_name}")
lines.append(" :width: 400")
lines.append(f" :target: ../_images/{os.path.basename(im)}")
lines.append("\n")
lines.append("\n")
for ext in data_patterns:
data_files = sorted(
glob.glob(os.path.join(im_path, f"{script_name}__*.{ext}"))
)
for df in data_files:
df_bn = os.path.basename(df)
shutil.copyfile(
os.path.join(rst_dir, df), os.path.join(dest_dir, rel_dir, df_bn)
)
lines.append(f" * Data: `{df_bn} <{df}>`__)")
lines.append("\n")
self.state_machine.insert_input(lines, rst_file)
return []
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/extensions/yt_showfields.py 0000644 0001751 0000177 00000001471 14714401662 017771 0 ustar 00runner docker import subprocess
import sys
from docutils.parsers.rst import Directive
def setup(app):
app.add_directive("yt_showfields", ShowFields)
setup.app = app
setup.config = app.config
setup.confdir = app.confdir
retdict = dict(version="1.0", parallel_read_safe=True, parallel_write_safe=True)
return retdict
class ShowFields(Directive):
required_arguments = 0
optional_arguments = 0
parallel_read_safe = True
parallel_write_safe = True
def run(self):
rst_file = self.state_machine.document.attributes["source"]
lines = subprocess.check_output(
[sys.executable, "./helper_scripts/show_fields.py"]
)
lines = lines.decode("utf8")
lines = lines.split("\n")
self.state_machine.insert_input(lines, rst_file)
return []
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.2151506
yt-4.4.0/doc/helper_scripts/ 0000755 0001751 0000177 00000000000 14714401715 015357 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/code_support.py 0000644 0001751 0000177 00000005062 14714401662 020443 0 ustar 00runner docker vals = [
"FluidQuantities",
"Particles",
"Parameters",
"Units",
"ReadOnDemand",
"LoadRawData",
"LevelOfSupport",
"ContactPerson",
]
class CodeSupport:
def __init__(self, **kwargs):
self.support = {}
for v in vals:
self.support[v] = "N"
for k, v in kwargs.items():
if k in vals:
self.support[k] = v
Y = "Y"
N = "N"
code_names = ["Enzo", "Orion", "FLASH", "RAMSES", "Chombo", "Gadget", "ART", "ZEUS"]
codes = dict(
Enzo=CodeSupport(
FluidQuantities=Y,
Particles=Y,
Parameters=Y,
Units=Y,
ReadOnDemand=Y,
LoadRawData=Y,
ContactPerson="Matt Turk",
LevelOfSupport="Full",
),
Orion=CodeSupport(
FluidQuantities=Y,
Particles=N,
Parameters=Y,
Units=Y,
ReadOnDemand=Y,
LoadRawData=Y,
ContactPerson="Jeff Oishi",
LevelOfSupport="Full",
),
FLASH=CodeSupport(
FluidQuantities=Y,
Particles=N,
Parameters=N,
Units=Y,
ReadOnDemand=Y,
LoadRawData=Y,
ContactPerson="John !ZuHone",
LevelOfSupport="Partial",
),
RAMSES=CodeSupport(
FluidQuantities=Y,
Particles=N,
Parameters=N,
Units=N,
ReadOnDemand=Y,
LoadRawData=Y,
ContactPerson="Matt Turk",
LevelOfSupport="Partial",
),
Chombo=CodeSupport(
FluidQuantities=Y,
Particles=N,
Parameters=N,
Units=N,
ReadOnDemand=Y,
LoadRawData=Y,
ContactPerson="Jeff Oishi",
LevelOfSupport="Partial",
),
Gadget=CodeSupport(
FluidQuantities=N,
Particles=Y,
Parameters=Y,
Units=Y,
ReadOnDemand=N,
LoadRawData=N,
ContactPerson="Chris Moody",
LevelOfSupport="Partial",
),
ART=CodeSupport(
FluidQuantities=N,
Particles=N,
Parameters=N,
Units=N,
ReadOnDemand=N,
LoadRawData=N,
ContactPerson="Matt Turk",
LevelOfSupport="None",
),
ZEUS=CodeSupport(
FluidQuantities=N,
Particles=N,
Parameters=N,
Units=N,
ReadOnDemand=N,
LoadRawData=N,
ContactPerson="Matt Turk",
LevelOfSupport="None",
),
)
print("|| . ||", end=" ")
for c in code_names:
print(f"{c} || ", end=" ")
print()
for vn in vals:
print(f"|| !{vn} ||", end=" ")
for c in code_names:
print(f"{codes[c].support[vn]} || ", end=" ")
print()
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/parse_cb_list.py 0000644 0001751 0000177 00000002545 14714401662 020551 0 ustar 00runner docker import inspect
from textwrap import TextWrapper
import yt
ds = yt.load("RD0005-mine/RedshiftOutput0005")
output = open("source/visualizing/_cb_docstrings.inc", "w")
template = """
.. function:: %(clsname)s%(sig)s:
(This is a proxy for :class:`~%(clsproxy)s`.)
%(docstring)s
"""
tw = TextWrapper(initial_indent=" ", subsequent_indent=" ", width=60)
def write_docstring(f, name, cls):
if not hasattr(cls, "_type_name") or cls._type_name is None:
return
for clsi in inspect.getmro(cls):
docstring = inspect.getdoc(clsi.__init__)
if docstring is not None:
break
clsname = cls._type_name
sig = inspect.formatargspec(*inspect.getargspec(cls.__init__))
sig = sig.replace("**kwargs", "**field_parameters")
clsproxy = f"yt.visualization.plot_modifications.{cls.__name__}"
# docstring = "\n".join([" %s" % line for line in docstring.split("\n")])
# print(docstring)
f.write(
template
% dict(
clsname=clsname,
sig=sig,
clsproxy=clsproxy,
docstring="\n".join(tw.wrap(docstring)),
)
)
# docstring = docstring))
for n, c in sorted(yt.visualization.api.callback_registry.items()):
write_docstring(output, n, c)
print(f".. autoclass:: yt.visualization.plot_modifications.{n}")
print(" :members:")
print()
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/parse_dq_list.py 0000644 0001751 0000177 00000002023 14714401662 020560 0 ustar 00runner docker import inspect
from textwrap import TextWrapper
import yt
ds = yt.load("RD0005-mine/RedshiftOutput0005")
output = open("source/analyzing/_dq_docstrings.inc", "w")
template = """
.. function:: %(funcname)s%(sig)s:
(This is a proxy for :func:`~%(funcproxy)s`.)
%(docstring)s
"""
tw = TextWrapper(initial_indent=" ", subsequent_indent=" ", width=60)
def write_docstring(f, name, func):
docstring = inspect.getdoc(func)
funcname = name
sig = inspect.formatargspec(*inspect.getargspec(func))
sig = sig.replace("data, ", "")
sig = sig.replace("(data)", "()")
funcproxy = f"yt.data_objects.derived_quantities.{func.__name__}"
docstring = "\n".join(" %s" % line for line in docstring.split("\n"))
f.write(
template
% dict(funcname=funcname, sig=sig, funcproxy=funcproxy, docstring=docstring)
)
# docstring = "\n".join(tw.wrap(docstring))))
dd = ds.all_data()
for n, func in sorted(dd.quantities.functions.items()):
print(n, func)
write_docstring(output, n, func[1])
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/parse_object_list.py 0000644 0001751 0000177 00000002046 14714401662 021427 0 ustar 00runner docker import inspect
from textwrap import TextWrapper
import yt
ds = yt.load("RD0005-mine/RedshiftOutput0005")
output = open("source/analyzing/_obj_docstrings.inc", "w")
template = """
.. class:: %(clsname)s%(sig)s:
For more information, see :ref:`%(docstring)s`
(This is a proxy for :class:`~%(clsproxy)sBase`.)
"""
tw = TextWrapper(initial_indent=" ", subsequent_indent=" ", width=60)
def write_docstring(f, name, cls):
for clsi in inspect.getmro(cls):
docstring = inspect.getdoc(clsi.__init__)
if docstring is not None:
break
clsname = name
sig = inspect.formatargspec(*inspect.getargspec(cls.__init__))
sig = sig.replace("**kwargs", "**field_parameters")
clsproxy = f"yt.data_objects.data_containers.{cls.__name__}"
f.write(
template
% dict(
clsname=clsname, sig=sig, clsproxy=clsproxy, docstring="physical-object-api"
)
)
for n, c in sorted(ds.__dict__.items()):
if hasattr(c, "_con_args"):
print(n)
write_docstring(output, n, c)
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/run_recipes.py 0000755 0001751 0000177 00000004661 14714401662 020262 0 ustar 00runner docker #!/usr/bin/env python3
import glob
import os
import shutil
import subprocess
import sys
import tempfile
import traceback
from multiprocessing import Pool
import matplotlib
from yt.config import ytcfg
matplotlib.use("Agg")
FPATTERNS = ["*.png", "*.txt", "*.h5", "*.dat", "*.mp4"]
DPATTERNS = ["LC*", "LR", "DD0046"]
BADF = [
"cloudy_emissivity.h5",
"apec_emissivity.h5",
"xray_emissivity.h5",
"AMRGridData_Slice_x_density.png",
]
CWD = os.getcwd()
ytcfg["yt", "serialize"] = False
BLACKLIST = ["opengl_ipython", "opengl_vr"]
def prep_dirs():
for directory in glob.glob(f"{ytcfg.get('yt', 'test_data_dir')}/*"):
os.symlink(directory, os.path.basename(directory))
def run_recipe(payload):
(recipe,) = payload
module_name, ext = os.path.splitext(os.path.basename(recipe))
dest = os.path.join(os.path.dirname(recipe), "_static", module_name)
if module_name in BLACKLIST:
return 0
if not os.path.exists(f"{CWD}/_temp/{module_name}.done"):
sys.stderr.write(f"Started {module_name}\n")
tmpdir = tempfile.mkdtemp()
os.chdir(tmpdir)
prep_dirs()
try:
subprocess.check_call(["python", recipe])
except Exception as exc:
trace = "".join(traceback.format_exception(*sys.exc_info()))
trace += f" in module: {module_name}\n"
trace += f" recipe: {recipe}\n"
raise Exception(trace) from exc
open(f"{CWD}/_temp/{module_name}.done", "wb").close()
for pattern in FPATTERNS:
for fname in glob.glob(pattern):
if fname not in BADF:
shutil.move(fname, f"{dest}__{fname}")
for pattern in DPATTERNS:
for dname in glob.glob(pattern):
shutil.move(dname, dest)
os.chdir(CWD)
shutil.rmtree(tmpdir, True)
sys.stderr.write(f"Finished with {module_name}\n")
return 0
for path in [
"_temp",
"source/cookbook/_static",
"source/visualizing/colormaps/_static",
]:
fpath = os.path.join(CWD, path)
if os.path.exists(fpath):
shutil.rmtree(fpath)
os.makedirs(fpath)
os.chdir("_temp")
recipes = []
for rpath in ["source/cookbook", "source/visualizing/colormaps"]:
fpath = os.path.join(CWD, rpath)
sys.path.append(fpath)
recipes += glob.glob(f"{fpath}/*.py")
WPOOL = Pool(processes=6)
RES = WPOOL.map_async(run_recipe, ((recipe,) for recipe in recipes))
RES.get()
os.chdir(CWD)
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/show_fields.py 0000644 0001751 0000177 00000023624 14714401662 020247 0 ustar 00runner docker import inspect
import numpy as np
import yt.frontends as frontends_module
from yt.config import ytcfg
from yt.fields.derived_field import NullFunc
from yt.frontends.api import _frontends
from yt.frontends.stream.fields import StreamFieldInfo
from yt.funcs import obj_length
from yt.testing import fake_random_ds
from yt.units import dimensions
from yt.units.yt_array import Unit
from yt.utilities.cosmology import Cosmology
fields, units = [], []
for fname, (code_units, _aliases, _dn) in StreamFieldInfo.known_other_fields:
fields.append(("gas", fname))
units.append(code_units)
base_ds = fake_random_ds(4, fields=fields, units=units)
base_ds.index
base_ds.cosmological_simulation = 1
base_ds.cosmology = Cosmology()
ytcfg["yt", "internals", "within_testing"] = True
np.seterr(all="ignore")
def _strip_ftype(field):
if not isinstance(field, tuple):
return field
elif field[0] == "all":
return field
return field[1]
np.random.seed(int(0x4D3D3D3))
units = [base_ds._get_field_info(f).units for f in fields]
fields = [_strip_ftype(f) for f in fields]
ds = fake_random_ds(16, fields=fields, units=units, particles=1)
ds.parameters["HydroMethod"] = "streaming"
ds.parameters["EOSType"] = 1.0
ds.parameters["EOSSoundSpeed"] = 1.0
ds.conversion_factors["Time"] = 1.0
ds.conversion_factors.update({f: 1.0 for f in fields})
ds.gamma = 5.0 / 3.0
ds.current_redshift = 0.0001
ds.cosmological_simulation = 1
ds.hubble_constant = 0.7
ds.omega_matter = 0.27
ds.omega_lambda = 0.73
ds.cosmology = Cosmology(
hubble_constant=ds.hubble_constant,
omega_matter=ds.omega_matter,
omega_lambda=ds.omega_lambda,
unit_registry=ds.unit_registry,
)
for my_unit in ["m", "pc", "AU", "au"]:
new_unit = f"{my_unit}cm"
my_u = Unit(my_unit, registry=ds.unit_registry)
ds.unit_registry.add(
new_unit,
my_u.base_value,
dimensions.length,
"\\rm{%s}/(1+z)" % my_unit,
prefixable=True,
)
header = r"""
.. _field-list:
Field List
==========
This is a list of many of the fields available in yt. We have attempted to
include most of the fields that are accessible through the plugin system, as
well as the fields that are known by the frontends, however it is possible to
generate many more permutations, particularly through vector operations. For
more information about the fields framework, see :ref:`fields`.
Some fields are recognized by specific frontends only. These are typically
fields like density and temperature that have their own names and units in
the different frontend datasets. Often, these fields are aliased to their
yt-named counterpart fields (typically 'gas' fieldtypes). For example, in
the ``FLASH`` frontend, the ``dens`` field (i.e. ``(flash, dens)``) is aliased
to the gas field density (i.e. ``(gas, density)``), similarly ``(flash, velx)``
is aliased to ``(gas, velocity_x)``, and so on. In what follows, if a field
is aliased it will be noted.
Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
native and derived fields available for your dataset respectively. For example
to display the native fields in alphabetical order:
.. notebook-cell::
import yt
ds = yt.load("Enzo_64/DD0043/data0043")
for i in sorted(ds.field_list):
print(i)
To figure out out what all of the field types here mean, see
:ref:`known-field-types`.
.. contents:: Table of Contents
:depth: 1
:local:
:backlinks: none
.. _yt-fields:
Universal Fields
----------------
"""
footer = """
Index of Fields
---------------
.. contents::
:depth: 3
:backlinks: none
"""
print(header)
seen = []
def fix_units(units, in_cgs=False):
unit_object = Unit(units, registry=ds.unit_registry)
if in_cgs:
unit_object = unit_object.get_cgs_equivalent()
latex = unit_object.latex_representation()
return latex.replace(r"\ ", "~")
def print_all_fields(fl):
for fn in sorted(fl):
df = fl[fn]
f = df._function
s = f"{df.name}"
print(s)
print("^" * len(s))
print()
if obj_length(df.units) > 0:
# Most universal fields are in CGS except for these special fields
if df.name[1] in [
"particle_position",
"particle_position_x",
"particle_position_y",
"particle_position_z",
"entropy",
"kT",
"metallicity",
"dx",
"dy",
"dz",
"cell_volume",
"x",
"y",
"z",
]:
print(f" * Units: :math:`{fix_units(df.units)}`")
else:
print(f" * Units: :math:`{fix_units(df.units, in_cgs=True)}`")
print(f" * Sampling Method: {df.sampling_type}")
print()
print("**Field Source**")
print()
if f == NullFunc:
print("No source available.")
print()
continue
else:
print(".. code-block:: python")
print()
for line in inspect.getsource(f).split("\n"):
print(" " + line)
print()
ds.index
print_all_fields(ds.field_info)
class FieldInfo:
"""a simple container to hold the information about fields"""
def __init__(self, ftype, field, ptype):
name = field[0]
self.units = ""
u = field[1][0]
if len(u) > 0:
self.units = r":math:`\mathrm{%s}`" % fix_units(u)
a = [f"``{f}``" for f in field[1][1] if f]
self.aliases = " ".join(a)
self.dname = ""
if field[1][2] is not None:
self.dname = f":math:`{field[1][2]}`"
if ftype != "particle_type":
ftype = f"'{ftype}'"
self.name = f"({ftype}, '{name}')"
self.ptype = ptype
current_frontends = [f for f in _frontends if f not in ["stream"]]
for frontend in current_frontends:
this_f = getattr(frontends_module, frontend)
field_info_names = [fi for fi in dir(this_f) if "FieldInfo" in fi]
dataset_names = [dset for dset in dir(this_f) if "Dataset" in dset]
if frontend == "gadget":
# Drop duplicate entry for GadgetHDF5, add special case for FieldInfo
# entry
dataset_names = ["GadgetDataset"]
field_info_names = ["GadgetFieldInfo"]
elif frontend == "boxlib":
field_info_names = []
for d in dataset_names:
if "Maestro" in d:
field_info_names.append("MaestroFieldInfo")
elif "Castro" in d:
field_info_names.append("CastroFieldInfo")
else:
field_info_names.append("BoxlibFieldInfo")
elif frontend == "chombo":
# remove low dimensional field info containers for ChomboPIC
field_info_names = [
f for f in field_info_names if "1D" not in f and "2D" not in f
]
for dset_name, fi_name in zip(dataset_names, field_info_names):
fi = getattr(this_f, fi_name)
nfields = 0
if hasattr(fi, "known_other_fields"):
known_other_fields = fi.known_other_fields
nfields += len(known_other_fields)
else:
known_other_fields = []
if hasattr(fi, "known_particle_fields"):
known_particle_fields = fi.known_particle_fields
if "Tipsy" in fi_name:
known_particle_fields += tuple(fi.aux_particle_fields.values())
nfields += len(known_particle_fields)
else:
known_particle_fields = []
if nfields > 0:
print(f".. _{dset_name.replace('Dataset', '')}_specific_fields:\n")
h = f"{dset_name.replace('Dataset', '')}-Specific Fields"
print(h)
print("-" * len(h) + "\n")
field_stuff = []
for field in known_other_fields:
field_stuff.append(FieldInfo(frontend, field, False))
for field in known_particle_fields:
if frontend in ["sph", "halo_catalogs", "sdf"]:
field_stuff.append(FieldInfo("particle_type", field, True))
else:
field_stuff.append(FieldInfo("io", field, True))
# output
len_name = 10
len_units = 5
len_aliases = 7
len_part = 9
len_disp = 12
for f in field_stuff:
len_name = max(len_name, len(f.name))
len_aliases = max(len_aliases, len(f.aliases))
len_units = max(len_units, len(f.units))
len_disp = max(len_disp, len(f.dname))
fstr = "{nm:{nw}} {un:{uw}} {al:{aw}} {pt:{pw}} {dp:{dw}}"
header = fstr.format(
nm="field name",
nw=len_name,
un="units",
uw=len_units,
al="aliases",
aw=len_aliases,
pt="particle?",
pw=len_part,
dp="display name",
dw=len_disp,
)
div = fstr.format(
nm="=" * len_name,
nw=len_name,
un="=" * len_units,
uw=len_units,
al="=" * len_aliases,
aw=len_aliases,
pt="=" * len_part,
pw=len_part,
dp="=" * len_disp,
dw=len_disp,
)
print(div)
print(header)
print(div)
for f in field_stuff:
print(
fstr.format(
nm=f.name,
nw=len_name,
un=f.units,
uw=len_units,
al=f.aliases,
aw=len_aliases,
pt=f.ptype,
pw=len_part,
dp=f.dname,
dw=len_disp,
)
)
print(div)
print("")
print(footer)
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/split_auto.py 0000644 0001751 0000177 00000003370 14714401662 020120 0 ustar 00runner docker import collections
templates = dict(
autoclass=r"""
%(name)s
%(header)s
.. autoclass:: %(name)s
:members:
:inherited-members:
:undoc-members:
""",
autofunction=r"""
%(name)s
%(header)s
.. autofunction:: %(name)s
""",
index_file=r"""
%(title)s
%(header)s
.. autosummary::
:toctree: generated/%(dn)s
""",
)
file_names = dict(
ft=("Field Types", "source/api/field_types/%s.rst"),
pt=("Plot Types", "source/api/plot_types/%s.rst"),
cl=("Callback List", "source/api/callback_list/%s.rst"),
ee=("Extension Types", "source/api/extension_types/%s.rst"),
dd=("Derived Datatypes", "source/api/derived_datatypes/%s.rst"),
mt=("Miscellaneous Types", "source/api/misc_types/%s.rst"),
fl=("Function List", "source/api/function_list/%s.rst"),
ds=("Data Sources", "source/api/data_sources/%s.rst"),
dq=("Derived Quantities", "source/api/derived_quantities/%s.rst"),
)
to_include = collections.defaultdict(list)
for line in open("auto_generated.txt"):
ftype, name, file_name = (s.strip() for s in line.split("::"))
cn = name.split(".")[-1]
if cn[0] == "_":
cn = cn[1:] # For leading _
fn = file_names[file_name][1] % cn
# if not os.path.exists(os.path.dirname(fn)):
# os.mkdir(os.path.dirname(fn))
header = "-" * len(name)
dd = dict(header=header, name=name)
# open(fn, "w").write(templates[ftype] % dd)
to_include[file_name].append(name)
for key, val in file_names.items():
title, file = val
fn = file.rsplit("/", 1)[0] + ".rst"
print(fn)
f = open(fn, "w")
dn = fn.split("/")[-1][:-4]
dd = dict(header="=" * len(title), title=title, dn=dn)
f.write(templates["index_file"] % dd)
for obj in sorted(to_include[key]):
f.write(f" {obj}\n")
././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/helper_scripts/table.py 0000644 0001751 0000177 00000006320 14714401662 017022 0 ustar 00runner docker contents = [
(
"Getting Started",
[
("welcome/index.html", "Welcome to yt!", "What's yt all about?"),
(
"orientation/index.html",
"yt Orientation",
"Quickly get up and running with yt: zero to sixty.",
),
(
"help/index.html",
"How to Ask for Help",
"Some guidelines on how and where to ask for help with yt",
),
(
"workshop.html",
"Workshop Tutorials",
"Videos, slides and scripts from the 2012 workshop covering many "
+ "aspects of yt, from beginning to advanced.",
),
],
),
(
"Everyday yt",
[
(
"analyzing/index.html",
"Analyzing Data",
"An overview of different ways to handle and process data: loading "
+ "data, using and manipulating objects and fields, examining and "
+ "manipulating particles, derived fields, generating processed data, "
+ "time series analysis.",
),
(
"visualizing/index.html",
"Visualizing Data",
"An overview of different ways to visualize data: making projections, "
+ "slices, phase plots, streamlines, and volume rendering; modifying "
+ "plots; the fixed resolution buffer.",
),
(
"interacting/index.html",
"Interacting with yt",
"Different ways -- scripting, GUIs, prompts, explorers -- to explore "
+ "your data.",
),
],
),
(
"Advanced Usage",
[
(
"advanced/index.html",
"Advanced yt usage",
"Advanced topics: parallelism, debugging, ways to drive yt, "
+ "developing",
),
(
"getting_involved/index.html",
"Getting Involved",
"How to participate in the community, contribute code and share "
+ "scripts",
),
],
),
(
"Reference Materials",
[
(
"cookbook/index.html",
"The Cookbook",
"A bunch of illustrated examples of how to do things",
),
(
"reference/index.html",
"Reference Materials",
"A list of all bundled fields, API documentation, the Change Log...",
),
("faq/index.html", "FAQ", "Frequently Asked Questions: answered for you!"),
],
),
]
heading_template = r"""
%s
"""
subheading_template = r"""
%s
|
%s
|
"""
t = ""
for heading, items in contents:
s = ""
for subheading in items:
s += subheading_template % subheading
t += heading_template % (heading, s)
print(t)
././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.2151506
yt-4.4.0/doc/source/ 0000755 0001751 0000177 00000000000 14714401715 013631 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000034 00000000000 010212 x ustar 00 28 mtime=1731331021.2191505
yt-4.4.0/doc/source/_static/ 0000755 0001751 0000177 00000000000 14714401715 015257 5 ustar 00runner docker ././@PaxHeader 0000000 0000000 0000000 00000000026 00000000000 010213 x ustar 00 22 mtime=1731330994.0
yt-4.4.0/doc/source/_static/apiKey01.jpg 0000644 0001751 0000177 00000203454 14714401662 017355 0 ustar 00runner docker JFIF H H C
#%$""!&+7/&)4)!"0A149;>>>%.DIC; C
;("(;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; 3 T !1AQ"aq23U#RS$4BTr56cs7bt%CD&EVu