zc.buildout-1.7.1/0000755000076500007650000000000012111415075013351 5ustar jimjim00000000000000zc.buildout-1.7.1/CHANGES.txt0000644000076500007650000010614712111414575015177 0ustar jimjim00000000000000Change History ************** 1.7.1 (2013-02-21) ================== Fixed: Constraints intended to prevent upgrading to buildout-2-compatible recipes weren't expressed correctly, leading to unintendional use of zc.recipe.egg-2.0.0a3. 1.7.0 (2013-01-11) ================== - Unless version requirements are specified, buildout won't upgrade itself past version 1. - Versions in versions sections can now be simple constraints, like <2.0dev in addition to being simple versions. This is used to prevent upgrading zc.recipe.egg and zc.recipe.testrunner past version 1. - If buildout is bootstrapped with a non-final release, it won't downgrade itself to a final release. - Fix: distribute 0.6.33 broke Python 2.4 compatibility - remove `data_files` from `setup.py`, which was installing README.txt in current directory during installation [Domen Kožar] - Windows fix: use cli-64.exe/cli.exe depending on 64/32 bit and try cli.exe if cli-64.exe is not found, fixing 9c6be7ac6d218f09e33725e07dccc4af74d8cf97 - Windows fix: `buildout init` was broken, re.sub does not like a single backslash - fixed all builds on travis-ci [Domen Kožar] - use os._exit insted of sys.exit after ugrade forking [Domen Kožar] - Revert cfa0478937d16769c268bf51e60e69cd3ead50f3, it only broke a feature [Domen Kožar] 1.6.3 (2012-08-22) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/90bc44f9bffd0d9eb09aacf08c6a4c2fed797319 and https://github.com/buildout/buildout/commit/e65b7bfbd7c7ccd556a278016a16b63ae8ef782b) [aclark4life] 1.6.2 (2012-08-21) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/cfa0478937d16769c268bf51e60e69cd3ead50f3) [aclark4life] 1.6.1 (2012-08-18) ================== - `bootstrap.py -d init` would invoke buildout with arguments `init bootstrap` leading into installation of bootstrap package. now bootstrap.py first runs any commands passed, then tries to bootstrap. (Domen Kožar) - fix Python 2.4 support (Domen Kožar) - added travis-ci testing (Domen Kožar) 1.6.0 (2012-08-15) ================== - The buildout init command now accepts distribution requirements and paths to set up a custom interpreter part that has the distributions or parts in the path. For example:: python bootstrap.py init BeautifulSoup - Introduce a cache for the expensive `buildout._dir_hash` function. - Remove duplicate path from script's sys.path setup. - changed broken dash S check to pass the configuration options -S -c separately, to make zc.buildout more compatible with the PyPy interpreter, which has less flexible argument parsing than CPython. Note that PyPy post 1.4.0 is needed to make buildout work at all, due to missing support for the ``-E`` option, which only got added afterwards. - Made sure to download extended configuration files only once per buildout run even if they are referenced multiple times (patch by Rafael Monnerat). - Ported speedup optimization patch by Ross Patterson to 1.5.x series. Improved patch to calculate required_by packages in linear time in verbose mode (-v). Running relatively simple Buildout envornment yielded in running time improvement from 30 seconds to 10 seconds. (Domen Kožar, Ross Patterson) - Removed unnecessary pyc recompilation with optimization flags. Running Buildout with pre-downloaded ~300 packages that were installed in empty eggs repository yielded in running time improvement from 1126 seconds to 348 seconds. (Domen Kožar) Bugs fixed: - In the download module, fixed the handling of directories that are pointed to by file-system paths and ``file:`` URLs. - Removed any traces of the implementation of ``extended-by``. Raise a UserError if the option is encountered instead of ignoring it, though. - https://bugs.launchpad.net/bugs/697913 : Buildout doesn't honor exit code from scripts. Fixed. - Handle both addition and subtraction of elements (+= and -=) on the same key in the same section. 1.5.2 (2010-10-11) ================== - changed metadata 'url' to pypi.python.org in order to solve a temporary outage of buildout.org - IMPORTANT: For better backwards compatibility with the pre-1.5 line, this release has two big changes from 1.5.0 and 1.5.1. - Buildout defaults to including site packages. - Buildout loads recipes and extensions with the same constraints to site-packages that it builds eggs, instead of never allowing access to site-packages. This means that the default configuration should better support pre-existing use of system Python in recipes or builds. - To make it easier to detect the fact that buildout has set the PYTHONPATH, BUILDOUT_ORIGINAL_PYTHONPATH is always set in the environment, even if PYTHONPATH was not originally set. BUILDOUT_ORIGINAL_PYTHONPATH will be an empty string if PYTHONPATH was not set. 1.5.1 (2010-08-29) ================== New features: - Scripts store the old PYTHONPATH in BUILDOUT_ORIGINAL_PYTHONPATH if it existed, and store nothing in the value if it did not exist. This allows code that does not want subprocesses to have the system-Python-protected site.py to set the environment of the subprocess as it was originally. Bugs fixed: - https://bugs.launchpad.net/bugs/623590 : If include-site-packages were true and versions were not set explicitly, system eggs were preferred over newer released eggs. Fixed. 1.5.0 (2010-08-23) ================== New features: - zc.buildout supports Python 2.7. - By default, Buildout and the bootstrap script now prefer final versions of Buildout, recipes, and extensions. This can be changed by using the --accept-buildout-test-releases flag (or -t for short) when calling bootstrap. This will hopefully allow beta releases of these items to be more easily and safely made in the future. NOTE: dependencies of your own software are not affected by this new behavior. Buildout continues to choose the newest available versions of your dependencies regardless of whether they are final releases. To prevent this, use the pre-existing switch ``prefer-final = true`` in the [buildout] section of your configuration file (see http://pypi.python.org/pypi/zc.buildout#preferring-final-releases) or pin your versions using a versions section (see http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Bugs fixed: - You can now again use virtualenv with Buildout. The new features to let buildout be used with a system Python are disabled in this configuration, and the previous script generation behavior (1.4.3) is used, even if the new function ``zc.buildout.easy_install.sitepackage_safe_scripts`` is used. 1.5.0b2 (2010-04-29) ==================== This was a re-release of 1.4.3 in order to keep 1.5.0b1 release from hurting workflows that combined virtualenv with zc.buildout. 1.5.0b1 (2010-04-29) ==================== New Features: - Added buildout:socket-timout option so that socket timeout can be configured both from command line and from config files. (gotcha) - Buildout can be safely used with a system Python (or any Python with code in site-packages), as long as you use (1) A fresh checkout, (2) the new bootstrap.py, and (3) recipes that use the new ``zc.buildout.easy_install.sitepackage_safe_scripts`` function to generate scripts and interpreters. Many recipes will need to be updated to use this new function. The scripts and interpreters generated by ``zc.recipe.egg`` will continue to use the older function, not safe with system Pythons. Use the ``z3c.recipe.scripts`` as a replacement. zc.recipe.egg is still a fully supported, and simpler, way of generating scripts and interpreters if you are using a "clean" Python, without code installed in site-packages. It keeps its previous behavior in order to provide backwards compatibility. The z3c.recipe.scripts recipe allows you to control how you use the code in site-packages. You can exclude it entirely (preferred); allow eggs in it to fulfill package dependencies declared in setup.py and buildout configuration; allow it to be available but not used to fulfill dependencies declared in setup.py or buildout configuration; or only allow certain eggs in site-packages to fulfill dependencies. - Added new function, ``zc.buildout.easy_install.sitepackage_safe_scripts``, to generate scripts and interpreter. It produces a full-featured interpreter (all command-line options supported) and the ability to safely let scripts include site packages, such as with a system Python. The ``z3c.recipe.scripts`` recipe uses this new function. - Improve bootstrap. * New options let you specify where to find ez_setup.py and where to find a download cache. These options can keep bootstrap from going over the network. * Another new option lets you specify where to put generated eggs. * The buildout script generated by bootstrap honors more of the settings in the designated configuration file (e.g., buildout.cfg). * Correctly handle systems where pkg_resources is present but the rest of setuptools is missing (like Ubuntu installs). https://bugs.launchpad.net/zc.buildout/+bug/410528 - You can develop zc.buildout using Distribute instead of Setuptools. Use the --distribute option on the dev.py script. (Releases should be tested with both Distribute and Setuptools.) The tests for zc.buildout pass with Setuptools and Python 2.4, 2.5, 2.6, and 2.7; and with Distribute and Python 2.5, 2.6, and 2.7. Using zc.buildout with Distribute and Python 2.4 is not recommended. - The ``distribute-version`` now works in the [buildout] section, mirroring the ``setuptools-version`` option (this is for consistency; using the general-purpose ``versions`` option is preferred). Bugs fixed: - Using Distribute with the ``allow-picked-versions = false`` buildout option no longer causes an error. - The handling and documenting of default buildout options was normalized. This means, among other things, that ``bin/buildout -vv`` and ``bin/buildout annotate`` correctly list more of the options. - Installing a namespace package using a Python that already has a package in the same namespace (e.g., in the Python's site-packages) failed in some cases. It is now handled correctly. - Another variation of this error showed itself when at least two dependencies were in a shared location like site-packages, and the first one met the "versions" setting. The first dependency would be added, but subsequent dependencies from the same location (e.g., site-packages) would use the version of the package found in the shared location, ignoring the version setting. This is also now handled correctly. 1.4.3 (2009-12-10) ================== Bugs fixed: - Using pre-detected setuptools version for easy_installing tgz files. This prevents a recursion error when easy_installing an upgraded "distribute" tgz. Note that setuptools did not have this recursion problem solely because it was packaged as an ``.egg``, which does not have to go through the easy_install step. 1.4.2 (2009-11-01) ================== New Feature: - Added a --distribute option to the bootstrap script, in order to use Distribute rather than Setuptools. By default, Setuptools is used. Bugs fixed: - While checking for new versions of setuptools and buildout itself, compare requirement locations instead of requirement objects. - Incrementing didn't work properly when extending multiple files. https://bugs.launchpad.net/zc.buildout/+bug/421022 - The download API computed MD5 checksums of text files wrong on Windows. 1.4.1 (2009-08-27) ================== New Feature: - Added a debug built-in recipe to make writing some tests easier. Bugs fixed: - (introduced in 1.4.0) option incrementing (-=) and decrementing (-=) didn't work in the buildout section. https://bugs.launchpad.net/zc.buildout/+bug/420463 - Option incrementing and decrementing didn't work for options specified on the command line. - Scripts generated with relative-paths enabled couldn't be symbolically linked to other locations and still work. - Scripts run using generated interpreters didn't have __file__ set correctly. - The standard Python -m option didn't work for custom interpreters. 1.4.0 (2009-08-26) ================== - When doing variable substitutions, you can omit the section name to refer to a variable in the same section (e.g. ${:foo}). - When doing variable substitution, you can use the special option, ``_buildout_section_name_`` to get the section name. This is most handy for getting the current section name (e.g. ${:_buildout_section_name_}). - A new special option, ``<`` allows sections to be used as macros. - Added annotate command for annotated sections. Displays sections key-value pairs along with the value origin. - Added a download API that handles the download cache, offline mode etc and is meant to be reused by recipes. - Used the download API to allow caching of base configurations (specified by the buildout section's 'extends' option). 1.3.1 (2009-08-12) ================== - Bug fixed: extras were ignored in some cases when versions were specified. 1.3.0 (2009-06-22) ================== - Better Windows compatibility in test infrastructure. - Now the bootstrap.py has an optional --version argument, that can be used to force zc.buildout version to use. - ``zc.buildout.testing.buildoutSetUp`` installs a new handler in the python root logging facility. This handler is now removed during tear down as it might disturb other packages reusing buildout's testing infrastructure. - fixed usage of 'relative_paths' keyword parameter on Windows - Added an unload entry point for extensions. - Fixed bug: when the relative paths option was used, relative paths could be inserted into sys.path if a relative path was used to run the generated script. 1.2.1 (2009-03-18) ================== - Refactored generation of relative egg paths to generate simpler code. 1.2.0 (2009-03-17) ================== - Added a relative_paths option to zc.buildout.easy_install.script to generate egg paths relative to the script they're used in. 1.1.2 (2009-03-16) ================== - Added Python 2.6 support. Removed Python 2.3 support. - Fixed remaining deprecation warnings under Python 2.6, both when running our tests and when using the package. - Switched from using os.popen* to subprocess.Popen, to avoid a deprecation warning in Python 2.6. See: http://docs.python.org/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3 - Made sure the 'redo_pyc' function and the doctest checkers work with Python executable paths containing spaces. - Expand shell patterns when processing the list of paths in `develop`, e.g:: [buildout] develop = ./local-checkouts/* - Conditionally import and use hashlib.md5 when it's available instead of md5 module, which is deprecated in Python 2.6. - Added Jython support for bootstrap, development bootstrap and zc.buildout support on Jython - Fixed a bug that would cause buildout to break while computing a directory hash if it found a broken symlink (Launchpad #250573) 1.1.1 (2008-07-28) ================== - Fixed a bug that caused buildouts to fail when variable substitutions are used to name standard directories, as in:: [buildout] eggs-directory = ${buildout:directory}/develop-eggs 1.1.0 (2008-07-19) ================== - Added a buildout-level unzip option tp change the default policy for unzipping zip-safe eggs. - Tracebacks are now printed for internal errors (as opposed to user errors) even without the -D option. - pyc and pyo files are regenerated for installed eggs so that the stored path in code objects matches the the install location. 1.0.6 (2008-06-13) ================== - Manually reverted the changeset for the fix for https://bugs.launchpad.net/zc.buildout/+bug/239212 to verify thet the test actually fails with the changeset: http://svn.zope.org/zc.buildout/trunk/src/zc/buildout/buildout.py?rev=87309&r1=87277&r2=87309 Thanks tarek for pointing this out. (seletz) - fixed the test for the += -= syntax in buildout.txt as the test was actually wronng. The original implementation did a split/join on whitespace, and later on that was corrected to respect the original EOL setting, the test was not updated, though. (seletz) - added a test to verify against https://bugs.launchpad.net/zc.buildout/+bug/239212 in allowhosts.txt (seletz) - further fixes for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor (related to the new 'allow-hosts' option) (patch by Gottfried Ganssauge) (ajung) 1.0.5 (2008-06-10) ================== - Fixed wrong split when using the += and -= syntax (mustapha) 1.0.4 (2008-06-10) ================== - Added the `allow-hosts` option (tarek) - Quote the 'executable' argument when trying to detect the python version using popen4. (sidnei) - Quote the 'spec' argument, as in the case of installing an egg from the buildout-cache, if the filename contains spaces it would fail (sidnei) - Extended configuration syntax to allow -= and += operators (malthe, mustapha). 1.0.3 (2008-06-01) ================== - fix for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor. (patch by Gottfried Ganssauge) (ajung) 1.0.2 (2008-05-13) ================== - More fixes for Windows. A quoted sh-bang is now used on Windows to make the .exe files work with a Python executable in 'program files'. - Added "-t " option for specifying the socket timeout. (ajung) 1.0.1 (2008-04-02) ================== - Made easy_install.py's _get_version accept non-final releases of Python, like 2.4.4c0. (hannosch) - Applied various patches for Windows (patch by Gottfried Ganssauge). (ajung) - Applied patch fixing rmtree issues on Windows (patch by Gottfried Ganssauge). (ajung) 1.0.0 (2008-01-13) ================== - Added a French translation of the buildout tutorial. 1.0.0b31 (2007-11-01) ===================== Feature Changes --------------- - Added a configuration option that allows buildouts to ignore dependency_links metadata specified in setup. By default dependency_links in setup are used in addition to buildout specified find-links. This can make it hard to control where eggs come from. Here's how to tell buildout to ignore URLs in dependency_links:: [buildout] use-dependency-links = false By default use-dependency-links is true, which matches the behavior of previous versions of buildout. - Added a configuration option that causes buildout to error if a version is picked. This is a nice safety belt when fixing all versions is intended, especially when creating releases. Bugs Fixed ---------- - 151820: Develop failed if the setup.py script imported modules in the distribution directory. - Verbose logging of the develop command was omitting detailed output. - The setup command wasn't documented. - The setup command failed if run in a directory without specifying a configuration file. - The setup command raised a stupid exception if run without arguments. - When using a local find links or index, distributions weren't copied to the download cache. - When installing from source releases, a version specification (via a buildout versions section) for setuptools was ignored when deciding which setuptools to use to build an egg from the source release. 1.0.0b30 (2007-08-20) ===================== Feature Changes --------------- - Changed the default policy back to what it was to avoid breakage in existing buildouts. Use:: [buildout] prefer-final = true to get the new policy. The new policy will go into effect in buildout 2. 1.0.0b29 (2007-08-20) ===================== Feature Changes --------------- - Now, final distributions are prefered over non-final versions. If both final and non-final versions satisfy a requirement, then the final version will be used even if it is older. The normal way to override this for specific packages is to specifically require a non-final version, either specifically or via a lower bound. - There is a buildout prefer-final version that can be used with a value of "false":: prefer-final = false To prefer newer versions, regardless of whether or not they are final, buildout-wide. - The new simple Python index, http://cheeseshop.python.org/simple, is used as the default index. This will provide better performance than the human package index interface, http://pypi.python.org/pypi. More importantly, it lists hidden distributions, so buildouts with fixed distribution versions will be able to find old distributions even if the distributions have been hidden in the human PyPI interface. Bugs Fixed ---------- - 126441: Look for default.cfg in the right place on Windows. 1.0.0b28 (2007-07-05) ===================== Bugs Fixed ---------- - When requiring a specific version, buildout looked for new versions even if that single version was already installed. 1.0.0b27 (2007-06-20) ===================== Bugs Fixed ---------- - Scripts were generated incorrectly on Windows. This included the buildout script itself, making buildout completely unusable. 1.0.0b26 (2007-06-19) ===================== Feature Changes --------------- - Thanks to recent fixes in setuptools, I was able to change buildout to use find-link and index information when searching extensions. Sadly, this work, especially the timing, was motivated my the need to use alternate indexes due to performance problems in the cheese shop (http://www.python.org/pypi/). I really home we can address these performance problems soon. 1.0.0b25 (2007-05-31) ===================== Feature Changes --------------- - buildout now changes to the buildout directory before running recipe install and update methods. - Added a new init command for creating a new buildout. This creates an empty configuration file and then bootstraps. - Except when using the new init command, it is now an error to run buildout without a configuration file. - In verbose mode, when adding distributions to fulful requirements of already-added distributions, we now show why the new distributions are being added. - Changed the logging format to exclude the logger name for the zc.buildout logger. This reduces noise in the output. - Clean up lots of messages, adding missing periods and adding quotes around requirement strings and file paths. Bugs Fixed ---------- - 114614: Buildouts could take a very long time if there were dependency problems in large sets of pathologically interdependent packages. - 59270: Buggy recipes can cause failures in later recipes via chdir - 61890: file:// urls don't seem to work in find-links setuptools requires that file urls that point to directories must end in a "/". Added a workaround. - 75607: buildout should not run if it creates an empty buildout.cfg 1.0.0b24 (2007-05-09) ===================== Feature Changes --------------- - Improved error reporting by showing which packages require other packages that can't be found or that cause version conflicts. - Added an API for use by recipe writers to clean up created files when recipe errors occur. - Log installed scripts. Bugs Fixed ---------- - 92891: bootstrap crashes with recipe option in buildout section. - 113085: Buildout exited with a zero exist status when internal errors occurred. 1.0.0b23 (2007-03-19) ===================== Feature Changes --------------- - Added support for download caches. A buildout can specify a cache for distribution downloads. The cache can be shared among buildouts to reduce network access and to support creating source distributions for applications allowing install without network access. - Log scripts created, as suggested in: https://bugs.launchpad.net/zc.buildout/+bug/71353 Bugs Fixed ---------- - It wasn't possible to give options on the command line for sections not defined in a configuration file. 1.0.0b22 (2007-03-15) ===================== Feature Changes --------------- - Improved error reporting and debugging support: - Added "logical tracebacks" that show functionally what the buildout was doing when an error occurs. Don't show a Python traceback unless the -D option is used. - Added a -D option that causes the buildout to print a traceback and start the pdb post-mortem debugger when an error occurs. - Warnings are printed for unused options in the buildout section and installed-part sections. This should make it easier to catch option misspellings. - Changed the way the installed database (.installed.cfg) is handled to avoid database corruption when a user breaks out of a buildout with control-c. - Don't save an installed database if there are no installed parts or develop egg links. 1.0.0b21 (2007-03-06) ===================== Feature Changes --------------- - Added support for repeatable buildouts by allowing egg versions to be specified in a versions section. - The easy_install module install and build functions now accept a versions argument that supplied to mapping from project name to version numbers. This can be used to fix version numbers for required distributions and their depenencies. When a version isn't fixed, using either a versions option or using a fixed version number in a requirement, then a debug log message is emitted indicating the version picked. This is useful for setting versions options. A default_versions function can be used to set a default value for this option. - Adjusted the output for verbosity levels. Using a single -v option no longer causes voluminous setuptools output. Uisng -vv and -vvv now triggers extra setuptools output. - Added a remove testing helper function that removes files or directories. 1.0.0b20 (2007-02-08) ===================== Feature Changes --------------- - Added a buildout newest option, to control whether the newest distributions should be sought to meet requirements. This might also provide a hint to recipes that don't deal with distributions. For example, a recipe that manages subversion checkouts might not update a checkout if newest is set to "false". - Added a *newest* keyword parameter to the zc.buildout.easy_install.install and zc.buildout.easy_install.build functions to control whether the newest distributions that meed given requirements should be sought. If a false value is provided for this parameter and already installed eggs meet the given requirements, then no attempt will be made to search for newer distributions. - The recipe-testing support setUp function now adds the name *buildout* to the test namespace with a value that is the path to the buildout script in the sample buildout. This allows tests to use >>> print system(buildout), rather than: >>> print system(join('bin', 'buildout')), Bugs Fixed ---------- - Paths returned from update methods replaced lists of installed files rather than augmenting them. 1.0.0b19 (2007-01-24) ===================== Bugs Fixed ---------- - Explicitly specifying a Python executable failed if the output of running Python with the -V option included a 2-digit (rather than a 3-digit) version number. 1.0.0b18 (2007-01-22) ===================== Feature Changes --------------- - Added documentation for some previously undocumented features of the easy_install APIs. - By popular demand, added a -o command-line option that is a short hand for the assignment buildout:offline=true. Bugs Fixed ---------- - When deciding whether recipe develop eggs had changed, buildout incorrectly considered files in .svn and CVS directories. 1.0.0b17 (2006-12-07) ===================== Feature Changes --------------- - Configuration files can now be loaded from URLs. Bugs Fixed ---------- - https://bugs.launchpad.net/products/zc.buildout/+bug/71246 Buildout extensions installed as eggs couldn't be loaded in offline mode. 1.0.0b16 (2006-12-07) ===================== Feature Changes --------------- - A new command-line argument, -U, suppresses reading user defaults. - You can now suppress use of an installed-part database (e.g. .installed.cfg) by sprifying an empty value for the buildout installed option. Bugs Fixed ---------- - When the install command is used with a list of parts, only those parts are supposed to be installed, but the buildout was also building parts that those parts depended on. 1.0.0b15 (2006-12-06) ===================== Bugs Fixed ---------- - Uninstall recipes weren't loaded correctly in cases where no parts in the (new) configuration used the recipe egg. 1.0.0b14 (2006-12-05) ===================== Feature Changes --------------- - Added uninstall recipes for dealing with complex uninstallation scenarios. Bugs Fixed ---------- - Automatic upgrades weren't performed on Windows due to a bug that caused buildout to incorrectly determine that it wasn't running locally in a buildout. - Fixed some spurious test failures on Windows. 1.0.0b13 (2006-12-04) ===================== Feature Changes --------------- - Variable substitutions now reflect option data written by recipes. - A part referenced by a part in a parts list is now added to the parts list before the referencing part. This means that you can omit parts from the parts list if they are referenced by other parts. - Added a develop function to the easy_install module to aid in creating develop eggs with custom build_ext options. - The build and develop functions in the easy_install module now return the path of the egg or egg link created. - Removed the limitation that parts named in the install command can only name configured parts. - Removed support ConfigParser-style variable substitutions (e.g. %(foo)s). Only the string-template style of variable (e.g. ${section:option}) substitutions will be supported. Supporting both violates "there's only one way to do it". - Deprecated the buildout-section extendedBy option. Bugs Fixed ---------- - We treat setuptools as a dependency of any distribution that (declares that it) uses namespace packages, whether it declares setuptools as a dependency or not. This wasn't working for eggs intalled by virtue of being dependencies. 1.0.0b12 (2006-10-24) ===================== Feature Changes --------------- - Added an initialization argument to the zc.buildout.easy_install.scripts function to include initialization code in generated scripts. 1.0.0b11 (2006-10-24) ===================== Bugs Fixed ---------- `67737 `_ Verbose and quite output options caused errors when the develop buildout option was used to create develop eggs. `67871 `_ Installation failed if the source was a (local) unzipped egg. `67873 `_ There was an error in producing an error message when part names passed to the install command weren't included in the configuration. 1.0.0b10 (2006-10-16) ===================== Feature Changes --------------- - Renamed the runsetup command to setup. (The old name still works.) - Added a recipe update method. Now install is only called when a part is installed for the first time, or after an uninstall. Otherwise, update is called. For backward compatibility, recipes that don't define update methiods are still supported. - If a distribution defines namespace packages but fails to declare setuptools as one of its dependencies, we now treat setuptools as an implicit dependency. We generate a warning if the distribution is a develop egg. - You can now create develop eggs for setup scripts that don't use setuptools. Bugs Fixed ---------- - Egg links weren't removed when corresponding entries were removed from develop sections. - Running a non-local buildout command (one not installed in the buildout) ket to a hang if there were new versions of zc.buildout or setuptools were available. Now we issue a warning and don't upgrade. - When installing zip-safe eggs from local directories, the eggs were moved, rather than copied, removing them from the source directory. 1.0.0b9 (2006-10-02) ==================== Bugs Fixed ---------- Non-zip-safe eggs were not unzipped when they were installed. 1.0.0b8 (2006-10-01) ==================== Bugs Fixed ---------- - Installing source distributions failed when using alternate Python versions (depending on the versions of Python used.) - Installing eggs wasn't handled as efficiently as possible due to a bug in egg URL parsing. - Fixed a bug in runsetup that caused setup scripts that introspected __file__ to fail. 1.0.0b7 ======= Added a documented testing framework for use by recipes. Refactored the buildout tests to use it. Added a runsetup command run a setup script. This is handy if, like me, you don't install setuptools in your system Python. 1.0.0b6 ======= Fixed https://launchpad.net/products/zc.buildout/+bug/60582 Use of extension options caused bootstrapping to fail if the eggs directory didn't already exist. We no longer use extensions for bootstrapping. There really isn't any reason to anyway. 1.0.0b5 ======= Refactored to do more work in buildout and less work in easy_install. This makes things go a little faster, makes errors a little easier to handle, and allows extensions (like the sftp extension) to influence more of the process. This was done to fix a problem in using the sftp support. 1.0.0b4 ======= - Added an **experimental** extensions mechanism, mainly to support adding sftp support to buildouts that need it. - Fixed buildout self-updating on Windows. 1.0.0b3 ======= - Added a help option (-h, --help) - Increased the default level of verbosity. - Buildouts now automatically update themselves to new versions of zc.buildout and setuptools. - Added Windows support. - Added a recipe API for generating user errors. - No-longer generate a py_zc.buildout script. - Fixed some bugs in variable substitutions. The characters "-", "." and " ", weren't allowed in section or option names. Substitutions with invalid names were ignored, which caused missleading failures downstream. - Improved error handling. No longer show tracebacks for user errors. - Now require a recipe option (and therefore a section) for every part. - Expanded the easy_install module API to: - Allow extra paths to be provided - Specify explicit entry points - Specify entry-point arguments 1.0.0b2 ======= Added support for specifying some build_ext options when installing eggs from source distributions. 1.0.0b1 ======= - Changed the bootstrapping code to only install setuptools and zc.buildout. The bootstrap code no-longer runs the buildout itself. This was to fix a bug that caused parts to be recreated unnecessarily because the recipe signature in the initial buildout reflected temporary locations for setuptools and zc.buildout. - Now create a minimal setup.py if it doesn't exist and issue a warning that it is being created. - Fixed bug in saving installed configuration data. %'s and extra spaces weren't quoted. 1.0.0a1 ======= Initial public version zc.buildout-1.7.1/COPYRIGHT.txt0000644000076500007650000000004012111414141015445 0ustar jimjim00000000000000Zope Foundation and Contributorszc.buildout-1.7.1/DEVELOPERS.txt0000644000076500007650000000167612111414155015572 0ustar jimjim00000000000000Developing buildout itself ************************** When you're developing buildout itself, you need to know two things: - Use a clean python *without* setuptools installed. Otherwise many tests will find your already-installed setuptools, leading to test differences when setuptools' presence is explicitly tested. - Don't bootstrap with ``python bootstrap/bootstrap.py`` but with ``python dev.py``. +1 for testing: - You should have specific python executable versions at specific locations or PYTHONxy environment variables pointing to those See zc.buildout testing.py, def find_python(version) If you use python2.5 to test then the test runner will look for python2.6 in your path. If you use any other python version it will look for python2.5 in your path or it will look for the PYTHON25 environment variable. To start the tests with Python 2.7 for instance would be:: $ env PYTHON25=/path/to/bin/python2.5 bin/test zc.buildout-1.7.1/LICENSE.txt0000644000076500007650000000402612111414141015167 0ustar jimjim00000000000000Zope Public License (ZPL) Version 2.1 A copyright notice accompanies this license document that identifies the copyright holders. This license has been certified as open source. It has also been designated as GPL compatible by the Free Software Foundation (FSF). Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions in source code must retain the accompanying copyright notice, this list of conditions, and the following disclaimer. 2. Redistributions in binary form must reproduce the accompanying copyright notice, this list of conditions, and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Names of the copyright holders must not be used to endorse or promote products derived from this software without prior written permission from the copyright holders. 4. The right to distribute this software or to use it for any purpose does not give you the right to use Servicemarks (sm) or Trademarks (tm) of the copyright holders. Use of them is covered by separate agreement with the copyright holders. 5. If any files are modified, you must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. Disclaimer THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. zc.buildout-1.7.1/PKG-INFO0000644000076500007650000124540112111415075014455 0ustar jimjim00000000000000Metadata-Version: 1.1 Name: zc.buildout Version: 1.7.1 Summary: System for managing development buildouts Home-page: http://pypi.python.org/pypi/zc.buildout Author: Jim Fulton Author-email: jim@zope.com License: ZPL 2.1 Description: ******** Buildout ******** .. contents:: The Buildout project provides support for creating applications, especially Python applications. It provides tools for assembling applications from multiple parts, Python or otherwise. An application may actually contain multiple programs, processes, and configuration settings. The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". To get a feel for some of the things you might use buildouts for, see the `Buildout examples`_. To lean more about using buildouts, see `Detailed Documentation`_. To see screencasts, talks, useful links and more documentation, visit the `Buildout website `_. Recipes ******* Existing recipes include: `zc.recipe.egg `_ The egg recipe installes one or more eggs, with their dependencies. It installs their console-script entry points with the needed eggs included in their paths. It is suitable for use with a "clean" Python: one without packages installed in site-packages. `z3c.recipe.scripts `_ Like zc.recipe.egg, this recipe builds interpreter scripts and entry point scripts based on eggs. It can be used with a Python that has packages installed in site-packages, such as a system Python. The interpreter also has more features than the one offered by zc.recipe.egg. `zc.recipe.testrunner `_ The testrunner egg creates a test runner script for one or more eggs. `zc.recipe.zope3checkout `_ The zope3checkout recipe installs a Zope 3 checkout into a buildout. `zc.recipe.zope3instance `_ The zope3instance recipe sets up a Zope 3 instance. `zc.recipe.filestorage `_ The filestorage recipe sets up a ZODB file storage for use in a Zope 3 instance created by the zope3instance recipe. Buildout examples ***************** Here are a few examples of what you can do with buildouts. We'll present these as a set of use cases. Try out an egg ============== Sometimes you want to try an egg (or eggs) that someone has released. You'd like to get a Python interpreter that lets you try things interactively or run sample scripts without having to do path manipulations. If you can and don't mind modifying your Python installation, you could use easy_install, otherwise, you could create a directory somewhere and create a buildout.cfg file in that directory containing:: [buildout] parts = mypython [mypython] recipe = zc.recipe.egg interpreter = mypython eggs = theegg where theegg is the name of the egg you want to try out. Run buildout in this directory. It will create a bin subdirectory that includes a mypython script. If you run mypython without any arguments you'll get an interactive interpreter with the egg in the path. If you run it with a script and script arguments, the script will run with the egg in its path. Of course, you can specify as many eggs as you want in the eggs option. If the egg provides any scripts (console_scripts entry points), those will be installed in your bin directory too. Work on a package ================= I often work on packages that are managed separately. They don't have scripts to be installed, but I want to be able to run their tests using the `zope.testing test runner `_. In this kind of application, the program to be installed is the test runner. A good example of this is `zc.ngi `_. Here I have a subversion project for the zc.ngi package. The software is in the src directory. The configuration file is very simple:: [buildout] develop = . parts = test [test] recipe = zc.recipe.testrunner eggs = zc.ngi I use the develop option to create a develop egg based on the current directory. I request a test script named "test" using the zc.recipe.testrunner recipe. In the section for the test script, I specify that I want to run the tests in the zc.ngi package. When I check out this project into a new sandbox, I run bootstrap.py to get setuptools and zc.buildout and to create bin/buildout. I run bin/buildout, which installs the test script, bin/test, which I can then use to run the tests. This is probably the most common type of buildout. If I need to run a previous version of zc.buildout, I use the `--version` option of the bootstrap.py script:: $ python bootstrap.py --version 1.1.3 The `zc.buildout project `_ is a slightly more complex example of this type of buildout. Install egg-based scripts ========================= A variation of the `Try out an egg`_ use case is to install scripts into your ~/bin directory (on Unix, of course). My ~/bin directory is a buildout with a configuration file that looks like:: [buildout] parts = foo bar bin-directory = . [foo] ... where foo and bar are packages with scripts that I want available. As I need new scripts, I can add additional sections. The bin-directory option specified that scripts should be installed into the current directory. Multi-program multi-machine systems =================================== Using an older prototype version of the buildout, we've build a number of systems involving multiple programs, databases, and machines. One typical example consists of: - Multiple Zope instances - Multiple ZEO servers - An LDAP server - Cache-invalidation and Mail delivery servers - Dozens of add-on packages - Multiple test runners - Multiple deployment modes, including dev, stage, and prod, with prod deployment over multiple servers Parts installed include: - Application software installs, including Zope, ZEO and LDAP software - Add-on packages - Bundles of configuration that define Zope, ZEO and LDAP instances - Utility scripts such as test runners, server-control scripts, cron jobs. Questions and Bug Reporting *************************** Please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. System Python and zc.buildout 1.5 ********************************* The 1.5 line of zc.buildout introduced a number of changes. Problems ======== As usual, please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. If problems are keeping you from your work, here's an easy way to revert to the old code temporarily: switch to a custom "emergency" bootstrap script, available from http://svn.zope.org/repos/main/zc.buildout/branches/1.4/bootstrap/bootstrap.py . This customized script will select zc.buildout 1.4.4 by default. zc.buildout 1.4.4 will not upgrade itself unless you explicitly specify a new version. It will also prefer older versions of zc.recipe.egg and some other common recipes. If you have trouble with other recipes, consider using a standard buildout "versions" section to specify older versions of these, as described in the Buildout documentation (http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Working with a System Python ============================ While there are a number of new features available in zc.buildout 1.5, the biggest is that Buildout itself supports usage with a system Python. This can work if you follow a couple of simple rules. 1. Use the new bootstrap.py (available from http://svn.zope.org/*checkout*/zc.buildout/trunk/bootstrap/bootstrap.py). 2. Use buildout recipes that have been upgraded to work with zc.buildout 1.5 and higher. Specifically, they should use ``zc.buildout.easy_install.sitepackage_safe_scripts`` to generate their scripts, if any, rather than ``zc.buildout.easy_install.scripts``. See the `Recipes That Support a System Python`_ section below for more details on recipes that are available as of this writing, and `Updating Recipes to Support a System Python`_ for instructions on how to update a recipe. Note that you should generally only need to update recipes that generate scripts. You can then use ``include-site-packages = false`` and ``exec-sitecustomize = false`` buildout options to eliminate access to your Python's site packages and not execute its sitecustomize file, if it exists, respectively. Alternately, you can use the ``allowed-eggs-from-site-packages`` buildout option as a glob-aware whitelist of eggs that may come from site-packages. This value defaults to "*", accepting all eggs. It's important to note that recipes not upgraded for zc.buildout 1.5.0 should continue to work--just without internal support for a system Python. Using a system Python is inherently fragile. Using a clean, freshly-installed Python without customization in site-packages is more robust and repeatable. See some of the regression tests added to the 1.5.0 line for the kinds of issues that you can encounter with a system Python, and see http://pypi.python.org/pypi/z3c.recipe.scripts#including-site-packages-and-sitecustomize for more discussion. However, using a system Python can be very convenient, and the zc.buildout code for this feature has been tested by many users already. Moreover, it has automated tests to exercise the problems that have been encountered and fixed. Many people rely on it. Recipes That Support a System Python ==================================== zc.recipe.egg continues to generate old-style scripts that are not safe for use with a system Python. This was done for backwards compatibility, because it is integral to so many buildouts and used as a dependency of so many other recipes. If you want to generate new-style scripts that do support system Python usage, use z3c.recipe.scripts instead (http://pypi.python.org/pypi/z3c.recipe.scripts). z3c.recipe.scripts has the same script and interpreter generation options as zc.recipe.egg, plus a few more for the new features mentioned above. In the simplest case, you should be able to simply change ``recipe = zc.recipe.egg`` to ``recipe = z3c.recipe.scripts`` in the pertinent sections of your buildout configuration and your generated scripts will work with a system Python. Other updated recipes include zc.recipe.testrunner 1.4.0 and z3c.recipe.tag 0.4.0. Others should be updated soon: see their change documents for details, or see `Updating Recipes to Support a System Python`_ for instructions on how to update recipes yourself. Templates for creating Python scripts with the z3c.recipe.filetemplate recipe can be easily changed to support a system Python. - If you don't care about supporting relative paths, simply using a generated interpreter with the eggs you want should be sufficient, as it was before. For instance, if the interpreter is named "py", use ``#!${buildout:bin-directory/py}`` or ``#!/usr/bin/env ${buildout:bin-directory/py}``). - If you do care about relative paths, (``relative-paths = true`` in your buildout configuration), then z3c.recipe.scripts does require a bit more changes, as is usual for the relative path support in that package. First, use z3c.recipe.scripts to generate a script or interpreter with the dependencies you want. This will create a directory in ``parts`` that has a site.py and sitecustomize.py. Then, begin your script as in the snippet below. The example assumes that the z3c.recipe.scripts generated were from a Buildout configuration section labeled "scripts": adjust accordingly. :: #!${buildout:executable} -S ${python-relative-path-setup} import sys sys.path.insert(0, ${scripts:parts-directory|path-repr}) import site Updating Recipes to Support a System Python =========================================== You should generally only need to update recipes that generate scripts. These recipes need to change from using ``zc.buildout.easy_install.scripts`` to be using ``zc.buildout.easy_install.sitepackage_safe_scripts``. The signatures of the two functions are different. Please compare:: def scripts( reqs, working_set, executable, dest, scripts=None, extra_paths=(), arguments='', interpreter=None, initialization='', relative_paths=False, ): def sitepackage_safe_scripts( dest, working_set, executable, site_py_dest, reqs=(), scripts=None, interpreter=None, extra_paths=(), initialization='', include_site_packages=False, exec_sitecustomize=False, relative_paths=False, script_arguments='', script_initialization='', ): In most cases, the arguments are merely reordered. The ``reqs`` argument is no longer required in order to make it easier to generate an interpreter alone. The ``arguments`` argument was renamed to ``script_arguments`` to clarify that it did not affect interpreter generation. The only new required argument is ``site_py_dest``. It must be the path to a directory in which the customized site.py and sitecustomize.py files will be written. A typical generation in a recipe will look like this. (In the recipe's __init__ method...) :: self.options = options b_options = buildout['buildout'] options['parts-directory'] = os.path.join( b_options['parts-directory'], self.name) (In the recipe's install method...) :: options = self.options generated = [] if not os.path.exists(options['parts-directory']): os.mkdir(options['parts-directory']) generated.append(options['parts-directory']) Then ``options['parts-directory']`` can be used for the ``site_py_dest`` value. If you want to support the other arguments (``include_site_packages``, ``exec_sitecustomize``, ``script_initialization``, as well as the ``allowed-eggs-from-site-packages`` option), you might want to look at some of the code in https://github.com/buildout/buildout/blob/1.6.x/z3c.recipe.scripts_/src/z3c/recipe/scripts/scripts.py . You might even be able to adopt some of it by subclassing or delegating. The Scripts class in that file is the closest to what you might be used to from zc.recipe.egg. Important note for recipe authors: As of buildout 1.5.2, the code in recipes is *always run with the access to the site-packages as configured in the buildout section*. virtualenv ========== Using virtualenv (http://pypi.python.org/pypi/virtualenv) with the --no-site-packages option already provided a simple way of using a system Python. This is intended to continue to work, and some automated tests exist to demonstrate this. However, it is only supported to the degree that people have found it to work in the past. The existing Buildout tests for virtualenv are only for problems encountered previously. They are very far from comprehensive. Using Buildout with a system python has at least three advantages over using Buildout in conjunction with virtualenv. They may or may not be pertinent to your desired usage. - Unlike ``virtualenv --no-site-packages``, Buildout's support allows you to choose to let packages from your system Python be available to your software (see ``include-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). You can even specify which eggs installed in your system Python can be allowed to fulfill some of your packages' dependencies (see ``allowed-eggs-from-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). At the expense of some repeatability and platform dependency, this flexibility means that, for instance, you can rely on difficult-to-build eggs like lxml coming from your system Python. - Buildout's implementation has a full set of automated tests. - An integral Buildout implementation means fewer steps and fewer dependencies to work with a system Python. Detailed Documentation ********************** Buildouts ========= The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". This document describes how to define buildouts using buildout configuration files and recipes. There are three ways to set up the buildout software and create a buildout instance: 1. Install the zc.buildout egg with easy_install and use the buildout script installed in a Python scripts area. 2. Use the buildout bootstrap script to create a buildout that includes both the setuptools and zc.buildout eggs. This allows you to use the buildout software without modifying a Python install. The buildout script is installed into your buildout local scripts area. 3. Use a buildout command from an already installed buildout to bootstrap a new buildout. (See the section on bootstraping later in this document.) Often, a software project will be managed in a software repository, such as a subversion repository, that includes some software source directories, buildout configuration files, and a copy of the buildout bootstrap script. To work on the project, one would check out the project from the repository and run the bootstrap script which installs setuptools and zc.buildout into the checkout as well as any parts defined. We have a sample buildout that we created using the bootstrap command of an existing buildout (method 3 above). It has the absolute minimum information. We have bin, develop-eggs, eggs and parts directories, and a configuration file: >>> ls(sample_buildout) d bin - buildout.cfg d develop-eggs d eggs d parts The bin directory contains scripts. >>> ls(sample_buildout, 'bin') - buildout >>> ls(sample_buildout, 'eggs') - setuptools-0.6-py2.4.egg - zc.buildout-1.0-py2.4.egg The develop-eggs directory is initially empty: >>> ls(sample_buildout, 'develop-eggs') The develop-eggs directory holds egg links for software being developed in the buildout. We separate develop-eggs and other eggs to allow eggs directories to be shared across multiple buildouts. For example, a common developer technique is to define a common eggs directory in their home that all non-develop eggs are stored in. This allows larger buildouts to be set up much more quickly and saves disk space. The parts directory just contains some helpers for the buildout script itself. >>> ls(sample_buildout, 'parts') d buildout The parts directory provides an area where recipes can install part data. For example, if we built a custom Python, we would install it in the part directory. Part data is stored in a sub-directory of the parts directory with the same name as the part. Buildouts are defined using configuration files. These are in the format defined by the Python ConfigParser module, with extensions that we'll describe later. By default, when a buildout is run, it looks for the file buildout.cfg in the directory where the buildout is run. The minimal configuration file has a buildout section that defines no parts: >>> cat(sample_buildout, 'buildout.cfg') [buildout] parts = A part is simply something to be created by a buildout. It can be almost anything, such as a Python package, a program, a directory, or even a configuration file. Recipes ------- A part is created by a recipe. Recipes are always installed as Python eggs. They can be downloaded from a package server, such as the Python Package Index, or they can be developed as part of a project using a "develop" egg. A develop egg is a special kind of egg that gets installed as an "egg link" that contains the name of a source directory. Develop eggs don't have to be packaged for distribution to be used and can be modified in place, which is especially useful while they are being developed. Let's create a recipe as part of the sample project. We'll create a recipe for creating directories. First, we'll create a recipes source directory for our local recipes: >>> mkdir(sample_buildout, 'recipes') and then we'll create a source file for our mkdir recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... if not os.path.isdir(os.path.dirname(options['path'])): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... ... ... def install(self): ... path = self.options['path'] ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return path ... ... def update(self): ... pass ... """) Currently, recipes must define 3 methods [#future_recipe_methods]_: - a constructor, - an install method, and - an update method. The constructor is responsible for updating a parts options to reflect data read from other sections. The buildout system keeps track of whether a part specification has changed. A part specification has changed if it's options, after adjusting for data read from other sections, has changed, or if the recipe has changed. Only the options for the part are considered. If data are read from other sections, then that information has to be reflected in the parts options. In the Mkdir example, the given path is interpreted relative to the buildout directory, and data from the buildout directory is read. The path option is updated to reflect this. If the directory option was changed in the buildout sections, we would know to update parts created using the mkdir recipe using relative path names. When buildout is run, it saves configuration data for installed parts in a file named ".installed.cfg". In subsequent runs, it compares part-configuration data stored in the .installed.cfg file and the part-configuration data loaded from the configuration files as modified by recipe constructors to decide if the configuration of a part has changed. If the configuration has changed, or if the recipe has changed, then the part is uninstalled and reinstalled. The buildout only looks at the part's options, so any data used to configure the part needs to be reflected in the part's options. It is the job of a recipe constructor to make sure that the options include all relevant data. Of course, parts are also uninstalled if they are no-longer used. The recipe defines a constructor that takes a buildout object, a part name, and an options dictionary. It saves them in instance attributes. If the path is relative, we'll interpret it as relative to the buildout directory. The buildout object passed in is a mapping from section name to a mapping of options for that section. The buildout directory is available as the directory option of the buildout section. We normalize the path and save it back into the options directory. The install method is responsible for creating the part. In this case, we need the path of the directory to create. We'll use a path option from our options dictionary. The install method logs what it's doing using the Python logging call. We return the path that we installed. If the part is uninstalled or reinstalled, then the path returned will be removed by the buildout machinery. A recipe install method is expected to return a string, or an iterable of strings containing paths to be removed if a part is uninstalled. For most recipes, this is all of the uninstall support needed. For more complex uninstallation scenarios use `Uninstall recipes`_. The update method is responsible for updating an already installed part. An empty method is often provided, as in this example, if parts can't be updated. An update method can return None, a string, or an iterable of strings. If a string or iterable of strings is returned, then the saved list of paths to be uninstalled is updated with the new information by adding any new files returned by the update method. We need to provide packaging information so that our recipe can be installed as a develop egg. The minimum information we need to specify [#packaging_info]_ is a name. For recipes, we also need to define the names of the recipe classes as entry points. Packaging information is provided via a setup.py script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) Our setup script defines an entry point. Entry points provide a way for an egg to define the services it provides. Here we've said that we define a zc.buildout entry point named mkdir. Recipe classes must be exposed as entry points in the zc.buildout group. we give entry points names within the group. We also need a README.txt for our recipes to avoid an annoying warning from distutils, on which setuptools and zc.buildout are based: >>> write(sample_buildout, 'recipes', 'README.txt', " ") Now let's update our buildout.cfg: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) Let's go through the changes one by one:: develop = recipes This tells the buildout to install a development egg for our recipes. Any number of paths can be listed. The paths can be relative or absolute. If relative, they are treated as relative to the buildout directory. They can be directory or file paths. If a file path is given, it should point to a Python setup script. If a directory path is given, it should point to a directory containing a setup.py file. Development eggs are installed before building any parts, as they may provide locally-defined recipes needed by the parts. :: parts = data-dir Here we've named a part to be "built". We can use any name we want except that different part names must be unique and recipes will often use the part name to decide what to do. :: [data-dir] recipe = recipes:mkdir path = mystuff When we name a part, we also create a section of the same name that contains part data. In this section, we'll define the recipe to be used to install the part. In this case, we also specify the path to be created. Let's run the buildout. We do so by running the build script in the buildout: >>> import os >>> os.chdir(sample_buildout) >>> buildout = os.path.join(sample_buildout, 'bin', 'buildout') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory mystuff We see that the recipe created the directory, as expected: >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mystuff d parts d recipes In addition, .installed.cfg has been created containing information about the part we installed: >>> cat(sample_buildout, '.installed.cfg') [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir [data-dir] __buildout_installed__ = /sample-buildout/mystuff __buildout_signature__ = recipes-c7vHV6ekIDUPy/7fjAaYjg== path = /sample-buildout/mystuff recipe = recipes:mkdir Note that the directory we installed is included in .installed.cfg. In addition, the path option includes the actual destination directory. If we change the name of the directory in the configuration file, we'll see that the directory gets removed and recreated: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mydata d parts d recipes If any of the files or directories created by a recipe are removed, the part will be reinstalled: >>> rmdir(sample_buildout, 'mydata') >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Error reporting --------------- If a user makes an error, an error needs to be printed and work needs to stop. This is accomplished by logging a detailed error message and then raising a (or an instance of a subclass of a) zc.buildout.UserError exception. Raising an error other than a UserError still displays the error, but labels it as a bug in the buildout software or recipe. In the sample above, of someone gives a non-existent directory to create the directory in: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = /xxx/mydata ... """) We'll get a user error, not a traceback. >>> print system(buildout), Develop: '/sample-buildout/recipes' data-dir: Cannot create /xxx/mydata. /xxx is not a directory. While: Installing. Getting section data-dir. Initializing part data-dir. Error: Invalid Path Recipe Error Handling --------------------- If an error occurs during installation, it is up to the recipe to clean up any system side effects, such as files created. Let's update the mkdir recipe to support multiple paths: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') If there is an error creating a path, the install method will exit and leave previously created paths in place: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' We meant to create a directory bins, but typed bin. Now foo was left behind. >>> os.path.exists('foo') True If we fix the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/foo' Now they fail because foo exists, because it was left behind. >>> remove('foo') Let's fix the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... created = [] ... try: ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... created.append(path) ... except: ... for d in created: ... os.rmdir(d) ... raise ... ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') And put back the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) When we rerun the buildout: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' we get the same error, but we don't get the directory left behind: >>> os.path.exists('foo') False It's critical that recipes clean up partial effects when errors occur. Because recipes most commonly create files and directories, buildout provides a helper API for removing created files when an error occurs. Option objects have a created method that can be called to record files as they are created. If the install or update method returns with an error, then any registered paths are removed automatically. The method returns the files registered and can be used to return the files created. Let's use this API to simplify the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... self.options.created(path) ... ... return self.options.created() ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') We returned by calling created, taking advantage of the fact that it returns the registered paths. We did this for illustrative purposes. It would be simpler just to return the paths as before. If we rerun the buildout, again, we'll get the error and no directories will be created: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' >>> os.path.exists('foo') False Now, we'll fix the typo again and we'll get the directories we expect: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bins >>> os.path.exists('foo') True >>> os.path.exists('bins') True Configuration file syntax ------------------------- As mentioned earlier, buildout configuration files use the format defined by the Python ConfigParser module with extensions. The extensions are: - option names are case sensitive - option values can use a substitution syntax, described below, to refer to option values in specific sections. - option values can be appended or removed using the - and + operators. The ConfigParser syntax is very flexible. Section names can contain any characters other than newlines and right square braces ("]"). Option names can contain any characters other than newlines, colons, and equal signs, can not start with a space, and don't include trailing spaces. It is likely that, in the future, some characters will be given special buildout-defined meanings. This is already true of the characters ":", "$", "%", "(", and ")". For now, it is a good idea to keep section and option names simple, sticking to alphanumeric characters, hyphens, and periods. Annotated sections ------------------ When used with the `annotate` command, buildout displays annotated sections. All sections are displayed, sorted alphabetically. For each section, all key-value pairs are displayed, sorted alphabetically, along with the origin of the value (file name or COMPUTED_VALUE, DEFAULT_VALUE, COMMAND_LINE_VALUE). >>> print system(buildout+ ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== [buildout] accept-buildout-test-releases= false DEFAULT_VALUE allow-hosts= * DEFAULT_VALUE allow-picked-versions= true DEFAULT_VALUE allowed-eggs-from-site-packages= * DEFAULT_VALUE bin-directory= bin DEFAULT_VALUE develop= recipes /sample-buildout/buildout.cfg develop-eggs-directory= develop-eggs DEFAULT_VALUE directory= /sample-buildout COMPUTED_VALUE eggs-directory= eggs DEFAULT_VALUE exec-sitecustomize= true DEFAULT_VALUE executable= ... DEFAULT_VALUE find-links= DEFAULT_VALUE include-site-packages= true DEFAULT_VALUE install-from-cache= false DEFAULT_VALUE installed= .installed.cfg DEFAULT_VALUE log-format= DEFAULT_VALUE log-level= INFO DEFAULT_VALUE newest= true DEFAULT_VALUE offline= false DEFAULT_VALUE parts= data-dir /sample-buildout/buildout.cfg parts-directory= parts DEFAULT_VALUE prefer-final= false DEFAULT_VALUE python= buildout DEFAULT_VALUE relative-paths= false DEFAULT_VALUE socket-timeout= DEFAULT_VALUE unzip= false DEFAULT_VALUE use-dependency-links= true DEFAULT_VALUE [data-dir] path= foo bins /sample-buildout/buildout.cfg recipe= recipes:mkdir /sample-buildout/buildout.cfg Variable substitutions ---------------------- Buildout configuration files support variable substitution. To illustrate this, we'll create an debug recipe to allow us to see interactions with the buildout: >>> write(sample_buildout, 'recipes', 'debug.py', ... """ ... class Debug: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... items = self.options.items() ... items.sort() ... for option, value in items: ... print option, value ... return () ... ... update = install ... """) This recipe doesn't actually create anything. The install method doesn't return anything, because it didn't create any files or directories. We also have to update our setup script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) We've rearranged the script a bit to make the entry points easier to edit. In particular, entry points are now defined as a configuration string, rather than a dictionary. Let's update our configuration to provide variable substitution examples: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) We used a string-template substitution for File 1 and File 2. This type of substitution uses the string.Template syntax. Names substituted are qualified option names, consisting of a section name and option name joined by a colon. Now, if we run the buildout, we'll see the options with the values substituted. >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug Note that the substitution of the data-dir path option reflects the update to the option performed by the mkdir recipe. It might seem surprising that mydata was created again. This is because we changed our recipes package by adding the debug module. The buildout system didn't know if this module could effect the mkdir recipe, so it assumed it could and reinstalled mydata. If we rerun the buildout: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug We can see that mydata was not recreated. Note that, in this case, we didn't specify a log level, so we didn't get output about what the buildout was doing. Section and option names in variable substitutions are only allowed to contain alphanumeric characters, hyphens, periods and spaces. This restriction might be relaxed in future releases. We can ommit the section name in a variable substitution to refer to the current section. We can also use the special option, _buildout_section_name_ to get the current section name. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${:File 1}/log ... my_name = ${:_buildout_section_name_} ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log my_name debug recipe recipes:debug Automatic part selection and ordering ------------------------------------- When a section with a recipe is referred to, either through variable substitution or by an initializing recipe, the section is treated as a part and added to the part list before the referencing part. For example, we can leave data-dir out of the parts list: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Note that the data-dir part is included *before* the debug part, because the debug part refers to the data-dir part. Even if we list the data-dir part after the debug part, it will be included before: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug data-dir ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Extending sections (macros) --------------------------- A section (other than the buildout section) can extend one or more other sections using the ``<=`` option. Options from the referenced sections are copied to the refering section *before* variable substitution. This, together with the ability to refer to variables of the current section allows sections to be used as macros. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = myfiles ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... ... [with_file1] ... <= debug ... file1 = ${:path}/file1 ... color = red ... ... [with_file2] ... <= debug ... file2 = ${:path}/file2 ... color = blue ... ... [myfiles] ... <= with_file1 ... with_file2 ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Uninstalling data-dir. Installing myfiles. color blue file1 mydata/file1 file2 mydata/file2 path mydata recipe recipes:debug In this example, the debug, with_file1 and with_file2 sections act as macros. In particular, the variable substitutions are performed relative to the myfiles section. Adding and removing options --------------------------- We can append and remove values to an option by using the + and - operators. This is illustrated below; first we define a base configuration. >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... parts = part1 part2 part3 ... ... [part1] ... recipe = ... option = a1 a2 ... ... [part2] ... recipe = ... option = b1 ... b2 ... b3 ... b4 ... ... [part3] ... recipe = ... option = c1 c2 ... ... [part4] ... recipe = ... option = d2 ... d3 ... d5 ... ... """) Extending this configuration, we can "adjust" the values set in the base configuration file. >>> write(sample_buildout, 'extension1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... # appending values ... [part1] ... option += a3 a4 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... ... # alt. spelling ... [part3] ... option+=c3 c4 c5 ... ... # combining both adding and removing ... [part4] ... option += d1 ... d4 ... option -= d5 ... ... # normal assignment ... [part5] ... option = h1 h2 ... """) An additional extension. >>> write(sample_buildout, 'extension2.cfg', ... """ ... [buildout] ... extends = extension1.cfg ... ... # appending values ... [part1] ... option += a5 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... b3 ... ... """) To verify that the options are adjusted correctly, we'll set up an extension that prints out the options. >>> mkdir(sample_buildout, 'demo') >>> write(sample_buildout, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print [part['option'] for name, part in buildout.items() \ ... if name.startswith('part')] ... """) >>> write(sample_buildout, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name="demo", ... entry_points={'zc.buildout.extension': ['ext = demo:ext']}, ... ) ... """) Set up a buildout configuration for this extension. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_buildout) >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), # doctest: +ELLIPSIS Develop: '/sample-buildout/demo' Uninstalling myfiles. Getting distribution for 'recipes'. zip_safe flag not set; analyzing archive contents... Got recipes 0.0.0. warning: install_lib: 'build/lib...' does not exist -- no Python modules to install Verify option values. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... extends = extension2.cfg ... """) >>> print system(os.path.join('bin', 'buildout')), ['a1 a2/na3 a4/na5', 'b4', 'c1 c2/nc3 c4 c5', 'd2/nd3/nd1/nd4', 'h1 h2'] Develop: '/sample-buildout/demo' Annotated sections output shows which files are responsible for which operations. >>> print system(os.path.join('bin', 'buildout') + ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== ... [part1] option= a1 a2 a3 a4 a5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg += /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part2] option= b4 /sample-buildout/base.cfg -= /sample-buildout/extension1.cfg -= /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part3] option= c1 c2 c3 c4 c5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part4] option= d2 d3 d1 d4 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg -= /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part5] option= h1 h2 /sample-buildout/extension1.cfg Cleanup. >>> os.remove(os.path.join(sample_buildout, 'base.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension1.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension2.cfg')) Multiple configuration files ---------------------------- A configuration file can "extend" another configuration file. Options are read from the other configuration file if they aren't already defined by your configuration file. The configuration files your file extends can extend other configuration files. The same file may be used more than once although, of course, cycles aren't allowed. To see how this works, we use an example: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op = buildout ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... op = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing debug. op buildout recipe recipes:debug The example is pretty trivial, but the pattern it illustrates is pretty common. In a more practical example, the base buildout might represent a product and the extending buildout might be a customization. Here is a more elaborate example. >>> other = tmpdir('other') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = b1.cfg b2.cfg %(b3)s ... ... [debug] ... op = buildout ... """ % dict(b3=os.path.join(other, 'b3.cfg'))) >>> write(sample_buildout, 'b1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op1 = b1 1 ... op2 = b1 2 ... """) >>> write(sample_buildout, 'b2.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op2 = b2 2 ... op3 = b2 3 ... """) >>> write(other, 'b3.cfg', ... """ ... [buildout] ... extends = b3base.cfg ... ... [debug] ... op4 = b3 4 ... """) >>> write(other, 'b3base.cfg', ... """ ... [debug] ... op5 = b3base 5 ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... name = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug There are several things to note about this example: - We can name multiple files in an extends option. - We can reference files recursively. - Relative file names in extended options are interpreted relative to the directory containing the referencing configuration file. Loading Configuration from URLs ------------------------------- Configuration files can be loaded from URLs. To see how this works, we'll set up a web server with some configuration files. >>> server_data = tmpdir('server_data') >>> write(server_data, "r1.cfg", ... """ ... [debug] ... op1 = r1 1 ... op2 = r1 2 ... """) >>> write(server_data, "r2.cfg", ... """ ... [buildout] ... extends = r1.cfg ... ... [debug] ... op2 = r2 2 ... op3 = r2 3 ... """) >>> server_url = start_server(server_data) >>> write('client.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = %(url)s/r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = base ... """ % dict(url=server_url)) >>> print system(buildout+ ' -c client.cfg'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug Here we specified a URL for the file we extended. The file we downloaded, itself referred to a file on the server using a relative URL reference. Relative references are interpreted relative to the base URL when they appear in configuration files loaded via URL. We can also specify a URL as the configuration file to be used by a buildout. >>> os.remove('client.cfg') >>> write(server_data, 'remote.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = remote ... """) >>> print system(buildout + ' -c ' + server_url + '/remote.cfg'), While: Initializing. Error: Missing option: buildout:directory Normally, the buildout directory defaults to directory containing a configuration file. This won't work for configuration files loaded from URLs. In this case, the buildout directory would normally be defined on the command line: >>> print system(buildout ... + ' -c ' + server_url + '/remote.cfg' ... + ' buildout:directory=' + sample_buildout ... ), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name remote op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug User defaults ------------- If the file $HOME/.buildout/default.cfg, exists, it is read before reading the configuration file. ($HOME is the value of the HOME environment variable. The '/' is replaced by the operating system file delimiter.) >>> old_home = os.environ['HOME'] >>> home = tmpdir('home') >>> mkdir(home, '.buildout') >>> write(home, '.buildout', 'default.cfg', ... """ ... [debug] ... op1 = 1 ... op7 = 7 ... """) >>> os.environ['HOME'] = home >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 op7 7 recipe recipes:debug A buildout command-line argument, -U, can be used to suppress reading user defaults: >>> print system(buildout + ' -U'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug >>> os.environ['HOME'] = old_home Log level --------- We can control the level of logging by specifying a log level in out configuration file. For example, so suppress info messages, we can set the logging level to WARNING >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... log-level = WARNING ... extends = b1.cfg b2.cfg ... """) >>> print system(buildout), name base op1 b1 1 op2 b2 2 op3 b2 3 recipe recipes:debug Uninstall recipes ----------------- As we've seen, when parts are installed, buildout keeps track of files and directories that they create. When the parts are uninstalled these files and directories are deleted. Sometimes more clean up is needed. For example, a recipe might add a system service by calling chkconfig --add during installation. Later during uninstallation, chkconfig --del will need to be called to remove the system service. In order to deal with these uninstallation issues, you can register uninstall recipes. Uninstall recipes are registered using the 'zc.buildout.uninstall' entry point. Parts specify uninstall recipes using the 'uninstall' option. In comparison to regular recipes, uninstall recipes are much simpler. They are simply callable objects that accept the name of the part to be uninstalled and the part's options dictionary. Uninstall recipes don't have access to the part itself since it maybe not be able to be instantiated at uninstallation time. Here's a recipe that simulates installation of a system service, along with an uninstall recipe that simulates removing the service. >>> write(sample_buildout, 'recipes', 'service.py', ... """ ... class Service: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... print "chkconfig --add %s" % self.options['script'] ... return () ... ... def update(self): ... pass ... ... ... def uninstall_service(name, options): ... print "chkconfig --del %s" % options['script'] ... """) To use these recipes we must register them using entry points. Make sure to use the same name for the recipe and uninstall recipe. This is required to let buildout know which uninstall recipe goes with which recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... service = service:uninstall_service ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Here's how these recipes could be used in a buildout: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/script ... """) When the buildout is run the service will be installed >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing service. chkconfig --add /path/to/script The service has been installed. If the buildout is run again with no changes, the service shouldn't be changed. >>> print system(buildout) Develop: '/sample-buildout/recipes' Updating service. Now we change the service part to trigger uninstallation and re-installation. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/a/different/script ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/script Installing service. chkconfig --add /path/to/a/different/script Now we remove the service part, and add another part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/a/different/script Installing debug. recipe recipes:debug Uninstall recipes don't have to take care of removing all the files and directories created by the part. This is still done automatically, following the execution of the uninstall recipe. An upshot is that an uninstallation recipe can access files and directories created by a recipe before they are deleted. For example, here's an uninstallation recipe that simulates backing up a directory before it is deleted. It is designed to work with the mkdir recipe introduced earlier. >>> write(sample_buildout, 'recipes', 'backup.py', ... """ ... import os ... def backup_directory(name, options): ... path = options['path'] ... size = len(os.listdir(path)) ... print "backing up directory %s of size %s" % (path, size) ... """) It must be registered with the zc.buildout.uninstall entry point. Notice how it is given the name 'mkdir' to associate it with the mkdir recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... uninstall_service = service:uninstall_service ... mkdir = backup:backup_directory ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Now we can use it with a mkdir part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = dir debug ... ... [dir] ... recipe = recipes:mkdir ... path = my_directory ... ... [debug] ... recipe = recipes:debug ... """) Run the buildout to install the part. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing dir. dir: Creating directory my_directory Installing debug. recipe recipes:debug Now we remove the part from the configuration file. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) When the buildout is run the part is removed, and the uninstall recipe is run before the directory is deleted. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling dir. Running uninstall recipe. backing up directory /sample-buildout/my_directory of size 0 Updating debug. recipe recipes:debug Now we will return the registration to normal for the benefit of the rest of the examples. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Command-line usage ------------------ A number of arguments can be given on the buildout command line. The command usage is:: buildout [options and assignments] [command [command arguments]] The following options are supported: -h (or --help) Print basic usage information. If this option is used, then all other options are ignored. -c filename The -c option can be used to specify a configuration file, rather than buildout.cfg in the current directory. -t socket_timeout Specify the socket timeout in seconds. -v Increment the verbosity by 10. The verbosity is used to adjust the logging level. The verbosity is subtracted from the numeric value of the log-level option specified in the configuration file. -q Decrement the verbosity by 10. -U Don't read user-default configuration. -o Run in off-line mode. This is equivalent to the assignment buildout:offline=true. -O Run in non-off-line mode. This is equivalent to the assignment buildout:offline=false. This is the default buildout mode. The -O option would normally be used to override a true offline setting in a configuration file. -n Run in newest mode. This is equivalent to the assignment buildout:newest=true. With this setting, which is the default, buildout will try to find the newest versions of distributions available that satisfy its requirements. -N Run in non-newest mode. This is equivalent to the assignment buildout:newest=false. With this setting, buildout will not seek new distributions if installed distributions satisfy it's requirements. Assignments are of the form:: section_name:option_name=value Options and assignments can be given in any order. Here's an example: >>> write(sample_buildout, 'other.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... installed = .other.cfg ... log-level = WARNING ... ... [debug] ... name = other ... recipe = recipes:debug ... """) Note that we used the installed buildout option to specify an alternate file to store information about installed parts. >>> print system(buildout+' -c other.cfg debug:op1=foo -v'), Develop: '/sample-buildout/recipes' Installing debug. name other op1 foo recipe recipes:debug Here we used the -c option to specify an alternate configuration file, and the -v option to increase the level of logging from the default, WARNING. Options can also be combined in the usual Unix way, as in: >>> print system(buildout+' -vcother.cfg debug:op1=foo'), Develop: '/sample-buildout/recipes' Updating debug. name other op1 foo recipe recipes:debug Here we combined the -v and -c options with the configuration file name. Note that the -c option has to be last, because it takes an argument. >>> os.remove(os.path.join(sample_buildout, 'other.cfg')) >>> os.remove(os.path.join(sample_buildout, '.other.cfg')) The most commonly used command is 'install' and it takes a list of parts to install. if any parts are specified, only those parts are installed. To illustrate this, we'll update our configuration and run the buildout in the usual way: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d1 d2 d3 ... ... [d1] ... recipe = recipes:mkdir ... path = d1 ... ... [d2] ... recipe = recipes:mkdir ... path = d2 ... ... [d3] ... recipe = recipes:mkdir ... path = d3 ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. recipe recipes:debug Installing d1. d1: Creating directory d1 Installing d2. d2: Creating directory d2 Installing d3. d3: Creating directory d3 >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d d3 d demo d develop-eggs d eggs d parts d recipes >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/d3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d3 recipe = recipes:mkdir Now we'll update our configuration file: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d2 d3 d4 ... ... [d2] ... recipe = recipes:mkdir ... path = data2 ... ... [d3] ... recipe = recipes:mkdir ... path = data3 ... ... [d4] ... recipe = recipes:mkdir ... path = ${d2:path}-extra ... ... [debug] ... recipe = recipes:debug ... x = 1 ... """) and run the buildout specifying just d3 and d4: >>> print system(buildout+' install d3 d4'), Develop: '/sample-buildout/recipes' Uninstalling d3. Installing d3. d3: Creating directory data3 Installing d4. d4: Creating directory data2-extra >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Only the d3 and d4 recipes ran. d3 was removed and data3 and data2-extra were created. The .installed.cfg is only updated for the recipes that ran: >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 d4 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/data3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data3 recipe = recipes:mkdir [d4] __buildout_installed__ = /sample-buildout/data2-extra __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data2-extra recipe = recipes:mkdir Note that the installed data for debug, d1, and d2 haven't changed, because we didn't install those parts and that the d1 and d2 directories are still there. Now, if we run the buildout without the install command: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling d2. Uninstalling d1. Uninstalling debug. Installing debug. recipe recipes:debug x 1 Installing d2. d2: Creating directory data2 Updating d3. Updating d4. We see the output of the debug recipe and that data2 was created. We also see that d1 and d2 have gone away: >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d data2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Alternate directory and file locations -------------------------------------- The buildout normally puts the bin, eggs, and parts directories in the directory in the directory containing the configuration file. You can provide alternate locations, and even names for these directories. >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... develop-eggs-directory = %(developbasket)s ... eggs-directory = %(basket)s ... bin-directory = %(scripts)s ... parts-directory = %(work)s ... """ % dict( ... developbasket = os.path.join(alt, 'developbasket'), ... basket = os.path.join(alt, 'basket'), ... scripts = os.path.join(alt, 'scripts'), ... work = os.path.join(alt, 'work'), ... )) >>> print system(buildout), Creating directory '/sample-alt/scripts'. Creating directory '/sample-alt/work'. Creating directory '/sample-alt/basket'. Creating directory '/sample-alt/developbasket'. Develop: '/sample-buildout/recipes' Uninstalling d4. Uninstalling d3. Uninstalling d2. Uninstalling debug. >>> ls(alt) d basket d developbasket d scripts d work >>> ls(alt, 'developbasket') - recipes.egg-link You can also specify an alternate buildout directory: >>> rmdir(alt) >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... directory = %(alt)s ... develop = %(recipes)s ... parts = ... """ % dict( ... alt=alt, ... recipes=os.path.join(sample_buildout, 'recipes'), ... )) >>> print system(buildout), Creating directory '/sample-alt/bin'. Creating directory '/sample-alt/parts'. Creating directory '/sample-alt/eggs'. Creating directory '/sample-alt/develop-eggs'. Develop: '/sample-buildout/recipes' >>> ls(alt) - .installed.cfg d bin d develop-eggs d eggs d parts >>> ls(alt, 'develop-eggs') - recipes.egg-link Logging control --------------- Three buildout options are used to control logging: log-level specifies the log level verbosity adjusts the log level log-format allows an alternate logging for mat to be specified We've already seen the log level and verbosity. Let's look at an example of changing the format: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... log-level = 25 ... verbosity = 5 ... log-format = %(levelname)s %(message)s ... """) Here, we've changed the format to include the log-level name, rather than the logger name. We've also illustrated, with a contrived example, that the log level can be a numeric value and that the verbosity can be specified in the configuration file. Because the verbosity is subtracted from the log level, we get a final log level of 20, which is the INFO level. >>> print system(buildout), INFO Develop: '/sample-buildout/recipes' Predefined buildout options --------------------------- Buildouts have a number of predefined options that recipes can use and that users can override in their configuration files. To see these, we'll run a minimal buildout configuration with a debug logging level. One of the features of debug logging is that the configuration database is shown. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' -vv'), # doctest: +NORMALIZE_WHITESPACE Installing 'zc.buildout >=1.9a1, <2dev', 'setuptools'. We have a develop egg: zc.buildout X.X. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = V.V Configuration data: [buildout] accept-buildout-test-releases = false allow-hosts = * allow-picked-versions = true allowed-eggs-from-site-packages = * bin-directory = /sample-buildout/bin develop-eggs-directory = /sample-buildout/develop-eggs directory = /sample-buildout eggs-directory = /sample-buildout/eggs exec-sitecustomize = true executable = python find-links = include-site-packages = true install-from-cache = false installed = /sample-buildout/.installed.cfg log-format = log-level = INFO newest = true offline = false parts = parts-directory = /sample-buildout/parts prefer-final = false python = buildout relative-paths = false socket-timeout = unzip = false use-dependency-links = true verbosity = 20 zc.buildout-version = >=1.9a1, <2dev All of these options can be overridden by configuration files or by command-line assignments. We've discussed most of these options already, but let's review them and touch on some we haven't discussed: allowed-eggs-from-site-packages Sometimes you need or want to control what eggs from site-packages are used. The allowed-eggs-from-site-packages option allows you to specify a whitelist of project names that may be included from site-packages. You can use globs to specify the value. It defaults to a single value of '*', indicating that any package may come from site-packages. Here's a usage example:: [buildout] ... allowed-eggs-from-site-packages = demo bigdemo zope.* This option interacts with the ``include-site-packages`` option in the following ways. If ``include-site-packages`` is true, then ``allowed-eggs-from-site-packages`` filters what eggs from site-packages may be chosen. Therefore, if ``allowed-eggs-from-site-packages`` is an empty list, then no eggs from site-packages are chosen, but site-packages will still be included at the end of path lists. If ``include-site-packages`` is false, the value of ``allowed-eggs-from-site-packages`` is irrelevant. See the ``include-site-packages`` description for more information. bin-directory The directory path where scripts are written. This can be a relative path, which is interpreted relative to the directory option. develop-eggs-directory The directory path where development egg links are created for software being created in the local project. This can be a relative path, which is interpreted relative to the directory option. directory The buildout directory. This is the base for other buildout file and directory locations, when relative locations are used. eggs-directory The directory path where downloaded eggs are put. It is common to share this directory across buildouts. Eggs in this directory should *never* be modified. This can be a relative path, which is interpreted relative to the directory option. exec-sitecustomize Normally the Python's real sitecustomize module is processed. If you do not want it to be processed in order to increase the repeatability of your buildout, set this value to 'false'. This will be honored irrespective of the setting for include-site-packages. This option will be honored by some recipes and not others. z3c.recipe.scripts honors this and zc.recipe.egg does not, for instance. executable The Python executable used to run the buildout. See the python option below. include-site-packages You can choose not to have the site-packages of the underlying Python available to your script or interpreter, in addition to the packages from your eggs. This can increase repeatability for your buildout. This option will be better used by some recipes than others. z3c.recipe.scripts honors this fully and zc.recipe.egg only partially, for instance. installed The file path where information about the results of the previous buildout run is written. This can be a relative path, which is interpreted relative to the directory option. This file provides an inventory of installed parts with information needed to decide which if any parts need to be uninstalled. log-format The format used for logging messages. log-level The log level before verbosity adjustment parts A white space separated list of parts to be installed. parts-directory A working directory that parts can used to store data. python The name of a section containing information about the default Python interpreter. Recipes that need a installation typically have options to tell them which Python installation to use. By convention, if a section-specific option isn't used, the option is looked for in the buildout section. The option must point to a section with an executable option giving the path to a Python executable. By default, the buildout section defines the default Python as the Python used to run the buildout. relative-paths The paths generated by zc.buildout are absolute by default, and this option is ``false``. However, if you set this value to be ``true``, bin/buildout will be generated with code that makes the paths relative. Some recipes, such as zc.recipe.egg and z3c.recipe.scripts, honor this value as well. unzip By default, zc.buildout doesn't unzip zip-safe eggs ("unzip = false"). This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy ("unzip = true"). use-dependency-links By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. verbosity A log-level adjustment. Typically, this is set via the -q and -v command-line options. Creating new buildouts and bootstrapping ---------------------------------------- If zc.buildout is installed, you can use it to create a new buildout with it's own local copies of zc.buildout and setuptools and with local buildout scripts. >>> sample_bootstrapped = tmpdir('sample-bootstrapped') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), Creating '/sample-bootstrapped/setup.cfg'. Creating directory '/sample-bootstrapped/bin'. Creating directory '/sample-bootstrapped/parts'. Creating directory '/sample-bootstrapped/eggs'. Creating directory '/sample-bootstrapped/develop-eggs'. Generated script '/sample-bootstrapped/bin/buildout'. Note that a basic setup.cfg was created for us. This is because we provided an 'init' argument. By default, the generated ``setup.cfg`` is as minimal as it could be: >>> cat(sample_bootstrapped, 'setup.cfg') [buildout] parts = We also get other buildout artifacts: >>> ls(sample_bootstrapped) d bin d develop-eggs d eggs d parts - setup.cfg >>> ls(sample_bootstrapped, 'bin') - buildout >>> _ = (ls(sample_bootstrapped, 'eggs'), ... ls(sample_bootstrapped, 'develop-eggs')) - setuptools-0.6-py2.3.egg - zc.buildout-1.0-py2.3.egg (We list both the eggs and develop-eggs directories because the buildout or setuptools egg could be installed in the develop-eggs directory if the original buildout had develop eggs for either buildout or setuptools.) If relative-paths is ``true``, the buildout script uses relative paths. >>> write(sample_bootstrapped, 'setup.cfg', ... ''' ... [buildout] ... relative-paths = true ... parts = ... ''') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' bootstrap'), Generated script '/sample-bootstrapped/bin/buildout'. >>> buildout_script = join(sample_bootstrapped, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #!... -S import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'parts/buildout'), ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import zc.buildout.buildout if __name__ == '__main__': sys.exit(zc.buildout.buildout.main()) Note that, in the above two examples, the buildout script was installed but not run. To run the buildout, we'd have to run the installed buildout script. If we have an existing buildout that already has a buildout.cfg, we'll normally use the bootstrap command instead of init. It will complain if there isn't a configuration file: >>> sample_bootstrapped2 = tmpdir('sample-bootstrapped2') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), While: Initializing. Error: Couldn't open /sample-bootstrapped2/setup.cfg >>> write(sample_bootstrapped2, 'setup.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), Creating directory '/sample-bootstrapped2/bin'. Creating directory '/sample-bootstrapped2/parts'. Creating directory '/sample-bootstrapped2/eggs'. Creating directory '/sample-bootstrapped2/develop-eggs'. Generated script '/sample-bootstrapped2/bin/buildout'. Similarly, if there is a configuration file and we use the init command, we'll get an error that the configuration file already exists: >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), While: Initializing. Error: '/sample-bootstrapped/setup.cfg' already exists. Initial eggs ------------ When using the ``init`` command, you can specify distribution requirements or paths to use: >>> cd(sample_bootstrapped) >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Generated script '/sample-bootstrapped/bin/buildout'. Getting distribution for 'zc.recipe.egg<2dev'. Got zc.recipe.egg 1.3.3dev. Installing py. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'other'. Got other 1.0. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. This causes a ``py`` part to be included that sets up a custom python interpreter with the given requirements or paths: >>> cat('setup.cfg') [buildout] parts = py [py] recipe = zc.recipe.egg interpreter = py eggs = demo other extra-paths = ./src Passing requirements or paths causes the the builout to be run as part of initialization. In the example above, we got a number of distributions installed and 2 scripts generated. The first, ``demo``, was defined by the ``demo`` project. The second, ``py`` was defined by the generated configuration. It's a "custom interpreter" that behaves like a standard Python interpeter, except that includes the specified eggs and extra paths in it's Python path. We specified a source directory that didn't exist. Buildout created it for us: >>> ls('.') - .installed.cfg d bin d develop-eggs d eggs d parts - setup.cfg d src >>> uncd() .. Make sure it works if the dir is already there: >>> cd(sample_bootstrapped) >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Installing py. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. .. cleanup >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> uncd() Newest and Offline Modes ------------------------ By default buildout and recipes will try to find the newest versions of distributions needed to satisfy requirements. This can be very time consuming, especially when incrementally working on setting up a buildout or working on a recipe. The buildout newest option can be used to to suppress this. If the newest option is set to false, then new distributions won't be sought if an installed distribution meets requirements. The newest option can be set to false using the -N command-line option. The offline option goes a bit further. If the buildout offline option is given a value of "true", the buildout and recipes that are aware of the option will avoid doing network access. This is handy when running the buildout when not connected to the internet. It also makes buildouts run much faster. This option is typically set using the buildout -o option. Preferring Final Releases ------------------------- Currently, when searching for new releases of your project's dependencies, the newest available release is used. This isn't usually ideal, as you may get a development release or alpha releases not ready to be widely used. You can request that final releases be preferred using the ``prefer-final`` option in the buildout section:: [buildout] ... prefer-final = true When the ``prefer-final`` option is set to true, then when searching for new releases, final releases are preferred. If there are final releases that satisfy distribution requirements, then those releases are used even if newer non-final releases are available. In buildout version 2, all final releases will be preferred by default--that is ``prefer-final`` will also default to 'true'. You will then need to use a 'false' value for ``prefer-final`` to get the newest releases. A separate option controls the behavior of the build system itself. When buildout looks for recipes, extensions, and for updates to itself, it does prefer final releases by default, as of the 1.5.0 release. The ``accept-buildout-test-releases`` option will let you override this behavior. However, it is typically changed by the --accept-buildout-test-releases option to the bootstrap script, since bootstrapping is the first step to selecting a buildout. Finding distributions --------------------- By default, buildout searches the Python Package Index when looking for distributions. You can, instead, specify your own index to search using the `index` option:: [buildout] ... index = http://index.example.com/ This index, or the default of http://pypi.python.org/simple/ if no index is specified, will always be searched for distributions unless running buildout with options that prevent searching for distributions. The latest version of the distribution that meets the requirements of the buildout will always be used. You can also specify more locations to search for distributions using the `find-links` option. All locations specified will be searched for distributions along with the package index as described before. Locations can be urls:: [buildout] ... find-links = http://download.zope.org/distribution/ They can also be directories on disk:: [buildout] ... find-links = /some/path Finally, they can also be direct paths to distributions:: [buildout] ... find-links = /some/path/someegg-1.0.0-py2.3.egg Any number of locations can be specified in the `find-links` option:: [buildout] ... find-links = http://download.zope.org/distribution/ /some/otherpath /some/path/someegg-1.0.0-py2.3.egg Dependency links ---------------- By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. Controlling the installation database ------------------------------------- The buildout installed option is used to specify the file used to save information on installed parts. This option is initialized to ".installed.cfg", but it can be overridden in the configuration file or on the command line: >>> write('buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs - inst.cfg d parts d recipes The installation database can be disabled by supplying an empty buildout installed option: >>> os.remove('inst.cfg') >>> print system(buildout+' buildout:installed='), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Note that there will be no installation database if there are no parts: >>> write('buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Extensions ---------- A feature allows code to be loaded and run after configuration files have been read but before the buildout has begun any processing. The intent is to allow special plugins such as urllib2 request handlers to be loaded. To load an extension, we use the extensions option and list one or more distribution requirements, on separate lines. The distributions named will be loaded and any ``zc.buildout.extension`` entry points found will be called with the buildout as an argument. When buildout finishes processing, any ``zc.buildout.unloadextension`` entry points found will be called with the buildout as an argument. Let's create a sample extension in our sample buildout created in the previous section: >>> mkdir(sample_bootstrapped, 'demo') >>> write(sample_bootstrapped, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print 'ext', list(buildout) ... def unload(buildout): ... print 'unload', list(buildout) ... """) >>> write(sample_bootstrapped, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "demo", ... entry_points = { ... 'zc.buildout.extension': ['ext = demo:ext'], ... 'zc.buildout.unloadextension': ['ext = demo:unload'], ... }, ... ) ... """) Our extension just prints out the word 'demo', and lists the sections found in the buildout passed to it. We'll update our buildout.cfg to list the demo directory as a develop egg to be built: >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_bootstrapped) >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), Develop: '/sample-bootstrapped/demo' Now we can add the extensions option. We were a bit tricky and ran the buildout once with the demo develop egg defined but without the extension option. This is because extensions are loaded before the buildout creates develop eggs. We needed to use a separate buildout run to create the develop egg. Normally, when eggs are loaded from the network, we wouldn't need to do anything special. >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... parts = ... """) We see that our extension is loaded and executed: >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), ext ['buildout'] Develop: '/sample-bootstrapped/demo' unload ['buildout'] Allow hosts ----------- On some environments the links visited by `zc.buildout` can be forbidden by paranoiac firewalls. These URL might be on the chain of links visited by `zc.buildout` wheter they are defined in the `find-links` option, wheter they are defined by various eggs in their `url`, `download_url`, `dependency_links` metadata. It is even harder to track that package_index works like a spider and might visit links and go to other location. The `allow-hosts` option provides a way to prevent this, and works exactly like the one provided in `easy_install`. You can provide a list of allowed host, together with wildcards:: [buildout] ... allow-hosts = *.python.org example.com All urls that does not match these hosts will not be visited. .. [#future_recipe_methods] In the future, additional methods may be added. Older recipes with fewer methods will still be supported. .. [#packaging_info] If we wanted to create a distribution from this package, we would need specify much more information. See the `setuptools documentation `_. Always unzipping eggs ===================== By default, zc.buildout doesn't unzip zip-safe eggs. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> _ = system(buildout) >>> ls('eggs') - demo-0.4c1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... unzip = true ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> import os >>> for name in os.listdir('eggs'): ... if name.startswith('demo'): ... remove('eggs', name) >>> _ = system(buildout) >>> ls('eggs') d demo-0.4c1-py2.4.egg d demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link Repeatable buildouts: controlling eggs used =========================================== One of the goals of zc.buildout is to provide enough control to make buildouts repeatable. It should be possible to check the buildout configuration files for a project into a version control system and later use the checked in files to get the same buildout, subject to changes in the environment outside the buildout. An advantage of using Python eggs is that depenencies of eggs used are automatically determined and used. The automatic inclusion of depenent distributions is at odds with the goal of repeatable buildouts. To support repeatable buildouts, a versions section can be created with options for each distribution name whos version is to be fixed. The section can then be specified via the buildout versions option. To see how this works, we'll create two versions of a recipe egg: >>> mkdir('recipe') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v1' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='1', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'README', '') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... >>> rmdir('recipe', 'build') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v2' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='2', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... and we'll configure a buildout to use it: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) If we run the buildout, it will use version 2: >>> print system(buildout), Getting distribution for 'spam'. Got spam 2. Installing foo. recipe v2 We can specify a versions section that lists our recipe and name it in the buildout section: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) Here we created a release-1 section listing the version 1 for the spam distribution. We told the buildout to use it by specifying release-1 as in the versions option. Now, if we run the buildout, we'll use version 1 of the spam recipe: >>> print system(buildout), Getting distribution for 'spam==1'. Got spam 1. Uninstalling foo. Installing foo. recipe v1 Running the buildout in verbose mode will help us get information about versions used. If we run the buildout in verbose mode without specifying a versions section: >>> print system(buildout+' buildout:versions= -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the best distribution that satisfies 'spam'. Picked: spam = 2. Uninstalling foo. Installing foo. recipe v2 We'll get output that includes lines that tell us what versions buildout chose a for us, like:: zc.buildout.easy_install.picked: spam = 2 This allows us to discover versions that are picked dynamically, so that we can fix them in a versions section. If we run the buildout with the versions section: >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the distribution that satisfies 'spam==1'. Uninstalling foo. Installing foo. recipe v1 We won't get output for the spam distribution, which we didn't pick, but we will get output for setuptools, which we didn't specify versions for. You can request buildout to generate an error if it picks any versions: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... allow-picked-versions = false ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) Using the download utility ========================== The ``zc.buildout.download`` module provides a download utility that handles the details of downloading files needed for a buildout run from the internet. It downloads files to the local file system, using the download cache if desired and optionally checking the downloaded files' MD5 checksum. We setup an HTTP server that provides a file we want to download: >>> server_data = tmpdir('sample_files') >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> server_url = start_server(server_data) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Downloading without using the cache ----------------------------------- If no download cache should be used, the download utility is instantiated without any arguments: >>> from zc.buildout.download import Download >>> download = Download() >>> print download.cache_dir None Downloading a file is achieved by calling the utility with the URL as an argument. A tuple is returned that consists of the path to the downloaded copy of the file and a boolean value indicating whether this is a temporary file meant to be cleaned up during the same buildout run: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /.../buildout-... >>> cat(path) This is a foo text. As we aren't using the download cache and haven't specified a target path either, the download has ended up in a temporary file: >>> is_temp True >>> import tempfile >>> path.startswith(tempfile.gettempdir()) True We are responsible for cleaning up temporary files behind us: >>> remove(path) When trying to access a file that doesn't exist, we'll get an exception: >>> try: download(server_url+'not-there') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error Downloading a local file doesn't produce a temporary file but simply returns the local file itself: >>> download(join(server_data, 'foo.txt')) ('/sample_files/foo.txt', False) We can also have the downloaded file's MD5 sum checked: >>> try: from hashlib import md5 ... except ImportError: from md5 import new as md5 >>> path, is_temp = download(server_url+'foo.txt', ... md5('This is a foo text.').hexdigest()) >>> is_temp True >>> remove(path) >>> download(server_url+'foo.txt', ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' The error message in the event of an MD5 checksum mismatch for a local file reads somewhat differently: >>> download(join(server_data, 'foo.txt'), ... md5('This is a foo text.').hexdigest()) ('/sample_files/foo.txt', False) >>> download(join(server_data, 'foo.txt'), ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for local resource at '/sample_files/foo.txt'. Finally, we can download the file to a specified place in the file system: >>> target_dir = tmpdir('download-target') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False Trying to download a file in offline mode will result in an error: >>> download = Download(cache=None, offline=True) >>> download(server_url+'foo.txt') Traceback (most recent call last): UserError: Couldn't download 'http://localhost/foo.txt' in offline mode. As an exception to this rule, file system paths and URLs in the ``file`` scheme will still work: >>> cat(download(join(server_data, 'foo.txt'))[0]) This is a foo text. >>> cat(download('file:' + join(server_data, 'foo.txt'))[0]) This is a foo text. >>> remove(path) Downloading using the download cache ------------------------------------ In order to make use of the download cache, we need to configure the download utility differently. To do this, we pass a directory path as the ``cache`` attribute upon instantiation: >>> cache = tmpdir('download-cache') >>> download = Download(cache=cache) >>> print download.cache_dir /download-cache/ Simple usage ~~~~~~~~~~~~ When using the cache, a file will be stored in the cache directory when it is first downloaded. The file system path returned by the download utility points to the cached copy: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False Whenever the file is downloaded again, the cached copy is used. Let's change the file on the server to see this: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. If we specify an MD5 checksum for a file that is already in the cache, the cached copy's checksum will be verified: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for cached download from 'http://localhost/foo.txt' at '/download-cache/foo.txt' Trying to access another file at a different URL which has the same base name will result in the cached copy being used: >>> mkdir(server_data, 'other') >>> write(server_data, 'other', 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'other/foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. Given a target path for the download, the utility will provide a copy of the file at that location both when first downloading the file and when using a cached copy: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False >>> ls(cache) - foo.txt >>> remove(path) >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False In offline mode, downloads from any URL will be successful if the file is found in the cache: >>> download = Download(cache=cache, offline=True) >>> cat(download(server_url+'foo.txt')[0]) This is a foo text. Local resources will be cached just like any others since download caches are sometimes used to create source distributions: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download = Download(cache=cache) >>> cat(download('file:' + join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') >>> cat(download(join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') However, resources with checksum mismatches will not be copied to the cache: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> ls(cache) >>> remove(path) If the file is completely missing it should notify the user of the error: >>> download(server_url+'bar.txt') Traceback (most recent call last): UserError: Error downloading extends for URL http://localhost/bar.txt: (404, 'Not Found') >>> ls(cache) Finally, let's see what happens if the download cache to be used doesn't exist as a directory in the file system yet: >>> Download(cache=join(cache, 'non-existent'))(server_url+'foo.txt') Traceback (most recent call last): UserError: The directory: '/download-cache/non-existent' to be used as a download cache doesn't exist. Using namespace sub-directories of the download cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It is common to store cached copies of downloaded files within sub-directories of the download cache to keep some degree of order. For example, zc.buildout stores downloaded distributions in a sub-directory named "dist". Those sub-directories are also known as namespaces. So far, we haven't specified any namespaces to use, so the download utility stored files directly inside the download cache. Let's use a namespace "test" instead: >>> download = Download(cache=cache, namespace='test') >>> print download.cache_dir /download-cache/test The namespace sub-directory hasn't been created yet: >>> ls(cache) Downloading a file now creates the namespace sub-directory and places a copy of the file inside it: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> ls(cache) d test >>> ls(cache, 'test') - foo.txt >>> cat(path) This is a foo text. >>> is_temp False The next time we want to download that file, the copy from inside the cache namespace is used. To see this clearly, we put a file with the same name but different content both on the server and in the cache's root directory: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> write(cache, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> cat(path) This is a foo text. >>> rmdir(cache, 'test') >>> remove(cache, 'foo.txt') >>> write(server_data, 'foo.txt', 'This is a foo text.') Using a hash of the URL as the filename in the cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ So far, the base name of the downloaded file read from the URL has been used for the name of the cached copy of the file. This may not be desirable in some cases, for example when downloading files from different locations that have the same base name due to some naming convention, or if the file content depends on URL parameters. In such cases, an MD5 hash of the complete URL may be used as the filename in the cache: >>> download = Download(cache=cache, hash_name=True) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/09f5793fcdc1716727f72d49519c688d >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d The path was printed just to illustrate matters; we cannot know the real checksum since we don't know which port the server happens to listen at when the test is run, so we don't actually know the full URL of the file. Let's check that the checksum actually belongs to the particular URL used: >>> path.lower() == join(cache, md5(server_url+'foo.txt').hexdigest()).lower() True The cached copy is used when downloading the file again: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> (path, is_temp) == download(server_url+'foo.txt') True >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d If we change the URL, even in such a way that it keeps the base name of the file the same, the file will be downloaded again this time and put in the cache under a different name: >>> path2, is_temp = download(server_url+'other/foo.txt') >>> print path2 /download-cache/537b6d73267f8f4447586989af8c470e >>> path == path2 False >>> path2.lower() == join(cache, md5(server_url+'other/foo.txt').hexdigest()).lower() True >>> cat(path) This is a foo text. >>> cat(path2) The wrong text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d - 537b6d73267f8f4447586989af8c470e >>> remove(path) >>> remove(path2) >>> write(server_data, 'foo.txt', 'This is a foo text.') Using the cache purely as a fall-back ------------------------------------- Sometimes it is desirable to try downloading a file from the net if at all possible, and use the cache purely as a fall-back option when a server is down or if we are in offline mode. This mode is only in effect if a download cache is configured in the first place: >>> download = Download(cache=cache, fallback=True) >>> print download.cache_dir /download-cache/ A downloaded file will be cached: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> ls(cache) - foo.txt >>> cat(cache, 'foo.txt') This is a foo text. >>> is_temp False If the file cannot be served, the cached copy will be used: >>> remove(server_data, 'foo.txt') >>> try: Download()(server_url+'foo.txt') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error >>> path, is_temp = download(server_url+'foo.txt') >>> cat(path) This is a foo text. >>> is_temp False Similarly, if the file is served but we're in offline mode, we'll fall back to using the cache: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> get(server_url+'foo.txt') 'The wrong text.' >>> offline_download = Download(cache=cache, offline=True, fallback=True) >>> path, is_temp = offline_download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False However, when downloading the file normally with the cache being used in fall-back mode, the file will be downloaded from the net and the cached copy will be replaced with the new content: >>> cat(download(server_url+'foo.txt')[0]) The wrong text. >>> cat(cache, 'foo.txt') The wrong text. When trying to download a resource whose checksum does not match, the cached copy will neither be used nor overwritten: >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> cat(cache, 'foo.txt') The wrong text. Configuring the download utility from buildout options ------------------------------------------------------ The configuration options explained so far derive from the build logic implemented by the calling code. Other options configure the download utility for use in a particular project or buildout run; they are read from the ``buildout`` configuration section. The latter can be passed directly as the first argument to the download utility's constructor. The location of the download cache is specified by the ``download-cache`` option: >>> download = Download({'download-cache': cache}, namespace='cmmi') >>> print download.cache_dir /download-cache/cmmi If the ``download-cache`` option specifies a relative path, it is understood relative to the current working directory, or to the buildout directory if that is given: >>> download = Download({'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/relative-cache/ >>> download = Download({'directory': join(sample_buildout, 'root'), ... 'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/root/relative-cache/ Keyword parameters take precedence over the corresponding options: >>> download = Download({'download-cache': cache}, cache=None) >>> print download.cache_dir None Whether to assume offline mode can be inferred from either the ``offline`` or the ``install-from-cache`` option. As usual with zc.buildout, these options must assume one of the values 'true' and 'false': >>> download = Download({'offline': 'true'}) >>> download.offline True >>> download = Download({'offline': 'false'}) >>> download.offline False >>> download = Download({'install-from-cache': 'true'}) >>> download.offline True >>> download = Download({'install-from-cache': 'false'}) >>> download.offline False These two options are combined using logical 'or': >>> download = Download({'offline': 'true', 'install-from-cache': 'false'}) >>> download.offline True >>> download = Download({'offline': 'false', 'install-from-cache': 'true'}) >>> download.offline True The ``offline`` keyword parameter takes precedence over both the ``offline`` and ``install-from-cache`` options: >>> download = Download({'offline': 'true'}, offline=False) >>> download.offline False >>> download = Download({'install-from-cache': 'false'}, offline=True) >>> download.offline True Regressions ----------- MD5 checksum calculation needs to be reliable on all supported systems, which requires text files to be treated as binary to avoid implicit line-ending conversions: >>> text = 'First line of text.\r\nSecond line.\r\n' >>> f = open(join(server_data, 'foo.txt'), 'wb') >>> f.write(text) >>> f.close() >>> path, is_temp = Download()(server_url+'foo.txt', md5(text).hexdigest()) >>> remove(path) When "downloading" a directory given by file-system path or ``file:`` URL and using a download cache at the same time, the cached directory wasn't handled correctly. Consequently, the cache was defeated and an attempt to cache the directory a second time broke. This is how it should work: >>> download = Download(cache=cache) >>> dirpath = join(server_data, 'some_directory') >>> mkdir(dirpath) >>> dest, _ = download(dirpath) If we now modify the source tree, the second download will produce the original one from the cache: >>> mkdir(join(dirpath, 'foo')) >>> ls(dirpath) d foo >>> dest, _ = download(dirpath) >>> ls(dest) Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir Using a download cache ====================== Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. The buildout download-cache option can be used to specify a directory to be used as a download cache. In this example, we'll create a directory to hold the cache: >>> cache = tmpdir('cache') And set up a buildout that downloads some eggs: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ==0.2 ... ''' % globals()) We specified a link server that has some distributions available for download: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
We'll enable logging on the link server so we can see what's going on: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' We also specified a download cache. If we run the buildout, we'll see the eggs installed from the link server as usual: >>> print system(buildout), GET 200 / GET 200 /demo-0.2-py2.4.egg GET 200 /demoneeded-1.2c1.zip Installing eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. We'll also get the download cache populated. The buildout doesn't put files in the cache directly. It creates an intermediate directory, dist: >>> ls(cache) d dist >>> ls(cache, 'dist') - demo-0.2-py2.4.egg - demoneeded-1.2c1.zip If we remove the installed eggs from eggs directory and re-run the buildout: >>> import os >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> print system(buildout), GET 200 / Updating eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. We see that the distributions aren't downloaded, because they're downloaded from the cache. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a buildout with download cache and tell the buildout to install from the download cache only, without making network accesses. The buildout install-from-cache option can be used to signal that packages should be installed only from the download cache. Let's remove our installed eggs and run the buildout with the install-from-cache option set to true: >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... install-from-cache = true ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> print system(buildout), Uninstalling eggs. Installing eggs. Getting distribution for 'demo'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. Caching extended configuration ============================== As mentioned in the general buildout documentation, configuration files can extend each other, including the ability to download configuration being extended from a URL. If desired, zc.buildout caches downloaded configuration in order to be able to use it when run offline. As we're going to talk about downloading things, let's start an HTTP server. Also, all of the following will take place inside the sample buildout. >>> server_data = tmpdir('server_data') >>> server_url = start_server(server_data) >>> cd(sample_buildout) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Basic use of the extends cache ------------------------------ We put some base configuration on a server and reference it from a sample buildout: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) When trying to run this buildout offline, we'll find that we cannot read all of the required configuration: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Trying the same online, we can: >>> print system(buildout) Unused options for buildout: 'foo'. As long as we haven't said anything about caching downloaded configuration, nothing gets cached. Offline mode will still cause the buildout to fail: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Let's now specify a cache for base configuration files. This cache is different from the download cache used by recipes for caching distributions and other files; one might, however, use a namespace subdirectory of the download cache for it. The configuration cache we specify will be created when running buildout and the base.cfg file will be put in it (with the file name being a hash of the complete URL): >>> mkdir('cache') >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. >>> cache = join(sample_buildout, 'cache') >>> ls(cache) - 5aedc98d7e769290a29d654a591a3a45 >>> import os >>> cat(cache, os.listdir(cache)[0]) [buildout] parts = foo = bar We can now run buildout offline as it will read base.cfg from the cache: >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. The cache is being used purely as a fall-back in case we are offline or don't have access to a configuration file to be downloaded. As long as we are online, buildout attempts to download a fresh copy of each file even if a cached copy of the file exists. To see this, we put different configuration in the same place on the server and run buildout in offline mode so it takes base.cfg from the cache: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... bar = baz ... """) >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. In online mode, buildout will download and use the modified version: >>> print system(buildout) Unused options for buildout: 'bar'. Trying offline mode again, the new version will be used as it has been put in the cache now: >>> print system(buildout + ' -o') Unused options for buildout: 'bar'. Clean up: >>> rmdir(cache) Specifying extends cache and offline mode ----------------------------------------- Normally, the values of buildout options such as the location of a download cache or whether to use offline mode are determined by first reading the user's default configuration, updating it with the project's configuration and finally applying command-line options. User and project configuration are assembled by reading a file such as ``~/.buildout/default.cfg``, ``buildout.cfg`` or a URL given on the command line, recursively (depth-first) downloading any base configuration specified by the ``buildout:extends`` option read from each of those config files, and finally evaluating each config file to provide default values for options not yet read. This works fine for all options that do not influence how configuration is downloaded in the first place. The ``extends-cache`` and ``offline`` options, however, are treated differently from the procedure described in order to make it simple and obvious to see where a particular configuration file came from under any particular circumstances. - Offline and extends-cache settings are read from the two root config files exclusively. Otherwise one could construct configuration files that, when read, imply that they should have been read from a different source than they have. Also, specifying the extends cache within a file that might have to be taken from the cache before being read wouldn't make a lot of sense. - Offline and extends-cache settings given by the user's defaults apply to the process of assembling the project's configuration. If no extends cache has been specified by the user's default configuration, the project's root config file must be available, be it from disk or from the net. - Offline mode turned on by the ``-o`` command line option is honoured from the beginning even though command line options are applied to the configuration last. If offline mode is not requested by the command line, it may be switched on by either the user's or the project's config root. Extends cache ~~~~~~~~~~~~~ Let's see the above rules in action. We create a new home directory for our user and write user and project configuration that recursively extends online bases, using different caches: >>> mkdir('home') >>> mkdir('home', '.buildout') >>> mkdir('cache') >>> mkdir('user-cache') >>> os.environ['HOME'] = join(sample_buildout, 'home') >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... extends-cache = user-cache ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write(server_data, 'base_default.cfg', """\ ... [buildout] ... foo = bar ... offline = false ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... extends-cache = cache ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... offline = false ... """) Buildout will now assemble its configuration from all of these 6 files, defaults first. The online resources end up in the respective extends caches: >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 10e772cf422123ef6c64ae770f555740 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] foo = bar offline = false >>> ls('cache') - c72213127e6eb2208a3e1fc1dba771a7 >>> cat('cache', os.listdir('cache')[0]) [buildout] parts = offline = false If, on the other hand, the extends caches are specified in files that get extended themselves, they won't be used for assembling the configuration they belong to (user's or project's, resp.). The extends cache specified by the user's defaults does, however, apply to downloading project configuration. Let's rewrite the config files, clean out the caches and re-run buildout: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... extends-cache = user-cache ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> remove('user-cache', os.listdir('user-cache')[0]) >>> remove('cache', os.listdir('cache')[0]) >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 0548bad6002359532de37385bb532e26 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] parts = offline = false >>> ls('cache') Clean up: >>> rmdir('user-cache') >>> rmdir('cache') Offline mode and installation from cache ----------------------------~~~~~~~~~~~~ If we run buildout in offline mode now, it will fail because it cannot get at the remote configuration file needed by the user's defaults: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. Let's now successively turn on offline mode by different parts of the configuration and see when buildout applies this setting in each case: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... offline = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... offline = true ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. The ``install-from-cache`` option is treated accordingly: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Installing. Checking for upgrades. An internal error occurred ... ValueError: install_from_cache set to true with no download cache >>> rmdir('home', '.buildout') Newest and non-newest behaviour for extends cache ------------------------------------------------- While offline mode forbids network access completely, 'newest' mode determines whether to look for updated versions of a resource even if some version of it is already present locally. If we run buildout in newest mode (``newest = true``), the configuration files are updated with each run: >>> mkdir("cache") >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... """ % server_url) >>> print system(buildout) >>> ls('cache') - 5aedc98d7e769290a29d654a591a3a45 >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = A change to ``base.cfg`` is picked up on the next buildout run: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> print system(buildout + " -n") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar In contrast, when not using ``newest`` mode (``newest = false``), the files already present in the extends cache will not be updated: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> print system(buildout + " -N") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar Even when updating base configuration files with a buildout run, any given configuration file will be downloaded only once during that particular run. If some base configuration file is extended more than once, its cached copy is used: >>> write(server_data, 'baseA.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... foo = bar ... """ % server_url) >>> write(server_data, 'baseB.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... bar = foo ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... newest = true ... extends = %sbaseA.cfg %sbaseB.cfg ... """ % (server_url, server_url)) >>> print system(buildout + " -n") Unused options for buildout: 'bar' 'foo'. (XXX We patch download utility's API to produce readable output for the test; a better solution would utilise the logging already done by the utility.) >>> import zc.buildout >>> old_download = zc.buildout.download.Download.download >>> def wrapper_download(self, url, md5sum=None, path=None): ... print "The URL %s was downloaded." % url ... return old_download(url, md5sum, path) >>> zc.buildout.download.Download.download = wrapper_download >>> zc.buildout.buildout.main([]) The URL http://localhost/baseA.cfg was downloaded. The URL http://localhost/base.cfg was downloaded. The URL http://localhost/baseB.cfg was downloaded. Unused options for buildout: 'bar' 'foo'. >>> zc.buildout.download.Download.download = old_download The deprecated ``extended-by`` option ------------------------------------- The ``buildout`` section used to recognise an option named ``extended-by`` that was deprecated at some point and removed in the 1.5 line. Since ignoring this option silently was considered harmful as a matter of principle, a UserError is raised if that option is encountered now: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... extended-by = foo.cfg ... """) >>> print system(buildout) While: Initializing. Error: No-longer supported "extended-by" option found in http://localhost/base.cfg. Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir Using zc.buildout to run setup scripts ====================================== zc buildout has a convenience command for running setup scripts. Why? There are two reasons. If a setup script doesn't import setuptools, you can't use any setuptools-provided commands, like bdist_egg. When buildout runs a setup script, it arranges to import setuptools before running the script so setuptools-provided commands are available. If you use a squeaky-clean Python to do your development, the setup script that would import setuptools because setuptools isn't in the path. Because buildout requires setuptools and knows where it has installed a setuptools egg, it adds the setuptools egg to the Python path before running the script. To run a setup script, use the buildout setup command, passing the name of a script or a directory containing a setup script and arguments to the script. Let's look at an example: >>> mkdir('test') >>> cd('test') >>> write('setup.py', ... ''' ... from distutils.core import setup ... setup(name='sample') ... ''') We've created a super simple (stupid) setup script. Note that it doesn't import setuptools. Let's try running it to create an egg. We'll use the buildout script from our sample buildout: >>> print system(buildout+' setup'), ... # doctest: +NORMALIZE_WHITESPACE Error: The setup command requires the path to a setup script or directory containing a setup script, and its arguments. Oops, we forgot to give the name of the setup script: >>> print system(buildout+' setup setup.py bdist_egg'), ... # doctest: +ELLIPSIS Running setup script 'setup.py'. ... >>> ls('dist') - sample-0.0.0-py2.5.egg Note that we can specify a directory name. This is often shorter and preferred by the lazy :) >>> print system(buildout+' setup . bdist_egg'), # doctest: +ELLIPSIS Running setup script './setup.py'. ... Automatic Buildout Updates ========================== When a buildout is run, one of the first steps performed is to check for updates to either zc.buildout or setuptools. To demonstrate this, we've created some "new releases" of buildout and setuptools in a new_releases folder: >>> ls(new_releases) d setuptools - setuptools-1.99.99-py2.4.egg d zc.buildout - zc.buildout-1.100.0b1-pyN.N.egg - zc.buildout-1.99.99-py2.4.egg - zc.buildout-2.0.0-pyN.N.egg Let's update the sample buildout.cfg to look in this area: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) We'll also include a recipe that echos the versions of setuptools and zc.buildout used: >>> mkdir(sample_buildout, 'showversions') >>> write(sample_buildout, 'showversions', 'showversions.py', ... """ ... import pkg_resources ... ... class Recipe: ... ... def __init__(self, buildout, name, options): ... pass ... ... def install(self): ... for project in 'zc.buildout', 'setuptools': ... req = pkg_resources.Requirement.parse(project) ... print project, pkg_resources.working_set.find(req).version ... return () ... update = install ... """) >>> write(sample_buildout, 'showversions', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "showversions", ... entry_points = {'zc.buildout': ['default = showversions:Recipe']}, ... ) ... """) Now if we run the buildout, the buildout will upgrade itself to the new versions found in new releases: >>> print system(buildout), Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Upgraded: zc.buildout version 1.99.99, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. Develop: '/sample-buildout/showversions' Installing show-versions. zc.buildout 1.99.99 setuptools 1.99.99 Notice that, even though we have a newer beta version of zc.buildout available, the final "1.99.99" was selected. If you want to get non-final versions, specify a specific version in your buildout's versions section, you typically want to use the --accept-buildout-test-releases option to the bootstrap script, which internally uses the ``accept-buildout-test-releases = true`` discussed below. Also, even thought there's a later final version, buildout won't upgrade itself past version 1. Our buildout script's site.py has been updated to use the new eggs: >>> cat(sample_buildout, 'parts', 'buildout', 'site.py') ... # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS "... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/sample-buildout/eggs/zc.buildout-1.99.99-pyN.N.egg', '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support setuptools. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths ... Now, let's recreate the sample buildout. If we specify constraints on the versions of zc.buildout and setuptools (or distribute) to use, running the buildout will install earlier versions of these packages: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... zc.buildout-version = < 1.99 ... setuptools-version = < 1.99 ... distribute-version = < 1.99 ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) Now we can see that we actually "upgrade" to an earlier version. >>> print system(buildout), Upgraded: zc.buildout version 1.0.0, setuptools version 0.6; restarting. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 There are a number of cases, described below, in which the updates don't happen. We won't upgrade in offline mode: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout+' -o'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 Or in non-newest mode: >>> print system(buildout+' -N'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 We also won't upgrade if the buildout script being run isn't in the buildout's bin directory. To see this we'll create a new buildout directory: >>> sample_buildout2 = tmpdir('sample_buildout2') >>> write(sample_buildout2, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = ... """ % dict(new_releases=new_releases)) >>> cd(sample_buildout2) >>> print system(buildout), Creating directory '/sample_buildout2/bin'. Creating directory '/sample_buildout2/parts'. Creating directory '/sample_buildout2/eggs'. Creating directory '/sample_buildout2/develop-eggs'. Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Not upgrading because not running a local buildout command. >>> ls('bin') As mentioned above, the ``accept-buildout-test-releases = true`` means that newer non-final versions of these dependencies are preferred. Typically users are not expected to actually manipulate this value. Instead, the bootstrap script creates a buildout buildout script that passes in the value as a command line override. This then results in the buildout script being rewritten to remember the decision. We'll mimic this by passing the argument actually in the command line. >>> cd(sample_buildout) >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout + ... ' buildout:accept-buildout-test-releases=true'), ... # doctest: +NORMALIZE_WHITESPACE Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.100.0b1. Upgraded: zc.buildout version 1.100.0b1, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. NOTE: Accepting early releases of build system packages. Rerun bootstrap without --accept-buildout-test-releases (-t) to return to default behavior. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.100.0b1 setuptools 1.99.99 The buildout script shows the change. >>> buildout_script = join(sample_buildout, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... sys.argv.insert(1, 'buildout:accept-buildout-test-releases=true') print ('NOTE: Accepting early releases of build system packages. Rerun ' 'bootstrap without --accept-buildout-test-releases (-t) to return to ' 'default behavior.') ... If the update process for buildout or setuptools fails the error should be caught (displaying a warning) and the rest of the buildout update process should continue. >>> version = sys.version_info[0:2] >>> egg = new_releases + '/zc.buildout-1.99.99-py%s.%s.egg' % version >>> copy_egg = new_releases + '/zc.buildout-1.1000-py%s.%s.egg' % version >>> import shutil >>> shutil.copy(egg, copy_egg) Create a broken egg >>> mkdir(sample_buildout, 'broken') >>> write(sample_buildout, 'broken', 'setup.py', "import broken_egg\n") >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = ... broken ... ... [broken] ... recipe = zc.recipe.egg ... eggs = broken ... """ % dict(new_releases=new_releases)) >>> import subprocess >>> subprocess.call([buildout]) 1 Debugging buildouts =================== Buildouts can be pretty complex. When things go wrong, it isn't always obvious why. Errors can occur due to problems in user input or due to bugs in zc.buildout or recipes. When an error occurs, Python's post-mortem debugger can be used to inspect the state of the buildout or recipe code where the error occurred. To enable this, use the -D option to the buildout. Let's create a recipe that has a bug: >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... ... def install(self): ... directory = self.options['directory'] ... os.mkdir(directory) ... return directory ... ... def update(self): ... pass ... """) >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup(name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) And create a buildout that uses it: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) If we run the buildout, we'll get an error: >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. While: Installing data-dir. Error: Missing option: data-dir:directory If we want to debug the error, we can add the -D option. Here's we'll supply some input: >>> print system(buildout+" -D", """\ ... up ... p self.options.keys() ... q ... """), Develop: '/sample-buildout/recipes' Installing data-dir. > /zc/buildout/buildout.py(925)__getitem__() -> raise MissingOption("Missing option: %s:%s" % (self.name, key)) (Pdb) > /sample-buildout/recipes/mkdir.py(14)install() -> directory = self.options['directory'] (Pdb) ['path', 'recipe'] (Pdb) While: Installing data-dir. Traceback (most recent call last): File "/zc/buildout/buildout.py", line 1352, in main getattr(buildout, command)(args) File "/zc/buildout/buildout.py", line 383, in install installed_files = self[part]._call(recipe.install) File "/zc/buildout/buildout.py", line 961, in _call return f() File "/sample-buildout/recipes/mkdir.py", line 14, in install directory = self.options['directory'] File "/zc/buildout/buildout.py", line 925, in __getitem__ raise MissingOption("Missing option: %s:%s" % (self.name, key)) MissingOption: Missing option: data-dir:directory Starting pdb: Testing Support =============== The zc.buildout.testing module provides an API that can be used when writing recipe tests. This API is documented below. Many examples of using this API can be found in the zc.buildout, zc.recipe.egg, and zc.recipe.testrunner tests. zc.buildout.testing.buildoutSetUp(test) --------------------------------------- The buildoutSetup function can be used as a doctest setup function. It creates a sample buildout that can be used by tests, changing the current working directory to the sample_buildout. It also adds a number of names to the test namespace: ``sample_buildout`` This is the name of a buildout with a basic configuration. ``buildout`` This is the path of the buildout script in the sample buildout. ``ls(*path)`` List the contents of a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``cat(*path)`` Display the contents of a file. The file path is provided as one or more strings, to be joined with os.path.join. On Windows, if the file doesn't exist, the function will try adding a '-script.py' suffix. This helps to work around a difference in script generation on windows. ``mkdir(*path)`` Create a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``rmdir(*path)`` Remove a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``remove(*path)`` Remove a directory or file. The path is provided as one or more strings, to be joined with os.path.join. ``tmpdir(name)`` Create a temporary directory with the given name. The directory will be automatically removed at the end of the test. The path of the created directory is returned. Further, if the the normalize_path normlaizing substitution (see below) is used, then any paths starting with this path will be normalized to:: /name/restofpath No two temporary directories can be created with the same name. A directory created with tmpdir can be removed with rmdir and recreated. Note that the sample_buildout directory is created by calling this function. ``write(*path_and_contents)`` Create a file. The file path is provided as one or more strings, to be joined with os.path.join. The last argument is the file contents. ``system(command, input='')`` Execute a system command with the given input passed to the command's standard input. The output (error and regular output) from the command is returned. ``get(url)`` Get a web page. ``cd(*path)`` Change to the given directory. The directory path is provided as one or more strings, to be joined with os.path.join. The directory will be reset at the end of the test. ``uncd()`` Change to the directory that was current prior to the previous call to ``cd``. You can call ``cd`` multiple times and then ``uncd`` the same number of times to return to the same location. ``join(*path)`` A convenient reference to os.path.join. ``register_teardown(func)`` Register a tear-down function. The function will be called with no arguments at the end of the test. ``start_server(path)`` Start a web server on the given path. The server will be shut down at the end of the test. The server URL is returned. You can cause the server to start and stop logging it's output using: >>> get(server_url+'enable_server_logging') and: >>> get(server_url+'disable_server_logging') This can be useful to see how buildout is interacting with a server. ``sdist(setup, dest)`` Create a source distribution by running the given setup file and placing the result in the given destination directory. If the setup argument is a directory, the thge setup.py file in that directory is used. ``bdist_egg(setup, executable, dest)`` Create an egg by running the given setup file with the given Python executable and placing the result in the given destination directory. If the setup argument is a directory, then the setup.py file in that directory is used. ``find_python(version)`` Find a Python executable for the given version, where version is a string like "2.4". This function uses the following strategy to find a Python of the given version: - Look for an environment variable of the form PYTHON%(version)s. - On windows, look for \Pythonm%(version)s\python - on Unix, try running python%(version)s or just python to get the executable ``zc.buildout.testing.buildoutTearDown(test)`` ---------------------------------------------- Tear down everything set up by zc.buildout.testing.buildoutSetUp. Any functions passed to register_teardown are called as well. ``install(project, destination)`` --------------------------------- Install eggs for a given project into a destination. If the destination is a test object, then the eggs directory of the sample buildout (sample_buildout) defined by the test will be used. Tests will use this to install the distributions for the packages being tested (and their dependencies) into a sample buildout. The egg to be used should already be loaded, by importing one of the modules provided, before calling this function. ``install_develop(project, destination)`` ----------------------------------------- Like install, but a develop egg is installed even if the current egg if not a develop egg. ``Output normalization`` ------------------------ Recipe tests often generate output that is dependent on temporary file locations, operating system conventions or Python versions. To deal with these dependencies, we often use zope.testing.renormalizing.RENormalizing to normalize test output. zope.testing.renormalizing.RENormalizing takes pairs of regular expressions and substitutions. The zc.buildout.testing module provides a few helpful variables that define regular-expression/substitution pairs that you can pass to zope.testing.renormalizing.RENormalizing. ``normalize_path`` Converts tests paths, based on directories created with tmpdir(), to simple paths. ``normalize_script`` On Unix-like systems, scripts are implemented in single files without suffixes. On windows, scripts are implemented with 2 files, a -script.py file and a .exe file. This normalization converts directory listings of Windows scripts to the form generated on UNix-like systems. ``normalize_egg_py`` Normalize Python version and platform indicators, if specified, in egg names. Python API for egg and script installation ========================================== The easy_install module provides some functions to provide support for egg and script installation. It provides functionality at the python level that is similar to easy_install, with a few exceptions: - By default, we look for new packages *and* the packages that they depend on. This is somewhat like (and uses) the --upgrade option of easy_install, except that we also upgrade required packages. - If the highest-revision package satisfying a specification is already present, then we don't try to get another one. This saves a lot of search time in the common case that packages are pegged to specific versions. - If there is a develop egg that satisfies a requirement, we don't look for additional distributions. We always give preference to develop eggs. - Distutils options for building extensions can be passed. Distribution installation ------------------------- The easy_install module provides a function, install, for installing one or more packages and their dependencies. The install function takes 2 positional arguments: - An iterable of setuptools requirement strings for the distributions to be installed, and - A destination directory to install to and to satisfy requirements from. The destination directory can be None, in which case, no new distributions are downloaded and there will be an error if the needed distributions can't be found among those already installed. It supports a number of optional keyword arguments: links A sequence of URLs, file names, or directories to look for links to distributions. index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. always_unzip A flag indicating that newly-downloaded distributions should be directories even if they could be installed as zip files. working_set An existing working set to be augmented with additional distributions, if necessary to satisfy requirements. This allows you to call install multiple times, if necessary, to gather multiple sets of requirements. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. use_dependency_links A flag indicating whether to search for dependencies using the setup dependency_links metadata or not. If true, links are searched for using dependency_links in preference to other locations. Defaults to true. include_site_packages A flag indicating whether Python's non-standard-library packages should be available for finding dependencies. Defaults to true. Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. relative_paths Adjust egg paths so they are relative to the script path. This allows scripts to work when scripts and eggs are moved, as long as they are both moved in the same way. The install method returns a working set containing the distributions needed to meet the given requirements. We have a link server that has a number of eggs: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
Let's make a directory and install the demo egg to it, using the demo: >>> dest = tmpdir('sample-install') >>> import zc.buildout.easy_install >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/') We requested version 0.2 of the demo distribution to be installed into the destination server. We specified that we should search for links on the link server and that we should use the (empty) link server index directory as a package index. The working set contains the distributions we retrieved. >>> for dist in ws: ... print dist demo 0.2 demoneeded 1.1 We got demoneeded because it was a dependency of demo. And the actual eggs were added to the eggs directory. >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we remove the version restriction on demo, but specify a false value for newest, no new distributions will be installed: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... newest=False) >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we leave off the newest option, we'll get an update for demo: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg Note that we didn't get the newest versions available. There were release candidates for newer versions of both packages. By default, final releases are preferred. We can change this behavior using the prefer_final function: >>> zc.buildout.easy_install.prefer_final(False) True The old setting is returned. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.4c1 demoneeded 1.2c1 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg Let's put the setting back to the default. >>> zc.buildout.easy_install.prefer_final(True) False We can supply additional distributions. We can also supply specifications for distributions that would normally be found via dependencies. We might do this to specify a specific version. >>> ws = zc.buildout.easy_install.install( ... ['demo', 'other', 'demoneeded==1.0'], dest, ... links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.3 other 1.0 demoneeded 1.0 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.0-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d other-1.0-py2.4.egg We can request that eggs be unzipped even if they are zip safe. This can be useful when debugging. (Note that Distribute will unzip eggs by default, so if you are using Distribute, most or all eggs will already be unzipped without this flag.) >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=False) >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg We can also set a default by calling the always_unzip function: >>> zc.buildout.easy_install.always_unzip(True) False The old default is returned: >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> zc.buildout.easy_install.always_unzip(False) True >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg Specifying version information independent of requirements ---------------------------------------------------------- Sometimes it's useful to specify version information independent of normal requirements specifications. For example, a buildout may need to lock down a set of versions, without having to put put version numbers in setup files or part definitions. If a dictionary is passed to the install function, mapping project names to version numbers, then the versions numbers will be used. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) >>> [d.version for d in ws] ['0.2', '1.0'] In this example, we specified a version for demoneeded, even though we didn't define a requirement for it. The versions specified apply to dependencies as well as the specified requirements. If we specify a version that's incompatible with a requirement, then we'll get an error: >>> from zope.testing.loggingsupport import InstalledHandler >>> handler = InstalledHandler('zc.buildout.easy_install') >>> import logging >>> logging.getLogger('zc.buildout.easy_install').propagate = False >>> ws = zc.buildout.easy_install.install( ... ['demo >0.2'], dest, links=[link_server], ... index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) Traceback (most recent call last): ... IncompatibleConstraintError: Bad constraint 0.2 demo>0.2 >>> print handler zc.buildout.easy_install DEBUG Installing 'demo >0.2'. zc.buildout.easy_install ERROR The constraint, 0.2, is not consistent with the requirement, 'demo>0.2'. >>> handler.clear() If no versions are specified, a debugging message will be output reporting that a version was picked automatically: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> print handler zc.buildout.easy_install DEBUG Installing 'demo'. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demo'. zc.buildout.easy_install DEBUG Picked: demo = 0.3 zc.buildout.easy_install DEBUG Getting required 'demoneeded' zc.buildout.easy_install DEBUG required by demo 0.3. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demoneeded'. zc.buildout.easy_install DEBUG Picked: demoneeded = 1.1 >>> handler.uninstall() >>> logging.getLogger('zc.buildout.easy_install').propagate = True We can request that we get an error if versions are picked: >>> zc.buildout.easy_install.allow_picked_versions(False) True (The old setting is returned.) >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) Traceback (most recent call last): ... UserError: Picked: demo = 0.3 >>> zc.buildout.easy_install.allow_picked_versions(True) False The function default_versions can be used to get and set default version information to be used when no version information is passes. If called with an argument, it sets the default versions: >>> zc.buildout.easy_install.default_versions(dict(demoneeded='1')) {} It always returns the previous default versions. If called without an argument, it simply returns the default versions without changing them: >>> zc.buildout.easy_install.default_versions() {'demoneeded': '1'} So with the default versions set, we'll get the requested version even if the versions option isn't used: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.0'] Of course, we can unset the default versions by passing an empty dictionary: >>> zc.buildout.easy_install.default_versions({}) {'demoneeded': '1'} >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.1'] Dependencies in Site Packages ----------------------------- Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. These site-packages are searched by default for distributions. This can be disabled, so that, for instance, a system Python can be used with buildout, cleaned of any packages installed by a user or system package manager. The default behavior can be controlled and introspected using zc.buildout.easy_install.include_site_packages. >>> zc.buildout.easy_install.include_site_packages() True Here's an example of using a Python executable that includes our dependencies. Our "py_path" will have the "demoneeded," and "demo" packages available. We'll simply be asking for "demoneeded" here, but without any external index or links. >>> from zc.buildout.tests import create_sample_sys_install >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) >>> [dist.project_name for dist in workingset] ['demoneeded'] That worked fine. Let's try again with site packages not allowed. We'll change the policy by changing the default. Notice that the function for changing the default value returns the previous value. >>> zc.buildout.easy_install.include_site_packages(False) True >>> zc.buildout.easy_install.include_site_packages() False >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. >>> zc.buildout.easy_install.clear_index_cache() Now we'll reset the default. >>> zc.buildout.easy_install.include_site_packages(True) False >>> zc.buildout.easy_install.include_site_packages() True Dependency links ---------------- Setuptools allows metadata that describes where to search for package dependencies. This option is called dependency_links. Buildout has its own notion of where to look for dependencies, but it also uses the setup tools dependency_links information if it's available. Let's demo this by creating an egg that specifies dependency_links. To begin, let's create a new egg repository. This repository hold a newer version of the 'demoneeded' egg than the sample repository does. >>> repoloc = tmpdir('repo') >>> from zc.buildout.tests import create_egg >>> create_egg('demoneeded', '1.2', repoloc) >>> link_server2 = start_server(repoloc) Turn on logging on this server so that we can see when eggs are pulled from it. >>> get(link_server2 + 'enable_server_logging') GET 200 /enable_server_logging '' Now we can create an egg that specifies that its dependencies are found on this server. >>> repoloc = tmpdir('repo2') >>> create_egg('hasdeps', '1.0', repoloc, ... install_requires = "'demoneeded'", ... dependency_links = [link_server2]) Let's add the egg to another repository. >>> link_server3 = start_server(repoloc) Now let's install the egg. >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, ... links=[link_server3], index=link_server3+'index/') GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg The server logs show that the dependency was retrieved from the server specified in the dependency_links. Now let's see what happens if we provide two different ways to retrieve the dependencies. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg Once again the dependency is fetched from the logging server even though it is also available from the non-logging server. This is because the version on the logging server is newer and buildout normally chooses the newest egg available. If you wish to control where dependencies come from regardless of dependency_links setup metadata use the 'use_dependency_links' option to zc.buildout.easy_install.install(). >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=False) Notice that this time the dependency egg is not fetched from the logging server. When you specify not to use dependency_links, eggs will only be searched for using the links you explicitly provide. Another way to control this option is with the zc.buildout.easy_install.use_dependency_links() function. This function sets the default behavior for the zc.buildout.easy_install() function. >>> zc.buildout.easy_install.use_dependency_links(False) True The function returns its previous setting. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) It can be overridden by passing a keyword argument to the install function. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=True) GET 200 /demoneeded-1.2-pyN.N.egg To return the dependency_links behavior to normal call the function again. >>> zc.buildout.easy_install.use_dependency_links(True) False >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 /demoneeded-1.2-pyN.N.egg Script generation ----------------- The easy_install module provides support for creating scripts from eggs. It provides two competing functions. One, ``scripts``, is a well-established approach to generating reliable scripts with a "clean" Python--e.g., one that does not have any packages in its site-packages. The other, ``sitepackage_safe_scripts``, is newer, a bit trickier, and is designed to work with a Python that has code in its site-packages, such as a system Python. Both are similar to setuptools except that they provides facilities for baking a script's path into the script. This has two advantages: - The eggs to be used by a script are not chosen at run time, making startup faster and, more importantly, deterministic. - The script doesn't have to import pkg_resources because the logic that pkg_resources would execute at run time is executed at script-creation time. (There is an exception in ``sitepackage_safe_scripts`` if you want to have your Python's site packages available, as discussed below, but even in that case pkg_resources is only partially activated, which can be a significant time savings.) The ``scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~ The ``scripts`` function is the first way to generate scripts that we'll examine. It is the earlier approach that the package offered. Let's create a destination directory for it to place them in: >>> bin = tmpdir('bin') Now, we'll use the scripts function to generate scripts in this directory from the demo egg: >>> import sys >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin) the four arguments we passed were: 1. A sequence of distribution requirements. These are of the same form as setuptools requirements. Here we passed a single requirement, for the version 0.1 demo distribution. 2. A working set, 3. The Python executable to use, and 3. The destination directory. The bin directory now contains a generated script: >>> ls(bin) - demo The return value is a list of the scripts generated: >>> import os, sys >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo')] True Note that in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. The demo script run the entry point defined in the demo egg: >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Some things to note: - The demo and demoneeded eggs are added to the beginning of sys.path. - The module for the script entry point is imported and the entry point, in this case, 'main', is run. Rather than requirement strings, you can pass tuples containing 3 strings: - A script name, - A module, - An attribute expression for an entry point within the module. For example, we could have passed entry point information directly rather than passing a requirement: >>> scripts = zc.buildout.easy_install.scripts( ... [('demo', 'eggrecipedemo', 'main')], ... ws, sys.executable, bin) >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Passing entry-point information directly is handy when using eggs (or distributions) that don't declare their entry points, such as distributions that aren't based on setuptools. The interpreter keyword argument can be used to generate a script that can be used to invoke the Python interactive interpreter with the path set based on the working set. This generated script can also be used to run other scripts with the path set on the working set: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, interpreter='py') >>> ls(bin) - demo - py >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py'), ... os.path.join(bin, 'py.exe'), ... os.path.join(bin, 'py-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo'), ... os.path.join(bin, 'py')] True The py script simply runs the Python interactive interpreter with the path set: >>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-pyN.N.egg', '/sample-install/demoneeded-1.1-pyN.N.egg', ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) If invoked with a script name and arguments, it will run that script, instead. >>> write('ascript', ''' ... "demo doc" ... print sys.argv ... print (__name__, __file__, __doc__) ... ''') >>> print system(join(bin, 'py')+' ascript a b c'), ['ascript', 'a', 'b', 'c'] ('__main__', 'ascript', 'demo doc') For Python 2.5 and higher, you can also use the -m option to run a module: >>> print system(join(bin, 'py')+' -m pdb'), usage: pdb.py scriptfile [arg] ... >>> print system(join(bin, 'py')+' -m pdb what'), Error: what does not exist An additional argument can be passed to define which scripts to install and to provide script names. The argument is a dictionary mapping original script names to new script names. >>> bin = tmpdir('bin2') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run')) >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'run.exe'), ... os.path.join(bin, 'run-script.py')] ... else: ... scripts == [os.path.join(bin, 'run')] True >>> ls(bin) - run >>> print system(os.path.join(bin, 'run')), 3 1 The ``scripts`` function: Including extra paths in scripts ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We can pass a keyword argument, extra paths, to cause additional paths to be included in the a generated script: >>> foo = tmpdir('foo') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... extra_paths=[foo]) >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', '/foo', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) The ``scripts`` function: Providing script arguments ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ An "argument" keyword argument can be used to pass arguments to an entry point. The value passed is a source string to be placed between the parentheses in the call: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Passing initialization code ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ You can also pass script initialization code: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2', ... initialization='import os\nos.chdir("foo")') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Relative paths ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Sometimes, you want to be able to move a buildout directory around and have scripts still work without having to rebuild them. We can control this using the relative_paths option to install. You need to pass a common base directory of the scripts and eggs: >>> bo = tmpdir('bo') >>> ba = tmpdir('ba') >>> mkdir(bo, 'eggs') >>> mkdir(bo, 'bin') >>> mkdir(bo, 'other') >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(bo, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, join(bo, 'bin'), dict(demo='run'), ... extra_paths=[ba, join(bo, 'bar')], ... interpreter='py', ... relative_paths=bo) >>> cat(bo, 'bin', 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Note that the extra path we specified that was outside the directory passed as relative_paths wasn't converted to a relative path. Of course, running the script works: >>> print system(join(bo, 'bin', 'run')), 3 1 We specified an interpreter and its paths are adjusted too: >>> cat(bo, 'bin', 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) The ``sitepackage_safe_scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The newer function for creating scripts is ``sitepackage_safe_scripts``. It has the same basic functionality as the ``scripts`` function: it can create scripts to run arbitrary entry points, and to run a Python interpreter. The following are the differences from a user's perspective. - It can be used safely with a Python that has packages installed itself, such as a system-installed Python. - In contrast to the interpreter generated by the ``scripts`` method, which supports only a small subset of the usual Python executable's options, the interpreter generated by ``sitepackage_safe_scripts`` supports all of them. This makes it possible to use as full Python replacement for scripts that need the distributions specified in your buildout. - Both the interpreter and the entry point scripts allow you to include the site packages, and/or the sitecustomize, of the Python executable, if desired. It works by creating site.py and sitecustomize.py files that set up the desired paths and initialization. These must be placed within an otherwise empty directory. Typically this is in a recipe's parts directory. Here's the simplest example, building an interpreter script. >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py') Depending on whether the machine being used is running Windows or not, this produces either three or four files. In both cases, we have site.py and sitecustomize.py generated in the parts/interpreter directory. For Windows, we have py.exe and py-script.py; for other operating systems, we have py. >>> sitecustomize_path = os.path.join( ... interpreter_parts_dir, 'sitecustomize.py') >>> site_path = os.path.join(interpreter_parts_dir, 'site.py') >>> interpreter_path = os.path.join(interpreter_bin_dir, 'py') >>> if sys.platform == 'win32': ... py_path = os.path.join(interpreter_bin_dir, 'py-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'py.exe'), ... py_path] ... else: ... py_path = interpreter_path ... expected = [sitecustomize_path, site_path, py_path] ... >>> assert generated == expected, repr((generated, expected)) We didn't ask for any initialization, and we didn't ask to use the underlying sitecustomization, so sitecustomize.py is empty. >>> cat(sitecustomize_path) The interpreter script is simple. It puts the directory with the site.py and sitecustomize.py on the PYTHONPATH and (re)starts Python. >>> cat(py_path) #!/usr/bin/python -S import os import sys argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = '/interpreter/parts/interpreter' if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) The site.py file is a modified version of the underlying Python's site.py. The most important modification is that it has a different version of the addsitepackages function. It sets up the Python path, similarly to the behavior of the function it replaces. The following shows the part that buildout inserts, in the simplest case. >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) return known_paths def original_addsitepackages(known_paths):... Here are some examples of the interpreter in use. >>> print call_py(interpreter_path, "print 16+26") 42 >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] >>> clean_paths = eval(res.strip()) # This is used later for comparison. If you provide initialization, it goes in sitecustomize.py. >>> def reset_interpreter(): ... # This is necessary because, in our tests, the timestamps of the ... # .pyc files are not outdated when we want them to be. ... rmdir(interpreter_bin_dir) ... mkdir(interpreter_bin_dir) ... rmdir(interpreter_parts_dir) ... mkdir(interpreter_parts_dir) ... >>> reset_interpreter() >>> initialization_string = """\ ... import os ... os.environ['FOO'] = 'bar baz bing shazam'""" >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', initialization=initialization_string) >>> cat(sitecustomize_path) import os os.environ['FOO'] = 'bar baz bing shazam' >>> print call_py(interpreter_path, "import os; print os.environ['FOO']") bar baz bing shazam If you use relative paths, this affects the interpreter and site.py. (This is again the UNIX version; the Windows version uses subprocess instead of os.execve.) >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', relative_paths=interpreter_dir) >>> cat(py_path) #!/usr/bin/python -S import os import sys join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = join(base, 'parts/interpreter') if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) For site.py, we again show only the pertinent parts. Notice that the egg paths join a base to a path, as with the use of this argument in the ``scripts`` function. >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ]... The paths resolve in practice as you would expect. >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] The ``extra_paths`` argument affects the path in site.py. Notice that /interpreter/other is added after the eggs. >>> reset_interpreter() >>> mkdir(interpreter_dir, 'other') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', extra_paths=[join(interpreter_dir, 'other')]) >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other' ]... >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other'] The ``sitepackage_safe_scripts`` function: using site-packages ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``sitepackage_safe_scripts`` function supports including site packages. This has some advantages and some serious dangers. A typical reason to include site-packages is that it is easier to install one or more dependencies in your Python than it is with buildout. Some packages, such as lxml or Python PostgreSQL integration, have dependencies that can be much easier to build and/or install using other mechanisms, such as your operating system's package manager. By installing some core packages into your Python's site-packages, this can significantly simplify some application installations. However, doing this has a significant danger. One of the primary goals of buildout is to provide repeatability. Some packages (one of the better known Python openid packages, for instance) change their behavior depending on what packages are available. If Python curl bindings are available, these may be preferred by the library. If a certain XML package is installed, it may be preferred by the library. These hidden choices may cause small or large behavior differences. The fact that they can be rarely encountered can actually make it worse: you forget that this might be a problem, and debugging the differences can be difficult. If you allow site-packages to be included in your buildout, and the Python you use is not managed precisely by your application (for instance, it is a system Python), you open yourself up to these possibilities. Don't be unaware of the dangers. That explained, let's see how it works. If you don't use namespace packages, this is very straightforward. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = None buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... It simply adds the original paths using addsitedir after the code to add the buildout paths. Here's an example of the new script in use. Other documents and tests in this package give the feature a more thorough workout, but this should give you an idea of the feature. >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-py2.4.egg', '/interpreter/eggs/demoneeded-1.1-py2.4.egg', ...] The clean_paths gathered earlier is a subset of this full list of paths. >>> full_paths = eval(res.strip()) >>> len(clean_paths) < len(full_paths) True >>> set(os.path.normpath(p) for p in clean_paths).issubset( ... os.path.normpath(p) for p in full_paths) True Unfortunately, because of how setuptools namespace packages are implemented differently for operating system packages (debs or rpms) as opposed to standard setuptools installation, there's a slightly trickier dance if you use them. To show this we'll needs some extra eggs that use namespaces. We'll use the ``tellmy.fortune`` package, which we'll need to make an initial call to another text fixture to create. >>> from zc.buildout.tests import create_sample_namespace_eggs >>> namespace_eggs = tmpdir('namespace_eggs') >>> create_sample_namespace_eggs(namespace_eggs) >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo', 'tellmy.fortune'], join(interpreter_dir, 'eggs'), ... links=[link_server, namespace_eggs], index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '...setuptools...', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] As you can see, the script now first imports pkg_resources. Then we need to process egg files specially to look for namespace packages there *before* we process process lines in .pth files that use the "import" feature--lines that might be part of the setuptools namespace package implementation for system packages, as mentioned above, and that must come after processing egg namespaces. The most complex that this function gets is if you use namespace packages, include site-packages, and use relative paths. For completeness, we'll look at that result. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True, ... relative_paths=interpreter_dir) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/tellmy.fortune-1.0-pyN.N.egg'), '...setuptools...', join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] The ``exec_sitecustomize`` argument does the same thing for the sitecustomize module--it allows you to include the code from the sitecustomize module in the underlying Python if you set the argument to True. The z3c.recipe.scripts package sets up the full environment necessary to demonstrate this piece. The ``sitepackage_safe_scripts`` function: writing scripts for entry points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All of the examples so far for this function have been creating interpreters. The function can also write scripts for entry points. They are almost identical to the scripts that we saw for the ``scripts`` function except that they ``import site`` after setting the sys.path to include our custom site.py and sitecustomize.py files. These files then initialize the Python environment as we have already seen. Let's see a simple example. >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo']) As before, in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. This is in addition to the site.py and sitecustomize.py files that are generated as with our interpreter examples above. >>> if sys.platform == 'win32': ... demo_path = os.path.join(interpreter_bin_dir, 'demo-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'demo.exe'), ... demo_path] ... else: ... demo_path = os.path.join(interpreter_bin_dir, 'demo') ... expected = [sitecustomize_path, site_path, demo_path] ... >>> assert generated == expected, repr((generated, expected)) The demo script runs the entry point defined in the demo egg: >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) >>> demo_call = join(interpreter_bin_dir, 'demo') >>> if sys.platform == 'win32': ... demo_call = '"%s"' % demo_call >>> print system(demo_call) 3 1 There are a few differences from the ``scripts`` function. First, the ``reqs`` argument (an iterable of string requirements or entry point tuples) is a keyword argument here. We see that in the example above. Second, the ``arguments`` argument is now named ``script_arguments`` to try and clarify that it does not affect interpreters. While the ``initialization`` argument continues to affect both the interpreters and the entry point scripts, if you have initialization that is only pertinent to the entry point scripts, you can use the ``script_initialization`` argument. Let's see ``script_arguments`` and ``script_initialization`` in action. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo'], script_arguments='1, 2', ... script_initialization='import os\nos.chdir("foo")') >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) Handling custom build options for extensions provided in source distributions ----------------------------------------------------------------------------- Sometimes, we need to control how extension modules are built. The build function provides this level of control. It takes a single package specification, downloads a source distribution, and builds it with specified custom build options. The build function takes 3 positional arguments: spec A package specification for a source distribution dest A destination directory build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. It supports a number of optional keyword arguments: links a sequence of URLs, file names, or directories to look for links to distributions, index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. Our link server included a source distribution that includes a simple extension, extdemo.c:: #include #include static PyMethodDef methods[] = {}; PyMODINIT_FUNC initextdemo(void) { PyObject *m; m = Py_InitModule3("extdemo", methods, ""); #ifdef TWO PyModule_AddObject(m, "val", PyInt_FromLong(2)); #else PyModule_AddObject(m, "val", PyInt_FromLong(EXTDEMO)); #endif } The extension depends on a system-dependent include file, extdemo.h, that defines a constant, EXTDEMO, that is exposed by the extension. We'll add an include directory to our sample buildout and add the needed include file to it: >>> mkdir('include') >>> write('include', 'extdemo.h', ... """ ... #define EXTDEMO 42 ... """) Now, we can use the build function to create an egg from the source distribution: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] The function returns the list of eggs Now if we look in our destination directory, we see we have an extdemo egg: >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg Let's update our link server with a new version of extdemo: >>> update_extdemo() >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
extdemo-1.5.zip
index/
other-1.0-py2.4.egg
The easy_install caches information about servers to reduce network access. To see the update, we have to call the clear_index_cache function to clear the index cache: >>> zc.buildout.easy_install.clear_index_cache() If we run build with newest set to False, we won't get an update: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... newest=False) ['/sample-install/extdemo-1.4-py2.4-linux-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg But if we run it with the default True setting for newest, then we'll get an updated egg: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.5-py2.4-unix-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg d extdemo-1.5-py2.4-unix-i686.egg The versions option also influences the versions used. For example, if we specify a version for extdemo, then that will be used, even though it isn't the newest. Let's clean out the destination directory first: >>> import os >>> for name in os.listdir(dest): ... remove(dest, name) >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... versions=dict(extdemo='1.4')) ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg Handling custom build options for extensions in develop eggs ------------------------------------------------------------ The develop function is similar to the build function, except that, rather than building an egg from a source directory containing a setup.py script. The develop function takes 2 positional arguments: setup The path to a setup script, typically named "setup.py", or a directory containing a setup.py script. dest The directory to install the egg link to It supports some optional keyword argument: build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. We have a local directory containing the extdemo source: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README - extdemo.c - setup.py Now, we can use the develop function to create a develop egg from the source distribution: >>> zc.buildout.easy_install.develop( ... extdemo, dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}) '/sample-install/extdemo.egg-link' The name of the egg link created is returned. Now if we look in our destination directory, we see we have an extdemo egg link: >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg - extdemo.egg-link And that the source directory contains the compiled extension: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README d build - extdemo.c d extdemo.egg-info - extdemo.so - setup.py Download cache -------------- Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. A download cache is specified by calling the download_cache function. The function always returns the previous setting. If no argument is passed, then the setting is unchanged. If an argument is passed, the download cache is set to the given path, which must point to an existing directory. Passing None clears the cache setting. To see this work, we'll create a directory and set it as the cache directory: >>> cache = tmpdir('cache') >>> zc.buildout.easy_install.download_cache(cache) We'll recreate our destination directory: >>> remove(dest) >>> dest = tmpdir('sample-install') We'd like to see what is being fetched from the server, so we'll enable server logging: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' Now, if we install demo, and extdemo: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 200 /demo-0.2-py2.4.egg GET 404 /index/demoneeded/ GET 200 /demoneeded-1.1.zip >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ GET 200 /extdemo-1.5.zip ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] Not only will we get eggs in our destination directory: >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg But we'll get distributions in the cache directory: >>> ls(cache) - demo-0.2-py2.4.egg - demoneeded-1.1.zip - extdemo-1.5.zip The cache directory contains uninstalled distributions, such as zipped eggs or source distributions. Let's recreate our destination directory and clear the index cache: >>> remove(dest) >>> dest = tmpdir('sample-install') >>> zc.buildout.easy_install.clear_index_cache() Now when we install the distributions: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 404 /index/demoneeded/ >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg Note that we didn't download the distributions from the link server. If we remove the restriction on demo, we'll download a newer version from the link server: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 /demo-0.3-py2.4.egg Normally, the download cache is the preferred source of downloads, but not the only one. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a download cache and tell the easy_install module to install from the download cache only, without making network accesses. The install_from_cache function can be used to signal that packages should be installed only from the download cache. The function always returns the previous setting. Calling it with no arguments returns the current setting without changing it: >>> zc.buildout.easy_install.install_from_cache() False Calling it with a boolean value changes the setting and returns the previous setting: >>> zc.buildout.easy_install.install_from_cache(True) False Let's remove demo-0.3-py2.4.egg from the cache, clear the index cache, recreate the destination directory, and reinstall demo: >>> for f in os.listdir(cache): ... if f.startswith('demo-0.3-'): ... remove(cache, f) >>> zc.buildout.easy_install.clear_index_cache() >>> remove(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg This time, we didn't download from or even query the link server. .. Disable the download cache: >>> zc.buildout.easy_install.download_cache(None) '/cache' >>> zc.buildout.easy_install.install_from_cache(False) True Distribute Support ================== Distribute is a drop-in replacement for Setuptools. zc.buildout is now compatible with Distribute 0.6. To use Distribute in your buildout, you need use the ``--distribute`` option of the ``bootstrap.py`` script:: $ python bootstrap.py --distribute This will download and install the latest Distribute 0.6 release in the ``eggs`` directory, and use this version for the scripts that are created in ``bin``. Notice that if you have a shared eggs directory, a buildout that uses Distribute will not interfer with other buildouts that are based on Setuptools and that are sharing the same eggs directory. Form more information about the Distribute project, see: http://python-distribute.org Change History ************** 1.7.1 (2013-02-21) ================== Fixed: Constraints intended to prevent upgrading to buildout-2-compatible recipes weren't expressed correctly, leading to unintendional use of zc.recipe.egg-2.0.0a3. 1.7.0 (2013-01-11) ================== - Unless version requirements are specified, buildout won't upgrade itself past version 1. - Versions in versions sections can now be simple constraints, like <2.0dev in addition to being simple versions. This is used to prevent upgrading zc.recipe.egg and zc.recipe.testrunner past version 1. - If buildout is bootstrapped with a non-final release, it won't downgrade itself to a final release. - Fix: distribute 0.6.33 broke Python 2.4 compatibility - remove `data_files` from `setup.py`, which was installing README.txt in current directory during installation [Domen Kožar] - Windows fix: use cli-64.exe/cli.exe depending on 64/32 bit and try cli.exe if cli-64.exe is not found, fixing 9c6be7ac6d218f09e33725e07dccc4af74d8cf97 - Windows fix: `buildout init` was broken, re.sub does not like a single backslash - fixed all builds on travis-ci [Domen Kožar] - use os._exit insted of sys.exit after ugrade forking [Domen Kožar] - Revert cfa0478937d16769c268bf51e60e69cd3ead50f3, it only broke a feature [Domen Kožar] 1.6.3 (2012-08-22) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/90bc44f9bffd0d9eb09aacf08c6a4c2fed797319 and https://github.com/buildout/buildout/commit/e65b7bfbd7c7ccd556a278016a16b63ae8ef782b) [aclark4life] 1.6.2 (2012-08-21) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/cfa0478937d16769c268bf51e60e69cd3ead50f3) [aclark4life] 1.6.1 (2012-08-18) ================== - `bootstrap.py -d init` would invoke buildout with arguments `init bootstrap` leading into installation of bootstrap package. now bootstrap.py first runs any commands passed, then tries to bootstrap. (Domen Kožar) - fix Python 2.4 support (Domen Kožar) - added travis-ci testing (Domen Kožar) 1.6.0 (2012-08-15) ================== - The buildout init command now accepts distribution requirements and paths to set up a custom interpreter part that has the distributions or parts in the path. For example:: python bootstrap.py init BeautifulSoup - Introduce a cache for the expensive `buildout._dir_hash` function. - Remove duplicate path from script's sys.path setup. - changed broken dash S check to pass the configuration options -S -c separately, to make zc.buildout more compatible with the PyPy interpreter, which has less flexible argument parsing than CPython. Note that PyPy post 1.4.0 is needed to make buildout work at all, due to missing support for the ``-E`` option, which only got added afterwards. - Made sure to download extended configuration files only once per buildout run even if they are referenced multiple times (patch by Rafael Monnerat). - Ported speedup optimization patch by Ross Patterson to 1.5.x series. Improved patch to calculate required_by packages in linear time in verbose mode (-v). Running relatively simple Buildout envornment yielded in running time improvement from 30 seconds to 10 seconds. (Domen Kožar, Ross Patterson) - Removed unnecessary pyc recompilation with optimization flags. Running Buildout with pre-downloaded ~300 packages that were installed in empty eggs repository yielded in running time improvement from 1126 seconds to 348 seconds. (Domen Kožar) Bugs fixed: - In the download module, fixed the handling of directories that are pointed to by file-system paths and ``file:`` URLs. - Removed any traces of the implementation of ``extended-by``. Raise a UserError if the option is encountered instead of ignoring it, though. - https://bugs.launchpad.net/bugs/697913 : Buildout doesn't honor exit code from scripts. Fixed. - Handle both addition and subtraction of elements (+= and -=) on the same key in the same section. 1.5.2 (2010-10-11) ================== - changed metadata 'url' to pypi.python.org in order to solve a temporary outage of buildout.org - IMPORTANT: For better backwards compatibility with the pre-1.5 line, this release has two big changes from 1.5.0 and 1.5.1. - Buildout defaults to including site packages. - Buildout loads recipes and extensions with the same constraints to site-packages that it builds eggs, instead of never allowing access to site-packages. This means that the default configuration should better support pre-existing use of system Python in recipes or builds. - To make it easier to detect the fact that buildout has set the PYTHONPATH, BUILDOUT_ORIGINAL_PYTHONPATH is always set in the environment, even if PYTHONPATH was not originally set. BUILDOUT_ORIGINAL_PYTHONPATH will be an empty string if PYTHONPATH was not set. 1.5.1 (2010-08-29) ================== New features: - Scripts store the old PYTHONPATH in BUILDOUT_ORIGINAL_PYTHONPATH if it existed, and store nothing in the value if it did not exist. This allows code that does not want subprocesses to have the system-Python-protected site.py to set the environment of the subprocess as it was originally. Bugs fixed: - https://bugs.launchpad.net/bugs/623590 : If include-site-packages were true and versions were not set explicitly, system eggs were preferred over newer released eggs. Fixed. 1.5.0 (2010-08-23) ================== New features: - zc.buildout supports Python 2.7. - By default, Buildout and the bootstrap script now prefer final versions of Buildout, recipes, and extensions. This can be changed by using the --accept-buildout-test-releases flag (or -t for short) when calling bootstrap. This will hopefully allow beta releases of these items to be more easily and safely made in the future. NOTE: dependencies of your own software are not affected by this new behavior. Buildout continues to choose the newest available versions of your dependencies regardless of whether they are final releases. To prevent this, use the pre-existing switch ``prefer-final = true`` in the [buildout] section of your configuration file (see http://pypi.python.org/pypi/zc.buildout#preferring-final-releases) or pin your versions using a versions section (see http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Bugs fixed: - You can now again use virtualenv with Buildout. The new features to let buildout be used with a system Python are disabled in this configuration, and the previous script generation behavior (1.4.3) is used, even if the new function ``zc.buildout.easy_install.sitepackage_safe_scripts`` is used. 1.5.0b2 (2010-04-29) ==================== This was a re-release of 1.4.3 in order to keep 1.5.0b1 release from hurting workflows that combined virtualenv with zc.buildout. 1.5.0b1 (2010-04-29) ==================== New Features: - Added buildout:socket-timout option so that socket timeout can be configured both from command line and from config files. (gotcha) - Buildout can be safely used with a system Python (or any Python with code in site-packages), as long as you use (1) A fresh checkout, (2) the new bootstrap.py, and (3) recipes that use the new ``zc.buildout.easy_install.sitepackage_safe_scripts`` function to generate scripts and interpreters. Many recipes will need to be updated to use this new function. The scripts and interpreters generated by ``zc.recipe.egg`` will continue to use the older function, not safe with system Pythons. Use the ``z3c.recipe.scripts`` as a replacement. zc.recipe.egg is still a fully supported, and simpler, way of generating scripts and interpreters if you are using a "clean" Python, without code installed in site-packages. It keeps its previous behavior in order to provide backwards compatibility. The z3c.recipe.scripts recipe allows you to control how you use the code in site-packages. You can exclude it entirely (preferred); allow eggs in it to fulfill package dependencies declared in setup.py and buildout configuration; allow it to be available but not used to fulfill dependencies declared in setup.py or buildout configuration; or only allow certain eggs in site-packages to fulfill dependencies. - Added new function, ``zc.buildout.easy_install.sitepackage_safe_scripts``, to generate scripts and interpreter. It produces a full-featured interpreter (all command-line options supported) and the ability to safely let scripts include site packages, such as with a system Python. The ``z3c.recipe.scripts`` recipe uses this new function. - Improve bootstrap. * New options let you specify where to find ez_setup.py and where to find a download cache. These options can keep bootstrap from going over the network. * Another new option lets you specify where to put generated eggs. * The buildout script generated by bootstrap honors more of the settings in the designated configuration file (e.g., buildout.cfg). * Correctly handle systems where pkg_resources is present but the rest of setuptools is missing (like Ubuntu installs). https://bugs.launchpad.net/zc.buildout/+bug/410528 - You can develop zc.buildout using Distribute instead of Setuptools. Use the --distribute option on the dev.py script. (Releases should be tested with both Distribute and Setuptools.) The tests for zc.buildout pass with Setuptools and Python 2.4, 2.5, 2.6, and 2.7; and with Distribute and Python 2.5, 2.6, and 2.7. Using zc.buildout with Distribute and Python 2.4 is not recommended. - The ``distribute-version`` now works in the [buildout] section, mirroring the ``setuptools-version`` option (this is for consistency; using the general-purpose ``versions`` option is preferred). Bugs fixed: - Using Distribute with the ``allow-picked-versions = false`` buildout option no longer causes an error. - The handling and documenting of default buildout options was normalized. This means, among other things, that ``bin/buildout -vv`` and ``bin/buildout annotate`` correctly list more of the options. - Installing a namespace package using a Python that already has a package in the same namespace (e.g., in the Python's site-packages) failed in some cases. It is now handled correctly. - Another variation of this error showed itself when at least two dependencies were in a shared location like site-packages, and the first one met the "versions" setting. The first dependency would be added, but subsequent dependencies from the same location (e.g., site-packages) would use the version of the package found in the shared location, ignoring the version setting. This is also now handled correctly. 1.4.3 (2009-12-10) ================== Bugs fixed: - Using pre-detected setuptools version for easy_installing tgz files. This prevents a recursion error when easy_installing an upgraded "distribute" tgz. Note that setuptools did not have this recursion problem solely because it was packaged as an ``.egg``, which does not have to go through the easy_install step. 1.4.2 (2009-11-01) ================== New Feature: - Added a --distribute option to the bootstrap script, in order to use Distribute rather than Setuptools. By default, Setuptools is used. Bugs fixed: - While checking for new versions of setuptools and buildout itself, compare requirement locations instead of requirement objects. - Incrementing didn't work properly when extending multiple files. https://bugs.launchpad.net/zc.buildout/+bug/421022 - The download API computed MD5 checksums of text files wrong on Windows. 1.4.1 (2009-08-27) ================== New Feature: - Added a debug built-in recipe to make writing some tests easier. Bugs fixed: - (introduced in 1.4.0) option incrementing (-=) and decrementing (-=) didn't work in the buildout section. https://bugs.launchpad.net/zc.buildout/+bug/420463 - Option incrementing and decrementing didn't work for options specified on the command line. - Scripts generated with relative-paths enabled couldn't be symbolically linked to other locations and still work. - Scripts run using generated interpreters didn't have __file__ set correctly. - The standard Python -m option didn't work for custom interpreters. 1.4.0 (2009-08-26) ================== - When doing variable substitutions, you can omit the section name to refer to a variable in the same section (e.g. ${:foo}). - When doing variable substitution, you can use the special option, ``_buildout_section_name_`` to get the section name. This is most handy for getting the current section name (e.g. ${:_buildout_section_name_}). - A new special option, ``<`` allows sections to be used as macros. - Added annotate command for annotated sections. Displays sections key-value pairs along with the value origin. - Added a download API that handles the download cache, offline mode etc and is meant to be reused by recipes. - Used the download API to allow caching of base configurations (specified by the buildout section's 'extends' option). 1.3.1 (2009-08-12) ================== - Bug fixed: extras were ignored in some cases when versions were specified. 1.3.0 (2009-06-22) ================== - Better Windows compatibility in test infrastructure. - Now the bootstrap.py has an optional --version argument, that can be used to force zc.buildout version to use. - ``zc.buildout.testing.buildoutSetUp`` installs a new handler in the python root logging facility. This handler is now removed during tear down as it might disturb other packages reusing buildout's testing infrastructure. - fixed usage of 'relative_paths' keyword parameter on Windows - Added an unload entry point for extensions. - Fixed bug: when the relative paths option was used, relative paths could be inserted into sys.path if a relative path was used to run the generated script. 1.2.1 (2009-03-18) ================== - Refactored generation of relative egg paths to generate simpler code. 1.2.0 (2009-03-17) ================== - Added a relative_paths option to zc.buildout.easy_install.script to generate egg paths relative to the script they're used in. 1.1.2 (2009-03-16) ================== - Added Python 2.6 support. Removed Python 2.3 support. - Fixed remaining deprecation warnings under Python 2.6, both when running our tests and when using the package. - Switched from using os.popen* to subprocess.Popen, to avoid a deprecation warning in Python 2.6. See: http://docs.python.org/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3 - Made sure the 'redo_pyc' function and the doctest checkers work with Python executable paths containing spaces. - Expand shell patterns when processing the list of paths in `develop`, e.g:: [buildout] develop = ./local-checkouts/* - Conditionally import and use hashlib.md5 when it's available instead of md5 module, which is deprecated in Python 2.6. - Added Jython support for bootstrap, development bootstrap and zc.buildout support on Jython - Fixed a bug that would cause buildout to break while computing a directory hash if it found a broken symlink (Launchpad #250573) 1.1.1 (2008-07-28) ================== - Fixed a bug that caused buildouts to fail when variable substitutions are used to name standard directories, as in:: [buildout] eggs-directory = ${buildout:directory}/develop-eggs 1.1.0 (2008-07-19) ================== - Added a buildout-level unzip option tp change the default policy for unzipping zip-safe eggs. - Tracebacks are now printed for internal errors (as opposed to user errors) even without the -D option. - pyc and pyo files are regenerated for installed eggs so that the stored path in code objects matches the the install location. 1.0.6 (2008-06-13) ================== - Manually reverted the changeset for the fix for https://bugs.launchpad.net/zc.buildout/+bug/239212 to verify thet the test actually fails with the changeset: http://svn.zope.org/zc.buildout/trunk/src/zc/buildout/buildout.py?rev=87309&r1=87277&r2=87309 Thanks tarek for pointing this out. (seletz) - fixed the test for the += -= syntax in buildout.txt as the test was actually wronng. The original implementation did a split/join on whitespace, and later on that was corrected to respect the original EOL setting, the test was not updated, though. (seletz) - added a test to verify against https://bugs.launchpad.net/zc.buildout/+bug/239212 in allowhosts.txt (seletz) - further fixes for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor (related to the new 'allow-hosts' option) (patch by Gottfried Ganssauge) (ajung) 1.0.5 (2008-06-10) ================== - Fixed wrong split when using the += and -= syntax (mustapha) 1.0.4 (2008-06-10) ================== - Added the `allow-hosts` option (tarek) - Quote the 'executable' argument when trying to detect the python version using popen4. (sidnei) - Quote the 'spec' argument, as in the case of installing an egg from the buildout-cache, if the filename contains spaces it would fail (sidnei) - Extended configuration syntax to allow -= and += operators (malthe, mustapha). 1.0.3 (2008-06-01) ================== - fix for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor. (patch by Gottfried Ganssauge) (ajung) 1.0.2 (2008-05-13) ================== - More fixes for Windows. A quoted sh-bang is now used on Windows to make the .exe files work with a Python executable in 'program files'. - Added "-t " option for specifying the socket timeout. (ajung) 1.0.1 (2008-04-02) ================== - Made easy_install.py's _get_version accept non-final releases of Python, like 2.4.4c0. (hannosch) - Applied various patches for Windows (patch by Gottfried Ganssauge). (ajung) - Applied patch fixing rmtree issues on Windows (patch by Gottfried Ganssauge). (ajung) 1.0.0 (2008-01-13) ================== - Added a French translation of the buildout tutorial. 1.0.0b31 (2007-11-01) ===================== Feature Changes --------------- - Added a configuration option that allows buildouts to ignore dependency_links metadata specified in setup. By default dependency_links in setup are used in addition to buildout specified find-links. This can make it hard to control where eggs come from. Here's how to tell buildout to ignore URLs in dependency_links:: [buildout] use-dependency-links = false By default use-dependency-links is true, which matches the behavior of previous versions of buildout. - Added a configuration option that causes buildout to error if a version is picked. This is a nice safety belt when fixing all versions is intended, especially when creating releases. Bugs Fixed ---------- - 151820: Develop failed if the setup.py script imported modules in the distribution directory. - Verbose logging of the develop command was omitting detailed output. - The setup command wasn't documented. - The setup command failed if run in a directory without specifying a configuration file. - The setup command raised a stupid exception if run without arguments. - When using a local find links or index, distributions weren't copied to the download cache. - When installing from source releases, a version specification (via a buildout versions section) for setuptools was ignored when deciding which setuptools to use to build an egg from the source release. 1.0.0b30 (2007-08-20) ===================== Feature Changes --------------- - Changed the default policy back to what it was to avoid breakage in existing buildouts. Use:: [buildout] prefer-final = true to get the new policy. The new policy will go into effect in buildout 2. 1.0.0b29 (2007-08-20) ===================== Feature Changes --------------- - Now, final distributions are prefered over non-final versions. If both final and non-final versions satisfy a requirement, then the final version will be used even if it is older. The normal way to override this for specific packages is to specifically require a non-final version, either specifically or via a lower bound. - There is a buildout prefer-final version that can be used with a value of "false":: prefer-final = false To prefer newer versions, regardless of whether or not they are final, buildout-wide. - The new simple Python index, http://cheeseshop.python.org/simple, is used as the default index. This will provide better performance than the human package index interface, http://pypi.python.org/pypi. More importantly, it lists hidden distributions, so buildouts with fixed distribution versions will be able to find old distributions even if the distributions have been hidden in the human PyPI interface. Bugs Fixed ---------- - 126441: Look for default.cfg in the right place on Windows. 1.0.0b28 (2007-07-05) ===================== Bugs Fixed ---------- - When requiring a specific version, buildout looked for new versions even if that single version was already installed. 1.0.0b27 (2007-06-20) ===================== Bugs Fixed ---------- - Scripts were generated incorrectly on Windows. This included the buildout script itself, making buildout completely unusable. 1.0.0b26 (2007-06-19) ===================== Feature Changes --------------- - Thanks to recent fixes in setuptools, I was able to change buildout to use find-link and index information when searching extensions. Sadly, this work, especially the timing, was motivated my the need to use alternate indexes due to performance problems in the cheese shop (http://www.python.org/pypi/). I really home we can address these performance problems soon. 1.0.0b25 (2007-05-31) ===================== Feature Changes --------------- - buildout now changes to the buildout directory before running recipe install and update methods. - Added a new init command for creating a new buildout. This creates an empty configuration file and then bootstraps. - Except when using the new init command, it is now an error to run buildout without a configuration file. - In verbose mode, when adding distributions to fulful requirements of already-added distributions, we now show why the new distributions are being added. - Changed the logging format to exclude the logger name for the zc.buildout logger. This reduces noise in the output. - Clean up lots of messages, adding missing periods and adding quotes around requirement strings and file paths. Bugs Fixed ---------- - 114614: Buildouts could take a very long time if there were dependency problems in large sets of pathologically interdependent packages. - 59270: Buggy recipes can cause failures in later recipes via chdir - 61890: file:// urls don't seem to work in find-links setuptools requires that file urls that point to directories must end in a "/". Added a workaround. - 75607: buildout should not run if it creates an empty buildout.cfg 1.0.0b24 (2007-05-09) ===================== Feature Changes --------------- - Improved error reporting by showing which packages require other packages that can't be found or that cause version conflicts. - Added an API for use by recipe writers to clean up created files when recipe errors occur. - Log installed scripts. Bugs Fixed ---------- - 92891: bootstrap crashes with recipe option in buildout section. - 113085: Buildout exited with a zero exist status when internal errors occurred. 1.0.0b23 (2007-03-19) ===================== Feature Changes --------------- - Added support for download caches. A buildout can specify a cache for distribution downloads. The cache can be shared among buildouts to reduce network access and to support creating source distributions for applications allowing install without network access. - Log scripts created, as suggested in: https://bugs.launchpad.net/zc.buildout/+bug/71353 Bugs Fixed ---------- - It wasn't possible to give options on the command line for sections not defined in a configuration file. 1.0.0b22 (2007-03-15) ===================== Feature Changes --------------- - Improved error reporting and debugging support: - Added "logical tracebacks" that show functionally what the buildout was doing when an error occurs. Don't show a Python traceback unless the -D option is used. - Added a -D option that causes the buildout to print a traceback and start the pdb post-mortem debugger when an error occurs. - Warnings are printed for unused options in the buildout section and installed-part sections. This should make it easier to catch option misspellings. - Changed the way the installed database (.installed.cfg) is handled to avoid database corruption when a user breaks out of a buildout with control-c. - Don't save an installed database if there are no installed parts or develop egg links. 1.0.0b21 (2007-03-06) ===================== Feature Changes --------------- - Added support for repeatable buildouts by allowing egg versions to be specified in a versions section. - The easy_install module install and build functions now accept a versions argument that supplied to mapping from project name to version numbers. This can be used to fix version numbers for required distributions and their depenencies. When a version isn't fixed, using either a versions option or using a fixed version number in a requirement, then a debug log message is emitted indicating the version picked. This is useful for setting versions options. A default_versions function can be used to set a default value for this option. - Adjusted the output for verbosity levels. Using a single -v option no longer causes voluminous setuptools output. Uisng -vv and -vvv now triggers extra setuptools output. - Added a remove testing helper function that removes files or directories. 1.0.0b20 (2007-02-08) ===================== Feature Changes --------------- - Added a buildout newest option, to control whether the newest distributions should be sought to meet requirements. This might also provide a hint to recipes that don't deal with distributions. For example, a recipe that manages subversion checkouts might not update a checkout if newest is set to "false". - Added a *newest* keyword parameter to the zc.buildout.easy_install.install and zc.buildout.easy_install.build functions to control whether the newest distributions that meed given requirements should be sought. If a false value is provided for this parameter and already installed eggs meet the given requirements, then no attempt will be made to search for newer distributions. - The recipe-testing support setUp function now adds the name *buildout* to the test namespace with a value that is the path to the buildout script in the sample buildout. This allows tests to use >>> print system(buildout), rather than: >>> print system(join('bin', 'buildout')), Bugs Fixed ---------- - Paths returned from update methods replaced lists of installed files rather than augmenting them. 1.0.0b19 (2007-01-24) ===================== Bugs Fixed ---------- - Explicitly specifying a Python executable failed if the output of running Python with the -V option included a 2-digit (rather than a 3-digit) version number. 1.0.0b18 (2007-01-22) ===================== Feature Changes --------------- - Added documentation for some previously undocumented features of the easy_install APIs. - By popular demand, added a -o command-line option that is a short hand for the assignment buildout:offline=true. Bugs Fixed ---------- - When deciding whether recipe develop eggs had changed, buildout incorrectly considered files in .svn and CVS directories. 1.0.0b17 (2006-12-07) ===================== Feature Changes --------------- - Configuration files can now be loaded from URLs. Bugs Fixed ---------- - https://bugs.launchpad.net/products/zc.buildout/+bug/71246 Buildout extensions installed as eggs couldn't be loaded in offline mode. 1.0.0b16 (2006-12-07) ===================== Feature Changes --------------- - A new command-line argument, -U, suppresses reading user defaults. - You can now suppress use of an installed-part database (e.g. .installed.cfg) by sprifying an empty value for the buildout installed option. Bugs Fixed ---------- - When the install command is used with a list of parts, only those parts are supposed to be installed, but the buildout was also building parts that those parts depended on. 1.0.0b15 (2006-12-06) ===================== Bugs Fixed ---------- - Uninstall recipes weren't loaded correctly in cases where no parts in the (new) configuration used the recipe egg. 1.0.0b14 (2006-12-05) ===================== Feature Changes --------------- - Added uninstall recipes for dealing with complex uninstallation scenarios. Bugs Fixed ---------- - Automatic upgrades weren't performed on Windows due to a bug that caused buildout to incorrectly determine that it wasn't running locally in a buildout. - Fixed some spurious test failures on Windows. 1.0.0b13 (2006-12-04) ===================== Feature Changes --------------- - Variable substitutions now reflect option data written by recipes. - A part referenced by a part in a parts list is now added to the parts list before the referencing part. This means that you can omit parts from the parts list if they are referenced by other parts. - Added a develop function to the easy_install module to aid in creating develop eggs with custom build_ext options. - The build and develop functions in the easy_install module now return the path of the egg or egg link created. - Removed the limitation that parts named in the install command can only name configured parts. - Removed support ConfigParser-style variable substitutions (e.g. %(foo)s). Only the string-template style of variable (e.g. ${section:option}) substitutions will be supported. Supporting both violates "there's only one way to do it". - Deprecated the buildout-section extendedBy option. Bugs Fixed ---------- - We treat setuptools as a dependency of any distribution that (declares that it) uses namespace packages, whether it declares setuptools as a dependency or not. This wasn't working for eggs intalled by virtue of being dependencies. 1.0.0b12 (2006-10-24) ===================== Feature Changes --------------- - Added an initialization argument to the zc.buildout.easy_install.scripts function to include initialization code in generated scripts. 1.0.0b11 (2006-10-24) ===================== Bugs Fixed ---------- `67737 `_ Verbose and quite output options caused errors when the develop buildout option was used to create develop eggs. `67871 `_ Installation failed if the source was a (local) unzipped egg. `67873 `_ There was an error in producing an error message when part names passed to the install command weren't included in the configuration. 1.0.0b10 (2006-10-16) ===================== Feature Changes --------------- - Renamed the runsetup command to setup. (The old name still works.) - Added a recipe update method. Now install is only called when a part is installed for the first time, or after an uninstall. Otherwise, update is called. For backward compatibility, recipes that don't define update methiods are still supported. - If a distribution defines namespace packages but fails to declare setuptools as one of its dependencies, we now treat setuptools as an implicit dependency. We generate a warning if the distribution is a develop egg. - You can now create develop eggs for setup scripts that don't use setuptools. Bugs Fixed ---------- - Egg links weren't removed when corresponding entries were removed from develop sections. - Running a non-local buildout command (one not installed in the buildout) ket to a hang if there were new versions of zc.buildout or setuptools were available. Now we issue a warning and don't upgrade. - When installing zip-safe eggs from local directories, the eggs were moved, rather than copied, removing them from the source directory. 1.0.0b9 (2006-10-02) ==================== Bugs Fixed ---------- Non-zip-safe eggs were not unzipped when they were installed. 1.0.0b8 (2006-10-01) ==================== Bugs Fixed ---------- - Installing source distributions failed when using alternate Python versions (depending on the versions of Python used.) - Installing eggs wasn't handled as efficiently as possible due to a bug in egg URL parsing. - Fixed a bug in runsetup that caused setup scripts that introspected __file__ to fail. 1.0.0b7 ======= Added a documented testing framework for use by recipes. Refactored the buildout tests to use it. Added a runsetup command run a setup script. This is handy if, like me, you don't install setuptools in your system Python. 1.0.0b6 ======= Fixed https://launchpad.net/products/zc.buildout/+bug/60582 Use of extension options caused bootstrapping to fail if the eggs directory didn't already exist. We no longer use extensions for bootstrapping. There really isn't any reason to anyway. 1.0.0b5 ======= Refactored to do more work in buildout and less work in easy_install. This makes things go a little faster, makes errors a little easier to handle, and allows extensions (like the sftp extension) to influence more of the process. This was done to fix a problem in using the sftp support. 1.0.0b4 ======= - Added an **experimental** extensions mechanism, mainly to support adding sftp support to buildouts that need it. - Fixed buildout self-updating on Windows. 1.0.0b3 ======= - Added a help option (-h, --help) - Increased the default level of verbosity. - Buildouts now automatically update themselves to new versions of zc.buildout and setuptools. - Added Windows support. - Added a recipe API for generating user errors. - No-longer generate a py_zc.buildout script. - Fixed some bugs in variable substitutions. The characters "-", "." and " ", weren't allowed in section or option names. Substitutions with invalid names were ignored, which caused missleading failures downstream. - Improved error handling. No longer show tracebacks for user errors. - Now require a recipe option (and therefore a section) for every part. - Expanded the easy_install module API to: - Allow extra paths to be provided - Specify explicit entry points - Specify entry-point arguments 1.0.0b2 ======= Added support for specifying some build_ext options when installing eggs from source distributions. 1.0.0b1 ======= - Changed the bootstrapping code to only install setuptools and zc.buildout. The bootstrap code no-longer runs the buildout itself. This was to fix a bug that caused parts to be recreated unnecessarily because the recipe signature in the initial buildout reflected temporary locations for setuptools and zc.buildout. - Now create a minimal setup.py if it doesn't exist and issue a warning that it is being created. - Fixed bug in saving installed configuration data. %'s and extra spaces weren't quoted. 1.0.0a1 ======= Initial public version Download ********************** Keywords: development build Platform: UNKNOWN Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Zope Public License Classifier: Topic :: Software Development :: Build Tools Classifier: Topic :: Software Development :: Libraries :: Python Modules zc.buildout-1.7.1/README.txt0000644000076500007650000001523212111414155015050 0ustar jimjim00000000000000******** Buildout ******** .. contents:: The Buildout project provides support for creating applications, especially Python applications. It provides tools for assembling applications from multiple parts, Python or otherwise. An application may actually contain multiple programs, processes, and configuration settings. The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". To get a feel for some of the things you might use buildouts for, see the `Buildout examples`_. To lean more about using buildouts, see `Detailed Documentation`_. To see screencasts, talks, useful links and more documentation, visit the `Buildout website `_. Recipes ******* Existing recipes include: `zc.recipe.egg `_ The egg recipe installes one or more eggs, with their dependencies. It installs their console-script entry points with the needed eggs included in their paths. It is suitable for use with a "clean" Python: one without packages installed in site-packages. `z3c.recipe.scripts `_ Like zc.recipe.egg, this recipe builds interpreter scripts and entry point scripts based on eggs. It can be used with a Python that has packages installed in site-packages, such as a system Python. The interpreter also has more features than the one offered by zc.recipe.egg. `zc.recipe.testrunner `_ The testrunner egg creates a test runner script for one or more eggs. `zc.recipe.zope3checkout `_ The zope3checkout recipe installs a Zope 3 checkout into a buildout. `zc.recipe.zope3instance `_ The zope3instance recipe sets up a Zope 3 instance. `zc.recipe.filestorage `_ The filestorage recipe sets up a ZODB file storage for use in a Zope 3 instance created by the zope3instance recipe. Buildout examples ***************** Here are a few examples of what you can do with buildouts. We'll present these as a set of use cases. Try out an egg ============== Sometimes you want to try an egg (or eggs) that someone has released. You'd like to get a Python interpreter that lets you try things interactively or run sample scripts without having to do path manipulations. If you can and don't mind modifying your Python installation, you could use easy_install, otherwise, you could create a directory somewhere and create a buildout.cfg file in that directory containing:: [buildout] parts = mypython [mypython] recipe = zc.recipe.egg interpreter = mypython eggs = theegg where theegg is the name of the egg you want to try out. Run buildout in this directory. It will create a bin subdirectory that includes a mypython script. If you run mypython without any arguments you'll get an interactive interpreter with the egg in the path. If you run it with a script and script arguments, the script will run with the egg in its path. Of course, you can specify as many eggs as you want in the eggs option. If the egg provides any scripts (console_scripts entry points), those will be installed in your bin directory too. Work on a package ================= I often work on packages that are managed separately. They don't have scripts to be installed, but I want to be able to run their tests using the `zope.testing test runner `_. In this kind of application, the program to be installed is the test runner. A good example of this is `zc.ngi `_. Here I have a subversion project for the zc.ngi package. The software is in the src directory. The configuration file is very simple:: [buildout] develop = . parts = test [test] recipe = zc.recipe.testrunner eggs = zc.ngi I use the develop option to create a develop egg based on the current directory. I request a test script named "test" using the zc.recipe.testrunner recipe. In the section for the test script, I specify that I want to run the tests in the zc.ngi package. When I check out this project into a new sandbox, I run bootstrap.py to get setuptools and zc.buildout and to create bin/buildout. I run bin/buildout, which installs the test script, bin/test, which I can then use to run the tests. This is probably the most common type of buildout. If I need to run a previous version of zc.buildout, I use the `--version` option of the bootstrap.py script:: $ python bootstrap.py --version 1.1.3 The `zc.buildout project `_ is a slightly more complex example of this type of buildout. Install egg-based scripts ========================= A variation of the `Try out an egg`_ use case is to install scripts into your ~/bin directory (on Unix, of course). My ~/bin directory is a buildout with a configuration file that looks like:: [buildout] parts = foo bar bin-directory = . [foo] ... where foo and bar are packages with scripts that I want available. As I need new scripts, I can add additional sections. The bin-directory option specified that scripts should be installed into the current directory. Multi-program multi-machine systems =================================== Using an older prototype version of the buildout, we've build a number of systems involving multiple programs, databases, and machines. One typical example consists of: - Multiple Zope instances - Multiple ZEO servers - An LDAP server - Cache-invalidation and Mail delivery servers - Dozens of add-on packages - Multiple test runners - Multiple deployment modes, including dev, stage, and prod, with prod deployment over multiple servers Parts installed include: - Application software installs, including Zope, ZEO and LDAP software - Add-on packages - Bundles of configuration that define Zope, ZEO and LDAP instances - Utility scripts such as test runners, server-control scripts, cron jobs. Questions and Bug Reporting *************************** Please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. zc.buildout-1.7.1/setup.cfg0000644000076500007650000000007312111415075015172 0ustar jimjim00000000000000[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 zc.buildout-1.7.1/setup.py0000644000076500007650000000566612111414421015072 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2006-2009 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## name = "zc.buildout" version = "1.7.1" import os from setuptools import setup def read(*rnames): return open(os.path.join(os.path.dirname(__file__), *rnames)).read() long_description=( read('README.txt') + '\n' + read('SYSTEM_PYTHON_HELP.txt') + '\n' + 'Detailed Documentation\n' '**********************\n' + '\n' + read('src', 'zc', 'buildout', 'buildout.txt') + '\n' + read('src', 'zc', 'buildout', 'unzip.txt') + '\n' + read('src', 'zc', 'buildout', 'repeatable.txt') + '\n' + read('src', 'zc', 'buildout', 'download.txt') + '\n' + read('src', 'zc', 'buildout', 'downloadcache.txt') + '\n' + read('src', 'zc', 'buildout', 'extends-cache.txt') + '\n' + read('src', 'zc', 'buildout', 'setup.txt') + '\n' + read('src', 'zc', 'buildout', 'update.txt') + '\n' + read('src', 'zc', 'buildout', 'debugging.txt') + '\n' + read('src', 'zc', 'buildout', 'testing.txt') + '\n' + read('src', 'zc', 'buildout', 'easy_install.txt') + '\n' + read('src', 'zc', 'buildout', 'distribute.txt') + '\n' + read('CHANGES.txt') + '\n' + 'Download\n' '**********************\n' ) entry_points = """ [console_scripts] buildout = %(name)s.buildout:main [zc.buildout] debug = %(name)s.testrecipes:Debug """ % dict(name=name) setup( name = name, version = version, author = "Jim Fulton", author_email = "jim@zope.com", description = "System for managing development buildouts", long_description=long_description, license = "ZPL 2.1", keywords = "development build", url='http://pypi.python.org/pypi/zc.buildout', packages = ['zc', 'zc.buildout'], package_dir = {'': 'src'}, namespace_packages = ['zc'], install_requires = 'setuptools', include_package_data = True, entry_points = entry_points, extras_require = dict(test=['zope.testing']), zip_safe=False, classifiers = [ 'Intended Audience :: Developers', 'License :: OSI Approved :: Zope Public License', 'Topic :: Software Development :: Build Tools', 'Topic :: Software Development :: Libraries :: Python Modules', ], ) zc.buildout-1.7.1/src/0000755000076500007650000000000012111415075014140 5ustar jimjim00000000000000zc.buildout-1.7.1/src/zc/0000755000076500007650000000000012111415075014554 5ustar jimjim00000000000000zc.buildout-1.7.1/src/zc/__init__.py0000644000076500007650000000014612111414141016657 0ustar jimjim00000000000000try: __import__('pkg_resources').declare_namespace(__name__) except: # bootstrapping pass zc.buildout-1.7.1/src/zc/buildout/0000755000076500007650000000000012111415075016403 5ustar jimjim00000000000000zc.buildout-1.7.1/src/zc/buildout/__init__.py0000644000076500007650000000143512111414155020515 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2006 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## """Buildout package $Id$ """ class UserError(Exception): """Errors made by a user """ def __str__(self): return " ".join(map(str, self)) zc.buildout-1.7.1/src/zc/buildout/allowhosts.txt0000644000076500007650000000752212111414155021347 0ustar jimjim00000000000000Allow hosts ----------- On some environments the links visited by `zc.buildout` can be forbidden by paranoiac firewalls. These URL might be on the chain of links visited by `zc.buildout` whether they are defined in the `find-links` option or by various eggs in their `url`, `download_url` and `dependency_links` metadata. It is even harder to track that package_index works like a spider and might visit links and go to other location. The `allow-hosts` option provides a way to prevent this, and works exactly like the one provided in `easy_install` (see `easy_install allow-hosts option`_). You can provide a list of allowed host, together with wildcards:: [buildout] ... allow-hosts = *.python.org example.com Let's create a develop egg in our buildout that specifies `dependency_links` which points to a server in the outside world:: >>> mkdir(sample_buildout, 'allowdemo') >>> write(sample_buildout, 'allowdemo', 'dependencydemo.py', ... 'import eggrecipekss.core') >>> write(sample_buildout, 'allowdemo', 'setup.py', ... '''from setuptools import setup; setup( ... name='allowdemo', py_modules=['dependencydemo'], ... install_requires = 'kss.core', ... dependency_links = ['http://dist.plone.org'], ... zip_safe=True, version='1') ... ''') Now let's configure the buildout to use the develop egg, together with some rules that disallow any website but PyPI and local files:: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = allowdemo ... parts = eggs ... allow-hosts = ... pypi.python.org ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = allowdemo ... ''') Now we can run the buildout and make sure all attempts to dist.plone.org fails:: >>> print system(buildout) # doctest: +ELLIPSIS Develop: '/sample-buildout/allowdemo' ... Link to http://dist.plone.org ***BLOCKED*** by --allow-hosts ... While: Installing eggs. Getting distribution for 'kss.core'. Error: Couldn't find a distribution for 'kss.core'. That's what we wanted : this will prevent any attempt to access unwanted domains. For instance, some packages are listing in their links `svn://` links. These can lead to error in some cases, and can therefore be protected like this:: XXX (showcase with a svn:// file) >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = allowdemo ... parts = eggs ... allow-hosts = ... ^(!svn://).* ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = allowdemo ... ''') Now we can run the buildout and make sure all attempts to dist.plone.org fails:: >>> print system(buildout) # doctest: +ELLIPSIS Develop: '/sample-buildout/allowdemo' ... Link to http://dist.plone.org ***BLOCKED*** by --allow-hosts ... While: Installing eggs. Getting distribution for 'kss.core'. Error: Couldn't find a distribution for 'kss.core'. Test for issues --------------- Test for 1.0.5 breakage as in https://bugs.launchpad.net/zc.buildout/+bug/239212:: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... parts=python ... foo = ${python:interpreter} ... ... [python] ... recipe=zc.recipe.egg ... eggs=zc.buildout ... interpreter=python ... ''') >>> print system(buildout) Unused options for buildout: 'foo'. Installing python. Generated script '/sample-buildout/bin/buildout'. Generated interpreter '/sample-buildout/bin/python'. The bug 239212 above would have got us an *AttributeError* on *buildout._allow_hosts*. This was fixed in this changeset: http://svn.zope.org/zc.buildout/trunk/src/zc/buildout/buildout.py?rev=87309&r1=87277&r2=87309 zc.buildout-1.7.1/src/zc/buildout/bootstrap.txt0000644000076500007650000003017712111414155021167 0ustar jimjim00000000000000Make sure the bootstrap script actually works:: >>> import os, sys >>> from os.path import dirname, join >>> import zc.buildout >>> bootstrap_py = join( ... dirname( ... dirname( ... dirname( ... dirname(zc.buildout.__file__) ... ) ... ) ... ), ... 'bootstrap', 'bootstrap.py') >>> sample_buildout = tmpdir('sample') >>> os.chdir(sample_buildout) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... ''') >>> write('bootstrap.py', open(bootstrap_py).read()) >>> print 'X'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py'); print 'X' # doctest: +ELLIPSIS X... Creating directory '/sample/bin'. Creating directory '/sample/parts'. Creating directory '/sample/eggs'. Creating directory '/sample/develop-eggs'. Generated script '/sample/bin/buildout'. ... >>> ls(sample_buildout) d bin - bootstrap.py - buildout.cfg d develop-eggs d eggs d parts >>> ls(sample_buildout, 'bin') - buildout >>> print 'X'; ls(sample_buildout, 'eggs') # doctest: +ELLIPSIS X... d zc.buildout-...egg The buildout script it has generated is a new-style script, using a customized site.py. >>> buildout_script = join(sample_buildout, 'bin', 'buildout') >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... import sys sys.path[0:0] = [ '/sample/parts/buildout', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import zc.buildout.buildout if __name__ == '__main__': sys.exit(zc.buildout.buildout.main()) The bootstrap process prefers final versions of zc.buildout, so it has selected the (generated-locally) 99.99 egg rather than the also-available 100.0b1 egg. We can see that in the buildout script's site.py. >>> buildout_site_py = join( ... sample_buildout, 'parts', 'buildout', 'site.py') >>> print open(buildout_site_py).read() # doctest: +ELLIPSIS "... buildout_paths = [ '/sample/eggs/setuptools-...egg', '/sample/eggs/zc.buildout-1.99.99-pyN.N.egg' ] ... If you want to accept early releases of zc.buildout, you either need to specify an explicit version (using --version here and specifying the version in the buildout configuration file using the ``buildout-version`` option or the ``versions`` option) or specify that you accept early releases by using ``--accept-buildout-test-releases`` on the bootstrap script. Here's an example. >>> ignored = system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --accept-buildout-test-releases') >>> print open(buildout_site_py).read() # doctest: +ELLIPSIS "... buildout_paths = [ '/sample/eggs/setuptools-...egg', '/sample/eggs/zc.buildout-1.100.0b1-pyN.N.egg' ] ... Notice we are now using zc.buildout 1.100.0b1, a non-final release. The buildout script remembers the decision to accept early releases, and alerts the user. >>> print system(join('bin', 'buildout')), ... # doctest: +NORMALIZE_WHITESPACE NOTE: Accepting early releases of build system packages. Rerun bootstrap without --accept-buildout-test-releases (-t) to return to default behavior. This is accomplished within the script itself. >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... sys.argv.insert(1, 'buildout:accept-buildout-test-releases=true') print ('NOTE: Accepting early releases of build system packages. Rerun ' 'bootstrap without --accept-buildout-test-releases (-t) to return to ' 'default behavior.') ... As the note says, to undo, you just need to re-run bootstrap without --accept-buildout-test-releases. >>> ignored = system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py') >>> print open(buildout_site_py).read() # doctest: +ELLIPSIS "... buildout_paths = [ '/sample/eggs/setuptools-...egg', '/sample/eggs/zc.buildout-1.99.99-pyN.N.egg' ] ... >>> ('buildout:accept-buildout-test-releases=true' in ... open(buildout_script).read()) False Now we will try the `--version` option, which lets you define a version for `zc.buildout`. If not provided, bootstrap will look for the latest one. Let's try with an unknown version:: >>> print 'XX'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --version UNKNOWN'); print 'X' # doctest: +ELLIPSIS ... X... No local packages or download links found for zc.buildout==UNKNOWN... ... Now let's try with `1.1.2`, which happens to exist:: >>> print 'X'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --version 1.1.2'); print 'X' ... X Generated script '/sample/bin/buildout'. X Versions older than 1.5.0 put their egg dependencies in the ``buildout`` script. Let's make sure it was generated as we expect:: >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... import sys sys.path[0:0] = [ '/sample/eggs/setuptools-...egg', '/sample/eggs/zc.buildout-1.1.2...egg', ] import zc.buildout.buildout if __name__ == '__main__': zc.buildout.buildout.main() Let's try with `1.2.1`:: >>> print 'X'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --version 1.2.1'); print 'X' # doctest: +ELLIPSIS ... X Generated script '/sample/bin/buildout'. X Let's make sure the generated ``buildout`` script uses it:: >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... import sys sys.path[0:0] = [ '/sample/eggs/setuptools-...egg', '/sample/eggs/zc.buildout-1.2.1...egg', ] import zc.buildout.buildout if __name__ == '__main__': zc.buildout.buildout.main() ``zc.buildout`` now can also run with `Distribute` with the `--distribute` option:: >>> print 'XX'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --distribute') # doctest: +ELLIPSIS ... X...Generated script '/sample/bin/buildout'... Let's make sure the generated ``site.py`` uses it:: >>> print open(buildout_site_py).read() # doctest: +ELLIPSIS "... buildout_paths = [ '/sample/eggs/distribute-...egg', '/sample/eggs/zc.buildout-1.99.99-pyN.N.egg' ] ... Make sure both options can be used together:: >>> print 'XX'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --distribute --version 1.2.1') ... # doctest: +ELLIPSIS X...Generated script '/sample/bin/buildout'... Let's make sure the old-style generated ``buildout`` script uses ``Distribute`` *and* ``zc.buildout-1.2.1``:: >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... import sys sys.path[0:0] = [ '/sample/eggs/distribute-...egg', '/sample/eggs/zc.buildout-1.2.1...egg', ] import zc.buildout.buildout if __name__ == '__main__': zc.buildout.buildout.main() Last, the -c option needs to work on bootstrap.py:: >>> conf_file = os.path.join(sample_buildout, 'other.cfg') >>> f = open(conf_file, 'w') >>> f.write('[buildout]\nparts=\n\n') >>> f.close() >>> print 'XX'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py -c %s --distribute' % conf_file) # doctest: +ELLIPSIS X...Generated script '/sample/bin/buildout'... You can specify a location of ez_setup.py or distribute_setup, so you can rely on a local or remote location. We'll write our own ez_setup.py that we will also use to test some other bootstrap options. >>> write('ez_setup.py', '''\ ... def use_setuptools(**kwargs): ... import sys, pprint ... pprint.pprint(kwargs, width=40) ... sys.exit() ... ''') >>> print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --setup-source=./ez_setup.py') ... # doctest: +ELLIPSIS {'download_delay': 0, 'to_dir': '...'} You can also pass a download-cache, and a place in which eggs should be stored (they are normally stored in a temporary directory). >>> print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --setup-source=./ez_setup.py '+ ... '--download-base=./download-cache --eggs=eggs') ... # doctest: +ELLIPSIS {'download_base': '/sample/download-cache/', 'download_delay': 0, 'to_dir': '/sample/eggs'} Here's the entire help text. >>> print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py --help'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Usage: [DESIRED PYTHON FOR BUILDOUT] bootstrap.py [options] Bootstraps a buildout-based project. Simply run this script in a directory containing a buildout.cfg, using the Python that you want bin/buildout to use. Note that by using --setup-source and --download-base to point to local resources, you can keep this script from going over the network. Options: -h, --help show this help message and exit -v VERSION, --version=VERSION use a specific zc.buildout version -d, --distribute Use Distribute rather than Setuptools. --setup-source=SETUP_SOURCE Specify a URL or file location for the setup file. If you use Setuptools, this will default to http://peak.telecommunity.com/dist/ez_setup.py; if you use Distribute, this will default to http://python- distribute.org/distribute_setup.py. --download-base=DOWNLOAD_BASE Specify a URL or directory for downloading zc.buildout and either Setuptools or Distribute. Defaults to PyPI. --eggs=EGGS Specify a directory for storing eggs. Defaults to a temporary directory that is deleted when the bootstrap script completes. -t, --accept-buildout-test-releases Normally, if you do not specify a --version, the bootstrap script and buildout gets the newest *final* versions of zc.buildout and its recipes and extensions for you. If you use this flag, bootstrap and buildout will get the newest releases even if they are alphas or betas. -c CONFIG_FILE Specify the path to the buildout configuration file to be used. Rebootstrap and create config file. >>> remove('buildout.cfg') >>> print system( ... zc.buildout.easy_install._safe_arg(sys.executable) + ... ' bootstrap.py init') # doctest: +NORMALIZE_WHITESPACE Creating '/sample/buildout.cfg'. >>> cat('buildout.cfg') # doctest: +NORMALIZE_WHITESPACE [buildout] parts = zc.buildout-1.7.1/src/zc/buildout/bootstrap1.txt0000644000076500007650000000335212111414155021243 0ustar jimjim00000000000000Bootstrap.py won't bootstrap buildout 2 even if there's a 2 egg in the user's egg cache: >>> import os >>> home = os.environ['HOME'] >>> mkdir(home) >>> mkdir(home, '.buildout') >>> write(home, '.buildout', 'default.cfg', """ ... [buildout] ... eggs-directory = %s ... """ % sample_eggs) >>> import os, sys >>> from os.path import dirname, join >>> import zc.buildout >>> bootstrap_py = join( ... dirname( ... dirname( ... dirname( ... dirname(zc.buildout.__file__) ... ) ... ) ... ), ... 'bootstrap', 'bootstrap.py') >>> sample_buildout = tmpdir('sample') >>> os.chdir(sample_buildout) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... ''') >>> write('bootstrap.py', open(bootstrap_py).read()) >>> print 'X'; print system( ... zc.buildout.easy_install._safe_arg(sys.executable)+' '+ ... 'bootstrap.py'); print 'X' # doctest: +ELLIPSIS X... Creating directory '/sample/bin'. Creating directory '/sample/parts'. Creating directory '/sample/develop-eggs'. Generated script '/sample/bin/buildout'. ... The bootstrap process prefers final versions of zc.buildout, so it has selected the (generated-locally) 99.99 egg rather than the also-available 100.0b1 egg. We can see that in the buildout script's site.py. >>> buildout_site_py = join( ... sample_buildout, 'parts', 'buildout', 'site.py') >>> print open(buildout_site_py).read() # doctest: +ELLIPSIS "... buildout_paths = [ '/sample_eggs/setuptools-...egg', '/sample_eggs/zc.buildout-1.99.99-pyN.N.egg' ] ... It hasn't chosen buildout2. zc.buildout-1.7.1/src/zc/buildout/buildout.py0000644000076500007650000020651612111414155020614 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2005-2009 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## """Buildout main script """ from rmtree import rmtree try: from hashlib import md5 except ImportError: # Python 2.4 and older from md5 import md5 import ConfigParser import copy import distutils.errors import glob import itertools import logging import os import pkg_resources import re import shutil import sys import tempfile import UserDict import warnings import subprocess import zc.buildout import zc.buildout.download import zc.buildout.easy_install realpath = zc.buildout.easy_install.realpath pkg_resources_loc = pkg_resources.working_set.find( pkg_resources.Requirement.parse('setuptools')).location _isurl = re.compile('([a-zA-Z0-9+.-]+)://').match is_jython = sys.platform.startswith('java') _sys_executable_has_broken_dash_S = ( zc.buildout.easy_install._has_broken_dash_S(sys.executable)) class MissingOption(zc.buildout.UserError, KeyError): """A required option was missing. """ class MissingSection(zc.buildout.UserError, KeyError): """A required section is missing. """ def __str__(self): return "The referenced section, %r, was not defined." % self[0] def _annotate_section(section, note): for key in section: section[key] = (section[key], note) return section def _annotate(data, note): for key in data: data[key] = _annotate_section(data[key], note) return data def _print_annotate(data): sections = data.keys() sections.sort() print print "Annotated sections" print "="*len("Annotated sections") for section in sections: print print '[%s]' % section keys = data[section].keys() keys.sort() for key in keys: value, notes = data[section][key] keyvalue = "%s= %s" % (key, value) print keyvalue line = ' ' for note in notes.split(): if note == '[+]': line = '+= ' elif note == '[-]': line = '-= ' else: print line, note line = ' ' print def _unannotate_section(section): for key in section: value, note = section[key] section[key] = value return section def _unannotate(data): for key in data: data[key] = _unannotate_section(data[key]) return data _buildout_default_options = _annotate_section({ 'accept-buildout-test-releases': 'false', 'allow-hosts': '*', 'allow-picked-versions': 'true', 'allowed-eggs-from-site-packages': '*', 'bin-directory': 'bin', 'develop-eggs-directory': 'develop-eggs', 'eggs-directory': 'eggs', 'executable': sys.executable, 'exec-sitecustomize': 'true', 'find-links': '', 'include-site-packages': 'true', 'install-from-cache': 'false', 'installed': '.installed.cfg', 'log-format': '', 'log-level': 'INFO', 'newest': 'true', 'offline': 'false', 'parts-directory': 'parts', 'prefer-final': 'false', 'python': 'buildout', 'relative-paths': 'false', 'socket-timeout': '', 'unzip': 'false', 'use-dependency-links': 'true', }, 'DEFAULT_VALUE') class Buildout(UserDict.DictMixin): def __init__(self, config_file, cloptions, user_defaults=True, windows_restart=False, command=None, args=()): __doing__ = 'Initializing.' self.__windows_restart = windows_restart # default options data = dict(buildout=_buildout_default_options.copy()) self._buildout_dir = os.getcwd() if not _isurl(config_file): config_file = os.path.abspath(config_file) base = os.path.dirname(config_file) if not os.path.exists(config_file): if command == 'init': self._init_config(config_file, args) elif command == 'setup': # Sigh. This model of a buildout instance # with methods is breaking down. :( config_file = None data['buildout']['directory'] = ('.', 'COMPUTED_VALUE') else: raise zc.buildout.UserError( "Couldn't open %s" % config_file) elif command == 'init': raise zc.buildout.UserError( "%r already exists." % config_file) if config_file: data['buildout']['directory'] = (os.path.dirname(config_file), 'COMPUTED_VALUE') else: base = None cloptions = dict( (section, dict((option, (value, 'COMMAND_LINE_VALUE')) for (_, option, value) in v)) for (section, v) in itertools.groupby(sorted(cloptions), lambda v: v[0]) ) override = cloptions.get('buildout', {}).copy() # load user defaults, which override defaults if user_defaults: user_config = os.path.join(os.path.expanduser('~'), '.buildout', 'default.cfg') if os.path.exists(user_config): _update(data, _open(os.path.dirname(user_config), user_config, [], data['buildout'].copy(), override, set())) # load configuration files if config_file: _update(data, _open(os.path.dirname(config_file), config_file, [], data['buildout'].copy(), override, set())) # apply command-line options _update(data, cloptions) self._annotated = copy.deepcopy(data) self._raw = _unannotate(data) self._data = {} self._parts = [] # provide some defaults before options are parsed # because while parsing options those attributes might be # used already (Gottfried Ganssauge) buildout_section = data['buildout'] # Try to make sure we have absolute paths for standard # directories. We do this before doing substitutions, in case # a one of these gets read by another section. If any # variable references are used though, we leave it as is in # _buildout_path. if 'directory' in buildout_section: self._buildout_dir = buildout_section['directory'] for name in ('bin', 'parts', 'eggs', 'develop-eggs'): d = self._buildout_path(buildout_section[name+'-directory']) buildout_section[name+'-directory'] = d # Attributes on this buildout object shouldn't be used by # recipes in their __init__. It can cause bugs, because the # recipes will be instantiated below (``options = self['buildout']``) # before this has completed initializing. These attributes are # left behind for legacy support but recipe authors should # beware of using them. A better practice is for a recipe to # use the buildout['buildout'] options. links = buildout_section['find-links'] self._links = links and links.split() or () allow_hosts = buildout_section['allow-hosts'].split('\n') self._allow_hosts = tuple([host.strip() for host in allow_hosts if host.strip() != '']) self._logger = logging.getLogger('zc.buildout') self.offline = (buildout_section['offline'] == 'true') self.newest = (buildout_section['newest'] == 'true') self.accept_buildout_test_releases = ( buildout_section['accept-buildout-test-releases'] == 'true') ################################################################## ## WARNING!!! ## ALL ATTRIBUTES MUST HAVE REASONABLE DEFAULTS AT THIS POINT ## OTHERWISE ATTRIBUTEERRORS MIGHT HAPPEN ANY TIME FROM RECIPES. ## RECIPES SHOULD GENERALLY USE buildout['buildout'] OPTIONS, NOT ## BUILDOUT ATTRIBUTES. ################################################################## # initialize some attrs and buildout directories. options = self['buildout'] # now reinitialize links = options.get('find-links', '') self._links = links and links.split() or () allow_hosts = options['allow-hosts'].split('\n') self._allow_hosts = tuple([host.strip() for host in allow_hosts if host.strip() != '']) self._buildout_dir = options['directory'] # Make sure we have absolute paths for standard directories. We do this # a second time here in case someone overrode these in their configs. for name in ('bin', 'parts', 'eggs', 'develop-eggs'): d = self._buildout_path(options[name+'-directory']) options[name+'-directory'] = d if options['installed']: options['installed'] = os.path.join(options['directory'], options['installed']) self._setup_logging() self.versions = { 'zc.recipe.egg': '<2dev', 'zc.recipe.testrunner': '<2dev', } versions_option = options.get('versions') if versions_option: self.versions.update(self[versions_option]) zc.buildout.easy_install.default_versions(self.versions) self.offline = options.get_bool('offline') if self.offline: options['newest'] = 'false' self.newest = options.get_bool('newest') zc.buildout.easy_install.prefer_final( options.get_bool('prefer-final')) self.accept_buildout_test_releases = options.get_bool( 'accept-buildout-test-releases') zc.buildout.easy_install.use_dependency_links( options.get_bool('use-dependency-links')) zc.buildout.easy_install.allow_picked_versions( options.get_bool('allow-picked-versions')) zc.buildout.easy_install.install_from_cache( options.get_bool('install-from-cache')) zc.buildout.easy_install.always_unzip(options.get_bool('unzip')) allowed_eggs = tuple(name.strip() for name in options[ 'allowed-eggs-from-site-packages'].split('\n')) self.include_site_packages = options.get_bool('include-site-packages') self.exec_sitecustomize = options.get_bool('exec-sitecustomize') if (_sys_executable_has_broken_dash_S and (not self.include_site_packages or allowed_eggs != ('*',))): # We can't do this if the executable has a broken -S. warnings.warn(zc.buildout.easy_install.BROKEN_DASH_S_WARNING) self.include_site_packages = True zc.buildout.easy_install.allowed_eggs_from_site_packages(allowed_eggs) zc.buildout.easy_install.include_site_packages( self.include_site_packages) download_cache = options.get('download-cache') if download_cache: download_cache = os.path.join(options['directory'], download_cache) if not os.path.isdir(download_cache): raise zc.buildout.UserError( 'The specified download cache:\n' '%r\n' "Doesn't exist.\n" % download_cache) download_cache = os.path.join(download_cache, 'dist') if not os.path.isdir(download_cache): os.mkdir(download_cache) zc.buildout.easy_install.download_cache(download_cache) # "Use" each of the defaults so they aren't reported as unused options. for name in _buildout_default_options: options[name] # Do the same for extends-cache which is not among the defaults but # wasn't recognized as having been used since it was used before # tracking was turned on. options.get('extends-cache') os.chdir(options['directory']) def _buildout_path(self, name): if '${' in name: return name return os.path.join(self._buildout_dir, name) def bootstrap(self, args): __doing__ = 'Bootstrapping.' self._setup_directories() options = self['buildout'] # Get a base working set for our distributions that corresponds to the # stated desires in the configuration. distributions = ['setuptools', 'zc.buildout<2dev'] if options.get('offline') == 'true': ws = zc.buildout.easy_install.working_set( distributions, options['executable'], [options['develop-eggs-directory'], options['eggs-directory']], prefer_final=not self.accept_buildout_test_releases, ) else: ws = zc.buildout.easy_install.install( distributions, options['eggs-directory'], links=self._links, index=options.get('index'), executable=options['executable'], path=[options['develop-eggs-directory']], newest=self.newest, allow_hosts=self._allow_hosts, prefer_final=not self.accept_buildout_test_releases, ) # Now copy buildout and setuptools eggs, and record destination eggs: entries = [] for name in 'setuptools', 'zc.buildout': r = pkg_resources.Requirement.parse(name) dist = ws.find(r) if dist.precedence == pkg_resources.DEVELOP_DIST: dest = os.path.join(self['buildout']['develop-eggs-directory'], name+'.egg-link') open(dest, 'w').write(dist.location) entries.append(dist.location) else: dest = os.path.join(self['buildout']['eggs-directory'], os.path.basename(dist.location)) entries.append(dest) if not os.path.exists(dest): if os.path.isdir(dist.location): shutil.copytree(dist.location, dest) else: shutil.copy2(dist.location, dest) # Create buildout script. # Ideally the (possibly) new version of buildout would get a # chance to write the script. Not sure how to do that. ws = pkg_resources.WorkingSet(entries) ws.require('zc.buildout') partsdir = os.path.join(options['parts-directory'], 'buildout') if not os.path.exists(partsdir): os.mkdir(partsdir) # (Honor the relative-paths option.) relative_paths = options.get('relative-paths', 'false') if relative_paths == 'true': relative_paths = options['directory'] elif relative_paths == 'false': relative_paths = '' else: raise zc.buildout.UserError("relative_paths must be true or false") if (self.accept_buildout_test_releases and self._annotated['buildout']['accept-buildout-test-releases'][1] == 'COMMAND_LINE_VALUE'): # Bootstrap was called with '--accept-buildout-test-releases'. # Continue to honor that setting. script_initialization = _early_release_initialization_code else: script_initialization = '' zc.buildout.easy_install.sitepackage_safe_scripts( options['bin-directory'], ws, options['executable'], partsdir, reqs=['zc.buildout'], relative_paths=relative_paths, include_site_packages=self.include_site_packages, script_initialization=script_initialization, exec_sitecustomize=self.exec_sitecustomize, ) def _init_config(self, config_file, args): print 'Creating %r.' % config_file f = open(config_file, 'w') if args: sep = re.compile(r'[\\/]') ossep = os.path.sep if ossep == '\\': ossep = '\\\\' # re.sub does not like a single backslash eggs = '\n '.join(a for a in args if not sep.search(a)) paths = '\n '.join( sep.sub(ossep, a) for a in args if sep.search(a)) f.write('[buildout]\n' 'parts = py\n' '\n' '[py]\n' 'recipe = zc.recipe.egg\n' 'interpreter = py\n' 'eggs =\n' ) if eggs: f.write(' %s\n' % eggs) if paths: f.write('extra-paths =\n %s\n' % paths) for p in [a for a in args if sep.search(a)]: if not os.path.exists(p): os.mkdir(p) else: f.write('[buildout]\nparts =\n') f.close() def init(self, args): self.bootstrap(()) if args: self.install(()) def install(self, install_args): __doing__ = 'Installing.' self._load_extensions() self._setup_directories() # Add develop-eggs directory to path so that it gets searched # for eggs: sys.path.insert(0, self['buildout']['develop-eggs-directory']) # Check for updates. This could cause the process to be restarted. self._maybe_upgrade() # load installed data (installed_part_options, installed_exists )= self._read_installed_part_options() # Remove old develop eggs self._uninstall( installed_part_options['buildout'].get( 'installed_develop_eggs', '') ) # Build develop eggs installed_develop_eggs = self._develop() installed_part_options['buildout']['installed_develop_eggs' ] = installed_develop_eggs if installed_exists: self._update_installed( installed_develop_eggs=installed_develop_eggs) # get configured and installed part lists conf_parts = self['buildout']['parts'] conf_parts = conf_parts and conf_parts.split() or [] installed_parts = installed_part_options['buildout']['parts'] installed_parts = installed_parts and installed_parts.split() or [] if install_args: install_parts = install_args uninstall_missing = False else: install_parts = conf_parts uninstall_missing = True # load and initialize recipes [self[part]['recipe'] for part in install_parts] if not install_args: install_parts = self._parts if self._log_level < logging.DEBUG: sections = list(self) sections.sort() print print 'Configuration data:' for section in self._data: _save_options(section, self[section], sys.stdout) print # compute new part recipe signatures self._compute_part_signatures(install_parts) # uninstall parts that are no-longer used or whose configs # have changed for part in reversed(installed_parts): if part in install_parts: old_options = installed_part_options[part].copy() installed_files = old_options.pop('__buildout_installed__') new_options = self.get(part) if old_options == new_options: # The options are the same, but are all of the # installed files still there? If not, we should # reinstall. if not installed_files: continue for f in installed_files.split('\n'): if not os.path.exists(self._buildout_path(f)): break else: continue # output debugging info if self._logger.getEffectiveLevel() < logging.DEBUG: for k in old_options: if k not in new_options: self._logger.debug("Part %s, dropped option %s.", part, k) elif old_options[k] != new_options[k]: self._logger.debug( "Part %s, option %s changed:\n%r != %r", part, k, new_options[k], old_options[k], ) for k in new_options: if k not in old_options: self._logger.debug("Part %s, new option %s.", part, k) elif not uninstall_missing: continue self._uninstall_part(part, installed_part_options) installed_parts = [p for p in installed_parts if p != part] if installed_exists: self._update_installed(parts=' '.join(installed_parts)) # Check for unused buildout options: _check_for_unused_options_in_section(self, 'buildout') # install new parts for part in install_parts: signature = self[part].pop('__buildout_signature__') saved_options = self[part].copy() recipe = self[part].recipe if part in installed_parts: # update need_to_save_installed = False __doing__ = 'Updating %s.', part self._logger.info(*__doing__) old_options = installed_part_options[part] old_installed_files = old_options['__buildout_installed__'] try: update = recipe.update except AttributeError: update = recipe.install self._logger.warning( "The recipe for %s doesn't define an update " "method. Using its install method.", part) try: installed_files = self[part]._call(update) except: installed_parts.remove(part) self._uninstall(old_installed_files) if installed_exists: self._update_installed( parts=' '.join(installed_parts)) raise old_installed_files = old_installed_files.split('\n') if installed_files is None: installed_files = old_installed_files else: if isinstance(installed_files, str): installed_files = [installed_files] else: installed_files = list(installed_files) need_to_save_installed = [ p for p in installed_files if p not in old_installed_files] if need_to_save_installed: installed_files = (old_installed_files + need_to_save_installed) else: # install need_to_save_installed = True __doing__ = 'Installing %s.', part self._logger.info(*__doing__) installed_files = self[part]._call(recipe.install) if installed_files is None: self._logger.warning( "The %s install returned None. A path or " "iterable of paths should be returned.", part) installed_files = () elif isinstance(installed_files, str): installed_files = [installed_files] else: installed_files = list(installed_files) installed_part_options[part] = saved_options saved_options['__buildout_installed__' ] = '\n'.join(installed_files) saved_options['__buildout_signature__'] = signature installed_parts = [p for p in installed_parts if p != part] installed_parts.append(part) _check_for_unused_options_in_section(self, part) if need_to_save_installed: installed_part_options['buildout']['parts'] = ( ' '.join(installed_parts)) self._save_installed_options(installed_part_options) installed_exists = True else: assert installed_exists # nothing to tell the user here self._update_installed(parts=' '.join(installed_parts)) if installed_develop_eggs: if not installed_exists: self._save_installed_options(installed_part_options) elif (not installed_parts) and installed_exists: os.remove(self['buildout']['installed']) self._unload_extensions() def _update_installed(self, **buildout_options): installed = self['buildout']['installed'] f = open(installed, 'a') f.write('\n[buildout]\n') for option, value in buildout_options.items(): _save_option(option, value, f) f.close() def _uninstall_part(self, part, installed_part_options): # uninstall part __doing__ = 'Uninstalling %s.', part self._logger.info(*__doing__) # run uninstall recipe recipe, entry = _recipe(installed_part_options[part]) try: uninstaller = _install_and_load( recipe, 'zc.buildout.uninstall', entry, self) self._logger.info('Running uninstall recipe.') uninstaller(part, installed_part_options[part]) except (ImportError, pkg_resources.DistributionNotFound), v: pass # remove created files and directories self._uninstall( installed_part_options[part]['__buildout_installed__']) def _setup_directories(self): __doing__ = 'Setting up buildout directories' # Create buildout directories for name in ('bin', 'parts', 'eggs', 'develop-eggs'): d = self['buildout'][name+'-directory'] if not os.path.exists(d): self._logger.info('Creating directory %r.', d) os.mkdir(d) def _develop(self): """Install sources by running setup.py develop on them """ __doing__ = 'Processing directories listed in the develop option' develop = self['buildout'].get('develop') if not develop: return '' dest = self['buildout']['develop-eggs-directory'] old_files = os.listdir(dest) env = dict(os.environ, PYTHONPATH=pkg_resources_loc) here = os.getcwd() try: try: for setup in develop.split(): setup = self._buildout_path(setup) files = glob.glob(setup) if not files: self._logger.warn("Couldn't develop %r (not found)", setup) else: files.sort() for setup in files: self._logger.info("Develop: %r", setup) __doing__ = 'Processing develop directory %r.', setup zc.buildout.easy_install.develop(setup, dest) except: # if we had an error, we need to roll back changes, by # removing any files we created. self._sanity_check_develop_eggs_files(dest, old_files) self._uninstall('\n'.join( [os.path.join(dest, f) for f in os.listdir(dest) if f not in old_files ])) raise else: self._sanity_check_develop_eggs_files(dest, old_files) return '\n'.join([os.path.join(dest, f) for f in os.listdir(dest) if f not in old_files ]) finally: os.chdir(here) def _sanity_check_develop_eggs_files(self, dest, old_files): for f in os.listdir(dest): if f in old_files: continue if not (os.path.isfile(os.path.join(dest, f)) and f.endswith('.egg-link')): self._logger.warning( "Unexpected entry, %r, in develop-eggs directory.", f) def _compute_part_signatures(self, parts): # Compute recipe signature and add to options for part in parts: options = self.get(part) if options is None: options = self[part] = {} recipe, entry = _recipe(options) req = pkg_resources.Requirement.parse(recipe) sig = _dists_sig(pkg_resources.working_set.resolve([req])) options['__buildout_signature__'] = ' '.join(sig) def _read_installed_part_options(self): old = self['buildout']['installed'] if old and os.path.isfile(old): parser = ConfigParser.RawConfigParser() parser.optionxform = lambda s: s parser.read(old) result = {} for section in parser.sections(): options = {} for option, value in parser.items(section): if '%(' in value: for k, v in _spacey_defaults: value = value.replace(k, v) options[option] = value result[section] = Options(self, section, options) return result, True else: return ({'buildout': Options(self, 'buildout', {'parts': ''})}, False, ) def _uninstall(self, installed): for f in installed.split('\n'): if not f: continue f = self._buildout_path(f) if os.path.isdir(f): rmtree(f) elif os.path.isfile(f): try: os.remove(f) except OSError: if not ( sys.platform == 'win32' and (realpath(os.path.join(os.path.dirname(sys.argv[0]), 'buildout.exe')) == realpath(f) ) # Sigh. This is the exectable used to run the buildout # and, of course, it's in use. Leave it. ): raise def _install(self, part): options = self[part] recipe, entry = _recipe(options) recipe_class = pkg_resources.load_entry_point( recipe, 'zc.buildout', entry) installed = recipe_class(self, part, options).install() if installed is None: installed = [] elif isinstance(installed, basestring): installed = [installed] base = self._buildout_path('') installed = [d.startswith(base) and d[len(base):] or d for d in installed] return ' '.join(installed) def _save_installed_options(self, installed_options): installed = self['buildout']['installed'] if not installed: return f = open(installed, 'w') _save_options('buildout', installed_options['buildout'], f) for part in installed_options['buildout']['parts'].split(): print >>f _save_options(part, installed_options[part], f) f.close() def _error(self, message, *args): raise zc.buildout.UserError(message % args) def _setup_logging(self): root_logger = logging.getLogger() self._logger = logging.getLogger('zc.buildout') handler = logging.StreamHandler(sys.stdout) log_format = self['buildout']['log-format'] if not log_format: # No format specified. Use different formatter for buildout # and other modules, showing logger name except for buildout log_format = '%(name)s: %(message)s' buildout_handler = logging.StreamHandler(sys.stdout) buildout_handler.setFormatter(logging.Formatter('%(message)s')) self._logger.propagate = False self._logger.addHandler(buildout_handler) handler.setFormatter(logging.Formatter(log_format)) root_logger.addHandler(handler) level = self['buildout']['log-level'] if level in ('DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL'): level = getattr(logging, level) else: try: level = int(level) except ValueError: self._error("Invalid logging level %s", level) verbosity = self['buildout'].get('verbosity', 0) try: verbosity = int(verbosity) except ValueError: self._error("Invalid verbosity %s", verbosity) level -= verbosity root_logger.setLevel(level) self._log_level = level def _maybe_upgrade(self): # See if buildout or setuptools need to be upgraded. # If they do, do the upgrade and restart the buildout process. __doing__ = 'Checking for upgrades.' if not self.newest: return specs = ['zc.buildout'] if zc.buildout.easy_install.is_distribute: specs.append('distribute') else: specs.append('setuptools') # Prevent downgrading due to prefer-final: options = self['buildout'] if not ('zc.buildout-version' in options or 'zc.buildout' in self.versions): v = pkg_resources.working_set.find( pkg_resources.Requirement.parse('zc.buildout') ).version options['zc.buildout-version'] = '>=' + v + ', <2dev' ws = zc.buildout.easy_install.install( [ (spec + ' ' + options.get(spec+'-version', '')).strip() for spec in specs ], options['eggs-directory'], links = options.get('find-links', '').split(), index = options.get('index'), path = [options['develop-eggs-directory']], allow_hosts = self._allow_hosts, prefer_final=not self.accept_buildout_test_releases, ) upgraded = [] for project in 'zc.buildout', 'setuptools': req = pkg_resources.Requirement.parse(project) project_location = pkg_resources.working_set.find(req).location if ws.find(req).location != project_location: upgraded.append(ws.find(req)) if not upgraded: return __doing__ = 'Upgrading.' should_run = realpath( os.path.join(os.path.abspath(options['bin-directory']), 'buildout') ) if sys.platform == 'win32': should_run += '-script.py' if realpath(os.path.abspath(sys.argv[0])) != should_run: self._logger.debug("Running %r.", realpath(sys.argv[0])) self._logger.debug("Local buildout is %r.", should_run) self._logger.warn("Not upgrading because not running a local " "buildout command.") return if sys.platform == 'win32' and not self.__windows_restart: args = map(zc.buildout.easy_install._safe_arg, sys.argv) args.insert(1, '-W') if not __debug__: args.insert(0, '-O') args.insert(0, zc.buildout.easy_install._safe_arg (sys.executable)) os.execv(sys.executable, args) self._logger.info("Upgraded:\n %s;\nrestarting.", ",\n ".join([("%s version %s" % (dist.project_name, dist.version) ) for dist in upgraded ] ), ) # the new dist is different, so we've upgraded. # Update the scripts and return True # Ideally the new version of buildout would get a chance to write the # script. Not sure how to do that. partsdir = os.path.join(options['parts-directory'], 'buildout') if os.path.exists(partsdir): # This is primarily for unit tests, in which .py files change too # fast for Python to know to regenerate the .pyc/.pyo files. shutil.rmtree(partsdir) os.mkdir(partsdir) if (self.accept_buildout_test_releases and self._annotated['buildout']['accept-buildout-test-releases'][1] == 'COMMAND_LINE_VALUE'): # Bootstrap was called with '--accept-buildout-test-releases'. # Continue to honor that setting. script_initialization = _early_release_initialization_code else: script_initialization = '' # (Honor the relative-paths option.) relative_paths = options.get('relative-paths', 'false') if relative_paths == 'true': relative_paths = options['directory'] elif relative_paths == 'false': relative_paths = '' else: raise zc.buildout.UserError("relative_paths must be true or false") zc.buildout.easy_install.sitepackage_safe_scripts( options['bin-directory'], ws, options['executable'], partsdir, reqs=['zc.buildout'], relative_paths=relative_paths, include_site_packages=self.include_site_packages, script_initialization=script_initialization, exec_sitecustomize=self.exec_sitecustomize, ) # Restart args = list(sys.argv) if not __debug__: args.insert(0, '-O') args.insert(0, sys.executable) # We want to make sure that our new site.py is used for rerunning # buildout, so we put the partsdir in PYTHONPATH for our restart. # This overrides any set PYTHONPATH, but since we generally are # trying to run with a completely "clean" python (only the standard # library) then that should be fine. env = os.environ.copy() env['PYTHONPATH'] = partsdir # windows: Popen will quote args itself if needed # see subprocess.list2cmdline os._exit(subprocess.Popen(args, env=env).wait()) def _load_extensions(self): __doing__ = 'Loading extensions.' specs = self['buildout'].get('extensions', '').split() if specs: path = [self['buildout']['develop-eggs-directory']] if self.offline: dest = None path.append(self['buildout']['eggs-directory']) else: dest = self['buildout']['eggs-directory'] if not os.path.exists(dest): self._logger.info('Creating directory %r.', dest) os.mkdir(dest) zc.buildout.easy_install.install( specs, dest, path=path, working_set=pkg_resources.working_set, links = self['buildout'].get('find-links', '').split(), index = self['buildout'].get('index'), newest=self.newest, allow_hosts=self._allow_hosts, prefer_final=not self.accept_buildout_test_releases) # Clear cache because extensions might now let us read pages we # couldn't read before. zc.buildout.easy_install.clear_index_cache() for ep in pkg_resources.iter_entry_points('zc.buildout.extension'): ep.load()(self) def _unload_extensions(self): __doing__ = 'Unloading extensions.' specs = self['buildout'].get('extensions', '').split() if specs: for ep in pkg_resources.iter_entry_points( 'zc.buildout.unloadextension'): ep.load()(self) def setup(self, args): if not args: raise zc.buildout.UserError( "The setup command requires the path to a setup script or \n" "directory containing a setup script, and its arguments." ) setup = args.pop(0) if os.path.isdir(setup): setup = os.path.join(setup, 'setup.py') self._logger.info("Running setup script %r.", setup) setup = os.path.abspath(setup) fd, tsetup = tempfile.mkstemp() exe = zc.buildout.easy_install._safe_arg(sys.executable) try: os.write(fd, zc.buildout.easy_install.runsetup_template % dict( setuptools=pkg_resources_loc, setupdir=os.path.dirname(setup), setup=setup, __file__ = setup, )) if is_jython: arg_list = list() for a in args: arg_list.append(zc.buildout.easy_install._safe_arg(a)) subprocess.Popen([exe] + list(tsetup) + arg_list).wait() else: os.spawnl(os.P_WAIT, sys.executable, exe, tsetup, *[zc.buildout.easy_install._safe_arg(a) for a in args]) finally: os.close(fd) os.remove(tsetup) runsetup = setup # backward compat. def annotate(self, args): _print_annotate(self._annotated) def __getitem__(self, section): __doing__ = 'Getting section %s.', section try: return self._data[section] except KeyError: pass try: data = self._raw[section] except KeyError: raise MissingSection(section) options = Options(self, section, data) self._data[section] = options options._initialize() return options def __setitem__(self, key, value): raise NotImplementedError('__setitem__') def __delitem__(self, key): raise NotImplementedError('__delitem__') def keys(self): return self._raw.keys() def __iter__(self): return iter(self._raw) def _install_and_load(spec, group, entry, buildout): __doing__ = 'Loading recipe %r.', spec try: req = pkg_resources.Requirement.parse(spec) buildout_options = buildout['buildout'] if pkg_resources.working_set.find(req) is None: __doing__ = 'Installing recipe %s.', spec if buildout.offline: dest = None path = [buildout_options['develop-eggs-directory'], buildout_options['eggs-directory'], ] else: dest = buildout_options['eggs-directory'] path = [buildout_options['develop-eggs-directory']] zc.buildout.easy_install.install( [spec], dest, links=buildout._links, index=buildout_options.get('index'), path=path, working_set=pkg_resources.working_set, newest=buildout.newest, allow_hosts=buildout._allow_hosts, prefer_final=not buildout.accept_buildout_test_releases) __doing__ = 'Loading %s recipe entry %s:%s.', group, spec, entry return pkg_resources.load_entry_point( req.project_name, group, entry) except Exception, v: buildout._logger.log( 1, "Could't load %s entry point %s\nfrom %s:\n%s.", group, entry, spec, v) raise class Options(UserDict.DictMixin): def __init__(self, buildout, section, data): self.buildout = buildout self.name = section self._raw = data self._cooked = {} self._data = {} def _initialize(self): name = self.name __doing__ = 'Initializing section %s.', name if '<' in self._raw: self._raw = self._do_extend_raw(name, self._raw, []) # force substitutions for k, v in self._raw.items(): if '${' in v: self._dosub(k, v) if self.name == 'buildout': return # buildout section can never be a part recipe = self.get('recipe') if not recipe: return reqs, entry = _recipe(self._data) buildout = self.buildout recipe_class = _install_and_load(reqs, 'zc.buildout', entry, buildout) __doing__ = 'Initializing part %s.', name self.recipe = recipe_class(buildout, name, self) buildout._parts.append(name) def _do_extend_raw(self, name, data, doing): if name == 'buildout': return data if name in doing: raise zc.buildout.UserError("Infinite extending loop %r" % name) doing.append(name) try: to_do = data.pop('<', None) if to_do is None: return data __doing__ = 'Loading input sections for %r', name result = {} for iname in to_do.split('\n'): iname = iname.strip() if not iname: continue raw = self.buildout._raw.get(iname) if raw is None: raise zc.buildout.UserError("No section named %r" % iname) result.update(self._do_extend_raw(iname, raw, doing)) result.update(data) return result finally: assert doing.pop() == name def _dosub(self, option, v): __doing__ = 'Getting option %s:%s.', self.name, option seen = [(self.name, option)] v = '$$'.join([self._sub(s, seen) for s in v.split('$$')]) self._cooked[option] = v def get(self, option, default=None, seen=None): try: return self._data[option] except KeyError: pass v = self._cooked.get(option) if v is None: v = self._raw.get(option) if v is None: return default __doing__ = 'Getting option %s:%s.', self.name, option if '${' in v: key = self.name, option if seen is None: seen = [key] elif key in seen: raise zc.buildout.UserError( "Circular reference in substitutions.\n" ) else: seen.append(key) v = '$$'.join([self._sub(s, seen) for s in v.split('$$')]) seen.pop() self._data[option] = v return v _template_split = re.compile('([$]{[^}]*})').split _simple = re.compile('[-a-zA-Z0-9 ._]+$').match _valid = re.compile('\${[-a-zA-Z0-9 ._]*:[-a-zA-Z0-9 ._]+}$').match def _sub(self, template, seen): value = self._template_split(template) subs = [] for ref in value[1::2]: s = tuple(ref[2:-1].split(':')) if not self._valid(ref): if len(s) < 2: raise zc.buildout.UserError("The substitution, %s,\n" "doesn't contain a colon." % ref) if len(s) > 2: raise zc.buildout.UserError("The substitution, %s,\n" "has too many colons." % ref) if not self._simple(s[0]): raise zc.buildout.UserError( "The section name in substitution, %s,\n" "has invalid characters." % ref) if not self._simple(s[1]): raise zc.buildout.UserError( "The option name in substitution, %s,\n" "has invalid characters." % ref) section, option = s if not section: section = self.name v = self.buildout[section].get(option, None, seen) if v is None: if option == '_buildout_section_name_': v = self.name else: raise MissingOption("Referenced option does not exist:", section, option) subs.append(v) subs.append('') return ''.join([''.join(v) for v in zip(value[::2], subs)]) def __getitem__(self, key): try: return self._data[key] except KeyError: pass v = self.get(key) if v is None: raise MissingOption("Missing option: %s:%s" % (self.name, key)) return v def __setitem__(self, option, value): if not isinstance(value, str): raise TypeError('Option values must be strings', value) self._data[option] = value def __delitem__(self, key): if key in self._raw: del self._raw[key] if key in self._data: del self._data[key] if key in self._cooked: del self._cooked[key] elif key in self._data: del self._data[key] else: raise KeyError, key def keys(self): raw = self._raw return list(self._raw) + [k for k in self._data if k not in raw] def copy(self): result = self._raw.copy() result.update(self._cooked) result.update(self._data) return result def _call(self, f): buildout_directory = self.buildout['buildout']['directory'] self._created = [] try: try: os.chdir(buildout_directory) return f() except: for p in self._created: if os.path.isdir(p): rmtree(p) elif os.path.isfile(p): os.remove(p) else: self.buildout._logger.warn("Couldn't clean up %r.", p) raise finally: self._created = None os.chdir(buildout_directory) def created(self, *paths): try: self._created.extend(paths) except AttributeError: raise TypeError( "Attempt to register a created path while not installing", self.name) return self._created def query_bool(self, name, default=None): """Given a name, return a boolean value for that name. ``default``, if given, should be 'true', 'false', or None. """ if default is not None: value = self.setdefault(name, default=default) else: value = self.get(name) if value is None: return value return _convert_bool(name, value) def get_bool(self, name): """Given a name, return a boolean value for that name. """ return _convert_bool(name, self[name]) def _convert_bool(name, value): if value not in ('true', 'false'): raise zc.buildout.UserError( 'Invalid value for %s option: %s' % (name, value)) else: return value == 'true' _spacey_nl = re.compile('[ \t\r\f\v]*\n[ \t\r\f\v\n]*' '|' '^[ \t\r\f\v]+' '|' '[ \t\r\f\v]+$' ) _spacey_defaults = [ ('%(__buildout_space__)s', ' '), ('%(__buildout_space_n__)s', '\n'), ('%(__buildout_space_r__)s', '\r'), ('%(__buildout_space_f__)s', '\f'), ('%(__buildout_space_v__)s', '\v'), ] def _quote_spacey_nl(match): match = match.group(0).split('\n', 1) result = '\n\t'.join( [(s .replace(' ', '%(__buildout_space__)s') .replace('\r', '%(__buildout_space_r__)s') .replace('\f', '%(__buildout_space_f__)s') .replace('\v', '%(__buildout_space_v__)s') .replace('\n', '%(__buildout_space_n__)s') ) for s in match] ) return result def _save_option(option, value, f): value = _spacey_nl.sub(_quote_spacey_nl, value) if value.startswith('\n\t'): value = '%(__buildout_space_n__)s' + value[2:] if value.endswith('\n\t'): value = value[:-2] + '%(__buildout_space_n__)s' print >>f, option, '=', value def _save_options(section, options, f): print >>f, '[%s]' % section items = options.items() items.sort() for option, value in items: _save_option(option, value, f) def _open(base, filename, seen, dl_options, override, downloaded): """Open a configuration file and return the result as a dictionary, Recursively open other files based on buildout options found. """ _update_section(dl_options, override) _dl_options = _unannotate_section(dl_options.copy()) newest = _convert_bool('newest', _dl_options.get('newest', 'false')) fallback = newest and not (filename in downloaded) download = zc.buildout.download.Download( _dl_options, cache=_dl_options.get('extends-cache'), fallback=fallback, hash_name=True) is_temp = False if _isurl(filename): path, is_temp = download(filename) fp = open(path) base = filename[:filename.rfind('/')] elif _isurl(base): if os.path.isabs(filename): fp = open(filename) base = os.path.dirname(filename) else: filename = base + '/' + filename path, is_temp = download(filename) fp = open(path) base = filename[:filename.rfind('/')] else: filename = os.path.join(base, filename) fp = open(filename) base = os.path.dirname(filename) downloaded.add(filename) if filename in seen: if is_temp: fp.close() os.remove(path) raise zc.buildout.UserError("Recursive file include", seen, filename) root_config_file = not seen seen.append(filename) result = {} parser = ConfigParser.RawConfigParser() parser.optionxform = lambda s: s parser.readfp(fp) if is_temp: fp.close() os.remove(path) extends = None for section in parser.sections(): options = dict(parser.items(section)) if section == 'buildout': extends = options.pop('extends', extends) if 'extended-by' in options: raise zc.buildout.UserError( 'No-longer supported "extended-by" option found in %s.' % filename) result[section] = options result = _annotate(result, filename) if root_config_file and 'buildout' in result: dl_options = _update_section(dl_options, result['buildout']) if extends: extends = extends.split() eresult = _open(base, extends.pop(0), seen, dl_options, override, downloaded) for fname in extends: _update(eresult, _open(base, fname, seen, dl_options, override, downloaded)) result = _update(eresult, result) seen.pop() return result ignore_directories = '.svn', 'CVS' _dir_hashes = {} def _dir_hash(dir): dir_hash = _dir_hashes.get(dir, None) if dir_hash is not None: return dir_hash hash = md5() for (dirpath, dirnames, filenames) in os.walk(dir): dirnames[:] = [n for n in dirnames if n not in ignore_directories] filenames[:] = [f for f in filenames if (not (f.endswith('pyc') or f.endswith('pyo')) and os.path.exists(os.path.join(dirpath, f))) ] hash.update(' '.join(dirnames)) hash.update(' '.join(filenames)) for name in filenames: hash.update(open(os.path.join(dirpath, name)).read()) _dir_hashes[dir] = dir_hash = hash.digest().encode('base64').strip() return dir_hash def _dists_sig(dists): result = [] for dist in dists: location = dist.location if dist.precedence == pkg_resources.DEVELOP_DIST: result.append(dist.project_name + '-' + _dir_hash(location)) else: result.append(os.path.basename(location)) return result def _update_section(s1, s2): # Base section 2 on section 1; section 1 is copied, with key-value pairs # in section 2 overriding those in section 1. If there are += or -= # operators in section 2, process these to add or substract items (delimited # by newlines) from the preexisting values. s2 = s2.copy() # avoid mutating the second argument, which is unexpected # Sort on key, then on the addition or substraction operator (+ comes first) for k, v in sorted(s2.items(), key=lambda x: (x[0].rstrip(' +'), x[0][-1])): v2, note2 = v if k.endswith('+'): key = k.rstrip(' +') # Find v1 in s2 first; it may have been defined locally too. v1, note1 = s2.get(key, s1.get(key, ("", ""))) newnote = ' [+] '.join((note1, note2)).strip() s2[key] = "\n".join((v1).split('\n') + v2.split('\n')), newnote del s2[k] elif k.endswith('-'): key = k.rstrip(' -') # Find v1 in s2 first; it may have been set by a += operation first v1, note1 = s2.get(key, s1.get(key, ("", ""))) newnote = ' [-] '.join((note1, note2)).strip() s2[key] = ("\n".join( [v for v in v1.split('\n') if v not in v2.split('\n')]), newnote) del s2[k] s1.update(s2) return s1 def _update(d1, d2): for section in d2: if section in d1: d1[section] = _update_section(d1[section], d2[section]) else: d1[section] = d2[section] return d1 def _recipe(options): recipe = options['recipe'] if ':' in recipe: recipe, entry = recipe.split(':') else: entry = 'default' return recipe, entry def _doing(): _, v, tb = sys.exc_info() message = str(v) doing = [] while tb is not None: d = tb.tb_frame.f_locals.get('__doing__') if d: doing.append(d) tb = tb.tb_next if doing: sys.stderr.write('While:\n') for d in doing: if not isinstance(d, str): d = d[0] % d[1:] sys.stderr.write(' %s\n' % d) def _error(*message): sys.stderr.write('Error: ' + ' '.join(message) +'\n') sys.exit(1) _internal_error_template = """ An internal error occurred due to a bug in either zc.buildout or in a recipe being used: """ def _check_for_unused_options_in_section(buildout, section): options = buildout[section] unused = [option for option in options._raw if option not in options._data] if unused: buildout._logger.warn("Unused options for %s: %s." % (section, ' '.join(map(repr, unused))) ) _early_release_initialization_code = """\ sys.argv.insert(1, 'buildout:accept-buildout-test-releases=true') print ('NOTE: Accepting early releases of build system packages. Rerun ' 'bootstrap without --accept-buildout-test-releases (-t) to return to ' 'default behavior.') """ _usage = """\ Usage: buildout [options] [assignments] [command [command arguments]] Options: -h, --help Print this message and exit. -v Increase the level of verbosity. This option can be used multiple times. -q Decrease the level of verbosity. This option can be used multiple times. -c config_file Specify the path to the buildout configuration file to be used. This defaults to the file named "buildout.cfg" in the current working directory. -t socket_timeout Specify the socket timeout in seconds. -U Don't read user defaults. -o Run in off-line mode. This is equivalent to the assignment buildout:offline=true. -O Run in non-off-line mode. This is equivalent to the assignment buildout:offline=false. This is the default buildout mode. The -O option would normally be used to override a true offline setting in a configuration file. -n Run in newest mode. This is equivalent to the assignment buildout:newest=true. With this setting, which is the default, buildout will try to find the newest versions of distributions available that satisfy its requirements. -N Run in non-newest mode. This is equivalent to the assignment buildout:newest=false. With this setting, buildout will not seek new distributions if installed distributions satisfy it's requirements. -D Debug errors. If an error occurs, then the post-mortem debugger will be started. This is especially useful for debuging recipe problems. -s Squelch warnings about using an executable with a broken -S implementation. Assignments are of the form: section:option=value and are used to provide configuration options that override those given in the configuration file. For example, to run the buildout in offline mode, use buildout:offline=true. Options and assignments can be interspersed. Commands: install [parts] Install parts. If no command arguments are given, then the parts definition from the configuration file is used. Otherwise, the arguments specify the parts to be installed. Note that the semantics differ depending on whether any parts are specified. If parts are specified, then only those parts will be installed. If no parts are specified, then the parts specified by the buildout parts option will be installed along with all of their dependencies. bootstrap Create a new buildout in the current working directory, copying the buildout and setuptools eggs and, creating a basic directory structure and a buildout-local buildout script. init Initialize a buildout, creating a buildout.cfg file if it doesn't exist and then performing the same actions as for the buildout command. setup script [setup command and options] Run a given setup script arranging that setuptools is in the script's path and and that it has been imported so that setuptools-provided commands (like bdist_egg) can be used even if the setup script doesn't import setuptools itself. The script can be given either as a script path or a path to a directory containing a setup.py script. annotate Display annotated sections. All sections are displayed, sorted alphabetically. For each section, all key-value pairs are displayed, sorted alphabetically, along with the origin of the value (file name or COMPUTED_VALUE, DEFAULT_VALUE, COMMAND_LINE_VALUE). """ def _help(): print _usage sys.exit(0) def main(args=None): if args is None: args = sys.argv[1:] config_file = 'buildout.cfg' verbosity = 0 options = [] windows_restart = False user_defaults = True debug = False ignore_broken_dash_s = False while args: if args[0][0] == '-': op = orig_op = args.pop(0) op = op[1:] while op and op[0] in 'vqhWUoOnNDAs': if op[0] == 'v': verbosity += 10 elif op[0] == 'q': verbosity -= 10 elif op[0] == 'W': windows_restart = True elif op[0] == 'U': user_defaults = False elif op[0] == 'o': options.append(('buildout', 'offline', 'true')) elif op[0] == 'O': options.append(('buildout', 'offline', 'false')) elif op[0] == 'n': options.append(('buildout', 'newest', 'true')) elif op[0] == 'N': options.append(('buildout', 'newest', 'false')) elif op[0] == 'D': debug = True elif op[0] == 's': ignore_broken_dash_s = True else: _help() op = op[1:] if op[:1] in ('c', 't'): op_ = op[:1] op = op[1:] if op_ == 'c': if op: config_file = op else: if args: config_file = args.pop(0) else: _error("No file name specified for option", orig_op) elif op_ == 't': try: timeout = int(args.pop(0)) except IndexError: _error("No timeout value specified for option", orig_op) except ValueError: _error("No timeout value must be numeric", orig_op) import socket print 'Setting socket time out to %d seconds' % timeout socket.setdefaulttimeout(timeout) elif op: if orig_op == '--help': _help() _error("Invalid option", '-'+op[0]) elif '=' in args[0]: option, value = args.pop(0).split('=', 1) if len(option.split(':')) != 2: _error('Invalid option:', option) section, option = option.split(':') options.append((section.strip(), option.strip(), value.strip())) else: # We've run out of command-line options and option assignnemnts # The rest should be commands, so we'll stop here break if verbosity < 0 or ignore_broken_dash_s: broken_dash_S_filter_action = 'ignore' elif verbosity == 0: # This is the default. broken_dash_S_filter_action = 'once' else: broken_dash_S_filter_action = 'default' warnings.filterwarnings( broken_dash_S_filter_action, re.escape( zc.buildout.easy_install.BROKEN_DASH_S_WARNING), UserWarning) if verbosity: options.append(('buildout', 'verbosity', str(verbosity))) if args: command = args.pop(0) if command not in ( 'install', 'bootstrap', 'runsetup', 'setup', 'init', 'annotate', ): _error('invalid command:', command) else: command = 'install' try: try: buildout = Buildout(config_file, options, user_defaults, windows_restart, command, args) getattr(buildout, command)(args) except Exception, v: _doing() exc_info = sys.exc_info() import pdb, traceback if debug: traceback.print_exception(*exc_info) sys.stderr.write('\nStarting pdb:\n') pdb.post_mortem(exc_info[2]) else: if isinstance(v, (zc.buildout.UserError, distutils.errors.DistutilsError, ) ): _error(str(v)) else: sys.stderr.write(_internal_error_template) traceback.print_exception(*exc_info) sys.exit(1) finally: logging.shutdown() if sys.version_info[:2] < (2, 4): def reversed(iterable): result = list(iterable); result.reverse() return result zc.buildout-1.7.1/src/zc/buildout/buildout.txt0000644000076500007650000025344012111414155021001 0ustar jimjim00000000000000Buildouts ========= The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". This document describes how to define buildouts using buildout configuration files and recipes. There are three ways to set up the buildout software and create a buildout instance: 1. Install the zc.buildout egg with easy_install and use the buildout script installed in a Python scripts area. 2. Use the buildout bootstrap script to create a buildout that includes both the setuptools and zc.buildout eggs. This allows you to use the buildout software without modifying a Python install. The buildout script is installed into your buildout local scripts area. 3. Use a buildout command from an already installed buildout to bootstrap a new buildout. (See the section on bootstraping later in this document.) Often, a software project will be managed in a software repository, such as a subversion repository, that includes some software source directories, buildout configuration files, and a copy of the buildout bootstrap script. To work on the project, one would check out the project from the repository and run the bootstrap script which installs setuptools and zc.buildout into the checkout as well as any parts defined. We have a sample buildout that we created using the bootstrap command of an existing buildout (method 3 above). It has the absolute minimum information. We have bin, develop-eggs, eggs and parts directories, and a configuration file: >>> ls(sample_buildout) d bin - buildout.cfg d develop-eggs d eggs d parts The bin directory contains scripts. >>> ls(sample_buildout, 'bin') - buildout >>> ls(sample_buildout, 'eggs') - setuptools-0.6-py2.4.egg - zc.buildout-1.0-py2.4.egg The develop-eggs directory is initially empty: >>> ls(sample_buildout, 'develop-eggs') The develop-eggs directory holds egg links for software being developed in the buildout. We separate develop-eggs and other eggs to allow eggs directories to be shared across multiple buildouts. For example, a common developer technique is to define a common eggs directory in their home that all non-develop eggs are stored in. This allows larger buildouts to be set up much more quickly and saves disk space. The parts directory just contains some helpers for the buildout script itself. >>> ls(sample_buildout, 'parts') d buildout The parts directory provides an area where recipes can install part data. For example, if we built a custom Python, we would install it in the part directory. Part data is stored in a sub-directory of the parts directory with the same name as the part. Buildouts are defined using configuration files. These are in the format defined by the Python ConfigParser module, with extensions that we'll describe later. By default, when a buildout is run, it looks for the file buildout.cfg in the directory where the buildout is run. The minimal configuration file has a buildout section that defines no parts: >>> cat(sample_buildout, 'buildout.cfg') [buildout] parts = A part is simply something to be created by a buildout. It can be almost anything, such as a Python package, a program, a directory, or even a configuration file. Recipes ------- A part is created by a recipe. Recipes are always installed as Python eggs. They can be downloaded from a package server, such as the Python Package Index, or they can be developed as part of a project using a "develop" egg. A develop egg is a special kind of egg that gets installed as an "egg link" that contains the name of a source directory. Develop eggs don't have to be packaged for distribution to be used and can be modified in place, which is especially useful while they are being developed. Let's create a recipe as part of the sample project. We'll create a recipe for creating directories. First, we'll create a recipes source directory for our local recipes: >>> mkdir(sample_buildout, 'recipes') and then we'll create a source file for our mkdir recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... if not os.path.isdir(os.path.dirname(options['path'])): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... ... ... def install(self): ... path = self.options['path'] ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return path ... ... def update(self): ... pass ... """) Currently, recipes must define 3 methods [#future_recipe_methods]_: - a constructor, - an install method, and - an update method. The constructor is responsible for updating a parts options to reflect data read from other sections. The buildout system keeps track of whether a part specification has changed. A part specification has changed if it's options, after adjusting for data read from other sections, has changed, or if the recipe has changed. Only the options for the part are considered. If data are read from other sections, then that information has to be reflected in the parts options. In the Mkdir example, the given path is interpreted relative to the buildout directory, and data from the buildout directory is read. The path option is updated to reflect this. If the directory option was changed in the buildout sections, we would know to update parts created using the mkdir recipe using relative path names. When buildout is run, it saves configuration data for installed parts in a file named ".installed.cfg". In subsequent runs, it compares part-configuration data stored in the .installed.cfg file and the part-configuration data loaded from the configuration files as modified by recipe constructors to decide if the configuration of a part has changed. If the configuration has changed, or if the recipe has changed, then the part is uninstalled and reinstalled. The buildout only looks at the part's options, so any data used to configure the part needs to be reflected in the part's options. It is the job of a recipe constructor to make sure that the options include all relevant data. Of course, parts are also uninstalled if they are no-longer used. The recipe defines a constructor that takes a buildout object, a part name, and an options dictionary. It saves them in instance attributes. If the path is relative, we'll interpret it as relative to the buildout directory. The buildout object passed in is a mapping from section name to a mapping of options for that section. The buildout directory is available as the directory option of the buildout section. We normalize the path and save it back into the options directory. The install method is responsible for creating the part. In this case, we need the path of the directory to create. We'll use a path option from our options dictionary. The install method logs what it's doing using the Python logging call. We return the path that we installed. If the part is uninstalled or reinstalled, then the path returned will be removed by the buildout machinery. A recipe install method is expected to return a string, or an iterable of strings containing paths to be removed if a part is uninstalled. For most recipes, this is all of the uninstall support needed. For more complex uninstallation scenarios use `Uninstall recipes`_. The update method is responsible for updating an already installed part. An empty method is often provided, as in this example, if parts can't be updated. An update method can return None, a string, or an iterable of strings. If a string or iterable of strings is returned, then the saved list of paths to be uninstalled is updated with the new information by adding any new files returned by the update method. We need to provide packaging information so that our recipe can be installed as a develop egg. The minimum information we need to specify [#packaging_info]_ is a name. For recipes, we also need to define the names of the recipe classes as entry points. Packaging information is provided via a setup.py script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) Our setup script defines an entry point. Entry points provide a way for an egg to define the services it provides. Here we've said that we define a zc.buildout entry point named mkdir. Recipe classes must be exposed as entry points in the zc.buildout group. we give entry points names within the group. We also need a README.txt for our recipes to avoid an annoying warning from distutils, on which setuptools and zc.buildout are based: >>> write(sample_buildout, 'recipes', 'README.txt', " ") Now let's update our buildout.cfg: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) Let's go through the changes one by one:: develop = recipes This tells the buildout to install a development egg for our recipes. Any number of paths can be listed. The paths can be relative or absolute. If relative, they are treated as relative to the buildout directory. They can be directory or file paths. If a file path is given, it should point to a Python setup script. If a directory path is given, it should point to a directory containing a setup.py file. Development eggs are installed before building any parts, as they may provide locally-defined recipes needed by the parts. :: parts = data-dir Here we've named a part to be "built". We can use any name we want except that different part names must be unique and recipes will often use the part name to decide what to do. :: [data-dir] recipe = recipes:mkdir path = mystuff When we name a part, we also create a section of the same name that contains part data. In this section, we'll define the recipe to be used to install the part. In this case, we also specify the path to be created. Let's run the buildout. We do so by running the build script in the buildout: >>> import os >>> os.chdir(sample_buildout) >>> buildout = os.path.join(sample_buildout, 'bin', 'buildout') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory mystuff We see that the recipe created the directory, as expected: >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mystuff d parts d recipes In addition, .installed.cfg has been created containing information about the part we installed: >>> cat(sample_buildout, '.installed.cfg') [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir [data-dir] __buildout_installed__ = /sample-buildout/mystuff __buildout_signature__ = recipes-c7vHV6ekIDUPy/7fjAaYjg== path = /sample-buildout/mystuff recipe = recipes:mkdir Note that the directory we installed is included in .installed.cfg. In addition, the path option includes the actual destination directory. If we change the name of the directory in the configuration file, we'll see that the directory gets removed and recreated: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mydata d parts d recipes If any of the files or directories created by a recipe are removed, the part will be reinstalled: >>> rmdir(sample_buildout, 'mydata') >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Error reporting --------------- If a user makes an error, an error needs to be printed and work needs to stop. This is accomplished by logging a detailed error message and then raising a (or an instance of a subclass of a) zc.buildout.UserError exception. Raising an error other than a UserError still displays the error, but labels it as a bug in the buildout software or recipe. In the sample above, of someone gives a non-existent directory to create the directory in: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = /xxx/mydata ... """) We'll get a user error, not a traceback. >>> print system(buildout), Develop: '/sample-buildout/recipes' data-dir: Cannot create /xxx/mydata. /xxx is not a directory. While: Installing. Getting section data-dir. Initializing part data-dir. Error: Invalid Path Recipe Error Handling --------------------- If an error occurs during installation, it is up to the recipe to clean up any system side effects, such as files created. Let's update the mkdir recipe to support multiple paths: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') If there is an error creating a path, the install method will exit and leave previously created paths in place: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' We meant to create a directory bins, but typed bin. Now foo was left behind. >>> os.path.exists('foo') True If we fix the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/foo' Now they fail because foo exists, because it was left behind. >>> remove('foo') Let's fix the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... created = [] ... try: ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... created.append(path) ... except: ... for d in created: ... os.rmdir(d) ... raise ... ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') And put back the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) When we rerun the buildout: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' we get the same error, but we don't get the directory left behind: >>> os.path.exists('foo') False It's critical that recipes clean up partial effects when errors occur. Because recipes most commonly create files and directories, buildout provides a helper API for removing created files when an error occurs. Option objects have a created method that can be called to record files as they are created. If the install or update method returns with an error, then any registered paths are removed automatically. The method returns the files registered and can be used to return the files created. Let's use this API to simplify the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... self.options.created(path) ... ... return self.options.created() ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') We returned by calling created, taking advantage of the fact that it returns the registered paths. We did this for illustrative purposes. It would be simpler just to return the paths as before. If we rerun the buildout, again, we'll get the error and no directories will be created: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' >>> os.path.exists('foo') False Now, we'll fix the typo again and we'll get the directories we expect: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bins >>> os.path.exists('foo') True >>> os.path.exists('bins') True Configuration file syntax ------------------------- As mentioned earlier, buildout configuration files use the format defined by the Python ConfigParser module with extensions. The extensions are: - option names are case sensitive - option values can use a substitution syntax, described below, to refer to option values in specific sections. - option values can be appended or removed using the - and + operators. The ConfigParser syntax is very flexible. Section names can contain any characters other than newlines and right square braces ("]"). Option names can contain any characters other than newlines, colons, and equal signs, can not start with a space, and don't include trailing spaces. It is likely that, in the future, some characters will be given special buildout-defined meanings. This is already true of the characters ":", "$", "%", "(", and ")". For now, it is a good idea to keep section and option names simple, sticking to alphanumeric characters, hyphens, and periods. Annotated sections ------------------ When used with the `annotate` command, buildout displays annotated sections. All sections are displayed, sorted alphabetically. For each section, all key-value pairs are displayed, sorted alphabetically, along with the origin of the value (file name or COMPUTED_VALUE, DEFAULT_VALUE, COMMAND_LINE_VALUE). >>> print system(buildout+ ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== [buildout] accept-buildout-test-releases= false DEFAULT_VALUE allow-hosts= * DEFAULT_VALUE allow-picked-versions= true DEFAULT_VALUE allowed-eggs-from-site-packages= * DEFAULT_VALUE bin-directory= bin DEFAULT_VALUE develop= recipes /sample-buildout/buildout.cfg develop-eggs-directory= develop-eggs DEFAULT_VALUE directory= /sample-buildout COMPUTED_VALUE eggs-directory= eggs DEFAULT_VALUE exec-sitecustomize= true DEFAULT_VALUE executable= ... DEFAULT_VALUE find-links= DEFAULT_VALUE include-site-packages= true DEFAULT_VALUE install-from-cache= false DEFAULT_VALUE installed= .installed.cfg DEFAULT_VALUE log-format= DEFAULT_VALUE log-level= INFO DEFAULT_VALUE newest= true DEFAULT_VALUE offline= false DEFAULT_VALUE parts= data-dir /sample-buildout/buildout.cfg parts-directory= parts DEFAULT_VALUE prefer-final= false DEFAULT_VALUE python= buildout DEFAULT_VALUE relative-paths= false DEFAULT_VALUE socket-timeout= DEFAULT_VALUE unzip= false DEFAULT_VALUE use-dependency-links= true DEFAULT_VALUE [data-dir] path= foo bins /sample-buildout/buildout.cfg recipe= recipes:mkdir /sample-buildout/buildout.cfg Variable substitutions ---------------------- Buildout configuration files support variable substitution. To illustrate this, we'll create an debug recipe to allow us to see interactions with the buildout: >>> write(sample_buildout, 'recipes', 'debug.py', ... """ ... class Debug: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... items = self.options.items() ... items.sort() ... for option, value in items: ... print option, value ... return () ... ... update = install ... """) This recipe doesn't actually create anything. The install method doesn't return anything, because it didn't create any files or directories. We also have to update our setup script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) We've rearranged the script a bit to make the entry points easier to edit. In particular, entry points are now defined as a configuration string, rather than a dictionary. Let's update our configuration to provide variable substitution examples: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) We used a string-template substitution for File 1 and File 2. This type of substitution uses the string.Template syntax. Names substituted are qualified option names, consisting of a section name and option name joined by a colon. Now, if we run the buildout, we'll see the options with the values substituted. >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug Note that the substitution of the data-dir path option reflects the update to the option performed by the mkdir recipe. It might seem surprising that mydata was created again. This is because we changed our recipes package by adding the debug module. The buildout system didn't know if this module could effect the mkdir recipe, so it assumed it could and reinstalled mydata. If we rerun the buildout: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug We can see that mydata was not recreated. Note that, in this case, we didn't specify a log level, so we didn't get output about what the buildout was doing. Section and option names in variable substitutions are only allowed to contain alphanumeric characters, hyphens, periods and spaces. This restriction might be relaxed in future releases. We can ommit the section name in a variable substitution to refer to the current section. We can also use the special option, _buildout_section_name_ to get the current section name. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${:File 1}/log ... my_name = ${:_buildout_section_name_} ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log my_name debug recipe recipes:debug Automatic part selection and ordering ------------------------------------- When a section with a recipe is referred to, either through variable substitution or by an initializing recipe, the section is treated as a part and added to the part list before the referencing part. For example, we can leave data-dir out of the parts list: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Note that the data-dir part is included *before* the debug part, because the debug part refers to the data-dir part. Even if we list the data-dir part after the debug part, it will be included before: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug data-dir ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Extending sections (macros) --------------------------- A section (other than the buildout section) can extend one or more other sections using the ``<=`` option. Options from the referenced sections are copied to the refering section *before* variable substitution. This, together with the ability to refer to variables of the current section allows sections to be used as macros. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = myfiles ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... ... [with_file1] ... <= debug ... file1 = ${:path}/file1 ... color = red ... ... [with_file2] ... <= debug ... file2 = ${:path}/file2 ... color = blue ... ... [myfiles] ... <= with_file1 ... with_file2 ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Uninstalling data-dir. Installing myfiles. color blue file1 mydata/file1 file2 mydata/file2 path mydata recipe recipes:debug In this example, the debug, with_file1 and with_file2 sections act as macros. In particular, the variable substitutions are performed relative to the myfiles section. Adding and removing options --------------------------- We can append and remove values to an option by using the + and - operators. This is illustrated below; first we define a base configuration. >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... parts = part1 part2 part3 ... ... [part1] ... recipe = ... option = a1 a2 ... ... [part2] ... recipe = ... option = b1 ... b2 ... b3 ... b4 ... ... [part3] ... recipe = ... option = c1 c2 ... ... [part4] ... recipe = ... option = d2 ... d3 ... d5 ... ... """) Extending this configuration, we can "adjust" the values set in the base configuration file. >>> write(sample_buildout, 'extension1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... # appending values ... [part1] ... option += a3 a4 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... ... # alt. spelling ... [part3] ... option+=c3 c4 c5 ... ... # combining both adding and removing ... [part4] ... option += d1 ... d4 ... option -= d5 ... ... # normal assignment ... [part5] ... option = h1 h2 ... """) An additional extension. >>> write(sample_buildout, 'extension2.cfg', ... """ ... [buildout] ... extends = extension1.cfg ... ... # appending values ... [part1] ... option += a5 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... b3 ... ... """) To verify that the options are adjusted correctly, we'll set up an extension that prints out the options. >>> mkdir(sample_buildout, 'demo') >>> write(sample_buildout, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print [part['option'] for name, part in buildout.items() \ ... if name.startswith('part')] ... """) >>> write(sample_buildout, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name="demo", ... entry_points={'zc.buildout.extension': ['ext = demo:ext']}, ... ) ... """) Set up a buildout configuration for this extension. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_buildout) >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), # doctest: +ELLIPSIS Develop: '/sample-buildout/demo' Uninstalling myfiles. Getting distribution for 'recipes'. zip_safe flag not set; analyzing archive contents... Got recipes 0.0.0. warning: install_lib: 'build/lib...' does not exist -- no Python modules to install Verify option values. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... extends = extension2.cfg ... """) >>> print system(os.path.join('bin', 'buildout')), ['a1 a2/na3 a4/na5', 'b4', 'c1 c2/nc3 c4 c5', 'd2/nd3/nd1/nd4', 'h1 h2'] Develop: '/sample-buildout/demo' Annotated sections output shows which files are responsible for which operations. >>> print system(os.path.join('bin', 'buildout') + ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== ... [part1] option= a1 a2 a3 a4 a5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg += /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part2] option= b4 /sample-buildout/base.cfg -= /sample-buildout/extension1.cfg -= /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part3] option= c1 c2 c3 c4 c5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part4] option= d2 d3 d1 d4 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg -= /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part5] option= h1 h2 /sample-buildout/extension1.cfg Cleanup. >>> os.remove(os.path.join(sample_buildout, 'base.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension1.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension2.cfg')) Multiple configuration files ---------------------------- A configuration file can "extend" another configuration file. Options are read from the other configuration file if they aren't already defined by your configuration file. The configuration files your file extends can extend other configuration files. The same file may be used more than once although, of course, cycles aren't allowed. To see how this works, we use an example: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op = buildout ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... op = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing debug. op buildout recipe recipes:debug The example is pretty trivial, but the pattern it illustrates is pretty common. In a more practical example, the base buildout might represent a product and the extending buildout might be a customization. Here is a more elaborate example. >>> other = tmpdir('other') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = b1.cfg b2.cfg %(b3)s ... ... [debug] ... op = buildout ... """ % dict(b3=os.path.join(other, 'b3.cfg'))) >>> write(sample_buildout, 'b1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op1 = b1 1 ... op2 = b1 2 ... """) >>> write(sample_buildout, 'b2.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op2 = b2 2 ... op3 = b2 3 ... """) >>> write(other, 'b3.cfg', ... """ ... [buildout] ... extends = b3base.cfg ... ... [debug] ... op4 = b3 4 ... """) >>> write(other, 'b3base.cfg', ... """ ... [debug] ... op5 = b3base 5 ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... name = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug There are several things to note about this example: - We can name multiple files in an extends option. - We can reference files recursively. - Relative file names in extended options are interpreted relative to the directory containing the referencing configuration file. Loading Configuration from URLs ------------------------------- Configuration files can be loaded from URLs. To see how this works, we'll set up a web server with some configuration files. >>> server_data = tmpdir('server_data') >>> write(server_data, "r1.cfg", ... """ ... [debug] ... op1 = r1 1 ... op2 = r1 2 ... """) >>> write(server_data, "r2.cfg", ... """ ... [buildout] ... extends = r1.cfg ... ... [debug] ... op2 = r2 2 ... op3 = r2 3 ... """) >>> server_url = start_server(server_data) >>> write('client.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = %(url)s/r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = base ... """ % dict(url=server_url)) >>> print system(buildout+ ' -c client.cfg'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug Here we specified a URL for the file we extended. The file we downloaded, itself referred to a file on the server using a relative URL reference. Relative references are interpreted relative to the base URL when they appear in configuration files loaded via URL. We can also specify a URL as the configuration file to be used by a buildout. >>> os.remove('client.cfg') >>> write(server_data, 'remote.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = remote ... """) >>> print system(buildout + ' -c ' + server_url + '/remote.cfg'), While: Initializing. Error: Missing option: buildout:directory Normally, the buildout directory defaults to directory containing a configuration file. This won't work for configuration files loaded from URLs. In this case, the buildout directory would normally be defined on the command line: >>> print system(buildout ... + ' -c ' + server_url + '/remote.cfg' ... + ' buildout:directory=' + sample_buildout ... ), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name remote op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug User defaults ------------- If the file $HOME/.buildout/default.cfg, exists, it is read before reading the configuration file. ($HOME is the value of the HOME environment variable. The '/' is replaced by the operating system file delimiter.) >>> old_home = os.environ['HOME'] >>> home = tmpdir('home') >>> mkdir(home, '.buildout') >>> write(home, '.buildout', 'default.cfg', ... """ ... [debug] ... op1 = 1 ... op7 = 7 ... """) >>> os.environ['HOME'] = home >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 op7 7 recipe recipes:debug A buildout command-line argument, -U, can be used to suppress reading user defaults: >>> print system(buildout + ' -U'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug >>> os.environ['HOME'] = old_home Log level --------- We can control the level of logging by specifying a log level in out configuration file. For example, so suppress info messages, we can set the logging level to WARNING >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... log-level = WARNING ... extends = b1.cfg b2.cfg ... """) >>> print system(buildout), name base op1 b1 1 op2 b2 2 op3 b2 3 recipe recipes:debug Uninstall recipes ----------------- As we've seen, when parts are installed, buildout keeps track of files and directories that they create. When the parts are uninstalled these files and directories are deleted. Sometimes more clean up is needed. For example, a recipe might add a system service by calling chkconfig --add during installation. Later during uninstallation, chkconfig --del will need to be called to remove the system service. In order to deal with these uninstallation issues, you can register uninstall recipes. Uninstall recipes are registered using the 'zc.buildout.uninstall' entry point. Parts specify uninstall recipes using the 'uninstall' option. In comparison to regular recipes, uninstall recipes are much simpler. They are simply callable objects that accept the name of the part to be uninstalled and the part's options dictionary. Uninstall recipes don't have access to the part itself since it maybe not be able to be instantiated at uninstallation time. Here's a recipe that simulates installation of a system service, along with an uninstall recipe that simulates removing the service. >>> write(sample_buildout, 'recipes', 'service.py', ... """ ... class Service: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... print "chkconfig --add %s" % self.options['script'] ... return () ... ... def update(self): ... pass ... ... ... def uninstall_service(name, options): ... print "chkconfig --del %s" % options['script'] ... """) To use these recipes we must register them using entry points. Make sure to use the same name for the recipe and uninstall recipe. This is required to let buildout know which uninstall recipe goes with which recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... service = service:uninstall_service ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Here's how these recipes could be used in a buildout: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/script ... """) When the buildout is run the service will be installed >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing service. chkconfig --add /path/to/script The service has been installed. If the buildout is run again with no changes, the service shouldn't be changed. >>> print system(buildout) Develop: '/sample-buildout/recipes' Updating service. Now we change the service part to trigger uninstallation and re-installation. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/a/different/script ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/script Installing service. chkconfig --add /path/to/a/different/script Now we remove the service part, and add another part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/a/different/script Installing debug. recipe recipes:debug Uninstall recipes don't have to take care of removing all the files and directories created by the part. This is still done automatically, following the execution of the uninstall recipe. An upshot is that an uninstallation recipe can access files and directories created by a recipe before they are deleted. For example, here's an uninstallation recipe that simulates backing up a directory before it is deleted. It is designed to work with the mkdir recipe introduced earlier. >>> write(sample_buildout, 'recipes', 'backup.py', ... """ ... import os ... def backup_directory(name, options): ... path = options['path'] ... size = len(os.listdir(path)) ... print "backing up directory %s of size %s" % (path, size) ... """) It must be registered with the zc.buildout.uninstall entry point. Notice how it is given the name 'mkdir' to associate it with the mkdir recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... uninstall_service = service:uninstall_service ... mkdir = backup:backup_directory ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Now we can use it with a mkdir part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = dir debug ... ... [dir] ... recipe = recipes:mkdir ... path = my_directory ... ... [debug] ... recipe = recipes:debug ... """) Run the buildout to install the part. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing dir. dir: Creating directory my_directory Installing debug. recipe recipes:debug Now we remove the part from the configuration file. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) When the buildout is run the part is removed, and the uninstall recipe is run before the directory is deleted. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling dir. Running uninstall recipe. backing up directory /sample-buildout/my_directory of size 0 Updating debug. recipe recipes:debug Now we will return the registration to normal for the benefit of the rest of the examples. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Command-line usage ------------------ A number of arguments can be given on the buildout command line. The command usage is:: buildout [options and assignments] [command [command arguments]] The following options are supported: -h (or --help) Print basic usage information. If this option is used, then all other options are ignored. -c filename The -c option can be used to specify a configuration file, rather than buildout.cfg in the current directory. -t socket_timeout Specify the socket timeout in seconds. -v Increment the verbosity by 10. The verbosity is used to adjust the logging level. The verbosity is subtracted from the numeric value of the log-level option specified in the configuration file. -q Decrement the verbosity by 10. -U Don't read user-default configuration. -o Run in off-line mode. This is equivalent to the assignment buildout:offline=true. -O Run in non-off-line mode. This is equivalent to the assignment buildout:offline=false. This is the default buildout mode. The -O option would normally be used to override a true offline setting in a configuration file. -n Run in newest mode. This is equivalent to the assignment buildout:newest=true. With this setting, which is the default, buildout will try to find the newest versions of distributions available that satisfy its requirements. -N Run in non-newest mode. This is equivalent to the assignment buildout:newest=false. With this setting, buildout will not seek new distributions if installed distributions satisfy it's requirements. Assignments are of the form:: section_name:option_name=value Options and assignments can be given in any order. Here's an example: >>> write(sample_buildout, 'other.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... installed = .other.cfg ... log-level = WARNING ... ... [debug] ... name = other ... recipe = recipes:debug ... """) Note that we used the installed buildout option to specify an alternate file to store information about installed parts. >>> print system(buildout+' -c other.cfg debug:op1=foo -v'), Develop: '/sample-buildout/recipes' Installing debug. name other op1 foo recipe recipes:debug Here we used the -c option to specify an alternate configuration file, and the -v option to increase the level of logging from the default, WARNING. Options can also be combined in the usual Unix way, as in: >>> print system(buildout+' -vcother.cfg debug:op1=foo'), Develop: '/sample-buildout/recipes' Updating debug. name other op1 foo recipe recipes:debug Here we combined the -v and -c options with the configuration file name. Note that the -c option has to be last, because it takes an argument. >>> os.remove(os.path.join(sample_buildout, 'other.cfg')) >>> os.remove(os.path.join(sample_buildout, '.other.cfg')) The most commonly used command is 'install' and it takes a list of parts to install. if any parts are specified, only those parts are installed. To illustrate this, we'll update our configuration and run the buildout in the usual way: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d1 d2 d3 ... ... [d1] ... recipe = recipes:mkdir ... path = d1 ... ... [d2] ... recipe = recipes:mkdir ... path = d2 ... ... [d3] ... recipe = recipes:mkdir ... path = d3 ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. recipe recipes:debug Installing d1. d1: Creating directory d1 Installing d2. d2: Creating directory d2 Installing d3. d3: Creating directory d3 >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d d3 d demo d develop-eggs d eggs d parts d recipes >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/d3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d3 recipe = recipes:mkdir Now we'll update our configuration file: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d2 d3 d4 ... ... [d2] ... recipe = recipes:mkdir ... path = data2 ... ... [d3] ... recipe = recipes:mkdir ... path = data3 ... ... [d4] ... recipe = recipes:mkdir ... path = ${d2:path}-extra ... ... [debug] ... recipe = recipes:debug ... x = 1 ... """) and run the buildout specifying just d3 and d4: >>> print system(buildout+' install d3 d4'), Develop: '/sample-buildout/recipes' Uninstalling d3. Installing d3. d3: Creating directory data3 Installing d4. d4: Creating directory data2-extra >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Only the d3 and d4 recipes ran. d3 was removed and data3 and data2-extra were created. The .installed.cfg is only updated for the recipes that ran: >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 d4 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/data3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data3 recipe = recipes:mkdir [d4] __buildout_installed__ = /sample-buildout/data2-extra __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data2-extra recipe = recipes:mkdir Note that the installed data for debug, d1, and d2 haven't changed, because we didn't install those parts and that the d1 and d2 directories are still there. Now, if we run the buildout without the install command: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling d2. Uninstalling d1. Uninstalling debug. Installing debug. recipe recipes:debug x 1 Installing d2. d2: Creating directory data2 Updating d3. Updating d4. We see the output of the debug recipe and that data2 was created. We also see that d1 and d2 have gone away: >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d data2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Alternate directory and file locations -------------------------------------- The buildout normally puts the bin, eggs, and parts directories in the directory in the directory containing the configuration file. You can provide alternate locations, and even names for these directories. >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... develop-eggs-directory = %(developbasket)s ... eggs-directory = %(basket)s ... bin-directory = %(scripts)s ... parts-directory = %(work)s ... """ % dict( ... developbasket = os.path.join(alt, 'developbasket'), ... basket = os.path.join(alt, 'basket'), ... scripts = os.path.join(alt, 'scripts'), ... work = os.path.join(alt, 'work'), ... )) >>> print system(buildout), Creating directory '/sample-alt/scripts'. Creating directory '/sample-alt/work'. Creating directory '/sample-alt/basket'. Creating directory '/sample-alt/developbasket'. Develop: '/sample-buildout/recipes' Uninstalling d4. Uninstalling d3. Uninstalling d2. Uninstalling debug. >>> ls(alt) d basket d developbasket d scripts d work >>> ls(alt, 'developbasket') - recipes.egg-link You can also specify an alternate buildout directory: >>> rmdir(alt) >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... directory = %(alt)s ... develop = %(recipes)s ... parts = ... """ % dict( ... alt=alt, ... recipes=os.path.join(sample_buildout, 'recipes'), ... )) >>> print system(buildout), Creating directory '/sample-alt/bin'. Creating directory '/sample-alt/parts'. Creating directory '/sample-alt/eggs'. Creating directory '/sample-alt/develop-eggs'. Develop: '/sample-buildout/recipes' >>> ls(alt) - .installed.cfg d bin d develop-eggs d eggs d parts >>> ls(alt, 'develop-eggs') - recipes.egg-link Logging control --------------- Three buildout options are used to control logging: log-level specifies the log level verbosity adjusts the log level log-format allows an alternate logging for mat to be specified We've already seen the log level and verbosity. Let's look at an example of changing the format: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... log-level = 25 ... verbosity = 5 ... log-format = %(levelname)s %(message)s ... """) Here, we've changed the format to include the log-level name, rather than the logger name. We've also illustrated, with a contrived example, that the log level can be a numeric value and that the verbosity can be specified in the configuration file. Because the verbosity is subtracted from the log level, we get a final log level of 20, which is the INFO level. >>> print system(buildout), INFO Develop: '/sample-buildout/recipes' Predefined buildout options --------------------------- Buildouts have a number of predefined options that recipes can use and that users can override in their configuration files. To see these, we'll run a minimal buildout configuration with a debug logging level. One of the features of debug logging is that the configuration database is shown. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' -vv'), # doctest: +NORMALIZE_WHITESPACE Installing 'zc.buildout >=1.9a1, <2dev', 'setuptools'. We have a develop egg: zc.buildout X.X. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = V.V Configuration data: [buildout] accept-buildout-test-releases = false allow-hosts = * allow-picked-versions = true allowed-eggs-from-site-packages = * bin-directory = /sample-buildout/bin develop-eggs-directory = /sample-buildout/develop-eggs directory = /sample-buildout eggs-directory = /sample-buildout/eggs exec-sitecustomize = true executable = python find-links = include-site-packages = true install-from-cache = false installed = /sample-buildout/.installed.cfg log-format = log-level = INFO newest = true offline = false parts = parts-directory = /sample-buildout/parts prefer-final = false python = buildout relative-paths = false socket-timeout = unzip = false use-dependency-links = true verbosity = 20 zc.buildout-version = >=1.9a1, <2dev All of these options can be overridden by configuration files or by command-line assignments. We've discussed most of these options already, but let's review them and touch on some we haven't discussed: allowed-eggs-from-site-packages Sometimes you need or want to control what eggs from site-packages are used. The allowed-eggs-from-site-packages option allows you to specify a whitelist of project names that may be included from site-packages. You can use globs to specify the value. It defaults to a single value of '*', indicating that any package may come from site-packages. Here's a usage example:: [buildout] ... allowed-eggs-from-site-packages = demo bigdemo zope.* This option interacts with the ``include-site-packages`` option in the following ways. If ``include-site-packages`` is true, then ``allowed-eggs-from-site-packages`` filters what eggs from site-packages may be chosen. Therefore, if ``allowed-eggs-from-site-packages`` is an empty list, then no eggs from site-packages are chosen, but site-packages will still be included at the end of path lists. If ``include-site-packages`` is false, the value of ``allowed-eggs-from-site-packages`` is irrelevant. See the ``include-site-packages`` description for more information. bin-directory The directory path where scripts are written. This can be a relative path, which is interpreted relative to the directory option. develop-eggs-directory The directory path where development egg links are created for software being created in the local project. This can be a relative path, which is interpreted relative to the directory option. directory The buildout directory. This is the base for other buildout file and directory locations, when relative locations are used. eggs-directory The directory path where downloaded eggs are put. It is common to share this directory across buildouts. Eggs in this directory should *never* be modified. This can be a relative path, which is interpreted relative to the directory option. exec-sitecustomize Normally the Python's real sitecustomize module is processed. If you do not want it to be processed in order to increase the repeatability of your buildout, set this value to 'false'. This will be honored irrespective of the setting for include-site-packages. This option will be honored by some recipes and not others. z3c.recipe.scripts honors this and zc.recipe.egg does not, for instance. executable The Python executable used to run the buildout. See the python option below. include-site-packages You can choose not to have the site-packages of the underlying Python available to your script or interpreter, in addition to the packages from your eggs. This can increase repeatability for your buildout. This option will be better used by some recipes than others. z3c.recipe.scripts honors this fully and zc.recipe.egg only partially, for instance. installed The file path where information about the results of the previous buildout run is written. This can be a relative path, which is interpreted relative to the directory option. This file provides an inventory of installed parts with information needed to decide which if any parts need to be uninstalled. log-format The format used for logging messages. log-level The log level before verbosity adjustment parts A white space separated list of parts to be installed. parts-directory A working directory that parts can used to store data. python The name of a section containing information about the default Python interpreter. Recipes that need a installation typically have options to tell them which Python installation to use. By convention, if a section-specific option isn't used, the option is looked for in the buildout section. The option must point to a section with an executable option giving the path to a Python executable. By default, the buildout section defines the default Python as the Python used to run the buildout. relative-paths The paths generated by zc.buildout are absolute by default, and this option is ``false``. However, if you set this value to be ``true``, bin/buildout will be generated with code that makes the paths relative. Some recipes, such as zc.recipe.egg and z3c.recipe.scripts, honor this value as well. unzip By default, zc.buildout doesn't unzip zip-safe eggs ("unzip = false"). This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy ("unzip = true"). use-dependency-links By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. verbosity A log-level adjustment. Typically, this is set via the -q and -v command-line options. Creating new buildouts and bootstrapping ---------------------------------------- If zc.buildout is installed, you can use it to create a new buildout with it's own local copies of zc.buildout and setuptools and with local buildout scripts. >>> sample_bootstrapped = tmpdir('sample-bootstrapped') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), Creating '/sample-bootstrapped/setup.cfg'. Creating directory '/sample-bootstrapped/bin'. Creating directory '/sample-bootstrapped/parts'. Creating directory '/sample-bootstrapped/eggs'. Creating directory '/sample-bootstrapped/develop-eggs'. Generated script '/sample-bootstrapped/bin/buildout'. Note that a basic setup.cfg was created for us. This is because we provided an 'init' argument. By default, the generated ``setup.cfg`` is as minimal as it could be: >>> cat(sample_bootstrapped, 'setup.cfg') [buildout] parts = We also get other buildout artifacts: >>> ls(sample_bootstrapped) d bin d develop-eggs d eggs d parts - setup.cfg >>> ls(sample_bootstrapped, 'bin') - buildout >>> _ = (ls(sample_bootstrapped, 'eggs'), ... ls(sample_bootstrapped, 'develop-eggs')) - setuptools-0.6-py2.3.egg - zc.buildout-1.0-py2.3.egg (We list both the eggs and develop-eggs directories because the buildout or setuptools egg could be installed in the develop-eggs directory if the original buildout had develop eggs for either buildout or setuptools.) If relative-paths is ``true``, the buildout script uses relative paths. >>> write(sample_bootstrapped, 'setup.cfg', ... ''' ... [buildout] ... relative-paths = true ... parts = ... ''') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' bootstrap'), Generated script '/sample-bootstrapped/bin/buildout'. >>> buildout_script = join(sample_bootstrapped, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #!... -S import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'parts/buildout'), ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import zc.buildout.buildout if __name__ == '__main__': sys.exit(zc.buildout.buildout.main()) Note that, in the above two examples, the buildout script was installed but not run. To run the buildout, we'd have to run the installed buildout script. If we have an existing buildout that already has a buildout.cfg, we'll normally use the bootstrap command instead of init. It will complain if there isn't a configuration file: >>> sample_bootstrapped2 = tmpdir('sample-bootstrapped2') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), While: Initializing. Error: Couldn't open /sample-bootstrapped2/setup.cfg >>> write(sample_bootstrapped2, 'setup.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), Creating directory '/sample-bootstrapped2/bin'. Creating directory '/sample-bootstrapped2/parts'. Creating directory '/sample-bootstrapped2/eggs'. Creating directory '/sample-bootstrapped2/develop-eggs'. Generated script '/sample-bootstrapped2/bin/buildout'. Similarly, if there is a configuration file and we use the init command, we'll get an error that the configuration file already exists: >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), While: Initializing. Error: '/sample-bootstrapped/setup.cfg' already exists. Initial eggs ------------ When using the ``init`` command, you can specify distribution requirements or paths to use: >>> cd(sample_bootstrapped) >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Generated script '/sample-bootstrapped/bin/buildout'. Getting distribution for 'zc.recipe.egg<2dev'. Got zc.recipe.egg 1.3.3dev. Installing py. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'other'. Got other 1.0. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. This causes a ``py`` part to be included that sets up a custom python interpreter with the given requirements or paths: >>> cat('setup.cfg') [buildout] parts = py [py] recipe = zc.recipe.egg interpreter = py eggs = demo other extra-paths = ./src Passing requirements or paths causes the the builout to be run as part of initialization. In the example above, we got a number of distributions installed and 2 scripts generated. The first, ``demo``, was defined by the ``demo`` project. The second, ``py`` was defined by the generated configuration. It's a "custom interpreter" that behaves like a standard Python interpeter, except that includes the specified eggs and extra paths in it's Python path. We specified a source directory that didn't exist. Buildout created it for us: >>> ls('.') - .installed.cfg d bin d develop-eggs d eggs d parts - setup.cfg d src >>> uncd() .. Make sure it works if the dir is already there: >>> cd(sample_bootstrapped) >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Installing py. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. .. cleanup >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> uncd() Newest and Offline Modes ------------------------ By default buildout and recipes will try to find the newest versions of distributions needed to satisfy requirements. This can be very time consuming, especially when incrementally working on setting up a buildout or working on a recipe. The buildout newest option can be used to to suppress this. If the newest option is set to false, then new distributions won't be sought if an installed distribution meets requirements. The newest option can be set to false using the -N command-line option. The offline option goes a bit further. If the buildout offline option is given a value of "true", the buildout and recipes that are aware of the option will avoid doing network access. This is handy when running the buildout when not connected to the internet. It also makes buildouts run much faster. This option is typically set using the buildout -o option. Preferring Final Releases ------------------------- Currently, when searching for new releases of your project's dependencies, the newest available release is used. This isn't usually ideal, as you may get a development release or alpha releases not ready to be widely used. You can request that final releases be preferred using the ``prefer-final`` option in the buildout section:: [buildout] ... prefer-final = true When the ``prefer-final`` option is set to true, then when searching for new releases, final releases are preferred. If there are final releases that satisfy distribution requirements, then those releases are used even if newer non-final releases are available. In buildout version 2, all final releases will be preferred by default--that is ``prefer-final`` will also default to 'true'. You will then need to use a 'false' value for ``prefer-final`` to get the newest releases. A separate option controls the behavior of the build system itself. When buildout looks for recipes, extensions, and for updates to itself, it does prefer final releases by default, as of the 1.5.0 release. The ``accept-buildout-test-releases`` option will let you override this behavior. However, it is typically changed by the --accept-buildout-test-releases option to the bootstrap script, since bootstrapping is the first step to selecting a buildout. Finding distributions --------------------- By default, buildout searches the Python Package Index when looking for distributions. You can, instead, specify your own index to search using the `index` option:: [buildout] ... index = http://index.example.com/ This index, or the default of http://pypi.python.org/simple/ if no index is specified, will always be searched for distributions unless running buildout with options that prevent searching for distributions. The latest version of the distribution that meets the requirements of the buildout will always be used. You can also specify more locations to search for distributions using the `find-links` option. All locations specified will be searched for distributions along with the package index as described before. Locations can be urls:: [buildout] ... find-links = http://download.zope.org/distribution/ They can also be directories on disk:: [buildout] ... find-links = /some/path Finally, they can also be direct paths to distributions:: [buildout] ... find-links = /some/path/someegg-1.0.0-py2.3.egg Any number of locations can be specified in the `find-links` option:: [buildout] ... find-links = http://download.zope.org/distribution/ /some/otherpath /some/path/someegg-1.0.0-py2.3.egg Dependency links ---------------- By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. Controlling the installation database ------------------------------------- The buildout installed option is used to specify the file used to save information on installed parts. This option is initialized to ".installed.cfg", but it can be overridden in the configuration file or on the command line: >>> write('buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs - inst.cfg d parts d recipes The installation database can be disabled by supplying an empty buildout installed option: >>> os.remove('inst.cfg') >>> print system(buildout+' buildout:installed='), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Note that there will be no installation database if there are no parts: >>> write('buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Extensions ---------- A feature allows code to be loaded and run after configuration files have been read but before the buildout has begun any processing. The intent is to allow special plugins such as urllib2 request handlers to be loaded. To load an extension, we use the extensions option and list one or more distribution requirements, on separate lines. The distributions named will be loaded and any ``zc.buildout.extension`` entry points found will be called with the buildout as an argument. When buildout finishes processing, any ``zc.buildout.unloadextension`` entry points found will be called with the buildout as an argument. Let's create a sample extension in our sample buildout created in the previous section: >>> mkdir(sample_bootstrapped, 'demo') >>> write(sample_bootstrapped, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print 'ext', list(buildout) ... def unload(buildout): ... print 'unload', list(buildout) ... """) >>> write(sample_bootstrapped, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "demo", ... entry_points = { ... 'zc.buildout.extension': ['ext = demo:ext'], ... 'zc.buildout.unloadextension': ['ext = demo:unload'], ... }, ... ) ... """) Our extension just prints out the word 'demo', and lists the sections found in the buildout passed to it. We'll update our buildout.cfg to list the demo directory as a develop egg to be built: >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_bootstrapped) >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), Develop: '/sample-bootstrapped/demo' Now we can add the extensions option. We were a bit tricky and ran the buildout once with the demo develop egg defined but without the extension option. This is because extensions are loaded before the buildout creates develop eggs. We needed to use a separate buildout run to create the develop egg. Normally, when eggs are loaded from the network, we wouldn't need to do anything special. >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... parts = ... """) We see that our extension is loaded and executed: >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), ext ['buildout'] Develop: '/sample-bootstrapped/demo' unload ['buildout'] Allow hosts ----------- On some environments the links visited by `zc.buildout` can be forbidden by paranoiac firewalls. These URL might be on the chain of links visited by `zc.buildout` wheter they are defined in the `find-links` option, wheter they are defined by various eggs in their `url`, `download_url`, `dependency_links` metadata. It is even harder to track that package_index works like a spider and might visit links and go to other location. The `allow-hosts` option provides a way to prevent this, and works exactly like the one provided in `easy_install`. You can provide a list of allowed host, together with wildcards:: [buildout] ... allow-hosts = *.python.org example.com All urls that does not match these hosts will not be visited. .. [#future_recipe_methods] In the future, additional methods may be added. Older recipes with fewer methods will still be supported. .. [#packaging_info] If we wanted to create a distribution from this package, we would need specify much more information. See the `setuptools documentation `_. zc.buildout-1.7.1/src/zc/buildout/debugging.txt0000644000076500007650000000612412111414155021100 0ustar jimjim00000000000000Debugging buildouts =================== Buildouts can be pretty complex. When things go wrong, it isn't always obvious why. Errors can occur due to problems in user input or due to bugs in zc.buildout or recipes. When an error occurs, Python's post-mortem debugger can be used to inspect the state of the buildout or recipe code where the error occurred. To enable this, use the -D option to the buildout. Let's create a recipe that has a bug: >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... ... def install(self): ... directory = self.options['directory'] ... os.mkdir(directory) ... return directory ... ... def update(self): ... pass ... """) >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup(name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) And create a buildout that uses it: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) If we run the buildout, we'll get an error: >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. While: Installing data-dir. Error: Missing option: data-dir:directory If we want to debug the error, we can add the -D option. Here's we'll supply some input: >>> print system(buildout+" -D", """\ ... up ... p self.options.keys() ... q ... """), Develop: '/sample-buildout/recipes' Installing data-dir. > /zc/buildout/buildout.py(925)__getitem__() -> raise MissingOption("Missing option: %s:%s" % (self.name, key)) (Pdb) > /sample-buildout/recipes/mkdir.py(14)install() -> directory = self.options['directory'] (Pdb) ['path', 'recipe'] (Pdb) While: Installing data-dir. Traceback (most recent call last): File "/zc/buildout/buildout.py", line 1352, in main getattr(buildout, command)(args) File "/zc/buildout/buildout.py", line 383, in install installed_files = self[part]._call(recipe.install) File "/zc/buildout/buildout.py", line 961, in _call return f() File "/sample-buildout/recipes/mkdir.py", line 14, in install directory = self.options['directory'] File "/zc/buildout/buildout.py", line 925, in __getitem__ raise MissingOption("Missing option: %s:%s" % (self.name, key)) MissingOption: Missing option: data-dir:directory Starting pdb: zc.buildout-1.7.1/src/zc/buildout/dependencylinks.txt0000644000076500007650000001367412111414155022334 0ustar jimjim00000000000000Dependency links ---------------- By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option. [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. Let's see this feature in action. To begin, let's create a new egg repository. This repository uses the same sample eggs as the normal testing repository. >>> link_server2 = start_server(sample_eggs) Turn on logging on this server so that we can see when eggs are pulled from it. >>> get(link_server2 + 'enable_server_logging') GET 200 /enable_server_logging '' Let's create a develop egg in our buildout that specifies dependency_links which point to the new server. >>> mkdir(sample_buildout, 'depdemo') >>> write(sample_buildout, 'depdemo', 'dependencydemo.py', ... 'import eggrecipedemoneeded') >>> write(sample_buildout, 'depdemo', 'setup.py', ... '''from setuptools import setup; setup( ... name='depdemo', py_modules=['dependencydemo'], ... install_requires = 'demoneeded', ... dependency_links = ['%s'], ... zip_safe=True, version='1') ... ''' % link_server2) Now let's configure the buildout to use the develop egg. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = depdemo ... parts = eggs ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = depdemo ... ''') Now we can run the buildout. >>> print system(buildout) GET 200 / GET 200 /demoneeded-1.2c1.zip Develop: '/sample-buildout/depdemo' Installing eggs. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Notice that the egg was retrieved from the logging server. Now let's change the egg so that it doesn't specify dependency links. >>> write(sample_buildout, 'depdemo', 'setup.py', ... '''from setuptools import setup; setup( ... name='depdemo', py_modules=['dependencydemo'], ... install_requires = 'demoneeded', ... zip_safe=True, version='1') ... ''') Now we'll remove the existing dependency egg, and rerunning the buildout to see where the egg comes from this time. >>> from glob import glob >>> from os.path import join >>> def remove_demoneeded_egg(): ... for egg in glob(join(sample_buildout, 'eggs', 'demoneeded*.egg')): ... remove(sample_buildout, 'eggs', egg) >>> remove_demoneeded_egg() >>> print system(buildout) # doctest: +ELLIPSIS Develop: '/sample-buildout/depdemo' ... Getting distribution for 'demoneeded'. While: Updating eggs. Getting distribution for 'demoneeded'. Error: Couldn't find a distribution for 'demoneeded'. Now it can't find the dependency since neither the buildout configuration nor setup specifies where to look. Let's change things so that the buildout configuration specifies where to look for eggs. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = depdemo ... parts = eggs ... find-links = %s ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = depdemo ... ''' % link_server) >>> print system(buildout) Develop: '/sample-buildout/depdemo' Installing eggs. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. This time the dependency egg was found on the server without logging configured. Now let's change things once again so that both buildout and setup specify different places to look for the dependency egg. >>> write(sample_buildout, 'depdemo', 'setup.py', ... '''from setuptools import setup; setup( ... name='depdemo', py_modules=['dependencydemo'], ... install_requires = 'demoneeded', ... dependency_links = ['%s'], ... zip_safe=True, version='1') ... ''' % link_server2) >>> remove_demoneeded_egg() >>> print system(buildout) #doctest: +ELLIPSIS GET 200 /... Develop: '/sample-buildout/depdemo' Updating eggs. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. So when both setuptools and buildout specify places to search for eggs, the dependency_links takes precedence over find-links. There is a buildout option that you can specify to change this behavior. It is the use-dependency-links option. This option defaults to true. When you specify false for this option, buildout will ignore dependency_links and only look for eggs using find-links. Here is an example of using this option to disable dependency_links. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = depdemo ... parts = eggs ... find-links = %s ... use-dependency-links = false ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = depdemo ... ''' % link_server) >>> remove_demoneeded_egg() >>> print system(buildout) Develop: '/sample-buildout/depdemo' Updating eggs. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Notice that this time the egg isn't downloaded from the logging server. If we set the option to true, things return to the way they were before. The dependency's are looked for first in the logging server. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = depdemo ... parts = eggs ... find-links = %s ... use-dependency-links = true ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = depdemo ... ''' % link_server) >>> remove_demoneeded_egg() >>> print system(buildout) #doctest: +ELLIPSIS GET 200 /... Develop: '/sample-buildout/depdemo' Updating eggs. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. zc.buildout-1.7.1/src/zc/buildout/distribute.txt0000644000076500007650000000134212111414155021320 0ustar jimjim00000000000000Distribute Support ================== Distribute is a drop-in replacement for Setuptools. zc.buildout is now compatible with Distribute 0.6. To use Distribute in your buildout, you need use the ``--distribute`` option of the ``bootstrap.py`` script:: $ python bootstrap.py --distribute This will download and install the latest Distribute 0.6 release in the ``eggs`` directory, and use this version for the scripts that are created in ``bin``. Notice that if you have a shared eggs directory, a buildout that uses Distribute will not interfer with other buildouts that are based on Setuptools and that are sharing the same eggs directory. Form more information about the Distribute project, see: http://python-distribute.org zc.buildout-1.7.1/src/zc/buildout/download.py0000644000076500007650000002122212111414155020561 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2009 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## """Buildout download infrastructure""" try: from hashlib import md5 except ImportError: from md5 import new as md5 from zc.buildout.easy_install import realpath import logging import os import os.path import re import shutil import tempfile import urllib import urlparse import zc.buildout class URLOpener(urllib.FancyURLopener): http_error_default = urllib.URLopener.http_error_default class ChecksumError(zc.buildout.UserError): pass url_opener = URLOpener() class Download(object): """Configurable download utility. Handles the download cache and offline mode. Download(options=None, cache=None, namespace=None, offline=False, fallback=False, hash_name=False, logger=None) options: mapping of buildout options (e.g. a ``buildout`` config section) cache: path to the download cache (excluding namespaces) namespace: namespace directory to use inside the cache offline: whether to operate in offline mode fallback: whether to use the cache as a fallback (try downloading first) hash_name: whether to use a hash of the URL as cache file name logger: an optional logger to receive download-related log messages """ def __init__(self, options={}, cache=-1, namespace=None, offline=-1, fallback=False, hash_name=False, logger=None): self.directory = options.get('directory', '') self.cache = cache if cache == -1: self.cache = options.get('download-cache') self.namespace = namespace self.offline = offline if offline == -1: self.offline = (options.get('offline') == 'true' or options.get('install-from-cache') == 'true') self.fallback = fallback self.hash_name = hash_name self.logger = logger or logging.getLogger('zc.buildout') @property def download_cache(self): if self.cache is not None: return realpath(os.path.join(self.directory, self.cache)) @property def cache_dir(self): if self.download_cache is not None: return os.path.join(self.download_cache, self.namespace or '') def __call__(self, url, md5sum=None, path=None): """Download a file according to the utility's configuration. url: URL to download md5sum: MD5 checksum to match path: where to place the downloaded file Returns the path to the downloaded file. """ if self.cache: local_path, is_temp = self.download_cached(url, md5sum) else: local_path, is_temp = self.download(url, md5sum, path) return locate_at(local_path, path), is_temp def download_cached(self, url, md5sum=None): """Download a file from a URL using the cache. This method assumes that the cache has been configured. Optionally, it raises a ChecksumError if a cached copy of a file has an MD5 mismatch, but will not remove the copy in that case. """ if not os.path.exists(self.download_cache): raise zc.buildout.UserError( 'The directory:\n' '%r\n' "to be used as a download cache doesn't exist.\n" % self.download_cache) cache_dir = self.cache_dir if not os.path.exists(cache_dir): os.mkdir(cache_dir) cache_key = self.filename(url) cached_path = os.path.join(cache_dir, cache_key) self.logger.debug('Searching cache at %s' % cache_dir) if os.path.exists(cached_path): is_temp = False if self.fallback: try: _, is_temp = self.download(url, md5sum, cached_path) except ChecksumError: raise except Exception: pass if not check_md5sum(cached_path, md5sum): raise ChecksumError( 'MD5 checksum mismatch for cached download ' 'from %r at %r' % (url, cached_path)) self.logger.debug('Using cache file %s' % cached_path) else: self.logger.debug('Cache miss; will cache %s as %s' % (url, cached_path)) _, is_temp = self.download(url, md5sum, cached_path) return cached_path, is_temp def download(self, url, md5sum=None, path=None): """Download a file from a URL to a given or temporary path. An online resource is always downloaded to a temporary file and moved to the specified path only after the download is complete and the checksum (if given) matches. If path is None, the temporary file is returned and the client code is responsible for cleaning it up. """ # Make sure the drive letter in windows-style file paths isn't # interpreted as a URL scheme. if re.match(r"^[A-Za-z]:\\", url): url = 'file:' + url parsed_url = urlparse.urlparse(url, 'file') url_scheme, _, url_path = parsed_url[:3] if url_scheme == 'file': self.logger.debug('Using local resource %s' % url) if not check_md5sum(url_path, md5sum): raise ChecksumError( 'MD5 checksum mismatch for local resource at %r.' % url_path) return locate_at(url_path, path), False if self.offline: raise zc.buildout.UserError( "Couldn't download %r in offline mode." % url) self.logger.info('Downloading %s' % url) urllib._urlopener = url_opener handle, tmp_path = tempfile.mkstemp(prefix='buildout-') try: try: tmp_path, headers = urllib.urlretrieve(url, tmp_path) if not check_md5sum(tmp_path, md5sum): raise ChecksumError( 'MD5 checksum mismatch downloading %r' % url) except IOError, e: os.remove(tmp_path) raise zc.buildout.UserError("Error downloading extends for URL" " %s: %r" % (url, e[1:3])) except Exception, e: os.remove(tmp_path) raise finally: os.close(handle) if path: shutil.move(tmp_path, path) return path, False else: return tmp_path, True def filename(self, url): """Determine a file name from a URL according to the configuration. """ if self.hash_name: return md5(url).hexdigest() else: if re.match(r"^[A-Za-z]:\\", url): url = 'file:' + url parsed = urlparse.urlparse(url, 'file') url_path = parsed[2] if parsed[0] == 'file': while True: url_path, name = os.path.split(url_path) if name: return name if not url_path: break else: for name in reversed(url_path.split('/')): if name: return name url_host, url_port = parsed[-2:] return '%s:%s' % (url_host, url_port) def check_md5sum(path, md5sum): """Tell whether the MD5 checksum of the file at path matches. No checksum being given is considered a match. """ if md5sum is None: return True f = open(path, 'rb') checksum = md5() try: chunk = f.read(2**16) while chunk: checksum.update(chunk) chunk = f.read(2**16) return checksum.hexdigest() == md5sum finally: f.close() def remove(path): if os.path.exists(path): os.remove(path) def locate_at(source, dest): if dest is None or realpath(dest) == realpath(source): return source if os.path.isdir(source): shutil.copytree(source, dest) else: try: os.link(source, dest) except (AttributeError, OSError): shutil.copyfile(source, dest) return dest zc.buildout-1.7.1/src/zc/buildout/download.txt0000644000076500007650000004222212111414155020753 0ustar jimjim00000000000000Using the download utility ========================== The ``zc.buildout.download`` module provides a download utility that handles the details of downloading files needed for a buildout run from the internet. It downloads files to the local file system, using the download cache if desired and optionally checking the downloaded files' MD5 checksum. We setup an HTTP server that provides a file we want to download: >>> server_data = tmpdir('sample_files') >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> server_url = start_server(server_data) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Downloading without using the cache ----------------------------------- If no download cache should be used, the download utility is instantiated without any arguments: >>> from zc.buildout.download import Download >>> download = Download() >>> print download.cache_dir None Downloading a file is achieved by calling the utility with the URL as an argument. A tuple is returned that consists of the path to the downloaded copy of the file and a boolean value indicating whether this is a temporary file meant to be cleaned up during the same buildout run: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /.../buildout-... >>> cat(path) This is a foo text. As we aren't using the download cache and haven't specified a target path either, the download has ended up in a temporary file: >>> is_temp True >>> import tempfile >>> path.startswith(tempfile.gettempdir()) True We are responsible for cleaning up temporary files behind us: >>> remove(path) When trying to access a file that doesn't exist, we'll get an exception: >>> try: download(server_url+'not-there') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error Downloading a local file doesn't produce a temporary file but simply returns the local file itself: >>> download(join(server_data, 'foo.txt')) ('/sample_files/foo.txt', False) We can also have the downloaded file's MD5 sum checked: >>> try: from hashlib import md5 ... except ImportError: from md5 import new as md5 >>> path, is_temp = download(server_url+'foo.txt', ... md5('This is a foo text.').hexdigest()) >>> is_temp True >>> remove(path) >>> download(server_url+'foo.txt', ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' The error message in the event of an MD5 checksum mismatch for a local file reads somewhat differently: >>> download(join(server_data, 'foo.txt'), ... md5('This is a foo text.').hexdigest()) ('/sample_files/foo.txt', False) >>> download(join(server_data, 'foo.txt'), ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for local resource at '/sample_files/foo.txt'. Finally, we can download the file to a specified place in the file system: >>> target_dir = tmpdir('download-target') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False Trying to download a file in offline mode will result in an error: >>> download = Download(cache=None, offline=True) >>> download(server_url+'foo.txt') Traceback (most recent call last): UserError: Couldn't download 'http://localhost/foo.txt' in offline mode. As an exception to this rule, file system paths and URLs in the ``file`` scheme will still work: >>> cat(download(join(server_data, 'foo.txt'))[0]) This is a foo text. >>> cat(download('file:' + join(server_data, 'foo.txt'))[0]) This is a foo text. >>> remove(path) Downloading using the download cache ------------------------------------ In order to make use of the download cache, we need to configure the download utility differently. To do this, we pass a directory path as the ``cache`` attribute upon instantiation: >>> cache = tmpdir('download-cache') >>> download = Download(cache=cache) >>> print download.cache_dir /download-cache/ Simple usage ~~~~~~~~~~~~ When using the cache, a file will be stored in the cache directory when it is first downloaded. The file system path returned by the download utility points to the cached copy: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False Whenever the file is downloaded again, the cached copy is used. Let's change the file on the server to see this: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. If we specify an MD5 checksum for a file that is already in the cache, the cached copy's checksum will be verified: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for cached download from 'http://localhost/foo.txt' at '/download-cache/foo.txt' Trying to access another file at a different URL which has the same base name will result in the cached copy being used: >>> mkdir(server_data, 'other') >>> write(server_data, 'other', 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'other/foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. Given a target path for the download, the utility will provide a copy of the file at that location both when first downloading the file and when using a cached copy: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False >>> ls(cache) - foo.txt >>> remove(path) >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False In offline mode, downloads from any URL will be successful if the file is found in the cache: >>> download = Download(cache=cache, offline=True) >>> cat(download(server_url+'foo.txt')[0]) This is a foo text. Local resources will be cached just like any others since download caches are sometimes used to create source distributions: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download = Download(cache=cache) >>> cat(download('file:' + join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') >>> cat(download(join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') However, resources with checksum mismatches will not be copied to the cache: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> ls(cache) >>> remove(path) If the file is completely missing it should notify the user of the error: >>> download(server_url+'bar.txt') Traceback (most recent call last): UserError: Error downloading extends for URL http://localhost/bar.txt: (404, 'Not Found') >>> ls(cache) Finally, let's see what happens if the download cache to be used doesn't exist as a directory in the file system yet: >>> Download(cache=join(cache, 'non-existent'))(server_url+'foo.txt') Traceback (most recent call last): UserError: The directory: '/download-cache/non-existent' to be used as a download cache doesn't exist. Using namespace sub-directories of the download cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It is common to store cached copies of downloaded files within sub-directories of the download cache to keep some degree of order. For example, zc.buildout stores downloaded distributions in a sub-directory named "dist". Those sub-directories are also known as namespaces. So far, we haven't specified any namespaces to use, so the download utility stored files directly inside the download cache. Let's use a namespace "test" instead: >>> download = Download(cache=cache, namespace='test') >>> print download.cache_dir /download-cache/test The namespace sub-directory hasn't been created yet: >>> ls(cache) Downloading a file now creates the namespace sub-directory and places a copy of the file inside it: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> ls(cache) d test >>> ls(cache, 'test') - foo.txt >>> cat(path) This is a foo text. >>> is_temp False The next time we want to download that file, the copy from inside the cache namespace is used. To see this clearly, we put a file with the same name but different content both on the server and in the cache's root directory: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> write(cache, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> cat(path) This is a foo text. >>> rmdir(cache, 'test') >>> remove(cache, 'foo.txt') >>> write(server_data, 'foo.txt', 'This is a foo text.') Using a hash of the URL as the filename in the cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ So far, the base name of the downloaded file read from the URL has been used for the name of the cached copy of the file. This may not be desirable in some cases, for example when downloading files from different locations that have the same base name due to some naming convention, or if the file content depends on URL parameters. In such cases, an MD5 hash of the complete URL may be used as the filename in the cache: >>> download = Download(cache=cache, hash_name=True) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/09f5793fcdc1716727f72d49519c688d >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d The path was printed just to illustrate matters; we cannot know the real checksum since we don't know which port the server happens to listen at when the test is run, so we don't actually know the full URL of the file. Let's check that the checksum actually belongs to the particular URL used: >>> path.lower() == join(cache, md5(server_url+'foo.txt').hexdigest()).lower() True The cached copy is used when downloading the file again: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> (path, is_temp) == download(server_url+'foo.txt') True >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d If we change the URL, even in such a way that it keeps the base name of the file the same, the file will be downloaded again this time and put in the cache under a different name: >>> path2, is_temp = download(server_url+'other/foo.txt') >>> print path2 /download-cache/537b6d73267f8f4447586989af8c470e >>> path == path2 False >>> path2.lower() == join(cache, md5(server_url+'other/foo.txt').hexdigest()).lower() True >>> cat(path) This is a foo text. >>> cat(path2) The wrong text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d - 537b6d73267f8f4447586989af8c470e >>> remove(path) >>> remove(path2) >>> write(server_data, 'foo.txt', 'This is a foo text.') Using the cache purely as a fall-back ------------------------------------- Sometimes it is desirable to try downloading a file from the net if at all possible, and use the cache purely as a fall-back option when a server is down or if we are in offline mode. This mode is only in effect if a download cache is configured in the first place: >>> download = Download(cache=cache, fallback=True) >>> print download.cache_dir /download-cache/ A downloaded file will be cached: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> ls(cache) - foo.txt >>> cat(cache, 'foo.txt') This is a foo text. >>> is_temp False If the file cannot be served, the cached copy will be used: >>> remove(server_data, 'foo.txt') >>> try: Download()(server_url+'foo.txt') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error >>> path, is_temp = download(server_url+'foo.txt') >>> cat(path) This is a foo text. >>> is_temp False Similarly, if the file is served but we're in offline mode, we'll fall back to using the cache: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> get(server_url+'foo.txt') 'The wrong text.' >>> offline_download = Download(cache=cache, offline=True, fallback=True) >>> path, is_temp = offline_download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False However, when downloading the file normally with the cache being used in fall-back mode, the file will be downloaded from the net and the cached copy will be replaced with the new content: >>> cat(download(server_url+'foo.txt')[0]) The wrong text. >>> cat(cache, 'foo.txt') The wrong text. When trying to download a resource whose checksum does not match, the cached copy will neither be used nor overwritten: >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> cat(cache, 'foo.txt') The wrong text. Configuring the download utility from buildout options ------------------------------------------------------ The configuration options explained so far derive from the build logic implemented by the calling code. Other options configure the download utility for use in a particular project or buildout run; they are read from the ``buildout`` configuration section. The latter can be passed directly as the first argument to the download utility's constructor. The location of the download cache is specified by the ``download-cache`` option: >>> download = Download({'download-cache': cache}, namespace='cmmi') >>> print download.cache_dir /download-cache/cmmi If the ``download-cache`` option specifies a relative path, it is understood relative to the current working directory, or to the buildout directory if that is given: >>> download = Download({'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/relative-cache/ >>> download = Download({'directory': join(sample_buildout, 'root'), ... 'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/root/relative-cache/ Keyword parameters take precedence over the corresponding options: >>> download = Download({'download-cache': cache}, cache=None) >>> print download.cache_dir None Whether to assume offline mode can be inferred from either the ``offline`` or the ``install-from-cache`` option. As usual with zc.buildout, these options must assume one of the values 'true' and 'false': >>> download = Download({'offline': 'true'}) >>> download.offline True >>> download = Download({'offline': 'false'}) >>> download.offline False >>> download = Download({'install-from-cache': 'true'}) >>> download.offline True >>> download = Download({'install-from-cache': 'false'}) >>> download.offline False These two options are combined using logical 'or': >>> download = Download({'offline': 'true', 'install-from-cache': 'false'}) >>> download.offline True >>> download = Download({'offline': 'false', 'install-from-cache': 'true'}) >>> download.offline True The ``offline`` keyword parameter takes precedence over both the ``offline`` and ``install-from-cache`` options: >>> download = Download({'offline': 'true'}, offline=False) >>> download.offline False >>> download = Download({'install-from-cache': 'false'}, offline=True) >>> download.offline True Regressions ----------- MD5 checksum calculation needs to be reliable on all supported systems, which requires text files to be treated as binary to avoid implicit line-ending conversions: >>> text = 'First line of text.\r\nSecond line.\r\n' >>> f = open(join(server_data, 'foo.txt'), 'wb') >>> f.write(text) >>> f.close() >>> path, is_temp = Download()(server_url+'foo.txt', md5(text).hexdigest()) >>> remove(path) When "downloading" a directory given by file-system path or ``file:`` URL and using a download cache at the same time, the cached directory wasn't handled correctly. Consequently, the cache was defeated and an attempt to cache the directory a second time broke. This is how it should work: >>> download = Download(cache=cache) >>> dirpath = join(server_data, 'some_directory') >>> mkdir(dirpath) >>> dest, _ = download(dirpath) If we now modify the source tree, the second download will produce the original one from the cache: >>> mkdir(join(dirpath, 'foo')) >>> ls(dirpath) d foo >>> dest, _ = download(dirpath) >>> ls(dest) Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir zc.buildout-1.7.1/src/zc/buildout/downloadcache.txt0000644000076500007650000001040112111414155021731 0ustar jimjim00000000000000Using a download cache ====================== Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. The buildout download-cache option can be used to specify a directory to be used as a download cache. In this example, we'll create a directory to hold the cache: >>> cache = tmpdir('cache') And set up a buildout that downloads some eggs: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ==0.2 ... ''' % globals()) We specified a link server that has some distributions available for download: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
We'll enable logging on the link server so we can see what's going on: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' We also specified a download cache. If we run the buildout, we'll see the eggs installed from the link server as usual: >>> print system(buildout), GET 200 / GET 200 /demo-0.2-py2.4.egg GET 200 /demoneeded-1.2c1.zip Installing eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. We'll also get the download cache populated. The buildout doesn't put files in the cache directly. It creates an intermediate directory, dist: >>> ls(cache) d dist >>> ls(cache, 'dist') - demo-0.2-py2.4.egg - demoneeded-1.2c1.zip If we remove the installed eggs from eggs directory and re-run the buildout: >>> import os >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> print system(buildout), GET 200 / Updating eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. We see that the distributions aren't downloaded, because they're downloaded from the cache. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a buildout with download cache and tell the buildout to install from the download cache only, without making network accesses. The buildout install-from-cache option can be used to signal that packages should be installed only from the download cache. Let's remove our installed eggs and run the buildout with the install-from-cache option set to true: >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... install-from-cache = true ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> print system(buildout), Uninstalling eggs. Installing eggs. Getting distribution for 'demo'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. zc.buildout-1.7.1/src/zc/buildout/easy_install.py0000644000076500007650000022434012111414155021447 0ustar jimjim00000000000000############################################################################# # # Copyright (c) 2005 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## """Python easy_install API This module provides a high-level Python API for installing packages. It doesn't install scripts. It uses setuptools and requires it to be installed. """ import distutils.errors import fnmatch import glob import logging import os import pkg_resources import py_compile import re import setuptools.archive_util import setuptools.command.setopt import setuptools.package_index import shutil import struct import subprocess import sys import tempfile import warnings import zc.buildout import zipimport _oprp = getattr(os.path, 'realpath', lambda path: path) def realpath(path): return os.path.normcase(os.path.abspath(_oprp(path))) default_index_url = os.environ.get( 'buildout-testing-index-url', 'http://pypi.python.org/simple', ) logger = logging.getLogger('zc.buildout.easy_install') url_match = re.compile('[a-z0-9+.-]+://').match is_win32 = sys.platform == 'win32' is_64 = struct.calcsize("P") == 8 is_jython = sys.platform.startswith('java') is_distribute = ( pkg_resources.Requirement.parse('setuptools').key=='distribute') BROKEN_DASH_S_WARNING = ( 'Buildout has been asked to exclude or limit site-packages so that ' 'builds can be repeatable when using a system Python. However, ' 'the chosen Python executable has a broken implementation of -S (see ' 'https://bugs.launchpad.net/virtualenv/+bug/572545 for an example ' "problem) and this breaks buildout's ability to isolate site-packages. " "If the executable already has a clean site-packages (e.g., " "using virtualenv's ``--no-site-packages`` option) you may be getting " 'equivalent repeatability. To silence this warning, use the -s argument ' 'to the buildout script. Alternatively, use a Python executable with a ' 'working -S (such as a standard Python binary).') if is_jython: import java.lang.System jython_os_name = (java.lang.System.getProperties()['os.name']).lower() setuptools_loc = pkg_resources.working_set.find( pkg_resources.Requirement.parse('setuptools') ).location # Include buildout and setuptools eggs in paths. We prevent dupes just to # keep from duplicating any log messages about them. buildout_loc = pkg_resources.working_set.find( pkg_resources.Requirement.parse('zc.buildout')).location buildout_and_setuptools_path = [setuptools_loc] if os.path.normpath(setuptools_loc) != os.path.normpath(buildout_loc): buildout_and_setuptools_path.append(buildout_loc) def _has_broken_dash_S(executable): """Detect https://bugs.launchpad.net/virtualenv/+bug/572545 .""" # The first attempt here was to simply have the executable attempt to import # ConfigParser and return the return code. That worked except for tests on # Windows, where the return code was wrong for the fake Python executable # generated by the virtualenv.txt test, apparently because setuptools' .exe # file does not pass the -script.py's returncode back properly, at least in # some circumstances. Therefore...print statements. stdout, stderr = subprocess.Popen( [executable, '-S', '-c', 'try:\n' ' import ConfigParser\n' 'except ImportError:\n' ' print 1\n' 'else:\n' ' print 0\n'], stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate() return bool(int(stdout.strip())) def _get_system_paths(executable): """Return lists of standard lib and site paths for executable. """ # We want to get a list of the site packages, which is not easy. # The canonical way to do this is to use # distutils.sysconfig.get_python_lib(), but that only returns a # single path, which does not reflect reality for many system # Pythons, which have multiple additions. Instead, we start Python # with -S, which does not import site.py and set up the extra paths # like site-packages or (Ubuntu/Debian) dist-packages and # python-support. We then compare that sys.path with the normal one # (minus user packages if this is Python 2.6, because we don't # support those (yet?). The set of the normal one minus the set of # the ones in ``python -S`` is the set of packages that are # effectively site-packages. # # The given executable might not be the current executable, so it is # appropriate to do another subprocess to figure out what the # additional site-package paths are. Moreover, even if this # executable *is* the current executable, this code might be run in # the context of code that has manipulated the sys.path--for # instance, to add local zc.buildout or setuptools eggs. def get_sys_path(*args, **kwargs): cmd = [executable] cmd.extend(args) cmd.extend([ "-c", "import sys, os;" "print repr([os.path.normpath(p) for p in sys.path if p])"]) # Windows needs some (as yet to be determined) part of the real env. env = os.environ.copy() # We need to make sure that PYTHONPATH, which will often be set # to include a custom buildout-generated site.py, is not set, or # else we will not get an accurate sys.path for the executable. env.pop('PYTHONPATH', None) env.update(kwargs) _proc = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=env) stdout, stderr = _proc.communicate(); if _proc.returncode: raise RuntimeError( 'error trying to get system packages:\n%s' % (stderr,)) res = eval(stdout.strip()) try: res.remove('.') except ValueError: pass return res stdlib = get_sys_path('-S') # stdlib only no_user_paths = get_sys_path(PYTHONNOUSERSITE='x') site_paths = [p for p in no_user_paths if p not in stdlib] return (stdlib, site_paths) def _get_version_info(executable): cmd = [executable, '-Sc', 'import sys; print(repr(tuple(x for x in sys.version_info)))'] _proc = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) stdout, stderr = _proc.communicate(); if _proc.returncode: raise RuntimeError( 'error trying to get system packages:\n%s' % (stderr,)) return eval(stdout.strip()) _versions = {sys.executable: '%d.%d' % sys.version_info[:2]} def _get_version(executable): try: return _versions[executable] except KeyError: cmd = _safe_arg(executable) + ' -V' p = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=not is_win32) i, o = (p.stdin, p.stdout) i.close() version = o.read().strip() o.close() pystring, version = version.split() assert pystring == 'Python' version = re.match('(\d[.]\d)([.].*\d)?$', version).group(1) _versions[executable] = version return version FILE_SCHEME = re.compile('file://', re.I).match class AllowHostsPackageIndex(setuptools.package_index.PackageIndex): """Will allow urls that are local to the system. No matter what is allow_hosts. """ def url_ok(self, url, fatal=False): if FILE_SCHEME(url): return True return setuptools.package_index.PackageIndex.url_ok(self, url, False) _indexes = {} def _get_index(executable, index_url, find_links, allow_hosts=('*',), path=None): # If path is None, the index will use sys.path. If you provide an empty # path ([]), it will complain uselessly about missing index pages for # packages found in the paths that you expect to use. Therefore, this path # is always the same as the _env path in the Installer. key = executable, index_url, tuple(find_links) index = _indexes.get(key) if index is not None: return index if index_url is None: index_url = default_index_url index = AllowHostsPackageIndex( index_url, hosts=allow_hosts, search_path=path, python=_get_version(executable) ) if find_links: index.add_find_links(find_links) _indexes[key] = index return index clear_index_cache = _indexes.clear if is_win32: # work around spawn lamosity on windows # XXX need safe quoting (see the subprocess.list2cmdline) and test def _safe_arg(arg): return '"%s"' % arg else: _safe_arg = str # The following string is used to run easy_install in # Installer._call_easy_install. It is usually started with python -S # (that is, don't import site at start). That flag, and all of the code # in this snippet above the last two lines, exist to work around a # relatively rare problem. If # # - your buildout configuration is trying to install a package that is within # a namespace package, and # # - you use a Python that has a different version of this package # installed in in its site-packages using # --single-version-externally-managed (that is, using the mechanism # sometimes used by system packagers: # http://peak.telecommunity.com/DevCenter/setuptools#install-command ), and # # - the new package tries to do sys.path tricks in the setup.py to get a # __version__, # # then the older package will be loaded first, making the setup version # the wrong number. While very arguably packages simply shouldn't do # the sys.path tricks, some do, and we don't want buildout to fall over # when they do. # # The namespace packages installed in site-packages with # --single-version-externally-managed use a mechanism that cause them to # be processed when site.py is imported (see # http://mail.python.org/pipermail/distutils-sig/2009-May/011730.html # for another description of the problem). Simply starting Python with # -S addresses the problem in Python 2.4 and 2.5, but Python 2.6's # distutils imports a value from the site module, so we unfortunately # have to do more drastic surgery in the _easy_install_preface code below. # # Here's an example of the .pth files created by setuptools when using that # flag: # # import sys,new,os; # p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('',)); # ie = os.path.exists(os.path.join(p,'__init__.py')); # m = not ie and sys.modules.setdefault('',new.module('')); # mp = (m or []) and m.__dict__.setdefault('__path__',[]); # (p not in mp) and mp.append(p) # # The code, below, then, runs under -S, indicating that site.py should # not be loaded initially. It gets the initial sys.path under these # circumstances, and then imports site (because Python 2.6's distutils # will want it, as mentioned above). It then reinstates the old sys.path # value. Then it removes namespace packages (created by the setuptools # code above) from sys.modules. It identifies namespace packages by # iterating over every loaded module. It first looks if there is a # __path__, so it is a package; and then it sees if that __path__ does # not have an __init__.py. (Note that PEP 382, # http://www.python.org/dev/peps/pep-0382, makes it possible to have a # namespace package that has an __init__.py, but also should make it # unnecessary for site.py to preprocess these packages, so it should be # fine, as far as can be guessed as of this writing.) Finally, it # imports easy_install and runs it. _easy_install_preface = '''\ import sys,os;\ p = sys.path[:];\ import site;\ sys.path[:] = p;\ [sys.modules.pop(k) for k, v in sys.modules.items()\ if hasattr(v, '__path__') and len(v.__path__)==1 and\ not os.path.exists(os.path.join(v.__path__[0],'__init__.py'))];''' _easy_install_cmd = ( 'from setuptools.command.easy_install import main;main()') class Installer: _versions = {} _download_cache = None _install_from_cache = False _prefer_final = True _use_dependency_links = True _allow_picked_versions = True _always_unzip = False _include_site_packages = True _allowed_eggs_from_site_packages = ('*',) def __init__(self, dest=None, links=(), index=None, executable=sys.executable, always_unzip=None, path=None, newest=True, versions=None, use_dependency_links=None, allow_hosts=('*',), include_site_packages=None, allowed_eggs_from_site_packages=None, prefer_final=None, ): self._dest = dest self._allow_hosts = allow_hosts if self._install_from_cache: if not self._download_cache: raise ValueError("install_from_cache set to true with no" " download cache") links = () index = 'file://' + self._download_cache if use_dependency_links is not None: self._use_dependency_links = use_dependency_links if prefer_final is not None: self._prefer_final = prefer_final self._links = links = list(_fix_file_links(links)) if self._download_cache and (self._download_cache not in links): links.insert(0, self._download_cache) self._index_url = index self._executable = executable self._has_broken_dash_S = _has_broken_dash_S(self._executable) if always_unzip is not None: self._always_unzip = always_unzip path = (path and path[:] or []) if include_site_packages is not None: self._include_site_packages = include_site_packages if allowed_eggs_from_site_packages is not None: self._allowed_eggs_from_site_packages = tuple( allowed_eggs_from_site_packages) if self._has_broken_dash_S: if (not self._include_site_packages or self._allowed_eggs_from_site_packages != ('*',)): # We can't do this if the executable has a broken -S. warnings.warn(BROKEN_DASH_S_WARNING) self._include_site_packages = True self._allowed_eggs_from_site_packages = ('*',) self._easy_install_cmd = _easy_install_cmd else: self._easy_install_cmd = _easy_install_preface + _easy_install_cmd self._easy_install_cmd = _safe_arg(self._easy_install_cmd) stdlib, self._site_packages = _get_system_paths(executable) version_info = _get_version_info(executable) if version_info == sys.version_info: # Maybe we can add the buildout and setuptools path. If we # are including site_packages, we only have to include the extra # bits here, so we don't duplicate. On the other hand, if we # are not including site_packages, we only want to include the # parts that are not in site_packages, so the code is the same. path.extend( set(buildout_and_setuptools_path).difference( self._site_packages)) if self._include_site_packages: path.extend(self._site_packages) if dest is not None and dest not in path: path.insert(0, dest) self._path = path if self._dest is None: newest = False self._newest = newest self._env = pkg_resources.Environment(path, python=_get_version(executable)) self._index = _get_index(executable, index, links, self._allow_hosts, self._path) if versions is not None: self._versions = versions _allowed_eggs_from_site_packages_regex = None def allow_site_package_egg(self, name): if (not self._include_site_packages or not self._allowed_eggs_from_site_packages): # If the answer is a blanket "no," perform a shortcut. return False if self._allowed_eggs_from_site_packages_regex is None: pattern = '(%s)' % ( '|'.join( fnmatch.translate(name) for name in self._allowed_eggs_from_site_packages), ) self._allowed_eggs_from_site_packages_regex = re.compile(pattern) return bool(self._allowed_eggs_from_site_packages_regex.match(name)) def _satisfied(self, req, source=None): # We get all distributions that match the given requirement. If we are # not supposed to include site-packages for the given egg, we also # filter those out. Even if include_site_packages is False and so we # have excluded site packages from the _env's paths (see # Installer.__init__), we need to do the filtering here because an # .egg-link, such as one for setuptools or zc.buildout installed by # zc.buildout.buildout.Buildout.bootstrap, can indirectly include a # path in our _site_packages. dists = [dist for dist in self._env[req.project_name] if ( dist in req and ( dist.location not in self._site_packages or self.allow_site_package_egg(dist.project_name)) ) ] if not dists: logger.debug('We have no distributions for %s that satisfies %r.', req.project_name, str(req)) return None, self._obtain(req, source) # Note that dists are sorted from best to worst, as promised by # env.__getitem__ for dist in dists: if (dist.precedence == pkg_resources.DEVELOP_DIST and dist.location not in self._site_packages): # System eggs are sometimes installed as develop eggs. # Those are not the kind of develop eggs we are looking for # here: we want ones that the buildout itself has locally as # develop eggs. logger.debug('We have a develop egg: %s', dist) return dist, None # Special common case, we have a specification for a single version: specs = req.specs if len(specs) == 1 and specs[0][0] == '==': logger.debug('We have the distribution that satisfies %r.', str(req)) return dists[0], None if self._prefer_final: fdists = [dist for dist in dists if _final_version(dist.parsed_version) ] if fdists: # There are final dists, so only use those dists = fdists if not self._newest: # We don't need the newest, so we'll use the newest one we # find, which is the first returned by # Environment.__getitem__. return dists[0], None best_we_have = dists[0] # Because dists are sorted from best to worst # We have some installed distros. There might, theoretically, be # newer ones. Let's find out which ones are available and see if # any are newer. We only do this if we're willing to install # something, which is only true if dest is not None: if self._dest is not None: best_available = self._obtain(req, source) else: best_available = None if best_available is None: # That's a bit odd. There aren't any distros available. # We should use the best one we have that meets the requirement. logger.debug( 'There are no distros available that meet %r.\n' 'Using our best, %s.', str(req), best_available) return best_we_have, None if self._prefer_final: if _final_version(best_available.parsed_version): if _final_version(best_we_have.parsed_version): if (best_we_have.parsed_version < best_available.parsed_version ): return None, best_available else: return None, best_available else: if (not _final_version(best_we_have.parsed_version) and (best_we_have.parsed_version < best_available.parsed_version ) ): return None, best_available else: if (best_we_have.parsed_version < best_available.parsed_version ): return None, best_available logger.debug( 'We have the best distribution that satisfies %r.', str(req)) return best_we_have, None def _load_dist(self, dist): dists = pkg_resources.Environment( dist.location, python=_get_version(self._executable), )[dist.project_name] assert len(dists) == 1 return dists[0] def _call_easy_install(self, spec, ws, dest, dist): tmp = tempfile.mkdtemp(dir=dest) try: path = setuptools_loc args = ('-c', self._easy_install_cmd, '-mUNxd', _safe_arg(tmp)) if not self._has_broken_dash_S: args = ('-S',) + args if self._always_unzip: args += ('-Z', ) level = logger.getEffectiveLevel() if level > 0: args += ('-q', ) elif level < 0: args += ('-v', ) args += (_safe_arg(spec), ) if level <= logging.DEBUG: logger.debug('Running easy_install:\n%s "%s"\npath=%s\n', self._executable, '" "'.join(args), path) if is_jython: extra_env = dict(os.environ, PYTHONPATH=path) else: args += (dict(os.environ, PYTHONPATH=path), ) sys.stdout.flush() # We want any pending output first if is_jython: exit_code = subprocess.Popen( [_safe_arg(self._executable)] + list(args), env=extra_env).wait() else: exit_code = os.spawnle( os.P_WAIT, self._executable, _safe_arg (self._executable), *args) dists = [] env = pkg_resources.Environment( [tmp], python=_get_version(self._executable), ) for project in env: dists.extend(env[project]) if exit_code: logger.error( "An error occurred when trying to install %s. " "Look above this message for any errors that " "were output by easy_install.", dist) if not dists: raise zc.buildout.UserError("Couldn't install: %s" % dist) if len(dists) > 1: logger.warn("Installing %s\n" "caused multiple distributions to be installed:\n" "%s\n", dist, '\n'.join(map(str, dists))) else: d = dists[0] if d.project_name != dist.project_name: logger.warn("Installing %s\n" "Caused installation of a distribution:\n" "%s\n" "with a different project name.", dist, d) if d.version != dist.version: logger.warn("Installing %s\n" "Caused installation of a distribution:\n" "%s\n" "with a different version.", dist, d) result = [] for d in dists: newloc = os.path.join(dest, os.path.basename(d.location)) if os.path.exists(newloc): if os.path.isdir(newloc): shutil.rmtree(newloc) else: os.remove(newloc) os.rename(d.location, newloc) [d] = pkg_resources.Environment( [newloc], python=_get_version(self._executable), )[d.project_name] result.append(d) return result finally: shutil.rmtree(tmp) def _obtain(self, requirement, source=None): # initialize out index for this project: index = self._index if index.obtain(requirement) is None: # Nothing is available. return None # Filter the available dists for the requirement and source flag. If # we are not supposed to include site-packages for the given egg, we # also filter those out. Even if include_site_packages is False and so # we have excluded site packages from the _env's paths (see # Installer.__init__), we need to do the filtering here because an # .egg-link, such as one for setuptools or zc.buildout installed by # zc.buildout.buildout.Buildout.bootstrap, can indirectly include a # path in our _site_packages. dists = [dist for dist in index[requirement.project_name] if ( dist in requirement and ( dist.location not in self._site_packages or self.allow_site_package_egg(dist.project_name)) and ( (not source) or (dist.precedence == pkg_resources.SOURCE_DIST)) ) ] # If we prefer final dists, filter for final and use the # result if it is non empty. if self._prefer_final: fdists = [dist for dist in dists if _final_version(dist.parsed_version) ] if fdists: # There are final dists, so only use those dists = fdists # Now find the best one: best = [] bestv = () for dist in dists: distv = dist.parsed_version if distv > bestv: best = [dist] bestv = distv elif distv == bestv: best.append(dist) if not best: return None if len(best) == 1: return best[0] if self._download_cache: for dist in best: if (realpath(os.path.dirname(dist.location)) == self._download_cache ): return dist best.sort() return best[-1] def _fetch(self, dist, tmp, download_cache): if (download_cache and (realpath(os.path.dirname(dist.location)) == download_cache) ): return dist new_location = self._index.download(dist.location, tmp) if (download_cache and (realpath(new_location) == realpath(dist.location)) and os.path.isfile(new_location) ): # setuptools avoids making extra copies, but we want to copy # to the download cache shutil.copy2(new_location, tmp) new_location = os.path.join(tmp, os.path.basename(new_location)) return dist.clone(location=new_location) def _get_dist(self, requirement, ws, always_unzip): __doing__ = 'Getting distribution for %r.', str(requirement) # Maybe an existing dist is already the best dist that satisfies the # requirement dist, avail = self._satisfied(requirement) if dist is None: if self._dest is not None: logger.info(*__doing__) # Retrieve the dist: if avail is None: raise MissingDistribution(requirement, ws) # We may overwrite distributions, so clear importer # cache. sys.path_importer_cache.clear() tmp = self._download_cache if tmp is None: tmp = tempfile.mkdtemp('get_dist') try: dist = self._fetch(avail, tmp, self._download_cache) if dist is None: raise zc.buildout.UserError( "Couldn't download distribution %s." % avail) if dist.precedence == pkg_resources.EGG_DIST: # It's already an egg, just fetch it into the dest newloc = os.path.join( self._dest, os.path.basename(dist.location)) if os.path.isdir(dist.location): # we got a directory. It must have been # obtained locally. Just copy it. shutil.copytree(dist.location, newloc) else: if self._always_unzip: should_unzip = True else: metadata = pkg_resources.EggMetadata( zipimport.zipimporter(dist.location) ) should_unzip = ( metadata.has_metadata('not-zip-safe') or not metadata.has_metadata('zip-safe') ) if should_unzip: setuptools.archive_util.unpack_archive( dist.location, newloc) else: shutil.copyfile(dist.location, newloc) redo_pyc(newloc) # Getting the dist from the environment causes the # distribution meta data to be read. Cloning isn't # good enough. dists = pkg_resources.Environment( [newloc], python=_get_version(self._executable), )[dist.project_name] else: # It's some other kind of dist. We'll let easy_install # deal with it: dists = self._call_easy_install( dist.location, ws, self._dest, dist) for dist in dists: redo_pyc(dist.location) finally: if tmp != self._download_cache: shutil.rmtree(tmp) self._env.scan([self._dest]) dist = self._env.best_match(requirement, ws) logger.info("Got %s.", dist) else: dists = [dist] for dist in dists: if (dist.has_metadata('dependency_links.txt') and not self._install_from_cache and self._use_dependency_links ): for link in dist.get_metadata_lines('dependency_links.txt'): link = link.strip() if link not in self._links: logger.debug('Adding find link %r from %s', link, dist) self._links.append(link) self._index = _get_index(self._executable, self._index_url, self._links, self._allow_hosts, self._path) for dist in dists: # Check whether we picked a version and, if we did, report it: if not ( dist.precedence == pkg_resources.DEVELOP_DIST or (len(requirement.specs) == 1 and requirement.specs[0][0] == '==') ): logger.debug('Picked: %s = %s', dist.project_name, dist.version) if not self._allow_picked_versions: raise zc.buildout.UserError( 'Picked: %s = %s' % (dist.project_name, dist.version) ) return dists def _maybe_add_setuptools(self, ws, dist): if dist.has_metadata('namespace_packages.txt'): for r in dist.requires(): if r.project_name in ('setuptools', 'distribute'): break else: # We have a namespace package but no requirement for setuptools if dist.precedence == pkg_resources.DEVELOP_DIST: logger.warn( "Develop distribution: %s\n" "uses namespace packages but the distribution " "does not require setuptools.", dist) requirement = self._constrain( pkg_resources.Requirement.parse('setuptools') ) if ws.find(requirement) is None: for dist in self._get_dist(requirement, ws, False): ws.add(dist) def _constrain(self, requirement): if is_distribute and requirement.key == 'setuptools': requirement = pkg_resources.Requirement.parse('distribute') constraint = self._versions.get(requirement.project_name) if constraint: requirement = _constrained_requirement(constraint, requirement) return requirement def install(self, specs, working_set=None): logger.debug('Installing %s.', repr(specs)[1:-1]) path = self._path destination = self._dest if destination is not None and destination not in path: path.insert(0, destination) requirements = [self._constrain(pkg_resources.Requirement.parse(spec)) for spec in specs] if working_set is None: ws = pkg_resources.WorkingSet([]) else: ws = working_set for requirement in requirements: for dist in self._get_dist(requirement, ws, self._always_unzip): ws.add(dist) self._maybe_add_setuptools(ws, dist) # OK, we have the requested distributions and they're in the working # set, but they may have unmet requirements. We'll resolve these # requirements. This is code modified from # pkg_resources.WorkingSet.resolve. We can't reuse that code directly # because we have to constrain our requirements (see # versions_section_ignored_for_dependency_in_favor_of_site_packages in # zc.buildout.tests). requirements.reverse() # Set up the stack. processed = {} # This is a set of processed requirements. best = {} # This is a mapping of key -> dist. # Note that we don't use the existing environment, because we want # to look for new eggs unless what we have is the best that # matches the requirement. env = pkg_resources.Environment(ws.entries) while requirements: # Process dependencies breadth-first. req = self._constrain(requirements.pop(0)) if req in processed: # Ignore cyclic or redundant dependencies. continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map. dist = ws.by_key.get(req.key) if dist is None: try: dist = best[req.key] = env.best_match(req, ws) except pkg_resources.VersionConflict, err: raise VersionConflict(err, ws) if dist is None or ( dist.location in self._site_packages and not self.allow_site_package_egg(dist.project_name)): # If we didn't find a distribution in the # environment, or what we found is from site # packages and not allowed to be there, try # again. if destination: logger.debug('Getting required %r', str(req)) else: logger.debug('Adding required %r', str(req)) _log_requirement(ws, req) for dist in self._get_dist(req, ws, self._always_unzip): ws.add(dist) self._maybe_add_setuptools(ws, dist) if dist not in req: # Oops, the "best" so far conflicts with a dependency. raise VersionConflict( pkg_resources.VersionConflict(dist, req), ws) requirements.extend(dist.requires(req.extras)[::-1]) processed[req] = True if dist.location in self._site_packages: logger.debug('Egg from site-packages: %s', dist) return ws def build(self, spec, build_ext): requirement = self._constrain(pkg_resources.Requirement.parse(spec)) dist, avail = self._satisfied(requirement, 1) if dist is not None: return [dist.location] # Retrieve the dist: if avail is None: raise zc.buildout.UserError( "Couldn't find a source distribution for %r." % str(requirement)) logger.debug('Building %r', spec) tmp = self._download_cache if tmp is None: tmp = tempfile.mkdtemp('get_dist') try: dist = self._fetch(avail, tmp, self._download_cache) build_tmp = tempfile.mkdtemp('build') try: setuptools.archive_util.unpack_archive(dist.location, build_tmp) if os.path.exists(os.path.join(build_tmp, 'setup.py')): base = build_tmp else: setups = glob.glob( os.path.join(build_tmp, '*', 'setup.py')) if not setups: raise distutils.errors.DistutilsError( "Couldn't find a setup script in %s" % os.path.basename(dist.location) ) if len(setups) > 1: raise distutils.errors.DistutilsError( "Multiple setup scripts in %s" % os.path.basename(dist.location) ) base = os.path.dirname(setups[0]) setup_cfg = os.path.join(base, 'setup.cfg') if not os.path.exists(setup_cfg): f = open(setup_cfg, 'w') f.close() setuptools.command.setopt.edit_config( setup_cfg, dict(build_ext=build_ext)) dists = self._call_easy_install( base, pkg_resources.WorkingSet(), self._dest, dist) for dist in dists: redo_pyc(dist.location) return [dist.location for dist in dists] finally: shutil.rmtree(build_tmp) finally: if tmp != self._download_cache: shutil.rmtree(tmp) def default_versions(versions=None): old = Installer._versions if versions is not None: Installer._versions = versions return old def download_cache(path=-1): old = Installer._download_cache if path != -1: if path: path = realpath(path) Installer._download_cache = path return old def install_from_cache(setting=None): old = Installer._install_from_cache if setting is not None: Installer._install_from_cache = bool(setting) return old def prefer_final(setting=None): old = Installer._prefer_final if setting is not None: Installer._prefer_final = bool(setting) return old def include_site_packages(setting=None): old = Installer._include_site_packages if setting is not None: Installer._include_site_packages = bool(setting) return old def allowed_eggs_from_site_packages(setting=None): old = Installer._allowed_eggs_from_site_packages if setting is not None: Installer._allowed_eggs_from_site_packages = tuple(setting) return old def use_dependency_links(setting=None): old = Installer._use_dependency_links if setting is not None: Installer._use_dependency_links = bool(setting) return old def allow_picked_versions(setting=None): old = Installer._allow_picked_versions if setting is not None: Installer._allow_picked_versions = bool(setting) return old def always_unzip(setting=None): old = Installer._always_unzip if setting is not None: Installer._always_unzip = bool(setting) return old def install(specs, dest, links=(), index=None, executable=sys.executable, always_unzip=None, path=None, working_set=None, newest=True, versions=None, use_dependency_links=None, allow_hosts=('*',), include_site_packages=None, allowed_eggs_from_site_packages=None, prefer_final=None): installer = Installer( dest, links, index, executable, always_unzip, path, newest, versions, use_dependency_links, allow_hosts=allow_hosts, include_site_packages=include_site_packages, allowed_eggs_from_site_packages=allowed_eggs_from_site_packages, prefer_final=prefer_final) return installer.install(specs, working_set) def build(spec, dest, build_ext, links=(), index=None, executable=sys.executable, path=None, newest=True, versions=None, allow_hosts=('*',), include_site_packages=None, allowed_eggs_from_site_packages=None): installer = Installer( dest, links, index, executable, True, path, newest, versions, allow_hosts=allow_hosts, include_site_packages=include_site_packages, allowed_eggs_from_site_packages=allowed_eggs_from_site_packages) return installer.build(spec, build_ext) def _rm(*paths): for path in paths: if os.path.isdir(path): shutil.rmtree(path) elif os.path.exists(path): os.remove(path) def _copyeggs(src, dest, suffix, undo): result = [] undo.append(lambda : _rm(*result)) for name in os.listdir(src): if name.endswith(suffix): new = os.path.join(dest, name) _rm(new) os.rename(os.path.join(src, name), new) result.append(new) assert len(result) == 1, str(result) undo.pop() return result[0] def develop(setup, dest, build_ext=None, executable=sys.executable): if os.path.isdir(setup): directory = setup setup = os.path.join(directory, 'setup.py') else: directory = os.path.dirname(setup) undo = [] try: if build_ext: setup_cfg = os.path.join(directory, 'setup.cfg') if os.path.exists(setup_cfg): os.rename(setup_cfg, setup_cfg+'-develop-aside') def restore_old_setup(): if os.path.exists(setup_cfg): os.remove(setup_cfg) os.rename(setup_cfg+'-develop-aside', setup_cfg) undo.append(restore_old_setup) else: open(setup_cfg, 'w') undo.append(lambda: os.remove(setup_cfg)) setuptools.command.setopt.edit_config( setup_cfg, dict(build_ext=build_ext)) fd, tsetup = tempfile.mkstemp() undo.append(lambda: os.remove(tsetup)) undo.append(lambda: os.close(fd)) os.write(fd, runsetup_template % dict( setuptools=setuptools_loc, setupdir=directory, setup=setup, __file__ = setup, )) tmp3 = tempfile.mkdtemp('build', dir=dest) undo.append(lambda : shutil.rmtree(tmp3)) args = [ zc.buildout.easy_install._safe_arg(tsetup), '-q', 'develop', '-mxN', '-d', _safe_arg(tmp3), ] log_level = logger.getEffectiveLevel() if log_level <= 0: if log_level == 0: del args[1] else: args[1] == '-v' if log_level < logging.DEBUG: logger.debug("in: %r\n%s", directory, ' '.join(args)) # XXX It looks like someone tried to get clever with "bad" develop eggs, but this # currently fails on Windows #p = subprocess.Popen( # [_safe_arg(executable)] + args, stdout=subprocess.PIPE, stderr=subprocess.PIPE) #if p.wait() > 0: # raise zc.buildout.UserError("Installing develop egg failed: %s" % p.stderr.read()) if is_jython: assert subprocess.Popen([_safe_arg(executable)] + args).wait() == 0 else: assert os.spawnl(os.P_WAIT, executable, _safe_arg(executable), *args) == 0 return _copyeggs(tmp3, dest, '.egg-link', undo) finally: undo.reverse() [f() for f in undo] def working_set(specs, executable, path, include_site_packages=None, allowed_eggs_from_site_packages=None, prefer_final=None): return install( specs, None, executable=executable, path=path, include_site_packages=include_site_packages, allowed_eggs_from_site_packages=allowed_eggs_from_site_packages, prefer_final=prefer_final) ############################################################################ # Script generation functions def scripts( reqs, working_set, executable, dest, scripts=None, extra_paths=(), arguments='', interpreter=None, initialization='', relative_paths=False, ): """Generate scripts and/or an interpreter. See sitepackage_safe_scripts for a version that can be used with a Python that has code installed in site-packages. It has more options and a different approach. """ path = _get_path(working_set, extra_paths) if initialization: initialization = '\n'+initialization+'\n' generated = _generate_scripts( reqs, working_set, dest, path, scripts, relative_paths, initialization, executable, arguments) if interpreter: sname = os.path.join(dest, interpreter) spath, rpsetup = _relative_path_and_setup(sname, path, relative_paths) generated.extend( _pyscript(spath, sname, executable, rpsetup)) return generated # We need to give an alternate name to the ``scripts`` function so that it # can be referenced within sitepackage_safe_scripts, which uses ``scripts`` # as an argument name. _original_scripts_function = scripts def sitepackage_safe_scripts( dest, working_set, executable, site_py_dest, reqs=(), scripts=None, interpreter=None, extra_paths=(), initialization='', include_site_packages=False, exec_sitecustomize=False, relative_paths=False, script_arguments='', script_initialization='', ): """Generate scripts and/or an interpreter from a system Python. This accomplishes the same job as the ``scripts`` function, above, but it does so in an alternative way that allows safely including Python site packages, if desired, and choosing to execute the Python's sitecustomize. """ if _has_broken_dash_S(executable): if not include_site_packages: warnings.warn(BROKEN_DASH_S_WARNING) return _original_scripts_function( reqs, working_set, executable, dest, scripts, extra_paths, script_arguments, interpreter, initialization, relative_paths) generated = [] generated.append(_generate_sitecustomize( site_py_dest, executable, initialization, exec_sitecustomize)) generated.append(_generate_site( site_py_dest, working_set, executable, extra_paths, include_site_packages, relative_paths)) script_initialization = _script_initialization_template % dict( site_py_dest=site_py_dest, script_initialization=script_initialization) if not script_initialization.endswith('\n'): script_initialization += '\n' generated.extend(_generate_scripts( reqs, working_set, dest, [site_py_dest], scripts, relative_paths, script_initialization, executable, script_arguments, block_site=True)) if interpreter: generated.extend(_generate_interpreter( interpreter, dest, executable, site_py_dest, relative_paths)) return generated _script_initialization_template = ''' import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py %(script_initialization)s''' # Utilities for the script generation functions. # These are shared by both ``scripts`` and ``sitepackage_safe_scripts`` def _get_path(working_set, extra_paths=()): """Given working set and extra paths, return a normalized path list.""" path = [dist.location for dist in working_set] path.extend(extra_paths) # order preserving unique unique_path = [] for p in path: if p not in unique_path: unique_path.append(p) return map(realpath, unique_path) def _generate_scripts(reqs, working_set, dest, path, scripts, relative_paths, initialization, executable, arguments, block_site=False): """Generate scripts for the given requirements. - reqs is an iterable of string requirements or entry points. - The requirements must be findable in the given working_set. - The dest is the directory in which the scripts should be created. - The path is a list of paths that should be added to sys.path. - The scripts is an optional dictionary. If included, the keys should be the names of the scripts that should be created, as identified in their entry points; and the values should be the name the script should actually be created with. - relative_paths, if given, should be the path that is the root of the buildout (the common path that should be the root of what is relative). """ if isinstance(reqs, str): raise TypeError('Expected iterable of requirements or entry points,' ' got string.') generated = [] entry_points = [] for req in reqs: if isinstance(req, str): req = pkg_resources.Requirement.parse(req) dist = working_set.find(req) for name in pkg_resources.get_entry_map(dist, 'console_scripts'): entry_point = dist.get_entry_info('console_scripts', name) entry_points.append( (name, entry_point.module_name, '.'.join(entry_point.attrs)) ) else: entry_points.append(req) for name, module_name, attrs in entry_points: if scripts is not None: sname = scripts.get(name) if sname is None: continue else: sname = name sname = os.path.join(dest, sname) spath, rpsetup = _relative_path_and_setup(sname, path, relative_paths) generated.extend( _script(sname, executable, rpsetup, spath, initialization, module_name, attrs, arguments, block_site=block_site)) return generated def _relative_path_and_setup(sname, path, relative_paths=False, indent_level=1, omit_os_import=False): """Return a string of code of paths and of setup if appropriate. - sname is the full path to the script name to be created. - path is the list of paths to be added to sys.path. - relative_paths, if given, should be the path that is the root of the buildout (the common path that should be the root of what is relative). - indent_level is the number of four-space indents that the path should insert before each element of the path. """ if relative_paths: relative_paths = os.path.normcase(relative_paths) sname = os.path.normcase(os.path.abspath(sname)) spath = _format_paths( [_relativitize(os.path.normcase(path_item), sname, relative_paths) for path_item in path], indent_level=indent_level) rpsetup = relative_paths_setup if not omit_os_import: rpsetup = '\n\nimport os\n' + rpsetup for i in range(_relative_depth(relative_paths, sname)): rpsetup += "\nbase = os.path.dirname(base)" else: spath = _format_paths((repr(p) for p in path), indent_level=indent_level) rpsetup = '' return spath, rpsetup def _relative_depth(common, path): """Return number of dirs separating ``path`` from ancestor, ``common``. For instance, if path is /foo/bar/baz/bing, and common is /foo, this will return 2--in UNIX, the number of ".." to get from bing's directory to foo. This is a helper for _relative_path_and_setup. """ n = 0 while 1: dirname = os.path.dirname(path) if dirname == path: raise AssertionError("dirname of %s is the same" % dirname) if dirname == common: break n += 1 path = dirname return n def _relative_path(common, path): """Return the relative path from ``common`` to ``path``. This is a helper for _relativitize, which is a helper to _relative_path_and_setup. """ r = [] while 1: dirname, basename = os.path.split(path) r.append(basename) if dirname == common: break if dirname == path: raise AssertionError("dirname of %s is the same" % dirname) path = dirname r.reverse() return os.path.join(*r) def _relativitize(path, script, relative_paths): """Return a code string for the given path. Path is relative to the base path ``relative_paths``if the common prefix between ``path`` and ``script`` starts with ``relative_paths``. """ if path == script: raise AssertionError("path == script") common = os.path.dirname(os.path.commonprefix([path, script])) if (common == relative_paths or common.startswith(os.path.join(relative_paths, '')) ): return "join(base, %r)" % _relative_path(common, path) else: return repr(path) relative_paths_setup = """ join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))""" def _write_script(full_name, contents, logged_type): """Write contents of script in full_name, logging the action. The only tricky bit in this function is that it supports Windows by creating exe files using a pkg_resources helper. """ generated = [] script_name = full_name if is_win32: script_name += '-script.py' # Generate exe file and give the script a magic name. exe = full_name + '.exe' if is_64: resource = 'cli-64.exe' else: resource = 'cli.exe' try: new_data = pkg_resources.resource_string('setuptools', resource) except IOError: # setuptools has just cli.exe, no matter if 64/32 bit new_data = pkg_resources.resource_string('setuptools', 'cli.exe') if not os.path.exists(exe) or (open(exe, 'rb').read() != new_data): # Only write it if it's different. open(exe, 'wb').write(new_data) generated.append(exe) changed = not (os.path.exists(script_name) and open(script_name).read() == contents) if changed: open(script_name, 'w').write(contents) try: os.chmod(script_name, 0755) except (AttributeError, os.error): pass logger.info("Generated %s %r.", logged_type, full_name) generated.append(script_name) return generated def _format_paths(paths, indent_level=1): """Format paths for inclusion in a script.""" separator = ',\n' + indent_level * ' ' return separator.join(paths) def _script(dest, executable, relative_paths_setup, path, initialization, module_name, attrs, arguments, block_site=False): if block_site: dash_S = ' -S' else: dash_S = '' contents = script_template % dict( python=_safe_arg(executable), dash_S=dash_S, path=path, module_name=module_name, attrs=attrs, arguments=arguments, initialization=initialization, relative_paths_setup=relative_paths_setup, ) return _write_script(dest, contents, 'script') if is_jython and jython_os_name == 'linux': script_header = '#!/usr/bin/env %(python)s%(dash_S)s' else: script_header = '#!%(python)s%(dash_S)s' sys_path_template = '''\ import sys sys.path[0:0] = [ %s, ] ''' script_template = script_header + '''\ %(relative_paths_setup)s import sys sys.path[0:0] = [ %(path)s, ] %(initialization)s import %(module_name)s if __name__ == '__main__': sys.exit(%(module_name)s.%(attrs)s(%(arguments)s)) ''' # These are used only by the older ``scripts`` function. def _pyscript(path, dest, executable, rsetup): contents = py_script_template % dict( python=_safe_arg(executable), dash_S='', path=path, relative_paths_setup=rsetup, ) return _write_script(dest, contents, 'interpreter') py_script_template = script_header + '''\ %(relative_paths_setup)s import sys sys.path[0:0] = [ %(path)s, ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) ''' # These are used only by the newer ``sitepackage_safe_scripts`` function. def _get_module_file(executable, name, silent=False): """Return a module's file path. - executable is a path to the desired Python executable. - name is the name of the (pure, not C) Python module. """ cmd = [executable, "-Sc", "import imp; " "fp, path, desc = imp.find_module(%r); " "fp.close(); " "print path" % (name,)] env = os.environ.copy() # We need to make sure that PYTHONPATH, which will often be set to # include a custom buildout-generated site.py, is not set, or else # we will not get an accurate value for the "real" site.py and # sitecustomize.py. env.pop('PYTHONPATH', None) _proc = subprocess.Popen( cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=env) stdout, stderr = _proc.communicate(); if _proc.returncode: if not silent: logger.info( 'Could not find file for module %s:\n%s', name, stderr) return None # else: ... res = stdout.strip() if res.endswith('.pyc') or res.endswith('.pyo'): raise RuntimeError('Cannot find uncompiled version of %s' % (name,)) if not os.path.exists(res): raise RuntimeError( 'File does not exist for module %s:\n%s' % (name, res)) return res def _generate_sitecustomize(dest, executable, initialization='', exec_sitecustomize=False): """Write a sitecustomize file with optional custom initialization. The created script will execute the underlying Python's sitecustomize if exec_sitecustomize is True. """ sitecustomize_path = os.path.join(dest, 'sitecustomize.py') sitecustomize = open(sitecustomize_path, 'w') if initialization: sitecustomize.write(initialization + '\n') if exec_sitecustomize: real_sitecustomize_path = _get_module_file( executable, 'sitecustomize', silent=True) if real_sitecustomize_path: real_sitecustomize = open(real_sitecustomize_path, 'r') sitecustomize.write( '\n# The following is from\n# %s\n' % (real_sitecustomize_path,)) sitecustomize.write(real_sitecustomize.read()) real_sitecustomize.close() sitecustomize.close() return sitecustomize_path def _generate_site(dest, working_set, executable, extra_paths=(), include_site_packages=False, relative_paths=False): """Write a site.py file with eggs from working_set. extra_paths will be added to the path. If include_site_packages is True, paths from the underlying Python will be added. """ path = _get_path(working_set, extra_paths) site_path = os.path.join(dest, 'site.py') original_path_setup = preamble = '' if include_site_packages: stdlib, site_paths = _get_system_paths(executable) # We want to make sure that paths from site-packages, such as those # allowed by allowed_eggs_from_site_packages, always come last, or # else site-packages paths may include packages that mask the eggs we # really want. path = [p for p in path if p not in site_paths] # Now we set up the code we need. original_path_setup = original_path_snippet % ( _format_paths((repr(p) for p in site_paths), 2),) distribution = working_set.find( pkg_resources.Requirement.parse('setuptools')) if distribution is not None: # We need to worry about namespace packages. if relative_paths: location = _relativitize( distribution.location, os.path.normcase(os.path.abspath(site_path)), relative_paths) else: location = repr(distribution.location) preamble = namespace_include_site_packages_setup % (location,) original_path_setup = ( addsitedir_namespace_originalpackages_snippet + original_path_setup) else: preamble = '\n setuptools_path = None' egg_path_string, relative_preamble = _relative_path_and_setup( site_path, path, relative_paths, indent_level=2, omit_os_import=True) if relative_preamble: relative_preamble = '\n'.join( [(line and ' %s' % (line,) or line) for line in relative_preamble.split('\n')]) preamble = relative_preamble + preamble addsitepackages_marker = 'def addsitepackages(' enableusersite_marker = 'ENABLE_USER_SITE = ' successful_rewrite = False real_site_path = _get_module_file(executable, 'site') real_site = open(real_site_path, 'r') site = open(site_path, 'w') try: for line in real_site.readlines(): if line.startswith(enableusersite_marker): site.write(enableusersite_marker) site.write('False # buildout does not support user sites.\n') elif line.startswith(addsitepackages_marker): site.write(addsitepackages_script % ( preamble, egg_path_string, original_path_setup)) site.write(line[len(addsitepackages_marker):]) successful_rewrite = True else: site.write(line) finally: site.close() real_site.close() if not successful_rewrite: raise RuntimeError( 'Buildout did not successfully rewrite %s to %s' % (real_site_path, site_path)) return site_path namespace_include_site_packages_setup = ''' setuptools_path = %s sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources''' addsitedir_namespace_originalpackages_snippet = ''' pkg_resources.working_set.add_entry(sitedir)''' original_path_snippet = ''' sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ %s ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths)''' addsitepackages_script = '''\ def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version."""%s buildout_paths = [ %s ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase)%s return known_paths def original_addsitepackages(''' def _generate_interpreter(name, dest, executable, site_py_dest, relative_paths=False): """Write an interpreter script, using the site.py approach.""" full_name = os.path.join(dest, name) site_py_dest_string, rpsetup = _relative_path_and_setup( full_name, [site_py_dest], relative_paths, omit_os_import=True) if rpsetup: rpsetup += "\n" if sys.platform == 'win32': windows_import = '\nimport subprocess' # os.exec* is a mess on Windows, particularly if the path # to the executable has spaces and the Python is using MSVCRT. # The standard fix is to surround the executable's path with quotes, # but that has been unreliable in testing. # # Here's a demonstration of the problem. Given a Python # compiled with a MSVCRT-based compiler, such as the free Visual # C++ 2008 Express Edition, and an executable path with spaces # in it such as the below, we see the following. # # >>> import os # >>> p0 = 'C:\\Documents and Settings\\Administrator\\My Documents\\Downloads\\Python-2.6.4\\PCbuild\\python.exe' # >>> os.path.exists(p0) # True # >>> os.execv(p0, []) # Traceback (most recent call last): # File "", line 1, in # OSError: [Errno 22] Invalid argument # # That seems like a standard problem. The standard solution is # to quote the path (see, for instance # http://bugs.python.org/issue436259). However, this solution, # and other variations, fail: # # >>> p1 = '"C:\\Documents and Settings\\Administrator\\My Documents\\Downloads\\Python-2.6.4\\PCbuild\\python.exe"' # >>> os.execv(p1, []) # Traceback (most recent call last): # File "", line 1, in # OSError: [Errno 22] Invalid argument # # We simply use subprocess instead, since it handles everything # nicely, and the transparency of exec* (that is, not running, # perhaps unexpectedly, in a subprocess) is arguably not a # necessity, at least for many use cases. execute = 'subprocess.call(argv, env=environ)' else: windows_import = '' execute = 'os.execve(sys.executable, argv, environ)' contents = interpreter_template % dict( python=_safe_arg(executable), dash_S=' -S', site_dest=site_py_dest_string, relative_paths_setup=rpsetup, windows_import=windows_import, execute=execute, ) return _write_script(full_name, contents, 'interpreter') interpreter_template = script_header + ''' import os import sys%(windows_import)s %(relative_paths_setup)s argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = %(site_dest)s if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path %(execute)s ''' # End of script generation code. ############################################################################ runsetup_template = """ import sys sys.path.insert(0, %(setupdir)r) sys.path.insert(0, %(setuptools)r) import os, setuptools __file__ = %(__file__)r os.chdir(%(setupdir)r) sys.argv[0] = %(setup)r execfile(%(setup)r) """ class VersionConflict(zc.buildout.UserError): def __init__(self, err, ws): ws = list(ws) ws.sort() self.err, self.ws = err, ws def __str__(self): existing_dist, req = self.err result = ["There is a version conflict.", "We already have: %s" % existing_dist, ] for dist in self.ws: if req in dist.requires(): result.append("but %s requires %r." % (dist, str(req))) return '\n'.join(result) class MissingDistribution(zc.buildout.UserError): def __init__(self, req, ws): ws = list(ws) ws.sort() self.data = req, ws def __str__(self): req, ws = self.data return "Couldn't find a distribution for %r." % str(req) def _log_requirement(ws, req): if not logger.isEnabledFor(logging.DEBUG): return for dist in ws: if req in dist.requires(): logger.debug(" required by %s." % dist) def _fix_file_links(links): for link in links: if link.startswith('file://') and link[-1] != '/': if os.path.isdir(link[7:]): # work around excessive restriction in setuptools: link += '/' yield link _final_parts = '*final-', '*final' def _final_version(parsed_version): for part in parsed_version: if (part[:1] == '*') and (part not in _final_parts): return False return True def redo_pyc(egg): if not os.path.isdir(egg): return for dirpath, dirnames, filenames in os.walk(egg): for filename in filenames: if not filename.endswith('.py'): continue filepath = os.path.join(dirpath, filename) if not (os.path.exists(filepath+'c') or os.path.exists(filepath+'o')): # If it wasn't compiled, it may not be compilable continue # OK, it looks like we should try to compile. # Remove old files. for suffix in 'co': if os.path.exists(filepath+suffix): os.remove(filepath+suffix) # Compile under current optimization try: py_compile.compile(filepath) except py_compile.PyCompileError: logger.warning("Couldn't compile %s", filepath) def _constrained_requirement(constraint, requirement): return pkg_resources.Requirement.parse( "%s[%s]%s" % ( requirement.project_name, ','.join(requirement.extras), _constrained_requirement_constraint(constraint, requirement) ) ) class IncompatibleConstraintError(zc.buildout.UserError): """A specified version is incompatible with a given requirement. """ IncompatibleVersionError = IncompatibleConstraintError # Backward compatibility def bad_constraint(constraint, requirement): logger.error("The constraint, %s, is not consistent with the " "requirement, %r.", constraint, str(requirement)) raise IncompatibleConstraintError("Bad constraint", constraint, requirement) _parse_constraint = re.compile(r'([<>]=?)(\S+)').match _comparef = { '>' : lambda x, y: x > y, '>=': lambda x, y: x >= y, '<' : lambda x, y: x < y, '<=': lambda x, y: x <= y, } _opop = {'<': '>', '>': '<'} _opeqop = {'<': '>=', '>': '<='} def _constrained_requirement_constraint(constraint, requirement): # Simple cases: # No specs tp merge with: if not requirement.specs: if not constraint[0] in '<=>': constraint = '==' + constraint return constraint # Simple single-version constraint: if constraint[0] not in '<>': if constraint.startswith('='): assert constraint.startswith('==') constraint = constraint[2:] if constraint in requirement: return '=='+constraint bad_constraint(constraint, requirement) # OK, we have a complex constraint (<. <=, >=, or >) and specs. # In many cases, the spec needs to filter constraints. # In other cases, the constraints need to limit the constraint. specs = requirement.specs cop, cv = _parse_constraint(constraint).group(1, 2) pcv = pkg_resources.parse_version(cv) # Special case, all of the specs are == specs: if not [op for (op, v) in specs if op != '==']: # There aren't any non-== specs. # See if any of the specs satisfy the constraint: specs = [op+v for (op, v) in specs if _comparef[cop](pkg_resources.parse_version(v), pcv)] if specs: return ','.join(specs) bad_constraint(constraint, requirement) cop0 = cop[0] # Normalize specs by splitting >= and <= specs. We meed tp do this # becaise these have really weird semantics. Also cache parsed # versions, which we'll need for comparisons: specs = [] for op, v in requirement.specs: pv = pkg_resources.parse_version(v) if op == _opeqop[cop0]: specs.append((op[0], v, pv)) specs.append(('==', v, pv)) else: specs.append((op, v, pv)) # Error if there are opposite specs that conflict with the constraint # and there are no equal specs that satisfy the constraint: if [v for (op, v, pv) in specs if op == _opop[cop0] and _comparef[_opop[cop0]](pv, pcv) ]: eqspecs = [op+v for (op, v, pv) in specs if _comparef[cop](pv, pcv)] if eqspecs: # OK, we do, use these: return ','.join(eqspecs) bad_constraint(constraint, requirement) # We have a combination of range constraints and eq specs that # satisfy the requirement. # Return the constraint + the filtered specs return ','.join( op+v for (op, v) in ( [(cop, cv)] + [(op, v) for (op, v, pv) in specs if _comparef[cop](pv, pcv)] ) ) zc.buildout-1.7.1/src/zc/buildout/easy_install.txt0000644000076500007650000021161312111414155021635 0ustar jimjim00000000000000Python API for egg and script installation ========================================== The easy_install module provides some functions to provide support for egg and script installation. It provides functionality at the python level that is similar to easy_install, with a few exceptions: - By default, we look for new packages *and* the packages that they depend on. This is somewhat like (and uses) the --upgrade option of easy_install, except that we also upgrade required packages. - If the highest-revision package satisfying a specification is already present, then we don't try to get another one. This saves a lot of search time in the common case that packages are pegged to specific versions. - If there is a develop egg that satisfies a requirement, we don't look for additional distributions. We always give preference to develop eggs. - Distutils options for building extensions can be passed. Distribution installation ------------------------- The easy_install module provides a function, install, for installing one or more packages and their dependencies. The install function takes 2 positional arguments: - An iterable of setuptools requirement strings for the distributions to be installed, and - A destination directory to install to and to satisfy requirements from. The destination directory can be None, in which case, no new distributions are downloaded and there will be an error if the needed distributions can't be found among those already installed. It supports a number of optional keyword arguments: links A sequence of URLs, file names, or directories to look for links to distributions. index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. always_unzip A flag indicating that newly-downloaded distributions should be directories even if they could be installed as zip files. working_set An existing working set to be augmented with additional distributions, if necessary to satisfy requirements. This allows you to call install multiple times, if necessary, to gather multiple sets of requirements. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. use_dependency_links A flag indicating whether to search for dependencies using the setup dependency_links metadata or not. If true, links are searched for using dependency_links in preference to other locations. Defaults to true. include_site_packages A flag indicating whether Python's non-standard-library packages should be available for finding dependencies. Defaults to true. Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. relative_paths Adjust egg paths so they are relative to the script path. This allows scripts to work when scripts and eggs are moved, as long as they are both moved in the same way. The install method returns a working set containing the distributions needed to meet the given requirements. We have a link server that has a number of eggs: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
Let's make a directory and install the demo egg to it, using the demo: >>> dest = tmpdir('sample-install') >>> import zc.buildout.easy_install >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/') We requested version 0.2 of the demo distribution to be installed into the destination server. We specified that we should search for links on the link server and that we should use the (empty) link server index directory as a package index. The working set contains the distributions we retrieved. >>> for dist in ws: ... print dist demo 0.2 demoneeded 1.1 We got demoneeded because it was a dependency of demo. And the actual eggs were added to the eggs directory. >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we remove the version restriction on demo, but specify a false value for newest, no new distributions will be installed: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... newest=False) >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we leave off the newest option, we'll get an update for demo: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg Note that we didn't get the newest versions available. There were release candidates for newer versions of both packages. By default, final releases are preferred. We can change this behavior using the prefer_final function: >>> zc.buildout.easy_install.prefer_final(False) True The old setting is returned. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.4c1 demoneeded 1.2c1 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg Let's put the setting back to the default. >>> zc.buildout.easy_install.prefer_final(True) False We can supply additional distributions. We can also supply specifications for distributions that would normally be found via dependencies. We might do this to specify a specific version. >>> ws = zc.buildout.easy_install.install( ... ['demo', 'other', 'demoneeded==1.0'], dest, ... links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.3 other 1.0 demoneeded 1.0 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.0-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d other-1.0-py2.4.egg We can request that eggs be unzipped even if they are zip safe. This can be useful when debugging. (Note that Distribute will unzip eggs by default, so if you are using Distribute, most or all eggs will already be unzipped without this flag.) >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=False) >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg We can also set a default by calling the always_unzip function: >>> zc.buildout.easy_install.always_unzip(True) False The old default is returned: >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> zc.buildout.easy_install.always_unzip(False) True >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg Specifying version information independent of requirements ---------------------------------------------------------- Sometimes it's useful to specify version information independent of normal requirements specifications. For example, a buildout may need to lock down a set of versions, without having to put put version numbers in setup files or part definitions. If a dictionary is passed to the install function, mapping project names to version numbers, then the versions numbers will be used. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) >>> [d.version for d in ws] ['0.2', '1.0'] In this example, we specified a version for demoneeded, even though we didn't define a requirement for it. The versions specified apply to dependencies as well as the specified requirements. If we specify a version that's incompatible with a requirement, then we'll get an error: >>> from zope.testing.loggingsupport import InstalledHandler >>> handler = InstalledHandler('zc.buildout.easy_install') >>> import logging >>> logging.getLogger('zc.buildout.easy_install').propagate = False >>> ws = zc.buildout.easy_install.install( ... ['demo >0.2'], dest, links=[link_server], ... index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) Traceback (most recent call last): ... IncompatibleConstraintError: Bad constraint 0.2 demo>0.2 >>> print handler zc.buildout.easy_install DEBUG Installing 'demo >0.2'. zc.buildout.easy_install ERROR The constraint, 0.2, is not consistent with the requirement, 'demo>0.2'. >>> handler.clear() If no versions are specified, a debugging message will be output reporting that a version was picked automatically: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> print handler zc.buildout.easy_install DEBUG Installing 'demo'. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demo'. zc.buildout.easy_install DEBUG Picked: demo = 0.3 zc.buildout.easy_install DEBUG Getting required 'demoneeded' zc.buildout.easy_install DEBUG required by demo 0.3. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demoneeded'. zc.buildout.easy_install DEBUG Picked: demoneeded = 1.1 >>> handler.uninstall() >>> logging.getLogger('zc.buildout.easy_install').propagate = True We can request that we get an error if versions are picked: >>> zc.buildout.easy_install.allow_picked_versions(False) True (The old setting is returned.) >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) Traceback (most recent call last): ... UserError: Picked: demo = 0.3 >>> zc.buildout.easy_install.allow_picked_versions(True) False The function default_versions can be used to get and set default version information to be used when no version information is passes. If called with an argument, it sets the default versions: >>> zc.buildout.easy_install.default_versions(dict(demoneeded='1')) {} It always returns the previous default versions. If called without an argument, it simply returns the default versions without changing them: >>> zc.buildout.easy_install.default_versions() {'demoneeded': '1'} So with the default versions set, we'll get the requested version even if the versions option isn't used: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.0'] Of course, we can unset the default versions by passing an empty dictionary: >>> zc.buildout.easy_install.default_versions({}) {'demoneeded': '1'} >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.1'] Dependencies in Site Packages ----------------------------- Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. These site-packages are searched by default for distributions. This can be disabled, so that, for instance, a system Python can be used with buildout, cleaned of any packages installed by a user or system package manager. The default behavior can be controlled and introspected using zc.buildout.easy_install.include_site_packages. >>> zc.buildout.easy_install.include_site_packages() True Here's an example of using a Python executable that includes our dependencies. Our "py_path" will have the "demoneeded," and "demo" packages available. We'll simply be asking for "demoneeded" here, but without any external index or links. >>> from zc.buildout.tests import create_sample_sys_install >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) >>> [dist.project_name for dist in workingset] ['demoneeded'] That worked fine. Let's try again with site packages not allowed. We'll change the policy by changing the default. Notice that the function for changing the default value returns the previous value. >>> zc.buildout.easy_install.include_site_packages(False) True >>> zc.buildout.easy_install.include_site_packages() False >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. >>> zc.buildout.easy_install.clear_index_cache() Now we'll reset the default. >>> zc.buildout.easy_install.include_site_packages(True) False >>> zc.buildout.easy_install.include_site_packages() True Dependency links ---------------- Setuptools allows metadata that describes where to search for package dependencies. This option is called dependency_links. Buildout has its own notion of where to look for dependencies, but it also uses the setup tools dependency_links information if it's available. Let's demo this by creating an egg that specifies dependency_links. To begin, let's create a new egg repository. This repository hold a newer version of the 'demoneeded' egg than the sample repository does. >>> repoloc = tmpdir('repo') >>> from zc.buildout.tests import create_egg >>> create_egg('demoneeded', '1.2', repoloc) >>> link_server2 = start_server(repoloc) Turn on logging on this server so that we can see when eggs are pulled from it. >>> get(link_server2 + 'enable_server_logging') GET 200 /enable_server_logging '' Now we can create an egg that specifies that its dependencies are found on this server. >>> repoloc = tmpdir('repo2') >>> create_egg('hasdeps', '1.0', repoloc, ... install_requires = "'demoneeded'", ... dependency_links = [link_server2]) Let's add the egg to another repository. >>> link_server3 = start_server(repoloc) Now let's install the egg. >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, ... links=[link_server3], index=link_server3+'index/') GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg The server logs show that the dependency was retrieved from the server specified in the dependency_links. Now let's see what happens if we provide two different ways to retrieve the dependencies. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg Once again the dependency is fetched from the logging server even though it is also available from the non-logging server. This is because the version on the logging server is newer and buildout normally chooses the newest egg available. If you wish to control where dependencies come from regardless of dependency_links setup metadata use the 'use_dependency_links' option to zc.buildout.easy_install.install(). >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=False) Notice that this time the dependency egg is not fetched from the logging server. When you specify not to use dependency_links, eggs will only be searched for using the links you explicitly provide. Another way to control this option is with the zc.buildout.easy_install.use_dependency_links() function. This function sets the default behavior for the zc.buildout.easy_install() function. >>> zc.buildout.easy_install.use_dependency_links(False) True The function returns its previous setting. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) It can be overridden by passing a keyword argument to the install function. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=True) GET 200 /demoneeded-1.2-pyN.N.egg To return the dependency_links behavior to normal call the function again. >>> zc.buildout.easy_install.use_dependency_links(True) False >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 /demoneeded-1.2-pyN.N.egg Script generation ----------------- The easy_install module provides support for creating scripts from eggs. It provides two competing functions. One, ``scripts``, is a well-established approach to generating reliable scripts with a "clean" Python--e.g., one that does not have any packages in its site-packages. The other, ``sitepackage_safe_scripts``, is newer, a bit trickier, and is designed to work with a Python that has code in its site-packages, such as a system Python. Both are similar to setuptools except that they provides facilities for baking a script's path into the script. This has two advantages: - The eggs to be used by a script are not chosen at run time, making startup faster and, more importantly, deterministic. - The script doesn't have to import pkg_resources because the logic that pkg_resources would execute at run time is executed at script-creation time. (There is an exception in ``sitepackage_safe_scripts`` if you want to have your Python's site packages available, as discussed below, but even in that case pkg_resources is only partially activated, which can be a significant time savings.) The ``scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~ The ``scripts`` function is the first way to generate scripts that we'll examine. It is the earlier approach that the package offered. Let's create a destination directory for it to place them in: >>> bin = tmpdir('bin') Now, we'll use the scripts function to generate scripts in this directory from the demo egg: >>> import sys >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin) the four arguments we passed were: 1. A sequence of distribution requirements. These are of the same form as setuptools requirements. Here we passed a single requirement, for the version 0.1 demo distribution. 2. A working set, 3. The Python executable to use, and 3. The destination directory. The bin directory now contains a generated script: >>> ls(bin) - demo The return value is a list of the scripts generated: >>> import os, sys >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo')] True Note that in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. The demo script run the entry point defined in the demo egg: >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Some things to note: - The demo and demoneeded eggs are added to the beginning of sys.path. - The module for the script entry point is imported and the entry point, in this case, 'main', is run. Rather than requirement strings, you can pass tuples containing 3 strings: - A script name, - A module, - An attribute expression for an entry point within the module. For example, we could have passed entry point information directly rather than passing a requirement: >>> scripts = zc.buildout.easy_install.scripts( ... [('demo', 'eggrecipedemo', 'main')], ... ws, sys.executable, bin) >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Passing entry-point information directly is handy when using eggs (or distributions) that don't declare their entry points, such as distributions that aren't based on setuptools. The interpreter keyword argument can be used to generate a script that can be used to invoke the Python interactive interpreter with the path set based on the working set. This generated script can also be used to run other scripts with the path set on the working set: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, interpreter='py') >>> ls(bin) - demo - py >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py'), ... os.path.join(bin, 'py.exe'), ... os.path.join(bin, 'py-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo'), ... os.path.join(bin, 'py')] True The py script simply runs the Python interactive interpreter with the path set: >>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-pyN.N.egg', '/sample-install/demoneeded-1.1-pyN.N.egg', ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) If invoked with a script name and arguments, it will run that script, instead. >>> write('ascript', ''' ... "demo doc" ... print sys.argv ... print (__name__, __file__, __doc__) ... ''') >>> print system(join(bin, 'py')+' ascript a b c'), ['ascript', 'a', 'b', 'c'] ('__main__', 'ascript', 'demo doc') For Python 2.5 and higher, you can also use the -m option to run a module: >>> print system(join(bin, 'py')+' -m pdb'), usage: pdb.py scriptfile [arg] ... >>> print system(join(bin, 'py')+' -m pdb what'), Error: what does not exist An additional argument can be passed to define which scripts to install and to provide script names. The argument is a dictionary mapping original script names to new script names. >>> bin = tmpdir('bin2') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run')) >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'run.exe'), ... os.path.join(bin, 'run-script.py')] ... else: ... scripts == [os.path.join(bin, 'run')] True >>> ls(bin) - run >>> print system(os.path.join(bin, 'run')), 3 1 The ``scripts`` function: Including extra paths in scripts ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We can pass a keyword argument, extra paths, to cause additional paths to be included in the a generated script: >>> foo = tmpdir('foo') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... extra_paths=[foo]) >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', '/foo', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) The ``scripts`` function: Providing script arguments ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ An "argument" keyword argument can be used to pass arguments to an entry point. The value passed is a source string to be placed between the parentheses in the call: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Passing initialization code ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ You can also pass script initialization code: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2', ... initialization='import os\nos.chdir("foo")') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Relative paths ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Sometimes, you want to be able to move a buildout directory around and have scripts still work without having to rebuild them. We can control this using the relative_paths option to install. You need to pass a common base directory of the scripts and eggs: >>> bo = tmpdir('bo') >>> ba = tmpdir('ba') >>> mkdir(bo, 'eggs') >>> mkdir(bo, 'bin') >>> mkdir(bo, 'other') >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(bo, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, join(bo, 'bin'), dict(demo='run'), ... extra_paths=[ba, join(bo, 'bar')], ... interpreter='py', ... relative_paths=bo) >>> cat(bo, 'bin', 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Note that the extra path we specified that was outside the directory passed as relative_paths wasn't converted to a relative path. Of course, running the script works: >>> print system(join(bo, 'bin', 'run')), 3 1 We specified an interpreter and its paths are adjusted too: >>> cat(bo, 'bin', 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) The ``sitepackage_safe_scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The newer function for creating scripts is ``sitepackage_safe_scripts``. It has the same basic functionality as the ``scripts`` function: it can create scripts to run arbitrary entry points, and to run a Python interpreter. The following are the differences from a user's perspective. - It can be used safely with a Python that has packages installed itself, such as a system-installed Python. - In contrast to the interpreter generated by the ``scripts`` method, which supports only a small subset of the usual Python executable's options, the interpreter generated by ``sitepackage_safe_scripts`` supports all of them. This makes it possible to use as full Python replacement for scripts that need the distributions specified in your buildout. - Both the interpreter and the entry point scripts allow you to include the site packages, and/or the sitecustomize, of the Python executable, if desired. It works by creating site.py and sitecustomize.py files that set up the desired paths and initialization. These must be placed within an otherwise empty directory. Typically this is in a recipe's parts directory. Here's the simplest example, building an interpreter script. >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py') Depending on whether the machine being used is running Windows or not, this produces either three or four files. In both cases, we have site.py and sitecustomize.py generated in the parts/interpreter directory. For Windows, we have py.exe and py-script.py; for other operating systems, we have py. >>> sitecustomize_path = os.path.join( ... interpreter_parts_dir, 'sitecustomize.py') >>> site_path = os.path.join(interpreter_parts_dir, 'site.py') >>> interpreter_path = os.path.join(interpreter_bin_dir, 'py') >>> if sys.platform == 'win32': ... py_path = os.path.join(interpreter_bin_dir, 'py-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'py.exe'), ... py_path] ... else: ... py_path = interpreter_path ... expected = [sitecustomize_path, site_path, py_path] ... >>> assert generated == expected, repr((generated, expected)) We didn't ask for any initialization, and we didn't ask to use the underlying sitecustomization, so sitecustomize.py is empty. >>> cat(sitecustomize_path) The interpreter script is simple. It puts the directory with the site.py and sitecustomize.py on the PYTHONPATH and (re)starts Python. >>> cat(py_path) #!/usr/bin/python -S import os import sys argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = '/interpreter/parts/interpreter' if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) The site.py file is a modified version of the underlying Python's site.py. The most important modification is that it has a different version of the addsitepackages function. It sets up the Python path, similarly to the behavior of the function it replaces. The following shows the part that buildout inserts, in the simplest case. >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) return known_paths def original_addsitepackages(known_paths):... Here are some examples of the interpreter in use. >>> print call_py(interpreter_path, "print 16+26") 42 >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] >>> clean_paths = eval(res.strip()) # This is used later for comparison. If you provide initialization, it goes in sitecustomize.py. >>> def reset_interpreter(): ... # This is necessary because, in our tests, the timestamps of the ... # .pyc files are not outdated when we want them to be. ... rmdir(interpreter_bin_dir) ... mkdir(interpreter_bin_dir) ... rmdir(interpreter_parts_dir) ... mkdir(interpreter_parts_dir) ... >>> reset_interpreter() >>> initialization_string = """\ ... import os ... os.environ['FOO'] = 'bar baz bing shazam'""" >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', initialization=initialization_string) >>> cat(sitecustomize_path) import os os.environ['FOO'] = 'bar baz bing shazam' >>> print call_py(interpreter_path, "import os; print os.environ['FOO']") bar baz bing shazam If you use relative paths, this affects the interpreter and site.py. (This is again the UNIX version; the Windows version uses subprocess instead of os.execve.) >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', relative_paths=interpreter_dir) >>> cat(py_path) #!/usr/bin/python -S import os import sys join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = join(base, 'parts/interpreter') if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) For site.py, we again show only the pertinent parts. Notice that the egg paths join a base to a path, as with the use of this argument in the ``scripts`` function. >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ]... The paths resolve in practice as you would expect. >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] The ``extra_paths`` argument affects the path in site.py. Notice that /interpreter/other is added after the eggs. >>> reset_interpreter() >>> mkdir(interpreter_dir, 'other') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', extra_paths=[join(interpreter_dir, 'other')]) >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other' ]... >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other'] The ``sitepackage_safe_scripts`` function: using site-packages ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``sitepackage_safe_scripts`` function supports including site packages. This has some advantages and some serious dangers. A typical reason to include site-packages is that it is easier to install one or more dependencies in your Python than it is with buildout. Some packages, such as lxml or Python PostgreSQL integration, have dependencies that can be much easier to build and/or install using other mechanisms, such as your operating system's package manager. By installing some core packages into your Python's site-packages, this can significantly simplify some application installations. However, doing this has a significant danger. One of the primary goals of buildout is to provide repeatability. Some packages (one of the better known Python openid packages, for instance) change their behavior depending on what packages are available. If Python curl bindings are available, these may be preferred by the library. If a certain XML package is installed, it may be preferred by the library. These hidden choices may cause small or large behavior differences. The fact that they can be rarely encountered can actually make it worse: you forget that this might be a problem, and debugging the differences can be difficult. If you allow site-packages to be included in your buildout, and the Python you use is not managed precisely by your application (for instance, it is a system Python), you open yourself up to these possibilities. Don't be unaware of the dangers. That explained, let's see how it works. If you don't use namespace packages, this is very straightforward. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = None buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... It simply adds the original paths using addsitedir after the code to add the buildout paths. Here's an example of the new script in use. Other documents and tests in this package give the feature a more thorough workout, but this should give you an idea of the feature. >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-py2.4.egg', '/interpreter/eggs/demoneeded-1.1-py2.4.egg', ...] The clean_paths gathered earlier is a subset of this full list of paths. >>> full_paths = eval(res.strip()) >>> len(clean_paths) < len(full_paths) True >>> set(os.path.normpath(p) for p in clean_paths).issubset( ... os.path.normpath(p) for p in full_paths) True Unfortunately, because of how setuptools namespace packages are implemented differently for operating system packages (debs or rpms) as opposed to standard setuptools installation, there's a slightly trickier dance if you use them. To show this we'll needs some extra eggs that use namespaces. We'll use the ``tellmy.fortune`` package, which we'll need to make an initial call to another text fixture to create. >>> from zc.buildout.tests import create_sample_namespace_eggs >>> namespace_eggs = tmpdir('namespace_eggs') >>> create_sample_namespace_eggs(namespace_eggs) >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo', 'tellmy.fortune'], join(interpreter_dir, 'eggs'), ... links=[link_server, namespace_eggs], index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '...setuptools...', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] As you can see, the script now first imports pkg_resources. Then we need to process egg files specially to look for namespace packages there *before* we process process lines in .pth files that use the "import" feature--lines that might be part of the setuptools namespace package implementation for system packages, as mentioned above, and that must come after processing egg namespaces. The most complex that this function gets is if you use namespace packages, include site-packages, and use relative paths. For completeness, we'll look at that result. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True, ... relative_paths=interpreter_dir) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/tellmy.fortune-1.0-pyN.N.egg'), '...setuptools...', join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] The ``exec_sitecustomize`` argument does the same thing for the sitecustomize module--it allows you to include the code from the sitecustomize module in the underlying Python if you set the argument to True. The z3c.recipe.scripts package sets up the full environment necessary to demonstrate this piece. The ``sitepackage_safe_scripts`` function: writing scripts for entry points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All of the examples so far for this function have been creating interpreters. The function can also write scripts for entry points. They are almost identical to the scripts that we saw for the ``scripts`` function except that they ``import site`` after setting the sys.path to include our custom site.py and sitecustomize.py files. These files then initialize the Python environment as we have already seen. Let's see a simple example. >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo']) As before, in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. This is in addition to the site.py and sitecustomize.py files that are generated as with our interpreter examples above. >>> if sys.platform == 'win32': ... demo_path = os.path.join(interpreter_bin_dir, 'demo-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'demo.exe'), ... demo_path] ... else: ... demo_path = os.path.join(interpreter_bin_dir, 'demo') ... expected = [sitecustomize_path, site_path, demo_path] ... >>> assert generated == expected, repr((generated, expected)) The demo script runs the entry point defined in the demo egg: >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) >>> demo_call = join(interpreter_bin_dir, 'demo') >>> if sys.platform == 'win32': ... demo_call = '"%s"' % demo_call >>> print system(demo_call) 3 1 There are a few differences from the ``scripts`` function. First, the ``reqs`` argument (an iterable of string requirements or entry point tuples) is a keyword argument here. We see that in the example above. Second, the ``arguments`` argument is now named ``script_arguments`` to try and clarify that it does not affect interpreters. While the ``initialization`` argument continues to affect both the interpreters and the entry point scripts, if you have initialization that is only pertinent to the entry point scripts, you can use the ``script_initialization`` argument. Let's see ``script_arguments`` and ``script_initialization`` in action. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo'], script_arguments='1, 2', ... script_initialization='import os\nos.chdir("foo")') >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) Handling custom build options for extensions provided in source distributions ----------------------------------------------------------------------------- Sometimes, we need to control how extension modules are built. The build function provides this level of control. It takes a single package specification, downloads a source distribution, and builds it with specified custom build options. The build function takes 3 positional arguments: spec A package specification for a source distribution dest A destination directory build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. It supports a number of optional keyword arguments: links a sequence of URLs, file names, or directories to look for links to distributions, index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. Our link server included a source distribution that includes a simple extension, extdemo.c:: #include #include static PyMethodDef methods[] = {}; PyMODINIT_FUNC initextdemo(void) { PyObject *m; m = Py_InitModule3("extdemo", methods, ""); #ifdef TWO PyModule_AddObject(m, "val", PyInt_FromLong(2)); #else PyModule_AddObject(m, "val", PyInt_FromLong(EXTDEMO)); #endif } The extension depends on a system-dependent include file, extdemo.h, that defines a constant, EXTDEMO, that is exposed by the extension. We'll add an include directory to our sample buildout and add the needed include file to it: >>> mkdir('include') >>> write('include', 'extdemo.h', ... """ ... #define EXTDEMO 42 ... """) Now, we can use the build function to create an egg from the source distribution: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] The function returns the list of eggs Now if we look in our destination directory, we see we have an extdemo egg: >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg Let's update our link server with a new version of extdemo: >>> update_extdemo() >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
extdemo-1.5.zip
index/
other-1.0-py2.4.egg
The easy_install caches information about servers to reduce network access. To see the update, we have to call the clear_index_cache function to clear the index cache: >>> zc.buildout.easy_install.clear_index_cache() If we run build with newest set to False, we won't get an update: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... newest=False) ['/sample-install/extdemo-1.4-py2.4-linux-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg But if we run it with the default True setting for newest, then we'll get an updated egg: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.5-py2.4-unix-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg d extdemo-1.5-py2.4-unix-i686.egg The versions option also influences the versions used. For example, if we specify a version for extdemo, then that will be used, even though it isn't the newest. Let's clean out the destination directory first: >>> import os >>> for name in os.listdir(dest): ... remove(dest, name) >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... versions=dict(extdemo='1.4')) ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg Handling custom build options for extensions in develop eggs ------------------------------------------------------------ The develop function is similar to the build function, except that, rather than building an egg from a source directory containing a setup.py script. The develop function takes 2 positional arguments: setup The path to a setup script, typically named "setup.py", or a directory containing a setup.py script. dest The directory to install the egg link to It supports some optional keyword argument: build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. We have a local directory containing the extdemo source: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README - extdemo.c - setup.py Now, we can use the develop function to create a develop egg from the source distribution: >>> zc.buildout.easy_install.develop( ... extdemo, dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}) '/sample-install/extdemo.egg-link' The name of the egg link created is returned. Now if we look in our destination directory, we see we have an extdemo egg link: >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg - extdemo.egg-link And that the source directory contains the compiled extension: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README d build - extdemo.c d extdemo.egg-info - extdemo.so - setup.py Download cache -------------- Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. A download cache is specified by calling the download_cache function. The function always returns the previous setting. If no argument is passed, then the setting is unchanged. If an argument is passed, the download cache is set to the given path, which must point to an existing directory. Passing None clears the cache setting. To see this work, we'll create a directory and set it as the cache directory: >>> cache = tmpdir('cache') >>> zc.buildout.easy_install.download_cache(cache) We'll recreate our destination directory: >>> remove(dest) >>> dest = tmpdir('sample-install') We'd like to see what is being fetched from the server, so we'll enable server logging: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' Now, if we install demo, and extdemo: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 200 /demo-0.2-py2.4.egg GET 404 /index/demoneeded/ GET 200 /demoneeded-1.1.zip >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ GET 200 /extdemo-1.5.zip ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] Not only will we get eggs in our destination directory: >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg But we'll get distributions in the cache directory: >>> ls(cache) - demo-0.2-py2.4.egg - demoneeded-1.1.zip - extdemo-1.5.zip The cache directory contains uninstalled distributions, such as zipped eggs or source distributions. Let's recreate our destination directory and clear the index cache: >>> remove(dest) >>> dest = tmpdir('sample-install') >>> zc.buildout.easy_install.clear_index_cache() Now when we install the distributions: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 404 /index/demoneeded/ >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg Note that we didn't download the distributions from the link server. If we remove the restriction on demo, we'll download a newer version from the link server: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 /demo-0.3-py2.4.egg Normally, the download cache is the preferred source of downloads, but not the only one. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a download cache and tell the easy_install module to install from the download cache only, without making network accesses. The install_from_cache function can be used to signal that packages should be installed only from the download cache. The function always returns the previous setting. Calling it with no arguments returns the current setting without changing it: >>> zc.buildout.easy_install.install_from_cache() False Calling it with a boolean value changes the setting and returns the previous setting: >>> zc.buildout.easy_install.install_from_cache(True) False Let's remove demo-0.3-py2.4.egg from the cache, clear the index cache, recreate the destination directory, and reinstall demo: >>> for f in os.listdir(cache): ... if f.startswith('demo-0.3-'): ... remove(cache, f) >>> zc.buildout.easy_install.clear_index_cache() >>> remove(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg This time, we didn't download from or even query the link server. .. Disable the download cache: >>> zc.buildout.easy_install.download_cache(None) '/cache' >>> zc.buildout.easy_install.install_from_cache(False) True zc.buildout-1.7.1/src/zc/buildout/extends-cache.txt0000644000076500007650000003563012111414155021664 0ustar jimjim00000000000000Caching extended configuration ============================== As mentioned in the general buildout documentation, configuration files can extend each other, including the ability to download configuration being extended from a URL. If desired, zc.buildout caches downloaded configuration in order to be able to use it when run offline. As we're going to talk about downloading things, let's start an HTTP server. Also, all of the following will take place inside the sample buildout. >>> server_data = tmpdir('server_data') >>> server_url = start_server(server_data) >>> cd(sample_buildout) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Basic use of the extends cache ------------------------------ We put some base configuration on a server and reference it from a sample buildout: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) When trying to run this buildout offline, we'll find that we cannot read all of the required configuration: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Trying the same online, we can: >>> print system(buildout) Unused options for buildout: 'foo'. As long as we haven't said anything about caching downloaded configuration, nothing gets cached. Offline mode will still cause the buildout to fail: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Let's now specify a cache for base configuration files. This cache is different from the download cache used by recipes for caching distributions and other files; one might, however, use a namespace subdirectory of the download cache for it. The configuration cache we specify will be created when running buildout and the base.cfg file will be put in it (with the file name being a hash of the complete URL): >>> mkdir('cache') >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. >>> cache = join(sample_buildout, 'cache') >>> ls(cache) - 5aedc98d7e769290a29d654a591a3a45 >>> import os >>> cat(cache, os.listdir(cache)[0]) [buildout] parts = foo = bar We can now run buildout offline as it will read base.cfg from the cache: >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. The cache is being used purely as a fall-back in case we are offline or don't have access to a configuration file to be downloaded. As long as we are online, buildout attempts to download a fresh copy of each file even if a cached copy of the file exists. To see this, we put different configuration in the same place on the server and run buildout in offline mode so it takes base.cfg from the cache: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... bar = baz ... """) >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. In online mode, buildout will download and use the modified version: >>> print system(buildout) Unused options for buildout: 'bar'. Trying offline mode again, the new version will be used as it has been put in the cache now: >>> print system(buildout + ' -o') Unused options for buildout: 'bar'. Clean up: >>> rmdir(cache) Specifying extends cache and offline mode ----------------------------------------- Normally, the values of buildout options such as the location of a download cache or whether to use offline mode are determined by first reading the user's default configuration, updating it with the project's configuration and finally applying command-line options. User and project configuration are assembled by reading a file such as ``~/.buildout/default.cfg``, ``buildout.cfg`` or a URL given on the command line, recursively (depth-first) downloading any base configuration specified by the ``buildout:extends`` option read from each of those config files, and finally evaluating each config file to provide default values for options not yet read. This works fine for all options that do not influence how configuration is downloaded in the first place. The ``extends-cache`` and ``offline`` options, however, are treated differently from the procedure described in order to make it simple and obvious to see where a particular configuration file came from under any particular circumstances. - Offline and extends-cache settings are read from the two root config files exclusively. Otherwise one could construct configuration files that, when read, imply that they should have been read from a different source than they have. Also, specifying the extends cache within a file that might have to be taken from the cache before being read wouldn't make a lot of sense. - Offline and extends-cache settings given by the user's defaults apply to the process of assembling the project's configuration. If no extends cache has been specified by the user's default configuration, the project's root config file must be available, be it from disk or from the net. - Offline mode turned on by the ``-o`` command line option is honoured from the beginning even though command line options are applied to the configuration last. If offline mode is not requested by the command line, it may be switched on by either the user's or the project's config root. Extends cache ~~~~~~~~~~~~~ Let's see the above rules in action. We create a new home directory for our user and write user and project configuration that recursively extends online bases, using different caches: >>> mkdir('home') >>> mkdir('home', '.buildout') >>> mkdir('cache') >>> mkdir('user-cache') >>> os.environ['HOME'] = join(sample_buildout, 'home') >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... extends-cache = user-cache ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write(server_data, 'base_default.cfg', """\ ... [buildout] ... foo = bar ... offline = false ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... extends-cache = cache ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... offline = false ... """) Buildout will now assemble its configuration from all of these 6 files, defaults first. The online resources end up in the respective extends caches: >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 10e772cf422123ef6c64ae770f555740 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] foo = bar offline = false >>> ls('cache') - c72213127e6eb2208a3e1fc1dba771a7 >>> cat('cache', os.listdir('cache')[0]) [buildout] parts = offline = false If, on the other hand, the extends caches are specified in files that get extended themselves, they won't be used for assembling the configuration they belong to (user's or project's, resp.). The extends cache specified by the user's defaults does, however, apply to downloading project configuration. Let's rewrite the config files, clean out the caches and re-run buildout: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... extends-cache = user-cache ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> remove('user-cache', os.listdir('user-cache')[0]) >>> remove('cache', os.listdir('cache')[0]) >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 0548bad6002359532de37385bb532e26 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] parts = offline = false >>> ls('cache') Clean up: >>> rmdir('user-cache') >>> rmdir('cache') Offline mode and installation from cache ----------------------------~~~~~~~~~~~~ If we run buildout in offline mode now, it will fail because it cannot get at the remote configuration file needed by the user's defaults: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. Let's now successively turn on offline mode by different parts of the configuration and see when buildout applies this setting in each case: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... offline = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... offline = true ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. The ``install-from-cache`` option is treated accordingly: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Installing. Checking for upgrades. An internal error occurred ... ValueError: install_from_cache set to true with no download cache >>> rmdir('home', '.buildout') Newest and non-newest behaviour for extends cache ------------------------------------------------- While offline mode forbids network access completely, 'newest' mode determines whether to look for updated versions of a resource even if some version of it is already present locally. If we run buildout in newest mode (``newest = true``), the configuration files are updated with each run: >>> mkdir("cache") >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... """ % server_url) >>> print system(buildout) >>> ls('cache') - 5aedc98d7e769290a29d654a591a3a45 >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = A change to ``base.cfg`` is picked up on the next buildout run: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> print system(buildout + " -n") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar In contrast, when not using ``newest`` mode (``newest = false``), the files already present in the extends cache will not be updated: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> print system(buildout + " -N") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar Even when updating base configuration files with a buildout run, any given configuration file will be downloaded only once during that particular run. If some base configuration file is extended more than once, its cached copy is used: >>> write(server_data, 'baseA.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... foo = bar ... """ % server_url) >>> write(server_data, 'baseB.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... bar = foo ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... newest = true ... extends = %sbaseA.cfg %sbaseB.cfg ... """ % (server_url, server_url)) >>> print system(buildout + " -n") Unused options for buildout: 'bar' 'foo'. (XXX We patch download utility's API to produce readable output for the test; a better solution would utilise the logging already done by the utility.) >>> import zc.buildout >>> old_download = zc.buildout.download.Download.download >>> def wrapper_download(self, url, md5sum=None, path=None): ... print "The URL %s was downloaded." % url ... return old_download(url, md5sum, path) >>> zc.buildout.download.Download.download = wrapper_download >>> zc.buildout.buildout.main([]) The URL http://localhost/baseA.cfg was downloaded. The URL http://localhost/base.cfg was downloaded. The URL http://localhost/baseB.cfg was downloaded. Unused options for buildout: 'bar' 'foo'. >>> zc.buildout.download.Download.download = old_download The deprecated ``extended-by`` option ------------------------------------- The ``buildout`` section used to recognise an option named ``extended-by`` that was deprecated at some point and removed in the 1.5 line. Since ignoring this option silently was considered harmful as a matter of principle, a UserError is raised if that option is encountered now: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... extended-by = foo.cfg ... """) >>> print system(buildout) While: Initializing. Error: No-longer supported "extended-by" option found in http://localhost/base.cfg. Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir zc.buildout-1.7.1/src/zc/buildout/repeatable.txt0000644000076500007650000001215412111414155021251 0ustar jimjim00000000000000Repeatable buildouts: controlling eggs used =========================================== One of the goals of zc.buildout is to provide enough control to make buildouts repeatable. It should be possible to check the buildout configuration files for a project into a version control system and later use the checked in files to get the same buildout, subject to changes in the environment outside the buildout. An advantage of using Python eggs is that depenencies of eggs used are automatically determined and used. The automatic inclusion of depenent distributions is at odds with the goal of repeatable buildouts. To support repeatable buildouts, a versions section can be created with options for each distribution name whos version is to be fixed. The section can then be specified via the buildout versions option. To see how this works, we'll create two versions of a recipe egg: >>> mkdir('recipe') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v1' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='1', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'README', '') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... >>> rmdir('recipe', 'build') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v2' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='2', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... and we'll configure a buildout to use it: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) If we run the buildout, it will use version 2: >>> print system(buildout), Getting distribution for 'spam'. Got spam 2. Installing foo. recipe v2 We can specify a versions section that lists our recipe and name it in the buildout section: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) Here we created a release-1 section listing the version 1 for the spam distribution. We told the buildout to use it by specifying release-1 as in the versions option. Now, if we run the buildout, we'll use version 1 of the spam recipe: >>> print system(buildout), Getting distribution for 'spam==1'. Got spam 1. Uninstalling foo. Installing foo. recipe v1 Running the buildout in verbose mode will help us get information about versions used. If we run the buildout in verbose mode without specifying a versions section: >>> print system(buildout+' buildout:versions= -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the best distribution that satisfies 'spam'. Picked: spam = 2. Uninstalling foo. Installing foo. recipe v2 We'll get output that includes lines that tell us what versions buildout chose a for us, like:: zc.buildout.easy_install.picked: spam = 2 This allows us to discover versions that are picked dynamically, so that we can fix them in a versions section. If we run the buildout with the versions section: >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the distribution that satisfies 'spam==1'. Uninstalling foo. Installing foo. recipe v1 We won't get output for the spam distribution, which we didn't pick, but we will get output for setuptools, which we didn't specify versions for. You can request buildout to generate an error if it picks any versions: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... allow-picked-versions = false ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) zc.buildout-1.7.1/src/zc/buildout/rmtree.py0000644000076500007650000000315412111414155020254 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2006 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## import shutil import os import doctest def rmtree (path): """ A variant of shutil.rmtree which tries hard to be successful On windows shutil.rmtree aborts when it tries to delete a read only file. This tries to chmod the file to writeable and retries before giving up. >>> from tempfile import mkdtemp Let's make a directory ... >>> d = mkdtemp() and make sure it is actually there >>> os.path.isdir (d) 1 Now create a file ... >>> foo = os.path.join (d, 'foo') >>> open (foo, 'w').write ('huhu') and make it unwriteable >>> os.chmod (foo, 0400) rmtree should be able to remove it: >>> rmtree (d) and now the directory is gone >>> os.path.isdir (d) 0 """ def retry_writeable (func, path, exc): os.chmod (path, 0600) func (path) shutil.rmtree (path, onerror = retry_writeable) def test_suite(): return doctest.DocTestSuite() if "__main__" == __name__: doctest.testmod() zc.buildout-1.7.1/src/zc/buildout/runsetup.txt0000644000076500007650000000320112111414155021023 0ustar jimjim00000000000000Running setup scripts ===================== Buildouts are often used to work on packages that will be distributed as eggs. During development, we use develop eggs. When you've completed a development cycle, you'll need to run your setup script to generate a distribution and, perhaps, uploaded it to the Python package index. If your script uses setuptools, you'll need setuptools in your Python path, which may be an issue if you haven't installed setuptools into your Python installation. The buildout setup command is helpful in a situation like this. It can be used to run a setup script and it does so with the setuptools egg in the Python path and with setuptools already imported. The fact that setuptools is imported means that you can use setuptools-based commands, like bdist_egg even with packages that don't use setuptools. To illustrate this, we'll create a package in a sample buildout: >>> mkdir('hello') >>> write('hello', 'hello.py', 'print "Hello World!"') >>> write('hello', 'README', 'This is hello') >>> write('hello', 'setup.py', ... """ ... from distutils.core import setup ... setup(name="hello", ... version="1.0", ... py_modules=["hello"], ... author="Bob", ... author_email="bob@foo.com", ... ) ... """) We can use the buildout command to generate the hello egg: >>> print system(buildout +' setup hello -q bdist_egg'), Running setup script 'hello/setup.py'. zip_safe flag not set; analyzing archive contents... The hello directory now has a hello egg in it's dist directory: >>> ls('hello', 'dist') - hello-1.0-py2.4.egg zc.buildout-1.7.1/src/zc/buildout/setup.txt0000644000076500007650000000361612111414155020310 0ustar jimjim00000000000000Using zc.buildout to run setup scripts ====================================== zc buildout has a convenience command for running setup scripts. Why? There are two reasons. If a setup script doesn't import setuptools, you can't use any setuptools-provided commands, like bdist_egg. When buildout runs a setup script, it arranges to import setuptools before running the script so setuptools-provided commands are available. If you use a squeaky-clean Python to do your development, the setup script that would import setuptools because setuptools isn't in the path. Because buildout requires setuptools and knows where it has installed a setuptools egg, it adds the setuptools egg to the Python path before running the script. To run a setup script, use the buildout setup command, passing the name of a script or a directory containing a setup script and arguments to the script. Let's look at an example: >>> mkdir('test') >>> cd('test') >>> write('setup.py', ... ''' ... from distutils.core import setup ... setup(name='sample') ... ''') We've created a super simple (stupid) setup script. Note that it doesn't import setuptools. Let's try running it to create an egg. We'll use the buildout script from our sample buildout: >>> print system(buildout+' setup'), ... # doctest: +NORMALIZE_WHITESPACE Error: The setup command requires the path to a setup script or directory containing a setup script, and its arguments. Oops, we forgot to give the name of the setup script: >>> print system(buildout+' setup setup.py bdist_egg'), ... # doctest: +ELLIPSIS Running setup script 'setup.py'. ... >>> ls('dist') - sample-0.0.0-py2.5.egg Note that we can specify a directory name. This is often shorter and preferred by the lazy :) >>> print system(buildout+' setup . bdist_egg'), # doctest: +ELLIPSIS Running setup script './setup.py'. ... zc.buildout-1.7.1/src/zc/buildout/testing.py0000644000076500007650000005070512111414155020437 0ustar jimjim00000000000000############################################################################# # # Copyright (c) 2004-2009 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## """Various test-support utility functions $Id$ """ import BaseHTTPServer import errno import logging import os import pkg_resources import random import re import shutil import socket import subprocess import sys import tempfile import textwrap import threading import time import urllib2 import zc.buildout.buildout import zc.buildout.easy_install from zc.buildout.rmtree import rmtree fsync = getattr(os, 'fsync', lambda fileno: None) is_win32 = sys.platform == 'win32' # Only some unixes allow scripts in shebang lines: script_in_shebang = is_win32 if sys.platform == 'linux2': f = subprocess.Popen('uname -r', shell=True, stdout=subprocess.PIPE).stdout r = f.read().strip() f.close() r = tuple(map(int, re.match(r'\d+(\.\d+)*', r).group(0).split('.'))) if r >= (2, 6, 27, 9): # http://www.in-ulm.de/~mascheck/various/shebang/ script_in_shebang = True setuptools_location = pkg_resources.working_set.find( pkg_resources.Requirement.parse('setuptools')).location def cat(dir, *names): path = os.path.join(dir, *names) if (not os.path.exists(path) and is_win32 and os.path.exists(path+'-script.py') ): path = path+'-script.py' print open(path).read(), def ls(dir, *subs): if subs: dir = os.path.join(dir, *subs) names = os.listdir(dir) names.sort() for name in names: if os.path.isdir(os.path.join(dir, name)): print 'd ', elif os.path.islink(os.path.join(dir, name)): print 'l ', else: print '- ', print name def mkdir(*path): os.mkdir(os.path.join(*path)) def remove(*path): path = os.path.join(*path) if os.path.isdir(path): shutil.rmtree(path) else: os.remove(path) def rmdir(*path): shutil.rmtree(os.path.join(*path)) def write(dir, *args): path = os.path.join(dir, *(args[:-1])) f = open(path, 'w') f.write(args[-1]) f.flush() fsync(f.fileno()) f.close() def clean_up_pyc(*path): base, filename = os.path.join(*path[:-1]), path[-1] if filename.endswith('.py'): filename += 'c' # .py -> .pyc for path in ( os.path.join(base, filename), os.path.join(base, '__pycache__', filename), ): if os.path.exists(path): remove(path) ## FIXME - check for other platforms MUST_CLOSE_FDS = not sys.platform.startswith('win') def system(command, input=''): env = dict(os.environ) env['COLUMNS'] = '80' p = subprocess.Popen(command, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, close_fds=MUST_CLOSE_FDS, env=env, ) i, o, e = (p.stdin, p.stdout, p.stderr) if input: i.write(input) i.close() result = o.read() + e.read() o.close() e.close() return result def call_py(interpreter, cmd, flags=None): if sys.platform == 'win32': args = ['"%s"' % arg for arg in (interpreter, flags, cmd) if arg] args.insert(-1, '"-c"') return system('"%s"' % ' '.join(args)) else: cmd = repr(cmd) return system( ' '.join(arg for arg in (interpreter, flags, '-c', cmd) if arg)) def get(url): return urllib2.urlopen(url).read() def _runsetup(setup, executable, *args): if os.path.isdir(setup): setup = os.path.join(setup, 'setup.py') d = os.path.dirname(setup) args = list(args) args.insert(0, '-q') env = dict(os.environ) if executable == sys.executable: env['PYTHONPATH'] = setuptools_location # else pass an executable that has setuptools! See testselectingpython.py. here = os.getcwd() try: os.chdir(d) p = subprocess.Popen( [executable, setup] + args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=MUST_CLOSE_FDS, env=env) out = p.stdout.read() if p.wait(): print out if os.path.exists('build'): rmtree('build') finally: os.chdir(here) def sdist(setup, dest): _runsetup(setup, sys.executable, 'sdist', '-d', dest, '--formats=zip') def bdist_egg(setup, executable, dest): _runsetup(setup, executable, 'bdist_egg', '-d', dest) def sys_install(setup, dest): _runsetup(setup, sys.executable, 'install', '--install-purelib', dest, '--install-scripts', dest, '--record', os.path.join(dest, '__added_files__'), '--single-version-externally-managed') def find_python(version): env_friendly_version = ''.join(version.split('.')) e = os.environ.get('PYTHON%s' % env_friendly_version) if e is not None: return e if is_win32: e = '\Python%s\python.exe' % env_friendly_version if os.path.exists(e): return e else: cmd = 'python%s -c "import sys; print sys.executable"' % version p = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=MUST_CLOSE_FDS) i, o = (p.stdin, p.stdout) i.close() e = o.read().strip() o.close() if os.path.exists(e): return e cmd = 'python -c "import sys; print \'%s.%s\' % sys.version_info[:2]"' p = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=MUST_CLOSE_FDS) i, o = (p.stdin, p.stdout) i.close() e = o.read().strip() o.close() if e == version: cmd = 'python -c "import sys; print sys.executable"' p = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=MUST_CLOSE_FDS) i, o = (p.stdin, p.stdout) i.close() e = o.read().strip() o.close() if os.path.exists(e): return e raise ValueError( "Couldn't figure out the executable for Python %(version)s.\n" "Set the environment variable PYTHON%(envversion)s to the location\n" "of the Python %(version)s executable before running the tests." % {'version': version, 'envversion': env_friendly_version}) def wait_until(label, func, *args, **kw): if 'timeout' in kw: kw = dict(kw) timeout = kw.pop('timeout') else: timeout = 30 deadline = time.time()+timeout while time.time() < deadline: if func(*args, **kw): return time.sleep(0.01) raise ValueError('Timed out waiting for: '+label) def get_installer_values(): """Get the current values for the easy_install module. This is necessary because instantiating a Buildout will force the Buildout's values on the installer. Returns a dict of names-values suitable for set_installer_values.""" names = ('default_versions', 'download_cache', 'install_from_cache', 'prefer_final', 'include_site_packages', 'allowed_eggs_from_site_packages', 'use_dependency_links', 'allow_picked_versions', 'always_unzip' ) values = {} for name in names: values[name] = getattr(zc.buildout.easy_install, name)() return values def set_installer_values(values): """Set the given values on the installer.""" for name, value in values.items(): getattr(zc.buildout.easy_install, name)(value) def make_buildout(executable=None): """Make a buildout that uses this version of zc.buildout.""" # Create a basic buildout.cfg to avoid a warning from buildout. open('buildout.cfg', 'w').write( "[buildout]\nparts =\n" ) # Get state of installer defaults so we can reinstate them (instantiating # a Buildout will force the Buildout's defaults on the installer). installer_values = get_installer_values() # Use the buildout bootstrap command to create a buildout config = [ ('buildout', 'log-level', 'WARNING'), # trick bootstrap into putting the buildout develop egg # in the eggs dir. ('buildout', 'develop-eggs-directory', 'eggs'), ] if executable is not None: config.append(('buildout', 'executable', executable)) zc.buildout.buildout.Buildout( 'buildout.cfg', config, user_defaults=False, ).bootstrap([]) # Create the develop-eggs dir, which didn't get created the usual # way due to the trick above: os.mkdir('develop-eggs') # Reinstate the default values of the installer. set_installer_values(installer_values) def buildoutSetUp(test): test.globs['__tear_downs'] = __tear_downs = [] test.globs['register_teardown'] = register_teardown = __tear_downs.append installer_values = get_installer_values() register_teardown( lambda: set_installer_values(installer_values) ) here = os.getcwd() register_teardown(lambda: os.chdir(here)) handlers_before_set_up = logging.getLogger().handlers[:] def restore_root_logger_handlers(): root_logger = logging.getLogger() for handler in root_logger.handlers[:]: root_logger.removeHandler(handler) for handler in handlers_before_set_up: root_logger.addHandler(handler) register_teardown(restore_root_logger_handlers) base = tempfile.mkdtemp('buildoutSetUp') base = os.path.realpath(base) register_teardown(lambda base=base: rmtree(base)) old_home = os.environ.get('HOME') os.environ['HOME'] = os.path.join(base, 'bbbBadHome') def restore_home(): if old_home is None: del os.environ['HOME'] else: os.environ['HOME'] = old_home register_teardown(restore_home) base = os.path.join(base, '_TEST_') os.mkdir(base) tmp = tempfile.mkdtemp('buildouttests') register_teardown(lambda: rmtree(tmp)) zc.buildout.easy_install.default_index_url = 'file://'+tmp os.environ['buildout-testing-index-url'] = ( zc.buildout.easy_install.default_index_url) os.environ.pop('PYTHONPATH', None) def tmpdir(name): path = os.path.join(base, name) mkdir(path) return path sample = tmpdir('sample-buildout') os.chdir(sample) make_buildout() def start_server(path): port, thread = _start_server(path, name=path) url = 'http://localhost:%s/' % port register_teardown(lambda: stop_server(url, thread)) return url def make_py(initialization=''): """Returns paths to new executable and to its site-packages. """ buildout = tmpdir('executable_buildout') site_packages_dir = os.path.join(buildout, 'site-packages') mkdir(site_packages_dir) old_wd = os.getcwd() os.chdir(buildout) make_buildout() # Normally we don't process .pth files in extra-paths. We want to # in this case so that we can test with setuptools system installs # (--single-version-externally-managed), which use .pth files. initialization = ( ('import sys\n' 'import site\n' 'known_paths = set(sys.path)\n' 'site_packages_dir = %r\n' 'site.addsitedir(site_packages_dir, known_paths)\n' ) % (site_packages_dir,)) + initialization initialization = '\n'.join( ' ' + line for line in initialization.split('\n')) install_develop( 'zc.recipe.egg', os.path.join(buildout, 'develop-eggs')) install_develop( 'z3c.recipe.scripts', os.path.join(buildout, 'develop-eggs')) write('buildout.cfg', textwrap.dedent('''\ [buildout] parts = py include-site-packages = false exec-sitecustomize = false [py] recipe = z3c.recipe.scripts interpreter = py initialization = %(initialization)s extra-paths = %(site-packages)s eggs = setuptools ''') % { 'initialization': initialization, 'site-packages': site_packages_dir}) system(os.path.join(buildout, 'bin', 'buildout')) os.chdir(old_wd) return ( os.path.join(buildout, 'bin', 'py'), site_packages_dir) cdpaths = [] def cd(*path): path = os.path.join(*path) cdpaths.append(os.path.abspath(os.getcwd())) os.chdir(path) def uncd(): os.chdir(cdpaths.pop()) test.globs.update(dict( sample_buildout = sample, ls = ls, cat = cat, mkdir = mkdir, rmdir = rmdir, remove = remove, tmpdir = tmpdir, write = write, system = system, call_py = call_py, get = get, cd = cd, uncd = uncd, join = os.path.join, sdist = sdist, bdist_egg = bdist_egg, start_server = start_server, buildout = os.path.join(sample, 'bin', 'buildout'), wait_until = wait_until, make_py = make_py, clean_up_pyc = clean_up_pyc, )) def buildoutTearDown(test): for f in test.globs['__tear_downs']: f() class Server(BaseHTTPServer.HTTPServer): def __init__(self, tree, *args): BaseHTTPServer.HTTPServer.__init__(self, *args) self.tree = os.path.abspath(tree) __run = True def serve_forever(self): while self.__run: self.handle_request() def handle_error(self, *_): self.__run = False class Handler(BaseHTTPServer.BaseHTTPRequestHandler): Server.__log = False def __init__(self, request, address, server): self.__server = server self.tree = server.tree BaseHTTPServer.BaseHTTPRequestHandler.__init__( self, request, address, server) def do_GET(self): if '__stop__' in self.path: raise SystemExit if self.path == '/enable_server_logging': self.__server.__log = True self.send_response(200) return if self.path == '/disable_server_logging': self.__server.__log = False self.send_response(200) return path = os.path.abspath(os.path.join(self.tree, *self.path.split('/'))) if not ( ((path == self.tree) or path.startswith(self.tree+os.path.sep)) and os.path.exists(path) ): self.send_response(404, 'Not Found') #self.send_response(200) out = 'Not Found' #out = '\n'.join(self.tree, self.path, path) self.send_header('Content-Length', str(len(out))) self.send_header('Content-Type', 'text/html') self.end_headers() self.wfile.write(out) return self.send_response(200) if os.path.isdir(path): out = ['\n'] names = os.listdir(path) names.sort() for name in names: if os.path.isdir(os.path.join(path, name)): name += '/' out.append('%s
\n' % (name, name)) out.append('\n') out = ''.join(out) self.send_header('Content-Length', str(len(out))) self.send_header('Content-Type', 'text/html') else: out = open(path, 'rb').read() self.send_header('Content-Length', len(out)) if path.endswith('.egg'): self.send_header('Content-Type', 'application/zip') elif path.endswith('.gz'): self.send_header('Content-Type', 'application/x-gzip') elif path.endswith('.zip'): self.send_header('Content-Type', 'application/x-gzip') else: self.send_header('Content-Type', 'text/html') self.end_headers() self.wfile.write(out) def log_request(self, code): if self.__server.__log: print '%s %s %s' % (self.command, code, self.path) def _run(tree, port): server_address = ('localhost', port) httpd = Server(tree, server_address, Handler) httpd.serve_forever() def get_port(): for i in range(10): port = random.randrange(20000, 30000) s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) try: try: s.connect(('localhost', port)) except socket.error: return port finally: s.close() raise RuntimeError, "Can't find port" def _start_server(tree, name=''): port = get_port() thread = threading.Thread(target=_run, args=(tree, port), name=name) thread.setDaemon(True) thread.start() wait(port, up=True) return port, thread def start_server(tree): return _start_server(tree)[0] def stop_server(url, thread=None): try: urllib2.urlopen(url+'__stop__') except Exception: pass if thread is not None: thread.join() # wait for thread to stop def wait(port, up): addr = 'localhost', port for i in range(120): time.sleep(0.25) try: s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.connect(addr) s.close() if up: break except socket.error, e: if e[0] not in (errno.ECONNREFUSED, errno.ECONNRESET): raise s.close() if not up: break else: if up: raise else: raise SystemError("Couln't stop server") def install(project, destination): if not isinstance(destination, basestring): destination = os.path.join(destination.globs['sample_buildout'], 'eggs') dist = pkg_resources.working_set.find( pkg_resources.Requirement.parse(project)) if dist.location.endswith('.egg'): destination = os.path.join(destination, os.path.basename(dist.location), ) if os.path.isdir(dist.location): shutil.copytree(dist.location, destination) else: shutil.copyfile(dist.location, destination) else: # copy link open(os.path.join(destination, project+'.egg-link'), 'w' ).write(dist.location) def install_develop(project, destination): if not isinstance(destination, basestring): destination = os.path.join(destination.globs['sample_buildout'], 'develop-eggs') dist = pkg_resources.working_set.find( pkg_resources.Requirement.parse(project)) open(os.path.join(destination, project+'.egg-link'), 'w' ).write(dist.location) def _normalize_path(match): path = match.group(1) if os.path.sep == '\\': path = path.replace('\\\\', '/') if path.startswith('\\'): path = path[1:] return '/' + path.replace(os.path.sep, '/') if sys.platform == 'win32': sep = r'[\\/]' # Windows uses both sometimes. else: sep = re.escape(os.path.sep) normalize_path = ( re.compile( r'''[^'" \t\n\r!]+%(sep)s_[Tt][Ee][Ss][Tt]_%(sep)s([^"' \t\n\r]+)''' % dict(sep=sep)), _normalize_path, ) normalize_endings = re.compile('\r\n'), '\n' normalize_script = ( re.compile('(\n?)- ([a-zA-Z_.-]+)-script.py\n- \\2.exe\n'), '\\1- \\2\n') normalize_egg_py = ( re.compile('-py\d[.]\d(-\S+)?.egg'), '-pyN.N.egg', ) zc.buildout-1.7.1/src/zc/buildout/testing.txt0000644000076500007650000001475712111414155020635 0ustar jimjim00000000000000Testing Support =============== The zc.buildout.testing module provides an API that can be used when writing recipe tests. This API is documented below. Many examples of using this API can be found in the zc.buildout, zc.recipe.egg, and zc.recipe.testrunner tests. zc.buildout.testing.buildoutSetUp(test) --------------------------------------- The buildoutSetup function can be used as a doctest setup function. It creates a sample buildout that can be used by tests, changing the current working directory to the sample_buildout. It also adds a number of names to the test namespace: ``sample_buildout`` This is the name of a buildout with a basic configuration. ``buildout`` This is the path of the buildout script in the sample buildout. ``ls(*path)`` List the contents of a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``cat(*path)`` Display the contents of a file. The file path is provided as one or more strings, to be joined with os.path.join. On Windows, if the file doesn't exist, the function will try adding a '-script.py' suffix. This helps to work around a difference in script generation on windows. ``mkdir(*path)`` Create a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``rmdir(*path)`` Remove a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``remove(*path)`` Remove a directory or file. The path is provided as one or more strings, to be joined with os.path.join. ``tmpdir(name)`` Create a temporary directory with the given name. The directory will be automatically removed at the end of the test. The path of the created directory is returned. Further, if the the normalize_path normlaizing substitution (see below) is used, then any paths starting with this path will be normalized to:: /name/restofpath No two temporary directories can be created with the same name. A directory created with tmpdir can be removed with rmdir and recreated. Note that the sample_buildout directory is created by calling this function. ``write(*path_and_contents)`` Create a file. The file path is provided as one or more strings, to be joined with os.path.join. The last argument is the file contents. ``system(command, input='')`` Execute a system command with the given input passed to the command's standard input. The output (error and regular output) from the command is returned. ``get(url)`` Get a web page. ``cd(*path)`` Change to the given directory. The directory path is provided as one or more strings, to be joined with os.path.join. The directory will be reset at the end of the test. ``uncd()`` Change to the directory that was current prior to the previous call to ``cd``. You can call ``cd`` multiple times and then ``uncd`` the same number of times to return to the same location. ``join(*path)`` A convenient reference to os.path.join. ``register_teardown(func)`` Register a tear-down function. The function will be called with no arguments at the end of the test. ``start_server(path)`` Start a web server on the given path. The server will be shut down at the end of the test. The server URL is returned. You can cause the server to start and stop logging it's output using: >>> get(server_url+'enable_server_logging') and: >>> get(server_url+'disable_server_logging') This can be useful to see how buildout is interacting with a server. ``sdist(setup, dest)`` Create a source distribution by running the given setup file and placing the result in the given destination directory. If the setup argument is a directory, the thge setup.py file in that directory is used. ``bdist_egg(setup, executable, dest)`` Create an egg by running the given setup file with the given Python executable and placing the result in the given destination directory. If the setup argument is a directory, then the setup.py file in that directory is used. ``find_python(version)`` Find a Python executable for the given version, where version is a string like "2.4". This function uses the following strategy to find a Python of the given version: - Look for an environment variable of the form PYTHON%(version)s. - On windows, look for \Pythonm%(version)s\python - on Unix, try running python%(version)s or just python to get the executable ``zc.buildout.testing.buildoutTearDown(test)`` ---------------------------------------------- Tear down everything set up by zc.buildout.testing.buildoutSetUp. Any functions passed to register_teardown are called as well. ``install(project, destination)`` --------------------------------- Install eggs for a given project into a destination. If the destination is a test object, then the eggs directory of the sample buildout (sample_buildout) defined by the test will be used. Tests will use this to install the distributions for the packages being tested (and their dependencies) into a sample buildout. The egg to be used should already be loaded, by importing one of the modules provided, before calling this function. ``install_develop(project, destination)`` ----------------------------------------- Like install, but a develop egg is installed even if the current egg if not a develop egg. ``Output normalization`` ------------------------ Recipe tests often generate output that is dependent on temporary file locations, operating system conventions or Python versions. To deal with these dependencies, we often use zope.testing.renormalizing.RENormalizing to normalize test output. zope.testing.renormalizing.RENormalizing takes pairs of regular expressions and substitutions. The zc.buildout.testing module provides a few helpful variables that define regular-expression/substitution pairs that you can pass to zope.testing.renormalizing.RENormalizing. ``normalize_path`` Converts tests paths, based on directories created with tmpdir(), to simple paths. ``normalize_script`` On Unix-like systems, scripts are implemented in single files without suffixes. On windows, scripts are implemented with 2 files, a -script.py file and a .exe file. This normalization converts directory listings of Windows scripts to the form generated on UNix-like systems. ``normalize_egg_py`` Normalize Python version and platform indicators, if specified, in egg names. zc.buildout-1.7.1/src/zc/buildout/testing_bugfix.txt0000644000076500007650000000256212111414155022170 0ustar jimjim00000000000000Bug fixes in zc.buildout.testing ================================ Logging handler which did not get deleted ----------------------------------------- The buildout testing set up runs a buildout which adds a ``logging.StreamHandler`` to the root logger. But tear down did not remove it. This can disturb other tests of packages reusing zc.buildout.testing. The handers before calling set up are: >>> import logging >>> oldn = len(logging.getLogger().handlers) >>> logging.getLogger().handlers # doctest: +ELLIPSIS [] After calling it, a ``logging.StreamHandler`` was added: >>> import zc.buildout.testing >>> import doctest >>> test = doctest.DocTestParser().get_doctest( ... '>>> x', {}, 'foo', 'foo.py', 0) >>> zc.buildout.testing.buildoutSetUp(test) >>> len(logging.getLogger().handlers) == oldn + 1 True >>> logging.getLogger().handlers # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE [, ] But tear down removes the new logging handler: >>> zc.buildout.testing.buildoutTearDown(test) >>> len(logging.getLogger().handlers) == oldn True >>> logging.getLogger().handlers # doctest: +ELLIPSIS [] zc.buildout-1.7.1/src/zc/buildout/testrecipes.py0000644000076500007650000000054512111414155021311 0ustar jimjim00000000000000 class Debug: def __init__(self, buildout, name, options): self.buildout = buildout self.name = name self.options = options def install(self): items = self.options.items() items.sort() for option, value in items: print " %s=%r" % (option, value) return () update = install zc.buildout-1.7.1/src/zc/buildout/tests.py0000644000076500007650000040557712111414155020137 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2004-2009 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## import doctest from zope.testing import renormalizing import os import pkg_resources import re import shutil import sys import tempfile import unittest import zc.buildout.easy_install import zc.buildout.testing import zc.buildout.testselectingpython import zipfile os_path_sep = os.path.sep if os_path_sep == '\\': os_path_sep *= 2 def develop_w_non_setuptools_setup_scripts(): """ We should be able to deal with setup scripts that aren't setuptools based. >>> mkdir('foo') >>> write('foo', 'setup.py', ... ''' ... from distutils.core import setup ... setup(name="foo") ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = foo ... parts = ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' >>> ls('develop-eggs') - foo.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link """ def develop_verbose(): """ We should be able to deal with setup scripts that aren't setuptools based. >>> mkdir('foo') >>> write('foo', 'setup.py', ... ''' ... from setuptools import setup ... setup(name="foo") ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = foo ... parts = ... ''') >>> print system(join('bin', 'buildout')+' -vv'), # doctest: +ELLIPSIS Installing... Develop: '/sample-buildout/foo' ... Installed /sample-buildout/foo ... >>> ls('develop-eggs') - foo.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link >>> print system(join('bin', 'buildout')+' -vvv'), # doctest: +ELLIPSIS Installing... Develop: '/sample-buildout/foo' in: '/sample-buildout/foo' ... -q develop -mxN -d /sample-buildout/develop-eggs/... """ def buildout_error_handling(): r"""Buildout error handling Asking for a section that doesn't exist, yields a missing section error: >>> import os >>> os.chdir(sample_buildout) >>> import zc.buildout.buildout >>> buildout = zc.buildout.buildout.Buildout('buildout.cfg', []) >>> buildout['eek'] Traceback (most recent call last): ... MissingSection: The referenced section, 'eek', was not defined. Asking for an option that doesn't exist, a MissingOption error is raised: >>> buildout['buildout']['eek'] Traceback (most recent call last): ... MissingOption: Missing option: buildout:eek It is an error to create a variable-reference cycle: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... parts = ... x = ${buildout:y} ... y = ${buildout:z} ... z = ${buildout:x} ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), ... # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS While: Initializing. Getting section buildout. Initializing section buildout. Getting option buildout:y. Getting option buildout:z. Getting option buildout:x. Getting option buildout:y. Error: Circular reference in substitutions. It is an error to use funny characters in variable refereces: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = data_dir debug ... x = ${bui$ldout:y} ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Initializing. Getting section buildout. Initializing section buildout. Getting option buildout:x. Error: The section name in substitution, ${bui$ldout:y}, has invalid characters. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = data_dir debug ... x = ${buildout:y{z} ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Initializing. Getting section buildout. Initializing section buildout. Getting option buildout:x. Error: The option name in substitution, ${buildout:y{z}, has invalid characters. and too have too many or too few colons: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = data_dir debug ... x = ${parts} ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Initializing. Getting section buildout. Initializing section buildout. Getting option buildout:x. Error: The substitution, ${parts}, doesn't contain a colon. >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = data_dir debug ... x = ${buildout:y:z} ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Initializing. Getting section buildout. Initializing section buildout. Getting option buildout:x. Error: The substitution, ${buildout:y:z}, has too many colons. Al parts have to have a section: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... parts = x ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Installing. Getting section x. Error: The referenced section, 'x', was not defined. and all parts have to have a specified recipe: >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... parts = x ... ... [x] ... foo = 1 ... ''') >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), While: Installing. Error: Missing option: x:recipe """ make_dist_that_requires_setup_py_template = """ from setuptools import setup setup(name=%r, version=%r, install_requires=%r, ) """ def make_dist_that_requires(dest, name, requires=[], version=1, egg=''): os.mkdir(os.path.join(dest, name)) open(os.path.join(dest, name, 'setup.py'), 'w').write( make_dist_that_requires_setup_py_template % (name, version, requires) ) def show_who_requires_when_there_is_a_conflict(): """ It's a pain when we require eggs that have requirements that are incompatible. We want the error we get to tell us what is missing. Let's make a few develop distros, some of which have incompatible requirements. >>> make_dist_that_requires(sample_buildout, 'sampley', ... ['demoneeded ==1.0']) >>> make_dist_that_requires(sample_buildout, 'samplez', ... ['demoneeded ==1.1']) Now, let's create a buildout that requires y and z: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... develop = sampley samplez ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = sampley ... samplez ... ''' % globals()) >>> print system(buildout), Develop: '/sample-buildout/sampley' Develop: '/sample-buildout/samplez' Installing eggs. Getting distribution for 'demoneeded==1.1'. Got demoneeded 1.1. While: Installing eggs. Error: There is a version conflict. We already have: demoneeded 1.1 but sampley 1 requires 'demoneeded==1.0'. Here, we see that sampley required an older version of demoneeded. What if we hadn't required sampley ourselves: >>> make_dist_that_requires(sample_buildout, 'samplea', ['sampleb']) >>> make_dist_that_requires(sample_buildout, 'sampleb', ... ['sampley', 'samplea']) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... develop = sampley samplez samplea sampleb ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = samplea ... samplez ... ''' % globals()) If we use the verbose switch, we can see where requirements are coming from: >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0 We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Develop: '/sample-buildout/sampley' Develop: '/sample-buildout/samplez' Develop: '/sample-buildout/samplea' Develop: '/sample-buildout/sampleb' ...Installing eggs. Installing 'samplea', 'samplez'. We have a develop egg: samplea 1 We have a develop egg: samplez 1 Getting required 'demoneeded==1.1' required by samplez 1. We have the distribution that satisfies 'demoneeded==1.1'. Getting required 'sampleb' required by samplea 1. We have a develop egg: sampleb 1 Getting required 'sampley' required by sampleb 1. We have a develop egg: sampley 1 While: Installing eggs. Error: There is a version conflict. We already have: demoneeded 1.1 but sampley 1 requires 'demoneeded==1.0'. """ def show_who_requires_missing_distributions(): """ When working with a lot of eggs, which require eggs recursively, it can be hard to tell why we're requiring things we can't find. Fortunately, buildout will tell us who's asking for something that we can't find. >>> make_dist_that_requires(sample_buildout, 'sampley', ['demoneeded']) >>> make_dist_that_requires(sample_buildout, 'samplea', ['sampleb']) >>> make_dist_that_requires(sample_buildout, 'sampleb', ... ['sampley', 'samplea']) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... develop = sampley samplea sampleb ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = samplea ... ''') >>> print system(buildout), Develop: '/sample-buildout/sampley' Develop: '/sample-buildout/samplea' Develop: '/sample-buildout/sampleb' Installing eggs. Couldn't find index page for 'demoneeded' (maybe misspelled?) Getting distribution for 'demoneeded'. While: Installing eggs. Getting distribution for 'demoneeded'. Error: Couldn't find a distribution for 'demoneeded'. """ def show_eggs_from_site_packages(): """ Sometimes you want to know what eggs are coming from site-packages. This might be for a diagnostic, or so that you can get a starting value for the allowed-eggs-from-site-packages option. The -v flag will also include this information. Our "py_path" has the "demoneeded," "demo" packages available. We'll ask for "bigdemo," which will get both of them. Here's our set up. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... prefer-final = true ... find-links = %(link_server)s ... ... [primed_python] ... executable = %(py_path)s ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... python = primed_python ... eggs = bigdemo ... ''' % globals()) Now here is the output. The lines that begin with "Egg from site-packages:" indicate the eggs from site-packages that have been selected. You'll see we have two: demo 0.3 and demoneeded 1.1. >>> print system(buildout+" -v"), Installing 'zc.buildout >=1.9.0, <2dev', 'setuptools'. We have a develop egg: zc.buildout V We have the best distribution that satisfies 'setuptools'. Picked: setuptools = V Installing 'zc.recipe.egg'. We have a develop egg: zc.recipe.egg V Installing eggs. Installing 'bigdemo'. We have no distributions for bigdemo that satisfies 'bigdemo'. Getting distribution for 'bigdemo'. Got bigdemo 0.1. Picked: bigdemo = 0.1 Getting required 'demo' required by bigdemo 0.1. We have the best distribution that satisfies 'demo'. Egg from site-packages: demo 0.3 Getting required 'demoneeded' required by demo 0.3. We have the best distribution that satisfies 'demoneeded'. Egg from site-packages: demoneeded 1.1 """ def test_comparing_saved_options_with_funny_characters(): """ If an option has newlines, extra/odd spaces or a %, we need to make sure the comparison with the saved value works correctly. >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'debug.py', ... ''' ... class Debug: ... def __init__(self, buildout, name, options): ... options['debug'] = \"\"\" ... ... ... path foo ... ... ... ... \"\"\" ... options['debug1'] = \"\"\" ... ... ... ... path foo ... ... ... ... \"\"\" ... options['debug2'] = ' x ' ... options['debug3'] = '42' ... options['format'] = '%3d' ... ... def install(self): ... open('t', 'w').write('t') ... return 't' ... ... update = install ... ''') >>> write(sample_buildout, 'recipes', 'setup.py', ... ''' ... from setuptools import setup ... setup( ... name = "recipes", ... entry_points = {'zc.buildout': ['default = debug:Debug']}, ... ) ... ''') >>> write(sample_buildout, 'recipes', 'README.txt', " ") >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes ... ''') >>> os.chdir(sample_buildout) >>> buildout = os.path.join(sample_buildout, 'bin', 'buildout') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing debug. If we run the buildout again, we shoudn't get a message about uninstalling anything because the configuration hasn't changed. >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating debug. """ def finding_eggs_as_local_directories(): r""" It is possible to set up find-links so that we could install from a local directory that may contained unzipped eggs. >>> src = tmpdir('src') >>> write(src, 'setup.py', ... ''' ... from setuptools import setup ... setup(name='demo', py_modules=[''], ... zip_safe=False, version='1.0', author='bob', url='bob', ... author_email='bob') ... ''') >>> write(src, 't.py', '#\n') >>> write(src, 'README.txt', '') >>> _ = system(join('bin', 'buildout')+' setup ' + src + ' bdist_egg') Install it so it gets unzipped: >>> d1 = tmpdir('d1') >>> ws = zc.buildout.easy_install.install( ... ['demo'], d1, links=[join(src, 'dist')], ... ) >>> ls(d1) d demo-1.0-py2.4.egg Then try to install it again: >>> d2 = tmpdir('d2') >>> ws = zc.buildout.easy_install.install( ... ['demo'], d2, links=[d1], ... ) >>> ls(d2) d demo-1.0-py2.4.egg """ def make_sure__get_version_works_with_2_digit_python_versions(): """ This is a test of an internal function used by higher-level machinery. We'll start by creating a faux 'python' that executable that prints a 2-digit version. This is a bit of a pain to do portably. :( >>> mkdir('demo') >>> write('demo', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='demo', ... entry_points = {'console_scripts': ['demo = demo:main']}, ... ) ... ''') >>> write('demo', 'demo.py', ... ''' ... def main(): ... print 'Python 2.5' ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = demo ... parts = ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/demo' >>> import zc.buildout.easy_install >>> ws = zc.buildout.easy_install.working_set( ... ['demo'], sys.executable, ['develop-eggs']) >>> bool(zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, 'bin')) True >>> print system(join('bin', 'demo')), Python 2.5 Now, finally, let's test _get_version: >>> zc.buildout.easy_install._get_version(join('bin', 'demo')) '2.5' """ def create_sections_on_command_line(): """ >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... x = ${foo:bar} ... ''') >>> print system(buildout + ' foo:bar=1 -vv'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. ... [foo] bar = 1 ... """ def test_help(): """ >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')+' -h'), ... # doctest: +ELLIPSIS Usage: buildout [options] [assignments] [command [command arguments]] Options: -h, --help ... >>> print system(os.path.join(sample_buildout, 'bin', 'buildout') ... +' --help'), ... # doctest: +ELLIPSIS Usage: buildout [options] [assignments] [command [command arguments]] Options: -h, --help ... """ def test_bootstrap_with_extension(): """ We had a problem running a bootstrap with an extension. Let's make sure it is fixed. Basically, we don't load extensions when bootstrapping. >>> d = tmpdir('sample-bootstrap') >>> write(d, 'buildout.cfg', ... ''' ... [buildout] ... extensions = some_awsome_extension ... parts = ... ''') >>> os.chdir(d) >>> print system(os.path.join(sample_buildout, 'bin', 'buildout') ... + ' bootstrap'), Creating directory '/sample-bootstrap/bin'. Creating directory '/sample-bootstrap/parts'. Creating directory '/sample-bootstrap/eggs'. Creating directory '/sample-bootstrap/develop-eggs'. Generated script '/sample-bootstrap/bin/buildout'. """ def bug_92891_bootstrap_crashes_with_egg_recipe_in_buildout_section(): """ >>> d = tmpdir('sample-bootstrap') >>> write(d, 'buildout.cfg', ... ''' ... [buildout] ... parts = buildout ... eggs-directory = eggs ... ... [buildout] ... recipe = zc.recipe.egg ... eggs = zc.buildout ... scripts = buildout=buildout ... ''') >>> os.chdir(d) >>> print system(os.path.join(sample_buildout, 'bin', 'buildout') ... + ' bootstrap'), Creating directory '/sample-bootstrap/bin'. Creating directory '/sample-bootstrap/parts'. Creating directory '/sample-bootstrap/eggs'. Creating directory '/sample-bootstrap/develop-eggs'. Generated script '/sample-bootstrap/bin/buildout'. >>> print system(os.path.join('bin', 'buildout')), Unused options for buildout: 'scripts' 'eggs'. """ def removing_eggs_from_develop_section_causes_egg_link_to_be_removed(): ''' >>> cd(sample_buildout) Create a develop egg: >>> mkdir('foo') >>> write('foo', 'setup.py', ... """ ... from setuptools import setup ... setup(name='foox') ... """) >>> write('buildout.cfg', ... """ ... [buildout] ... develop = foo ... parts = ... """) >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' >>> ls('develop-eggs') - foox.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link Create another: >>> mkdir('bar') >>> write('bar', 'setup.py', ... """ ... from setuptools import setup ... setup(name='fooy') ... """) >>> write('buildout.cfg', ... """ ... [buildout] ... develop = foo bar ... parts = ... """) >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' Develop: '/sample-buildout/bar' >>> ls('develop-eggs') - foox.egg-link - fooy.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link Remove one: >>> write('buildout.cfg', ... """ ... [buildout] ... develop = bar ... parts = ... """) >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/bar' It is gone >>> ls('develop-eggs') - fooy.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link Remove the other: >>> write('buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(join('bin', 'buildout')), All gone >>> ls('develop-eggs') - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link ''' def add_setuptools_to_dependencies_when_namespace_packages(): ''' Often, a package depends on setuptools soley by virtue of using namespace packages. In this situation, package authors often forget to declare setuptools as a dependency. This is a mistake, but, unfortunately, a common one that we need to work around. If an egg uses namespace packages and does not include setuptools as a depenency, we will still include setuptools in the working set. If we see this for a devlop egg, we will also generate a warning. >>> mkdir('foo') >>> mkdir('foo', 'src') >>> mkdir('foo', 'src', 'stuff') >>> write('foo', 'src', 'stuff', '__init__.py', ... """__import__('pkg_resources').declare_namespace(__name__) ... """) >>> mkdir('foo', 'src', 'stuff', 'foox') >>> write('foo', 'src', 'stuff', 'foox', '__init__.py', '') >>> write('foo', 'setup.py', ... """ ... from setuptools import setup ... setup(name='foox', ... namespace_packages = ['stuff'], ... package_dir = {'': 'src'}, ... packages = ['stuff', 'stuff.foox'], ... ) ... """) >>> write('foo', 'README.txt', '') >>> write('buildout.cfg', ... """ ... [buildout] ... develop = foo ... parts = ... """) >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' Now, if we generate a working set using the egg link, we will get a warning and we will get setuptools included in the working set. >>> import logging, zope.testing.loggingsupport >>> handler = zope.testing.loggingsupport.InstalledHandler( ... 'zc.buildout.easy_install', level=logging.WARNING) >>> logging.getLogger('zc.buildout.easy_install').propagate = False >>> [dist.project_name ... for dist in zc.buildout.easy_install.working_set( ... ['foox'], sys.executable, ... [join(sample_buildout, 'eggs'), ... join(sample_buildout, 'develop-eggs'), ... ])] ['foox', 'setuptools'] >>> print handler zc.buildout.easy_install WARNING Develop distribution: foox 0.0.0 uses namespace packages but the distribution does not require setuptools. >>> handler.clear() On the other hand, if we have a regular egg, rather than a develop egg: >>> os.remove(join('develop-eggs', 'foox.egg-link')) >>> _ = system(join('bin', 'buildout') + ' setup foo bdist_egg -d' ... + join(sample_buildout, 'eggs')) >>> ls('develop-eggs') - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link >>> print 'START ->'; ls('eggs') # doctest: +ELLIPSIS START... - foox-0.0.0-py2.4.egg ... We do not get a warning, but we do get setuptools included in the working set: >>> [dist.project_name ... for dist in zc.buildout.easy_install.working_set( ... ['foox'], sys.executable, ... [join(sample_buildout, 'eggs'), ... join(sample_buildout, 'develop-eggs'), ... ])] ['foox', 'setuptools'] >>> print handler, We get the same behavior if the it is a depedency that uses a namespace package. >>> mkdir('bar') >>> write('bar', 'setup.py', ... """ ... from setuptools import setup ... setup(name='bar', install_requires = ['foox']) ... """) >>> write('bar', 'README.txt', '') >>> write('buildout.cfg', ... """ ... [buildout] ... develop = foo bar ... parts = ... """) >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' Develop: '/sample-buildout/bar' >>> [dist.project_name ... for dist in zc.buildout.easy_install.working_set( ... ['bar'], sys.executable, ... [join(sample_buildout, 'eggs'), ... join(sample_buildout, 'develop-eggs'), ... ])] ['bar', 'foox', 'setuptools'] >>> print handler, zc.buildout.easy_install WARNING Develop distribution: foox 0.0.0 uses namespace packages but the distribution does not require setuptools. >>> logging.getLogger('zc.buildout.easy_install').propagate = True >>> handler.uninstall() ''' def develop_preserves_existing_setup_cfg(): """ See "Handling custom build options for extensions in develop eggs" in easy_install.txt. This will be very similar except that we'll have an existing setup.cfg: >>> write(extdemo, "setup.cfg", ... ''' ... # sampe cfg file ... ... [foo] ... bar = 1 ... ... [build_ext] ... define = X,Y ... ''') >>> mkdir('include') >>> write('include', 'extdemo.h', ... ''' ... #define EXTDEMO 42 ... ''') >>> dest = tmpdir('dest') >>> zc.buildout.easy_install.develop( ... extdemo, dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}) '/dest/extdemo.egg-link' >>> ls(dest) - extdemo.egg-link >>> cat(extdemo, "setup.cfg") # sampe cfg file [foo] bar = 1 [build_ext] define = X,Y """ def uninstall_recipes_used_for_removal(): """ Uninstall recipes need to be called when a part is removed too: >>> mkdir("recipes") >>> write("recipes", "setup.py", ... ''' ... from setuptools import setup ... setup(name='recipes', ... entry_points={ ... 'zc.buildout': ["demo=demo:Install"], ... 'zc.buildout.uninstall': ["demo=demo:uninstall"], ... }) ... ''') >>> write("recipes", "demo.py", ... ''' ... class Install: ... def __init__(*args): pass ... def install(self): ... print 'installing' ... return () ... def uninstall(name, options): print 'uninstalling' ... ''') >>> write('buildout.cfg', ''' ... [buildout] ... develop = recipes ... parts = demo ... [demo] ... recipe = recipes:demo ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/recipes' Installing demo. installing >>> write('buildout.cfg', ''' ... [buildout] ... develop = recipes ... parts = demo ... [demo] ... recipe = recipes:demo ... x = 1 ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/recipes' Uninstalling demo. Running uninstall recipe. uninstalling Installing demo. installing >>> write('buildout.cfg', ''' ... [buildout] ... develop = recipes ... parts = ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/recipes' Uninstalling demo. Running uninstall recipe. uninstalling """ def extensions_installed_as_eggs_work_in_offline_mode(): ''' >>> mkdir('demo') >>> write('demo', 'demo.py', ... """ ... def ext(buildout): ... print 'ext', list(buildout) ... """) >>> write('demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "demo", ... py_modules=['demo'], ... entry_points = {'zc.buildout.extension': ['ext = demo:ext']}, ... ) ... """) >>> bdist_egg(join(sample_buildout, "demo"), sys.executable, ... join(sample_buildout, "eggs")) >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extensions = demo ... parts = ... offline = true ... """) >>> print system(join(sample_buildout, 'bin', 'buildout')), ext ['buildout'] ''' def changes_in_svn_or_CVS_dont_affect_sig(): """ If we have a develop recipe, it's signature shouldn't be affected to changes in .svn or CVS directories. >>> mkdir('recipe') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='recipe', ... entry_points={'zc.buildout': ['default=foo:Foo']}) ... ''') >>> write('recipe', 'foo.py', ... ''' ... class Foo: ... def __init__(*args): pass ... def install(*args): return () ... update = install ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipe ... parts = foo ... ... [foo] ... recipe = recipe ... ''') >>> print system(join(sample_buildout, 'bin', 'buildout')), Develop: '/sample-buildout/recipe' Installing foo. >>> mkdir('recipe', '.svn') >>> mkdir('recipe', 'CVS') >>> print system(join(sample_buildout, 'bin', 'buildout')), Develop: '/sample-buildout/recipe' Updating foo. >>> write('recipe', '.svn', 'x', '1') >>> write('recipe', 'CVS', 'x', '1') >>> print system(join(sample_buildout, 'bin', 'buildout')), Develop: '/sample-buildout/recipe' Updating foo. """ if hasattr(os, 'symlink'): def bug_250537_broken_symlink_doesnt_affect_sig(): """ If we have a develop recipe, it's signature shouldn't be affected by broken symlinks, and better yet, computing the hash should not break because of the missing target file. >>> mkdir('recipe') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='recipe', ... entry_points={'zc.buildout': ['default=foo:Foo']}) ... ''') >>> write('recipe', 'foo.py', ... ''' ... class Foo: ... def __init__(*args): pass ... def install(*args): return () ... update = install ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipe ... parts = foo ... ... [foo] ... recipe = recipe ... ''') >>> print system(join(sample_buildout, 'bin', 'buildout')), Develop: '/sample-buildout/recipe' Installing foo. >>> write('recipe', 'some-file', '1') >>> os.symlink(join('recipe', 'some-file'), ... join('recipe', 'another-file')) >>> ls('recipe') l another-file - foo.py - foo.pyc d recipe.egg-info - setup.py - some-file >>> remove('recipe', 'some-file') >>> print system(join(sample_buildout, 'bin', 'buildout')), Develop: '/sample-buildout/recipe' Updating foo. """ def o_option_sets_offline(): """ >>> print system(join(sample_buildout, 'bin', 'buildout')+' -vvo'), ... # doctest: +ELLIPSIS ... offline = true ... """ def recipe_upgrade(): """ The buildout will upgrade recipes in newest (and non-offline) mode. Let's create a recipe egg >>> mkdir('recipe') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v1' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='recipe', version='1', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'README', '') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... >>> rmdir('recipe', 'build') And update our buildout to use it. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = recipe ... ''' % join('recipe', 'dist')) >>> print system(buildout), Getting distribution for 'recipe'. Got recipe 1. Installing foo. recipe v1 Now, if we update the recipe egg: >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v2' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='recipe', version='2', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... We won't get the update if we specify -N: >>> print system(buildout+' -N'), Updating foo. recipe v1 or if we use -o: >>> print system(buildout+' -o'), Updating foo. recipe v1 But we will if we use neither of these: >>> print system(buildout), Getting distribution for 'recipe'. Got recipe 2. Uninstalling foo. Installing foo. recipe v2 We can also select a particular recipe version: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = recipe ==1 ... ''' % join('recipe', 'dist')) >>> print system(buildout), Uninstalling foo. Installing foo. recipe v1 """ def update_adds_to_uninstall_list(): """ Paths returned by the update method are added to the list of paths to uninstall >>> mkdir('recipe') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='recipe', ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'recipe.py', ... ''' ... import os ... class Recipe: ... def __init__(*_): pass ... def install(self): ... r = ('a', 'b', 'c') ... for p in r: os.mkdir(p) ... return r ... def update(self): ... r = ('c', 'd', 'e') ... for p in r: ... if not os.path.exists(p): ... os.mkdir(p) ... return r ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipe ... parts = foo ... ... [foo] ... recipe = recipe ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipe' Installing foo. >>> print system(buildout), Develop: '/sample-buildout/recipe' Updating foo. >>> cat('.installed.cfg') # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE [buildout] ... [foo] __buildout_installed__ = a b c d e __buildout_signature__ = ... """ def log_when_there_are_not_local_distros(): """ >>> from zope.testing.loggingsupport import InstalledHandler >>> handler = InstalledHandler('zc.buildout.easy_install') >>> import logging >>> logger = logging.getLogger('zc.buildout.easy_install') >>> old_propogate = logger.propagate >>> logger.propagate = False >>> dest = tmpdir('sample-install') >>> import zc.buildout.easy_install >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/') >>> print handler # doctest: +ELLIPSIS zc.buildout.easy_install DEBUG Installing 'demo==0.2'. zc.buildout.easy_install DEBUG We have no distributions for demo that satisfies 'demo==0.2'. ... >>> handler.uninstall() >>> logger.propagate = old_propogate """ def internal_errors(): """Internal errors are clearly marked and don't generate tracebacks: >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'mkdir.py', ... ''' ... class Mkdir: ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... ''') >>> write(sample_buildout, 'recipes', 'setup.py', ... ''' ... from setuptools import setup ... setup(name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... ''') >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... ''') >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' While: Installing. Getting section data-dir. Initializing part data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... NameError: global name 'os' is not defined """ def whine_about_unused_options(): ''' >>> write('foo.py', ... """ ... class Foo: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['x'] ... ... def install(self): ... self.options['y'] ... return () ... """) >>> write('setup.py', ... """ ... from setuptools import setup ... setup(name = "foo", ... entry_points = {'zc.buildout': ['default = foo:Foo']}, ... ) ... """) >>> write('buildout.cfg', ... """ ... [buildout] ... develop = . ... parts = foo ... a = 1 ... ... [foo] ... recipe = foo ... x = 1 ... y = 1 ... z = 1 ... """) >>> print system(buildout), Develop: '/sample-buildout/.' Unused options for buildout: 'a'. Installing foo. Unused options for foo: 'z'. ''' def abnormal_exit(): """ People sometimes hit control-c while running a builout. We need to make sure that the installed database Isn't corrupted. To test this, we'll create some evil recipes that exit uncleanly: >>> mkdir('recipes') >>> write('recipes', 'recipes.py', ... ''' ... import os ... ... class Clean: ... def __init__(*_): pass ... def install(_): return () ... def update(_): pass ... ... class EvilInstall(Clean): ... def install(_): os._exit(1) ... ... class EvilUpdate(Clean): ... def update(_): os._exit(1) ... ''') >>> write('recipes', 'setup.py', ... ''' ... import setuptools ... setuptools.setup(name='recipes', ... entry_points = { ... 'zc.buildout': [ ... 'clean = recipes:Clean', ... 'evil_install = recipes:EvilInstall', ... 'evil_update = recipes:EvilUpdate', ... 'evil_uninstall = recipes:Clean', ... ], ... }, ... ) ... ''') Now let's look at 3 cases: 1. We exit during installation after installing some other parts: >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = p1 p2 p3 p4 ... ... [p1] ... recipe = recipes:clean ... ... [p2] ... recipe = recipes:clean ... ... [p3] ... recipe = recipes:evil_install ... ... [p4] ... recipe = recipes:clean ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing p1. Installing p2. Installing p3. >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating p1. Updating p2. Installing p3. >>> print system(buildout+' buildout:parts='), Develop: '/sample-buildout/recipes' Uninstalling p2. Uninstalling p1. 2. We exit while updating: >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = p1 p2 p3 p4 ... ... [p1] ... recipe = recipes:clean ... ... [p2] ... recipe = recipes:clean ... ... [p3] ... recipe = recipes:evil_update ... ... [p4] ... recipe = recipes:clean ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing p1. Installing p2. Installing p3. Installing p4. >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating p1. Updating p2. Updating p3. >>> print system(buildout+' buildout:parts='), Develop: '/sample-buildout/recipes' Uninstalling p2. Uninstalling p1. Uninstalling p4. Uninstalling p3. 3. We exit while installing or updating after uninstalling: >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = p1 p2 p3 p4 ... ... [p1] ... recipe = recipes:evil_update ... ... [p2] ... recipe = recipes:clean ... ... [p3] ... recipe = recipes:clean ... ... [p4] ... recipe = recipes:clean ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing p1. Installing p2. Installing p3. Installing p4. >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = p1 p2 p3 p4 ... ... [p1] ... recipe = recipes:evil_update ... ... [p2] ... recipe = recipes:clean ... ... [p3] ... recipe = recipes:clean ... ... [p4] ... recipe = recipes:clean ... x = 1 ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling p4. Updating p1. >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = p1 p2 p3 p4 ... ... [p1] ... recipe = recipes:clean ... ... [p2] ... recipe = recipes:clean ... ... [p3] ... recipe = recipes:clean ... ... [p4] ... recipe = recipes:clean ... ''') >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling p1. Installing p1. Updating p2. Updating p3. Installing p4. """ def install_source_dist_with_bad_py(): """ >>> mkdir('badegg') >>> mkdir('badegg', 'badegg') >>> write('badegg', 'badegg', '__init__.py', '#\\n') >>> mkdir('badegg', 'badegg', 'scripts') >>> write('badegg', 'badegg', 'scripts', '__init__.py', '#\\n') >>> write('badegg', 'badegg', 'scripts', 'one.py', ... ''' ... return 1 ... ''') >>> write('badegg', 'setup.py', ... ''' ... from setuptools import setup, find_packages ... setup( ... name='badegg', ... version='1', ... packages = find_packages('.'), ... zip_safe=False) ... ''') >>> print system(buildout+' setup badegg sdist'), # doctest: +ELLIPSIS Running setup script 'badegg/setup.py'. ... >>> dist = join('badegg', 'dist') >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs bo ... find-links = %(dist)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = badegg ... ... [bo] ... recipe = zc.recipe.egg ... eggs = zc.buildout ... scripts = buildout=bo ... ''' % globals()) >>> print system(buildout);print 'X' # doctest: +ELLIPSIS Installing eggs. Getting distribution for 'badegg'. Got badegg 1. Installing bo. ... SyntaxError: ...'return' outside function... ... SyntaxError: ...'return' outside function... ... >>> ls('eggs') # doctest: +ELLIPSIS d badegg-1-py2.4.egg ... >>> ls('bin') - bo - buildout """ def version_requirements_in_build_honored(): ''' >>> update_extdemo() >>> dest = tmpdir('sample-install') >>> mkdir('include') >>> write('include', 'extdemo.h', ... """ ... #define EXTDEMO 42 ... """) >>> zc.buildout.easy_install.build( ... 'extdemo ==1.4', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... newest=False) ['/sample-install/extdemo-1.4-py2.4-linux-i686.egg'] ''' def bug_105081_Specific_egg_versions_are_ignored_when_newer_eggs_are_around(): """ Buildout might ignore a specific egg requirement for a recipe: - Have a newer version of an egg in your eggs directory - Use 'recipe==olderversion' in your buildout.cfg to request an older version Buildout will go and fetch the older version, but it will *use* the newer version when installing a part with this recipe. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = x ... find-links = %(sample_eggs)s ... ... [x] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> print system(buildout), Installing x. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. >>> print system(join('bin', 'demo')), 4 2 >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = x ... find-links = %(sample_eggs)s ... ... [x] ... recipe = zc.recipe.egg ... eggs = demo ==0.1 ... ''' % globals()) >>> print system(buildout), Uninstalling x. Installing x. Getting distribution for 'demo==0.1'. Got demo 0.1. Generated script '/sample-buildout/bin/demo'. >>> print system(join('bin', 'demo')), 1 2 """ def versions_section_ignored_for_dependency_in_favor_of_site_packages(): r""" This is a test for a bugfix. The error showed itself when at least two dependencies were in a shared location like site-packages, and the first one met the "versions" setting. The first dependency would be added, but subsequent dependencies from the same location (e.g., site-packages) would use the version of the package found in the shared location, ignoring the version setting. We begin with a Python that has demoneeded version 1.1 installed and a demo version 0.3, all in a site-packages-like shared directory. We need to create this. ``eggrecipedemo.main()`` shows the number after the dot (that is, ``X`` in ``1.X``), for the demo package and the demoneeded package, so this demonstrates that our Python does in fact have demo version 0.3 and demoneeded version 1.1. >>> py_path = make_py_with_system_install(make_py, sample_eggs) >>> print call_py( ... py_path, ... "import tellmy.version; print tellmy.version.__version__"), 1.1 Now here's a setup that would expose the bug, using the zc.buildout.easy_install API. >>> example_dest = tmpdir('example_dest') >>> workingset = zc.buildout.easy_install.install( ... ['tellmy.version'], example_dest, links=[sample_eggs], ... executable=py_path, ... index=None, ... versions={'tellmy.version': '1.0'}) >>> for dist in workingset: ... res = str(dist) ... if res.startswith('tellmy.version'): ... print res ... break tellmy.version 1.0 Before the bugfix, the desired tellmy.version distribution would have been blocked the one in site-packages. """ def handle_namespace_package_in_both_site_packages_and_buildout_eggs(): r""" If you have the same namespace package in both site-packages and in buildout, we need to be very careful that faux-Python-executables and scripts generated by easy_install.sitepackage_safe_scripts correctly combine the two. We show this with the local recipe that uses the function, z3c.recipe.scripts. To demonstrate this, we will create three packages: tellmy.version 1.0, tellmy.version 1.1, and tellmy.fortune 1.0. tellmy.version 1.1 is installed. >>> py_path = make_py_with_system_install(make_py, sample_eggs) >>> print call_py( ... py_path, ... "import tellmy.version; print tellmy.version.__version__") 1.1 Now we will create a buildout that creates a script and a faux-Python script. We want to see that both can successfully import the specified versions of tellmy.version and tellmy.fortune. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... ... [primed_python] ... executable = %(py_path)s ... ... [eggs] ... recipe = z3c.recipe.scripts ... python = primed_python ... interpreter = py ... include-site-packages = true ... eggs = tellmy.version == 1.0 ... tellmy.fortune == 1.0 ... demo ... script-initialization = ... import tellmy.version ... print tellmy.version.__version__ ... import tellmy.fortune ... print tellmy.fortune.__version__ ... ''' % globals()) >>> print system(buildout) Installing eggs. Getting distribution for 'tellmy.version==1.0'. Got tellmy.version 1.0. Getting distribution for 'tellmy.fortune==1.0'. Got tellmy.fortune 1.0. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. Generated interpreter '/sample-buildout/bin/py'. Finally, we are ready to see if it worked. Prior to the bug fix that this tests, the results of both calls below was the following:: 1.1 Traceback (most recent call last): ... ImportError: No module named fortune In other words, we got the site-packages version of tellmy.version, and we could not import tellmy.fortune at all. The following are the correct results for the interpreter and for the script. >>> print call_py( ... join('bin', 'py'), ... "import tellmy.version; " + ... "print tellmy.version.__version__; " + ... "import tellmy.fortune; " + ... "print tellmy.fortune.__version__") # doctest: +ELLIPSIS 1.0 1.0... >>> print system(join('bin', 'demo')) 1.0 1.0 4 2 """ if not zc.buildout.testing.script_in_shebang: del handle_namespace_package_in_both_site_packages_and_buildout_eggs def handle_sys_path_version_hack(): r""" This is a test for a bugfix. If you use a Python that has a different version of one of your dependencies, and the new package tries to do sys.path tricks in the setup.py to get a __version__, and it uses namespace packages, the older package will be loaded first, making the setup version the wrong number. While very arguably packages simply shouldn't do this, some do, and we don't want buildout to fall over when they do. To demonstrate this, we will need to create a distribution that has one of these unpleasant tricks, and a Python that has an older version installed. >>> py_path, site_packages_path = make_py() >>> for version in ('1.0', '1.1'): ... tmp = tempfile.mkdtemp() ... try: ... write(tmp, 'README.txt', '') ... mkdir(tmp, 'src') ... mkdir(tmp, 'src', 'tellmy') ... write(tmp, 'src', 'tellmy', '__init__.py', ... "__import__(" ... "'pkg_resources').declare_namespace(__name__)\n") ... mkdir(tmp, 'src', 'tellmy', 'version') ... write(tmp, 'src', 'tellmy', 'version', ... '__init__.py', '__version__=%r\n' % version) ... write( ... tmp, 'setup.py', ... "from setuptools import setup\n" ... "import sys\n" ... "sys.path.insert(0, 'src')\n" ... "from tellmy.version import __version__\n" ... "setup(\n" ... " name='tellmy.version',\n" ... " package_dir = {'': 'src'},\n" ... " packages = ['tellmy', 'tellmy.version'],\n" ... " install_requires = ['setuptools'],\n" ... " namespace_packages=['tellmy'],\n" ... " zip_safe=True, version=__version__,\n" ... " author='bob', url='bob', author_email='bob')\n" ... ) ... zc.buildout.testing.sdist(tmp, sample_eggs) ... if version == '1.0': ... # We install the 1.0 version in site packages the way a ... # system packaging system (debs, rpms) would do it. ... zc.buildout.testing.sys_install(tmp, site_packages_path) ... finally: ... shutil.rmtree(tmp) >>> print call_py( ... py_path, ... "import tellmy.version; print tellmy.version.__version__") 1.0 >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(sample_eggs)s ... ... [primed_python] ... executable = %(py_path)s ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... python = primed_python ... eggs = tellmy.version == 1.1 ... ''' % globals()) Before the bugfix, running this buildout would generate this error: Installing eggs. Getting distribution for 'tellmy.version==1.1'. Installing tellmy.version 1.1 Caused installation of a distribution: tellmy.version 1.0 with a different version. Got None. While: Installing eggs. Error: There is a version conflict. We already have: tellmy.version 1.0 You can see the copiously commented fix for this in easy_install.py (see zc.buildout.easy_install.Installer._call_easy_install and particularly the comment leading up to zc.buildout.easy_install._easy_install_cmd). Now the install works correctly, as seen here. >>> print system(buildout) Installing eggs. Getting distribution for 'tellmy.version==1.1'. Got tellmy.version 1.1. """ def isolated_include_site_packages(): """ This is an isolated test of the include_site_packages functionality, passing the argument directly to install, overriding a default. Our "py_path" has the "demoneeded" and "demo" packages available. We'll simply be asking for "demoneeded" here. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> zc.buildout.easy_install.include_site_packages(False) True >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, include_site_packages=True) >>> [dist.project_name for dist in workingset] ['demoneeded'] That worked fine. Let's try again with site packages not allowed (and reversing the default). >>> zc.buildout.easy_install.include_site_packages(True) False >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, include_site_packages=False) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. That's a failure, as expected. Now we explore an important edge case. Some system Pythons include setuptools (and other Python packages) in their site-packages (or equivalent) using a .egg-info directory. The pkg_resources module (from setuptools) considers a package installed using .egg-info to be a develop egg. zc.buildout.buildout.Buildout.bootstrap will make setuptools and zc.buildout available to the buildout via the eggs directory, for normal eggs; or the develop-eggs directory, for develop-eggs. If setuptools or zc.buildout is found in site-packages and considered by pkg_resources to be a develop egg, then the bootstrap code will use a .egg-link in the local develop-eggs, pointing to site-packages, in its entirety. Because develop-eggs must always be available for searching for distributions, this indirectly brings site-packages back into the search path for distributions. Because of this, we have to take special care that we still exclude site-packages even in this case. See the comments about site packages in the Installer._satisfied and Installer._obtain methods for the implementation (as of this writing). In this demonstration, we insert a link to the "demoneeded" distribution in our develop-eggs, which would bring the package back in, except for the special care we have taken to exclude it. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> mkdir(example_dest, 'develop-eggs') >>> write(example_dest, 'develop-eggs', 'demoneeded.egg-link', ... site_packages_path) >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], ... path=[join(example_dest, 'develop-eggs')], ... executable=py_path, ... index=None, include_site_packages=False) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. The MissingDistribution error shows that buildout correctly excluded the "site-packages" source even though it was indirectly included in the path via a .egg-link file. """ def include_site_packages_bug_623590(): """ As mentioned in isolated_include_site_packages, some system Pythons include various Python packages in their site-packages (or equivalent) using a .egg-info directory. The pkg_resources module (from setuptools) considers a package installed using .egg-info to be a develop egg We generally prefer develop eggs when we are selecting dependencies, because we expect them to be eggs that buildout has been told to develop. However, we should not consider these site-packages eggs as develop eggs--they should not have automatic precedence over eggs available elsewhere. We have specific code to handle this case, as identified in bug 623590. See zc.buildout.easy_install.Installer._satisfied, as of this writing, for the pertinent code. Here's the test for the bugfix. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> zc.buildout.easy_install.include_site_packages(False) True >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demo'], example_dest, links=[sample_eggs], executable=py_path, ... index=None, include_site_packages=True, prefer_final=False) >>> [(dist.project_name, dist.version) for dist in workingset] [('demo', '0.4c1'), ('demoneeded', '1.2c1')] """ def allowed_eggs_from_site_packages(): """ Sometimes you need or want to control what eggs from site-packages are used. The allowed-eggs-from-site-packages option allows you to specify a whitelist of project names that may be included from site-packages. You can use globs to specify the value. It defaults to a single value of '*', indicating that any package may come from site-packages. This option interacts with include-site-packages in the following ways. If include-site-packages is true, then allowed-eggs-from-site-packages filters what eggs from site-packages may be chosen. If allowed-eggs-from-site-packages is an empty list, then no eggs from site-packages are chosen, but site-packages will still be included at the end of path lists. If include-site-packages is false, allowed-eggs-from-site-packages is irrelevant. This test shows the interaction with the zc.buildout.easy_install API. Another test below (allow_site_package_eggs_option) shows using it with a buildout.cfg. Our "py_path" has the "demoneeded" and "demo" packages available. We'll simply be asking for "demoneeded" here. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, ... allowed_eggs_from_site_packages=['demoneeded', 'other']) >>> [dist.project_name for dist in workingset] ['demoneeded'] That worked fine. It would work fine for a glob too. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, ... allowed_eggs_from_site_packages=['?emon*', 'other']) >>> [dist.project_name for dist in workingset] ['demoneeded'] But now let's try again with 'demoneeded' not allowed. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, ... allowed_eggs_from_site_packages=['demo']) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. Here's the same, but with an empty list. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, ... allowed_eggs_from_site_packages=[]) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. Of course, this doesn't stop us from getting a package from elsewhere. Here, we add a link server. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, executable=py_path, ... links=[link_server], index=link_server+'index/', ... allowed_eggs_from_site_packages=['other']) >>> [dist.project_name for dist in workingset] ['demoneeded'] >>> [dist.location for dist in workingset] ['/site-packages-example-install/demoneeded-1.1-py2.6.egg'] Finally, here's an example of an interaction: we say that it is OK to allow the "demoneeded" egg to come from site-packages, but we don't include-site-packages. >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None, include_site_packages=False, ... allowed_eggs_from_site_packages=['demoneeded']) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. """ def allowed_eggs_from_site_packages_dependencies_bugfix(): """ If you specify that a package with a dependency may come from site-packages, that doesn't mean that the dependency may come from site-packages. This is a test for a bug fix to verify that this is true. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), executable=py_path, ... links=[link_server], index=link_server+'index/', ... allowed_eggs_from_site_packages=['demo']) >>> [dist.project_name for dist in ws] ['demo', 'demoneeded'] >>> from pprint import pprint >>> pprint([dist.location for dist in ws]) ['/executable_buildout/site-packages', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] """ def allowed_eggs_from_site_packages_bug_592524(): """ When we use allowed_eggs_from_site_packages, we need to make sure that the site-packages paths are not inserted with the normal egg paths. They already included at the end, and including them along with the normal egg paths will possibly mask subsequent egg paths. This affects interpreters and scripts generated by sitepackage_safe_scripts. Our "py_path" has the "demoneeded" and "demo" packages available. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo', 'other'], join(interpreter_dir, 'eggs'), executable=py_path, ... links=[link_server], index=link_server+'index/', ... allowed_eggs_from_site_packages=['demo']) >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, py_path, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) Now we will look at the paths in the site.py we generated. Notice that the site-packages are at the end. They were not before this bugfix. >>> test = 'import pprint, sys; pprint.pprint(sys.path[-4:])' >>> print call_py(join(interpreter_bin_dir, 'py'), test) ['/interpreter/eggs/other-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/executable_buildout/eggs/setuptools-0.0-pyN.N.egg', '/executable_buildout/site-packages'] """ if not zc.buildout.testing.script_in_shebang: del allowed_eggs_from_site_packages_bug_592524 def subprocesses_have_same_environment_by_default(): """ The scripts generated by sitepackage_safe_scripts set the PYTHONPATH so that, if the environment is maintained (the default behavior), subprocesses get the same Python packages. First, we set up a script and an interpreter. >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> test = ( ... "import subprocess, sys; subprocess.call(" ... "[sys.executable, '-c', " ... "'import eggrecipedemo; print eggrecipedemo.x'])") >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo'], interpreter='py', ... script_initialization=test + '; sys.exit(0)') This works for the script. >>> print system(join(interpreter_bin_dir, 'demo')) 3 This also works for the generated interpreter. >>> print call_py(join(interpreter_bin_dir, 'py'), test) 3 If you have a PYTHONPATH in your environment, it will be honored, after the buildout-generated path. >>> original_pythonpath = os.environ.get('PYTHONPATH') >>> os.environ['PYTHONPATH'] = 'foo' >>> test = ( ... "import subprocess, sys; subprocess.call(" ... "[sys.executable, '-c', " ... "'import sys, pprint; pprint.pprint(sys.path)'])") >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo'], interpreter='py', ... script_initialization=test + '; sys.exit(0)') This works for the script. As you can see, /sample_buildout/foo is included right after the "parts" directory that contains site.py and sitecustomize.py. You can also see, actually more easily than in the other example, that we have the desired eggs available. >>> print system(join(interpreter_bin_dir, 'demo')), # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', '/sample-buildout/foo', ... '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] This also works for the generated interpreter, with identical results. >>> print call_py(join(interpreter_bin_dir, 'py'), test), ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', '/sample-buildout/foo', ... '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] >>> # Cleanup >>> if original_pythonpath: ... os.environ['PYTHONPATH'] = original_pythonpath ... else: ... del os.environ['PYTHONPATH'] ... """ def bootstrap_makes_buildout_that_works_with_system_python(): r""" In order to work smoothly with a system Python, bootstrapping creates the buildout script with zc.buildout.easy_install.sitepackage_safe_scripts. If it did not, a variety of problems might happen. For instance, if another version of buildout or setuptools is installed in the site-packages than is desired, it may cause a problem. A problem actually experienced in the field is when a recipe wants a different version of a dependency that is installed in site-packages. We will create a similar situation, and show that it is now handled. First let's write a dummy recipe. >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'dummy.py', ... ''' ... import logging, os, zc.buildout ... class Dummy: ... def __init__(self, buildout, name, options): ... pass ... def install(self): ... return () ... def update(self): ... pass ... ''') >>> write(sample_buildout, 'recipes', 'setup.py', ... ''' ... from setuptools import setup ... ... setup( ... name = "recipes", ... entry_points = {'zc.buildout': ['dummy = dummy:Dummy']}, ... install_requires = 'demoneeded==1.2c1', ... ) ... ''') >>> write(sample_buildout, 'recipes', 'README.txt', " ") Now we'll try to use it with a Python that has a different version of demoneeded installed. >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> rmdir('develop-eggs') >>> from zc.buildout.testing import make_buildout >>> make_buildout(executable=py_path) >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = recipes ... parts = dummy ... find-links = %(link_server)s ... executable = %(py_path)s ... ... [dummy] ... recipe = recipes:dummy ... ''' % globals()) Now we actually run the buildout. Before the change, we got the following error: Develop: '/sample-buildout/recipes' While: Installing. Getting section dummy. Initializing section dummy. Installing recipe recipes. Error: There is a version conflict. We already have: demoneeded 1.1 but recipes 0.0.0 requires 'demoneeded==1.2c1'. Now, it is handled smoothly. >>> print system(buildout) Develop: '/sample-buildout/recipes' Getting distribution for 'demoneeded==1.2c1'. Got demoneeded 1.2c1. Installing dummy. Here's the same story with a namespace package, which has some additional complications behind the scenes. First, a recipe, in the "tellmy" namespace. >>> mkdir(sample_buildout, 'ns') >>> mkdir(sample_buildout, 'ns', 'tellmy') >>> write(sample_buildout, 'ns', 'tellmy', '__init__.py', ... "__import__('pkg_resources').declare_namespace(__name__)\n") >>> mkdir(sample_buildout, 'ns', 'tellmy', 'recipes') >>> write(sample_buildout, 'ns', 'tellmy', 'recipes', '__init__.py', ' ') >>> write(sample_buildout, 'ns', 'tellmy', 'recipes', 'dummy.py', ... ''' ... import logging, os, zc.buildout ... class Dummy: ... def __init__(self, buildout, name, options): ... pass ... def install(self): ... return () ... def update(self): ... pass ... ''') >>> write(sample_buildout, 'ns', 'setup.py', ... ''' ... from setuptools import setup ... setup( ... name="tellmy.recipes", ... packages=['tellmy', 'tellmy.recipes'], ... install_requires=['setuptools'], ... namespace_packages=['tellmy'], ... entry_points = {'zc.buildout': ... ['dummy = tellmy.recipes.dummy:Dummy']}, ... ) ... ''') Now, a buildout that uses it. >>> create_sample_namespace_eggs(sample_eggs, site_packages_path) >>> rmdir('develop-eggs') >>> from zc.buildout.testing import make_buildout >>> make_buildout(executable=py_path) >>> write(sample_buildout, 'buildout.cfg', ... ''' ... [buildout] ... develop = ns ... recipes ... parts = dummy ... find-links = %(link_server)s ... executable = %(py_path)s ... ... [dummy] ... recipe = tellmy.recipes:dummy ... ''' % globals()) Now we actually run the buildout. >>> print system(buildout) Develop: '/sample-buildout/ns' Develop: '/sample-buildout/recipes' Uninstalling dummy. Installing dummy. """ if not zc.buildout.testing.script_in_shebang: del bootstrap_makes_buildout_that_works_with_system_python def test_exit_codes(): """ >>> import subprocess >>> def call(s): ... p = subprocess.Popen(s, stdin=subprocess.PIPE, ... stdout=subprocess.PIPE, stderr=subprocess.STDOUT) ... p.stdin.close() ... print p.stdout.read() ... print 'Exit:', bool(p.wait()) >>> call(buildout) Exit: False >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = x ... ''') >>> call(buildout) # doctest: +NORMALIZE_WHITESPACE While: Installing. Getting section x. Error: The referenced section, 'x', was not defined. Exit: True >>> write('setup.py', ... ''' ... from setuptools import setup ... setup(name='zc.buildout.testexit', entry_points={ ... 'zc.buildout': ['default = testexitrecipe:x']}) ... ''') >>> write('testexitrecipe.py', ... ''' ... x y ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = x ... develop = . ... ... [x] ... recipe = zc.buildout.testexit ... ''') >>> call(buildout) # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS Develop: '/sample-buildout/.' While: Installing. Getting section x. Initializing section x. Loading zc.buildout recipe entry zc.buildout.testexit:default. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... x y ^ SyntaxError: invalid syntax Exit: True """ def bug_59270_recipes_always_start_in_buildout_dir(): """ Recipes can rely on running from buildout directory >>> mkdir('bad_start') >>> write('bad_recipe.py', ... ''' ... import os ... class Bad: ... def __init__(self, *_): ... print os.getcwd() ... def install(self): ... print os.getcwd() ... os.chdir('bad_start') ... print os.getcwd() ... return () ... ''') >>> write('setup.py', ... ''' ... from setuptools import setup ... setup(name='bad.test', ... entry_points={'zc.buildout': ['default=bad_recipe:Bad']},) ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = . ... parts = b1 b2 ... [b1] ... recipe = bad.test ... [b2] ... recipe = bad.test ... ''') >>> os.chdir('bad_start') >>> print system(join(sample_buildout, 'bin', 'buildout') ... +' -c '+join(sample_buildout, 'buildout.cfg')), Develop: '/sample-buildout/.' /sample-buildout /sample-buildout Installing b1. /sample-buildout /sample-buildout/bad_start Installing b2. /sample-buildout /sample-buildout/bad_start """ def bug_61890_file_urls_dont_seem_to_work_in_find_dash_links(): """ This bug arises from the fact that setuptools is overly restrictive about file urls, requiring that file urls pointing at directories must end in a slash. >>> dest = tmpdir('sample-install') >>> import zc.buildout.easy_install >>> sample_eggs = sample_eggs.replace(os.path.sep, '/') >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=['file://'+sample_eggs], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.2 demoneeded 1.1 >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg """ def bug_75607_buildout_should_not_run_if_it_creates_an_empty_buildout_cfg(): """ >>> remove('buildout.cfg') >>> print system(buildout), While: Initializing. Error: Couldn't open /sample-buildout/buildout.cfg """ def dealing_with_extremely_insane_dependencies(): r""" There was a problem with analysis of dependencies taking a long time, in part because the analysis would get repeated every time a package was encountered in a dependency list. Now, we don't do the analysis any more: >>> import os >>> for i in range(5): ... p = 'pack%s' % i ... deps = [('pack%s' % j) for j in range(5) if j is not i] ... if i == 4: ... deps.append('pack5') ... mkdir(p) ... write(p, 'setup.py', ... 'from setuptools import setup\n' ... 'setup(name=%r, install_requires=%r,\n' ... ' url="u", author="a", author_email="e")\n' ... % (p, deps)) >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = pack0 pack1 pack2 pack3 pack4 ... parts = pack1 ... ... [pack1] ... recipe = zc.recipe.egg:eggs ... eggs = pack0 ... ''') >>> print system(buildout), Develop: '/sample-buildout/pack0' Develop: '/sample-buildout/pack1' Develop: '/sample-buildout/pack2' Develop: '/sample-buildout/pack3' Develop: '/sample-buildout/pack4' Installing pack1. Couldn't find index page for 'pack5' (maybe misspelled?) Getting distribution for 'pack5'. While: Installing pack1. Getting distribution for 'pack5'. Error: Couldn't find a distribution for 'pack5'. However, if we run in verbose mode, we can see why packages were included: >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.9.0, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0 We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Develop: '/sample-buildout/pack0' Develop: '/sample-buildout/pack1' Develop: '/sample-buildout/pack2' Develop: '/sample-buildout/pack3' Develop: '/sample-buildout/pack4' ...Installing pack1. Installing 'pack0'. We have a develop egg: pack0 0.0.0 Getting required 'pack4' required by pack0 0.0.0. We have a develop egg: pack4 0.0.0 Getting required 'pack3' required by pack0 0.0.0. required by pack4 0.0.0. We have a develop egg: pack3 0.0.0 Getting required 'pack2' required by pack0 0.0.0. required by pack4 0.0.0. required by pack3 0.0.0. We have a develop egg: pack2 0.0.0 Getting required 'pack1' required by pack0 0.0.0. required by pack4 0.0.0. required by pack3 0.0.0. required by pack2 0.0.0. We have a develop egg: pack1 0.0.0 Getting required 'pack5' required by pack4 0.0.0. We have no distributions for pack5 that satisfies 'pack5'. Couldn't find index page for 'pack5' (maybe misspelled?) Getting distribution for 'pack5'. While: Installing pack1. Getting distribution for 'pack5'. Error: Couldn't find a distribution for 'pack5'. """ def read_find_links_to_load_extensions(): """ We'll create a wacky buildout extension that is just another name for http: >>> src = tmpdir('src') >>> write(src, 'wacky_handler.py', ... ''' ... import urllib2 ... class Wacky(urllib2.HTTPHandler): ... wacky_open = urllib2.HTTPHandler.http_open ... def install(buildout=None): ... urllib2.install_opener(urllib2.build_opener(Wacky)) ... ''') >>> write(src, 'setup.py', ... ''' ... from setuptools import setup ... setup(name='wackyextension', version='1', ... py_modules=['wacky_handler'], ... entry_points = {'zc.buildout.extension': ... ['default = wacky_handler:install'] ... }, ... ) ... ''') >>> print system(buildout+' setup '+src+' bdist_egg'), ... # doctest: +ELLIPSIS Running setup ... creating 'dist/wackyextension-1-... Now we'll create a buildout that uses this extension to load other packages: >>> wacky_server = link_server.replace('http', 'wacky') >>> dist = 'file://' + join(src, 'dist').replace(os.path.sep, '/') >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = demo ... extensions = wackyextension ... find-links = %(wacky_server)s/demoneeded-1.0.zip ... %(dist)s ... [demo] ... recipe = zc.recipe.egg ... eggs = demoneeded ... ''' % globals()) When we run the buildout. it will load the extension from the dist directory and then use the wacky extension to load the demo package >>> print system(buildout), Getting distribution for 'wackyextension'. Got wackyextension 1. Installing demo. Getting distribution for 'demoneeded'. Got demoneeded 1.0. """ def distributions_from_local_find_links_make_it_to_download_cache(): """ If we specify a local directory in find links, distors found there need to make it to the download cache. >>> mkdir('test') >>> write('test', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='foo') ... ''') >>> print system(buildout+' setup test bdist_egg'), # doctest: +ELLIPSIS Running setup script 'test/setup.py'. ... >>> mkdir('cache') >>> old_cache = zc.buildout.easy_install.download_cache('cache') >>> list(zc.buildout.easy_install.install(['foo'], 'eggs', ... links=[join('test', 'dist')])) # doctest: +ELLIPSIS [foo 0.0.0 ... >>> ls('cache') - foo-0.0.0-py2.4.egg >>> _ = zc.buildout.easy_install.download_cache(old_cache) """ def create_egg(name, version, dest, install_requires=None, dependency_links=None): d = tempfile.mkdtemp() if dest=='available': extras = dict(x=['x']) else: extras = {} if dependency_links: links = 'dependency_links = %s, ' % dependency_links else: links = '' if install_requires: requires = 'install_requires = %s, ' % install_requires else: requires = '' try: open(os.path.join(d, 'setup.py'), 'w').write( 'from setuptools import setup\n' 'setup(name=%r, version=%r, extras_require=%r, zip_safe=True,\n' ' %s %s py_modules=["setup"]\n)' % (name, str(version), extras, requires, links) ) zc.buildout.testing.bdist_egg(d, sys.executable, os.path.abspath(dest)) finally: shutil.rmtree(d) def prefer_final_permutation(existing, available): for d in ('existing', 'available'): if os.path.exists(d): shutil.rmtree(d) os.mkdir(d) for version in existing: create_egg('spam', version, 'existing') for version in available: create_egg('spam', version, 'available') zc.buildout.easy_install.clear_index_cache() [dist] = list( zc.buildout.easy_install.install(['spam'], 'existing', ['available'], always_unzip=True) ) if dist.extras: print 'downloaded', dist.version else: print 'had', dist.version sys.path_importer_cache.clear() def prefer_final(): """ This test tests several permutations: Using different version numbers to work around zip importer cache problems. :( - With prefer final: - no existing and newer dev available >>> prefer_final_permutation((), [1, '2a1']) downloaded 1 - no existing and only dev available >>> prefer_final_permutation((), ['3a1']) downloaded 3a1 - final existing and only dev acailable >>> prefer_final_permutation([4], ['5a1']) had 4 - final existing and newer final available >>> prefer_final_permutation([6], [7]) downloaded 7 - final existing and same final available >>> prefer_final_permutation([8], [8]) had 8 - final existing and older final available >>> prefer_final_permutation([10], [9]) had 10 - only dev existing and final available >>> prefer_final_permutation(['12a1'], [11]) downloaded 11 - only dev existing and no final available newer dev available >>> prefer_final_permutation(['13a1'], ['13a2']) downloaded 13a2 - only dev existing and no final available older dev available >>> prefer_final_permutation(['15a1'], ['14a1']) had 15a1 - only dev existing and no final available same dev available >>> prefer_final_permutation(['16a1'], ['16a1']) had 16a1 - Without prefer final: >>> _ = zc.buildout.easy_install.prefer_final(False) - no existing and newer dev available >>> prefer_final_permutation((), [18, '19a1']) downloaded 19a1 - no existing and only dev available >>> prefer_final_permutation((), ['20a1']) downloaded 20a1 - final existing and only dev acailable >>> prefer_final_permutation([21], ['22a1']) downloaded 22a1 - final existing and newer final available >>> prefer_final_permutation([23], [24]) downloaded 24 - final existing and same final available >>> prefer_final_permutation([25], [25]) had 25 - final existing and older final available >>> prefer_final_permutation([27], [26]) had 27 - only dev existing and final available >>> prefer_final_permutation(['29a1'], [28]) had 29a1 - only dev existing and no final available newer dev available >>> prefer_final_permutation(['30a1'], ['30a2']) downloaded 30a2 - only dev existing and no final available older dev available >>> prefer_final_permutation(['32a1'], ['31a1']) had 32a1 - only dev existing and no final available same dev available >>> prefer_final_permutation(['33a1'], ['33a1']) had 33a1 >>> _ = zc.buildout.easy_install.prefer_final(True) """ def buildout_prefer_final_option(): """ The prefer-final buildout option can be used for override the default preference for newer distributions. The default is prefer-final = false: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = demo ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. ... Picked: demo = 0.4c1 ... Picked: demoneeded = 1.2c1 Here we see that the final versions of demo and demoneeded are used. We get the same behavior if we add prefer-final = false >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... prefer-final = false ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = demo ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. ... Picked: demo = 0.4c1 ... Picked: demoneeded = 1.2c1 If we specify prefer-final = true, we'll get the newest distributions: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... prefer-final = true ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = demo ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. ... Picked: demo = 0.3 ... Picked: demoneeded = 1.1 We get an error if we specify anything but true or false: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... prefer-final = no ... ... [eggs] ... recipe = zc.recipe.egg:eggs ... eggs = demo ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS While: Initializing. Error: Invalid value for prefer-final option: no """ def wont_downgrade_due_to_prefer_final(): r""" If we install a non-final buildout version, we don't want to downgrade just bcause we prefer-final. If a buildout version isn't specified, either through buildout-version or a versions entry, then buildout-version gets set to >=CURRENT_VERSION, <2dev. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... ''') >>> [v] = [l.split('=', 1)[1].strip() ... for l in system(buildout+' -vv').split('\n') ... if l.startswith('zc.buildout-version = ')] >>> v == '>=' + pkg_resources.working_set.find( ... pkg_resources.Requirement.parse('zc.buildout') ... ).version+', <2dev' True >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... zc.buildout-version = >.1 ... ''') >>> [l.split('=', 1)[1].strip() ... for l in system(buildout+' -vv').split('\n') ... if l.startswith('zc.buildout-version =')] ['>.1'] >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... versions = versions ... [versions] ... zc.buildout = 43 ... ''') >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing... Error: Couldn't find a distribution for 'zc.buildout==43'. """ def buildout_prefer_final_build_system_option(): """ The accept-buildout-test-releases buildout option can be used for overriding the default preference for final distributions for recipes, buildout extensions, and buildout itself. It is usually controlled via the bootstrap script rather than in the configuration file, but we will test the machinery using the file. Set up. This creates sdists for demorecipe 1.0 and 1.1b1, and for demoextension 1.0 and 1.1b1. >>> create_sample_recipe_sdists(sample_eggs) >>> create_sample_extension_sdists(sample_eggs) The default is accept-buildout-test-releases = false: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = demo ... find-links = %(link_server)s ... extensions = demoextension ... ... [demo] ... recipe = demorecipe ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing ... Picked: demoextension = 1.0 ... Picked: demorecipe = 1.0 ... Here we see that the final versions of demorecipe and demoextension were used. We get the same behavior if we explicitly state that accept-buildout-test-releases = false. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = demo ... find-links = %(link_server)s ... extensions = demoextension ... accept-buildout-test-releases = false ... ... [demo] ... recipe = demorecipe ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing ... Picked: demoextension = 1.0 ... Picked: demorecipe = 1.0 ... If we specify accept-buildout-test-releases = true, we'll get the newest distributions in the build system: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = demo ... find-links = %(link_server)s ... extensions = demoextension ... accept-buildout-test-releases = true ... ... [demo] ... recipe = demorecipe ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing ... Picked: demoextension = 1.1b1 ... Picked: demorecipe = 1.1b1 ... We get an error if we specify anything but true or false: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = demo ... find-links = %(link_server)s ... extensions = demoextension ... accept-buildout-test-releases = no ... ... [demo] ... recipe = demorecipe ... ''' % globals()) >>> print system(buildout+' -v'), # doctest: +ELLIPSIS While: Initializing. Error: Invalid value for accept-buildout-test-releases option: no """ def develop_with_modules(): """ Distribution setup scripts can import modules in the distribution directory: >>> mkdir('foo') >>> write('foo', 'bar.py', ... '''# empty ... ''') >>> write('foo', 'setup.py', ... ''' ... import bar ... from setuptools import setup ... setup(name="foo") ... ''') >>> write('buildout.cfg', ... ''' ... [buildout] ... develop = foo ... parts = ... ''') >>> print system(join('bin', 'buildout')), Develop: '/sample-buildout/foo' >>> ls('develop-eggs') - foo.egg-link - z3c.recipe.scripts.egg-link - zc.recipe.egg.egg-link """ def dont_pick_setuptools_if_version_is_specified_when_required_by_src_dist(): """ When installing a source distribution, we got setuptools without honoring our version specification. >>> mkdir('dist') >>> write('setup.py', ... ''' ... from setuptools import setup ... setup(name='foo', version='1', py_modules=['foo'], zip_safe=True) ... ''') >>> write('foo.py', '') >>> _ = system(buildout+' setup . sdist') >>> if zc.buildout.easy_install.is_distribute: ... distribute_version = 'distribute = %s' % ( ... pkg_resources.working_set.find( ... pkg_resources.Requirement.parse('distribute')).version,) ... else: ... distribute_version = '' ... >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = dist ... versions = versions ... allow-picked-versions = false ... ... [versions] ... setuptools = %s ... foo = 1 ... %s ... ... [foo] ... recipe = zc.recipe.egg ... eggs = foo ... ''' % (pkg_resources.working_set.find( ... pkg_resources.Requirement.parse('setuptools')).version, ... distribute_version)) >>> print system(buildout), Installing foo. Getting distribution for 'foo==1'. Got foo 1. """ def pyc_files_have_correct_paths(): r""" >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... unzip = true ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... interpreter = py ... ''' % globals()) >>> _ = system(buildout) >>> write('t.py', ... ''' ... import eggrecipedemo, eggrecipedemoneeded ... print eggrecipedemo.main.func_code.co_filename ... print eggrecipedemoneeded.f.func_code.co_filename ... ''') >>> print system(join('bin', 'py')+ ' t.py'), /sample-buildout/eggs/demo-0.4c1-py2.4.egg/eggrecipedemo.py /sample-buildout/eggs/demoneeded-1.2c1-py2.4.egg/eggrecipedemoneeded.py >>> import os >>> for name in os.listdir('eggs'): ... if name.startswith('demoneeded'): ... ls('eggs', name) d EGG-INFO - eggrecipedemoneeded.py - eggrecipedemoneeded.pyc """ def dont_mess_with_standard_dirs_with_variable_refs(): """ >>> write('buildout.cfg', ... ''' ... [buildout] ... eggs-directory = ${buildout:directory}/develop-eggs ... parts = ... ''' % globals()) >>> print system(buildout), """ def expand_shell_patterns_in_develop_paths(): """ Sometimes we want to include a number of eggs in some directory as develop eggs, without explicitly listing all of them in our buildout.cfg >>> make_dist_that_requires(sample_buildout, 'sampley') >>> make_dist_that_requires(sample_buildout, 'samplez') Now, let's create a buildout that has a shell pattern that matches both: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... develop = sample* ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = sampley ... samplez ... ''' % globals()) We can see that both eggs were found: >>> print system(buildout), Develop: '/sample-buildout/sampley' Develop: '/sample-buildout/samplez' Installing eggs. """ def warn_users_when_expanding_shell_patterns_yields_no_results(): """ Sometimes shell patterns do not match anything, so we want to warn our users about it... >>> make_dist_that_requires(sample_buildout, 'samplea') So if we have 2 patterns, one that has a matching directory, and another one that does not >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... develop = samplea grumble* ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = samplea ... ''' % globals()) We should get one of the eggs, and a warning for the pattern that did not match anything. >>> print system(buildout), Develop: '/sample-buildout/samplea' Couldn't develop '/sample-buildout/grumble*' (not found) Installing eggs. """ def make_sure_versions_dont_cancel_extras(): """ There was a bug that caused extras in requirements to be lost. >>> open('setup.py', 'w').write(''' ... from setuptools import setup ... setup(name='extraversiondemo', version='1.0', ... url='x', author='x', author_email='x', ... extras_require=dict(foo=['demo']), py_modules=['t']) ... ''') >>> open('README', 'w').close() >>> open('t.py', 'w').close() >>> sdist('.', sample_eggs) >>> mkdir('dest') >>> ws = zc.buildout.easy_install.install( ... ['extraversiondemo[foo]'], 'dest', links=[sample_eggs], ... versions = dict(extraversiondemo='1.0') ... ) >>> sorted(dist.key for dist in ws) ['demo', 'demoneeded', 'extraversiondemo'] """ def increment_buildout_options(): r""" >>> write('b1.cfg', ''' ... [buildout] ... parts = p1 ... x = 1 ... y = a ... b ... ... [p1] ... recipe = zc.buildout:debug ... foo = ${buildout:x} ${buildout:y} ... ''') >>> write('buildout.cfg', ''' ... [buildout] ... extends = b1.cfg ... parts += p2 ... x += 2 ... y -= a ... ... [p2] ... <= p1 ... ''') >>> print system(buildout), Installing p1. foo='1\n2 b' recipe='zc.buildout:debug' Installing p2. foo='1\n2 b' recipe='zc.buildout:debug' """ def increment_buildout_with_multiple_extended_files_421022(): r""" >>> write('foo.cfg', ''' ... [buildout] ... foo-option = foo ... [other] ... foo-option = foo ... ''') >>> write('bar.cfg', ''' ... [buildout] ... bar-option = bar ... [other] ... bar-option = bar ... ''') >>> write('buildout.cfg', ''' ... [buildout] ... parts = p other ... extends = bar.cfg foo.cfg ... bar-option += baz ... foo-option += ham ... ... [other] ... recipe = zc.buildout:debug ... bar-option += baz ... foo-option += ham ... ... [p] ... recipe = zc.buildout:debug ... x = ${buildout:bar-option} ${buildout:foo-option} ... ''') >>> print system(buildout), Installing p. recipe='zc.buildout:debug' x='bar\nbaz foo\nham' Installing other. bar-option='bar\nbaz' foo-option='foo\nham' recipe='zc.buildout:debug' """ def increment_on_command_line(): r""" >>> write('buildout.cfg', ''' ... [buildout] ... parts = p1 ... x = 1 ... y = a ... b ... ... [p1] ... recipe = zc.buildout:debug ... foo = ${buildout:x} ${buildout:y} ... ... [p2] ... <= p1 ... ''') >>> print system(buildout+' buildout:parts+=p2 p1:foo+=bar'), Installing p1. foo='1 a\nb\nbar' recipe='zc.buildout:debug' Installing p2. foo='1 a\nb\nbar' recipe='zc.buildout:debug' """ def test_constrained_requirement(): """ zc.buildout.easy_install._constrained_requirement(constraint, requitement) Transforms an environment by applying a constraint. Here's a table of examples: >>> from zc.buildout.easy_install import IncompatibleConstraintError >>> examples = [ ... # original, constraint, transformed ... ('x', '1', 'x==1'), ... ('x>1', '2', 'x==2'), ... ('x>3', '2', IncompatibleConstraintError), ... ('x>1', '>2', 'x>2'), ... ('x>1', '>=2', 'x>=2'), ... ('x<1', '>2', IncompatibleConstraintError), ... ('x<=1', '>=1', 'x>=1,<1,==1'), ... ('x<3', '>1', 'x>1,<3'), ... ('x==2', '>1', 'x==2'), ... ('x==2', '>=2', 'x==2'), ... ('x[y]', '1', 'x[y]==1'), ... ('x[y]>1', '2', 'x[y]==2'), ... ('x<3', '2', 'x==2'), ... ('x<1', '2', IncompatibleConstraintError), ... ('x<3', '<2', 'x<2'), ... ('x<3', '<=2', 'x<=2'), ... ('x>3', '<2', IncompatibleConstraintError), ... ('x>=1', '<=1', 'x<=1,>1,==1'), ... ('x<3', '>1', 'x>1,<3'), ... ('x==2', '<3', 'x==2'), ... ('x==2', '<=2', 'x==2'), ... ('x[y]<3', '2', 'x[y]==2'), ... ] >>> from zc.buildout.easy_install import _constrained_requirement >>> for o, c, e in examples: ... try: ... o = pkg_resources.Requirement.parse(o) ... if isinstance(e, str): ... e = pkg_resources.Requirement.parse(e) ... g = _constrained_requirement(c, o) ... except IncompatibleConstraintError: ... g = IncompatibleConstraintError ... if str(g) != str(e): ... print 'failed', o, c, g, '!=', e """ ###################################################################### def make_py_with_system_install(make_py, sample_eggs): py_path, site_packages_path = make_py() create_sample_namespace_eggs(sample_eggs, site_packages_path) return py_path def create_sample_namespace_eggs(dest, site_packages_path=None): from zc.buildout.testing import write, mkdir for pkg, version in (('version', '1.0'), ('version', '1.1'), ('fortune', '1.0')): tmp = tempfile.mkdtemp() try: write(tmp, 'README.txt', '') mkdir(tmp, 'src') mkdir(tmp, 'src', 'tellmy') write(tmp, 'src', 'tellmy', '__init__.py', "__import__(" "'pkg_resources').declare_namespace(__name__)\n") mkdir(tmp, 'src', 'tellmy', pkg) write(tmp, 'src', 'tellmy', pkg, '__init__.py', '__version__=%r\n' % version) write( tmp, 'setup.py', "from setuptools import setup\n" "setup(\n" " name='tellmy.%(pkg)s',\n" " package_dir = {'': 'src'},\n" " packages = ['tellmy', 'tellmy.%(pkg)s'],\n" " install_requires = ['setuptools'],\n" " namespace_packages=['tellmy'],\n" " zip_safe=True, version=%(version)r,\n" " author='bob', url='bob', author_email='bob')\n" % locals() ) zc.buildout.testing.sdist(tmp, dest) if (site_packages_path and pkg == 'version' and version == '1.1'): # We install the 1.1 version in site packages the way a # system packaging system (debs, rpms) would do it. zc.buildout.testing.sys_install(tmp, site_packages_path) finally: shutil.rmtree(tmp) def create_sample_extension_sdists(dest): from zc.buildout.testing import write, mkdir name = 'demoextension' for version in ('1.0', '1.1b1'): tmp = tempfile.mkdtemp() try: write(tmp, 'README.txt', '') write(tmp, name + '.py', "def ext(buildout):\n" " pass\n" "def unload(buildout):\n" " pass\n" % locals()) write(tmp, 'setup.py', "from setuptools import setup\n" "setup(\n" " name = %(name)r,\n" " py_modules = [%(name)r],\n" " entry_points = {\n" " 'zc.buildout.extension': " "['ext = %(name)s:ext'],\n" " 'zc.buildout.unloadextension': " "['ext = %(name)s:unload'],\n" " },\n" " zip_safe=True, version=%(version)r,\n" " author='bob', url='bob', author_email='bob')\n" % locals()) zc.buildout.testing.sdist(tmp, dest) finally: shutil.rmtree(tmp) def create_sample_recipe_sdists(dest): from zc.buildout.testing import write, mkdir name = 'demorecipe' for version in ('1.0', '1.1b1'): tmp = tempfile.mkdtemp() try: write(tmp, 'README.txt', '') write(tmp, name + '.py', "import logging, os, zc.buildout\n" "class Demorecipe:\n" " def __init__(self, buildout, name, options):\n" " self.name, self.options = name, options\n" " def install(self):\n" " return ()\n" " def update(self):\n" " pass\n" % locals()) write(tmp, 'setup.py', "from setuptools import setup\n" "setup(\n" " name = %(name)r,\n" " py_modules = [%(name)r],\n" " entry_points = {'zc.buildout': " "['default = %(name)s:Demorecipe']},\n" " zip_safe=True, version=%(version)r,\n" " author='bob', url='bob', author_email='bob')\n" % locals()) zc.buildout.testing.sdist(tmp, dest) finally: shutil.rmtree(tmp) def _write_eggrecipedemoneeded(tmp, minor_version, suffix=''): from zc.buildout.testing import write write(tmp, 'README.txt', '') write(tmp, 'eggrecipedemoneeded.py', 'y=%s\ndef f():\n pass' % minor_version) write( tmp, 'setup.py', "from setuptools import setup\n" "setup(name='demoneeded', py_modules=['eggrecipedemoneeded']," " zip_safe=True, version='1.%s%s', author='bob', url='bob', " "author_email='bob')\n" % (minor_version, suffix) ) def _write_eggrecipedemo(tmp, minor_version, suffix=''): from zc.buildout.testing import write write(tmp, 'README.txt', '') write( tmp, 'eggrecipedemo.py', 'import eggrecipedemoneeded\n' 'x=%s\n' 'def main(): print x, eggrecipedemoneeded.y\n' % minor_version) write( tmp, 'setup.py', "from setuptools import setup\n" "setup(name='demo', py_modules=['eggrecipedemo']," " install_requires = 'demoneeded'," " entry_points={'console_scripts': " "['demo = eggrecipedemo:main']}," " zip_safe=True, version='0.%s%s')\n" % (minor_version, suffix) ) def create_sample_sys_install(site_packages_path): for creator, minor_version in ( (_write_eggrecipedemoneeded, 1), (_write_eggrecipedemo, 3)): # Write the files and install in site_packages_path. tmp = tempfile.mkdtemp() try: creator(tmp, minor_version) zc.buildout.testing.sys_install(tmp, site_packages_path) finally: shutil.rmtree(tmp) def create_sample_eggs(test, executable=sys.executable): from zc.buildout.testing import write dest = test.globs['sample_eggs'] tmp = tempfile.mkdtemp() try: for i in (0, 1, 2): suffix = i==2 and 'c1' or '' _write_eggrecipedemoneeded(tmp, i, suffix) zc.buildout.testing.sdist(tmp, dest) write( tmp, 'setup.py', "from setuptools import setup\n" "setup(name='other', zip_safe=False, version='1.0', " "py_modules=['eggrecipedemoneeded'])\n" ) zc.buildout.testing.bdist_egg(tmp, executable, dest) os.remove(os.path.join(tmp, 'eggrecipedemoneeded.py')) for i in (1, 2, 3, 4): suffix = i==4 and 'c1' or '' _write_eggrecipedemo(tmp, i, suffix) zc.buildout.testing.bdist_egg(tmp, executable, dest) write(tmp, 'eggrecipebigdemo.py', 'import eggrecipedemo') write( tmp, 'setup.py', "from setuptools import setup\n" "setup(name='bigdemo', " " install_requires = 'demo'," " py_modules=['eggrecipebigdemo'], " " zip_safe=True, version='0.1')\n" ) zc.buildout.testing.bdist_egg(tmp, executable, dest) finally: shutil.rmtree(tmp) extdemo_c = """ #include #include static PyMethodDef methods[] = {{NULL}}; PyMODINIT_FUNC initextdemo(void) { PyObject *m; m = Py_InitModule3("extdemo", methods, ""); #ifdef TWO PyModule_AddObject(m, "val", PyInt_FromLong(2)); #else PyModule_AddObject(m, "val", PyInt_FromLong(EXTDEMO)); #endif } """ extdemo_setup_py = """ import os from distutils.core import setup, Extension if os.environ.get('test-variable'): print "Have environment test-variable:", os.environ['test-variable'] setup(name = "extdemo", version = "%s", url="http://www.zope.org", author="Demo", author_email="demo@demo.com", ext_modules = [Extension('extdemo', ['extdemo.c'])], ) """ def add_source_dist(test, version=1.4): if 'extdemo' not in test.globs: test.globs['extdemo'] = test.globs['tmpdir']('extdemo') tmp = test.globs['extdemo'] write = test.globs['write'] try: write(tmp, 'extdemo.c', extdemo_c); write(tmp, 'setup.py', extdemo_setup_py % version); write(tmp, 'README', ""); write(tmp, 'MANIFEST.in', "include *.c\n"); test.globs['sdist'](tmp, test.globs['sample_eggs']) except: shutil.rmtree(tmp) def easy_install_SetUp(test): zc.buildout.testing.buildoutSetUp(test) sample_eggs = test.globs['tmpdir']('sample_eggs') test.globs['sample_eggs'] = sample_eggs os.mkdir(os.path.join(sample_eggs, 'index')) create_sample_eggs(test) add_source_dist(test) test.globs['link_server'] = test.globs['start_server']( test.globs['sample_eggs']) test.globs['update_extdemo'] = lambda : add_source_dist(test, 1.5) zc.buildout.testing.install_develop('zc.recipe.egg', test) zc.buildout.testing.install_develop('z3c.recipe.scripts', test) def buildout_txt_setup(test): zc.buildout.testing.buildoutSetUp(test) mkdir = test.globs['mkdir'] eggs = os.environ['buildout-testing-index-url'][7:] test.globs['sample_eggs'] = eggs create_sample_eggs(test) for name in os.listdir(eggs): if '-' in name: pname = name.split('-')[0] if not os.path.exists(os.path.join(eggs, pname)): mkdir(eggs, pname) shutil.move(os.path.join(eggs, name), os.path.join(eggs, pname, name)) dist = pkg_resources.working_set.find( pkg_resources.Requirement.parse('zc.recipe.egg')) mkdir(eggs, 'zc.recipe.egg') zc.buildout.testing.sdist( os.path.dirname(dist.location), os.path.join(eggs, 'zc.recipe.egg'), ) egg_parse = re.compile('([0-9a-zA-Z_.]+)-([0-9a-zA-Z_.]+)-py(\d[.]\d).egg$' ).match def makeNewRelease(project, ws, dest, version='1.99.99'): dist = ws.find(pkg_resources.Requirement.parse(project)) eggname, oldver, pyver = egg_parse( os.path.basename(dist.location) ).groups() dest = os.path.join(dest, "%s-%s-py%s.egg" % (eggname, version, pyver)) if os.path.isfile(dist.location): shutil.copy(dist.location, dest) zip = zipfile.ZipFile(dest, 'a') zip.writestr( 'EGG-INFO/PKG-INFO', zip.read('EGG-INFO/PKG-INFO').replace("Version: %s" % oldver, "Version: %s" % version) ) zip.close() else: shutil.copytree(dist.location, dest) info_path = os.path.join(dest, 'EGG-INFO', 'PKG-INFO') info = open(info_path).read().replace("Version: %s" % oldver, "Version: %s" % version) open(info_path, 'w').write(info) def getWorkingSetWithBuildoutEgg(test): sample_buildout = test.globs['sample_buildout'] eggs = os.path.join(sample_buildout, 'eggs') # If the zc.buildout dist is a develop dist, convert it to a # regular egg in the sample buildout req = pkg_resources.Requirement.parse('zc.buildout') dist = pkg_resources.working_set.find(req) if dist.precedence == pkg_resources.DEVELOP_DIST: # We have a develop egg, create a real egg for it: here = os.getcwd() os.chdir(os.path.dirname(dist.location)) assert os.spawnle( os.P_WAIT, sys.executable, zc.buildout.easy_install._safe_arg(sys.executable), os.path.join(os.path.dirname(dist.location), 'setup.py'), '-q', 'bdist_egg', '-d', eggs, dict(os.environ, PYTHONPATH=pkg_resources.working_set.find( pkg_resources.Requirement.parse('setuptools') ).location, ), ) == 0 os.chdir(here) os.remove(os.path.join(eggs, 'zc.buildout.egg-link')) # Rebuild the buildout script ws = pkg_resources.WorkingSet([eggs]) ws.require('zc.buildout') zc.buildout.easy_install.scripts( ['zc.buildout'], ws, sys.executable, os.path.join(sample_buildout, 'bin')) else: ws = pkg_resources.working_set return ws def updateSetup(test): zc.buildout.testing.buildoutSetUp(test) new_releases = test.globs['tmpdir']('new_releases') test.globs['new_releases'] = new_releases ws = getWorkingSetWithBuildoutEgg(test) # now let's make the new releases makeNewRelease('zc.buildout', ws, new_releases) makeNewRelease('zc.buildout', ws, new_releases, '1.100.0b1') makeNewRelease('zc.buildout', ws, new_releases, '2.0.0') os.mkdir(os.path.join(new_releases, 'zc.buildout')) if zc.buildout.easy_install.is_distribute: makeNewRelease('distribute', ws, new_releases) os.mkdir(os.path.join(new_releases, 'distribute')) else: makeNewRelease('setuptools', ws, new_releases) os.mkdir(os.path.join(new_releases, 'setuptools')) def bootstrapSetup(test): easy_install_SetUp(test) sample_eggs = test.globs['sample_eggs'] ws = getWorkingSetWithBuildoutEgg(test) makeNewRelease('zc.buildout', ws, sample_eggs) makeNewRelease('zc.buildout', ws, sample_eggs, '1.100.0b1') makeNewRelease('zc.buildout', ws, sample_eggs, '2.0.0') os.environ['bootstrap-testing-find-links'] = test.globs['link_server'] normalize_bang = ( re.compile(re.escape('#!'+ zc.buildout.easy_install._safe_arg(sys.executable))), '#!/usr/local/bin/python2.4', ) hide_distribute_additions = (re.compile('install_dir .+\n'), '') hide_zip_safe_message = ( # This comes in a different place in the output in Python 2.7. It's not # important to our tests. Hide it. re.compile( '((?<=\n)\n)?zip_safe flag not set; analyzing archive contents...\n'), '') hide_first_index_page_message = ( # This comes in a different place in the output in Python 2.7. It's not # important to our tests. Hide it. re.compile( "Couldn't find index page for '[^']+' \(maybe misspelled\?\)\n"), '') def test_suite(): test_suite = [ doctest.DocFileSuite( 'buildout.txt', setUp=buildout_txt_setup, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, zc.buildout.tests.hide_distribute_additions, hide_zip_safe_message, (re.compile(r'zc.buildout-version = >=\S+, <2dev'), ''), (re.compile(r"Installing 'zc.buildout >=\S+, <2dev"), ''), (re.compile('__buildout_signature__ = recipes-\S+'), '__buildout_signature__ = recipes-SSSSSSSSSSS'), (re.compile('executable = [\S ]+python\S*', re.I), 'executable = python'), (re.compile('[-d] (setuptools|distribute)-\S+[.]egg'), 'setuptools.egg'), (re.compile('zc.buildout(-\S+)?[.]egg(-link)?'), 'zc.buildout.egg'), (re.compile('creating \S*setup.cfg'), 'creating setup.cfg'), (re.compile('hello\%ssetup' % os.path.sep), 'hello/setup'), (re.compile('Picked: (\S+) = \S+'), 'Picked: \\1 = V.V'), (re.compile(r'We have a develop egg: zc.buildout (\S+)'), 'We have a develop egg: zc.buildout X.X.'), (re.compile(r'\\[\\]?'), '/'), (re.compile('WindowsError'), 'OSError'), (re.compile(r'\[Error \d+\] Cannot create a file ' r'when that file already exists: '), '[Errno 17] File exists: ' ), (re.compile('distribute'), 'setuptools'), (re.compile('Got zc.recipe.egg \S+'), 'Got zc.recipe.egg'), ]) ), doctest.DocFileSuite( 'runsetup.txt', 'repeatable.txt', 'setup.txt', setUp=zc.buildout.testing.buildoutSetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, zc.buildout.tests.hide_distribute_additions, hide_zip_safe_message, (re.compile(r"Installing 'zc.buildout >=\S+, <2dev"), ''), (re.compile( r"Getting distribution for 'zc.buildout >=\S+, <2dev"), ''), (re.compile('__buildout_signature__ = recipes-\S+'), '__buildout_signature__ = recipes-SSSSSSSSSSS'), (re.compile('executable = [\S ]+python\S*', re.I), 'executable = python'), (re.compile('[-d] (setuptools|distribute)-\S+[.]egg'), 'setuptools.egg'), (re.compile('zc.buildout(-\S+)?[.]egg(-link)?'), 'zc.buildout.egg'), (re.compile('creating \S*setup.cfg'), 'creating setup.cfg'), (re.compile('hello\%ssetup' % os.path.sep), 'hello/setup'), (re.compile('Picked: (\S+) = \S+'), 'Picked: \\1 = V.V'), (re.compile(r'We have a develop egg: zc.buildout (\S+)'), 'We have a develop egg: zc.buildout X.X.'), (re.compile(r'\\[\\]?'), '/'), (re.compile('WindowsError'), 'OSError'), (re.compile(r'\[Error \d+\] Cannot create a file ' r'when that file already exists: '), '[Errno 17] File exists: ' ), (re.compile('distribute'), 'setuptools'), ]) ), doctest.DocFileSuite( 'debugging.txt', setUp=zc.buildout.testing.buildoutSetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.tests.hide_distribute_additions, (re.compile(r'\S+buildout.py'), 'buildout.py'), (re.compile(r'line \d+'), 'line NNN'), (re.compile(r'py\(\d+\)'), 'py(NNN)'), ]) ), doctest.DocFileSuite( 'update.txt', setUp=updateSetup, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, normalize_bang, zc.buildout.tests.hide_distribute_additions, (re.compile(r"Installing 'zc.buildout >=\S+, <2dev"), ''), (re.compile( r"Getting distribution for 'zc.buildout>=\S+, *<2dev"), ''), (re.compile('99[.]99'), 'NINETYNINE.NINETYNINE'), (re.compile('(zc.buildout|setuptools)-\d+[.]\d+\S*' '-py\d.\d.egg'), '\\1.egg'), (re.compile('distribute-\d+[.]\d+\S*' '-py\d.\d.egg'), 'setuptools.egg'), (re.compile('(zc.buildout|setuptools)( version)? \d+[.]\d+\S*'), '\\1 V.V'), (re.compile('distribute( version)? \d+[.]\d+\S*'), 'setuptools V.V'), (re.compile('[-d] (setuptools|distribute)'), '- setuptools'), (re.compile('distribute'), 'setuptools'), (re.compile("\nUnused options for buildout: " "'(distribute|setuptools)\-version'\."), '') ]) ), doctest.DocFileSuite( 'easy_install.txt', 'downloadcache.txt', 'dependencylinks.txt', 'allowhosts.txt', 'unzip.txt', 'upgrading_distribute.txt', setUp=easy_install_SetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, normalize_bang, hide_first_index_page_message, zc.buildout.tests.hide_distribute_additions, (re.compile('extdemo[.]pyd'), 'extdemo.so'), (re.compile('[-d] (setuptools|distribute)-\S+[.]egg'), 'setuptools.egg'), (re.compile(r'\\[\\]?'), '/'), (re.compile(r'\#!\S+\bpython\S*'), '#!/usr/bin/python'), # Normalize generate_script's Windows interpreter to UNIX: (re.compile(r'\nimport subprocess\n'), '\n'), (re.compile('subprocess\\.call\\(argv, env=environ\\)'), 'os.execve(sys.executable, argv, environ)'), (re.compile('distribute'), 'setuptools'), # Distribute unzips eggs by default. (re.compile('\- demoneeded'), 'd demoneeded'), ]+(sys.version_info < (2, 5) and [ (re.compile('.*No module named runpy.*', re.S), ''), (re.compile('.*usage: pdb.py scriptfile .*', re.S), ''), (re.compile('.*Error: what does not exist.*', re.S), ''), ] or [])), ), doctest.DocFileSuite( 'download.txt', 'extends-cache.txt', setUp=easy_install_SetUp, tearDown=zc.buildout.testing.buildoutTearDown, optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS, checker=renormalizing.RENormalizing([ (re.compile(' at -?0x[^>]+'), ''), (re.compile('http://localhost:[0-9]{4,5}/'), 'http://localhost/'), (re.compile('[0-9a-f]{32}'), ''), zc.buildout.testing.normalize_path, ]), ), doctest.DocTestSuite( setUp=easy_install_SetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, zc.buildout.tests.hide_distribute_additions, hide_first_index_page_message, (re.compile(r"Installing 'zc.buildout >=\S+, <2dev"), 'Installing '), (re.compile(r'^(\w+\.)*(Missing\w+: )'), '\2'), (re.compile("buildout: Running \S*setup.py"), 'buildout: Running setup.py'), (re.compile('(setuptools|distribute)-\S+-'), 'setuptools.egg'), (re.compile('zc.buildout-\S+-'), 'zc.buildout.egg'), (re.compile('File "\S+one.py"'), 'File "one.py"'), (re.compile(r'We have a develop egg: (\S+) (\S+)'), r'We have a develop egg: \1 V'), (re.compile('Picked: (setuptools|distribute) = \S+'), 'Picked: setuptools = V'), (re.compile(r'\\[\\]?'), '/'), (re.compile( '-q develop -mxN -d "/sample-buildout/develop-eggs'), '-q develop -mxN -d /sample-buildout/develop-eggs' ), (re.compile(r'^[*]...'), '...'), # for 92891 # bootstrap_crashes_with_egg_recipe_in_buildout_section (re.compile(r"Unused options for buildout: 'eggs' 'scripts'\."), "Unused options for buildout: 'scripts' 'eggs'."), (re.compile('distribute'), 'setuptools'), # Distribute unzips eggs by default. (re.compile('\- demoneeded'), 'd demoneeded'), ]), ), zc.buildout.rmtree.test_suite(), doctest.DocFileSuite( 'windows.txt', setUp=zc.buildout.testing.buildoutSetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, zc.buildout.tests.hide_distribute_additions, (re.compile('__buildout_signature__ = recipes-\S+'), '__buildout_signature__ = recipes-SSSSSSSSSSS'), (re.compile('[-d] setuptools-\S+[.]egg'), 'setuptools.egg'), (re.compile('zc.buildout(-\S+)?[.]egg(-link)?'), 'zc.buildout.egg'), (re.compile('creating \S*setup.cfg'), 'creating setup.cfg'), (re.compile('hello\%ssetup' % os.path.sep), 'hello/setup'), (re.compile('Picked: (\S+) = \S+'), 'Picked: \\1 = V.V'), (re.compile(r'We have a develop egg: zc.buildout (\S+)'), 'We have a develop egg: zc.buildout X.X.'), (re.compile(r'\\[\\]?'), '/'), (re.compile('WindowsError'), 'OSError'), (re.compile(r'\[Error 17\] Cannot create a file ' r'when that file already exists: '), '[Errno 17] File exists: ' ), ]) ), doctest.DocFileSuite( 'testing_bugfix.txt', checker=renormalizing.RENormalizing([ # Python 2.7 (re.compile( re.escape( 'testrunner.logsupport.NullHandler instance at')), 'testrunner.logsupport.NullHandler object at'), (re.compile(re.escape('logging.StreamHandler instance at')), 'logging.StreamHandler object at'), ]) ), ] if zc.buildout.testing.script_in_shebang: test_suite.append(zc.buildout.testselectingpython.test_suite()) # adding bootstrap.txt doctest to the suite # only if bootstrap.py is present bootstrap_py = os.path.join( os.path.dirname( os.path.dirname( os.path.dirname( os.path.dirname(zc.buildout.__file__) ) ) ), 'bootstrap', 'bootstrap.py') if os.path.exists(bootstrap_py): test_suite.append(doctest.DocFileSuite( 'bootstrap.txt', 'bootstrap1.txt', 'bootstrap_cl_settings.test', setUp=bootstrapSetup, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, normalize_bang, (re.compile('Downloading.*setuptools.*egg\n'), ''), (re.compile('options:'), 'Options:'), (re.compile('usage:'), 'Usage:'), ]), )) test_suite.append(doctest.DocFileSuite( 'virtualenv.txt', setUp=easy_install_SetUp, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ zc.buildout.testing.normalize_path, zc.buildout.testing.normalize_endings, zc.buildout.testing.normalize_script, zc.buildout.testing.normalize_egg_py, zc.buildout.tests.hide_distribute_additions, (re.compile('(setuptools|distribute)-\S+-'), 'setuptools.egg'), (re.compile('zc.buildout-\S+-'), 'zc.buildout.egg'), (re.compile(re.escape('#!"/executable_buildout/bin/py"')), '#!/executable_buildout/bin/py'), # Windows. (re.compile(re.escape('/broken_s/')), '/broken_S/'), # Windows. ]), )) return unittest.TestSuite(test_suite) zc.buildout-1.7.1/src/zc/buildout/testselectingpython.py0000644000076500007650000000703712111414155023101 0ustar jimjim00000000000000############################################################################## # # Copyright (c) 2006 Zope Foundation and Contributors. # All Rights Reserved. # # This software is subject to the provisions of the Zope Public License, # Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution. # THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED # WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS # FOR A PARTICULAR PURPOSE. # ############################################################################## import os, re, subprocess, sys, textwrap, unittest, doctest from zope.testing import renormalizing import zc.buildout.tests import zc.buildout.testing if sys.version_info[:2] == (2, 5): other_version = "2.6" else: other_version = "2.5" __test__ = dict( test_selecting_python_via_easy_install= """\ We can specify a specific Python executable. >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], ... index='http://www.python.org/pypi/', ... always_unzip=True, executable=other_executable) >>> ls(dest) d demo-0.3-py%(other_version)s.egg d demoneeded-1.1-py%(other_version)s.egg """ % dict(other_version=other_version) ) def multi_python(test): other_executable = zc.buildout.testing.find_python(other_version) command = textwrap.dedent('''\ try: import setuptools except ImportError: import sys sys.exit(1) ''') env = dict(os.environ) env.pop('PYTHONPATH', None) if subprocess.call([other_executable, '-c', command], env=env): # the other executable does not have setuptools. Get setuptools. # We will do this using the same tools we are testing, for better or # worse. Alternatively, we could try using bootstrap. executable_dir = test.globs['tmpdir']('executable_dir') executable_parts = os.path.join(executable_dir, 'parts') test.globs['mkdir'](executable_parts) ws = zc.buildout.easy_install.install( ['setuptools'], executable_dir, index='http://www.python.org/pypi/', always_unzip=True, executable=other_executable) zc.buildout.easy_install.sitepackage_safe_scripts( executable_dir, ws, other_executable, executable_parts, reqs=['setuptools'], interpreter='py') original_executable = other_executable other_executable = os.path.join(executable_dir, 'py') assert not subprocess.call( [other_executable, '-c', command], env=env), ( 'test set up failed') sample_eggs = test.globs['tmpdir']('sample_eggs') os.mkdir(os.path.join(sample_eggs, 'index')) test.globs['sample_eggs'] = sample_eggs zc.buildout.tests.create_sample_eggs(test, executable=other_executable) test.globs['other_executable'] = other_executable def setup(test): zc.buildout.testing.buildoutSetUp(test) multi_python(test) zc.buildout.tests.add_source_dist(test) test.globs['link_server'] = test.globs['start_server']( test.globs['sample_eggs']) def test_suite(): return doctest.DocTestSuite( setUp=setup, tearDown=zc.buildout.testing.buildoutTearDown, checker=renormalizing.RENormalizing([ (re.compile('setuptools-\S+-py%s.egg' % other_version), 'setuptools-V-py%s.egg' % other_version), ]), ) zc.buildout-1.7.1/src/zc/buildout/unzip.txt0000644000076500007650000000244712111414155020316 0ustar jimjim00000000000000Always unzipping eggs ===================== By default, zc.buildout doesn't unzip zip-safe eggs. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> _ = system(buildout) >>> ls('eggs') - demo-0.4c1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... unzip = true ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> import os >>> for name in os.listdir('eggs'): ... if name.startswith('demo'): ... remove('eggs', name) >>> _ = system(buildout) >>> ls('eggs') d demo-0.4c1-py2.4.egg d demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link zc.buildout-1.7.1/src/zc/buildout/update.txt0000644000076500007650000002330512111414155020427 0ustar jimjim00000000000000Automatic Buildout Updates ========================== When a buildout is run, one of the first steps performed is to check for updates to either zc.buildout or setuptools. To demonstrate this, we've created some "new releases" of buildout and setuptools in a new_releases folder: >>> ls(new_releases) d setuptools - setuptools-1.99.99-py2.4.egg d zc.buildout - zc.buildout-1.100.0b1-pyN.N.egg - zc.buildout-1.99.99-py2.4.egg - zc.buildout-2.0.0-pyN.N.egg Let's update the sample buildout.cfg to look in this area: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) We'll also include a recipe that echos the versions of setuptools and zc.buildout used: >>> mkdir(sample_buildout, 'showversions') >>> write(sample_buildout, 'showversions', 'showversions.py', ... """ ... import pkg_resources ... ... class Recipe: ... ... def __init__(self, buildout, name, options): ... pass ... ... def install(self): ... for project in 'zc.buildout', 'setuptools': ... req = pkg_resources.Requirement.parse(project) ... print project, pkg_resources.working_set.find(req).version ... return () ... update = install ... """) >>> write(sample_buildout, 'showversions', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "showversions", ... entry_points = {'zc.buildout': ['default = showversions:Recipe']}, ... ) ... """) Now if we run the buildout, the buildout will upgrade itself to the new versions found in new releases: >>> print system(buildout), Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Upgraded: zc.buildout version 1.99.99, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. Develop: '/sample-buildout/showversions' Installing show-versions. zc.buildout 1.99.99 setuptools 1.99.99 Notice that, even though we have a newer beta version of zc.buildout available, the final "1.99.99" was selected. If you want to get non-final versions, specify a specific version in your buildout's versions section, you typically want to use the --accept-buildout-test-releases option to the bootstrap script, which internally uses the ``accept-buildout-test-releases = true`` discussed below. Also, even thought there's a later final version, buildout won't upgrade itself past version 1. Our buildout script's site.py has been updated to use the new eggs: >>> cat(sample_buildout, 'parts', 'buildout', 'site.py') ... # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS "... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/sample-buildout/eggs/zc.buildout-1.99.99-pyN.N.egg', '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support setuptools. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths ... Now, let's recreate the sample buildout. If we specify constraints on the versions of zc.buildout and setuptools (or distribute) to use, running the buildout will install earlier versions of these packages: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... zc.buildout-version = < 1.99 ... setuptools-version = < 1.99 ... distribute-version = < 1.99 ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) Now we can see that we actually "upgrade" to an earlier version. >>> print system(buildout), Upgraded: zc.buildout version 1.0.0, setuptools version 0.6; restarting. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 There are a number of cases, described below, in which the updates don't happen. We won't upgrade in offline mode: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout+' -o'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 Or in non-newest mode: >>> print system(buildout+' -N'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 We also won't upgrade if the buildout script being run isn't in the buildout's bin directory. To see this we'll create a new buildout directory: >>> sample_buildout2 = tmpdir('sample_buildout2') >>> write(sample_buildout2, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = ... """ % dict(new_releases=new_releases)) >>> cd(sample_buildout2) >>> print system(buildout), Creating directory '/sample_buildout2/bin'. Creating directory '/sample_buildout2/parts'. Creating directory '/sample_buildout2/eggs'. Creating directory '/sample_buildout2/develop-eggs'. Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Not upgrading because not running a local buildout command. >>> ls('bin') As mentioned above, the ``accept-buildout-test-releases = true`` means that newer non-final versions of these dependencies are preferred. Typically users are not expected to actually manipulate this value. Instead, the bootstrap script creates a buildout buildout script that passes in the value as a command line override. This then results in the buildout script being rewritten to remember the decision. We'll mimic this by passing the argument actually in the command line. >>> cd(sample_buildout) >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout + ... ' buildout:accept-buildout-test-releases=true'), ... # doctest: +NORMALIZE_WHITESPACE Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.100.0b1. Upgraded: zc.buildout version 1.100.0b1, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. NOTE: Accepting early releases of build system packages. Rerun bootstrap without --accept-buildout-test-releases (-t) to return to default behavior. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.100.0b1 setuptools 1.99.99 The buildout script shows the change. >>> buildout_script = join(sample_buildout, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... sys.argv.insert(1, 'buildout:accept-buildout-test-releases=true') print ('NOTE: Accepting early releases of build system packages. Rerun ' 'bootstrap without --accept-buildout-test-releases (-t) to return to ' 'default behavior.') ... If the update process for buildout or setuptools fails the error should be caught (displaying a warning) and the rest of the buildout update process should continue. >>> version = sys.version_info[0:2] >>> egg = new_releases + '/zc.buildout-1.99.99-py%s.%s.egg' % version >>> copy_egg = new_releases + '/zc.buildout-1.1000-py%s.%s.egg' % version >>> import shutil >>> shutil.copy(egg, copy_egg) Create a broken egg >>> mkdir(sample_buildout, 'broken') >>> write(sample_buildout, 'broken', 'setup.py', "import broken_egg\n") >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = ... broken ... ... [broken] ... recipe = zc.recipe.egg ... eggs = broken ... """ % dict(new_releases=new_releases)) >>> import subprocess >>> subprocess.call([buildout]) 1 zc.buildout-1.7.1/src/zc/buildout/upgrading_distribute.txt0000644000076500007650000000446012111414155023364 0ustar jimjim00000000000000Installing setuptools/distribute -------------------------------- Some initial test setup: >>> import sys >>> import zc.buildout >>> dest = tmpdir('sample-install') Setuptools (0.6something) is packaged as an ``.egg``. So when installing it, the egg is downloaded and used. Distribute is packaged as a tarball, which makes an easy_install call necessary. In older versions of buildout, the ``_call_easy_install()`` method would call ``_get_dist()`` to get hold of the setuptools path for calling easy_install. When an updated "distribute" was found, this would try an install again, leading to an infinite recursion. The solution is to just use the setuptools location found at import time, like happens with the buildout and setuptools location that is inserted in scripts' paths. We test this corner case by patching the ``_get_dist()`` call: >>> def mock_get_dist(requirement, ws, always_unzip): ... raise RuntimeError("We should not get called") When installing setuptools itself, we expect the "Getting dist" message not to be printed. We call ``_call_easy_install()`` directly and get an error because of a non-existing tarball, but that's the OK for this corner case test: we only want to test that ``_get_dist()`` isn't getting called: >>> class MockDist(object): ... def __str__(self): ... return 'nonexisting.tgz' ... @property ... def project_name(self): ... # Testing corner case: there *is* actually ... # a newer setuptools package on pypi than we ... # are running with, so it really installs it ... # and compares project_name. We're past the ... # point that we're testing, so we just raise ... # the normally expected error. ... raise zc.buildout.UserError( ... "Couldn't install: nonexisting.tgz") >>> dist = MockDist() >>> installer = zc.buildout.easy_install.Installer( ... dest=dest, ... links=[link_server], ... index=link_server+'index/', ... executable=sys.executable, ... always_unzip=True) >>> installer._get_dist = mock_get_dist >>> installer._call_easy_install('setuptools', None, dest, dist) Traceback (most recent call last): ... UserError: Couldn't install: nonexisting.tgz zc.buildout-1.7.1/src/zc/buildout/virtualenv.txt0000644000076500007650000002324312111414155021345 0ustar jimjim00000000000000Version 1.5.0 of buildout (and higher) provides the ability to use buildout directly with a system Python if you use z3c.recipe.scripts or other isolation-aware recipes that use the sitepackage_safe_scripts function. Some people use virtualenv to provide similar functionality. Unfortunately, a problem with the virtualenv executable as of this writing means that -S will not work properly with it (see https://bugs.launchpad.net/virtualenv/+bug/572545). This breaks buildout's approach to providing isolation. Because of this, if buildout detects an executable with a broken -S option, it will revert to its pre-1.5.0 behavior. If buildout has been asked to provide isolation, it will warn the user that isolation will not be provided by buildout, but proceed. This should give full backwards compatibility to virtualenv users. The only minor annoyance in the future may be recipes that explicitly use the new buildout functionality to provide isolation: as described above, the builds will proceed, but users will receive warnings that buildout is not providing isolation itself. The warnings themselves can be squelched when running bin/buildout with the ``-s`` option or with a lower verbosity than usual (e.g., one or more ``-q`` options). For tests, then, we can examine several things. We'll focus on four. - Running bootstrap with an executable broken in this way will not try to do any -S tricks. - Running sitepackage_safe_scripts with a virtualenv will create an old-style script. This will affect the bin/buildout script that is created, for instance. If the sitepackage_safe_scripts function is asked to provide isolation under these circumstances, it will warn that isolation will not be available, but still create the desired script. - Using the easy_install Installer or install or build functions and trying to request isolation will generate a warning and then the isolation request will be ignored as it proceeds. - Passing -s (or -q) to the bin/buildout script will squelch warnings. Testing these involves first creating a Python that exhibits the same behavior as the problematic one we care about from virtualenv. Let's do that first. >>> import os, sys >>> from zc.buildout.easy_install import _safe_arg >>> py_path, site_packages_path = make_py() >>> if sys.platform == 'win32': ... py_script_path = py_path + '-script.py' ... else: ... py_script_path = py_path ... >>> py_file = open(py_script_path) >>> py_lines = py_file.readlines() >>> py_file.close() >>> py_file = open(py_script_path, 'w') >>> extra = '''\ ... new_argv = argv[:1] ... for ix, val in enumerate(argv[1:]): ... if val.startswith('--'): ... new_argv.append(val) ... if val.startswith('-') and len(val) > 1: ... if 'S' in val: ... val = val.replace('S', '') ... environ['BROKEN_DASH_S'] = 'Y' ... if val != '-': ... new_argv.append(val) ... if 'c' in val: ... new_argv.extend(argv[ix+2:]) ... break ... else: ... new_argv.extend(argv[ix+1:]) ... argv = new_argv ... ''' >>> for line in py_lines: ... py_file.write(line) ... if line.startswith('environ = os.environ.copy()'): ... py_file.write(extra) ... print 'Rewritten.' ... Rewritten. >>> py_file.close() >>> sitecustomize_path = join(os.path.dirname(site_packages_path), ... 'parts', 'py', 'sitecustomize.py') >>> sitecustomize_file = open(sitecustomize_path, 'a') >>> sitecustomize_file.write(''' ... import os, sys ... sys.executable = %r ... if 'BROKEN_DASH_S' in os.environ: ... class ImportHook: ... site = None ... ... @classmethod ... def find_module(klass, fullname, path=None): ... if klass.site is None and 'site' in sys.modules: ... # Pop site out of sys.modules. This will be a ... # close-enough approximation of site not being ... # loaded for our tests--it lets us provoke the ... # right errors when the fixes are absent, and ... # works well enough when the fixes are present. ... klass.site = sys.modules.pop('site') ... if fullname == 'ConfigParser': ... raise ImportError(fullname) ... elif fullname == 'site': ... # Keep the site module from being processed twice. ... return klass ... ... @classmethod ... def load_module(klass, fullname): ... if fullname == 'site': ... return klass.site ... raise ImportError(fullname) ... ... sys.meta_path.append(ImportHook) ... ''' % (py_path,)) >>> sitecustomize_file.close() >>> print call_py( ... _safe_arg(py_path), ... "import ConfigParser") >>> print 'X'; print call_py( ... _safe_arg(py_path), ... "import ConfigParser", ... '-S') # doctest: +ELLIPSIS X...Traceback (most recent call last): ... ImportError: No module named ConfigParser >>> from zc.buildout.easy_install import _has_broken_dash_S >>> _has_broken_dash_S(py_path) True Well, that was ugly, but it seems to have done the trick. The executable represented by py_path has the same problematic characteristic as the virtualenv one: -S results in a Python that does not allow the import of some packages from the standard library. We'll test with this. First, let's try running bootstrap. >>> from os.path import dirname, join >>> import zc.buildout >>> bootstrap_py = join( ... dirname( ... dirname( ... dirname( ... dirname(zc.buildout.__file__) ... ) ... ) ... ), ... 'bootstrap', 'bootstrap.py') >>> broken_S_buildout = tmpdir('broken_S') >>> os.chdir(broken_S_buildout) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = ... ''') >>> write('bootstrap.py', open(bootstrap_py).read()) >>> print 'X'; print system( ... _safe_arg(py_path)+' '+ ... 'bootstrap.py'); print 'X' # doctest: +ELLIPSIS X... Generated script '/broken_S/bin/buildout'. ... If bootstrap didn't look out for a broken -S, that would have failed. Moreover, take a look at bin/buildout: >>> cat('bin', 'buildout') # doctest: +NORMALIZE_WHITESPACE #!/executable_buildout/bin/py import sys sys.path[0:0] = [ '/broken_S/eggs/setuptools-0.0-pyN.N.egg', '/broken_S/eggs/zc.buildout-0.0-pyN.N.egg', ] import zc.buildout.buildout if __name__ == '__main__': sys.exit(zc.buildout.buildout.main()) That's the old-style buildout script: no changes for users with this issue. Of course, they don't get the new features either, presumably because they don't need or want them. This means that if they use a recipe that tries to use a new feature, the behavior needs to degrade gracefully. Here's an example. We'll switch to another buildout in which it is easier to use local dev versions of zc.buildout and z3c.recipe.scripts. >>> os.chdir(dirname(dirname(buildout))) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... include-site-packages = false ... ... [primed_python] ... executable = %(py_path)s ... ... [eggs] ... recipe = z3c.recipe.scripts ... python = primed_python ... interpreter = py ... eggs = demo ... ''' % globals()) >>> print system(buildout), # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS Installing eggs. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. Generated interpreter '/sample-buildout/bin/py'. ...UserWarning: Buildout has been asked to exclude or limit site-packages so that builds can be repeatable when using a system Python. However, the chosen Python executable has a broken implementation of -S (see https://bugs.launchpad.net/virtualenv/+bug/572545 for an example problem) and this breaks buildout's ability to isolate site-packages. If the executable already has a clean site-packages (e.g., using virtualenv's ``--no-site-packages`` option) you may be getting equivalent repeatability. To silence this warning, use the -s argument to the buildout script. Alternatively, use a Python executable with a working -S (such as a standard Python binary). warnings.warn(BROKEN_DASH_S_WARNING) So, it did what we asked as best it could, but gave a big warning. If you don't want those warnings for those particular recipes that use the new features, you can use the "-s" option to squelch the warnings. >>> print system(buildout + ' -s'), Updating eggs. A lower verbosity (one or more -q options) also quiets the warning. >>> print system(buildout + ' -q'), Notice that, as we saw before with bin/buildout, the generated scripts are old-style, because the new-style feature gracefully degrades to the previous implementation when it encounters an executable with a broken dash-S. >>> print 'X'; cat('bin', 'py') # doctest: +ELLIPSIS X... import sys sys.path[0:0] = [ '/sample-buildout/eggs/demo-0.4c1-pyN.N.egg', '/sample-buildout/eggs/demoneeded-1.2c1-pyN.N.egg', ] ... zc.buildout-1.7.1/src/zc/buildout/windows.txt0000644000076500007650000000356512111414155020645 0ustar jimjim00000000000000zc.buildout on MS-Windows ========================= Certain aspects of every software project are dependent on the operating system used. The same - of course - applies to zc.buildout. To test that Windows doesn't get in the way, we'll test some system dependent aspects. The following recipe will create a read-only file which shutil.rmtree can't delete. >>> mkdir('recipe') >>> write('recipe', 'recipe.py', ... ''' ... import os ... class Recipe: ... def __init__(self, buildout, name, options): ... self.location = os.path.join( ... buildout['buildout']['parts-directory'], ... name) ... ... def install(self): ... print "can't remove read only files" ... if not os.path.exists (self.location): ... os.makedirs (self.location) ... ... name = os.path.join (self.location, 'readonly.txt') ... open (name, 'w').write ('this is a read only file') ... os.chmod (name, 0400) ... return () ... ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='1', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'README', '') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... and we'll configure a buildout to use it: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) >>> print system(buildout), Getting distribution for 'spam'. Got spam 1. Installing foo. can't remove read only files zc.buildout-1.7.1/src/zc.buildout.egg-info/0000755000076500007650000000000012111415075020074 5ustar jimjim00000000000000zc.buildout-1.7.1/src/zc.buildout.egg-info/dependency_links.txt0000644000076500007650000000000112111415075024142 0ustar jimjim00000000000000 zc.buildout-1.7.1/src/zc.buildout.egg-info/entry_points.txt0000644000076500007650000000015612111415075023374 0ustar jimjim00000000000000 [console_scripts] buildout = zc.buildout.buildout:main [zc.buildout] debug = zc.buildout.testrecipes:Debug zc.buildout-1.7.1/src/zc.buildout.egg-info/namespace_packages.txt0000644000076500007650000000000312111415075024420 0ustar jimjim00000000000000zc zc.buildout-1.7.1/src/zc.buildout.egg-info/not-zip-safe0000644000076500007650000000000112111414176022323 0ustar jimjim00000000000000 zc.buildout-1.7.1/src/zc.buildout.egg-info/PKG-INFO0000644000076500007650000124540112111415075021200 0ustar jimjim00000000000000Metadata-Version: 1.1 Name: zc.buildout Version: 1.7.1 Summary: System for managing development buildouts Home-page: http://pypi.python.org/pypi/zc.buildout Author: Jim Fulton Author-email: jim@zope.com License: ZPL 2.1 Description: ******** Buildout ******** .. contents:: The Buildout project provides support for creating applications, especially Python applications. It provides tools for assembling applications from multiple parts, Python or otherwise. An application may actually contain multiple programs, processes, and configuration settings. The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". To get a feel for some of the things you might use buildouts for, see the `Buildout examples`_. To lean more about using buildouts, see `Detailed Documentation`_. To see screencasts, talks, useful links and more documentation, visit the `Buildout website `_. Recipes ******* Existing recipes include: `zc.recipe.egg `_ The egg recipe installes one or more eggs, with their dependencies. It installs their console-script entry points with the needed eggs included in their paths. It is suitable for use with a "clean" Python: one without packages installed in site-packages. `z3c.recipe.scripts `_ Like zc.recipe.egg, this recipe builds interpreter scripts and entry point scripts based on eggs. It can be used with a Python that has packages installed in site-packages, such as a system Python. The interpreter also has more features than the one offered by zc.recipe.egg. `zc.recipe.testrunner `_ The testrunner egg creates a test runner script for one or more eggs. `zc.recipe.zope3checkout `_ The zope3checkout recipe installs a Zope 3 checkout into a buildout. `zc.recipe.zope3instance `_ The zope3instance recipe sets up a Zope 3 instance. `zc.recipe.filestorage `_ The filestorage recipe sets up a ZODB file storage for use in a Zope 3 instance created by the zope3instance recipe. Buildout examples ***************** Here are a few examples of what you can do with buildouts. We'll present these as a set of use cases. Try out an egg ============== Sometimes you want to try an egg (or eggs) that someone has released. You'd like to get a Python interpreter that lets you try things interactively or run sample scripts without having to do path manipulations. If you can and don't mind modifying your Python installation, you could use easy_install, otherwise, you could create a directory somewhere and create a buildout.cfg file in that directory containing:: [buildout] parts = mypython [mypython] recipe = zc.recipe.egg interpreter = mypython eggs = theegg where theegg is the name of the egg you want to try out. Run buildout in this directory. It will create a bin subdirectory that includes a mypython script. If you run mypython without any arguments you'll get an interactive interpreter with the egg in the path. If you run it with a script and script arguments, the script will run with the egg in its path. Of course, you can specify as many eggs as you want in the eggs option. If the egg provides any scripts (console_scripts entry points), those will be installed in your bin directory too. Work on a package ================= I often work on packages that are managed separately. They don't have scripts to be installed, but I want to be able to run their tests using the `zope.testing test runner `_. In this kind of application, the program to be installed is the test runner. A good example of this is `zc.ngi `_. Here I have a subversion project for the zc.ngi package. The software is in the src directory. The configuration file is very simple:: [buildout] develop = . parts = test [test] recipe = zc.recipe.testrunner eggs = zc.ngi I use the develop option to create a develop egg based on the current directory. I request a test script named "test" using the zc.recipe.testrunner recipe. In the section for the test script, I specify that I want to run the tests in the zc.ngi package. When I check out this project into a new sandbox, I run bootstrap.py to get setuptools and zc.buildout and to create bin/buildout. I run bin/buildout, which installs the test script, bin/test, which I can then use to run the tests. This is probably the most common type of buildout. If I need to run a previous version of zc.buildout, I use the `--version` option of the bootstrap.py script:: $ python bootstrap.py --version 1.1.3 The `zc.buildout project `_ is a slightly more complex example of this type of buildout. Install egg-based scripts ========================= A variation of the `Try out an egg`_ use case is to install scripts into your ~/bin directory (on Unix, of course). My ~/bin directory is a buildout with a configuration file that looks like:: [buildout] parts = foo bar bin-directory = . [foo] ... where foo and bar are packages with scripts that I want available. As I need new scripts, I can add additional sections. The bin-directory option specified that scripts should be installed into the current directory. Multi-program multi-machine systems =================================== Using an older prototype version of the buildout, we've build a number of systems involving multiple programs, databases, and machines. One typical example consists of: - Multiple Zope instances - Multiple ZEO servers - An LDAP server - Cache-invalidation and Mail delivery servers - Dozens of add-on packages - Multiple test runners - Multiple deployment modes, including dev, stage, and prod, with prod deployment over multiple servers Parts installed include: - Application software installs, including Zope, ZEO and LDAP software - Add-on packages - Bundles of configuration that define Zope, ZEO and LDAP instances - Utility scripts such as test runners, server-control scripts, cron jobs. Questions and Bug Reporting *************************** Please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. System Python and zc.buildout 1.5 ********************************* The 1.5 line of zc.buildout introduced a number of changes. Problems ======== As usual, please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. If problems are keeping you from your work, here's an easy way to revert to the old code temporarily: switch to a custom "emergency" bootstrap script, available from http://svn.zope.org/repos/main/zc.buildout/branches/1.4/bootstrap/bootstrap.py . This customized script will select zc.buildout 1.4.4 by default. zc.buildout 1.4.4 will not upgrade itself unless you explicitly specify a new version. It will also prefer older versions of zc.recipe.egg and some other common recipes. If you have trouble with other recipes, consider using a standard buildout "versions" section to specify older versions of these, as described in the Buildout documentation (http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Working with a System Python ============================ While there are a number of new features available in zc.buildout 1.5, the biggest is that Buildout itself supports usage with a system Python. This can work if you follow a couple of simple rules. 1. Use the new bootstrap.py (available from http://svn.zope.org/*checkout*/zc.buildout/trunk/bootstrap/bootstrap.py). 2. Use buildout recipes that have been upgraded to work with zc.buildout 1.5 and higher. Specifically, they should use ``zc.buildout.easy_install.sitepackage_safe_scripts`` to generate their scripts, if any, rather than ``zc.buildout.easy_install.scripts``. See the `Recipes That Support a System Python`_ section below for more details on recipes that are available as of this writing, and `Updating Recipes to Support a System Python`_ for instructions on how to update a recipe. Note that you should generally only need to update recipes that generate scripts. You can then use ``include-site-packages = false`` and ``exec-sitecustomize = false`` buildout options to eliminate access to your Python's site packages and not execute its sitecustomize file, if it exists, respectively. Alternately, you can use the ``allowed-eggs-from-site-packages`` buildout option as a glob-aware whitelist of eggs that may come from site-packages. This value defaults to "*", accepting all eggs. It's important to note that recipes not upgraded for zc.buildout 1.5.0 should continue to work--just without internal support for a system Python. Using a system Python is inherently fragile. Using a clean, freshly-installed Python without customization in site-packages is more robust and repeatable. See some of the regression tests added to the 1.5.0 line for the kinds of issues that you can encounter with a system Python, and see http://pypi.python.org/pypi/z3c.recipe.scripts#including-site-packages-and-sitecustomize for more discussion. However, using a system Python can be very convenient, and the zc.buildout code for this feature has been tested by many users already. Moreover, it has automated tests to exercise the problems that have been encountered and fixed. Many people rely on it. Recipes That Support a System Python ==================================== zc.recipe.egg continues to generate old-style scripts that are not safe for use with a system Python. This was done for backwards compatibility, because it is integral to so many buildouts and used as a dependency of so many other recipes. If you want to generate new-style scripts that do support system Python usage, use z3c.recipe.scripts instead (http://pypi.python.org/pypi/z3c.recipe.scripts). z3c.recipe.scripts has the same script and interpreter generation options as zc.recipe.egg, plus a few more for the new features mentioned above. In the simplest case, you should be able to simply change ``recipe = zc.recipe.egg`` to ``recipe = z3c.recipe.scripts`` in the pertinent sections of your buildout configuration and your generated scripts will work with a system Python. Other updated recipes include zc.recipe.testrunner 1.4.0 and z3c.recipe.tag 0.4.0. Others should be updated soon: see their change documents for details, or see `Updating Recipes to Support a System Python`_ for instructions on how to update recipes yourself. Templates for creating Python scripts with the z3c.recipe.filetemplate recipe can be easily changed to support a system Python. - If you don't care about supporting relative paths, simply using a generated interpreter with the eggs you want should be sufficient, as it was before. For instance, if the interpreter is named "py", use ``#!${buildout:bin-directory/py}`` or ``#!/usr/bin/env ${buildout:bin-directory/py}``). - If you do care about relative paths, (``relative-paths = true`` in your buildout configuration), then z3c.recipe.scripts does require a bit more changes, as is usual for the relative path support in that package. First, use z3c.recipe.scripts to generate a script or interpreter with the dependencies you want. This will create a directory in ``parts`` that has a site.py and sitecustomize.py. Then, begin your script as in the snippet below. The example assumes that the z3c.recipe.scripts generated were from a Buildout configuration section labeled "scripts": adjust accordingly. :: #!${buildout:executable} -S ${python-relative-path-setup} import sys sys.path.insert(0, ${scripts:parts-directory|path-repr}) import site Updating Recipes to Support a System Python =========================================== You should generally only need to update recipes that generate scripts. These recipes need to change from using ``zc.buildout.easy_install.scripts`` to be using ``zc.buildout.easy_install.sitepackage_safe_scripts``. The signatures of the two functions are different. Please compare:: def scripts( reqs, working_set, executable, dest, scripts=None, extra_paths=(), arguments='', interpreter=None, initialization='', relative_paths=False, ): def sitepackage_safe_scripts( dest, working_set, executable, site_py_dest, reqs=(), scripts=None, interpreter=None, extra_paths=(), initialization='', include_site_packages=False, exec_sitecustomize=False, relative_paths=False, script_arguments='', script_initialization='', ): In most cases, the arguments are merely reordered. The ``reqs`` argument is no longer required in order to make it easier to generate an interpreter alone. The ``arguments`` argument was renamed to ``script_arguments`` to clarify that it did not affect interpreter generation. The only new required argument is ``site_py_dest``. It must be the path to a directory in which the customized site.py and sitecustomize.py files will be written. A typical generation in a recipe will look like this. (In the recipe's __init__ method...) :: self.options = options b_options = buildout['buildout'] options['parts-directory'] = os.path.join( b_options['parts-directory'], self.name) (In the recipe's install method...) :: options = self.options generated = [] if not os.path.exists(options['parts-directory']): os.mkdir(options['parts-directory']) generated.append(options['parts-directory']) Then ``options['parts-directory']`` can be used for the ``site_py_dest`` value. If you want to support the other arguments (``include_site_packages``, ``exec_sitecustomize``, ``script_initialization``, as well as the ``allowed-eggs-from-site-packages`` option), you might want to look at some of the code in https://github.com/buildout/buildout/blob/1.6.x/z3c.recipe.scripts_/src/z3c/recipe/scripts/scripts.py . You might even be able to adopt some of it by subclassing or delegating. The Scripts class in that file is the closest to what you might be used to from zc.recipe.egg. Important note for recipe authors: As of buildout 1.5.2, the code in recipes is *always run with the access to the site-packages as configured in the buildout section*. virtualenv ========== Using virtualenv (http://pypi.python.org/pypi/virtualenv) with the --no-site-packages option already provided a simple way of using a system Python. This is intended to continue to work, and some automated tests exist to demonstrate this. However, it is only supported to the degree that people have found it to work in the past. The existing Buildout tests for virtualenv are only for problems encountered previously. They are very far from comprehensive. Using Buildout with a system python has at least three advantages over using Buildout in conjunction with virtualenv. They may or may not be pertinent to your desired usage. - Unlike ``virtualenv --no-site-packages``, Buildout's support allows you to choose to let packages from your system Python be available to your software (see ``include-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). You can even specify which eggs installed in your system Python can be allowed to fulfill some of your packages' dependencies (see ``allowed-eggs-from-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). At the expense of some repeatability and platform dependency, this flexibility means that, for instance, you can rely on difficult-to-build eggs like lxml coming from your system Python. - Buildout's implementation has a full set of automated tests. - An integral Buildout implementation means fewer steps and fewer dependencies to work with a system Python. Detailed Documentation ********************** Buildouts ========= The word "buildout" refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named "Foo", then "the Foo buildout" is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as "a Foo buildout". This document describes how to define buildouts using buildout configuration files and recipes. There are three ways to set up the buildout software and create a buildout instance: 1. Install the zc.buildout egg with easy_install and use the buildout script installed in a Python scripts area. 2. Use the buildout bootstrap script to create a buildout that includes both the setuptools and zc.buildout eggs. This allows you to use the buildout software without modifying a Python install. The buildout script is installed into your buildout local scripts area. 3. Use a buildout command from an already installed buildout to bootstrap a new buildout. (See the section on bootstraping later in this document.) Often, a software project will be managed in a software repository, such as a subversion repository, that includes some software source directories, buildout configuration files, and a copy of the buildout bootstrap script. To work on the project, one would check out the project from the repository and run the bootstrap script which installs setuptools and zc.buildout into the checkout as well as any parts defined. We have a sample buildout that we created using the bootstrap command of an existing buildout (method 3 above). It has the absolute minimum information. We have bin, develop-eggs, eggs and parts directories, and a configuration file: >>> ls(sample_buildout) d bin - buildout.cfg d develop-eggs d eggs d parts The bin directory contains scripts. >>> ls(sample_buildout, 'bin') - buildout >>> ls(sample_buildout, 'eggs') - setuptools-0.6-py2.4.egg - zc.buildout-1.0-py2.4.egg The develop-eggs directory is initially empty: >>> ls(sample_buildout, 'develop-eggs') The develop-eggs directory holds egg links for software being developed in the buildout. We separate develop-eggs and other eggs to allow eggs directories to be shared across multiple buildouts. For example, a common developer technique is to define a common eggs directory in their home that all non-develop eggs are stored in. This allows larger buildouts to be set up much more quickly and saves disk space. The parts directory just contains some helpers for the buildout script itself. >>> ls(sample_buildout, 'parts') d buildout The parts directory provides an area where recipes can install part data. For example, if we built a custom Python, we would install it in the part directory. Part data is stored in a sub-directory of the parts directory with the same name as the part. Buildouts are defined using configuration files. These are in the format defined by the Python ConfigParser module, with extensions that we'll describe later. By default, when a buildout is run, it looks for the file buildout.cfg in the directory where the buildout is run. The minimal configuration file has a buildout section that defines no parts: >>> cat(sample_buildout, 'buildout.cfg') [buildout] parts = A part is simply something to be created by a buildout. It can be almost anything, such as a Python package, a program, a directory, or even a configuration file. Recipes ------- A part is created by a recipe. Recipes are always installed as Python eggs. They can be downloaded from a package server, such as the Python Package Index, or they can be developed as part of a project using a "develop" egg. A develop egg is a special kind of egg that gets installed as an "egg link" that contains the name of a source directory. Develop eggs don't have to be packaged for distribution to be used and can be modified in place, which is especially useful while they are being developed. Let's create a recipe as part of the sample project. We'll create a recipe for creating directories. First, we'll create a recipes source directory for our local recipes: >>> mkdir(sample_buildout, 'recipes') and then we'll create a source file for our mkdir recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... if not os.path.isdir(os.path.dirname(options['path'])): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... ... ... def install(self): ... path = self.options['path'] ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return path ... ... def update(self): ... pass ... """) Currently, recipes must define 3 methods [#future_recipe_methods]_: - a constructor, - an install method, and - an update method. The constructor is responsible for updating a parts options to reflect data read from other sections. The buildout system keeps track of whether a part specification has changed. A part specification has changed if it's options, after adjusting for data read from other sections, has changed, or if the recipe has changed. Only the options for the part are considered. If data are read from other sections, then that information has to be reflected in the parts options. In the Mkdir example, the given path is interpreted relative to the buildout directory, and data from the buildout directory is read. The path option is updated to reflect this. If the directory option was changed in the buildout sections, we would know to update parts created using the mkdir recipe using relative path names. When buildout is run, it saves configuration data for installed parts in a file named ".installed.cfg". In subsequent runs, it compares part-configuration data stored in the .installed.cfg file and the part-configuration data loaded from the configuration files as modified by recipe constructors to decide if the configuration of a part has changed. If the configuration has changed, or if the recipe has changed, then the part is uninstalled and reinstalled. The buildout only looks at the part's options, so any data used to configure the part needs to be reflected in the part's options. It is the job of a recipe constructor to make sure that the options include all relevant data. Of course, parts are also uninstalled if they are no-longer used. The recipe defines a constructor that takes a buildout object, a part name, and an options dictionary. It saves them in instance attributes. If the path is relative, we'll interpret it as relative to the buildout directory. The buildout object passed in is a mapping from section name to a mapping of options for that section. The buildout directory is available as the directory option of the buildout section. We normalize the path and save it back into the options directory. The install method is responsible for creating the part. In this case, we need the path of the directory to create. We'll use a path option from our options dictionary. The install method logs what it's doing using the Python logging call. We return the path that we installed. If the part is uninstalled or reinstalled, then the path returned will be removed by the buildout machinery. A recipe install method is expected to return a string, or an iterable of strings containing paths to be removed if a part is uninstalled. For most recipes, this is all of the uninstall support needed. For more complex uninstallation scenarios use `Uninstall recipes`_. The update method is responsible for updating an already installed part. An empty method is often provided, as in this example, if parts can't be updated. An update method can return None, a string, or an iterable of strings. If a string or iterable of strings is returned, then the saved list of paths to be uninstalled is updated with the new information by adding any new files returned by the update method. We need to provide packaging information so that our recipe can be installed as a develop egg. The minimum information we need to specify [#packaging_info]_ is a name. For recipes, we also need to define the names of the recipe classes as entry points. Packaging information is provided via a setup.py script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) Our setup script defines an entry point. Entry points provide a way for an egg to define the services it provides. Here we've said that we define a zc.buildout entry point named mkdir. Recipe classes must be exposed as entry points in the zc.buildout group. we give entry points names within the group. We also need a README.txt for our recipes to avoid an annoying warning from distutils, on which setuptools and zc.buildout are based: >>> write(sample_buildout, 'recipes', 'README.txt', " ") Now let's update our buildout.cfg: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) Let's go through the changes one by one:: develop = recipes This tells the buildout to install a development egg for our recipes. Any number of paths can be listed. The paths can be relative or absolute. If relative, they are treated as relative to the buildout directory. They can be directory or file paths. If a file path is given, it should point to a Python setup script. If a directory path is given, it should point to a directory containing a setup.py file. Development eggs are installed before building any parts, as they may provide locally-defined recipes needed by the parts. :: parts = data-dir Here we've named a part to be "built". We can use any name we want except that different part names must be unique and recipes will often use the part name to decide what to do. :: [data-dir] recipe = recipes:mkdir path = mystuff When we name a part, we also create a section of the same name that contains part data. In this section, we'll define the recipe to be used to install the part. In this case, we also specify the path to be created. Let's run the buildout. We do so by running the build script in the buildout: >>> import os >>> os.chdir(sample_buildout) >>> buildout = os.path.join(sample_buildout, 'bin', 'buildout') >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory mystuff We see that the recipe created the directory, as expected: >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mystuff d parts d recipes In addition, .installed.cfg has been created containing information about the part we installed: >>> cat(sample_buildout, '.installed.cfg') [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir [data-dir] __buildout_installed__ = /sample-buildout/mystuff __buildout_signature__ = recipes-c7vHV6ekIDUPy/7fjAaYjg== path = /sample-buildout/mystuff recipe = recipes:mkdir Note that the directory we installed is included in .installed.cfg. In addition, the path option includes the actual destination directory. If we change the name of the directory in the configuration file, we'll see that the directory gets removed and recreated: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata >>> ls(sample_buildout) - .installed.cfg d bin - buildout.cfg d develop-eggs d eggs d mydata d parts d recipes If any of the files or directories created by a recipe are removed, the part will be reinstalled: >>> rmdir(sample_buildout, 'mydata') >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Error reporting --------------- If a user makes an error, an error needs to be printed and work needs to stop. This is accomplished by logging a detailed error message and then raising a (or an instance of a subclass of a) zc.buildout.UserError exception. Raising an error other than a UserError still displays the error, but labels it as a bug in the buildout software or recipe. In the sample above, of someone gives a non-existent directory to create the directory in: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = /xxx/mydata ... """) We'll get a user error, not a traceback. >>> print system(buildout), Develop: '/sample-buildout/recipes' data-dir: Cannot create /xxx/mydata. /xxx is not a directory. While: Installing. Getting section data-dir. Initializing part data-dir. Error: Invalid Path Recipe Error Handling --------------------- If an error occurs during installation, it is up to the recipe to clean up any system side effects, such as files created. Let's update the mkdir recipe to support multiple paths: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') If there is an error creating a path, the install method will exit and leave previously created paths in place: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' We meant to create a directory bins, but typed bin. Now foo was left behind. >>> os.path.exists('foo') True If we fix the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/foo' Now they fail because foo exists, because it was left behind. >>> remove('foo') Let's fix the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... created = [] ... try: ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... created.append(path) ... except: ... for d in created: ... os.rmdir(d) ... raise ... ... return paths ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') And put back the typo: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bin ... """) When we rerun the buildout: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' we get the same error, but we don't get the directory left behind: >>> os.path.exists('foo') False It's critical that recipes clean up partial effects when errors occur. Because recipes most commonly create files and directories, buildout provides a helper API for removing created files when an error occurs. Option objects have a created method that can be called to record files as they are created. If the install or update method returns with an error, then any registered paths are removed automatically. The method returns the files registered and can be used to return the files created. Let's use this API to simplify the recipe: >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import logging, os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... ... # Normalize paths and check that their parent ... # directories exist: ... paths = [] ... for path in options['path'].split(): ... path = os.path.join(buildout['buildout']['directory'], path) ... if not os.path.isdir(os.path.dirname(path)): ... logging.getLogger(self.name).error( ... 'Cannot create %s. %s is not a directory.', ... options['path'], os.path.dirname(options['path'])) ... raise zc.buildout.UserError('Invalid Path') ... paths.append(path) ... options['path'] = ' '.join(paths) ... ... def install(self): ... paths = self.options['path'].split() ... for path in paths: ... logging.getLogger(self.name).info( ... 'Creating directory %s', os.path.basename(path)) ... os.mkdir(path) ... self.options.created(path) ... ... return self.options.created() ... ... def update(self): ... pass ... """) .. >>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py') We returned by calling created, taking advantage of the fact that it returns the registered paths. We did this for illustrative purposes. It would be simpler just to return the paths as before. If we rerun the buildout, again, we'll get the error and no directories will be created: >>> print system(buildout), # doctest: +ELLIPSIS Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bin While: Installing data-dir. An internal error occurred due to a bug in either zc.buildout or in a recipe being used: Traceback (most recent call last): ... OSError: [Errno 17] File exists: '/sample-buildout/bin' >>> os.path.exists('foo') False Now, we'll fix the typo again and we'll get the directories we expect: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = foo bins ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. data-dir: Creating directory foo data-dir: Creating directory bins >>> os.path.exists('foo') True >>> os.path.exists('bins') True Configuration file syntax ------------------------- As mentioned earlier, buildout configuration files use the format defined by the Python ConfigParser module with extensions. The extensions are: - option names are case sensitive - option values can use a substitution syntax, described below, to refer to option values in specific sections. - option values can be appended or removed using the - and + operators. The ConfigParser syntax is very flexible. Section names can contain any characters other than newlines and right square braces ("]"). Option names can contain any characters other than newlines, colons, and equal signs, can not start with a space, and don't include trailing spaces. It is likely that, in the future, some characters will be given special buildout-defined meanings. This is already true of the characters ":", "$", "%", "(", and ")". For now, it is a good idea to keep section and option names simple, sticking to alphanumeric characters, hyphens, and periods. Annotated sections ------------------ When used with the `annotate` command, buildout displays annotated sections. All sections are displayed, sorted alphabetically. For each section, all key-value pairs are displayed, sorted alphabetically, along with the origin of the value (file name or COMPUTED_VALUE, DEFAULT_VALUE, COMMAND_LINE_VALUE). >>> print system(buildout+ ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== [buildout] accept-buildout-test-releases= false DEFAULT_VALUE allow-hosts= * DEFAULT_VALUE allow-picked-versions= true DEFAULT_VALUE allowed-eggs-from-site-packages= * DEFAULT_VALUE bin-directory= bin DEFAULT_VALUE develop= recipes /sample-buildout/buildout.cfg develop-eggs-directory= develop-eggs DEFAULT_VALUE directory= /sample-buildout COMPUTED_VALUE eggs-directory= eggs DEFAULT_VALUE exec-sitecustomize= true DEFAULT_VALUE executable= ... DEFAULT_VALUE find-links= DEFAULT_VALUE include-site-packages= true DEFAULT_VALUE install-from-cache= false DEFAULT_VALUE installed= .installed.cfg DEFAULT_VALUE log-format= DEFAULT_VALUE log-level= INFO DEFAULT_VALUE newest= true DEFAULT_VALUE offline= false DEFAULT_VALUE parts= data-dir /sample-buildout/buildout.cfg parts-directory= parts DEFAULT_VALUE prefer-final= false DEFAULT_VALUE python= buildout DEFAULT_VALUE relative-paths= false DEFAULT_VALUE socket-timeout= DEFAULT_VALUE unzip= false DEFAULT_VALUE use-dependency-links= true DEFAULT_VALUE [data-dir] path= foo bins /sample-buildout/buildout.cfg recipe= recipes:mkdir /sample-buildout/buildout.cfg Variable substitutions ---------------------- Buildout configuration files support variable substitution. To illustrate this, we'll create an debug recipe to allow us to see interactions with the buildout: >>> write(sample_buildout, 'recipes', 'debug.py', ... """ ... class Debug: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... items = self.options.items() ... items.sort() ... for option, value in items: ... print option, value ... return () ... ... update = install ... """) This recipe doesn't actually create anything. The install method doesn't return anything, because it didn't create any files or directories. We also have to update our setup script: >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) We've rearranged the script a bit to make the entry points easier to edit. In particular, entry points are now defined as a configuration string, rather than a dictionary. Let's update our configuration to provide variable substitution examples: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) We used a string-template substitution for File 1 and File 2. This type of substitution uses the string.Template syntax. Names substituted are qualified option names, consisting of a section name and option name joined by a colon. Now, if we run the buildout, we'll see the options with the values substituted. >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling data-dir. Installing data-dir. data-dir: Creating directory mydata Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug Note that the substitution of the data-dir path option reflects the update to the option performed by the mkdir recipe. It might seem surprising that mydata was created again. This is because we changed our recipes package by adding the debug module. The buildout system didn't know if this module could effect the mkdir recipe, so it assumed it could and reinstalled mydata. If we rerun the buildout: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug We can see that mydata was not recreated. Note that, in this case, we didn't specify a log level, so we didn't get output about what the buildout was doing. Section and option names in variable substitutions are only allowed to contain alphanumeric characters, hyphens, periods and spaces. This restriction might be relaxed in future releases. We can ommit the section name in a variable substitution to refer to the current section. We can also use the special option, _buildout_section_name_ to get the current section name. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${:File 1}/log ... my_name = ${:_buildout_section_name_} ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log my_name debug recipe recipes:debug Automatic part selection and ordering ------------------------------------- When a section with a recipe is referred to, either through variable substitution or by an initializing recipe, the section is treated as a part and added to the part list before the referencing part. For example, we can leave data-dir out of the parts list: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Updating data-dir. Installing debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Note that the data-dir part is included *before* the debug part, because the debug part refers to the data-dir part. Even if we list the data-dir part after the debug part, it will be included before: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug data-dir ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... File 1 = ${data-dir:path}/file ... File 2 = ${debug:File 1}/log ... ... [data-dir] ... recipe = recipes:mkdir ... path = mydata ... """) It will still be treated as a part: >>> print system(buildout), Develop: '/sample-buildout/recipes' Updating data-dir. Updating debug. File 1 /sample-buildout/mydata/file File 2 /sample-buildout/mydata/file/log recipe recipes:debug >>> cat('.installed.cfg') # doctest: +ELLIPSIS [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = data-dir debug ... Extending sections (macros) --------------------------- A section (other than the buildout section) can extend one or more other sections using the ``<=`` option. Options from the referenced sections are copied to the refering section *before* variable substitution. This, together with the ability to refer to variables of the current section allows sections to be used as macros. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = myfiles ... log-level = INFO ... ... [debug] ... recipe = recipes:debug ... ... [with_file1] ... <= debug ... file1 = ${:path}/file1 ... color = red ... ... [with_file2] ... <= debug ... file2 = ${:path}/file2 ... color = blue ... ... [myfiles] ... <= with_file1 ... with_file2 ... path = mydata ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Uninstalling data-dir. Installing myfiles. color blue file1 mydata/file1 file2 mydata/file2 path mydata recipe recipes:debug In this example, the debug, with_file1 and with_file2 sections act as macros. In particular, the variable substitutions are performed relative to the myfiles section. Adding and removing options --------------------------- We can append and remove values to an option by using the + and - operators. This is illustrated below; first we define a base configuration. >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... parts = part1 part2 part3 ... ... [part1] ... recipe = ... option = a1 a2 ... ... [part2] ... recipe = ... option = b1 ... b2 ... b3 ... b4 ... ... [part3] ... recipe = ... option = c1 c2 ... ... [part4] ... recipe = ... option = d2 ... d3 ... d5 ... ... """) Extending this configuration, we can "adjust" the values set in the base configuration file. >>> write(sample_buildout, 'extension1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... # appending values ... [part1] ... option += a3 a4 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... ... # alt. spelling ... [part3] ... option+=c3 c4 c5 ... ... # combining both adding and removing ... [part4] ... option += d1 ... d4 ... option -= d5 ... ... # normal assignment ... [part5] ... option = h1 h2 ... """) An additional extension. >>> write(sample_buildout, 'extension2.cfg', ... """ ... [buildout] ... extends = extension1.cfg ... ... # appending values ... [part1] ... option += a5 ... ... # removing values ... [part2] ... option -= b1 ... b2 ... b3 ... ... """) To verify that the options are adjusted correctly, we'll set up an extension that prints out the options. >>> mkdir(sample_buildout, 'demo') >>> write(sample_buildout, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print [part['option'] for name, part in buildout.items() \ ... if name.startswith('part')] ... """) >>> write(sample_buildout, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name="demo", ... entry_points={'zc.buildout.extension': ['ext = demo:ext']}, ... ) ... """) Set up a buildout configuration for this extension. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_buildout) >>> print system(os.path.join(sample_buildout, 'bin', 'buildout')), # doctest: +ELLIPSIS Develop: '/sample-buildout/demo' Uninstalling myfiles. Getting distribution for 'recipes'. zip_safe flag not set; analyzing archive contents... Got recipes 0.0.0. warning: install_lib: 'build/lib...' does not exist -- no Python modules to install Verify option values. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... extends = extension2.cfg ... """) >>> print system(os.path.join('bin', 'buildout')), ['a1 a2/na3 a4/na5', 'b4', 'c1 c2/nc3 c4 c5', 'd2/nd3/nd1/nd4', 'h1 h2'] Develop: '/sample-buildout/demo' Annotated sections output shows which files are responsible for which operations. >>> print system(os.path.join('bin', 'buildout') + ' annotate'), ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE Annotated sections ================== ... [part1] option= a1 a2 a3 a4 a5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg += /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part2] option= b4 /sample-buildout/base.cfg -= /sample-buildout/extension1.cfg -= /sample-buildout/extension2.cfg recipe= /sample-buildout/base.cfg [part3] option= c1 c2 c3 c4 c5 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part4] option= d2 d3 d1 d4 /sample-buildout/base.cfg += /sample-buildout/extension1.cfg -= /sample-buildout/extension1.cfg recipe= /sample-buildout/base.cfg [part5] option= h1 h2 /sample-buildout/extension1.cfg Cleanup. >>> os.remove(os.path.join(sample_buildout, 'base.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension1.cfg')) >>> os.remove(os.path.join(sample_buildout, 'extension2.cfg')) Multiple configuration files ---------------------------- A configuration file can "extend" another configuration file. Options are read from the other configuration file if they aren't already defined by your configuration file. The configuration files your file extends can extend other configuration files. The same file may be used more than once although, of course, cycles aren't allowed. To see how this works, we use an example: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op = buildout ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... op = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing debug. op buildout recipe recipes:debug The example is pretty trivial, but the pattern it illustrates is pretty common. In a more practical example, the base buildout might represent a product and the extending buildout might be a customization. Here is a more elaborate example. >>> other = tmpdir('other') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... extends = b1.cfg b2.cfg %(b3)s ... ... [debug] ... op = buildout ... """ % dict(b3=os.path.join(other, 'b3.cfg'))) >>> write(sample_buildout, 'b1.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op1 = b1 1 ... op2 = b1 2 ... """) >>> write(sample_buildout, 'b2.cfg', ... """ ... [buildout] ... extends = base.cfg ... ... [debug] ... op2 = b2 2 ... op3 = b2 3 ... """) >>> write(other, 'b3.cfg', ... """ ... [buildout] ... extends = b3base.cfg ... ... [debug] ... op4 = b3 4 ... """) >>> write(other, 'b3base.cfg', ... """ ... [debug] ... op5 = b3base 5 ... """) >>> write(sample_buildout, 'base.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... name = base ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug There are several things to note about this example: - We can name multiple files in an extends option. - We can reference files recursively. - Relative file names in extended options are interpreted relative to the directory containing the referencing configuration file. Loading Configuration from URLs ------------------------------- Configuration files can be loaded from URLs. To see how this works, we'll set up a web server with some configuration files. >>> server_data = tmpdir('server_data') >>> write(server_data, "r1.cfg", ... """ ... [debug] ... op1 = r1 1 ... op2 = r1 2 ... """) >>> write(server_data, "r2.cfg", ... """ ... [buildout] ... extends = r1.cfg ... ... [debug] ... op2 = r2 2 ... op3 = r2 3 ... """) >>> server_url = start_server(server_data) >>> write('client.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = %(url)s/r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = base ... """ % dict(url=server_url)) >>> print system(buildout+ ' -c client.cfg'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug Here we specified a URL for the file we extended. The file we downloaded, itself referred to a file on the server using a relative URL reference. Relative references are interpreted relative to the base URL when they appear in configuration files loaded via URL. We can also specify a URL as the configuration file to be used by a buildout. >>> os.remove('client.cfg') >>> write(server_data, 'remote.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... extends = r2.cfg ... ... [debug] ... recipe = recipes:debug ... name = remote ... """) >>> print system(buildout + ' -c ' + server_url + '/remote.cfg'), While: Initializing. Error: Missing option: buildout:directory Normally, the buildout directory defaults to directory containing a configuration file. This won't work for configuration files loaded from URLs. In this case, the buildout directory would normally be defined on the command line: >>> print system(buildout ... + ' -c ' + server_url + '/remote.cfg' ... + ' buildout:directory=' + sample_buildout ... ), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name remote op1 r1 1 op2 r2 2 op3 r2 3 recipe recipes:debug User defaults ------------- If the file $HOME/.buildout/default.cfg, exists, it is read before reading the configuration file. ($HOME is the value of the HOME environment variable. The '/' is replaced by the operating system file delimiter.) >>> old_home = os.environ['HOME'] >>> home = tmpdir('home') >>> mkdir(home, '.buildout') >>> write(home, '.buildout', 'default.cfg', ... """ ... [debug] ... op1 = 1 ... op7 = 7 ... """) >>> os.environ['HOME'] = home >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 op7 7 recipe recipes:debug A buildout command-line argument, -U, can be used to suppress reading user defaults: >>> print system(buildout + ' -U'), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. name base op buildout op1 b1 1 op2 b2 2 op3 b2 3 op4 b3 4 op5 b3base 5 recipe recipes:debug >>> os.environ['HOME'] = old_home Log level --------- We can control the level of logging by specifying a log level in out configuration file. For example, so suppress info messages, we can set the logging level to WARNING >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... log-level = WARNING ... extends = b1.cfg b2.cfg ... """) >>> print system(buildout), name base op1 b1 1 op2 b2 2 op3 b2 3 recipe recipes:debug Uninstall recipes ----------------- As we've seen, when parts are installed, buildout keeps track of files and directories that they create. When the parts are uninstalled these files and directories are deleted. Sometimes more clean up is needed. For example, a recipe might add a system service by calling chkconfig --add during installation. Later during uninstallation, chkconfig --del will need to be called to remove the system service. In order to deal with these uninstallation issues, you can register uninstall recipes. Uninstall recipes are registered using the 'zc.buildout.uninstall' entry point. Parts specify uninstall recipes using the 'uninstall' option. In comparison to regular recipes, uninstall recipes are much simpler. They are simply callable objects that accept the name of the part to be uninstalled and the part's options dictionary. Uninstall recipes don't have access to the part itself since it maybe not be able to be instantiated at uninstallation time. Here's a recipe that simulates installation of a system service, along with an uninstall recipe that simulates removing the service. >>> write(sample_buildout, 'recipes', 'service.py', ... """ ... class Service: ... ... def __init__(self, buildout, name, options): ... self.buildout = buildout ... self.name = name ... self.options = options ... ... def install(self): ... print "chkconfig --add %s" % self.options['script'] ... return () ... ... def update(self): ... pass ... ... ... def uninstall_service(name, options): ... print "chkconfig --del %s" % options['script'] ... """) To use these recipes we must register them using entry points. Make sure to use the same name for the recipe and uninstall recipe. This is required to let buildout know which uninstall recipe goes with which recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... service = service:uninstall_service ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Here's how these recipes could be used in a buildout: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/script ... """) When the buildout is run the service will be installed >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing service. chkconfig --add /path/to/script The service has been installed. If the buildout is run again with no changes, the service shouldn't be changed. >>> print system(buildout) Develop: '/sample-buildout/recipes' Updating service. Now we change the service part to trigger uninstallation and re-installation. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = service ... ... [service] ... recipe = recipes:service ... script = /path/to/a/different/script ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/script Installing service. chkconfig --add /path/to/a/different/script Now we remove the service part, and add another part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling service. Running uninstall recipe. chkconfig --del /path/to/a/different/script Installing debug. recipe recipes:debug Uninstall recipes don't have to take care of removing all the files and directories created by the part. This is still done automatically, following the execution of the uninstall recipe. An upshot is that an uninstallation recipe can access files and directories created by a recipe before they are deleted. For example, here's an uninstallation recipe that simulates backing up a directory before it is deleted. It is designed to work with the mkdir recipe introduced earlier. >>> write(sample_buildout, 'recipes', 'backup.py', ... """ ... import os ... def backup_directory(name, options): ... path = options['path'] ... size = len(os.listdir(path)) ... print "backing up directory %s of size %s" % (path, size) ... """) It must be registered with the zc.buildout.uninstall entry point. Notice how it is given the name 'mkdir' to associate it with the mkdir recipe. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... service = service:Service ... ... [zc.buildout.uninstall] ... uninstall_service = service:uninstall_service ... mkdir = backup:backup_directory ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Now we can use it with a mkdir part. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = dir debug ... ... [dir] ... recipe = recipes:mkdir ... path = my_directory ... ... [debug] ... recipe = recipes:debug ... """) Run the buildout to install the part. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling debug. Installing dir. dir: Creating directory my_directory Installing debug. recipe recipes:debug Now we remove the part from the configuration file. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) When the buildout is run the part is removed, and the uninstall recipe is run before the directory is deleted. >>> print system(buildout) Develop: '/sample-buildout/recipes' Uninstalling dir. Running uninstall recipe. backing up directory /sample-buildout/my_directory of size 0 Updating debug. recipe recipes:debug Now we will return the registration to normal for the benefit of the rest of the examples. >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... entry_points = ( ... ''' ... [zc.buildout] ... mkdir = mkdir:Mkdir ... debug = debug:Debug ... ''') ... setup(name="recipes", entry_points=entry_points) ... """) Command-line usage ------------------ A number of arguments can be given on the buildout command line. The command usage is:: buildout [options and assignments] [command [command arguments]] The following options are supported: -h (or --help) Print basic usage information. If this option is used, then all other options are ignored. -c filename The -c option can be used to specify a configuration file, rather than buildout.cfg in the current directory. -t socket_timeout Specify the socket timeout in seconds. -v Increment the verbosity by 10. The verbosity is used to adjust the logging level. The verbosity is subtracted from the numeric value of the log-level option specified in the configuration file. -q Decrement the verbosity by 10. -U Don't read user-default configuration. -o Run in off-line mode. This is equivalent to the assignment buildout:offline=true. -O Run in non-off-line mode. This is equivalent to the assignment buildout:offline=false. This is the default buildout mode. The -O option would normally be used to override a true offline setting in a configuration file. -n Run in newest mode. This is equivalent to the assignment buildout:newest=true. With this setting, which is the default, buildout will try to find the newest versions of distributions available that satisfy its requirements. -N Run in non-newest mode. This is equivalent to the assignment buildout:newest=false. With this setting, buildout will not seek new distributions if installed distributions satisfy it's requirements. Assignments are of the form:: section_name:option_name=value Options and assignments can be given in any order. Here's an example: >>> write(sample_buildout, 'other.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... installed = .other.cfg ... log-level = WARNING ... ... [debug] ... name = other ... recipe = recipes:debug ... """) Note that we used the installed buildout option to specify an alternate file to store information about installed parts. >>> print system(buildout+' -c other.cfg debug:op1=foo -v'), Develop: '/sample-buildout/recipes' Installing debug. name other op1 foo recipe recipes:debug Here we used the -c option to specify an alternate configuration file, and the -v option to increase the level of logging from the default, WARNING. Options can also be combined in the usual Unix way, as in: >>> print system(buildout+' -vcother.cfg debug:op1=foo'), Develop: '/sample-buildout/recipes' Updating debug. name other op1 foo recipe recipes:debug Here we combined the -v and -c options with the configuration file name. Note that the -c option has to be last, because it takes an argument. >>> os.remove(os.path.join(sample_buildout, 'other.cfg')) >>> os.remove(os.path.join(sample_buildout, '.other.cfg')) The most commonly used command is 'install' and it takes a list of parts to install. if any parts are specified, only those parts are installed. To illustrate this, we'll update our configuration and run the buildout in the usual way: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d1 d2 d3 ... ... [d1] ... recipe = recipes:mkdir ... path = d1 ... ... [d2] ... recipe = recipes:mkdir ... path = d2 ... ... [d3] ... recipe = recipes:mkdir ... path = d3 ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling debug. Installing debug. recipe recipes:debug Installing d1. d1: Creating directory d1 Installing d2. d2: Creating directory d2 Installing d3. d3: Creating directory d3 >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d d3 d demo d develop-eggs d eggs d parts d recipes >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/d3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d3 recipe = recipes:mkdir Now we'll update our configuration file: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug d2 d3 d4 ... ... [d2] ... recipe = recipes:mkdir ... path = data2 ... ... [d3] ... recipe = recipes:mkdir ... path = data3 ... ... [d4] ... recipe = recipes:mkdir ... path = ${d2:path}-extra ... ... [debug] ... recipe = recipes:debug ... x = 1 ... """) and run the buildout specifying just d3 and d4: >>> print system(buildout+' install d3 d4'), Develop: '/sample-buildout/recipes' Uninstalling d3. Installing d3. d3: Creating directory data3 Installing d4. d4: Creating directory data2-extra >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d d1 d d2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Only the d3 and d4 recipes ran. d3 was removed and data3 and data2-extra were created. The .installed.cfg is only updated for the recipes that ran: >>> cat(sample_buildout, '.installed.cfg') ... # doctest: +NORMALIZE_WHITESPACE [buildout] installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link parts = debug d1 d2 d3 d4 [debug] __buildout_installed__ = __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== recipe = recipes:debug [d1] __buildout_installed__ = /sample-buildout/d1 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d1 recipe = recipes:mkdir [d2] __buildout_installed__ = /sample-buildout/d2 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/d2 recipe = recipes:mkdir [d3] __buildout_installed__ = /sample-buildout/data3 __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data3 recipe = recipes:mkdir [d4] __buildout_installed__ = /sample-buildout/data2-extra __buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg== path = /sample-buildout/data2-extra recipe = recipes:mkdir Note that the installed data for debug, d1, and d2 haven't changed, because we didn't install those parts and that the d1 and d2 directories are still there. Now, if we run the buildout without the install command: >>> print system(buildout), Develop: '/sample-buildout/recipes' Uninstalling d2. Uninstalling d1. Uninstalling debug. Installing debug. recipe recipes:debug x 1 Installing d2. d2: Creating directory data2 Updating d3. Updating d4. We see the output of the debug recipe and that data2 was created. We also see that d1 and d2 have gone away: >>> ls(sample_buildout) - .installed.cfg - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d data2 d data2-extra d data3 d demo d develop-eggs d eggs d parts d recipes Alternate directory and file locations -------------------------------------- The buildout normally puts the bin, eggs, and parts directories in the directory in the directory containing the configuration file. You can provide alternate locations, and even names for these directories. >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... develop-eggs-directory = %(developbasket)s ... eggs-directory = %(basket)s ... bin-directory = %(scripts)s ... parts-directory = %(work)s ... """ % dict( ... developbasket = os.path.join(alt, 'developbasket'), ... basket = os.path.join(alt, 'basket'), ... scripts = os.path.join(alt, 'scripts'), ... work = os.path.join(alt, 'work'), ... )) >>> print system(buildout), Creating directory '/sample-alt/scripts'. Creating directory '/sample-alt/work'. Creating directory '/sample-alt/basket'. Creating directory '/sample-alt/developbasket'. Develop: '/sample-buildout/recipes' Uninstalling d4. Uninstalling d3. Uninstalling d2. Uninstalling debug. >>> ls(alt) d basket d developbasket d scripts d work >>> ls(alt, 'developbasket') - recipes.egg-link You can also specify an alternate buildout directory: >>> rmdir(alt) >>> alt = tmpdir('sample-alt') >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... directory = %(alt)s ... develop = %(recipes)s ... parts = ... """ % dict( ... alt=alt, ... recipes=os.path.join(sample_buildout, 'recipes'), ... )) >>> print system(buildout), Creating directory '/sample-alt/bin'. Creating directory '/sample-alt/parts'. Creating directory '/sample-alt/eggs'. Creating directory '/sample-alt/develop-eggs'. Develop: '/sample-buildout/recipes' >>> ls(alt) - .installed.cfg d bin d develop-eggs d eggs d parts >>> ls(alt, 'develop-eggs') - recipes.egg-link Logging control --------------- Three buildout options are used to control logging: log-level specifies the log level verbosity adjusts the log level log-format allows an alternate logging for mat to be specified We've already seen the log level and verbosity. Let's look at an example of changing the format: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = ... log-level = 25 ... verbosity = 5 ... log-format = %(levelname)s %(message)s ... """) Here, we've changed the format to include the log-level name, rather than the logger name. We've also illustrated, with a contrived example, that the log level can be a numeric value and that the verbosity can be specified in the configuration file. Because the verbosity is subtracted from the log level, we get a final log level of 20, which is the INFO level. >>> print system(buildout), INFO Develop: '/sample-buildout/recipes' Predefined buildout options --------------------------- Buildouts have a number of predefined options that recipes can use and that users can override in their configuration files. To see these, we'll run a minimal buildout configuration with a debug logging level. One of the features of debug logging is that the configuration database is shown. >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' -vv'), # doctest: +NORMALIZE_WHITESPACE Installing 'zc.buildout >=1.9a1, <2dev', 'setuptools'. We have a develop egg: zc.buildout X.X. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = V.V Configuration data: [buildout] accept-buildout-test-releases = false allow-hosts = * allow-picked-versions = true allowed-eggs-from-site-packages = * bin-directory = /sample-buildout/bin develop-eggs-directory = /sample-buildout/develop-eggs directory = /sample-buildout eggs-directory = /sample-buildout/eggs exec-sitecustomize = true executable = python find-links = include-site-packages = true install-from-cache = false installed = /sample-buildout/.installed.cfg log-format = log-level = INFO newest = true offline = false parts = parts-directory = /sample-buildout/parts prefer-final = false python = buildout relative-paths = false socket-timeout = unzip = false use-dependency-links = true verbosity = 20 zc.buildout-version = >=1.9a1, <2dev All of these options can be overridden by configuration files or by command-line assignments. We've discussed most of these options already, but let's review them and touch on some we haven't discussed: allowed-eggs-from-site-packages Sometimes you need or want to control what eggs from site-packages are used. The allowed-eggs-from-site-packages option allows you to specify a whitelist of project names that may be included from site-packages. You can use globs to specify the value. It defaults to a single value of '*', indicating that any package may come from site-packages. Here's a usage example:: [buildout] ... allowed-eggs-from-site-packages = demo bigdemo zope.* This option interacts with the ``include-site-packages`` option in the following ways. If ``include-site-packages`` is true, then ``allowed-eggs-from-site-packages`` filters what eggs from site-packages may be chosen. Therefore, if ``allowed-eggs-from-site-packages`` is an empty list, then no eggs from site-packages are chosen, but site-packages will still be included at the end of path lists. If ``include-site-packages`` is false, the value of ``allowed-eggs-from-site-packages`` is irrelevant. See the ``include-site-packages`` description for more information. bin-directory The directory path where scripts are written. This can be a relative path, which is interpreted relative to the directory option. develop-eggs-directory The directory path where development egg links are created for software being created in the local project. This can be a relative path, which is interpreted relative to the directory option. directory The buildout directory. This is the base for other buildout file and directory locations, when relative locations are used. eggs-directory The directory path where downloaded eggs are put. It is common to share this directory across buildouts. Eggs in this directory should *never* be modified. This can be a relative path, which is interpreted relative to the directory option. exec-sitecustomize Normally the Python's real sitecustomize module is processed. If you do not want it to be processed in order to increase the repeatability of your buildout, set this value to 'false'. This will be honored irrespective of the setting for include-site-packages. This option will be honored by some recipes and not others. z3c.recipe.scripts honors this and zc.recipe.egg does not, for instance. executable The Python executable used to run the buildout. See the python option below. include-site-packages You can choose not to have the site-packages of the underlying Python available to your script or interpreter, in addition to the packages from your eggs. This can increase repeatability for your buildout. This option will be better used by some recipes than others. z3c.recipe.scripts honors this fully and zc.recipe.egg only partially, for instance. installed The file path where information about the results of the previous buildout run is written. This can be a relative path, which is interpreted relative to the directory option. This file provides an inventory of installed parts with information needed to decide which if any parts need to be uninstalled. log-format The format used for logging messages. log-level The log level before verbosity adjustment parts A white space separated list of parts to be installed. parts-directory A working directory that parts can used to store data. python The name of a section containing information about the default Python interpreter. Recipes that need a installation typically have options to tell them which Python installation to use. By convention, if a section-specific option isn't used, the option is looked for in the buildout section. The option must point to a section with an executable option giving the path to a Python executable. By default, the buildout section defines the default Python as the Python used to run the buildout. relative-paths The paths generated by zc.buildout are absolute by default, and this option is ``false``. However, if you set this value to be ``true``, bin/buildout will be generated with code that makes the paths relative. Some recipes, such as zc.recipe.egg and z3c.recipe.scripts, honor this value as well. unzip By default, zc.buildout doesn't unzip zip-safe eggs ("unzip = false"). This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy ("unzip = true"). use-dependency-links By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. verbosity A log-level adjustment. Typically, this is set via the -q and -v command-line options. Creating new buildouts and bootstrapping ---------------------------------------- If zc.buildout is installed, you can use it to create a new buildout with it's own local copies of zc.buildout and setuptools and with local buildout scripts. >>> sample_bootstrapped = tmpdir('sample-bootstrapped') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), Creating '/sample-bootstrapped/setup.cfg'. Creating directory '/sample-bootstrapped/bin'. Creating directory '/sample-bootstrapped/parts'. Creating directory '/sample-bootstrapped/eggs'. Creating directory '/sample-bootstrapped/develop-eggs'. Generated script '/sample-bootstrapped/bin/buildout'. Note that a basic setup.cfg was created for us. This is because we provided an 'init' argument. By default, the generated ``setup.cfg`` is as minimal as it could be: >>> cat(sample_bootstrapped, 'setup.cfg') [buildout] parts = We also get other buildout artifacts: >>> ls(sample_bootstrapped) d bin d develop-eggs d eggs d parts - setup.cfg >>> ls(sample_bootstrapped, 'bin') - buildout >>> _ = (ls(sample_bootstrapped, 'eggs'), ... ls(sample_bootstrapped, 'develop-eggs')) - setuptools-0.6-py2.3.egg - zc.buildout-1.0-py2.3.egg (We list both the eggs and develop-eggs directories because the buildout or setuptools egg could be installed in the develop-eggs directory if the original buildout had develop eggs for either buildout or setuptools.) If relative-paths is ``true``, the buildout script uses relative paths. >>> write(sample_bootstrapped, 'setup.cfg', ... ''' ... [buildout] ... relative-paths = true ... parts = ... ''') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' bootstrap'), Generated script '/sample-bootstrapped/bin/buildout'. >>> buildout_script = join(sample_bootstrapped, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #!... -S import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'parts/buildout'), ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import zc.buildout.buildout if __name__ == '__main__': sys.exit(zc.buildout.buildout.main()) Note that, in the above two examples, the buildout script was installed but not run. To run the buildout, we'd have to run the installed buildout script. If we have an existing buildout that already has a buildout.cfg, we'll normally use the bootstrap command instead of init. It will complain if there isn't a configuration file: >>> sample_bootstrapped2 = tmpdir('sample-bootstrapped2') >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), While: Initializing. Error: Couldn't open /sample-bootstrapped2/setup.cfg >>> write(sample_bootstrapped2, 'setup.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg') ... +' bootstrap'), Creating directory '/sample-bootstrapped2/bin'. Creating directory '/sample-bootstrapped2/parts'. Creating directory '/sample-bootstrapped2/eggs'. Creating directory '/sample-bootstrapped2/develop-eggs'. Generated script '/sample-bootstrapped2/bin/buildout'. Similarly, if there is a configuration file and we use the init command, we'll get an error that the configuration file already exists: >>> print system(buildout ... +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg') ... +' init'), While: Initializing. Error: '/sample-bootstrapped/setup.cfg' already exists. Initial eggs ------------ When using the ``init`` command, you can specify distribution requirements or paths to use: >>> cd(sample_bootstrapped) >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Generated script '/sample-bootstrapped/bin/buildout'. Getting distribution for 'zc.recipe.egg<2dev'. Got zc.recipe.egg 1.3.3dev. Installing py. Getting distribution for 'demo'. Got demo 0.4c1. Getting distribution for 'other'. Got other 1.0. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. This causes a ``py`` part to be included that sets up a custom python interpreter with the given requirements or paths: >>> cat('setup.cfg') [buildout] parts = py [py] recipe = zc.recipe.egg interpreter = py eggs = demo other extra-paths = ./src Passing requirements or paths causes the the builout to be run as part of initialization. In the example above, we got a number of distributions installed and 2 scripts generated. The first, ``demo``, was defined by the ``demo`` project. The second, ``py`` was defined by the generated configuration. It's a "custom interpreter" that behaves like a standard Python interpeter, except that includes the specified eggs and extra paths in it's Python path. We specified a source directory that didn't exist. Buildout created it for us: >>> ls('.') - .installed.cfg d bin d develop-eggs d eggs d parts - setup.cfg d src >>> uncd() .. Make sure it works if the dir is already there: >>> cd(sample_bootstrapped) >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> remove('setup.cfg') >>> print system(buildout + ' -csetup.cfg init demo other ./src'), Creating '/sample-bootstrapped/setup.cfg'. Installing py. Generated script '/sample-bootstrapped/bin/demo'. Generated interpreter '/sample-bootstrapped/bin/py'. .. cleanup >>> _ = system(buildout + ' -csetup.cfg buildout:parts=') >>> uncd() Newest and Offline Modes ------------------------ By default buildout and recipes will try to find the newest versions of distributions needed to satisfy requirements. This can be very time consuming, especially when incrementally working on setting up a buildout or working on a recipe. The buildout newest option can be used to to suppress this. If the newest option is set to false, then new distributions won't be sought if an installed distribution meets requirements. The newest option can be set to false using the -N command-line option. The offline option goes a bit further. If the buildout offline option is given a value of "true", the buildout and recipes that are aware of the option will avoid doing network access. This is handy when running the buildout when not connected to the internet. It also makes buildouts run much faster. This option is typically set using the buildout -o option. Preferring Final Releases ------------------------- Currently, when searching for new releases of your project's dependencies, the newest available release is used. This isn't usually ideal, as you may get a development release or alpha releases not ready to be widely used. You can request that final releases be preferred using the ``prefer-final`` option in the buildout section:: [buildout] ... prefer-final = true When the ``prefer-final`` option is set to true, then when searching for new releases, final releases are preferred. If there are final releases that satisfy distribution requirements, then those releases are used even if newer non-final releases are available. In buildout version 2, all final releases will be preferred by default--that is ``prefer-final`` will also default to 'true'. You will then need to use a 'false' value for ``prefer-final`` to get the newest releases. A separate option controls the behavior of the build system itself. When buildout looks for recipes, extensions, and for updates to itself, it does prefer final releases by default, as of the 1.5.0 release. The ``accept-buildout-test-releases`` option will let you override this behavior. However, it is typically changed by the --accept-buildout-test-releases option to the bootstrap script, since bootstrapping is the first step to selecting a buildout. Finding distributions --------------------- By default, buildout searches the Python Package Index when looking for distributions. You can, instead, specify your own index to search using the `index` option:: [buildout] ... index = http://index.example.com/ This index, or the default of http://pypi.python.org/simple/ if no index is specified, will always be searched for distributions unless running buildout with options that prevent searching for distributions. The latest version of the distribution that meets the requirements of the buildout will always be used. You can also specify more locations to search for distributions using the `find-links` option. All locations specified will be searched for distributions along with the package index as described before. Locations can be urls:: [buildout] ... find-links = http://download.zope.org/distribution/ They can also be directories on disk:: [buildout] ... find-links = /some/path Finally, they can also be direct paths to distributions:: [buildout] ... find-links = /some/path/someegg-1.0.0-py2.3.egg Any number of locations can be specified in the `find-links` option:: [buildout] ... find-links = http://download.zope.org/distribution/ /some/otherpath /some/path/someegg-1.0.0-py2.3.egg Dependency links ---------------- By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:: [buildout] ... use-dependency-links = false The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links. Controlling the installation database ------------------------------------- The buildout installed option is used to specify the file used to save information on installed parts. This option is initialized to ".installed.cfg", but it can be overridden in the configuration file or on the command line: >>> write('buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = debug ... ... [debug] ... recipe = recipes:debug ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs - inst.cfg d parts d recipes The installation database can be disabled by supplying an empty buildout installed option: >>> os.remove('inst.cfg') >>> print system(buildout+' buildout:installed='), Develop: '/sample-buildout/recipes' Installing debug. recipe recipes:debug >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Note that there will be no installation database if there are no parts: >>> write('buildout.cfg', ... """ ... [buildout] ... parts = ... """) >>> print system(buildout+' buildout:installed=inst.cfg'), >>> ls(sample_buildout) - b1.cfg - b2.cfg - base.cfg d bin - buildout.cfg d demo d develop-eggs d eggs d parts d recipes Extensions ---------- A feature allows code to be loaded and run after configuration files have been read but before the buildout has begun any processing. The intent is to allow special plugins such as urllib2 request handlers to be loaded. To load an extension, we use the extensions option and list one or more distribution requirements, on separate lines. The distributions named will be loaded and any ``zc.buildout.extension`` entry points found will be called with the buildout as an argument. When buildout finishes processing, any ``zc.buildout.unloadextension`` entry points found will be called with the buildout as an argument. Let's create a sample extension in our sample buildout created in the previous section: >>> mkdir(sample_bootstrapped, 'demo') >>> write(sample_bootstrapped, 'demo', 'demo.py', ... """ ... def ext(buildout): ... print 'ext', list(buildout) ... def unload(buildout): ... print 'unload', list(buildout) ... """) >>> write(sample_bootstrapped, 'demo', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "demo", ... entry_points = { ... 'zc.buildout.extension': ['ext = demo:ext'], ... 'zc.buildout.unloadextension': ['ext = demo:unload'], ... }, ... ) ... """) Our extension just prints out the word 'demo', and lists the sections found in the buildout passed to it. We'll update our buildout.cfg to list the demo directory as a develop egg to be built: >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... parts = ... """) >>> os.chdir(sample_bootstrapped) >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), Develop: '/sample-bootstrapped/demo' Now we can add the extensions option. We were a bit tricky and ran the buildout once with the demo develop egg defined but without the extension option. This is because extensions are loaded before the buildout creates develop eggs. We needed to use a separate buildout run to create the develop egg. Normally, when eggs are loaded from the network, we wouldn't need to do anything special. >>> write(sample_bootstrapped, 'buildout.cfg', ... """ ... [buildout] ... develop = demo ... extensions = demo ... parts = ... """) We see that our extension is loaded and executed: >>> print system(os.path.join(sample_bootstrapped, 'bin', 'buildout')), ext ['buildout'] Develop: '/sample-bootstrapped/demo' unload ['buildout'] Allow hosts ----------- On some environments the links visited by `zc.buildout` can be forbidden by paranoiac firewalls. These URL might be on the chain of links visited by `zc.buildout` wheter they are defined in the `find-links` option, wheter they are defined by various eggs in their `url`, `download_url`, `dependency_links` metadata. It is even harder to track that package_index works like a spider and might visit links and go to other location. The `allow-hosts` option provides a way to prevent this, and works exactly like the one provided in `easy_install`. You can provide a list of allowed host, together with wildcards:: [buildout] ... allow-hosts = *.python.org example.com All urls that does not match these hosts will not be visited. .. [#future_recipe_methods] In the future, additional methods may be added. Older recipes with fewer methods will still be supported. .. [#packaging_info] If we wanted to create a distribution from this package, we would need specify much more information. See the `setuptools documentation `_. Always unzipping eggs ===================== By default, zc.buildout doesn't unzip zip-safe eggs. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> _ = system(buildout) >>> ls('eggs') - demo-0.4c1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link This follows the policy followed by setuptools itself. Experience shows this policy to to be inconvenient. Zipped eggs make debugging more difficult and often import more slowly. You can include an unzip option in the buildout section to change the default unzipping policy. >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... find-links = %(link_server)s ... unzip = true ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> import os >>> for name in os.listdir('eggs'): ... if name.startswith('demo'): ... remove('eggs', name) >>> _ = system(buildout) >>> ls('eggs') d demo-0.4c1-py2.4.egg d demoneeded-1.2c1-py2.4.egg d setuptools-0.6c8-py2.4.egg - zc.buildout.egg-link Repeatable buildouts: controlling eggs used =========================================== One of the goals of zc.buildout is to provide enough control to make buildouts repeatable. It should be possible to check the buildout configuration files for a project into a version control system and later use the checked in files to get the same buildout, subject to changes in the environment outside the buildout. An advantage of using Python eggs is that depenencies of eggs used are automatically determined and used. The automatic inclusion of depenent distributions is at odds with the goal of repeatable buildouts. To support repeatable buildouts, a versions section can be created with options for each distribution name whos version is to be fixed. The section can then be specified via the buildout versions option. To see how this works, we'll create two versions of a recipe egg: >>> mkdir('recipe') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v1' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='1', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> write('recipe', 'README', '') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... >>> rmdir('recipe', 'build') >>> write('recipe', 'recipe.py', ... ''' ... class Recipe: ... def __init__(*a): pass ... def install(self): ... print 'recipe v2' ... return () ... update = install ... ''') >>> write('recipe', 'setup.py', ... ''' ... from setuptools import setup ... setup(name='spam', version='2', py_modules=['recipe'], ... entry_points={'zc.buildout': ['default = recipe:Recipe']}, ... ) ... ''') >>> print system(buildout+' setup recipe bdist_egg'), # doctest: +ELLIPSIS Running setup script 'recipe/setup.py'. ... and we'll configure a buildout to use it: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) If we run the buildout, it will use version 2: >>> print system(buildout), Getting distribution for 'spam'. Got spam 2. Installing foo. recipe v2 We can specify a versions section that lists our recipe and name it in the buildout section: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) Here we created a release-1 section listing the version 1 for the spam distribution. We told the buildout to use it by specifying release-1 as in the versions option. Now, if we run the buildout, we'll use version 1 of the spam recipe: >>> print system(buildout), Getting distribution for 'spam==1'. Got spam 1. Uninstalling foo. Installing foo. recipe v1 Running the buildout in verbose mode will help us get information about versions used. If we run the buildout in verbose mode without specifying a versions section: >>> print system(buildout+' buildout:versions= -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the best distribution that satisfies 'spam'. Picked: spam = 2. Uninstalling foo. Installing foo. recipe v2 We'll get output that includes lines that tell us what versions buildout chose a for us, like:: zc.buildout.easy_install.picked: spam = 2 This allows us to discover versions that are picked dynamically, so that we can fix them in a versions section. If we run the buildout with the versions section: >>> print system(buildout+' -v'), # doctest: +ELLIPSIS Installing 'zc.buildout >=1.99, <2dev', 'setuptools'. We have a develop egg: zc.buildout 1.0.0. We have the best distribution that satisfies 'setuptools'. Picked: setuptools = 0.6 Installing 'spam'. We have the distribution that satisfies 'spam==1'. Uninstalling foo. Installing foo. recipe v1 We won't get output for the spam distribution, which we didn't pick, but we will get output for setuptools, which we didn't specify versions for. You can request buildout to generate an error if it picks any versions: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = foo ... find-links = %s ... versions = release-1 ... allow-picked-versions = false ... ... [release-1] ... spam = 1 ... eggs = 2.2 ... ... [foo] ... recipe = spam ... ''' % join('recipe', 'dist')) Using the download utility ========================== The ``zc.buildout.download`` module provides a download utility that handles the details of downloading files needed for a buildout run from the internet. It downloads files to the local file system, using the download cache if desired and optionally checking the downloaded files' MD5 checksum. We setup an HTTP server that provides a file we want to download: >>> server_data = tmpdir('sample_files') >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> server_url = start_server(server_data) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Downloading without using the cache ----------------------------------- If no download cache should be used, the download utility is instantiated without any arguments: >>> from zc.buildout.download import Download >>> download = Download() >>> print download.cache_dir None Downloading a file is achieved by calling the utility with the URL as an argument. A tuple is returned that consists of the path to the downloaded copy of the file and a boolean value indicating whether this is a temporary file meant to be cleaned up during the same buildout run: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /.../buildout-... >>> cat(path) This is a foo text. As we aren't using the download cache and haven't specified a target path either, the download has ended up in a temporary file: >>> is_temp True >>> import tempfile >>> path.startswith(tempfile.gettempdir()) True We are responsible for cleaning up temporary files behind us: >>> remove(path) When trying to access a file that doesn't exist, we'll get an exception: >>> try: download(server_url+'not-there') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error Downloading a local file doesn't produce a temporary file but simply returns the local file itself: >>> download(join(server_data, 'foo.txt')) ('/sample_files/foo.txt', False) We can also have the downloaded file's MD5 sum checked: >>> try: from hashlib import md5 ... except ImportError: from md5 import new as md5 >>> path, is_temp = download(server_url+'foo.txt', ... md5('This is a foo text.').hexdigest()) >>> is_temp True >>> remove(path) >>> download(server_url+'foo.txt', ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' The error message in the event of an MD5 checksum mismatch for a local file reads somewhat differently: >>> download(join(server_data, 'foo.txt'), ... md5('This is a foo text.').hexdigest()) ('/sample_files/foo.txt', False) >>> download(join(server_data, 'foo.txt'), ... md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for local resource at '/sample_files/foo.txt'. Finally, we can download the file to a specified place in the file system: >>> target_dir = tmpdir('download-target') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False Trying to download a file in offline mode will result in an error: >>> download = Download(cache=None, offline=True) >>> download(server_url+'foo.txt') Traceback (most recent call last): UserError: Couldn't download 'http://localhost/foo.txt' in offline mode. As an exception to this rule, file system paths and URLs in the ``file`` scheme will still work: >>> cat(download(join(server_data, 'foo.txt'))[0]) This is a foo text. >>> cat(download('file:' + join(server_data, 'foo.txt'))[0]) This is a foo text. >>> remove(path) Downloading using the download cache ------------------------------------ In order to make use of the download cache, we need to configure the download utility differently. To do this, we pass a directory path as the ``cache`` attribute upon instantiation: >>> cache = tmpdir('download-cache') >>> download = Download(cache=cache) >>> print download.cache_dir /download-cache/ Simple usage ~~~~~~~~~~~~ When using the cache, a file will be stored in the cache directory when it is first downloaded. The file system path returned by the download utility points to the cached copy: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False Whenever the file is downloaded again, the cached copy is used. Let's change the file on the server to see this: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. If we specify an MD5 checksum for a file that is already in the cache, the cached copy's checksum will be verified: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch for cached download from 'http://localhost/foo.txt' at '/download-cache/foo.txt' Trying to access another file at a different URL which has the same base name will result in the cached copy being used: >>> mkdir(server_data, 'other') >>> write(server_data, 'other', 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'other/foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. Given a target path for the download, the utility will provide a copy of the file at that location both when first downloading the file and when using a cached copy: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False >>> ls(cache) - foo.txt >>> remove(path) >>> write(server_data, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt', ... path=join(target_dir, 'downloaded.txt')) >>> print path /download-target/downloaded.txt >>> cat(path) This is a foo text. >>> is_temp False In offline mode, downloads from any URL will be successful if the file is found in the cache: >>> download = Download(cache=cache, offline=True) >>> cat(download(server_url+'foo.txt')[0]) This is a foo text. Local resources will be cached just like any others since download caches are sometimes used to create source distributions: >>> remove(cache, 'foo.txt') >>> ls(cache) >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download = Download(cache=cache) >>> cat(download('file:' + join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') >>> cat(download(join(server_data, 'foo.txt'), path=path)[0]) This is a foo text. >>> ls(cache) - foo.txt >>> remove(cache, 'foo.txt') However, resources with checksum mismatches will not be copied to the cache: >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> ls(cache) >>> remove(path) If the file is completely missing it should notify the user of the error: >>> download(server_url+'bar.txt') Traceback (most recent call last): UserError: Error downloading extends for URL http://localhost/bar.txt: (404, 'Not Found') >>> ls(cache) Finally, let's see what happens if the download cache to be used doesn't exist as a directory in the file system yet: >>> Download(cache=join(cache, 'non-existent'))(server_url+'foo.txt') Traceback (most recent call last): UserError: The directory: '/download-cache/non-existent' to be used as a download cache doesn't exist. Using namespace sub-directories of the download cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It is common to store cached copies of downloaded files within sub-directories of the download cache to keep some degree of order. For example, zc.buildout stores downloaded distributions in a sub-directory named "dist". Those sub-directories are also known as namespaces. So far, we haven't specified any namespaces to use, so the download utility stored files directly inside the download cache. Let's use a namespace "test" instead: >>> download = Download(cache=cache, namespace='test') >>> print download.cache_dir /download-cache/test The namespace sub-directory hasn't been created yet: >>> ls(cache) Downloading a file now creates the namespace sub-directory and places a copy of the file inside it: >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> ls(cache) d test >>> ls(cache, 'test') - foo.txt >>> cat(path) This is a foo text. >>> is_temp False The next time we want to download that file, the copy from inside the cache namespace is used. To see this clearly, we put a file with the same name but different content both on the server and in the cache's root directory: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> write(cache, 'foo.txt', 'The wrong text.') >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/test/foo.txt >>> cat(path) This is a foo text. >>> rmdir(cache, 'test') >>> remove(cache, 'foo.txt') >>> write(server_data, 'foo.txt', 'This is a foo text.') Using a hash of the URL as the filename in the cache ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ So far, the base name of the downloaded file read from the URL has been used for the name of the cached copy of the file. This may not be desirable in some cases, for example when downloading files from different locations that have the same base name due to some naming convention, or if the file content depends on URL parameters. In such cases, an MD5 hash of the complete URL may be used as the filename in the cache: >>> download = Download(cache=cache, hash_name=True) >>> path, is_temp = download(server_url+'foo.txt') >>> print path /download-cache/09f5793fcdc1716727f72d49519c688d >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d The path was printed just to illustrate matters; we cannot know the real checksum since we don't know which port the server happens to listen at when the test is run, so we don't actually know the full URL of the file. Let's check that the checksum actually belongs to the particular URL used: >>> path.lower() == join(cache, md5(server_url+'foo.txt').hexdigest()).lower() True The cached copy is used when downloading the file again: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> (path, is_temp) == download(server_url+'foo.txt') True >>> cat(path) This is a foo text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d If we change the URL, even in such a way that it keeps the base name of the file the same, the file will be downloaded again this time and put in the cache under a different name: >>> path2, is_temp = download(server_url+'other/foo.txt') >>> print path2 /download-cache/537b6d73267f8f4447586989af8c470e >>> path == path2 False >>> path2.lower() == join(cache, md5(server_url+'other/foo.txt').hexdigest()).lower() True >>> cat(path) This is a foo text. >>> cat(path2) The wrong text. >>> ls(cache) - 09f5793fcdc1716727f72d49519c688d - 537b6d73267f8f4447586989af8c470e >>> remove(path) >>> remove(path2) >>> write(server_data, 'foo.txt', 'This is a foo text.') Using the cache purely as a fall-back ------------------------------------- Sometimes it is desirable to try downloading a file from the net if at all possible, and use the cache purely as a fall-back option when a server is down or if we are in offline mode. This mode is only in effect if a download cache is configured in the first place: >>> download = Download(cache=cache, fallback=True) >>> print download.cache_dir /download-cache/ A downloaded file will be cached: >>> ls(cache) >>> path, is_temp = download(server_url+'foo.txt') >>> ls(cache) - foo.txt >>> cat(cache, 'foo.txt') This is a foo text. >>> is_temp False If the file cannot be served, the cached copy will be used: >>> remove(server_data, 'foo.txt') >>> try: Download()(server_url+'foo.txt') # doctest: +ELLIPSIS ... except: print 'download error' ... else: print 'woops' download error >>> path, is_temp = download(server_url+'foo.txt') >>> cat(path) This is a foo text. >>> is_temp False Similarly, if the file is served but we're in offline mode, we'll fall back to using the cache: >>> write(server_data, 'foo.txt', 'The wrong text.') >>> get(server_url+'foo.txt') 'The wrong text.' >>> offline_download = Download(cache=cache, offline=True, fallback=True) >>> path, is_temp = offline_download(server_url+'foo.txt') >>> print path /download-cache/foo.txt >>> cat(path) This is a foo text. >>> is_temp False However, when downloading the file normally with the cache being used in fall-back mode, the file will be downloaded from the net and the cached copy will be replaced with the new content: >>> cat(download(server_url+'foo.txt')[0]) The wrong text. >>> cat(cache, 'foo.txt') The wrong text. When trying to download a resource whose checksum does not match, the cached copy will neither be used nor overwritten: >>> write(server_data, 'foo.txt', 'This is a foo text.') >>> download(server_url+'foo.txt', md5('The wrong text.').hexdigest()) Traceback (most recent call last): ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt' >>> cat(cache, 'foo.txt') The wrong text. Configuring the download utility from buildout options ------------------------------------------------------ The configuration options explained so far derive from the build logic implemented by the calling code. Other options configure the download utility for use in a particular project or buildout run; they are read from the ``buildout`` configuration section. The latter can be passed directly as the first argument to the download utility's constructor. The location of the download cache is specified by the ``download-cache`` option: >>> download = Download({'download-cache': cache}, namespace='cmmi') >>> print download.cache_dir /download-cache/cmmi If the ``download-cache`` option specifies a relative path, it is understood relative to the current working directory, or to the buildout directory if that is given: >>> download = Download({'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/relative-cache/ >>> download = Download({'directory': join(sample_buildout, 'root'), ... 'download-cache': 'relative-cache'}) >>> print download.cache_dir /sample-buildout/root/relative-cache/ Keyword parameters take precedence over the corresponding options: >>> download = Download({'download-cache': cache}, cache=None) >>> print download.cache_dir None Whether to assume offline mode can be inferred from either the ``offline`` or the ``install-from-cache`` option. As usual with zc.buildout, these options must assume one of the values 'true' and 'false': >>> download = Download({'offline': 'true'}) >>> download.offline True >>> download = Download({'offline': 'false'}) >>> download.offline False >>> download = Download({'install-from-cache': 'true'}) >>> download.offline True >>> download = Download({'install-from-cache': 'false'}) >>> download.offline False These two options are combined using logical 'or': >>> download = Download({'offline': 'true', 'install-from-cache': 'false'}) >>> download.offline True >>> download = Download({'offline': 'false', 'install-from-cache': 'true'}) >>> download.offline True The ``offline`` keyword parameter takes precedence over both the ``offline`` and ``install-from-cache`` options: >>> download = Download({'offline': 'true'}, offline=False) >>> download.offline False >>> download = Download({'install-from-cache': 'false'}, offline=True) >>> download.offline True Regressions ----------- MD5 checksum calculation needs to be reliable on all supported systems, which requires text files to be treated as binary to avoid implicit line-ending conversions: >>> text = 'First line of text.\r\nSecond line.\r\n' >>> f = open(join(server_data, 'foo.txt'), 'wb') >>> f.write(text) >>> f.close() >>> path, is_temp = Download()(server_url+'foo.txt', md5(text).hexdigest()) >>> remove(path) When "downloading" a directory given by file-system path or ``file:`` URL and using a download cache at the same time, the cached directory wasn't handled correctly. Consequently, the cache was defeated and an attempt to cache the directory a second time broke. This is how it should work: >>> download = Download(cache=cache) >>> dirpath = join(server_data, 'some_directory') >>> mkdir(dirpath) >>> dest, _ = download(dirpath) If we now modify the source tree, the second download will produce the original one from the cache: >>> mkdir(join(dirpath, 'foo')) >>> ls(dirpath) d foo >>> dest, _ = download(dirpath) >>> ls(dest) Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir Using a download cache ====================== Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. The buildout download-cache option can be used to specify a directory to be used as a download cache. In this example, we'll create a directory to hold the cache: >>> cache = tmpdir('cache') And set up a buildout that downloads some eggs: >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ==0.2 ... ''' % globals()) We specified a link server that has some distributions available for download: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
We'll enable logging on the link server so we can see what's going on: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' We also specified a download cache. If we run the buildout, we'll see the eggs installed from the link server as usual: >>> print system(buildout), GET 200 / GET 200 /demo-0.2-py2.4.egg GET 200 /demoneeded-1.2c1.zip Installing eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. We'll also get the download cache populated. The buildout doesn't put files in the cache directly. It creates an intermediate directory, dist: >>> ls(cache) d dist >>> ls(cache, 'dist') - demo-0.2-py2.4.egg - demoneeded-1.2c1.zip If we remove the installed eggs from eggs directory and re-run the buildout: >>> import os >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> print system(buildout), GET 200 / Updating eggs. Getting distribution for 'demo==0.2'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. We see that the distributions aren't downloaded, because they're downloaded from the cache. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a buildout with download cache and tell the buildout to install from the download cache only, without making network accesses. The buildout install-from-cache option can be used to signal that packages should be installed only from the download cache. Let's remove our installed eggs and run the buildout with the install-from-cache option set to true: >>> for f in os.listdir('eggs'): ... if f.startswith('demo'): ... remove('eggs', f) >>> write('buildout.cfg', ... ''' ... [buildout] ... parts = eggs ... download-cache = %(cache)s ... install-from-cache = true ... find-links = %(link_server)s ... ... [eggs] ... recipe = zc.recipe.egg ... eggs = demo ... ''' % globals()) >>> print system(buildout), Uninstalling eggs. Installing eggs. Getting distribution for 'demo'. Got demo 0.2. Getting distribution for 'demoneeded'. Got demoneeded 1.2c1. Generated script '/sample-buildout/bin/demo'. Caching extended configuration ============================== As mentioned in the general buildout documentation, configuration files can extend each other, including the ability to download configuration being extended from a URL. If desired, zc.buildout caches downloaded configuration in order to be able to use it when run offline. As we're going to talk about downloading things, let's start an HTTP server. Also, all of the following will take place inside the sample buildout. >>> server_data = tmpdir('server_data') >>> server_url = start_server(server_data) >>> cd(sample_buildout) We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end: >>> import tempfile >>> old_tempdir = tempfile.tempdir >>> tempfile.tempdir = tmpdir('tmp') Basic use of the extends cache ------------------------------ We put some base configuration on a server and reference it from a sample buildout: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) When trying to run this buildout offline, we'll find that we cannot read all of the required configuration: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Trying the same online, we can: >>> print system(buildout) Unused options for buildout: 'foo'. As long as we haven't said anything about caching downloaded configuration, nothing gets cached. Offline mode will still cause the buildout to fail: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. Let's now specify a cache for base configuration files. This cache is different from the download cache used by recipes for caching distributions and other files; one might, however, use a namespace subdirectory of the download cache for it. The configuration cache we specify will be created when running buildout and the base.cfg file will be put in it (with the file name being a hash of the complete URL): >>> mkdir('cache') >>> write('buildout.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. >>> cache = join(sample_buildout, 'cache') >>> ls(cache) - 5aedc98d7e769290a29d654a591a3a45 >>> import os >>> cat(cache, os.listdir(cache)[0]) [buildout] parts = foo = bar We can now run buildout offline as it will read base.cfg from the cache: >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. The cache is being used purely as a fall-back in case we are offline or don't have access to a configuration file to be downloaded. As long as we are online, buildout attempts to download a fresh copy of each file even if a cached copy of the file exists. To see this, we put different configuration in the same place on the server and run buildout in offline mode so it takes base.cfg from the cache: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... bar = baz ... """) >>> print system(buildout + ' -o') Unused options for buildout: 'foo'. In online mode, buildout will download and use the modified version: >>> print system(buildout) Unused options for buildout: 'bar'. Trying offline mode again, the new version will be used as it has been put in the cache now: >>> print system(buildout + ' -o') Unused options for buildout: 'bar'. Clean up: >>> rmdir(cache) Specifying extends cache and offline mode ----------------------------------------- Normally, the values of buildout options such as the location of a download cache or whether to use offline mode are determined by first reading the user's default configuration, updating it with the project's configuration and finally applying command-line options. User and project configuration are assembled by reading a file such as ``~/.buildout/default.cfg``, ``buildout.cfg`` or a URL given on the command line, recursively (depth-first) downloading any base configuration specified by the ``buildout:extends`` option read from each of those config files, and finally evaluating each config file to provide default values for options not yet read. This works fine for all options that do not influence how configuration is downloaded in the first place. The ``extends-cache`` and ``offline`` options, however, are treated differently from the procedure described in order to make it simple and obvious to see where a particular configuration file came from under any particular circumstances. - Offline and extends-cache settings are read from the two root config files exclusively. Otherwise one could construct configuration files that, when read, imply that they should have been read from a different source than they have. Also, specifying the extends cache within a file that might have to be taken from the cache before being read wouldn't make a lot of sense. - Offline and extends-cache settings given by the user's defaults apply to the process of assembling the project's configuration. If no extends cache has been specified by the user's default configuration, the project's root config file must be available, be it from disk or from the net. - Offline mode turned on by the ``-o`` command line option is honoured from the beginning even though command line options are applied to the configuration last. If offline mode is not requested by the command line, it may be switched on by either the user's or the project's config root. Extends cache ~~~~~~~~~~~~~ Let's see the above rules in action. We create a new home directory for our user and write user and project configuration that recursively extends online bases, using different caches: >>> mkdir('home') >>> mkdir('home', '.buildout') >>> mkdir('cache') >>> mkdir('user-cache') >>> os.environ['HOME'] = join(sample_buildout, 'home') >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... extends-cache = user-cache ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write(server_data, 'base_default.cfg', """\ ... [buildout] ... foo = bar ... offline = false ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... extends-cache = cache ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... """ % server_url) >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... offline = false ... """) Buildout will now assemble its configuration from all of these 6 files, defaults first. The online resources end up in the respective extends caches: >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 10e772cf422123ef6c64ae770f555740 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] foo = bar offline = false >>> ls('cache') - c72213127e6eb2208a3e1fc1dba771a7 >>> cat('cache', os.listdir('cache')[0]) [buildout] parts = offline = false If, on the other hand, the extends caches are specified in files that get extended themselves, they won't be used for assembling the configuration they belong to (user's or project's, resp.). The extends cache specified by the user's defaults does, however, apply to downloading project configuration. Let's rewrite the config files, clean out the caches and re-run buildout: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... extends-cache = user-cache ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... extends-cache = cache ... """ % server_url) >>> remove('user-cache', os.listdir('user-cache')[0]) >>> remove('cache', os.listdir('cache')[0]) >>> print system(buildout) Unused options for buildout: 'foo'. >>> ls('user-cache') - 0548bad6002359532de37385bb532e26 >>> cat('user-cache', os.listdir('user-cache')[0]) [buildout] parts = offline = false >>> ls('cache') Clean up: >>> rmdir('user-cache') >>> rmdir('cache') Offline mode and installation from cache ----------------------------~~~~~~~~~~~~ If we run buildout in offline mode now, it will fail because it cannot get at the remote configuration file needed by the user's defaults: >>> print system(buildout + ' -o') While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. Let's now successively turn on offline mode by different parts of the configuration and see when buildout applies this setting in each case: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... offline = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... offline = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... offline = true ... """ % server_url) >>> print system(buildout) Unused options for buildout: 'foo'. The ``install-from-cache`` option is treated accordingly: >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode. >>> write('home', '.buildout', 'default.cfg', """\ ... [buildout] ... extends = fancy_default.cfg ... """) >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('home', '.buildout', 'fancy_default.cfg', """\ ... [buildout] ... extends = %sbase_default.cfg ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... install-from-cache = true ... """) >>> print system(buildout) While: Initializing. Error: Couldn't download 'http://localhost/base.cfg' in offline mode. >>> write('buildout.cfg', """\ ... [buildout] ... extends = fancy.cfg ... """) >>> write('fancy.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... install-from-cache = true ... """ % server_url) >>> print system(buildout) While: Installing. Checking for upgrades. An internal error occurred ... ValueError: install_from_cache set to true with no download cache >>> rmdir('home', '.buildout') Newest and non-newest behaviour for extends cache ------------------------------------------------- While offline mode forbids network access completely, 'newest' mode determines whether to look for updated versions of a resource even if some version of it is already present locally. If we run buildout in newest mode (``newest = true``), the configuration files are updated with each run: >>> mkdir("cache") >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... """ % server_url) >>> print system(buildout) >>> ls('cache') - 5aedc98d7e769290a29d654a591a3a45 >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = A change to ``base.cfg`` is picked up on the next buildout run: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... foo = bar ... """) >>> print system(buildout + " -n") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar In contrast, when not using ``newest`` mode (``newest = false``), the files already present in the extends cache will not be updated: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... """) >>> print system(buildout + " -N") Unused options for buildout: 'foo'. >>> cat('cache', os.listdir(cache)[0]) [buildout] parts = foo = bar Even when updating base configuration files with a buildout run, any given configuration file will be downloaded only once during that particular run. If some base configuration file is extended more than once, its cached copy is used: >>> write(server_data, 'baseA.cfg', """\ ... [buildout] ... extends = %sbase.cfg ... foo = bar ... """ % server_url) >>> write(server_data, 'baseB.cfg', """\ ... [buildout] ... extends-cache = cache ... extends = %sbase.cfg ... bar = foo ... """ % server_url) >>> write('buildout.cfg', """\ ... [buildout] ... extends-cache = cache ... newest = true ... extends = %sbaseA.cfg %sbaseB.cfg ... """ % (server_url, server_url)) >>> print system(buildout + " -n") Unused options for buildout: 'bar' 'foo'. (XXX We patch download utility's API to produce readable output for the test; a better solution would utilise the logging already done by the utility.) >>> import zc.buildout >>> old_download = zc.buildout.download.Download.download >>> def wrapper_download(self, url, md5sum=None, path=None): ... print "The URL %s was downloaded." % url ... return old_download(url, md5sum, path) >>> zc.buildout.download.Download.download = wrapper_download >>> zc.buildout.buildout.main([]) The URL http://localhost/baseA.cfg was downloaded. The URL http://localhost/base.cfg was downloaded. The URL http://localhost/baseB.cfg was downloaded. Unused options for buildout: 'bar' 'foo'. >>> zc.buildout.download.Download.download = old_download The deprecated ``extended-by`` option ------------------------------------- The ``buildout`` section used to recognise an option named ``extended-by`` that was deprecated at some point and removed in the 1.5 line. Since ignoring this option silently was considered harmful as a matter of principle, a UserError is raised if that option is encountered now: >>> write(server_data, 'base.cfg', """\ ... [buildout] ... parts = ... extended-by = foo.cfg ... """) >>> print system(buildout) While: Initializing. Error: No-longer supported "extended-by" option found in http://localhost/base.cfg. Clean up -------- We should have cleaned up all temporary files created by downloading things: >>> ls(tempfile.tempdir) Reset the global temporary directory: >>> tempfile.tempdir = old_tempdir Using zc.buildout to run setup scripts ====================================== zc buildout has a convenience command for running setup scripts. Why? There are two reasons. If a setup script doesn't import setuptools, you can't use any setuptools-provided commands, like bdist_egg. When buildout runs a setup script, it arranges to import setuptools before running the script so setuptools-provided commands are available. If you use a squeaky-clean Python to do your development, the setup script that would import setuptools because setuptools isn't in the path. Because buildout requires setuptools and knows where it has installed a setuptools egg, it adds the setuptools egg to the Python path before running the script. To run a setup script, use the buildout setup command, passing the name of a script or a directory containing a setup script and arguments to the script. Let's look at an example: >>> mkdir('test') >>> cd('test') >>> write('setup.py', ... ''' ... from distutils.core import setup ... setup(name='sample') ... ''') We've created a super simple (stupid) setup script. Note that it doesn't import setuptools. Let's try running it to create an egg. We'll use the buildout script from our sample buildout: >>> print system(buildout+' setup'), ... # doctest: +NORMALIZE_WHITESPACE Error: The setup command requires the path to a setup script or directory containing a setup script, and its arguments. Oops, we forgot to give the name of the setup script: >>> print system(buildout+' setup setup.py bdist_egg'), ... # doctest: +ELLIPSIS Running setup script 'setup.py'. ... >>> ls('dist') - sample-0.0.0-py2.5.egg Note that we can specify a directory name. This is often shorter and preferred by the lazy :) >>> print system(buildout+' setup . bdist_egg'), # doctest: +ELLIPSIS Running setup script './setup.py'. ... Automatic Buildout Updates ========================== When a buildout is run, one of the first steps performed is to check for updates to either zc.buildout or setuptools. To demonstrate this, we've created some "new releases" of buildout and setuptools in a new_releases folder: >>> ls(new_releases) d setuptools - setuptools-1.99.99-py2.4.egg d zc.buildout - zc.buildout-1.100.0b1-pyN.N.egg - zc.buildout-1.99.99-py2.4.egg - zc.buildout-2.0.0-pyN.N.egg Let's update the sample buildout.cfg to look in this area: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) We'll also include a recipe that echos the versions of setuptools and zc.buildout used: >>> mkdir(sample_buildout, 'showversions') >>> write(sample_buildout, 'showversions', 'showversions.py', ... """ ... import pkg_resources ... ... class Recipe: ... ... def __init__(self, buildout, name, options): ... pass ... ... def install(self): ... for project in 'zc.buildout', 'setuptools': ... req = pkg_resources.Requirement.parse(project) ... print project, pkg_resources.working_set.find(req).version ... return () ... update = install ... """) >>> write(sample_buildout, 'showversions', 'setup.py', ... """ ... from setuptools import setup ... ... setup( ... name = "showversions", ... entry_points = {'zc.buildout': ['default = showversions:Recipe']}, ... ) ... """) Now if we run the buildout, the buildout will upgrade itself to the new versions found in new releases: >>> print system(buildout), Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Upgraded: zc.buildout version 1.99.99, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. Develop: '/sample-buildout/showversions' Installing show-versions. zc.buildout 1.99.99 setuptools 1.99.99 Notice that, even though we have a newer beta version of zc.buildout available, the final "1.99.99" was selected. If you want to get non-final versions, specify a specific version in your buildout's versions section, you typically want to use the --accept-buildout-test-releases option to the bootstrap script, which internally uses the ``accept-buildout-test-releases = true`` discussed below. Also, even thought there's a later final version, buildout won't upgrade itself past version 1. Our buildout script's site.py has been updated to use the new eggs: >>> cat(sample_buildout, 'parts', 'buildout', 'site.py') ... # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS "... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/sample-buildout/eggs/zc.buildout-1.99.99-pyN.N.egg', '/sample-buildout/eggs/setuptools-1.99.99-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support setuptools. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths ... Now, let's recreate the sample buildout. If we specify constraints on the versions of zc.buildout and setuptools (or distribute) to use, running the buildout will install earlier versions of these packages: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... zc.buildout-version = < 1.99 ... setuptools-version = < 1.99 ... distribute-version = < 1.99 ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) Now we can see that we actually "upgrade" to an earlier version. >>> print system(buildout), Upgraded: zc.buildout version 1.0.0, setuptools version 0.6; restarting. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 There are a number of cases, described below, in which the updates don't happen. We won't upgrade in offline mode: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout+' -o'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 Or in non-newest mode: >>> print system(buildout+' -N'), Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.0.0 setuptools 0.6 We also won't upgrade if the buildout script being run isn't in the buildout's bin directory. To see this we'll create a new buildout directory: >>> sample_buildout2 = tmpdir('sample_buildout2') >>> write(sample_buildout2, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = ... """ % dict(new_releases=new_releases)) >>> cd(sample_buildout2) >>> print system(buildout), Creating directory '/sample_buildout2/bin'. Creating directory '/sample_buildout2/parts'. Creating directory '/sample_buildout2/eggs'. Creating directory '/sample_buildout2/develop-eggs'. Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.99.99. Getting distribution for 'setuptools'. Got setuptools 1.99.99. Not upgrading because not running a local buildout command. >>> ls('bin') As mentioned above, the ``accept-buildout-test-releases = true`` means that newer non-final versions of these dependencies are preferred. Typically users are not expected to actually manipulate this value. Instead, the bootstrap script creates a buildout buildout script that passes in the value as a command line override. This then results in the buildout script being rewritten to remember the decision. We'll mimic this by passing the argument actually in the command line. >>> cd(sample_buildout) >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = showversions ... ... [show-versions] ... recipe = showversions ... """ % dict(new_releases=new_releases)) >>> print system(buildout + ... ' buildout:accept-buildout-test-releases=true'), ... # doctest: +NORMALIZE_WHITESPACE Getting distribution for 'zc.buildout>=1.99, <2dev'. Got zc.buildout 1.100.0b1. Upgraded: zc.buildout version 1.100.0b1, setuptools version 1.99.99; restarting. Generated script '/sample-buildout/bin/buildout'. NOTE: Accepting early releases of build system packages. Rerun bootstrap without --accept-buildout-test-releases (-t) to return to default behavior. Develop: '/sample-buildout/showversions' Updating show-versions. zc.buildout 1.100.0b1 setuptools 1.99.99 The buildout script shows the change. >>> buildout_script = join(sample_buildout, 'bin', 'buildout') >>> import sys >>> if sys.platform.startswith('win'): ... buildout_script += '-script.py' >>> print open(buildout_script).read() # doctest: +ELLIPSIS #... sys.argv.insert(1, 'buildout:accept-buildout-test-releases=true') print ('NOTE: Accepting early releases of build system packages. Rerun ' 'bootstrap without --accept-buildout-test-releases (-t) to return to ' 'default behavior.') ... If the update process for buildout or setuptools fails the error should be caught (displaying a warning) and the rest of the buildout update process should continue. >>> version = sys.version_info[0:2] >>> egg = new_releases + '/zc.buildout-1.99.99-py%s.%s.egg' % version >>> copy_egg = new_releases + '/zc.buildout-1.1000-py%s.%s.egg' % version >>> import shutil >>> shutil.copy(egg, copy_egg) Create a broken egg >>> mkdir(sample_buildout, 'broken') >>> write(sample_buildout, 'broken', 'setup.py', "import broken_egg\n") >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... find-links = %(new_releases)s ... index = %(new_releases)s ... parts = show-versions ... develop = ... broken ... ... [broken] ... recipe = zc.recipe.egg ... eggs = broken ... """ % dict(new_releases=new_releases)) >>> import subprocess >>> subprocess.call([buildout]) 1 Debugging buildouts =================== Buildouts can be pretty complex. When things go wrong, it isn't always obvious why. Errors can occur due to problems in user input or due to bugs in zc.buildout or recipes. When an error occurs, Python's post-mortem debugger can be used to inspect the state of the buildout or recipe code where the error occurred. To enable this, use the -D option to the buildout. Let's create a recipe that has a bug: >>> mkdir(sample_buildout, 'recipes') >>> write(sample_buildout, 'recipes', 'mkdir.py', ... """ ... import os, zc.buildout ... ... class Mkdir: ... ... def __init__(self, buildout, name, options): ... self.name, self.options = name, options ... options['path'] = os.path.join( ... buildout['buildout']['directory'], ... options['path'], ... ) ... ... def install(self): ... directory = self.options['directory'] ... os.mkdir(directory) ... return directory ... ... def update(self): ... pass ... """) >>> write(sample_buildout, 'recipes', 'setup.py', ... """ ... from setuptools import setup ... ... setup(name = "recipes", ... entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']}, ... ) ... """) And create a buildout that uses it: >>> write(sample_buildout, 'buildout.cfg', ... """ ... [buildout] ... develop = recipes ... parts = data-dir ... ... [data-dir] ... recipe = recipes:mkdir ... path = mystuff ... """) If we run the buildout, we'll get an error: >>> print system(buildout), Develop: '/sample-buildout/recipes' Installing data-dir. While: Installing data-dir. Error: Missing option: data-dir:directory If we want to debug the error, we can add the -D option. Here's we'll supply some input: >>> print system(buildout+" -D", """\ ... up ... p self.options.keys() ... q ... """), Develop: '/sample-buildout/recipes' Installing data-dir. > /zc/buildout/buildout.py(925)__getitem__() -> raise MissingOption("Missing option: %s:%s" % (self.name, key)) (Pdb) > /sample-buildout/recipes/mkdir.py(14)install() -> directory = self.options['directory'] (Pdb) ['path', 'recipe'] (Pdb) While: Installing data-dir. Traceback (most recent call last): File "/zc/buildout/buildout.py", line 1352, in main getattr(buildout, command)(args) File "/zc/buildout/buildout.py", line 383, in install installed_files = self[part]._call(recipe.install) File "/zc/buildout/buildout.py", line 961, in _call return f() File "/sample-buildout/recipes/mkdir.py", line 14, in install directory = self.options['directory'] File "/zc/buildout/buildout.py", line 925, in __getitem__ raise MissingOption("Missing option: %s:%s" % (self.name, key)) MissingOption: Missing option: data-dir:directory Starting pdb: Testing Support =============== The zc.buildout.testing module provides an API that can be used when writing recipe tests. This API is documented below. Many examples of using this API can be found in the zc.buildout, zc.recipe.egg, and zc.recipe.testrunner tests. zc.buildout.testing.buildoutSetUp(test) --------------------------------------- The buildoutSetup function can be used as a doctest setup function. It creates a sample buildout that can be used by tests, changing the current working directory to the sample_buildout. It also adds a number of names to the test namespace: ``sample_buildout`` This is the name of a buildout with a basic configuration. ``buildout`` This is the path of the buildout script in the sample buildout. ``ls(*path)`` List the contents of a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``cat(*path)`` Display the contents of a file. The file path is provided as one or more strings, to be joined with os.path.join. On Windows, if the file doesn't exist, the function will try adding a '-script.py' suffix. This helps to work around a difference in script generation on windows. ``mkdir(*path)`` Create a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``rmdir(*path)`` Remove a directory. The directory path is provided as one or more strings, to be joined with os.path.join. ``remove(*path)`` Remove a directory or file. The path is provided as one or more strings, to be joined with os.path.join. ``tmpdir(name)`` Create a temporary directory with the given name. The directory will be automatically removed at the end of the test. The path of the created directory is returned. Further, if the the normalize_path normlaizing substitution (see below) is used, then any paths starting with this path will be normalized to:: /name/restofpath No two temporary directories can be created with the same name. A directory created with tmpdir can be removed with rmdir and recreated. Note that the sample_buildout directory is created by calling this function. ``write(*path_and_contents)`` Create a file. The file path is provided as one or more strings, to be joined with os.path.join. The last argument is the file contents. ``system(command, input='')`` Execute a system command with the given input passed to the command's standard input. The output (error and regular output) from the command is returned. ``get(url)`` Get a web page. ``cd(*path)`` Change to the given directory. The directory path is provided as one or more strings, to be joined with os.path.join. The directory will be reset at the end of the test. ``uncd()`` Change to the directory that was current prior to the previous call to ``cd``. You can call ``cd`` multiple times and then ``uncd`` the same number of times to return to the same location. ``join(*path)`` A convenient reference to os.path.join. ``register_teardown(func)`` Register a tear-down function. The function will be called with no arguments at the end of the test. ``start_server(path)`` Start a web server on the given path. The server will be shut down at the end of the test. The server URL is returned. You can cause the server to start and stop logging it's output using: >>> get(server_url+'enable_server_logging') and: >>> get(server_url+'disable_server_logging') This can be useful to see how buildout is interacting with a server. ``sdist(setup, dest)`` Create a source distribution by running the given setup file and placing the result in the given destination directory. If the setup argument is a directory, the thge setup.py file in that directory is used. ``bdist_egg(setup, executable, dest)`` Create an egg by running the given setup file with the given Python executable and placing the result in the given destination directory. If the setup argument is a directory, then the setup.py file in that directory is used. ``find_python(version)`` Find a Python executable for the given version, where version is a string like "2.4". This function uses the following strategy to find a Python of the given version: - Look for an environment variable of the form PYTHON%(version)s. - On windows, look for \Pythonm%(version)s\python - on Unix, try running python%(version)s or just python to get the executable ``zc.buildout.testing.buildoutTearDown(test)`` ---------------------------------------------- Tear down everything set up by zc.buildout.testing.buildoutSetUp. Any functions passed to register_teardown are called as well. ``install(project, destination)`` --------------------------------- Install eggs for a given project into a destination. If the destination is a test object, then the eggs directory of the sample buildout (sample_buildout) defined by the test will be used. Tests will use this to install the distributions for the packages being tested (and their dependencies) into a sample buildout. The egg to be used should already be loaded, by importing one of the modules provided, before calling this function. ``install_develop(project, destination)`` ----------------------------------------- Like install, but a develop egg is installed even if the current egg if not a develop egg. ``Output normalization`` ------------------------ Recipe tests often generate output that is dependent on temporary file locations, operating system conventions or Python versions. To deal with these dependencies, we often use zope.testing.renormalizing.RENormalizing to normalize test output. zope.testing.renormalizing.RENormalizing takes pairs of regular expressions and substitutions. The zc.buildout.testing module provides a few helpful variables that define regular-expression/substitution pairs that you can pass to zope.testing.renormalizing.RENormalizing. ``normalize_path`` Converts tests paths, based on directories created with tmpdir(), to simple paths. ``normalize_script`` On Unix-like systems, scripts are implemented in single files without suffixes. On windows, scripts are implemented with 2 files, a -script.py file and a .exe file. This normalization converts directory listings of Windows scripts to the form generated on UNix-like systems. ``normalize_egg_py`` Normalize Python version and platform indicators, if specified, in egg names. Python API for egg and script installation ========================================== The easy_install module provides some functions to provide support for egg and script installation. It provides functionality at the python level that is similar to easy_install, with a few exceptions: - By default, we look for new packages *and* the packages that they depend on. This is somewhat like (and uses) the --upgrade option of easy_install, except that we also upgrade required packages. - If the highest-revision package satisfying a specification is already present, then we don't try to get another one. This saves a lot of search time in the common case that packages are pegged to specific versions. - If there is a develop egg that satisfies a requirement, we don't look for additional distributions. We always give preference to develop eggs. - Distutils options for building extensions can be passed. Distribution installation ------------------------- The easy_install module provides a function, install, for installing one or more packages and their dependencies. The install function takes 2 positional arguments: - An iterable of setuptools requirement strings for the distributions to be installed, and - A destination directory to install to and to satisfy requirements from. The destination directory can be None, in which case, no new distributions are downloaded and there will be an error if the needed distributions can't be found among those already installed. It supports a number of optional keyword arguments: links A sequence of URLs, file names, or directories to look for links to distributions. index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. always_unzip A flag indicating that newly-downloaded distributions should be directories even if they could be installed as zip files. working_set An existing working set to be augmented with additional distributions, if necessary to satisfy requirements. This allows you to call install multiple times, if necessary, to gather multiple sets of requirements. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. use_dependency_links A flag indicating whether to search for dependencies using the setup dependency_links metadata or not. If true, links are searched for using dependency_links in preference to other locations. Defaults to true. include_site_packages A flag indicating whether Python's non-standard-library packages should be available for finding dependencies. Defaults to true. Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. relative_paths Adjust egg paths so they are relative to the script path. This allows scripts to work when scripts and eggs are moved, as long as they are both moved in the same way. The install method returns a working set containing the distributions needed to meet the given requirements. We have a link server that has a number of eggs: >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
index/
other-1.0-py2.4.egg
Let's make a directory and install the demo egg to it, using the demo: >>> dest = tmpdir('sample-install') >>> import zc.buildout.easy_install >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/') We requested version 0.2 of the demo distribution to be installed into the destination server. We specified that we should search for links on the link server and that we should use the (empty) link server index directory as a package index. The working set contains the distributions we retrieved. >>> for dist in ws: ... print dist demo 0.2 demoneeded 1.1 We got demoneeded because it was a dependency of demo. And the actual eggs were added to the eggs directory. >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we remove the version restriction on demo, but specify a false value for newest, no new distributions will be installed: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... newest=False) >>> ls(dest) - demo-0.2-py2.4.egg - demoneeded-1.1-py2.4.egg If we leave off the newest option, we'll get an update for demo: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg Note that we didn't get the newest versions available. There were release candidates for newer versions of both packages. By default, final releases are preferred. We can change this behavior using the prefer_final function: >>> zc.buildout.easy_install.prefer_final(False) True The old setting is returned. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.4c1 demoneeded 1.2c1 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg Let's put the setting back to the default. >>> zc.buildout.easy_install.prefer_final(True) False We can supply additional distributions. We can also supply specifications for distributions that would normally be found via dependencies. We might do this to specify a specific version. >>> ws = zc.buildout.easy_install.install( ... ['demo', 'other', 'demoneeded==1.0'], dest, ... links=[link_server], index=link_server+'index/') >>> for dist in ws: ... print dist demo 0.3 other 1.0 demoneeded 1.0 >>> ls(dest) - demo-0.2-py2.4.egg - demo-0.3-py2.4.egg - demo-0.4c1-py2.4.egg - demoneeded-1.0-py2.4.egg - demoneeded-1.1-py2.4.egg - demoneeded-1.2c1-py2.4.egg d other-1.0-py2.4.egg We can request that eggs be unzipped even if they are zip safe. This can be useful when debugging. (Note that Distribute will unzip eggs by default, so if you are using Distribute, most or all eggs will already be unzipped without this flag.) >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=False) >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg We can also set a default by calling the always_unzip function: >>> zc.buildout.easy_install.always_unzip(True) False The old default is returned: >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg >>> zc.buildout.easy_install.always_unzip(False) True >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/') >>> ls(dest) - demo-0.3-py2.4.egg - demoneeded-1.1-py2.4.egg >>> rmdir(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.3-py2.4.egg d demoneeded-1.1-py2.4.egg Specifying version information independent of requirements ---------------------------------------------------------- Sometimes it's useful to specify version information independent of normal requirements specifications. For example, a buildout may need to lock down a set of versions, without having to put put version numbers in setup files or part definitions. If a dictionary is passed to the install function, mapping project names to version numbers, then the versions numbers will be used. >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) >>> [d.version for d in ws] ['0.2', '1.0'] In this example, we specified a version for demoneeded, even though we didn't define a requirement for it. The versions specified apply to dependencies as well as the specified requirements. If we specify a version that's incompatible with a requirement, then we'll get an error: >>> from zope.testing.loggingsupport import InstalledHandler >>> handler = InstalledHandler('zc.buildout.easy_install') >>> import logging >>> logging.getLogger('zc.buildout.easy_install').propagate = False >>> ws = zc.buildout.easy_install.install( ... ['demo >0.2'], dest, links=[link_server], ... index=link_server+'index/', ... versions = dict(demo='0.2', demoneeded='1.0')) Traceback (most recent call last): ... IncompatibleConstraintError: Bad constraint 0.2 demo>0.2 >>> print handler zc.buildout.easy_install DEBUG Installing 'demo >0.2'. zc.buildout.easy_install ERROR The constraint, 0.2, is not consistent with the requirement, 'demo>0.2'. >>> handler.clear() If no versions are specified, a debugging message will be output reporting that a version was picked automatically: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> print handler zc.buildout.easy_install DEBUG Installing 'demo'. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demo'. zc.buildout.easy_install DEBUG Picked: demo = 0.3 zc.buildout.easy_install DEBUG Getting required 'demoneeded' zc.buildout.easy_install DEBUG required by demo 0.3. zc.buildout.easy_install DEBUG We have the best distribution that satisfies 'demoneeded'. zc.buildout.easy_install DEBUG Picked: demoneeded = 1.1 >>> handler.uninstall() >>> logging.getLogger('zc.buildout.easy_install').propagate = True We can request that we get an error if versions are picked: >>> zc.buildout.easy_install.allow_picked_versions(False) True (The old setting is returned.) >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) Traceback (most recent call last): ... UserError: Picked: demo = 0.3 >>> zc.buildout.easy_install.allow_picked_versions(True) False The function default_versions can be used to get and set default version information to be used when no version information is passes. If called with an argument, it sets the default versions: >>> zc.buildout.easy_install.default_versions(dict(demoneeded='1')) {} It always returns the previous default versions. If called without an argument, it simply returns the default versions without changing them: >>> zc.buildout.easy_install.default_versions() {'demoneeded': '1'} So with the default versions set, we'll get the requested version even if the versions option isn't used: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.0'] Of course, we can unset the default versions by passing an empty dictionary: >>> zc.buildout.easy_install.default_versions({}) {'demoneeded': '1'} >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, links=[link_server], index=link_server+'index/', ... ) >>> [d.version for d in ws] ['0.3', '1.1'] Dependencies in Site Packages ----------------------------- Paths outside of Python's standard library--or more precisely, those that are not included when Python is started with the -S argument--are loosely referred to as "site-packages" here. These site-packages are searched by default for distributions. This can be disabled, so that, for instance, a system Python can be used with buildout, cleaned of any packages installed by a user or system package manager. The default behavior can be controlled and introspected using zc.buildout.easy_install.include_site_packages. >>> zc.buildout.easy_install.include_site_packages() True Here's an example of using a Python executable that includes our dependencies. Our "py_path" will have the "demoneeded," and "demo" packages available. We'll simply be asking for "demoneeded" here, but without any external index or links. >>> from zc.buildout.tests import create_sample_sys_install >>> py_path, site_packages_path = make_py() >>> create_sample_sys_install(site_packages_path) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) >>> [dist.project_name for dist in workingset] ['demoneeded'] That worked fine. Let's try again with site packages not allowed. We'll change the policy by changing the default. Notice that the function for changing the default value returns the previous value. >>> zc.buildout.easy_install.include_site_packages(False) True >>> zc.buildout.easy_install.include_site_packages() False >>> zc.buildout.easy_install.clear_index_cache() >>> rmdir(example_dest) >>> example_dest = tmpdir('site-packages-example-install') >>> workingset = zc.buildout.easy_install.install( ... ['demoneeded'], example_dest, links=[], executable=py_path, ... index=None) Traceback (most recent call last): ... MissingDistribution: Couldn't find a distribution for 'demoneeded'. >>> zc.buildout.easy_install.clear_index_cache() Now we'll reset the default. >>> zc.buildout.easy_install.include_site_packages(True) False >>> zc.buildout.easy_install.include_site_packages() True Dependency links ---------------- Setuptools allows metadata that describes where to search for package dependencies. This option is called dependency_links. Buildout has its own notion of where to look for dependencies, but it also uses the setup tools dependency_links information if it's available. Let's demo this by creating an egg that specifies dependency_links. To begin, let's create a new egg repository. This repository hold a newer version of the 'demoneeded' egg than the sample repository does. >>> repoloc = tmpdir('repo') >>> from zc.buildout.tests import create_egg >>> create_egg('demoneeded', '1.2', repoloc) >>> link_server2 = start_server(repoloc) Turn on logging on this server so that we can see when eggs are pulled from it. >>> get(link_server2 + 'enable_server_logging') GET 200 /enable_server_logging '' Now we can create an egg that specifies that its dependencies are found on this server. >>> repoloc = tmpdir('repo2') >>> create_egg('hasdeps', '1.0', repoloc, ... install_requires = "'demoneeded'", ... dependency_links = [link_server2]) Let's add the egg to another repository. >>> link_server3 = start_server(repoloc) Now let's install the egg. >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, ... links=[link_server3], index=link_server3+'index/') GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg The server logs show that the dependency was retrieved from the server specified in the dependency_links. Now let's see what happens if we provide two different ways to retrieve the dependencies. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 / GET 200 /demoneeded-1.2-pyN.N.egg Once again the dependency is fetched from the logging server even though it is also available from the non-logging server. This is because the version on the logging server is newer and buildout normally chooses the newest egg available. If you wish to control where dependencies come from regardless of dependency_links setup metadata use the 'use_dependency_links' option to zc.buildout.easy_install.install(). >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=False) Notice that this time the dependency egg is not fetched from the logging server. When you specify not to use dependency_links, eggs will only be searched for using the links you explicitly provide. Another way to control this option is with the zc.buildout.easy_install.use_dependency_links() function. This function sets the default behavior for the zc.buildout.easy_install() function. >>> zc.buildout.easy_install.use_dependency_links(False) True The function returns its previous setting. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) It can be overridden by passing a keyword argument to the install function. >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3], ... use_dependency_links=True) GET 200 /demoneeded-1.2-pyN.N.egg To return the dependency_links behavior to normal call the function again. >>> zc.buildout.easy_install.use_dependency_links(True) False >>> rmdir(example_dest) >>> example_dest = tmpdir('example-install') >>> workingset = zc.buildout.easy_install.install( ... ['hasdeps'], example_dest, index=link_server+'index/', ... links=[link_server, link_server3]) GET 200 /demoneeded-1.2-pyN.N.egg Script generation ----------------- The easy_install module provides support for creating scripts from eggs. It provides two competing functions. One, ``scripts``, is a well-established approach to generating reliable scripts with a "clean" Python--e.g., one that does not have any packages in its site-packages. The other, ``sitepackage_safe_scripts``, is newer, a bit trickier, and is designed to work with a Python that has code in its site-packages, such as a system Python. Both are similar to setuptools except that they provides facilities for baking a script's path into the script. This has two advantages: - The eggs to be used by a script are not chosen at run time, making startup faster and, more importantly, deterministic. - The script doesn't have to import pkg_resources because the logic that pkg_resources would execute at run time is executed at script-creation time. (There is an exception in ``sitepackage_safe_scripts`` if you want to have your Python's site packages available, as discussed below, but even in that case pkg_resources is only partially activated, which can be a significant time savings.) The ``scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~ The ``scripts`` function is the first way to generate scripts that we'll examine. It is the earlier approach that the package offered. Let's create a destination directory for it to place them in: >>> bin = tmpdir('bin') Now, we'll use the scripts function to generate scripts in this directory from the demo egg: >>> import sys >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin) the four arguments we passed were: 1. A sequence of distribution requirements. These are of the same form as setuptools requirements. Here we passed a single requirement, for the version 0.1 demo distribution. 2. A working set, 3. The Python executable to use, and 3. The destination directory. The bin directory now contains a generated script: >>> ls(bin) - demo The return value is a list of the scripts generated: >>> import os, sys >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo')] True Note that in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. The demo script run the entry point defined in the demo egg: >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Some things to note: - The demo and demoneeded eggs are added to the beginning of sys.path. - The module for the script entry point is imported and the entry point, in this case, 'main', is run. Rather than requirement strings, you can pass tuples containing 3 strings: - A script name, - A module, - An attribute expression for an entry point within the module. For example, we could have passed entry point information directly rather than passing a requirement: >>> scripts = zc.buildout.easy_install.scripts( ... [('demo', 'eggrecipedemo', 'main')], ... ws, sys.executable, bin) >>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Passing entry-point information directly is handy when using eggs (or distributions) that don't declare their entry points, such as distributions that aren't based on setuptools. The interpreter keyword argument can be used to generate a script that can be used to invoke the Python interactive interpreter with the path set based on the working set. This generated script can also be used to run other scripts with the path set on the working set: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, interpreter='py') >>> ls(bin) - demo - py >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'demo.exe'), ... os.path.join(bin, 'demo-script.py'), ... os.path.join(bin, 'py.exe'), ... os.path.join(bin, 'py-script.py')] ... else: ... scripts == [os.path.join(bin, 'demo'), ... os.path.join(bin, 'py')] True The py script simply runs the Python interactive interpreter with the path set: >>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-pyN.N.egg', '/sample-install/demoneeded-1.1-pyN.N.egg', ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) If invoked with a script name and arguments, it will run that script, instead. >>> write('ascript', ''' ... "demo doc" ... print sys.argv ... print (__name__, __file__, __doc__) ... ''') >>> print system(join(bin, 'py')+' ascript a b c'), ['ascript', 'a', 'b', 'c'] ('__main__', 'ascript', 'demo doc') For Python 2.5 and higher, you can also use the -m option to run a module: >>> print system(join(bin, 'py')+' -m pdb'), usage: pdb.py scriptfile [arg] ... >>> print system(join(bin, 'py')+' -m pdb what'), Error: what does not exist An additional argument can be passed to define which scripts to install and to provide script names. The argument is a dictionary mapping original script names to new script names. >>> bin = tmpdir('bin2') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run')) >>> if sys.platform == 'win32': ... scripts == [os.path.join(bin, 'run.exe'), ... os.path.join(bin, 'run-script.py')] ... else: ... scripts == [os.path.join(bin, 'run')] True >>> ls(bin) - run >>> print system(os.path.join(bin, 'run')), 3 1 The ``scripts`` function: Including extra paths in scripts ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We can pass a keyword argument, extra paths, to cause additional paths to be included in the a generated script: >>> foo = tmpdir('foo') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... extra_paths=[foo]) >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', '/foo', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) The ``scripts`` function: Providing script arguments ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ An "argument" keyword argument can be used to pass arguments to an entry point. The value passed is a source string to be placed between the parentheses in the call: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Passing initialization code ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ You can also pass script initialization code: >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, bin, dict(demo='run'), ... arguments='1, 2', ... initialization='import os\nos.chdir("foo")') >>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import sys sys.path[0:0] = [ '/sample-install/demo-0.3-py2.4.egg', '/sample-install/demoneeded-1.1-py2.4.egg', ] import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) The ``scripts`` function: Relative paths ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Sometimes, you want to be able to move a buildout directory around and have scripts still work without having to rebuild them. We can control this using the relative_paths option to install. You need to pass a common base directory of the scripts and eggs: >>> bo = tmpdir('bo') >>> ba = tmpdir('ba') >>> mkdir(bo, 'eggs') >>> mkdir(bo, 'bin') >>> mkdir(bo, 'other') >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(bo, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> scripts = zc.buildout.easy_install.scripts( ... ['demo'], ws, sys.executable, join(bo, 'bin'), dict(demo='run'), ... extra_paths=[ba, join(bo, 'bar')], ... interpreter='py', ... relative_paths=bo) >>> cat(bo, 'bin', 'run') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) Note that the extra path we specified that was outside the directory passed as relative_paths wasn't converted to a relative path. Of course, running the script works: >>> print system(join(bo, 'bin', 'run')), 3 1 We specified an interpreter and its paths are adjusted too: >>> cat(bo, 'bin', 'py') # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 import os join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) import sys sys.path[0:0] = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg'), '/ba', join(base, 'bar'), ] _interactive = True if len(sys.argv) > 1: _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:') _interactive = False for (_opt, _val) in _options: if _opt == '-i': _interactive = True elif _opt == '-c': exec _val elif _opt == '-m': sys.argv[1:] = _args _args = [] __import__("runpy").run_module( _val, {}, "__main__", alter_sys=True) if _args: sys.argv[:] = _args __file__ = _args[0] del _options, _args execfile(__file__) if _interactive: del _interactive __import__("code").interact(banner="", local=globals()) The ``sitepackage_safe_scripts`` function ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The newer function for creating scripts is ``sitepackage_safe_scripts``. It has the same basic functionality as the ``scripts`` function: it can create scripts to run arbitrary entry points, and to run a Python interpreter. The following are the differences from a user's perspective. - It can be used safely with a Python that has packages installed itself, such as a system-installed Python. - In contrast to the interpreter generated by the ``scripts`` method, which supports only a small subset of the usual Python executable's options, the interpreter generated by ``sitepackage_safe_scripts`` supports all of them. This makes it possible to use as full Python replacement for scripts that need the distributions specified in your buildout. - Both the interpreter and the entry point scripts allow you to include the site packages, and/or the sitecustomize, of the Python executable, if desired. It works by creating site.py and sitecustomize.py files that set up the desired paths and initialization. These must be placed within an otherwise empty directory. Typically this is in a recipe's parts directory. Here's the simplest example, building an interpreter script. >>> interpreter_dir = tmpdir('interpreter') >>> interpreter_parts_dir = os.path.join( ... interpreter_dir, 'parts', 'interpreter') >>> interpreter_bin_dir = os.path.join(interpreter_dir, 'bin') >>> mkdir(interpreter_bin_dir) >>> mkdir(interpreter_dir, 'eggs') >>> mkdir(interpreter_dir, 'parts') >>> mkdir(interpreter_parts_dir) >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py') Depending on whether the machine being used is running Windows or not, this produces either three or four files. In both cases, we have site.py and sitecustomize.py generated in the parts/interpreter directory. For Windows, we have py.exe and py-script.py; for other operating systems, we have py. >>> sitecustomize_path = os.path.join( ... interpreter_parts_dir, 'sitecustomize.py') >>> site_path = os.path.join(interpreter_parts_dir, 'site.py') >>> interpreter_path = os.path.join(interpreter_bin_dir, 'py') >>> if sys.platform == 'win32': ... py_path = os.path.join(interpreter_bin_dir, 'py-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'py.exe'), ... py_path] ... else: ... py_path = interpreter_path ... expected = [sitecustomize_path, site_path, py_path] ... >>> assert generated == expected, repr((generated, expected)) We didn't ask for any initialization, and we didn't ask to use the underlying sitecustomization, so sitecustomize.py is empty. >>> cat(sitecustomize_path) The interpreter script is simple. It puts the directory with the site.py and sitecustomize.py on the PYTHONPATH and (re)starts Python. >>> cat(py_path) #!/usr/bin/python -S import os import sys argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = '/interpreter/parts/interpreter' if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) The site.py file is a modified version of the underlying Python's site.py. The most important modification is that it has a different version of the addsitepackages function. It sets up the Python path, similarly to the behavior of the function it replaces. The following shows the part that buildout inserts, in the simplest case. >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) return known_paths def original_addsitepackages(known_paths):... Here are some examples of the interpreter in use. >>> print call_py(interpreter_path, "print 16+26") 42 >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] >>> clean_paths = eval(res.strip()) # This is used later for comparison. If you provide initialization, it goes in sitecustomize.py. >>> def reset_interpreter(): ... # This is necessary because, in our tests, the timestamps of the ... # .pyc files are not outdated when we want them to be. ... rmdir(interpreter_bin_dir) ... mkdir(interpreter_bin_dir) ... rmdir(interpreter_parts_dir) ... mkdir(interpreter_parts_dir) ... >>> reset_interpreter() >>> initialization_string = """\ ... import os ... os.environ['FOO'] = 'bar baz bing shazam'""" >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', initialization=initialization_string) >>> cat(sitecustomize_path) import os os.environ['FOO'] = 'bar baz bing shazam' >>> print call_py(interpreter_path, "import os; print os.environ['FOO']") bar baz bing shazam If you use relative paths, this affects the interpreter and site.py. (This is again the UNIX version; the Windows version uses subprocess instead of os.execve.) >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', relative_paths=interpreter_dir) >>> cat(py_path) #!/usr/bin/python -S import os import sys join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) argv = [sys.executable] + sys.argv[1:] environ = os.environ.copy() path = join(base, 'parts/interpreter') if environ.get('PYTHONPATH'): path = os.pathsep.join([path, environ['PYTHONPATH']]) environ['PYTHONPATH'] = path os.execve(sys.executable, argv, environ) For site.py, we again show only the pertinent parts. Notice that the egg paths join a base to a path, as with the use of this argument in the ``scripts`` function. >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ]... The paths resolve in practice as you would expect. >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg'] The ``extra_paths`` argument affects the path in site.py. Notice that /interpreter/other is added after the eggs. >>> reset_interpreter() >>> mkdir(interpreter_dir, 'other') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', extra_paths=[join(interpreter_dir, 'other')]) >>> sys.stdout.write('#\n'); cat(site_path) # doctest: +ELLIPSIS #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other' ]... >>> print call_py(interpreter_path, ... "import sys, pprint; pprint.pprint(sys.path)") ... # doctest: +ELLIPSIS ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', '/interpreter/other'] The ``sitepackage_safe_scripts`` function: using site-packages ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``sitepackage_safe_scripts`` function supports including site packages. This has some advantages and some serious dangers. A typical reason to include site-packages is that it is easier to install one or more dependencies in your Python than it is with buildout. Some packages, such as lxml or Python PostgreSQL integration, have dependencies that can be much easier to build and/or install using other mechanisms, such as your operating system's package manager. By installing some core packages into your Python's site-packages, this can significantly simplify some application installations. However, doing this has a significant danger. One of the primary goals of buildout is to provide repeatability. Some packages (one of the better known Python openid packages, for instance) change their behavior depending on what packages are available. If Python curl bindings are available, these may be preferred by the library. If a certain XML package is installed, it may be preferred by the library. These hidden choices may cause small or large behavior differences. The fact that they can be rarely encountered can actually make it worse: you forget that this might be a problem, and debugging the differences can be difficult. If you allow site-packages to be included in your buildout, and the Python you use is not managed precisely by your application (for instance, it is a system Python), you open yourself up to these possibilities. Don't be unaware of the dangers. That explained, let's see how it works. If you don't use namespace packages, this is very straightforward. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = None buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... It simply adds the original paths using addsitedir after the code to add the buildout paths. Here's an example of the new script in use. Other documents and tests in this package give the feature a more thorough workout, but this should give you an idea of the feature. >>> res = call_py(interpreter_path, "import sys; print sys.path") >>> print res # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '/interpreter/eggs/demo-0.3-py2.4.egg', '/interpreter/eggs/demoneeded-1.1-py2.4.egg', ...] The clean_paths gathered earlier is a subset of this full list of paths. >>> full_paths = eval(res.strip()) >>> len(clean_paths) < len(full_paths) True >>> set(os.path.normpath(p) for p in clean_paths).issubset( ... os.path.normpath(p) for p in full_paths) True Unfortunately, because of how setuptools namespace packages are implemented differently for operating system packages (debs or rpms) as opposed to standard setuptools installation, there's a slightly trickier dance if you use them. To show this we'll needs some extra eggs that use namespaces. We'll use the ``tellmy.fortune`` package, which we'll need to make an initial call to another text fixture to create. >>> from zc.buildout.tests import create_sample_namespace_eggs >>> namespace_eggs = tmpdir('namespace_eggs') >>> create_sample_namespace_eggs(namespace_eggs) >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo', 'tellmy.fortune'], join(interpreter_dir, 'eggs'), ... links=[link_server, namespace_eggs], index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '...setuptools...', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg' ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] As you can see, the script now first imports pkg_resources. Then we need to process egg files specially to look for namespace packages there *before* we process process lines in .pth files that use the "import" feature--lines that might be part of the setuptools namespace package implementation for system packages, as mentioned above, and that must come after processing egg namespaces. The most complex that this function gets is if you use namespace packages, include site-packages, and use relative paths. For completeness, we'll look at that result. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... interpreter='py', include_site_packages=True, ... relative_paths=interpreter_dir) >>> sys.stdout.write('#\n'); cat(site_path) ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE #... def addsitepackages(known_paths): """Add site packages, as determined by zc.buildout. See original_addsitepackages, below, for the original version.""" join = os.path.join base = os.path.dirname(os.path.abspath(os.path.realpath(__file__))) base = os.path.dirname(base) base = os.path.dirname(base) setuptools_path = '...setuptools...' sys.path.append(setuptools_path) known_paths.add(os.path.normcase(setuptools_path)) import pkg_resources buildout_paths = [ join(base, 'eggs/demo-0.3-pyN.N.egg'), join(base, 'eggs/tellmy.fortune-1.0-pyN.N.egg'), '...setuptools...', join(base, 'eggs/demoneeded-1.1-pyN.N.egg') ] for path in buildout_paths: sitedir, sitedircase = makepath(path) if not sitedircase in known_paths and os.path.exists(sitedir): sys.path.append(sitedir) known_paths.add(sitedircase) pkg_resources.working_set.add_entry(sitedir) sys.__egginsert = len(buildout_paths) # Support distribute. original_paths = [ ... ] for path in original_paths: if path == setuptools_path or path not in known_paths: addsitedir(path, known_paths) return known_paths def original_addsitepackages(known_paths):... >>> print call_py(interpreter_path, "import sys; print sys.path") ... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE ['', '/interpreter/parts/interpreter', ..., '...setuptools...', '/interpreter/eggs/demo-0.3-pyN.N.egg', '/interpreter/eggs/tellmy.fortune-1.0-pyN.N.egg', '/interpreter/eggs/demoneeded-1.1-pyN.N.egg', ...] The ``exec_sitecustomize`` argument does the same thing for the sitecustomize module--it allows you to include the code from the sitecustomize module in the underlying Python if you set the argument to True. The z3c.recipe.scripts package sets up the full environment necessary to demonstrate this piece. The ``sitepackage_safe_scripts`` function: writing scripts for entry points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All of the examples so far for this function have been creating interpreters. The function can also write scripts for entry points. They are almost identical to the scripts that we saw for the ``scripts`` function except that they ``import site`` after setting the sys.path to include our custom site.py and sitecustomize.py files. These files then initialize the Python environment as we have already seen. Let's see a simple example. >>> reset_interpreter() >>> ws = zc.buildout.easy_install.install( ... ['demo'], join(interpreter_dir, 'eggs'), links=[link_server], ... index=link_server+'index/') >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo']) As before, in Windows, 2 files are generated for each script. A script file, ending in '-script.py', and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a '.py' suffix. This is in addition to the site.py and sitecustomize.py files that are generated as with our interpreter examples above. >>> if sys.platform == 'win32': ... demo_path = os.path.join(interpreter_bin_dir, 'demo-script.py') ... expected = [sitecustomize_path, ... site_path, ... os.path.join(interpreter_bin_dir, 'demo.exe'), ... demo_path] ... else: ... demo_path = os.path.join(interpreter_bin_dir, 'demo') ... expected = [sitecustomize_path, site_path, demo_path] ... >>> assert generated == expected, repr((generated, expected)) The demo script runs the entry point defined in the demo egg: >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main()) >>> demo_call = join(interpreter_bin_dir, 'demo') >>> if sys.platform == 'win32': ... demo_call = '"%s"' % demo_call >>> print system(demo_call) 3 1 There are a few differences from the ``scripts`` function. First, the ``reqs`` argument (an iterable of string requirements or entry point tuples) is a keyword argument here. We see that in the example above. Second, the ``arguments`` argument is now named ``script_arguments`` to try and clarify that it does not affect interpreters. While the ``initialization`` argument continues to affect both the interpreters and the entry point scripts, if you have initialization that is only pertinent to the entry point scripts, you can use the ``script_initialization`` argument. Let's see ``script_arguments`` and ``script_initialization`` in action. >>> reset_interpreter() >>> generated = zc.buildout.easy_install.sitepackage_safe_scripts( ... interpreter_bin_dir, ws, sys.executable, interpreter_parts_dir, ... reqs=['demo'], script_arguments='1, 2', ... script_initialization='import os\nos.chdir("foo")') >>> cat(demo_path) # doctest: +NORMALIZE_WHITESPACE #!/usr/local/bin/python2.4 -S import sys sys.path[0:0] = [ '/interpreter/parts/interpreter', ] import os path = sys.path[0] if os.environ.get('PYTHONPATH'): path = os.pathsep.join([path, os.environ['PYTHONPATH']]) os.environ['BUILDOUT_ORIGINAL_PYTHONPATH'] = os.environ.get('PYTHONPATH', '') os.environ['PYTHONPATH'] = path import site # imports custom buildout-generated site.py import os os.chdir("foo") import eggrecipedemo if __name__ == '__main__': sys.exit(eggrecipedemo.main(1, 2)) Handling custom build options for extensions provided in source distributions ----------------------------------------------------------------------------- Sometimes, we need to control how extension modules are built. The build function provides this level of control. It takes a single package specification, downloads a source distribution, and builds it with specified custom build options. The build function takes 3 positional arguments: spec A package specification for a source distribution dest A destination directory build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. It supports a number of optional keyword arguments: links a sequence of URLs, file names, or directories to look for links to distributions, index The URL of an index server, or almost any other valid URL. :) If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we'll just point to an empty directory on our link server. This will make our examples run a little bit faster. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. path A list of additional directories to search for locally-installed distributions. newest A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements. versions A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements. Our link server included a source distribution that includes a simple extension, extdemo.c:: #include #include static PyMethodDef methods[] = {}; PyMODINIT_FUNC initextdemo(void) { PyObject *m; m = Py_InitModule3("extdemo", methods, ""); #ifdef TWO PyModule_AddObject(m, "val", PyInt_FromLong(2)); #else PyModule_AddObject(m, "val", PyInt_FromLong(EXTDEMO)); #endif } The extension depends on a system-dependent include file, extdemo.h, that defines a constant, EXTDEMO, that is exposed by the extension. We'll add an include directory to our sample buildout and add the needed include file to it: >>> mkdir('include') >>> write('include', 'extdemo.h', ... """ ... #define EXTDEMO 42 ... """) Now, we can use the build function to create an egg from the source distribution: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] The function returns the list of eggs Now if we look in our destination directory, we see we have an extdemo egg: >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg Let's update our link server with a new version of extdemo: >>> update_extdemo() >>> print get(link_server), bigdemo-0.1-py2.4.egg
demo-0.1-py2.4.egg
demo-0.2-py2.4.egg
demo-0.3-py2.4.egg
demo-0.4c1-py2.4.egg
demoneeded-1.0.zip
demoneeded-1.1.zip
demoneeded-1.2c1.zip
extdemo-1.4.zip
extdemo-1.5.zip
index/
other-1.0-py2.4.egg
The easy_install caches information about servers to reduce network access. To see the update, we have to call the clear_index_cache function to clear the index cache: >>> zc.buildout.easy_install.clear_index_cache() If we run build with newest set to False, we won't get an update: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... newest=False) ['/sample-install/extdemo-1.4-py2.4-linux-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg But if we run it with the default True setting for newest, then we'll get an updated egg: >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') ['/sample-install/extdemo-1.5-py2.4-unix-i686.egg'] >>> ls(dest) - demo-0.2-py2.4.egg d demo-0.3-py2.4.egg - demoneeded-1.0-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.4-py2.4-unix-i686.egg d extdemo-1.5-py2.4-unix-i686.egg The versions option also influences the versions used. For example, if we specify a version for extdemo, then that will be used, even though it isn't the newest. Let's clean out the destination directory first: >>> import os >>> for name in os.listdir(dest): ... remove(dest, name) >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/', ... versions=dict(extdemo='1.4')) ['/sample-install/extdemo-1.4-py2.4-unix-i686.egg'] >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg Handling custom build options for extensions in develop eggs ------------------------------------------------------------ The develop function is similar to the build function, except that, rather than building an egg from a source directory containing a setup.py script. The develop function takes 2 positional arguments: setup The path to a setup script, typically named "setup.py", or a directory containing a setup.py script. dest The directory to install the egg link to It supports some optional keyword argument: build_ext A dictionary of options to be passed to the distutils build_ext command when building extensions. executable A path to a Python executable. Distributions will be installed using this executable and will be for the matching Python version. We have a local directory containing the extdemo source: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README - extdemo.c - setup.py Now, we can use the develop function to create a develop egg from the source distribution: >>> zc.buildout.easy_install.develop( ... extdemo, dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}) '/sample-install/extdemo.egg-link' The name of the egg link created is returned. Now if we look in our destination directory, we see we have an extdemo egg link: >>> ls(dest) d extdemo-1.4-py2.4-unix-i686.egg - extdemo.egg-link And that the source directory contains the compiled extension: >>> ls(extdemo) - MANIFEST - MANIFEST.in - README d build - extdemo.c d extdemo.egg-info - extdemo.so - setup.py Download cache -------------- Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout. A download cache is specified by calling the download_cache function. The function always returns the previous setting. If no argument is passed, then the setting is unchanged. If an argument is passed, the download cache is set to the given path, which must point to an existing directory. Passing None clears the cache setting. To see this work, we'll create a directory and set it as the cache directory: >>> cache = tmpdir('cache') >>> zc.buildout.easy_install.download_cache(cache) We'll recreate our destination directory: >>> remove(dest) >>> dest = tmpdir('sample-install') We'd like to see what is being fetched from the server, so we'll enable server logging: >>> get(link_server+'enable_server_logging') GET 200 /enable_server_logging '' Now, if we install demo, and extdemo: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 200 /demo-0.2-py2.4.egg GET 404 /index/demoneeded/ GET 200 /demoneeded-1.1.zip >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ GET 200 /extdemo-1.5.zip ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] Not only will we get eggs in our destination directory: >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg But we'll get distributions in the cache directory: >>> ls(cache) - demo-0.2-py2.4.egg - demoneeded-1.1.zip - extdemo-1.5.zip The cache directory contains uninstalled distributions, such as zipped eggs or source distributions. Let's recreate our destination directory and clear the index cache: >>> remove(dest) >>> dest = tmpdir('sample-install') >>> zc.buildout.easy_install.clear_index_cache() Now when we install the distributions: >>> ws = zc.buildout.easy_install.install( ... ['demo==0.2'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 / GET 404 /index/demo/ GET 200 /index/ GET 404 /index/demoneeded/ >>> zc.buildout.easy_install.build( ... 'extdemo', dest, ... {'include-dirs': os.path.join(sample_buildout, 'include')}, ... links=[link_server], index=link_server+'index/') GET 404 /index/extdemo/ ['/sample-install/extdemo-1.5-py2.4-linux-i686.egg'] >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg d extdemo-1.5-py2.4-linux-i686.egg Note that we didn't download the distributions from the link server. If we remove the restriction on demo, we'll download a newer version from the link server: >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) GET 200 /demo-0.3-py2.4.egg Normally, the download cache is the preferred source of downloads, but not the only one. Installing solely from a download cache --------------------------------------- A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a download cache and tell the easy_install module to install from the download cache only, without making network accesses. The install_from_cache function can be used to signal that packages should be installed only from the download cache. The function always returns the previous setting. Calling it with no arguments returns the current setting without changing it: >>> zc.buildout.easy_install.install_from_cache() False Calling it with a boolean value changes the setting and returns the previous setting: >>> zc.buildout.easy_install.install_from_cache(True) False Let's remove demo-0.3-py2.4.egg from the cache, clear the index cache, recreate the destination directory, and reinstall demo: >>> for f in os.listdir(cache): ... if f.startswith('demo-0.3-'): ... remove(cache, f) >>> zc.buildout.easy_install.clear_index_cache() >>> remove(dest) >>> dest = tmpdir('sample-install') >>> ws = zc.buildout.easy_install.install( ... ['demo'], dest, ... links=[link_server], index=link_server+'index/', ... always_unzip=True) >>> ls(dest) d demo-0.2-py2.4.egg d demoneeded-1.1-py2.4.egg This time, we didn't download from or even query the link server. .. Disable the download cache: >>> zc.buildout.easy_install.download_cache(None) '/cache' >>> zc.buildout.easy_install.install_from_cache(False) True Distribute Support ================== Distribute is a drop-in replacement for Setuptools. zc.buildout is now compatible with Distribute 0.6. To use Distribute in your buildout, you need use the ``--distribute`` option of the ``bootstrap.py`` script:: $ python bootstrap.py --distribute This will download and install the latest Distribute 0.6 release in the ``eggs`` directory, and use this version for the scripts that are created in ``bin``. Notice that if you have a shared eggs directory, a buildout that uses Distribute will not interfer with other buildouts that are based on Setuptools and that are sharing the same eggs directory. Form more information about the Distribute project, see: http://python-distribute.org Change History ************** 1.7.1 (2013-02-21) ================== Fixed: Constraints intended to prevent upgrading to buildout-2-compatible recipes weren't expressed correctly, leading to unintendional use of zc.recipe.egg-2.0.0a3. 1.7.0 (2013-01-11) ================== - Unless version requirements are specified, buildout won't upgrade itself past version 1. - Versions in versions sections can now be simple constraints, like <2.0dev in addition to being simple versions. This is used to prevent upgrading zc.recipe.egg and zc.recipe.testrunner past version 1. - If buildout is bootstrapped with a non-final release, it won't downgrade itself to a final release. - Fix: distribute 0.6.33 broke Python 2.4 compatibility - remove `data_files` from `setup.py`, which was installing README.txt in current directory during installation [Domen Kožar] - Windows fix: use cli-64.exe/cli.exe depending on 64/32 bit and try cli.exe if cli-64.exe is not found, fixing 9c6be7ac6d218f09e33725e07dccc4af74d8cf97 - Windows fix: `buildout init` was broken, re.sub does not like a single backslash - fixed all builds on travis-ci [Domen Kožar] - use os._exit insted of sys.exit after ugrade forking [Domen Kožar] - Revert cfa0478937d16769c268bf51e60e69cd3ead50f3, it only broke a feature [Domen Kožar] 1.6.3 (2012-08-22) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/90bc44f9bffd0d9eb09aacf08c6a4c2fed797319 and https://github.com/buildout/buildout/commit/e65b7bfbd7c7ccd556a278016a16b63ae8ef782b) [aclark4life] 1.6.2 (2012-08-21) ================== - Fix Windows regression (see: https://github.com/buildout/buildout/commit/cfa0478937d16769c268bf51e60e69cd3ead50f3) [aclark4life] 1.6.1 (2012-08-18) ================== - `bootstrap.py -d init` would invoke buildout with arguments `init bootstrap` leading into installation of bootstrap package. now bootstrap.py first runs any commands passed, then tries to bootstrap. (Domen Kožar) - fix Python 2.4 support (Domen Kožar) - added travis-ci testing (Domen Kožar) 1.6.0 (2012-08-15) ================== - The buildout init command now accepts distribution requirements and paths to set up a custom interpreter part that has the distributions or parts in the path. For example:: python bootstrap.py init BeautifulSoup - Introduce a cache for the expensive `buildout._dir_hash` function. - Remove duplicate path from script's sys.path setup. - changed broken dash S check to pass the configuration options -S -c separately, to make zc.buildout more compatible with the PyPy interpreter, which has less flexible argument parsing than CPython. Note that PyPy post 1.4.0 is needed to make buildout work at all, due to missing support for the ``-E`` option, which only got added afterwards. - Made sure to download extended configuration files only once per buildout run even if they are referenced multiple times (patch by Rafael Monnerat). - Ported speedup optimization patch by Ross Patterson to 1.5.x series. Improved patch to calculate required_by packages in linear time in verbose mode (-v). Running relatively simple Buildout envornment yielded in running time improvement from 30 seconds to 10 seconds. (Domen Kožar, Ross Patterson) - Removed unnecessary pyc recompilation with optimization flags. Running Buildout with pre-downloaded ~300 packages that were installed in empty eggs repository yielded in running time improvement from 1126 seconds to 348 seconds. (Domen Kožar) Bugs fixed: - In the download module, fixed the handling of directories that are pointed to by file-system paths and ``file:`` URLs. - Removed any traces of the implementation of ``extended-by``. Raise a UserError if the option is encountered instead of ignoring it, though. - https://bugs.launchpad.net/bugs/697913 : Buildout doesn't honor exit code from scripts. Fixed. - Handle both addition and subtraction of elements (+= and -=) on the same key in the same section. 1.5.2 (2010-10-11) ================== - changed metadata 'url' to pypi.python.org in order to solve a temporary outage of buildout.org - IMPORTANT: For better backwards compatibility with the pre-1.5 line, this release has two big changes from 1.5.0 and 1.5.1. - Buildout defaults to including site packages. - Buildout loads recipes and extensions with the same constraints to site-packages that it builds eggs, instead of never allowing access to site-packages. This means that the default configuration should better support pre-existing use of system Python in recipes or builds. - To make it easier to detect the fact that buildout has set the PYTHONPATH, BUILDOUT_ORIGINAL_PYTHONPATH is always set in the environment, even if PYTHONPATH was not originally set. BUILDOUT_ORIGINAL_PYTHONPATH will be an empty string if PYTHONPATH was not set. 1.5.1 (2010-08-29) ================== New features: - Scripts store the old PYTHONPATH in BUILDOUT_ORIGINAL_PYTHONPATH if it existed, and store nothing in the value if it did not exist. This allows code that does not want subprocesses to have the system-Python-protected site.py to set the environment of the subprocess as it was originally. Bugs fixed: - https://bugs.launchpad.net/bugs/623590 : If include-site-packages were true and versions were not set explicitly, system eggs were preferred over newer released eggs. Fixed. 1.5.0 (2010-08-23) ================== New features: - zc.buildout supports Python 2.7. - By default, Buildout and the bootstrap script now prefer final versions of Buildout, recipes, and extensions. This can be changed by using the --accept-buildout-test-releases flag (or -t for short) when calling bootstrap. This will hopefully allow beta releases of these items to be more easily and safely made in the future. NOTE: dependencies of your own software are not affected by this new behavior. Buildout continues to choose the newest available versions of your dependencies regardless of whether they are final releases. To prevent this, use the pre-existing switch ``prefer-final = true`` in the [buildout] section of your configuration file (see http://pypi.python.org/pypi/zc.buildout#preferring-final-releases) or pin your versions using a versions section (see http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Bugs fixed: - You can now again use virtualenv with Buildout. The new features to let buildout be used with a system Python are disabled in this configuration, and the previous script generation behavior (1.4.3) is used, even if the new function ``zc.buildout.easy_install.sitepackage_safe_scripts`` is used. 1.5.0b2 (2010-04-29) ==================== This was a re-release of 1.4.3 in order to keep 1.5.0b1 release from hurting workflows that combined virtualenv with zc.buildout. 1.5.0b1 (2010-04-29) ==================== New Features: - Added buildout:socket-timout option so that socket timeout can be configured both from command line and from config files. (gotcha) - Buildout can be safely used with a system Python (or any Python with code in site-packages), as long as you use (1) A fresh checkout, (2) the new bootstrap.py, and (3) recipes that use the new ``zc.buildout.easy_install.sitepackage_safe_scripts`` function to generate scripts and interpreters. Many recipes will need to be updated to use this new function. The scripts and interpreters generated by ``zc.recipe.egg`` will continue to use the older function, not safe with system Pythons. Use the ``z3c.recipe.scripts`` as a replacement. zc.recipe.egg is still a fully supported, and simpler, way of generating scripts and interpreters if you are using a "clean" Python, without code installed in site-packages. It keeps its previous behavior in order to provide backwards compatibility. The z3c.recipe.scripts recipe allows you to control how you use the code in site-packages. You can exclude it entirely (preferred); allow eggs in it to fulfill package dependencies declared in setup.py and buildout configuration; allow it to be available but not used to fulfill dependencies declared in setup.py or buildout configuration; or only allow certain eggs in site-packages to fulfill dependencies. - Added new function, ``zc.buildout.easy_install.sitepackage_safe_scripts``, to generate scripts and interpreter. It produces a full-featured interpreter (all command-line options supported) and the ability to safely let scripts include site packages, such as with a system Python. The ``z3c.recipe.scripts`` recipe uses this new function. - Improve bootstrap. * New options let you specify where to find ez_setup.py and where to find a download cache. These options can keep bootstrap from going over the network. * Another new option lets you specify where to put generated eggs. * The buildout script generated by bootstrap honors more of the settings in the designated configuration file (e.g., buildout.cfg). * Correctly handle systems where pkg_resources is present but the rest of setuptools is missing (like Ubuntu installs). https://bugs.launchpad.net/zc.buildout/+bug/410528 - You can develop zc.buildout using Distribute instead of Setuptools. Use the --distribute option on the dev.py script. (Releases should be tested with both Distribute and Setuptools.) The tests for zc.buildout pass with Setuptools and Python 2.4, 2.5, 2.6, and 2.7; and with Distribute and Python 2.5, 2.6, and 2.7. Using zc.buildout with Distribute and Python 2.4 is not recommended. - The ``distribute-version`` now works in the [buildout] section, mirroring the ``setuptools-version`` option (this is for consistency; using the general-purpose ``versions`` option is preferred). Bugs fixed: - Using Distribute with the ``allow-picked-versions = false`` buildout option no longer causes an error. - The handling and documenting of default buildout options was normalized. This means, among other things, that ``bin/buildout -vv`` and ``bin/buildout annotate`` correctly list more of the options. - Installing a namespace package using a Python that already has a package in the same namespace (e.g., in the Python's site-packages) failed in some cases. It is now handled correctly. - Another variation of this error showed itself when at least two dependencies were in a shared location like site-packages, and the first one met the "versions" setting. The first dependency would be added, but subsequent dependencies from the same location (e.g., site-packages) would use the version of the package found in the shared location, ignoring the version setting. This is also now handled correctly. 1.4.3 (2009-12-10) ================== Bugs fixed: - Using pre-detected setuptools version for easy_installing tgz files. This prevents a recursion error when easy_installing an upgraded "distribute" tgz. Note that setuptools did not have this recursion problem solely because it was packaged as an ``.egg``, which does not have to go through the easy_install step. 1.4.2 (2009-11-01) ================== New Feature: - Added a --distribute option to the bootstrap script, in order to use Distribute rather than Setuptools. By default, Setuptools is used. Bugs fixed: - While checking for new versions of setuptools and buildout itself, compare requirement locations instead of requirement objects. - Incrementing didn't work properly when extending multiple files. https://bugs.launchpad.net/zc.buildout/+bug/421022 - The download API computed MD5 checksums of text files wrong on Windows. 1.4.1 (2009-08-27) ================== New Feature: - Added a debug built-in recipe to make writing some tests easier. Bugs fixed: - (introduced in 1.4.0) option incrementing (-=) and decrementing (-=) didn't work in the buildout section. https://bugs.launchpad.net/zc.buildout/+bug/420463 - Option incrementing and decrementing didn't work for options specified on the command line. - Scripts generated with relative-paths enabled couldn't be symbolically linked to other locations and still work. - Scripts run using generated interpreters didn't have __file__ set correctly. - The standard Python -m option didn't work for custom interpreters. 1.4.0 (2009-08-26) ================== - When doing variable substitutions, you can omit the section name to refer to a variable in the same section (e.g. ${:foo}). - When doing variable substitution, you can use the special option, ``_buildout_section_name_`` to get the section name. This is most handy for getting the current section name (e.g. ${:_buildout_section_name_}). - A new special option, ``<`` allows sections to be used as macros. - Added annotate command for annotated sections. Displays sections key-value pairs along with the value origin. - Added a download API that handles the download cache, offline mode etc and is meant to be reused by recipes. - Used the download API to allow caching of base configurations (specified by the buildout section's 'extends' option). 1.3.1 (2009-08-12) ================== - Bug fixed: extras were ignored in some cases when versions were specified. 1.3.0 (2009-06-22) ================== - Better Windows compatibility in test infrastructure. - Now the bootstrap.py has an optional --version argument, that can be used to force zc.buildout version to use. - ``zc.buildout.testing.buildoutSetUp`` installs a new handler in the python root logging facility. This handler is now removed during tear down as it might disturb other packages reusing buildout's testing infrastructure. - fixed usage of 'relative_paths' keyword parameter on Windows - Added an unload entry point for extensions. - Fixed bug: when the relative paths option was used, relative paths could be inserted into sys.path if a relative path was used to run the generated script. 1.2.1 (2009-03-18) ================== - Refactored generation of relative egg paths to generate simpler code. 1.2.0 (2009-03-17) ================== - Added a relative_paths option to zc.buildout.easy_install.script to generate egg paths relative to the script they're used in. 1.1.2 (2009-03-16) ================== - Added Python 2.6 support. Removed Python 2.3 support. - Fixed remaining deprecation warnings under Python 2.6, both when running our tests and when using the package. - Switched from using os.popen* to subprocess.Popen, to avoid a deprecation warning in Python 2.6. See: http://docs.python.org/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3 - Made sure the 'redo_pyc' function and the doctest checkers work with Python executable paths containing spaces. - Expand shell patterns when processing the list of paths in `develop`, e.g:: [buildout] develop = ./local-checkouts/* - Conditionally import and use hashlib.md5 when it's available instead of md5 module, which is deprecated in Python 2.6. - Added Jython support for bootstrap, development bootstrap and zc.buildout support on Jython - Fixed a bug that would cause buildout to break while computing a directory hash if it found a broken symlink (Launchpad #250573) 1.1.1 (2008-07-28) ================== - Fixed a bug that caused buildouts to fail when variable substitutions are used to name standard directories, as in:: [buildout] eggs-directory = ${buildout:directory}/develop-eggs 1.1.0 (2008-07-19) ================== - Added a buildout-level unzip option tp change the default policy for unzipping zip-safe eggs. - Tracebacks are now printed for internal errors (as opposed to user errors) even without the -D option. - pyc and pyo files are regenerated for installed eggs so that the stored path in code objects matches the the install location. 1.0.6 (2008-06-13) ================== - Manually reverted the changeset for the fix for https://bugs.launchpad.net/zc.buildout/+bug/239212 to verify thet the test actually fails with the changeset: http://svn.zope.org/zc.buildout/trunk/src/zc/buildout/buildout.py?rev=87309&r1=87277&r2=87309 Thanks tarek for pointing this out. (seletz) - fixed the test for the += -= syntax in buildout.txt as the test was actually wronng. The original implementation did a split/join on whitespace, and later on that was corrected to respect the original EOL setting, the test was not updated, though. (seletz) - added a test to verify against https://bugs.launchpad.net/zc.buildout/+bug/239212 in allowhosts.txt (seletz) - further fixes for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor (related to the new 'allow-hosts' option) (patch by Gottfried Ganssauge) (ajung) 1.0.5 (2008-06-10) ================== - Fixed wrong split when using the += and -= syntax (mustapha) 1.0.4 (2008-06-10) ================== - Added the `allow-hosts` option (tarek) - Quote the 'executable' argument when trying to detect the python version using popen4. (sidnei) - Quote the 'spec' argument, as in the case of installing an egg from the buildout-cache, if the filename contains spaces it would fail (sidnei) - Extended configuration syntax to allow -= and += operators (malthe, mustapha). 1.0.3 (2008-06-01) ================== - fix for """AttributeError: Buildout instance has no attribute '_logger'""" by providing reasonable defaults within the Buildout constructor. (patch by Gottfried Ganssauge) (ajung) 1.0.2 (2008-05-13) ================== - More fixes for Windows. A quoted sh-bang is now used on Windows to make the .exe files work with a Python executable in 'program files'. - Added "-t " option for specifying the socket timeout. (ajung) 1.0.1 (2008-04-02) ================== - Made easy_install.py's _get_version accept non-final releases of Python, like 2.4.4c0. (hannosch) - Applied various patches for Windows (patch by Gottfried Ganssauge). (ajung) - Applied patch fixing rmtree issues on Windows (patch by Gottfried Ganssauge). (ajung) 1.0.0 (2008-01-13) ================== - Added a French translation of the buildout tutorial. 1.0.0b31 (2007-11-01) ===================== Feature Changes --------------- - Added a configuration option that allows buildouts to ignore dependency_links metadata specified in setup. By default dependency_links in setup are used in addition to buildout specified find-links. This can make it hard to control where eggs come from. Here's how to tell buildout to ignore URLs in dependency_links:: [buildout] use-dependency-links = false By default use-dependency-links is true, which matches the behavior of previous versions of buildout. - Added a configuration option that causes buildout to error if a version is picked. This is a nice safety belt when fixing all versions is intended, especially when creating releases. Bugs Fixed ---------- - 151820: Develop failed if the setup.py script imported modules in the distribution directory. - Verbose logging of the develop command was omitting detailed output. - The setup command wasn't documented. - The setup command failed if run in a directory without specifying a configuration file. - The setup command raised a stupid exception if run without arguments. - When using a local find links or index, distributions weren't copied to the download cache. - When installing from source releases, a version specification (via a buildout versions section) for setuptools was ignored when deciding which setuptools to use to build an egg from the source release. 1.0.0b30 (2007-08-20) ===================== Feature Changes --------------- - Changed the default policy back to what it was to avoid breakage in existing buildouts. Use:: [buildout] prefer-final = true to get the new policy. The new policy will go into effect in buildout 2. 1.0.0b29 (2007-08-20) ===================== Feature Changes --------------- - Now, final distributions are prefered over non-final versions. If both final and non-final versions satisfy a requirement, then the final version will be used even if it is older. The normal way to override this for specific packages is to specifically require a non-final version, either specifically or via a lower bound. - There is a buildout prefer-final version that can be used with a value of "false":: prefer-final = false To prefer newer versions, regardless of whether or not they are final, buildout-wide. - The new simple Python index, http://cheeseshop.python.org/simple, is used as the default index. This will provide better performance than the human package index interface, http://pypi.python.org/pypi. More importantly, it lists hidden distributions, so buildouts with fixed distribution versions will be able to find old distributions even if the distributions have been hidden in the human PyPI interface. Bugs Fixed ---------- - 126441: Look for default.cfg in the right place on Windows. 1.0.0b28 (2007-07-05) ===================== Bugs Fixed ---------- - When requiring a specific version, buildout looked for new versions even if that single version was already installed. 1.0.0b27 (2007-06-20) ===================== Bugs Fixed ---------- - Scripts were generated incorrectly on Windows. This included the buildout script itself, making buildout completely unusable. 1.0.0b26 (2007-06-19) ===================== Feature Changes --------------- - Thanks to recent fixes in setuptools, I was able to change buildout to use find-link and index information when searching extensions. Sadly, this work, especially the timing, was motivated my the need to use alternate indexes due to performance problems in the cheese shop (http://www.python.org/pypi/). I really home we can address these performance problems soon. 1.0.0b25 (2007-05-31) ===================== Feature Changes --------------- - buildout now changes to the buildout directory before running recipe install and update methods. - Added a new init command for creating a new buildout. This creates an empty configuration file and then bootstraps. - Except when using the new init command, it is now an error to run buildout without a configuration file. - In verbose mode, when adding distributions to fulful requirements of already-added distributions, we now show why the new distributions are being added. - Changed the logging format to exclude the logger name for the zc.buildout logger. This reduces noise in the output. - Clean up lots of messages, adding missing periods and adding quotes around requirement strings and file paths. Bugs Fixed ---------- - 114614: Buildouts could take a very long time if there were dependency problems in large sets of pathologically interdependent packages. - 59270: Buggy recipes can cause failures in later recipes via chdir - 61890: file:// urls don't seem to work in find-links setuptools requires that file urls that point to directories must end in a "/". Added a workaround. - 75607: buildout should not run if it creates an empty buildout.cfg 1.0.0b24 (2007-05-09) ===================== Feature Changes --------------- - Improved error reporting by showing which packages require other packages that can't be found or that cause version conflicts. - Added an API for use by recipe writers to clean up created files when recipe errors occur. - Log installed scripts. Bugs Fixed ---------- - 92891: bootstrap crashes with recipe option in buildout section. - 113085: Buildout exited with a zero exist status when internal errors occurred. 1.0.0b23 (2007-03-19) ===================== Feature Changes --------------- - Added support for download caches. A buildout can specify a cache for distribution downloads. The cache can be shared among buildouts to reduce network access and to support creating source distributions for applications allowing install without network access. - Log scripts created, as suggested in: https://bugs.launchpad.net/zc.buildout/+bug/71353 Bugs Fixed ---------- - It wasn't possible to give options on the command line for sections not defined in a configuration file. 1.0.0b22 (2007-03-15) ===================== Feature Changes --------------- - Improved error reporting and debugging support: - Added "logical tracebacks" that show functionally what the buildout was doing when an error occurs. Don't show a Python traceback unless the -D option is used. - Added a -D option that causes the buildout to print a traceback and start the pdb post-mortem debugger when an error occurs. - Warnings are printed for unused options in the buildout section and installed-part sections. This should make it easier to catch option misspellings. - Changed the way the installed database (.installed.cfg) is handled to avoid database corruption when a user breaks out of a buildout with control-c. - Don't save an installed database if there are no installed parts or develop egg links. 1.0.0b21 (2007-03-06) ===================== Feature Changes --------------- - Added support for repeatable buildouts by allowing egg versions to be specified in a versions section. - The easy_install module install and build functions now accept a versions argument that supplied to mapping from project name to version numbers. This can be used to fix version numbers for required distributions and their depenencies. When a version isn't fixed, using either a versions option or using a fixed version number in a requirement, then a debug log message is emitted indicating the version picked. This is useful for setting versions options. A default_versions function can be used to set a default value for this option. - Adjusted the output for verbosity levels. Using a single -v option no longer causes voluminous setuptools output. Uisng -vv and -vvv now triggers extra setuptools output. - Added a remove testing helper function that removes files or directories. 1.0.0b20 (2007-02-08) ===================== Feature Changes --------------- - Added a buildout newest option, to control whether the newest distributions should be sought to meet requirements. This might also provide a hint to recipes that don't deal with distributions. For example, a recipe that manages subversion checkouts might not update a checkout if newest is set to "false". - Added a *newest* keyword parameter to the zc.buildout.easy_install.install and zc.buildout.easy_install.build functions to control whether the newest distributions that meed given requirements should be sought. If a false value is provided for this parameter and already installed eggs meet the given requirements, then no attempt will be made to search for newer distributions. - The recipe-testing support setUp function now adds the name *buildout* to the test namespace with a value that is the path to the buildout script in the sample buildout. This allows tests to use >>> print system(buildout), rather than: >>> print system(join('bin', 'buildout')), Bugs Fixed ---------- - Paths returned from update methods replaced lists of installed files rather than augmenting them. 1.0.0b19 (2007-01-24) ===================== Bugs Fixed ---------- - Explicitly specifying a Python executable failed if the output of running Python with the -V option included a 2-digit (rather than a 3-digit) version number. 1.0.0b18 (2007-01-22) ===================== Feature Changes --------------- - Added documentation for some previously undocumented features of the easy_install APIs. - By popular demand, added a -o command-line option that is a short hand for the assignment buildout:offline=true. Bugs Fixed ---------- - When deciding whether recipe develop eggs had changed, buildout incorrectly considered files in .svn and CVS directories. 1.0.0b17 (2006-12-07) ===================== Feature Changes --------------- - Configuration files can now be loaded from URLs. Bugs Fixed ---------- - https://bugs.launchpad.net/products/zc.buildout/+bug/71246 Buildout extensions installed as eggs couldn't be loaded in offline mode. 1.0.0b16 (2006-12-07) ===================== Feature Changes --------------- - A new command-line argument, -U, suppresses reading user defaults. - You can now suppress use of an installed-part database (e.g. .installed.cfg) by sprifying an empty value for the buildout installed option. Bugs Fixed ---------- - When the install command is used with a list of parts, only those parts are supposed to be installed, but the buildout was also building parts that those parts depended on. 1.0.0b15 (2006-12-06) ===================== Bugs Fixed ---------- - Uninstall recipes weren't loaded correctly in cases where no parts in the (new) configuration used the recipe egg. 1.0.0b14 (2006-12-05) ===================== Feature Changes --------------- - Added uninstall recipes for dealing with complex uninstallation scenarios. Bugs Fixed ---------- - Automatic upgrades weren't performed on Windows due to a bug that caused buildout to incorrectly determine that it wasn't running locally in a buildout. - Fixed some spurious test failures on Windows. 1.0.0b13 (2006-12-04) ===================== Feature Changes --------------- - Variable substitutions now reflect option data written by recipes. - A part referenced by a part in a parts list is now added to the parts list before the referencing part. This means that you can omit parts from the parts list if they are referenced by other parts. - Added a develop function to the easy_install module to aid in creating develop eggs with custom build_ext options. - The build and develop functions in the easy_install module now return the path of the egg or egg link created. - Removed the limitation that parts named in the install command can only name configured parts. - Removed support ConfigParser-style variable substitutions (e.g. %(foo)s). Only the string-template style of variable (e.g. ${section:option}) substitutions will be supported. Supporting both violates "there's only one way to do it". - Deprecated the buildout-section extendedBy option. Bugs Fixed ---------- - We treat setuptools as a dependency of any distribution that (declares that it) uses namespace packages, whether it declares setuptools as a dependency or not. This wasn't working for eggs intalled by virtue of being dependencies. 1.0.0b12 (2006-10-24) ===================== Feature Changes --------------- - Added an initialization argument to the zc.buildout.easy_install.scripts function to include initialization code in generated scripts. 1.0.0b11 (2006-10-24) ===================== Bugs Fixed ---------- `67737 `_ Verbose and quite output options caused errors when the develop buildout option was used to create develop eggs. `67871 `_ Installation failed if the source was a (local) unzipped egg. `67873 `_ There was an error in producing an error message when part names passed to the install command weren't included in the configuration. 1.0.0b10 (2006-10-16) ===================== Feature Changes --------------- - Renamed the runsetup command to setup. (The old name still works.) - Added a recipe update method. Now install is only called when a part is installed for the first time, or after an uninstall. Otherwise, update is called. For backward compatibility, recipes that don't define update methiods are still supported. - If a distribution defines namespace packages but fails to declare setuptools as one of its dependencies, we now treat setuptools as an implicit dependency. We generate a warning if the distribution is a develop egg. - You can now create develop eggs for setup scripts that don't use setuptools. Bugs Fixed ---------- - Egg links weren't removed when corresponding entries were removed from develop sections. - Running a non-local buildout command (one not installed in the buildout) ket to a hang if there were new versions of zc.buildout or setuptools were available. Now we issue a warning and don't upgrade. - When installing zip-safe eggs from local directories, the eggs were moved, rather than copied, removing them from the source directory. 1.0.0b9 (2006-10-02) ==================== Bugs Fixed ---------- Non-zip-safe eggs were not unzipped when they were installed. 1.0.0b8 (2006-10-01) ==================== Bugs Fixed ---------- - Installing source distributions failed when using alternate Python versions (depending on the versions of Python used.) - Installing eggs wasn't handled as efficiently as possible due to a bug in egg URL parsing. - Fixed a bug in runsetup that caused setup scripts that introspected __file__ to fail. 1.0.0b7 ======= Added a documented testing framework for use by recipes. Refactored the buildout tests to use it. Added a runsetup command run a setup script. This is handy if, like me, you don't install setuptools in your system Python. 1.0.0b6 ======= Fixed https://launchpad.net/products/zc.buildout/+bug/60582 Use of extension options caused bootstrapping to fail if the eggs directory didn't already exist. We no longer use extensions for bootstrapping. There really isn't any reason to anyway. 1.0.0b5 ======= Refactored to do more work in buildout and less work in easy_install. This makes things go a little faster, makes errors a little easier to handle, and allows extensions (like the sftp extension) to influence more of the process. This was done to fix a problem in using the sftp support. 1.0.0b4 ======= - Added an **experimental** extensions mechanism, mainly to support adding sftp support to buildouts that need it. - Fixed buildout self-updating on Windows. 1.0.0b3 ======= - Added a help option (-h, --help) - Increased the default level of verbosity. - Buildouts now automatically update themselves to new versions of zc.buildout and setuptools. - Added Windows support. - Added a recipe API for generating user errors. - No-longer generate a py_zc.buildout script. - Fixed some bugs in variable substitutions. The characters "-", "." and " ", weren't allowed in section or option names. Substitutions with invalid names were ignored, which caused missleading failures downstream. - Improved error handling. No longer show tracebacks for user errors. - Now require a recipe option (and therefore a section) for every part. - Expanded the easy_install module API to: - Allow extra paths to be provided - Specify explicit entry points - Specify entry-point arguments 1.0.0b2 ======= Added support for specifying some build_ext options when installing eggs from source distributions. 1.0.0b1 ======= - Changed the bootstrapping code to only install setuptools and zc.buildout. The bootstrap code no-longer runs the buildout itself. This was to fix a bug that caused parts to be recreated unnecessarily because the recipe signature in the initial buildout reflected temporary locations for setuptools and zc.buildout. - Now create a minimal setup.py if it doesn't exist and issue a warning that it is being created. - Fixed bug in saving installed configuration data. %'s and extra spaces weren't quoted. 1.0.0a1 ======= Initial public version Download ********************** Keywords: development build Platform: UNKNOWN Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Zope Public License Classifier: Topic :: Software Development :: Build Tools Classifier: Topic :: Software Development :: Libraries :: Python Modules zc.buildout-1.7.1/src/zc.buildout.egg-info/requires.txt0000644000076500007650000000003712111415075022474 0ustar jimjim00000000000000setuptools [test] zope.testingzc.buildout-1.7.1/src/zc.buildout.egg-info/SOURCES.txt0000644000076500007650000000251712111415075021765 0ustar jimjim00000000000000CHANGES.txt COPYRIGHT.txt DEVELOPERS.txt LICENSE.txt README.txt SYSTEM_PYTHON_HELP.txt setup.py todo.txt src/zc/__init__.py src/zc.buildout.egg-info/PKG-INFO src/zc.buildout.egg-info/SOURCES.txt src/zc.buildout.egg-info/dependency_links.txt src/zc.buildout.egg-info/entry_points.txt src/zc.buildout.egg-info/namespace_packages.txt src/zc.buildout.egg-info/not-zip-safe src/zc.buildout.egg-info/requires.txt src/zc.buildout.egg-info/top_level.txt src/zc/buildout/__init__.py src/zc/buildout/allowhosts.txt src/zc/buildout/bootstrap.txt src/zc/buildout/bootstrap1.txt src/zc/buildout/buildout.py src/zc/buildout/buildout.txt src/zc/buildout/debugging.txt src/zc/buildout/dependencylinks.txt src/zc/buildout/distribute.txt src/zc/buildout/download.py src/zc/buildout/download.txt src/zc/buildout/downloadcache.txt src/zc/buildout/easy_install.py src/zc/buildout/easy_install.txt src/zc/buildout/extends-cache.txt src/zc/buildout/repeatable.txt src/zc/buildout/rmtree.py src/zc/buildout/runsetup.txt src/zc/buildout/setup.txt src/zc/buildout/testing.py src/zc/buildout/testing.txt src/zc/buildout/testing_bugfix.txt src/zc/buildout/testrecipes.py src/zc/buildout/tests.py src/zc/buildout/testselectingpython.py src/zc/buildout/unzip.txt src/zc/buildout/update.txt src/zc/buildout/upgrading_distribute.txt src/zc/buildout/virtualenv.txt src/zc/buildout/windows.txtzc.buildout-1.7.1/src/zc.buildout.egg-info/top_level.txt0000644000076500007650000000000312111415075022617 0ustar jimjim00000000000000zc zc.buildout-1.7.1/SYSTEM_PYTHON_HELP.txt0000644000076500007650000002320012111414155017042 0ustar jimjim00000000000000System Python and zc.buildout 1.5 ********************************* The 1.5 line of zc.buildout introduced a number of changes. Problems ======== As usual, please send questions and comments to the `distutils SIG mailing list `_. Report bugs using the `zc.buildout Launchpad Bug Tracker `_. If problems are keeping you from your work, here's an easy way to revert to the old code temporarily: switch to a custom "emergency" bootstrap script, available from http://svn.zope.org/repos/main/zc.buildout/branches/1.4/bootstrap/bootstrap.py . This customized script will select zc.buildout 1.4.4 by default. zc.buildout 1.4.4 will not upgrade itself unless you explicitly specify a new version. It will also prefer older versions of zc.recipe.egg and some other common recipes. If you have trouble with other recipes, consider using a standard buildout "versions" section to specify older versions of these, as described in the Buildout documentation (http://pypi.python.org/pypi/zc.buildout#repeatable-buildouts-controlling-eggs-used). Working with a System Python ============================ While there are a number of new features available in zc.buildout 1.5, the biggest is that Buildout itself supports usage with a system Python. This can work if you follow a couple of simple rules. 1. Use the new bootstrap.py (available from http://svn.zope.org/*checkout*/zc.buildout/trunk/bootstrap/bootstrap.py). 2. Use buildout recipes that have been upgraded to work with zc.buildout 1.5 and higher. Specifically, they should use ``zc.buildout.easy_install.sitepackage_safe_scripts`` to generate their scripts, if any, rather than ``zc.buildout.easy_install.scripts``. See the `Recipes That Support a System Python`_ section below for more details on recipes that are available as of this writing, and `Updating Recipes to Support a System Python`_ for instructions on how to update a recipe. Note that you should generally only need to update recipes that generate scripts. You can then use ``include-site-packages = false`` and ``exec-sitecustomize = false`` buildout options to eliminate access to your Python's site packages and not execute its sitecustomize file, if it exists, respectively. Alternately, you can use the ``allowed-eggs-from-site-packages`` buildout option as a glob-aware whitelist of eggs that may come from site-packages. This value defaults to "*", accepting all eggs. It's important to note that recipes not upgraded for zc.buildout 1.5.0 should continue to work--just without internal support for a system Python. Using a system Python is inherently fragile. Using a clean, freshly-installed Python without customization in site-packages is more robust and repeatable. See some of the regression tests added to the 1.5.0 line for the kinds of issues that you can encounter with a system Python, and see http://pypi.python.org/pypi/z3c.recipe.scripts#including-site-packages-and-sitecustomize for more discussion. However, using a system Python can be very convenient, and the zc.buildout code for this feature has been tested by many users already. Moreover, it has automated tests to exercise the problems that have been encountered and fixed. Many people rely on it. Recipes That Support a System Python ==================================== zc.recipe.egg continues to generate old-style scripts that are not safe for use with a system Python. This was done for backwards compatibility, because it is integral to so many buildouts and used as a dependency of so many other recipes. If you want to generate new-style scripts that do support system Python usage, use z3c.recipe.scripts instead (http://pypi.python.org/pypi/z3c.recipe.scripts). z3c.recipe.scripts has the same script and interpreter generation options as zc.recipe.egg, plus a few more for the new features mentioned above. In the simplest case, you should be able to simply change ``recipe = zc.recipe.egg`` to ``recipe = z3c.recipe.scripts`` in the pertinent sections of your buildout configuration and your generated scripts will work with a system Python. Other updated recipes include zc.recipe.testrunner 1.4.0 and z3c.recipe.tag 0.4.0. Others should be updated soon: see their change documents for details, or see `Updating Recipes to Support a System Python`_ for instructions on how to update recipes yourself. Templates for creating Python scripts with the z3c.recipe.filetemplate recipe can be easily changed to support a system Python. - If you don't care about supporting relative paths, simply using a generated interpreter with the eggs you want should be sufficient, as it was before. For instance, if the interpreter is named "py", use ``#!${buildout:bin-directory/py}`` or ``#!/usr/bin/env ${buildout:bin-directory/py}``). - If you do care about relative paths, (``relative-paths = true`` in your buildout configuration), then z3c.recipe.scripts does require a bit more changes, as is usual for the relative path support in that package. First, use z3c.recipe.scripts to generate a script or interpreter with the dependencies you want. This will create a directory in ``parts`` that has a site.py and sitecustomize.py. Then, begin your script as in the snippet below. The example assumes that the z3c.recipe.scripts generated were from a Buildout configuration section labeled "scripts": adjust accordingly. :: #!${buildout:executable} -S ${python-relative-path-setup} import sys sys.path.insert(0, ${scripts:parts-directory|path-repr}) import site Updating Recipes to Support a System Python =========================================== You should generally only need to update recipes that generate scripts. These recipes need to change from using ``zc.buildout.easy_install.scripts`` to be using ``zc.buildout.easy_install.sitepackage_safe_scripts``. The signatures of the two functions are different. Please compare:: def scripts( reqs, working_set, executable, dest, scripts=None, extra_paths=(), arguments='', interpreter=None, initialization='', relative_paths=False, ): def sitepackage_safe_scripts( dest, working_set, executable, site_py_dest, reqs=(), scripts=None, interpreter=None, extra_paths=(), initialization='', include_site_packages=False, exec_sitecustomize=False, relative_paths=False, script_arguments='', script_initialization='', ): In most cases, the arguments are merely reordered. The ``reqs`` argument is no longer required in order to make it easier to generate an interpreter alone. The ``arguments`` argument was renamed to ``script_arguments`` to clarify that it did not affect interpreter generation. The only new required argument is ``site_py_dest``. It must be the path to a directory in which the customized site.py and sitecustomize.py files will be written. A typical generation in a recipe will look like this. (In the recipe's __init__ method...) :: self.options = options b_options = buildout['buildout'] options['parts-directory'] = os.path.join( b_options['parts-directory'], self.name) (In the recipe's install method...) :: options = self.options generated = [] if not os.path.exists(options['parts-directory']): os.mkdir(options['parts-directory']) generated.append(options['parts-directory']) Then ``options['parts-directory']`` can be used for the ``site_py_dest`` value. If you want to support the other arguments (``include_site_packages``, ``exec_sitecustomize``, ``script_initialization``, as well as the ``allowed-eggs-from-site-packages`` option), you might want to look at some of the code in https://github.com/buildout/buildout/blob/1.6.x/z3c.recipe.scripts_/src/z3c/recipe/scripts/scripts.py . You might even be able to adopt some of it by subclassing or delegating. The Scripts class in that file is the closest to what you might be used to from zc.recipe.egg. Important note for recipe authors: As of buildout 1.5.2, the code in recipes is *always run with the access to the site-packages as configured in the buildout section*. virtualenv ========== Using virtualenv (http://pypi.python.org/pypi/virtualenv) with the --no-site-packages option already provided a simple way of using a system Python. This is intended to continue to work, and some automated tests exist to demonstrate this. However, it is only supported to the degree that people have found it to work in the past. The existing Buildout tests for virtualenv are only for problems encountered previously. They are very far from comprehensive. Using Buildout with a system python has at least three advantages over using Buildout in conjunction with virtualenv. They may or may not be pertinent to your desired usage. - Unlike ``virtualenv --no-site-packages``, Buildout's support allows you to choose to let packages from your system Python be available to your software (see ``include-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). You can even specify which eggs installed in your system Python can be allowed to fulfill some of your packages' dependencies (see ``allowed-eggs-from-site-packages`` in http://pypi.python.org/pypi/z3c.recipe.scripts). At the expense of some repeatability and platform dependency, this flexibility means that, for instance, you can rely on difficult-to-build eggs like lxml coming from your system Python. - Buildout's implementation has a full set of automated tests. - An integral Buildout implementation means fewer steps and fewer dependencies to work with a system Python. zc.buildout-1.7.1/todo.txt0000644000076500007650000000244212111414155015057 0ustar jimjim00000000000000 - Report error if repeated parts - tests - distribution dependency links - offline mode (there is an indirect test in the testrunner tests) - Load from urls - control python for develop (probably a new recipe) - proper handling of extras - Common recipes - configure-make-make-install - download, checkout - Should it be possible to provide multiple recipies? Or should recipies be combined through inheritence (or composition)? - Python - Some way to freeze versions so we can have reproducable buildouts. Maybe simple approach: - Egg recipe outputs dependency info with debug logging - Egg recipe has option to specify dependencies. When used, don't automatically fetch newer data. - Option to search python path for distros - Part dependencies - custom uninstall - Fix develop so thet ordinary eggs fetched as dependencies end up in eggs directory. "fixed" except that fix is ineffective due to setuptools bug. :( - spelling :) - document recipe initialization order Issues - Should we include setuptools and buildout eggs for buildout process in environment when searching for requirements? - We don't want to look for new versions of setuptools all the time. For now, we always use a local dist if there is one. Needs more thought.