.flake80100644 0000000 0000000 00000000062 13025567015 010762 0ustar000000000 0000000 [flake8] max-line-length=80 ignore=E111,E114,E402 .gitattributes0100644 0000000 0000000 00000000171 13025567015 012503 0ustar000000000 0000000 # Prevent /bin/sh scripts from being clobbered by autocrlf=true git_ssh text eol=lf repo text eol=lf hooks/* text eol=lf .gitignore0100644 0000000 0000000 00000000033 13025567015 011575 0ustar000000000 0000000 *.pyc .repopickle_* /repoc .mailmap0100644 0000000 0000000 00000001573 13025567015 011240 0ustar000000000 0000000 Anthony Newnam Anthony Hu Xiuyun Hu xiuyun Hu Xiuyun Hu Xiuyun Jelly Chen chenguodong Jia Bi bijia JoonCheol Park Jooncheol Park Sergii Pylypenko pelya Shawn Pearce Shawn O. Pearce Ulrik Sjölin Ulrik Sjolin Ulrik Sjölin Ulrik Sjolin Ulrik Sjölin Ulrik Sjölin .project0100644 0000000 0000000 00000000552 13025567015 011262 0ustar000000000 0000000 git-repo org.python.pydev.PyDevBuilder org.python.pydev.pythonNature .pydevproject0100644 0000000 0000000 00000000636 13025567015 012335 0ustar000000000 0000000 /git-repo python 2.6 Default COPYING0100644 0000000 0000000 00000026136 13025567015 010654 0ustar000000000 0000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. README.md0100644 0000000 0000000 00000001233 13025567015 011067 0ustar000000000 0000000 # repo Repo is a tool built on top of Git. Repo helps manage many Git repositories, does the uploads to revision control systems, and automates parts of the development workflow. Repo is not meant to replace Git, only to make it easier to work with Git. The repo command is an executable Python script that you can put anywhere in your path. * Homepage: https://code.google.com/p/git-repo/ * Bug reports: https://code.google.com/p/git-repo/issues/ * Source: https://code.google.com/p/git-repo/ * Overview: https://source.android.com/source/developing.html * Docs: https://source.android.com/source/using-repo.html * [Submitting patches](./SUBMITTING_PATCHES.md) SUBMITTING_PATCHES.md0100644 0000000 0000000 00000011045 13025567015 012770 0ustar000000000 0000000 # Short Version - Make small logical changes. - Provide a meaningful commit message. - Check for coding errors and style nits with pyflakes and flake8 - Make sure all code is under the Apache License, 2.0. - Publish your changes for review. - Make corrections if requested. - Verify your changes on gerrit so they can be submitted. `git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master` # Long Version I wanted a file describing how to submit patches for repo, so I started with the one found in the core Git distribution (Documentation/SubmittingPatches), which itself was based on the patch submission guidelines for the Linux kernel. However there are some differences, so please review and familiarize yourself with the following relevant bits. ## Make separate commits for logically separate changes. Unless your patch is really trivial, you should not be sending out a patch that was generated between your working tree and your commit head. Instead, always make a commit with complete commit message and generate a series of patches from your repository. It is a good discipline. Describe the technical detail of the change(s). If your description starts to get too long, that's a sign that you probably need to split up your commit to finer grained pieces. ## Check for coding errors and style nits with pyflakes and flake8 ### Coding errors Run `pyflakes` on changed modules: pyflakes file.py Ideally there should be no new errors or warnings introduced. ### Style violations Run `flake8` on changes modules: flake8 file.py Note that repo generally follows [Google's python style guide] (https://google.github.io/styleguide/pyguide.html) rather than [PEP 8] (https://www.python.org/dev/peps/pep-0008/), so it's possible that the output of `flake8` will be quite noisy. It's not mandatory to avoid all warnings, but at least the maximum line length should be followed. If there are many occurrences of the same warning that cannot be avoided without going against the Google style guide, these may be suppressed in the included `.flake8` file. ## Check the license repo is licensed under the Apache License, 2.0. Because of this licensing model *every* file within the project *must* list the license that covers it in the header of the file. Any new contributions to an existing file *must* be submitted under the current license of that file. Any new files *must* clearly indicate which license they are provided under in the file header. Please verify that you are legally allowed and willing to submit your changes under the license covering each file *prior* to submitting your patch. It is virtually impossible to remove a patch once it has been applied and pushed out. ## Sending your patches. Do not email your patches to anyone. Instead, login to the Gerrit Code Review tool at: https://gerrit-review.googlesource.com/ Ensure you have completed one of the necessary contributor agreements, providing documentation to the project maintainers that they have right to redistribute your work under the Apache License: https://gerrit-review.googlesource.com/#/settings/agreements Ensure you have obtained an HTTP password to authenticate: https://gerrit-review.googlesource.com/new-password Ensure that you have the local commit hook installed to automatically add a ChangeId to your commits: curl -Lo `git rev-parse --git-dir`/hooks/commit-msg https://gerrit-review.googlesource.com/tools/hooks/commit-msg chmod +x `git rev-parse --git-dir`/hooks/commit-msg If you have already committed your changes you will need to amend the commit to get the ChangeId added. git commit --amend Push your patches over HTTPS to the review server, possibly through a remembered remote to make this easier in the future: git config remote.review.url https://gerrit-review.googlesource.com/git-repo git config remote.review.push HEAD:refs/for/master git push review You will be automatically emailed a copy of your commits, and any comments made by the project maintainers. ## Make changes if requested The project maintainer who reviews your changes might request changes to your commit. If you make the requested changes you will need to amend your commit and push it to the review server again. ## Verify your changes on gerrit After you receive a Code-Review+2 from the maintainer, select the Verified button on the gerrit page for the change. This verifies that you have tested your changes and notifies the maintainer that they are ready to be submitted. The maintainer will then submit your changes to the repository. color.py0100644 0000000 0000000 00000010527 13025567015 011306 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import sys import pager COLORS = {None: -1, 'normal': -1, 'black': 0, 'red': 1, 'green': 2, 'yellow': 3, 'blue': 4, 'magenta': 5, 'cyan': 6, 'white': 7} ATTRS = {None: -1, 'bold': 1, 'dim': 2, 'ul': 4, 'blink': 5, 'reverse': 7} RESET = "\033[m" def is_color(s): return s in COLORS def is_attr(s): return s in ATTRS def _Color(fg=None, bg=None, attr=None): fg = COLORS[fg] bg = COLORS[bg] attr = ATTRS[attr] if attr >= 0 or fg >= 0 or bg >= 0: need_sep = False code = "\033[" if attr >= 0: code += chr(ord('0') + attr) need_sep = True if fg >= 0: if need_sep: code += ';' need_sep = True if fg < 8: code += '3%c' % (ord('0') + fg) else: code += '38;5;%d' % fg if bg >= 0: if need_sep: code += ';' if bg < 8: code += '4%c' % (ord('0') + bg) else: code += '48;5;%d' % bg code += 'm' else: code = '' return code DEFAULT = None def SetDefaultColoring(state): """Set coloring behavior to |state|. This is useful for overriding config options via the command line. """ if state is None: # Leave it alone -- return quick! return global DEFAULT state = state.lower() if state in ('auto',): DEFAULT = state elif state in ('always', 'yes', 'true', True): DEFAULT = 'always' elif state in ('never', 'no', 'false', False): DEFAULT = 'never' class Coloring(object): def __init__(self, config, section_type): self._section = 'color.%s' % section_type self._config = config self._out = sys.stdout on = DEFAULT if on is None: on = self._config.GetString(self._section) if on is None: on = self._config.GetString('color.ui') if on == 'auto': if pager.active or os.isatty(1): self._on = True else: self._on = False elif on in ('true', 'always'): self._on = True else: self._on = False def redirect(self, out): self._out = out @property def is_on(self): return self._on def write(self, fmt, *args): self._out.write(fmt % args) def flush(self): self._out.flush() def nl(self): self._out.write('\n') def printer(self, opt=None, fg=None, bg=None, attr=None): s = self c = self.colorer(opt, fg, bg, attr) def f(fmt, *args): s._out.write(c(fmt, *args)) return f def nofmt_printer(self, opt=None, fg=None, bg=None, attr=None): s = self c = self.nofmt_colorer(opt, fg, bg, attr) def f(fmt): s._out.write(c(fmt)) return f def colorer(self, opt=None, fg=None, bg=None, attr=None): if self._on: c = self._parse(opt, fg, bg, attr) def f(fmt, *args): output = fmt % args return ''.join([c, output, RESET]) return f else: def f(fmt, *args): return fmt % args return f def nofmt_colorer(self, opt=None, fg=None, bg=None, attr=None): if self._on: c = self._parse(opt, fg, bg, attr) def f(fmt): return ''.join([c, fmt, RESET]) return f else: def f(fmt): return fmt return f def _parse(self, opt, fg, bg, attr): if not opt: return _Color(fg, bg, attr) v = self._config.GetString('%s.%s' % (self._section, opt)) if v is None: return _Color(fg, bg, attr) v = v.strip().lower() if v == "reset": return RESET elif v == '': return _Color(fg, bg, attr) have_fg = False for a in v.split(' '): if is_color(a): if have_fg: bg = a else: fg = a elif is_attr(a): attr = a return _Color(fg, bg, attr) command.py0100644 0000000 0000000 00000017070 13025567015 011606 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import optparse import platform import re import sys from error import NoSuchProjectError from error import InvalidProjectGroupsError class Command(object): """Base class for any command line action in repo. """ common = False manifest = None _optparse = None def WantPager(self, _opt): return False def ReadEnvironmentOptions(self, opts): """ Set options from environment variables. """ env_options = self._RegisteredEnvironmentOptions() for env_key, opt_key in env_options.items(): # Get the user-set option value if any opt_value = getattr(opts, opt_key) # If the value is set, it means the user has passed it as a command # line option, and we should use that. Otherwise we can try to set it # with the value from the corresponding environment variable. if opt_value is not None: continue env_value = os.environ.get(env_key) if env_value is not None: setattr(opts, opt_key, env_value) return opts @property def OptionParser(self): if self._optparse is None: try: me = 'repo %s' % self.NAME usage = self.helpUsage.strip().replace('%prog', me) except AttributeError: usage = 'repo %s' % self.NAME self._optparse = optparse.OptionParser(usage=usage) self._Options(self._optparse) return self._optparse def _Options(self, p): """Initialize the option parser. """ def _RegisteredEnvironmentOptions(self): """Get options that can be set from environment variables. Return a dictionary mapping environment variable name to option key name that it can override. Example: {'REPO_MY_OPTION': 'my_option'} Will allow the option with key value 'my_option' to be set from the value in the environment variable named 'REPO_MY_OPTION'. Note: This does not work properly for options that are explicitly set to None by the user, or options that are defined with a default value other than None. """ return {} def Usage(self): """Display usage and terminate. """ self.OptionParser.print_usage() sys.exit(1) def Execute(self, opt, args): """Perform the action, after option parsing is complete. """ raise NotImplementedError def _ResetPathToProjectMap(self, projects): self._by_path = dict((p.worktree, p) for p in projects) def _UpdatePathToProjectMap(self, project): self._by_path[project.worktree] = project def _GetProjectByPath(self, manifest, path): project = None if os.path.exists(path): oldpath = None while path and \ path != oldpath and \ path != manifest.topdir: try: project = self._by_path[path] break except KeyError: oldpath = path path = os.path.dirname(path) if not project and path == manifest.topdir: try: project = self._by_path[path] except KeyError: pass else: try: project = self._by_path[path] except KeyError: pass return project def GetProjects(self, args, manifest=None, groups='', missing_ok=False, submodules_ok=False): """A list of projects that match the arguments. """ if not manifest: manifest = self.manifest all_projects_list = manifest.projects result = [] mp = manifest.manifestProject if not groups: groups = mp.config.GetString('manifest.groups') if not groups: groups = 'default,platform-' + platform.system().lower() groups = [x for x in re.split(r'[,\s]+', groups) if x] if not args: derived_projects = {} for project in all_projects_list: if submodules_ok or project.sync_s: derived_projects.update((p.name, p) for p in project.GetDerivedSubprojects()) all_projects_list.extend(derived_projects.values()) for project in all_projects_list: if (missing_ok or project.Exists) and project.MatchesGroups(groups): result.append(project) else: self._ResetPathToProjectMap(all_projects_list) for arg in args: projects = manifest.GetProjectsWithName(arg) if not projects: path = os.path.abspath(arg).replace('\\', '/') project = self._GetProjectByPath(manifest, path) # If it's not a derived project, update path->project mapping and # search again, as arg might actually point to a derived subproject. if (project and not project.Derived and (submodules_ok or project.sync_s)): search_again = False for subproject in project.GetDerivedSubprojects(): self._UpdatePathToProjectMap(subproject) search_again = True if search_again: project = self._GetProjectByPath(manifest, path) or project if project: projects = [project] if not projects: raise NoSuchProjectError(arg) for project in projects: if not missing_ok and not project.Exists: raise NoSuchProjectError(arg) if not project.MatchesGroups(groups): raise InvalidProjectGroupsError(arg) result.extend(projects) def _getpath(x): return x.relpath result.sort(key=_getpath) return result def FindProjects(self, args, inverse=False): result = [] patterns = [re.compile(r'%s' % a, re.IGNORECASE) for a in args] for project in self.GetProjects(''): for pattern in patterns: match = pattern.search(project.name) or pattern.search(project.relpath) if not inverse and match: result.append(project) break if inverse and match: break else: if inverse: result.append(project) result.sort(key=lambda project: project.relpath) return result # pylint: disable=W0223 # Pylint warns that the `InteractiveCommand` and `PagedCommand` classes do not # override method `Execute` which is abstract in `Command`. Since that method # is always implemented in classes derived from `InteractiveCommand` and # `PagedCommand`, this warning can be suppressed. class InteractiveCommand(Command): """Command which requires user interaction on the tty and must not run within a pager, even if the user asks to. """ def WantPager(self, _opt): return False class PagedCommand(Command): """Command which defaults to output in a pager, as its display tends to be larger than one screen full. """ def WantPager(self, _opt): return True # pylint: enable=W0223 class MirrorSafeCommand(object): """Command permits itself to run within a mirror, and does not require a working directory. """ class GitcAvailableCommand(object): """Command that requires GITC to be available, but does not require the local client to be a GITC client. """ class GitcClientCommand(object): """Command that requires the local client to be a GITC client. """ docs/0040755 0000000 0000000 00000000000 13025567015 010544 5ustar000000000 0000000 docs/manifest-format.txt0100644 0000000 0000000 00000033504 13025567015 014403 0ustar000000000 0000000 repo Manifest Format ==================== A repo manifest describes the structure of a repo client; that is the directories that are visible and where they should be obtained from with git. The basic structure of a manifest is a bare Git repository holding a single 'default.xml' XML file in the top level directory. Manifests are inherently version controlled, since they are kept within a Git repository. Updates to manifests are automatically obtained by clients during `repo sync`. XML File Format --------------- A manifest XML file (e.g. 'default.xml') roughly conforms to the following DTD: ]> A description of the elements and their attributes follows. Element manifest ---------------- The root element of the file. Element remote -------------- One or more remote elements may be specified. Each remote element specifies a Git URL shared by one or more projects and (optionally) the Gerrit review server those projects upload changes through. Attribute `name`: A short name unique to this manifest file. The name specified here is used as the remote name in each project's .git/config, and is therefore automatically available to commands like `git fetch`, `git remote`, `git pull` and `git push`. Attribute `alias`: The alias, if specified, is used to override `name` to be set as the remote name in each project's .git/config. Its value can be duplicated while attribute `name` has to be unique in the manifest file. This helps each project to be able to have same remote name which actually points to different remote url. Attribute `fetch`: The Git URL prefix for all projects which use this remote. Each project's name is appended to this prefix to form the actual URL used to clone the project. Attribute `pushurl`: The Git "push" URL prefix for all projects which use this remote. Each project's name is appended to this prefix to form the actual URL used to "git push" the project. This attribute is optional; if not specified then "git push" will use the same URL as the `fetch` attribute. Attribute `review`: Hostname of the Gerrit server where reviews are uploaded to by `repo upload`. This attribute is optional; if not specified then `repo upload` will not function. Attribute `revision`: Name of a Git branch (e.g. `master` or `refs/heads/master`). Remotes with their own revision will override the default revision. Element default --------------- At most one default element may be specified. Its remote and revision attributes are used when a project element does not specify its own remote or revision attribute. Attribute `remote`: Name of a previously defined remote element. Project elements lacking a remote attribute of their own will use this remote. Attribute `revision`: Name of a Git branch (e.g. `master` or `refs/heads/master`). Project elements lacking their own revision attribute will use this revision. Attribute `dest-branch`: Name of a Git branch (e.g. `master`). Project elements not setting their own `dest-branch` will inherit this value. If this value is not set, projects will use `revision` by default instead. Attribute `sync-j`: Number of parallel jobs to use when synching. Attribute `sync-c`: Set to true to only sync the given Git branch (specified in the `revision` attribute) rather than the whole ref space. Project elements lacking a sync-c element of their own will use this value. Attribute `sync-s`: Set to true to also sync sub-projects. Element manifest-server ----------------------- At most one manifest-server may be specified. The url attribute is used to specify the URL of a manifest server, which is an XML RPC service. The manifest server should implement the following RPC methods: GetApprovedManifest(branch, target) Return a manifest in which each project is pegged to a known good revision for the current branch and target. This is used by repo sync when the --smart-sync option is given. The target to use is defined by environment variables TARGET_PRODUCT and TARGET_BUILD_VARIANT. These variables are used to create a string of the form $TARGET_PRODUCT-$TARGET_BUILD_VARIANT, e.g. passion-userdebug. If one of those variables or both are not present, the program will call GetApprovedManifest without the target parameter and the manifest server should choose a reasonable default target. GetManifest(tag) Return a manifest in which each project is pegged to the revision at the specified tag. This is used by repo sync when the --smart-tag option is given. Element project --------------- One or more project elements may be specified. Each element describes a single Git repository to be cloned into the repo client workspace. You may specify Git-submodules by creating a nested project. Git-submodules will be automatically recognized and inherit their parent's attributes, but those may be overridden by an explicitly specified project element. Attribute `name`: A unique name for this project. The project's name is appended onto its remote's fetch URL to generate the actual URL to configure the Git remote with. The URL gets formed as: ${remote_fetch}/${project_name}.git where ${remote_fetch} is the remote's fetch attribute and ${project_name} is the project's name attribute. The suffix ".git" is always appended as repo assumes the upstream is a forest of bare Git repositories. If the project has a parent element, its name will be prefixed by the parent's. The project name must match the name Gerrit knows, if Gerrit is being used for code reviews. Attribute `path`: An optional path relative to the top directory of the repo client where the Git working directory for this project should be placed. If not supplied the project name is used. If the project has a parent element, its path will be prefixed by the parent's. Attribute `remote`: Name of a previously defined remote element. If not supplied the remote given by the default element is used. Attribute `revision`: Name of the Git branch the manifest wants to track for this project. Names can be relative to refs/heads (e.g. just "master") or absolute (e.g. "refs/heads/master"). Tags and/or explicit SHA-1s should work in theory, but have not been extensively tested. If not supplied the revision given by the remote element is used if applicable, else the default element is used. Attribute `dest-branch`: Name of a Git branch (e.g. `master`). When using `repo upload`, changes will be submitted for code review on this branch. If unspecified both here and in the default element, `revision` is used instead. Attribute `groups`: List of groups to which this project belongs, whitespace or comma separated. All projects belong to the group "all", and each project automatically belongs to a group of its name:`name` and path:`path`. E.g. for , that project definition is implicitly in the following manifest groups: default, name:monkeys, and path:barrel-of. If you place a project in the group "notdefault", it will not be automatically downloaded by repo. If the project has a parent element, the `name` and `path` here are the prefixed ones. Attribute `sync-c`: Set to true to only sync the given Git branch (specified in the `revision` attribute) rather than the whole ref space. Attribute `sync-s`: Set to true to also sync sub-projects. Attribute `upstream`: Name of the Git ref in which a sha1 can be found. Used when syncing a revision locked manifest in -c mode to avoid having to sync the entire ref space. Attribute `clone-depth`: Set the depth to use when fetching this project. If specified, this value will override any value given to repo init with the --depth option on the command line. Attribute `force-path`: Set to true to force this project to create the local mirror repository according to its `path` attribute (if supplied) rather than the `name` attribute. This attribute only applies to the local mirrors syncing, it will be ignored when syncing the projects in a client working directory. Element extend-project ---------------------- Modify the attributes of the named project. This element is mostly useful in a local manifest file, to modify the attributes of an existing project without completely replacing the existing project definition. This makes the local manifest more robust against changes to the original manifest. Attribute `path`: If specified, limit the change to projects checked out at the specified path, rather than all projects with the given name. Attribute `groups`: List of additional groups to which this project belongs. Same syntax as the corresponding element of `project`. Element annotation ------------------ Zero or more annotation elements may be specified as children of a project element. Each element describes a name-value pair that will be exported into each project's environment during a 'forall' command, prefixed with REPO__. In addition, there is an optional attribute "keep" which accepts the case insensitive values "true" (default) or "false". This attribute determines whether or not the annotation will be kept when exported with the manifest subcommand. Element copyfile ---------------- Zero or more copyfile elements may be specified as children of a project element. Each element describes a src-dest pair of files; the "src" file will be copied to the "dest" place during 'repo sync' command. "src" is project relative, "dest" is relative to the top of the tree. Element linkfile ---------------- It's just like copyfile and runs at the same time as copyfile but instead of copying it creates a symlink. Element remove-project ---------------------- Deletes the named project from the internal manifest table, possibly allowing a subsequent project element in the same manifest file to replace the project with a different source. This element is mostly useful in a local manifest file, where the user can remove a project, and possibly replace it with their own definition. Element include --------------- This element provides the capability of including another manifest file into the originating manifest. Normal rules apply for the target manifest to include - it must be a usable manifest on its own. Attribute `name`: the manifest to include, specified relative to the manifest repository's root. Local Manifests =============== Additional remotes and projects may be added through local manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`. For example: $ ls .repo/local_manifests local_manifest.xml another_local_manifest.xml $ cat .repo/local_manifests/local_manifest.xml Users may add projects to the local manifest(s) prior to a `repo sync` invocation, instructing repo to automatically download and manage these extra projects. Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will be loaded in alphabetical order. Additional remotes and projects may also be added through a local manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method is deprecated in favor of using multiple manifest files as mentioned above. If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`. editor.py0100644 0000000 0000000 00000005144 13025567015 011455 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import re import sys import subprocess import tempfile from error import EditorError class Editor(object): """Manages the user's preferred text editor.""" _editor = None globalConfig = None @classmethod def _GetEditor(cls): if cls._editor is None: cls._editor = cls._SelectEditor() return cls._editor @classmethod def _SelectEditor(cls): e = os.getenv('GIT_EDITOR') if e: return e if cls.globalConfig: e = cls.globalConfig.GetString('core.editor') if e: return e e = os.getenv('VISUAL') if e: return e e = os.getenv('EDITOR') if e: return e if os.getenv('TERM') == 'dumb': print( """No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR. Tried to fall back to vi but terminal is dumb. Please configure at least one of these before using this command.""", file=sys.stderr) sys.exit(1) return 'vi' @classmethod def EditString(cls, data): """Opens an editor to edit the given content. Args: data : the text to edit Returns: new value of edited text; None if editing did not succeed """ editor = cls._GetEditor() if editor == ':': return data fd, path = tempfile.mkstemp() try: os.write(fd, data) os.close(fd) fd = None if re.compile("^.*[$ \t'].*$").match(editor): args = [editor + ' "$@"', 'sh'] shell = True else: args = [editor] shell = False args.append(path) try: rc = subprocess.Popen(args, shell=shell).wait() except OSError as e: raise EditorError('editor failed, %s: %s %s' % (str(e), editor, path)) if rc != 0: raise EditorError('editor failed with exit status %d: %s %s' % (rc, editor, path)) fd2 = open(path) try: return fd2.read() finally: fd2.close() finally: if fd: os.close(fd) os.remove(path) error.py0100644 0000000 0000000 00000006043 13025567015 011317 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. class ManifestParseError(Exception): """Failed to parse the manifest file. """ class ManifestInvalidRevisionError(Exception): """The revision value in a project is incorrect. """ class NoManifestException(Exception): """The required manifest does not exist. """ def __init__(self, path, reason): super(NoManifestException, self).__init__() self.path = path self.reason = reason def __str__(self): return self.reason class EditorError(Exception): """Unspecified error from the user's text editor. """ def __init__(self, reason): super(EditorError, self).__init__() self.reason = reason def __str__(self): return self.reason class GitError(Exception): """Unspecified internal error from git. """ def __init__(self, command): super(GitError, self).__init__() self.command = command def __str__(self): return self.command class UploadError(Exception): """A bundle upload to Gerrit did not succeed. """ def __init__(self, reason): super(UploadError, self).__init__() self.reason = reason def __str__(self): return self.reason class DownloadError(Exception): """Cannot download a repository. """ def __init__(self, reason): super(DownloadError, self).__init__() self.reason = reason def __str__(self): return self.reason class NoSuchProjectError(Exception): """A specified project does not exist in the work tree. """ def __init__(self, name=None): super(NoSuchProjectError, self).__init__() self.name = name def __str__(self): if self.name is None: return 'in current directory' return self.name class InvalidProjectGroupsError(Exception): """A specified project is not suitable for the specified groups """ def __init__(self, name=None): super(InvalidProjectGroupsError, self).__init__() self.name = name def __str__(self): if self.name is None: return 'in current directory' return self.name class RepoChangedException(Exception): """Thrown if 'repo sync' results in repo updating its internal repo or manifest repositories. In this special case we must use exec to re-execute repo with the new code and manifest. """ def __init__(self, extra_args=None): super(RepoChangedException, self).__init__() self.extra_args = extra_args or [] class HookError(Exception): """Thrown if a 'repo-hook' could not be run. The common case is that the file wasn't present when we tried to run it. """ git_command.py0100644 0000000 0000000 00000016403 13025567015 012450 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import fcntl import os import select import sys import subprocess import tempfile from signal import SIGTERM from error import GitError from trace import REPO_TRACE, IsTrace, Trace from wrapper import Wrapper GIT = 'git' MIN_GIT_VERSION = (1, 5, 4) GIT_DIR = 'GIT_DIR' LAST_GITDIR = None LAST_CWD = None _ssh_proxy_path = None _ssh_sock_path = None _ssh_clients = [] def ssh_sock(create=True): global _ssh_sock_path if _ssh_sock_path is None: if not create: return None tmp_dir = '/tmp' if not os.path.exists(tmp_dir): tmp_dir = tempfile.gettempdir() _ssh_sock_path = os.path.join( tempfile.mkdtemp('', 'ssh-', tmp_dir), 'master-%r@%h:%p') return _ssh_sock_path def _ssh_proxy(): global _ssh_proxy_path if _ssh_proxy_path is None: _ssh_proxy_path = os.path.join( os.path.dirname(__file__), 'git_ssh') return _ssh_proxy_path def _add_ssh_client(p): _ssh_clients.append(p) def _remove_ssh_client(p): try: _ssh_clients.remove(p) except ValueError: pass def terminate_ssh_clients(): global _ssh_clients for p in _ssh_clients: try: os.kill(p.pid, SIGTERM) p.wait() except OSError: pass _ssh_clients = [] _git_version = None class _sfd(object): """select file descriptor class""" def __init__(self, fd, dest, std_name): assert std_name in ('stdout', 'stderr') self.fd = fd self.dest = dest self.std_name = std_name def fileno(self): return self.fd.fileno() class _GitCall(object): def version(self): p = GitCommand(None, ['--version'], capture_stdout=True) if p.Wait() == 0: if hasattr(p.stdout, 'decode'): return p.stdout.decode('utf-8') else: return p.stdout return None def version_tuple(self): global _git_version if _git_version is None: ver_str = git.version() _git_version = Wrapper().ParseGitVersion(ver_str) if _git_version is None: print('fatal: "%s" unsupported' % ver_str, file=sys.stderr) sys.exit(1) return _git_version def __getattr__(self, name): name = name.replace('_','-') def fun(*cmdv): command = [name] command.extend(cmdv) return GitCommand(None, command).Wait() == 0 return fun git = _GitCall() def git_require(min_version, fail=False): git_version = git.version_tuple() if min_version <= git_version: return True if fail: need = '.'.join(map(str, min_version)) print('fatal: git %s or later required' % need, file=sys.stderr) sys.exit(1) return False def _setenv(env, name, value): env[name] = value.encode() class GitCommand(object): def __init__(self, project, cmdv, bare = False, provide_stdin = False, capture_stdout = False, capture_stderr = False, disable_editor = False, ssh_proxy = False, cwd = None, gitdir = None): env = os.environ.copy() for key in [REPO_TRACE, GIT_DIR, 'GIT_ALTERNATE_OBJECT_DIRECTORIES', 'GIT_OBJECT_DIRECTORY', 'GIT_WORK_TREE', 'GIT_GRAFT_FILE', 'GIT_INDEX_FILE']: if key in env: del env[key] # If we are not capturing std* then need to print it. self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr} if disable_editor: _setenv(env, 'GIT_EDITOR', ':') if ssh_proxy: _setenv(env, 'REPO_SSH_SOCK', ssh_sock()) _setenv(env, 'GIT_SSH', _ssh_proxy()) if 'http_proxy' in env and 'darwin' == sys.platform: s = "'http.proxy=%s'" % (env['http_proxy'],) p = env.get('GIT_CONFIG_PARAMETERS') if p is not None: s = p + ' ' + s _setenv(env, 'GIT_CONFIG_PARAMETERS', s) if 'GIT_ALLOW_PROTOCOL' not in env: _setenv(env, 'GIT_ALLOW_PROTOCOL', 'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc') if project: if not cwd: cwd = project.worktree if not gitdir: gitdir = project.gitdir command = [GIT] if bare: if gitdir: _setenv(env, GIT_DIR, gitdir) cwd = None command.append(cmdv[0]) # Need to use the --progress flag for fetch/clone so output will be # displayed as by default git only does progress output if stderr is a TTY. if sys.stderr.isatty() and cmdv[0] in ('fetch', 'clone'): if '--progress' not in cmdv and '--quiet' not in cmdv: command.append('--progress') command.extend(cmdv[1:]) if provide_stdin: stdin = subprocess.PIPE else: stdin = None stdout = subprocess.PIPE stderr = subprocess.PIPE if IsTrace(): global LAST_CWD global LAST_GITDIR dbg = '' if cwd and LAST_CWD != cwd: if LAST_GITDIR or LAST_CWD: dbg += '\n' dbg += ': cd %s\n' % cwd LAST_CWD = cwd if GIT_DIR in env and LAST_GITDIR != env[GIT_DIR]: if LAST_GITDIR or LAST_CWD: dbg += '\n' dbg += ': export GIT_DIR=%s\n' % env[GIT_DIR] LAST_GITDIR = env[GIT_DIR] dbg += ': ' dbg += ' '.join(command) if stdin == subprocess.PIPE: dbg += ' 0<|' if stdout == subprocess.PIPE: dbg += ' 1>|' if stderr == subprocess.PIPE: dbg += ' 2>|' Trace('%s', dbg) try: p = subprocess.Popen(command, cwd = cwd, env = env, stdin = stdin, stdout = stdout, stderr = stderr) except Exception as e: raise GitError('%s: %s' % (command[1], e)) if ssh_proxy: _add_ssh_client(p) self.process = p self.stdin = p.stdin def Wait(self): try: p = self.process rc = self._CaptureOutput() finally: _remove_ssh_client(p) return rc def _CaptureOutput(self): p = self.process s_in = [_sfd(p.stdout, sys.stdout, 'stdout'), _sfd(p.stderr, sys.stderr, 'stderr')] self.stdout = '' self.stderr = '' for s in s_in: flags = fcntl.fcntl(s.fd, fcntl.F_GETFL) fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK) while s_in: in_ready, _, _ = select.select(s_in, [], []) for s in in_ready: buf = s.fd.read(4096) if not buf: s_in.remove(s) continue if not hasattr(buf, 'encode'): buf = buf.decode() if s.std_name == 'stdout': self.stdout += buf else: self.stderr += buf if self.tee[s.std_name]: s.dest.write(buf) s.dest.flush() return p.wait() git_config.py0100644 0000000 0000000 00000050776 13025567015 012312 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import contextlib import errno import json import os import re import subprocess import sys try: import threading as _threading except ImportError: import dummy_threading as _threading import time from pyversion import is_python3 if is_python3(): import urllib.request import urllib.error else: import urllib2 import imp urllib = imp.new_module('urllib') urllib.request = urllib2 urllib.error = urllib2 from signal import SIGTERM from error import GitError, UploadError from trace import Trace if is_python3(): from http.client import HTTPException else: from httplib import HTTPException from git_command import GitCommand from git_command import ssh_sock from git_command import terminate_ssh_clients R_HEADS = 'refs/heads/' R_TAGS = 'refs/tags/' ID_RE = re.compile(r'^[0-9a-f]{40}$') REVIEW_CACHE = dict() def IsId(rev): return ID_RE.match(rev) def _key(name): parts = name.split('.') if len(parts) < 2: return name.lower() parts[ 0] = parts[ 0].lower() parts[-1] = parts[-1].lower() return '.'.join(parts) class GitConfig(object): _ForUser = None @classmethod def ForUser(cls): if cls._ForUser is None: cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig')) return cls._ForUser @classmethod def ForRepository(cls, gitdir, defaults=None): return cls(configfile = os.path.join(gitdir, 'config'), defaults = defaults) def __init__(self, configfile, defaults=None, jsonFile=None): self.file = configfile self.defaults = defaults self._cache_dict = None self._section_dict = None self._remotes = {} self._branches = {} self._json = jsonFile if self._json is None: self._json = os.path.join( os.path.dirname(self.file), '.repo_' + os.path.basename(self.file) + '.json') def Has(self, name, include_defaults = True): """Return true if this configuration file has the key. """ if _key(name) in self._cache: return True if include_defaults and self.defaults: return self.defaults.Has(name, include_defaults = True) return False def GetBoolean(self, name): """Returns a boolean from the configuration file. None : The value was not defined, or is not a boolean. True : The value was set to true or yes. False: The value was set to false or no. """ v = self.GetString(name) if v is None: return None v = v.lower() if v in ('true', 'yes'): return True if v in ('false', 'no'): return False return None def GetString(self, name, all_keys=False): """Get the first value for a key, or None if it is not defined. This configuration file is used first, if the key is not defined or all_keys = True then the defaults are also searched. """ try: v = self._cache[_key(name)] except KeyError: if self.defaults: return self.defaults.GetString(name, all_keys = all_keys) v = [] if not all_keys: if v: return v[0] return None r = [] r.extend(v) if self.defaults: r.extend(self.defaults.GetString(name, all_keys = True)) return r def SetString(self, name, value): """Set the value(s) for a key. Only this configuration file is modified. The supplied value should be either a string, or a list of strings (to store multiple values). """ key = _key(name) try: old = self._cache[key] except KeyError: old = [] if value is None: if old: del self._cache[key] self._do('--unset-all', name) elif isinstance(value, list): if len(value) == 0: self.SetString(name, None) elif len(value) == 1: self.SetString(name, value[0]) elif old != value: self._cache[key] = list(value) self._do('--replace-all', name, value[0]) for i in range(1, len(value)): self._do('--add', name, value[i]) elif len(old) != 1 or old[0] != value: self._cache[key] = [value] self._do('--replace-all', name, value) def GetRemote(self, name): """Get the remote.$name.* configuration values as an object. """ try: r = self._remotes[name] except KeyError: r = Remote(self, name) self._remotes[r.name] = r return r def GetBranch(self, name): """Get the branch.$name.* configuration values as an object. """ try: b = self._branches[name] except KeyError: b = Branch(self, name) self._branches[b.name] = b return b def GetSubSections(self, section): """List all subsection names matching $section.*.* """ return self._sections.get(section, set()) def HasSection(self, section, subsection = ''): """Does at least one key in section.subsection exist? """ try: return subsection in self._sections[section] except KeyError: return False def UrlInsteadOf(self, url): """Resolve any url.*.insteadof references. """ for new_url in self.GetSubSections('url'): for old_url in self.GetString('url.%s.insteadof' % new_url, True): if old_url is not None and url.startswith(old_url): return new_url + url[len(old_url):] return url @property def _sections(self): d = self._section_dict if d is None: d = {} for name in self._cache.keys(): p = name.split('.') if 2 == len(p): section = p[0] subsect = '' else: section = p[0] subsect = '.'.join(p[1:-1]) if section not in d: d[section] = set() d[section].add(subsect) self._section_dict = d return d @property def _cache(self): if self._cache_dict is None: self._cache_dict = self._Read() return self._cache_dict def _Read(self): d = self._ReadJson() if d is None: d = self._ReadGit() self._SaveJson(d) return d def _ReadJson(self): try: if os.path.getmtime(self._json) \ <= os.path.getmtime(self.file): os.remove(self._json) return None except OSError: return None try: Trace(': parsing %s', self.file) fd = open(self._json) try: return json.load(fd) finally: fd.close() except (IOError, ValueError): os.remove(self._json) return None def _SaveJson(self, cache): try: fd = open(self._json, 'w') try: json.dump(cache, fd, indent=2) finally: fd.close() except (IOError, TypeError): if os.path.exists(self._json): os.remove(self._json) def _ReadGit(self): """ Read configuration data from git. This internal method populates the GitConfig cache. """ c = {} d = self._do('--null', '--list') if d is None: return c for line in d.decode('utf-8').rstrip('\0').split('\0'): # pylint: disable=W1401 # Backslash is not anomalous if '\n' in line: key, val = line.split('\n', 1) else: key = line val = None if key in c: c[key].append(val) else: c[key] = [val] return c def _do(self, *args): command = ['config', '--file', self.file] command.extend(args) p = GitCommand(None, command, capture_stdout = True, capture_stderr = True) if p.Wait() == 0: return p.stdout else: GitError('git config %s: %s' % (str(args), p.stderr)) class RefSpec(object): """A Git refspec line, split into its components: forced: True if the line starts with '+' src: Left side of the line dst: Right side of the line """ @classmethod def FromString(cls, rs): lhs, rhs = rs.split(':', 2) if lhs.startswith('+'): lhs = lhs[1:] forced = True else: forced = False return cls(forced, lhs, rhs) def __init__(self, forced, lhs, rhs): self.forced = forced self.src = lhs self.dst = rhs def SourceMatches(self, rev): if self.src: if rev == self.src: return True if self.src.endswith('/*') and rev.startswith(self.src[:-1]): return True return False def DestMatches(self, ref): if self.dst: if ref == self.dst: return True if self.dst.endswith('/*') and ref.startswith(self.dst[:-1]): return True return False def MapSource(self, rev): if self.src.endswith('/*'): return self.dst[:-1] + rev[len(self.src) - 1:] return self.dst def __str__(self): s = '' if self.forced: s += '+' if self.src: s += self.src if self.dst: s += ':' s += self.dst return s _master_processes = [] _master_keys = set() _ssh_master = True _master_keys_lock = None def init_ssh(): """Should be called once at the start of repo to init ssh master handling. At the moment, all we do is to create our lock. """ global _master_keys_lock assert _master_keys_lock is None, "Should only call init_ssh once" _master_keys_lock = _threading.Lock() def _open_ssh(host, port=None): global _ssh_master # Acquire the lock. This is needed to prevent opening multiple masters for # the same host when we're running "repo sync -jN" (for N > 1) _and_ the # manifest specifies a different host from the # one that was passed to repo init. _master_keys_lock.acquire() try: # Check to see whether we already think that the master is running; if we # think it's already running, return right away. if port is not None: key = '%s:%s' % (host, port) else: key = host if key in _master_keys: return True if not _ssh_master \ or 'GIT_SSH' in os.environ \ or sys.platform in ('win32', 'cygwin'): # failed earlier, or cygwin ssh can't do this # return False # We will make two calls to ssh; this is the common part of both calls. command_base = ['ssh', '-o','ControlPath %s' % ssh_sock(), host] if port is not None: command_base[1:1] = ['-p', str(port)] # Since the key wasn't in _master_keys, we think that master isn't running. # ...but before actually starting a master, we'll double-check. This can # be important because we can't tell that that 'git@myhost.com' is the same # as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file. check_command = command_base + ['-O','check'] try: Trace(': %s', ' '.join(check_command)) check_process = subprocess.Popen(check_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE) check_process.communicate() # read output, but ignore it... isnt_running = check_process.wait() if not isnt_running: # Our double-check found that the master _was_ infact running. Add to # the list of keys. _master_keys.add(key) return True except Exception: # Ignore excpetions. We we will fall back to the normal command and print # to the log there. pass command = command_base[:1] + \ ['-M', '-N'] + \ command_base[1:] try: Trace(': %s', ' '.join(command)) p = subprocess.Popen(command) except Exception as e: _ssh_master = False print('\nwarn: cannot enable ssh control master for %s:%s\n%s' % (host,port, str(e)), file=sys.stderr) return False time.sleep(1) ssh_died = (p.poll() is not None) if ssh_died: return False _master_processes.append(p) _master_keys.add(key) return True finally: _master_keys_lock.release() def close_ssh(): global _master_keys_lock terminate_ssh_clients() for p in _master_processes: try: os.kill(p.pid, SIGTERM) p.wait() except OSError: pass del _master_processes[:] _master_keys.clear() d = ssh_sock(create=False) if d: try: os.rmdir(os.path.dirname(d)) except OSError: pass # We're done with the lock, so we can delete it. _master_keys_lock = None URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):') URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/') def GetSchemeFromUrl(url): m = URI_ALL.match(url) if m: return m.group(1) return None @contextlib.contextmanager def GetUrlCookieFile(url, quiet): if url.startswith('persistent-'): try: p = subprocess.Popen( ['git-remote-persistent-https', '-print_config', url], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE) try: cookieprefix = 'http.cookiefile=' proxyprefix = 'http.proxy=' cookiefile = None proxy = None for line in p.stdout: line = line.strip() if line.startswith(cookieprefix): cookiefile = line[len(cookieprefix):] if line.startswith(proxyprefix): proxy = line[len(proxyprefix):] # Leave subprocess open, as cookie file may be transient. if cookiefile or proxy: yield cookiefile, proxy return finally: p.stdin.close() if p.wait(): err_msg = p.stderr.read() if ' -print_config' in err_msg: pass # Persistent proxy doesn't support -print_config. elif not quiet: print(err_msg, file=sys.stderr) except OSError as e: if e.errno == errno.ENOENT: pass # No persistent proxy. raise yield GitConfig.ForUser().GetString('http.cookiefile'), None def _preconnect(url): m = URI_ALL.match(url) if m: scheme = m.group(1) host = m.group(2) if ':' in host: host, port = host.split(':') else: port = None if scheme in ('ssh', 'git+ssh', 'ssh+git'): return _open_ssh(host, port) return False m = URI_SCP.match(url) if m: host = m.group(1) return _open_ssh(host) return False class Remote(object): """Configuration options related to a remote. """ def __init__(self, config, name): self._config = config self.name = name self.url = self._Get('url') self.pushUrl = self._Get('pushurl') self.review = self._Get('review') self.projectname = self._Get('projectname') self.fetch = list(map(RefSpec.FromString, self._Get('fetch', all_keys=True))) self._review_url = None def _InsteadOf(self): globCfg = GitConfig.ForUser() urlList = globCfg.GetSubSections('url') longest = "" longestUrl = "" for url in urlList: key = "url." + url + ".insteadOf" insteadOfList = globCfg.GetString(key, all_keys=True) for insteadOf in insteadOfList: if self.url.startswith(insteadOf) \ and len(insteadOf) > len(longest): longest = insteadOf longestUrl = url if len(longest) == 0: return self.url return self.url.replace(longest, longestUrl, 1) def PreConnectFetch(self): connectionUrl = self._InsteadOf() return _preconnect(connectionUrl) def ReviewUrl(self, userEmail): if self._review_url is None: if self.review is None: return None u = self.review if u.startswith('persistent-'): u = u[len('persistent-'):] if u.split(':')[0] not in ('http', 'https', 'sso'): u = 'http://%s' % u if u.endswith('/Gerrit'): u = u[:len(u) - len('/Gerrit')] if u.endswith('/ssh_info'): u = u[:len(u) - len('/ssh_info')] if not u.endswith('/'): u += '/' http_url = u if u in REVIEW_CACHE: self._review_url = REVIEW_CACHE[u] elif 'REPO_HOST_PORT_INFO' in os.environ: host, port = os.environ['REPO_HOST_PORT_INFO'].split() self._review_url = self._SshReviewUrl(userEmail, host, port) REVIEW_CACHE[u] = self._review_url elif u.startswith('sso:'): self._review_url = u # Assume it's right REVIEW_CACHE[u] = self._review_url else: try: info_url = u + 'ssh_info' info = urllib.request.urlopen(info_url).read() if info == 'NOT_AVAILABLE' or '<' in info: # If `info` contains '<', we assume the server gave us some sort # of HTML response back, like maybe a login page. # # Assume HTTP if SSH is not enabled or ssh_info doesn't look right. self._review_url = http_url else: host, port = info.split() self._review_url = self._SshReviewUrl(userEmail, host, port) except urllib.error.HTTPError as e: raise UploadError('%s: %s' % (self.review, str(e))) except urllib.error.URLError as e: raise UploadError('%s: %s' % (self.review, str(e))) except HTTPException as e: raise UploadError('%s: %s' % (self.review, e.__class__.__name__)) REVIEW_CACHE[u] = self._review_url return self._review_url + self.projectname def _SshReviewUrl(self, userEmail, host, port): username = self._config.GetString('review.%s.username' % self.review) if username is None: username = userEmail.split('@')[0] return 'ssh://%s@%s:%s/' % (username, host, port) def ToLocal(self, rev): """Convert a remote revision string to something we have locally. """ if self.name == '.' or IsId(rev): return rev if not rev.startswith('refs/'): rev = R_HEADS + rev for spec in self.fetch: if spec.SourceMatches(rev): return spec.MapSource(rev) if not rev.startswith(R_HEADS): return rev raise GitError('remote %s does not have %s' % (self.name, rev)) def WritesTo(self, ref): """True if the remote stores to the tracking ref. """ for spec in self.fetch: if spec.DestMatches(ref): return True return False def ResetFetch(self, mirror=False): """Set the fetch refspec to its default value. """ if mirror: dst = 'refs/heads/*' else: dst = 'refs/remotes/%s/*' % self.name self.fetch = [RefSpec(True, 'refs/heads/*', dst)] def Save(self): """Save this remote to the configuration. """ self._Set('url', self.url) if self.pushUrl is not None: self._Set('pushurl', self.pushUrl + '/' + self.projectname) else: self._Set('pushurl', self.pushUrl) self._Set('review', self.review) self._Set('projectname', self.projectname) self._Set('fetch', list(map(str, self.fetch))) def _Set(self, key, value): key = 'remote.%s.%s' % (self.name, key) return self._config.SetString(key, value) def _Get(self, key, all_keys=False): key = 'remote.%s.%s' % (self.name, key) return self._config.GetString(key, all_keys = all_keys) class Branch(object): """Configuration options related to a single branch. """ def __init__(self, config, name): self._config = config self.name = name self.merge = self._Get('merge') r = self._Get('remote') if r: self.remote = self._config.GetRemote(r) else: self.remote = None @property def LocalMerge(self): """Convert the merge spec to a local name. """ if self.remote and self.merge: return self.remote.ToLocal(self.merge) return None def Save(self): """Save this branch back into the configuration. """ if self._config.HasSection('branch', self.name): if self.remote: self._Set('remote', self.remote.name) else: self._Set('remote', None) self._Set('merge', self.merge) else: fd = open(self._config.file, 'a') try: fd.write('[branch "%s"]\n' % self.name) if self.remote: fd.write('\tremote = %s\n' % self.remote.name) if self.merge: fd.write('\tmerge = %s\n' % self.merge) finally: fd.close() def _Set(self, key, value): key = 'branch.%s.%s' % (self.name, key) return self._config.SetString(key, value) def _Get(self, key, all_keys=False): key = 'branch.%s.%s' % (self.name, key) return self._config.GetString(key, all_keys = all_keys) git_refs.py0100644 0000000 0000000 00000007503 13025567015 011772 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os from trace import Trace HEAD = 'HEAD' R_HEADS = 'refs/heads/' R_TAGS = 'refs/tags/' R_PUB = 'refs/published/' R_M = 'refs/remotes/m/' class GitRefs(object): def __init__(self, gitdir): self._gitdir = gitdir self._phyref = None self._symref = None self._mtime = {} @property def all(self): self._EnsureLoaded() return self._phyref def get(self, name): try: return self.all[name] except KeyError: return '' def deleted(self, name): if self._phyref is not None: if name in self._phyref: del self._phyref[name] if name in self._symref: del self._symref[name] if name in self._mtime: del self._mtime[name] def symref(self, name): try: self._EnsureLoaded() return self._symref[name] except KeyError: return '' def _EnsureLoaded(self): if self._phyref is None or self._NeedUpdate(): self._LoadAll() def _NeedUpdate(self): Trace(': scan refs %s', self._gitdir) for name, mtime in self._mtime.items(): try: if mtime != os.path.getmtime(os.path.join(self._gitdir, name)): return True except OSError: return True return False def _LoadAll(self): Trace(': load refs %s', self._gitdir) self._phyref = {} self._symref = {} self._mtime = {} self._ReadPackedRefs() self._ReadLoose('refs/') self._ReadLoose1(os.path.join(self._gitdir, HEAD), HEAD) scan = self._symref attempts = 0 while scan and attempts < 5: scan_next = {} for name, dest in scan.items(): if dest in self._phyref: self._phyref[name] = self._phyref[dest] else: scan_next[name] = dest scan = scan_next attempts += 1 def _ReadPackedRefs(self): path = os.path.join(self._gitdir, 'packed-refs') try: fd = open(path, 'r') mtime = os.path.getmtime(path) except IOError: return except OSError: return try: for line in fd: line = str(line) if line[0] == '#': continue if line[0] == '^': continue line = line[:-1] p = line.split(' ') ref_id = p[0] name = p[1] self._phyref[name] = ref_id finally: fd.close() self._mtime['packed-refs'] = mtime def _ReadLoose(self, prefix): base = os.path.join(self._gitdir, prefix) for name in os.listdir(base): p = os.path.join(base, name) if os.path.isdir(p): self._mtime[prefix] = os.path.getmtime(base) self._ReadLoose(prefix + name + '/') elif name.endswith('.lock'): pass else: self._ReadLoose1(p, prefix + name) def _ReadLoose1(self, path, name): try: fd = open(path, 'rb') except IOError: return try: try: mtime = os.path.getmtime(path) ref_id = fd.readline() except (IOError, OSError): return finally: fd.close() try: ref_id = ref_id.decode() except AttributeError: pass if not ref_id: return ref_id = ref_id[:-1] if ref_id.startswith('ref: '): self._symref[name] = ref_id[5:] else: self._phyref[name] = ref_id self._mtime[name] = mtime git_ssh0100755 0000000 0000000 00000000116 13025567015 011175 0ustar000000000 0000000 #!/bin/sh exec ssh -o "ControlMaster no" -o "ControlPath $REPO_SSH_SOCK" "$@" gitc_utils.py0100644 0000000 0000000 00000012665 13025567015 012343 0ustar000000000 0000000 # # Copyright (C) 2015 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import platform import re import sys import time import git_command import git_config import wrapper from error import ManifestParseError NUM_BATCH_RETRIEVE_REVISIONID = 32 def get_gitc_manifest_dir(): return wrapper.Wrapper().get_gitc_manifest_dir() def parse_clientdir(gitc_fs_path): return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path) def _set_project_revisions(projects): """Sets the revisionExpr for a list of projects. Because of the limit of open file descriptors allowed, length of projects should not be overly large. Recommend calling this function multiple times with each call not exceeding NUM_BATCH_RETRIEVE_REVISIONID projects. @param projects: List of project objects to set the revionExpr for. """ # Retrieve the commit id for each project based off of it's current # revisionExpr and it is not already a commit id. project_gitcmds = [( project, git_command.GitCommand(None, ['ls-remote', project.remote.url, project.revisionExpr], capture_stdout=True, cwd='/tmp')) for project in projects if not git_config.IsId(project.revisionExpr)] for proj, gitcmd in project_gitcmds: if gitcmd.Wait(): print('FATAL: Failed to retrieve revisionExpr for %s' % proj) sys.exit(1) revisionExpr = gitcmd.stdout.split('\t')[0] if not revisionExpr: raise(ManifestParseError('Invalid SHA-1 revision project %s (%s)' % (proj.remote.url, proj.revisionExpr))) proj.revisionExpr = revisionExpr def _manifest_groups(manifest): """Returns the manifest group string that should be synced This is the same logic used by Command.GetProjects(), which is used during repo sync @param manifest: The XmlManifest object """ mp = manifest.manifestProject groups = mp.config.GetString('manifest.groups') if not groups: groups = 'default,platform-' + platform.system().lower() return groups def generate_gitc_manifest(gitc_manifest, manifest, paths=None): """Generate a manifest for shafsd to use for this GITC client. @param gitc_manifest: Current gitc manifest, or None if there isn't one yet. @param manifest: A GitcManifest object loaded with the current repo manifest. @param paths: List of project paths we want to update. """ print('Generating GITC Manifest by fetching revision SHAs for each ' 'project.') if paths is None: paths = manifest.paths.keys() groups = [x for x in re.split(r'[,\s]+', _manifest_groups(manifest)) if x] # Convert the paths to projects, and filter them to the matched groups. projects = [manifest.paths[p] for p in paths] projects = [p for p in projects if p.MatchesGroups(groups)] if gitc_manifest is not None: for path, proj in manifest.paths.iteritems(): if not proj.MatchesGroups(groups): continue if not proj.upstream and not git_config.IsId(proj.revisionExpr): proj.upstream = proj.revisionExpr if not path in gitc_manifest.paths: # Any new projects need their first revision, even if we weren't asked # for them. projects.append(proj) elif not path in paths: # And copy revisions from the previous manifest if we're not updating # them now. gitc_proj = gitc_manifest.paths[path] if gitc_proj.old_revision: proj.revisionExpr = None proj.old_revision = gitc_proj.old_revision else: proj.revisionExpr = gitc_proj.revisionExpr index = 0 while index < len(projects): _set_project_revisions( projects[index:(index+NUM_BATCH_RETRIEVE_REVISIONID)]) index += NUM_BATCH_RETRIEVE_REVISIONID if gitc_manifest is not None: for path, proj in gitc_manifest.paths.iteritems(): if proj.old_revision and path in paths: # If we updated a project that has been started, keep the old-revision # updated. repo_proj = manifest.paths[path] repo_proj.old_revision = repo_proj.revisionExpr repo_proj.revisionExpr = None # Convert URLs from relative to absolute. for _name, remote in manifest.remotes.iteritems(): remote.fetchUrl = remote.resolvedFetchUrl # Save the manifest. save_manifest(manifest) def save_manifest(manifest, client_dir=None): """Save the manifest file in the client_dir. @param client_dir: Client directory to save the manifest in. @param manifest: Manifest object to save. """ if not client_dir: client_dir = manifest.gitc_client_dir with open(os.path.join(client_dir, '.manifest'), 'w') as f: manifest.Save(f, groups=_manifest_groups(manifest)) # TODO(sbasi/jorg): Come up with a solution to remove the sleep below. # Give the GITC filesystem time to register the manifest changes. time.sleep(3) hooks/0040755 0000000 0000000 00000000000 13025567015 010737 5ustar000000000 0000000 hooks/commit-msg0100755 0000000 0000000 00000011066 13025567015 012742 0ustar000000000 0000000 #!/bin/sh # From Gerrit Code Review 2.12.1 # # Part of Gerrit Code Review (https://www.gerritcodereview.com/) # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # unset GREP_OPTIONS CHANGE_ID_AFTER="Bug|Issue|Test" MSG="$1" # Check for, and add if missing, a unique Change-Id # add_ChangeId() { clean_message=`sed -e ' /^diff --git .*/{ s/// q } /^Signed-off-by:/d /^#/d ' "$MSG" | git stripspace` if test -z "$clean_message" then return fi # Do not add Change-Id to temp commits if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!' then return fi if test "false" = "`git config --bool --get gerrit.createChangeId`" then return fi # Does Change-Id: already exist? if so, exit (no change). if grep -i '^Change-Id:' "$MSG" >/dev/null then return fi id=`_gen_ChangeId` T="$MSG.tmp.$$" AWK=awk if [ -x /usr/xpg4/bin/awk ]; then # Solaris AWK is just too broken AWK=/usr/xpg4/bin/awk fi # Get core.commentChar from git config or use default symbol commentChar=`git config --get core.commentChar` commentChar=${commentChar:-#} # How this works: # - parse the commit message as (textLine+ blankLine*)* # - assume textLine+ to be a footer until proven otherwise # - exception: the first block is not footer (as it is the title) # - read textLine+ into a variable # - then count blankLines # - once the next textLine appears, print textLine+ blankLine* as these # aren't footer # - in END, the last textLine+ block is available for footer parsing $AWK ' BEGIN { # while we start with the assumption that textLine+ # is a footer, the first block is not. isFooter = 0 footerComment = 0 blankLines = 0 } # Skip lines starting with commentChar without any spaces before it. /^'"$commentChar"'/ { next } # Skip the line starting with the diff command and everything after it, # up to the end of the file, assuming it is only patch data. # If more than one line before the diff was empty, strip all but one. /^diff --git / { blankLines = 0 while (getline) { } next } # Count blank lines outside footer comments /^$/ && (footerComment == 0) { blankLines++ next } # Catch footer comment /^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) { footerComment = 1 } /]$/ && (footerComment == 1) { footerComment = 2 } # We have a non-blank line after blank lines. Handle this. (blankLines > 0) { print lines for (i = 0; i < blankLines; i++) { print "" } lines = "" blankLines = 0 isFooter = 1 footerComment = 0 } # Detect that the current block is not the footer (footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) { isFooter = 0 } { # We need this information about the current last comment line if (footerComment == 2) { footerComment = 0 } if (lines != "") { lines = lines "\n"; } lines = lines $0 } # Footer handling: # If the last block is considered a footer, splice in the Change-Id at the # right place. # Look for the right place to inject Change-Id by considering # CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first, # then Change-Id, then everything else (eg. Signed-off-by:). # # Otherwise just print the last block, a new line and the Change-Id as a # block of its own. END { unprinted = 1 if (isFooter == 0) { print lines "\n" lines = "" } changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):" numlines = split(lines, footer, "\n") for (line = 1; line <= numlines; line++) { if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) { unprinted = 0 print "Change-Id: I'"$id"'" } print footer[line] } if (unprinted) { print "Change-Id: I'"$id"'" } }' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T" } _gen_ChangeIdInput() { echo "tree `git write-tree`" if parent=`git rev-parse "HEAD^0" 2>/dev/null` then echo "parent $parent" fi echo "author `git var GIT_AUTHOR_IDENT`" echo "committer `git var GIT_COMMITTER_IDENT`" echo printf '%s' "$clean_message" } _gen_ChangeId() { _gen_ChangeIdInput | git hash-object -t commit --stdin } add_ChangeId hooks/pre-auto-gc0100755 0000000 0000000 00000003144 13025567015 013007 0ustar000000000 0000000 #!/bin/sh # # An example hook script to verify if you are on battery, in case you # are running Linux or OS X. Called by git-gc --auto with no arguments. # The hook should exit with non-zero status after issuing an appropriate # message if it wants to stop the auto repacking. # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA if test -x /sbin/on_ac_power && /sbin/on_ac_power then exit 0 elif test "$(cat /sys/class/power_supply/AC/online 2>/dev/null)" = 1 then exit 0 elif grep -q 'on-line' /proc/acpi/ac_adapter/AC/state 2>/dev/null then exit 0 elif grep -q '0x01$' /proc/apm 2>/dev/null then exit 0 elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null then exit 0 elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt | grep -q "drawing from 'AC Power'" then exit 0 elif test -d /sys/bus/acpi/drivers/battery && test 0 = \ "$(find /sys/bus/acpi/drivers/battery/ -type l | wc -l)"; then # No battery exists. exit 0 fi echo "Auto packing deferred; not on AC" exit 1 main.py0100755 0000000 0000000 00000037574 13025567015 011132 0ustar000000000 0000000 #!/usr/bin/env python # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import getpass import imp import netrc import optparse import os import sys import time from pyversion import is_python3 if is_python3(): import urllib.request else: import urllib2 urllib = imp.new_module('urllib') urllib.request = urllib2 try: import kerberos except ImportError: kerberos = None from color import SetDefaultColoring from trace import SetTrace from git_command import git, GitCommand from git_config import init_ssh, close_ssh from command import InteractiveCommand from command import MirrorSafeCommand from command import GitcAvailableCommand, GitcClientCommand from subcmds.version import Version from editor import Editor from error import DownloadError from error import InvalidProjectGroupsError from error import ManifestInvalidRevisionError from error import ManifestParseError from error import NoManifestException from error import NoSuchProjectError from error import RepoChangedException import gitc_utils from manifest_xml import GitcManifest, XmlManifest from pager import RunPager from wrapper import WrapperPath, Wrapper from subcmds import all_commands if not is_python3(): # pylint:disable=W0622 input = raw_input # pylint:enable=W0622 global_options = optparse.OptionParser( usage="repo [-p|--paginate|--no-pager] COMMAND [ARGS]" ) global_options.add_option('-p', '--paginate', dest='pager', action='store_true', help='display command output in the pager') global_options.add_option('--no-pager', dest='no_pager', action='store_true', help='disable the pager') global_options.add_option('--color', choices=('auto', 'always', 'never'), default=None, help='control color usage: auto, always, never') global_options.add_option('--trace', dest='trace', action='store_true', help='trace git command execution') global_options.add_option('--time', dest='time', action='store_true', help='time repo command execution') global_options.add_option('--version', dest='show_version', action='store_true', help='display this version of repo') class _Repo(object): def __init__(self, repodir): self.repodir = repodir self.commands = all_commands # add 'branch' as an alias for 'branches' all_commands['branch'] = all_commands['branches'] def _Run(self, argv): result = 0 name = None glob = [] for i in range(len(argv)): if not argv[i].startswith('-'): name = argv[i] if i > 0: glob = argv[:i] argv = argv[i + 1:] break if not name: glob = argv name = 'help' argv = [] gopts, _gargs = global_options.parse_args(glob) if gopts.trace: SetTrace() if gopts.show_version: if name == 'help': name = 'version' else: print('fatal: invalid usage of --version', file=sys.stderr) return 1 SetDefaultColoring(gopts.color) try: cmd = self.commands[name] except KeyError: print("repo: '%s' is not a repo command. See 'repo help'." % name, file=sys.stderr) return 1 cmd.repodir = self.repodir cmd.manifest = XmlManifest(cmd.repodir) cmd.gitc_manifest = None gitc_client_name = gitc_utils.parse_clientdir(os.getcwd()) if gitc_client_name: cmd.gitc_manifest = GitcManifest(cmd.repodir, gitc_client_name) cmd.manifest.isGitcClient = True Editor.globalConfig = cmd.manifest.globalConfig if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror: print("fatal: '%s' requires a working directory" % name, file=sys.stderr) return 1 if isinstance(cmd, GitcAvailableCommand) and not gitc_utils.get_gitc_manifest_dir(): print("fatal: '%s' requires GITC to be available" % name, file=sys.stderr) return 1 if isinstance(cmd, GitcClientCommand) and not gitc_client_name: print("fatal: '%s' requires a GITC client" % name, file=sys.stderr) return 1 try: copts, cargs = cmd.OptionParser.parse_args(argv) copts = cmd.ReadEnvironmentOptions(copts) except NoManifestException as e: print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)), file=sys.stderr) print('error: manifest missing or unreadable -- please run init', file=sys.stderr) return 1 if not gopts.no_pager and not isinstance(cmd, InteractiveCommand): config = cmd.manifest.globalConfig if gopts.pager: use_pager = True else: use_pager = config.GetBoolean('pager.%s' % name) if use_pager is None: use_pager = cmd.WantPager(copts) if use_pager: RunPager(config) start = time.time() try: result = cmd.Execute(copts, cargs) except (DownloadError, ManifestInvalidRevisionError, NoManifestException) as e: print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)), file=sys.stderr) if isinstance(e, NoManifestException): print('error: manifest missing or unreadable -- please run init', file=sys.stderr) result = 1 except NoSuchProjectError as e: if e.name: print('error: project %s not found' % e.name, file=sys.stderr) else: print('error: no project in current directory', file=sys.stderr) result = 1 except InvalidProjectGroupsError as e: if e.name: print('error: project group must be enabled for project %s' % e.name, file=sys.stderr) else: print('error: project group must be enabled for the project in the current directory', file=sys.stderr) result = 1 finally: elapsed = time.time() - start hours, remainder = divmod(elapsed, 3600) minutes, seconds = divmod(remainder, 60) if gopts.time: if hours == 0: print('real\t%dm%.3fs' % (minutes, seconds), file=sys.stderr) else: print('real\t%dh%dm%.3fs' % (hours, minutes, seconds), file=sys.stderr) return result def _MyRepoPath(): return os.path.dirname(__file__) def _CheckWrapperVersion(ver, repo_path): if not repo_path: repo_path = '~/bin/repo' if not ver: print('no --wrapper-version argument', file=sys.stderr) sys.exit(1) exp = Wrapper().VERSION ver = tuple(map(int, ver.split('.'))) if len(ver) == 1: ver = (0, ver[0]) exp_str = '.'.join(map(str, exp)) if exp[0] > ver[0] or ver < (0, 4): print(""" !!! A new repo command (%5s) is available. !!! !!! You must upgrade before you can continue: !!! cp %s %s """ % (exp_str, WrapperPath(), repo_path), file=sys.stderr) sys.exit(1) if exp > ver: print(""" ... A new repo command (%5s) is available. ... You should upgrade soon: cp %s %s """ % (exp_str, WrapperPath(), repo_path), file=sys.stderr) def _CheckRepoDir(repo_dir): if not repo_dir: print('no --repo-dir argument', file=sys.stderr) sys.exit(1) def _PruneOptions(argv, opt): i = 0 while i < len(argv): a = argv[i] if a == '--': break if a.startswith('--'): eq = a.find('=') if eq > 0: a = a[0:eq] if not opt.has_option(a): del argv[i] continue i += 1 _user_agent = None def _UserAgent(): global _user_agent if _user_agent is None: py_version = sys.version_info os_name = sys.platform if os_name == 'linux2': os_name = 'Linux' elif os_name == 'win32': os_name = 'Win32' elif os_name == 'cygwin': os_name = 'Cygwin' elif os_name == 'darwin': os_name = 'Darwin' p = GitCommand( None, ['describe', 'HEAD'], cwd = _MyRepoPath(), capture_stdout = True) if p.Wait() == 0: repo_version = p.stdout if len(repo_version) > 0 and repo_version[-1] == '\n': repo_version = repo_version[0:-1] if len(repo_version) > 0 and repo_version[0] == 'v': repo_version = repo_version[1:] else: repo_version = 'unknown' _user_agent = 'git-repo/%s (%s) git/%s Python/%d.%d.%d' % ( repo_version, os_name, '.'.join(map(str, git.version_tuple())), py_version[0], py_version[1], py_version[2]) return _user_agent class _UserAgentHandler(urllib.request.BaseHandler): def http_request(self, req): req.add_header('User-Agent', _UserAgent()) return req def https_request(self, req): req.add_header('User-Agent', _UserAgent()) return req def _AddPasswordFromUserInput(handler, msg, req): # If repo could not find auth info from netrc, try to get it from user input url = req.get_full_url() user, password = handler.passwd.find_user_password(None, url) if user is None: print(msg) try: user = input('User: ') password = getpass.getpass() except KeyboardInterrupt: return handler.passwd.add_password(None, url, user, password) class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler): def http_error_401(self, req, fp, code, msg, headers): _AddPasswordFromUserInput(self, msg, req) return urllib.request.HTTPBasicAuthHandler.http_error_401( self, req, fp, code, msg, headers) def http_error_auth_reqed(self, authreq, host, req, headers): try: old_add_header = req.add_header def _add_header(name, val): val = val.replace('\n', '') old_add_header(name, val) req.add_header = _add_header return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed( self, authreq, host, req, headers) except: reset = getattr(self, 'reset_retry_count', None) if reset is not None: reset() elif getattr(self, 'retried', None): self.retried = 0 raise class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler): def http_error_401(self, req, fp, code, msg, headers): _AddPasswordFromUserInput(self, msg, req) return urllib.request.HTTPDigestAuthHandler.http_error_401( self, req, fp, code, msg, headers) def http_error_auth_reqed(self, auth_header, host, req, headers): try: old_add_header = req.add_header def _add_header(name, val): val = val.replace('\n', '') old_add_header(name, val) req.add_header = _add_header return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed( self, auth_header, host, req, headers) except: reset = getattr(self, 'reset_retry_count', None) if reset is not None: reset() elif getattr(self, 'retried', None): self.retried = 0 raise class _KerberosAuthHandler(urllib.request.BaseHandler): def __init__(self): self.retried = 0 self.context = None self.handler_order = urllib.request.BaseHandler.handler_order - 50 def http_error_401(self, req, fp, code, msg, headers): # pylint:disable=unused-argument host = req.get_host() retry = self.http_error_auth_reqed('www-authenticate', host, req, headers) return retry def http_error_auth_reqed(self, auth_header, host, req, headers): try: spn = "HTTP@%s" % host authdata = self._negotiate_get_authdata(auth_header, headers) if self.retried > 3: raise urllib.request.HTTPError(req.get_full_url(), 401, "Negotiate auth failed", headers, None) else: self.retried += 1 neghdr = self._negotiate_get_svctk(spn, authdata) if neghdr is None: return None req.add_unredirected_header('Authorization', neghdr) response = self.parent.open(req) srvauth = self._negotiate_get_authdata(auth_header, response.info()) if self._validate_response(srvauth): return response except kerberos.GSSError: return None except: self.reset_retry_count() raise finally: self._clean_context() def reset_retry_count(self): self.retried = 0 def _negotiate_get_authdata(self, auth_header, headers): authhdr = headers.get(auth_header, None) if authhdr is not None: for mech_tuple in authhdr.split(","): mech, __, authdata = mech_tuple.strip().partition(" ") if mech.lower() == "negotiate": return authdata.strip() return None def _negotiate_get_svctk(self, spn, authdata): if authdata is None: return None result, self.context = kerberos.authGSSClientInit(spn) if result < kerberos.AUTH_GSS_COMPLETE: return None result = kerberos.authGSSClientStep(self.context, authdata) if result < kerberos.AUTH_GSS_CONTINUE: return None response = kerberos.authGSSClientResponse(self.context) return "Negotiate %s" % response def _validate_response(self, authdata): if authdata is None: return None result = kerberos.authGSSClientStep(self.context, authdata) if result == kerberos.AUTH_GSS_COMPLETE: return True return None def _clean_context(self): if self.context is not None: kerberos.authGSSClientClean(self.context) self.context = None def init_http(): handlers = [_UserAgentHandler()] mgr = urllib.request.HTTPPasswordMgrWithDefaultRealm() try: n = netrc.netrc() for host in n.hosts: p = n.hosts[host] mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2]) mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2]) except netrc.NetrcParseError: pass except IOError: pass handlers.append(_BasicAuthHandler(mgr)) handlers.append(_DigestAuthHandler(mgr)) if kerberos: handlers.append(_KerberosAuthHandler()) if 'http_proxy' in os.environ: url = os.environ['http_proxy'] handlers.append(urllib.request.ProxyHandler({'http': url, 'https': url})) if 'REPO_CURL_VERBOSE' in os.environ: handlers.append(urllib.request.HTTPHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1)) urllib.request.install_opener(urllib.request.build_opener(*handlers)) def _Main(argv): result = 0 opt = optparse.OptionParser(usage="repo wrapperinfo -- ...") opt.add_option("--repo-dir", dest="repodir", help="path to .repo/") opt.add_option("--wrapper-version", dest="wrapper_version", help="version of the wrapper script") opt.add_option("--wrapper-path", dest="wrapper_path", help="location of the wrapper script") _PruneOptions(argv, opt) opt, argv = opt.parse_args(argv) _CheckWrapperVersion(opt.wrapper_version, opt.wrapper_path) _CheckRepoDir(opt.repodir) Version.wrapper_version = opt.wrapper_version Version.wrapper_path = opt.wrapper_path repo = _Repo(opt.repodir) try: try: init_ssh() init_http() result = repo._Run(argv) or 0 finally: close_ssh() except KeyboardInterrupt: print('aborted by user', file=sys.stderr) result = 1 except ManifestParseError as mpe: print('fatal: %s' % mpe, file=sys.stderr) result = 1 except RepoChangedException as rce: # If repo changed, re-exec ourselves. # argv = list(sys.argv) argv.extend(rce.extra_args) try: os.execv(__file__, argv) except OSError as e: print('fatal: cannot restart repo after upgrade', file=sys.stderr) print('fatal: %s' % e, file=sys.stderr) result = 128 sys.exit(result) if __name__ == '__main__': _Main(sys.argv[1:]) manifest_xml.py0100644 0000000 0000000 00000077450 13025567015 012666 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import itertools import os import re import sys import xml.dom.minidom from pyversion import is_python3 if is_python3(): import urllib.parse else: import imp import urlparse urllib = imp.new_module('urllib') urllib.parse = urlparse import gitc_utils from git_config import GitConfig from git_refs import R_HEADS, HEAD from project import RemoteSpec, Project, MetaProject from error import ManifestParseError, ManifestInvalidRevisionError MANIFEST_FILE_NAME = 'manifest.xml' LOCAL_MANIFEST_NAME = 'local_manifest.xml' LOCAL_MANIFESTS_DIR_NAME = 'local_manifests' # urljoin gets confused if the scheme is not known. urllib.parse.uses_relative.extend(['ssh', 'git', 'persistent-https', 'rpc']) urllib.parse.uses_netloc.extend(['ssh', 'git', 'persistent-https', 'rpc']) class _Default(object): """Project defaults within the manifest.""" revisionExpr = None destBranchExpr = None remote = None sync_j = 1 sync_c = False sync_s = False def __eq__(self, other): return self.__dict__ == other.__dict__ def __ne__(self, other): return self.__dict__ != other.__dict__ class _XmlRemote(object): def __init__(self, name, alias=None, fetch=None, pushUrl=None, manifestUrl=None, review=None, revision=None): self.name = name self.fetchUrl = fetch self.pushUrl = pushUrl self.manifestUrl = manifestUrl self.remoteAlias = alias self.reviewUrl = review self.revision = revision self.resolvedFetchUrl = self._resolveFetchUrl() def __eq__(self, other): return self.__dict__ == other.__dict__ def __ne__(self, other): return self.__dict__ != other.__dict__ def _resolveFetchUrl(self): url = self.fetchUrl.rstrip('/') manifestUrl = self.manifestUrl.rstrip('/') # urljoin will gets confused over quite a few things. The ones we care # about here are: # * no scheme in the base url, like # We handle no scheme by replacing it with an obscure protocol, gopher # and then replacing it with the original when we are done. if manifestUrl.find(':') != manifestUrl.find('/') - 1: url = urllib.parse.urljoin('gopher://' + manifestUrl, url) url = re.sub(r'^gopher://', '', url) else: url = urllib.parse.urljoin(manifestUrl, url) return url def ToRemoteSpec(self, projectName): url = self.resolvedFetchUrl.rstrip('/') + '/' + projectName remoteName = self.name if self.remoteAlias: remoteName = self.remoteAlias return RemoteSpec(remoteName, url=url, pushUrl=self.pushUrl, review=self.reviewUrl, orig_name=self.name) class XmlManifest(object): """manages the repo configuration file""" def __init__(self, repodir): self.repodir = os.path.abspath(repodir) self.topdir = os.path.dirname(self.repodir) self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME) self.globalConfig = GitConfig.ForUser() self.localManifestWarning = False self.isGitcClient = False self.repoProject = MetaProject(self, 'repo', gitdir = os.path.join(repodir, 'repo/.git'), worktree = os.path.join(repodir, 'repo')) self.manifestProject = MetaProject(self, 'manifests', gitdir = os.path.join(repodir, 'manifests.git'), worktree = os.path.join(repodir, 'manifests')) self._Unload() def Override(self, name): """Use a different manifest, just for the current instantiation. """ path = os.path.join(self.manifestProject.worktree, name) if not os.path.isfile(path): raise ManifestParseError('manifest %s not found' % name) old = self.manifestFile try: self.manifestFile = path self._Unload() self._Load() finally: self.manifestFile = old def Link(self, name): """Update the repo metadata to use a different manifest. """ self.Override(name) try: if os.path.lexists(self.manifestFile): os.remove(self.manifestFile) os.symlink('manifests/%s' % name, self.manifestFile) except OSError as e: raise ManifestParseError('cannot link manifest %s: %s' % (name, str(e))) def _RemoteToXml(self, r, doc, root): e = doc.createElement('remote') root.appendChild(e) e.setAttribute('name', r.name) e.setAttribute('fetch', r.fetchUrl) if r.pushUrl is not None: e.setAttribute('pushurl', r.pushUrl) if r.remoteAlias is not None: e.setAttribute('alias', r.remoteAlias) if r.reviewUrl is not None: e.setAttribute('review', r.reviewUrl) if r.revision is not None: e.setAttribute('revision', r.revision) def _ParseGroups(self, groups): return [x for x in re.split(r'[,\s]+', groups) if x] def Save(self, fd, peg_rev=False, peg_rev_upstream=True, groups=None): """Write the current manifest out to the given file descriptor. """ mp = self.manifestProject if groups is None: groups = mp.config.GetString('manifest.groups') if groups: groups = self._ParseGroups(groups) doc = xml.dom.minidom.Document() root = doc.createElement('manifest') doc.appendChild(root) # Save out the notice. There's a little bit of work here to give it the # right whitespace, which assumes that the notice is automatically indented # by 4 by minidom. if self.notice: notice_element = root.appendChild(doc.createElement('notice')) notice_lines = self.notice.splitlines() indented_notice = ('\n'.join(" "*4 + line for line in notice_lines))[4:] notice_element.appendChild(doc.createTextNode(indented_notice)) d = self.default for r in sorted(self.remotes): self._RemoteToXml(self.remotes[r], doc, root) if self.remotes: root.appendChild(doc.createTextNode('')) have_default = False e = doc.createElement('default') if d.remote: have_default = True e.setAttribute('remote', d.remote.name) if d.revisionExpr: have_default = True e.setAttribute('revision', d.revisionExpr) if d.destBranchExpr: have_default = True e.setAttribute('dest-branch', d.destBranchExpr) if d.sync_j > 1: have_default = True e.setAttribute('sync-j', '%d' % d.sync_j) if d.sync_c: have_default = True e.setAttribute('sync-c', 'true') if d.sync_s: have_default = True e.setAttribute('sync-s', 'true') if have_default: root.appendChild(e) root.appendChild(doc.createTextNode('')) if self._manifest_server: e = doc.createElement('manifest-server') e.setAttribute('url', self._manifest_server) root.appendChild(e) root.appendChild(doc.createTextNode('')) def output_projects(parent, parent_node, projects): for project_name in projects: for project in self._projects[project_name]: output_project(parent, parent_node, project) def output_project(parent, parent_node, p): if not p.MatchesGroups(groups): return name = p.name relpath = p.relpath if parent: name = self._UnjoinName(parent.name, name) relpath = self._UnjoinRelpath(parent.relpath, relpath) e = doc.createElement('project') parent_node.appendChild(e) e.setAttribute('name', name) if relpath != name: e.setAttribute('path', relpath) remoteName = None if d.remote: remoteName = d.remote.name if not d.remote or p.remote.orig_name != remoteName: remoteName = p.remote.orig_name e.setAttribute('remote', remoteName) if peg_rev: if self.IsMirror: value = p.bare_git.rev_parse(p.revisionExpr + '^0') else: value = p.work_git.rev_parse(HEAD + '^0') e.setAttribute('revision', value) if peg_rev_upstream: if p.upstream: e.setAttribute('upstream', p.upstream) elif value != p.revisionExpr: # Only save the origin if the origin is not a sha1, and the default # isn't our value e.setAttribute('upstream', p.revisionExpr) else: revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr if not revision or revision != p.revisionExpr: e.setAttribute('revision', p.revisionExpr) if p.upstream and p.upstream != p.revisionExpr: e.setAttribute('upstream', p.upstream) if p.dest_branch and p.dest_branch != d.destBranchExpr: e.setAttribute('dest-branch', p.dest_branch) for c in p.copyfiles: ce = doc.createElement('copyfile') ce.setAttribute('src', c.src) ce.setAttribute('dest', c.dest) e.appendChild(ce) for l in p.linkfiles: le = doc.createElement('linkfile') le.setAttribute('src', l.src) le.setAttribute('dest', l.dest) e.appendChild(le) default_groups = ['all', 'name:%s' % p.name, 'path:%s' % p.relpath] egroups = [g for g in p.groups if g not in default_groups] if egroups: e.setAttribute('groups', ','.join(egroups)) for a in p.annotations: if a.keep == "true": ae = doc.createElement('annotation') ae.setAttribute('name', a.name) ae.setAttribute('value', a.value) e.appendChild(ae) if p.sync_c: e.setAttribute('sync-c', 'true') if p.sync_s: e.setAttribute('sync-s', 'true') if p.clone_depth: e.setAttribute('clone-depth', str(p.clone_depth)) self._output_manifest_project_extras(p, e) if p.subprojects: subprojects = set(subp.name for subp in p.subprojects) output_projects(p, e, list(sorted(subprojects))) projects = set(p.name for p in self._paths.values() if not p.parent) output_projects(None, root, list(sorted(projects))) if self._repo_hooks_project: root.appendChild(doc.createTextNode('')) e = doc.createElement('repo-hooks') e.setAttribute('in-project', self._repo_hooks_project.name) e.setAttribute('enabled-list', ' '.join(self._repo_hooks_project.enabled_repo_hooks)) root.appendChild(e) doc.writexml(fd, '', ' ', '\n', 'UTF-8') def _output_manifest_project_extras(self, p, e): """Manifests can modify e if they support extra project attributes.""" pass @property def paths(self): self._Load() return self._paths @property def projects(self): self._Load() return list(self._paths.values()) @property def remotes(self): self._Load() return self._remotes @property def default(self): self._Load() return self._default @property def repo_hooks_project(self): self._Load() return self._repo_hooks_project @property def notice(self): self._Load() return self._notice @property def manifest_server(self): self._Load() return self._manifest_server @property def IsMirror(self): return self.manifestProject.config.GetBoolean('repo.mirror') @property def IsArchive(self): return self.manifestProject.config.GetBoolean('repo.archive') def _Unload(self): self._loaded = False self._projects = {} self._paths = {} self._remotes = {} self._default = None self._repo_hooks_project = None self._notice = None self.branch = None self._manifest_server = None def _Load(self): if not self._loaded: m = self.manifestProject b = m.GetBranch(m.CurrentBranch).merge if b is not None and b.startswith(R_HEADS): b = b[len(R_HEADS):] self.branch = b nodes = [] nodes.append(self._ParseManifestXml(self.manifestFile, self.manifestProject.worktree)) local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME) if os.path.exists(local): if not self.localManifestWarning: self.localManifestWarning = True print('warning: %s is deprecated; put local manifests in `%s` instead' % (LOCAL_MANIFEST_NAME, os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)), file=sys.stderr) nodes.append(self._ParseManifestXml(local, self.repodir)) local_dir = os.path.abspath(os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)) try: for local_file in sorted(os.listdir(local_dir)): if local_file.endswith('.xml'): local = os.path.join(local_dir, local_file) nodes.append(self._ParseManifestXml(local, self.repodir)) except OSError: pass try: self._ParseManifest(nodes) except ManifestParseError as e: # There was a problem parsing, unload ourselves in case they catch # this error and try again later, we will show the correct error self._Unload() raise e if self.IsMirror: self._AddMetaProjectMirror(self.repoProject) self._AddMetaProjectMirror(self.manifestProject) self._loaded = True def _ParseManifestXml(self, path, include_root): try: root = xml.dom.minidom.parse(path) except (OSError, xml.parsers.expat.ExpatError) as e: raise ManifestParseError("error parsing manifest %s: %s" % (path, e)) if not root or not root.childNodes: raise ManifestParseError("no root node in %s" % (path,)) for manifest in root.childNodes: if manifest.nodeName == 'manifest': break else: raise ManifestParseError("no in %s" % (path,)) nodes = [] for node in manifest.childNodes: # pylint:disable=W0631 # We only get here if manifest is initialised if node.nodeName == 'include': name = self._reqatt(node, 'name') fp = os.path.join(include_root, name) if not os.path.isfile(fp): raise ManifestParseError("include %s doesn't exist or isn't a file" % (name,)) try: nodes.extend(self._ParseManifestXml(fp, include_root)) # should isolate this to the exact exception, but that's # tricky. actual parsing implementation may vary. except (KeyboardInterrupt, RuntimeError, SystemExit): raise except Exception as e: raise ManifestParseError( "failed parsing included manifest %s: %s", (name, e)) else: nodes.append(node) return nodes def _ParseManifest(self, node_list): for node in itertools.chain(*node_list): if node.nodeName == 'remote': remote = self._ParseRemote(node) if remote: if remote.name in self._remotes: if remote != self._remotes[remote.name]: raise ManifestParseError( 'remote %s already exists with different attributes' % (remote.name)) else: self._remotes[remote.name] = remote for node in itertools.chain(*node_list): if node.nodeName == 'default': new_default = self._ParseDefault(node) if self._default is None: self._default = new_default elif new_default != self._default: raise ManifestParseError('duplicate default in %s' % (self.manifestFile)) if self._default is None: self._default = _Default() for node in itertools.chain(*node_list): if node.nodeName == 'notice': if self._notice is not None: raise ManifestParseError( 'duplicate notice in %s' % (self.manifestFile)) self._notice = self._ParseNotice(node) for node in itertools.chain(*node_list): if node.nodeName == 'manifest-server': url = self._reqatt(node, 'url') if self._manifest_server is not None: raise ManifestParseError( 'duplicate manifest-server in %s' % (self.manifestFile)) self._manifest_server = url def recursively_add_projects(project): projects = self._projects.setdefault(project.name, []) if project.relpath is None: raise ManifestParseError( 'missing path for %s in %s' % (project.name, self.manifestFile)) if project.relpath in self._paths: raise ManifestParseError( 'duplicate path %s in %s' % (project.relpath, self.manifestFile)) self._paths[project.relpath] = project projects.append(project) for subproject in project.subprojects: recursively_add_projects(subproject) for node in itertools.chain(*node_list): if node.nodeName == 'project': project = self._ParseProject(node) recursively_add_projects(project) if node.nodeName == 'extend-project': name = self._reqatt(node, 'name') if name not in self._projects: raise ManifestParseError('extend-project element specifies non-existent ' 'project: %s' % name) path = node.getAttribute('path') groups = node.getAttribute('groups') if groups: groups = self._ParseGroups(groups) for p in self._projects[name]: if path and p.relpath != path: continue if groups: p.groups.extend(groups) if node.nodeName == 'repo-hooks': # Get the name of the project and the (space-separated) list of enabled. repo_hooks_project = self._reqatt(node, 'in-project') enabled_repo_hooks = self._reqatt(node, 'enabled-list').split() # Only one project can be the hooks project if self._repo_hooks_project is not None: raise ManifestParseError( 'duplicate repo-hooks in %s' % (self.manifestFile)) # Store a reference to the Project. try: repo_hooks_projects = self._projects[repo_hooks_project] except KeyError: raise ManifestParseError( 'project %s not found for repo-hooks' % (repo_hooks_project)) if len(repo_hooks_projects) != 1: raise ManifestParseError( 'internal error parsing repo-hooks in %s' % (self.manifestFile)) self._repo_hooks_project = repo_hooks_projects[0] # Store the enabled hooks in the Project object. self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks if node.nodeName == 'remove-project': name = self._reqatt(node, 'name') if name not in self._projects: raise ManifestParseError('remove-project element specifies non-existent ' 'project: %s' % name) for p in self._projects[name]: del self._paths[p.relpath] del self._projects[name] # If the manifest removes the hooks project, treat it as if it deleted # the repo-hooks element too. if self._repo_hooks_project and (self._repo_hooks_project.name == name): self._repo_hooks_project = None def _AddMetaProjectMirror(self, m): name = None m_url = m.GetRemote(m.remote.name).url if m_url.endswith('/.git'): raise ManifestParseError('refusing to mirror %s' % m_url) if self._default and self._default.remote: url = self._default.remote.resolvedFetchUrl if not url.endswith('/'): url += '/' if m_url.startswith(url): remote = self._default.remote name = m_url[len(url):] if name is None: s = m_url.rindex('/') + 1 manifestUrl = self.manifestProject.config.GetString('remote.origin.url') remote = _XmlRemote('origin', fetch=m_url[:s], manifestUrl=manifestUrl) name = m_url[s:] if name.endswith('.git'): name = name[:-4] if name not in self._projects: m.PreSync() gitdir = os.path.join(self.topdir, '%s.git' % name) project = Project(manifest = self, name = name, remote = remote.ToRemoteSpec(name), gitdir = gitdir, objdir = gitdir, worktree = None, relpath = name or None, revisionExpr = m.revisionExpr, revisionId = None) self._projects[project.name] = [project] self._paths[project.relpath] = project def _ParseRemote(self, node): """ reads a element from the manifest file """ name = self._reqatt(node, 'name') alias = node.getAttribute('alias') if alias == '': alias = None fetch = self._reqatt(node, 'fetch') pushUrl = node.getAttribute('pushurl') if pushUrl == '': pushUrl = None review = node.getAttribute('review') if review == '': review = None revision = node.getAttribute('revision') if revision == '': revision = None manifestUrl = self.manifestProject.config.GetString('remote.origin.url') return _XmlRemote(name, alias, fetch, pushUrl, manifestUrl, review, revision) def _ParseDefault(self, node): """ reads a element from the manifest file """ d = _Default() d.remote = self._get_remote(node) d.revisionExpr = node.getAttribute('revision') if d.revisionExpr == '': d.revisionExpr = None d.destBranchExpr = node.getAttribute('dest-branch') or None sync_j = node.getAttribute('sync-j') if sync_j == '' or sync_j is None: d.sync_j = 1 else: d.sync_j = int(sync_j) sync_c = node.getAttribute('sync-c') if not sync_c: d.sync_c = False else: d.sync_c = sync_c.lower() in ("yes", "true", "1") sync_s = node.getAttribute('sync-s') if not sync_s: d.sync_s = False else: d.sync_s = sync_s.lower() in ("yes", "true", "1") return d def _ParseNotice(self, node): """ reads a element from the manifest file The element is distinct from other tags in the XML in that the data is conveyed between the start and end tag (it's not an empty-element tag). The white space (carriage returns, indentation) for the notice element is relevant and is parsed in a way that is based on how python docstrings work. In fact, the code is remarkably similar to here: http://www.python.org/dev/peps/pep-0257/ """ # Get the data out of the node... notice = node.childNodes[0].data # Figure out minimum indentation, skipping the first line (the same line # as the tag)... minIndent = sys.maxsize lines = notice.splitlines() for line in lines[1:]: lstrippedLine = line.lstrip() if lstrippedLine: indent = len(line) - len(lstrippedLine) minIndent = min(indent, minIndent) # Strip leading / trailing blank lines and also indentation. cleanLines = [lines[0].strip()] for line in lines[1:]: cleanLines.append(line[minIndent:].rstrip()) # Clear completely blank lines from front and back... while cleanLines and not cleanLines[0]: del cleanLines[0] while cleanLines and not cleanLines[-1]: del cleanLines[-1] return '\n'.join(cleanLines) def _JoinName(self, parent_name, name): return os.path.join(parent_name, name) def _UnjoinName(self, parent_name, name): return os.path.relpath(name, parent_name) def _ParseProject(self, node, parent = None, **extra_proj_attrs): """ reads a element from the manifest file """ name = self._reqatt(node, 'name') if parent: name = self._JoinName(parent.name, name) remote = self._get_remote(node) if remote is None: remote = self._default.remote if remote is None: raise ManifestParseError("no remote for project %s within %s" % (name, self.manifestFile)) revisionExpr = node.getAttribute('revision') or remote.revision if not revisionExpr: revisionExpr = self._default.revisionExpr if not revisionExpr: raise ManifestParseError("no revision for project %s within %s" % (name, self.manifestFile)) path = node.getAttribute('path') if not path: path = name if path.startswith('/'): raise ManifestParseError("project %s path cannot be absolute in %s" % (name, self.manifestFile)) rebase = node.getAttribute('rebase') if not rebase: rebase = True else: rebase = rebase.lower() in ("yes", "true", "1") sync_c = node.getAttribute('sync-c') if not sync_c: sync_c = False else: sync_c = sync_c.lower() in ("yes", "true", "1") sync_s = node.getAttribute('sync-s') if not sync_s: sync_s = self._default.sync_s else: sync_s = sync_s.lower() in ("yes", "true", "1") clone_depth = node.getAttribute('clone-depth') if clone_depth: try: clone_depth = int(clone_depth) if clone_depth <= 0: raise ValueError() except ValueError: raise ManifestParseError('invalid clone-depth %s in %s' % (clone_depth, self.manifestFile)) dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr upstream = node.getAttribute('upstream') groups = '' if node.hasAttribute('groups'): groups = node.getAttribute('groups') groups = self._ParseGroups(groups) if parent is None: relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path) else: relpath, worktree, gitdir, objdir = \ self.GetSubprojectPaths(parent, name, path) default_groups = ['all', 'name:%s' % name, 'path:%s' % relpath] groups.extend(set(default_groups).difference(groups)) if self.IsMirror and node.hasAttribute('force-path'): if node.getAttribute('force-path').lower() in ("yes", "true", "1"): gitdir = os.path.join(self.topdir, '%s.git' % path) project = Project(manifest = self, name = name, remote = remote.ToRemoteSpec(name), gitdir = gitdir, objdir = objdir, worktree = worktree, relpath = relpath, revisionExpr = revisionExpr, revisionId = None, rebase = rebase, groups = groups, sync_c = sync_c, sync_s = sync_s, clone_depth = clone_depth, upstream = upstream, parent = parent, dest_branch = dest_branch, **extra_proj_attrs) for n in node.childNodes: if n.nodeName == 'copyfile': self._ParseCopyFile(project, n) if n.nodeName == 'linkfile': self._ParseLinkFile(project, n) if n.nodeName == 'annotation': self._ParseAnnotation(project, n) if n.nodeName == 'project': project.subprojects.append(self._ParseProject(n, parent = project)) return project def GetProjectPaths(self, name, path): relpath = path if self.IsMirror: worktree = None gitdir = os.path.join(self.topdir, '%s.git' % name) objdir = gitdir else: worktree = os.path.join(self.topdir, path).replace('\\', '/') gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path) objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name) return relpath, worktree, gitdir, objdir def GetProjectsWithName(self, name): return self._projects.get(name, []) def GetSubprojectName(self, parent, submodule_path): return os.path.join(parent.name, submodule_path) def _JoinRelpath(self, parent_relpath, relpath): return os.path.join(parent_relpath, relpath) def _UnjoinRelpath(self, parent_relpath, relpath): return os.path.relpath(relpath, parent_relpath) def GetSubprojectPaths(self, parent, name, path): relpath = self._JoinRelpath(parent.relpath, path) gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path) objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name) if self.IsMirror: worktree = None else: worktree = os.path.join(parent.worktree, path).replace('\\', '/') return relpath, worktree, gitdir, objdir def _ParseCopyFile(self, project, node): src = self._reqatt(node, 'src') dest = self._reqatt(node, 'dest') if not self.IsMirror: # src is project relative; # dest is relative to the top of the tree project.AddCopyFile(src, dest, os.path.join(self.topdir, dest)) def _ParseLinkFile(self, project, node): src = self._reqatt(node, 'src') dest = self._reqatt(node, 'dest') if not self.IsMirror: # src is project relative; # dest is relative to the top of the tree project.AddLinkFile(src, dest, os.path.join(self.topdir, dest)) def _ParseAnnotation(self, project, node): name = self._reqatt(node, 'name') value = self._reqatt(node, 'value') try: keep = self._reqatt(node, 'keep').lower() except ManifestParseError: keep = "true" if keep != "true" and keep != "false": raise ManifestParseError('optional "keep" attribute must be ' '"true" or "false"') project.AddAnnotation(name, value, keep) def _get_remote(self, node): name = node.getAttribute('remote') if not name: return None v = self._remotes.get(name) if not v: raise ManifestParseError("remote %s not defined in %s" % (name, self.manifestFile)) return v def _reqatt(self, node, attname): """ reads a required attribute from the node. """ v = node.getAttribute(attname) if not v: raise ManifestParseError("no %s in <%s> within %s" % (attname, node.nodeName, self.manifestFile)) return v def projectsDiff(self, manifest): """return the projects differences between two manifests. The diff will be from self to given manifest. """ fromProjects = self.paths toProjects = manifest.paths fromKeys = sorted(fromProjects.keys()) toKeys = sorted(toProjects.keys()) diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []} for proj in fromKeys: if not proj in toKeys: diff['removed'].append(fromProjects[proj]) else: fromProj = fromProjects[proj] toProj = toProjects[proj] try: fromRevId = fromProj.GetCommitRevisionId() toRevId = toProj.GetCommitRevisionId() except ManifestInvalidRevisionError: diff['unreachable'].append((fromProj, toProj)) else: if fromRevId != toRevId: diff['changed'].append((fromProj, toProj)) toKeys.remove(proj) for proj in toKeys: diff['added'].append(toProjects[proj]) return diff class GitcManifest(XmlManifest): def __init__(self, repodir, gitc_client_name): """Initialize the GitcManifest object.""" super(GitcManifest, self).__init__(repodir) self.isGitcClient = True self.gitc_client_name = gitc_client_name self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), gitc_client_name) self.manifestFile = os.path.join(self.gitc_client_dir, '.manifest') def _ParseProject(self, node, parent = None): """Override _ParseProject and add support for GITC specific attributes.""" return super(GitcManifest, self)._ParseProject( node, parent=parent, old_revision=node.getAttribute('old-revision')) def _output_manifest_project_extras(self, p, e): """Output GITC Specific Project attributes""" if p.old_revision: e.setAttribute('old-revision', str(p.old_revision)) pager.py0100755 0000000 0000000 00000004003 13025567015 011261 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import select import sys active = False def RunPager(globalConfig): global active if not os.isatty(0) or not os.isatty(1): return pager = _SelectPager(globalConfig) if pager == '' or pager == 'cat': return # This process turns into the pager; a child it forks will # do the real processing and output back to the pager. This # is necessary to keep the pager in control of the tty. # try: r, w = os.pipe() pid = os.fork() if not pid: os.dup2(w, 1) os.dup2(w, 2) os.close(r) os.close(w) active = True return os.dup2(r, 0) os.close(r) os.close(w) _BecomePager(pager) except Exception: print("fatal: cannot start pager '%s'" % pager, file=sys.stderr) sys.exit(255) def _SelectPager(globalConfig): try: return os.environ['GIT_PAGER'] except KeyError: pass pager = globalConfig.GetString('core.pager') if pager: return pager try: return os.environ['PAGER'] except KeyError: pass return 'less' def _BecomePager(pager): # Delaying execution of the pager until we have output # ready works around a long-standing bug in popularly # available versions of 'less', a better 'more'. # _a, _b, _c = select.select([0], [], [0]) os.environ['LESS'] = 'FRSX' try: os.execvp(pager, [pager]) except OSError: os.execv('/bin/sh', ['sh', '-c', pager]) progress.py0100644 0000000 0000000 00000003764 13025567015 012041 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import sys from time import time from trace import IsTrace _NOT_TTY = not os.isatty(2) class Progress(object): def __init__(self, title, total=0, units=''): self._title = title self._total = total self._done = 0 self._lastp = -1 self._start = time() self._show = False self._units = units def update(self, inc=1): self._done += inc if _NOT_TTY or IsTrace(): return if not self._show: if 0.5 <= time() - self._start: self._show = True else: return if self._total <= 0: sys.stderr.write('\r%s: %d, ' % ( self._title, self._done)) sys.stderr.flush() else: p = (100 * self._done) / self._total if self._lastp != p: self._lastp = p sys.stderr.write('\r%s: %3d%% (%d%s/%d%s) ' % ( self._title, p, self._done, self._units, self._total, self._units)) sys.stderr.flush() def end(self): if _NOT_TTY or IsTrace() or not self._show: return if self._total <= 0: sys.stderr.write('\r%s: %d, done. \n' % ( self._title, self._done)) sys.stderr.flush() else: p = (100 * self._done) / self._total sys.stderr.write('\r%s: %3d%% (%d%s/%d%s), done. \n' % ( self._title, p, self._done, self._units, self._total, self._units)) sys.stderr.flush() project.py0100644 0000000 0000000 00000266141 13025567015 011643 0ustar000000000 0000000 # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import errno import filecmp import glob import os import random import re import shutil import stat import subprocess import sys import tarfile import tempfile import time import traceback from color import Coloring from git_command import GitCommand, git_require from git_config import GitConfig, IsId, GetSchemeFromUrl, GetUrlCookieFile, \ ID_RE from error import GitError, HookError, UploadError, DownloadError from error import ManifestInvalidRevisionError from error import NoManifestException from trace import IsTrace, Trace from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M from pyversion import is_python3 if is_python3(): import urllib.parse else: import imp import urlparse urllib = imp.new_module('urllib') urllib.parse = urlparse # pylint:disable=W0622 input = raw_input # pylint:enable=W0622 def _lwrite(path, content): lock = '%s.lock' % path fd = open(lock, 'w') try: fd.write(content) finally: fd.close() try: os.rename(lock, path) except OSError: os.remove(lock) raise def _error(fmt, *args): msg = fmt % args print('error: %s' % msg, file=sys.stderr) def _warn(fmt, *args): msg = fmt % args print('warn: %s' % msg, file=sys.stderr) def not_rev(r): return '^' + r def sq(r): return "'" + r.replace("'", "'\''") + "'" _project_hook_list = None def _ProjectHooks(): """List the hooks present in the 'hooks' directory. These hooks are project hooks and are copied to the '.git/hooks' directory of all subprojects. This function caches the list of hooks (based on the contents of the 'repo/hooks' directory) on the first call. Returns: A list of absolute paths to all of the files in the hooks directory. """ global _project_hook_list if _project_hook_list is None: d = os.path.realpath(os.path.abspath(os.path.dirname(__file__))) d = os.path.join(d, 'hooks') _project_hook_list = [os.path.join(d, x) for x in os.listdir(d)] return _project_hook_list class DownloadedChange(object): _commit_cache = None def __init__(self, project, base, change_id, ps_id, commit): self.project = project self.base = base self.change_id = change_id self.ps_id = ps_id self.commit = commit @property def commits(self): if self._commit_cache is None: self._commit_cache = self.project.bare_git.rev_list('--abbrev=8', '--abbrev-commit', '--pretty=oneline', '--reverse', '--date-order', not_rev(self.base), self.commit, '--') return self._commit_cache class ReviewableBranch(object): _commit_cache = None def __init__(self, project, branch, base): self.project = project self.branch = branch self.base = base @property def name(self): return self.branch.name @property def commits(self): if self._commit_cache is None: self._commit_cache = self.project.bare_git.rev_list('--abbrev=8', '--abbrev-commit', '--pretty=oneline', '--reverse', '--date-order', not_rev(self.base), R_HEADS + self.name, '--') return self._commit_cache @property def unabbrev_commits(self): r = dict() for commit in self.project.bare_git.rev_list(not_rev(self.base), R_HEADS + self.name, '--'): r[commit[0:8]] = commit return r @property def date(self): return self.project.bare_git.log('--pretty=format:%cd', '-n', '1', R_HEADS + self.name, '--') def UploadForReview(self, people, auto_topic=False, draft=False, dest_branch=None): self.project.UploadForReview(self.name, people, auto_topic=auto_topic, draft=draft, dest_branch=dest_branch) def GetPublishedRefs(self): refs = {} output = self.project.bare_git.ls_remote( self.branch.remote.SshReviewUrl(self.project.UserEmail), 'refs/changes/*') for line in output.split('\n'): try: (sha, ref) = line.split() refs[sha] = ref except ValueError: pass return refs class StatusColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'status') self.project = self.printer('header', attr='bold') self.branch = self.printer('header', attr='bold') self.nobranch = self.printer('nobranch', fg='red') self.important = self.printer('important', fg='red') self.added = self.printer('added', fg='green') self.changed = self.printer('changed', fg='red') self.untracked = self.printer('untracked', fg='red') class DiffColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'diff') self.project = self.printer('header', attr='bold') class _Annotation(object): def __init__(self, name, value, keep): self.name = name self.value = value self.keep = keep class _CopyFile(object): def __init__(self, src, dest, abssrc, absdest): self.src = src self.dest = dest self.abs_src = abssrc self.abs_dest = absdest def _Copy(self): src = self.abs_src dest = self.abs_dest # copy file if it does not exist or is out of date if not os.path.exists(dest) or not filecmp.cmp(src, dest): try: # remove existing file first, since it might be read-only if os.path.exists(dest): os.remove(dest) else: dest_dir = os.path.dirname(dest) if not os.path.isdir(dest_dir): os.makedirs(dest_dir) shutil.copy(src, dest) # make the file read-only mode = os.stat(dest)[stat.ST_MODE] mode = mode & ~(stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH) os.chmod(dest, mode) except IOError: _error('Cannot copy file %s to %s', src, dest) class _LinkFile(object): def __init__(self, git_worktree, src, dest, relsrc, absdest): self.git_worktree = git_worktree self.src = src self.dest = dest self.src_rel_to_dest = relsrc self.abs_dest = absdest def __linkIt(self, relSrc, absDest): # link file if it does not exist or is out of date if not os.path.islink(absDest) or (os.readlink(absDest) != relSrc): try: # remove existing file first, since it might be read-only if os.path.lexists(absDest): os.remove(absDest) else: dest_dir = os.path.dirname(absDest) if not os.path.isdir(dest_dir): os.makedirs(dest_dir) os.symlink(relSrc, absDest) except IOError: _error('Cannot link file %s to %s', relSrc, absDest) def _Link(self): """Link the self.rel_src_to_dest and self.abs_dest. Handles wild cards on the src linking all of the files in the source in to the destination directory. """ # We use the absSrc to handle the situation where the current directory # is not the root of the repo absSrc = os.path.join(self.git_worktree, self.src) if os.path.exists(absSrc): # Entity exists so just a simple one to one link operation self.__linkIt(self.src_rel_to_dest, self.abs_dest) else: # Entity doesn't exist assume there is a wild card absDestDir = self.abs_dest if os.path.exists(absDestDir) and not os.path.isdir(absDestDir): _error('Link error: src with wildcard, %s must be a directory', absDestDir) else: absSrcFiles = glob.glob(absSrc) for absSrcFile in absSrcFiles: # Create a releative path from source dir to destination dir absSrcDir = os.path.dirname(absSrcFile) relSrcDir = os.path.relpath(absSrcDir, absDestDir) # Get the source file name srcFile = os.path.basename(absSrcFile) # Now form the final full paths to srcFile. They will be # absolute for the desintaiton and relative for the srouce. absDest = os.path.join(absDestDir, srcFile) relSrc = os.path.join(relSrcDir, srcFile) self.__linkIt(relSrc, absDest) class RemoteSpec(object): def __init__(self, name, url=None, pushUrl=None, review=None, revision=None, orig_name=None): self.name = name self.url = url self.pushUrl = pushUrl self.review = review self.revision = revision self.orig_name = orig_name class RepoHook(object): """A RepoHook contains information about a script to run as a hook. Hooks are used to run a python script before running an upload (for instance, to run presubmit checks). Eventually, we may have hooks for other actions. This shouldn't be confused with files in the 'repo/hooks' directory. Those files are copied into each '.git/hooks' folder for each project. Repo-level hooks are associated instead with repo actions. Hooks are always python. When a hook is run, we will load the hook into the interpreter and execute its main() function. """ def __init__(self, hook_type, hooks_project, topdir, manifest_url, abort_if_user_denies=False): """RepoHook constructor. Params: hook_type: A string representing the type of hook. This is also used to figure out the name of the file containing the hook. For example: 'pre-upload'. hooks_project: The project containing the repo hooks. If you have a manifest, this is manifest.repo_hooks_project. OK if this is None, which will make the hook a no-op. topdir: Repo's top directory (the one containing the .repo directory). Scripts will run with CWD as this directory. If you have a manifest, this is manifest.topdir manifest_url: The URL to the manifest git repo. abort_if_user_denies: If True, we'll throw a HookError() if the user doesn't allow us to run the hook. """ self._hook_type = hook_type self._hooks_project = hooks_project self._manifest_url = manifest_url self._topdir = topdir self._abort_if_user_denies = abort_if_user_denies # Store the full path to the script for convenience. if self._hooks_project: self._script_fullpath = os.path.join(self._hooks_project.worktree, self._hook_type + '.py') else: self._script_fullpath = None def _GetHash(self): """Return a hash of the contents of the hooks directory. We'll just use git to do this. This hash has the property that if anything changes in the directory we will return a different has. SECURITY CONSIDERATION: This hash only represents the contents of files in the hook directory, not any other files imported or called by hooks. Changes to imported files can change the script behavior without affecting the hash. Returns: A string representing the hash. This will always be ASCII so that it can be printed to the user easily. """ assert self._hooks_project, "Must have hooks to calculate their hash." # We will use the work_git object rather than just calling GetRevisionId(). # That gives us a hash of the latest checked in version of the files that # the user will actually be executing. Specifically, GetRevisionId() # doesn't appear to change even if a user checks out a different version # of the hooks repo (via git checkout) nor if a user commits their own revs. # # NOTE: Local (non-committed) changes will not be factored into this hash. # I think this is OK, since we're really only worried about warning the user # about upstream changes. return self._hooks_project.work_git.rev_parse('HEAD') def _GetMustVerb(self): """Return 'must' if the hook is required; 'should' if not.""" if self._abort_if_user_denies: return 'must' else: return 'should' def _CheckForHookApproval(self): """Check to see whether this hook has been approved. We'll accept approval of manifest URLs if they're using secure transports. This way the user can say they trust the manifest hoster. For insecure hosts, we fall back to checking the hash of the hooks repo. Note that we ask permission for each individual hook even though we use the hash of all hooks when detecting changes. We'd like the user to be able to approve / deny each hook individually. We only use the hash of all hooks because there is no other easy way to detect changes to local imports. Returns: True if this hook is approved to run; False otherwise. Raises: HookError: Raised if the user doesn't approve and abort_if_user_denies was passed to the consturctor. """ if self._ManifestUrlHasSecureScheme(): return self._CheckForHookApprovalManifest() else: return self._CheckForHookApprovalHash() def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt, changed_prompt): """Check for approval for a particular attribute and hook. Args: subkey: The git config key under [repo.hooks.] to store the last approved string. new_val: The new value to compare against the last approved one. main_prompt: Message to display to the user to ask for approval. changed_prompt: Message explaining why we're re-asking for approval. Returns: True if this hook is approved to run; False otherwise. Raises: HookError: Raised if the user doesn't approve and abort_if_user_denies was passed to the consturctor. """ hooks_config = self._hooks_project.config git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey) # Get the last value that the user approved for this hook; may be None. old_val = hooks_config.GetString(git_approval_key) if old_val is not None: # User previously approved hook and asked not to be prompted again. if new_val == old_val: # Approval matched. We're done. return True else: # Give the user a reason why we're prompting, since they last told # us to "never ask again". prompt = 'WARNING: %s\n\n' % (changed_prompt,) else: prompt = '' # Prompt the user if we're not on a tty; on a tty we'll assume "no". if sys.stdout.isatty(): prompt += main_prompt + ' (yes/always/NO)? ' response = input(prompt).lower() print() # User is doing a one-time approval. if response in ('y', 'yes'): return True elif response == 'always': hooks_config.SetString(git_approval_key, new_val) return True # For anything else, we'll assume no approval. if self._abort_if_user_denies: raise HookError('You must allow the %s hook or use --no-verify.' % self._hook_type) return False def _ManifestUrlHasSecureScheme(self): """Check if the URI for the manifest is a secure transport.""" secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc') parse_results = urllib.parse.urlparse(self._manifest_url) return parse_results.scheme in secure_schemes def _CheckForHookApprovalManifest(self): """Check whether the user has approved this manifest host. Returns: True if this hook is approved to run; False otherwise. """ return self._CheckForHookApprovalHelper( 'approvedmanifest', self._manifest_url, 'Run hook scripts from %s' % (self._manifest_url,), 'Manifest URL has changed since %s was allowed.' % (self._hook_type,)) def _CheckForHookApprovalHash(self): """Check whether the user has approved the hooks repo. Returns: True if this hook is approved to run; False otherwise. """ prompt = ('Repo %s run the script:\n' ' %s\n' '\n' 'Do you want to allow this script to run') return self._CheckForHookApprovalHelper( 'approvedhash', self._GetHash(), prompt % (self._GetMustVerb(), self._script_fullpath), 'Scripts have changed since %s was allowed.' % (self._hook_type,)) def _ExecuteHook(self, **kwargs): """Actually execute the given hook. This will run the hook's 'main' function in our python interpreter. Args: kwargs: Keyword arguments to pass to the hook. These are often specific to the hook type. For instance, pre-upload hooks will contain a project_list. """ # Keep sys.path and CWD stashed away so that we can always restore them # upon function exit. orig_path = os.getcwd() orig_syspath = sys.path try: # Always run hooks with CWD as topdir. os.chdir(self._topdir) # Put the hook dir as the first item of sys.path so hooks can do # relative imports. We want to replace the repo dir as [0] so # hooks can't import repo files. sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:] # Exec, storing global context in the context dict. We catch exceptions # and convert to a HookError w/ just the failing traceback. context = {'__file__': self._script_fullpath} try: exec(compile(open(self._script_fullpath).read(), self._script_fullpath, 'exec'), context) except Exception: raise HookError('%s\nFailed to import %s hook; see traceback above.' % (traceback.format_exc(), self._hook_type)) # Running the script should have defined a main() function. if 'main' not in context: raise HookError('Missing main() in: "%s"' % self._script_fullpath) # Add 'hook_should_take_kwargs' to the arguments to be passed to main. # We don't actually want hooks to define their main with this argument-- # it's there to remind them that their hook should always take **kwargs. # For instance, a pre-upload hook should be defined like: # def main(project_list, **kwargs): # # This allows us to later expand the API without breaking old hooks. kwargs = kwargs.copy() kwargs['hook_should_take_kwargs'] = True # Call the main function in the hook. If the hook should cause the # build to fail, it will raise an Exception. We'll catch that convert # to a HookError w/ just the failing traceback. try: context['main'](**kwargs) except Exception: raise HookError('%s\nFailed to run main() for %s hook; see traceback ' 'above.' % (traceback.format_exc(), self._hook_type)) finally: # Restore sys.path and CWD. sys.path = orig_syspath os.chdir(orig_path) def Run(self, user_allows_all_hooks, **kwargs): """Run the hook. If the hook doesn't exist (because there is no hooks project or because this particular hook is not enabled), this is a no-op. Args: user_allows_all_hooks: If True, we will never prompt about running the hook--we'll just assume it's OK to run it. kwargs: Keyword arguments to pass to the hook. These are often specific to the hook type. For instance, pre-upload hooks will contain a project_list. Raises: HookError: If there was a problem finding the hook or the user declined to run a required hook (from _CheckForHookApproval). """ # No-op if there is no hooks project or if hook is disabled. if ((not self._hooks_project) or (self._hook_type not in self._hooks_project.enabled_repo_hooks)): return # Bail with a nice error if we can't find the hook. if not os.path.isfile(self._script_fullpath): raise HookError('Couldn\'t find repo hook: "%s"' % self._script_fullpath) # Make sure the user is OK with running the hook. if (not user_allows_all_hooks) and (not self._CheckForHookApproval()): return # Run the hook with the same version of python we're using. self._ExecuteHook(**kwargs) class Project(object): # These objects can be shared between several working trees. shareable_files = ['description', 'info'] shareable_dirs = ['hooks', 'objects', 'rr-cache', 'svn'] # These objects can only be used by a single working tree. working_tree_files = ['config', 'packed-refs', 'shallow'] working_tree_dirs = ['logs', 'refs'] def __init__(self, manifest, name, remote, gitdir, objdir, worktree, relpath, revisionExpr, revisionId, rebase=True, groups=None, sync_c=False, sync_s=False, clone_depth=None, upstream=None, parent=None, is_derived=False, dest_branch=None, optimized_fetch=False, old_revision=None): """Init a Project object. Args: manifest: The XmlManifest object. name: The `name` attribute of manifest.xml's project element. remote: RemoteSpec object specifying its remote's properties. gitdir: Absolute path of git directory. objdir: Absolute path of directory to store git objects. worktree: Absolute path of git working tree. relpath: Relative path of git working tree to repo's top directory. revisionExpr: The `revision` attribute of manifest.xml's project element. revisionId: git commit id for checking out. rebase: The `rebase` attribute of manifest.xml's project element. groups: The `groups` attribute of manifest.xml's project element. sync_c: The `sync-c` attribute of manifest.xml's project element. sync_s: The `sync-s` attribute of manifest.xml's project element. upstream: The `upstream` attribute of manifest.xml's project element. parent: The parent Project object. is_derived: False if the project was explicitly defined in the manifest; True if the project is a discovered submodule. dest_branch: The branch to which to push changes for review by default. optimized_fetch: If True, when a project is set to a sha1 revision, only fetch from the remote if the sha1 is not present locally. old_revision: saved git commit id for open GITC projects. """ self.manifest = manifest self.name = name self.remote = remote self.gitdir = gitdir.replace('\\', '/') self.objdir = objdir.replace('\\', '/') if worktree: self.worktree = os.path.normpath(worktree.replace('\\', '/')) else: self.worktree = None self.relpath = relpath self.revisionExpr = revisionExpr if revisionId is None \ and revisionExpr \ and IsId(revisionExpr): self.revisionId = revisionExpr else: self.revisionId = revisionId self.rebase = rebase self.groups = groups self.sync_c = sync_c self.sync_s = sync_s self.clone_depth = clone_depth self.upstream = upstream self.parent = parent self.is_derived = is_derived self.optimized_fetch = optimized_fetch self.subprojects = [] self.snapshots = {} self.copyfiles = [] self.linkfiles = [] self.annotations = [] self.config = GitConfig.ForRepository(gitdir=self.gitdir, defaults=self.manifest.globalConfig) if self.worktree: self.work_git = self._GitGetByExec(self, bare=False, gitdir=gitdir) else: self.work_git = None self.bare_git = self._GitGetByExec(self, bare=True, gitdir=gitdir) self.bare_ref = GitRefs(gitdir) self.bare_objdir = self._GitGetByExec(self, bare=True, gitdir=objdir) self.dest_branch = dest_branch self.old_revision = old_revision # This will be filled in if a project is later identified to be the # project containing repo hooks. self.enabled_repo_hooks = [] @property def Derived(self): return self.is_derived @property def Exists(self): return os.path.isdir(self.gitdir) and os.path.isdir(self.objdir) @property def CurrentBranch(self): """Obtain the name of the currently checked out branch. The branch name omits the 'refs/heads/' prefix. None is returned if the project is on a detached HEAD. """ b = self.work_git.GetHead() if b.startswith(R_HEADS): return b[len(R_HEADS):] return None def IsRebaseInProgress(self): w = self.worktree g = os.path.join(w, '.git') return os.path.exists(os.path.join(g, 'rebase-apply')) \ or os.path.exists(os.path.join(g, 'rebase-merge')) \ or os.path.exists(os.path.join(w, '.dotest')) def IsDirty(self, consider_untracked=True): """Is the working directory modified in some way? """ self.work_git.update_index('-q', '--unmerged', '--ignore-missing', '--refresh') if self.work_git.DiffZ('diff-index', '-M', '--cached', HEAD): return True if self.work_git.DiffZ('diff-files'): return True if consider_untracked and self.work_git.LsOthers(): return True return False _userident_name = None _userident_email = None @property def UserName(self): """Obtain the user's personal name. """ if self._userident_name is None: self._LoadUserIdentity() return self._userident_name @property def UserEmail(self): """Obtain the user's email address. This is very likely to be their Gerrit login. """ if self._userident_email is None: self._LoadUserIdentity() return self._userident_email def _LoadUserIdentity(self): u = self.bare_git.var('GIT_COMMITTER_IDENT') m = re.compile("^(.*) <([^>]*)> ").match(u) if m: self._userident_name = m.group(1) self._userident_email = m.group(2) else: self._userident_name = '' self._userident_email = '' def GetRemote(self, name): """Get the configuration for a single remote. """ return self.config.GetRemote(name) def GetBranch(self, name): """Get the configuration for a single branch. """ return self.config.GetBranch(name) def GetBranches(self): """Get all existing local branches. """ current = self.CurrentBranch all_refs = self._allrefs heads = {} for name, ref_id in all_refs.items(): if name.startswith(R_HEADS): name = name[len(R_HEADS):] b = self.GetBranch(name) b.current = name == current b.published = None b.revision = ref_id heads[name] = b for name, ref_id in all_refs.items(): if name.startswith(R_PUB): name = name[len(R_PUB):] b = heads.get(name) if b: b.published = ref_id return heads def MatchesGroups(self, manifest_groups): """Returns true if the manifest groups specified at init should cause this project to be synced. Prefixing a manifest group with "-" inverts the meaning of a group. All projects are implicitly labelled with "all". labels are resolved in order. In the example case of project_groups: "all,group1,group2" manifest_groups: "-group1,group2" the project will be matched. The special manifest group "default" will match any project that does not have the special project group "notdefault" """ expanded_manifest_groups = manifest_groups or ['default'] expanded_project_groups = ['all'] + (self.groups or []) if 'notdefault' not in expanded_project_groups: expanded_project_groups += ['default'] matched = False for group in expanded_manifest_groups: if group.startswith('-') and group[1:] in expanded_project_groups: matched = False elif group in expanded_project_groups: matched = True return matched # Status Display ## def UncommitedFiles(self, get_all=True): """Returns a list of strings, uncommitted files in the git tree. Args: get_all: a boolean, if True - get information about all different uncommitted files. If False - return as soon as any kind of uncommitted files is detected. """ details = [] self.work_git.update_index('-q', '--unmerged', '--ignore-missing', '--refresh') if self.IsRebaseInProgress(): details.append("rebase in progress") if not get_all: return details changes = self.work_git.DiffZ('diff-index', '--cached', HEAD).keys() if changes: details.extend(changes) if not get_all: return details changes = self.work_git.DiffZ('diff-files').keys() if changes: details.extend(changes) if not get_all: return details changes = self.work_git.LsOthers() if changes: details.extend(changes) return details def HasChanges(self): """Returns true if there are uncommitted changes. """ if self.UncommitedFiles(get_all=False): return True else: return False def PrintWorkTreeStatus(self, output_redir=None): """Prints the status of the repository to stdout. Args: output: If specified, redirect the output to this object. """ if not os.path.isdir(self.worktree): if output_redir is None: output_redir = sys.stdout print(file=output_redir) print('project %s/' % self.relpath, file=output_redir) print(' missing (run "repo sync")', file=output_redir) return self.work_git.update_index('-q', '--unmerged', '--ignore-missing', '--refresh') rb = self.IsRebaseInProgress() di = self.work_git.DiffZ('diff-index', '-M', '--cached', HEAD) df = self.work_git.DiffZ('diff-files') do = self.work_git.LsOthers() if not rb and not di and not df and not do and not self.CurrentBranch: return 'CLEAN' out = StatusColoring(self.config) if output_redir is not None: out.redirect(output_redir) out.project('project %-40s', self.relpath + '/ ') branch = self.CurrentBranch if branch is None: out.nobranch('(*** NO BRANCH ***)') else: out.branch('branch %s', branch) out.nl() if rb: out.important('prior sync failed; rebase still in progress') out.nl() paths = list() paths.extend(di.keys()) paths.extend(df.keys()) paths.extend(do) for p in sorted(set(paths)): try: i = di[p] except KeyError: i = None try: f = df[p] except KeyError: f = None if i: i_status = i.status.upper() else: i_status = '-' if f: f_status = f.status.lower() else: f_status = '-' if i and i.src_path: line = ' %s%s\t%s => %s (%s%%)' % (i_status, f_status, i.src_path, p, i.level) else: line = ' %s%s\t%s' % (i_status, f_status, p) if i and not f: out.added('%s', line) elif (i and f) or (not i and f): out.changed('%s', line) elif not i and not f: out.untracked('%s', line) else: out.write('%s', line) out.nl() return 'DIRTY' def PrintWorkTreeDiff(self, absolute_paths=False): """Prints the status of the repository to stdout. """ out = DiffColoring(self.config) cmd = ['diff'] if out.is_on: cmd.append('--color') cmd.append(HEAD) if absolute_paths: cmd.append('--src-prefix=a/%s/' % self.relpath) cmd.append('--dst-prefix=b/%s/' % self.relpath) cmd.append('--') p = GitCommand(self, cmd, capture_stdout=True, capture_stderr=True) has_diff = False for line in p.process.stdout: if not has_diff: out.nl() out.project('project %s/' % self.relpath) out.nl() has_diff = True print(line[:-1]) p.Wait() # Publish / Upload ## def WasPublished(self, branch, all_refs=None): """Was the branch published (uploaded) for code review? If so, returns the SHA-1 hash of the last published state for the branch. """ key = R_PUB + branch if all_refs is None: try: return self.bare_git.rev_parse(key) except GitError: return None else: try: return all_refs[key] except KeyError: return None def CleanPublishedCache(self, all_refs=None): """Prunes any stale published refs. """ if all_refs is None: all_refs = self._allrefs heads = set() canrm = {} for name, ref_id in all_refs.items(): if name.startswith(R_HEADS): heads.add(name) elif name.startswith(R_PUB): canrm[name] = ref_id for name, ref_id in canrm.items(): n = name[len(R_PUB):] if R_HEADS + n not in heads: self.bare_git.DeleteRef(name, ref_id) def GetUploadableBranches(self, selected_branch=None): """List any branches which can be uploaded for review. """ heads = {} pubed = {} for name, ref_id in self._allrefs.items(): if name.startswith(R_HEADS): heads[name[len(R_HEADS):]] = ref_id elif name.startswith(R_PUB): pubed[name[len(R_PUB):]] = ref_id ready = [] for branch, ref_id in heads.items(): if branch in pubed and pubed[branch] == ref_id: continue if selected_branch and branch != selected_branch: continue rb = self.GetUploadableBranch(branch) if rb: ready.append(rb) return ready def GetUploadableBranch(self, branch_name): """Get a single uploadable branch, or None. """ branch = self.GetBranch(branch_name) base = branch.LocalMerge if branch.LocalMerge: rb = ReviewableBranch(self, branch, base) if rb.commits: return rb return None def UploadForReview(self, branch=None, people=([], []), auto_topic=False, draft=False, dest_branch=None): """Uploads the named branch for code review. """ if branch is None: branch = self.CurrentBranch if branch is None: raise GitError('not currently on a branch') branch = self.GetBranch(branch) if not branch.LocalMerge: raise GitError('branch %s does not track a remote' % branch.name) if not branch.remote.review: raise GitError('remote %s has no review url' % branch.remote.name) if dest_branch is None: dest_branch = self.dest_branch if dest_branch is None: dest_branch = branch.merge if not dest_branch.startswith(R_HEADS): dest_branch = R_HEADS + dest_branch if not branch.remote.projectname: branch.remote.projectname = self.name branch.remote.Save() url = branch.remote.ReviewUrl(self.UserEmail) if url is None: raise UploadError('review not configured') cmd = ['push'] if url.startswith('ssh://'): rp = ['gerrit receive-pack'] for e in people[0]: rp.append('--reviewer=%s' % sq(e)) for e in people[1]: rp.append('--cc=%s' % sq(e)) cmd.append('--receive-pack=%s' % " ".join(rp)) cmd.append(url) if dest_branch.startswith(R_HEADS): dest_branch = dest_branch[len(R_HEADS):] upload_type = 'for' if draft: upload_type = 'drafts' ref_spec = '%s:refs/%s/%s' % (R_HEADS + branch.name, upload_type, dest_branch) if auto_topic: ref_spec = ref_spec + '/' + branch.name if not url.startswith('ssh://'): rp = ['r=%s' % p for p in people[0]] + \ ['cc=%s' % p for p in people[1]] if rp: ref_spec = ref_spec + '%' + ','.join(rp) cmd.append(ref_spec) if GitCommand(self, cmd, bare=True).Wait() != 0: raise UploadError('Upload failed') msg = "posted to %s for %s" % (branch.remote.review, dest_branch) self.bare_git.UpdateRef(R_PUB + branch.name, R_HEADS + branch.name, message=msg) # Sync ## def _ExtractArchive(self, tarpath, path=None): """Extract the given tar on its current location Args: - tarpath: The path to the actual tar file """ try: with tarfile.open(tarpath, 'r') as tar: tar.extractall(path=path) return True except (IOError, tarfile.TarError) as e: _error("Cannot extract archive %s: %s", tarpath, str(e)) return False def Sync_NetworkHalf(self, quiet=False, is_new=None, current_branch_only=False, force_sync=False, clone_bundle=True, no_tags=False, archive=False, optimized_fetch=False, prune=False): """Perform only the network IO portion of the sync process. Local working directory/branch state is not affected. """ if archive and not isinstance(self, MetaProject): if self.remote.url.startswith(('http://', 'https://')): _error("%s: Cannot fetch archives from http/https remotes.", self.name) return False name = self.relpath.replace('\\', '/') name = name.replace('/', '_') tarpath = '%s.tar' % name topdir = self.manifest.topdir try: self._FetchArchive(tarpath, cwd=topdir) except GitError as e: _error('%s', e) return False # From now on, we only need absolute tarpath tarpath = os.path.join(topdir, tarpath) if not self._ExtractArchive(tarpath, path=topdir): return False try: os.remove(tarpath) except OSError as e: _warn("Cannot remove archive %s: %s", tarpath, str(e)) self._CopyAndLinkFiles() return True if is_new is None: is_new = not self.Exists if is_new: self._InitGitDir(force_sync=force_sync) else: self._UpdateHooks() self._InitRemote() if is_new: alt = os.path.join(self.gitdir, 'objects/info/alternates') try: fd = open(alt, 'rb') try: alt_dir = fd.readline().rstrip() finally: fd.close() except IOError: alt_dir = None else: alt_dir = None if clone_bundle \ and alt_dir is None \ and self._ApplyCloneBundle(initial=is_new, quiet=quiet): is_new = False if not current_branch_only: if self.sync_c: current_branch_only = True elif not self.manifest._loaded: # Manifest cannot check defaults until it syncs. current_branch_only = False elif self.manifest.default.sync_c: current_branch_only = True need_to_fetch = not (optimized_fetch and (ID_RE.match(self.revisionExpr) and self._CheckForSha1())) if (need_to_fetch and not self._RemoteFetch(initial=is_new, quiet=quiet, alt_dir=alt_dir, current_branch_only=current_branch_only, no_tags=no_tags, prune=prune)): return False if self.worktree: self._InitMRef() else: self._InitMirrorHead() try: os.remove(os.path.join(self.gitdir, 'FETCH_HEAD')) except OSError: pass return True def PostRepoUpgrade(self): self._InitHooks() def _CopyAndLinkFiles(self): if self.manifest.isGitcClient: return for copyfile in self.copyfiles: copyfile._Copy() for linkfile in self.linkfiles: linkfile._Link() def GetCommitRevisionId(self): """Get revisionId of a commit. Use this method instead of GetRevisionId to get the id of the commit rather than the id of the current git object (for example, a tag) """ if not self.revisionExpr.startswith(R_TAGS): return self.GetRevisionId(self._allrefs) try: return self.bare_git.rev_list(self.revisionExpr, '-1')[0] except GitError: raise ManifestInvalidRevisionError('revision %s in %s not found' % (self.revisionExpr, self.name)) def GetRevisionId(self, all_refs=None): if self.revisionId: return self.revisionId rem = self.GetRemote(self.remote.name) rev = rem.ToLocal(self.revisionExpr) if all_refs is not None and rev in all_refs: return all_refs[rev] try: return self.bare_git.rev_parse('--verify', '%s^0' % rev) except GitError: raise ManifestInvalidRevisionError('revision %s in %s not found' % (self.revisionExpr, self.name)) def Sync_LocalHalf(self, syncbuf, force_sync=False): """Perform only the local IO portion of the sync process. Network access is not required. """ self._InitWorkTree(force_sync=force_sync) all_refs = self.bare_ref.all self.CleanPublishedCache(all_refs) revid = self.GetRevisionId(all_refs) def _doff(): self._FastForward(revid) self._CopyAndLinkFiles() head = self.work_git.GetHead() if head.startswith(R_HEADS): branch = head[len(R_HEADS):] try: head = all_refs[head] except KeyError: head = None else: branch = None if branch is None or syncbuf.detach_head: # Currently on a detached HEAD. The user is assumed to # not have any local modifications worth worrying about. # if self.IsRebaseInProgress(): syncbuf.fail(self, _PriorSyncFailedError()) return if head == revid: # No changes; don't do anything further. # Except if the head needs to be detached # if not syncbuf.detach_head: # The copy/linkfile config may have changed. self._CopyAndLinkFiles() return else: lost = self._revlist(not_rev(revid), HEAD) if lost: syncbuf.info(self, "discarding %d commits", len(lost)) try: self._Checkout(revid, quiet=True) except GitError as e: syncbuf.fail(self, e) return self._CopyAndLinkFiles() return if head == revid: # No changes; don't do anything further. # # The copy/linkfile config may have changed. self._CopyAndLinkFiles() return branch = self.GetBranch(branch) if not branch.LocalMerge: # The current branch has no tracking configuration. # Jump off it to a detached HEAD. # syncbuf.info(self, "leaving %s; does not track upstream", branch.name) try: self._Checkout(revid, quiet=True) except GitError as e: syncbuf.fail(self, e) return self._CopyAndLinkFiles() return upstream_gain = self._revlist(not_rev(HEAD), revid) pub = self.WasPublished(branch.name, all_refs) if pub: not_merged = self._revlist(not_rev(revid), pub) if not_merged: if upstream_gain: # The user has published this branch and some of those # commits are not yet merged upstream. We do not want # to rewrite the published commits so we punt. # syncbuf.fail(self, "branch %s is published (but not merged) and is now " "%d commits behind" % (branch.name, len(upstream_gain))) return elif pub == head: # All published commits are merged, and thus we are a # strict subset. We can fast-forward safely. # syncbuf.later1(self, _doff) return # Examine the local commits not in the remote. Find the # last one attributed to this user, if any. # local_changes = self._revlist(not_rev(revid), HEAD, format='%H %ce') last_mine = None cnt_mine = 0 for commit in local_changes: commit_id, committer_email = commit.decode('utf-8').split(' ', 1) if committer_email == self.UserEmail: last_mine = commit_id cnt_mine += 1 if not upstream_gain and cnt_mine == len(local_changes): return if self.IsDirty(consider_untracked=False): syncbuf.fail(self, _DirtyError()) return # If the upstream switched on us, warn the user. # if branch.merge != self.revisionExpr: if branch.merge and self.revisionExpr: syncbuf.info(self, 'manifest switched %s...%s', branch.merge, self.revisionExpr) elif branch.merge: syncbuf.info(self, 'manifest no longer tracks %s', branch.merge) if cnt_mine < len(local_changes): # Upstream rebased. Not everything in HEAD # was created by this user. # syncbuf.info(self, "discarding %d commits removed from upstream", len(local_changes) - cnt_mine) branch.remote = self.GetRemote(self.remote.name) if not ID_RE.match(self.revisionExpr): # in case of manifest sync the revisionExpr might be a SHA1 branch.merge = self.revisionExpr if not branch.merge.startswith('refs/'): branch.merge = R_HEADS + branch.merge branch.Save() if cnt_mine > 0 and self.rebase: def _dorebase(): self._Rebase(upstream='%s^1' % last_mine, onto=revid) self._CopyAndLinkFiles() syncbuf.later2(self, _dorebase) elif local_changes: try: self._ResetHard(revid) self._CopyAndLinkFiles() except GitError as e: syncbuf.fail(self, e) return else: syncbuf.later1(self, _doff) def AddCopyFile(self, src, dest, absdest): # dest should already be an absolute path, but src is project relative # make src an absolute path abssrc = os.path.join(self.worktree, src) self.copyfiles.append(_CopyFile(src, dest, abssrc, absdest)) def AddLinkFile(self, src, dest, absdest): # dest should already be an absolute path, but src is project relative # make src relative path to dest absdestdir = os.path.dirname(absdest) relsrc = os.path.relpath(os.path.join(self.worktree, src), absdestdir) self.linkfiles.append(_LinkFile(self.worktree, src, dest, relsrc, absdest)) def AddAnnotation(self, name, value, keep): self.annotations.append(_Annotation(name, value, keep)) def DownloadPatchSet(self, change_id, patch_id): """Download a single patch set of a single change to FETCH_HEAD. """ remote = self.GetRemote(self.remote.name) cmd = ['fetch', remote.name] cmd.append('refs/changes/%2.2d/%d/%d' % (change_id % 100, change_id, patch_id)) if GitCommand(self, cmd, bare=True).Wait() != 0: return None return DownloadedChange(self, self.GetRevisionId(), change_id, patch_id, self.bare_git.rev_parse('FETCH_HEAD')) # Branch Management ## def StartBranch(self, name, branch_merge=''): """Create a new branch off the manifest's revision. """ if not branch_merge: branch_merge = self.revisionExpr head = self.work_git.GetHead() if head == (R_HEADS + name): return True all_refs = self.bare_ref.all if R_HEADS + name in all_refs: return GitCommand(self, ['checkout', name, '--'], capture_stdout=True, capture_stderr=True).Wait() == 0 branch = self.GetBranch(name) branch.remote = self.GetRemote(self.remote.name) branch.merge = branch_merge if not branch.merge.startswith('refs/') and not ID_RE.match(branch_merge): branch.merge = R_HEADS + branch_merge revid = self.GetRevisionId(all_refs) if head.startswith(R_HEADS): try: head = all_refs[head] except KeyError: head = None if revid and head and revid == head: ref = os.path.join(self.gitdir, R_HEADS + name) try: os.makedirs(os.path.dirname(ref)) except OSError: pass _lwrite(ref, '%s\n' % revid) _lwrite(os.path.join(self.worktree, '.git', HEAD), 'ref: %s%s\n' % (R_HEADS, name)) branch.Save() return True if GitCommand(self, ['checkout', '-b', branch.name, revid], capture_stdout=True, capture_stderr=True).Wait() == 0: branch.Save() return True return False def CheckoutBranch(self, name): """Checkout a local topic branch. Args: name: The name of the branch to checkout. Returns: True if the checkout succeeded; False if it didn't; None if the branch didn't exist. """ rev = R_HEADS + name head = self.work_git.GetHead() if head == rev: # Already on the branch # return True all_refs = self.bare_ref.all try: revid = all_refs[rev] except KeyError: # Branch does not exist in this project # return None if head.startswith(R_HEADS): try: head = all_refs[head] except KeyError: head = None if head == revid: # Same revision; just update HEAD to point to the new # target branch, but otherwise take no other action. # _lwrite(os.path.join(self.worktree, '.git', HEAD), 'ref: %s%s\n' % (R_HEADS, name)) return True return GitCommand(self, ['checkout', name, '--'], capture_stdout=True, capture_stderr=True).Wait() == 0 def AbandonBranch(self, name): """Destroy a local topic branch. Args: name: The name of the branch to abandon. Returns: True if the abandon succeeded; False if it didn't; None if the branch didn't exist. """ rev = R_HEADS + name all_refs = self.bare_ref.all if rev not in all_refs: # Doesn't exist return None head = self.work_git.GetHead() if head == rev: # We can't destroy the branch while we are sitting # on it. Switch to a detached HEAD. # head = all_refs[head] revid = self.GetRevisionId(all_refs) if head == revid: _lwrite(os.path.join(self.worktree, '.git', HEAD), '%s\n' % revid) else: self._Checkout(revid, quiet=True) return GitCommand(self, ['branch', '-D', name], capture_stdout=True, capture_stderr=True).Wait() == 0 def PruneHeads(self): """Prune any topic branches already merged into upstream. """ cb = self.CurrentBranch kill = [] left = self._allrefs for name in left.keys(): if name.startswith(R_HEADS): name = name[len(R_HEADS):] if cb is None or name != cb: kill.append(name) rev = self.GetRevisionId(left) if cb is not None \ and not self._revlist(HEAD + '...' + rev) \ and not self.IsDirty(consider_untracked=False): self.work_git.DetachHead(HEAD) kill.append(cb) if kill: old = self.bare_git.GetHead() try: self.bare_git.DetachHead(rev) b = ['branch', '-d'] b.extend(kill) b = GitCommand(self, b, bare=True, capture_stdout=True, capture_stderr=True) b.Wait() finally: if ID_RE.match(old): self.bare_git.DetachHead(old) else: self.bare_git.SetHead(old) left = self._allrefs for branch in kill: if (R_HEADS + branch) not in left: self.CleanPublishedCache() break if cb and cb not in kill: kill.append(cb) kill.sort() kept = [] for branch in kill: if R_HEADS + branch in left: branch = self.GetBranch(branch) base = branch.LocalMerge if not base: base = rev kept.append(ReviewableBranch(self, branch, base)) return kept # Submodule Management ## def GetRegisteredSubprojects(self): result = [] def rec(subprojects): if not subprojects: return result.extend(subprojects) for p in subprojects: rec(p.subprojects) rec(self.subprojects) return result def _GetSubmodules(self): # Unfortunately we cannot call `git submodule status --recursive` here # because the working tree might not exist yet, and it cannot be used # without a working tree in its current implementation. def get_submodules(gitdir, rev): # Parse .gitmodules for submodule sub_paths and sub_urls sub_paths, sub_urls = parse_gitmodules(gitdir, rev) if not sub_paths: return [] # Run `git ls-tree` to read SHAs of submodule object, which happen to be # revision of submodule repository sub_revs = git_ls_tree(gitdir, rev, sub_paths) submodules = [] for sub_path, sub_url in zip(sub_paths, sub_urls): try: sub_rev = sub_revs[sub_path] except KeyError: # Ignore non-exist submodules continue submodules.append((sub_rev, sub_path, sub_url)) return submodules re_path = re.compile(r'^submodule\.([^.]+)\.path=(.*)$') re_url = re.compile(r'^submodule\.([^.]+)\.url=(.*)$') def parse_gitmodules(gitdir, rev): cmd = ['cat-file', 'blob', '%s:.gitmodules' % rev] try: p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True, bare=True, gitdir=gitdir) except GitError: return [], [] if p.Wait() != 0: return [], [] gitmodules_lines = [] fd, temp_gitmodules_path = tempfile.mkstemp() try: os.write(fd, p.stdout) os.close(fd) cmd = ['config', '--file', temp_gitmodules_path, '--list'] p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True, bare=True, gitdir=gitdir) if p.Wait() != 0: return [], [] gitmodules_lines = p.stdout.split('\n') except GitError: return [], [] finally: os.remove(temp_gitmodules_path) names = set() paths = {} urls = {} for line in gitmodules_lines: if not line: continue m = re_path.match(line) if m: names.add(m.group(1)) paths[m.group(1)] = m.group(2) continue m = re_url.match(line) if m: names.add(m.group(1)) urls[m.group(1)] = m.group(2) continue names = sorted(names) return ([paths.get(name, '') for name in names], [urls.get(name, '') for name in names]) def git_ls_tree(gitdir, rev, paths): cmd = ['ls-tree', rev, '--'] cmd.extend(paths) try: p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True, bare=True, gitdir=gitdir) except GitError: return [] if p.Wait() != 0: return [] objects = {} for line in p.stdout.split('\n'): if not line.strip(): continue object_rev, object_path = line.split()[2:4] objects[object_path] = object_rev return objects try: rev = self.GetRevisionId() except GitError: return [] return get_submodules(self.gitdir, rev) def GetDerivedSubprojects(self): result = [] if not self.Exists: # If git repo does not exist yet, querying its submodules will # mess up its states; so return here. return result for rev, path, url in self._GetSubmodules(): name = self.manifest.GetSubprojectName(self, path) relpath, worktree, gitdir, objdir = \ self.manifest.GetSubprojectPaths(self, name, path) project = self.manifest.paths.get(relpath) if project: result.extend(project.GetDerivedSubprojects()) continue remote = RemoteSpec(self.remote.name, url=url, pushUrl=self.remote.pushUrl, review=self.remote.review, revision=self.remote.revision) subproject = Project(manifest=self.manifest, name=name, remote=remote, gitdir=gitdir, objdir=objdir, worktree=worktree, relpath=relpath, revisionExpr=rev, revisionId=rev, rebase=self.rebase, groups=self.groups, sync_c=self.sync_c, sync_s=self.sync_s, parent=self, is_derived=True) result.append(subproject) result.extend(subproject.GetDerivedSubprojects()) return result # Direct Git Commands ## def _CheckForSha1(self): try: # if revision (sha or tag) is not present then following function # throws an error. self.bare_git.rev_parse('--verify', '%s^0' % self.revisionExpr) return True except GitError: # There is no such persistent revision. We have to fetch it. return False def _FetchArchive(self, tarpath, cwd=None): cmd = ['archive', '-v', '-o', tarpath] cmd.append('--remote=%s' % self.remote.url) cmd.append('--prefix=%s/' % self.relpath) cmd.append(self.revisionExpr) command = GitCommand(self, cmd, cwd=cwd, capture_stdout=True, capture_stderr=True) if command.Wait() != 0: raise GitError('git archive %s: %s' % (self.name, command.stderr)) def _RemoteFetch(self, name=None, current_branch_only=False, initial=False, quiet=False, alt_dir=None, no_tags=False, prune=False): is_sha1 = False tag_name = None depth = None # The depth should not be used when fetching to a mirror because # it will result in a shallow repository that cannot be cloned or # fetched from. if not self.manifest.IsMirror: if self.clone_depth: depth = self.clone_depth else: depth = self.manifest.manifestProject.config.GetString('repo.depth') # The repo project should never be synced with partial depth if self.relpath == '.repo/repo': depth = None if depth: current_branch_only = True if ID_RE.match(self.revisionExpr) is not None: is_sha1 = True if current_branch_only: if self.revisionExpr.startswith(R_TAGS): # this is a tag and its sha1 value should never change tag_name = self.revisionExpr[len(R_TAGS):] if is_sha1 or tag_name is not None: if self._CheckForSha1(): return True if is_sha1 and not depth: # When syncing a specific commit and --depth is not set: # * if upstream is explicitly specified and is not a sha1, fetch only # upstream as users expect only upstream to be fetch. # Note: The commit might not be in upstream in which case the sync # will fail. # * otherwise, fetch all branches to make sure we end up with the # specific commit. if self.upstream: current_branch_only = not ID_RE.match(self.upstream) else: current_branch_only = False if not name: name = self.remote.name ssh_proxy = False remote = self.GetRemote(name) if remote.PreConnectFetch(): ssh_proxy = True if initial: if alt_dir and 'objects' == os.path.basename(alt_dir): ref_dir = os.path.dirname(alt_dir) packed_refs = os.path.join(self.gitdir, 'packed-refs') remote = self.GetRemote(name) all_refs = self.bare_ref.all ids = set(all_refs.values()) tmp = set() for r, ref_id in GitRefs(ref_dir).all.items(): if r not in all_refs: if r.startswith(R_TAGS) or remote.WritesTo(r): all_refs[r] = ref_id ids.add(ref_id) continue if ref_id in ids: continue r = 'refs/_alt/%s' % ref_id all_refs[r] = ref_id ids.add(ref_id) tmp.add(r) tmp_packed = '' old_packed = '' for r in sorted(all_refs): line = '%s %s\n' % (all_refs[r], r) tmp_packed += line if r not in tmp: old_packed += line _lwrite(packed_refs, tmp_packed) else: alt_dir = None cmd = ['fetch'] if depth: cmd.append('--depth=%s' % depth) else: # If this repo has shallow objects, then we don't know which refs have # shallow objects or not. Tell git to unshallow all fetched refs. Don't # do this with projects that don't have shallow objects, since it is less # efficient. if os.path.exists(os.path.join(self.gitdir, 'shallow')): cmd.append('--depth=2147483647') if quiet: cmd.append('--quiet') if not self.worktree: cmd.append('--update-head-ok') cmd.append(name) # If using depth then we should not get all the tags since they may # be outside of the depth. if no_tags or depth: cmd.append('--no-tags') else: cmd.append('--tags') if prune: cmd.append('--prune') spec = [] if not current_branch_only: # Fetch whole repo spec.append(str((u'+refs/heads/*:') + remote.ToLocal('refs/heads/*'))) elif tag_name is not None: spec.append('tag') spec.append(tag_name) if not self.manifest.IsMirror: branch = self.revisionExpr if is_sha1 and depth and git_require((1, 8, 3)): # Shallow checkout of a specific commit, fetch from that commit and not # the heads only as the commit might be deeper in the history. spec.append(branch) else: if is_sha1: branch = self.upstream if branch is not None and branch.strip(): if not branch.startswith('refs/'): branch = R_HEADS + branch spec.append(str((u'+%s:' % branch) + remote.ToLocal(branch))) cmd.extend(spec) ok = False for _i in range(2): gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy) ret = gitcmd.Wait() if ret == 0: ok = True break # If needed, run the 'git remote prune' the first time through the loop elif (not _i and "error:" in gitcmd.stderr and "git remote prune" in gitcmd.stderr): prunecmd = GitCommand(self, ['remote', 'prune', name], bare=True, ssh_proxy=ssh_proxy) ret = prunecmd.Wait() if ret: break continue elif current_branch_only and is_sha1 and ret == 128: # Exit code 128 means "couldn't find the ref you asked for"; if we're # in sha1 mode, we just tried sync'ing from the upstream field; it # doesn't exist, thus abort the optimization attempt and do a full sync. break elif ret < 0: # Git died with a signal, exit immediately break time.sleep(random.randint(30, 45)) if initial: if alt_dir: if old_packed != '': _lwrite(packed_refs, old_packed) else: os.remove(packed_refs) self.bare_git.pack_refs('--all', '--prune') if is_sha1 and current_branch_only and self.upstream: # We just synced the upstream given branch; verify we # got what we wanted, else trigger a second run of all # refs. if not self._CheckForSha1(): if not depth: # Avoid infinite recursion when depth is True (since depth implies # current_branch_only) return self._RemoteFetch(name=name, current_branch_only=False, initial=False, quiet=quiet, alt_dir=alt_dir) if self.clone_depth: self.clone_depth = None return self._RemoteFetch(name=name, current_branch_only=current_branch_only, initial=False, quiet=quiet, alt_dir=alt_dir) return ok def _ApplyCloneBundle(self, initial=False, quiet=False): if initial and \ (self.manifest.manifestProject.config.GetString('repo.depth') or self.clone_depth): return False remote = self.GetRemote(self.remote.name) bundle_url = remote.url + '/clone.bundle' bundle_url = GitConfig.ForUser().UrlInsteadOf(bundle_url) if GetSchemeFromUrl(bundle_url) not in ('http', 'https', 'persistent-http', 'persistent-https'): return False bundle_dst = os.path.join(self.gitdir, 'clone.bundle') bundle_tmp = os.path.join(self.gitdir, 'clone.bundle.tmp') exist_dst = os.path.exists(bundle_dst) exist_tmp = os.path.exists(bundle_tmp) if not initial and not exist_dst and not exist_tmp: return False if not exist_dst: exist_dst = self._FetchBundle(bundle_url, bundle_tmp, bundle_dst, quiet) if not exist_dst: return False cmd = ['fetch'] if quiet: cmd.append('--quiet') if not self.worktree: cmd.append('--update-head-ok') cmd.append(bundle_dst) for f in remote.fetch: cmd.append(str(f)) cmd.append('refs/tags/*:refs/tags/*') ok = GitCommand(self, cmd, bare=True).Wait() == 0 if os.path.exists(bundle_dst): os.remove(bundle_dst) if os.path.exists(bundle_tmp): os.remove(bundle_tmp) return ok def _FetchBundle(self, srcUrl, tmpPath, dstPath, quiet): if os.path.exists(dstPath): os.remove(dstPath) cmd = ['curl', '--fail', '--output', tmpPath, '--netrc', '--location'] if quiet: cmd += ['--silent'] if os.path.exists(tmpPath): size = os.stat(tmpPath).st_size if size >= 1024: cmd += ['--continue-at', '%d' % (size,)] else: os.remove(tmpPath) if 'http_proxy' in os.environ and 'darwin' == sys.platform: cmd += ['--proxy', os.environ['http_proxy']] with GetUrlCookieFile(srcUrl, quiet) as (cookiefile, _proxy): if cookiefile: cmd += ['--cookie', cookiefile, '--cookie-jar', cookiefile] if srcUrl.startswith('persistent-'): srcUrl = srcUrl[len('persistent-'):] cmd += [srcUrl] if IsTrace(): Trace('%s', ' '.join(cmd)) try: proc = subprocess.Popen(cmd) except OSError: return False curlret = proc.wait() if curlret == 22: # From curl man page: # 22: HTTP page not retrieved. The requested url was not found or # returned another error with the HTTP error code being 400 or above. # This return code only appears if -f, --fail is used. if not quiet: print("Server does not provide clone.bundle; ignoring.", file=sys.stderr) return False if os.path.exists(tmpPath): if curlret == 0 and self._IsValidBundle(tmpPath, quiet): os.rename(tmpPath, dstPath) return True else: os.remove(tmpPath) return False else: return False def _IsValidBundle(self, path, quiet): try: with open(path) as f: if f.read(16) == '# v2 git bundle\n': return True else: if not quiet: print("Invalid clone.bundle file; ignoring.", file=sys.stderr) return False except OSError: return False def _Checkout(self, rev, quiet=False): cmd = ['checkout'] if quiet: cmd.append('-q') cmd.append(rev) cmd.append('--') if GitCommand(self, cmd).Wait() != 0: if self._allrefs: raise GitError('%s checkout %s ' % (self.name, rev)) def _CherryPick(self, rev): cmd = ['cherry-pick'] cmd.append(rev) cmd.append('--') if GitCommand(self, cmd).Wait() != 0: if self._allrefs: raise GitError('%s cherry-pick %s ' % (self.name, rev)) def _Revert(self, rev): cmd = ['revert'] cmd.append('--no-edit') cmd.append(rev) cmd.append('--') if GitCommand(self, cmd).Wait() != 0: if self._allrefs: raise GitError('%s revert %s ' % (self.name, rev)) def _ResetHard(self, rev, quiet=True): cmd = ['reset', '--hard'] if quiet: cmd.append('-q') cmd.append(rev) if GitCommand(self, cmd).Wait() != 0: raise GitError('%s reset --hard %s ' % (self.name, rev)) def _Rebase(self, upstream, onto=None): cmd = ['rebase'] if onto is not None: cmd.extend(['--onto', onto]) cmd.append(upstream) if GitCommand(self, cmd).Wait() != 0: raise GitError('%s rebase %s ' % (self.name, upstream)) def _FastForward(self, head, ffonly=False): cmd = ['merge', head] if ffonly: cmd.append("--ff-only") if GitCommand(self, cmd).Wait() != 0: raise GitError('%s merge %s ' % (self.name, head)) def _InitGitDir(self, mirror_git=None, force_sync=False): init_git_dir = not os.path.exists(self.gitdir) init_obj_dir = not os.path.exists(self.objdir) try: # Initialize the bare repository, which contains all of the objects. if init_obj_dir: os.makedirs(self.objdir) self.bare_objdir.init() # If we have a separate directory to hold refs, initialize it as well. if self.objdir != self.gitdir: if init_git_dir: os.makedirs(self.gitdir) if init_obj_dir or init_git_dir: self._ReferenceGitDir(self.objdir, self.gitdir, share_refs=False, copy_all=True) try: self._CheckDirReference(self.objdir, self.gitdir, share_refs=False) except GitError as e: if force_sync: print("Retrying clone after deleting %s" % self.gitdir, file=sys.stderr) try: shutil.rmtree(os.path.realpath(self.gitdir)) if self.worktree and os.path.exists(os.path.realpath (self.worktree)): shutil.rmtree(os.path.realpath(self.worktree)) return self._InitGitDir(mirror_git=mirror_git, force_sync=False) except: raise e raise e if init_git_dir: mp = self.manifest.manifestProject ref_dir = mp.config.GetString('repo.reference') or '' if ref_dir or mirror_git: if not mirror_git: mirror_git = os.path.join(ref_dir, self.name + '.git') repo_git = os.path.join(ref_dir, '.repo', 'projects', self.relpath + '.git') if os.path.exists(mirror_git): ref_dir = mirror_git elif os.path.exists(repo_git): ref_dir = repo_git else: ref_dir = None if ref_dir: _lwrite(os.path.join(self.gitdir, 'objects/info/alternates'), os.path.join(ref_dir, 'objects') + '\n') self._UpdateHooks() m = self.manifest.manifestProject.config for key in ['user.name', 'user.email']: if m.Has(key, include_defaults=False): self.config.SetString(key, m.GetString(key)) self.config.SetString('filter.lfs.smudge', 'git-lfs smudge --skip -- %f') if self.manifest.IsMirror: self.config.SetString('core.bare', 'true') else: self.config.SetString('core.bare', None) except Exception: if init_obj_dir and os.path.exists(self.objdir): shutil.rmtree(self.objdir) if init_git_dir and os.path.exists(self.gitdir): shutil.rmtree(self.gitdir) raise def _UpdateHooks(self): if os.path.exists(self.gitdir): self._InitHooks() def _InitHooks(self): hooks = os.path.realpath(self._gitdir_path('hooks')) if not os.path.exists(hooks): os.makedirs(hooks) for stock_hook in _ProjectHooks(): name = os.path.basename(stock_hook) if name in ('commit-msg',) and not self.remote.review \ and self is not self.manifest.manifestProject: # Don't install a Gerrit Code Review hook if this # project does not appear to use it for reviews. # # Since the manifest project is one of those, but also # managed through gerrit, it's excluded continue dst = os.path.join(hooks, name) if os.path.islink(dst): continue if os.path.exists(dst): if filecmp.cmp(stock_hook, dst, shallow=False): os.remove(dst) else: _warn("%s: Not replacing locally modified %s hook", self.relpath, name) continue try: os.symlink(os.path.relpath(stock_hook, os.path.dirname(dst)), dst) except OSError as e: if e.errno == errno.EPERM: raise GitError('filesystem must support symlinks') else: raise def _InitRemote(self): if self.remote.url: remote = self.GetRemote(self.remote.name) remote.url = self.remote.url remote.pushUrl = self.remote.pushUrl remote.review = self.remote.review remote.projectname = self.name if self.worktree: remote.ResetFetch(mirror=False) else: remote.ResetFetch(mirror=True) remote.Save() def _InitMRef(self): if self.manifest.branch: self._InitAnyMRef(R_M + self.manifest.branch) def _InitMirrorHead(self): self._InitAnyMRef(HEAD) def _InitAnyMRef(self, ref): cur = self.bare_ref.symref(ref) if self.revisionId: if cur != '' or self.bare_ref.get(ref) != self.revisionId: msg = 'manifest set to %s' % self.revisionId dst = self.revisionId + '^0' self.bare_git.UpdateRef(ref, dst, message=msg, detach=True) else: remote = self.GetRemote(self.remote.name) dst = remote.ToLocal(self.revisionExpr) if cur != dst: msg = 'manifest set to %s' % self.revisionExpr self.bare_git.symbolic_ref('-m', msg, ref, dst) def _CheckDirReference(self, srcdir, destdir, share_refs): symlink_files = self.shareable_files[:] symlink_dirs = self.shareable_dirs[:] if share_refs: symlink_files += self.working_tree_files symlink_dirs += self.working_tree_dirs to_symlink = symlink_files + symlink_dirs for name in set(to_symlink): dst = os.path.realpath(os.path.join(destdir, name)) if os.path.lexists(dst): src = os.path.realpath(os.path.join(srcdir, name)) # Fail if the links are pointing to the wrong place if src != dst: raise GitError('--force-sync not enabled; cannot overwrite a local ' 'work tree. If you\'re comfortable with the ' 'possibility of losing the work tree\'s git metadata,' ' use `repo sync --force-sync {0}` to ' 'proceed.'.format(self.relpath)) def _ReferenceGitDir(self, gitdir, dotgit, share_refs, copy_all): """Update |dotgit| to reference |gitdir|, using symlinks where possible. Args: gitdir: The bare git repository. Must already be initialized. dotgit: The repository you would like to initialize. share_refs: If true, |dotgit| will store its refs under |gitdir|. Only one work tree can store refs under a given |gitdir|. copy_all: If true, copy all remaining files from |gitdir| -> |dotgit|. This saves you the effort of initializing |dotgit| yourself. """ symlink_files = self.shareable_files[:] symlink_dirs = self.shareable_dirs[:] if share_refs: symlink_files += self.working_tree_files symlink_dirs += self.working_tree_dirs to_symlink = symlink_files + symlink_dirs to_copy = [] if copy_all: to_copy = os.listdir(gitdir) dotgit = os.path.realpath(dotgit) for name in set(to_copy).union(to_symlink): try: src = os.path.realpath(os.path.join(gitdir, name)) dst = os.path.join(dotgit, name) if os.path.lexists(dst): continue # If the source dir doesn't exist, create an empty dir. if name in symlink_dirs and not os.path.lexists(src): os.makedirs(src) # If the source file doesn't exist, ensure the destination # file doesn't either. if name in symlink_files and not os.path.lexists(src): try: os.remove(dst) except OSError: pass if name in to_symlink: os.symlink(os.path.relpath(src, os.path.dirname(dst)), dst) elif copy_all and not os.path.islink(dst): if os.path.isdir(src): shutil.copytree(src, dst) elif os.path.isfile(src): shutil.copy(src, dst) except OSError as e: if e.errno == errno.EPERM: raise DownloadError('filesystem must support symlinks') else: raise def _InitWorkTree(self, force_sync=False): dotgit = os.path.join(self.worktree, '.git') init_dotgit = not os.path.exists(dotgit) try: if init_dotgit: os.makedirs(dotgit) self._ReferenceGitDir(self.gitdir, dotgit, share_refs=True, copy_all=False) try: self._CheckDirReference(self.gitdir, dotgit, share_refs=True) except GitError as e: if force_sync: try: shutil.rmtree(dotgit) return self._InitWorkTree(force_sync=False) except: raise e raise e if init_dotgit: _lwrite(os.path.join(dotgit, HEAD), '%s\n' % self.GetRevisionId()) cmd = ['read-tree', '--reset', '-u'] cmd.append('-v') cmd.append(HEAD) if GitCommand(self, cmd).Wait() != 0: raise GitError("cannot initialize work tree") self._CopyAndLinkFiles() except Exception: if init_dotgit: shutil.rmtree(dotgit) raise def _gitdir_path(self, path): return os.path.realpath(os.path.join(self.gitdir, path)) def _revlist(self, *args, **kw): a = [] a.extend(args) a.append('--') return self.work_git.rev_list(*a, **kw) @property def _allrefs(self): return self.bare_ref.all def _getLogs(self, rev1, rev2, oneline=False, color=True, pretty_format=None): """Get logs between two revisions of this project.""" comp = '..' if rev1: revs = [rev1] if rev2: revs.extend([comp, rev2]) cmd = ['log', ''.join(revs)] out = DiffColoring(self.config) if out.is_on and color: cmd.append('--color') if pretty_format is not None: cmd.append('--pretty=format:%s' % pretty_format) if oneline: cmd.append('--oneline') try: log = GitCommand(self, cmd, capture_stdout=True, capture_stderr=True) if log.Wait() == 0: return log.stdout except GitError: # worktree may not exist if groups changed for example. In that case, # try in gitdir instead. if not os.path.exists(self.worktree): return self.bare_git.log(*cmd[1:]) else: raise return None def getAddedAndRemovedLogs(self, toProject, oneline=False, color=True, pretty_format=None): """Get the list of logs from this revision to given revisionId""" logs = {} selfId = self.GetRevisionId(self._allrefs) toId = toProject.GetRevisionId(toProject._allrefs) logs['added'] = self._getLogs(selfId, toId, oneline=oneline, color=color, pretty_format=pretty_format) logs['removed'] = self._getLogs(toId, selfId, oneline=oneline, color=color, pretty_format=pretty_format) return logs class _GitGetByExec(object): def __init__(self, project, bare, gitdir): self._project = project self._bare = bare self._gitdir = gitdir def LsOthers(self): p = GitCommand(self._project, ['ls-files', '-z', '--others', '--exclude-standard'], bare=False, gitdir=self._gitdir, capture_stdout=True, capture_stderr=True) if p.Wait() == 0: out = p.stdout if out: # Backslash is not anomalous return out[:-1].split('\0') # pylint: disable=W1401 return [] def DiffZ(self, name, *args): cmd = [name] cmd.append('-z') cmd.extend(args) p = GitCommand(self._project, cmd, gitdir=self._gitdir, bare=False, capture_stdout=True, capture_stderr=True) try: out = p.process.stdout.read() r = {} if out: out = iter(out[:-1].split('\0')) # pylint: disable=W1401 while out: try: info = next(out) path = next(out) except StopIteration: break class _Info(object): def __init__(self, path, omode, nmode, oid, nid, state): self.path = path self.src_path = None self.old_mode = omode self.new_mode = nmode self.old_id = oid self.new_id = nid if len(state) == 1: self.status = state self.level = None else: self.status = state[:1] self.level = state[1:] while self.level.startswith('0'): self.level = self.level[1:] info = info[1:].split(' ') info = _Info(path, *info) if info.status in ('R', 'C'): info.src_path = info.path info.path = next(out) r[info.path] = info return r finally: p.Wait() def GetHead(self): if self._bare: path = os.path.join(self._project.gitdir, HEAD) else: path = os.path.join(self._project.worktree, '.git', HEAD) try: fd = open(path, 'rb') except IOError as e: raise NoManifestException(path, str(e)) try: line = fd.read() finally: fd.close() try: line = line.decode() except AttributeError: pass if line.startswith('ref: '): return line[5:-1] return line[:-1] def SetHead(self, ref, message=None): cmdv = [] if message is not None: cmdv.extend(['-m', message]) cmdv.append(HEAD) cmdv.append(ref) self.symbolic_ref(*cmdv) def DetachHead(self, new, message=None): cmdv = ['--no-deref'] if message is not None: cmdv.extend(['-m', message]) cmdv.append(HEAD) cmdv.append(new) self.update_ref(*cmdv) def UpdateRef(self, name, new, old=None, message=None, detach=False): cmdv = [] if message is not None: cmdv.extend(['-m', message]) if detach: cmdv.append('--no-deref') cmdv.append(name) cmdv.append(new) if old is not None: cmdv.append(old) self.update_ref(*cmdv) def DeleteRef(self, name, old=None): if not old: old = self.rev_parse(name) self.update_ref('-d', name, old) self._project.bare_ref.deleted(name) def rev_list(self, *args, **kw): if 'format' in kw: cmdv = ['log', '--pretty=format:%s' % kw['format']] else: cmdv = ['rev-list'] cmdv.extend(args) p = GitCommand(self._project, cmdv, bare=self._bare, gitdir=self._gitdir, capture_stdout=True, capture_stderr=True) r = [] for line in p.process.stdout: if line[-1] == '\n': line = line[:-1] r.append(line) if p.Wait() != 0: raise GitError('%s rev-list %s: %s' % (self._project.name, str(args), p.stderr)) return r def __getattr__(self, name): """Allow arbitrary git commands using pythonic syntax. This allows you to do things like: git_obj.rev_parse('HEAD') Since we don't have a 'rev_parse' method defined, the __getattr__ will run. We'll replace the '_' with a '-' and try to run a git command. Any other positional arguments will be passed to the git command, and the following keyword arguments are supported: config: An optional dict of git config options to be passed with '-c'. Args: name: The name of the git command to call. Any '_' characters will be replaced with '-'. Returns: A callable object that will try to call git with the named command. """ name = name.replace('_', '-') def runner(*args, **kwargs): cmdv = [] config = kwargs.pop('config', None) for k in kwargs: raise TypeError('%s() got an unexpected keyword argument %r' % (name, k)) if config is not None: if not git_require((1, 7, 2)): raise ValueError('cannot set config on command line for %s()' % name) for k, v in config.items(): cmdv.append('-c') cmdv.append('%s=%s' % (k, v)) cmdv.append(name) cmdv.extend(args) p = GitCommand(self._project, cmdv, bare=self._bare, gitdir=self._gitdir, capture_stdout=True, capture_stderr=True) if p.Wait() != 0: raise GitError('%s %s: %s' % (self._project.name, name, p.stderr)) r = p.stdout try: r = r.decode('utf-8') except AttributeError: pass if r.endswith('\n') and r.index('\n') == len(r) - 1: return r[:-1] return r return runner class _PriorSyncFailedError(Exception): def __str__(self): return 'prior sync failed; rebase still in progress' class _DirtyError(Exception): def __str__(self): return 'contains uncommitted changes' class _InfoMessage(object): def __init__(self, project, text): self.project = project self.text = text def Print(self, syncbuf): syncbuf.out.info('%s/: %s', self.project.relpath, self.text) syncbuf.out.nl() class _Failure(object): def __init__(self, project, why): self.project = project self.why = why def Print(self, syncbuf): syncbuf.out.fail('error: %s/: %s', self.project.relpath, str(self.why)) syncbuf.out.nl() class _Later(object): def __init__(self, project, action): self.project = project self.action = action def Run(self, syncbuf): out = syncbuf.out out.project('project %s/', self.project.relpath) out.nl() try: self.action() out.nl() return True except GitError: out.nl() return False class _SyncColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'reposync') self.project = self.printer('header', attr='bold') self.info = self.printer('info') self.fail = self.printer('fail', fg='red') class SyncBuffer(object): def __init__(self, config, detach_head=False): self._messages = [] self._failures = [] self._later_queue1 = [] self._later_queue2 = [] self.out = _SyncColoring(config) self.out.redirect(sys.stderr) self.detach_head = detach_head self.clean = True def info(self, project, fmt, *args): self._messages.append(_InfoMessage(project, fmt % args)) def fail(self, project, err=None): self._failures.append(_Failure(project, err)) self.clean = False def later1(self, project, what): self._later_queue1.append(_Later(project, what)) def later2(self, project, what): self._later_queue2.append(_Later(project, what)) def Finish(self): self._PrintMessages() self._RunLater() self._PrintMessages() return self.clean def _RunLater(self): for q in ['_later_queue1', '_later_queue2']: if not self._RunQueue(q): return def _RunQueue(self, queue): for m in getattr(self, queue): if not m.Run(self): self.clean = False return False setattr(self, queue, []) return True def _PrintMessages(self): for m in self._messages: m.Print(self) for m in self._failures: m.Print(self) self._messages = [] self._failures = [] class MetaProject(Project): """A special project housed under .repo. """ def __init__(self, manifest, name, gitdir, worktree): Project.__init__(self, manifest=manifest, name=name, gitdir=gitdir, objdir=gitdir, worktree=worktree, remote=RemoteSpec('origin'), relpath='.repo/%s' % name, revisionExpr='refs/heads/master', revisionId=None, groups=None) def PreSync(self): if self.Exists: cb = self.CurrentBranch if cb: base = self.GetBranch(cb).merge if base: self.revisionExpr = base self.revisionId = None def MetaBranchSwitch(self): """ Prepare MetaProject for manifest branch switch """ # detach and delete manifest branch, allowing a new # branch to take over syncbuf = SyncBuffer(self.config, detach_head=True) self.Sync_LocalHalf(syncbuf) syncbuf.Finish() return GitCommand(self, ['update-ref', '-d', 'refs/heads/default'], capture_stdout=True, capture_stderr=True).Wait() == 0 @property def LastFetch(self): try: fh = os.path.join(self.gitdir, 'FETCH_HEAD') return os.path.getmtime(fh) except OSError: return 0 @property def HasChanges(self): """Has the remote received new commits not yet checked out? """ if not self.remote or not self.revisionExpr: return False all_refs = self.bare_ref.all revid = self.GetRevisionId(all_refs) head = self.work_git.GetHead() if head.startswith(R_HEADS): try: head = all_refs[head] except KeyError: head = None if revid == head: return False elif self._revlist(not_rev(HEAD), revid): return True return False pyversion.py0100644 0000000 0000000 00000001233 13025567015 012220 0ustar000000000 0000000 # # Copyright (C) 2013 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import sys def is_python3(): return sys.version_info[0] == 3 repo0100755 0000000 0000000 00000066276 13025567015 010525 0ustar000000000 0000000 #!/usr/bin/env python # repo default configuration # import os REPO_URL = os.environ.get('REPO_URL', None) if not REPO_URL: REPO_URL = 'https://gerrit.googlesource.com/git-repo' REPO_REV = 'stable' # Copyright (C) 2008 Google Inc. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # increment this whenever we make important changes to this script VERSION = (1, 23) # increment this if the MAINTAINER_KEYS block is modified KEYRING_VERSION = (1, 2) # Each individual key entry is created by using: # gpg --armor --export keyid MAINTAINER_KEYS = """ Repo Maintainer -----BEGIN PGP PUBLIC KEY BLOCK----- Version: GnuPG v1.4.2.2 (GNU/Linux) mQGiBEj3ugERBACrLJh/ZPyVSKeClMuznFIrsQ+hpNnmJGw1a9GXKYKk8qHPhAZf WKtrBqAVMNRLhL85oSlekRz98u41H5si5zcuv+IXJDF5MJYcB8f22wAy15lUqPWi VCkk1l8qqLiuW0fo+ZkPY5qOgrvc0HW1SmdH649uNwqCbcKb6CxaTxzhOwCgj3AP xI1WfzLqdJjsm1Nq98L0cLcD/iNsILCuw44PRds3J75YP0pze7YF/6WFMB6QSFGu aUX1FsTTztKNXGms8i5b2l1B8JaLRWq/jOnZzyl1zrUJhkc0JgyZW5oNLGyWGhKD Fxp5YpHuIuMImopWEMFIRQNrvlg+YVK8t3FpdI1RY0LYqha8pPzANhEYgSfoVzOb fbfbA/4ioOrxy8ifSoga7ITyZMA+XbW8bx33WXutO9N7SPKS/AK2JpasSEVLZcON ae5hvAEGVXKxVPDjJBmIc2cOe7kOKSi3OxLzBqrjS2rnjiP4o0ekhZIe4+ocwVOg e0PLlH5avCqihGRhpoqDRsmpzSHzJIxtoeb+GgGEX8KkUsVAhbQpUmVwbyBNYWlu dGFpbmVyIDxyZXBvQGFuZHJvaWQua2VybmVsLm9yZz6IYAQTEQIAIAUCSPe6AQIb AwYLCQgHAwIEFQIIAwQWAgMBAh4BAheAAAoJEBZTDV6SD1xl1GEAn0x/OKQpy7qI 6G73NJviU0IUMtftAKCFMUhGb/0bZvQ8Rm3QCUpWHyEIu7kEDQRI97ogEBAA2wI6 5fs9y/rMwD6dkD/vK9v4C9mOn1IL5JCPYMJBVSci+9ED4ChzYvfq7wOcj9qIvaE0 GwCt2ar7Q56me5J+byhSb32Rqsw/r3Vo5cZMH80N4cjesGuSXOGyEWTe4HYoxnHv gF4EKI2LK7xfTUcxMtlyn52sUpkfKsCpUhFvdmbAiJE+jCkQZr1Z8u2KphV79Ou+ P1N5IXY/XWOlq48Qf4MWCYlJFrB07xjUjLKMPDNDnm58L5byDrP/eHysKexpbakL xCmYyfT6DV1SWLblpd2hie0sL3YejdtuBMYMS2rI7Yxb8kGuqkz+9l1qhwJtei94 5MaretDy/d/JH/pRYkRf7L+ke7dpzrP+aJmcz9P1e6gq4NJsWejaALVASBiioqNf QmtqSVzF1wkR5avZkFHuYvj6V/t1RrOZTXxkSk18KFMJRBZrdHFCWbc5qrVxUB6e N5pja0NFIUCigLBV1c6I2DwiuboMNh18VtJJh+nwWeez/RueN4ig59gRTtkcc0PR 35tX2DR8+xCCFVW/NcJ4PSePYzCuuLvp1vEDHnj41R52Fz51hgddT4rBsp0nL+5I socSOIIezw8T9vVzMY4ArCKFAVu2IVyBcahTfBS8q5EM63mONU6UVJEozfGljiMw xuQ7JwKcw0AUEKTKG7aBgBaTAgT8TOevpvlw91cAAwUP/jRkyVi/0WAb0qlEaq/S ouWxX1faR+vU3b+Y2/DGjtXQMzG0qpetaTHC/AxxHpgt/dCkWI6ljYDnxgPLwG0a Oasm94BjZc6vZwf1opFZUKsjOAAxRxNZyjUJKe4UZVuMTk6zo27Nt3LMnc0FO47v FcOjRyquvgNOS818irVHUf12waDx8gszKxQTTtFxU5/ePB2jZmhP6oXSe4K/LG5T +WBRPDrHiGPhCzJRzm9BP0lTnGCAj3o9W90STZa65RK7IaYpC8TB35JTBEbrrNCp w6lzd74LnNEp5eMlKDnXzUAgAH0yzCQeMl7t33QCdYx2hRs2wtTQSjGfAiNmj/WW Vl5Jn+2jCDnRLenKHwVRFsBX2e0BiRWt/i9Y8fjorLCXVj4z+7yW6DawdLkJorEo p3v5ILwfC7hVx4jHSnOgZ65L9s8EQdVr1ckN9243yta7rNgwfcqb60ILMFF1BRk/ 0V7wCL+68UwwiQDvyMOQuqkysKLSDCLb7BFcyA7j6KG+5hpsREstFX2wK1yKeraz 5xGrFy8tfAaeBMIQ17gvFSp/suc9DYO0ICK2BISzq+F+ZiAKsjMYOBNdH/h0zobQ HTHs37+/QLMomGEGKZMWi0dShU2J5mNRQu3Hhxl3hHDVbt5CeJBb26aQcQrFz69W zE3GNvmJosh6leayjtI9P2A6iEkEGBECAAkFAkj3uiACGwwACgkQFlMNXpIPXGWp TACbBS+Up3RpfYVfd63c1cDdlru13pQAn3NQy/SN858MkxN+zym86UBgOad2 =CMiZ -----END PGP PUBLIC KEY BLOCK----- Conley Owens -----BEGIN PGP PUBLIC KEY BLOCK----- Version: GnuPG v1.4.11 (GNU/Linux) mQENBFHRvc8BCADFg45Xx/y6QDC+T7Y/gGc7vx0ww7qfOwIKlAZ9xG3qKunMxo+S hPCnzEl3cq+6I1Ww/ndop/HB3N3toPXRCoN8Vs4/Hc7by+SnaLFnacrm+tV5/OgT V37Lzt8lhay1Kl+YfpFwHYYpIEBLFV9knyfRXS/428W2qhdzYfvB15/AasRmwmor py4NIzSs8UD/SPr1ihqNCdZM76+MQyN5HMYXW/ALZXUFG0pwluHFA7hrfPG74i8C zMiP7qvMWIl/r/jtzHioH1dRKgbod+LZsrDJ8mBaqsZaDmNJMhss9g76XvfMyLra 9DI9/iFuBpGzeqBv0hwOGQspLRrEoyTeR6n1ABEBAAG0H0NvbmxleSBPd2VucyA8 Y2NvM0BhbmRyb2lkLmNvbT6JATgEEwECACIFAlHRvc8CGwMGCwkIBwMCBhUIAgkK CwQWAgMBAh4BAheAAAoJEGe35EhpKzgsP6AIAJKJmNtn4l7hkYHKHFSo3egb6RjQ zEIP3MFTcu8HFX1kF1ZFbrp7xqurLaE53kEkKuAAvjJDAgI8mcZHP1JyplubqjQA xvv84gK+OGP3Xk+QK1ZjUQSbjOpjEiSZpRhWcHci3dgOUH4blJfByHw25hlgHowd a/2PrNKZVcJ92YienaxxGjcXEUcd0uYEG2+rwllQigFcnMFDhr9B71MfalRHjFKE fmdoypqLrri61YBc59P88Rw2/WUpTQjgNubSqa3A2+CKdaRyaRw+2fdF4TdR0h8W zbg+lbaPtJHsV+3mJC7fq26MiJDRJa5ZztpMn8su20gbLgi2ShBOaHAYDDi5AQ0E UdG9zwEIAMoOBq+QLNozAhxOOl5GL3StTStGRgPRXINfmViTsihrqGCWBBUfXlUE OytC0mYcrDUQev/8ToVoyqw+iGSwDkcSXkrEUCKFtHV/GECWtk1keyHgR10YKI1R mquSXoubWGqPeG1PAI74XWaRx8UrL8uCXUtmD8Q5J7mDjKR5NpxaXrwlA0bKsf2E Gp9tu1kKauuToZhWHMRMqYSOGikQJwWSFYKT1KdNcOXLQF6+bfoJ6sjVYdwfmNQL Ixn8QVhoTDedcqClSWB17VDEFDFa7MmqXZz2qtM3X1R/MUMHqPtegQzBGNhRdnI2 V45+1Nnx/uuCxDbeI4RbHzujnxDiq70AEQEAAYkBHwQYAQIACQUCUdG9zwIbDAAK CRBnt+RIaSs4LNVeB/0Y2pZ8I7gAAcEM0Xw8drr4omg2fUoK1J33ozlA/RxeA/lJ I3KnyCDTpXuIeBKPGkdL8uMATC9Z8DnBBajRlftNDVZS3Hz4G09G9QpMojvJkFJV By+01Flw/X+eeN8NpqSuLV4W+AjEO8at/VvgKr1AFvBRdZ7GkpI1o6DgPe7ZqX+1 dzQZt3e13W0rVBb/bUgx9iSLoeWP3aq/k+/GRGOR+S6F6BBSl0SQ2EF2+dIywb1x JuinEP+AwLAUZ1Bsx9ISC0Agpk2VeHXPL3FGhroEmoMvBzO0kTFGyoeT7PR/BfKv +H/g3HsL2LOB9uoIm8/5p2TTU5ttYCXMHhQZ81AY =AUp4 -----END PGP PUBLIC KEY BLOCK----- """ GIT = 'git' # our git command MIN_GIT_VERSION = (1, 7, 2) # minimum supported git version repodir = '.repo' # name of repo's private directory S_repo = 'repo' # special repo repository S_manifests = 'manifests' # special manifest repository REPO_MAIN = S_repo + '/main.py' # main script MIN_PYTHON_VERSION = (2, 6) # minimum supported python version GITC_CONFIG_FILE = '/gitc/.config' GITC_FS_ROOT_DIR = '/gitc/manifest-rw/' import errno import optparse import re import shutil import stat import subprocess import sys if sys.version_info[0] == 3: import urllib.request import urllib.error else: import imp import urllib2 urllib = imp.new_module('urllib') urllib.request = urllib2 urllib.error = urllib2 def _print(*objects, **kwargs): sep = kwargs.get('sep', ' ') end = kwargs.get('end', '\n') out = kwargs.get('file', sys.stdout) out.write(sep.join(objects) + end) # Python version check ver = sys.version_info if (ver[0], ver[1]) < MIN_PYTHON_VERSION: _print('error: Python version %s unsupported.\n' 'Please use Python 2.6 - 2.7 instead.' % sys.version.split(' ')[0], file=sys.stderr) sys.exit(1) home_dot_repo = os.path.expanduser('~/.repoconfig') gpg_dir = os.path.join(home_dot_repo, 'gnupg') extra_args = [] init_optparse = optparse.OptionParser(usage="repo init -u url [options]") # Logging group = init_optparse.add_option_group('Logging options') group.add_option('-q', '--quiet', dest="quiet", action="store_true", default=False, help="be quiet") # Manifest group = init_optparse.add_option_group('Manifest options') group.add_option('-u', '--manifest-url', dest='manifest_url', help='manifest repository location', metavar='URL') group.add_option('-b', '--manifest-branch', dest='manifest_branch', help='manifest branch or revision', metavar='REVISION') group.add_option('-m', '--manifest-name', dest='manifest_name', help='initial manifest file', metavar='NAME.xml') group.add_option('--mirror', dest='mirror', action='store_true', help='create a replica of the remote repositories ' 'rather than a client working directory') group.add_option('--reference', dest='reference', help='location of mirror directory', metavar='DIR') group.add_option('--depth', type='int', default=None, dest='depth', help='create a shallow clone with given depth; see git clone') group.add_option('--archive', dest='archive', action='store_true', help='checkout an archive instead of a git repository for ' 'each project. See git archive.') group.add_option('-g', '--groups', dest='groups', default='default', help='restrict manifest projects to ones with specified ' 'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]', metavar='GROUP') group.add_option('-p', '--platform', dest='platform', default="auto", help='restrict manifest projects to ones with a specified ' 'platform group [auto|all|none|linux|darwin|...]', metavar='PLATFORM') group.add_option('--no-clone-bundle', dest='no_clone_bundle', action='store_true', help='disable use of /clone.bundle on HTTP/HTTPS') # Tool group = init_optparse.add_option_group('repo Version options') group.add_option('--repo-url', dest='repo_url', help='repo repository location', metavar='URL') group.add_option('--repo-branch', dest='repo_branch', help='repo branch or revision', metavar='REVISION') group.add_option('--no-repo-verify', dest='no_repo_verify', action='store_true', help='do not verify repo source code') # Other group = init_optparse.add_option_group('Other options') group.add_option('--config-name', dest='config_name', action="store_true", default=False, help='Always prompt for name/e-mail') def _GitcInitOptions(init_optparse_arg): init_optparse_arg.set_usage("repo gitc-init -u url -c client [options]") g = init_optparse_arg.add_option_group('GITC options') g.add_option('-f', '--manifest-file', dest='manifest_file', help='Optional manifest file to use for this GITC client.') g.add_option('-c', '--gitc-client', dest='gitc_client', help='The name of the gitc_client instance to create or modify.') _gitc_manifest_dir = None def get_gitc_manifest_dir(): global _gitc_manifest_dir if _gitc_manifest_dir is None: _gitc_manifest_dir = '' try: with open(GITC_CONFIG_FILE, 'r') as gitc_config: for line in gitc_config: match = re.match('gitc_dir=(?P.*)', line) if match: _gitc_manifest_dir = match.group('gitc_manifest_dir') except IOError: pass return _gitc_manifest_dir def gitc_parse_clientdir(gitc_fs_path): """Parse a path in the GITC FS and return its client name. @param gitc_fs_path: A subdirectory path within the GITC_FS_ROOT_DIR. @returns: The GITC client name """ if gitc_fs_path == GITC_FS_ROOT_DIR: return None if not gitc_fs_path.startswith(GITC_FS_ROOT_DIR): manifest_dir = get_gitc_manifest_dir() if manifest_dir == '': return None if manifest_dir[-1] != '/': manifest_dir += '/' if gitc_fs_path == manifest_dir: return None if not gitc_fs_path.startswith(manifest_dir): return None return gitc_fs_path.split(manifest_dir)[1].split('/')[0] return gitc_fs_path.split(GITC_FS_ROOT_DIR)[1].split('/')[0] class CloneFailure(Exception): """Indicate the remote clone of repo itself failed. """ def _Init(args, gitc_init=False): """Installs repo by cloning it over the network. """ if gitc_init: _GitcInitOptions(init_optparse) opt, args = init_optparse.parse_args(args) if args: init_optparse.print_usage() sys.exit(1) url = opt.repo_url if not url: url = REPO_URL extra_args.append('--repo-url=%s' % url) branch = opt.repo_branch if not branch: branch = REPO_REV extra_args.append('--repo-branch=%s' % branch) if branch.startswith('refs/heads/'): branch = branch[len('refs/heads/'):] if branch.startswith('refs/'): _print("fatal: invalid branch name '%s'" % branch, file=sys.stderr) raise CloneFailure() try: if gitc_init: gitc_manifest_dir = get_gitc_manifest_dir() if not gitc_manifest_dir: _print('fatal: GITC filesystem is not available. Exiting...', file=sys.stderr) sys.exit(1) gitc_client = opt.gitc_client if not gitc_client: gitc_client = gitc_parse_clientdir(os.getcwd()) if not gitc_client: _print('fatal: GITC client (-c) is required.', file=sys.stderr) sys.exit(1) client_dir = os.path.join(gitc_manifest_dir, gitc_client) if not os.path.exists(client_dir): os.makedirs(client_dir) os.chdir(client_dir) if os.path.exists(repodir): # This GITC Client has already initialized repo so continue. return os.mkdir(repodir) except OSError as e: if e.errno != errno.EEXIST: _print('fatal: cannot make %s directory: %s' % (repodir, e.strerror), file=sys.stderr) # Don't raise CloneFailure; that would delete the # name. Instead exit immediately. # sys.exit(1) _CheckGitVersion() try: if NeedSetupGnuPG(): can_verify = SetupGnuPG(opt.quiet) else: can_verify = True dst = os.path.abspath(os.path.join(repodir, S_repo)) _Clone(url, dst, opt.quiet, not opt.no_clone_bundle) if can_verify and not opt.no_repo_verify: rev = _Verify(dst, branch, opt.quiet) else: rev = 'refs/remotes/origin/%s^0' % branch _Checkout(dst, branch, rev, opt.quiet) except CloneFailure: if opt.quiet: _print('fatal: repo init failed; run without --quiet to see why', file=sys.stderr) raise def ParseGitVersion(ver_str): if not ver_str.startswith('git version '): return None num_ver_str = ver_str[len('git version '):].strip().split('-')[0] to_tuple = [] for num_str in num_ver_str.split('.')[:3]: if num_str.isdigit(): to_tuple.append(int(num_str)) else: to_tuple.append(0) return tuple(to_tuple) def _CheckGitVersion(): cmd = [GIT, '--version'] try: proc = subprocess.Popen(cmd, stdout=subprocess.PIPE) except OSError as e: _print(file=sys.stderr) _print("fatal: '%s' is not available" % GIT, file=sys.stderr) _print('fatal: %s' % e, file=sys.stderr) _print(file=sys.stderr) _print('Please make sure %s is installed and in your path.' % GIT, file=sys.stderr) raise CloneFailure() ver_str = proc.stdout.read().strip() proc.stdout.close() proc.wait() ver_act = ParseGitVersion(ver_str) if ver_act is None: _print('error: "%s" unsupported' % ver_str, file=sys.stderr) raise CloneFailure() if ver_act < MIN_GIT_VERSION: need = '.'.join(map(str, MIN_GIT_VERSION)) _print('fatal: git %s or later required' % need, file=sys.stderr) raise CloneFailure() def NeedSetupGnuPG(): if not os.path.isdir(home_dot_repo): return True kv = os.path.join(home_dot_repo, 'keyring-version') if not os.path.exists(kv): return True kv = open(kv).read() if not kv: return True kv = tuple(map(int, kv.split('.'))) if kv < KEYRING_VERSION: return True return False def SetupGnuPG(quiet): try: os.mkdir(home_dot_repo) except OSError as e: if e.errno != errno.EEXIST: _print('fatal: cannot make %s directory: %s' % (home_dot_repo, e.strerror), file=sys.stderr) sys.exit(1) try: os.mkdir(gpg_dir, stat.S_IRWXU) except OSError as e: if e.errno != errno.EEXIST: _print('fatal: cannot make %s directory: %s' % (gpg_dir, e.strerror), file=sys.stderr) sys.exit(1) env = os.environ.copy() try: env['GNUPGHOME'] = gpg_dir except UnicodeEncodeError: env['GNUPGHOME'] = gpg_dir.encode() cmd = ['gpg', '--import'] try: proc = subprocess.Popen(cmd, env=env, stdin=subprocess.PIPE) except OSError as e: if not quiet: _print('warning: gpg (GnuPG) is not available.', file=sys.stderr) _print('warning: Installing it is strongly encouraged.', file=sys.stderr) _print(file=sys.stderr) return False proc.stdin.write(MAINTAINER_KEYS) proc.stdin.close() if proc.wait() != 0: _print('fatal: registering repo maintainer keys failed', file=sys.stderr) sys.exit(1) _print() fd = open(os.path.join(home_dot_repo, 'keyring-version'), 'w') fd.write('.'.join(map(str, KEYRING_VERSION)) + '\n') fd.close() return True def _SetConfig(local, name, value): """Set a git configuration option to the specified value. """ cmd = [GIT, 'config', name, value] if subprocess.Popen(cmd, cwd=local).wait() != 0: raise CloneFailure() def _InitHttp(): handlers = [] mgr = urllib.request.HTTPPasswordMgrWithDefaultRealm() try: import netrc n = netrc.netrc() for host in n.hosts: p = n.hosts[host] mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2]) mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2]) except: # pylint: disable=bare-except pass handlers.append(urllib.request.HTTPBasicAuthHandler(mgr)) handlers.append(urllib.request.HTTPDigestAuthHandler(mgr)) if 'http_proxy' in os.environ: url = os.environ['http_proxy'] handlers.append(urllib.request.ProxyHandler({'http': url, 'https': url})) if 'REPO_CURL_VERBOSE' in os.environ: handlers.append(urllib.request.HTTPHandler(debuglevel=1)) handlers.append(urllib.request.HTTPSHandler(debuglevel=1)) urllib.request.install_opener(urllib.request.build_opener(*handlers)) def _Fetch(url, local, src, quiet): if not quiet: _print('Get %s' % url, file=sys.stderr) cmd = [GIT, 'fetch'] if quiet: cmd.append('--quiet') err = subprocess.PIPE else: err = None cmd.append(src) cmd.append('+refs/heads/*:refs/remotes/origin/*') cmd.append('refs/tags/*:refs/tags/*') proc = subprocess.Popen(cmd, cwd=local, stderr=err) if err: proc.stderr.read() proc.stderr.close() if proc.wait() != 0: raise CloneFailure() def _DownloadBundle(url, local, quiet): if not url.endswith('/'): url += '/' url += 'clone.bundle' proc = subprocess.Popen( [GIT, 'config', '--get-regexp', 'url.*.insteadof'], cwd=local, stdout=subprocess.PIPE) for line in proc.stdout: m = re.compile(r'^url\.(.*)\.insteadof (.*)$').match(line) if m: new_url = m.group(1) old_url = m.group(2) if url.startswith(old_url): url = new_url + url[len(old_url):] break proc.stdout.close() proc.wait() if not url.startswith('http:') and not url.startswith('https:'): return False dest = open(os.path.join(local, '.git', 'clone.bundle'), 'w+b') try: try: r = urllib.request.urlopen(url) except urllib.error.HTTPError as e: if e.code in [401, 403, 404, 501]: return False _print('fatal: Cannot get %s' % url, file=sys.stderr) _print('fatal: HTTP error %s' % e.code, file=sys.stderr) raise CloneFailure() except urllib.error.URLError as e: _print('fatal: Cannot get %s' % url, file=sys.stderr) _print('fatal: error %s' % e.reason, file=sys.stderr) raise CloneFailure() try: if not quiet: _print('Get %s' % url, file=sys.stderr) while True: buf = r.read(8192) if buf == '': return True dest.write(buf) finally: r.close() finally: dest.close() def _ImportBundle(local): path = os.path.join(local, '.git', 'clone.bundle') try: _Fetch(local, local, path, True) finally: os.remove(path) def _Clone(url, local, quiet, clone_bundle): """Clones a git repository to a new subdirectory of repodir """ try: os.mkdir(local) except OSError as e: _print('fatal: cannot make %s directory: %s' % (local, e.strerror), file=sys.stderr) raise CloneFailure() cmd = [GIT, 'init', '--quiet'] try: proc = subprocess.Popen(cmd, cwd=local) except OSError as e: _print(file=sys.stderr) _print("fatal: '%s' is not available" % GIT, file=sys.stderr) _print('fatal: %s' % e, file=sys.stderr) _print(file=sys.stderr) _print('Please make sure %s is installed and in your path.' % GIT, file=sys.stderr) raise CloneFailure() if proc.wait() != 0: _print('fatal: could not create %s' % local, file=sys.stderr) raise CloneFailure() _InitHttp() _SetConfig(local, 'remote.origin.url', url) _SetConfig(local, 'remote.origin.fetch', '+refs/heads/*:refs/remotes/origin/*') if clone_bundle and _DownloadBundle(url, local, quiet): _ImportBundle(local) _Fetch(url, local, 'origin', quiet) def _Verify(cwd, branch, quiet): """Verify the branch has been signed by a tag. """ cmd = [GIT, 'describe', 'origin/%s' % branch] proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd) cur = proc.stdout.read().strip() proc.stdout.close() proc.stderr.read() proc.stderr.close() if proc.wait() != 0 or not cur: _print(file=sys.stderr) _print("fatal: branch '%s' has not been signed" % branch, file=sys.stderr) raise CloneFailure() m = re.compile(r'^(.*)-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur) if m: cur = m.group(1) if not quiet: _print(file=sys.stderr) _print("info: Ignoring branch '%s'; using tagged release '%s'" % (branch, cur), file=sys.stderr) _print(file=sys.stderr) env = os.environ.copy() try: env['GNUPGHOME'] = gpg_dir except UnicodeEncodeError: env['GNUPGHOME'] = gpg_dir.encode() cmd = [GIT, 'tag', '-v', cur] proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd, env=env) out = proc.stdout.read() proc.stdout.close() err = proc.stderr.read() proc.stderr.close() if proc.wait() != 0: _print(file=sys.stderr) _print(out, file=sys.stderr) _print(err, file=sys.stderr) _print(file=sys.stderr) raise CloneFailure() return '%s^0' % cur def _Checkout(cwd, branch, rev, quiet): """Checkout an upstream branch into the repository and track it. """ cmd = [GIT, 'update-ref', 'refs/heads/default', rev] if subprocess.Popen(cmd, cwd=cwd).wait() != 0: raise CloneFailure() _SetConfig(cwd, 'branch.default.remote', 'origin') _SetConfig(cwd, 'branch.default.merge', 'refs/heads/%s' % branch) cmd = [GIT, 'symbolic-ref', 'HEAD', 'refs/heads/default'] if subprocess.Popen(cmd, cwd=cwd).wait() != 0: raise CloneFailure() cmd = [GIT, 'read-tree', '--reset', '-u'] if not quiet: cmd.append('-v') cmd.append('HEAD') if subprocess.Popen(cmd, cwd=cwd).wait() != 0: raise CloneFailure() def _FindRepo(): """Look for a repo installation, starting at the current directory. """ curdir = os.getcwd() repo = None olddir = None while curdir != '/' \ and curdir != olddir \ and not repo: repo = os.path.join(curdir, repodir, REPO_MAIN) if not os.path.isfile(repo): repo = None olddir = curdir curdir = os.path.dirname(curdir) return (repo, os.path.join(curdir, repodir)) class _Options(object): help = False def _ParseArguments(args): cmd = None opt = _Options() arg = [] for i in range(len(args)): a = args[i] if a == '-h' or a == '--help': opt.help = True elif not a.startswith('-'): cmd = a arg = args[i + 1:] break return cmd, opt, arg def _Usage(): gitc_usage = "" if get_gitc_manifest_dir(): gitc_usage = " gitc-init Initialize a GITC Client.\n" _print( """usage: repo COMMAND [ARGS] repo is not yet installed. Use "repo init" to install it here. The most commonly used repo commands are: init Install repo in the current working directory """ + gitc_usage + """ help Display detailed help on a command For access to the full online help, install repo ("repo init"). """, file=sys.stderr) sys.exit(1) def _Help(args): if args: if args[0] == 'init': init_optparse.print_help() sys.exit(0) elif args[0] == 'gitc-init': _GitcInitOptions(init_optparse) init_optparse.print_help() sys.exit(0) else: _print("error: '%s' is not a bootstrap command.\n" ' For access to online help, install repo ("repo init").' % args[0], file=sys.stderr) else: _Usage() sys.exit(1) def _NotInstalled(): _print('error: repo is not installed. Use "repo init" to install it here.', file=sys.stderr) sys.exit(1) def _NoCommands(cmd): _print("""error: command '%s' requires repo to be installed first. Use "repo init" to install it here.""" % cmd, file=sys.stderr) sys.exit(1) def _RunSelf(wrapper_path): my_dir = os.path.dirname(wrapper_path) my_main = os.path.join(my_dir, 'main.py') my_git = os.path.join(my_dir, '.git') if os.path.isfile(my_main) and os.path.isdir(my_git): for name in ['git_config.py', 'project.py', 'subcmds']: if not os.path.exists(os.path.join(my_dir, name)): return None, None return my_main, my_git return None, None def _SetDefaultsTo(gitdir): global REPO_URL global REPO_REV REPO_URL = gitdir proc = subprocess.Popen([GIT, '--git-dir=%s' % gitdir, 'symbolic-ref', 'HEAD'], stdout=subprocess.PIPE, stderr=subprocess.PIPE) REPO_REV = proc.stdout.read().strip() proc.stdout.close() proc.stderr.read() proc.stderr.close() if proc.wait() != 0: _print('fatal: %s has no current branch' % gitdir, file=sys.stderr) sys.exit(1) def main(orig_args): cmd, opt, args = _ParseArguments(orig_args) repo_main, rel_repo_dir = None, None # Don't use the local repo copy, make sure to switch to the gitc client first. if cmd != 'gitc-init': repo_main, rel_repo_dir = _FindRepo() wrapper_path = os.path.abspath(__file__) my_main, my_git = _RunSelf(wrapper_path) cwd = os.getcwd() if get_gitc_manifest_dir() and cwd.startswith(get_gitc_manifest_dir()): _print('error: repo cannot be used in the GITC local manifest directory.' '\nIf you want to work on this GITC client please rerun this ' 'command from the corresponding client under /gitc/', file=sys.stderr) sys.exit(1) if not repo_main: if opt.help: _Usage() if cmd == 'help': _Help(args) if not cmd: _NotInstalled() if cmd == 'init' or cmd == 'gitc-init': if my_git: _SetDefaultsTo(my_git) try: _Init(args, gitc_init=(cmd == 'gitc-init')) except CloneFailure: shutil.rmtree(os.path.join(repodir, S_repo), ignore_errors=True) sys.exit(1) repo_main, rel_repo_dir = _FindRepo() else: _NoCommands(cmd) if my_main: repo_main = my_main ver_str = '.'.join(map(str, VERSION)) me = [sys.executable, repo_main, '--repo-dir=%s' % rel_repo_dir, '--wrapper-version=%s' % ver_str, '--wrapper-path=%s' % wrapper_path, '--'] me.extend(orig_args) me.extend(extra_args) try: os.execv(sys.executable, me) except OSError as e: _print("fatal: unable to start %s" % repo_main, file=sys.stderr) _print("fatal: %s" % e, file=sys.stderr) sys.exit(148) if __name__ == '__main__': if ver[0] == 3: _print('warning: Python 3 support is currently experimental. YMMV.\n' 'Please use Python 2.6 - 2.7 instead.', file=sys.stderr) main(sys.argv[1:]) subcmds/0040755 0000000 0000000 00000000000 13025567015 011254 5ustar000000000 0000000 subcmds/__init__.py0100644 0000000 0000000 00000002606 13025567015 013366 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os all_commands = {} my_dir = os.path.dirname(__file__) for py in os.listdir(my_dir): if py == '__init__.py': continue if py.endswith('.py'): name = py[:-3] clsn = name.capitalize() while clsn.find('_') > 0: h = clsn.index('_') clsn = clsn[0:h] + clsn[h + 1:].capitalize() mod = __import__(__name__, globals(), locals(), ['%s' % name]) mod = getattr(mod, name) try: cmd = getattr(mod, clsn)() except AttributeError: raise SyntaxError('%s/%s does not define class %s' % ( __name__, py, clsn)) name = name.replace('_', '-') cmd.NAME = name all_commands[name] = cmd if 'help' in all_commands: all_commands['help'].commands = all_commands subcmds/abandon.py0100644 0000000 0000000 00000004052 13025567015 013226 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from command import Command from git_command import git from progress import Progress class Abandon(Command): common = True helpSummary = "Permanently abandon a development branch" helpUsage = """ %prog [...] This subcommand permanently abandons a development branch by deleting it (and all its history) from your local repository. It is equivalent to "git branch -D ". """ def Execute(self, opt, args): if not args: self.Usage() nb = args[0] if not git.check_ref_format('heads/%s' % nb): print("error: '%s' is not a valid name" % nb, file=sys.stderr) sys.exit(1) nb = args[0] err = [] success = [] all_projects = self.GetProjects(args[1:]) pm = Progress('Abandon %s' % nb, len(all_projects)) for project in all_projects: pm.update() status = project.AbandonBranch(nb) if status is not None: if status: success.append(project) else: err.append(project) pm.end() if err: for p in err: print("error: %s/: cannot abandon %s" % (p.relpath, nb), file=sys.stderr) sys.exit(1) elif not success: print('error: no project has branch %s' % nb, file=sys.stderr) sys.exit(1) else: print('Abandoned in %d project(s):\n %s' % (len(success), '\n '.join(p.relpath for p in success)), file=sys.stderr) subcmds/branches.py0100644 0000000 0000000 00000011643 13025567015 013415 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from color import Coloring from command import Command class BranchColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'branch') self.current = self.printer('current', fg='green') self.local = self.printer('local') self.notinproject = self.printer('notinproject', fg='red') class BranchInfo(object): def __init__(self, name): self.name = name self.current = 0 self.published = 0 self.published_equal = 0 self.projects = [] def add(self, b): if b.current: self.current += 1 if b.published: self.published += 1 if b.revision == b.published: self.published_equal += 1 self.projects.append(b) @property def IsCurrent(self): return self.current > 0 @property def IsSplitCurrent(self): return self.current != 0 and self.current != len(self.projects) @property def IsPublished(self): return self.published > 0 @property def IsPublishedEqual(self): return self.published_equal == len(self.projects) class Branches(Command): common = True helpSummary = "View current topic branches" helpUsage = """ %prog [...] Summarizes the currently available topic branches. Branch Display -------------- The branch display output by this command is organized into four columns of information; for example: *P nocolor | in repo repo2 | The first column contains a * if the branch is the currently checked out branch in any of the specified projects, or a blank if no project has the branch checked out. The second column contains either blank, p or P, depending upon the upload status of the branch. (blank): branch not yet published by repo upload P: all commits were published by repo upload p: only some commits were published by repo upload The third column contains the branch name. The fourth column (after the | separator) lists the projects that the branch appears in, or does not appear in. If no project list is shown, then the branch appears in all projects. """ def Execute(self, opt, args): projects = self.GetProjects(args) out = BranchColoring(self.manifest.manifestProject.config) all_branches = {} project_cnt = len(projects) for project in projects: for name, b in project.GetBranches().items(): b.project = project if name not in all_branches: all_branches[name] = BranchInfo(name) all_branches[name].add(b) names = list(sorted(all_branches)) if not names: print(' (no branches)', file=sys.stderr) return width = 25 for name in names: if width < len(name): width = len(name) for name in names: i = all_branches[name] in_cnt = len(i.projects) if i.IsCurrent: current = '*' hdr = out.current else: current = ' ' hdr = out.local if i.IsPublishedEqual: published = 'P' elif i.IsPublished: published = 'p' else: published = ' ' hdr('%c%c %-*s' % (current, published, width, name)) out.write(' |') if in_cnt < project_cnt: fmt = out.write paths = [] non_cur_paths = [] if i.IsSplitCurrent or (in_cnt < project_cnt - in_cnt): in_type = 'in' for b in i.projects: if not i.IsSplitCurrent or b.current: paths.append(b.project.relpath) else: non_cur_paths.append(b.project.relpath) else: fmt = out.notinproject in_type = 'not in' have = set() for b in i.projects: have.add(b.project) for p in projects: if not p in have: paths.append(p.relpath) s = ' %s %s' % (in_type, ', '.join(paths)) if not i.IsSplitCurrent and (width + 7 + len(s) < 80): fmt = out.current if i.IsCurrent else fmt fmt(s) else: fmt(' %s:' % in_type) fmt = out.current if i.IsCurrent else out.write for p in paths: out.nl() fmt(width*' ' + ' %s' % p) fmt = out.write for p in non_cur_paths: out.nl() fmt(width*' ' + ' %s' % p) else: out.write(' in all projects') out.nl() subcmds/checkout.py0100644 0000000 0000000 00000003370 13025567015 013433 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from command import Command from progress import Progress class Checkout(Command): common = True helpSummary = "Checkout a branch for development" helpUsage = """ %prog [...] """ helpDescription = """ The '%prog' command checks out an existing branch that was previously created by 'repo start'. The command is equivalent to: repo forall [...] -c git checkout """ def Execute(self, opt, args): if not args: self.Usage() nb = args[0] err = [] success = [] all_projects = self.GetProjects(args[1:]) pm = Progress('Checkout %s' % nb, len(all_projects)) for project in all_projects: pm.update() status = project.CheckoutBranch(nb) if status is not None: if status: success.append(project) else: err.append(project) pm.end() if err: for p in err: print("error: %s/: cannot checkout %s" % (p.relpath, nb), file=sys.stderr) sys.exit(1) elif not success: print('error: no project has branch %s' % nb, file=sys.stderr) sys.exit(1) subcmds/cherry_pick.py0100644 0000000 0000000 00000006550 13025567015 014133 0ustar000000000 0000000 # # Copyright (C) 2010 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import re import sys from command import Command from git_command import GitCommand CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$') class CherryPick(Command): common = True helpSummary = "Cherry-pick a change." helpUsage = """ %prog """ helpDescription = """ '%prog' cherry-picks a change from one branch to another. The change id will be updated, and a reference to the old change id will be added. """ def _Options(self, p): pass def Execute(self, opt, args): if len(args) != 1: self.Usage() reference = args[0] p = GitCommand(None, ['rev-parse', '--verify', reference], capture_stdout = True, capture_stderr = True) if p.Wait() != 0: print(p.stderr, file=sys.stderr) sys.exit(1) sha1 = p.stdout.strip() p = GitCommand(None, ['cat-file', 'commit', sha1], capture_stdout=True) if p.Wait() != 0: print("error: Failed to retrieve old commit message", file=sys.stderr) sys.exit(1) old_msg = self._StripHeader(p.stdout) p = GitCommand(None, ['cherry-pick', sha1], capture_stdout = True, capture_stderr = True) status = p.Wait() print(p.stdout, file=sys.stdout) print(p.stderr, file=sys.stderr) if status == 0: # The cherry-pick was applied correctly. We just need to edit the # commit message. new_msg = self._Reformat(old_msg, sha1) p = GitCommand(None, ['commit', '--amend', '-F', '-'], provide_stdin = True, capture_stdout = True, capture_stderr = True) p.stdin.write(new_msg) p.stdin.close() if p.Wait() != 0: print("error: Failed to update commit message", file=sys.stderr) sys.exit(1) else: print('NOTE: When committing (please see above) and editing the commit ' 'message, please remove the old Change-Id-line and add:') print(self._GetReference(sha1), file=sys.stderr) print(file=sys.stderr) def _IsChangeId(self, line): return CHANGE_ID_RE.match(line) def _GetReference(self, sha1): return "(cherry picked from commit %s)" % sha1 def _StripHeader(self, commit_msg): lines = commit_msg.splitlines() return "\n".join(lines[lines.index("")+1:]) def _Reformat(self, old_msg, sha1): new_msg = [] for line in old_msg.splitlines(): if not self._IsChangeId(line): new_msg.append(line) # Add a blank line between the message and the change id/reference try: if new_msg[-1].strip() != "": new_msg.append("") except IndexError: pass new_msg.append(self._GetReference(sha1)) return "\n".join(new_msg) subcmds/diff.py0100644 0000000 0000000 00000002606 13025567015 012537 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from command import PagedCommand class Diff(PagedCommand): common = True helpSummary = "Show changes between commit and working tree" helpUsage = """ %prog [...] The -u option causes '%prog' to generate diff output with file paths relative to the repository root, so the output can be applied to the Unix 'patch' command. """ def _Options(self, p): def cmd(option, opt_str, value, parser): setattr(parser.values, option.dest, list(parser.rargs)) while parser.rargs: del parser.rargs[0] p.add_option('-u', '--absolute', dest='absolute', action='store_true', help='Paths are relative to the repository root') def Execute(self, opt, args): for project in self.GetProjects(args): project.PrintWorkTreeDiff(opt.absolute) subcmds/diffmanifests.py0100644 0000000 0000000 00000016367 13025567015 014462 0ustar000000000 0000000 # # Copyright (C) 2014 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from color import Coloring from command import PagedCommand from manifest_xml import XmlManifest class _Coloring(Coloring): def __init__(self, config): Coloring.__init__(self, config, "status") class Diffmanifests(PagedCommand): """ A command to see logs in projects represented by manifests This is used to see deeper differences between manifests. Where a simple diff would only show a diff of sha1s for example, this command will display the logs of the project between both sha1s, allowing user to see diff at a deeper level. """ common = True helpSummary = "Manifest diff utility" helpUsage = """%prog manifest1.xml [manifest2.xml] [options]""" helpDescription = """ The %prog command shows differences between project revisions of manifest1 and manifest2. if manifest2 is not specified, current manifest.xml will be used instead. Both absolute and relative paths may be used for manifests. Relative paths start from project's ".repo/manifests" folder. The --raw option Displays the diff in a way that facilitates parsing, the project pattern will be [] and the commit pattern will be with status values respectively : A = Added project R = Removed project C = Changed project U = Project with unreachable revision(s) (revision(s) not found) for project, and A = Added commit R = Removed commit for a commit. Only changed projects may contain commits, and commit status always starts with a space, and are part of last printed project. Unreachable revisions may occur if project is not up to date or if repo has not been initialized with all the groups, in which case some projects won't be synced and their revisions won't be found. """ def _Options(self, p): p.add_option('--raw', dest='raw', action='store_true', help='Display raw diff.') p.add_option('--no-color', dest='color', action='store_false', default=True, help='does not display the diff in color.') p.add_option('--pretty-format', dest='pretty_format', action='store', metavar='', help='print the log using a custom git pretty format string') def _printRawDiff(self, diff): for project in diff['added']: self.printText("A %s %s" % (project.relpath, project.revisionExpr)) self.out.nl() for project in diff['removed']: self.printText("R %s %s" % (project.relpath, project.revisionExpr)) self.out.nl() for project, otherProject in diff['changed']: self.printText("C %s %s %s" % (project.relpath, project.revisionExpr, otherProject.revisionExpr)) self.out.nl() self._printLogs(project, otherProject, raw=True, color=False) for project, otherProject in diff['unreachable']: self.printText("U %s %s %s" % (project.relpath, project.revisionExpr, otherProject.revisionExpr)) self.out.nl() def _printDiff(self, diff, color=True, pretty_format=None): if diff['added']: self.out.nl() self.printText('added projects : \n') self.out.nl() for project in diff['added']: self.printProject('\t%s' % (project.relpath)) self.printText(' at revision ') self.printRevision(project.revisionExpr) self.out.nl() if diff['removed']: self.out.nl() self.printText('removed projects : \n') self.out.nl() for project in diff['removed']: self.printProject('\t%s' % (project.relpath)) self.printText(' at revision ') self.printRevision(project.revisionExpr) self.out.nl() if diff['changed']: self.out.nl() self.printText('changed projects : \n') self.out.nl() for project, otherProject in diff['changed']: self.printProject('\t%s' % (project.relpath)) self.printText(' changed from ') self.printRevision(project.revisionExpr) self.printText(' to ') self.printRevision(otherProject.revisionExpr) self.out.nl() self._printLogs(project, otherProject, raw=False, color=color, pretty_format=pretty_format) self.out.nl() if diff['unreachable']: self.out.nl() self.printText('projects with unreachable revisions : \n') self.out.nl() for project, otherProject in diff['unreachable']: self.printProject('\t%s ' % (project.relpath)) self.printRevision(project.revisionExpr) self.printText(' or ') self.printRevision(otherProject.revisionExpr) self.printText(' not found') self.out.nl() def _printLogs(self, project, otherProject, raw=False, color=True, pretty_format=None): logs = project.getAddedAndRemovedLogs(otherProject, oneline=(pretty_format is None), color=color, pretty_format=pretty_format) if logs['removed']: removedLogs = logs['removed'].split('\n') for log in removedLogs: if log.strip(): if raw: self.printText(' R ' + log) self.out.nl() else: self.printRemoved('\t\t[-] ') self.printText(log) self.out.nl() if logs['added']: addedLogs = logs['added'].split('\n') for log in addedLogs: if log.strip(): if raw: self.printText(' A ' + log) self.out.nl() else: self.printAdded('\t\t[+] ') self.printText(log) self.out.nl() def Execute(self, opt, args): if not args or len(args) > 2: self.Usage() self.out = _Coloring(self.manifest.globalConfig) self.printText = self.out.nofmt_printer('text') if opt.color: self.printProject = self.out.nofmt_printer('project', attr = 'bold') self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold') self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold') self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow') else: self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText manifest1 = XmlManifest(self.manifest.repodir) manifest1.Override(args[0]) if len(args) == 1: manifest2 = self.manifest else: manifest2 = XmlManifest(self.manifest.repodir) manifest2.Override(args[1]) diff = manifest1.projectsDiff(manifest2) if opt.raw: self._printRawDiff(diff) else: self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format) subcmds/download.py0100644 0000000 0000000 00000006237 13025567015 013442 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import re import sys from command import Command from error import GitError CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$') class Download(Command): common = True helpSummary = "Download and checkout a change" helpUsage = """ %prog {project change[/patchset]}... """ helpDescription = """ The '%prog' command downloads a change from the review system and makes it available in your project's local working directory. """ def _Options(self, p): p.add_option('-c', '--cherry-pick', dest='cherrypick', action='store_true', help="cherry-pick instead of checkout") p.add_option('-r', '--revert', dest='revert', action='store_true', help="revert instead of checkout") p.add_option('-f', '--ff-only', dest='ffonly', action='store_true', help="force fast-forward merge") def _ParseChangeIds(self, args): if not args: self.Usage() to_get = [] project = None for a in args: m = CHANGE_RE.match(a) if m: if not project: self.Usage() chg_id = int(m.group(1)) if m.group(2): ps_id = int(m.group(2)) else: ps_id = 1 to_get.append((project, chg_id, ps_id)) else: project = self.GetProjects([a])[0] return to_get def Execute(self, opt, args): for project, change_id, ps_id in self._ParseChangeIds(args): dl = project.DownloadPatchSet(change_id, ps_id) if not dl: print('[%s] change %d/%d not found' % (project.name, change_id, ps_id), file=sys.stderr) sys.exit(1) if not opt.revert and not dl.commits: print('[%s] change %d/%d has already been merged' % (project.name, change_id, ps_id), file=sys.stderr) continue if len(dl.commits) > 1: print('[%s] %d/%d depends on %d unmerged changes:' \ % (project.name, change_id, ps_id, len(dl.commits)), file=sys.stderr) for c in dl.commits: print(' %s' % (c), file=sys.stderr) if opt.cherrypick: try: project._CherryPick(dl.commit) except GitError: print('[%s] Could not complete the cherry-pick of %s' \ % (project.name, dl.commit), file=sys.stderr) sys.exit(1) elif opt.revert: project._Revert(dl.commit) elif opt.ffonly: project._FastForward(dl.commit, ffonly=True) else: project._Checkout(dl.commit) subcmds/forall.py0100644 0000000 0000000 00000030216 13025567015 013104 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import errno import fcntl import multiprocessing import re import os import select import signal import sys import subprocess from color import Coloring from command import Command, MirrorSafeCommand _CAN_COLOR = [ 'branch', 'diff', 'grep', 'log', ] class ForallColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'forall') self.project = self.printer('project', attr='bold') class Forall(Command, MirrorSafeCommand): common = False helpSummary = "Run a shell command in each project" helpUsage = """ %prog [...] -c [...] %prog -r str1 [str2] ... -c [...]" """ helpDescription = """ Executes the same shell command in each project. The -r option allows running the command only on projects matching regex or wildcard expression. Output Formatting ----------------- The -p option causes '%prog' to bind pipes to the command's stdin, stdout and stderr streams, and pipe all output into a continuous stream that is displayed in a single pager session. Project headings are inserted before the output of each command is displayed. If the command produces no output in a project, no heading is displayed. The formatting convention used by -p is very suitable for some types of searching, e.g. `repo forall -p -c git log -SFoo` will print all commits that add or remove references to Foo. The -v option causes '%prog' to display stderr messages if a command produces output only on stderr. Normally the -p option causes command output to be suppressed until the command produces at least one byte of output on stdout. Environment ----------- pwd is the project's working directory. If the current client is a mirror client, then pwd is the Git repository. REPO_PROJECT is set to the unique name of the project. REPO_PATH is the path relative the the root of the client. REPO_REMOTE is the name of the remote system from the manifest. REPO_LREV is the name of the revision from the manifest, translated to a local tracking branch. If you need to pass the manifest revision to a locally executed git command, use REPO_LREV. REPO_RREV is the name of the revision from the manifest, exactly as written in the manifest. REPO_COUNT is the total number of projects being iterated. REPO_I is the current (1-based) iteration count. Can be used in conjunction with REPO_COUNT to add a simple progress indicator to your command. REPO__* are any extra environment variables, specified by the "annotation" element under any project element. This can be useful for differentiating trees based on user-specific criteria, or simply annotating tree details. shell positional arguments ($1, $2, .., $#) are set to any arguments following . Unless -p is used, stdin, stdout, stderr are inherited from the terminal and are not redirected. If -e is used, when a command exits unsuccessfully, '%prog' will abort without iterating through the remaining projects. """ def _Options(self, p): def cmd(option, opt_str, value, parser): setattr(parser.values, option.dest, list(parser.rargs)) while parser.rargs: del parser.rargs[0] p.add_option('-r', '--regex', dest='regex', action='store_true', help="Execute the command only on projects matching regex or wildcard expression") p.add_option('-i', '--inverse-regex', dest='inverse_regex', action='store_true', help="Execute the command only on projects not matching regex or wildcard expression") p.add_option('-g', '--groups', dest='groups', help="Execute the command only on projects matching the specified groups") p.add_option('-c', '--command', help='Command (and arguments) to execute', dest='command', action='callback', callback=cmd) p.add_option('-e', '--abort-on-errors', dest='abort_on_errors', action='store_true', help='Abort if a command exits unsuccessfully') g = p.add_option_group('Output') g.add_option('-p', dest='project_header', action='store_true', help='Show project headers before output') g.add_option('-v', '--verbose', dest='verbose', action='store_true', help='Show command error messages') g.add_option('-j', '--jobs', dest='jobs', action='store', type='int', default=1, help='number of commands to execute simultaneously') def WantPager(self, opt): return opt.project_header and opt.jobs == 1 def _SerializeProject(self, project): """ Serialize a project._GitGetByExec instance. project._GitGetByExec is not pickle-able. Instead of trying to pass it around between processes, make a dict ourselves containing only the attributes that we need. """ if not self.manifest.IsMirror: lrev = project.GetRevisionId() else: lrev = None return { 'name': project.name, 'relpath': project.relpath, 'remote_name': project.remote.name, 'lrev': lrev, 'rrev': project.revisionExpr, 'annotations': dict((a.name, a.value) for a in project.annotations), 'gitdir': project.gitdir, 'worktree': project.worktree, } def Execute(self, opt, args): if not opt.command: self.Usage() cmd = [opt.command[0]] shell = True if re.compile(r'^[a-z0-9A-Z_/\.-]+$').match(cmd[0]): shell = False if shell: cmd.append(cmd[0]) cmd.extend(opt.command[1:]) if opt.project_header \ and not shell \ and cmd[0] == 'git': # If this is a direct git command that can enable colorized # output and the user prefers coloring, add --color into the # command line because we are going to wrap the command into # a pipe and git won't know coloring should activate. # for cn in cmd[1:]: if not cn.startswith('-'): break else: cn = None # pylint: disable=W0631 if cn and cn in _CAN_COLOR: class ColorCmd(Coloring): def __init__(self, config, cmd): Coloring.__init__(self, config, cmd) if ColorCmd(self.manifest.manifestProject.config, cn).is_on: cmd.insert(cmd.index(cn) + 1, '--color') # pylint: enable=W0631 mirror = self.manifest.IsMirror rc = 0 smart_sync_manifest_name = "smart_sync_override.xml" smart_sync_manifest_path = os.path.join( self.manifest.manifestProject.worktree, smart_sync_manifest_name) if os.path.isfile(smart_sync_manifest_path): self.manifest.Override(smart_sync_manifest_path) if opt.regex: projects = self.FindProjects(args) elif opt.inverse_regex: projects = self.FindProjects(args, inverse=True) else: projects = self.GetProjects(args, groups=opt.groups) os.environ['REPO_COUNT'] = str(len(projects)) pool = multiprocessing.Pool(opt.jobs, InitWorker) try: config = self.manifest.manifestProject.config results_it = pool.imap( DoWorkWrapper, self.ProjectArgs(projects, mirror, opt, cmd, shell, config)) pool.close() for r in results_it: rc = rc or r if r != 0 and opt.abort_on_errors: raise Exception('Aborting due to previous error') except (KeyboardInterrupt, WorkerKeyboardInterrupt): # Catch KeyboardInterrupt raised inside and outside of workers print('Interrupted - terminating the pool') pool.terminate() rc = rc or errno.EINTR except Exception as e: # Catch any other exceptions raised print('Got an error, terminating the pool: %s: %s' % (type(e).__name__, e), file=sys.stderr) pool.terminate() rc = rc or getattr(e, 'errno', 1) finally: pool.join() if rc != 0: sys.exit(rc) def ProjectArgs(self, projects, mirror, opt, cmd, shell, config): for cnt, p in enumerate(projects): try: project = self._SerializeProject(p) except Exception as e: print('Project list error on project %s: %s: %s' % (p.name, type(e).__name__, e), file=sys.stderr) return except KeyboardInterrupt: print('Project list interrupted', file=sys.stderr) return yield [mirror, opt, cmd, shell, cnt, config, project] class WorkerKeyboardInterrupt(Exception): """ Keyboard interrupt exception for worker processes. """ pass def InitWorker(): signal.signal(signal.SIGINT, signal.SIG_IGN) def DoWorkWrapper(args): """ A wrapper around the DoWork() method. Catch the KeyboardInterrupt exceptions here and re-raise them as a different, ``Exception``-based exception to stop it flooding the console with stacktraces and making the parent hang indefinitely. """ project = args.pop() try: return DoWork(project, *args) except KeyboardInterrupt: print('%s: Worker interrupted' % project['name']) raise WorkerKeyboardInterrupt() def DoWork(project, mirror, opt, cmd, shell, cnt, config): env = os.environ.copy() def setenv(name, val): if val is None: val = '' if hasattr(val, 'encode'): val = val.encode() env[name] = val setenv('REPO_PROJECT', project['name']) setenv('REPO_PATH', project['relpath']) setenv('REPO_REMOTE', project['remote_name']) setenv('REPO_LREV', project['lrev']) setenv('REPO_RREV', project['rrev']) setenv('REPO_I', str(cnt + 1)) for name in project['annotations']: setenv("REPO__%s" % (name), project['annotations'][name]) if mirror: setenv('GIT_DIR', project['gitdir']) cwd = project['gitdir'] else: cwd = project['worktree'] if not os.path.exists(cwd): if (opt.project_header and opt.verbose) \ or not opt.project_header: print('skipping %s/' % project['relpath'], file=sys.stderr) return if opt.project_header: stdin = subprocess.PIPE stdout = subprocess.PIPE stderr = subprocess.PIPE else: stdin = None stdout = None stderr = None p = subprocess.Popen(cmd, cwd=cwd, shell=shell, env=env, stdin=stdin, stdout=stdout, stderr=stderr) if opt.project_header: out = ForallColoring(config) out.redirect(sys.stdout) class sfd(object): def __init__(self, fd, dest): self.fd = fd self.dest = dest def fileno(self): return self.fd.fileno() empty = True errbuf = '' p.stdin.close() s_in = [sfd(p.stdout, sys.stdout), sfd(p.stderr, sys.stderr)] for s in s_in: flags = fcntl.fcntl(s.fd, fcntl.F_GETFL) fcntl.fcntl(s.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK) while s_in: in_ready, _out_ready, _err_ready = select.select(s_in, [], []) for s in in_ready: buf = s.fd.read(4096) if not buf: s.fd.close() s_in.remove(s) continue if not opt.verbose: if s.fd != p.stdout: errbuf += buf continue if empty and out: if not cnt == 0: out.nl() if mirror: project_header_path = project['name'] else: project_header_path = project['relpath'] out.project('project %s/', project_header_path) out.nl() out.flush() if errbuf: sys.stderr.write(errbuf) sys.stderr.flush() errbuf = '' empty = False s.dest.write(buf) s.dest.flush() r = p.wait() return r subcmds/gitc_delete.py0100644 0000000 0000000 00000003267 13025567015 014103 0ustar000000000 0000000 # # Copyright (C) 2015 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import shutil import sys from command import Command, GitcClientCommand import gitc_utils from pyversion import is_python3 if not is_python3(): # pylint:disable=W0622 input = raw_input # pylint:enable=W0622 class GitcDelete(Command, GitcClientCommand): common = True visible_everywhere = False helpSummary = "Delete a GITC Client." helpUsage = """ %prog """ helpDescription = """ This subcommand deletes the current GITC client, deleting the GITC manifest and all locally downloaded sources. """ def _Options(self, p): p.add_option('-f', '--force', dest='force', action='store_true', help='Force the deletion (no prompt).') def Execute(self, opt, args): if not opt.force: prompt = ('This will delete GITC client: %s\nAre you sure? (yes/no) ' % self.gitc_manifest.gitc_client_name) response = input(prompt).lower() if not response == 'yes': print('Response was not "yes"\n Exiting...') sys.exit(1) shutil.rmtree(self.gitc_manifest.gitc_client_dir) subcmds/gitc_init.py0100644 0000000 0000000 00000006000 13025567015 013570 0ustar000000000 0000000 # # Copyright (C) 2015 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import sys import gitc_utils from command import GitcAvailableCommand from manifest_xml import GitcManifest from subcmds import init import wrapper class GitcInit(init.Init, GitcAvailableCommand): common = True helpSummary = "Initialize a GITC Client." helpUsage = """ %prog [options] [client name] """ helpDescription = """ The '%prog' command is ran to initialize a new GITC client for use with the GITC file system. This command will setup the client directory, initialize repo, just like repo init does, and then downloads the manifest collection and installs it in the .repo/directory of the GITC client. Once this is done, a GITC manifest is generated by pulling the HEAD SHA for each project and generates the properly formatted XML file and installs it as .manifest in the GITC client directory. The -c argument is required to specify the GITC client name. The optional -f argument can be used to specify the manifest file to use for this GITC client. """ def _Options(self, p): super(GitcInit, self)._Options(p) g = p.add_option_group('GITC options') g.add_option('-f', '--manifest-file', dest='manifest_file', help='Optional manifest file to use for this GITC client.') g.add_option('-c', '--gitc-client', dest='gitc_client', help='The name of the gitc_client instance to create or modify.') def Execute(self, opt, args): gitc_client = gitc_utils.parse_clientdir(os.getcwd()) if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client): print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr) sys.exit(1) self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(), gitc_client) super(GitcInit, self).Execute(opt, args) manifest_file = self.manifest.manifestFile if opt.manifest_file: if not os.path.exists(opt.manifest_file): print('fatal: Specified manifest file %s does not exist.' % opt.manifest_file) sys.exit(1) manifest_file = opt.manifest_file manifest = GitcManifest(self.repodir, gitc_client) manifest.Override(manifest_file) gitc_utils.generate_gitc_manifest(None, manifest) print('Please run `cd %s` to view your GITC client.' % os.path.join(wrapper.Wrapper().GITC_FS_ROOT_DIR, gitc_client)) subcmds/grep.py0100644 0000000 0000000 00000017373 13025567015 012573 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from color import Coloring from command import PagedCommand from git_command import git_require, GitCommand class GrepColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'grep') self.project = self.printer('project', attr='bold') class Grep(PagedCommand): common = True helpSummary = "Print lines matching a pattern" helpUsage = """ %prog {pattern | -e pattern} [...] """ helpDescription = """ Search for the specified patterns in all project files. Boolean Options --------------- The following options can appear as often as necessary to express the pattern to locate: -e PATTERN --and, --or, --not, -(, -) Further, the -r/--revision option may be specified multiple times in order to scan multiple trees. If the same file matches in more than one tree, only the first result is reported, prefixed by the revision name it was found under. Examples ------- Look for a line that has '#define' and either 'MAX_PATH or 'PATH_MAX': repo grep -e '#define' --and -\\( -e MAX_PATH -e PATH_MAX \\) Look for a line that has 'NODE' or 'Unexpected' in files that contain a line that matches both expressions: repo grep --all-match -e NODE -e Unexpected """ def _Options(self, p): def carry(option, opt_str, value, parser): pt = getattr(parser.values, 'cmd_argv', None) if pt is None: pt = [] setattr(parser.values, 'cmd_argv', pt) if opt_str == '-(': pt.append('(') elif opt_str == '-)': pt.append(')') else: pt.append(opt_str) if value is not None: pt.append(value) g = p.add_option_group('Sources') g.add_option('--cached', action='callback', callback=carry, help='Search the index, instead of the work tree') g.add_option('-r', '--revision', dest='revision', action='append', metavar='TREEish', help='Search TREEish, instead of the work tree') g = p.add_option_group('Pattern') g.add_option('-e', action='callback', callback=carry, metavar='PATTERN', type='str', help='Pattern to search for') g.add_option('-i', '--ignore-case', action='callback', callback=carry, help='Ignore case differences') g.add_option('-a', '--text', action='callback', callback=carry, help="Process binary files as if they were text") g.add_option('-I', action='callback', callback=carry, help="Don't match the pattern in binary files") g.add_option('-w', '--word-regexp', action='callback', callback=carry, help='Match the pattern only at word boundaries') g.add_option('-v', '--invert-match', action='callback', callback=carry, help='Select non-matching lines') g.add_option('-G', '--basic-regexp', action='callback', callback=carry, help='Use POSIX basic regexp for patterns (default)') g.add_option('-E', '--extended-regexp', action='callback', callback=carry, help='Use POSIX extended regexp for patterns') g.add_option('-F', '--fixed-strings', action='callback', callback=carry, help='Use fixed strings (not regexp) for pattern') g = p.add_option_group('Pattern Grouping') g.add_option('--all-match', action='callback', callback=carry, help='Limit match to lines that have all patterns') g.add_option('--and', '--or', '--not', action='callback', callback=carry, help='Boolean operators to combine patterns') g.add_option('-(', '-)', action='callback', callback=carry, help='Boolean operator grouping') g = p.add_option_group('Output') g.add_option('-n', action='callback', callback=carry, help='Prefix the line number to matching lines') g.add_option('-C', action='callback', callback=carry, metavar='CONTEXT', type='str', help='Show CONTEXT lines around match') g.add_option('-B', action='callback', callback=carry, metavar='CONTEXT', type='str', help='Show CONTEXT lines before match') g.add_option('-A', action='callback', callback=carry, metavar='CONTEXT', type='str', help='Show CONTEXT lines after match') g.add_option('-l', '--name-only', '--files-with-matches', action='callback', callback=carry, help='Show only file names containing matching lines') g.add_option('-L', '--files-without-match', action='callback', callback=carry, help='Show only file names not containing matching lines') def Execute(self, opt, args): out = GrepColoring(self.manifest.manifestProject.config) cmd_argv = ['grep'] if out.is_on and git_require((1, 6, 3)): cmd_argv.append('--color') cmd_argv.extend(getattr(opt, 'cmd_argv', [])) if '-e' not in cmd_argv: if not args: self.Usage() cmd_argv.append('-e') cmd_argv.append(args[0]) args = args[1:] projects = self.GetProjects(args) full_name = False if len(projects) > 1: cmd_argv.append('--full-name') full_name = True have_rev = False if opt.revision: if '--cached' in cmd_argv: print('fatal: cannot combine --cached and --revision', file=sys.stderr) sys.exit(1) have_rev = True cmd_argv.extend(opt.revision) cmd_argv.append('--') bad_rev = False have_match = False for project in projects: p = GitCommand(project, cmd_argv, bare = False, capture_stdout = True, capture_stderr = True) if p.Wait() != 0: # no results # if p.stderr: if have_rev and 'fatal: ambiguous argument' in p.stderr: bad_rev = True else: out.project('--- project %s ---' % project.relpath) out.nl() out.write("%s", p.stderr) out.nl() continue have_match = True # We cut the last element, to avoid a blank line. # r = p.stdout.split('\n') r = r[0:-1] if have_rev and full_name: for line in r: rev, line = line.split(':', 1) out.write("%s", rev) out.write(':') out.project(project.relpath) out.write('/') out.write("%s", line) out.nl() elif full_name: for line in r: out.project(project.relpath) out.write('/') out.write("%s", line) out.nl() else: for line in r: print(line) if have_match: sys.exit(0) elif have_rev and bad_rev: for r in opt.revision: print("error: can't search revision %s" % r, file=sys.stderr) sys.exit(1) else: sys.exit(1) subcmds/help.py0100644 0000000 0000000 00000012201 13025567015 012547 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import re import sys from formatter import AbstractFormatter, DumbWriter from color import Coloring from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand import gitc_utils class Help(PagedCommand, MirrorSafeCommand): common = False helpSummary = "Display detailed help on a command" helpUsage = """ %prog [--all|command] """ helpDescription = """ Displays detailed usage information about a command. """ def _PrintAllCommands(self): print('usage: repo COMMAND [ARGS]') print('The complete list of recognized repo commands are:') commandNames = list(sorted(self.commands)) maxlen = 0 for name in commandNames: maxlen = max(maxlen, len(name)) fmt = ' %%-%ds %%s' % maxlen for name in commandNames: command = self.commands[name] try: summary = command.helpSummary.strip() except AttributeError: summary = '' print(fmt % (name, summary)) print("See 'repo help ' for more information on a " 'specific command.') def _PrintCommonCommands(self): print('usage: repo COMMAND [ARGS]') print('The most commonly used repo commands are:') def gitc_supported(cmd): if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand): return True if self.manifest.isGitcClient: return True if isinstance(cmd, GitcClientCommand): return False if gitc_utils.get_gitc_manifest_dir(): return True return False commandNames = list(sorted([name for name, command in self.commands.items() if command.common and gitc_supported(command)])) maxlen = 0 for name in commandNames: maxlen = max(maxlen, len(name)) fmt = ' %%-%ds %%s' % maxlen for name in commandNames: command = self.commands[name] try: summary = command.helpSummary.strip() except AttributeError: summary = '' print(fmt % (name, summary)) print( "See 'repo help ' for more information on a specific command.\n" "See 'repo help --all' for a complete list of recognized commands.") def _PrintCommandHelp(self, cmd): class _Out(Coloring): def __init__(self, gc): Coloring.__init__(self, gc, 'help') self.heading = self.printer('heading', attr='bold') self.wrap = AbstractFormatter(DumbWriter()) def _PrintSection(self, heading, bodyAttr): try: body = getattr(cmd, bodyAttr) except AttributeError: return if body == '' or body is None: return self.nl() self.heading('%s', heading) self.nl() self.heading('%s', ''.ljust(len(heading), '-')) self.nl() me = 'repo %s' % cmd.NAME body = body.strip() body = body.replace('%prog', me) asciidoc_hdr = re.compile(r'^\n?([^\n]{1,})\n([=~-]{2,})$') for para in body.split("\n\n"): if para.startswith(' '): self.write('%s', para) self.nl() self.nl() continue m = asciidoc_hdr.match(para) if m: title = m.group(1) section_type = m.group(2) if section_type[0] in ('=', '-'): p = self.heading else: def _p(fmt, *args): self.write(' ') self.heading(fmt, *args) p = _p p('%s', title) self.nl() p('%s', ''.ljust(len(title), section_type[0])) self.nl() continue self.wrap.add_flowing_data(para) self.wrap.end_paragraph(1) self.wrap.end_paragraph(0) out = _Out(self.manifest.globalConfig) out._PrintSection('Summary', 'helpSummary') cmd.OptionParser.print_help() out._PrintSection('Description', 'helpDescription') def _Options(self, p): p.add_option('-a', '--all', dest='show_all', action='store_true', help='show the complete list of commands') def Execute(self, opt, args): if len(args) == 0: if opt.show_all: self._PrintAllCommands() else: self._PrintCommonCommands() elif len(args) == 1: name = args[0] try: cmd = self.commands[name] except KeyError: print("repo: '%s' is not a repo command." % name, file=sys.stderr) sys.exit(1) cmd.manifest = self.manifest self._PrintCommandHelp(cmd) else: self._PrintCommandHelp(self) subcmds/info.py0100644 0000000 0000000 00000013664 13025567015 012570 0ustar000000000 0000000 # # Copyright (C) 2012 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from command import PagedCommand from color import Coloring from error import NoSuchProjectError from git_refs import R_M class _Coloring(Coloring): def __init__(self, config): Coloring.__init__(self, config, "status") class Info(PagedCommand): common = True helpSummary = "Get info on the manifest branch, current branch or unmerged branches" helpUsage = "%prog [-dl] [-o [-b]] [...]" def _Options(self, p): p.add_option('-d', '--diff', dest='all', action='store_true', help="show full info and commit diff including remote branches") p.add_option('-o', '--overview', dest='overview', action='store_true', help='show overview of all local commits') p.add_option('-b', '--current-branch', dest="current_branch", action="store_true", help="consider only checked out branches") p.add_option('-l', '--local-only', dest="local", action="store_true", help="Disable all remote operations") def Execute(self, opt, args): self.out = _Coloring(self.manifest.globalConfig) self.heading = self.out.printer('heading', attr = 'bold') self.headtext = self.out.printer('headtext', fg = 'yellow') self.redtext = self.out.printer('redtext', fg = 'red') self.sha = self.out.printer("sha", fg = 'yellow') self.text = self.out.nofmt_printer('text') self.dimtext = self.out.printer('dimtext', attr = 'dim') self.opt = opt manifestConfig = self.manifest.manifestProject.config mergeBranch = manifestConfig.GetBranch("default").merge manifestGroups = (manifestConfig.GetString('manifest.groups') or 'all,-notdefault') self.heading("Manifest branch: ") if self.manifest.default.revisionExpr: self.headtext(self.manifest.default.revisionExpr) self.out.nl() self.heading("Manifest merge branch: ") self.headtext(mergeBranch) self.out.nl() self.heading("Manifest groups: ") self.headtext(manifestGroups) self.out.nl() self.printSeparator() if not opt.overview: self.printDiffInfo(args) else: self.printCommitOverview(args) def printSeparator(self): self.text("----------------------------") self.out.nl() def printDiffInfo(self, args): try: projs = self.GetProjects(args) except NoSuchProjectError: return for p in projs: self.heading("Project: ") self.headtext(p.name) self.out.nl() self.heading("Mount path: ") self.headtext(p.worktree) self.out.nl() self.heading("Current revision: ") self.headtext(p.revisionExpr) self.out.nl() localBranches = p.GetBranches().keys() self.heading("Local Branches: ") self.redtext(str(len(localBranches))) if len(localBranches) > 0: self.text(" [") self.text(", ".join(localBranches)) self.text("]") self.out.nl() if self.opt.all: self.findRemoteLocalDiff(p) self.printSeparator() def findRemoteLocalDiff(self, project): #Fetch all the latest commits if not self.opt.local: project.Sync_NetworkHalf(quiet=True, current_branch_only=True) logTarget = R_M + self.manifest.manifestProject.config.GetBranch("default").merge bareTmp = project.bare_git._bare project.bare_git._bare = False localCommits = project.bare_git.rev_list( '--abbrev=8', '--abbrev-commit', '--pretty=oneline', logTarget + "..", '--') originCommits = project.bare_git.rev_list( '--abbrev=8', '--abbrev-commit', '--pretty=oneline', ".." + logTarget, '--') project.bare_git._bare = bareTmp self.heading("Local Commits: ") self.redtext(str(len(localCommits))) self.dimtext(" (on current branch)") self.out.nl() for c in localCommits: split = c.split() self.sha(split[0] + " ") self.text(" ".join(split[1:])) self.out.nl() self.printSeparator() self.heading("Remote Commits: ") self.redtext(str(len(originCommits))) self.out.nl() for c in originCommits: split = c.split() self.sha(split[0] + " ") self.text(" ".join(split[1:])) self.out.nl() def printCommitOverview(self, args): all_branches = [] for project in self.GetProjects(args): br = [project.GetUploadableBranch(x) for x in project.GetBranches()] br = [x for x in br if x] if self.opt.current_branch: br = [x for x in br if x.name == project.CurrentBranch] all_branches.extend(br) if not all_branches: return self.out.nl() self.heading('Projects Overview') project = None for branch in all_branches: if project != branch.project: project = branch.project self.out.nl() self.headtext(project.relpath) self.out.nl() commits = branch.commits date = branch.date self.text('%s %-33s (%2d commit%s, %s)' % ( branch.name == project.CurrentBranch and '*' or ' ', branch.name, len(commits), len(commits) != 1 and 's' or '', date)) self.out.nl() for commit in commits: split = commit.split() self.text('{0:38}{1} '.format('','-')) self.sha(split[0] + " ") self.text(" ".join(split[1:])) self.out.nl() subcmds/init.py0100644 0000000 0000000 00000034123 13025567015 012571 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import platform import re import shutil import sys from pyversion import is_python3 if is_python3(): import urllib.parse else: import imp import urlparse urllib = imp.new_module('urllib') urllib.parse = urlparse from color import Coloring from command import InteractiveCommand, MirrorSafeCommand from error import ManifestParseError from project import SyncBuffer from git_config import GitConfig from git_command import git_require, MIN_GIT_VERSION class Init(InteractiveCommand, MirrorSafeCommand): common = True helpSummary = "Initialize repo in the current directory" helpUsage = """ %prog [options] """ helpDescription = """ The '%prog' command is run once to install and initialize repo. The latest repo source code and manifest collection is downloaded from the server and is installed in the .repo/ directory in the current working directory. The optional -b argument can be used to select the manifest branch to checkout and use. If no branch is specified, master is assumed. The optional -m argument can be used to specify an alternate manifest to be used. If no manifest is specified, the manifest default.xml will be used. The --reference option can be used to point to a directory that has the content of a --mirror sync. This will make the working directory use as much data as possible from the local reference directory when fetching from the server. This will make the sync go a lot faster by reducing data traffic on the network. The --no-clone-bundle option disables any attempt to use $URL/clone.bundle to bootstrap a new Git repository from a resumeable bundle file on a content delivery network. This may be necessary if there are problems with the local Python HTTP client or proxy configuration, but the Git binary works. Switching Manifest Branches --------------------------- To switch to another manifest branch, `repo init -b otherbranch` may be used in an existing client. However, as this only updates the manifest, a subsequent `repo sync` (or `repo sync -d`) is necessary to update the working directory files. """ def _Options(self, p): # Logging g = p.add_option_group('Logging options') g.add_option('-q', '--quiet', dest="quiet", action="store_true", default=False, help="be quiet") # Manifest g = p.add_option_group('Manifest options') g.add_option('-u', '--manifest-url', dest='manifest_url', help='manifest repository location', metavar='URL') g.add_option('-b', '--manifest-branch', dest='manifest_branch', help='manifest branch or revision', metavar='REVISION') g.add_option('-m', '--manifest-name', dest='manifest_name', default='default.xml', help='initial manifest file', metavar='NAME.xml') g.add_option('--mirror', dest='mirror', action='store_true', help='create a replica of the remote repositories ' 'rather than a client working directory') g.add_option('--reference', dest='reference', help='location of mirror directory', metavar='DIR') g.add_option('--depth', type='int', default=None, dest='depth', help='create a shallow clone with given depth; see git clone') g.add_option('--archive', dest='archive', action='store_true', help='checkout an archive instead of a git repository for ' 'each project. See git archive.') g.add_option('-g', '--groups', dest='groups', default='default', help='restrict manifest projects to ones with specified ' 'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]', metavar='GROUP') g.add_option('-p', '--platform', dest='platform', default='auto', help='restrict manifest projects to ones with a specified ' 'platform group [auto|all|none|linux|darwin|...]', metavar='PLATFORM') g.add_option('--no-clone-bundle', dest='no_clone_bundle', action='store_true', help='disable use of /clone.bundle on HTTP/HTTPS') # Tool g = p.add_option_group('repo Version options') g.add_option('--repo-url', dest='repo_url', help='repo repository location', metavar='URL') g.add_option('--repo-branch', dest='repo_branch', help='repo branch or revision', metavar='REVISION') g.add_option('--no-repo-verify', dest='no_repo_verify', action='store_true', help='do not verify repo source code') # Other g = p.add_option_group('Other options') g.add_option('--config-name', dest='config_name', action="store_true", default=False, help='Always prompt for name/e-mail') def _RegisteredEnvironmentOptions(self): return {'REPO_MANIFEST_URL': 'manifest_url', 'REPO_MIRROR_LOCATION': 'reference'} def _SyncManifest(self, opt): m = self.manifest.manifestProject is_new = not m.Exists if is_new: if not opt.manifest_url: print('fatal: manifest url (-u) is required.', file=sys.stderr) sys.exit(1) if not opt.quiet: print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url), file=sys.stderr) # The manifest project object doesn't keep track of the path on the # server where this git is located, so let's save that here. mirrored_manifest_git = None if opt.reference: manifest_git_path = urllib.parse.urlparse(opt.manifest_url).path[1:] mirrored_manifest_git = os.path.join(opt.reference, manifest_git_path) if not mirrored_manifest_git.endswith(".git"): mirrored_manifest_git += ".git" if not os.path.exists(mirrored_manifest_git): mirrored_manifest_git = os.path.join(opt.reference + '/.repo/manifests.git') m._InitGitDir(mirror_git=mirrored_manifest_git) if opt.manifest_branch: m.revisionExpr = opt.manifest_branch else: m.revisionExpr = 'refs/heads/master' else: if opt.manifest_branch: m.revisionExpr = opt.manifest_branch else: m.PreSync() if opt.manifest_url: r = m.GetRemote(m.remote.name) r.url = opt.manifest_url r.ResetFetch() r.Save() groups = re.split(r'[,\s]+', opt.groups) all_platforms = ['linux', 'darwin', 'windows'] platformize = lambda x: 'platform-' + x if opt.platform == 'auto': if (not opt.mirror and not m.config.GetString('repo.mirror') == 'true'): groups.append(platformize(platform.system().lower())) elif opt.platform == 'all': groups.extend(map(platformize, all_platforms)) elif opt.platform in all_platforms: groups.append(platformize(opt.platform)) elif opt.platform != 'none': print('fatal: invalid platform flag', file=sys.stderr) sys.exit(1) groups = [x for x in groups if x] groupstr = ','.join(groups) if opt.platform == 'auto' and groupstr == 'default,platform-' + platform.system().lower(): groupstr = None m.config.SetString('manifest.groups', groupstr) if opt.reference: m.config.SetString('repo.reference', opt.reference) if opt.archive: if is_new: m.config.SetString('repo.archive', 'true') else: print('fatal: --archive is only supported when initializing a new ' 'workspace.', file=sys.stderr) print('Either delete the .repo folder in this workspace, or initialize ' 'in another location.', file=sys.stderr) sys.exit(1) if opt.mirror: if is_new: m.config.SetString('repo.mirror', 'true') else: print('fatal: --mirror is only supported when initializing a new ' 'workspace.', file=sys.stderr) print('Either delete the .repo folder in this workspace, or initialize ' 'in another location.', file=sys.stderr) sys.exit(1) if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, clone_bundle=not opt.no_clone_bundle): r = m.GetRemote(m.remote.name) print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr) # Better delete the manifest git dir if we created it; otherwise next # time (when user fixes problems) we won't go through the "is_new" logic. if is_new: shutil.rmtree(m.gitdir) sys.exit(1) if opt.manifest_branch: m.MetaBranchSwitch() syncbuf = SyncBuffer(m.config) m.Sync_LocalHalf(syncbuf) syncbuf.Finish() if is_new or m.CurrentBranch is None: if not m.StartBranch('default'): print('fatal: cannot create default in manifest', file=sys.stderr) sys.exit(1) def _LinkManifest(self, name): if not name: print('fatal: manifest name (-m) is required.', file=sys.stderr) sys.exit(1) try: self.manifest.Link(name) except ManifestParseError as e: print("fatal: manifest '%s' not available" % name, file=sys.stderr) print('fatal: %s' % str(e), file=sys.stderr) sys.exit(1) def _Prompt(self, prompt, value): sys.stdout.write('%-10s [%s]: ' % (prompt, value)) a = sys.stdin.readline().strip() if a == '': return value return a def _ShouldConfigureUser(self): gc = self.manifest.globalConfig mp = self.manifest.manifestProject # If we don't have local settings, get from global. if not mp.config.Has('user.name') or not mp.config.Has('user.email'): if not gc.Has('user.name') or not gc.Has('user.email'): return True mp.config.SetString('user.name', gc.GetString('user.name')) mp.config.SetString('user.email', gc.GetString('user.email')) print() print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'), mp.config.GetString('user.email'))) print('If you want to change this, please re-run \'repo init\' with --config-name') return False def _ConfigureUser(self): mp = self.manifest.manifestProject while True: print() name = self._Prompt('Your Name', mp.UserName) email = self._Prompt('Your Email', mp.UserEmail) print() print('Your identity is: %s <%s>' % (name, email)) sys.stdout.write('is this correct [y/N]? ') a = sys.stdin.readline().strip().lower() if a in ('yes', 'y', 't', 'true'): break if name != mp.UserName: mp.config.SetString('user.name', name) if email != mp.UserEmail: mp.config.SetString('user.email', email) def _HasColorSet(self, gc): for n in ['ui', 'diff', 'status']: if gc.Has('color.%s' % n): return True return False def _ConfigureColor(self): gc = self.manifest.globalConfig if self._HasColorSet(gc): return class _Test(Coloring): def __init__(self): Coloring.__init__(self, gc, 'test color display') self._on = True out = _Test() print() print("Testing colorized output (for 'repo diff', 'repo status'):") for c in ['black', 'red', 'green', 'yellow', 'blue', 'magenta', 'cyan']: out.write(' ') out.printer(fg=c)(' %-6s ', c) out.write(' ') out.printer(fg='white', bg='black')(' %s ' % 'white') out.nl() for c in ['bold', 'dim', 'ul', 'reverse']: out.write(' ') out.printer(fg='black', attr=c)(' %-6s ', c) out.nl() sys.stdout.write('Enable color display in this user account (y/N)? ') a = sys.stdin.readline().strip().lower() if a in ('y', 'yes', 't', 'true', 'on'): gc.SetString('color.ui', 'auto') def _ConfigureDepth(self, opt): """Configure the depth we'll sync down. Args: opt: Options from optparse. We care about opt.depth. """ # Opt.depth will be non-None if user actually passed --depth to repo init. if opt.depth is not None: if opt.depth > 0: # Positive values will set the depth. depth = str(opt.depth) else: # Negative numbers will clear the depth; passing None to SetString # will do that. depth = None # We store the depth in the main manifest project. self.manifest.manifestProject.config.SetString('repo.depth', depth) def _DisplayResult(self): if self.manifest.IsMirror: init_type = 'mirror ' else: init_type = '' print() print('repo %shas been initialized in %s' % (init_type, self.manifest.topdir)) current_dir = os.getcwd() if current_dir != self.manifest.topdir: print('If this is not the directory in which you want to initialize ' 'repo, please run:') print(' rm -r %s/.repo' % self.manifest.topdir) print('and try again.') def Execute(self, opt, args): git_require(MIN_GIT_VERSION, fail=True) if opt.reference: opt.reference = os.path.expanduser(opt.reference) # Check this here, else manifest will be tagged "not new" and init won't be # possible anymore without removing the .repo/manifests directory. if opt.archive and opt.mirror: print('fatal: --mirror and --archive cannot be used together.', file=sys.stderr) sys.exit(1) self._SyncManifest(opt) self._LinkManifest(opt.manifest_name) if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror: if opt.config_name or self._ShouldConfigureUser(): self._ConfigureUser() self._ConfigureColor() self._ConfigureDepth(opt) self._DisplayResult() subcmds/list.py0100644 0000000 0000000 00000005700 13025567015 012600 0ustar000000000 0000000 # # Copyright (C) 2011 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from command import Command, MirrorSafeCommand class List(Command, MirrorSafeCommand): common = True helpSummary = "List projects and their associated directories" helpUsage = """ %prog [-f] [...] %prog [-f] -r str1 [str2]..." """ helpDescription = """ List all projects; pass '.' to list the project for the cwd. This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'. """ def _Options(self, p): p.add_option('-r', '--regex', dest='regex', action='store_true', help="Filter the project list based on regex or wildcard matching of strings") p.add_option('-g', '--groups', dest='groups', help="Filter the project list based on the groups the project is in") p.add_option('-f', '--fullpath', dest='fullpath', action='store_true', help="Display the full work tree path instead of the relative path") p.add_option('-n', '--name-only', dest='name_only', action='store_true', help="Display only the name of the repository") p.add_option('-p', '--path-only', dest='path_only', action='store_true', help="Display only the path of the repository") def Execute(self, opt, args): """List all projects and the associated directories. This may be possible to do with 'repo forall', but repo newbies have trouble figuring that out. The idea here is that it should be more discoverable. Args: opt: The options. args: Positional args. Can be a list of projects to list, or empty. """ if opt.fullpath and opt.name_only: print('error: cannot combine -f and -n', file=sys.stderr) sys.exit(1) if not opt.regex: projects = self.GetProjects(args, groups=opt.groups) else: projects = self.FindProjects(args) def _getpath(x): if opt.fullpath: return x.worktree return x.relpath lines = [] for project in projects: if opt.name_only and not opt.path_only: lines.append("%s" % ( project.name)) elif opt.path_only and not opt.name_only: lines.append("%s" % (_getpath(project))) else: lines.append("%s : %s" % (_getpath(project), project.name)) lines.sort() print('\n'.join(lines)) subcmds/manifest.py0100644 0000000 0000000 00000005322 13025567015 013433 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import sys from command import PagedCommand class Manifest(PagedCommand): common = False helpSummary = "Manifest inspection utility" helpUsage = """ %prog [-o {-|NAME.xml} [-r]] """ _helpDescription = """ With the -o option, exports the current manifest for inspection. The manifest and (if present) local_manifest.xml are combined together to produce a single manifest file. This file can be stored in a Git repository for use during future 'repo init' invocations. """ @property def helpDescription(self): helptext = self._helpDescription + '\n' r = os.path.dirname(__file__) r = os.path.dirname(r) fd = open(os.path.join(r, 'docs', 'manifest-format.txt')) for line in fd: helptext += line fd.close() return helptext def _Options(self, p): p.add_option('-r', '--revision-as-HEAD', dest='peg_rev', action='store_true', help='Save revisions as current HEAD') p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream', default=True, action='store_false', help='If in -r mode, do not write the upstream field. ' 'Only of use if the branch names for a sha1 manifest are ' 'sensitive.') p.add_option('-o', '--output-file', dest='output_file', default='-', help='File to save the manifest to', metavar='-|NAME.xml') def _Output(self, opt): if opt.output_file == '-': fd = sys.stdout else: fd = open(opt.output_file, 'w') self.manifest.Save(fd, peg_rev = opt.peg_rev, peg_rev_upstream = opt.peg_rev_upstream) fd.close() if opt.output_file != '-': print('Saved manifest to %s' % opt.output_file, file=sys.stderr) def Execute(self, opt, args): if args: self.Usage() if opt.output_file is not None: self._Output(opt) return print('error: no operation to perform', file=sys.stderr) print('error: see repo help manifest', file=sys.stderr) sys.exit(1) subcmds/overview.py0100644 0000000 0000000 00000005247 13025567015 013501 0ustar000000000 0000000 # # Copyright (C) 2012 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function from color import Coloring from command import PagedCommand class Overview(PagedCommand): common = True helpSummary = "Display overview of unmerged project branches" helpUsage = """ %prog [--current-branch] [...] """ helpDescription = """ The '%prog' command is used to display an overview of the projects branches, and list any local commits that have not yet been merged into the project. The -b/--current-branch option can be used to restrict the output to only branches currently checked out in each project. By default, all branches are displayed. """ def _Options(self, p): p.add_option('-b', '--current-branch', dest="current_branch", action="store_true", help="Consider only checked out branches") def Execute(self, opt, args): all_branches = [] for project in self.GetProjects(args): br = [project.GetUploadableBranch(x) for x in project.GetBranches()] br = [x for x in br if x] if opt.current_branch: br = [x for x in br if x.name == project.CurrentBranch] all_branches.extend(br) if not all_branches: return class Report(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'status') self.project = self.printer('header', attr='bold') self.text = self.printer('text') out = Report(all_branches[0].project.config) out.text("Deprecated. See repo info -o.") out.nl() out.project('Projects Overview') out.nl() project = None for branch in all_branches: if project != branch.project: project = branch.project out.nl() out.project('project %s/' % project.relpath) out.nl() commits = branch.commits date = branch.date print('%s %-33s (%2d commit%s, %s)' % ( branch.name == project.CurrentBranch and '*' or ' ', branch.name, len(commits), len(commits) != 1 and 's' or ' ', date)) for commit in commits: print('%-35s - %s' % ('', commit)) subcmds/prune.py0100644 0000000 0000000 00000003400 13025567015 012751 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function from color import Coloring from command import PagedCommand class Prune(PagedCommand): common = True helpSummary = "Prune (delete) already merged topics" helpUsage = """ %prog [...] """ def Execute(self, opt, args): all_branches = [] for project in self.GetProjects(args): all_branches.extend(project.PruneHeads()) if not all_branches: return class Report(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'status') self.project = self.printer('header', attr='bold') out = Report(all_branches[0].project.config) out.project('Pending Branches') out.nl() project = None for branch in all_branches: if project != branch.project: project = branch.project out.nl() out.project('project %s/' % project.relpath) out.nl() commits = branch.commits date = branch.date print('%s %-33s (%2d commit%s, %s)' % ( branch.name == project.CurrentBranch and '*' or ' ', branch.name, len(commits), len(commits) != 1 and 's' or ' ', date)) subcmds/rebase.py0100644 0000000 0000000 00000011274 13025567015 013071 0ustar000000000 0000000 # # Copyright (C) 2010 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from command import Command from git_command import GitCommand class Rebase(Command): common = True helpSummary = "Rebase local branches on upstream branch" helpUsage = """ %prog {[...] | -i ...} """ helpDescription = """ '%prog' uses git rebase to move local changes in the current topic branch to the HEAD of the upstream history, useful when you have made commits in a topic branch but need to incorporate new upstream changes "underneath" them. """ def _Options(self, p): p.add_option('-i', '--interactive', dest="interactive", action="store_true", help="interactive rebase (single project only)") p.add_option('-f', '--force-rebase', dest='force_rebase', action='store_true', help='Pass --force-rebase to git rebase') p.add_option('--no-ff', dest='no_ff', action='store_true', help='Pass --no-ff to git rebase') p.add_option('-q', '--quiet', dest='quiet', action='store_true', help='Pass --quiet to git rebase') p.add_option('--autosquash', dest='autosquash', action='store_true', help='Pass --autosquash to git rebase') p.add_option('--whitespace', dest='whitespace', action='store', metavar='WS', help='Pass --whitespace to git rebase') p.add_option('--auto-stash', dest='auto_stash', action='store_true', help='Stash local modifications before starting') p.add_option('-m', '--onto-manifest', dest='onto_manifest', action='store_true', help='Rebase onto the manifest version instead of upstream ' 'HEAD. This helps to make sure the local tree stays ' 'consistent if you previously synced to a manifest.') def Execute(self, opt, args): all_projects = self.GetProjects(args) one_project = len(all_projects) == 1 if opt.interactive and not one_project: print('error: interactive rebase not supported with multiple projects', file=sys.stderr) if len(args) == 1: print('note: project %s is mapped to more than one path' % (args[0],), file=sys.stderr) return -1 for project in all_projects: cb = project.CurrentBranch if not cb: if one_project: print("error: project %s has a detached HEAD" % project.relpath, file=sys.stderr) return -1 # ignore branches with detatched HEADs continue upbranch = project.GetBranch(cb) if not upbranch.LocalMerge: if one_project: print("error: project %s does not track any remote branches" % project.relpath, file=sys.stderr) return -1 # ignore branches without remotes continue args = ["rebase"] if opt.whitespace: args.append('--whitespace=%s' % opt.whitespace) if opt.quiet: args.append('--quiet') if opt.force_rebase: args.append('--force-rebase') if opt.no_ff: args.append('--no-ff') if opt.autosquash: args.append('--autosquash') if opt.interactive: args.append("-i") if opt.onto_manifest: args.append('--onto') args.append(project.revisionExpr) args.append(upbranch.LocalMerge) print('# %s: rebasing %s -> %s' % (project.relpath, cb, upbranch.LocalMerge), file=sys.stderr) needs_stash = False if opt.auto_stash: stash_args = ["update-index", "--refresh", "-q"] if GitCommand(project, stash_args).Wait() != 0: needs_stash = True # Dirty index, requires stash... stash_args = ["stash"] if GitCommand(project, stash_args).Wait() != 0: return -1 if GitCommand(project, args).Wait() != 0: return -1 if needs_stash: stash_args.append('pop') stash_args.append('--quiet') if GitCommand(project, stash_args).Wait() != 0: return -1 subcmds/selfupdate.py0100644 0000000 0000000 00000003665 13025567015 013771 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function from optparse import SUPPRESS_HELP import sys from command import Command, MirrorSafeCommand from subcmds.sync import _PostRepoUpgrade from subcmds.sync import _PostRepoFetch class Selfupdate(Command, MirrorSafeCommand): common = False helpSummary = "Update repo to the latest version" helpUsage = """ %prog """ helpDescription = """ The '%prog' command upgrades repo to the latest version, if a newer version is available. Normally this is done automatically by 'repo sync' and does not need to be performed by an end-user. """ def _Options(self, p): g = p.add_option_group('repo Version options') g.add_option('--no-repo-verify', dest='no_repo_verify', action='store_true', help='do not verify repo source code') g.add_option('--repo-upgraded', dest='repo_upgraded', action='store_true', help=SUPPRESS_HELP) def Execute(self, opt, args): rp = self.manifest.repoProject rp.PreSync() if opt.repo_upgraded: _PostRepoUpgrade(self.manifest) else: if not rp.Sync_NetworkHalf(): print("error: can't update repo", file=sys.stderr) sys.exit(1) rp.bare_git.gc('--auto') _PostRepoFetch(rp, no_repo_verify = opt.no_repo_verify, verbose = True) subcmds/smartsync.py0100644 0000000 0000000 00000002003 13025567015 013641 0ustar000000000 0000000 # # Copyright (C) 2010 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from subcmds.sync import Sync class Smartsync(Sync): common = True helpSummary = "Update working tree to the latest known good revision" helpUsage = """ %prog [...] """ helpDescription = """ The '%prog' command is a shortcut for sync -s. """ def _Options(self, p): Sync._Options(self, p, show_smart=False) def Execute(self, opt, args): opt.smart_sync = True Sync.Execute(self, opt, args) subcmds/stage.py0100644 0000000 0000000 00000005667 13025567015 012744 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from color import Coloring from command import InteractiveCommand from git_command import GitCommand class _ProjectList(Coloring): def __init__(self, gc): Coloring.__init__(self, gc, 'interactive') self.prompt = self.printer('prompt', fg='blue', attr='bold') self.header = self.printer('header', attr='bold') self.help = self.printer('help', fg='red', attr='bold') class Stage(InteractiveCommand): common = True helpSummary = "Stage file(s) for commit" helpUsage = """ %prog -i [...] """ helpDescription = """ The '%prog' command stages files to prepare the next commit. """ def _Options(self, p): p.add_option('-i', '--interactive', dest='interactive', action='store_true', help='use interactive staging') def Execute(self, opt, args): if opt.interactive: self._Interactive(opt, args) else: self.Usage() def _Interactive(self, opt, args): all_projects = [p for p in self.GetProjects(args) if p.IsDirty()] if not all_projects: print('no projects have uncommitted modifications', file=sys.stderr) return out = _ProjectList(self.manifest.manifestProject.config) while True: out.header(' %s', 'project') out.nl() for i in range(len(all_projects)): p = all_projects[i] out.write('%3d: %s', i + 1, p.relpath + '/') out.nl() out.nl() out.write('%3d: (', 0) out.prompt('q') out.write('uit)') out.nl() out.prompt('project> ') try: a = sys.stdin.readline() except KeyboardInterrupt: out.nl() break if a == '': out.nl() break a = a.strip() if a.lower() in ('q', 'quit', 'exit'): break if not a: continue try: a_index = int(a) except ValueError: a_index = None if a_index is not None: if a_index == 0: break if 0 < a_index and a_index <= len(all_projects): _AddI(all_projects[a_index - 1]) continue projects = [p for p in all_projects if a in [p.name, p.relpath]] if len(projects) == 1: _AddI(projects[0]) continue print('Bye.') def _AddI(project): p = GitCommand(project, ['add', '--interactive'], bare=False) p.Wait() subcmds/start.py0100644 0000000 0000000 00000007614 13025567015 012770 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import os import sys from command import Command from git_config import IsId from git_command import git import gitc_utils from progress import Progress from project import SyncBuffer class Start(Command): common = True helpSummary = "Start a new branch for development" helpUsage = """ %prog [--all | ...] """ helpDescription = """ '%prog' begins a new branch of development, starting from the revision specified in the manifest. """ def _Options(self, p): p.add_option('--all', dest='all', action='store_true', help='begin branch in all projects') def Execute(self, opt, args): if not args: self.Usage() nb = args[0] if not git.check_ref_format('heads/%s' % nb): print("error: '%s' is not a valid name" % nb, file=sys.stderr) sys.exit(1) err = [] projects = [] if not opt.all: projects = args[1:] if len(projects) < 1: projects = ['.',] # start it in the local project by default all_projects = self.GetProjects(projects, missing_ok=bool(self.gitc_manifest)) # This must happen after we find all_projects, since GetProjects may need # the local directory, which will disappear once we save the GITC manifest. if self.gitc_manifest: gitc_projects = self.GetProjects(projects, manifest=self.gitc_manifest, missing_ok=True) for project in gitc_projects: if project.old_revision: project.already_synced = True else: project.already_synced = False project.old_revision = project.revisionExpr project.revisionExpr = None # Save the GITC manifest. gitc_utils.save_manifest(self.gitc_manifest) # Make sure we have a valid CWD if not os.path.exists(os.getcwd()): os.chdir(self.manifest.topdir) pm = Progress('Starting %s' % nb, len(all_projects)) for project in all_projects: pm.update() if self.gitc_manifest: gitc_project = self.gitc_manifest.paths[project.relpath] # Sync projects that have not been opened. if not gitc_project.already_synced: proj_localdir = os.path.join(self.gitc_manifest.gitc_client_dir, project.relpath) project.worktree = proj_localdir if not os.path.exists(proj_localdir): os.makedirs(proj_localdir) project.Sync_NetworkHalf() sync_buf = SyncBuffer(self.manifest.manifestProject.config) project.Sync_LocalHalf(sync_buf) project.revisionId = gitc_project.old_revision # If the current revision is a specific SHA1 then we can't push back # to it; so substitute with dest_branch if defined, or with manifest # default revision instead. branch_merge = '' if IsId(project.revisionExpr): if project.dest_branch: branch_merge = project.dest_branch else: branch_merge = self.manifest.default.revisionExpr if not project.StartBranch(nb, branch_merge=branch_merge): err.append(project) pm.end() if err: for p in err: print("error: %s/: cannot start %s" % (p.relpath, nb), file=sys.stderr) sys.exit(1) subcmds/status.py0100644 0000000 0000000 00000014545 13025567015 013157 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from command import PagedCommand try: import threading as _threading except ImportError: import dummy_threading as _threading import glob import itertools import os from color import Coloring class Status(PagedCommand): common = True helpSummary = "Show the working tree status" helpUsage = """ %prog [...] """ helpDescription = """ '%prog' compares the working tree to the staging area (aka index), and the most recent commit on this branch (HEAD), in each project specified. A summary is displayed, one line per file where there is a difference between these three states. The -j/--jobs option can be used to run multiple status queries in parallel. The -o/--orphans option can be used to show objects that are in the working directory, but not associated with a repo project. This includes unmanaged top-level files and directories, but also includes deeper items. For example, if dir/subdir/proj1 and dir/subdir/proj2 are repo projects, dir/subdir/proj3 will be shown if it is not known to repo. Status Display -------------- The status display is organized into three columns of information, for example if the file 'subcmds/status.py' is modified in the project 'repo' on branch 'devwork': project repo/ branch devwork -m subcmds/status.py The first column explains how the staging area (index) differs from the last commit (HEAD). Its values are always displayed in upper case and have the following meanings: -: no difference A: added (not in HEAD, in index ) M: modified ( in HEAD, in index, different content ) D: deleted ( in HEAD, not in index ) R: renamed (not in HEAD, in index, path changed ) C: copied (not in HEAD, in index, copied from another) T: mode changed ( in HEAD, in index, same content ) U: unmerged; conflict resolution required The second column explains how the working directory differs from the index. Its values are always displayed in lower case and have the following meanings: -: new / unknown (not in index, in work tree ) m: modified ( in index, in work tree, modified ) d: deleted ( in index, not in work tree ) """ def _Options(self, p): p.add_option('-j', '--jobs', dest='jobs', action='store', type='int', default=2, help="number of projects to check simultaneously") p.add_option('-o', '--orphans', dest='orphans', action='store_true', help="include objects in working directory outside of repo projects") def _StatusHelper(self, project, clean_counter, sem): """Obtains the status for a specific project. Obtains the status for a project, redirecting the output to the specified object. It will release the semaphore when done. Args: project: Project to get status of. clean_counter: Counter for clean projects. sem: Semaphore, will call release() when complete. output: Where to output the status. """ try: state = project.PrintWorkTreeStatus() if state == 'CLEAN': next(clean_counter) finally: sem.release() def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring): """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'""" status_header = ' --\t' for item in dirs: if not os.path.isdir(item): outstring.append(''.join([status_header, item])) continue if item in proj_dirs: continue if item in proj_dirs_parents: self._FindOrphans(glob.glob('%s/.*' % item) + glob.glob('%s/*' % item), proj_dirs, proj_dirs_parents, outstring) continue outstring.append(''.join([status_header, item, '/'])) def Execute(self, opt, args): all_projects = self.GetProjects(args) counter = itertools.count() if opt.jobs == 1: for project in all_projects: state = project.PrintWorkTreeStatus() if state == 'CLEAN': next(counter) else: sem = _threading.Semaphore(opt.jobs) threads = [] for project in all_projects: sem.acquire() t = _threading.Thread(target=self._StatusHelper, args=(project, counter, sem)) threads.append(t) t.daemon = True t.start() for t in threads: t.join() if len(all_projects) == next(counter): print('nothing to commit (working directory clean)') if opt.orphans: proj_dirs = set() proj_dirs_parents = set() for project in self.GetProjects(None, missing_ok=True): proj_dirs.add(project.relpath) (head, _tail) = os.path.split(project.relpath) while head != "": proj_dirs_parents.add(head) (head, _tail) = os.path.split(head) proj_dirs.add('.repo') class StatusColoring(Coloring): def __init__(self, config): Coloring.__init__(self, config, 'status') self.project = self.printer('header', attr = 'bold') self.untracked = self.printer('untracked', fg = 'red') orig_path = os.getcwd() try: os.chdir(self.manifest.topdir) outstring = [] self._FindOrphans(glob.glob('.*') + glob.glob('*'), proj_dirs, proj_dirs_parents, outstring) if outstring: output = StatusColoring(self.manifest.globalConfig) output.project('Objects not within a project (orphans)') output.nl() for entry in outstring: output.untracked(entry) output.nl() else: print('No orphan files or directories') finally: # Restore CWD. os.chdir(orig_path) subcmds/sync.py0100644 0000000 0000000 00000107544 13025567015 012612 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import json import netrc from optparse import SUPPRESS_HELP import os import re import shutil import socket import subprocess import sys import tempfile import time from pyversion import is_python3 if is_python3(): import http.cookiejar as cookielib import urllib.error import urllib.parse import urllib.request import xmlrpc.client else: import cookielib import imp import urllib2 import urlparse import xmlrpclib urllib = imp.new_module('urllib') urllib.error = urllib2 urllib.parse = urlparse urllib.request = urllib2 xmlrpc = imp.new_module('xmlrpc') xmlrpc.client = xmlrpclib try: import threading as _threading except ImportError: import dummy_threading as _threading try: import resource def _rlimit_nofile(): return resource.getrlimit(resource.RLIMIT_NOFILE) except ImportError: def _rlimit_nofile(): return (256, 256) try: import multiprocessing except ImportError: multiprocessing = None from git_command import GIT, git_require from git_config import GetUrlCookieFile from git_refs import R_HEADS, HEAD import gitc_utils from project import Project from project import RemoteSpec from command import Command, MirrorSafeCommand from error import RepoChangedException, GitError, ManifestParseError from project import SyncBuffer from progress import Progress from wrapper import Wrapper from manifest_xml import GitcManifest _ONE_DAY_S = 24 * 60 * 60 class _FetchError(Exception): """Internal error thrown in _FetchHelper() when we don't want stack trace.""" pass class Sync(Command, MirrorSafeCommand): jobs = 1 common = True helpSummary = "Update working tree to the latest revision" helpUsage = """ %prog [...] """ helpDescription = """ The '%prog' command synchronizes local project directories with the remote repositories specified in the manifest. If a local project does not yet exist, it will clone a new local directory from the remote repository and set up tracking branches as specified in the manifest. If the local project already exists, '%prog' will update the remote branches and rebase any new local changes on top of the new remote changes. '%prog' will synchronize all projects listed at the command line. Projects can be specified either by name, or by a relative or absolute path to the project's local directory. If no projects are specified, '%prog' will synchronize all projects listed in the manifest. The -d/--detach option can be used to switch specified projects back to the manifest revision. This option is especially helpful if the project is currently on a topic branch, but the manifest revision is temporarily needed. The -s/--smart-sync option can be used to sync to a known good build as specified by the manifest-server element in the current manifest. The -t/--smart-tag option is similar and allows you to specify a custom tag/label. The -u/--manifest-server-username and -p/--manifest-server-password options can be used to specify a username and password to authenticate with the manifest server when using the -s or -t option. If -u and -p are not specified when using the -s or -t option, '%prog' will attempt to read authentication credentials for the manifest server from the user's .netrc file. '%prog' will not use authentication credentials from -u/-p or .netrc if the manifest server specified in the manifest file already includes credentials. The -f/--force-broken option can be used to proceed with syncing other projects if a project sync fails. The --force-sync option can be used to overwrite existing git directories if they have previously been linked to a different object direcotry. WARNING: This may cause data to be lost since refs may be removed when overwriting. The --no-clone-bundle option disables any attempt to use $URL/clone.bundle to bootstrap a new Git repository from a resumeable bundle file on a content delivery network. This may be necessary if there are problems with the local Python HTTP client or proxy configuration, but the Git binary works. The --fetch-submodules option enables fetching Git submodules of a project from server. The -c/--current-branch option can be used to only fetch objects that are on the branch specified by a project's revision. The --optimized-fetch option can be used to only fetch projects that are fixed to a sha1 revision if the sha1 revision does not already exist locally. The --prune option can be used to remove any refs that no longer exist on the remote. SSH Connections --------------- If at least one project remote URL uses an SSH connection (ssh://, git+ssh://, or user@host:path syntax) repo will automatically enable the SSH ControlMaster option when connecting to that host. This feature permits other projects in the same '%prog' session to reuse the same SSH tunnel, saving connection setup overheads. To disable this behavior on UNIX platforms, set the GIT_SSH environment variable to 'ssh'. For example: export GIT_SSH=ssh %prog Compatibility ~~~~~~~~~~~~~ This feature is automatically disabled on Windows, due to the lack of UNIX domain socket support. This feature is not compatible with url.insteadof rewrites in the user's ~/.gitconfig. '%prog' is currently not able to perform the rewrite early enough to establish the ControlMaster tunnel. If the remote SSH daemon is Gerrit Code Review, version 2.0.10 or later is required to fix a server side protocol bug. """ def _Options(self, p, show_smart=True): try: self.jobs = self.manifest.default.sync_j except ManifestParseError: self.jobs = 1 p.add_option('-f', '--force-broken', dest='force_broken', action='store_true', help="continue sync even if a project fails to sync") p.add_option('--force-sync', dest='force_sync', action='store_true', help="overwrite an existing git directory if it needs to " "point to a different object directory. WARNING: this " "may cause loss of data") p.add_option('-l', '--local-only', dest='local_only', action='store_true', help="only update working tree, don't fetch") p.add_option('-n', '--network-only', dest='network_only', action='store_true', help="fetch only, don't update working tree") p.add_option('-d', '--detach', dest='detach_head', action='store_true', help='detach projects back to manifest revision') p.add_option('-c', '--current-branch', dest='current_branch_only', action='store_true', help='fetch only current branch from server') p.add_option('-q', '--quiet', dest='quiet', action='store_true', help='be more quiet') p.add_option('-j', '--jobs', dest='jobs', action='store', type='int', help="projects to fetch simultaneously (default %d)" % self.jobs) p.add_option('-m', '--manifest-name', dest='manifest_name', help='temporary manifest to use for this sync', metavar='NAME.xml') p.add_option('--no-clone-bundle', dest='no_clone_bundle', action='store_true', help='disable use of /clone.bundle on HTTP/HTTPS') p.add_option('-u', '--manifest-server-username', action='store', dest='manifest_server_username', help='username to authenticate with the manifest server') p.add_option('-p', '--manifest-server-password', action='store', dest='manifest_server_password', help='password to authenticate with the manifest server') p.add_option('--fetch-submodules', dest='fetch_submodules', action='store_true', help='fetch submodules from server') p.add_option('--no-tags', dest='no_tags', action='store_true', help="don't fetch tags") p.add_option('--optimized-fetch', dest='optimized_fetch', action='store_true', help='only fetch projects fixed to sha1 if revision does not exist locally') p.add_option('--prune', dest='prune', action='store_true', help='delete refs that no longer exist on the remote') if show_smart: p.add_option('-s', '--smart-sync', dest='smart_sync', action='store_true', help='smart sync using manifest from the latest known good build') p.add_option('-t', '--smart-tag', dest='smart_tag', action='store', help='smart sync using manifest from a known tag') g = p.add_option_group('repo Version options') g.add_option('--no-repo-verify', dest='no_repo_verify', action='store_true', help='do not verify repo source code') g.add_option('--repo-upgraded', dest='repo_upgraded', action='store_true', help=SUPPRESS_HELP) def _FetchProjectList(self, opt, projects, *args, **kwargs): """Main function of the fetch threads when jobs are > 1. Delegates most of the work to _FetchHelper. Args: opt: Program options returned from optparse. See _Options(). projects: Projects to fetch. *args, **kwargs: Remaining arguments to pass to _FetchHelper. See the _FetchHelper docstring for details. """ for project in projects: success = self._FetchHelper(opt, project, *args, **kwargs) if not success and not opt.force_broken: break def _FetchHelper(self, opt, project, lock, fetched, pm, sem, err_event): """Fetch git objects for a single project. Args: opt: Program options returned from optparse. See _Options(). project: Project object for the project to fetch. lock: Lock for accessing objects that are shared amongst multiple _FetchHelper() threads. fetched: set object that we will add project.gitdir to when we're done (with our lock held). pm: Instance of a Project object. We will call pm.update() (with our lock held). sem: We'll release() this semaphore when we exit so that another thread can be started up. err_event: We'll set this event in the case of an error (after printing out info about the error). Returns: Whether the fetch was successful. """ # We'll set to true once we've locked the lock. did_lock = False if not opt.quiet: print('Fetching project %s' % project.name) # Encapsulate everything in a try/except/finally so that: # - We always set err_event in the case of an exception. # - We always make sure we call sem.release(). # - We always make sure we unlock the lock if we locked it. try: try: start = time.time() success = project.Sync_NetworkHalf( quiet=opt.quiet, current_branch_only=opt.current_branch_only, force_sync=opt.force_sync, clone_bundle=not opt.no_clone_bundle, no_tags=opt.no_tags, archive=self.manifest.IsArchive, optimized_fetch=opt.optimized_fetch, prune=opt.prune) self._fetch_times.Set(project, time.time() - start) # Lock around all the rest of the code, since printing, updating a set # and Progress.update() are not thread safe. lock.acquire() did_lock = True if not success: err_event.set() print('error: Cannot fetch %s' % project.name, file=sys.stderr) if opt.force_broken: print('warn: --force-broken, continuing to sync', file=sys.stderr) else: raise _FetchError() fetched.add(project.gitdir) pm.update() except _FetchError: pass except Exception as e: print('error: Cannot fetch %s (%s: %s)' \ % (project.name, type(e).__name__, str(e)), file=sys.stderr) err_event.set() raise finally: if did_lock: lock.release() sem.release() return success def _Fetch(self, projects, opt): fetched = set() lock = _threading.Lock() pm = Progress('Fetching projects', len(projects)) objdir_project_map = dict() for project in projects: objdir_project_map.setdefault(project.objdir, []).append(project) threads = set() sem = _threading.Semaphore(self.jobs) err_event = _threading.Event() for project_list in objdir_project_map.values(): # Check for any errors before running any more tasks. # ...we'll let existing threads finish, though. if err_event.isSet() and not opt.force_broken: break sem.acquire() kwargs = dict(opt=opt, projects=project_list, lock=lock, fetched=fetched, pm=pm, sem=sem, err_event=err_event) if self.jobs > 1: t = _threading.Thread(target = self._FetchProjectList, kwargs = kwargs) # Ensure that Ctrl-C will not freeze the repo process. t.daemon = True threads.add(t) t.start() else: self._FetchProjectList(**kwargs) for t in threads: t.join() # If we saw an error, exit with code 1 so that other scripts can check. if err_event.isSet(): print('\nerror: Exited sync due to fetch errors', file=sys.stderr) sys.exit(1) pm.end() self._fetch_times.Save() if not self.manifest.IsArchive: self._GCProjects(projects) return fetched def _GCProjects(self, projects): gc_gitdirs = {} for project in projects: if len(project.manifest.GetProjectsWithName(project.name)) > 1: print('Shared project %s found, disabling pruning.' % project.name) project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never') gc_gitdirs[project.gitdir] = project.bare_git has_dash_c = git_require((1, 7, 2)) if multiprocessing and has_dash_c: cpu_count = multiprocessing.cpu_count() else: cpu_count = 1 jobs = min(self.jobs, cpu_count) if jobs < 2: for bare_git in gc_gitdirs.values(): bare_git.gc('--auto') return config = {'pack.threads': cpu_count / jobs if cpu_count > jobs else 1} threads = set() sem = _threading.Semaphore(jobs) err_event = _threading.Event() def GC(bare_git): try: try: bare_git.gc('--auto', config=config) except GitError: err_event.set() except: err_event.set() raise finally: sem.release() for bare_git in gc_gitdirs.values(): if err_event.isSet(): break sem.acquire() t = _threading.Thread(target=GC, args=(bare_git,)) t.daemon = True threads.add(t) t.start() for t in threads: t.join() if err_event.isSet(): print('\nerror: Exited sync due to gc errors', file=sys.stderr) sys.exit(1) def _ReloadManifest(self, manifest_name=None): if manifest_name: # Override calls _Unload already self.manifest.Override(manifest_name) else: self.manifest._Unload() def _DeleteProject(self, path): print('Deleting obsolete path %s' % path, file=sys.stderr) # Delete the .git directory first, so we're less likely to have a partially # working git repository around. There shouldn't be any git projects here, # so rmtree works. try: shutil.rmtree(os.path.join(path, '.git')) except OSError: print('Failed to remove %s' % os.path.join(path, '.git'), file=sys.stderr) print('error: Failed to delete obsolete path %s' % path, file=sys.stderr) print(' remove manually, then run sync again', file=sys.stderr) return -1 # Delete everything under the worktree, except for directories that contain # another git project dirs_to_remove = [] failed = False for root, dirs, files in os.walk(path): for f in files: try: os.remove(os.path.join(root, f)) except OSError: print('Failed to remove %s' % os.path.join(root, f), file=sys.stderr) failed = True dirs[:] = [d for d in dirs if not os.path.lexists(os.path.join(root, d, '.git'))] dirs_to_remove += [os.path.join(root, d) for d in dirs if os.path.join(root, d) not in dirs_to_remove] for d in reversed(dirs_to_remove): if os.path.islink(d): try: os.remove(d) except OSError: print('Failed to remove %s' % os.path.join(root, d), file=sys.stderr) failed = True elif len(os.listdir(d)) == 0: try: os.rmdir(d) except OSError: print('Failed to remove %s' % os.path.join(root, d), file=sys.stderr) failed = True continue if failed: print('error: Failed to delete obsolete path %s' % path, file=sys.stderr) print(' remove manually, then run sync again', file=sys.stderr) return -1 # Try deleting parent dirs if they are empty project_dir = path while project_dir != self.manifest.topdir: if len(os.listdir(project_dir)) == 0: os.rmdir(project_dir) else: break project_dir = os.path.dirname(project_dir) return 0 def UpdateProjectList(self): new_project_paths = [] for project in self.GetProjects(None, missing_ok=True): if project.relpath: new_project_paths.append(project.relpath) file_name = 'project.list' file_path = os.path.join(self.manifest.repodir, file_name) old_project_paths = [] if os.path.exists(file_path): fd = open(file_path, 'r') try: old_project_paths = fd.read().split('\n') finally: fd.close() for path in old_project_paths: if not path: continue if path not in new_project_paths: # If the path has already been deleted, we don't need to do it gitdir = os.path.join(self.manifest.topdir, path, '.git') if os.path.exists(gitdir): project = Project( manifest = self.manifest, name = path, remote = RemoteSpec('origin'), gitdir = gitdir, objdir = gitdir, worktree = os.path.join(self.manifest.topdir, path), relpath = path, revisionExpr = 'HEAD', revisionId = None, groups = None) if project.IsDirty(): print('error: Cannot remove project "%s": uncommitted changes ' 'are present' % project.relpath, file=sys.stderr) print(' commit changes, then run sync again', file=sys.stderr) return -1 elif self._DeleteProject(project.worktree): return -1 new_project_paths.sort() fd = open(file_path, 'w') try: fd.write('\n'.join(new_project_paths)) fd.write('\n') finally: fd.close() return 0 def Execute(self, opt, args): if opt.jobs: self.jobs = opt.jobs if self.jobs > 1: soft_limit, _ = _rlimit_nofile() self.jobs = min(self.jobs, (soft_limit - 5) / 3) if opt.network_only and opt.detach_head: print('error: cannot combine -n and -d', file=sys.stderr) sys.exit(1) if opt.network_only and opt.local_only: print('error: cannot combine -n and -l', file=sys.stderr) sys.exit(1) if opt.manifest_name and opt.smart_sync: print('error: cannot combine -m and -s', file=sys.stderr) sys.exit(1) if opt.manifest_name and opt.smart_tag: print('error: cannot combine -m and -t', file=sys.stderr) sys.exit(1) if opt.manifest_server_username or opt.manifest_server_password: if not (opt.smart_sync or opt.smart_tag): print('error: -u and -p may only be combined with -s or -t', file=sys.stderr) sys.exit(1) if None in [opt.manifest_server_username, opt.manifest_server_password]: print('error: both -u and -p must be given', file=sys.stderr) sys.exit(1) if opt.manifest_name: self.manifest.Override(opt.manifest_name) manifest_name = opt.manifest_name smart_sync_manifest_name = "smart_sync_override.xml" smart_sync_manifest_path = os.path.join( self.manifest.manifestProject.worktree, smart_sync_manifest_name) if opt.smart_sync or opt.smart_tag: if not self.manifest.manifest_server: print('error: cannot smart sync: no manifest server defined in ' 'manifest', file=sys.stderr) sys.exit(1) manifest_server = self.manifest.manifest_server if not opt.quiet: print('Using manifest server %s' % manifest_server) if not '@' in manifest_server: username = None password = None if opt.manifest_server_username and opt.manifest_server_password: username = opt.manifest_server_username password = opt.manifest_server_password else: try: info = netrc.netrc() except IOError: # .netrc file does not exist or could not be opened pass else: try: parse_result = urllib.parse.urlparse(manifest_server) if parse_result.hostname: auth = info.authenticators(parse_result.hostname) if auth: username, _account, password = auth else: print('No credentials found for %s in .netrc' % parse_result.hostname, file=sys.stderr) except netrc.NetrcParseError as e: print('Error parsing .netrc file: %s' % e, file=sys.stderr) if (username and password): manifest_server = manifest_server.replace('://', '://%s:%s@' % (username, password), 1) transport = PersistentTransport(manifest_server) if manifest_server.startswith('persistent-'): manifest_server = manifest_server[len('persistent-'):] try: server = xmlrpc.client.Server(manifest_server, transport=transport) if opt.smart_sync: p = self.manifest.manifestProject b = p.GetBranch(p.CurrentBranch) branch = b.merge if branch.startswith(R_HEADS): branch = branch[len(R_HEADS):] env = os.environ.copy() if 'SYNC_TARGET' in env: target = env['SYNC_TARGET'] [success, manifest_str] = server.GetApprovedManifest(branch, target) elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env: target = '%s-%s' % (env['TARGET_PRODUCT'], env['TARGET_BUILD_VARIANT']) [success, manifest_str] = server.GetApprovedManifest(branch, target) else: [success, manifest_str] = server.GetApprovedManifest(branch) else: assert(opt.smart_tag) [success, manifest_str] = server.GetManifest(opt.smart_tag) if success: manifest_name = smart_sync_manifest_name try: f = open(smart_sync_manifest_path, 'w') try: f.write(manifest_str) finally: f.close() except IOError as e: print('error: cannot write manifest to %s:\n%s' % (smart_sync_manifest_path, e), file=sys.stderr) sys.exit(1) self._ReloadManifest(manifest_name) else: print('error: manifest server RPC call failed: %s' % manifest_str, file=sys.stderr) sys.exit(1) except (socket.error, IOError, xmlrpc.client.Fault) as e: print('error: cannot connect to manifest server %s:\n%s' % (self.manifest.manifest_server, e), file=sys.stderr) sys.exit(1) except xmlrpc.client.ProtocolError as e: print('error: cannot connect to manifest server %s:\n%d %s' % (self.manifest.manifest_server, e.errcode, e.errmsg), file=sys.stderr) sys.exit(1) else: # Not smart sync or smart tag mode if os.path.isfile(smart_sync_manifest_path): try: os.remove(smart_sync_manifest_path) except OSError as e: print('error: failed to remove existing smart sync override manifest: %s' % e, file=sys.stderr) rp = self.manifest.repoProject rp.PreSync() mp = self.manifest.manifestProject mp.PreSync() if opt.repo_upgraded: _PostRepoUpgrade(self.manifest, quiet=opt.quiet) if not opt.local_only: mp.Sync_NetworkHalf(quiet=opt.quiet, current_branch_only=opt.current_branch_only, no_tags=opt.no_tags, optimized_fetch=opt.optimized_fetch) if mp.HasChanges: syncbuf = SyncBuffer(mp.config) mp.Sync_LocalHalf(syncbuf) if not syncbuf.Finish(): sys.exit(1) self._ReloadManifest(manifest_name) if opt.jobs is None: self.jobs = self.manifest.default.sync_j if self.gitc_manifest: gitc_manifest_projects = self.GetProjects(args, missing_ok=True) gitc_projects = [] opened_projects = [] for project in gitc_manifest_projects: if project.relpath in self.gitc_manifest.paths and \ self.gitc_manifest.paths[project.relpath].old_revision: opened_projects.append(project.relpath) else: gitc_projects.append(project.relpath) if not args: gitc_projects = None if gitc_projects != [] and not opt.local_only: print('Updating GITC client: %s' % self.gitc_manifest.gitc_client_name) manifest = GitcManifest(self.repodir, self.gitc_manifest.gitc_client_name) if manifest_name: manifest.Override(manifest_name) else: manifest.Override(self.manifest.manifestFile) gitc_utils.generate_gitc_manifest(self.gitc_manifest, manifest, gitc_projects) print('GITC client successfully synced.') # The opened projects need to be synced as normal, therefore we # generate a new args list to represent the opened projects. # TODO: make this more reliable -- if there's a project name/path overlap, # this may choose the wrong project. args = [os.path.relpath(self.manifest.paths[p].worktree, os.getcwd()) for p in opened_projects] if not args: return all_projects = self.GetProjects(args, missing_ok=True, submodules_ok=opt.fetch_submodules) self._fetch_times = _FetchTimes(self.manifest) if not opt.local_only: to_fetch = [] now = time.time() if _ONE_DAY_S <= (now - rp.LastFetch): to_fetch.append(rp) to_fetch.extend(all_projects) to_fetch.sort(key=self._fetch_times.Get, reverse=True) fetched = self._Fetch(to_fetch, opt) _PostRepoFetch(rp, opt.no_repo_verify) if opt.network_only: # bail out now; the rest touches the working tree return # Iteratively fetch missing and/or nested unregistered submodules previously_missing_set = set() while True: self._ReloadManifest(manifest_name) all_projects = self.GetProjects(args, missing_ok=True, submodules_ok=opt.fetch_submodules) missing = [] for project in all_projects: if project.gitdir not in fetched: missing.append(project) if not missing: break # Stop us from non-stopped fetching actually-missing repos: If set of # missing repos has not been changed from last fetch, we break. missing_set = set(p.name for p in missing) if previously_missing_set == missing_set: break previously_missing_set = missing_set fetched.update(self._Fetch(missing, opt)) if self.manifest.IsMirror or self.manifest.IsArchive: # bail out now, we have no working tree return if self.UpdateProjectList(): sys.exit(1) syncbuf = SyncBuffer(mp.config, detach_head = opt.detach_head) pm = Progress('Syncing work tree', len(all_projects)) for project in all_projects: pm.update() if project.worktree: project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync) pm.end() print(file=sys.stderr) if not syncbuf.Finish(): sys.exit(1) # If there's a notice that's supposed to print at the end of the sync, print # it now... if self.manifest.notice: print(self.manifest.notice) def _PostRepoUpgrade(manifest, quiet=False): wrapper = Wrapper() if wrapper.NeedSetupGnuPG(): wrapper.SetupGnuPG(quiet) for project in manifest.projects: if project.Exists: project.PostRepoUpgrade() def _PostRepoFetch(rp, no_repo_verify=False, verbose=False): if rp.HasChanges: print('info: A new version of repo is available', file=sys.stderr) print(file=sys.stderr) if no_repo_verify or _VerifyTag(rp): syncbuf = SyncBuffer(rp.config) rp.Sync_LocalHalf(syncbuf) if not syncbuf.Finish(): sys.exit(1) print('info: Restarting repo with latest version', file=sys.stderr) raise RepoChangedException(['--repo-upgraded']) else: print('warning: Skipped upgrade to unverified version', file=sys.stderr) else: if verbose: print('repo version %s is current' % rp.work_git.describe(HEAD), file=sys.stderr) def _VerifyTag(project): gpg_dir = os.path.expanduser('~/.repoconfig/gnupg') if not os.path.exists(gpg_dir): print('warning: GnuPG was not available during last "repo init"\n' 'warning: Cannot automatically authenticate repo."""', file=sys.stderr) return True try: cur = project.bare_git.describe(project.GetRevisionId()) except GitError: cur = None if not cur \ or re.compile(r'^.*-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur): rev = project.revisionExpr if rev.startswith(R_HEADS): rev = rev[len(R_HEADS):] print(file=sys.stderr) print("warning: project '%s' branch '%s' is not signed" % (project.name, rev), file=sys.stderr) return False env = os.environ.copy() env['GIT_DIR'] = project.gitdir.encode() env['GNUPGHOME'] = gpg_dir.encode() cmd = [GIT, 'tag', '-v', cur] proc = subprocess.Popen(cmd, stdout = subprocess.PIPE, stderr = subprocess.PIPE, env = env) out = proc.stdout.read() proc.stdout.close() err = proc.stderr.read() proc.stderr.close() if proc.wait() != 0: print(file=sys.stderr) print(out, file=sys.stderr) print(err, file=sys.stderr) print(file=sys.stderr) return False return True class _FetchTimes(object): _ALPHA = 0.5 def __init__(self, manifest): self._path = os.path.join(manifest.repodir, '.repo_fetchtimes.json') self._times = None self._seen = set() def Get(self, project): self._Load() return self._times.get(project.name, _ONE_DAY_S) def Set(self, project, t): self._Load() name = project.name old = self._times.get(name, t) self._seen.add(name) a = self._ALPHA self._times[name] = (a*t) + ((1-a) * old) def _Load(self): if self._times is None: try: f = open(self._path) try: self._times = json.load(f) finally: f.close() except (IOError, ValueError): try: os.remove(self._path) except OSError: pass self._times = {} def Save(self): if self._times is None: return to_delete = [] for name in self._times: if name not in self._seen: to_delete.append(name) for name in to_delete: del self._times[name] try: f = open(self._path, 'w') try: json.dump(self._times, f, indent=2) finally: f.close() except (IOError, TypeError): try: os.remove(self._path) except OSError: pass # This is a replacement for xmlrpc.client.Transport using urllib2 # and supporting persistent-http[s]. It cannot change hosts from # request to request like the normal transport, the real url # is passed during initialization. class PersistentTransport(xmlrpc.client.Transport): def __init__(self, orig_host): self.orig_host = orig_host def request(self, host, handler, request_body, verbose=False): with GetUrlCookieFile(self.orig_host, not verbose) as (cookiefile, proxy): # Python doesn't understand cookies with the #HttpOnly_ prefix # Since we're only using them for HTTP, copy the file temporarily, # stripping those prefixes away. if cookiefile: tmpcookiefile = tempfile.NamedTemporaryFile() tmpcookiefile.write("# HTTP Cookie File") try: with open(cookiefile) as f: for line in f: if line.startswith("#HttpOnly_"): line = line[len("#HttpOnly_"):] tmpcookiefile.write(line) tmpcookiefile.flush() cookiejar = cookielib.MozillaCookieJar(tmpcookiefile.name) try: cookiejar.load() except cookielib.LoadError: cookiejar = cookielib.CookieJar() finally: tmpcookiefile.close() else: cookiejar = cookielib.CookieJar() proxyhandler = urllib.request.ProxyHandler if proxy: proxyhandler = urllib.request.ProxyHandler({ "http": proxy, "https": proxy }) opener = urllib.request.build_opener( urllib.request.HTTPCookieProcessor(cookiejar), proxyhandler) url = urllib.parse.urljoin(self.orig_host, handler) parse_results = urllib.parse.urlparse(url) scheme = parse_results.scheme if scheme == 'persistent-http': scheme = 'http' if scheme == 'persistent-https': # If we're proxying through persistent-https, use http. The # proxy itself will do the https. if proxy: scheme = 'http' else: scheme = 'https' # Parse out any authentication information using the base class host, extra_headers, _ = self.get_host_info(parse_results.netloc) url = urllib.parse.urlunparse(( scheme, host, parse_results.path, parse_results.params, parse_results.query, parse_results.fragment)) request = urllib.request.Request(url, request_body) if extra_headers is not None: for (name, header) in extra_headers: request.add_header(name, header) request.add_header('Content-Type', 'text/xml') try: response = opener.open(request) except urllib.error.HTTPError as e: if e.code == 501: # We may have been redirected through a login process # but our POST turned into a GET. Retry. response = opener.open(request) else: raise p, u = xmlrpc.client.getparser() while 1: data = response.read(1024) if not data: break p.feed(data) p.close() return u.close() def close(self): pass subcmds/upload.py0100644 0000000 0000000 00000040763 13025567015 013121 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import copy import re import sys from command import InteractiveCommand from editor import Editor from error import HookError, UploadError from git_command import GitCommand from project import RepoHook from pyversion import is_python3 # pylint:disable=W0622 if not is_python3(): input = raw_input else: unicode = str # pylint:enable=W0622 UNUSUAL_COMMIT_THRESHOLD = 5 def _ConfirmManyUploads(multiple_branches=False): if multiple_branches: print('ATTENTION: One or more branches has an unusually high number ' 'of commits.') else: print('ATTENTION: You are uploading an unusually high number of commits.') print('YOU PROBABLY DO NOT MEAN TO DO THIS. (Did you rebase across ' 'branches?)') answer = input("If you are sure you intend to do this, type 'yes': ").strip() return answer == "yes" def _die(fmt, *args): msg = fmt % args print('error: %s' % msg, file=sys.stderr) sys.exit(1) def _SplitEmails(values): result = [] for value in values: result.extend([s.strip() for s in value.split(',')]) return result class Upload(InteractiveCommand): common = True helpSummary = "Upload changes for code review" helpUsage = """ %prog [--re --cc] []... """ helpDescription = """ The '%prog' command is used to send changes to the Gerrit Code Review system. It searches for topic branches in local projects that have not yet been published for review. If multiple topic branches are found, '%prog' opens an editor to allow the user to select which branches to upload. '%prog' searches for uploadable changes in all projects listed at the command line. Projects can be specified either by name, or by a relative or absolute path to the project's local directory. If no projects are specified, '%prog' will search for uploadable changes in all projects listed in the manifest. If the --reviewers or --cc options are passed, those emails are added to the respective list of users, and emails are sent to any new users. Users passed as --reviewers must already be registered with the code review system, or the upload will fail. Configuration ------------- review.URL.autoupload: To disable the "Upload ... (y/N)?" prompt, you can set a per-project or global Git configuration option. If review.URL.autoupload is set to "true" then repo will assume you always answer "y" at the prompt, and will not prompt you further. If it is set to "false" then repo will assume you always answer "n", and will abort. review.URL.autoreviewer: To automatically append a user or mailing list to reviews, you can set a per-project or global Git option to do so. review.URL.autocopy: To automatically copy a user or mailing list to all uploaded reviews, you can set a per-project or global Git option to do so. Specifically, review.URL.autocopy can be set to a comma separated list of reviewers who you always want copied on all uploads with a non-empty --re argument. review.URL.username: Override the username used to connect to Gerrit Code Review. By default the local part of the email address is used. The URL must match the review URL listed in the manifest XML file, or in the .git/config within the project. For example: [remote "origin"] url = git://git.example.com/project.git review = http://review.example.com/ [review "http://review.example.com/"] autoupload = true autocopy = johndoe@company.com,my-team-alias@company.com review.URL.uploadtopic: To add a topic branch whenever uploading a commit, you can set a per-project or global Git option to do so. If review.URL.uploadtopic is set to "true" then repo will assume you always want the equivalent of the -t option to the repo command. If unset or set to "false" then repo will make use of only the command line option. References ---------- Gerrit Code Review: http://code.google.com/p/gerrit/ """ def _Options(self, p): p.add_option('-t', dest='auto_topic', action='store_true', help='Send local branch name to Gerrit Code Review') p.add_option('--re', '--reviewers', type='string', action='append', dest='reviewers', help='Request reviews from these people.') p.add_option('--cc', type='string', action='append', dest='cc', help='Also send email to these email addresses.') p.add_option('--br', type='string', action='store', dest='branch', help='Branch to upload.') p.add_option('--cbr', '--current-branch', dest='current_branch', action='store_true', help='Upload current git branch.') p.add_option('-d', '--draft', action='store_true', dest='draft', default=False, help='If specified, upload as a draft.') p.add_option('-D', '--destination', '--dest', type='string', action='store', dest='dest_branch', metavar='BRANCH', help='Submit for review on this target branch.') # Options relating to upload hook. Note that verify and no-verify are NOT # opposites of each other, which is why they store to different locations. # We are using them to match 'git commit' syntax. # # Combinations: # - no-verify=False, verify=False (DEFAULT): # If stdout is a tty, can prompt about running upload hooks if needed. # If user denies running hooks, the upload is cancelled. If stdout is # not a tty and we would need to prompt about upload hooks, upload is # cancelled. # - no-verify=False, verify=True: # Always run upload hooks with no prompt. # - no-verify=True, verify=False: # Never run upload hooks, but upload anyway (AKA bypass hooks). # - no-verify=True, verify=True: # Invalid p.add_option('--no-verify', dest='bypass_hooks', action='store_true', help='Do not run the upload hook.') p.add_option('--verify', dest='allow_all_hooks', action='store_true', help='Run the upload hook without prompting.') def _SingleBranch(self, opt, branch, people): project = branch.project name = branch.name remote = project.GetBranch(name).remote key = 'review.%s.autoupload' % remote.review answer = project.config.GetBoolean(key) if answer is False: _die("upload blocked by %s = false" % key) if answer is None: date = branch.date commit_list = branch.commits destination = opt.dest_branch or project.dest_branch or project.revisionExpr print('Upload project %s/ to remote branch %s:' % (project.relpath, destination)) print(' branch %s (%2d commit%s, %s):' % ( name, len(commit_list), len(commit_list) != 1 and 's' or '', date)) for commit in commit_list: print(' %s' % commit) sys.stdout.write('to %s (y/N)? ' % remote.review) answer = sys.stdin.readline().strip().lower() answer = answer in ('y', 'yes', '1', 'true', 't') if answer: if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD: answer = _ConfirmManyUploads() if answer: self._UploadAndReport(opt, [branch], people) else: _die("upload aborted by user") def _MultipleBranches(self, opt, pending, people): projects = {} branches = {} script = [] script.append('# Uncomment the branches to upload:') for project, avail in pending: script.append('#') script.append('# project %s/:' % project.relpath) b = {} for branch in avail: if branch is None: continue name = branch.name date = branch.date commit_list = branch.commits if b: script.append('#') destination = opt.dest_branch or project.dest_branch or project.revisionExpr script.append('# branch %s (%2d commit%s, %s) to remote branch %s:' % ( name, len(commit_list), len(commit_list) != 1 and 's' or '', date, destination)) for commit in commit_list: script.append('# %s' % commit) b[name] = branch projects[project.relpath] = project branches[project.name] = b script.append('') script = [ x.encode('utf-8') if issubclass(type(x), unicode) else x for x in script ] script = Editor.EditString("\n".join(script)).split("\n") project_re = re.compile(r'^#?\s*project\s*([^\s]+)/:$') branch_re = re.compile(r'^\s*branch\s*([^\s(]+)\s*\(.*') project = None todo = [] for line in script: m = project_re.match(line) if m: name = m.group(1) project = projects.get(name) if not project: _die('project %s not available for upload', name) continue m = branch_re.match(line) if m: name = m.group(1) if not project: _die('project for branch %s not in script', name) branch = branches[project.name].get(name) if not branch: _die('branch %s not in %s', name, project.relpath) todo.append(branch) if not todo: _die("nothing uncommented for upload") many_commits = False for branch in todo: if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD: many_commits = True break if many_commits: if not _ConfirmManyUploads(multiple_branches=True): _die("upload aborted by user") self._UploadAndReport(opt, todo, people) def _AppendAutoList(self, branch, people): """ Appends the list of reviewers in the git project's config. Appends the list of users in the CC list in the git project's config if a non-empty reviewer list was found. """ name = branch.name project = branch.project key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review raw_list = project.config.GetString(key) if not raw_list is None: people[0].extend([entry.strip() for entry in raw_list.split(',')]) key = 'review.%s.autocopy' % project.GetBranch(name).remote.review raw_list = project.config.GetString(key) if not raw_list is None and len(people[0]) > 0: people[1].extend([entry.strip() for entry in raw_list.split(',')]) def _FindGerritChange(self, branch): last_pub = branch.project.WasPublished(branch.name) if last_pub is None: return "" refs = branch.GetPublishedRefs() try: # refs/changes/XYZ/N --> XYZ return refs.get(last_pub).split('/')[-2] except (AttributeError, IndexError): return "" def _UploadAndReport(self, opt, todo, original_people): have_errors = False for branch in todo: try: people = copy.deepcopy(original_people) self._AppendAutoList(branch, people) # Check if there are local changes that may have been forgotten changes = branch.project.UncommitedFiles() if changes: key = 'review.%s.autoupload' % branch.project.remote.review answer = branch.project.config.GetBoolean(key) # if they want to auto upload, let's not ask because it could be automated if answer is None: sys.stdout.write('Uncommitted changes in ' + branch.project.name) sys.stdout.write(' (did you forget to amend?):\n') sys.stdout.write('\n'.join(changes) + '\n') sys.stdout.write('Continue uploading? (y/N) ') a = sys.stdin.readline().strip().lower() if a not in ('y', 'yes', 't', 'true', 'on'): print("skipping upload", file=sys.stderr) branch.uploaded = False branch.error = 'User aborted' continue # Check if topic branches should be sent to the server during upload if opt.auto_topic is not True: key = 'review.%s.uploadtopic' % branch.project.remote.review opt.auto_topic = branch.project.config.GetBoolean(key) destination = opt.dest_branch or branch.project.dest_branch # Make sure our local branch is not setup to track a different remote branch merge_branch = self._GetMergeBranch(branch.project) if destination: full_dest = 'refs/heads/%s' % destination if not opt.dest_branch and merge_branch and merge_branch != full_dest: print('merge branch %s does not match destination branch %s' % (merge_branch, full_dest)) print('skipping upload.') print('Please use `--destination %s` if this is intentional' % destination) branch.uploaded = False continue branch.UploadForReview(people, auto_topic=opt.auto_topic, draft=opt.draft, dest_branch=destination) branch.uploaded = True except UploadError as e: branch.error = e branch.uploaded = False have_errors = True print(file=sys.stderr) print('----------------------------------------------------------------------', file=sys.stderr) if have_errors: for branch in todo: if not branch.uploaded: if len(str(branch.error)) <= 30: fmt = ' (%s)' else: fmt = '\n (%s)' print(('[FAILED] %-15s %-15s' + fmt) % ( branch.project.relpath + '/', \ branch.name, \ str(branch.error)), file=sys.stderr) print() for branch in todo: if branch.uploaded: print('[OK ] %-15s %s' % ( branch.project.relpath + '/', branch.name), file=sys.stderr) if have_errors: sys.exit(1) def _GetMergeBranch(self, project): p = GitCommand(project, ['rev-parse', '--abbrev-ref', 'HEAD'], capture_stdout = True, capture_stderr = True) p.Wait() local_branch = p.stdout.strip() p = GitCommand(project, ['config', '--get', 'branch.%s.merge' % local_branch], capture_stdout = True, capture_stderr = True) p.Wait() merge_branch = p.stdout.strip() return merge_branch def Execute(self, opt, args): project_list = self.GetProjects(args) pending = [] reviewers = [] cc = [] branch = None if opt.branch: branch = opt.branch for project in project_list: if opt.current_branch: cbr = project.CurrentBranch up_branch = project.GetUploadableBranch(cbr) if up_branch: avail = [up_branch] else: avail = None print('ERROR: Current branch (%s) not uploadable. ' 'You may be able to type ' '"git branch --set-upstream-to m/master" to fix ' 'your branch.' % str(cbr), file=sys.stderr) else: avail = project.GetUploadableBranches(branch) if avail: pending.append((project, avail)) if not pending: print("no branches ready for upload", file=sys.stderr) return if not opt.bypass_hooks: hook = RepoHook('pre-upload', self.manifest.repo_hooks_project, self.manifest.topdir, self.manifest.manifestProject.GetRemote('origin').url, abort_if_user_denies=True) pending_proj_names = [project.name for (project, avail) in pending] pending_worktrees = [project.worktree for (project, avail) in pending] try: hook.Run(opt.allow_all_hooks, project_list=pending_proj_names, worktree_list=pending_worktrees) except HookError as e: print("ERROR: %s" % str(e), file=sys.stderr) return if opt.reviewers: reviewers = _SplitEmails(opt.reviewers) if opt.cc: cc = _SplitEmails(opt.cc) people = (reviewers, cc) if len(pending) == 1 and len(pending[0][1]) == 1: self._SingleBranch(opt, pending[0][1][0], people) else: self._MultipleBranches(opt, pending, people) subcmds/version.py0100644 0000000 0000000 00000002540 13025567015 013311 0ustar000000000 0000000 # # Copyright (C) 2009 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys from command import Command, MirrorSafeCommand from git_command import git from git_refs import HEAD class Version(Command, MirrorSafeCommand): wrapper_version = None wrapper_path = None common = False helpSummary = "Display the version of repo" helpUsage = """ %prog """ def Execute(self, opt, args): rp = self.manifest.repoProject rem = rp.GetRemote(rp.remote.name) print('repo version %s' % rp.work_git.describe(HEAD)) print(' (from %s)' % rem.url) if Version.wrapper_path is not None: print('repo launcher version %s' % Version.wrapper_version) print(' (from %s)' % Version.wrapper_path) print(git.version().strip()) print('Python %s' % sys.version) tests/0040755 0000000 0000000 00000000000 13025567015 010756 5ustar000000000 0000000 tests/fixtures/0040755 0000000 0000000 00000000000 13025567015 012627 5ustar000000000 0000000 tests/fixtures/gitc_config0100644 0000000 0000000 00000000045 13025567015 015021 0ustar000000000 0000000 gitc_dir=/test/usr/local/google/gitc tests/fixtures/test.gitconfig0100644 0000000 0000000 00000000042 13025567015 015472 0ustar000000000 0000000 [section] empty nonempty = true tests/test_git_config.py0100644 0000000 0000000 00000002315 13025567015 014475 0ustar000000000 0000000 import os import unittest import git_config def fixture(*paths): """Return a path relative to test/fixtures. """ return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) class GitConfigUnitTest(unittest.TestCase): """Tests the GitConfig class. """ def setUp(self): """Create a GitConfig object using the test.gitconfig fixture. """ config_fixture = fixture('test.gitconfig') self.config = git_config.GitConfig(config_fixture) def test_GetString_with_empty_config_values(self): """ Test config entries with no value. [section] empty """ val = self.config.GetString('section.empty') self.assertEqual(val, None) def test_GetString_with_true_value(self): """ Test config entries with a string value. [section] nonempty = true """ val = self.config.GetString('section.nonempty') self.assertEqual(val, 'true') def test_GetString_from_missing_file(self): """ Test missing config file """ config_fixture = fixture('not.present.gitconfig') config = git_config.GitConfig(config_fixture) val = config.GetString('empty') self.assertEqual(val, None) if __name__ == '__main__': unittest.main() tests/test_wrapper.py0100644 0000000 0000000 00000005615 13025567015 014053 0ustar000000000 0000000 # # Copyright (C) 2015 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import unittest import wrapper def fixture(*paths): """Return a path relative to tests/fixtures. """ return os.path.join(os.path.dirname(__file__), 'fixtures', *paths) class RepoWrapperUnitTest(unittest.TestCase): """Tests helper functions in the repo wrapper """ def setUp(self): """Load the wrapper module every time """ wrapper._wrapper_module = None self.wrapper = wrapper.Wrapper() def test_get_gitc_manifest_dir_no_gitc(self): """ Test reading a missing gitc config file """ self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config') val = self.wrapper.get_gitc_manifest_dir() self.assertEqual(val, '') def test_get_gitc_manifest_dir(self): """ Test reading the gitc config file and parsing the directory """ self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config') val = self.wrapper.get_gitc_manifest_dir() self.assertEqual(val, '/test/usr/local/google/gitc') def test_gitc_parse_clientdir_no_gitc(self): """ Test parsing the gitc clientdir without gitc running """ self.wrapper.GITC_CONFIG_FILE = fixture('missing_gitc_config') self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test') def test_gitc_parse_clientdir(self): """ Test parsing the gitc clientdir """ self.wrapper.GITC_CONFIG_FILE = fixture('gitc_config') self.assertEqual(self.wrapper.gitc_parse_clientdir('/something'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test') self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None) self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None) if __name__ == '__main__': unittest.main() trace.py0100644 0000000 0000000 00000001641 13025567015 011263 0ustar000000000 0000000 # # Copyright (C) 2008 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import sys import os REPO_TRACE = 'REPO_TRACE' try: _TRACE = os.environ[REPO_TRACE] == '1' except KeyError: _TRACE = False def IsTrace(): return _TRACE def SetTrace(): global _TRACE _TRACE = True def Trace(fmt, *args): if IsTrace(): print(fmt % args, file=sys.stderr) wrapper.py0100644 0000000 0000000 00000001655 13025567015 011652 0ustar000000000 0000000 #!/usr/bin/env python # # Copyright (C) 2014 The Android Open Source Project # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function import imp import os def WrapperPath(): return os.path.join(os.path.dirname(__file__), 'repo') _wrapper_module = None def Wrapper(): global _wrapper_module if not _wrapper_module: _wrapper_module = imp.load_source('wrapper', WrapperPath()) return _wrapper_module